The variance equation for linear combinations of random variables applies to the dot product, together with say cosine angle that explains a lot about neural nets. Also the CLT. ReLU: Connect is f(x)=x. Disconnect is f(x)=0. A switch. A ReLU net is a switched system of dot products that collapses to a particular simple matrix for a particular input. Inside-out nets: Fixed dot products and adjustable , parametric, activation functions. Fast transforms are a source of fixed dot products (FFT, WHT). Sign flipping as a simple random projection before the net to make good use of the first layer, you don't want the spectrum of the input. Final fast transform as a pseudo-read-out layer.
Very informative talk! I just wish the instructor talked a little slower, it would have been easier to follow, especially real-time. I had to toggle playback speed a few times.
The variance equation for linear combinations of random variables applies to the dot product, together with say cosine angle that explains a lot about neural nets. Also the CLT.
ReLU: Connect is f(x)=x. Disconnect is f(x)=0. A switch.
A ReLU net is a switched system of dot products that collapses to a particular simple matrix for a particular input.
Inside-out nets: Fixed dot products and adjustable , parametric, activation functions. Fast transforms are a source of fixed dot products (FFT, WHT). Sign flipping as a simple random projection before the net to make good use of the first layer, you don't want the spectrum of the input. Final fast transform as a pseudo-read-out layer.
Very informative talk! I just wish the instructor talked a little slower, it would have been easier to follow, especially real-time. I had to toggle playback speed a few times.
the lecturer accent is very fast, LOL I found hard to understand
despite my weak language,I may be need to see the lecture in slow motion setting 😅