13. Machine Learning for Mammography

Поделиться
HTML-код
  • Опубликовано: 12 сен 2024

Комментарии • 3

  • @nguyenngocly1484
    @nguyenngocly1484 3 года назад +1

    The variance equation for linear combinations of random variables applies to the dot product, together with say cosine angle that explains a lot about neural nets. Also the CLT.
    ReLU: Connect is f(x)=x. Disconnect is f(x)=0. A switch.
    A ReLU net is a switched system of dot products that collapses to a particular simple matrix for a particular input.
    Inside-out nets: Fixed dot products and adjustable , parametric, activation functions. Fast transforms are a source of fixed dot products (FFT, WHT). Sign flipping as a simple random projection before the net to make good use of the first layer, you don't want the spectrum of the input. Final fast transform as a pseudo-read-out layer.

  • @benettantal1728
    @benettantal1728 2 года назад +1

    Very informative talk! I just wish the instructor talked a little slower, it would have been easier to follow, especially real-time. I had to toggle playback speed a few times.

  • @murdhialharbi1270
    @murdhialharbi1270 Год назад +2

    the lecturer accent is very fast, LOL I found hard to understand
    despite my weak language,I may be need to see the lecture in slow motion setting 😅