Higher Order Company - Origins of the HVM

Поделиться
HTML-код
  • Опубликовано: 5 сен 2024

Комментарии • 15

  • @youssef.elmoumen
    @youssef.elmoumen 3 месяца назад +38

    Who came from Fireship ?

  • @johnkost2514
    @johnkost2514 3 месяца назад +4

    This is going to be BIG. This is no small advance in runtime research.
    Bravo !!!

  • @VictorTaelin
    @VictorTaelin 3 месяца назад +19

    If you're here because of Fireship, be warned my English pronunciation is terrible, and I sound like a potato trying to talk on it :( Also - I made many simplifications here, as not to over-complicate the explanation to my colleagues (including not explaining the difference between inets and icombs, not going in deep in the literature of optimal reduction, etc.) - if you're looking for a proper overview of ICs, this is the video you're looking for: ruclips.net/video/sDPuQ-UjhVQ/видео.html - it is Portuguese, but the English subs are good, and I made it as pedagogical as I could. It is worth watching (unlike this one haha)

    • @drapala97
      @drapala97 3 месяца назад +2

      Your english is impecable. Great job!

  • @DennisKorolevych
    @DennisKorolevych 3 месяца назад +3

    I dont understand much of it yet but I was so fascinated by Bend that I had to try that out. The performance improvements are massive in parallel computing but so far I have only scratched the surface.

  • @lrdass
    @lrdass 3 месяца назад +3

    can you imagine that finally Emacs will be multithreaded ?
    100% sure that was taelin's goal

  • @joaopedro-wv8kp
    @joaopedro-wv8kp 8 месяцев назад +3

    Wow, I didn't know this video existed. Great video!

  • @afmikasenpai
    @afmikasenpai 3 месяца назад +1

    Good stuff!

  • @Dr.Wiley9000
    @Dr.Wiley9000 3 месяца назад

    This is gnarly... Its like a real life Silicon Valley episode. Instead of finding the "Jerk Ratio" he used old hieroglyphic scriptures from the past.

  • @radumihaifilipescu2615
    @radumihaifilipescu2615 3 месяца назад +1

    brilliant

  • @britannio
    @britannio 4 месяца назад

    Really inspiring, thanks!

  • @EdnaMiner-w1c
    @EdnaMiner-w1c 2 дня назад

    Russell Gardens

  • @schalkdormehl3057
    @schalkdormehl3057 3 месяца назад

    LOVE THIS!

  • @HamiltonBess-h1w
    @HamiltonBess-h1w 3 дня назад

    Pfannerstill Burg

  • @jonabirdd
    @jonabirdd 3 месяца назад

    Sorry, but I think you've got better luck scaling unconventional neural networks trained using backprop (esp sparse neural networks) with CUDA.
    Scale is not the only piece of the puzzle. It's scale + learning algorithms that scale with compute, and the only one known to do so is deep learning.