A Visual Guide to Mixture of Experts (MoE) in LLMs

Поделиться
HTML-код
  • Опубликовано: 30 янв 2025

Комментарии •

  • @bidyapatip
    @bidyapatip 2 дня назад

    Very well explained...it answers every why and how in MoE architecture

  • @ScottzPlaylists
    @ScottzPlaylists 16 дней назад +3

    👍 I really like your style of teaching / presentation. I Subscribed❗ I'll look through all you videos and book. ❗

  • @marloncajamarca2793
    @marloncajamarca2793 2 месяца назад +1

    Best explaination of MoEs I have came across till now! The high-quality explaination of key concepts, production and visuals are superb. Keep up with this amazing work and thanks for sharing this for free Marteen.

  • @vardhan254
    @vardhan254 6 дней назад

    so much effort went into this Thanks maarten, i hope it gets the views it deserves

  • @ringpolitiet
    @ringpolitiet 2 месяца назад +2

    Very high production value! Very useful, thanks.

  • @YuxinWu-m1i
    @YuxinWu-m1i 22 дня назад +1

    你讲的好好啊!谢谢大哥!

  • @jacehua7334
    @jacehua7334 2 месяца назад +2

    Always very useful thank you Maarten!

  • @ShivangiTomar-p7j
    @ShivangiTomar-p7j 2 месяца назад

    AWESOME!!! THANKS A LOT

  • @mohajeramir
    @mohajeramir 14 дней назад

    Subscribed!

  • @tantzer6113
    @tantzer6113 11 дней назад +1

    I am not clear on how the different experts end up with different specializations.

  • @datahouse24
    @datahouse24 19 дней назад

    thank you

  • @Mohamed_Shokry
    @Mohamed_Shokry 2 месяца назад

    Thanks for the video! I can tell you put a lot of work into it.

  • @ekramhossain4601
    @ekramhossain4601 2 месяца назад

    Can you also make this kind of video to explain transformer as well?
    Thanks :)

  • @Deshwal.mahesh
    @Deshwal.mahesh 2 месяца назад

    Can we have a fine-tuning or building it from scratch video too 😶