Mixtral - Mixture of Experts (MoE) Free LLM that Rivals ChatGPT (3.5) by Mistral | Overview & Demo

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024

Комментарии • 4

  • @venelin_valkov
    @venelin_valkov  9 месяцев назад +2

    Sign up for the AI Bootcamp (preview drops on Christmas): www.mlexpert.io/membership

  • @NiketBahety
    @NiketBahety 9 месяцев назад +1

    I saw your videos on document classification with LayoutLmV3 and it was amazing. Can you please make a similar series on text labelling/form understanding with LayoutLmV3

  • @jishnunair5629
    @jishnunair5629 9 месяцев назад +2

    How to fine tune these MoE models? Any strategy for this ? I believe this would be different than normal fine tuning of models

  • @kolkoki
    @kolkoki 8 месяцев назад +1

    In my limited experience, it seems that dolphin mixtral is able to handle chinese and japanese. I'm not sure its excellent at those, but the little i've thrown at it looked like not nonsense, so there's that