Training and deploying open-source large language models

Поделиться
HTML-код
  • Опубликовано: 27 окт 2024

Комментарии • 15

  • @aayushgarg5437
    @aayushgarg5437 9 месяцев назад +1

    Thank you Niels for this short video on training and deploying LLMs. Really enjoyed. Keep making such videos. :)

  • @thenewexeptor
    @thenewexeptor 9 месяцев назад

    Awesome video. Simple and insightful.

  • @RezaZaheri-ow3qm
    @RezaZaheri-ow3qm 5 месяцев назад

    Fantastic breakdown, thank you Niels

  • @onangarodney7746
    @onangarodney7746 9 месяцев назад

    Thanks for the video. I learned so much

  • @michelguenard8391
    @michelguenard8391 9 месяцев назад +1

    Thanks for this comprensive work.
    I was eager to get all these bricks consolidated as you did.
    I am not a developper but i am now certain to have a plan for my own and small project; at least
    to prove it can be useful to people, enhancing their knowledge with pleasure!
    Thanks again.
    Best wishes for 2024
    Michel from France

  • @RajaSekharaReddyKaluri
    @RajaSekharaReddyKaluri 9 месяцев назад

    All I can say is Thank you!
    If we meet some time somehow I would be more than happy to give you a treat.

  • @giviz
    @giviz 9 месяцев назад

    That was a really nice talk thank you!

  • @Yocoda24
    @Yocoda24 9 месяцев назад

    Insightful, and very straightforward! Awesome video

  • @Hypersniper05
    @Hypersniper05 9 месяцев назад

    What a great breakdown of exactly what we saw in 2023

  • @codingthefunway9852
    @codingthefunway9852 9 месяцев назад

    Thank you very much for this video. You've been so helpful to me sincerely.

  • @thegrumpydeveloper
    @thegrumpydeveloper 9 месяцев назад

    Interesting that we’re rapidly following the history of mainframe based computing down to local and mobile based computing. We’ll always have some form of api based llm but running something like a mistral 7b on mobile or perhaps a mixtral and beyond may become commonplace in just a few years time.

  • @DailyProg
    @DailyProg 9 месяцев назад

    This was a great one

  • @giofou711
    @giofou711 9 месяцев назад

    4:33, 6:05 How to evaluate the output of the LLM models: Hugging Face's Open LLM Leaderboard or LMSys's Chatbot Arena

  • @akshatkant1423
    @akshatkant1423 6 месяцев назад

    If we train a llm model with our data and deploy it in our server, everything is ours then will there be token limits then also? Like response output tokens? I want my model to generate like 25k output tokens is it possible if it's deployed in our server only and not using any big organisations api llm model

  • @mohsenghafari7652
    @mohsenghafari7652 7 месяцев назад

    hi. please help me. how to create custom model from many pdfs in Persian language? tank you.