HuggingFace Fundamentals with LLM's such as TInyLlama and Mistral 7B

Поделиться
HTML-код
  • Опубликовано: 4 фев 2024
  • chris looks under the hood of huggingface models such as TinyLlama and Mistral 7-B. In the video Chris presents a high level reference model of large language models and uses this to show how tokenization and the AutoTokenizer module works from the HuggingFace transfomer library linking it back to the HuggingFace repository. In addition we look at the tokenizer config and Chris shows how Mistral and Llama-2 both use the same tokenizer and embeddings architecture (albeit different vocabularies). Finally Chris shows you how to look at the model configuration and model architecture of hugging face models.
    As we start to build towards our own large language model, understanding these fundamentals are critical no matter whether you are a builder or consumer of AI.
    Google Colab:
    colab.research.google.com/dri...
  • НаукаНаука

Комментарии • 45

  • @ukaszrozewicz7488
    @ukaszrozewicz7488 3 месяца назад +8

    The best video I've watched on RUclips about LLM so far. You explain complex topics in an accessible language, clearly and understandably. You are doing a very good job. I'm eagerly waiting for the next videos :)

    • @TheMariukz
      @TheMariukz 3 месяца назад +3

      same here

    • @chrishayuk
      @chrishayuk  3 месяца назад +2

      Wow, thanks!, this one actually took a long time to get right, glad you liked it

  • @mindurownbussines
    @mindurownbussines Месяц назад +2

    Thank you so much Chris
    I truly believe if one has a great understanding of a subject he can teach it clearly and you simply did that !
    God bless you

    • @chrishayuk
      @chrishayuk  Месяц назад

      You are too kind, and thank you. Glad it was useful

  • @kennethmichaelreda
    @kennethmichaelreda 3 месяца назад +3

    Insanely valuable video. Thank you!

    • @chrishayuk
      @chrishayuk  3 месяца назад +1

      Glad it’s useful

  • @janstrunk
    @janstrunk 3 месяца назад +3

    Great video! Looking forward to your next videos…

    • @chrishayuk
      @chrishayuk  3 месяца назад +2

      Yeah, next ones in series will be fun, glad you’re enjoying it

  • @keithrule4240
    @keithrule4240 3 месяца назад +2

    Great video. Just the right amount of detail. Thanks.

    • @chrishayuk
      @chrishayuk  2 месяца назад

      Glad it was helpful!

  • @wadejohnson4542
    @wadejohnson4542 2 месяца назад

    For the very first time, I finally get it, thanks to you. Thank you for your service to the community.

  • @BipinRimal314
    @BipinRimal314 3 месяца назад +2

    Looking really forward to the next video.

  • @narenkrishnanGenius
    @narenkrishnanGenius 3 месяца назад +3

    very well explained and useful

    • @chrishayuk
      @chrishayuk  3 месяца назад +2

      So glad to hear that, thank you

  • @Jaypatel512
    @Jaypatel512 2 месяца назад +1

    Amazing way to get people comfortable with the model architecture. Thank you so much for sharing your knowledge.

  • @kenchang3456
    @kenchang3456 3 месяца назад +2

    Excellent explanation. Although I don't have a use case to fine-tune a model currently, I presume I will eventually it'll be great to have what you've shared in my back pocket. Thanks a bunch.

    • @chrishayuk
      @chrishayuk  2 месяца назад

      Awesome, glad it was useful

  • @atifsaeedkhan9207
    @atifsaeedkhan9207 2 месяца назад +1

    Thanks being so so in details. That was really a refresher for me. Glad someone like you is doing such a good work.

    • @chrishayuk
      @chrishayuk  2 месяца назад

      thank you, very much appreciate that

  • @nguyenhuuanhtuan5360
    @nguyenhuuanhtuan5360 3 месяца назад +3

    Aways awesome content ❤

    • @chrishayuk
      @chrishayuk  3 месяца назад +2

      Super glad it’s useful, thank you

  • @BiranchiNarayanNayak
    @BiranchiNarayanNayak 2 месяца назад +1

    Excellent tutorial to get started with LLMs.

  • @prashantkowshik5637
    @prashantkowshik5637 25 дней назад

    Thanks a lot Chris.

  • @ilyanemihin6029
    @ilyanemihin6029 2 месяца назад +1

    Thank you! This video brings light into the black box of LLM magic)

    • @chrishayuk
      @chrishayuk  2 месяца назад +1

      more to come, the next set of videos reveal a bunch more

  • @javaneze
    @javaneze 2 месяца назад +1

    great video - many thanks!

  • @MannyBernabe
    @MannyBernabe 3 месяца назад +1

    great video. Thx!

  • @AncientSlugThrower
    @AncientSlugThrower 3 месяца назад +1

    Great video.

  • @john6268
    @john6268 3 месяца назад +2

    How does the tokenizer decode sub-word embeddings? Specifically, how do you determine which sequence is concatenated into a word vs. standing on its own? As shown, the answer would be decoded with spaces between the embeddings, which wouldn't make "Lovelace" into a word.

    • @chrishayuk
      @chrishayuk  3 месяца назад +2

      Certain tokens will have spaces others won’t so _lace would be a different token from lace. I have a deep dive of the tiktoken tokenizer where I spend a lot of time on this. I am planning to do a building a tokenizer vid soon as part of this series

    • @chrishayuk
      @chrishayuk  3 месяца назад

      how the tokenizer for gpt-4 (tiktoken) works and why it can't reverse strings
      ruclips.net/video/NMoHHSWf1Mo/видео.html

    • @john6268
      @john6268 3 месяца назад

      @@chrishayuk Thanks, I'll check out the other video and looking forward to the next one.

  • @tec-earning8672
    @tec-earning8672 3 месяца назад +1

    Great job sir, one video for me sir how to build llama APIs i want use my train own model now i want using in my website ..

    • @chrishayuk
      @chrishayuk  3 месяца назад +1

      That’s where we are are working up to, but you can check out my existing fine tuning llama-2 video

  • @huiwencheng4585
    @huiwencheng4585 3 месяца назад +2

    Bro, just turn-on the Big Thank so I can donate you

    • @chrishayuk
      @chrishayuk  2 месяца назад

      lol, not gonna happen but appreciate the gesture and glad you like the videos