LLM Chronicles: #5.2: Making LLMs from Transformers Part 1: BERT, Encoder-based

Поделиться
HTML-код
  • Опубликовано: 19 ноя 2024

Комментарии • 20

  • @sameersayyad6170
    @sameersayyad6170 8 месяцев назад +3

    Greart to see you back sir!

    • @donatocapitella
      @donatocapitella  8 месяцев назад +1

      Thanks! Soon publishing a lab related to this!

  • @MikePemberton
    @MikePemberton 8 месяцев назад +3

    This entire video series is completely brilliant, well done Donato 🙂 Keep 'em coming!

  • @micbab-vg2mu
    @micbab-vg2mu 8 месяцев назад +1

    Donato - great video:)
    for better understanding ( I am not technical guy) I used following promt:
    '''
    As an experienced educator specializing in simplifying complex technology concepts for beginners, your task is to explain the core ideas from the video [transcription] "Making LLMs from Transformers Part 1: BERT, Encoder-based" in a way that is easy to understand. Imagine that the person you are explaining to has no background in information technology or artificial intelligence and is eager to grasp the technology. Your explanation should focus on the fundamental concepts of BERT, Transformers, and how Large Language Models (LLMs) are constructed from them. Avoid using jargon or technical terms that require prior knowledge, and instead use analogies and simple examples to illustrate these concepts clearly.
    Please provide your explanation in a concise and comprehensible format, using plain language and avoiding unnecessary complexity. Aim for a friendly and approachable tone, keeping in mind that the audience has no prior technical knowledge.
    [transcription]=
    '''

    • @donatocapitella
      @donatocapitella  8 месяцев назад

      That's a great idea... I'm curious to see the result!

    • @micbab-vg2mu
      @micbab-vg2mu 8 месяцев назад +2

      @@donatocapitella I like the output from Opus:
      "'
      Imagine you have a special machine called a Transformer. This machine is really good at understanding and translating languages. It has two main parts:
      1)The Encoder: This part reads a sentence in one language, like English, and tries to understand the meaning and how the words relate to each other. It creates a "summary" of the sentence.
      2)The Decoder: This part takes the "summary" from the Encoder and uses it to write the sentence in another language, like French. It does this word by word.
      Now, some clever people realized that you can use just the Encoder part of the Transformer to create something called BERT. BERT is like a student that learns about language by doing special exercises:
      -In one exercise, some words in a sentence are hidden, and BERT has to guess what those hidden words are. This helps BERT understand how words fit together.
      -In another exercise, BERT is given two sentences and has to guess if the second sentence would naturally come after the first one in a story or conversation. This helps BERT understand how sentences relate to each other.
      By doing lots of these exercises with many, many sentences, BERT becomes really good at understanding language, almost like it has read a whole library of books!
      Once BERT has learned a lot about language in general, it can be taught to do specific tasks more easily, like:
      -Figuring out if a movie review is positive or negative (sentiment analysis)
      -Answering questions about a piece of text (question answering)
      -Identifying names of people, places, or things in a sentence (named entity recognition)
      To teach BERT these specific tasks, we just need to show it a few examples and fine-tune it a bit. It's like teaching a sports star a new game - they already have the basic skills, so they can learn the new rules quickly.
      BERT and other models like it are called Large Language Models (LLMs) because they are trained on huge amounts of text data and become very knowledgeable about language. They form an important building block for creating AI systems that can understand and communicate using human language.
      '''
      tell me if the output is correct

    • @donatocapitella
      @donatocapitella  8 месяцев назад +1

      @@micbab-vg2mu Opus did a really good job, I like it! I wish I used some of those sentences in my script!

  • @KhapitarBalakAagya
    @KhapitarBalakAagya 5 месяцев назад +1

    Excellant explaination.
    I also feel there are lots of information in every video, may be you can consider doing summarisation of complete video in the end for 30 sec-2min. It will help learner like me a lot.
    or may be recap in next video.

    • @donatocapitella
      @donatocapitella  5 месяцев назад +1

      Thank you for the feedback! I'll definitely include summaries at the end of future videos

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 8 месяцев назад +2

    this is really high production value. thanks

  • @arunavadasgupta2147
    @arunavadasgupta2147 6 месяцев назад +1

    Circuit for coding and Ecoding is correct

  • @fiseticamente
    @fiseticamente 8 месяцев назад

    Another incredible high quality lesson, grazie mille!

  • @MUHAMMADIFTIKHAR-x6j
    @MUHAMMADIFTIKHAR-x6j 8 месяцев назад +1

    THANK You !

  • @noomondai
    @noomondai 8 месяцев назад

    Thank you Don

  • @suryatejasama9213
    @suryatejasama9213 8 месяцев назад +1

    Great stuff sir, could you please share the ppt used in the video?

    • @donatocapitella
      @donatocapitella  8 месяцев назад +1

      There's no PPTX, these are not made with PowerPoint. But I have the canvas in PDF, you can find it in the comments or download it from llm-chronicles.com.

    • @suryatejasama9213
      @suryatejasama9213 8 месяцев назад

      @@donatocapitella Thanks for sharing ❤️✨

  • @matti7529
    @matti7529 8 месяцев назад

    Good stuff! Thanks.

  • @vishnuoad2190
    @vishnuoad2190 6 месяцев назад

    4:42