NLP for Developers: Word Embeddings | Rasa

Поделиться
HTML-код
  • Опубликовано: 29 янв 2025

Комментарии •

  • @vivian_who
    @vivian_who 3 года назад +6

    Clear and concise explanation! You have a great aptitude for teaching and I am so happy I came across your channel!

  • @arsalan2780
    @arsalan2780 4 года назад +8

    Please increase volume... content is extremely great, to the point and elaborative.
    it cleared my concepts

  • @ritamsarkar7753
    @ritamsarkar7753 2 года назад +1

    I am using highest highest volume to listen ,as the tutorial is very useful

  • @LeticiaMartinFuertes
    @LeticiaMartinFuertes 4 года назад +3

    This is great! I love that it is so direct :)

  • @samarthbhandari1360
    @samarthbhandari1360 4 года назад +5

    Great introduction. A good follow-up might be introduction to more advanced, contextual embeddings like BERT. Something I would love personally would be a comparison between BERT, XLNet, Universal Sentence Encoder, etc and the best model to pick based on the use case. For example, BERT would work well for predicting next sentence or missing words in text whereas USE would work better for semantic similarity. Just a suggestion!

  • @hameddadgour
    @hameddadgour 2 года назад

    Clear and concise explanation!

  • @mawiyah360
    @mawiyah360 3 года назад +1

    very nice video and helpful in learning word embeddings.

  • @cleanpoop9929
    @cleanpoop9929 4 года назад +1

    What an amazing explanation. Thanks

  • @HarishRaoS
    @HarishRaoS Год назад

    ❤ the videos. Thanks

  • @GEB-Loop
    @GEB-Loop 4 года назад

    Thank you; this is an excellent expalanation!

  • @fernandomaximoferreira1067
    @fernandomaximoferreira1067 4 года назад

    Amazing content, have learned a lot.
    Thanks for the video.

  • @ahmedelsabagh6990
    @ahmedelsabagh6990 2 года назад

    Great intro.

  • @luis96xd
    @luis96xd 4 года назад

    This was a great video! Thanks

  • @TheMehrdadIE
    @TheMehrdadIE 4 года назад

    Perfect! Thanks for a clear explanation

  • @marcosandoval7260
    @marcosandoval7260 4 года назад

    Great video, precise content! Thank you.

  • @AhmedGamal-xi3vj
    @AhmedGamal-xi3vj 4 года назад

    Very great! I love it so much. Can we get this presentation?

  • @TheCalvin765
    @TheCalvin765 4 года назад

    helped out a lot. thanks

  • @sahibsingh1563
    @sahibsingh1563 4 года назад +1

    Awesome tutorial @rctatman

  • @its_me7363
    @its_me7363 4 года назад

    if it has so many disadvantages and errors then why it is used even today?(if it is used)

    • @RasaHQ
      @RasaHQ  4 года назад +4

      The errors are ones you're likely to run into during implementation rather than flaws with the approach. Overall, the advantages (fast to train for new data, approximations of meanings) tend to outweigh the disadvantages for most applications.

  • @wajidiqbal5633
    @wajidiqbal5633 3 года назад

    Madam please help me in the followings
    the total number of unique words in T
    the total number of training examples in T
    the ratio of positive examples to negative examples in T
    the average length of document in T
    the max length of document in T

  • @vikx02
    @vikx02 4 года назад

    Succinct and informative. @Rasa A bit nitpicky, but at 3:46 you probably meant homonyms and not homophones.

    • @bay-bicerdover
      @bay-bicerdover Год назад

      She meant homophones correctly

    • @harshathammegowda1615
      @harshathammegowda1615 Месяц назад

      Term Meaning Spelling Pronunciation
      Homonym Different Same Same
      Homophone Different (No requirement) Same

  • @yashsolanki069
    @yashsolanki069 4 года назад +1

    I've watched full playlist in reverse order😂

  • @JaiSreeRam466
    @JaiSreeRam466 4 года назад

    Suggestion: Audio quality is poor

  • @bay-bicerdover
    @bay-bicerdover Год назад

    Could you please speak louder?