Contextual word embeddings in spaCy

Поделиться
HTML-код
  • Опубликовано: 10 янв 2025

Комментарии • 12

  • @python-programming
    @python-programming 2 года назад +2

    I was looking for a good video to share with a colleague to explain this concept in spaCy. This was fantastic, as always! Thanks so much.

  • @SP-db6sh
    @SP-db6sh 2 года назад +2

    Best tutorial on such a complex topic.

  • @Julia-ej4jz
    @Julia-ej4jz 2 года назад

    Thank you for sharing this opportunity to learn!

  • @RaminH
    @RaminH 3 года назад +3

    Thank you for the clear and concise explanation. Great job!

  • @ajitkumar15
    @ajitkumar15 2 года назад

    Great Video !!!

  • @gareebmanus2387
    @gareebmanus2387 3 года назад +2

    Thanks for a very succinct description. However, I am familiar only with the W2V (static) representation in which you store a lookup table of pairs. When we need to use the embeddings, we simply lookup the word in the table and their plug the retrieved embedding instead o the word in the input to some neural net, etc. How does one store and use the contextual embeddings--obviously the store and lookup paradigm won't work? Can you please explain...For example I wish to do NMT from English to Finnish...then?

    • @AppliedLanguageTechnology
      @AppliedLanguageTechnology  3 года назад

      Hi Gareeb! An excellent question - in this case, you would simply use BERT or some other model capable of learning contextual embeddings to extract representations for the entire text to be translated, and then use these representations to train some sequence-to-sequence model to translate from one language to another. In this video, we simply pick out embeddings for certain words to show that despite their similar form, their representations differ.
      In other words, with contextual word embeddings, you typically learn representations for entire sequences as opposed to individual words, as in word2vec.

    • @gareebmanus2387
      @gareebmanus2387 3 года назад

      @@AppliedLanguageTechnology Thank you very much for your clarification. Also, thanks again for sharing your excellent course material.

  • @ruwang3132
    @ruwang3132 Год назад

    it is a nice talk! but why the code sometimes doesn't work out, and sometimes works