Making the best NLU with Rasa and BERT, Rasa Developer Summit 2019

Поделиться
HTML-код
  • Опубликовано: 23 июл 2024
  • Mady Mantha, AI Platform Leader at Sirius Computer Solutions, shares how to build highly performant NLP by integrating BERT with a custom NLU pipeline.
    Bidirectional Encoder Representations from Transformers (BERT) is a NLP pre-training technique released by Google. BERT's key innovation is its ability to pre-train bidirectional, contextual language representations modeled on a large text corpus. The model can then be used for downstream NLP tasks like Natural Language Understanding (NLU) and question answering. Named Entity Recognition (NER) is a subtask of NLU that attempts to identify and classify entities in a given text into pre-defined categories like names, places, organizations, currency, and quantities. A NER model can be trained using BERT. Integration of BERT NER with Rasa using a custom pipeline resulted in highly performant NLP and engaging conversations between humans and Rasa agents.
  • НаукаНаука

Комментарии • 8

  • @ogabrielluiz
    @ogabrielluiz 4 года назад +15

    Very interesting talk!
    The speaker mentioned a GitHub repo. Can you provide it?
    Thanks :)

  • @AaronWacker
    @AaronWacker 4 года назад

    Awesome talk - thanks. Very excited about further use of BERT and DIET.

  • @maheswaran6628
    @maheswaran6628 3 года назад +1

    can anyone share the mentioned github repo??

  • @varungondu7053
    @varungondu7053 4 года назад +3

    github repo?

  • @d3v487
    @d3v487 Год назад

    Can you provide code. I'm curious about custom components to load your own language Model in pipeline.

  • @aqibfayyaz1619
    @aqibfayyaz1619 3 года назад +1

    GitHub repo?

  • @shushantpudasaini2306
    @shushantpudasaini2306 3 года назад +1

    github?