BERT: Pre-training of deep bidirectional transformers for language understanding

Поделиться
HTML-код
  • Опубликовано: 25 сен 2024
  • 주제: Background II (BERT: Pre-training of deep bidirectional transformers for language understanding)
    세부 주제:
    [1] Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2018). Bert: Pre-training of deep bidirectional transformers for
    language understanding. arXiv preprint arXiv:1810.04805.
    Korea University Smart Production Systems Lab. (sps.korea.ac.kr)

Комментарии •