Training State-of-the-Art Text Embedding & Neural Search Models

Поделиться
HTML-код
  • Опубликовано: 24 янв 2025

Комментарии • 14

  • @EduardoFloresV
    @EduardoFloresV 2 года назад +1

    Great presentation. Very useful, thanks for putting the effort and sharing it!

  • @枕头不软睡不香
    @枕头不软睡不香 2 года назад

    Very Great presentation!

  • @boscojay1381
    @boscojay1381 2 года назад

    16:00, I have heard you talk about the multiple negative ranking loss quiet often, is it the same as the SupCon loss or the contrastive loss? I also noticed in your simCSE paper implementation with sentence transformers library, you again use the MNR loss. Is it a special case of contrastive loss?? I appreciate your response to this question. Thanks

  • @nomanshahid9771
    @nomanshahid9771 11 месяцев назад

    How do we use triplets with multiple negative ranking loss? Doesn't this loss only takes positive pairs?

  • @ldrn-b4b
    @ldrn-b4b 2 года назад

    Thanks! Great content

  • @francescocariaggi1145
    @francescocariaggi1145 3 года назад

    Very interesting talk!

  • @zurechtweiser
    @zurechtweiser 2 года назад

    Es gibt einige vortrainierte Modelle, die für cosine-sim optimiert sind, andere für das Skalarprodukt. Wie wirkt es sich auf das Ergebnis aus, das jeweils andere zu nutzen?

  • @tianshuwang7543
    @tianshuwang7543 3 года назад

    Hello, thank you for the awesome talk. I have a question why contrastive / triplet loss might only optimize the local structure? Don't these losses increase the distance between negative cases, which usually include random pairs in a batch?

    • @NilsReimersTalks
      @NilsReimersTalks  3 года назад +1

      They increase the distance only for the pairs you provide. If you provide poor pairs, they might only optimize some local structures.

    • @tianshuwang7543
      @tianshuwang7543 3 года назад

      @@NilsReimersTalks Thx, I got your point.

  • @muhammadhammadkhan1289
    @muhammadhammadkhan1289 3 года назад

    Can you share the slides?

    • @NilsReimersTalks
      @NilsReimersTalks  3 года назад +1

      Slides are here: nils-reimers.de/talks/2021-09-State-of-the-art-Bi-Encoders.zip

  • @高长宽
    @高长宽 3 года назад

    太棒了!