16:00, I have heard you talk about the multiple negative ranking loss quiet often, is it the same as the SupCon loss or the contrastive loss? I also noticed in your simCSE paper implementation with sentence transformers library, you again use the MNR loss. Is it a special case of contrastive loss?? I appreciate your response to this question. Thanks
Es gibt einige vortrainierte Modelle, die für cosine-sim optimiert sind, andere für das Skalarprodukt. Wie wirkt es sich auf das Ergebnis aus, das jeweils andere zu nutzen?
Hello, thank you for the awesome talk. I have a question why contrastive / triplet loss might only optimize the local structure? Don't these losses increase the distance between negative cases, which usually include random pairs in a batch?
Great presentation. Very useful, thanks for putting the effort and sharing it!
Very Great presentation!
16:00, I have heard you talk about the multiple negative ranking loss quiet often, is it the same as the SupCon loss or the contrastive loss? I also noticed in your simCSE paper implementation with sentence transformers library, you again use the MNR loss. Is it a special case of contrastive loss?? I appreciate your response to this question. Thanks
How do we use triplets with multiple negative ranking loss? Doesn't this loss only takes positive pairs?
Thanks! Great content
Very interesting talk!
Es gibt einige vortrainierte Modelle, die für cosine-sim optimiert sind, andere für das Skalarprodukt. Wie wirkt es sich auf das Ergebnis aus, das jeweils andere zu nutzen?
Hello, thank you for the awesome talk. I have a question why contrastive / triplet loss might only optimize the local structure? Don't these losses increase the distance between negative cases, which usually include random pairs in a batch?
They increase the distance only for the pairs you provide. If you provide poor pairs, they might only optimize some local structures.
@@NilsReimersTalks Thx, I got your point.
Can you share the slides?
Slides are here: nils-reimers.de/talks/2021-09-State-of-the-art-Bi-Encoders.zip
太棒了!