It's my understanding that it's cosine similarity, aka dot product: (a1 + a2 + a3) . (b1 + b2 + b3) = (a1 b1 + a2 b2 + a3 b3) .... if you divide this vector by the length, it is literally cos( angle(a,b) )
Wow! What an amazing interview! Really skilled questions. You will definitely have over 100k subs in 6 months if this is the normal level of quality that you deliver 💪🏻
This was fantastic. The questions you ask really get to the heart of it so quickly.
What an outclass podcast!
It's my understanding that it's cosine similarity, aka dot product: (a1 + a2 + a3) . (b1 + b2 + b3) = (a1 b1 + a2 b2 + a3 b3) .... if you divide this vector by the length, it is literally cos( angle(a,b) )
Wow! What an amazing interview! Really skilled questions. You will definitely have over 100k subs in 6 months if this is the normal level of quality that you deliver 💪🏻
Thanks so much! I hope your prediction comes true. 😁
Thanks for the talk, would be great to hear more on the vector databases from someone involved.
This is such a great explanation of vector embeddings and vector databases for a nonmathematician audience!
This is really helpful. Thank you ever so much 🎉😊
awesome interview, great information
Thanks for listening!
Well done thanks, very interesting
Awesome!
Thanks! 😊
Very well done.
clear explanation, thx
I am a new sub😂😂❤
I need a whole podcast series about these topics