Vector Search in Azure Cognitive Search using Langchain | azure openai | embeddings | openai | llms
HTML-код
- Опубликовано: 24 июл 2024
- Learn about vector search in Azure Cognitive Search using Langchain and Azure Open AI! In this tutorial, we'll learn about how to use Langchain and Azure Open AI to perform vector search in Azure Cognitive Search.
Vector search in azure cognitive search
Azure machine learning service
Azure machine learning studio
How to create embeddings using azure open ai
Using text-embedding-ada-002 model to create embeddings.
Retrieval Augmented Generation (RAG)
How does google and youtube search work
Approximate nearest neighbor
k nearest neighbor
Model catalog
Azure cognitive search
Vector support for azure cognitive search
Azure OpenAI vector embedding.
Langchain Framework
Azure OpenAI vector
Clustering algorithm
Transformer
Vector search: learn.microsoft.com/en-us/azu...
Langchain : python.langchain.com/docs/int...
Huggingface embeddings and faiss vector serach: • Dive into LangChain an...
Azure OpenAI Service: • First look at Azure Op...
/ girish-goudar-42a56426
/ girishtweek
/ @learnwithgirishgoudar
Really on point 🚀, Thank you
wat perfect and simple example, thanks
You are welcome!
Informative video
Thanks
Hi, i tried same but getting error now.
vector_store: AzureSearch = AzureSearch(
azure_search_endpoint=vector_store_address,
azure_search_key=vector_store_password,
index_name=index_name,
embedding_function=embeddings.embed_query,
)
when I am running this snippet getting below error.
vector_search_configuration is not a known attribute of class and will be ignored
ImportError: cannot import name 'HnswVectorSearchAlgorithmConfiguration' from 'azure.search.documents.indexes.models'
can you please help me with the versions of azure-search-documents and langchain you are using?
Nice tutorial! Could you make a part 2 for endpoint deployment so that one could make HTTP requests to send queries to the script?
Yes, soon
Hey very interesting work with lots of uses! I have a question. The cognitive search compute the vector similarity between a "query" and a "source". In the image shown, "source" and "query" don't have to be the same modality; "query" can be text and "source" can be a video file. Does that mean that the produced embeddings from those two are directly comparable? or the query_emb and source_emb must be generated by the same model? In other terms my question is: can I search a query_emb (given from a textual model) over a source_emb (generated from a visual model)?
As i know the source and the query should use the same model. In this video i am using the open ai
text-embedding-ada-002 model embeddings and in of the other video i am using Hugging face embeddings. Currently as i know multi modality is not present with the embeddings model and might have a in future. A single model to rule them all. Link to my other video ruclips.net/video/UXMXPoYYRHo/видео.html
try to do more vedios
Nice video! really learned a lot, is the notebook available online?
you can refer this article techcommunity.microsoft.com/t5/azure-ai-services-blog/azure-cognitive-search-and-langchain-a-seamless-integration-for/bc-p/3941381
I have one doubt dose Azure cognitive service provides vector database service too or not? Please help me
Yes it provides vector database support . The lanchain api which we are using in the scripts store the data in vextor format
@@learnwithgirishgoudar thank you for the answer and reply.
If it supports vector db where the data will get into store , is it like blob storage or ADLS storage or any some other storage system. Could you let me know please.