Build and Deploy LLM Application in AWS Lambda - BedRock - LangChain
HTML-код
- Опубликовано: 18 июн 2024
- Building and deploying a Large Language Model (LLM) application on AWS Lambda, leveraging the power of BedRock and LangChain. We explore how to create a serverless LLM chain application that allows users to ask natural language questions.
⭐️ Contents ⭐️
00:00 Introduction
1:57 Bedrock and Lambda Service
4:23 Create Lambda Function
8:04 Deployment and Test
9:17 Create and Add Custom Lambda Layer
16:47 Test
17:29 Summary and Conclusion
📚 Resources 📚
▸ ARN for popular packages: api.klayers.cloud/api/v2/p3.1...
▸ Amazon ECR Public Gallery - SAM : gallery.ecr.aws/sam/build-pyt...
▸ Official Documentation: aws.amazon.com/blogs/compute/...
▸ Docker: docs.docker.com/desktop/insta...
▸ Article and Code: / build-and-deploy-llm-a...
__________________________________________________________________________________________________
🔔 My Newsletter and Featured Articles: abonia1.github.io/newsletter/
🔗 Linkedin: / aboniasojasingarayar
🔗 Find me on Github : github.com/Abonia1
🔗 Medium Articles: / abonia
Please do more on AWS Bedrock to develop on RAG applications......your explanation is simple and effective.......stay motivated and upload more videos about LLM
Thanks for your kind words! Sure I will do it.
Yes same thing i want @@AboniaSojasingarayar
Here the tutorial link to Deploying a Retrieval-Augmented Generation (RAG) in AWS : ruclips.net/video/gicsb9p7uj4/видео.html
Thanks for the video. Very useful for me as I am new to AWS lambda and bedrock. Can you please upload the lambda function source code? Thanks again!
Glad it helped.
Sure you can find the code and complete the article on this topic in the description.
In any way here is the link to the code : medium.com/@abonia/build-and-deploy-llm-application-in-aws-cca46c662749
Very Informative thanks for uploading
Glad it helped!
Hi Abonia, thanks for the thorough guide, but i'm abit confused with the lambda_layer.zip file, why did you have to create it through docker? is there an easier way to provide the dependencies in a zip file without going through docker? Thanks in advance!
Hi Humayoun Khan, Yes we can but Docker facilitates the inclusion of the runtime interface client for Python, making the image compatible with AWS Lambda.
Also it ensures a consistent and reproducible environment for Lambda function's dependencies. This is crucial for avoiding discrepancies between development, testing, and production environments.
Hope this helps.
Very informative
Glad it was helpful!
👍👍
😊😊
Hi Abonia, thanks for sharing. I am facing this error . can you please tell how to resolve it "errorMessage": "Unable to import module 'lambda_function': No module named 'langchain_community'",
Hello,
You are most welcome.
You must prepare your ZIP file with all the necessary packages. You can refer to the instructions starting at the 09:04
Can we use openai and chromadb on aws??
Yes we can!
In the below tutorial I have demonstrated how we can create and deploy lambda layer via container for larger dependencies : ruclips.net/video/gicsb9p7uj4/видео.htmlsi=F_X7-6YCAb0Kz3Jc
@@AboniaSojasingarayar yes but can this be done without eks or containers?
Yes! You can try it by creating a custom lambda layer.If you face issue try to use only the required libraries and remove any unnecessary dependencies from your zip file.Hope this helps.