I appreciate the effort to deliver a well-structured and very informative course. I just want to point out that rather than using multiple if statements for the pet_color as in the snippet below, if animal_type == "Dog": pet_color = st.sidebar.text_area( label="What color is your dog?", max_chars=15 ) if animal_type == "Cat": pet_color = st.sidebar.text_area( label="What color is your cat?", max_chars=15 ) ... you could do the below to avoid multiple if statements. pet_color = st.sidebar.text_area(label=f"What color is your {animal_type}?", max_chars=15)
@@nicknico4121 heres a pro tip, do not learn new ones constantly. Pick one you think you might like, and get skills and build those skills until you can complete an application or project that you designed and implemented yourself. There is no need to be learning all the new frameworks and languages every 2 minutes, you cannot keep up and even the best developers in the world don't keep up. Get the core skills first, then you can build applications in any language or framework your project calls for.
@@nicknico4121you shouldn't be worried about that, learn on your own pace and you'll be grateful afterwards. also, you should really only choose one programming language you think it's interesting to learn and stick to it.
@@defaultdefault812 default bro. That's not the point of my comment I just admire how these people so passionately creating videos as soon as possible. I know I'm not that far, but what I'm sure is that I'm consistent in my own pace. Goodluck in your journey.
Just fantastic !!! Thanks a lot. Some questions that come to my mind: - How to use it with HunggingFace Models or gpt4free? - How to use it with graphics or video based models like DALL-E? - Lets imagine I have a PDF that I convert to a vector db. What is the difference between asking an AI just based on the information of this PDF, as opposed to the total knowledge of ChatGPT + the information of the PDF? How to combine and compare it? - Since you are an Amazon Pro: Show how to deploy everything in the cloud with Beanstalk or the other web services
This is brilliant! Definitely the best langchain course for beginners. I saw several another courses on youtube and still couldn't understand fully how all of its tools works together. Only after this one I finally got it! Thank you so much!
This was incredible!! Thank you so much for this video, it was really easy to understand and follow! I can't wait to start doing my own projects with langchain!!
Around minute 23, how about: st.title("Pets name generator") animal_type = st.sidebar.selectbox("What is your pet?", ("Cat", "Dog", "Cow", "Hamster")) pet_color = st.sidebar.text_area(label="What color is your " + str.lower(animal_type) + "?", max_chars=15)
Why does the OpenAI LLM not respond with all the fluff like "Certainly! Finding a fitting name for a pet is a difficult process, and I'm happy to help in this regard. With this said, here are five examples of names that might fit your cat, which is black of color: 1. Shadow - Shadow is a common name for any black animal, so it would fit really well for your pet cat. 2. Midnight - The name midnight refers to the time of day at 12 am when it's really dark outside. The darkness is a reference to your cat's color! 3. etc etc etc Always remember that it's a big responsibility to choose a proper name for a pet. It's not easy to make such a decision lightly!"
Not sure, but I do know that if you tell ChatGPT to provide the output in a particular format, it will do so. E.g. tell it to "provide the output in a numbered list format and do not include any other text than the numbered list" and it will do that.
thanks for such a helpful course the section for the Youtubw Assistant is much too dense and a bit all over the place. you don't run the langchain helper to check if file is okay (for a noob like me, i have to) then you are bouncing between tabs - also makes things more confusing. break that section down into specific chunks in the video, so that those of us who are ony starting out at coding can follow
very informative, thanks, but this ugly bit of code around 23:50 made me feel extremely uncomfortable. instead of copy-pasting of the same code multiple times, why not use a simple f-string f"What color is your {animal_type}?" and dump all your "if" statements completely?
I still can't see why we need langchain. We can do templating with Jinja and use vllm for serving your LLM. Integrating with APIs is basic programming. Getting back structured data is much better with Guidance, LQML or Jsonformer. So why to use Langchain? I seem to not get it.
A questoin here: When I was following the agent part, I do used tools of wikipedia and llm-math, but the agent only chosed to use calculater but not wikipedia throughout the process. It gives " I need to find the average age of a dog and then multiply it by 3 Action: Calculator Action Input: 3 * (12 + 15 + 10 + 8 + 5) / 5 Observation: Answer: 30.0" for the first part, which is very weird cause I expect it to use wikipedia instead. Anyone knows why?
It would be helpful if someone could help with the answers. Why do we need to use an embedding model, rather we can just ask the Gpt- 4 model to answer our question based on our custom data, right? What is the use of an embedding model over a gpt- 4? What if I want to create a text classifier based on my custom data what should I use?
Hey you got a good learning experience, but one question, or we can able to do this things instead of using chat open ai to Microsoft Azure Open AI, can you give some notes on it.
So much complexity could have been resolved with f-strings, right? instead of using the llm template, just use an f-string, instead of using if statements for each animal type, use an f-string, ...
getting error like belwo when pip install langchain command my intalled python version is 3.12 ERROR: Ignored the following versions that require a different python version: 0.55.2 Requires-Python
Failing at the start unfortunately when running the dog name generating script. I can print the model name so things are set up correctly package wise, but when the code reaches name = llm("Write 5 dog names") it throws the error : "module 'openai' has no attribute 'error' "
Really cool tutorial and very helpfull for beginners. Best on RUclips I woul say. Just a quick tip for starters. Start with doeing a few non-coders like Langflow or Flowise. They realy help to visualize what you are doing. For me it realy helpt to understand the fundamental concepts of which componentens to use.
I tried the agents as per your example, I added both wikipedia and ll-math as tools, and I asked the exact same question, but the response starts with action:calculator and it tries to compute the math first rather than using the wikipedia to search first. the agent is not reasoning... May I have your views?@rishabincloud
thx for ur video, I wanna connect 7b-chat-hf to langchain for summarization, neither map-reduce nor refine responds me, in the last step- map reduce took 2 hrs without responding and refine gives me blank document, have u faced this problem
Thanks for the brilliant video. There is a small bug, while using lch.get_response_from query method, please pass k variable a value. Do you use any extension for terminal? Thanks
6:03 "Python was not found; run without arguments to install from the Microsoft Store, or disable this shortcut from Settings > Manage App Execution Aliases." Even though i have installed python on my system. Please resolve this issue. Thank you!
Question to the crowd: What are the main python libraries to know apart from LangChain? Huggingface? OpenAI? Is AutoGPT a library? sorry, I am a bit lost.
Just learn one and stop trying to run before you can walk. Langchain is a framework. Huggingface is a platform for deploying LLMs Open AI is a service provider. AutoGPT Is a library. Go start with OpenAI APIs
Can anyone explain that does it send 4000 words at a time or a total of 4000 words because of token limit? If it only sends 4000 words when k=4, how does it come to conclusion without reading the whole transcript? Thank you for the help, very informative and interesting video.
Based on my understanding, the RUclips assistant finds the 4 most similar parts of the transcript*, merge them, and then feeds the text-davinci-003 with the merged text. So, based on these 4000 chunks, the text-davinci-003 tries to answer the user's question. *Each part contains 1000 chunks
Can someone else just appreciate with me that at approx 16:00 we learn that it takes 28Gb of memory to choose a Cat name. I died laughing. great video. i shall now continue watching.
Hi, no. You need to provide your card info, and then, at the end of the month, OpenAI will charge you as much as you spent. BTW they don't take any money when you used a small amount, for example 2 cents :)
Hi ,very good i need course of searched clients Mean clients hunting crash course for every purpose with extremely deeply techniques and things which you know and even have made anyone this course tell me. Thanks ❤❤❤
You took most of the statements that you said in the introduction of this video from the 8 month old video on LangChain from Rabbitmetrics. You should have the decency and courtesy to at least mention that and cite that. It is a very bad practice to copy stuff from others and not cite it
7:08 virtual environment and pip install
13:13: prompt template
14:11 chain
18:40 langchain helper py and streamlit
27:14 set output key
29:05 agents
32:42 giving task to agent
36:13 indexes and building youtube assistant
49:43 k
I appreciate the effort to deliver a well-structured and very informative course. I just want to point out that rather than using multiple if statements for the pet_color as in the snippet below,
if animal_type == "Dog":
pet_color = st.sidebar.text_area(
label="What color is your dog?",
max_chars=15
)
if animal_type == "Cat":
pet_color = st.sidebar.text_area(
label="What color is your cat?",
max_chars=15
) ...
you could do the below to avoid multiple if statements.
pet_color = st.sidebar.text_area(label=f"What color is your {animal_type}?", max_chars=15)
My thoughts exactly. You could even use animal_type.lower() to remove the capital first letter
I think this could be done for the less experienced viewers to understand
I cannot explain the usefulness of this tutorial. Helped me understand a lot of topics and also gave me some project ideas. 10/10. 🙌
awesome, excited to see what you build! ✨
I'm not done with the html css tutorial and yet they upload another one tutorial. HOW PASSIONATE THEY ARE😭
@@nicknico4121 heres a pro tip, do not learn new ones constantly. Pick one you think you might like, and get skills and build those skills until you can complete an application or project that you designed and implemented yourself. There is no need to be learning all the new frameworks and languages every 2 minutes, you cannot keep up and even the best developers in the world don't keep up. Get the core skills first, then you can build applications in any language or framework your project calls for.
@@nicknico4121you shouldn't be worried about that, learn on your own pace and you'll be grateful afterwards. also, you should really only choose one programming language you think it's interesting to learn and stick to it.
What?
Dude this is a million miles away from html CSS ... Come back in 6-12 months.
@@defaultdefault812 default bro. That's not the point of my comment I just admire how these people so passionately creating videos as soon as possible. I know I'm not that far, but what I'm sure is that I'm consistent in my own pace. Goodluck in your journey.
Beautiful introduction to Langchain. Amazing that you made 2 demos in 1 hour with such clarity and simplicity
very interesting, I was struggling to grasp the concepts of LLM and LangChain, and you make them appears like a simple program.
Thanks
text-davinci-003 has since been deprecated. The same tutorial works for me if I replace it with "gpt-3.5-turbo-instruct". Good Luck.
Just fantastic !!! Thanks a lot.
Some questions that come to my mind:
- How to use it with HunggingFace Models or gpt4free?
- How to use it with graphics or video based models like DALL-E?
- Lets imagine I have a PDF that I convert to a vector db. What is the difference between asking an AI just based on the information of this PDF, as opposed to the total knowledge of ChatGPT + the information of the PDF? How to combine and compare it?
- Since you are an Amazon Pro: Show how to deploy everything in the cloud with Beanstalk or the other web services
Amazingly well done video on langchain! Thank you 🙏
This is brilliant! Definitely the best langchain course for beginners. I saw several another courses on youtube and still couldn't understand fully how all of its tools works together. Only after this one I finally got it! Thank you so much!
This was incredible!! Thank you so much for this video, it was really easy to understand and follow! I can't wait to start doing my own projects with langchain!!
Glad you found it helpful!
Hi, I really like your PS1 setups looks very neat! Could you make a video of it?
Around minute 23, how about:
st.title("Pets name generator")
animal_type = st.sidebar.selectbox("What is your pet?", ("Cat", "Dog", "Cow", "Hamster"))
pet_color = st.sidebar.text_area(label="What color is your " + str.lower(animal_type) + "?", max_chars=15)
Why does the OpenAI LLM not respond with all the fluff like "Certainly! Finding a fitting name for a pet is a difficult process, and I'm happy to help in this regard. With this said, here are five examples of names that might fit your cat, which is black of color:
1. Shadow - Shadow is a common name for any black animal, so it would fit really well for your pet cat.
2. Midnight - The name midnight refers to the time of day at 12 am when it's really dark outside. The darkness is a reference to your cat's color!
3. etc etc etc
Always remember that it's a big responsibility to choose a proper name for a pet. It's not easy to make such a decision lightly!"
Not sure, but I do know that if you tell ChatGPT to provide the output in a particular format, it will do so. E.g. tell it to "provide the output in a numbered list format and do not include any other text than the numbered list" and it will do that.
thanks for such a helpful course
the section for the Youtubw Assistant is much too dense and a bit all over the place.
you don't run the langchain helper to check if file is okay (for a noob like me, i have to)
then you are bouncing between tabs - also makes things more confusing.
break that section down into specific chunks in the video, so that those of us who are ony starting out at coding can follow
Like you mean, you can switch and query Gemini or claude models from this app, using langchain? If so, how is that done?
Thanks for this video. It's very interesting and very easy to undestand.
Great tutorial. Streamlit seems incredibly useful. I would support a streamlit tutorial.
do you have a list of the extensions you are currently using? your VS code setup is super sleek. :) thanks for providing this
very informative, thanks, but this ugly bit of code around 23:50 made me feel extremely uncomfortable. instead of copy-pasting of the same code multiple times, why not use a simple f-string
f"What color is your {animal_type}?" and dump all your "if" statements completely?
Easier for beginners to follow
At least I got basic concepts cleared :) now I am making my app :D
The code for the first example is now wrong and outdated.
Sadly none of this code works anymore by April 2024
I still can't see why we need langchain. We can do templating with Jinja and use vllm for serving your LLM. Integrating with APIs is basic programming. Getting back structured data is much better with Guidance, LQML or Jsonformer. So why to use Langchain? I seem to not get it.
A questoin here: When I was following the agent part, I do used tools of wikipedia and llm-math, but the agent only chosed to use calculater but not wikipedia throughout the process. It gives " I need to find the average age of a dog and then multiply it by 3
Action: Calculator
Action Input: 3 * (12 + 15 + 10 + 8 + 5) / 5
Observation: Answer: 30.0" for the first part, which is very weird cause I expect it to use wikipedia instead. Anyone knows why?
Excellent video. Cleared up a number of topics.
glad you liked it!
It would be helpful if someone could help with the answers.
Why do we need to use an embedding model, rather we can just ask the Gpt- 4 model to answer our question based on our custom data, right?
What is the use of an embedding model over a gpt- 4?
What if I want to create a text classifier based on my custom data what should I use?
Hey you got a good learning experience, but one question, or we can able to do this things instead of using chat open ai to Microsoft Azure Open AI, can you give some notes on it.
Very, very cool! Thanks for making this video. Hopefully, you'll make a "Langchain: Taking it to the next level". :)
Was literally searching for this course and you guys uploaded it
Much better than a course I bought on Udemy :D
So much complexity could have been resolved with f-strings, right? instead of using the llm template, just use an f-string, instead of using if statements for each animal type, use an f-string, ...
This so useful. I was having trouble thinking of a name for my cat.
😂😂😂😂😂
getting error like belwo when pip install langchain command my intalled python version is 3.12
ERROR: Ignored the following versions that require a different python version: 0.55.2 Requires-Python
what is the extension or package you use that shows the CPU and Mem on the command line? Looks really cool
Failing at the start unfortunately when running the dog name generating script. I can print the model name so things are set up correctly package wise, but when the code reaches name = llm("Write 5 dog names") it throws the error : "module 'openai' has no attribute 'error' "
Really cool tutorial and very helpfull for beginners. Best on RUclips I woul say. Just a quick tip for starters. Start with doeing a few non-coders like Langflow or Flowise. They realy help to visualize what you are doing. For me it realy helpt to understand the fundamental concepts of which componentens to use.
Great Video!! First AI related tutorial I watched end to end!
I’ve not gone though it but can someone tell me this is not just another api calls to open api right ?
what I can say, super, super helpful ..., thank you
Awesome content! Thanks for the video!
Cheers from Brazil
How come is it possible to have k=4 ? It only allows 97 tokens left for the prompt template and the output.
I think you should update this video, because i keep getting errors because of new version etc, i can't solve them!
This has been great so far but for some reason it breaks down for me at the output key. I am getting a key error, " KeyError: 'pet_name'. Any ideas?
Please do a crash course for Beginners for Local LLMs.
Amazing intro video, thank you so much.
I tried the agents as per your example, I added both wikipedia and ll-math as tools, and I asked the exact same question, but the response starts with action:calculator and it tries to compute the math first rather than using the wikipedia to search first. the agent is not reasoning... May I have your views?@rishabincloud
Good intro tq 😅
whats the command to show memory information and pc utilization on the powershell terminal?
This is great! Thank you!
my man really started the tutorial copying Rabbitmetrics video on the subject word for word
Great course - THANKS. Q: Is there a playlist or other videos that go into deeper details?
thanks a ton for this!
thx for ur video, I wanna connect 7b-chat-hf to langchain for summarization, neither map-reduce nor refine responds me, in the last step-
map reduce took 2 hrs without responding and refine gives me blank document, have u faced this problem
Thanks for the nice introduction. Can you clarify which versions you have for the requirements?
Craving for Flowise full course, which is no-code UI on top of langchain
You are crushing it bro.
❤ From India
Thanks for this great vid 💕👌
Thanks for the brilliant video.
There is a small bug, while using lch.get_response_from query method, please pass k variable a value.
Do you use any extension for terminal?
Thanks
Brilliant stuff man. Keep up the good work.💯
6:03 "Python was not found; run without arguments to install from the Microsoft Store, or disable this shortcut from Settings > Manage App Execution Aliases." Even though i have installed python on my system. Please resolve this issue. Thank you!
You probably need to install it in your virtual environment.
Thanks Rishabh.
Do i need to learn any language or skills before learning langchain?
Question to the crowd: What are the main python libraries to know apart from LangChain? Huggingface? OpenAI? Is AutoGPT a library?
sorry, I am a bit lost.
Just learn one and stop trying to run before you can walk.
Langchain is a framework.
Huggingface is a platform for deploying LLMs
Open AI is a service provider.
AutoGPT Is a library.
Go start with OpenAI APIs
hi can you help me , what embedding i can use if i want the youtube assistant using in open source@@defaultdefault812
what are embeddings you imported it not explained or i missed it ?
Fantastic course thank you
I hated my English teacher cuz she just make me feel weid ND stupid but I'm good at English 😿
Can anyone explain that does it send 4000 words at a time or a total of 4000 words because of token limit?
If it only sends 4000 words when k=4, how does it come to conclusion without reading the whole transcript?
Thank you for the help, very informative and interesting video.
Based on my understanding, the RUclips assistant finds the 4 most similar parts of the transcript*, merge them, and then feeds the text-davinci-003 with the merged text.
So, based on these 4000 chunks, the text-davinci-003 tries to answer the user's question.
*Each part contains 1000 chunks
Can someone else just appreciate with me that at approx 16:00 we learn that it takes 28Gb of memory to choose a Cat name. I died laughing. great video. i shall now continue watching.
great video!!
Can anyone please help me - how to get the exact poweshell as the guy using above?
Please make a video for Time series forecasting
Thanks so much 😊
no module named 'langchain', which streamlit does no like it
can anyone tell me do we have to pay for the API key ? than and then we execute it?
Hi, no. You need to provide your card info, and then, at the end of the month, OpenAI will charge you as much as you spent. BTW they don't take any money when you used a small amount, for example 2 cents :)
Did I hear zero shot react uses react framework?
sorry i am like total begiinner, on 0:05:04 what is the app u are runnning?
that's command prompt, just search 'cmd' on your window
Lots of Love
Thanks for the effort. Do you have any udemy course on this topic?
Thanks
Robotic operating system course plz😢
very cool.
Nice 👍👍
What tool is he using to get his command prompt to show his cpu and mem usage? I have been using my best google foo to find it with no luck.
I think it may be "oh my posh" 🤔
powershell
I am using oh-my-posh, theme called “clean-detailed”
Nice one.
Hi ,very good i need course of searched clients
Mean clients hunting crash course for every purpose with extremely deeply techniques and things which you know and even have made anyone this course tell me.
Thanks ❤❤❤
Why not teach langchain using gpt4free instead of open ai key
FCC for the win!
You took most of the statements that you said in the introduction of this video from the 8 month old video on LangChain from Rabbitmetrics. You should have the decency and courtesy to at least mention that and cite that. It is a very bad practice to copy stuff from others and not cite it
Thank you!
Most of this video is based on langchain documentation 😂😂
Aw I wished this would be in javascript, but this works too
I like it.
👏👌
thanks keep getting rate limit errors lol
Can we have Course for Advance or experts, instead of beginners all the time
pay
Fcc...winning!
me setting tempreture to 100 and it gave me 'Cat K!ller' as my dog's name 💀
First .❤😊
First 🥇
How is this for beginners? already failing at 5:33! Explain the steps for fucks sake!