everything literally everything I am searching since 1 month. please make more video on this topic for complex situations. please make video I really need it. thanks a lot
Highly useful! In conditions when we have huge amount of marketing AI videos, and small amount technical ones, your video is very interesting and efficient! Thank you!
really nice video, explaining things very well. The thing i like the best is that Sam talks at normal speed, and explains everything very simple with clear examples. Thnx
Bro imma be honest with you… i have seen a lot of videos trying to to understand how to integrate custom functions in my assistant, but none of them worked for me until i saw your video🙏🙏 thanks a lot, you have great teaching skills 🤙 u r the best
Thank you for the tutorial, it was very much appreciated, and very well explained such that even an amateur such as myself can understand it. I will be building an assistant to learn about my work site, as a veryquick example -- showing it a diagram of the door closing mechanism on the front entrance door, so that in the future it can help with troubleshooting/advising how to make a certain adjustments such as the door closing speed. Over time I'll have something similar to the computer on the Enterprise.... maybe. Anyway, it might be a fun video for you to consider making, and selfishly I might also get some pointers or ideas when seeing how you approach such a project.
This was very informative. I’ve thought about doing something like this to take my current location and maybe a compass direction and use an adsb API to fetch the planes in that direction from me and return them in order of how close they are to me. Additional feature might be to use a plane reference site to gather additional details or emphasize more unusual ones. This would let me essentially ask “what’s flying low to the west of me” and have it describe the sub hunter plane that sometimes trains around here. It’s essentially the same thing as me pulling up a flight radar app on my phone, then googling for details about the one that caught my attention. If practical, this might be an interesting follow up.
Excellent video, it was very helpful. Could you attach the notebook that you show in the video in order to make the interpretation and understanding of the concepts easier?
how can i call function just like you? In common, we add message in thread, but this video send instruction to run. help me. and your notebook link is not collect (other file)
Thank you very much. Could you please share the ipynb file you used in the video? The file mentioned in the comments seems to be different from the one in the video.
I had a bit of a dig into this - it looks like some people have been experimenting with this on the OpenAI forums, and it looks like someone has found a way of doing it via Retrieval. Haven't tested it, but might be worth a try! community.openai.com/t/functions-in-the-assistant-api-playground/479721/8
Thank you for the video. The notebook you had shared is openai_assistant_retrieval.ipynb. Are you able to share the notebook openai_assistant_functions.ipynb you used in the video? Thank you
I am newbie in coding so already sry for the dumb question but... where is this function deployed? Its running on your computer? For my understanding the information that we provide to the assistant is only what input are needed but the exact logic is not stored under "functions"? What if I dont want to run the logic on my computer? Do I have to deploy the function somewhere and put an API call as function for the assistant?
i am creating that chat bot but i got to know that even in threads model didn't remember the previous chat for example i wrote a prompt "change weight in previous prompt and keep rest details same". will that prompt work?
I think as long as you're working in the same thread (and your context/conversation isn't too long), the assistant will reference that thread in the response. Hope that helps!
The bot answers the question based on the data that was uploaded via the aviationstack API. How can I make the bot itself upload the necessary data? So that it doesn't need to enter the flight number in the aviationstack API.
Hey - so two approaches here, one could be to fetch data from your own database (so instead of using AviationStack, you’d query your own database), or you could potentially use RAG (or Retrieval) to fetch context from provided documents, if the information is static. I have another video on OpenAI Assistants Retrieval which might help here. Best of luck!
At the step with assistant = client.beta.assistants.retrieve( assistant_id=assistant_id ) assistant I have the error: timeout: Override the client-level default timeout for this request, in seconds """ extra_headers = {"OpenAI-Beta": "assistants=v1", **(extra_headers or {})} --> return self._get( f"/assistants/{assistant_id}", options=make_request_options( extra_headers=extra_headers, extra_query=extra_query, extra_body=extra_body, timeout=timeout ), cast_to=Assistant, ) I don't know what to do(
The part I don't understand yet is, with the Python code where do you host it so the Assistant can call it. Then in the OpenAI assistants dashboard under functions say you have function Name: Get_weather and you hit test button. How does the Assistant know where the code is in relation to Get_weather? There is no endpoint etc set up? Im so confused on this part. Someone please help! haha
Hey! Kind of a late response but still. The model doesn't really call the function, so it does not need to know where the Python function is. All the model does (in a general sense) is taking the question/message from the user, taking a bunch of **descriptions** of some functions (and their parameters), and returning the name of the function to call. Basically the model only **tells** you what function you should call (and what parameters you should call it with) based on the descriptions you provided. After you get that response from the model, is up to you to call the function the model suggested (as stated in minute 8:30). And after doing this, you can optionally pass the values returned by these functions back to the model, so the model can come up with a more "human sounding" response (which is what's happening at minute 10:42). Hope that helps!
everything literally everything I am searching since 1 month. please make more video on this topic for complex situations. please make video I really need it. thanks a lot
Highly useful! In conditions when we have huge amount of marketing AI videos, and small amount technical ones, your video is very interesting and efficient! Thank you!
Thank you - means a lot! :)
Wonderful, finally someone who doesn’t use a chatbot mechanism and explains how this works!!!
Your function calling tutorial was such a lifesaver! Made it sooo easy to put it into my project. You're a coding angel!
Comments like this make all the work feel like it was worth it, ha! Thanks so much for the kind words!
really nice video, explaining things very well. The thing i like the best is that Sam talks at normal speed, and explains everything very simple with clear examples. Thnx
Bro imma be honest with you… i have seen a lot of videos trying to to understand how to integrate custom functions in my assistant, but none of them worked for me until i saw your video🙏🙏 thanks a lot, you have great teaching skills 🤙 u r the best
I watched about 8 other videos trying to figure out this process. This one cracked it for me. Nice one.
Awesome, very straight forward explanation.
Thank you for the tutorial, it was very much appreciated, and very well explained such that even an amateur such as myself can understand it. I will be building an assistant to learn about my work site, as a veryquick example -- showing it a diagram of the door closing mechanism on the front entrance door, so that in the future it can help with troubleshooting/advising how to make a certain adjustments such as the door closing speed. Over time I'll have something similar to the computer on the Enterprise.... maybe. Anyway, it might be a fun video for you to consider making, and selfishly I might also get some pointers or ideas when seeing how you approach such a project.
all you need in 12 minutes - thanks for this !!
Please Do more Videos With OpenAI, API, Tools and Assistant API. Worth it.
This was very informative. I’ve thought about doing something like this to take my current location and maybe a compass direction and use an adsb API to fetch the planes in that direction from me and return them in order of how close they are to me. Additional feature might be to use a plane reference site to gather additional details or emphasize more unusual ones. This would let me essentially ask “what’s flying low to the west of me” and have it describe the sub hunter plane that sometimes trains around here. It’s essentially the same thing as me pulling up a flight radar app on my phone, then googling for details about the one that caught my attention.
If practical, this might be an interesting follow up.
That helped a lot. Keep going with these Assistant tutorials. They are great.
Thanks so much for the kind words! I'll get to work on some more :)
Really helpful in getting me started with the API, thanks!
Great tutorial , very clear screen and voice , helped a lot on function call
Thanks for the kind words, and thanks again for subscribing! ❤
Thanks a lot for the clear and very informative explanation
Excellent video, it was very helpful. Could you attach the notebook that you show in the video in order to make the interpretation and understanding of the concepts easier?
No problem - have put the link in the description :)
This was very helpful, thank you!
Amazing Video, Got my Respect! Thank you for that!
how can i call function just like you?
In common, we add message in thread, but this video send instruction to run.
help me. and your notebook link is not collect (other file)
Very nice, well explained. Thank you! =]
Great! could you post a link for the entire project in streamlit?
Thank you very much. Could you please share the ipynb file you used in the video? The file mentioned in the comments seems to be different from the one in the video.
Excellent. Working for me even post Assistant upgrades. How did you integrate your Streamlit app? Do you have this on Git too?
Great video, exactly the kind of instructions I was looking for
could I have your ipynb?
Yep - have put the link in the description :)
Can you provide a link to the code / notebook for the flight search API function? I see you posted the lookup album but not the flight search.
Great introduction, thanks (music redundant)
I would like to see an example how to obtain the flight number from user rather then hard coding it.
Nice, can you do it with Ollama?
Yep, I'll put together a video about it!
Is there a way to add the flight API code (or any API) using only the Assistant Playground rather than VS3?
I had a bit of a dig into this - it looks like some people have been experimenting with this on the OpenAI forums, and it looks like someone has found a way of doing it via Retrieval. Haven't tested it, but might be worth a try! community.openai.com/t/functions-in-the-assistant-api-playground/479721/8
Thank you for the video. The notebook you had shared is openai_assistant_retrieval.ipynb. Are you able to share the notebook openai_assistant_functions.ipynb you used in the video? Thank you
Your link to the github repo is pointing to another project
Excellent ! Thanks
Thanks Paul! Hope it helped a bit :)
I am newbie in coding so already sry for the dumb question but...
where is this function deployed? Its running on your computer?
For my understanding the information that we provide to the assistant is only what input are needed but the exact logic is not stored under "functions"?
What if I dont want to run the logic on my computer? Do I have to deploy the function somewhere and put an API call as function for the assistant?
What is the UI framework are you using for the demo? Gradio?
Nice tutorial
Streamlit! :)
Very nice video
Is there any better flight API than the one you are using?
Can you please show how to set an assistant that fetches information from websites and gives you the answers?
i am creating that chat bot but i got to know that even in threads model didn't remember the previous chat for example i wrote a prompt "change weight in previous prompt and keep rest details same". will that prompt work?
I think as long as you're working in the same thread (and your context/conversation isn't too long), the assistant will reference that thread in the response. Hope that helps!
Great video, very helhful
Thanks so much!
The bot answers the question based on the data that was uploaded via the aviationstack API. How can I make the bot itself upload the necessary data? So that it doesn't need to enter the flight number in the aviationstack API.
Hey - so two approaches here, one could be to fetch data from your own database (so instead of using AviationStack, you’d query your own database), or you could potentially use RAG (or Retrieval) to fetch context from provided documents, if the information is static. I have another video on OpenAI Assistants Retrieval which might help here.
Best of luck!
At the step with
assistant = client.beta.assistants.retrieve(
assistant_id=assistant_id
)
assistant
I have the error:
timeout: Override the client-level default timeout for this request, in seconds
"""
extra_headers = {"OpenAI-Beta": "assistants=v1", **(extra_headers or {})}
--> return self._get(
f"/assistants/{assistant_id}",
options=make_request_options(
extra_headers=extra_headers, extra_query=extra_query, extra_body=extra_body, timeout=timeout
),
cast_to=Assistant,
)
I don't know what to do(
Issue resolved
then why we use agent ?
use openai function calling + prompt eng. get more control
The part I don't understand yet is, with the Python code where do you host it so the Assistant can call it. Then in the OpenAI assistants dashboard under functions say you have function Name: Get_weather and you hit test button. How does the Assistant know where the code is in relation to Get_weather? There is no endpoint etc set up? Im so confused on this part. Someone please help! haha
Hey! Kind of a late response but still.
The model doesn't really call the function, so it does not need to know where the Python function is. All the model does (in a general sense) is taking the question/message from the user, taking a bunch of **descriptions** of some functions (and their parameters), and returning the name of the function to call. Basically the model only **tells** you what function you should call (and what parameters you should call it with) based on the descriptions you provided.
After you get that response from the model, is up to you to call the function the model suggested (as stated in minute 8:30). And after doing this, you can optionally pass the values returned by these functions back to the model, so the model can come up with a more "human sounding" response (which is what's happening at minute 10:42).
Hope that helps!
Great reply, thanks for covering this for me!
@@MakeStuffWithAI happy to help!
Super video
Where is the full code?
Which Api are you using can you share it?
I used aviationstack.com/ for the flight API :)
I subbed and liked
tutorial not for beginners
thanks, why not create a simple react front end (which you showed) ... call api , get result, show in frontend as complete project.
Thanks for the idea! I'll get to work on it :)