5 Steps to Build Your Own LLM Classification System
HTML-код
- Опубликовано: 3 июл 2024
- Want to get started with freelancing? Let me help: www.datalumina.com/data-freel...
Need help with a project? Work with me: www.datalumina.com/consulting
🔗 GitHub Gist
gist.github.com/daveebbelaar/...
👋🏻 About Me
Hi there! I'm Dave, an AI Engineer and the founder of Datalumina. On this channel, I share practical coding tutorials to help you become better at building intelligent systems. If you're interested in that, consider subscribing! Наука
Maaaaaan.... You have THE BEST content, HANDS DOWN, for Gen AI Development. Clear, concise, every step explained, context.... Context is key... Bravo! And thanks a lot for this, it's inspiring.
Wow, thanks!
Great! would love to see more of these.
Such great content. I was going to gist this and then i see that's even how you're sharing it! I wanted to get a use case for Instructor library as looked interesting, but wasnt sure what it added beyond pydantic. ... and here it is. Thanks!
This is exactly what i needed !
Thanks !!
great yar bhoat zbrdst.
You did an amazin job, thank you so much for sharing this.
Congratulations this is just perfect!
Excellent video. Can you go into a bit more detail of how a database of this type of information might look and operate. Or any type of automation that would be involved? You mentioned sentiment or you mentioned doing analytics
So cool that you make such great content, with clear explanations, and are so transparent
I appreciate that!
Good tip thnkss
How do you deal with the objections of sending this 'sensitive' data to OpenAI? We are doing a project now where we have to clean the data before sending it to openAI which is a big challenge. Curious to hear other people thoughts on this...
We use Azure OpenAI. Clients are generally okay with that in our experience.
Loved the content.
What are the advantages of using this instead of function calling?
@@sumitbindra streamlines prompt engineering, less code, and auto retries.
@@daveebbelaar makes sense. thank you
You have 50 000 classes transcripts you need to do a recommendation engine. Best approach?
Why don't you just just use the json response from openai directly?
This unifies your data structures without relying on prompt engineering. You still have to provide a JSON schema when using the JSON response with OpenAI, and there is also no automated retry mechanism if it fails to load your Pydantic model afterwards. Overall, this streamlines the development experience, especially if you're working with multiple developers who might all have slightly different prompting styles for JSON. Instructor uses the JSON response and Function Calling under the hood.
@@daveebbelaar How good or bad this solution is compared to other alternatives like langchain and llamaimdex output parsers?
@@AbdulBasit-ff6tq I don't think it's related.
Gold
combined it with fastapi to transform it to an endpoint and call in the frontend side ooooofff... faster development for machine learning web system