Finally! How to Leverage JSON Mode in OpenAI's New GPT 4 Turbo 128k
HTML-код
- Опубликовано: 8 ноя 2023
- @siliconsociety Lead AI Engineer Robert Sharp shows how to use OpenAI's new GPT-4 Turbo 128 with JSON mode.
We'll be posting more pro tips as we go-if you liked this, please do like the video to appease The Algorithm, and subscribe if you're interested in more 🙂 Наука
Cheers. That was helpful. I'm surprised they aren't as clear about these instructions in their documentation...
Very inspiring! I reversed your approach to make it create JSON objects with predefined pydantic classes from unstructured text data. The pydantic example guarantees that only the data that is ‘fitting the slots’ is extracted.
Added to it, instructions to reason about the most probable information that is not present in the source data, but can be deduced from what it did found!
Thanks!
Excellent video!
Excellent tutorial, thank you. Please provide more.
Awesome! I'm in the process of doing something similar and trying to figure out the best way to get my output as consistent as possible. Will definitely look into using a method like this!
Great tutorial, I am surprised you don't have more subscribers.
What a video, comments section also super helpful.
Cheers everyone.
This is exactly what I was looking for, thanks!
You need to start making more content. 👍
This is great! I have been working on a project that needed json output from open API, and was describing the json object in the system prompt. Similar to what you saw it works most of the time, but fails ~5%. I was solving that by making the call again if the output couldn't be parsed, but that eats up a lot more tokens.
This seems much more elegant.
I was in the same boat. This new approach gives me tons more confidence in the outputs.
great idea, this helped me so much
This is so cool. Thanks for sharing.
very cool, I am using JSON schema to create the structure in the prompt but pydantic looks cool. also wondering if you can just use function calling for this, as you have to define a JSON schema for the function. have you tested the differences between JSON mode and function calling?
Ask the AI agent to check and increase your font size.
Yeah fr. Had to play the video on my TV
very helpful, the way to get structured output from gpt API. Just wondering if it supports nested object, such as graph structure G(V, E) with nodes and edges?
I really want to see if this is better way than functions calling?
Whats the difference between using JSON Mode and function calling since both extract structered data? which is more accurate and viable? thanks for the pydentic tip btw
How can i make user_prompt dynamic, meaning instead RPG as input I want to send some other input?
Hi, i also need to get a json response from the open ai api from my project, the json is kind of big with around 5082 tokens, so after some time the response is gibberrish and completely unrelated to with what it started, do you think there are some alternatives to my problem and if any solution is there