Understanding ChatGPT/OpenAI Tokens
HTML-код
- Опубликовано: 29 сен 2024
- This video is from my course "Building an A.I ChatBot with OpenAI and Node.js"
You can get it at a 74% off using this link www.udemy.com/...
ChatGPT/OpenAI converts natural language to tokens and uses those tokens to process requests to their GPT model and also calculate pricing on their API.
In this video, you will learn how natural language is broken down into tokens
This is the BEST explanation for tokens/token pricing I've come across. Thank you!
Glad it was helpful!
i LOVE IT but its useless because you have to buy token from API. so basically not chatGPT 4 or 3 but API
Does use of a custom ChatGPT application, created with OpenAI Builder and hosted on their site, count towards both the User and/or the Builder's 40 per 3-hour limit? i.e. If someone else uses my application 40 times within 3 hours, then not only they will need to wait.. but I, as the app creator, will also need to wait to do any prompt?
What if I use something like MindStudio or AirTable to create an AI application (not hosted on OpenAI) that uses the ChatGPT API.. how does the pricing and/or hourly limit work?
And how does the pricing change between a prompt that requires a simple yes/no response, versus one that requires image analysis?
My goal is to drag a pdf of a handwritten scan to a web-app (hosted locally or on OpenAI Marketplace store) that calls Chat-GPT4 Vision to do an OCR and extract data from a table, then perform some calculations on it, and finally print the results to a pdf (or append a new page to the original pdf).
Should I perhaps use Perplexity instead (also $20/mo)?
For any app hosted on GPT store, it's charged to your account while OpenAI manages the rate limits (not totally sure about how the pricing works there, haven't thoroughly checked to give an informed take)
For apps like Airtable, they handle the rate limits for such services which they have taken care of with their pricing tiers. Standard OpenAI pricing, as described in this video, works behind the scenes.
If you're hosting the app yourself using the API, then the pricing structure in this video applies and you're responsible for controlling you and your users rate/usage limits.
How do I reduce the cost of ChatGPT? I have a platform but it does not list ChatGPT 4o as an option. How can my platform use this new cheaper model?
@@noelroberts2635 do you own the platform?
I have open ai credits
Are you in Nigeria? If yes, how can I pay for openAI credit using card
@@SamuelIfeoluwa-r1y I think ChipperCash should work
Merci pour ces explications claires 😊
i'm here afther nvidia conference, because the ceo talk about tokens, so...
thanks for the video
Thanks for the detailed explanation bro!
What I don't understand: You say it charges me per token. But I play a monthly fee for ChatGPT Plus. So what exactly does it charge from and how is it related to my Plus fee? How do I know what I have been charged and how do I know when I'm "broke"?
ChatGPT works on a fixed fee where usage is capped, more like your Internet bundles. Token pricing is more granular this way when you're using the OpenAI API to build AI-powered apps
How do you guys manage to pay??
This video is criminally underrated. Straight to the point, precise, and was able to easily understand in order to move forward. Thank you!
You're welcome 😊
I agree this is the best explanation I have ever gotten. All of a sudden I am paying so much for Chagpt. I needed to know why. I have a question....Chatgpt is driving my platform. Each of my Ai tools and chatbots have a prompt/instruction on the back end. Does that prompt instruction count towards token usage each time someone makes a request on the front end?
@@noelroberts2635 Glad you found it useful. And to answer your question, yes it does
@@DevTalk Thank you.
how can openai charge you if you're using free version? and does that mean if im using Pro for GPT4, not only that im paying for subscription, i'm also paying for the number of characters I type? did i understand this video correctly?
They don't charge you for the free version but your prompts have a token limit set on them. That is why an answer can get truncated if it is too long. The paid version works the same way but with a higher token limit. When using the API, the pricing is enforced strictly
Thank you for such detailed information!
thanks!
You're welcome
If i have 23 tokens, can i able to use any sentence capable of 114 characters?
This is where their Tokenizer tool comes in, you can use it to check that out
I am building with openai and i am stuck cos of no quota, how do i go about this I am solely looking at the Wisper aspect.
I have open ai credits if you are interested
@@nelzsignature3022 i am intrested
Thank you for this video. It helped me to clear my doubts.
You are welcome!
Appreciated man
I have a pid account. Where can I buy the tokens? I'm getting an error in Zapier saying I'm out of tokens.
You can manage your payments and billing at platform[dot]openai[dot]com/account/billing/overview
What if I have a account with $2500 credits I don't need
Great. That's helped clarify how tokens are calculated. Thanks bro
You're welcome ☺️
Thanks for the breakdown. I'm still curious how the words get broken down into multiple tokens. Is it a universal thing, an AI thing or something Chat GPT made up to monetize the use of their product?
I believe it's all based on how their natural language model is designed to consume words/phrases. The approximation is that a token represents 75% of a word. And yeah, I understand the perspective that such can be gamed for profit but I believe there are regulations that curb such
Tokenization is something that's done in NLP. They've likely based their pricing on how much it costs to "do work" per 1000 tokens and they've had to factor in overheads like cost of electric, the cost of training, cost of staff etc.
Thank you, this was very helpful!
You're welcome ☺️
Nigeria?
Thank you for the straightforward explanation!
You're welcome, glad you found it useful 😊
Very good explained
Glad it was helpful!