the power of artificial intelligence

What is ChatGPT's Prompt context size?

When using ChatGPT, we often wonder to what extent we can fill in the context of a prompt.
To answer this question, head to your favorite web browser and type ‘openai models’ in the search bar then you will click on the link.
You will land on a page which will give you information concerning each model developed by OpenAi, in particular the 'max tokens' i.e. the maximum number of tokens to use in your prompt.
A token is a unit of measurement that does not necessarily correspond to a specific word or number of syllables (at least it seems to me).
For example, at the time I discovered this page, GPT 4 limited the context to 8195 tokens, GPT3.5-turbo to 4096.

Okay, it's all well and good to know how many tokens maximum I have to use, but how can I know the number of words that represents given that a word is not necessarily a token.
This is where the OpenAI tokenizer comes into play.
Same procedure as for ‘openai models’, simply type ‘openai tokenizer’ in the browser search bar and click on the corresponding link.
You copy and paste the text of your prompt in the box reserved for this purpose and you will obtain the number of tokens that your prompt contains.

chatgpt's tokenizer

With that, let’s go to work!


Main picture created by tungnguyen0905

How do you rate this article?


Our lives through technology
Our lives through technology

This blog is about the technology and its impact on our lives. It talks about devices, cybersecurity, coding...

Send a $0.01 microtip in crypto to the author, and earn yourself as you read!

20% to author / 80% to me.
We pay the tips from our rewards pool.