GPT-4.1 Has a 1M Token Context–ChatGPT Still Limits Users to 32K
One of GPT-4.1’s core features is its 1 million token context window. But in ChatGPT, GPT-4.1 is still limited to 32,000 tokens—the same as GPT-4o.
This limitation isn’t clearly disclosed in the ChatGPT UI, subscription page, or rollout announcement. You only learn it’s API-only by reading developer documentation or forum posts.
If the model supports 1 million tokens and OpenAI already delivers that through the API, why is the core user-facing product—ChatGPT—still capped at 32K?
At a minimum, OpenAI should:
Clearly state token limits within the ChatGPT interface.
Stop advertising 1M-token context in GPT-4.1 materials without a disclaimer.
Provide a roadmap to enable full context support in ChatGPT.
Would appreciate hearing how others feel about this. Shouldn’t Plus users have access to what they’re paying for?
No comments yet