GPT-4.1 Has a 1M Token Context–ChatGPT Still Limits Users to 32K

1 Asmordikai 0 5/27/2025, 6:17:23 AM
I’m a long-time ChatGPT Plus subscriber and started using GPT-4.1 in ChatGPT as soon as it was released.

One of GPT-4.1’s core features is its 1 million token context window. But in ChatGPT, GPT-4.1 is still limited to 32,000 tokens—the same as GPT-4o.

This limitation isn’t clearly disclosed in the ChatGPT UI, subscription page, or rollout announcement. You only learn it’s API-only by reading developer documentation or forum posts.

If the model supports 1 million tokens and OpenAI already delivers that through the API, why is the core user-facing product—ChatGPT—still capped at 32K?

At a minimum, OpenAI should:

Clearly state token limits within the ChatGPT interface.

Stop advertising 1M-token context in GPT-4.1 materials without a disclaimer.

Provide a roadmap to enable full context support in ChatGPT.

Would appreciate hearing how others feel about this. Shouldn’t Plus users have access to what they’re paying for?

Comments (0)

No comments yet