comscore

Sam Altman Added Three New Modes To GPT-5 Model But You May Not Use It

GPT-5, the latest model by OpenAI, now has three new modes which allows you to choose between them according to your need. But there is a catch!

Published By: Divya | Published: Aug 13, 2025, 01:45 PM (IST)

  • whatsapp
  • twitter
  • facebook
  • whatsapp
  • twitter
  • facebook

ChatGPT has made another important upgrade to it, which makes sure that you can take control of the AI bot in your hands. OpenAI’s CEO Sam Altman has revealed in a post on X that the latest GPT-t model is getting three modes; however, it is exclusively for the paid use of the GPT-5. news Also Read: OpenAI Confirms Adult-Only ChatGPT With Custom Personalities And Erotic Conversations

Sam Altman revealed in a post on X that you will soon be able to choose between Auto, Fast, and Thinking modes of the GPT-5. He further says that most of the users prefer to continue with the Auto mode, which selects based on the basis of your chats. What else has been revealed for the GPT-5 model? Have a look. news Also Read: Forget ChatGPT And Gemini Nano Banana! Microsoft Launches MAI-Image-1 - The In House Text-To-Image Tool

GPT-5 Model: What’s New

Sam Altman has also revealed that with the GPT-5 Thinking mode, the weekly rate limit has been expanded to 3000 messages. After that limit, it will get extra capacity on GPT-5 Thinking mini. For now, the context limit for GPT-5 Thinking is 196k tokens. However, Altman has mentioned that it can be expanded over time on the basis of usage.  news Also Read: UPI Meets ChatGPT: India Tests AI-Powered E-Commerce Payments Via OpenAI Partnership

Paid Users Have More Options, Even With 4o

Yes, you heard it right. Sam Altman has confirmed that Model 4o is back in the model picker option for all paid users in ChatGPT. How will it work?

You will find the “Show Additional Models” toggle bar in the ChatGPT web settings. It will allow you to choose between o3, 4.1 and GPT-5 Thinking mini. On the other hand, the 4.5 model is only available to Pro users. The reason is simply that it costs a lot of GPUs for OpenAI!