comscore

OpenAI Launches Lightweight Version of Deep Research Tool With Fast and Effective Responses

The new lightweight deep research tool brings faster outputs to ChatGPT users without compromising quality.

Published By: Madhav Malhotra | Published: Apr 26, 2025, 06:09 PM (IST)

  • whatsapp
  • twitter
  • facebook
  • whatsapp
  • twitter
  • facebook

OpenAI has introduced a lightweight version of its ChatGPT deep research tool that delivers faster and cost-effective responses without sacrificing depth or quality. It is powered by OpenAI’s new o4-mini model and is a lighter version that is optimised for better reasoning and quicker and more efficient data analysis, making it ideal for users who need quick yet reliable outputs. news Also Read: OpenAI Bans Multiple ChatGPT Accounts: Know The Whole Reason

According to OpenAI, the lightweight version is almost as intelligent as the original deep research tool but is significantly cheaper to operate. While responses are typically shorter than the standard tool, they retain the same structured logic, depth, and proper citations that users expect. This ensures that even quick research tasks maintain a high level of credibility and usefulness. news Also Read: OpenAI Brings Spotify, Canva, Coursera and More Inside ChatGPT: Here’s What You Can Do

The new lightweight deep research tool is now available for ChatGPT Plus, Team, and Pro users, with Enterprise and educational (Edu) users getting access starting next week. Free users aren’t left out either, as they can access the lightweight version with a limit of five tasks per month. For paying users, the experience is even more seamless. Once they reach the usage limits of the original deep research tool, their queries automatically switch over to the lightweight version, ensuring uninterrupted research support without the hassle of switching manually.

OpenAI emphasises that the launch of the lightweight tool is to expand the accessibility of the deep research tool, without compromising on the core experience. This is possible because of the efficiency of the o4-mini model that the lightweight version provides a balance of speed, cost, and quality. With this move, OpenAI is not just improving the performance but, also making research capabilities available to more users globally.