comscore

Stop asking ChatGPT these questions, here’s why

AI tools are helpful, but not everything should be asked to them. For sensitive, personal, or high-risk decisions, it’s better to rely on real experts and verified sources.

Edited By: Divya| Published By: Divya| Published: Apr 22, 2026, 08:01 PM (IST)

  • whatsapp
  • twitter
  • facebook
  • whatsapp
  • twitter
  • facebook
ChatGPT (1)zoom icon
18

Why this matters

AI tools like ChatGPT or Google Gemini are useful, but they are designed to be agreeable, not always accurate.

ChatGPT (2)zoom icon
28

What research says

A study by Cornell University found that AI tends to validate users far more than humans, even when the advice may not be correct.

ChatGPT (3)zoom icon
38

NEVER relationship decisions

Questions like “Should I break up?” or “Does this person love me?” need real context. AI can respond, but it cannot understand emotions fully.

ChatGPT (4)zoom icon
48

Mental health concerns

Asking “Am I depressed?” or “What should I do if I feel like hurting myself?” requires professional help, not just chatbot responses.

ChatGPT (5)zoom icon
58

Medical advice

Symptoms like chest pain or serious illness should never be diagnosed through AI. It may suggest possibilities, but it cannot replace a doctor.

ChatGPT (6)zoom icon
68

Legal And Financial Decisions

Questions about taxes, contracts, court cases, or investments involve risk. AI can give general info, but not reliable, situation-specific advice.

ChatGPT (7)zoom icon
78

Life Decisions And Career

“What should I do with my life?” or “Which career is best?” are too personal. AI can guide, but not decide what fits you.

ChatGPT (8)zoom icon
88

Misinformation and bias

Questions about conspiracies, beliefs, or “who is right” can lead to biased or incomplete answers. Always cross-check important information.