Written By Divya
Published By: Divya | Published: Dec 04, 2025, 03:49 PM (IST)
AI might be escalating the cases of online scams, but one thing is for sure – it all depends on how you use it. A Reddit post has clarified that AI can be used as a shield too, just in case someone tries to scam you. According to the post, a Delhi-based IT professional recently shared how he outsmarted a fraudster who was running an “Army transfer/cheap goods sale” scam. And yes – ChatGPT has been a hero in the story. Also Read: ChatGPT May Soon Show Ads As OpenAI Reportedly Tests In-App Promotions
Here’s what really happened. Also Read: OpenAI CONFIRMS ChatGPT Data Leak Mixpanel Breach: Should You Worry?
The story began with a Facebook message from a number impersonating a college senior, who actually happens to be an IAS officer. The impostor claimed that a CRPF officer friend was being transferred and selling premium furniture and home appliances at unbelievably low prices. Also Read: OpenAI Denies Wrongdoing In Teen Suicide Suit, Says ChatGPT Urged Teen To Seek Help
But instead of getting impressed by cheap deals, the user did something smarter – he double-checked. And after confirming with the real senior on WhatsApp, it was clear that the message was part of a larger scam operation. Later, the scammer switched to SMS with an Army profile picture, sent photos of the goods, and pushed for QR-based payment urgently. That urgency was the red flag. The user decided not to block yet. Instead, he decided to go one step deeper.
This is where AI entered the chat. The user quickly opened ChatGPT and fed it a prompt to generate code for a clean-looking payment portal. But the portal wasn’t real. The goal was simple – make it look like a QR upload page, capture geolocation when opened on mobile, take a front camera snapshot (if permissions were allowed), and send the data to a backend.
Within minutes, the code was ready. The user hosted the tracker page and sent the scammer a link saying it would “speed up the payment process” if the QR code was uploaded there.
And it happened! The scammer clicked. As soon as the link opened, the results appeared instantly: live GPS-based location, device IP data, and a clear front-camera photo. Instead of waiting any longer, the user sent the scammer his own photo and location details.
Within minutes, the scammer started calling repeatedly, using different numbers – pleading, apologising, and asking for forgiveness. He even claimed that he would “leave scams forever.” Of course, the user knew that probably wouldn’t happen. But the impact? It was real. The scammer was scared, regretful, and most importantly, his operation was stopped at least for that moment.