ChatGPT-related scams are surging and despite OpenAI giving users a free version of ChatGPT, scammers lead victims to fraudulent websites, claiming they need to pay for these services, a report said on Thursday. There is another significant risk of using these chatbots.
“They might collect and steal the input you provide. In other words, providing anything sensitive or confidential could put you in danger. The chatbot’s responses could also be manipulated to give you incorrect answers or misleading information,” according to researchers from Palo Alto Networks Unit 42.
Unit 42 observed an increase of 910 percent in monthly registrations for domains related to ChatGPT between November 2022-April 2023.
The researchers also detected more than 100 daily detections of ChatGPT-related malicious URLs captured from traffic seen in its Advanced URL Filtering system. They also spotted nearly 18,000 percent growth of squatting domains from DNS security logs within the same timeframe. Scammers might use ChatGPT-related social engineering for identity theft or financial fraud.
“The fake ChatGPT sites try to lure victims into providing their confidential information, such as credit card details and email address,” it added.
Some scammers are exploiting the growing popularity of OpenAI for crypto frauds, using Elon Musk’s name to attract victims to fraudulent crypto giveaway events.
“Whether or not they’re offered free of charge, these copycat chatbots are not trustworthy. Many of them are actually based on GPT-3 (released June 2020), which is less powerful than the recent GPT-4 and GPT-3.5,” the report mentioned.
“To stay safe, ChatGPT users should exercise caution with suspicious emails or links related to ChatGPT. Moreover, the usage of copycat chatbots will bring extra security risks. Users should always access ChatGPT through the official OpenAI website,” the report emphasised.
— IANSGet latest Tech and Auto news from Techlusive on our WhatsApp Channel, Facebook, X (Twitter), Instagram and YouTube.
Author Name | Shubham Verma