Check Point highlights new ChatGPT4.0 account theft concerns
Check Point Research (CPR) has identified a growing trend on the dark web and hacking underground involving the theft and sale of stolen ChatGPT Premium accounts. Despite the OpenAI's geofencing capabilities, the ChatGPT API has been exploited by cybercriminals to bypass restrictions, with perpetrators utilising account checkers and bruteforcing tools like SilverBullet to gain unauthorised access.
“Since December 2022, we have raised concerns about ChatGPT’s implications for cybersecurity. Now, we warn that there is an increase in the trade of stolen ChatGPT Premium accounts, which enable cyber criminals to get around OpenAI’s geofencing restrictions and get unlimited access to ChatGPT,” says CPR researchers.
“The market of account takeovers (ATOs), stolen accounts to different online services, is one of the most flourishing markets in the hacking underground and in the dark web. Traditionally this market’s focus was on stolen financial services accounts (banks, online payment systems, etc.), social media, online dating websites, emails, and more.”
CPR informs that since March 2023, it has seen an increase in discussion and trade of stolen ChatGPT accounts, focusing on premium accounts.
“This includes leak and free publication of credentials to ChatGPT accounts; trade of premium ChatGPT accounts that were stolen; bruteforcing and checkers tools for ChatGPT – tools that allow cybercriminals to hack into ChatGPT accounts by running huge lists of email addresses and passwords, trying to guess the right combination to access existing accounts; and ChatGPT Accounts as a Service – dedicated service that offers opening ChatGPT premium accounts, most likely using stolen payment cards,” adds the CPR team.
ChatGPT imposes geofencing restrictions on accessing its platform from certain countries (including Russia, China and Iran). However, CPR has recently highlighted that utilising the ChatGPT API allows cybercriminals to bypass different restrictions and use ChatGPT's premium account. All this leads to an increasing demand for stolen ChatGPT accounts, especially paid premium accounts.
“Meanwhile, during the last few weeks there have been discussions on ChatGPT's privacy issues, with Italy banning ChatGPT and Germany considering banning it as well. We highlight another potential privacy risk of this platform. ChatGPT accounts store the recent queries of the account's owner. So when cybercriminals steal existing accounts, they gain access to the queries from the account's original owner. This can include personal information, details about corporate products and processes, and more," CPR notes.
Cybercriminals often exploit the fact that users recycle the same password across multiple platforms. Using this knowledge, malicious actors load combinations of emails and passwords into a dedicated software (also known as an account checker) and execute an attack against a specific online platform to identify the credentials that match the login to the platform. A final account takeover occurs when a malicious actor takes control of an account without the account holder's authorisation.
During the last month, CPR observed increased chatter in underground forums about leaking or selling compromised ChatGPT premium accounts. Mostly those stolen accounts are being sold, but some actors also share stolen ChatGPT premium accounts for free to advertise their services or tools to steal the accounts.
The CPR team elaborated on the tools used to hack into ChatGPT premium accounts.
SilverBullet is a web testing suite that allows users to perform requests towards a target web application. It offers a lot of tools to work with the results. This software can be used for scraping and parsing data, automated pen testing, unit testing through Selenium and much more. Cybercriminals also frequently use this tool to conduct credential stuffing and account-checking attacks against different websites, thus stealing accounts for online platforms.
As SilverBullet is a configurable suite, checking or bruteforcing attack against a certain website requires a "configuration" file that adjusts this process for a specific website and allows cybercriminals to steal the website's account in an automated way.
In this specific case, CPR identified cybercriminals offering a configuration file for SilverBullet that allows checking a set of credentials for OpenAI's platform in an automated way. It enables them to steal accounts on a scale. The process is fully automated and can initiate between 50 and 200 checks per minute (CPM). Also, it supports proxy implementation, which in many cases allows it to bypass different website protections against such attacks.
Another cybercriminal focusing only on abuse and fraud against ChatGPT products even named himself "gpt4". In his threads, he offers for sale ChatGPT accounts and a configuration for another automated tool that checks a credential's validity.
On March 20, an English-speaking cybercriminal started advertising a ChatGPT Plus lifetime account service, with 100% satisfaction guaranteed. The lifetime upgrade of a regular ChatGPT Plus account (opened via email provided by the buyer) costs US$59.99 (while OpenAI's original legitimate pricing of this service is US$20 per month). However, to reduce the costs, this underground service also offers an option to share access to ChatGPT account with another cybercriminal for US$24.99 for a lifetime. Several underground users have left positive feedback for this service and vouched for it.
“Like in other illicit cases, when the threat actor provides some services for a pricing that is significantly lower the original legitimate one, we assess that the payment for the upgrade is done using previously compromised payment cards,” says CPR.