Fake ChatGPT apps are beginning to distribute malware and steal credit card information

Feb 24, 2023
Updated • Feb 24, 2023

Last week, I wrote about why you should avoid downloading ChatGPT apps for Android and iOS. Now, you can add more to the list, malware.

Fake ChatGPT apps are distributing malware on Android and Windows
Over the past few months, ChatGPT has steadily been growing in popularity. People are using it for fun, to learn stuff, for research, to write programs, students were even caught for using the chatbot to write their homework.

Image courtesy: Cyble

There's no denying the fact that OpenAI's tool has become an internet sensation. Unsurprisingly, the company introduced a premium tier, called ChatGPT Plus. The subscription, which costs $20/month is available for users in the U.S., and grants unrestricted access to the chatbot. Unfortunately, ChatGPT's rising popularity has also drawn the attention of hackers. Martin wrote an article about an info-stealing malware called Stealc, where he mentioned another one named Redline.

Fake ChatGPT apps being used to spread malware and steal user data

Attackers have packaged the Redline malware in a ChatGPT app for Windows. The tool was analyzed by a security researcher, Dominic Alvieri, who discovered that it redirects users to a domain that infected visitors with the Redline malware.

Fake ChatGPT social media pages used for phishing campaigns

A report published by Cyble Research and Intelligence Labs (CRIL) goes into more details about how the malware were being distributed. The findings reveal that threat actors were using a Facebook page to promote the malicious app, the page even had ChatGPT logos on it to make it look like it was the real deal, aka malvertising.

Image via: Cyble

BleepingComputer reports that these fake ChatGPT apps were also pushing malware on the Google Play Store. The security firm identified over 50 fake ChatGPT Android apps that were used for nefarious purposes such as billing fraud via SMS to activate subscriptions, and contained different types of malware (adware, spyware) to steak call logs, contacts, messages, media files, etc. These fake apps were using the name and icon (logo) of ChatGPT to trick users.

ChatGPT phishing campaigns

CRIL also discovered that hackers were running some phishing campaigns by cloning ChatGPT's website. The attackers replaced the Try ChatGPT button, which then initiated a download for a malicious file (Lumma Stealer, Aurora Stealer, clipper malware, etc.). The threat actors were also using cloned websites of ChatGPT to lead users to fake a ChatGPT payment page, to steal credit card information from users.

Recently, a number of fake authenticator apps have made their way on to the iOS App Store, after Twitter announced it would be ending support for SMS 2FA for free accounts. Hackers wasted no time to exploit the situation to try and scam users with expensive subscriptions in the fake 2FA apps, and even used them to steal the QR code from users. Apple has taken action on some apps by delisting from its storefront, but many of them still exist.

Google and Apple need to improve their review checks before allowing apps on their stores. There is no official ChatGPT app for Windows, Android, iOS, etc. If you want to use ChatGPT on your phone, just use the official website at chat.openai.com. Bookmark the page, or add a shortcut for it on your mobile's home screen. You may also be interested in Bing Chat, which is now available in 3 of Microsoft's apps for Android and iOS. Access to the AI-powered tool is currently only available via a waitlist.

Fake ChatGPT apps are beginning to distribute malware and steal credit card information
Article Name
Fake ChatGPT apps are beginning to distribute malware and steal credit card information
Hackers are using fake ChatGPT apps to spread malware. Security researchers discover that cloned websites of the chatbot are being used to steal credit card information.
Ghacks Technology News

Previous Post: «
Next Post: «


  1. Martin P. said on February 24, 2023 at 2:01 pm


    Wouldn’t the title of your article be:

    « Fake ChatGPT apps are beginning… »

    instead of:

    « Fake ChatGPT apps are being… »


    1. Martin Brinkmann said on February 24, 2023 at 2:38 pm

      Thanks Martin, I changed the title!

      1. Tom Hawack said on February 24, 2023 at 2:59 pm

        Nice to see that Martin Brinkman has the authority to correct an article written by another author.

        Interesting article by the details and explanations provided, but not a surprise.

  2. boris said on February 24, 2023 at 10:15 am

    “Google and Apple need to improve their review checks before allowing apps on their stores.”

    They are more interested in cutting jobs rather than improving their security. Customers are just afterthought for big tech.

Leave a Reply

Check the box to consent to your data being stored in line with the guidelines set out in our privacy policy

We love comments and welcome thoughtful and civilized discussion. Rudeness and personal attacks will not be tolerated. Please stay on-topic.
Please note that your comment may not appear immediately after you post it.