FraudGPT: Another malicious chatbot emerged
Earlier this month, a hacker was found working on WormGPT, a bot similar to ChatGPT that enables users to create infections and phishing emails. Recently, security professionals came across a brand-new malicious chatbot named "FraudGPT."
Over the weekend, the owner of FraudGPT began promoting the malicious chat program in a hacker forum. According to the creator, "This creative device will undoubtedly change the neighborhood and the way you work forever."
What is FraudGPT?
FraudGPT offers similar functionalities to WormGPT. Your commands can be sent to a chat box that replies appropriately. In a video demonstration, FraudGPT swiftly produces a successful SMS phishing message that impersonates a bank. The bot can be configured to provide information about the best websites to use for credit card fraud. Additionally, it can offer bank identity numbers that Visa does not verify to assist the user in gaining access to credit cards.
The creator of FraudGPT also seems to sell stolen credit card numbers and other hacker-obtained data. He also provides instructions on how to conduct fraud. Therefore, it's possible that the chatbot service will incorporate all of this data.
WormGPT: The "dark side of ChatGPT"
FraudGPT has been spreading on Telegram Channels since July 22, according to research published today by Rakesh Krishnan, a senior threat analyst with cybersecurity company Netenrich. According to Krishnan's essay, this AI bot is only meant for offensive applications like phishing email production, tool building, carding, etc. The tool is currently available for purchase on the Telegram network and many dark web marketplaces.
It's important to remember that this is a dangerous chatbot created to aid online criminals in their activities. Therefore, it shouldn't be utilized for anything. We may be able to appreciate better the value of employing technology ethically and responsibly if we are aware of the dangers of FraudGPT and its potential effects.
Advertisement