**AI Made Crypto Scams Far More Dangerous**
The first half of 2025 saw one of the worst waves of crypto hacks to date, with more than $3.01 billion stolen. The use of artificial intelligence (AI) played a significant role in making these scams easier to run and allowing even low-skill criminals to join the fray.
In the United States alone, nearly 160,000 crypto-related fraud complaints were reported in 2024. According to Norah Beers, CISO at Grayscale, "The adversaries themselves aren't fundamentally different between traditional finance and the crypto industry, but certain of the tactics they employ are distinct and the sophistication of attackers in the crypto space is notably higher."
AI is fueling a new wave of attacks that are becoming increasingly sophisticated. Attackers use AI to analyze data from social media, online forums, and blockchain transactions. By correlating data across these sources, they can detect patterns and choose potential victims for phishing or impersonation campaigns.
Fake websites, social media accounts, and videos have become so realistic that it is difficult to determine their authenticity. Some criminals use fake trading bots to display false profits or give misleading signals, encouraging users to deposit funds or follow bad financial advice. Entire trading platforms or apps can be built around fake algorithms that promise high returns but steal deposited crypto.
Scammers can bypass KYC checks using generated images or credentials. Bots appear in crypto communities on Discord and Telegram, impersonating moderators or project admins to trick users into sharing wallet details or clicking malicious links. Others mimic support agents in live chats to steal login credentials or recovery phrases.
Long-term scams, known as "pig butchering," involve building trust over weeks or months before convincing victims to invest large sums in fake platforms. Researchers have found that cybercriminals are selling deepfake tools and services on forums, social media, and messaging platforms. These tools let users create fake audio, video, and images, including face swaps and deepfake videos.
Prices vary for these tools: face-swapping services like Swapface cost between free and $249 per month, while custom deepfake videos usually range from $60 to $500, depending on complexity and quality. A single attacker can now use AI to create and manage thousands of phishing messages, fake support agents, or investment bots.
Deepfake crypto scams are particularly prevalent on TikTok and YouTube, which have billions of active users. These scams often operate under a mask of legitimacy, using deepfake videos of well-known figures like Elon Musk, Mr. Beast, or Donald Trump to lure users into fraudulent cryptocurrency schemes.
For example, the National Cyber Security Centre (NCSC) reported an AI-assisted crypto scam on YouTube. The channel featured a likely AI-generated crypto expert and added over 100,000 followers in one day. Videos instructed viewers to run code claiming to activate developer mode in TradingView (charting and trading platform), but it installed malware that stole passwords, email access, and crypto wallet contents.
New York authorities recently froze $300,000 in stolen cryptocurrency and shut down more than 100 scam websites tied to a Vietnam-based group that targeted Russian-speaking residents in Brooklyn with fake Facebook investment ads. The consequences of these scams are not only financial but also lead to people losing trust and questioning the security of crypto exchanges.
Investors will think twice about investing if they cannot distinguish legitimate communications from sophisticated deepfake impersonations. As the use of AI in crypto scams continues to evolve, it's essential for crypto firms to take proactive steps to avoid these scams.
**How Can Crypto Firms Avoid Scams?**
To protect themselves and their users, crypto firms can take several measures:
* Implement robust KYC checks using machine learning algorithms to detect generated images or credentials. * Use AI-powered monitoring tools to track suspicious activity on social media and online forums. * Educate users about the risks of deepfake scams and provide guidance on how to spot fake content. * Collaborate with law enforcement agencies to shut down scam websites and disrupt phishing campaigns. * Invest in research and development to stay ahead of emerging threats and develop new technologies to counter them.
By taking these steps, crypto firms can help prevent the spread of AI-powered scams and protect their users' financial security.