Most people receive scam phone calls on a weekly basis, and usually they are just minor inconveniences. However, the FBI is now warning that scammers might be ratcheting up their scams in a new way: by using artificial intelligence to impersonate federal officials.
The threat comes as scammers can reportedly use AI programs to trick unsuspecting Americans into sharing personal information or sending money to fake accounts. According to the FBI, these scammers have been impersonating "current or former senior U.S. federal or state government officials" to target individuals. They create voice messages that claim to come from a senior official to establish rapport before gaining access to personal accounts.
The scammers may also send text messages posing as government officials, which the FBI calls smishing and vishing scams. Smishing is "malicious targeting using text messages," while vishing uses audio messages that may include AI-generated voices, said USA Today. These tactics are similar to phishing scams perpetrated via email.
If the scammers obtain your personal information, this contact information could be used to impersonate contacts and elicit information or funds from you, according to the FBI. It's unclear what the end goal of these hacking efforts is or who is behind it, but AI tools have made it easier for scammers and spies to impersonate friends, relatives, and colleagues of just about anyone, said CNN.
To protect yourself, you should be skeptical of unsolicited content featuring public officials, according to Newsweek. The FBI has provided some specific instructions on how to avoid becoming a victim of these scams. The most important step is to verify the identity of the person calling you or sending text or voice messages before responding.
You should research the originating number, organization, and/or person purporting to contact you. If a voice message is left, listen closely to the tone and word choice to distinguish between a legitimate phone call or voice message from a known contact and AI-generated voice cloning, which can sound nearly identical, said the FBI.
Even the FBI admits that AI-generated content has advanced to the point where it's often difficult to identify. They urge people to contact their relevant security officials or the FBI for help if you're unsure. The best way to protect your information is to never share sensitive information or an associate's contact information with people you have met only online or over the phone.
Additionally, do not send money, gift cards, cryptocurrency, or other assets to people you do not know. This is not the first time that foreign scammers have worked to undermine Americans. Throughout the 2024 presidential election season, scammers created content designed to deceive Americans, said NBC News. Russia was notably accused of masterminding two sprawling influence campaigns aimed at influencing American voters.
Stay safe and vigilant by being cautious when receiving unsolicited calls or messages from individuals claiming to be government officials. Verify the information and never share sensitive details with unknown individuals or send money without confirmation.