Impersonation Schemes Using AI Target US Officials, FBI Warns
Hackers are escalating phishing tactics by deploying deepfake voice and text messages to impersonate high-ranking US officials, according to a warning issued by the FBI on 15 May.
The campaign, active since April, has targeted both current and former federal and state officials in an effort to extract sensitive information under the guise of legitimate communication.
The agency said:
“If you receive a message claiming to be from a senior US official, do not assume it is authentic.”
The FBI cautions that if one official’s account is breached, the fallout could rapidly widen.
Compromised credentials may allow attackers to exploit trusted contact lists to reach additional government personnel or associates—amplifying the scale and credibility of the scam.
These operations often rely on malicious links that lure victims to spoofed platforms, where login credentials and other sensitive data can be quietly harvested.
The agency added:
“Contact information acquired through social engineering schemes could also be used to impersonate contacts to elicit information or funds.”
Hackers Use Deepfakes to Target Prominent Crypto Figures
In a separate deepfake-related scam, Polygon co-founder Sandeep Nailwal issued a public warning on 13 May via X (formerly known as Twitter), revealing that bad actors had used AI-generated impersonations to mimic him in live video calls.
“The attack vector is horrifying,” Nailwal wrote, noting that multiple people contacted him on Telegram asking if he had recently joined Zoom calls or requested they install a script—something he never did.
According to Nailwal, the attackers gained control of the Telegram account of Shreyansh Singh, head of ventures at Polygon, and used it to lure victims into Zoom meetings featuring deepfakes of Nailwal, Shreyansh, and a third unknown individual.
Nailwal noted:
“The audio is disabled and since your voice is not working, the scammer asks you to install some SDK, if you install game over for you. Other issue is, there is no way to complain this to Telegram and get their attention on this matter. I understand they can’t possibly take all these service calls but there should be a way to do it, maybe some sort of social way to call out a particular account.”
At least one user confirmed being targeted, and prominent Web3 investor Dovey Wan also reported falling victim to a similar impersonation scheme.
The incident underscores a growing threat in the crypto space, where deepfakes are being used to build false credibility in real-time.
Nailwal advised users to avoid installing any software during unsolicited online interactions and to use a dedicated device for crypto transactions.
Meanwhile, the FBI recommends a multi-pronged approach to avoid falling for these scams: verify identities through known channels, scrutinise sender addresses and media content for inconsistencies, and avoid sharing sensitive information or clicking on suspicious links.
Enabling multi-factor authentication remains one of the most effective defenses against unauthorised access.