• Shortlysts
  • Posts
  • Scammers Use AI to Impersonate U.S. Secretary of State Marco Rubio

Scammers Use AI to Impersonate U.S. Secretary of State Marco Rubio

Scammers send voicemails and texts to senior government officials impersonating Secretary of State Marco Rubio.

What Happened?

Last week scammers utilizing artificial intelligence programs that can duplicate voices left messages for several world leaders claiming to be U.S. Secretary of State Marco Rubio. According to a state department cable, the impostor sent fake voice messages and texts that mimicked Rubio’s voice and writing style to those targets including three foreign ministers, a U.S. governor and a member of Congress.

The cable, which was dated 3 July, said the impostor ‘left voicemails on Signal for at least two targeted individuals’ and sent text messages inviting others to communicate on the platform.

Why it Matters

AI programs that can duplicate voices and images have given scammers the ability to impersonate nearly anyone online. While some AI programs produce sounds or images that can easily be spotted as fake, others are more sophisticated, generating authentic looking and sounding content nearly indistinguishable from the real thing. 

Scammers have been using these AI programs to leave fake voice messages or to send phony texts to intended victims in order to try to conduct a number of different fraudulent activities. For the most part, these scams seek money or personal identifying information. A scammer might duplicate the voice of a friend or loved one, then send the recipient a message asking for money, for example. These types of scams have been on the rise over the past several years.

Why Billionaires Are Stockpiling This "Boring" Token

The world's largest financial institutions are building massive positions in a protocol most retail investors consider too "unsexy" to notice. As markets are volatile post-tariffs, this coin continues setting transaction records while flying almost completely under the radar.

The use of the same AI programs to impersonate a senior government official is rarer, but not unprecedented. In May a scammer impersonated White House Chief of Staff Susie Wiles in an attempt to gain access to information of other senior government officials. Both the Wiles and Rubio impersonations are under investigation by law enforcement agencies, but identifying the perpetrator or perpetrators will be difficult.

Because the AI technology in question is so widely available and free, anyone with internet access can use it. While digital impersonations used to require a high degree of skill in computer science and coding, now thanks to AI, anyone can do it. That makes the number of potential suspects almost unlimited.

A year ago, a likely Russian scam sent fake video messages of Ukrainian President Zelenskyy to Ukrainian military forces telling them to surrender. While the Ukrainian forces did not fall for the con, this kind of deception is likely to increase in the future as the AI technology becomes so refined that phony videos are indistinguishable from real ones. 

How it Affects You

Scams involving AI duplication of voices and images are a threat to everyone, not just government officials. Even though the sounds and video imagery may look authentic, a last line of defense for potential victims is to look for behavior that is out of character. Even if you get a voicemail that sounds like someone you know, if the content of the message is something you know that person would never say, that’s an indicator it could be fraudulent.