The U.S. State Department warned of AI-driven impersonation scams, including a case where an impostor using AI tried to contact foreign officials and U.S. politicians, highlighting growing concerns over deepfake technology and its potential to deceive high-level officials.
A US diplomatic cable reports that an unknown actor used AI to impersonate Secretary of State Marco Rubio, contacting multiple officials to access information, while a separate Russia-linked campaign targeted personal Gmail accounts of officials and activists, highlighting ongoing cyber threats against US and allied personnel.
An unknown actor used AI to impersonate U.S. Secretary of State Marco Rubio via a Signal account, contacting officials and attempting to manipulate them, prompting an investigation by the State Department. The incident highlights ongoing cybersecurity threats involving impersonation and espionage, including previous related incidents linked to Russian cyber actors.
The campaign of Democratic presidential candidate Dean Phillips is distancing itself from consultant Steve Kramer, who allegedly commissioned a robocall using AI to impersonate President Joe Biden during the New Hampshire primary election. The call urged voters not to vote in the primary and save their vote for the November election. Phillips's campaign confirmed Kramer's involvement in ballot efforts but stated they had no knowledge of his reported involvement with the robocall. The campaign denounced Kramer's alleged actions and confirmed he is no longer working for them. New Hampshire's attorney general has linked the call to a criminal investigation, and senior US law enforcement officials are monitoring the incident.
The Federal Trade Commission is proposing new rules to combat AI-driven impersonation fraud, seeking public comment on prohibiting the impersonation of individuals and extending protections to cover AI-generated deepfakes. The agency has also finalized a rule to combat scammers who impersonate businesses or government agencies, allowing the FTC to directly seek monetary relief in federal court from scammers engaging in such fraudulent activities. The proposed changes come in response to the increasing use of AI tools by fraudsters to impersonate individuals, with the FTC aiming to strengthen its toolkit to address AI-enabled scams and protect consumers from impersonator fraud.