In the final weeks of the 2020 election campaign, Chinese and Iranian operatives created fake, AI-generated content in an attempt to influence US voters, according to current and former US officials briefed on the intelligence. Although the deepfake audio and video was never disseminated publicly, the incident raised concerns about foreign powers spreading false information about the voting process. The National Security Agency collected intelligence on China and Iran’s capabilities in producing deepfakes, reflecting fears about how artificial intelligence could be exploited to mislead voters.

As the technology to produce deepfake audio and video becomes more accessible, US officials have become increasingly worried about the potential for foreign influence campaigns to use AI to propagate misinformation. At a simulated exercise in preparation for the 2024 election, senior US officials grappled with how to respond to a scenario involving Chinese operatives creating a fake AI-generated video depicting a Senate candidate destroying ballots. FBI officials have emphasized the role of AI in amplifying election disinformation and manipulating public opinion.

While the content of the deepfakes prepared by Chinese and Iranian operatives in 2020 remains unknown, some US officials were initially skeptical of their impact on the presidential election. The former senior official noted that advancements in technology, such as ChatGPT, have made deepfake production more sophisticated in recent years. However, the challenge lies in utilizing intelligence on AI advancements and foreign influence efforts within the US effectively.

US officials continue to monitor the advancements in AI and deepfake technology made by countries like China, Iran, and Russia since the 2020 election. Concerns remain about the ability to quickly identify and counter anomalous activities that could influence US elections. The Senate Intelligence Committee is set to hold a hearing on foreign threats to elections, where officials will address the potential risks posed by deepfakes and foreign influence operations.

While Iranian operatives attempted to influence voters in 2020 by impersonating the Proud Boys and disseminating false information, they did not deploy deepfakes. The effectiveness of foreign influence operations relies on cultural understanding and the ability to resonate with the American public. Generative AI has increased the efficiency of content creation, but its impact on campaign effectiveness remains uncertain. Experts caution against overemphasizing the influence of foreign AI-driven operations, noting that distribution remains a significant challenge.

Despite the growing concerns about foreign influence operations, the US remains vulnerable to conspiracy theories and misinformation. Public trust in government institutions is at historic lows, and a significant portion of Republicans do not believe President Biden’s election win was legitimate. With the upcoming 2024 election, there are fears that the conflict in Ukraine could serve as an animating event for Russian interference or influence operations. The FBI is particularly concerned about the potential for foreign actors to disrupt the election through disinformation campaigns.

Share.
Exit mobile version