A political consultant in New Hampshire has made his first court appearance after being charged with voter suppression and impersonating a candidate by sending artificial intelligence-generated robocalls mimicking President Joe Biden’s voice. Steven Kramer, who also faces a proposed $6 million fine from the Federal Communications Commission, admitted to orchestrating a message that was sent to thousands of voters before the state’s presidential primary. The AI-generated voice falsely suggested that voting in the primary would prevent voters from casting ballots in November. Kramer faces multiple felonies and misdemeanors related to violating New Hampshire’s law against using misleading information to deter voters and falsely representing himself as a candidate.

At Kramer’s arraignment, the Assistant Attorney General argued for a $10,000 cash bail to ensure Kramer returns to court. Kramer’s attorney argued for personal recognizance bail, stating that Kramer has a history of appearing at regulatory proceedings and has never missed a court date. Kramer declined to comment as he left the courthouse, with his attorney stating that they are reviewing the charges and will engage in discussions with the attorney general’s office.

Kramer, who owns a firm specializing in get-out-the-vote projects, previously stated that he wasn’t trying to influence the outcome of the primary election but rather wanted to highlight the potential dangers of artificial intelligence. Voter suppression and impersonating a candidate carry significant penalties in New Hampshire, with voter suppression carrying a prison sentence of 3 1/2 to 7 years and impersonating a candidate punishable by up to a year in jail. Since the incident, the FCC has taken steps to combat the use of artificial intelligence tools in political communications, banning AI voice-cloning tools in robocalls and introducing a proposal to require political advertisers to disclose the use of AI-generated content in broadcast ads.

The new rules proposed by the FCC aim to add transparency to political advertising and combat the use of generative AI tools that can create lifelike images, videos, and audio clips to potentially mislead voters. The charges against Kramer were announced on the same day the FCC proposed fines against him and the company accused of transmitting the robocalls. Lingo Telecom, the company involved, strongly disagreed with the FCC’s actions, calling it an attempt to impose new rules retroactively. The fines from the FCC mark the agency’s first enforcement actions related to generative AI technology, highlighting the growing concern over the use of AI in political communications and the need for increased transparency in the electoral process.

Share.
Exit mobile version