Steven Kramer, a Democratic political consultant, faces charges for using AI to create robocalls mimicking President Biden's voice, urging New Hampshire residents not to vote in the Democratic primary.

Steven Kramer, a Democratic political consultant from New Orleans, has been indicted for his role in an alleged voter suppression scheme involving AI-generated robocalls. According to the New Hampshire Attorney General’s Office, Kramer used artificial intelligence to mimic President Joe Biden’s voice, sending thousands of calls to New Hampshire residents in January, urging them not to vote in the Democratic primary.


The indictment, issued on May 23, accuses Kramer of impersonating a candidate during New Hampshire’s Democratic primary election. Kramer was working for rival candidate Dean Phillips at the time. The deepfake calls delivered a misleading message to potential voters, suggesting they should "save [their] vote for the November election," implying that their vote would be more impactful then rather than on the primary day.


Attorney General John Formella has brought 13 felony voter suppression charges and 13 misdemeanor impersonation charges against Kramer, who is 54 years old. The Federal Communications Commission (FCC) has also proposed a $6 million fine against Kramer, stating that the deepfake robocalls violated caller ID regulations.


Additionally, Lingo Telecom, the phone company that transmitted the calls, faces a proposed $2 million fine from the FCC. The calls were incorrectly labeled with the highest level of caller ID attestation, making it harder for other providers to detect them as spoofed calls.


Attorney General Formella emphasized the importance of these enforcement actions as a deterrent against future election interference, especially through the use of artificial intelligence. “I hope that our respective enforcement actions send a strong deterrent signal to anyone who might consider interfering with elections, whether through the use of artificial intelligence or otherwise,” he stated.


In defense of his actions, Kramer had previously told NBC News in February that he viewed the robocalls as an act of civil disobedience to highlight the dangers of AI in politics. “This is a way for me to make a difference, and I have,” he said, claiming that the $500 spent on the robocalls generated about $5 million worth of media and regulatory attention.


This incident has heightened concerns about AI-generated content misleading voters ahead of the 2024 elections. The Biden campaign has reportedly assembled an interdepartmental team to counter potential threats from malicious AI-generated deepfakes, according to Reuters.


In March, Cointelegraph reported an increase in AI-generated deepfakes during the election season, stressing the importance for voters to recognize such content. Furthermore, in February, twenty major AI tech companies pledged to prevent their software from influencing elections.


As AI technology advances, the potential for its misuse in political contexts becomes increasingly significant. The case against Steven Kramer underscores the need for vigilance and robust regulatory measures to protect the integrity of the electoral process.


(MARTIN YOUNG, COINTELEGRAPH, 2024)