Elon Musk Boosted a Kamala Harris Deepfake. He’s Only Getting Started

The parody ad, which called the vice president a “deep state puppet," underscores how AI voice-cloning might sow chaos in the election to come—especially when promoted by people with huge followings.
Image may contain Elon Musk Clothing Formal Wear Suit Face Head Person Photography Portrait Accessories and Tie
Elon Musk, chief executive officer of Tesla Inc., at the US Capitol in Washington, DC, US, on Wednesday, July 24, 2024.Samuel Corum/Bloomberg via Getty Images

Elon Musk, facing a torrent of backlash after sharing an AI “parody” of a Kamala Harris campaign ad last week, responded to his critics Monday in characteristically puerile fashion. “I checked with renowned world authority, Professor Suggon Deeznutz, and he said parody is legal in America,” the right-wing billionaire wrote on the social media site he owns. “Not to mention Pullitsir Prize winner Dr Head, first name Dick,” he added.

It was a typical Musk response—astonishingly unfunny; defiant in the most embarrassing way possible—that came after California Governor Gavin Newsom promised to sign a bill outlawing deepfakes like the one Musk shared, which featured an AI-generated likeness of Harris’s voice describing herself as the “ultimate diversity hire.” “I am both a woman and a person of color, so if you criticize anything I say you’re both sexist and racist,” the mock-Harris says in a voiceover. “I may not know the first thing about running the country, but remember, that’s a good thing if you’re a deep state puppet.”

“This is amazing,” Musk wrote of the video, cut to look like a campaign ad, when he shared it Friday.

The substance of the video was far from amazing; the satire was about at the level of the Babylon Bee, which Musk also seems to find hysterical. But the sophistication of the voice-cloning underscores concerns about the potential ways that AI could be used to sow chaos in the upcoming election, as well as how Musk may seek to wield his influence against Harris and the Democrats.

Indeed, Musk has assailed Harris since she became the Democratic nominee: Over the weekend, he described her as an “extinctionist” who could usher in a “de facto holocaust for all of humanity.” He also endorsed Donald Trump after a shooting at a Pennsylvania campaign rally, has appeared to indulge in conspiracy theories about that assassination attempt, and seems to be offering some financial support to Trump through his pro-“meritocracy” and deregulation super PAC—albeit apparently not at the $45 million-a-month level he had reportedly pledged earlier. Driven partly by bitterness toward Joe Biden and partly by his preoccupation with the so-called “woke mind virus,” Musk has grown increasingly vocal in his support for Trump and other right-wing causes, deepening concerns that his influential social media platform may not remain “politically neutral,” as he had previously suggested it should. (The suspension Monday of a “White Dudes for Harris” account after a lucrative online fundraiser attended by Jeff Bridges, Pete Buttigieg, and other celebrities and political figures only added to those questions, though it’s still not clear why the account was locked out.)

Musk’s antics have led to calls for action from Democrats like Barbara Lee, the California congresswoman: “It shows you just how dangerous [Musk] is and how dangerous it is for social media not to have guardrails,” she said on CNN Monday. “We need to make sure that, as we look at AI and move forward, there are some regulatory guardrails and rules that it has to follow.” Some have already been put in place; in February, for instance, the Federal Communications Commission banned the use of AI-generated voices in robocalls after New Hampshire voters received calls from a computer-generated Biden voice urging them not to vote in the primary. Others, like an FCC proposal that would force advertisers to disclose the use of AI on TV and radio, are in the works. But it’s unclear if that and other regulations can be enacted in time for the election, now just three months away. And even if additional rules are put in place, it’s almost certain that there will still be plenty of vulnerabilities for bad actors to exploit. What, exactly, that will mean for November remains to be seen. But at the very least, it could exacerbate a climate of distrust in a political moment in which Americans are already divided over the nature of reality.