Every Single Shot in RNC’s Dystopian ‘Beat Biden’ Ad Was Entirely AI-Generated
If Republicans’ first attack ad of the 2024 presidential election cycle has the look and feel of a low-budget disaster movie, there’s a reason for that. Every image in the 30-second video is fake.
In response to President Joe Biden’s announcement on Tuesday that he is running for reelection, the Republican National Committee predictably released an ad painting Biden’s America as a disaster. But there’s a catch: Every dystopian image of the 30-second spot was entirely generated by AI. The Youtube description calls it “an AI-generated look into the country’s possible future if Joe Biden is re-elected in 2024.”
That’s right. Instead of calling out any actual policy or action of the sitting president, or citing anything that has actually happened during his more than two years in office, they just made some stuff up. It’s an attack ad without an actual attack, just some hypothetical nightmare scenarios and no real message. It’d be laughable if it weren’t so disturbing.
In this imaginary world that previously existed only in the minds of everyone’s racist uncles, immigrants have invaded Texas, banks have literally crumbled, China has invaded Taiwan, and decent lighting has ceased to exist. San Francisco is “closed,” whatever that means, due to vague crime and a dude with his dreadlocks in a ponytail and a gang sign on his forehead. The streets are filled with tanks and armed soldiers, but apparently all of the U.S.’s vast military might is no match for the single ponytailed hipster— excuse me, “gang member.” And all of it—ALL OF IT—is future Biden’s fault, fictionally speaking.
So obviously the RNC fed the AI bot a few weeks of 24-hour Fox News programming, and this is what it spit out. It’s familiar right-wing fearmongering made to look like actual news footage.
Republicans are not hiding the fact that it’s all fake; in fact, they seem weirdly smug about producing the first AI-generated political ad. In addition to the Youtube description, the words “built entirely with AI imagery” appear in the upper left corner of the screen, in tiny pale lettering.
But will everyone read the disclaimer or notice that barely visible label? If someone sees the ad without having read about it and isn’t paying too close attention, it could be easy to miss what’s real (nothing) and what’s not (again, everything).
I can easily imagine some of my elderly relatives, or even some of my less-tuned-in relatives, struggling to understand what “AI-generated” even means. They’ll get the main message of “BIDEN = BAD” and not look any closer. It’s probably safe to assume the proud pioneers of propaganda behind this ad know that and are taking advantage.
It’s also far from guaranteed that political action committees, internet trolls, and the like are going to be that open about their use of AI, a technology so new that there are no rules yet. It’s already possible to create images, video, and audio of things that never happened, so realistic that even savvy consumers can fall for it, the way so many of us briefly thought that the pope actually wore that puffy white coat. We already have deceptively edited clips that remove context to make someone look bad; now those clips could just be manufactured.
It’s a whole new world of disinformation and deepfakes in which viewers have to pay extra close attention to the media they’re consuming to sort the truth from the lies. And when have most voters ever been known to do that? The dystopian future pictured by Republicans might be fake, but the futuristic threat that’s already here feels very real.
featured image: GOP
Have a tip we should know? [email protected]