Culture

Dizzying deep fake video turns an iconic American moment into a horror story

Two filmmakers make history a toy, and offer a warning for the future.

by David Grossman

A new short film begins with one of the most legitimate news sources in world history: Walter Cronkite. We seen him reporting on the launch of Apollo 11 for CBS.

Then, Francesca Pancetta and Halsey Burgund, the filmmakers behind In Event of Moon Disaster, take the viewer through a dizzying tour of the modern world of film manipulation, offering a horror story that serves as a warning for how easily the public could be duped:

  • They make Neil Armstrong and Buzz Aldrin’s iconic lunar landing look like a horror film, filled with quick cuts and worried voices.
  • Then, at the end of the film, a stunning visual plays out: Richard Nixon, sitting in the Oval Office, describes how the heroic astronauts died on the Moon.

That it all feels so real is a powerful indictment of how the most sanctified historical moments could be altered given enough computing power and determination.

The idea was helped along by a quirk of actual history — that Nixon’s speechwriters, accepting the tremendous risks of spaceflight, wrote a never-delivered speech, the title of which inspired the name of this short film. The speech, now safely stored at Nixon’s Presidential Library, announces gravely that Armstrong and Aldrin “know that there is no hope for their recovery.”

It then goes on to compare them to Greek gods:

“In ancient days, men looked at stars and saw their heroes in the constellations,” Nixon would have said. “In modern times, we do much the same, but our heroes are epic men of flesh and blood.”

The speech is written with an eye toward history and deep sadness. It’s exactly what a president would say in the moment, which is what drew Pancetta and Burgund to it, the two tell Inverse.

“There are deepfakes, there’s this speech, there’s the Moon landing’s 50th anniversary. Whoa, wow, these things could come together,” is how Burgund describes it.

The idea emerges out of discussions Pancetta, once of the British newspaper The Guardian and currently, the Creative Director at MIT's Center for Virtuality, would hold between artists and journalists at a fellowship at Harvard.

“Who can make the best deepfake?

After securing funding for the project through a grant from the Mozilla Foundation, the first question Pancetta had was, “Who can make the best deepfake?” They settled on CannyAI, based out of Israel, for the visuals and the Ukrainian Respeecher for the vocals.

The project was ambitious, even for these companies. “Doing a deepfake from material that was 50 years old, is, you know, not something they've really done before,” Burgund says.

But the efforts were worth it. Describing their first reactions to the first drafts of the project, Pancetta remembers thinking, “Oh wow, this is really, really realistic. We didn't have the voice by then and it wasn't the final visual version but, like, it was pretty obvious very quickly that this was going to be very, very, real.”

Working with actor Lewis D. Wheeler, the two needed to make sure that the Nixon in the video resembled the Nixon of real life — one who would speak slowly in this moment, with pauses, and the occasional shaking of his head.

“So much synthetically created media is, you know, a little robotic. It doesn't really necessarily have the sort of human feel,” Burgund says. Their Nixon doesn’t feel like an animatronic Disney president, it feels like Nixon.

Manipulative editing “is a potentially even more worrying type of misinformation."

Both Pancetta and Burgund realize that their deep well of funding and resources aren’t available to most malicious actors — although they are definitely available to some, like foreign states. Their entire project took 14 months, and most malicious actors would likely want to affect something in a smaller time frame. But video manipulation doesn’t need to have the latest cutting-edge technology to be effective, it can rely on simple measures that anyone with basic computer literacy can access.

Manipulative editing “is a potentially even more worrying type of misinformation because it's so easy for people to use media out of context to reverse it to speed it up to exactly which bits you want to use,” Pancetta says. Basic editing was able to take a successful Moon landing and make it look like a disaster.

“We are storytellers and documentary makers. We’re really well aware of those tools we've been using — the tools of editing. It was clear to us that we would get to construct a scenario where we crash the lunar lander onto the moon and that's what we did. We reverse the lunar lander which was going from the moon, back up into space, and we make it plummet downwards by reversing it.”

While well-done and professionally made, these techniques were not high tech. “Anyone can do that. We've all got iMovie and various editing software now. Kids are brilliant at doing this,” Pancetta says. She cites a 2019 manipulated video that originated on Facebook, showing Nancy Pelosi appearing to slur her speech, but was actually a regular video showing “significant distortion,” according to experts.

They’re both insistent that this technology can be used for positive purposes. Burgund refers to the Shoah Foundation’s Dimensions in Testimony, which allows users to ask real-time questions and get real-time responses about the nature of the Holocaust, using pre-recorded interviews to learn about genocidal history. The project lists further uses of "synthetic" media, as opposed to fake, on their website.

But for now, worries abound as the 2020 election grows closer. But “Moon Disaster” offers both a warning and a chance to show how “synthetic” media, as Burgund refers to the film, can generate real emotion.

Related Tags