deepfake, deepfaking

The late and much lamented showbiz personality Inday Badiday makes a surprise comeback in an ’80s-inspired music video–and it’s all thanks to deepfake technology.


The likeness is uncanny, the voice unmistakable to any Filipino who owned a television two or three decades ago.

Magandang gabi sa inyong lahat,” says Lourdes Carvajal, better known as Inday Badiday, in her signature husky drawl.

With its ’80s-inspired set design, analog drums, old-school dance beat, and degraded VHS-like quality, one could easily mistake the video for a lost episode of the Queen of Intrigues’ talk show, See True.

Actually, it’s the official music video for “Uwian Na”, a song from Pinoy band Truefaith. In fact, the video was created in 2019–sixteen years after Inday Badiday’s death.

Even more interesting is what made this time-traveling labor of love possible: deepfaking, a computer technique often linked to fake news, identity theft, and even revenge porn.

Eye to eye

“The basic idea was brought up [in 2018] in my first meeting with Truefaith, in preparation for their 25th anniversary concert in October,” explained director and visual effects (VFX) specialist Adrian Arcega.

During Arcega’s discussions with the band, the idea of a music video diving deep into ’80s Pinoy showbiz culture popped up. With it came an opportunity to take things a notch higher via deepfaking.

A combination of the words “deep learning” and “fake,” deepfaking is a relatively new approach to VFX production in the Philippines. Deepfaking uses artificial intelligence (AI) to modify photos and videos by superimposing existing images and footage onto them.

Through deepfaking, a user can create fake videos that are almost indistinguishable from the real thing. (Does that sound familiar? If so, you’ve probably used FaceApp, Meitu, Mug Life, and other similar smartphone apps.)

In a recent interview, Arcega described the deepfaking process:

“[W]hat deepfaking does is it uses what we call neural networks—computational models that are loosely inspired by (but not identical to) the way real brains process information—to analyze faces from every angle, similar to how we process the faces of new people we meet. Think of it as “getting to know the person” by meeting him or her often; in the machine’s case it needs to “learn” how the person looks from every angle as well.”

Arcega’s experiences and challenges as a VFX specialist led to his fascination with deepfake technology. “I always try to find ways to make VFX affordable to use here,” shared the director.

According to Arcega, he is no stranger to countless reshoots, scheduling issues, and even the idea of body doubles. In the past, for instance, he has had to replace heads from a bad take with those from a good take. It’s the same technique George Lucas employed in the 2004 re-release of Star Wars: Episode VI – Return of the Jedi. In the remastered version of the film, Lucas replaced the head of actor Sebastian Shaw (Anakin Skywalker’s Force ghost) with Hayden Christensen’s.

“It’s all about wanting to control the entire mise-en-scène in post-production, really.”

See true

Months before Truefaith approved the project, Arcega had already been experimenting with deepfaking.

Despite his initial results being “horrible,” the tests helped him get a better grasp of the technique and its limitations.

By the time the band gave the go-signal in July 2019, Arcega was ready. He already had the tools: a Ryzen 5 2600 processor with an 8GB GTX1070 graphics card and 32GB RAM.

Using DeepFaceLab for deepfaking, After Effects for compositing, and Adobe Premiere for primary editing, Arcega ran tests for a month.

deepfake, deepfaking, inday badiday
(Image: Adrian Arcega)

At this point, Arcega also had a clear idea of what he should and shouldn’t do during the actual shoot, thanks in part to American VFX company Corridor Digital. Around the time Arcega was running tests, Corridor Digital released a detailed behind-the-scenes look at how they deepfaked actor Keanu Reeves stopping a robbery. (You can check out the now-famous, and quite convincing, video below.)

Arcega’s team shot the video in August, inside the studio of the University of the Philippines Film Institute (UPFI) Media Center. Apart from the band, Pinoy celebrities Ricky Davao, Yayo Aguila, and Rey “PJ” Abellana also appear in the video as themselves. Inday Badiday’s daughter, entertainment writer Dolly Anne Carvajal, joined their shoots as well. Carvajal reportedly “loved the idea of bringing back her mom” for the video. “When Dolly Anne and the actors saw our set, it was an instant throwback for them.”

The analysis stage is perhaps the most complex part of the deepfake process. This is where the AI searches for similar facial patterns in different clips, using them to construct the deepfake. Due to time and budget constraints, Arcega limited the shots that needed deepfaking to 18.

“I did 2 more rounds of testing after I locked down the edit,” said Arcega. “The final stretch of implementation lasted two and a half weeks. Each round of analysis took two and a half days, and the conversion per shot was actually quick (15 minutes per shot).”

deepfake, deepfaking, inday badiday
(Image: Adrian Arcega)

Arcega uploaded the finished video to Truefaith’s official YouTube channel on October 5, 2019. To date, it has received over 8,200 views.

Nothing but the truth

The Inday Badiday we see in “Uwian Na” is the product of three key factors: deepfaking, careful acting, and nailing that iconic voice.

Arcega’s reference videos for Inday Badiday were basically low-resolution YouTube clips, making the endeavor a bit challenging. “A good deepfake would require a large amount of high-quality data. That’s what deepfaking essentially is: data analysis and parsing.”

The director had initially wanted popular impersonator Inday Garutay to be the host’s body double. However, a last-minute emergency prevented the comedian from participating, forcing Arcega to think on his feet.

“I realized that our production designer, Miel Cabañes, had a vague resemblance to a young Inday Badiday, too,” Arcega explained.

To prepare Cabañes for the role, Arcega shared some pointers with her regarding Inday Badiday’s fashion, manner of talking, and body language. “Thankfully, Miel had a keen eye–again, being a production designer.”

deepfake, deepfaking, inday badiday
FROM MIEL TO “ATE LUDS”: An unedited scene with Cabañes and Aguila (top)
versus the deepfaked version in the final cut (bottom). (Image: Adrian Arcega)

Abante Radyo host Tito Sandino completed the equation by providing Inday Badiday’s voice. “I needed a reality to the vocal performance, as going over-the-top did not work,” Arcega revealed.

Fortunately, Sandino had the nuances and specifics of Inday Badiday’s “hosting voice” down to a science. This helped make the deepfake sound as real as possible.

“Get the iconic voice wrong,” stressed Arcega, “and it all falls flat.”

Heart to heart

More often than not, discussions on deepfaking emphasize the technology’s perils, not its practical uses. For Arcega, however, there’s plenty of potential for deepfaking, especially in the local film industry.

For starters, it’s affordable. Arcega noted that only the analysis stage would likely prompt an increase in your electric bill. “Assuming the analysis went well (which ultimately depends on the amount of data you put in), then you just need manpower for the compositing,” explained Arcega.

Furthermore, it can help filmmakers fix things in post-production that their budgetary limitations wouldn’t normally permit. “I see it as a tool to change representations of reality on film and video,” Arcega said.

Arcega also emphasized that just like any digital image manipulation tool, deepfaking can be used in a variety of ways, for a variety of purposes. “While it has the potential to be used for fake news, the same can be applied for any tool we currently have at out disposal,” he asserted. “The challenge is to get experts with image manipulation to spot and combat this, but thankfully there are AI-based developments [in that area] as well.”

Based on what he saw, though, his foray into deepfaking yielded some pretty convincing (and satisfying) results.

“During the video launch, I took a look at Dolly Anne when the video started,” Arcega recalled. “Her smile when her mom appeared was all the approval I needed.”


References

  • https://africacheck.org/factsheets/guide-how-to-spot-cheap-out-of-context-and-deepfake-videos/
  • https://towardsdatascience.com/from-faceapp-to-deepfakes-3d1048713da0
  • https://www.rappler.com/move-ph/events/thinkph/2019/240911-putting-spotlight-responsible-technology
  • https://news.abs-cbn.com/ancx/culture/spotlight/08/26/19/deep-fakes-bring-fake-news-to-a-whole-other-leveland-you-should-be-concerned
  • https://aomnl.com/tech/deepfake-technology-used-in-opm-music-video
  • https://www.theverge.com/2019/6/10/18659432/deepfake-ai-fakes-tech-edit-video-by-typing-new-words

Author: Mikael Angelo Francisco

Bitten by the science writing bug, Mikael has years of writing and editorial experience under his belt. As the editor-in-chief of FlipScience, Mikael has sworn to help make science more fun and interesting for geeky readers and casual audiences alike.