Media and brands over the last few years have made a push to steer clear of news and visuals that even hint at misinformation. Beyond the harmful consequences of fake news stories on their businesses, fake text and fake visuals both can misinform a public accustomed to trusting their preferred media sources.

Meanwhile, the next evolution of the technology that once supported social media facial filters, then “face tuning”, got an upgrade. The “deepfake’’ was born. On the one hand, when younger generations amuse themselves on TikTok with the Reface app, deepfakes can seem harmless. On the other, political personalities have fallen victim to the same technology with doctored clips circulated online to a non-digital native public who may not know the signs of edited footage.

What if there was a chance to seize the technology for good over harm? This question is asked often in my field, and the lessons we take to heart could help media industries like advertising and film chart a course for the tech on the right side of history.

Deepfake technology, to get technical, isn’t a VFX technology; it’s an AI and machine learning technology. But the general public doesn’t really understand the nuance of this difference, and looks to VFX for guidance about how it works and how to use it properly. While not a widespread industry standard, there is room for this image science AI to pair with VFX as a tool to create cutting-edge entertainment, rather than manipulated videos.

Tracking and mapping capabilities are useful for animation techniques and faster motion capture methods in VFX production. Viewers see this often in aging or de-aging technology in movies. Films like the Avengers franchise and The Irishmen have already utilized machine learning and deepfake technology in their VFX production for similar kinds of work. At Alkemy X, we have also implemented AI and continue to R&D tools for face replacements and de-aging. Deepfakes may never create perfect reface results, but if the tracking and movement data could be made available, we could have a fantastic face tracking and movement tool. In some cases, we may be able to combat misinformation by training machine learning algorithms to spot deepfake videos on the web, using the technology to help stop the use that made it so familiar to the general public.

Until the technology improves, and until regulations permit or prevent use-cases for deep fakes, audiences will likely be suspicious of the technology. Rather than seize such a pivotal technology and throw it away on deception, however, we have the chance to let the technology improve our media and pave the way to stronger creative results.

Bilali Mack is a VFX supervisor at Alkemy X

 

Read the original article here.