From Netflix’s personalized thumbnails to deepfake Luke Skywalker, AI and machine learning are changing the way the film and television industry operates. While the industry has been slow to embrace this technology, the leading media giants are increasingly embedding AI into their operations, proving that AI and creative ventures can go hand-in-hand. Let’s look at some of the most exciting uses of AI in film and television today:
Back in 2020, Warner Bros signed a deal with LA startup Cinelytic to predict film success using machine learning.
The software allows Warner Bros to effectively play “fantasy football” with its films to decide which are best to greenlight or how they should be promoted. They can input different variables related to their project, such as genre or actors, and see how audiences would respond.
For example: If we made our romance drama more of a romantic comedy, would that increase the engagement of under 25s in Australia? How would replacing Tom Holland with Timothée Chalamet impact global box office revenue?
“We make tough decisions every day that affect what — and how — we produce and deliver films to theaters around the world, and the more precise our data is, the better we will be able to engage our audiences,” explained Warner Bros’ Senior Vice President of Distribution, Tonis Kiis.
Not only does Netflix use AI to understand what content you would most likely enjoy, but it also uses it to personalize how that content is promoted to you.
For every film and TV show that Netflix hosts, it has a bank of different thumbnails and title cards. Netflix’s algorithm chooses which thumbnail you see based on your past viewing history so that it has the highest chance of converting.
The more that it’s successful at converting you, the better it understands your favorite genres, making this more accurate over time.
For example, if someone has a history of watching LGBTQ+ content on Netflix, the thumbnail they see on other films and series will typically feature its LGBTQ+ characters.
20th Century Fox partnered with IBM Research to use artificial intelligence to create a trailer for its new horror feature film, “Morgan”.
First, the AI system was trained on 100 horror movie trailers to understand tone, shot composition, and actor emotion. This allowed the AI to learn what categories of scenes fit into a typical trailer of this genre.
It was then fed the full-length feature film and after “watching” it, it selected the 10 moments it recommended to be included in the trailer. A human editor then took over from there.
According to IBM, this entire process would typically take 10 to 30 days manually, but the AI completed the task in 24 hours:
“Reducing the time of a process from weeks to hours –that is the true power of AI.”
Cineverse is working with London-based startup Papercup to translate all 31 seasons of the Bob Ross series “The Joy of Painting” into Spanish using AI.
The translation software takes the original voice track in the Bob Ross show and automatically creates a new voiceover in Spanish, retaining some characteristics of the original speaker. The dubbed video is then reviewed and edited by a professional translator.
This has allowed Cinedigm to reach millions of potential viewers across Latin America without the prohibitive cost of hiring a recording studio and voice actos.
“As streaming services continue to expand worldwide, this will allow us to reach a previously untapped global audience,” said Tony Huidor, Chief Technology and Product Officer at Cinedigm. “This is a pivotal time for the entertainment industry as demand for streaming content adapted for local markets continues to grow.
We seek to make the most of our channel portfolio by adapting our content for international markets. Partnering with Papercup will allow us to achieve this goal and scale our business by utilizing their AI technology."
Disney’s The Mandalorian season 2 finale shocked fans when a young Luke Skywalker appeared on screen, despite the actor Mark Hamill now being 70 years old. The effect was the result of blending Mark Hamill wiith a younger body double using VFX.
The end outcome was not universally praised, with one deepfake YouTuber proving how much better it would have looked if they’d used deepfake technology. What did Disney do? They hired him.
Subsequent appearances of Luke in other Disney media, namely The Book of Boba Fett, have used his deepfaking skils to plant a young Hamill mask over the body double in post production, which has received general praise for improving on the VFX method.
Luke’s voice was also synthetically created by an AI software, Respeecher, using a combination of existing recordings from the original movies and audio plays.
Want to see of AI being applied to video content? Why not book a demo with Papercup so we can show you behind the scenes.
Stay up to date with the latest news and updates.