Remote Production and AI: The Future of Post Production

Published on in Miscellaneous

By Jonathan Solomon, Strategic Initiative Engineer, Streaming, IBM Aspera

As technology continues to progress at a rapid pace, post-production is set to change dramatically. Besides great new software features, compute (and therefore render) power continues to increase with faster processors and more powerful xPUs (GPUs, TPUs, etc.). This means real-time editing of HD, 4K, 8K, AR and VR is fast becoming a reality, ultimately shortening post-production timeframes.

Over the next few years, we’ll continue to see the commoditization of hardware and virtualization of software, which will not only lower the cost to operate post-production tools but remove the stipulation that the user is near the computer power or even storage.  

Remote production rules

Continued technology improvements will reduce the financial burden and enable the most talented specialists to work from any location, with remote capabilities and response on par with being onsite. Producers will be able to enjoy the same, all while reducing the overall cost of system and production operations. Large facilities and boutique houses will still exist, but the idea of “work from home” for an editor, colorist, etc., will become a reality.

Similarly, recent advances in file and video transport now allow post-production equipment and personnel to remain in the comfort of their own suites while supporting remote events with minimal delay, resulting in higher-quality content and happier staff. This means the concept of “Edit at Home” for live events will really take shape. Similar to the “Production at Home” model, where specific (often expensive) production personnel remain in the studio and produce the game remotely, “Edit at Home” provides a similar experience for post-production specialists as their equipment remains at home, giving them near-instant access to all feeds from the event. In this case, the incoming feeds are IP based and can be written directly to high-speed disk, and immediately made available for editing.

For example, Fox Sports used IP transport over public internet for post-production workflows as part of its 2018 FIFA World Cup coverage, sending live, file-based video from Russia to its Los Angeles post-production facility. By standardizing the transport of high bit rate or uncompressed media signals, producers aren’t tied to costly proprietary hardware, rather interfacing with whichever hardware is on available to them. 

Augmenting post-production with AI

All major cloud providers are offering varying forms of AI to assist with post-production. From shot logging and speech-to-text, to scene and object identification, AI augments human logging, providing richer metadata for each scene and shot. Some post-production software integrates directly with cloud AI for a seamless in-application experience.

As facilities need to cope with more media and metadata for each production, automated (AI and/or ML) processing, analysis and decision making will help them select the best shots and angles. Enhanced metadata will provide richer search results from the MAM system or for the consumer. As ML becomes more proficient, automated shot selection and scene grading will become more reliable. As well as boosting efficiency, it may also offer a faster way to produce alternative angles or deliverables, including custom versions and fixes.

Automated post-production based on AI and ML will be more common and may even become the norm for short-form production such as sports highlights. This will help with “custom” media playlists for consumers.  

Though the use of AI will continue to advance, the idea of fully autonomous post-production in the cloud is many years away (especially for more prominent projects). The use of AI tools to augment post-production professionals will grow as producers realize the monetary value gained from the richer metadata and insights the tools provide.

Most organizations today are exploring how they can best leverage and embrace these new technologies. Learning how AI technologies can help the entire production chain by improving quality and efficiency should benefit everyone. New things shouldn’t scare people; they should excite. Twenty years ago, people were worried about non-linear editing; today, no one would think to do otherwise.

 

ProductionHUB ProductionHUB Logo

About the Author

Jonathan Solomon
Jonathan Solomon is a Strategic Initiative Engineer at IBM Aspera – helping build the company’s streaming ecosystem. He is an experienced technical leader, who seeks the best solutions by listening first, asking questions, then offering ideas. Over the past 20 years, he worked in various capacities in Broadcast Engineering before transforming into a Systems and Sales Engineer. His technical background and high-level thinking help him understand and explain the most technical details to the non-technical community. He is highly skilled in system design and workflow automation and uses those skills to deliver improved workflows and cost savings by making technology more efficient and automating repetitive tasks. Prior to his current role, he was the Sr. Engineering Project Manager at WWE. He also served as the Director of Engineering for Mid-Atlantic Sports Network and as a Mobile Unit EIC for Game Creek Video.

Related Blog Posts
Real-time Augmented Graphics that Blend the Real and Virtual Worlds with  Marker-less Camera Tracking
Real-time Augmented Graphics that Blend the Real and Virtual Worlds with Marker-less Camera Tracking
Augmented graphics have come a long way in the past decade, from formerly rotund weatherman Al Roker obscuring the first two days of the seven-day forecast to reporter Jessica Yellin channeling a holographic Princess Leia while discussing results of the 2008 presidential campaign with Wolf Blitzer. Fast forward ten years and the real and virtual worlds are blending into one.
Published on Monday, March 11, 2019
BASE Media Cloud provides the “power of Signiant” to COPA90
BASE Media Cloud provides the “power of Signiant” to COPA90
COPA90 is the largest independent football media business in the world, reaching over 100 million people across a multitude of online platforms and social media channels. With a unique brand of fan-centric content, COPA90 has become the hub for a global football-obsessed youth culture that is driving the exploding popularity of the sport. According to the Nielsen World Football Report released just before the 2018 World Cup in Russia, “the global reach of football, or soccer, is unequalled among sports in terms of value to media and sponsors.”
Published on Monday, February 25, 2019
Photographer Zen Sekizawa relies on ARRI Lighting for “The Dance”
Photographer Zen Sekizawa relies on ARRI Lighting for “The Dance”
Exploring the breaking down of traditional ideals of Japanese refinement through the avant-garde dance form known as Butoh, “The Dance” compiles slow motion, still photography, and stop-motion all lit with ARRI lighting.
Published on Monday, February 18, 2019

Comments

There are no comments on this blog post.

You must be logged in to leave a comment.