Remote Production and AI: The Future of Post Production

Published on in Miscellaneous

By Jonathan Solomon, Strategic Initiative Engineer, Streaming, IBM Aspera

As technology continues to progress at a rapid pace, post-production is set to change dramatically. Besides great new software features, compute (and therefore render) power continues to increase with faster processors and more powerful xPUs (GPUs, TPUs, etc.). This means real-time editing of HD, 4K, 8K, AR and VR is fast becoming a reality, ultimately shortening post-production timeframes.

Over the next few years, we’ll continue to see the commoditization of hardware and virtualization of software, which will not only lower the cost to operate post-production tools but remove the stipulation that the user is near the computer power or even storage.  

Remote production rules

Continued technology improvements will reduce the financial burden and enable the most talented specialists to work from any location, with remote capabilities and response on par with being onsite. Producers will be able to enjoy the same, all while reducing the overall cost of system and production operations. Large facilities and boutique houses will still exist, but the idea of “work from home” for an editor, colorist, etc., will become a reality.

Similarly, recent advances in file and video transport now allow post-production equipment and personnel to remain in the comfort of their own suites while supporting remote events with minimal delay, resulting in higher-quality content and happier staff. This means the concept of “Edit at Home” for live events will really take shape. Similar to the “Production at Home” model, where specific (often expensive) production personnel remain in the studio and produce the game remotely, “Edit at Home” provides a similar experience for post-production specialists as their equipment remains at home, giving them near-instant access to all feeds from the event. In this case, the incoming feeds are IP based and can be written directly to high-speed disk, and immediately made available for editing.

For example, Fox Sports used IP transport over public internet for post-production workflows as part of its 2018 FIFA World Cup coverage, sending live, file-based video from Russia to its Los Angeles post-production facility. By standardizing the transport of high bit rate or uncompressed media signals, producers aren’t tied to costly proprietary hardware, rather interfacing with whichever hardware is on available to them. 

Augmenting post-production with AI

All major cloud providers are offering varying forms of AI to assist with post-production. From shot logging and speech-to-text, to scene and object identification, AI augments human logging, providing richer metadata for each scene and shot. Some post-production software integrates directly with cloud AI for a seamless in-application experience.

As facilities need to cope with more media and metadata for each production, automated (AI and/or ML) processing, analysis and decision making will help them select the best shots and angles. Enhanced metadata will provide richer search results from the MAM system or for the consumer. As ML becomes more proficient, automated shot selection and scene grading will become more reliable. As well as boosting efficiency, it may also offer a faster way to produce alternative angles or deliverables, including custom versions and fixes.

Automated post-production based on AI and ML will be more common and may even become the norm for short-form production such as sports highlights. This will help with “custom” media playlists for consumers.  

Though the use of AI will continue to advance, the idea of fully autonomous post-production in the cloud is many years away (especially for more prominent projects). The use of AI tools to augment post-production professionals will grow as producers realize the monetary value gained from the richer metadata and insights the tools provide.

Most organizations today are exploring how they can best leverage and embrace these new technologies. Learning how AI technologies can help the entire production chain by improving quality and efficiency should benefit everyone. New things shouldn’t scare people; they should excite. Twenty years ago, people were worried about non-linear editing; today, no one would think to do otherwise.

 

ProductionHUB ProductionHUB Logo

About the Author

Jonathan Solomon
Jonathan Solomon is a Strategic Initiative Engineer at IBM Aspera – helping build the company’s streaming ecosystem. He is an experienced technical leader, who seeks the best solutions by listening first, asking questions, then offering ideas. Over the past 20 years, he worked in various capacities in Broadcast Engineering before transforming into a Systems and Sales Engineer. His technical background and high-level thinking help him understand and explain the most technical details to the non-technical community. He is highly skilled in system design and workflow automation and uses those skills to deliver improved workflows and cost savings by making technology more efficient and automating repetitive tasks. Prior to his current role, he was the Sr. Engineering Project Manager at WWE. He also served as the Director of Engineering for Mid-Atlantic Sports Network and as a Mobile Unit EIC for Game Creek Video.

Related Blog Posts
SEACREST STUDIOS’ RADIO AND TV PROGRAMMING AT U.S. PEDIATRIC HOSPITALS
SEACREST STUDIOS’ RADIO AND TV PROGRAMMING AT U.S. PEDIATRIC HOSPITALS
Project: Ryan Seacrest Foundation’s (RSF) is a 501(c)(3) non-profit organization dedicated to inspiring today’s youth through entertainment and education-focused initiatives. RSF’s primary initiative is to build broadcast media centers — Seacrest Studios — within pediatric hospitals for patients to explore the creative realms of radio, television and new media. Most recently, the foundation opened a brand new Seacrest Studio at Orlando Health Arnold Palmer Hospital for Children, complete with state-of-the-art video and audio equipment. The studio helps the foundation fulfill its mission of contributing positively to the healing process of children and their families through radio and TV programming.
Published on Wednesday, April 28, 2021
Netflix feature film Mank takes editorial workflows to a new level
Netflix feature film Mank takes editorial workflows to a new level
Citizen Kane has long been regarded as a movie masterpiece for its cinematography, storytelling, and ahead-of-its-time visual effects. Who better to pay homage to the 1940’s film than director David Fincher, whose films are often lauded for these same characteristics? Fincher’s most recent project, the Netflix feature film Mank, brings to life a screenplay written by his late father, journalist Jack Fincher.
Published on Monday, April 12, 2021
The Importance of Music on Film & Emotion
The Importance of Music on Film & Emotion
If you took a second to close your eyes and think about Star Wars, what comes to mind? What are you hearing? Laser blasters, light sabers, possibly a dramatic score by John Williams? For the major films in pop culture, music plays a major role in developing the themes and tone of the production.
Published on Friday, January 15, 2021

Comments

There are no comments on this blog post.

You must be logged in to leave a comment.