In 2020, the markerless motion capture market continues to evolve with new industries beyond the core entertainment/videogame creation world, with new sectors such as bio-medical, science and architecture finding the software helpful. Michael Nikonov, Founder & Chief Technology Officer of Moscow-based iPi Soft, maker of the leading markerless motion capture solution iPi Motion Capture, continues to stay true to the company’s motto “Motion Capture for the Masses,” most recently unveiling real-time tracking and live streaming capabilities into popular game engines such as Unity and Unreal.
Nikonov spoke to ProductionHUB about the trends he sees in markerless motion capture, and specifically what enhancements iPi Soft users can look forward to seeing in the near future.
Last year iPi Motion Capture added real-time rendering. Was it the game-changer you thought it would be?
MN: Adding real-time was a huge developmental milestone for the company. It was a significant technical achievement for us just to make it happen and it’s something we’re tremendously proud of, but that said, it was one step for us. We are focused on other improvements, specifically in regards to the overall motion tracking user experience of the software.
Animation students in the US are getting ready to head back to college this fall, and for many, that will be a virtual classroom environment. What are the benefits that iPi Mocap offers to help students learn your software from home?
MN: Because iPi Mocap is a portable solution it can be used at home. Kinect sensors allow to capture in a small space less then 5 by 4 feet. The new Azure Kinect sensor released last year has wide-view mode that allow users to stand even closer than 3 feet to sensor. So, the system can be operated by single person, which makes it convenient for online learning. Also, our user license can be transferred between different computers without any limitation, so a license owned by a college or university can be used by multiple students from their homes.
How about the iPi Motion Capture user experience? Have you made improvements there as well?
MN: We recognize that for some users working with more than one camera configuring and calibrating the system can be a challenge. Our development team is working towards simplifying this for a more user-friendly motion tracking experience. Some examples of this include the ability to specify distance between any pair of cameras for setting scene scale during multi-camera system calibration, ability to control real-time tracking and live streaming to Unity from third-party software using Automation Add-on and other improvements.
Some examples of this include the ability to specify distance between any pair of cameras, which helps set scene scale during multi-camera system calibration; the ability to control real-time tracking and live streaming to Unity from third-party software via our Automation Add-on, and other improvements.
Due to the Coronavirus pandemic Hollywood live action productions have been largely suspended. Is this moment perhaps an opportunity for greater adoption for iPi Mocap among filmmakers/gamers?
MN: From the very beginning of the worldwide pandemic lockdown we've seen increased in interest to the software from hobbyists, indie filmmakers and game developers, as well as small studios. Our software can find its place in complex production pipelines of the big studios as well. It’s particularly effective for concept testing, previz, and background animation of large crowds or secondary characters that don’t require high levels of detail. This can be done at home or in small office space, it makes the system a perfect fit for remote work.
When you look at the motion capture industry what are the big picture trends you’re seeing?
MN: The opportunities as we see them for markerless motion capture remain primarily in the entertainment/gaming sectors, and increasingly in the biomedical/scientific research world. Architectural design firms are also using motion capture, but the majority of our users are professional and semi-professional animators and digital artists.
What specific workflow benefits does iPi Soft’s integration into the Unity and Unreal game engines provide iPi Mocap customers.
MN: One of the most important development improvements we made in regards to the Unity game engine was to enable live streaming from our software to Unity. We’re currently working on delivering this to the Unreal game engine, which should be online by 2021. Live streaming integration with game engines is essential for animators because they need to see their character models in the gaming environment as quickly as possible to decide if a scene works or needs to be redone or edited in some way. This eliminates artists having to constantly wait on their scenes to render and quickens the creative workflow.
We also added a preset for Unreal in iPi Mocap for working with their standard bipedal characters. In recent versions of Unreal Engine, the standard skeleton is stable so now it is as easy as selecting ‘Unreal’ from the menu of available characters and rigs.
I know you’re a huge fan of videogames, anything you’ve seen recently that is particularly great?
MN: I am amazed at how indie game developers are able to do complex and beautiful games with great looking animation, with such small teams. “The End of the Sun” game is one recent example. In their Kickstarter video, they explain how they brought animation from iPi Motion Capture software to Unity game engine, and what was their first experience with motion capture.
Last question, could we ever see an iPi Motion Capture app?
MN: We actually had a meeting last summer with a major cell phone manufacturer, who after watching a demonstration of our software asked us that same question. Unfortunately, right now, the answer is no. At present, markerless motion capture is simply too computer-intensive to exist as an app. A simplified body tracking app is quite possible, but it is not accurate enough for animation needs, so I would not call it “motion capture.” But who knows what the future holds.