As the world begins to adapt to new, burgeoning virtual environments, so too has the world of performance art. With the help of Orpheus VR and Reallusion, re:Naissance Opera combines classical storytelling and opera with 3D animation. Using advanced animation and motion capture software, coupled with traditional talents and interdisciplinary skills, re:Naissance Opera has empowered the world of opera – leading it to a brand new, virtual space.
The First Steps into a New World
Inspired by her love of both video games and opera, Debi Wong of re:Naissance Opera brought the idea of merging these two realms and creating something truly unique: a true operatic experience using 3D animation, character creation, and motion capture. With help from Art Director, Conrad Sly, and his wealth of experience within the 3D art space, re:Naissance sets out to show a brand new direction for world-class, virtual performance art.
“I think this is where the world of performance is going. I sit there in the room…constantly being blown away by the innovation.” – Omari Newton, Writer/Co-Director
This road is by no means simple – the challenge of working across such vastly different disciplines can be daunting from having to learn new tools to developing a way to naturally merge the art and technology. But, these difficulties were overcame with the help from Reallusion and their live motion capture technology.
Merging Technology and Art
The development of this cutting edge new operatic experience had to work seamlessly with established traditional art forms such as dance and music. Transfering these elements into a virtual space while maintaining elements, like facial expressions and natural body movement, would be difficult for a fully experienced team of 3D artists let alone a team consisting of relative newcomers into the sphere. Even when led by experienced artists like Conrad Sly, whose years of work in the industry made him an invaluable part of the team, finding the right tools and learning how to use them in such a way to fit their vision would not be easy. However, Debi and Conrad’s team would find a partnership with Reallusion that allowed their vision to come to life flawlessly.
“It is pretty complicated, but at the same time we’ve entered into these partnerships with, for example, Reallusion Software, and they’ve really created these user-friendly tools that have really helped us as indie producers.” – Debi Wong, Director
Debi Wong – Creative Director, Co-writer, Co-Director of Orpheus VR
Orpheus VR team combining motion capture, sound engineer, and 3D animation in a live performance.
iClone 3D animation used for motion capturing and animation editing.
With its suite of tools, most notably iClone, Reallusion’s programs allowed Debi and her team to create beautiful cinematic animations and characters that are fully rigged and synced to both movement and voice. Utilizing the Xsens suit and its host of sensors, the team was able to capture the live motion performance of professional dancer Erika Mitsuhashi. Within iClone, the performance could be seamlessly recorded and tracked onto a high-quality virtual avatar created within the same program in real-time. Thus allowing for a beautiful virtual performance that matches the exact movements and dynamics of the real thing. This opens a bevy of possibilities for live performances with artists and dancers no longer bound to the stage.
The live performance from dancer Erika Mitsuhashi connects to iClone avatars.
In addition, the vocal performance of singer Mireille Asselin was effortlessly realized in the virtual performance. With its integration with iPhone’s Depth Capture Camera, Reallusion’s iClone is able to track and record the facial animation performance of Mireille and transfer it right onto the virtual avatar’s facial rigging. This allows the performance to combine both the high-level dancing and singing performances together onto a single avatar. This merging of two performances of this level grants an unprecedented freedom and new ground as the ability to do such a performance would be nearly impossible in any other reality.
Not only that, but the ability to allow digital virtual performances to be done within almost any setting provides a new dimension of storytelling. Within the same tool used to record, track, and create these characters and movements, Debi’s team was able to create unique 3D environments for these performances to be held. This allows storytellers to truly let their imagination run wild and to see the full extent of their vision realized in ways never thought possible.
The Orpheus VR team combines voices and live performances in virtual production.
Connecting iClone avatars and animation to Unreal Engine through LIVE LINK plugin.
The iClone program provided a host of new possibilities for the team at re:Naissance Opera. With real-time capture of full body and facial motions, and tracking and rigging of customized virtual avatars, iClone provides a new opportunity for virtual performances and creations. it is capable of being both simplistic enough for a mixed, interdisciplinary group to grasp and learn, but advanced enough to create a virtual performance befitting a professional operatic story. iClone has become invaluable within the world of real-time 3D animations.
Final Render in Unreal Engine
The post The Future of Live Performances: How Modern Performance Art is Advancing into the Virtual World appeared first on ArtStation Magazine.