How augmented reality put five Madame Xs on stage at once
Engadget.com – the original home for technology news and reviews – features a very interesting article by Kris Holt discussing how augmented reality brought four virtual versions of Madame X – a secret agent, a musician, a cha-cha instructor and a bride – on stage at Wednesday’s Billboard Music Awards. As you remember Madonna herself teased about the new technology she was eager to share with her fans a few weeks ago and even posted some images from the green screen set when what you saw on TV last night was starting to be created.
Holt explains how the avatars – not holograms, as they weren’t visible to the naked eye – wove in and out of the inventive performance, bursting into butterflies and puffs of smoke with the help of volumetric capture – essentially 3D video – and Unreal Engine.
There were several environmental effects that livened up the show, including digital rain, clouds, greenery and splashes of color, which married with the physical side in an attempt to tell a cohesive story. Madonna has reinvented herself countless times over her storied career, so it’s perhaps little surprise that she tried something like this.
After meeting with Guy Oseary, we settled on the idea of incorporating augmented reality into the performance. I wanted to explore a way to involve her Madame X personas into the performance as well as the possibility of the real Madonna actually being able to perform with.
The team brought the concept to a new creative AR company called Sequin, which took on the challenge of piecing the performance together. While it was the first time Madonna and Maluma performed the song live, it also marked the first project for Sequin.
Co-founder Lawrence Jones, who oversees creative, production and technology development at Sequin, believes it was the first time there’s been a broadcast AR performance using volumetric capture, which he called “the next revolution” of the medium.
What’s new about this is that it’s a completely choreographed performance where Madonna and Maluma are dancing with four digital versions of Madonna in perfect choreography.
The show was something of a global affair. The volumetric capture process took place at a studio in London, while a Canadian company created the digital assets and environments in Unreal Engine for Sequin to pull together.
The interesting thing about real-time visual effects in broadcast augmented reality [is that] a big portion of the work is happening in pre-production. All the creation of the assets, all of the animation, most of the lighting is all done ahead of time.
A critical aspect of making performances such as this work is real-time camera tracking. Jones and his team use a tool called Brainstorm, layering broadcast objects, including motion graphics, character generation and real-time data, on top of Unreal Engine. Jones explained that Brainstorm feeds data from the physical cameras into Unreal Engine so everything from the real set lines up with a digital replication, ensuring the AR renders are in the right place at the right time.
Once Sequin was on site in Las Vegas, the team tweaked the production so it would fully integrate with the actual performance setting, making adjustments for factors such as lighting, shadows, reflections, timing and placement. According to Jones, the Billboard Music Awards and Dick Clark Productions (which produced the broadcast) were “essential in getting this to happen” and were “super accommodating” to Sequin and Madonna. Sufficient stage time was vital to make sure the live and virtual aspects lined up correctly – no mean feat for a show with more than a dozen performers who all needed rehearsal time.
Read the full story on Engadget.com.