The unstoppable talent show genre has gone from the humiliation of wannabes from American Idol and the more kind offering of the turning chair of The Voice to the latest twist of how to inject gentler competition back into the genre — a kind of reincarnation for frustrated and inhibited singers with Fox’s Alter Ego.
Alter Ego features real singers disguised with digital avatars, accomplished with a near to real-time performance capture process. Singers in motion capture suits, witness cameras and microphones are composited together in a live audience studio, which is in the same building as the performance capture volume.
The render is simultaneously broadcast to the TV audience as a kind of hologram, although the judges and live audience are watching the show on various monitors placed on-stage.
Unreal Engine is the rendering powerhouse behind the show, and the company describes Alter Ego as a striking experience. “There are the singers, who are competing to be recognized as professional stars; the famous judges including Grimes, will.i.am, Nick Lachey, and Alanis Morissette; and even the live audience of 200 fans. There’s just one very important difference: every single competitor performing live on Alter Ego’s stage is an avatar.”
Director Sam French admits to having anxiety about relying on such complicated technology for a “shot as live” entertainment show. “I was excited but also nervous,” he said. “I had concerns over feet being on the ground and tracking and mocap. All of those things as you really are bringing two worlds together.”
The show had just 10 weeks to create the 20 avatar characters, each of whom had to be outfitted with four different wardrobe options. Each avatar had their own individual effects, as well as several visual effects tied to all the characters.
Multiple cameras are tracked in the performers’ volume space, which is the same size as the actual stage. Michael Zinman, Lulu AR executive producer and co-executive producer of Alter Ego, detailed the workflow. “All of that data from the motion capture, both the body and the face go to the engines, including the camera tracking and the lighting. From there it gets composited on the stage,” he said.
“The two important features for us from Unreal would be the DMX that gives us the success to actually program a show quickly and efficiently. And then certainly the composting so that you buy into the actual avatar on the stage.
“I didn’t think it was going to be possible to do. Not because you couldn’t have an avatar on stage singing in a motion body capture suit. But how do you do 10 of these performances in a day, keep it on budget, and keep it transparent, as elementary as you can through the network, and through the producers on stage — no different to a show with live-action people.”
Unreal Engine further explained the technical process. “Every performance on the show would be captured live using 14 cameras, eight of which were fitted with stYpe tracking technology. Avatar data including eye color, height, and special effects, as well as motion-capture data, lighting data, and camera data were then sent to a hub of Unreal Engines behind the main stage.
“The result was a virtual avatar that would always accurately reflect the motion-captured performance behind the scenes. When a contestant cried, the avatar would cry, too. And blush. And run their fingers through their hair. All final performances could then be seen directly through on-set monitors, making it easy for the live audience to engage with the final characters and be immersed in their journey, rather than feeling removed from the story due to the computer-generated imagery.”
Dan Pack, managing director of Silver Spoon Animation, underlined the importance of a rigid control protocol. “The great thing about Unreal Engine versus traditional rendering pipelines is that we can preview incredible changes that would normally take a lot of time in post to render… so we are able to change character hair color, eye color, texture, and control the effects, all through this DMX control panel. We pushed DMX on this show further than it has ever gone before in a real-time setting.”
“In order to pull off a show like Alter Ego, you need Unreal Engine’s DMX capabilities; this is the vocabulary and protocol we all speak and the only way to bring in a network competition show on schedule,” said Zinman. “When lighting directors can program virtual and real studio lights on one console, and the characters can be programmed on another, you run through rehearsals much faster. This is how we were able to do 10 unique performances each day.”
In tandem, Silver Spoon was tapping Unreal Engine’s Live Link capabilities to stream data from more than ten mocap instances into the avatars. With virtually no latency, Unreal Engine is able to render high-quality animation captured from the actors onto the avatars in real time — as it happens. Live Link is also used for facial animation via the dedicated Live Link Face app, which enables facial performances to be captured via an iPhone or iPad and rendered in real time in the engine. Using Live Link removed the need for Alter Ego to write a dedicated plugin for the Vicon data packets, giving the team a free-flowing way to animate their skeletal meshes.
Director Sam French sums up the potential of the tech behind Alter Ego, “Alter Ego is probably the best example of the scale of what you can do. Watching them interact on a very normal TV production schedule was incredible. And I think it just the start of what can be done.”