How Flume and Unreal Engine Brought Coachella to the Metaverse

A version of this article was published in TIME’s newsletter Into the Metaverse. Sign up for a weekly guide to the future of the Internet. Previous issues of the newsletter can be found here.

If you were watching Coachella’s YouTube livestream on Saturday night, you might have taken a double shot when giant leafy trees and a parrot the size of Godzilla slowly rose above electronic artist Flume’s stage. Were they giant bouncy castles? Mirages on a 60 meter high LED screen? Pure hallucinations of the mind?

None of the above. This year, Coachella teamed up with Unreal Engine, Epic Games’ 3D software development tool, which I wrote about in this newsletter two weeks ago, to create what organizers say is the first live stream adding augmented reality (AR) technology to a music festival performance. . Unreal Engine worked with Flume’s artistic team and other technical collaborators to create massive psychedelic 3D images that blended seamlessly into its set design and scenery, floating around the artist and in the Indian skies.

But no one at the festival could see those huge parrots – only viewers at home. The result, which only takes a few minutes, serves as a template for how live event planners can use metaverse technology in the future to create unique experiences for viewers at home. Many metaverse builders believe that live events will increasingly hybridize with both digital and real-world components — and that immersive tools can help make each version of the event distinctive and desirable in its own right. “There’s no point in virtually simulating the live music experience,” said Sam Schoonover, Coachella’s innovation leader. “Instead, you have to give fans something new and different that they can’t in real life.”

In recent years, AR visuals have made their way into live broadcasts, albeit mostly as minor gimmicks. Riot Games brought a giant dragon to the opening ceremony of the 2017 League of Legends Worlds Finals; a camera followed the screeching beast as it flew around the fans in the stadium. Last September, a giant panther similarly jumped over the Carolina Panthers stadium. (The panther was also made with Unreal Engine.)

Schoonover has been trying to use similar effects for Coachella’s live stream for years in an effort to broaden audiences beyond the Empire Polo Club’s confines. “The online audience for shows is growing exponentially to the point that maybe 10 or 20 times more people are watching the show through a live stream than at the festival,” says Schoonover. “Because the experience at home can never be compared to the experience at the festival, we want to offer artists new ways to express themselves and increase viewership around the world.”

However, previous efforts at AR experiments at Coachella have been thwarted by production costs and lack of artist interest. This year it took a partnership with Epic – which aims to lower the barrier to entry for 3D creators – and the buy-in from Flume – an electronic musician who has long emphasized visual craftsmanship in his concerts – to bring the to bring the project to fruition. Key players in this process have been artist Jonathan Zawada, who has worked extensively on audiovisual projects with Flume, including NFTs, and director Michael Hili, who directed Flume’s extremely trippy recent music video ‘Say Nothing’.

The result was the creation of huge Australian birds (Flume is Australian), brightly colored flowers and leafy trees swaying in the wind, above the stage and a teeming crowd. Three broadcast cameras equipped with additional hardware tracking enabled the production team to insert those 3D graphics into the video feed in real time.

The graphics are just the beginning of what could be created in AR for live concert settings, Schoonover says. For example, future performers could have lighting effects around their faces at all times, or synchronize their dance moves with those of surrounding avatars. It’s easy to imagine production designers adding in real time the kind of effects that ubiquitous music video director Cole Bennett adds to his videos in post-production, or a Snoop Dogg performance in which he’s flanked by his characters from the Sandbox metaverse.

And Schoonover says these AR experiences will reach another level when AR glasses are normalized. Eventually, you may be able to see the concert in 3D, from the festival site, surrounded by floating AR birds, plants and anything else 3D artists come up with. “When it comes to people who want to get that Coachella experience from their couch, this is the starting point,” he says.

Subscribe to Into the Metaverse for a weekly guide to the future of the Internet.

Join TIMEPieces on Twitter and Discord

More must-read stories from TIME


Contact us at letters@time.com.

Leave a Comment