ja:lang="ja"
1
https://www.panoramaaudiovisual.com/en/2024/05/06/aja-brings-12g-sdi-signals-weplay-unreal-engine-5-bridge-real-virtual-worlds/

AJA - WePlay - Unreal Engine 5

WePlay Studios, specializing in eSports content production, has consolidated a live event broadcast workflow that blurs the boundaries of digital with image processing from 12G-SDI channels through the Unreal Engine 5 graphics engine with アジャ solutions.

With locations in Kiev (Ukraine) and Los Angeles (California), WePlay fuses video game content and storytelling to create broadcasts of eSports tournaments of top-tier titles (Dota 2, CS:GO, Valorant and Rocket League), as well as live event productions such as the VTuber Awards, for which its team did a full virtual production last year. Organized by virtual character Filian in partnership with talent management agency Mythic Talent, the five-hour event paid tribute to the best virtual creators online.

WePlay Studios helped bring the event to audiences around the world with a virtual broadcast, combining physical production facilities and equipment with extensive virtual production engineering and design. In the words of Aleksii Gutiantov, head of virtual production, “While we’d previously incorporated AR into live esports productions, this show marked our first foray into a fully virtual event managed with virtual cues; it’s the most challenging technological endeavor we’ve ever taken on.”

To bring the event to fruition, Gutiantov directed and coordinated the production in Los Angeles remotely from his laptop in Europe, using intercom communication with more than 16 crew members そして orchestrating eight days of non-stop pre-production to make the broadcast. His team first created a real-time rendering of a fully virtual Filian to incorporate into the live production using motion capture (mocap) technology. They then used 20 witness cameras to capture the full-body performance, including precise finger movements, and combined it with additional technology to transmit facial motion capture data.

The live broadcast of the event included a vast virtual stage, but Filian’s character was situated on a smaller stage, surrounded by a digitally reconstructed version of WePlay’s physical stage in Los Angeles. To ensure that all physical pan, tilt and focus movements were translated directly into the virtual environment, WePlay Studios camera operators operated three cameras synchronized with virtual cameras. Camera operators on the physical set could switch between various angles within the virtual arena using iPads connected to virtual cameras, creating the illusion of using a dozen cameras instead of three.

AJA - WePlay - Unreal Engine 5

AJA in WePlay’s virtual world

To make the production look more authentic, WePlay Studios connected the physical stage lights to the corresponding virtual lights, allowing the team to manipulate the virtual stadium lighting environment by activating a real environment through a lighting control console. Video playback was also integrated into the virtual world, with live event visuals software connected to the virtual venue used to launch and control the graphics displayed on the virtual stage screens. AJA Kona 5 video I/O cards played a crucial role in the 12G-SDI signal chain, with the final SDI feed being forwarded to an AJA Kumo 3232-12G video router for availability throughout the broadcast channel.

“Our Kona 5 cards were instrumental in allowing us to receive 12G-SDI signals, integrate them into an Unreal Engine 5 environment, and composite the final in SDI. It’s the best product on the market,” explained Gutiantov. “And, our Kumo routers let us build infrastructure for large remote and on-site productions like this one and manage everything from a single, convenient web interface thousands of kilometers away. We also love that we can save pre-programmed salvo routing configurations for SDI signals, and we never have to worry about them going down; I’ve been working with them since 2017 on various projects, and they’ve never failed me.”

AJA - WePlay - Unreal Engine 5

Bringing Kona and Unreal Engine 5 together

コーナー5 allowed the WePlay Studios team to harness the power of Unreal Engine to create a complete virtual production center capable of handling 12G-SDI workflows. This allowed them to take full advantage of the potential of augmented reality technology: from camera tracking, to motion capture and data-driven graphics, while ensuring live virtual production broadcasts without any technical mishaps in the composition. It also allowed them to produce UltraHD key and fill signals from one card in all known formats, using Pixotope as a keyer for 4K with the failover functions known from FHD workflows.

Kona 5 really helps accelerate operations on projects like this, which requires a lot of compute power for motion-adaptive deinterlacing. Furthermore, the card’s multi-channel hardware processing accelerated compute-intensive operations so that we could combine multiple video sources into a single output in Unreal Engine 5, up/down/cross-scale, and mix/composite for all resolutions. These processes are essential for handling video content of any resolution, ensuring that the final output meets the broadcast quality standards,” explains Gutiantov.

AJA - WePlay - Unreal Engine 5

More AJA solutions at WePlay virtual events

に加えて コーナー5 そして Kumo, WePlay also used AJA’s Ki Pro Ultra 12G recorders to meet the high quality recording standards demanded by the project: “The flexibility and reliability of our Ki Pro Ultra 12G recorders were essential. The devices are indispensable; they allow us to support multi-channel HD recording or single-channel UltraHD, and we can swap out recording media on the fly, which is convenient and reliable, especially for long-format live broadcasts and when clients require high-bitrate UltraHD materials for post. Plus, they continue to run smoothly for more than 12 consecutive hours,” says WePlay’s head of virtual production.

The WePlay team also developed a preview infrastructure comprising a series of mini-converters to facilitate the down-conversion of the 12G-SDI signal and forward the 3G-SDI signals to their AJA Kumo video router. Using AJA HD5DA SDI distribution amplifiers, the team was able to distribute the preview signals across all monitors in the stadium for easier management of all preview signals. The setup, which also used routing configurations except for SDI signals regardless of the nature of the data source, allowed precise control over the production view that WePlay Studio provided to its partners, talent, camera operators, motion capture crew and the entire production team at any given time. AJA’s DisplayPort to SDI ROI-DP mini-converters proved to be a key part of this preview infrastructure design, allowing the team to duplicate computer monitors on the broadcast pipeline to manage the conversion with region-of-interest scaling.

The intricate set-up of WePlay Studios’ VTuber Awards production is just one of many examples of how virtual production technology is transforming modern entertainment experiences. According to Aleksii Gutiantov, the level of interactivity it brings opens up exciting new possibilities for live entertainment genres, blurring the boundaries between viewers and the virtual worlds we create: “WePlay is not just staying within the confines of the gaming industry; we’re branching out to music and broader entertainment directions. We’re currently in the early stages of planning and discussions for projects that straddle these new frontiers.”

この記事は気に入りましたか?

購読してください RSSフィード 何も見逃すことはありません。

その他の記事
による • 6 May, 2024
• セクション: AR / VR / XR勉強