en:lang="en-US"
1
https://www.panoramaaudiovisual.com/en/2024/10/15/disney-documentary-shark-attack-360-created-with-composer-ultimatte-blackmagic/

Blackmagic - Ultimatte - Shark Attack 360 - documental - documentary

The Little Shadow studio created a virtual environment supported by solutions such as the Ultimatte compositor (Blackmagic), DaVinci Resolve software or Unreal Engine to give shape to ‘Shark Attack 360’, a Disney+ documentary.

Shark Attack 360, a series of eight 60-minute episodes, explores the true behaviour of sharks and investigates why they attack humans, showcasing personal accounts and scientific evidence in the process. The production, which is part of National Geographic’s Sharkfest, rounds off its script with a set-piece full of cutting-edge visual effects.

London-based production company Arrow Media commissioned Little Shadow to create the three-dimensional graphics and visual effects. As part of their work, the production company had to bring interactive virtual images to life and set up a 360-degree shark laboratory to show the movement and anatomy of these individuals.

Simon Percy, director of Little Shadow, made the decision to set up the Collins’ Music Hall in Islington, an alternative theatre that was never completed, as the filming environment. It was at this location that the unique 360-degree virtual shark laboratory was set up, a visual bet combining real and computer-generated images. “To start, we used a LiDAR scan to create a 3D model of the venue for planning and scaling the project. That digital replica of the venue allowed for meticulous planning and pre visualization, ensuring that each shot could be crafted ahead of time, helping to streamline both the shoot and edit.” explains Percy.

Blackmagic - Ultimatte - Shark Attack 360 - documental - documentary

Creating a virtual environment from scratch

Although the theatre met the lighting requirements, the lack of acoustic isolation and the layout of the spaces lent some challenges: “Everything echoed, and any sound from the kit was amplified, so we stationed our core workflow away from the set floor. This meant running a 4K signal across 110 meters and four floors using BNC cable,” says Little Shadow’s director.

Also, instead of using large LED screens, Little Shadow deployed a combination of custom systems and other tools for Shark Attack 360 to facilitate live overlays on a green background. In developing the workflow, Percy worked closely with experienced visual effects artist Lucas Zoltowski and the show’s director of photography, Matthew Beckett: “We needed a flexible solution to capture live content and integrate CGI assets, which played back on a virtual production (VP) box and provided immediate visual feedback. This was crucial for building the immersive underwater setting.”

After evaluating different options, the Little Shadow team decided to go with Unreal Engine and the Ultimatte 12 4K compositor, in order to create the overlays on the fly. Signal management was handled with a Blackmagic Videohub 20×20 12G matrix switcher, while multiple HyperDeck Studio 4K recorders were used to capture and playback content.

“This was crucial for real time decision making on set and immediate footage review, ensuring optimal takes and adjustments. To bring the underwater scenes to life, we used a green screen and the Ultimatte, which allowed us to integrate the virtual 3D elements into the scenes using augmented reality (AR). This enabled the presenter to interact with the sharks in real time, creating a more engaging viewing experience,” noted Percy.

Blackmagic - Ultimatte - Shark Attack 360 - documental - documentary

Camera movements in hybrid environments

One of the main challenges of Shark Attack 360 was managing the complex camera movements inherent in this hybrid mode. The project required extensive tracking and rotoscoping techniques in order to seamlessly blend live footage with virtual backgrounds, particularly in terms of smaller details such as hair or close interactions with virtual sharks.

“The Ultimatte was essential for live green screen keying as it guided the camera movements based on the virtual elements in the shot, meaning everyone on set could view this in real time. It gave us more room to experiment with the creative possibilities in the scenes while ensuring that we were all on the same page moving into post production,” Percy explains.

However, some shots and corrections did not involve the use of green backgrounds, as the production company’s director concludes: “DaVinci Resolve Studio’s AI tool such as the Magic Mask for rotoscoping proved to be hugely effective in these scenarios. We are increasingly exploring the use of DaVinci Resolve Studio and Fusion as more of our process is starting to lean that way. It’s incredibly fast in terms of EXRs and it’s all GPU accelerated, which is amazing. On this project it really defined the visuals and helped to blur the lines between CGI and live action.”

¿Te gustó este artículo?

Suscríbete a nuestro RSS feed y no te perderás nada.

Other articles on , , ,
By • 15 Oct, 2024
• Section: Study, Graphics, Featured PA (Main) INT, Postpro