Zero Density Virtual Studio Demo at NAB Show 2022

Zero Density Virtual Studio Demo at NAB Show 2022

Wednesday, 25-05 2022

It was great to be back at NAB Show. As part of ongoing efforts to help productions work in better ways, as well as create engaging broadcast experiences, Zero Density is upgrading its ZD Ecosystem with a new set of features.  Featuring three separate spaces, an AR demo stage, state-of-the-art virtual studio and demo pods, Zero Density booth provided a front seat ticket to the real-time graphics technologies powering daily shows, live events and more from the world’s biggest broadcasters.


About Zero Density

Zero Density is a world leader in virtual studio, augmented reality and real-time graphics technologies for the broadcast, live events and esports industries. From the Olympics to Louis Vuitton virtual fashion shows, Zero Density’s Unreal Engine-native platform, Reality Engine — which includes a real-time broadcast compositing system and its proprietary keying technology, Reality Keyer — has been used by some of the biggest companies in the world. Clients include: The Weather Channel, RTL, Fox Sports and Warner Media.

Reality Engine 4.27 was at work during the demonstrations. The real-time node-based compositor, which can key, comp and render photoreal graphics in real time, has been natively built to take advantage of Unreal Engine features that will make virtual studio graphics more photorealistic. The virtual studio was ray-traced, showcasing that cinematic-quality visuals are possible for ultra-realistic, live on-air graphics and virtual sets. Through ray tracing, reflections, soft shadows, area lights, refractions and other signs of light travelling around us create a result that is superior in realism than rasterization.

RealityEngine AMPERE is the hardware product that enables the highest performance with utmost stability for virtual studio and broadcast graphics productions.

Reality Keyer, the imaged-based keyer that works on the GPU provides high-quality results for keying of contact shadows, transparent objects and sub-pixel details like hair – in any shot. Focusing back and forth between the presenter’s hair and the background becomes a walk in the park without any visual defects with Reality Keyer.

Coupled with TRAXIS talentS, the demo showcases how visual accuracy is vital for virtual studio productions to achieve the highest photorealism. talentS, the AI-powered talent tracking system that automatically recognizes the presenter’s 3D location, sends the positional data of the presenter to Reality Engine. Then the Engine places the talent inside the 3D world with the correct depth. As a result, the presenter’s real reflections and shadows fall automatically to where they should.

This accurate positional data enables virtual objects to reflect and cast shadows into the real world and the real world to reflect on the virtual objects. The presenter can also interact with the AR objects and compose advanced scenarios such as virtual light following him, triggering AR objects automatically by walking towards a defined location and so on. s.

For more information on the latest Virtual Studio solutions, among many other innovations and the opportunity to interact with the device experience, please access or contact Me Ga Company via https://megasystems.com.vn/