laitimes

The Tracking of Virtual Manufacturing | the technical code for the biggest winner of the Oscars

Virtual production Tracking article

DNEG

The technical code for the biggest winner of the Oscars

The biggest winner of the Oscars

Dune

01

The Tracking of Virtual Manufacturing | the technical code for the biggest winner of the Oscars

Recently, the 94th Academy Awards were announced, and the movie "Dune" swept six awards for best visual effects, best cinematography, best art direction, best sound effects, best soundtrack, and best editing. Ten nominations, six awards, and almost all technical awards, such an eye-catching record has undoubtedly made it the biggest winner of this movie feast!

The Tracking of Virtual Manufacturing | the technical code for the biggest winner of the Oscars

The technical code of Dune

02

Some netizens commented: The visual effects of "Dune" are simply amazing! Indeed, the main visual effects team of this film has a lot to do, so that "Dune" directly gets the traffic password! They have won six Academy Awards in the past eight years, and this award marks their seventh Oscar for Best Visual Effects, a team that is known as DNEG.

Virtual production

Series of short films - tracking

03

The Tracking of Virtual Manufacturing | the technical code for the biggest winner of the Oscars

For more than 20 years, DNEG has been exploring VFX technology and art and has always been at the forefront of the industry. Especially in the virtual production that has emerged in recent years, DENG has the most advanced technology and services. Recently, DNEG collaborated for the first time with The Movement Capture Industry Veteran Team Divsion and production team Sky Studios and launched a series of short films focused on virtual production, showcasing the most advanced virtual production technology, workflow and creative concepts. At present, the series of short films is being continuously updated, and today Xiao Di first brings you the behind-the-scenes technology sharing on "tracking", and learns how to use motion capture technology to track camera tracking, actors and props in virtual production. Don't say anything, watch it first.

Virtual production shoot scene

Case show

04

Paul Franklin, who directed the project, is also the creative director of DNEG. He said: "We put the marker point of the motion capture on the flashlight prop held by one of the actors, and then use the captured flashlight data to drive the virtual lighting on the LED screen, and the actor uses the flashlight to point at the LED screen, and an aperture will appear in that position." This allows us to interact with the virtual environment to a certain degree of spontaneous interaction, which creates a good sense of presence in the LED scene. Have another performer put on a motion capture suit and use the captured performance data to drive the real-time movement of the virtual character on the screen. The flashlight-wielding live actor can then interact with the virtual character on the screen in real time, creating a dramatic scene where the real person interacts closely with the virtual character. ”

Motion capture tracking in front of the LED screen

Precautions

05

The Tracking of Virtual Manufacturing | the technical code for the biggest winner of the Oscars

Compared with the traditional shooting environment, what are the precautions when the LED screen is forward to catch and track?

Tim Doubleday, Virtual Production Director at Dimension, offers an answer to this question:

When running full-body motion capture technology in an LED shooting environment, the first thing to consider is the delay time between the real performance of the motion capture actor and the passive capture data-driven virtual character that appears on the LED screen. There are inherent delays in data processing, image rendering, and data transmission to the LED screen, and as long as this delay time is within the acceptable range, it will look like a problem from the lens. For example, you might consider a situation where a real actor high-fives a CG character in front of an LED screen, and if the delay is too much, then the CG character won't be able to react in time, then that's a failed shot. The same is true for prop tracking. For the example of a flashlight, we must ensure that there is minimal delay between the actor moving the flashlight in physical space and the movement of the CG flashlight and applying light to the LED screen. Therefore, we chose to use the Vicon motion capture system to track cameras, props and motion capture actors, Vicon's advantage over other products is that its delay is very small, almost can make the action of the virtual character in the LED screen synchronized with the action of the motion capture actor.

Another important consideration is to ensure that the tracking system you use does not interfere with visible light or affect the camera's capture. In the flashlight example, we used passive-emitting marker points that reflect infrared light, but for other projects we used actively illuminated marker points that are not visible to the live-action camera and work correctly with the camera flash turned off.

It is also important to determine the location of the moving fishing site. In the virtual shoot of this project, the motion capture actor, the LED screen and the actor of the real character are in the same space. This means they can play each other and the actors can see themselves on the screen, but this may distract the actors, so consider using a completely separate motion capture field so that the actors can focus on their performances without being affected by the LED screen.

Finally, the difference in line of sight and perspective between real and virtual characters also needs attention.

The Tracking of Virtual Manufacturing | the technical code for the biggest winner of the Oscars

Motion capture real-time driven CG characters

Usage Scenarios

06

In what situations can you take advantage of real-time-driven CG characters in virtual production?

Tim Doubleday argues:

The main benefit of having a live motion capture character is that it allows the scene to be dynamic and the motion capture actor can manipulate and react to the CG character in real time. Of course, it's still difficult to deliver final-quality CG characters in real time, but by recording camera motion and CG character motion at the same time, traditional VFX processing is done in post to deliver the higher quality renderings required for film and television.

Virtual production can also help enable actors to interact with the CG environment. For example, an actor waved his hand and had the virtual environment on the LED screen react to moving virtual lights or falling leaves. These things can all be achieved by tracking the actors and making them more harmonious between the actors' performances and the content seen on the LED screen, giving the actors a sense of immersive experience.

LED screens and augmented reality experiences offer the possibility of combining motion capture with live performances, and Vicon's motion capture technology opens up a world of endless possibilities for virtual production to create stunning interactive and visual experiences.

epilogue

07

Dickson Digital focuses on virtual production, providing a complete range of virtual production, motion capture and animation solutions and technical services to inspire endless possibilities for your artistic creation!

Xiao Di will continue to pay attention to DNEG's virtual production innovation project series of short films and share them with you. Follow Dickson to learn more about virtual production updates and services.

Read on