Shifting the world with data offset (tracked data)

Good day. Could you please advise on the best way to implement this? Thank you very much in advice!

I would like to implement fill light on the left side of the actor, but with unreal scene on monitor. (it's not a led wall and i don't plan to film on it)

A separate machine will be used for this.

The image should dynamically shift based on the tracker data, but always maintain offset — meaning the monitor should always display the left side of the unreal scene relative to the actor and shift with tracked data.

The ability to configure monitor's base position data relative to the actor - such as distance and rotation angle would be really awesome!

I tried to do this myself, but I don't fully understand how to implement it better, and I'm not entirely sure which modules are best to use for the transformation after receiving tracking data.

With best regards, Maria.

   Maria Moon

Comments

Eifert@Aximmetry
  -  

Hi Maria,

If I understand your goal correctly, your setup looks like this:

Studio setup:
There is a green screen with an actor (talent) in front of it, and a monitor placed close by to provide light on the actor.

Aximmetry setup:
One computer is used for the green-screen production.
A second computer renders the image shown on the monitor so it can illuminate the actor realistically.

If that is the case, then the monitor's content should not shift based on the tracking data. It should remain fixed relative to the actor, or to the actor’s virtual Billboard, depending on whether the Billboard is being tracked directly (talent tracking) or via optical tracking, or Auto Position is turned off.

My impression is that most of the difficulty comes from trying to use camera tracking data in a situation where it is not needed.

If you know LED wall production, this is very similar to how Fixed Fill Position works in Aximmetry, which is also the recommended method for LED wall workflows: https://aximmetry.com/learn/virtual-production-workflow/led-wall-production/setting-up-the-led-walls/fill-adjustments/#use-fixed-position 


What you need instead is a virtual camera for the monitor fill light that stays in a fixed position or moves with the virtual Billboard of the actor. In a multi-machine setup, you can do the fixed position by assigning one of the Cameras in the camera compound to the machine that will render the monitor. In the SELECT CAMERA panel, set the Cam # Engine parameter for that Camera. After that, go to the Camera’s ORIGIN # panel and use Delta Cam Transf to place it somewhere near the billboard or the actor.

In the INPUT # panel, you will probably also want to enable Override Zoom and adjust the Zoom value so that more of the scene is visible on the monitor. In addition, you should turn off that INPUT with the power button on the panel, because it will not be receiving video or tracking data. Even after turning it off, the Override Zoom parameter will remain effective.

Also, using two computers is not strictly required. In Unreal, you could create a setup with a Scene Capture 2D Actor and the Set Aximmetry Video node inside a custom/level Blueprint, allowing the same computer to render both the green-screen production and the monitor output. However, this would use more machine resources, especially at higher rendering resolutions for the monitor. The multi-machine workflow is not just better, but it is simpler to set up as well.
More information about Set Aximmetry Video is available here: https://aximmetry.com/learn/virtual-production-workflow/obtaining-graphics-and-virtual-assets/creating-content-in-ax-scene-editor/additional-control-with-blueprints/#passing-data-from-unreal-to-aximmetry 

Warmest regards,