Green Screen-Based Virtual Production: Multi-Machine Setup Questions

Hello. I’m a student studying Aximmetry at school.

I'm currently working on a green screen-based virtual production setup using Aximmetry and Unreal Engine, and I have a few questions regarding a multi-machine configuration. I would appreciate any guidance or corrections based on your experience.

Green Screen-Based Virtual Production: Multi-Machine Setup Questions

1. System Diagram Feedback

I've created a system diagram to outline how I'm planning to set up the machines, capture cards, and signal flow. If you notice any mistakes or areas for improvement, I’d be very grateful for your feedback.


2. Best Role Assignment for Each Machine

I have access to two computers:

Machine A: AMD Ryzen 9 5950X + NVIDIA RTX A6000

Machine B: Intel i9-12900K + NVIDIA RTX 3090


Based on performance and compatibility, which machine should run Unreal Engine, and which should run Aximmetry for optimal performance in a green screen virtual production workflow?


3. Capture Cards on Both Machines?

In a multi-machine setup (e.g., one machine for Aximmetry control, another for UE rendering), do both computers need capture cards, or can only one handle the SDI/NDI input/output?


4. Is Genlock Required in Green Screen Setups?

I understand that Genlock is essential in LED volume environments, but is it still necessary for chroma key setups, especially when using SDI cameras and multiple machines?


Thanks in advance for any insights or advice. Looking forward to hearing your thoughts!


   gibeom

コメント

JohanF
  -  

Why would you run Unreal on a separate workstation? Unreal scenes open directly in Aximmetry, and that is a big point of the whole system. Your proposed setup is needlessly complicated and will miss a lot of functionality you get when rendering the UE5scene inside Aximmetry.

Regarding genlock, you will need to genlock the camera and Vive Mars tracking system. With a high frame rate tracking system like ReTracker Bliss you won’t need it, since the tracking data is locked with the camera video input in Aximmetry . 

gibeom
  -  

Thank you for the helpful advice.

The reason I was considering running Unreal Engine on a separate workstation was to reduce rendering load across the system. I wanted to explore whether a distributed setup could help balance performance better.

As for Genlock, I am currently applying it to the camera, Vive Mars tracking system, and the capture card.

Thanks again for your insight!

Eifert@Aximmetry
  -  

Hi,

In an Aximmetry multi-machine setup using a tracked green camera compound, each remote computer must have only one camera input and only one rendered camera output. The studio operator should switch between the outputs from each computer by using either a physical video switcher device or another computer running Aximmetry's Video Switcher compound, found at: [Common_Studio]:Compounds\Switcher\Video_Switcher.xcomp.

If you are working with a limited budget, the control machine could also run the Video Switcher compound within the same compound as the tracked camera, allowing you to save on the cost of a third machine or a physical switcher.

In theory, you could modify the structure of the camera compounds so that one computer handles Unreal rendering, while the other manages keying and some other processes. However, this setup would not offer significant performance savings, as Unreal rendering is far more demanding than other tasks performed in the camera compound in Aximmetry.

If you are on a tight budget, you can save on a second capture card by using one computer to receive both camera inputs with a single capture card. You can then send the video to the other computer over NDI and return the final render in the same way. However, note that NDI may cause some loss in image quality, increase delay, and higher performance load on the computer. For best results, each computer should have its own capture card.
More on transmitting videos between the machines using NDI in a multi-machine setup here: https://aximmetry.com/learn/virtual-production-workflow/multi-machine-environment/advanced-information-and-features/transmitting-videos-from-render-to-control-machine/ 


If you have only one studio camera but you really want to use two computers, you could use the second computer to render AR elements over the final image from the first computer. Note that this setup is more complex than standard multi-machine workflows that use camera compounds.
You can find information on how to send camera tracking data and video from one computer to another in such a case here:
https://aximmetry.com/learn/virtual-production-workflow/led-wall-production/combine-different-productions-in-separate-machines/ 
Note, you could use Aximmetry to render the AR elements, more on that here: https://aximmetry.com/learn/virtual-production-workflow/obtaining-graphics-and-virtual-assets/creating-content-for-aximmetry-de/advanced-information-and-features/aximmetry-and-unreal-combined-render/ 

Warmest regards,