Questions about 5000ada graphics cards and 4090 graphics cards and Genlock

 

We used a server as the main synthesis server, and another server for rendering LED wall tests. We tested the Nvdia RTX 5000 ada card (with Nvdia Quadro sync) and the RTX 4090 card for rendering the LED wall. Our main server has an Nvdia RTX 5000 ada card with Nvdia Quadro sync and a Decklink 8k Pro, which we linked to external Genlock, tracking systeam, and cameras. Even though both the rendering server and the compositing server have 5000 ada cards and are linked to Genlock signals, we found that the animations within the LED wall and the external digital extension animations could not be seamlessly linked together. Internal and external animations appear at the wrong time.This is consistent with the results we obtained by rendering with a 4090 card without frame synchronization.

Therefore, we suspect that frame synchronization is not working with UE5 or Aximmetry, but if we unplugged the frame synchronization, Aximmetry would also display an error. Than I would like to ask what additional settings are needed to synchronize animations between different machines in UE5? Is it impossible for animations to be synchronized between different machines? If not, what is the difference between using consumer-grade graphics cards and professional graphics cards?Or did we miss something about Genlock in Aximmetry?

   Ruizhong

 
Profile Image
Eifert@Aximmetry
  -  

Hi,

You can synchronize animation between different machines and also between the Digital Extension and the LED walls. However, the latter won't happen automatically.

You are likely experiencing the animation being out of sync with the Digital Extension because you don't have a Picture Delay(s) set:

If you don't set a Picture Delay and leave it at 0, the Digital Extension will fit nicely on your LED wall's image. However, the animation will differ between the LED wall's image and the Digital Extension's image (meaning the animations/movements will occur sooner in the Digital Extension). Additionally, switching between cameras will be out of sync.

If you set the correct Picture Delay, then animation and switching between cameras will be in sync. However, the Digital Extension fitting around the LED Wall will be less accurate. If you render the Digital Extension separately, then the animation and switching between cameras will be in sync, and the Digital Extension fitting will be accurate. However, you will need at least two machines and must set the delay between animations or events in the Flow Editor.

You can read how to use two machines like that in the Switching Between Cameras paragraph in the documentation: https://aximmetry.com/learn/virtual-production-workflow/preparation-of-the-production-environment-phase-i/led-wall-production/using-led-walls-for-virtual-production/#switching-between-cameras


Note that Nvidia Quadro Sync is not yet fully supported. Your ADA cards will still be genlocked, but each machine might have a slightly different delay. This can become a visible issue when using more than one LED wall. However, in the Digital Extension, you probably won't be able to notice a one-frame delay.
You can fix such delays by adding a one-frame delay using a Delayer module to the LED wall output that is not lagging behind:


Note, that in a future release of Aximmetry, Nvidia Quadro Sync will be fully supported. Additionally, there will likely be an automatic Picture Delay calculation option and the Animation Delay process will be simplified.

Warmest regards,

 
Profile Image
Ruizhong
  -  

The delay setting here only seems to work for switching between multiple cameras, setting it doesn't alleviate the digital expansion and the asynchrony of the animation within the LED.

Is there any other way to do it?

 
Profile Image
Eifert@Aximmetry
  -  

Hi,

It could be what you are experiencing is the asynchrony of the animation is actually the imperfection of the Digital Extension fitting. By adjusting only the Picture Delay, you can rectify the animation delay no matter what. However, doing so affects the accuracy of the Digital Extension fitting around the LED Wall. This inaccuracy becomes noticeable only when you move the camera. And this fitting inaccuracy can give the impression that the animation is out of sync. This can be rectified by the multi-machine setup where the digital extension is only rendered by the controller machine.

Note that some level of tracking inaccuracy is inherent in LED Wall production, as it is impossible to predict exactly where the camera will be positioned in the future. But you can mitigate it by reducing the delay between the LED Wall and your computer and your tracking. Additionally, ensure that Aximmetry's In-to-Out Latency is set as low as possible.

To debug the animation delay and see how the Picture delay fixes it, you can render each frame's index number and compare the values displayed in the digital extension part and the video input. For example:
In the above image's red rectangle, a video image is rendered with the frame index number displayed multiple times. This image is then connected to the LED Wall compound. In your studio, position the camera so it simultaneously captures (frustum) both the LED Wall and the Digital Extension.
In the image above, the output of the LED Wall compound includes delayer modules, which display frames at one-second intervals instead of the actual frame rate. This allows easy comparison of the frame index numbers on the LED Wall with the Digital Extension in the final image.
Increase the Picture Delay by the difference observed between these numbers. Once adjusted correctly, all the numbers should match, ensuring there is no delay in the animation.

Warmest regards,

 
Profile Image
Ruizhong
  -  

Hello!

We tried to reduce the in-to-out-lantency as much as possible, and here is what we got:Questions about 5000ada graphics cards and 4090 graphics cards and Genlock

Next we tried to adjust the image delay. But for the original LED module, there seems to be no corresponding image delay option.

We tried to enter the module to adjust the delay of different images, but the result was not ideal. I think we may not have adjusted the right place. Where should we adjust the delay?

By the way, can Aximmetry broadcast existing tracking signals? The Sony fr7 we are using does not seem to support sending tracking signals to multiple devices at the same time.

 
Profile Image
Eifert@Aximmetry
  -  

Hi,

You should replace the LEDWallCam_3-Cam_4-Wall compound in your project with the original [Common_Studio]:Camera\LEDWallCam\LEDWallCam_3-Cam_4-Wall.xcomp. Because you previously unlinked it, it is no longer updated when you update Aximmetry, which can lead to various issues.
You can learn more about Linked Compounds here: https://aximmetry.com/learn/virtual-production-workflow/scripting-in-aximmetry/flow-editor/compound/#linked-compound 


I am sorry, I think I may have misled you earlier, and I did not fully take into account that you have a multi-machine setup. This is why you see different Frame IDs as they are from different computers, and Frame IDs are not synced.
Instead, the System Params module's Frame ID, you could do the above setup with a Timer module and see how that is synced if you start it when both computers are part of the multimachine setup, and how Picture Delay changes it.

Do you have a multi-machine setup where the controller machine only renders the Digital Extension?
https://aximmetry.com/learn/virtual-production-workflow/led-wall-production/using-led-walls-for-virtual-production/#multi-machine

If so, you will need to delay the start of Unreal animations, as detailed here under the Animation Delay section:
https://aximmetry.com/learn/virtual-production-workflow/led-wall-production/using-led-walls-for-virtual-production/#animation-delay

You can start Unreal animations from Aximmetry by using the Get Aximmetry Trigger blueprint node. More on it here:
https://aximmetry.com/learn/virtual-production-workflow/obtaining-graphics-and-virtual-assets/creating-content-for-aximmetry-de/additional-control-with-blueprints/#get-aximmetry-trigger

If you do not have a multi-machine setup where the controller machine only renders the Digital Extension, you will still need to start the animations on both computers at the same time. This is necessary because Unreal Engine itself does not have synchronization between machines, and there is no way to ensure that all engines start at exactly the same time.

Note: For the best Digital Extension fitting, it is recommended to use a multi-machine setup with the controller machine rendering only the Digital Extension.

Warmest regards,

 
Profile Image
Ruizhong
  -  

Hi,

We are indeed deploying in a multi-machine environment. We have a compositing server for Digital Extension rendering, and two rendering servers for rendering the floor LED and the wall LED respectively.

Maybe our multi-machine environment deployment method is wrong? But the animation is perfectly connected on the rendering server, and it will only be out of sync when it contacts the external extension.

Is this animation delay logic only valid for animations or sequences that can be controlled by blueprints? Because we are currently using plants, and the fluttering of plants is controlled by materials, I am not sure if this is effective.

At the same time, due to licensing reasons, the version we use in the current environment is Aximmetry 2024.1. The latest version of Aximmetry is currently deployed in another place, and we will go back to test it again in a while.

At the same time, we noticed that there seems to be an option called Strict frame sync in the settings of the multi-machine environment in the new version. We are very curious about what effect it can have and under what circumstances can it be turned on?(The picture shows where we found the corresponding options, which has nothing to do with the multi-machine environment settings we actually set up)

Questions about 5000ada graphics cards and 4090 graphics cards and Genlock

 
Profile Image
Eifert@Aximmetry
  -  

Hi,

There is nothing wrong with your setup.

If I understand correctly, you have three machines: two serve as renderers for the LED Walls, and the third machine does not render any LED Walls but receives all the camera inputs. This third machine is the so-called compositing server for Digital Extension rendering.

This is the recommended way to set up LED Wall production. However, to ensure that animations remain in sync, you will need to perform some additional steps. As detailed in the Animation Delay section of the documentation:
https://aximmetry.com/learn/virtual-production-workflow/led-wall-production/using-led-walls-for-virtual-production/#animation-delay
Unfortunately, the current documentation does not explain this very clearly. We are working on completely new LED Wall documentation, which we hope will provide much clearer guidance.

In the meantime, you may find these two images I posted helpful in understanding why animation delay is needed with this kind of multi-machine setup: https://my.aximmetry.com/post/3651-how-can-i-set-the-digital-extension-dela
In summary, when you have a dedicated machine rendering only the Digital Extension, the entire scene is re-rendered on that machine using new tracking data. For this reason, you need to apply the Animation Delay as described in the documentation.

If you do not set up a dedicated machine for the Digital Extension, the Picture Delay will delay the Digital Extension too, and animations should match perfectly if you are only using a single computer. (But because the Digital Extension is not rendered by a separate machine, it won't fit as well as it would in your multi-machine setup.)
However, since you are using two computers to render the LED Walls, even between them, the Unreal animations will probably not be in sync. Unless you start the animations from Aximmetry after Unreal is running. In that case, they will be in sync because actions performed from Aximmetry’s UI are synced through your multi-machine setup. (To do this, you will need to expose the starting of the animation playback on the Unreal module in Aximmetry’s Flow Editor.)

However, the fluttering of plants that is controlled by materials is not something I would classify as animation. In Unreal, "animation" typically refers to systems that can play back actions defined using keyframes, similar to how Aximmetry’s Sequencer works.
I am not even sure it is possible to trigger such material effects from a fixed point. Maybe you could apply the material on the object from a blueprint and have that action exposed on the Unreal module of Aximmetry. So that the materials will be created at the same time on the machines.
Additionally, if random numbers are used, the fluttering effect will likely be randomized differently on each machine. To fix this, usually you can set the randomizing module in the material to use a fixed seed number instead of a random one. If you can share the material, I can take a closer look at how to make it syncable.
Particle systems usually have the same issue.
(Note, even Unreal's multiplayer solution wouldn't sync such things, as in a multiplayer game, the players don't necessarily need to see the same exact effects.)


The "Strict Frame Sync" option was a test for a very specific case; it is no longer available in Aximmetry settings and was never available to anyone except a couple of testers, which is why it is grayed out for you. Nonetheless, we are continuing to develop more and easier synchronization options.

Warmest regards,

 
Profile Image
Ruizhong
  -  

Hi,

Thank you very much for your help! We are preparing to verify whether it is feasible.

The scene we are currently using is the free scene Rural Australia in UE5. We hope to solve this problem of the vegetation material animation being out of sync so that we can avoid these problems when we build scenes ourselves in the future.

As you said, we did not have the animation out of sync when testing on a single machine, and this problem only occurred in a multi-machine environment.

But I have a question, because we used three machines, but the animations rendered by two renderers on the LED wall were synchronized. If random numbers are used, shouldn't they animate differently? Why is it only the animation of the external extension rendered by the compositor that is out of sync with them?

In addition, I have other questions. Can we upload custom models to specify the size and appearance of the LED? Sometimes we encounter some screens with curvatures on the edges, and sometimes they are used as boxes. If we want to use xR on this, it seems very tricky to use the wall that comes with aximmetry to handle the curvature of the corners.

Questions about 5000ada graphics cards and 4090 graphics cards and Genlock

(This is a sample LED image. You can see that the walls have slight curves to make the turns, but they are still straight.)

And will the lens correction software be updated in the future? Our FR7 uses a lens of 28-135mm, and we noticed that it does not calibrate well after exceeding a certain range. This also caused the virtual wall to seem to be offset from the actual wall when the camera moved beyond a certain range, even though we had calibrated it as closely as possible at the initial position. (This problem often occurs when we use other tracking and cameras, and we are not sure if we did it wrong. We strictly followed the documentation method to combine the coordinates to calibrate the LED wall.)

And the slight shake caused by FR7 when it stopped after a fast movement did not seem to be passed to aximmetry through tracking. We are not sure whether this is a problem with FR7 itself or caused by other problems. We have placed it on a tripod on the horizontal ground as much as possible for testing. (This caused the camera image and the out-of-bounds expansion part to be slightly offset due to shaking when it stopped.)

;