I am testing Aximmetry in a virtual production environment with an LED wall.
In a multi-machine setup, I have applied digital extension using the VIVE Mars tracking system.
Here are the result videos for the assets:
- Cave asset: https://youtu.be/Y0sXkTko02E
- Subway asset: https://youtu.be/y5pH1UBxCUI
At first glance, the results may seem fine, but upon closer inspection:
- The assets are misaligned:
This is evident in the subway asset, where the metal pillars are slightly out of place, and the no-smoking sign is also misaligned.(https://youtu.be/F7rB-a9BJ-s -> I highlighted with a red box)
In the cave asset, stones on the floor overlap and appear misaligned.(https://youtu.be/CiCphCgv-08 highlighted with a red box)
Even after trying to adjust transforms, the current result is the best we could achieve. - The digital extension jitters
This issue is most noticeable in the cave asset. Even when the digital extension around the LED wall is stationary, it visibly shakes.(https://youtu.be/so5655Mqohs highlighted with a red box)
I tested for potential tracking delays, but the problem does not seem to be related to delay. The digital extension does not lag or move too fast but instead jitters up and down.
This is my questions:
- Are these issues common?
- Could the problem stem from the VIVE Mars tracking system we are using?
- Is this level of output typically acceptable in similar virtual production setups?
Thank you in advance for your response.
I’d say this is about as good as you can expect the alignment to get with the Vive Mars system. Besides the precision of the tracking, there’s also the accuracy of the lens calibration and tracker to sensor calibration, both of which can be less than perfect with the Aximmetry (or Vive Mars) calibrator.