Aximmetry Eye as a tracker for a studio camera?

 

Is there any way of adding a 'proper' camera to an Aximmetry Eye setup? I understand the process of creating the offset and lens distortion parameters, but can sync be achieved between the iphone and main camera? 

I love the simplicity and accuracy of Aximmetry Eye, but the iphone is obviously limited from a lens/video quality perspective.

Thank you! 


 

   megascapes

 
Profile Image
Eifert@Aximmetry
  -  

Hi,

The delay will change over time between your studio camera and the iPhone's tracking. Sadly, there is no method to circumvent this hardware limitation. However, if you are capturing short takes of 3-10 minutes, you are unlikely to experience any significant delay variation.
Between takes, you can easily correct the delay value by using the Detect Delay feature: https://aximmetry.com/learn/virtual-production-workflow/preparation-of-the-production-environment-phase-i/green-screen-production/tracked-camera-workflow/inputs-tracked-camera/#delays

In most of the professional tracking systems, delays can be fixed to a constant value by genlocking the camera and the tracking system.

Note that an iPhone will not be able to track your zoom and focus.

Warmest regards,

 
Profile Image
megascapes
  -  

Thanks for the reply. I will experiment with it. If I just reset the app on the phone (by either stopping the tracking trasmission by hitting the red button, or just restarting the app), will that also reset the delay, or would I need to also reset/restart the external camera? 

Ive seen it recommended elsewhere to use 60fps video recording from the iphone. Will using 60fps also help to minimise the delay? 



 
Profile Image
Eifert@Aximmetry
  -  

Hi,

Note that the delay itself is not a problem; it can be easily corrected using the Detect Delay feature. The issue arises because this delay changes over time due to the lack of synchronization between the studio camera and the iPhone tracking.
Restarting any or all of your devices will not reset the delay to the originally measured Tracking Delay value. Instead, it will likely introduce a different delay value, as the devices will lose their rhythm of capturing frames and will begin recording and sending frames from different starting points in time.

I am uncertain whether 60 fps will help minimize the delay change over time. However, it will enable you to specify a more accurate initial delay value, which will probably save you a few minutes before the change in delay becomes noticeable.

Ultimately, you should test this with your hardware since the delay changes over time will depend on your specific devices.

Warmest regards,


 
Profile Image
megascapes
  -  

Hi there, 

So, the issue Im currently having is that I have the iphone and external camera rigged together, I run the advanced camera calibration, and generate the camera profile, I apply the camera profile to the iphone tracker in the device mapper, but I cannot get the detect tracking delay to work. On a firmly printed red circle, the x should be roughly centered, but the x is offset (it moves with the camera but is not set near the circle). I think there is perhaps an issue with the overall offset between the camera and the iphone, dispite running the calibration process? Do I need to manually enter the delta transform in the origin tab? In the calibration process the iphone was set was sending data. 


Going back into the camera calibration and testing the calibration I can see that the floor and markers are not in the correct position. The floor is at head height. Further, when setting the origin using the aruco tag, should I use the input video from the phone itself to set this or from the video coming from the external camera (assuming that the camera calibration file is in use here, and I assume should already contain the offset information). 


Thank you!


 
Profile Image
Eifert@Aximmetry
  -  

Hi,

In the Detect Delay function, the position of the X is not related to the calibration or the tracking. It is simply an image detection, Aximmetry identifies the red's position on the video input's picture. Please note that in Aximmetry 2024.2.0 BETA, this feature has been improved for more accurate detection of the red (X).
Otherwise, if it is still not over the red, ensure no other red objects are in the camera's view, the red circle is in focus, and the image is not overexposed.

For iPhone tracking, you need to set the Delta Cam Transf in the Origin panel, as the iPhone will not know the floor's location. In professional tracking systems, the tracking system usually knows where the floor is.
When using an ArUco tag to detect the floor, use the video from the studio camera, as you did the tracking calibration with that.

Also, ensure that you leave the Delta Head Transf in the Origin panel at zero, as the calibration file contains this offset.

Warmest regards,

;