Aximmetry Eye and Ipad for Camera Tracking.

 

Hi, thanks for a great APP. I have tried using the iPad 11 pro with the Aximmetry Eye App and the overall experience was really good despite some minor jitters when moving the camera around.  I use the iPad pro over the iphone as it doesnt overheat that easily.

I then experiment mounting the iPad on a camera to use as a tracker and unfortunately it was drifting quite a bit. I do understand that the iPad lidar tracking is not for professional use but does using the additional Aximmetry camera calibrator help in this area? I'm on the free Studio DE version currently.  

For static shot, this ipad/camera combo is great though, no jitter/drift, rock solid. 

   Boonstar

 
Profile Image
Eifert@Aximmetry
  -  

Hi,

Yes, using the Aximmetry Camera Calibrator will help in reducing the drifting. This occurs because when you use Aximmetry Eye's tracking with a video input other than the phone's camera, Aximmetry does not know the camera's location relative to its tracking position. By utilizing the Camera Calibrator's tracking calibration, it calculates this misalignment, known as Delta Head Transform in Aximmetry. Although it's possible to manually specify the Delta Head Transform in Aximmetry by measuring this offset by hand, achieving a good level of precision, particularly in measuring the rotation of the offset, is very challenging. Furthermore, the Camera Calibrator can also do Lens calibration which can enhance realism.

Additionally, to minimize drifting you should also define the origin:
https://aximmetry.com/learn/virtual-production-workflow/preparation-of-the-production-environment-phase-i/starting-with-aximmetry/aximmetry-eye/what-is-aximmetry-eye-and-how-to-use-it/#defining-the-null-point-of-the-virtual-enviroment

Warmest regards,

 
Profile Image
Boonstar
  -  

Thanks for the reply, yes, it's super challenging dialing the offset manually. I've reckoned I'll go ahead and purchase the Camera Calibrator. Can I confirm that the Camera Calibrator's can help to calibrate with the following setup -  Aximmetry Eye, iPad/iPhone for camera tracking and a Sony FX3/Blackmagic Capture Card for video. 

I'm using the above setup for a low budget experimental short film. I understand that it's not the ideal tracking solution so I'm not expecting it to track like the professional system.  

 

 
Profile Image
Eifert@Aximmetry
  -  

Hi,

Yes, the Camera Calibrator will help with that setup and the offset calculation.

Note that if you wish to conduct Lens calibration as well with the Camera Calibrator, you must use the camera with a fixed zoom or lens. This is because the setup will not be able to track changes if you adjust the zoom. Nevertheless, this will not impact your calculated offset (Delta Head Transform - Tracking calibration). Therefore, when performing lens calibration in the Camera Calibrator, ensure you select Fixed:

You can of course do more than one fixed lens calibration and switch between them in Aximmetry between shoots.

Additionally, ensure a stable connection between Aximmetry Eye and your computer to avoid any issues during calibration. A stable connection can be guaranteed by using a wired ethernet connection for the phone, and by utilizing an external cooler on the iPhone. Even though you use the phone for only tracking, it can still overheat due to iOS using the phone's camera to calculate the position.

Warmest regards,

 
Profile Image
Boonstar
  -  

Hi, got everything setup with the camera calibrator and I must say the tracking is impressive after the calibration. For my learning, I do have some questions regards to Origin point when using Aximmetry Eye. How can I tell Aximmetry Camera Calibrator my origin point when using the Aximmetry Eye as tracker? Here's a picture showing the origin point with marker floating in the air after tracking calibration, the tracking was good though. Is it because I don't have a hardware tracker like the Vive or Anti-latency tracking volume, so the camera calibrator doesn't know where is my Origin point?  

Not sure what I did was proper, so after the tracking calibration, I apply the calibration profile in Aximmetry and then I use the ArUco marker to define the origin. Everything seems to be in order and the tracking was good after dialing in some tracking delay. Is this how it should be done, or have I missed some steps like I should detect the origin with the ArUco marker first in Aximmetry before using camera calibrator? But I don't see a way to input the origin point into camera calibrator though. Everything is working, I just want to understand a bit more on the process. Thanks.


Aximmetry Eye and Ipad for Camera Tracking.

 
Profile Image
TwentyStudios
  -  

Yes, since the origin point is set in Aximmetry and not in the tracker itself when using Aximmetry Eye, it’s natural that the Camera Calibrator (which doesn’t have the ArUco tag detection) won’t show the correct results. 

 
Profile Image
Boonstar
  -  

Thank you TwentyStudios.

Here's some of my observation after using the Aximmetry Eye and iPad for a few days. I'm using a Sony FX3 camera and the Aximmetry Studio DE edition. This is a very basic setup with no genlocking and camera is connected to a Blackmagic card running everything on 1080/30p. GPU is hovering around 30% and CPU around 15%. No overheating on the iPad even using it for many long hours, possibly because I only use it to send tracking data. The iPad is mounted securely on the camera and wired to the computer ethernet. I have make multiple calibration of the lens and tracking using the Camera Calibrator.

 After the calibration,  in Aximmetry Composer I dial in the tracking delay say 2 frame and manage to get everything to 'stick' . However, after about 30 second or so, it will start to drift. I will then have to dial in the tracking delay again and the initial 2 frame delay doesn't work anymore and I have to constantly dial in new frame delay just to get the tracking stick for a little while before it start todrift again. So, am I correct to say that this setup will never work for camera tracking because the delay varies over time and there's no way to have a constant delay setting due to no genlocking in prosumer setup? I repeated the same test using the Detect Camera Delay and it was the same result. Detect Camera Delay will show a 2 frame delay and the tracking will be good and stick for a while before it start to drift again. I then activate the Detect Camera Delay again and this time it shows 3.2 frame and the delay just varies overtime as I activate it, nothing is constant. The fact that the Detect Camera Delay gives new tracking delay number every time I activate it, mades me think that the data is not constant and it's fluctuating all the time

Like I said earlier, I'm not expecting this to perform like the professional tracking system, just for my learning experience and also to understand the limitation of the Aximmetry Eye App for my usage. I'm also thinking of buying the Retracker Bliss, but without genlock, will I see anything different from using the Aximmetry Eye App? Reason for considering the Retracker Bliss is mobility, as I need to set up in different locations.

 
Profile Image
TwentyStudios
  -  

If the Aximmetry Eye sends tracking data at 30 FPS and your camera also captures at 30p, there is a natural timing variability of one frame between them, since they are both running independently without a common clock reference (genlock). Even if you could genlock the tracking data, there would still be one frame of variability since your FX3 doesn’t genlock.

There might be a way for Aximmetry to (partially) fix this, since they do have the timing reference of your camera via the video input into Aximmetry. It they could clock (or re-clock) and timestamp the Eye tracking data (possibly using PTP) to the video signal ref, they could increase the timing accuracy. Since I doubt they could actually sync the sensor shutter on the iPad to PTP, it would also require being able to capture the tracking at a higher frame rate and doing some interpolation, so it’s not a trivial thing to accomplish.

The ReTracker Bliss sends tracking data into Aximmetry at up to 1000 fps, and since Aximmetry is already locked to your video input, it will always have a matching tracking data for each frame. Instead of a 1 frame variability you’re down to a 0.03 frame variability at 30p, which is virtually undetectable. This is why ReTracker Bliss doesn’t need genlock. They do offer a solution for hardware genlock through the FIZZ encoder box, but that’s for systems that won’t accept the 1000 fps tracking data and it still requires a camera that can be genlocked as well. 

 
Profile Image
Boonstar
  -  

Thanks. I will definitely check out ReTracker Bliss. Currently, I don't have the need for the Broadcast DE version and I think the plus is Retracker Bliss allows both FreeD and OSC, and I think OSC will probably work with my Studio DE version. One question though, is there a difference in quality between the 2 protocol? 

 
Profile Image
Eifert@Aximmetry
  -  

Hi,

When using the ReTracker Bliss OSC, you will experience a decrease in quality compared to the Free-D protocol. This is partly because the ReTracker OSC relies on a custom compound to provide tracking functionality. Unlike built-in tracking systems, it does not utilize Aximmetry's internal mechanisms.

Warmest regards,