Hi, I have just started using a Panasonic aw-UE100 ptz camera to display augmented reality elements within real environments. I am finding it very difficult to calibrate the virtual scene with the real scene. Since when I pan or tilt the camera, the virtual object shifts a bit with respect to the scene captured by the camera which gives a pretty bad result.
Any advice on how to adjust both scenes properly?
I am using TrackedCam_AR_Unreal_Prev_3-Cam in aximmetry.
Could it be a camera calibration problem?
Is there any best practice guide?
Thanks.
You need to understand how to get data track from PTZ Cam and then use tools calibrate of Aximmetry. If done you can use AR easy