Change origin of a tracked cam

 

When using a virtual cam it's pretty simmple to place it where you want on the 3D scene, you go edit mode, you navigate, find you spot and "put in front" the billboard. 

When using a tracked cam, I know you got to use the Scene Node to change origin. What is your workflow to reach the same position, or area ? How do you get orientations ? For exemple, I use a pretty large scene and the origin of my scene is randomly outside the building, I'm in full black and stuck with the FOV of my tracked lens. No eyemarks or reference points to know where to go in that sea of black :)

   whoozben

 
Profile Image
ericmarodon
  -  

First, note that you can switch to "free camera" mode even if you're in a tracked camera compound. So that you can move the view around freely to understand how your scene is oriented and placed.

Then, I try to make sure the origin of my 3D scene is in a sensible spot in my scene (eg: in the middle of the set, not in a wall). Scene tab can help move things around.

At last I set an aruco marker on the ground in the studio where I need the origin to be, and use the "detect origin" trigger to roughly place the cameras in place respective to the studio.


 
Profile Image
whoozben
  -  

Thats the main problem : free camera is a great mode to go wherever you want, but when you back to normal mode its pretty hard to come back where you wanted to be.

When you say Origin of 3D Scene, do you mean the origin of the Unreal Scene ? Cause I thought it was something you could not change. 

I'll give a go to the Aruco marker but if it is the same procedure as setting up an Origin center with Vive Mars, been there done that :)


 
Profile Image
RikvdReijen
  -  

Hey there, I am trying to achieve the same thing. I am trying to align the tracked camera with a ArUco marker on the table. To make it easy (or at least I had hoped so). I have also placed a ArUco marker in the Unreal Engine scene so that I can easily get the exact coordinates from there for alignment. However it turns out not to be so easy. I have changed the scene coordinates to the ArUco marker placed within the 3D scene (compare the values of the below screenshot)

Change origin of a tracked cam

And here is an screenshot from Aximmetry eye

Change origin of a tracked cam

Furthermore to get it align a little bit I rotated the scene by 90 degrees, I was still wondering whether that's correct or not (the ArUco marker is also rotated by 90 degrees but could just be a coincidence?

 
Profile Image
RikvdReijen
  -  

Furthermore I discovered that the scale of my tracked camera movement in Aximmetry is not equal to that in the real world. I have tested this by moving the camera along the tables edge but it's not one on one.

 
Profile Image
TwentyStudios
  -  
First of all: Aximmetry and Unreal use different scales and coordinate systems, so you can’t just copy the values from one application to the other. There’s a compound in Aximmetry that helps you convert the values via copy/paste. 
Secondly, the ArUco marker is intended for matching the floor plane of the tracking data and the virtual world. If the floor plane isn’t at zero in your Unreal scene, the tracking data will be offset and won’t be correct anymore unless you compensate for it. 
 
Profile Image
whoozben
  -  

i would like to get back to the topic of my post.

> when you start to work with a tracked cam, why it’s not located at the origin of the scene (or pretty close) ?

> why this random origin of the tracked cam is way different that the one when you start the free mode ? how can you get orientations if your tracked cam starts way outside the scene and you’re in full black ?

 
Profile Image
TwentyStudios
  -  
@ whoozben: It should be located at the origin of your scene. Are you sure your Unreal scene origin (0.0.0) is where you think the world origin is? Whatever your tracking system sends as the world origin will correspond to the world origin in Unreal. How else should it work?
 
Profile Image
Eifert@Aximmetry
  -  

Hi,

If your intended location in Unreal is not near the 0,0,0 position, you should select an object near that location in Unreal and copy its position into the [Common_Studio]:Compounds\Tools\Unreal_Transformation.xcomp compound.
You can even create a setup using the Set Transformation Pin module to set then the SCENE panel's transformation in Aximmetry easily:

Note, you will have to change the Module pin above according to which camera compound you are using.

Regarding your previous comment, Whoozben, about the difficulty in moving the SCENE to where the Free camera is located, we are aware of this. A solution to this will be added in a future update of Aximmetry.

Warmest regards,

 
Profile Image
RikvdReijen
  -  

Thanks for the advice guys. It turns out even though my locations don't perfectly match up that the FOV of the virtual camera does not match the tracked camera of Aximmetry Eye (shouldn't that be already done in the calibration file since the iPhone is pretty standard?)

I tested whether my movement was scaled by doing a 3D scan of my studio btw and walking from wall to wall (and first having positioned the tracked camera at a reference point. Notice in the picture below that even though electrical wall outlet (marked in red) are at approximately the same space. On the virtual camera much more of the triangle (side of a dormer) is shown.

Change origin of a tracked cam

 
Profile Image
Eifert@Aximmetry
  -  

Hi RikvdReijen,

Yes, Aximmetry Eye does transmit the Field of View (FOV) and Lens Data, so your virtual camera's FOV should be the same as your iPhone's camera. Unless you have Manual Lens turned on, more on that here: https://aximmetry.com/learn/virtual-production-workflow/green-screen-production/tracked-camera-workflow/inputs-tracked-camera/#manual-lens-parameters

You shouldn't use 3D scans to compare as they are not very accurate. Instead, you could use the ArUco marker with the Marker Detector module as that gives out a very accurate position for the ArUco. You would just need to capture the Marker Detector's output from different camera positions and see how much it changed. Or just measure the movement with a measurement tape.
It's better not to use 3D scans for comparisons, as they tend to lack accuracy. Instead, consider using the ArUco marker with the Marker Detector module, which provides a precise position for the ArUco in the real world. You can capture the Marker Detector's transformation output from various iPhone camera positions to observe any changes, or simply use a measurement tape to measure the movement of your iPhone.

Also, keep in mind that iPhone tracking isn't very precise over long distances. Since it doesn't use an external device for measurements from a fixed point, errors can accumulate over time. Consequently, returning to your original position with the iPhone will not really reset measurement inaccuracies.
To actually reset the inaccuracies, you could use the ArUco marker to detect the origin during breaks in your production, not just when starting Aximmetry Eye and Aximmetry (when it is essential).

Warmest regards,

;