Hi,
We were trying a long time to get our setup working but it seems that we are forgetting some steps.
Right now our setup have the next elements:
- A 8x8 Green cyclorama
- Two Blackmagic URSA Broadcast with Fujinon zoom lenses
- 4 HTC Vive Base Station 2.0 mounted on Truss
- Up to 4 trackers (2018), 2 for cameras and 2 for crew and objects
- 4 Glassmark I Encoders (Focus/Zoom for each camera).
- Aximmetry DE Broadcast
- Blackmagic Decklink Duo
Our common workflow is the next:
- Put our headset Vive Pro on a common spot to set de 0 point.
- Open SteamVR and trought Dev settings set room calibration.
- Create a Wifi Hotspot from our internal Wifi card
- Open LONET server.
- Power up the Glassmark Lens encoders
- Set each to their corresponding position (A1 and A2 to camera 1 and B1 and B2 for camera 2).
Here we have two approaches:
- Leave it unmapped. This will cause that all data we can see on Camera calibrator Zoom goes from 0.000 to 0.256 approx.
- Create a LONET Lens map. This will result on a higher range of operation (usually from 0.195 to 3.1xx)
Ok, we choose second one to get a bigger range and smoother transition between values. So we continue:
- Open Camera Calibrator
- Select video source for camera 1
- Set tracking/video delay to something that seems pretty close (never got perfect).
- Move a stand or tripod inside our Cyclorama to get positional reference. Sometimes we use two.
- Move marker to a near position after measure the position from 0 point in real life. Also set set its actual height. (and also marker two if we used two stands)
- Set camera properties (just sensor size, we don't any other param)
- Move zoom on camera to widest view.
- Create a new lens definition. Move Focal length and pan/tilt the camera. Save the value that seems cause less drift. (if we read the data, seems to be a little different from our "real" value (8mm on camera) and the Aximmetry one (usually something between 9 to 11mm)
- Move zoom on camera to narrow view.
- Create a new point and again move Focal Length param and pan/tilt camera until get close.
- Zooming in and out just to check map is working.
Here we can get two ways:
- If it works well, we add 2 or 3 point more just to get better interpolation values.
- If it doesn't, delete values and try again.
Usually we are finding some problems with camera calibrator:
- Zoom data makes crazy things. When we zoom in and then zoom out, we cannot reach the initial value (zoom in starts on 0.195 and when we zoom out our widest value is 0.196 to 0.205). When we move camera or zoom, we need to zero the encoder manually (it is very disappointing). We know Glassmark I are really really sensitive, ok, little movement will be read. But we think that the range of movement from 8mm to 128mm should be much much greater than 0.195 to 3.xxx so this little changing values will not be a problem.
- Zooming we got a jump at some point in between. We tested with just two calibration points (widest and narrow) and something in the middle the FOV jumps when zoom data don't.
At this point, we have the less worse camera mapping and close Camera Calibrator. As our cameras and lenses are exactly the same, we continue with next steps:
- Goto Aximmetry Composer
- Create a new compound and add the MixedCam_Unreal template.
- Add our Unreal Scene to Compound and link all pins.
- On TRK INPUTS, set video source, Tracking data (Positional and Lens), Lens profile and activate Use External Lens Data.
- On each Origin Square, set offset from tracker to camera sensor.
- Set Studio using all measurements from real life.
- Check moving camera. If it's not correct, on Manage Devices -> OpenVr we set the 3 points of space calibration with a tracker.
Here sometimes this is mostly working. Both cameras sees something plausible. Other times all the things have strange rotations that we correct manually on the Origin nodes. But, this will not work forever... we start "playing" our show and from one time to another all the scene moves up, there is a huge offset from the virtual 0.0.0 and the real one, or has strange rotations, or if we rotate the camera seems like the lens calibration went to a walk. When this things happen, we tries to recalibrate all the space again (first from SteamVR and also inside Aximmetry). Other days, just Stopping and Playing again will mess all the scene.
So, to resume, our problems are:
- Calibrating the camera: We cannot get a good lens calibration, at least a constant one (one that not varies from one moment to other randomly).
- Calibrating the camera: On point interpolation, sometimes there is strange jumps and not smooth curves just on FOV values.
- Calibrating the scene: There is an weird offset between real world reference points and the virtual one. Even resetting calibration from SteamVR or Manage Devices. This can happen randomly on the middle of live a show and it is really really difficult to get it working again (not good...).
- On live: At the beginning of the show, both cameras sees something "correct". Time to time, difference will start to appear from one camera to another (talent is on different sites on each camera, just as if lens mapping or offset has changed over time).
Some playaround we have done to help calibration:
- To calibrate studio: Link transformation to a Vive Tracker , and add a button to "catch" the current position to easy move and rotate it.
- To calibrate UE scene: Same as for studio, link transformation to a Vive tracker, with a push button we store the current transform. Usually this is a must do to set the scene floor quickly.
- We know that vive trackers have a lot of jitter, we just managed a "filter" to at least block the camera position camera is not moving. It just not accepts data that is way unexpected (as moving 1m from one frame from another, our cameramans are not as fast). Now are trying to add a kalman filter to it.
Sorry for the long write. I want to explain all as detailed as I can. Maybe some point on our workflow is missing or it helps to somebody else.
Thank you so much!
Adrian
Thanks Adrian for making this post.
We have a very similar problem with our setup, but we are using Antilatency.
I tried multiple calibrations with the camera calibrator but didn't seem to change the outcome.
To add, i also had the issue of a reversed zoom in AX from the real cam after calibration, i would zoom the real cam in and the VP cam would zoom out and vice versa.
I am not sure if it is because of the calibration data or something else, but the drift&slide seems also way more off than in the calibrator.
When i set a manual FOV it is ok, but we need to work with zoom.
Also, i tested the Glassmark directly in unreal and the values seem stable and correct.
The documentation video is nice, but it assumes a straight forward scenario, some problem and issue solving advice would be very appreciated.
Thank You.