Right Workflow of camera calibration with Glassmark encoders and HTC Vive

 

Hi,

We were trying a long time to get our setup working but it seems that we are forgetting some steps.

Right now our setup have the next elements:

  • A 8x8 Green cyclorama
  • Two Blackmagic URSA Broadcast with Fujinon zoom lenses
  • 4 HTC Vive Base Station 2.0 mounted on Truss
  • Up to 4 trackers (2018), 2 for cameras and 2 for crew and objects
  • 4 Glassmark I Encoders (Focus/Zoom for each camera).
  • Aximmetry DE Broadcast
  • Blackmagic Decklink Duo

Our common workflow is the next:

  • Put our headset Vive Pro on a common spot to set de 0 point.
  • Open SteamVR and trought Dev settings set room calibration.
  • Create a Wifi Hotspot from our internal Wifi card
  • Open LONET server.
  • Power up the Glassmark Lens encoders
  • Set each to their corresponding position (A1 and A2 to camera 1 and B1 and B2 for camera 2).

    Here we have two approaches:

  1. Leave it unmapped. This will cause that all data we can see on Camera calibrator Zoom goes from 0.000 to 0.256 approx.
  2. Create a LONET Lens map. This will result on a higher range of operation (usually from 0.195 to 3.1xx)

    Ok, we choose second one to get a bigger range and smoother transition between values. So we  continue:

  • Open Camera Calibrator
  • Select video source for camera 1
  • Set tracking/video delay to something that seems pretty close (never got perfect).
  • Move a stand or tripod inside our Cyclorama to get positional reference. Sometimes we use two.
  • Move marker to a near position after measure the position from 0 point in real life. Also set set its actual height. (and also marker two if we used two stands)
  • Set camera properties (just sensor size, we don't any other param)
  • Move zoom on camera to widest view.
  • Create a new lens definition. Move Focal length and pan/tilt the camera. Save the value that seems cause less drift. (if we read the data, seems to be a little different from our "real" value (8mm on camera) and the Aximmetry one (usually something between 9 to 11mm)
  • Move zoom on camera to narrow view.
  • Create a new point and again move Focal Length param and pan/tilt camera until get close.
  • Zooming in and out just to check map is working. 

    Here we can get two ways:

  1. If it works well, we add 2 or 3 point more just to get better interpolation values.
  2. If it doesn't, delete values and try again.

    Usually we are finding some problems with camera calibrator:

  1. Zoom data makes crazy things. When we zoom in and then zoom out, we cannot reach the initial value (zoom in starts on 0.195 and when we zoom out our widest value is 0.196 to 0.205). When we move camera or zoom, we need to zero the encoder manually (it is very disappointing). We know Glassmark I are really really sensitive, ok, little movement will be read. But we think that the range of movement from 8mm to 128mm should be much much greater than 0.195 to 3.xxx so this little changing values will not be a problem.
  2. Zooming we got a jump at some point in between. We tested with just two calibration points (widest and narrow) and something in the middle the FOV jumps when zoom data don't.

    At this point, we have the less worse camera mapping and close Camera Calibrator. As our cameras and lenses are exactly the same, we continue with next steps:

  • Goto Aximmetry Composer
  • Create a new compound and add the MixedCam_Unreal template.
  • Add our Unreal Scene to Compound and link all pins.
  • On TRK INPUTS, set video source, Tracking data (Positional and Lens), Lens profile and activate Use External Lens Data.
  • On each Origin Square, set offset from tracker to camera sensor.
  • Set Studio using all measurements from real life.
  • Check moving camera. If it's not correct, on Manage Devices -> OpenVr we set the 3 points of space calibration with a tracker.

Here sometimes this is mostly working. Both cameras sees something plausible. Other times all the things have strange rotations that we correct manually on the Origin nodes. But, this will not work forever... we start "playing" our show and from one time to another all the scene moves up, there is a huge offset from the virtual 0.0.0 and the real one, or has strange rotations, or if we rotate the camera seems like the lens calibration went to a walk.  When this things happen, we tries to recalibrate all the space again (first from SteamVR and also inside Aximmetry). Other days, just Stopping and Playing again will mess all the scene.

So, to resume, our problems are:

  • Calibrating the camera: We cannot get a good lens calibration, at least a constant one (one that not varies from one moment to other randomly).
  • Calibrating the camera: On point interpolation, sometimes there is strange jumps and not smooth curves just on FOV values. 
  • Calibrating the scene: There is an weird offset between real world reference points and the virtual one. Even resetting calibration from SteamVR or Manage Devices. This can happen randomly on the middle of live a show and it is really really difficult to get it working again (not good...).
  • On live: At the beginning of the show, both cameras sees something "correct". Time to time, difference will start to appear from one camera to another (talent is on different sites on each camera, just as if lens mapping or offset has changed over time).

Some playaround we have done to help calibration:

  • To calibrate studio: Link transformation to a Vive Tracker , and add a button to "catch" the current position to easy move and rotate it.
  • To calibrate UE scene: Same as for studio, link transformation to a Vive tracker, with a push button we store the current transform. Usually this is a must do to set the scene floor quickly.
  • We know that vive trackers have a lot of jitter, we just managed a "filter" to at least block the camera position camera is not moving. It just not accepts data that is way unexpected (as moving 1m from one frame from another, our cameramans are not as fast). Now are trying to add a kalman filter to it.

Sorry for the long write. I want to explain all as detailed as I can. Maybe some point on our workflow is missing or it helps to somebody else.


Thank you so much!

Adrian

   Qyuipo

 
Profile Image
marioreverist
  -  

Thanks Adrian for making this post.
We have a very similar problem with our setup, but we are using Antilatency.

I tried multiple calibrations with the camera calibrator but didn't seem to change the outcome.

To add, i also had the issue of a reversed zoom in AX from the real cam after calibration, i would zoom the real cam in and the VP cam would zoom out and vice versa.

I am not sure if it is because of the calibration data or something else, but the drift&slide seems also way more off than in the calibrator.

When i set a manual FOV it is ok, but we need to work with zoom.

Also, i tested the Glassmark directly in unreal and the values seem stable and correct.


The documentation video is nice, but it assumes a straight forward scenario, some problem and issue solving advice would be very appreciated.


Thank You.

 
Profile Image
Zoltan@Aximmetry
  -  

Hi Adrian,


Thank you for the detailed and comprehensive info of the situation.


we choose second one to get a bigger range and smoother transition between values

Firstly unmapped data are just as smooth as mapped one. Yes, you only see 3 decimals on the screen, but the value itself has the same precision internally.

But what is more important, I'm not sure you always get values that linearly proportional with the raw encoder values. In this case you will get a very different interpolation result in Aximmetry.

This is why we usually recommend using the unmapped mode.


Set tracking/video delay to something that seems pretty close (never got perfect).

You can try fractional values as well, sometimes it helps.


Set camera properties (just sensor size, we don't any other param)

Why? This is a key point. If you use Camera Calibrator then you should set Delta head transformation here. The ORIGIN panels' Delta head properties in the scene are primarily intended for Manual Lens mode. If you want a proper Focal Length calibration you have to see the scene properly which means have the proper parameters already here in Camera Calibrator.


seems to be a little different from our "real" value (8mm on camera) and the Aximmetry one (usually something between 9 to 11mm)

This can be because the Sensor Width value you provide is usually from a tech spec of the camera or found on the Net, but it may differ from the effective sensor area currently used by the camera. Also we ignore lens distortion with this method which also can cause difference. This is why we offer this visual method instead of simply entering the focal lengths you see on the lens.


Zoom data makes crazy things. When we zoom in and then zoom out, we cannot reach the initial value

Aximmetry does not do anything with the data coming from LONET. There's nothing in Camera Calibrator that can cause this. But to rule out any chance, I suggest the following: start LONET's own encoderview.exe app (it's in the utilities folder). Reset everything then check if the values displayed by encoderview returns to (near) the inital value when you zoom in and out.

Please let us know if you find different results than with Camera Calibrator.


Zooming we got a jump at some point in between.

If you mean occasional jumps that is something we and other customers also experienced, and is probably rooted in the stuttering of the WiFi network. This is why LOLED already offers a wired solution, both for the original Glassmark I and the new Indiemark which is purely USB based.

If you mean a constant bump in the curve you can try the other interpolation method can be set in Camera Properties.


and activate Use External Lens Data

No problem here, just for the info: this option has no effect in this case. It means receiving the full calculated lens data from the tracking system itself ("external" refers to that) and only relevant with professional tracking systems that provide that data. In your case, since no external data is coming, the system uses the "internal" data, which is calculated from the lens file you created with Camera Calibrator.


On each Origin Square, set offset from tracker to camera sensor.

We discussed this above.


from one time to another all the scene moves up, there is a huge offset from the virtual 0.0.0 and the real one, or has strange rotations

There's nothing in Aximmetry that randomly shifts the virtual space over time or between sessions. During our limited experience with Vive we encountered similar fluctuations of the incoming data. That is why we designate it as an "experimental" device and do not offer official support for it.


Apart from these the steps you described are correct.

Using trackers for calibrating the studio and scene is a good addition. (However it also depends on the reliability of the tracking system.)


Thank you again for sharing these issues.

 
Profile Image
Zoltan@Aximmetry
  -  

Hi Mario,

Regarding the reversed zoom.  You have to know that Glassmark sends unsigned data. It means that if you rotate it backwards and cross zero then the values will rise again. Below zero it works inverted.

To avoid that you always have to pay attention on resetting Glassmark while your lens is at the widest zoom state before using.

 
Profile Image
Qyuipo
  -  

Hi Zoltan, Thank you so much for your response.

About camera properties, we understood in the docs that this params were just for PTZ cameras, also this sound weird to us. Maybe it is.

About zooming, yeah, firstly we have talked to Andy from Loled. We exposed our problem and he said to us that the problem seem to be a Aximmetry thing, as the raw data usually its correct. Yeah, there is a mechanical problem also, the encoders are really really sensitive and count steps just with little movements of the rig (panning or tilting). Just we dont understand why our data sometimes it starts on 0.195 and another times by 0.000.

About the range, we found that sometimes when que are zooming the interpolation is not good. To explain it more clearly, is as if we have a keyframe each two frames and set the interpolation to none. The zoom "jumps" between poses. At the beggining we thought that was a camera framerate/interlaced problem, but we didn't get other explanation after dismiss that option. We aim that may be related with the range, but as you said that it internally have more decimals than shown... we will continue searching.

Other times, when we are calibrating, we put just two keys, widdest and narrow and something inbetween is as if there is another phantom point that makes crazy things (this is related to Mario problem i think). Yeah, we set correctly the zero of the encoder on the widdest point (now we are doing that each time we move the camera).

About External Lens data, oh! ok! we thought that button refers to external tracking data from composer, read here Camera Calibrator. Good point.

About Vive trackers, yeah, but one think that i dont understand is why if we put on the headset all the base stations are in place correctly, we can point them and it's virtual position and real one are correct. Why once it is inserted into Aximmetry this makes things terrible?

We will try again and will record a video to show you. 


As i have your attention, i have a few suggestions that we think can improve the DE workflow a lot:

  • As we mentioned before, a solution to "catch" the zero point and rotation from trackes from Scene and Studio is really usefull (if tracking works).
  • To get better tracking and defeat jitter, we are adding a kalman filter using a lua scripting node and also made a noise gate to "freeze" position of the camera if it is not moving or moving enough. Maybe it is interesting to apply early on the pipeline (when the tracking data is read) and have an option to switch them on or off (the kalman filter also will work better in composer native code -c++ i guess?- and get acceleration data from htc).
  •  On the new VP Toolkit from Ian Fursa, he is developing a quick system to calibrate the camera offset, he puts other tracker on other points of the camera (as the lens to get the forward point) and it automatically corrects the orientation of the tracker. It is not a hell, but it is really quick.
  • Maybe it is interesting to get some measurement tool inside composer, or continuing with tracker helpers, to measure the correct distance of the studio for example, move the tracker to the right wall, click, done. This should help to be as close as possible to match both worlds (again, if trackers are ok). We are making a UE project to do this.
  • We have reworked all the preview system to match a traditional workflow. We now have on one output out mosaic with all cameras, tracked, virtuals and other sources too. If you are interested we can provide you more info by mail.
  • We are working on a show where one of the talents are a virtual puppy with prerecorded animations and talks, we have a script that we have to follow. When we were setting up this Control board we missed some node to add text or something to customize the UI. As in flow we have note nodes, on control board we dont have anything. Now we are have workaround thiis with pin collectors, but it is not a good way. Why it is usefull? right, we have a button with our animation take, would be really cool to can read the transcription of that animation or the speeach from the presenter. Work as a interactive script. Our next goal to achieve this is to make an HTML or python gui and send all the commands to AX by cmd. But this simple things will improve a lot the possibilities of AX Composer/Pilot and make it more interesting.

Thank you so much!



 
Profile Image
heimspiel
  -  

Hi Qyuipo,

i would like to ask if you could provide the script/klaman filter for fighting jitter and how you implemented it.

Best regards,

Ferenc


 
Profile Image
solovai
  -  

hello Qyuipo,

i would like to ask if you could provide the script/klaman filter for fighting jitter and how you implemented it.

??