Possible to genlock camera feed, Engine and tracking data without any Sync generator(external hardware) as master clock?

 

Hi Axy Support Team:

Unreal Engine adopt the timecode values from an input SDI video feed coming in from an AJA or Blackmagic capture card and lock its frame rate to that feed so that it generates only one frame of output for each frame of input.( from Unreal Engine Documentation) Timecode Provider, Custom Timestep and Timed data monitor are needed for this workflow. Is Aximmetry using the same workflow for genlocking camera feed, Engine and tracking data? So they actually can be synced without External genlock hardware?


   zeketan

 
Profile Image
Eifert@Aximmetry
  -  

Hi Zeketan,

Aximmetry uses a different workflow and things are named a bit differently.

An "external" genlock syncs the external devices (camera and tracking) to capture things at the same time. It won't specify the delay between your computer and camera and tracking. However, it makes the delay to be constant and the delay won't change over time. We have this on how to set up the delay between cameras and tracking: https://aximmetry.com/learn/tutorials/for-studio-operators/setting-up-virtual-sets-with-tracked-cameras/#inputs

What Unreal documentation refers to as genlocking Unreal: "In some cases, you may want to go even further, and lock the engine so that it only produces one single frame for each frame of video that comes in through a reference input — we refer to this as genlock." https://docs.unrealengine.com/4.27/en-US/WorkingWithMedia/IntegratingMedia/ProVideoIO/TimecodeGenlock/ is done automatically in Aximmetry.
However, in most cases you want to actually sync your render frame rate to the output using the Sync option:

If you don't have "external" genlock, then it doesn't make much sense to render by timecode change, as your various devices will capture at different moments.


And in most cases, the timecodes from various devices are not synced by the genlock. But if you have such hardware which does that, then you can use the Timecode Sync option in the inputs to automatically sync the input (tracking and video) based on their timecode.

Timecodes are mostly used in post-process and you can record your device's timecode into the recorded video: https://aximmetry.com/learn/tutorials/for-studio-operators/recording-camera-tracking-data/#timecode

Warmest regards,



 
Profile Image
TwentyStudios
  -  

@Zeketan: The UE4 documentation is really confusing and ,uses up timecode, genlock and frame sync concepts a lot. From experience I can tell you it just works in Aximmetry.

@Efiert: I’m interested in learning more about how non-genlocked tracking data is handled. For example, the new ReTracker Bliss sends out 1000 FPS FreeD tracking data, essentially eliminating the need for genlocking the tracker given that the data is interpolated and reclocked appropriately to the video input. Is that how it’s currently handled?

 
Profile Image
Eifert@Aximmetry
  -  

Hi TwentyStudios,

To my understanding, without the genlock, it will be very hard to find the exact delay down to 1000 FPS accuracy.
Like even if Aximmetry would render 1000 fps with a 1000 fps camera, I think it would be humanly impossible to find the exact matching tracking data frame with the incoming video.

Warmest regards,

 
Profile Image
TwentyStudios
  -  

@Eifert: That’s not the point I was trying to make. If you interpolate and re-clock the incoming 1000 frames down to the frame rate Aximmetry is working at (which in turn would be locked to the video input from the camera) you would only have to adjust the delays in increments equal to the camera frame rate. For example, in 30 fps you would receive around 33 frames per frame “tick” if the tracking data is 1000 FPS. Instead of throwing away all that data you would interpolate the values down to 30p and read the interpolated value on engine tick (where the engine tick is synced to the camera input). I hope that’s a better explanation of what I’m trying to get across? 

 
Profile Image
Eifert@Aximmetry
  -  

Hi TwentyStudios,

I think I still don't really understand it. What would be the interpolate process? Don't you just want the average of that 33 frames?
If so, that's something probably the tracking system should do or does in the first place.

I thought in my previous post that specifying the right delay that could point to the right frame from the 33 would be the ideal solution. But it is impossible to manually specify the correct delay from that 33 frames. And without genlock, that delay will probably change at every frame with 1000 fps tracking.

Warmest regards,

 
Profile Image
TwentyStudios
  -  

While the frame smoothing/interpolation could be done in the tracker, you would then have to genlock it to the camera at the tracker side. 

Interpolation would be more or less an average of neighboring values, similar to how an image is down sampled in image processing. Right now you’re using Nearest Neighbor, just picking whatever value is closest to frame tick in Aximmetry, throwing away all other values. What you could do instead is to calculate a moving average (or moving mean) from all the tracking values and read from that average on tick instead. That way it would be less sensitive to any bad or dropped tracking values. A moving average of 33 positional tracking values per frame wouldn’t add any noticeable lag to the tracking and could reduce jitter without sacrificing responsiveness. 

Does that make sense?

 
Profile Image
Eifert@Aximmetry
  -  

Hi TwentyStudios,

It makes sense now.
I am not sure if this would remove jitter in a reasonable way. But I guess at least we could provide access to the unused frames. I added this to our request list.

I think it is very unlikely that a tracking system that can put out 1000 fps tracking will drop or produce bad tracking values. And that it would be fixed by this kind of smoothing.

Warmest regards,

 
Profile Image
TwentyStudios
  -  

@eifert: Has anything happened in this area? It seems the native LiveLink/FreeD implementation in vanilla Unreal gives smoother results with ReTracker Bliss because it uses the high frame rate tracking data correctly. It especially seems to help when sending tracking data over UDP (instead of direct USB) since single data packets might be dropped or arrive in the wrong order. Correctly interpolating the data alleviates this Hope Aximmetry can adopt a similar approach so that we don’t have to compromise on the results when using your awesome software.

 
Profile Image
canalLUZ
  -  

Dears, we plan to buy 2 Panasonic AG-CX10 cameras to use with Aximmetry and HTC VIVE MARS. These cameras do not have GENLOCK input. We have a Blackmagic sync generator, with which we could only synchronize the HTC and the Decklink Duo2 capture board. Is the fact of not being able to synchronize the cameras a problem?

 
Profile Image
TwentyStudios
  -  

@canalLUZ: Genlocking the Decklink Duo2 will do absolutely nothing for tracking. With the Vive Mars, you do need a camera with genlock input for best results.