Can I use movie render queue to generate the background and then composite in AE?

 

Can I record the tracking data in Aximmetry, and then apply the camera fbx in unreal and render the background using movie render queue?

My goal is to composite the live image recorded by camera and virtual background generated by Movie Render Queue in After Effect, to achieve the best quality. Is it possible now?

And when do need the input record like below?

When the scene is too performance consuming, so we just record the camera images and tracking data, then use Aximmetry to do offline rendering? 

Input recording

Its purpose is to enable offline rendering of the final show using pre-recorded camera images and tracking data.

   RogerFT

 
Profile Image
Ahmed@Aximmetry
  -  

Hello Roger,

Recording tracking data is possible.  This guide has what you are looking for.

Input recording allows you to get the maximum quality rendering possible. For example, if you use a Blackmagic pocket cinema camera 6k, you can't feed the highest quality footage that the camera is able to create because it has no SDI output and its HDMI output is limited to 1080p.

Thus, recording that footage internally then matching it later in Aximmetry with the tracking data will get you the best result.

 
Profile Image
jim4586
  -  

@Ahmed -

what about utilizing the Unreal movie render queue for rendering? Is there a way to get render passes out of Aximmetry? For example object ID, motion vectors, normals, depthmaps, etc

 
Profile Image
bigdream
  -  

yes you can,record camera tracking data,don't use master timecode ,output as FBX file,thanUE4 input to sequence,

Remove the check box from name

 
Profile Image
MEGO-XR
  -  

@bigdream @Ahmed@Aximmetry

I tried to import the FBX to sequence as a camera, it seems that I got the same movement of camera, just it's not the same scene shown in Aximmetry. Even in the scene node, everything is 0, when I imported the camera movement in UE sequencer, it's different. 
 

 
Profile Image
Eifert@Aximmetry
  -  

Hi MEGO-XR,

You might have used a master timecode or different timecode that resulted in the animation being very long and only at the very end of it there are animations.

If not, can you share some screenshots of the situation inside the Unreal Editor?

Warmest regards,

 
Profile Image
MEGO-XR
  -  

Yes, that was the case, I found the animation at the very end. The length of the animation seems to match what I recorded, just the camera is not in the right place. 


By the way, what is the right way of doing this? Shall I use the camera generated timecode or not? Because using the camera TC will almost always make the animation being very long.

 
Profile Image
Eifert@Aximmetry
  -  

Hi MEGO-XR,

Were you using the Record_3-Audio compound to record the tracking data or a Camera compound? Probably for some reason, the camera's Origin or the Scene Base Transformation is not included in the tracking recording.

I think you should only use camera generated timecode if you want to use that camera timecode for some reason in the post-production.
For example, while you were recording with Aximmetry, you were also saving the video with the camera or with another software in a different format. And you want to mix that video with the recording of Aximmetry. In that case, the camera's timecode can help you to easily sync the two recordings.

Warmest regards,