Offline Rendering using UE Path Tracer

 

Hi!


Is it possible to offline render in Aximmetry using the Path Tracer in UE?


 If not directly, is there any kind of workaround? Idea would be to use a lighter setting of the scene with Lumen or baked lighting for example and record the tracking data. Then later offline render the same scene with the Path Tracer for best results.


 Even just the UE background alone straight from UE without the comp/actors would be perfect. 


This would be easy to do in UE if the tracking data could be properly imported to UE but I guess it's still not possible to export st maps with lens distortion data to accompany the fbx files so there is still no solution for that? If not, I hope there is a way to do this with the tracking data inside Aximmetry.


Emil


   Nestruction Studios

 
Profile Image
Nestruction Studios
  -  

A little bumber. 

Ue5.3 was just announced and the Path Tracer is gonna get even more love so this topic is getting more and more critical with future updates. 

Eifert? :P Anyone? Thanks in advance! 


Emil

 
Profile Image
Eifert@Aximmetry
  -  

Hi Emil,

Note, that since version 2023.2.0 you can save focus data into the FBX.
And that Path Tracer won't work in real time render, you can only use it in post production and you most likely will want to use it with Movie Render Queue.

I am not sure that Path Tracer supports st maps. But when you use Unreal camera compounds in Aximmetry, then the lens distortions are done by Aximmetry. So for example, if you would export tracking data to Unreal with FBX, re-render the scene using that tracking data and Movie Render Queue, then add that video to the Unreal camera project in Aximmetry by overriding the Unreal node's video out:

And re-render again in Aximmetry with the billboard included, then you don't need to export the lens data at all. Aximmetry can apply the lens data on the rendered footage, if it is rendered in a resolution that includes the Lens Distortion's Edge Expand.

Never the less, running Movie Render Queue from Aximmetry is a returning request. And we will consider adding it in future releases.

Warmest regards,


 
Profile Image
Nestruction Studios
  -  

Thanks Eifert! 

"Path Tracer won't work in real time render, you can only use it in post production and you most likely will want to use it with Movie Render Queue." 

Yes, the idea is to render the backplate with best possible quality for post production the way it can be used in any vfx composition application later, like Fusion or Nuke etc. It would be absolutely massive to get access to movie render queu inside Aximmetry or to be able to export data that can render final backplate with lens distortion straight from UE MRQ. Now I know Aximmetry can't do that yet, but very cool to hear you're considering adding it in the future. Please give it a lot of weight! 

But so meanwhile I can do the same now with the workaround you suggested. Fantastic! So I will need to export FBX to UE, render in MRQ, import the sequence to Aximmetry and replace the UE scene output with the rendered sequence and render again in Aximmetry with lens distortion. 

Some extra steps, but if that will actually work, I'll be absolutely thrilled! Totally doable and of course Aximmetry would do the second rendering round very fast. 

So I'll just need to take the edge expand into consideration when setting the resolution. Any other things that needs to be considered when rendering in UE, any camera settings that needs to be changed to the default cine camera actor to match the real camera in studio? 

Thanks a ton 🙏


Emil




 
Profile Image
Eifert@Aximmetry
  -  

Hi Emil,

So a few things you should consider:

There will be no shadows, reflections, or Unreal lighting of the billboard as you will render the Unreal scene without the billboard in it. However, when replaying in Aximmetry, you can put in the billboard and have geometries occlude it (Use Billboards - On, Allow Virtuals - Off).

For some reason, Unreal's coordinate system is rotated by 90 degrees when importing FBX into level sequences:

So the best is that you rotate it by -90 when saving the FBX.
To do so, you will need to edit the Record_3-Audio compound in place:

And add a Rotation Y module with -90 Angle after the Collection Transformation module:


After importing the FBX, you will likely want to change the camera's Filmback setting to something that matches your aspect ratio:


For Path Tracing make sure these Unreal Project settings are turned on: https://docs.unrealengine.com/5.2/en-US/path-tracer-in-unreal-engine/#enablingthepathtracerinyourproject

Note, you can playback image sequences in Aximmetry just like videos. So you can save your render for example in .exr Sequence:

Then you can add an image to a Video Player module from the sequence to import the whole sequence. More on that here: https://aximmetry.com/learn/tutorials/for-content-creators/using-an-image-sequence-as-a-video/

For some reason, Unreal once saved me some extra frames with the following naming: NewLevelSequence.-0001.exr
Sadly, these names with the - sign crashed Aximmetry, so delete them before adding the image sequence to Aximmetry.

And then you can connect the Video Player module to the rendered input pin of the camera compound:


If you use Edge Expand:

Then you will have to render in the resolution you can see for the Out pin of the Unreal node:


For playback, you can use a playlist module to trigger everything at once you want: https://aximmetry.com/learn/virtual-production-workflow/preparation-of-the-production-environment-phase-i/scripting-in-aximmetry/automation/playlists/

When playing back, you should be sure that everything starts at the same frame otherwise the B mask will wrongly occlude the geometries. If for some reason there is a 1-2 frame difference when Unreal started rendering and when your playback starts in Aximmetry, then you can try correcting it with a Delay module with its frames setting turned on, and connecting through it a copy trigger to delay the triggering the playback of the image sequence or the playback of camera tracking of the compound.

Warmest regards,

 
Profile Image
Nestruction Studios
  -  

Great! Thank you for the tips. Was kinda expecting some issues like the rotation difference, but if that's all then that's easy! 

Now I got wondering, would this same workflow work with Blender/May/Cinema4D as well? as Aximmetry will anyway do the lens distortion on the prerendered image, wouldn't the fbx for camera tracking data then be sufficent to render the backplate from any 3D application? 

In a situation where Aximmetry is used more for recording and as a previz tool and where the client wants to use their scene they have in Maya, Blender or whatever else. The environment could be brought to UE/recreated in UE, then shot with Aximmetry recording the tracking data. Tracking data exported as fbx and brought to, say, Blender and render the backplate from the actual scene with all needed render passes. Then the backplate would be brought to Aximmetry to render again with the lens distortion. Then the client would take the raw camera footage and the backplate and composite it in Nuke, Fusion, AE or whatever. 

Probaply there could be some issues like the rotation thing that would be necessary to match to the used 3D application to make it work, possibly the Z/Y axis, but after sorting that out, shouldn't it work like that with Blender or Maya as well? 

If you had rendered a bunch of render passes in the EXR file from Blender for example, would Aximmetry be capable of rendering that sequence again as EXR and applying the lens distortion to all the included render passes and keeping them all in the final file? 

That would be huge! 


 
Profile Image
Eifert@Aximmetry
  -  

Hi Emil,

One thing I forgot to say, is that you can also render at higher resolutions in the same aspect ratio, and use Aximmetry to downscale to your intended resolution after the lens distortion.

Yes, I don't see why you couldn't do the same with Blender/May/Cinema4D. In Blender, If I remember correctly you don't even need to do the rotation, the tracking FBX will be in the correct coordinate system.

Aximmetry could render in EXR:

However, you won't be able to playback different layers in the EXR image sequence or save EXR with different layers in Aximmetry. If you want, I can add these two features to our request list.
So, you will have to save different layers to different files.


Note, for individual EXR images, you can actually open different layers:

This way you could probably use Aximmetry to programmatically save the layers to different image sequence files.

I also just noticed that Unreal will save to different layers in the EXR file if more than one Rendering type is specified in the Movie Render Queue:

Warmest regards,

 
Profile Image
Nestruction Studios
  -  

Thank you! That is great news. Gonna explore the possibilities soon. 


"However, you won't be able to playback different layers in the EXR image sequence or save EXR with different layers in Aximmetry. If you want, I can add these two features to our request list.
So, you will have to save different layers to different files." 


PLEASE DO! It would be aHUGE deal. Seriously I don't get why I see so little talk about using Aximmetry for recording, rendering 3D masks and previz as part of bigger VFX pipelines. It is bringing such insane tools for this use. However a lot of VFX artists would expect a lot of different render passes, so the lack of this feature would still be critical for this workflow. Not just a "nice to have" but possibly the last feature missing that could take Aximmetry from being tedious to a game changer in the mentioned workflow.

Extremely happy to hear you think it can be fixed in a future version! 

Thanks again Eifert for being so thorough and bad ass like always! :P🤘🤘🤘


Emil

 
Profile Image
Eifert@Aximmetry
  -  

Hi Emil,

We added better EXR layer functionalities to our request list and we will consider adding it in future releases.

Warmest regards,


 
Profile Image
viustudio
  -  

Hi, everyone. How are you?

I was just thinking about this workflow the other day:  

1) Recording with real-time visualization  

2) Offline rendering with path tracing  

It seems that this functionality isn’t available in Aximmetry yet. I understand that the Aximmetry team has many demands to address, but I just wanted to encourage the development of this feature. Recording in real-time and being able to render offline with path tracing feels like the best of both worlds.

Would be amazing to have this feature!

Thank you!

 
Profile Image
Eifert@Aximmetry
  -  

Hi Viustudio,

I bumped the relevant feature request with your post in our internal feature request list.

It is true that currently, you can not force Unreal to do offline rendering with path tracing directly from Aximmetry. However, as mentioned before, you can achieve this in Unreal Editor with Movie Render Queue by exporting the tracking as an FBX from Aximmetry.

Warmest regards,

 
Profile Image
viustudio
  -  

Hello, Eifert. 

Thank you very much for bumping it up! 

Yes, I read the previous messages about how to handle the recording process by exporting the FBX and rendering it in Unreal with Path Tracing. I haven’t tested it yet.

The main downside of this process (besides being a bit of work with the back-and-forth between programs) is that we lose the shadows and reflections of the Billboard in the scene. That’s quite a drawback. If it were possible to render using Path Tracing while still keeping the Billboard present in the scene, it would be perfect. But I know you already know that either  =) 

Looking forward to the day when this becomes possible!

As always, thank you very much for your support!

;