Multicam, fill control and 360 player

Multi-Machine Camera Tracking:

We have successfully set up a multi-machine system but are encountering limitations in tracking two different cameras on two different computers simultaneously. The camera tab in the LEDWall tool currently only allows for switching between cameras, not rendering two different ones concurrently. Despite this, we can output two distinct tracked Unreal feeds from the two computers.


Our goal is to have one computer render from one camera and the other from a different camera. We’ve tested this by using the Composer app on both computers and feeding one computer’s feed to the other, and it works, but it’s not a streamlined solution. The tracking information is available to both machines, but we need a way to specify that one machine tracks one camera while the other tracks a different camera. Is there a way to achieve this? Could a composition or script be created to enable this functionality?


Outer Frustum Control (Fill Control):

We need to control the outer frustum (fill) with a custom image or video to minimize flickering. Using different outer frustums increases flickering for the talents. While we can feed the fill with an image or video, it gets distorted by the wall setup. Our goal is to feed the entire canvas with a video source without any wrapping or distortion. Currently, feeding the fill with an image or video results in an offset display, likely due to how the software handles the input. How can we make the fill display content accurately, without the software assuming it needs to wrap around or adjust the feed?


Displaying 360 Content on a 180-Degree Wall:

We are exploring the best way to display 360 content on a 180-degree circular wall. This is important for some of our creative filmings. What is the recommended approach for achieving a seamless display of 360 content on a 180-degree setup? Any guidance or solutions you can offer would be highly valuable.


   mathiashaughom

Comments

Eifert@Aximmetry
  -  

Hi,

The biggest problem with rendering two different camera perspectives concurrently on the LED Wall is that the LED Walls display one image at a time. There are some ways to partially mitigate this fundamental limitation of LED Wall productions. For example, having very careful camera operators who ensure their cameras' perspectives never cross. We might be able to provide a solution depending on how you want to overcome this limitation.

Also, note that the latest version of Aximmetry (2024.2.0) supports camera inputs running on different remote computers in LED Walls:


This allows switching between them and not concurrent rendering.


Regarding the Fill control, there is an option to Freeze the Fill:
Multicam, fill control and 360 player
This should stop the flickering, but it will also stop the Fill from updating.
You can also do this per LED Wall in the LED Wall panels, instead of applying it to all LED Walls in the above image.

If you are experiencing flickering when getting too close to the LED Wall or being enclosed by it, you can fix it by using multiple LED Wall panels, then use a sticker module to stitch them together before outputting them. If you can describe in more detail the kind of flickering you experience or show a video of it, I can better determine if this solution is appropriate. If it is something else, I am still confident that we can come up with a solution that won't require you to change the entire Fill.



Regarding the 360 content, I am assuming it is a video or video stream. You can make a skybox from it, as discussed here: https://my.aximmetry.com/post/1951-is-it-possible-to-use-a-360-degree-image
There is even more information about using 360 content for LED Wall production here: https://my.aximmetry.com/post/2527-360-degree-video-on-led-wall

Warmest regards,




mathiashaughom
  -  

Perfect! Thank you very much.

The concurrent camera rendering is because we are using frame remapping/ghost frame. Because the cameras are synced to different subframes, we can overlap the fustums without a problem. How can we make a good solution for rendering concurrently 2 different camera angles on two different computers? Can you make a custom script for us?

I cant seem to find the 2024.2 version anywhere. When will it be available? 


Fill:

We really just need the fill to be customizable with an custom image. We can do this by just feeding the image into "F1 Rendered" but then it get distorted. 


Thank you for the 360 answers!


Regards, Mathias Haughom.

Eifert@Aximmetry
  -  

Hi Mathias,

Since version 2024.2.0 is still in beta, it is only available through the downloads page: https://my.aximmetry.com/User/MyDownloadshttps://my.aximmetry.com/User/MyDownloads


Creating Subframes with Multi-Machine Setup:

If your hardware handles the subframes, and you only need to provide videos for the different subframes, then there's an easy solution for this. You don't even need to edit the camera compound.
To accomplish this, you need to hijack the Playlist Select Cam feature.
First, turn it on in the CAMERA MODE panel:
Multicam, fill control and 360 player

Then, add the following logic to switch which camera input is active on the specific computers:
Multicam, fill control and 360 player

This should work with two computers as long as you correctly set the Cam Engines in the SELECT CAMERA panel:
Multicam, fill control and 360 player


Additionally, the same LED Wall panel must be used by both computers. There are two methods to achieve this:

Method 1:

Change the Segments parameter of the LED Wall panel to 2 Slices. Then, set the Resolution for each segment to the same as the LED wall’s Resolution. Make sure the Pixel Offset for both segments is set to 0,0:

Method 2:
 
Copy the settings of the LED Walls you use to unused LED Wall panels:
Multicam, fill control and 360 player
And set its Engine to the engine you use for the second CAM.
Multicam, fill control and 360 player

This can extend beyond two computers. For example, in the switch module, you could add more subframes (Camera Inputs) and more than 2 slices of Segments in the LED Wall panel (or copying to several LED Wall panels when using method 2).



Changing Fill to a Custom Image:

Regarding the custom image on the Fill, if it doesn't matter that the image is visible on the Frustum, the solution is simple. Just place the image on the LED Wall's input before it goes out:

However, I assume you don't want it to show up on the Frustum. In that case, the solution is a bit more complex, as the Frustum and Fill are mixed by a shader. Probably the easiest solution is to duplicate the shader and its modules to use it as a mask by feeding in a solid white image for the fill and a solid black one for the Frustum.
First, you should understand how linked compounds can be edited and how to create a hierarchy of compounds. You can read about it here: https://aximmetry.com/learn/virtual-production-workflow/preparation-of-the-production-environment-phase-i/scripting-in-aximmetry/flow-editor/compound/https://aximmetry.com/learn/virtual-production-workflow/preparation-of-the-production-environment-phase-i/scripting-in-aximmetry/flow-editor/compound/

Inside the LED Wall camera compound, you can find the logic you need to edit at: LED WALLS/LED Wall %N=1/FINAL.
Duplicate the logic and use it as a mask like this:
Note that this will not change what you see in the STUDIO preview; it will only affect the image that goes out to the LED Wall.

Instead of having an image or video module there, the best approach is to use a Transmit Video module to feed in the custom fill image from outside of the camera compound. More on the Transmit module can be found here: https://aximmetry.com/learn/virtual-production-workflow/preparation-of-the-production-environment-phase-i/scripting-in-aximmetry/flow-editor/pin/#transmit-modules


After this, you can apply the changes to every LED Wall panel and its logic by unlinking the LED Wall %N=1/ compound you edited:
Multicam, fill control and 360 player

Then save it somewhere outside the built-in Aximmetry libraries as a new compound:
Multicam, fill control and 360 player

Then, you can select all the other LED Wall compounds to change their Import Source to the new compound file:

Keep in mind, that you will likely need to redo everything above for the custom Fill image each time you update Aximmetry, as these compounds are typically changed in every release.

Warmest regards,

tanguanlong
  -  

Hi Eifert,

I would like to check if the method above is still the method to use if we want to set-up a multi-camera LED Wall with ghostframe/frame remapping?

Or there are updated workflow now?

Eifert@Aximmetry
  -  

Hi,

It is the same if your hardware (such as the LED wall controller) can handle the subframes, and you only need to provide videos for the different subframes, then there's the Multi-Machine Subframe solution in my previous post. ( This solution is quite straightforward if you are familiar with Aximmetry and multi-machine setups. If you have any questions or need further clarification, please feel free to ask.)
If this is not the case, your hardware can not handle subframes, and you are expecting Aximmetry to output different subframes from a single computer, please let me know.

Additionally, I wanted to mention that you can use the LED Wall's Segments option with Subframe, rather than setting up multiple identical LED Wall panels. I have updated my previous comment to include this trick (method 1).

Warmest regards,

tanguanlong
  -  

Thanks, would XR set extension functions work in this case for multi-machine subframes? 

Eifert@Aximmetry
  -  

Hi,

Yes, the Digital Extension (XR) will be rendered on the remote machines and will be included in the final output from that remote machine. This happens when the remote machine is set as the engine for one of the cameras, as shown here.Multicam, fill control and 360 player

Warmest regards,

tanguanlong
  -  

Hi Eifert,

Thanks for the info. 

Normally we have one XR system and multiple remote engines if it's a bigger wall.

In this case if it's a big wall and required multi-engine for each portion of the subframe how does the XR configuration work?