Three camera angles for different outputs

 
Hi, we are starting with Aximmetry and I would like to ask, if its possible to make on one pc multi-cam studio (yes, this is possible, but 3 camera angles with different outputs). We have Tricaster and our idea is to have 3 different views to Aximmetry studio (3 video outputs to Tricaster via NDI (so 3 inputs to tricaster), that we can "live cut" in the tricaster and Aximmetry only use for studio and green screen).
For example: 1. video output - whole studio 2. video output - Two people 3. video output - Detail of person.
We cant figured out how to do it. How to make 3 different angles/outputs in one project and if its possible to do it without more computers with Aximmetry.
Thank you for answer.

   Vojtěch Vysoudil

 
Profile Image
Eifert@Aximmetry
  -  

Hi,

Unreal will only be able to render one virtual camera angle per computer. As it is only possible to run a maximum of one Unreal Engine per computer.

The maximum you can do is to preview the keyed out billboards in Tricaster. Like what you see in the Matrix view:


For this to work with the Tricaster, you will need to edit a little bit the camera compound, if you want I can show you how.


In the case scenes rendered by Aximmetry. You can have more than one virtual camera angle.
You can observe this again in the Matrix view, for example using the [Studio]:News Room\News Room - VirtualCam_3-Cam.xcomp example scene:

However, we also recommend using multiple computers for this case. As this would mean rendering about 3 times more than otherwise and it is a significant drain on your computer's resources. If your scene has so little load on your hardware that you can do this, we suggest rendering at a higher resolution instead of having more camera angles.

And for the multiple camera angles to work with the Tricaster and scenes rendered by Aximmetry, you will need to edit a little bit the camera compound, if you want I can show you how.

Warmest regards,

 
Profile Image
Jeff Fuchs
  -  

Eifert@Aximmetry what if I am close to the same scenario, but 3 tracked cameras that I only want the backgrounds rendered simultaneously and sent to different outputs on my DeckLink? I have everything setup, trackers show correctly, I have outputs routed to individual outs on DeckLink, which then each go to Ultimatte's, then to an ATEM for switching/recording. I was disheartened when I hit this roadblock of needing more computers. The computers are not even the issue, it's needing multiple licenses. I have a Broadcast DE license, but purchasing 2 more I just do not have the budget for currently. It would be nice if I could run Unreal on the other machines and they were recognized as remote renderers, but no, will only look for Aximmetry. I am a very small studio and took me a while to be able to upgrade to Broadcast DE. Anyway, if there are any work arounds that might help it would be awesome.

 
Profile Image
TwentyStudios
  -  

@Jeff Fuchs: Sorry, but it seems to me like you should have researched this properly before investing in the Broadcast license. Rendering 3 different perspectives for the Unreal scene isn’t really feasible from a technical standpoint, and there’s nothing Aximmetry can do about that. Rendering 3 camera angles simultaneously means rendering the scene 3 times which would have a huge performance hit. We usually hit 90% GPU from rendering a single camera angle, so how could we expect to render 2 more on a single computer? 

How would accessing Unreal on the other machines work? Aximmetry handles the video I/O, camera animation, color correction, keying and tracking internally, so nothing of that is available in Unreal. Multi machine means each computer renders the scene, receives and outputs video renders locally, while being controlled from a single computer. 

The correct way to do it if you’re not using multiple workstations is to switch between cameras inside Aximmetry. So, you would have 3 SDI inputs with your camera signals and one output going to the switcher. Using the Ultimatte shouldn’t be necessary. You can get comparable quality with the built in Aximmetry keyer if you set it up correctly. 

 
Profile Image
Jeff Fuchs
  -  

I clearly should have researched more. I have been using Ax for a little over a year, but doing all the multicam recording first, then bringing them into Ax, rendering out each angle, then editing. Works fantastic, but a slow process. I finally got camera tracking and was trying to streamline the process by recording with the background live. I purchased ultimatte’s to take a bunch of the load off of Ax, at least that was my thought. So now back to the drawing board. Thanks again for the reply TwentyStudios. 

 
Profile Image
TwentyStudios
  -  

The Aximmetry keyer doesn’t have a big performance impact. Maybe 10% on a 3090 (not scientifically tested). If you want to use the Ultimatte, the recommended way of doing it would be to send the camera into the Ultimatte and key&fill from the Ultimatte from two SDI from the Ultimatte and into your Decklink. That way your keyed video source could be rendered into the scene and be occluded by foreground objects.

If you’re working in 4K, I think the performance impact of sending an additional 4K signal into Aximmetry would be comparable to that of the external keyer. Running 3 cameras with key&fill on a single workstation wouldn’t be feasible. 

 
Profile Image
marc.colemont
  -  

We use currently 2 Broadcast DE servers. One for close-ups, and one for wide-shots, using also close-up and wide-shot SDI camera's to each server.

I automated and created a small firmware engine that connects ATEM and the Aximmetry's together by network, so the video director can choose on the 2M/E panel each Aximmetry in Preset or Program and choose a A-B preset. As they don't like Streamdecks or the mouse to operate live.

If the Aximmetry is selected in Preview on ATEM, it automatically waits at the postion A. And as soon as the source goes live on the ATEM, the transition starts to move to B position automatically. Ideal for crane-shots or slow moves without anyone using the Aximmetry GUI, or no chance cutting halfway a A-B position move which looks like crap otherwise. The video director runs the liveshow with multiple Aximmetry's fully automated that way through the 2M/E panel. The Aximmetry operator is basically there to babysit the machines and adjust greenkey and A-B presets during rehearsals.

 
Profile Image
Jeff Fuchs
  -  

@marc.colemont, I would very much be interested in trying that firmware engine. Sounds like we run a very similar setup. 

 
Profile Image
marc.colemont
  -  

We are thinking of creating a little product from it if there's interest. It's now in-house firmware running on an embedded board, and does much more than just the Axymmetry part. It also translates ATEM shading panel to 8 Birddog camera's, or automates IMAG outputs for live events excluding wide-shot camera's. So we have to strip it. We are now adding small database, webGUI, and will place the embedded board in a 1/3rd 19"box, the same size as a hyperdeck or Teranex Mini that can be placed on the same rackmount shelfs. Most likely it will run from POE+. 


 
Profile Image
Jeff Fuchs
  -  

@marc.colemont, I can only speak for myself, but you have my interest.

 
Profile Image
marc.colemont
  -  

As soon as we have build some extra units I will let you know. Maybe best to drop me an email, as it's out of scope for this forum. http://www.tools4video.be