Aximmetry Broadcast DE - Drivers for 2x BMD Decklink 8K Pro

 

We have Aximmetry Broadcast DE on Win 11 Enterprise here, running 3 Cameras in and outputting 3 individual feeds out to a BMD ATEM for switching. Format is 1080p59,94 and both Cards and the switcher are genlocked to a BMD Sync Generator. Cards are 2x BMD Decklink 8K Pro (not G2).

There seem to be issues with the Drivers for these cards. The newest Version of BMD Desktop Video (do I even need this?) won't even recognize the cards, and older versions seem to have serious stability problems.



   Stefan Reck

 
Profile Image
TwentyStudios
  -  
We have zero issues with the Decklink running the latest driver (I think?) and the latest version of Aximmetry. We never had any issues with the Decklink. Try downgrade to a driver version back or two. 
I wonder if this could instead be an issue with running two Decklink cards on the same machine. You could easily saturate the PCIE buss with all those video signals. Also, how are you outputting separate feeds for the cameras? Unreal can only render one camera at a time. 
 
Profile Image
Stefan Reck
  -  

The issue already starts with getting only one Decklink 8K Pro to reliably be recognized by BMD Desktop Video (and Aximmetry), and we've tried pretty much all available versions. With the confirmation from you that this should work with no problems I suspect either a hardware failure or a messed up Windows installation underneath...

"Also, how are you outputting separate feeds for the cameras? Unreal can only render one camera at a time. "
Does that also apply to Aximmetry itself or is this purely a limitation of Unreal?
We´re not there yet, and I wasn't really aware of this limitation, as the project brief was changed quite a bit after it was all commissioned and approved. This is a very valuable bit of information, thank you for that. It means that I should still be able to deliver what the client needs, but I will need to adapt some of the signal and work flow...


This was originally planned as a very clean News/Talk Studio setup with Aximmetry SE to provide just the backgrounds for three tracked live cameras, with keying, switching, ISO recording, streaming, camera control, tally and talkback offloaded to an ATEM HD8 ISO to keep things straightforward.

A short while ago the client requested 3 changes:

  • - Change to UE (Aximmetry DE) to take advantage of existing in house UE resources.
  • - Change from 3 tracked cameras to 2 tracked cameras and 1 fixed camera, everything in 1 workstation for budget reasons.
  • - Ability to cast light and shadows onto real foreground objects, plus animation of the fixed camera -> the keying had to go from the ATEM into Aximmetry.



At this point it was decided to add a second Decklink card, as we thought it would still be possible to output the three individual live compositions and mix, record and deliver them in the ATEM...


So now the can of worms is open ;-) there are different questions:


  • - Can Aximmetry Broadcast DE do what we want to do here (keying and switching of 2 tracked cameras, 1 animated fixed camera, all in 1080p59,94) on a single workstation?
  • - Is it possible to record the "timeline" of a show in Aximmetry so we can make it re-render the individual outputs in full resolution later for post?
  • - Is it possible to pipe the preview/program buttons of the ATEM straight to the Aximmetry mixer or do I have to go through Bitfocus Companion? Implementing a T-Bar is probably out of the question here with UE only being able to render one camera at a time, but what about short (10 frames) mix transitions?
  • - What's the best way to interface Aximmetry to a streaming/Video conferencing setup on a separate PC? NDI?


 
Profile Image
Stefan Reck
  -  

So the IT guys sorted out the issues between Windows and the Decklink cards (BMD update service did not like to start reliably), the 3 camera signals go in and the PGM signal goes out of Aximmetry and I´m mainly left with 2 questions:

- Is there an example/tutorial on how to directly link the PGM input buttons on an ATEM HD8 ISO with the camera buttons in Aximmetry? It's probably doable over OSC in Companion as well, but it would be nice not to depend entirely on the Companion PC for this critical connection. IIRC the main problem here is BMDs control protocol not being publicly documented.

- Is it possible to do a short (10 Frames) mix transition between cameras on PGM or is UE strictly limited to hard cuts here?#


 
Profile Image
TwentyStudios
  -  

@Stefan Reck: I’m not sure it’s a good idea to involve the Atem at all in this scenario. Typically you would have one workstation per camera, each outputting a final composite to the video switcher. In a single workstation setup, you would instead send the raw camera feeds into Aximmetry via a Decklink capture card and do al the switching in Aximmetry. Since Unreal can only render one angle at a time, this has some clear limitations. You can’t crossfade between the cameras (no mix transitions) and you can’t use the preview/program switching, again since you would only get a single composite with the Unreal background. Another limitation is that even an instant switch between camera angles in Unreal can cause glitches when using Lumen, since Lumen relies on collecting several frames to achieve smooth lighting. When you switch to a new angle, the entire lighting will be recalculated, which can sometimes cause noticeable artifacts. 

Doing keying in Aximmetry is a much better idea than doing it in the Atem. The Aximmetry keyer is much better, and you can color grade the cameras properly (including applying custom LUTs that can even be tweaked in realtime using Pomfort Livegrade). Regarding casting virtual lights and shadows on the live video, that’s a bit of an unrealistic pipe dream. The video is just a flat plane, so it won’t react realistically to virtual lights and it definitely won’t receive virtual shadows in a realistic way. This isn’t really a limitation of Aximmetry or Unreal, and until we can get precise normal maps and depth maps that are generated in realtime, this will continue to be the case. I think this is still several years away.

You can record a timeline of the camera moves and switching and re-render offline (although I think you might get glitches with camera switches), but it would require some custom work to include things like animation triggers and other events triggered during the show. 

For streaming, you could do that internally in Aximmetry, but the most robust way would be to send the SDI output of the Decklink to another component (in a single workstation setup) or to just send the output of the video switcher to another computer. 

While you could certainly get a somewhat workable setup with a single workstation, I strongly recommend convincing your client to go the one workstation per camera route. It will be a much more,robust and professional setup in every way. 

 
Profile Image
Stefan Reck
  -  

Thanks for your insights, and yes, I can certainly see this evolving into a multi machine installation rather soon. For the moment though I'll have to make do with a single workstation to produce some results first, however what plays in my favor here is that this is a kind of "test balloon" project. The client needs to present rather small physical products (furniture fasteners) in the context of a variety of rooms. Currently they do this with real sets, lit with tons of tungsten lighting and setting up and breaking down a lot of external camera and sound equipment every time. They were looking for a much more lightweight solution with a permanently installed green box complete with cameras, sound, control and graphics for generating a virtual environment that is then used for presenting and explaining the physical products in the palm of a hand or on a small table. Hence the need for two manually operated, tracked cameras plus a third virtual camera for overviews and maybe the odd intro/outro camera flight. Transitions between sets will be handled by fullscreen stingers.

In this case the ATEM HD8 ISO would simply act as a remote for switching cameras on Aximmetry, but it will still be busy enough as a CCU, talkback controller, PGM cleanfeed recorder, DSK for lower thirds and streaming encoder.

What I am curious about is your experience with real life performance limits of a system like this. For the moment I`m fine with only hard cuts between cameras, it`s a studio talk format after all. But what about the Lumen glitches you mentioned? What do I need to tell the Unreal guys what can and what cannot be done here?

Playing with realistic looking light and shadows in the 3D foreground is obviously not going to happen; this will be more of a special effect, if it gets realized at all...

The other thing is post. Obviously the client wants to wring the most out of the live recorded material in post, so what would be a possible workflow to make the system record every input it received (tracking, switch commands, animation commands, stinger triggers...), combine it with the raw camera recordings (available outside of Aximmetry to take some load off the machine if necessary) and use this to re-render the three different perspectives off line? 


 
Profile Image
TwentyStudios
  -  
Ah, so then you’re feeding back the output of Aximmetry to the ATEM?  Not sure if you could configure the ATEM to just send out a switch trigger without actually switching the input? A Stream Deck would be the most practical option for switching. If you want to avoid relying on a separate computer for this you could alternatively use a midi controller and map buttons to switch cameras. You might also be able to find a suitable DMX controller for this. 
Performance shouldn’t be a big issue on a single system since it’s just rendering one camera at a time. 
Best way to avoid Lumen issues is to just not use Lumen. If there isn’t a lot of dynamic lighting in the scene, you could use mostly static lighting and bake it. That will look much better (if the UE5 artist knows how to work with baked lighting) and it will have much better performance as well. That being said, it’s very possible to use Limen and not noticing these issues, but it’s something to be aware of. LUMEN is still problematic in many ways, so for typical indoor broadcast sets I always recommend baked lighting. 
One thing to note is that keying will look much better if you send the camera to Aximmetry in 4K. You can set Aximmetry to render internally in 4K and lower ScreenPercentage in Unreal.until you’re below around 85% GPU load. If you’re doing live streaming, I’m not sure if running 59.94 is the ideal choice either. 30p will save a lot of bandwidth and performance while giving better image quality for what sounds like a fairly low motion production. 
Hopefully someone at Aximmetry can chime in on how a workflow that records everything for later offline rendering would work. We’re almost 100% focused on doing everything in realtime, so haven’t really looked into that personally. 
 
Profile Image
Stefan Reck
  -  

Yes, the PGM output of Aximmetry gets sent back into a free input of the ATEM HD8 ISO and it is then simply stomped on top of the ATEMs actual PGM bus by DSK1. So I'm free to repurpose the PGM bus buttons on the ATEM as a remote, with the added benefit that this will also automatically record all cuts into a DaVinci project file. DSK2 then adds lower thirds. It's a bit crude, but it gives me streaming and recording (both clean and with lower thirds, as well as raw feeds of the cameras including a DaVinci project file) outside of Aximmetry directly to an SSD whilst also being able to keep audio (with the exception of stingers) on a separate desk that feeds the ATEMs analog inputs.

Unfortunately the signal available from the cameras tops out at 2K as we don't have fiber heads. Scratch that, they will actually give me 2160p over SDI. It just means I can't record or monitor them in the ATEM any more, and I need to figure out a way to route the audio from channels 13-16 through Aximmetry into the 1080p PGM signal that goes back into the ATEM to preserve talkback. Lowering the PGM frame rate to 30p sounds like a good ideal too.

 
Profile Image
Stefan Reck
  -  

I can record 4K in the cameras themselves though. Being able to use that to improve the keying quality in post would be great, so I really need some input on how to design a practical workflow for this. Ideally I would like to end up (after some heavy re-rendering overnight...) with a DaVinci Project with the identical content as the live show but with complete camera tracks for re-editing and in better quality.

 
Profile Image
Stefan Reck
  -  

TwentyStudios wrote:

"One thing to note is that keying will look much better if you send the camera to Aximmetry in 4K. You can set Aximmetry to render internally in 4K and lower ScreenPercentage in Unreal.until you’re below around 85% GPU load. "

Would that still work with 2 tracked cameras and 1 fixed/virtual camera on a single workstation for a 1080p30 PGM output or do I need a multi machine setup for this?




 
Profile Image
TwentyStudios
  -  

I think you’ll find that the quality difference with baked lighting isn’t that noticeable even with the highest scalability settings in Unreal. You should be able to render the scene in realtime at the highest quality without having to resort to an offline rendering workflow with a property optimized scene. With large exterior scenes with Lumen and lots of dynamic lighting and animated objects and particles, the situation is different, so this only applies to typical virtual studio sets. 

Regarding handling three switched cameras on a single workstation that shouldn’t be an issue. Aximmetry only renders the active camera, so as long as you switch between the cameras and don’t have the SDI inputs connected to any active preview output, true currently unused cameras shouldn’t consume any GPU resources. There still might be PCIe bandwidth consumption for unused cameras which might be an issue with 3x 4K30p inputs, but I’m not sure. We hit some sort of bottleneck with two Decklink 8K cards in the same system when going over 5 simultaneous 4K30p SDI input/outputs, but that’s probably system dependent. PCIe bandwidth issues are generally hard to pinpoint and can cause unpredictable frame rate drops. 

 
Profile Image
Stefan Reck
  -  

"Aximmetry only renders the active camera, so as long as you switch between the cameras and don’t have the SDI inputs connected to any active preview output, true currently unused cameras shouldn’t consume any GPU resources."

Well, I do need the standard Aximmetry matrix view quad (3x Cam, 1x PGM) for live switching. But that's heavily scaled down and shouldn't consume that many resources, right?

 
Profile Image
TwentyStudios
  -  

You won’t get the matrix quad view with Unreal scenes. Again, only one camera angle is rendered at a time. Only the Aximmetry native 3D engine will give you the quad view with the 3D background. You will get the live video feed in the Quad view, but that would definitely consume the full PCIe bandwidth. Not sure how much GPU/CPU it consumes. I would advise using the quad view of the Atem instead. 

 
Profile Image
Stefan Reck
  -  

I can pull up a quad view of the 3 raw 4K camera signals and the composited 1080p PGM on an external monitor. But live switching without any sort of preview of the composited signal might become a bit of a challenge, depending on how much the talent decides to move around in the green box...

I'm wondering what it would take to generate an Unreal "PVW feed" on a separate machine, as a sort of interim solution short of upgrading this into a full 3 machine installation. What kind of Aximmetry license would I need here? Do I also need to feed it the cameras over HDSDI or is it possible (and practical) to just pipe them there from the main workstation over NDI?


 
Profile Image
Eifert@Aximmetry
  -  

Hi Stefan,

Regarding recording for post-production, yes, you can combine the raw camera recordings with the tracking recorded in Aximmetry. Detailed instructions on how to do this can be found here: https://aximmetry.com/learn/virtual-production-workflow/preparation-of-the-production-environment-phase-i/setting-up-inputs-outputs-for-virtual-production/video/recording/how-to-record-camera-tracking-data/#recording-the-image-on-the-camera
After recording, you can re-render each perspective offline sequentially.
Currently, Aximmetry does not include a built-in system to record actions performed during production in the Control Boards or within Aximmetry. However, you can create a system using the Flow Editor, although it is quite complex. Detailed instructions for this process can be found here: https://my.aximmetry.com/post/2753-replaying-a-real-time-session-with-high-



About the multi-machine setup:

The controlling machine requires a Broadcast Edition license.
Satellite (renderer) machines can use any license, depending on the input interface of the camera and the camera tracking system connected to the satellite machine.

It is recommended to use an SDI connection for each machine as you will lose quality using NDI. Additionally, SDI is more stable.

As for generating only previews on a second machine, it is theoretically possible to create a system that generates three previews for example by switching to a different camera at each frame, delaying it to the next frame, and combining the frames into a single preview view. However, this will result in a lower frame rate. Also, the issue with this approach is that Unreal's Lumen requires a few frames to generate a full-quality rendered image.

Warmest regards,

 
Profile Image
Stefan Reck
  -  

Thanks for the links on post workflows, I´ll dig deeper into this and come back. I might not even need the actual camera switching commands as these are also recorded by the ATEM HD8 ISO in a DaVinci project file if I use it as a remote. They will of course be off by a set amount of frames due to the processing delay in Aximmetry, but that shouldn't be too hard to measure and correct.

Regarding adding a PVW feed to a single machine DE setup: It does not need to be that elaborate, I can probably do without a live composited preview of all the cameras at once. Basically all I need is a second Aximmetry system working in sync (same UE room including set changes, same camera inputs, same tracking and lens data) to the main "PGM" system. This would be controlled by the PVW bus buttons on the mixer, effectively giving the TD a classic PVW/PGM broadcast user experience. The output of the PVW system does not need to be rendered in full quality, nor does it need to run transitions, billboards and the like. All it needs to do is give the TD (and possibly the cameramen) a live composition of the "next" camera and its associated UE rendering in order to line up the shot correctly.

 
Profile Image
Eifert@Aximmetry
  -  

Hi,

You could set up the Matrix Preview (PVW) as I mentioned by using one remote machine connected to your main controller machine. This way, they would be in sync, but it would require a significant edit in the Flow Editor.
It is much easier to use additional low-spec computers and run them at a low frame rate and resolution, especially if you have some old computers lying around. Then, utilize three machines plus the main controller machine in a multi-machine setup, where each low-spec computer renders one input. The low-spec computers will render PVW at low quality, while the controller machine will render the final output. This setup is possible because the Select Camera panel works on the controller machine even in a multi-machine setup and even if all the cameras are set to remote:
Aximmetry Broadcast DE - Drivers for 2x BMD Decklink 8K Pro
In a typical multi-machine setup, you would use the Select Camera panel only for setting things up and rely on an external switcher during live production. However, in your case, you could use Select Camera during live production.

Also, you don't need capture cards in your remote machine. You can receive everything with the controller machine and transmit it to the other machines via NDI using Aximmetry. This would decrease the quality of the INPUT, but as you said, it is only going to be used for PVW. However, it would slightly decrease the performance of the controller machine.

This setup would also make it much easier for you to later transition into a full multi-machine setup with a powerful computer for each camera. However, depending on your tracking, you would need additional licenses.

Warmest regards,

 
Profile Image
Stefan Reck
  -  

"You could set up the Matrix Preview (PVW) as I mentioned by using one remote machine connected to your main controller machine. This way, they would be in sync, but it would require a significant edit in the Flow Editor."

Well, how practical and stable is this? A few additional frames of delay certainly won't matter here, plus I don't even need all the cameras at once. One at a time, with a standard camera selector module that I can remote over HTTP would be enough.


 
Profile Image
Eifert@Aximmetry
  -  

Hi,

If you only need one camera at a time for the PVW, then you have a very easy job.
Because you can change the Cam 1-3 Engine values using Set Integer Pin modules:
Aximmetry Broadcast DE - Drivers for 2x BMD Decklink 8K Pro

You don't even need to edit the camera compound, as you can specify the pin and module's location within locked compounds when using Set Pin modules.
For example, if you are using the TrackedCam_Unreal_3-Cam camera compound, you can target the SELECT CAMERA panel (Pin Collector) with TrackedCam_Unreal_3-Cam\SELECT CAMERA as the Module pin in the Set Integer Pin module. This way, you can create the following logic:
When you select a CAM in the Pin Collector, it will change that CAM to Remote #1 in the SELECT CAMERA panel and set the rest to Local.


You can read more about Pin Collectors here: https://aximmetry.com/learn/virtual-production-workflow/preparation-of-the-production-environment-phase-i/scripting-in-aximmetry/flow-editor/special-compound-pin-collector/
Also, this setup needs a multi-machine configuration, as described here: https://aximmetry.com/learn/virtual-production-workflow/preparation-of-the-production-environment-phase-i/multi-machine-environment/multi-machine-setup/

If the second computer (remote machine) is less powerful than your controller computer, you can adjust its resolution with a Set System Params module. You would want to execute the Set System Params module only on the remote machine so that the controller machine still renders in the Frame Size you set in Edit > Preferences. You can determine which machine (engine) is being used like this:

Warmest regards,

 
Profile Image
Stefan Reck
  -  

TwentyStudios wrote:

"You won’t get the matrix quad view with Unreal scenes."

If I put an Unreal file into a new composition and link it to the Unreal 3+3 Mixed Camera Compound it does actually give me a live rendered matrix quad. The content will change between PGM and 3 virtual cameras and PGM and 3 tracked cameras depending on what camera is currently selected. Is this a new feature or does this indicate that I am missing something and the PGM output isn´t really rendered in Unreal yet?


TwentyStudios wrote:

"You can set Aximmetry to render internally in 4K"

Are you referring to the normal resolution setting under Edit/Preferences/Rendering/FrameSize? 

TwentyStudios wrote:

"and lower ScreenPercentage in Unreal.until you’re below around 85% GPU load. "

Ok, but where do I define the PGM output resolution and frame rate then? In Unreal or in Aximmetry?













 
Profile Image
TwentyStudios
  -  

Do you have a screen shot of the matrix quad view rendering different perspectives of the UE5 scene simultaneously? If they haven’t changed this quietly, you should only get the billboard rendered on top of a black background for every camera except the selected one. The limitation is that you can’t get the cameras rendered on top of their respective matching UE5 camera perspectives as separate outputs since that would require rendering the scene for each camera perspective which would surely tank performance.

Your UE5 rendering resolution is determined by the resolution you set in the Aximmetry project settings. This will also determine what resolution Aximmetry renders its internal effects in, where you will see an improvement in things like the keyer when working in 4K, even with a 1080p video source. The rendering resolution is independent of the output resolution. We usually output 1080p but still render in 4K internally because the increased image quality. 

You can override the the UE5 rendering resolution by specifying a resolution in the UE5 scene node. You can also execute a console command in the UE5 level blueprint to render at a lower screen percentage. I don’t think there’s a project setting in the UE5 editor that affects the rendering resolution, so you need to set it via screen percentage as a console command or in the .ini file. 


;