Build a broadcast studio

Hello, I want to build a studio consisting of three fixed cameras and a fourth full-frame tracking camera. What are the offers you have and what do you advise me to do? Note that the staff we have is good at working with unreal engine .

Please do not recommend documents to me.

I want a party to communicate with .


   huseen

Comments

Eifert@Aximmetry
  -  

Hi,

You can contact us directly at sales@aximmetry.com. Our sales team can answer questions like yours and also offer consultations or training packages—not for Unreal, since you are already proficient with it, but for operating virtual studio production with Aximmetry and related topics. We also respond there to questions that are not sales-related, so feel free to ask about any other topics as well.

Warmest regards,

huseen
  -  

I sent a message to the sellers, but I have not received a response yet. This is the second time I have been asked to build a studio for a satellite channel, but I have not received any positive response from you.

Eifert@Aximmetry
  -  

Hi,

Our team does not work on weekends, except in special cases that require prior arrangement. This is why you did not receive a reply to your email.

Warmest regards,

huseen
  -  

My message was replied by the sellers but they gave me the same documents as on this site !!!

huseen
  -  

Eifert@Aximmetry 

Can I know how your company will implement this project? These are the most important points I believe you need to know.


.  I use green screen chroma key .
. Unreal is an essential part of the business .
. I'm not targeting XR or AR at the moment.
. I need three fixed cameras and a fourth tracking camera.
. The cameras he currently uses are Canon DSLR. If you have any advice, I'd listen.
. We have a diverse team of experienced people because we are a satellite channel.

Eifert@Aximmetry
  -  

Hi,

To perform green screen chroma keying with tracked cameras in Unreal, you should use the TrackedCam_Unreal_8-Cam in Aximmetry, along with the corresponding Unreal Aximmetry camera in Unreal Editor.


Once you are comfortable with this process, you can explore using the MixedCam_Unreal_3+3-Cam compound. This allows you to use fixed cameras exclusively as virtual cameras, while in the TrackedCam_Unreal_8-Cam, you will have to manually specify their location. However, starting with TrackedCam_Unreal_8-Cam is recommended for a simpler introduction.

A Canon DSLR is not the best camera for virtual production, but it should work without major issues.

For consultations or training, ask our team at sales@aximmetry.com. This is the most direct way to get professional support. If you have Unreal Engine experience but are new to Aximmetry or virtual production, operator training would be a good starting point.

Regarding your request about "Can I know how your company will implement this project?" I am not sure what you are expecting of us.

Also, I recommend reading the relevant parts of the official documentation, as it covers many important topics. If you still have questions after that, our experts can address them in a consultation, or I can answer some questions here.

Warmest regards,

huseen
  -  

I have a few questions, please.
1_ Can I connect three or four cameras to one computer?
2_ Is it possible to have two ports, one input and one output, for each computer?Connecting two computers will give me four inputs and four outputs.

No tracking

Eifert@Aximmetry
  -  

Hi,

1. You can connect as many video streams as you want to a single Aximmetry. However, note that each computer can only render one camera perspective at a time when using Camera compounds. One reason for this is that Unreal Engine only supports rendering from one CineCamera simultaneously. Even though Aximmetry-rendered scenes allow for multiple cameras, we do not recommend multiple cameras, and camera compounds don't support it, as, for example, it places too much load on one computer.

In summary, you can connect multiple cameras to one computer, but only one camera output can be rendered at any given time. During production, you can switch which camera is used as the output.

2. In green screen and augmented reality (AR) productions, each remote computer in a multi-machine setup can handle one video input and one output. For example, to support four cameras (four inputs and four outputs), you would need four computers. Alternatively, you could use two computers—one acting as the controller that switches between up to three cameras, and the other as a remote computer that consistently renders one camera.


In most multi-machine setups, each computer renders the output from one camera. A separate video switcher device is used to switch between these computers, and this device is typically operated by someone other than Aximmetry's operator.
Note, since Aximmetry 2024.3.0, the software includes the [Common_Studio]:Compounds\Switcher\Video_Switcher.xcomp compound. This allows you to switch between the video outputs of different computers directly from a separate computer, effectively eliminating the need for a video switcher device.

For live productions, such as news studios, it is usually best to use a multi-machine setup with each computer rendering one camera. This is preferred for various technical and operational reasons.

Warmest regards,

huseen
  -  

I once designed a newscast for a satellite channel.We used zero density. We had two computers, each with two camera inputs and two camera outputs. One was for tracking and the other was fixed. Is this possible using aximmetry ?


JohanF
  -  

@huseen: He just explained that you can have two outputs from Unreal simultaneously on one computer. Is you’re using a fixed camera you can use a pre-rendered background for that and output that simultaneously. It’s just Unreal that will only output one perspective at a a time and that doesn’t have anything to do with tracking. Even if you could get two perspectives from UE5 at once, it would be a very bad idea since it would cut the performance available to each camera angle in half. 

huseen
  -  

@Eifert@Aximmetry

After cooking the project in Unreal, Unreal's role is over. aximmetry then controls the D-Link or AJA inputs and outputs . 

Right?



Eifert@Aximmetry
  -  
Hi,

Aximmetry's render camera in Unreal uses the video input and tracking data provided by Aximmetry Composer, and it outputs the rendered image back into Aximmetry Composer.

However, if you configure Unreal Engine to use its own built-in solutions for video input, tracking, or something else, those settings will still be present in the cooked version of your Unreal project. Ultimately, when you run the project through Aximmetry, the cooked Unreal project runs in the background. Aximmetry does not magically convert your entire Unreal Engine project to be rendered by Aximmetry's own render engine—instead, it launches and runs the Unreal Engine in the background, similar to the way a computer game built with Unreal Editor operates.

Warmest regards,
huseen
  -  

Yes, aximmetry takes a copy of the project and keeps it running in the background.

What I mean is that the distribution of inputs and outputs remains under aximmetry control.

So why is aximmetry  limited to one computer output? 

Eifert@Aximmetry
  -  

Hi,

If you mean, for example, why don't Unreal Editor for Aximmetry just cooks the Unreal project twice and runs two instance of the same project like that?

One reason is that this would more than double the load on your computer. Running two instances doesn't just double the resource usage—they also compete for the same type of computing resources in some cases, which can significantly reduce system performance.
Furthermore, Unreal Engine features like Lumen lighting allocate much larger caches than your project might need. This prevents another instance of Unreal from reliably accessing an equal share of resources, as system-wide resource allocation is not split evenly between processes.

A better alternative would be running two virtual cameras within a single instance of Unreal Engine. This is somewhat possible using RenderTarget2D, but not feasible with CineCamera, which is the high-quality camera used in virtual production workflows. Many features in Unreal, such as planar reflections, depend on the perspective of the camera. For example, planar reflections are rendered from the camera's viewpoint as a secondary pass before the main render, this mean the second camera would have the reflections rendered from the perspective of the first camera.
Supporting multiple camera outputs natively would require major changes to Unreal Engine, potentially making it slower and more complex even when using a single camera. Keep in mind, Unreal Engine is primarily designed for games, where almost always only one camera output is needed.

As discussed previously, in virtual production, rendering multiple outputs simultaneously is also not practical, as you usually want the highest possible quality on a single output, rather than splitting resources and compromising on quality.
If your scene happens to not use much of the computer’s processing power, you could increase your rendering resolution (Frame Size), and at the end downscale to your required output size, this would increase the quality of the final image—a technique often called double sampling.

Lastly, Aximmetry isn’t limited to only one output per computer. In theory, you can have up to 200 outputs from a single Aximmetry instance, depending on your hardware. The real limitation is rendering more than one camera output with Unreal Engine.

Warmest regards,

huseen
  -  

Is there a tutorial video on connecting multiple devices for learning?

Eifert@Aximmetry
  -  

Hi,

Could you please clarify what type of video you are looking for? We rarely produce our own video documentation, but there are many helpful videos about Aximmetry created by various authors on YouTube.

If you are interested in learning how to add devices in the new Startup Screen of Aximmetry, then this reaction video might help:
https://www.youtube.com/watch?v=RrVWqpRm6go
However, the actual documentation of it is much more detailed and straightforward: https://aximmetry.com/learn/virtual-production-workflow/starting-with-aximmetry/aximmetry-composer/startup-configuration/

Warmest regards,

huseen
  -  

The old interface was smoother.

I need to understand the process of connecting multiple computers, whether with or without tracking.

Documentation is important, but it's not enough for building a studio.

I need to understand how to build a studio with multiple computers, step by step.


https://www.youtube.com/watch?v=LqiuLhk_8Iw&ab_channel=StudioPlus

Eifert@Aximmetry
  -  

Hi,

The multi-machine setup is not part of the startup screen.

We have documentation on multi-machine environments starting here:  https://aximmetry.com/learn/virtual-production-workflow/multi-machine-environment/introduction-to-multi-machine-environment/ (the page with the studio diagrams could be very useful if you're curious about the steps you need to make outside of Aximmetry)
However, please note that this documentation is somewhat outdated. In version 2025.1.0, the multi-machine window was replaced by the Multi-Machine Manager window, but the documentation has not yet been updated to reflect this change.
The new Multi-Machine Manager window allows you to set the inputs for remote machines directly from the controller machine, making the whole process very straightforward. Previously, setting inputs and outputs was only possible by configuring the remote machines individually before starting the Launcher application.
Build a broadcast studio

Warmest regards,

huseen
  -  

This is a chart of what I need right now. Is this enough?

Build a broadcast studio

huseen
  -  

There is currently no tracking in this project. 

1- Do I need a calibrated camera?

2- If we need an Ethernet switch and a sync generator , what is the best type you have tried?

Eifert@Aximmetry
  -  

Hi,

For a virtual camera production without tracking, the chart and the ethernet switch should be sufficient.
However, note that your chart does not include capture cards, which you will likely need for video input and output.

If you are not using tracked cameras, there is no need for calibrated cameras or lens data.

Please note that we do not and can not provide endorsing recommendations regarding specific camera models or brands.
Maybe Johan or other users on this forum can help you with that.

Using a sync generator is helpful for keeping your cameras and computers synchronized. This ensures that when you switch between cameras, the outputs are perfectly aligned and not a few frames ahead or behind each other. When purchasing a sync generator, make sure it supports your intended frame rate.

For your ethernet switch, make sure that only the Aximmetry computers, or any computers that absolutely need to access them, are connected. This will help prevent network congestion caused by unrelated activities during your production.

Warmest regards,