Inquiry About Modules for Color Value Sampling from Images or Videos

 

Hi,

I would like to ask if there is any module or a combination of modules available that can extract color values from an image or video, similar to how MadMapper performs pixel mapping

   studioMagicHour

 
Profile Image
Eifert@Aximmetry
  -  

Hi,

In general, the Measurer module can extract color values from video:
Inquiry About Modules for Color Value Sampling from Images or Videos
Note that color output pins from the Measurer can be directly connected to vector input pins if you need the color data as a vector (a group of numerical values representing the color channels).

If you want to measure color in multiple areas of the video, you can create an array compound—essentially grouping multiple Measurer modules together. More about array compound here: https://aximmetry.com/learn/virtual-production-workflow/scripting-in-aximmetry/flow-editor/compound/#array-compound

For example, with an array of size 100, you can divide the video into 100 separate regions, with each Measurer handling one region:
Inquiry About Modules for Color Value Sampling from Images or Videos

In the above array compound (Group x100), divide the measuring into 100 areas:

Keep in mind that using dozens of Measurer modules will be quite resource-intensive. If you need to sample color values at the scale of many dozens, let us know so we can suggest a more performance-optimized solution if necessary.
Also, I think MadMapper can be used for various types of pixel mapping. To provide more specific advice, let us know what and how many devices or surfaces you intend to output the pixels on, which protocol you plan to use (for example, DMX), and what resolutions are required.

Warmest regards,

 
Profile Image
studioMagicHour
  -  

Hi Eifert,

Thank you for your quick and detailed explanation!

If I want to select a specific region from a video to feed into Measurer, is there an easier method than using the placer and manually adjusting parameters?

By the way, I’m currently researching ways to send RGB or intensity values via the OSC protocol into my QLC+, which handles DMX. MadMapper is also an option, but for some reason, I’m exploring alternative approaches for now.

Many thanks,
Moon

 
Profile Image
Eifert@Aximmetry
  -  

Hi Moon,

In the images above, I am following the same workflow you would use with the Placer module, but instead of cropping the video, I am using the Area input pins of the Measure module.

There are four different Placer modules available. For example, you might find the Placer Area module easier to use, depending on your needs.

Are you sure you need QLC+? Aximmetry can send DMX signals directly. However, configuring DMX control in Aximmetry can be more challenging than in dedicated lighting control software like QLC+, especially if you are more familiar with QLC+ than with the Aximmetry Flow Editor.

How many DMX lights or pixels are you working with in total? Knowing this will help me determine how many Measurer modules you will need.

Warmest regards,

 
Profile Image
studioMagicHour
  -  

Hi Eifert,

Thank you very much for your swift reply.
I’ll try experimenting with the Placer Area module as you suggested.

To give you a better idea of my current setup: I use about 24 lights in my Chroma studio DMX system. Specifically, 6 bi-color lights on top for base lighting, 3 RGB lights for the back lighting, 6 bi-color lights illuminating the green screen, 4 bi-color lights on the ground for key and fill, plus some additional RGB lights for supplementary effects.

I’ve tried several lighting apps, including MadMapper, and found that QLC+ is excellent for configuring custom fixtures and organizing scene-based lighting, even though it’s open-source. However, it lacks dynamic lighting features that interact with video or images in real time.

I understand that Aximmetry supports DMX control, and if it fits my needs, I’m open to integrating it. Most of the DMX-related information I’ve found on Aximmetry focuses on Unreal Engine 5 integration, but I’m aiming to build a 2D or 2.5D virtual production setup where cameras are on tripods and only provide PTZ values.

If you have any advice or learning resources relevant to this kind of setup, I would greatly appreciate it.

Sorry for the lengthy message, but I wanted to share the full context. Thanks again for your help.

Best regards,
Moon

 
Profile Image
Eifert@Aximmetry
  -  

Hi Moon,

It seems I bit misunderstood your setup earlier. I thought you might be working with an LED panel lighting where you would need to control each individual LED pixel using DMX channels.

For more information, you can find documentation about using DMX modules in the Flow Editor here: https://aximmetry.com/learn/virtual-production-workflow/setting-up-inputs-outputs-for-virtual-production/external-controllers/using-dmx-with-aximmetry/

There is also a general overview of DMX and how to connect your lighting through the DMX protocol here: https://aximmetry.com/learn/virtual-production-workflow/supported-hardware/controllers/controllers/#dmx-artnet

Additionally, here is the main documentation page for the Flow Editor: https://aximmetry.com/learn/virtual-production-workflow/scripting-in-aximmetry/flow-editor/introduction-to-the-flow-editor/

It sounds like you are working on a green screen production and want to adjust the real-world lighting on your talent based on the final rendered graphics.
Note that some software and hardware setups can capture the lighting conditions from the position of the virtual billboard and recalculate them for your actual studio lights. This approach is especially useful if you have complex lighting in your scene, for example, light sources that are close to the talent, as such lights do not illuminate the entire rendered scene and might be even blocked for most of the scene by virtual objects.

Warmest regards,

;