Camera projection Mapping

 

Can we do real time camera projection in aximmetry software?

like this one

https://www.youtube.com/watch?v=KGwCSim_rFk

thank you from your great support

   ahmed ahmed

 
プロフィール画像
Aximmetry
  -  

I don't think this should be the function of Aximmetry itself. The geometry and the projection have to be done in your modeler software. Then you can import it into Aximmetry and do the camera motion.

In other words only the camera motion and the post effects (reflection, dust etc.) are that requires the real-time rendering system.  The rest is mostly a manual work in the modeler software,

 
プロフィール画像
ahmed ahmed
  -  

I think that some other Realtime CG Softwares have this feature for example Vizrt

https://docs.vizrt.com/viz-artist-guide/3.3/container_plugins_projector_source_and_projector_target.html

or Ventuz 

https://www.ventuz.com/support/help/latest/NodeColorMaterial.html#Projection

or touch designer ,notch,etc

so this effect is not limited to modeler software only


 
プロフィール画像
Aximmetry
  -  

Maybe I wasn't clear.   I meant that the specific usage we can see in the video might be easier to put together in the modeler itself.

But of course you can create a shader in Aximmetry that does this kind of projection.



Also you can use Spot Light to project images onto objects.

(But it might not be what you want, because it won't lit all sides of the objects evenly.)

 
プロフィール画像
ahmed ahmed
  -  

Are there any  new shaders or compounds in aximmetry to do this effect?


 
プロフィール画像
Aximmetry
  -  

We are not planning to include a shader like this, because it is very specific. Which means it always has to be tailored for the individual need.

But if you play a little bit with out shader system you will see that it is quite easy to put together something like this.

For example I added one extra node to the Basic.xshad and I was able to project a texture to any object from any direction (and any position).

Please note that the "World Position" is used instead of "Tex Coords 1" for the input for "Sample Color". (Of course an extra transformation is added to be able to set the projection's direction and position.

For test you can use a simple sphere and an arbitrary image. Please note: if you use a sphere you will see two instance of the image.

 
プロフィール画像
ahmed ahmed
  -  

Can you share this shader?

 
プロフィール画像
Aximmetry
  -  

Unfortunately the forum doesn't yet support file sharing functionality and we would like to avoid using common file sharing sites.

But it is really easy to alter a shader. Just load it as you would load a normal xcomp. Double click on the "Pixel" node and you will see the image that I have already sent you, (There you just have to add 1 node and do 2-3 extra connection)

Just make sure you don't overwrite the original basic.xshad file. Make a copy into your own project dir.

You can even edit a shader while you are using it in another open xcomp file. Your changes will be automatically applied on the running xcomp file every time you save the shader.

 
プロフィール画像
ahmed ahmed
  -  

I still cant get the results I want 

Can you please help?


 
プロフィール画像
ahmed ahmed
  -  

To remind you I need this type of effect

https://www.youtube.com/watch?v=96JwSK3wG0E

 
プロフィール画像
Aximmetry
  -  

Several things are missing for that exact effect.

- First of all you started from the Norm shader, not the Basic one. So you have to use the transformed uv for both Sample Color and Sample Normal.

- To achive the motion effect you must not use the same camera for the rendering and for the projection of the texure. You need a separate camera for projection.

- You don't need the world transformation of the camera, but the Screen Transf instead which contains the full perspective projection transformation.

Here Camera 2 represents the camera position from where the photo was taken,  while Camera 1 is for rendering the scene. This way you'll be able to move away from the original camera pos.

- Also for using a projection transformation you a need a bit more complicated shader:

     

      

In words you need a Transform Homog then an Unhomog to apply the projection transformation. But this will project into the [-1, 1] interval therefore you have to multiply it by 0.5 and then offset by 0.5.

And after that, as I said, wire this to both Color and Normal sampler.

 
プロフィール画像
ahmed ahmed
  -  

Thank you , but when use lights and shadow strange problem happen,

the shadow keep flickering like 


 
プロフィール画像
ahmed ahmed
  -  
I think that Using Spotlight GW is the Cause of the problem after I  changed it to normal spotlight there are no flickering shadows
 
プロフィール画像
ahmed ahmed
  -  

Can aximmetry add such shader to default packages ?

maybe add some other functionality to it.

My idea that this solution can help when you have still camera live feed and you need to use real studio set  as background to add some graphics  to it.

you can record the footage and then do some 3d tracking to get camera and extract the objects or do some basic modeling in any 3d software.

Finally add the basic 3d scene to aximmetry and do the camera projection mapping from same live still camera to the imported obkects

Also you can have very small camera movement range but I think this will do the job

 
プロフィール画像
Aximmetry
  -  

The idea is good. The only problem is that you cannot remove the talent from the live camera feed.
So it could only work if you used a still photo of the studio without the talent, then add the talent as a billboard.
(You cannot create a proper basic model for the talent itself.)

;