Use 3d max camera image as an texture input

Use 3d max camera image as an texture input

Anonymous
Not applicable
5,247 Views
21 Replies
Message 1 of 22

Use 3d max camera image as an texture input

Anonymous
Not applicable

Hello. I want to use my camera in the scene as an input in my material editor (in a map input) for example you have a box in the scene and you have a camera is looking on the box. I want to have this image(view of camera) in a map and use it in texturing and the map should update with any changing in scene (for example position of box) and the quality of map is not important (just that you see in viewport.)

Is there any way for it or can somebody write an script for this function?

Thanks.

0 Likes
Accepted solutions (2)
5,248 Views
21 Replies
Replies (21)
Message 2 of 22

Anonymous
Not applicable

No body? 

 

0 Likes
Message 3 of 22

10DSpace
Advisor
Advisor

Hi @Anonymous 

 

It is not completely clear what you are trying to do here, so if we clarify your desired outcome it will help find a solution.  It sounds like you are trying to create a live camera feed of sorts with the rendered image output from the camera being placed in the material editor.  You also want the camera image to be updated with any change in the scene.  Do you mean automatically updated by any change in the scene or you the user can take action after creating your animation, like you just render the frames from the camera view?   

 

"..and the quality of map is not important just that you see in viewport."

 

This makes it sound like you don't want a render, but rather just a live viewport view.  Can you clarify why you want this?  

 

Also,  you say you want  "to use my camera in the scene as an input in my material editor " and that you want "to use it in texturing".    Texturing what?

 

If your overall goal is to use the series of images from the live camera feed to map to a model of a television screen, for example, than all you have to do is render out the animation from the camera and place the image sequence into the diffuse slot of the material for the screen.   Please clarify your overall goal and I will try to help.

Message 4 of 22

Anonymous
Not applicable

Thanks.About "This makes it sound like you don't want a render, but rather just a live viewport view" its true my goal is this . I want it for makeing a live and update able texture for my project.

My work is realy more than a TV camera useing.

Can you write a script for it?🙂

0 Likes
Message 5 of 22

10DSpace
Advisor
Advisor

Sorry @Anonymous ,  but I am still not clear on what you want to achieve.  

 

"I want it for makeing a live and update able texture for my project."

If you want a texture map/image sequence, the only way I know how to do this in Max is to render the Camera/viewport image.  This will always take a finite amount of time, so there would not be real-time updating of the texture map image of the camera view as things are animating in the scene.

 

On the other hand, the max viewports and camera views do automatically update what is going on in the scene in real time, (unless the scene gets very heavy in terms of the number of objects, total polycount, number and size of textures, etc. in which case the framerate for the viewport suffers.)  But even with a light scene, the visual feedback from the viewport does not give you an image map.

 

So you seem to be asking for the equivalent of a custom User Viewport for the realtime updating of the scene but with the added feature of an automatic and instantaneous generation of an image map of the viewport.    To the best of my knowledge the combination of both requirements is not possible in Max or any 3D Software that I am familiar with.    

 

I asked what your overall objective is, because with 3D programs and graphics in general there are always multiple ways to achieve an end.  But you still haven't explained exactly what you are trying to achieve so 

in the absence of understanding why you want this, I can't think of anything else to say that would be helpful.  

 

 

Message 6 of 22

Anonymous
Not applicable

Thanks @10DSpace . I know there is a way that you go and take a render and use it as an animation texture . About "So you seem to be asking for the equivalent of a custom User Viewport for the realtime updating of the scene but with the added feature of an automatic and instantaneous generation of an image map of the viewport." 

Yes I need something like this but with quality that we see in viewport (quality is not my goal) I think it can be possible in max script.

My project is makeing an animate able texture that create directly in the scene . not the texture , its mask go and animate directly with a black and white object.(in flat view(with out shadows and...))

Please someone write a script for it:/

 

0 Likes
Message 7 of 22

Anonymous
Not applicable

No Body?

What should I do?

HELP

PLEASE

THANKS

😞

0 Likes
Message 8 of 22

ads_royje
Alumni
Alumni

Hi @Anonymous ,

 

@10DSpace is absolutely right!

To be able to display an image to a material it has to be a saved image on disk.
3ds Max textures cannot use a viewport live feed.

There would be 2 ways I can think of, the first is what is already explained and you know about; render the sequence and use it as an animated texture.

If you need the image to be the viewport drawing, with the grid and the viewport menus displayed, see it in Wireframe or in shaded... e.i. the viewport as an image.

You can capture the viewport as a bitmap and then save the bitmap to disk to load it as a texturemap.
To capture the viewport as a bitmap, you need to have a viewport ID. Usually that is from top left to bottom right: TopLeft:1, TopRight:2, BottomLeft:3,BottomRight4. When having the default quad viewports.

To capture a Camera view, set a viewport to be a camera. Let say we are using the Top Left viewport (default is Top view), this is the viewport ID 1. Create a variable and assign the viewport capture to it.

-- the get viewport by ID
viewport.getHWnd index:1
-- to do a snapshot of it
windows.snapshot (viewport.getHWnd index:1)
-- to assign the snapshot (that is a bitmap) to a variable, so it can be viewed and saved as texture map
myViewportCapture = (windows.snapshot (viewport.getHWnd index:1))
-- now that we have a viewport capture as a bitmap, we can see it.
-- this displays the viewport capture bitmap. Note the the bitmap is the size of the viewport on screen
display myViewportCapture
-- it must be saved on disk to be able to use it as a texture map.
-- assign a path\filename to a variable so it can be re-used to assign to material. textureFileSaved = @"C:\savetopath\name.ext" myViewportCapture.filename = textureFileSaved save myViewrportCapture -- this saves the image to the defined path & name above. -- now the saved image can be loaded as image on the material -- depending of the material type, the image can be loaded -- using Standard material $.material.diffuseMap.bitmap = textureFileSaved -- using a Physical Material $.material.base_color_map.bitmap = textureFileSaved


The simplest approach would be to "render" the viewport from a For Loop and assign the sequence results to the material.


You would get a very similar results using Make Preview as an image Sequence. No scripting involved.
Make Preview would also allow to control the size of the image.
Max main menus, Tools > Preview - Grab viewport

I hope this can help,
Regards



 

Message 9 of 22

Anonymous
Not applicable

Thanks @ads_royje  . But I want a real time result  , and as you said there isn't any way for define it. Thanks.

Message 10 of 22

ads_royje
Alumni
Alumni
Accepted solution

sorry that does not exists by default.

 

I've tried a little trick, that seems to work, but honestly it is quite laggy to use.

Using Node Callbacks, when the Camera is moved around, a function is ran. The function does the screen capture, save image to disk and loads it as texture.


Attached, a video to show it. "CameraViewportCaptureAsTextureRealTime.mp4"

As ms file and max scene used for the video, in the zip file "CameraLiveFeedback.zip"

edit:
I wrote and saying in the video that it is executed on camera movement. The callback is on any node movement in the scene.
Regards,

Message 11 of 22

10DSpace
Advisor
Advisor

Hi @ads_royje 

 

That is a very cool callback technique!  Thanks for posting.  I am guessing the quirky lagging has to do with the variable state of ongoing windows events and competing in the cue for file save and file load operations in addition to the time for callbacks and max functions, etc.  

 

In reviewing @Anonymous  posts, I guess technically want he wants is a result that will be perceived by the viewer as real time.  So that would mean a result that would be quick enough to trick the viewer that it is continuous, ie., Persistence of Vision or about 1/10 to 1/12th of a second (~100 to 120 ms).   Out of curiosity, do you have any idea about the time in ms it takes different parts of the script to run?    I am guessing it is likely to be variable depending on windows load, but it might give a ballpark timeframe.   Also is a lower level retrieval and file save from the video frame buffer from the viewport (if that is even the correct way to describe it) feasible that might be quicker?  Probably C code and plug-in and not maxscript, I guess.

Message 12 of 22

ads_royje
Alumni
Alumni

Thanks @10DSpace for your feedback!! 🙂

 

That is also my suspicion, a lot of events going on, and as you saw, the script is very strait forward.
It is an experiment, to see how feasible this is.


I understand the same, as real time as possible!

Yes, I fully agree with you, this "feature" would require a faster language then scripting.

What could make a huge difference is a material that would accept an unsaved bitmap, so we'd be able to bypass any file i/o.

I did not try to measure when there is performance hits. I highly suspect file i/o to be one of the big ones.

The video frame buffer capture time should not be the biggest problem, not in my major performance suspects. Make Preview uses that same capture methods, from c++ that is, not maxscript, and is quite fast. Where Make Preview loses time is compression and file i/o.
In this experiment, the callback is listening to any node movement in the scene, the short test has 6 nodes including cameras and targets... I haven't tried with hundreds of nodes. 😕 

0 Likes
Message 13 of 22

10DSpace
Advisor
Advisor

@ads_royje 

 

What could make a huge difference is a material that would accept an unsaved bitmap, so we'd be able to bypass any file i/o.

 

OSL and/or data channel?

0 Likes
Message 14 of 22

Anonymous
Not applicable

Thanks @ads_royje  and @10DSpace . First that video @ads_royje sent was good but really not enough it was too slow. and I don't want to change the camera position. 

And as @10DSpace  said I think the OSL can helps a lot. 

And about language do you think C++ is better for these kinds of works or for example python is good too?

This code can really helps me and... Realy there isn't any way to do it in 3dsmax???

0 Likes
Message 15 of 22

Anonymous
Not applicable

Hello everyone, I am new here. Interesting thread, thanks for the information 🤗

0 Likes
Message 16 of 22

ads_royje
Alumni
Alumni

I don't know enough about OSL to know better, I would suspect it would still require a saved image on disk. To be looked at! 🙂

For speed, Python is also a script language, not going to be any faster. In Max, it will be slower than Maxscript.
A compiled language like c++ would most likely be much faster.

 

The experiment with Maxscript and Node Event, in the video I show the camera moving, but it is the same moving any objects in the scene, the node event is on any moved node from the scene.

I did some local measurement and this is the time it took for every part (see attached Measure.mp4):
viewport capture: 8 ms

set a file name to the bitmap: 0 ms
save bitmap to file: 12 ms
load and update texture to material : ~ 1000 to 1024 ms
force viewport redraw: ~ 20 to 70 ms

From what I have tried, I see that the bottleneck of this experiment is the texture loading/update in the material: file reading.
If there is a way to assign a bitmap directly to a material, it'd be much faster.
Materials, by default, all require a saved image on disk.

To get something faster, I would look into a material that can have a virtual bitmap. 
Not sure, maybe OSL language may allow this, otherwise I would think that to be done in c++ using Max's sdk.


Or something I am not thinking of. 🙂

Message 17 of 22

10DSpace
Advisor
Advisor

@ads_royje 

 

Great followup with the time measurements.  Really helps focus on the main bottlenecks. 

 

Regarding OSL, I don't know if this helps but @madsd  had posted a video of an OSL realtime cloud shader here:

 

https://www.youtube.com/watch?v=xOYI9k9gsf8

 

I know very little about OSL, but remember being impressed by some of Mads postings.   Thanks again for your followup info.

Message 18 of 22

madsd
Advisor
Advisor

Hi, yes - with OSL we can create image projections, or camera projections that run 100FPS.
I cant directly share some initiative, because of things. So you will have to wait or try dive in yourself.

But it is indeed possible.

Message 19 of 22

10DSpace
Advisor
Advisor
Accepted solution

@madsd 

 

Thanks for the (provocative) followup.  It sounds like an interesting initiative.  I guess the camera projection technique you are talking about is demonstrated here: 

 

https://cgpress.org/archives/real-time-projection-mapping-with-3ds-max.html

 

 

Message 20 of 22

madsd
Advisor
Advisor

Eloi is using a physical projector there if I recall correctly, beaming on some wall in his room.

With OSL we can project a UV through for example a camera, or a line or a plane, all we need is the position and its orientation and we can use that as a source to send out a projection plane that hits some surfaces and adds some kind of texture or procedural seen from that specific angle of approach.

Like Arnold Camera Projection shader node.