- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report
I need to query a point on a mesh, convert the point to UV coordinates, and then sample the image to determine what color corresponds to that point on the mesh (and then do something if the color = some rgb values).
I'm close to solving the point-on-mesh to UV space, but I've found nothing for sampling images. I could use pillow to extract pixel information, but then I have re-map it for UV space (I think pillow origin is upper left, while maya UV origin is lower left)
Maya does this image sampling already for displacement maps, emitting particles from a surface (emitter1.textureRate), and in various other ways. But I can't find any documentation on how to do it programmatically.
There are probably ways to do it in BiFrost, but I'd rather do it with python so it can be integrated into existing python tool sets (plus I hate node-based programming. Just a personal dislike--it's fine if that's your thing but I find it annoying. )
Use cases: telling meshes (trees/grass/bushes) where to be instanced, or pulling the red lines of a USGS topography map to create height displacement.
any help is appreciated.
thanks
Solved! Go to Solution.