I'm noticing that a texture map in my scene appears to render significantly darker on my object when rendered with the CPU vs the GPU. See the images attached for a comparison.
If you look closely at the single sprite on the right it looks like it might be a premult issue but I'm using a TIFF file with alpha transparency as the docs recommend, but I can't figure out why the GPU would interpret it differently from the CPU. I'm using Arnold 7.1.3 SDK and the Python API. Is there anything obvious that I'm missing here?
To me, the GPU seems correct and the CPU is wrong. Could it be you are using a node that isn't supported in GPU?
I agree, I think the CPU is wrong too. But there's nothing special about the node setup - it's an AiImage node plugged into the base color socket of an AiStandardSurface with the alpha channel routed to the opacity socket. No other parameters are touched. Even if there was an unsupported node, wouldn't that mean the GPU is wrong and not the other way around? This is why I'm so confused about this one.
That is weird yes. To me, it looks like a bug. I would test it using a different file format, like a PNG or even a TARGA and see if the same thing happens. If it stills happens then is something else.
At this point I've tried a bunch of different file formats for the texture and they all do the same thing. GPU renders properly, CPU renders dark. The only other thing I can think of is if there's something weird going on with my geometry, but again, normally you would see issues the other way around where CPU looks right and GPU doesn't, but this isn't what's happening here. I'm kind of at a loss at this point.
Any chance you could save a small scene so that I could test it on my machine?
Sie finden nicht, was Sie suchen? Fragen Sie die Community oder teilen Sie Ihr Wissen mit anderen.