Sorry if this subject has been addressed in the forefront and I missed it.
My question is, should we users be expecting a visual consistency between the choice of rendering using CPU vs. GPU? Or has it been made clear that the two are almost different render engines?
It's clear, under certain conditions, Arnold's GPU-rendered images do not yield the same results as CPU-rendered. A single example is attached to this post using the standard glass shader made available from a downloadable material library.
As GPUs become so much more powerful and cost-effective than CPUs (not to mention, easier to replace or upgrade), it's concerning that a user can not design in a CPU environment and then render in a GPU environment. Or, even worse, not trust down the road that animation won't look the same as technology changes and a re-render is necessary. The point of this post is NOT to debate which of the 2 is better. That's relative to an artist's situation.
Are the designers of Arnold intending the CPU/GPU renders to look the same and this is a bug? Or is the expectation that we artists are expected to choose a render technology, commit to it and not stray from it?
Thanks all!
Tech info:
Attached example rendered in 3D Studio Max
CPU example used Intel i9-9900X
GPU example used Nvidia EVGA RTX 3060 TI.
The goal is visual consistency. both should ultimately converge to the same result.
Can you attach an ass file of that test scene?
And what were the sampling settings for CPU vs GPU (those have to be different to get the same result)
Hi, Stephen.
Your response is a relief and I appreciate your time.
I didn't save the test scene but was able to quickly reproduce it. Hopefully, I'm giving you what you need.
Naturally, I'm less concerned about the inherent noise of settings versus the noticeable transparency interpretation difference between the CPU/GPU render.
Thanks.
Hi, Stephen.
Was wondering if there was any advancement in the exploration of this problem.
Thanks.
I've only done a quick investigation, but converting the png to 3-channel jpg makes the two renders match. I suspect it's probably related to some combination of single channel textures and some other attributes being broken in arnold. We'll keep investigating.
A workaround I've found so far is to specify the color_space of the image node as linear when you have single channel images. If you do that then CPU and GPU renders will match. Hopefully once we fix this bug you won't need to do this and any color space specified will match.
GPU Based Rendering And GPU Focused Render Engines
For one, GPUs are much better at 3D rendering than CPUs because they're optimized for graphical computations and parallel processing. This means that they are able to process many tasks simultaneously, unlike CPUs which operate serially.
Regards: knifeplatoon
Hi, Stephen.
This post is going on 4 months old.
I continue to experience substantial problems with Arnold CPU and GPU renders not matching.
I'm not sure how you can advertise the product, in an honest way, with the claim that your GPU rendering is a solid solution.
Has there been ANY advancement at all with this glaring issue? Has there been any public announcement or warning that your GPU solution is dangerous to use in conjunction with your CPU solution? If there is and I missed it I apologize.
Please let me know if I can continue to be of help to your development team.
We should have fixed the bug you reported above in Arnold core 7.1.4. What version of Arnold are you testing? If you're finding other issues, please make new threads so we can fix those issues as well.
https://help.autodesk.com/view/ARNOL/ENU/?guid=arnold_user_guide_ac_arnold_gpu_ac_features_limitatio... has a list of the current known limitations.
Thanks so much for responding. I just yesterday encountered an inconsistency with SSS.
I'll do some tests and create a new post if I find issues!
Sie finden nicht, was Sie suchen? Fragen Sie die Community oder teilen Sie Ihr Wissen mit anderen.