Hi Guys,
I'm working on some computer vision stuff, and I thought making some simple synthetic data in maya would be really convenient especially because I need a normal pass and depth pass. Anyways, Ive rendered out my beauty pass with mental ray and also a depth pass. I then used the camera characteristics to create 3D points at each pixel and then applying the depth to turn it into a point cloud. The results are pretty nifty but i noticed that my ground plane is not flat!
The included image are my results. I know this might be a bit out of scope for a Maya users board but I was wondering if maya/mentalray adds distortion to output images/depth maps by default. (I'm confident I didn't do it!)
I've attached the relevant passes.
May be some gamma issue. Have you checked (or taken into account) that the passes are not linear data anymore but written out with gamma correction?
One further possibility: Mismatch between depth (distance to plane perpendicular to optical axis) and distance (to eye point)?
Can't find what you're looking for? Ask the community or share your knowledge.