Decimation always on ... bug???

Decimation always on ... bug???

aaronfross
Collaborator Collaborator
866 Views
5 Replies
Message 1 of 6

Decimation always on ... bug???

aaronfross
Collaborator
Collaborator

Hi,

I saw that someone back in May posted an apparent bug about Decimation. I think I am having the same issue. I'm importing .e57 and .las files from an external applicationm, Cloud Compare. Even though the Decimation is disabled during import, the level of detail I'm seeing in ReCap Pro is very low compared to the original imported files. I double checked the files and they are fine. It seems that ReCap Pro is spuriously decimating the point cloud. I don't think I'm permitted to see a vertex count in ReCap Pro, but the original file had about 200,000 points.

Perhaps it's happening because the bounding box is very very small? Is ReCap Pro culling all points within a hard coded radius? Changing the project settings to millimeters did no good, especially since that setting is only available AFTER the point cloud is imported.

Please help. This is the only way I can get point clouds into 3ds Max, and it seems to be broken.

Aaron

0 Likes
867 Views
5 Replies
Replies (5)
Message 2 of 6

aaronfross
Collaborator
Collaborator

OK, I just applied a 100x scale in my external application, re-imported the data to ReCap Pro, and the bug is confirmed.

ReCap Pro is culling points that are within a radius that is unknown to the user.

This is surely an optimization that was introduced with the assumption that all point clouds will be scaled in the range of common household objects. No feature smaller than, I'm guessing, 0.01 millimeter can be resolved.

The optimization needs to be removed, we already have a Decimation feature that we can control.

 

Thanks

 

Aaorn

0 Likes
Message 3 of 6

ryan.frenz
Alumni
Alumni

Can you give some background on your data?  What scanner, how many setups, etc?

 

ReCap's format is not a 1:1 replication of the input file (it is lossy).  Points that are deemed to be statistically equivalent will be unified.  The criteria is based on knowledge and testing of commercial laser scanners.  In practice, as you mention, the effect will be removal of points that are within a small radius (much less than 1mm) of another in the same scan.

 

The important assumption here is that the uncertainty of the measurement device is equal to or larger than this radius.  'Uncertainty' here includes consideration of both the device's accuracy (correctness) and its precision (noise/repeatability) on the measured surface.

 

If ReCap is removing points that you need, it's likely for one of two reasons:

  • You have a laser scanner that is more accurate/precise than assumed.  That said, there do exist some of these in the metrology world.  Formats like LAS and E57 also make this tricky because they don't always carry info about the measurement device (i.e. more general assumptions must be made).
  • You've consolidated multiple, non-unified scans into a single input file.  This causes a problem because it changes the validity of the above assumption.  ReCap can remove points that are just 'overlap' and not actually equivalent.  The easy fix in this case is to keep your scan files separate when importing to ReCap.

In short, the best way to maximize the fidelity of your ReCap point cloud is to import the native format of your scanner and turn off decimation.

 

-Ryan

 

0 Likes
Message 4 of 6

aaronfross
Collaborator
Collaborator

Thank you Ryan. I feel vindicated that I was able to determine the issue on my own. And of course, "it's not a bug, it's a feature". 🙂

The developers' assumption that the point cloud data is coming from a real-world scanning device (with real-world accuracy limits) is not a good assumption in this case. I'm dealing with purely computer generated data; there is no scanner in my pipeline.

As I stated in my original post, the point cloud data is coming from an external application, which is called Cloud Compare. But that is not the original source of the data. The data came from a 3D fractal application called Mandelbulb3D. I had to run it through Cloud Compare because ReCap does not support .PLY point clouds. And I am using ReCap in the first place because 3ds Max does not support any point cloud format other than ReCap.

So, in order to get the point cloud data into 3ds Max, I have to import the .PLY point cloud into Cloud Compare, then export to .E57. Then load the .E57 into ReCap and save as .RCP. Then I can finally load into 3ds Max. But now I know that I must scale the cloud by a factor of 100 to avoid these round-off errors.

Can you tell me if there are any more gotchas? Is there any other way in which the ReCap format is inherently lossy?

Thanks again!

Aaron

0 Likes
Message 5 of 6

ryan.frenz
Alumni
Alumni

Hi Aaron,

 

That sounds really cool.  But ReCap was built for real-world capture (hence the name, and the assumptions).  The format assumptions are not necessarily a 'feature', but a reasonable optimization in light of that goal.

 

As far as 'gotchas', nothing else jumps out at me.  It might be useful to consider the details above when generating your synthetic scenes (i.e. when choosing density/sampling rate, etc).  You also might into performance issues in Max if you get too crazy w/ the point cloud size/density (I haven't tested it lately).

 

-Ryan

0 Likes
Message 6 of 6

marcel_winklmueller
Explorer
Explorer

disregard. 
I have another issue here. 

point still stands though, if there is an off button it should behave as such. i am just not sure anymore if it does or not

0 Likes