Revit API: IExportContext converting UV to the range (0,1)

Anonymous

Revit API: IExportContext converting UV to the range (0,1)

Anonymous
Not applicable

Hello, I'm writing an exporter for Revit and encountered with a problem when exporting UV. When I call PolymeshTopology.GetUV, the coordinates given are often larger than 1. My Questions are:

 

1. How can I convert them to the desired range (0,1)? Or how should I understand these UV values?

2. Are they coordinates on the image? And If they are indeed coordinate on the image, how can I access the size of the image in IExportContext?

 

Thanks in advance!

0 Likes
Reply
3,090 Views
14 Replies
Replies (14)

Anonymous
Not applicable

To give a concrete example:

say I have a texture of size 4096x4096, the corresponding UV data I read in IExportContext is around 25, and the RealWorldScale from the UnifiedBitmapAsset is about 141. I cannot seem to be able to discern a formula or something...

0 Likes

jeremytammik
Autodesk
Autodesk
0 Likes

Anonymous
Not applicable

@jeremytammik Thanks a lot for the reply... the post you linked to explains UV as two variables that parameterize a face, but as far as I can see they are related to the structure of the face in 3d space, not how each vertex of the face are mapped to the bitmap. I'd appreciate it very much if you can elaborate on this, that is, how can I perhaps convert the UV coordiantes returned by PolymeshTopology.GetUV  to pixel coordinate on the image or in the range (0,1).

 

Basically I found this post (https://forums.autodesk.com/t5/revit-api-forum/polymeshtopology-uvs/td-p/8641007) to be relevant, but the answer given is not detailed enough (in my view...) to actually solve the problem.

0 Likes

Anonymous
Not applicable

Okay, I think I've figured it out and I'll list the solution found just in case someone else also has this problem...

 

The PolymeshTopology.GetUV method return a UV coordiante, which is, as @jeremytammik 's link points out, the parameters of a face. Therefore I guess they are related to the dimension of the face that they parameterize. These UV coordiantes can be converted to display units via UnitUtils.ConvertFromInternalUnits (which is probably deprecated, but anyway...) and in my addin centimeter is used. Till now we have the UV in centimeters.

The next step is to obtain the size of the bitmap. Which is given by the properties with name UnifiedBitmap.TextureRealWorldScaleX/Y. These values are actually given in inches, so you can call UnitUtils.Convert(value, DisplayUnitType.DUT_DECIMAL_INCHES, DisplayUnitType.DUT_CENTIMETERS) to convert them. In fact, I've found DisplayUnitType.DUT_FRACTIONAL_INCHES gives the same result, I don't know what's the difference... maybe someone could elaborate on it.

After that the UV coordiantes can be scaled to the range [0,1] or whatever.

 

Am I doing it right? @jeremytammik 

thanks

0 Likes

Anonymous
Not applicable

P.S. the AssetPropertyDistance class actually has a DisplayUnitType property, which should be used instead of the DUT_DECIMAL_INCHES

0 Likes

jeremytammik
Autodesk
Autodesk

t sounds perfectly sensible and correct to me. I hope others will test and verify as well. Thank you very much for your research and documentation!

 



Jeremy Tammik
Developer Technical Services
Autodesk Developer Network, ADN Open
The Building Coder

0 Likes

jeremytammik
Autodesk
Autodesk

Thank you very nice for the helpful explanation. I am sure others will find it useful as well. I preserved it for posterity in the blog:

 

https://thebuildingcoder.typepad.com/blog/2020/07/revit-20211-update-and-normalising-custom-export-u...

 



Jeremy Tammik
Developer Technical Services
Autodesk Developer Network, ADN Open
The Building Coder

0 Likes

Anonymous
Not applicable

I'm a bit confused because UVs are generally a relative spatial coordinate system.  "Relative" is the key word as they do not correspond to measurable distances.  So I don't understand why there would be a need to convert TextureRealWorldScaleX/Y to another unit if normalization is ultimately x/y or y/x.

 

Of course I'm pretty sure I don't understand how Revit is calculating it's UVs.  It seems crazy that when we ask for UVs it gives us UVs that are not usable UVs (unless the texture dimensions are 1:1) - the whole purpose of UVs is to allow us to ignore texture dimensions.  But I understand it may make the Revit developers lives easier as it allows for interesting tiling dynamics when interchanging textures.

 

In any case, I attempted what I thought was a way to normalize given texture dimensions:

normailze_multiplier = scaleX / (scaleY + ((scaleX - scaleY) / 2))

 

If scaleX was 14 and scaleY was 10 (inch, cm, it shouldn't really matter), then I see UVs that go from (0, 0) to (1.167, 0.833).  normalize_multiplier then comes out to be 1.167.  So (/, *) results in (1, 1).

 

But, unfortunately, that doesn't always work and I see erroneous UVs elsewhere.  So I need more information:

  1. What are the UVs we are getting from Revit - they are not classic UVs so what do they really represent?
  2. What is the appropriate algorithm to convert or normalize the UVs based on texture dimensions? 
  3. Is there a function that already does this conversion?
  4. How does TextureRealWorldOffsetX/Y affect the UVs?
  5. How does TextureWAngle affect the UVs?

Thanks,

Jason

 

0 Likes

jeremytammik
Autodesk
Autodesk

Thank you for your questions. I passed them on to the development team for you.

 



Jeremy Tammik
Developer Technical Services
Autodesk Developer Network, ADN Open
The Building Coder

0 Likes

RPTHOMAS108
Mentor
Mentor

I think this depends on the type of face, I find they are not always normalised from 0 to 1.

 

You can plot the UV co-ords on the surface using AVF, I believe the UV tends to follow the parameter of the curves around the edges of the face.

 

So I believe last time I checked a cylindrical face the straight edges have ord related to raw parameter of curve (line), the curved edges had normalised parameter of the arc i.e. for the face on a vertical curved wall the V was raw and the U was normalised (or the other way around). Sometimes especially with cylindrical faces the system is not orientated with v increasing in the same direction as Basis.Z (depends on face orientation). If you have a complete circle with two cylindrical faces one will have V pointing downwards and the other pointing up to maintain face normal outwards.

 

Anyway my suggestion is to use AVF to understand how UV is applied to different types of faces.

0 Likes

RPTHOMAS108
Mentor
Mentor

Actually I recalled wrong, results as below:

 

Generally the UV is based on raw parameter of curved edges but for some faces i.e. ruled face it is normalised. Below are some examples (note the direction of increase in all cases). The below is based on difference between Face BoundingBoxUV Max/Min (dividing into 10 segments and adding the UV values to the ValueAtPoint collection). Many of the walls below are 9.8ft high with base at 0.

 

PlanarFace UPlanarFace UPlanarFace VPlanarFace VRuledFace VRuledFace VRuledFace URuledFace UCylindricalFace UCylindricalFace UCylindricalFace VCylindricalFace V

 

 

 

0 Likes

RPTHOMAS108
Mentor
Mentor

This seems logical to me in that a Ruled face has to transition from one curve to another (opposite edges) so it makes sense for those two opposite curves to be normalised so the points along it can be mapped to one another. Imagine otherwise you would have an arc at one edge opposing a line at the other (both with different lengths) you would have to normalise at some stage.

 

At the same time other types of faces such as cylindrical face / planar face may be constructed with a curve projected in a given direction by a length, so raw parameters also seem more appropriate for that form of construction. Regarding cylindrical face you can also see the U-Value are related to the parameter of the curve i.e. that curved wall is obviously longer that 1.5ft if it is 9.8ft high (so needs to be multiplied by it's radius).

 

I always prefer normalised curve parameters they inherently tell you more and tell you everything when combined with a length and start point. I know what my preference would be but I think we just get the leftovers of the internal geometry system.

0 Likes

jeremytammik
Autodesk
Autodesk

You never stop impressing me, Richard!

 

What a brilliant way to debug the UV coords!

 

Thank you for sharing this and resolving this complex topic!

 

I edited and shared it on the blog:

 

https://thebuildingcoder.typepad.com/blog/2020/12/dynamo-book-and-texture-bitmap-uv-coordinates.html...

 

Happy weekend to all!

 



Jeremy Tammik
Developer Technical Services
Autodesk Developer Network, ADN Open
The Building Coder

0 Likes

Anonymous
Not applicable

After educating myself on various aspects of Revit geometry and material creation I ended up ultimately I getting this working pretty much as sunsflower described.  Here's my code for calculating U...

 

Texture.scaleU = UnitUtils.Convert(propX.Value, propX.DisplayUnityType, DisplayUnitType.DUT_METERS);

uv = polyMeshTopology.GetUVs()[0];

valueU = UnitUtils.ConvertFromInternalUnits(uv.U, DisplayUnitType.DUT_METERS);

u = valueU / Texture.scaleU;

 

Thanks to everyone who tried to help.  My initial attempt to follow sunsflower's description never aligned properly with my original expectations and so I originally messed up the calculation in my first attempts.