Hi all,
I'm able to query a geometry's closest uv at a given point, but how can I reverse that? If I have this float2 uv value then how can I convert it to a position? Was anybody able to do this?
Cheers!
Jason
Solved! Go to Solution.
Hi all,
I'm able to query a geometry's closest uv at a given point, but how can I reverse that? If I have this float2 uv value then how can I convert it to a position? Was anybody able to do this?
Cheers!
Jason
Solved! Go to Solution.
Solved by mjcg91. Go to Solution.
You query the closest location on the UVs, but sample the location on the actual 3D geometry. Since your UVs and the 3D mesh have the same number of faces, and closest location works using barycentric coordinates, the sampled location will give you the closest position on the 3D mesh.
The tricky thing is that you to construct a mesh from your UV data to do the query, because the query has to be done on the closest location on a face. query will still happen in 3D even if you input float2 as positions.
Hope this helps.
You query the closest location on the UVs, but sample the location on the actual 3D geometry. Since your UVs and the 3D mesh have the same number of faces, and closest location works using barycentric coordinates, the sampled location will give you the closest position on the 3D mesh.
The tricky thing is that you to construct a mesh from your UV data to do the query, because the query has to be done on the closest location on a face. query will still happen in 3D even if you input float2 as positions.
Hope this helps.
And here's how to get the standard uv data from a mesh.
And here's how to get the standard uv data from a mesh.
Hey Max,
That looks really cool! So from what I understand you're building a mesh from the uvs to make that plane geometry, getting the closest location from it using the locator, then finally sampling the location's position but with the original sphere to get the red point's position?
I guess the thing I'm struggling with is even if I built that mesh from uvs I won't have that locator on my setup so I would have nothing to query from.
I kind of figured I would have to extract more info from my initial closest location when I get the uv value so I can use it later to map onto its updated 3d position. Which leads me to another question: how can you sample any other info from a location? For the life of me I can only pull position out of it, but from what I understand in this link there's way more properties that I should be able to pull. I'm not too sure what the names of the properties are and how to check though as the watchpoint is empty.
Hey Max,
That looks really cool! So from what I understand you're building a mesh from the uvs to make that plane geometry, getting the closest location from it using the locator, then finally sampling the location's position but with the original sphere to get the red point's position?
I guess the thing I'm struggling with is even if I built that mesh from uvs I won't have that locator on my setup so I would have nothing to query from.
I kind of figured I would have to extract more info from my initial closest location when I get the uv value so I can use it later to map onto its updated 3d position. Which leads me to another question: how can you sample any other info from a location? For the life of me I can only pull position out of it, but from what I understand in this link there's way more properties that I should be able to pull. I'm not too sure what the names of the properties are and how to check though as the watchpoint is empty.
That looks really cool! So from what I understand you're building a mesh from the uvs to make that plane geometry, getting the closest location from it using the locator, then finally sampling the location's position but with the original sphere to get the red point's position?
Yes, that's exactly what it is doing.
I guess the thing I'm struggling with is even if I built that mesh from uvs I won't have that locator on my setup so I would have nothing to query from.
This is using a locator in this case, with it could be anything, like another mesh's point position... If you don't plug anything in the position port you won't query anything either.
I kind of figured I would have to extract more info from my initial closest location when I get the uv value so I can use it later to map onto its updated 3d position. Which leads me to another question: how can you sample any other info from a location? For the life of me I can only pull position out of it, but from what I understand in this link there's way more properties that I should be able to pull. I'm not too sure what the names of the properties are and how to check though as the watchpoint is empty.
You can query almost any property from a mesh, relying on either the point_component or the face_component. You just got to make sure the "default" type you are setting on the "sample_property" match the actual property you want to query. If you can't see the data on your watchpoint, you can use a "dump_object" node to write your object to a text file. there you can see every single property it is holding, along with its data.
That looks really cool! So from what I understand you're building a mesh from the uvs to make that plane geometry, getting the closest location from it using the locator, then finally sampling the location's position but with the original sphere to get the red point's position?
Yes, that's exactly what it is doing.
I guess the thing I'm struggling with is even if I built that mesh from uvs I won't have that locator on my setup so I would have nothing to query from.
This is using a locator in this case, with it could be anything, like another mesh's point position... If you don't plug anything in the position port you won't query anything either.
I kind of figured I would have to extract more info from my initial closest location when I get the uv value so I can use it later to map onto its updated 3d position. Which leads me to another question: how can you sample any other info from a location? For the life of me I can only pull position out of it, but from what I understand in this link there's way more properties that I should be able to pull. I'm not too sure what the names of the properties are and how to check though as the watchpoint is empty.
You can query almost any property from a mesh, relying on either the point_component or the face_component. You just got to make sure the "default" type you are setting on the "sample_property" match the actual property you want to query. If you can't see the data on your watchpoint, you can use a "dump_object" node to write your object to a text file. there you can see every single property it is holding, along with its data.
Ok! Got it working!! Thanks for the help, Max! At first I wasn't completely understanding your first post and how that relates to getting the out position, but after replicating it and playing around I got it.
I'll share a test scene that does this so if anyone one else has the same issue they can check out the graphs where I left a few comments.
This scene does the following:
So this specific example is sort of redundant, but the point is that later on if the geometry deforms but I grabbed its uvs from the start I can still use them to get its deformed positions. Basically I'm thinking of trying to do a mini wrap deformer compound that's driven by uvs.
Ok! Got it working!! Thanks for the help, Max! At first I wasn't completely understanding your first post and how that relates to getting the out position, but after replicating it and playing around I got it.
I'll share a test scene that does this so if anyone one else has the same issue they can check out the graphs where I left a few comments.
This scene does the following:
So this specific example is sort of redundant, but the point is that later on if the geometry deforms but I grabbed its uvs from the start I can still use them to get its deformed positions. Basically I'm thinking of trying to do a mini wrap deformer compound that's driven by uvs.
Hi Maxime,
I just tried recreating your "create_mesh_from_uv" and I'm getting a empty mesh. Is the data type in "map1" array<Math::float2> or array<Math::float3>? Doesn't point position require the latter? But "map1" should be the former. I've tried both types and got an empty mesh.
"face_offset" and "indices" should be array<uint>, right?
Hi Maxime,
I just tried recreating your "create_mesh_from_uv" and I'm getting a empty mesh. Is the data type in "map1" array<Math::float2> or array<Math::float3>? Doesn't point position require the latter? But "map1" should be the former. I've tried both types and got an empty mesh.
"face_offset" and "indices" should be array<uint>, right?
@mcw0
The UV position property is array<Math::float2>, and face_offset and face_vertex_uv_index are array<uint>.
Also the UV position property is not longer called "map1". If you import a mesh from Maya then the UV property is called "face_vertex_uv"
@mcw0
The UV position property is array<Math::float2>, and face_offset and face_vertex_uv_index are array<uint>.
Also the UV position property is not longer called "map1". If you import a mesh from Maya then the UV property is called "face_vertex_uv"
Hi Maxime,
I didn't see "map1" in the watchpoint even though that's the default uvSet. So I did try "face_vertex_uv" as that was the only property with "uv" in the name. I'm still getting an empty mesh.
Hi Maxime,
I didn't see "map1" in the watchpoint even though that's the default uvSet. So I did try "face_vertex_uv" as that was the only property with "uv" in the name. I'm still getting an empty mesh.
I think I know what it is. I have to also change "map1_index".
I think I know what it is. I have to also change "map1_index".
That was it. I have a mesh now. Thank you
That was it. I have a mesh now. Thank you
But this does raise a question. How does Bifrost handle multiple uvsets?
But this does raise a question. How does Bifrost handle multiple uvsets?
Bifrost imports all uvSets with different property names.
To get the face_vertex_uv_index you can't get it using get_geo_property. You have to get "face_vertex_uv_index" using "get_sub_object", and then get the "indices" using "get_property"
Bifrost imports all uvSets with different property names.
To get the face_vertex_uv_index you can't get it using get_geo_property. You have to get "face_vertex_uv_index" using "get_sub_object", and then get the "indices" using "get_property"
Since Bifrost doesn't seem to respect uvSet names, will the properties be "face_vertex_uv", "face_vertex_uv1"...? And of course, the same for indices?
Since Bifrost doesn't seem to respect uvSet names, will the properties be "face_vertex_uv", "face_vertex_uv1"...? And of course, the same for indices?
Yes.
Yes.
Hi Maxime,
I am trying to wrap one mesh to another using the UVs but I am having some troubles with it.
I am generating the UV meshes using your template then I get the barycentric coordinates of the two UV meshes but when I try to set the point position of the wrapped mesh the vertex IDs are obviously not matching with the UV mesh. I was wondering if there is a way to rebuild the sampled locations with the correct ids. I guess I have to find a way to convert the face_vertex_uv_index to the vertex ID, but I have no idea on how to do that.
Now that I think about it. I do not even need to generate a UV mesh I just need an array of the UV coordinates sorted per vertex ID
Thank you
Maurizio
Hi Maxime,
I am trying to wrap one mesh to another using the UVs but I am having some troubles with it.
I am generating the UV meshes using your template then I get the barycentric coordinates of the two UV meshes but when I try to set the point position of the wrapped mesh the vertex IDs are obviously not matching with the UV mesh. I was wondering if there is a way to rebuild the sampled locations with the correct ids. I guess I have to find a way to convert the face_vertex_uv_index to the vertex ID, but I have no idea on how to do that.
Now that I think about it. I do not even need to generate a UV mesh I just need an array of the UV coordinates sorted per vertex ID
Thank you
Maurizio
Hi Maurizio,
I'm not quite sure why you want to deal with indices as I think spatial queries would allow you to do such effects without dealing with indices. If you have MJCG_compounds, you can have a look at deform_by_uv. It does what you are looking for using only a spatial query.
Hi Maurizio,
I'm not quite sure why you want to deal with indices as I think spatial queries would allow you to do such effects without dealing with indices. If you have MJCG_compounds, you can have a look at deform_by_uv. It does what you are looking for using only a spatial query.
Hi Maxime,
what I am trying to do I think it is a bit different. I am trying to replicate the same effect of the maya transfer attribute by UV.
So I am creating the correspondence of the points between the two UV sets.
So for the target object I need the UV locations of the vertices.
This is the Api version my little project
https://www.youtube.com/watch?v=Pn5FmfX1dJY
I am so close to it I just have to find a way to get the UV coordinates per vert ID.
Hi Maxime,
what I am trying to do I think it is a bit different. I am trying to replicate the same effect of the maya transfer attribute by UV.
So I am creating the correspondence of the points between the two UV sets.
So for the target object I need the UV locations of the vertices.
This is the Api version my little project
https://www.youtube.com/watch?v=Pn5FmfX1dJY
I am so close to it I just have to find a way to get the UV coordinates per vert ID.
I see. Technically speaking, if you are just looking for point id correspondance, a 3D vertex can be have multiple UV vertices correspondances in the uv space, this is the case when you have a seam. The points on the seam will have more than 1 correspondance.
Maybe you could try to spatial query the mesh on itself and use sample_property to sample face_vertex_uv.
Also this tutorial from Paul Smith could help you. It's not using UVs but it's transfering movement between different topologies.
I see. Technically speaking, if you are just looking for point id correspondance, a 3D vertex can be have multiple UV vertices correspondances in the uv space, this is the case when you have a seam. The points on the seam will have more than 1 correspondance.
Maybe you could try to spatial query the mesh on itself and use sample_property to sample face_vertex_uv.
Also this tutorial from Paul Smith could help you. It's not using UVs but it's transfering movement between different topologies.
Yes,
I was aware
@mjcg91 wrote:I see. Technically speaking, if you are just looking for point id correspondance, a 3D vertex can be have multiple UV vertices correspondences in the uv space, this is the case when you have a seam. The points on the seam will have more than 1 correspondence.
Yes I was aware of that, I think it would be fine just picking the first value that comes from the vertex.
I was wondering if there is a way to iterate all the vertices and query the UV coordinates related to that vertex.
I started getting into bifrost a couple of days ago so I do not even know if using an iterator is going to slow down the graph and I do not even know if there is a way to get some sort of conversion of the vertex component to UV.
Thanks for the video. At first glance it seems like he is doing the same thing I was doing following your example.
I will watch it again more carefully.
I have attached a scene with the meshes. I think the compound would work if the ids on the UV mesh were organized in the same way of the ids. I have the feeling there is an easy solution to this I just can't figure it out...
Thank you for the help
Yes,
I was aware
@mjcg91 wrote:I see. Technically speaking, if you are just looking for point id correspondance, a 3D vertex can be have multiple UV vertices correspondences in the uv space, this is the case when you have a seam. The points on the seam will have more than 1 correspondence.
Yes I was aware of that, I think it would be fine just picking the first value that comes from the vertex.
I was wondering if there is a way to iterate all the vertices and query the UV coordinates related to that vertex.
I started getting into bifrost a couple of days ago so I do not even know if using an iterator is going to slow down the graph and I do not even know if there is a way to get some sort of conversion of the vertex component to UV.
Thanks for the video. At first glance it seems like he is doing the same thing I was doing following your example.
I will watch it again more carefully.
I have attached a scene with the meshes. I think the compound would work if the ids on the UV mesh were organized in the same way of the ids. I have the feeling there is an easy solution to this I just can't figure it out...
Thank you for the help
Can't find what you're looking for? Ask the community or share your knowledge.