Researching if it would be possible to in VRED to grab and move (transform) objects with VR hands in a natural way.
I know how to move objects with the laser, but actually grabbing and moving them is so much more intuitive.
Has anyone actually succeeded in doing this in VRED in a good way?
I was thinking to maybe set the object to be moved as a child/constrained to the touching VRhand with a sensor-Vset, but is that the best way?
Any thoughts? ๐
Sorry, double post on the same subject. Mod can remove/hide this one if possible. Sorry again.
i know that we can use controllers to control the functionalities in vred, what are the possibilities to do the same with hand gestures ? is this possible with the vred apis exisitng currently ?
Hello,
Yes, it's possible by utilizing the 'getPinchStrengh' method along with constraints, collisions and timers.
See:
tried to get this vrdTrackedHand
getPinchStrength() but get an error that , it requires a parameter but i see in the api documentation , this function just returns a value
is there any example in the examples , i couldnt find any though
Hi, here is an example.
# Check RH pinch strength - while virtual hands are in view of HMD
rh_tracked = vrDeviceService.getRightTrackedHand()
print(rh_tracked.getPinchStrength())
using pico headset with vred , in pico headset there is standard handtracking setting. if we enable it i assume it will work for vred ? or do we need to program the grabbing of objects manually with python ? any tips ?
My example was for the Varjo XR-3 HMD which has integrated hand tracking functionality.
I don't have experience with Pico headsets, but the link below has some tips for Setting up Hand Tracking for Other Devices.
I implemented something like this for handtracking and controllers before. I had to struggle with the hierarchy of the nodes. Because when you want to move a wheel and you select it with the laserpointer, you maybe select a screw and then you have to go up the hierarchy until you selected the whole wheel. It was easy to do this with the controllers with the touchpad, but with handtracking you need to control the hierarchy with a extra menu. You can also prepare your scene so that there are no hierarchy problems. I can help you, if you want to implement something like that.
thanks for that , this doesnt work for pico controllers , i think i need to find out why
Which Pico device do you have? I'm thinking about implementing custom hand tracking for the Quest 3, maybe the implementation will also work for the Pico glasses.
Hi. the gifs you have attached , i am able to do the hand movements separately with pico controller.
but how is it possible to do this with python vred ? i couldnt see any examples in vred examples. ?
do we use external apis like ultraleap ? or opencv ?
I was able to do the handtracking very easily with opencv
the sample code i am able to do the control like sectioning with the controllers ( pico controller 4 ) but not able to connect the blue hands ( currently these dont move ) i.e i assume i have to connect the coordinates of the controllers to the left and right hand i assume. , see pic
Can't find what you're looking for? Ask the community or share your knowledge.