You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ideally we should have the option to generate grasp points for specific target avatars, hence scaling the hand w.r.t. the intended avatar. If no avatar is provided, we just take a default scale.
In addition, the hand should be adjusted w.r.t. two aspects:
(1) the visualization should be a proper meshed hand. Maybe as well the copy of the hand of the target avatar (if possible?) or just a deformable model. this would help with getting the distances to the object right and reduce mesh collisions.
(2) using single-chain IK on each finger would allow for adjusting the position of the finger tips only, with the rest of the hand following.
Create a helper function to spawn Hands for reach and grasp points with the respective scripts for extracting constraints.
The text was updated successfully, but these errors were encountered: