-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Demo always predict 2 hands #9
Comments
Hi, Sorry about this. As you mentioned, the demo code is hard-coded for the two-hand cases. |
hi, I still can't find where to cahnge confidence of hand boxes |
Hi, sorry for late. Line 112 in 0521504
|
HI,Sometimes it is obvious that there is only the right hand, but the confidence of the left-handed bbx will suddenly increase. Is there a standard for judging? |
sorry that could be wrong predictions of the model. |
Then would it be correct to understand that the DetectNet isn't clearly robust enough to distinguish "one hand" and "two hands extremely occluding another"? Or is this only a problem with the demo code itself? |
I think your're right. There should be some room to improve DetectNet. |
Hi, even on InterHand2.6M set that the model were trained on, when I try to visualize the prediction on single hand images using the demo.py ,

it shows both hands, I tried to debug and find out a number of hands evaluation but I couldn't it looks like the 2 hands case is hard coded on the demo code
(3d color was changed but nothing more)
thank you :)
The text was updated successfully, but these errors were encountered: