-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error should be more explicit when diverging #65
Comments
It happens on this dataset: https://archive.ics.uci.edu/dataset/713/auction+verification
I printed the feature weights and it looks like some random feature (always changing though) has the probability for the first outcome at 1, while the other outcomes have probabilities not equal 0. Example
|
Thanks for reporting this. The warnings suggest that the estimation is diverging and likely introducing NaNs in the parameters, which triggers errors in the sampling function. One way to address this is to rely on the simpler I also see that you are using the Regarding your second post on the sum of probabilities: are you sure this is triggering the error? This is likely the result of numerical calculations: the probabilities won't always perfectly sum to 1. For example, I tested the numpy multinomial sampling with the distribution |
Thanks for the help. I will try different models, but in my testing the |
Added this as a potential improvement. Models diverging will happen, but we should have a better error message when they do. |
I am gettintg
ValueError: pvals < 0, pvals > 1 or pvals contains NaNs
while running this codeFull Traceback:
So apparently its related to the numpy random multinomial function.
Runtime warnings I also get:
The text was updated successfully, but these errors were encountered: