-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Re] The Discriminative Kalman Filter for Bayesian Filtering with Nonlinear and Non-Gaussian Observation Models #74
Comments
Very sorry for the late answer. We'll assign an editor soon. |
@benoit-girard @oliviaguest @gdetor Can any of you handle this submission? |
Sorry for this much delayed answer: yes, I think I can handle this submission. |
@junpenglao : would you be interested in reviewing this submission? |
@rchavarriaga : would you be interested in reviewing this submission? |
@Josuelmet : do you have ideas of reviewers you could suggest? I have difficulties identifying who, in the usual Rescience C reviewers, could do the job adequately. |
@benoit-girard Sure, |
@ozancaglayan @mxochicale @gdetor : would any of you be interested in reviewing this submission? |
Hello. I'm no longer in academia and have limited time but I'll see if I can do it if nobody else volunteers. |
@ozancaglayan Your status wrt. academia is not a problem (we don't have such requirements at ReScience C), as long as you have the expertise and know-how to perform the review. |
Okay, you can assign this to me |
ReviewFirst of all, I'd like to thank the authors as the paper is very clear and fun to read! The authors went for replicating the experiment 4.5 which they claim to be the most significant type of experiment for BCI studies. The replication study is able to confirm the trends shown by the original authors with one method underperforming and another outperforming the original results. I was particularly curious about the LSTM/NN approach which often requires quite some tuning to get the best out of it. I'm pretty sure it could have surpassed the other methods given enough time and patience. Authors here already show that by changing a couple of settings, they were able to get better performance than the claimed ones. Unfortunately, I'm having issues in running the experiments so let's just first focus on making this work and then I'll follow up with the replication status. Test environments:
Readability
Issues
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [15 lines of output]
The 'sklearn' PyPI package is deprecated, use 'scikit-learn'
rather than 'sklearn' for pip commands.
Iteration 0:
DKF-NN
/home/ozan.caglayan/Discriminative-Kalman-Filter-4.5-Python/optimize_sx.py:43: RuntimeWarning: Mean of empty slice
return np.nanmean((xtrain - preds) ** 2)
DKF-GP
/home/ozan.caglayan/Discriminative-Kalman-Filter-4.5-Python/optimize_sx.py:43: RuntimeWarning: Mean of empty slice
return np.nanmean((xtrain - preds) ** 2)
Traceback (most recent call last):
File "/home/ozan.caglayan/Discriminative-Kalman-Filter-4.5-Python/Paper_Script_45.py", line 268, in <module>
zpred.loc[iRun, "DKF_GP"] = DKF_filtering(
^^^^^^^^^^^^^^
File "/home/ozan.caglayan/Discriminative-Kalman-Filter-4.5-Python/DKF_filtering.py", line 30, in DKF_filtering
if not np.all(eig(inv(bb) - inv(VZ))[0] > 0):
^^^^^^^^^^^^^^^^^^^^^^
File "/home/ozan.caglayan/mambaforge/envs/kalman/lib/python3.12/site-packages/numpy/linalg/linalg.py", line 1329, in eig
_assert_finite(a)
File "/home/ozan.caglayan/mambaforge/envs/kalman/lib/python3.12/site-packages/numpy/linalg/linalg.py", line 218, in _assert_finite
raise LinAlgError("Array must not contain infs or NaNs")
numpy.linalg.LinAlgError: Array must not contain infs or NaNs Other
|
Thanks for responding @ozancaglayan ! |
@ozancaglayan Thanks a lot for your review. |
I am asking again with the hope this notification will succeed: @mxochicale @gdetor : would any of you be interested in reviewing this submission? |
@ozancaglayan I updated the repo a bit, here is my progress:
I will let you know when I have cleared the Linux issues. Edit: I have cleared the Linux issues in the new code as of 3/6/2024. |
@ozancaglayan I have figured out the issue behind the instability, and there are two ways to fix it:
I recommend the latter; let me know when you are able to give running the code another chance. P.S. I ran the code for one seed (42) in a new Linux environment (Colab); some of the numbers for Table 2 jittered a bit, but overall the trends were the same. The new numbers matched Table 3 very closely, with some jitter in the DKF-NN row. |
@ozancaglayan were you able to run the code again? |
Sorry for the delay, I'm on leave with no access to computer. Will look
into after July 15.
|
Hi,
Remarks about the results in Table 2
Remarks about the results in Table 3
Some minor/further improvements:
[/tmp/ipykernel_387075/859728690.py:10](http://10.98.58.43:8001/tmp/ipykernel_387075/859728690.py#line=9): FutureWarning: Setting an item of incompatible dtype is deprecated and will raise an error in a future version of pandas. Value '-19%' has dtype incompatible with float64, please explicitly cast to a compatible dtype first.
table1.loc[idx] = table1.loc[idx].map(lambda n: '{:.0%}'.format(n)) Conclusion@Josuelmet I'm curious about your take on the result differences. Thanks! |
@ozancaglayan I took a deep look at the differences between two runtimes (one on my PC, and one on Colab), and even when using the same version of PyTorch, I was unable to make the two sets of results fully agree, particularly for the simple NN-based architectures: DKF-NN, EKF/UKF. My best guess is that this difference stems from somewhere within PyTorch. The LSTM seems to be more stable since it runs for much fewer epochs than the DKF-NN, EKF, and UKF. Between my two runtimes, I looked at the NN losses, and for some reason they would agree fully up until 330 epochs, after which they diverged. As a result, I tried modifying the training parameters to use fewer epochs, but in the end I was unable to fully remove all discrepancies between my two runtimes. Let me know what you think. Thanks! |
@benoit-girard Any progress on that ? |
Hi everyone! |
I had some minor remarks in my last comment. Besides the inconsistencies in a small part of the results that we discussed, I'm fine with this, sorry for the delays! |
Original article: The Discriminative Kalman Filter for Bayesian Filtering with Nonlinear and Non-Gaussian Observation Models
PDF URL: https://github.com/Josuelmet/Discriminative-Kalman-Filter-4.5-Python/blob/main/article/article.pdf
Metadata URL: https://github.com/Josuelmet/Discriminative-Kalman-Filter-4.5-Python/blob/main/article/metadata.yaml
Code URL: https://github.com/Josuelmet/Discriminative-Kalman-Filter-4.5-Python/tree/main
Scientific domain: Computational Neuroscience
Programming language: Python
Suggested editor:
The text was updated successfully, but these errors were encountered: