How to fix a seed in training CEBRA? #145
-
Hi, guys! I've just started to use CEBRA in my research project! Thanks for developing this innovative tool! I searched it in Docs and discussion panel here, but I failed to get any meaningful information ;-( Sincerely, |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 1 reply
-
Thanks for using cebra :) so, one of the strengths is its inherent consistency -- if you train the model and its learning (can discuss that), if you run it 100 times you should already get nearly that same result - see figure 1 in the paper. We have built in functions for you to compute consistency metrics. |
Beta Was this translation helpful? Give feedback.
-
Hi Jinwoo, please see @MMathisLab 's response regarding how to resolve. We did not implement seeding for this reason yet, as there are a lot of sources of randomness in the training (model init, sampling from various distributions, non-deterministic CUDA kernels) and there is no real advantage/reason for fixing the seed due to consistency properites. Does this clarify the question, or do you have a specific reason why you would like to seed the model? |
Beta Was this translation helpful? Give feedback.
-
Hi @Jinwoo-Yi , just wanted to check in here. Did we address your question? if so, feel free to mark one of the replies as the answer to your question. :) |
Beta Was this translation helpful? Give feedback.
Thanks for using cebra :) so, one of the strengths is its inherent consistency -- if you train the model and its learning (can discuss that), if you run it 100 times you should already get nearly that same result - see figure 1 in the paper. We have built in functions for you to compute consistency metrics.