Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hardware requirements #21

Open
antgr opened this issue Aug 27, 2019 · 4 comments
Open

hardware requirements #21

antgr opened this issue Aug 27, 2019 · 4 comments

Comments

@antgr
Copy link

antgr commented Aug 27, 2019

Hi, I run the experiment in my machine and also in colab (https://colab.research.google.com/drive/10z-ZpmTRBIegicA4p9ueA_BOLet-7fHJ),
but my machine halts (1810932it [37:24, 2382,33it/s]) and so does colab (1748626it [21:51, 1.29s/it]).
So, what are the hardware requirements to run it smoothly?

@antgr
Copy link
Author

antgr commented Aug 30, 2019

@titipata ?

@titipata
Copy link
Owner

Hi @antgr, I used one of the lambda machine https://lambdalabs.com/deep-learning/workstations/4-gpu to train the model. It's probably the GPU memory that cause the problem for you. I'll have more refined answer later on.

@antgr
Copy link
Author

antgr commented Oct 15, 2019

Hi @titipata is there any workaround that I could use to train with one gpu? Even if the final model will be less capable.. Specificity I would like to jointly train your model with another argumentation mining task. Do you think that could your model help me on the other task?

@titipata
Copy link
Owner

@antgr I actually train with one GPU. However, the memory in GPU probably gets a bit high ~ 6-7 GB (from maximum 10 GB). I'd say the easiest workaround is to reducing batch size or size of the model.

Definitely, I think this will help improve other tasks, specifically if argument mining task is in the science domain.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants