-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Data Commons JSON upload error #16
Comments
The json decode error is likely due to the website returning an error that is not json, perhaps html. I also tried uploading a file using
Next I am trying to upload a 2G file, but based on the earlier performance this could take 1 hour. |
I tested uploading a 50 MB file on OSC with curl. It uploaded much faster:
|
It seems hit or miss with the 500 error. I tried uploading the same 2G file on OSC and it uploaded fine. |
Code that often reproduces the 500 error (after filling in TODO with your datacommons token):
|
So a 500 error is the server's fault, right? Would it make sense to split files 1GB+ into smaller parts so each file takes less time to upload and reduces the probability of a 500 mid-transfer, and add some error handling to retry when a 500 occurs? |
Wouldn't that mean a user who downloads needs to merge the file parts back together when downloading? |
That could be integrated into If running the Or might it be better to check with Data Commons support on what might be causing the 500 error and see if that root issue can be sorted out? I feel like it shouldn't be normal for something like this, but maybe there's a fundamental technical limitation on their side that we should be prepared to deal with? I don't know enough about networking to know what exactly is normal. |
I attempted to upload a 4.45GB zip file to Data Commons using
dva upload <zipfile> <doi>
After about an hour of the terminal stating
Uploading <zipfile>
, it printed the following error:The text was updated successfully, but these errors were encountered: