Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limited to 100TB storage #370

Open
zenjabba opened this issue Aug 2, 2016 · 13 comments
Open

Limited to 100TB storage #370

zenjabba opened this issue Aug 2, 2016 · 13 comments
Labels

Comments

@zenjabba
Copy link

zenjabba commented Aug 2, 2016

ACD_CLI reports to FUSE 100TB storage, so once you go above that amount, it will report "out of space".

Can we report 1PB of free space?

@charlymr
Copy link

charlymr commented Aug 3, 2016

Amazon will ask you question before you reach that I would say...
It is unlimited with (fair usage policy)... Are you close to that amount of space use already? Just curious as if Amazon did tell you anything?

@zenjabba
Copy link
Author

zenjabba commented Aug 3, 2016

yes I am close to that amount, and amazon has not questioned my amount stored in my two accounts.

On 3 Aug 2016, at 7:12 AM, MARTIN Denis [email protected] wrote:

Amazon will ask you question before you reach that I would say...
It is unlimited with (fair usage policy)... Are you close to that amount of space use already? Just curious as if Amazon did tell you anything?


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub #370 (comment), or mute the thread https://github.com/notifications/unsubscribe-auth/AApfuCqQUaZVsYklMR1qR09n6vwCLX_6ks5qcHeVgaJpZM4JbEMX.

@charlymr
Copy link

charlymr commented Aug 3, 2016

Fairplay 👍

@jetbalsa
Copy link

jetbalsa commented Aug 4, 2016

I think thats the endpoint data from amazon is the quota side, if you look in .cache/endpoint_cache it says the 100TB max size

I think this might be Unlimited* on ACD's part

@asabla
Copy link

asabla commented Aug 5, 2016

I've noticed this as well. Could you please comment here again @zenjabba if you get past 100TB with the result?

@zenjabba
Copy link
Author

zenjabba commented Aug 6, 2016

So I've looked at other systems, and they also report 100TB as the storage space available, but for example NetDrive reports

100TB of 100TB Free
screen shot 2016-08-05 at 8 02 14 pm
so I don't feel it's a limitation within amazon.

@bryan
Copy link

bryan commented Aug 9, 2016

It appears to be a 100TB soft cap, where you can call Amazon to allow for more space (probably have to give them a good reason given that 100TB is quite a bit of storage). Not certain though since I can't seem to find anyone hitting the 100TB cap. Keep us in the loop!

streamaserver/streama#63

@roaima
Copy link

roaima commented Sep 24, 2016

If it's any help, S3QL (another cloud filesystem) reports double the usage with a minimum of 1TB, so you never exceed 50℅ of the reported space.

@zenjabba
Copy link
Author

The problem is S3QL doesn’t support ACD, so apples and oranges (last time I looked at S3QL it didn’t)

@yadayada yadayada added the FUSE label Sep 24, 2016
@yadayada
Copy link
Owner

Does the reported disk size really have any practical implications?

@zenjabba
Copy link
Author

yes, when you try to upload a file and the disk reports 100% usage, you get a error

On 24 Sep 2016, at 12:40 PM, yadayada [email protected] wrote:

Does the reported disk size really have any practical implications?


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub #370 (comment), or mute the thread https://github.com/notifications/unsubscribe-auth/AApfuAS7vVFXk7zYuIfje3HIVIox4Iitks5qtVJ6gaJpZM4JbEMX.

@tristaoeast
Copy link

@zenjabba did you try uploading using rclone? I'm just curious to see if it presents the same limitation. I'm currently using acdcli to mount my ACD and rclone to upload to it

(sorry @yadayada for PRing the competition, it's just that I'm more familiared with the rsync syntax -- although you have some interesting options in your upload command that I still ought to try)

@zenjabba
Copy link
Author

So finally got a chance to get back to this

ACDFuse 107374182400 -9444732965617890843136 -14025401856 100% /mnt/amazon

is what is reported when it's over 100GB. Can we please just get it to report 1PB of storage so it will never run out.

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

8 participants