-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(server): [WIP] migration endpoints from google photos #9804
Conversation
An alternative approach might be:
This seems more reliable to me then having a user upload big zip files to the server (residential upload speeds are often very slow). One downside might be that you would have to provide your own google api key for the server. Edit: Seems like exporting google photos data is sadly not yet supported by the api, according to this bug report. |
If we are adding support for the GoogleAPI (which would require to create an API App) wouldn't it be easier to skip the takeout and have a background process download each picture/album individually? Smells like scope-creep for me though. The feature described in the PR as is would be a good first step. My 2 cents: After the migration, are these files deleted? If the files stay on the server we might need to ensure that only authorized users get infos/contents of them |
re directly accessing the google photos API, I've understood that it has issues where it (sometimes?) doesn't return the original quality files. |
Currently, nothing. The zip files just sit on disk. I already have the code in there to auto delete on successful migration, I just need to call it. And, yeah, this tool will only really work for those with good connections or small takeout files. That's just the nature of the beast though with google not having proper API support for takeout. Larger migrations will have to be done via the command line. |
And yeah, while using the API would be a hell of a lot simpler for the users it would make things worse:
AFAIK, this is the simplest method to get a full quality import with all location data & full image quality. |
Would it be reasonable to allow for starting a takeout import, then manually placing the zip files in the appropriate folder rather than uploading? |
Yeah that shouldn't be hard at all. My current system doesn't take into account the different users on a server (so every user can see and import any zip files in the |
I'm going to close this PR for now. In the future we'll probably look at adding support for https://dtinit.org/. |
2024-05-27.15-43-46.1.mp4
The goal with this PR is to lay the groundwork for having a seamless and simple way to migrate from GP to Immich via the Web-UI.
So this PR includes 3 new endpoints:
The idea behind these endpoints is the user can drag and drop individual zip files from the takeout into a new page in the administration panel, and then they can click a button to begin the migration once all zip files are uploaded.
This PR is very much WIP and should not be viewed as the final product. I just want to see what you guys think about this concept and the direction I'm taking it before I fill out all the tests and begin on the Web-UI. Also, while immich-go is the community choice for large migrations, I instead used google-photos-migrate as it has a npm package that can easily be used with the server stack (as opposed to go which could be implemented, but would be more complicated). It is likely that
immich-go
is better suited for large imports, but this feature is targeted towards those with smaller libraries which could be successfully uploaded over a web UI.