What does HackerNews think of gphotos-sync?
Google Photos and Albums backup with Google Photos Library API
Its not perfect, but as a backup, it works well.
Please if you aren't backing up your cloud storage photos/important files, stop reading HN and go and set it up now.
I can recommend backblaze for cheap reliable storage and the restic backup client which is brilliant (a single, small binary). There's also rclone, or even the backblaze cli client. [Sorry I sound like a Backblaze shill!]
Then I have a task set to start the docker container up at 2am every morning, which then automatically backs up any new images, then the docker container stops.
Works really well.
It works extremely well, and you can re-run it any time to sync new files locally too.
I used it like that for a few months before I finally installed syncthing on my phone and stopped using Google Photos altogether.
Now what I do is take photos on my phone, have them sync to a NAS. And on the NAS I used a modified version of this https://forum.syncthing.net/t/android-photo-sync-with-exifto... to build up a YYYY/MM folder organisation and move files older than 30 days from the syncthing folder into my archive. My archive is then in my Plex so it's still accessible to me.
In essence: 1) Take photo (implicit sync to NAS), 2) wait 30 days, 3) archive photo into long term directory naming convention, whilst making available to Plex and deleting the version from my phone (by deleting the syncthing version it will delete the one on the phone after 30 days too).
For reference check the best effort project https://github.com/gilesknap/gphotos-sync and the "Known Issues with Google API" section.
In short: * photos lose some metadata including GPS * You will lose your RAW images (AFAIK) * videos are always transcoded with a lossy conversion
I've walked into the same trap. I'll probably use Google Takeout to hopefully recover my own data and treat Google Photos as an ephemeral destination from now on.
My reasoning is that I don't trust Google to not lock me out of my account at some point, so having both a local and a remote backup gives me piece of mind. I periodically check the offsite backup to check that it's still all working. Total cost for about a terabyte of files (it's not only photos and videos) is about $6/month, which is pretty reasonable.
I run one on my wife's account with a cron job to grab new photos daily.
There's a decent guide here: https://ubuntu.com/blog/safely-backup-google-photos.
I run this every night on a raspberry pi, syncing them to my local NAS which is in turn backed up to cloud storage.
I backup everything to my local server.
I also do some side work, feel free to email me if you would like me to help you set this up, if you have a server, we can do it all remotely. p n u t@ (borowicz).org, remove spaces and ()
I've used this 3rd party tool and it worked OK: https://github.com/gilesknap/gphotos-sync/
I run it nightly from my home server. It has a few limitations (due to limitations of the Google Photos API) but overall it does what you want.
https://github.com/gilesknap/gphotos-sync
is a command-line tool (in Python) that downloads (incrementally) your Google Photos content directly using the Photos API, not through Drive. I just set it up recently.
The two main caveats are the lack of location info and that it won't re-download photos that have been edited.