What does HackerNews think of gphotos-sync?

Google Photos and Albums backup with Google Photos Library API

Language: Python

#1 in Google
#31 in Python
#1 in Python
I pull mine from Google Photos, with a docker image running a tool: https://github.com/gilesknap/gphotos-sync

Its not perfect, but as a backup, it works well.

Exactly this. I have Google photos and I use it all the time, for every photo/video I take. But I also back up everything I can using the [gphotos-sync](https://github.com/gilesknap/gphotos-sync) python script. As it says, it's not the 100% quality in either photo or video, but it's still the memories.

Please if you aren't backing up your cloud storage photos/important files, stop reading HN and go and set it up now.

I can recommend backblaze for cheap reliable storage and the restic backup client which is brilliant (a single, small binary). There's also rclone, or even the backblaze cli client. [Sorry I sound like a Backblaze shill!]

I use a docker build of this on my Synology NAS: https://github.com/gilesknap/gphotos-sync

Then I have a task set to start the docker container up at 2am every morning, which then automatically backs up any new images, then the docker container stops.

Works really well.

I used this not too long ago: https://github.com/gilesknap/gphotos-sync and it stores photos and videos in a YYYY/MM directory structure.

It works extremely well, and you can re-run it any time to sync new files locally too.

I used it like that for a few months before I finally installed syncthing on my phone and stopped using Google Photos altogether.

Now what I do is take photos on my phone, have them sync to a NAS. And on the NAS I used a modified version of this https://forum.syncthing.net/t/android-photo-sync-with-exifto... to build up a YYYY/MM folder organisation and move files older than 30 days from the syncthing folder into my archive. My archive is then in my Plex so it's still accessible to me.

In essence: 1) Take photo (implicit sync to NAS), 2) wait 30 days, 3) archive photo into long term directory naming convention, whilst making available to Plex and deleting the version from my phone (by deleting the syncthing version it will delete the one on the phone after 30 days too).

Technically there is no way for a lossless synchronization/export out of Google Photos apart from using Google Takeout to the best of my knowledge.

For reference check the best effort project https://github.com/gilesknap/gphotos-sync and the "Known Issues with Google API" section.

In short: * photos lose some metadata including GPS * You will lose your RAW images (AFAIK) * videos are always transcoded with a lossy conversion

I've walked into the same trap. I'll probably use Google Takeout to hopefully recover my own data and treat Google Photos as an ephemeral destination from now on.

Use Google Photos on my android phone, then use gphotos-sync[0] to sync the files to a hard drive on my DIY NAS. Contents of hard drive are periodically backed up with restic[1] to B2[2].

My reasoning is that I don't trust Google to not lock me out of my account at some point, so having both a local and a remote backup gives me piece of mind. I periodically check the offsite backup to check that it's still all working. Total cost for about a terabyte of files (it's not only photos and videos) is about $6/month, which is pretty reasonable.

[0] https://github.com/gilesknap/gphotos-sync

[1] https://restic.net/

[2] https://www.backblaze.com/b2/cloud-storage.html

There are afew projects on github that let you configure an API key for google access and pull your photos, without using takeout. https://github.com/dtylman/gitmoo-goog https://github.com/gilesknap/gphotos-sync

I run one on my wife's account with a cron job to grab new photos daily.

Yeah, https://github.com/gilesknap/gphotos-sync.

There's a decent guide here: https://ubuntu.com/blog/safely-backup-google-photos.

I run this every night on a raspberry pi, syncing them to my local NAS which is in turn backed up to cloud storage.

I use the google api and this, https://github.com/gilesknap/gphotos-sync

I backup everything to my local server.

I also do some side work, feel free to email me if you would like me to help you set this up, if you have a server, we can do it all remotely. p n u t@ (borowicz).org, remove spaces and ()

Looking at the talk about various ways to downlod Google Photos,I use this for regular backups https://github.com/gilesknap/gphotos-sync
Takeout doesn't work in practice for bigger collections (archive creation routinely fails, timeouts while downloading, 50GB max size results in many splits)

I've used this 3rd party tool and it worked OK: https://github.com/gilesknap/gphotos-sync/

https://github.com/gilesknap/gphotos-sync

I run it nightly from my home server. It has a few limitations (due to limitations of the Google Photos API) but overall it does what you want.

Since I haven't seen it mentioned yet:

https://github.com/gilesknap/gphotos-sync

is a command-line tool (in Python) that downloads (incrementally) your Google Photos content directly using the Photos API, not through Drive. I just set it up recently.

The two main caveats are the lack of location info and that it won't re-download photos that have been edited.