Git Backups

For a developer the most valuable data resides in the Git repositories. Therefore regular backups are a must. I'd like to describe the workflow that I found useful for my daily work.

Pushing to Remote Locations

Having data distributed over multiple locations is a good idea, especially if your house burns down. I use more than one remote location to push my data, one is on my Synology NAS and the other on a web server.

But push usually only sends to one remote site at once, especially when using SourceTree - which I do - this is annoying. A little trick makes it possible to have define a remote that pushes to multiple URLs:

[remote "origin"]
url = https://myaccout@bitbucket.org/mynickname/myrepo
url = myuser@192.168.1.61:/srv/git/myrepo.git
fetch = +refs/heads/*:refs/remotes/origin/*
pushurl = https://myaccout@bitbucket.org/mynickname/myrepo
pushurl = myuser@192.168.1.61:/srv/git/myrepo.git

Now pushing to origin distributes the data to multiple locations and activating the "Push changes immediately" checkbox in SourceTree this is done immediately without further steps.

image-20180531174039787

Updating on USB Devices

The previous trick works well for locations that are available all the time when you are online, but how about USB drives? I do it the other way round i.e. I update the repositories on such a drive by fetching from the source repositories.

This requires some preparations. First you need to clone the Git repository like to the drive like this: git clone --mirror ~/work/example.

Now you can do git fetch. But of course nobody likes to type that for each repo, therefore we add some Shell magic:

find git-backup -name "*.git" -print -exec git -C {} fetch --all \; 

Wrapped in a nice shell script file or into an automator app this is easy to use and any repo cloned to that drive is considered by the fetch command.

automator@2x