Backing Up a Google Calendar

by Jake

Google’s hosted apps are a dream come true for a small office worker (the office is small, not the worker… usually), but they do not answer one nagging question. What happens if Google’s services are unavailable? Now, it is very possible to be overly paranoid about data loss, but there is a legitimate concern here (and with all online services). Today, I’m going to tackle backing up a Google Calendar.

This need comes directly from the office where I am employed on a full-time basis. We’re making the total office switch to Google Apps away from Exchange. The reasons are many, and that may call for a future post, but it currently isn’t appropriate. My job is to figure out how to make it smooth, how to make it easy, and how to make it safe. Now, for most people in the office, backups of their calendar are either not critical or easily accomplished (through subscribing to the ical feed of the Calendar in iCal, Sunbird, or Outlook (using remote calendars)), but that isn’t the case for the boss, who does not use a computer, and who’s data is most important (at least from the perspective of creating headaches for the rest of the staff).

There are a couple of ways to do this. The first, and most straightforward, solution is to have a staff member backup the Calendar daily. They could do this by downloading the ical file or subscribing to it in a client. Not a large amount of effort, but what if they forget, or are on vacation, or just don’t do it right? Plus, we have computers. Aren’t they supposed to solve this whole repetitive task thing? So, I started asking how we could do this automatically. And while we’re at it, wouldn’t it be better to keep multiple backups of the calendar incase someone deletes something they shouldn’t? Exactly.

The Plan

Use what you have. We have a web server (hosted by Media Temple — highly recommended). My solution was to figure out some sort of script that I could set to run on a scheduled basis to download and save a copy of the calendar. This called for a shell script (created in Textmate — shell scripts have a .sh extension) and cron (which Media Temple makes very easy — a lot of other hosts also have good cron tools). I set up a web accessible folder to store the backups in, and the only thing I still needed was the ical address, which can be found by clicking on calendar settings for the calendar you want to backup in Google, and then clicking on the private ical address link at the bottom of the page.

Here’s the code for the script1:

INCREMENT=`date +%Y%m%d%H%M`
DIR="/home/domains/domain.com/html/backup_folder"
curl -s https://www.google.com/calendar/ical/address/privateurl/basic.ics -o "$DIR/cal_$INCREMENT.ics"

The addresses of the ical feed and my backup folder have been changed to protect the innocent. The INCREMENT variable is there because I wanted my backups named according to when they were saved, so I could go back to a specific point in time if needed. Also, if I didn’t add some sort of increment, then each backup would overwrite the previous backup, which may be ideal for some people, but wasn’t quite what I was after.

There are two problems I have left.

  1. I can’t get to the backups if the internet is down at work, although in that situation we can use a phone or go to a cafe to get to things.
  2. I need to figure out a way to script in to remove old backups — say anything older than two weeks or a month.

This does seem to do the trick rather well, although I’m sure there are some improvements that could be make. Are there any wise Linux users out there with suggestions?

Updates

Thanks to a tip in the comments, I have updated the script to delete files in that directory that were modified (created in this case) more than 7 days ago. Check out the full and improved script below:

INCREMENT=`date +%Y%m%d%H%M`
DIR="/home/domains/domain.com/html/backup_folder"
curl -s https://www.google.com/calendar/ical/address/privateurl/basic.ics -o "$DIR/cal_$INCREMENT.ics"
find $DIR -mtime +7 -exec rm -f {} \;

Please be careful. Anytime you add file deletion to an automatically run script, you want to make sure things are working properly before you set it loose.

  1. I am not a linux expert. Please do not use this script in mission critical applications. Basically, any use of this script and/or advise is at your own risk.

Comments
  1. Bagster says:

    permanent link|Monday, November 19, 2007, at 11:45am

  2. Jake says:

    permanent link|Thursday, November 29, 2007, at 2:41am

Let's Hear It
Commenting is not available in this section entry.