Four years ago, long before Time Machine and the wide availability of cloud storage, I purchased a RAID/NAS for home backups. It’s done its job admirably, and has given us the confidence to back up the whole family without fear of drive failure. Even went as far as drilling holes in the floor and threading CAT-5 under the house so I could keep the Infrant in the closet, where it would make less noise.
It’s worked well, but the big problem it didn’t solve is the fire/flood/theft scenario. One good earthquake and all those images and videos of our child’s early years would be Gone Daddy Gone. Plus, my backup system was based on rsync. That worked fine, but was a bit too manual, and I had had occasional problems getting backups to complete to the non-Mac filesystem on the Infrant.
This problem had been hovering in the back of my mind for quite a while, when a dad at the local park mentioned that he had had success with Backblaze. For $5/month, you get hands-off unlimited backup of your entire system to their data center. Drive space is dirt cheap these days, so it’s tempting to rely on purchased drives, but let’s do the math. Let’s say you spend $100 for a 500GB drive. That’s the equivalent of 20 months of Backblaze service. If you go for the one-year commitment, you get the service for $4/month, so let’s say two years for the drive you just bought “cheap” to pay for itself. And you still haven’t got fire/flood/theft insurance. Seemed like a no-brainer to me, so I went for it.
My starter data set was 300 GBs – a healthy pile of bytes. Backblaze noted that the initial backup could take a couple of weeks, but in my case, the initial backup took more than three weeks, even over a fast broadband connection. After the initial backup is complete, incrementals happen quickly, with no interaction required.
Installation and backup management takes place through a preference pane on the Mac. It’s elegant, but I did have some problems along the way. At a certain point, halfway through the initial backup period, the pref pane informed me that the backup was complete, even though it wasn’t. It continued to report this for the next 10 days, even though I could see the bztransmit process chugging away in the background. The pref pane provides a count of the number of files and their total size; to get this to update, I’d have to unmount and remount my external data drive, then wait 3-4 hours for the process to rescan volumes and report new information.
At this point, I’ve made it through the initial backup and have added 150MBs of new data to the external drive. The preference pane does not report any change to the totals, even though I have confirmed that the newly added files are available on Backblaze servers. I also had a number of instances where the bztransmit process would swell to consume very large (> 2GBs) amounts of memory. In some cases, the process memory would eventually come back down on its own. In others I had to manually kill all bz* processes and restart the backup process. It’s as if the backup process is running fine, but the preference pane is unaware of what those processes are actually doing. Annoying, but not a deal-breaker.
I corresponded with Backblaze tech support during the process, and found them super-responsive, and not afraid to share detailed technical analysis of the process. They weren’t able to answer all of my questions about why the pref pane didn’t seem to know what the backup process was actually doing, but they were super detailed and quick, and I appreciate that.
Despite these glitches, my test restores have all gone well.
There is one little financial hitch in my plan: That $4/month is only for one machine. I’ll have to spend more to be able to back up other computers in the house. I’m still mulling that one. In any case, it feels great to know that my backups are complete, even if disaster hits the home some day. And now that the glitches of the initial backup period have passed, it should be pretty smooth sailing ahead.
There are other cloud backup systems for the home out there, like CrashPlan and Amazon S3 with S3Hub. I haven’t tried them. If you have, what have your experiences been like?