22

It's interesting to see the technological split between structured corporate environments and more developer-driven/startup environments. Some of the Microsoft technologies I take for granted (VSS, Folder Redirection, etc.) simply are not available when managing the increasing number of Apple laptops I see in DevOps shops.

I'm interested in centralized and automated backup strategies for a group of 30-40 Apple laptops...

How is this typically done safely and securely, assuming these are company-owned machines (versus BYOD)?

  • While Apple has Time Machine, it's geared toward individual computer backups and doesn't seem to work reliably in a group setting. Another issue with these workstations is the presence of Vagrant/Virtual Box VMs on the developers' systems. Time Machine and virtual machines typically don't work well unless the VMs are excluded from the backup set.
  • I'd like a push-based backup process with some flexible scheduling options.
  • I know how to handle the backend storage, but I'm not sure on what needs to be presented to the client systems.
  • Due to the nature of the data here, cloud-based backup may not be a viable option.

Any suggestions about how you handle this in your environment would be appreciated.

Edit: The virtual machine backups are no longer important. They can be excluded from the process and planning.

ewwhite
  • 201,205

7 Answers7

9

We're just trying to bring our Macs into the fold here. My original plan was to use Backup Exec's Mac agent. Then I found out that the agent doesn't support 10.9, or even 10.8. So if you're keeping the OS up-to-date, that's out. I've heard legend tell that the next SP will get it up to speed, but I'm not holding my breath.

It has been a few years, but Retrospect used to be the gold (and only) standard for Mac backup. Install the agent and you could set a schedule so the Macs would back up once connected to the network. I don't have recent experience with it, though it did work via VPN many moons ago. You'd then want to have it save the backup sets to storage that you would sweep into your existing backup environment.

If you get a Mac Mini with OS X Server, you can redirect Time Machine on the laptops to the network, then sweep that connection up with another disk backup tool. I don't know if there's any granularity to Time Machine, though -- I believe it grabs the entire disk, or nothing.

I know you mentioned cloud may not be an option, but if that is because of the VMs (which are now out of scope?), then perhaps that makes your CrashPlan/BackBlaze/Carbonite options more palatable.

If you do want to bring the VMs in scope, you could install a Windows-based agent in the VM, and treat that as you would anything else.

CC.
  • 1,196
7

Acronis supports Macs and a centralized backup server. Symantec also supports Macs and has a centralized appliance. There's also Retrospect, a long-time established Mac backup package that also appears to support a local backup server. I'm sure there are more. (I've intentionally excluded cloud services.)

Of course, the way we're using Acronis (for Windows!) qualifies more as business continuity rather than disaster recovery. We're using it for the users who have SSDs; when the SSDs inevitably die, Acronis gets them back up and working fast. The actual DR data is all server data and is handled differently based on whether it's client data or internal data.

You didn't explicitly state whether you were looking for business continuity answers or disaster recovery answers, but I've answered more along the lines of continuity. On the other hand, if the building burns down, perhaps your devs will have their laptops with them, so continuity is probably more of what you need.

[Edit]

I had intentionally excluded Crashplan due to the "no cloud" restriction, despite liking the home version a lot. Crashplan and Acronis are different use cases, though; Acronis does actual imaging, and Crashplan is data only (by default, the user's home directory only). Acronis is scheduled, and Crashplan is continuous (whenever the storage is available).

In our particular environment, developers are allowed to customize their machines in whatever way is most efficient for them, so they need an image level backup so they can get back up and running fast in case of emergency. If your devs use their machines the same way, they probably need an image-level backup, too. One more thing to look at in the product offerings, alas. (It looks like Acronis' Mac imaging is providing a central repository for Time Machine, but I could be misreading.)

(I've heard of home users telling Crashplan to back up their entire hard drive, including the Windows directory, but they're doing it wrong, alas, because restores would probably be wading into unsupported territory. It's all about backing up data.)

7

I used to use CrashPlan at a previous job to back up a couple of hundred Mac laptops, a few Windows VMs, and even a couple of Linux servers.

They have a cloud based solution, but we used the on-premise server (I think they've since renamed it to CrashPlan ProE) and it was rock solid.

I liked it enough that I use their cloud consumer solution to back up all my personal Macs.

re: Mac filesystem attributes mentioned in another answer - OS X is fully supported on CrashPlan and we never had any issues restoring Mac resource forks. You can run the server on OSX, but we ran ours on a Dell running Ubuntu.

re: Pricing - the seats are per-computer, not per-user, so if a user has a laptop and a desktop, that counts as two seats which seems reasonable. The seat price was on the low end of the range of different products we looked at.

CP has typical enterprise features as far as being able to configure how long to keep backups for (We kept hourly changes for a couple weeks, dailies for a month, then weekly for six months and monthly after that), and you can set up different organizations that have different settings. Setting up our server to auth to our LDAP took about 5 minutes, I recall being shocked at how quickly we got everything set up.

Joe Block
  • 726
3

I use Backblaze for many of my clients and on all of my machines (well, all the Win and OSX anyway - no support for anything else)- I can recommend them highly. The downsides are that the inital backup can take a while and it can be cumbersome to do a complete restore (they will overnight a drive for something like $200, but it can take time to prepare it), but it's completely automatic and very lightweight. It works well on Macs and Windows machines. (I also use acronis locally for a Windows machine that I like to abuse, never used their mac products). Backblaze also supports versioning, local encryption (i.e. they don't have your keys), and works from any internet connection, great for laptops.

CrashPlan is more expensive for business versions but they do have the advantage that you can seed your initial backup by sending them a drive.

I have never had a positive experience with Backup Exec (or Symantec anything at all), or Time Machine with anything more than a few machines.

quadruplebucky
  • 5,314
  • 24
  • 24
2

If I were you, I'd use network home folders over NFS or AFP and have a standardized image built from something like Deploy Studio or Apple's built-in deployment solution.

When a laptop fails, all of the data and user state is safe on your server (which is being backed up by something more enterprisey than Time Capsule, hopefully) and you can lay down a fresh image on fresh hardware and not think about it. Of course, this has some prerequisites that many smaller dev shops scoff at, such as Open Directory or Active Directory (unless you want to configure it all by hand).

MDMarra
  • 101,323
0

If you want to try running Time Machine against a file server, you can run netatalk on generic *nix to get the required afp protocol support.

--

a quick tip to make VM backups less painful regardless of your backup strategy.

Make regular snapshots of the VMs. Work from snapshots instead of the original. this way the original disk files won't be changed.

Alternately/additionally, make the VMs be dataless, and revert them to the snapshot state after each run. Store files that will be changing on a fileserver. VMware has a bundled samba you can use for sharing folders from the host; If VirtualBox doesn't, you can install your own samba if needed.

You can script this stuff up to make it quick & easy to start & stop your VMs. VMware, again, has command line options to the vmrun program (at the core of the app, look around with ps and you'll see it). you can do stuff like:

vmrun stop "/Users/foobar/Documents/VMs/win7.vmwarevm/win7.vmx" hard

which will kill the running VM, and revert to the snapshot.

Poke around and I bet you will find similar stuff with virtualbox.

--

One other thing you might try is BackupPC. It uses rsync or tar over ssh as a transport, and does file-level deduplication on the back end. I've used it for years with linux clients.

The only trick with Macs is that you need to be sure you are getting whatever mac-filesystem-specific stuff you need. Resource Forks, etc. People on the mailing list have reported success with "Xtar", a tar extended for osx. In your case you probably don't have any of these, but make sure.

Dan Pritts
  • 3,336
  • 29
  • 28
0

I took an unconventional approach by setting up GIT to push to a private remote server and running it through a script and cronjob.

It obviously doesn't handle ACL, but the "repair permissions" command in the disk utility works fine for this.

Twitch
  • 101