Compare and contrast backup strategies

On the one hand, you have Jeff Atwood’s Coding Horror, a blog about programming read by thousands if not hundreds of thousands of people. And by the same guy, blog.stackoverflow.com. His backup strategy was to make copies of both blogs but leave them on his hosting site, and trust that when the ISP said they had it backed up, they really had it backed up. Of course, the ISP had some sort of hardware failure, and when they went to restore their backups, they found that they didn’t work. He’s now trying to reconstruct his articles (but of course not the comments, and some very few of the images that went along with them) from Google’s cache, the Wayback Machine, and the web caches of his readers.

On the other hand, you have this blog, which is about nothing in particular and read by probably 15 people tops. My backup strategy is this:

  1. Daily database dumps, copied to another file system on a different physical volume on the same box. That’s there mostly to quickly respond if I accidentally delete the database or an upgrade goes bad or something. If my blog got more traffic and more comments, I’d do those dumps more frequently.
  2. Another backup and a tar file just before I do an upgrade.
  3. Daily rsyncs back to my Linux server at home. I keep a week’s worth of those.
  4. Daily copies of that local copy to removable hard drives. I keep a month’s worth of those.
  5. Every week or so, I move one of those removable hard drives to a physically remote location.

And I did this when my blog was hosted on a VPS that the ISP claimed had some sort of backups and now when my blog is hosted on a 1u box that I bought on eBay and stuck in a local colo facility. As far as I’m concerned, you’re not backed up until the backup in your pocket.

Oh yeah, did I mention that some of those Coding Horror blog entries that went missing were about backups and how important they are?

I’m sorry, but the idiocy of this just leaves me shaking my head in wonder about why anybody ever believed anything he ever said about computers. On the other hand, it also makes me glad that I don’t have a huge audience hanging on my every word, because someday I might get something wrong (hey, I know, not likely, right?), and schadenfreude’s a bitch.

Editing fail

So I was reading a story in this month’s Analog magazine, and encountered the following paragraph:

“They company wantoffered me a promotions me to learn aboutlead a new technology group, something I saw in the Dakotas a few months ago,” Gus replied. “It’s a nice increase career move if I accept,” Gus offered.

I swear all spelling, punctuation and spacing is exactly as it is in the magazine. It’s almost as if somebody took the blue pencil markup version and put in all the new stuff without taking out the old stuff. And then later in the same story:

His The aesthetic principles approach would disappear be lost when Phil’s new technology was perfectedused.

Doesn’t Analog have proof readers for this sort of thing?

(What does it say about my age and penchant for trivial that I know this much about obsolete book and magazine editing even though I’m not a writer or an editor?)

So long, Kodak

I found out yesterday that Kodak has shut down the Digital Cinema group that I belonged to for over 6 years, a victim of a Kodak’s inability to keep up with an incredibly rapidly changing marketplace. Some years before that, I’d had the pleasure to work with many of the same people on a product called “Cineon”, a very high end post production and digital editing program for movies. Alas, technology marched on faster than we did and today people are doing on their Macintoshes and PCs what we were doing on 16 processor million dollar SGI Onyx computers.

But in both cases, I was working with the finest group of programmers, QA people, applications specialists and sysadmins it’s ever been my pleasure to work with (with the possible exception of GeoVision, which was also exceptional). And although I might be cutting my own throat because I’m still in the job market and many of them will be entering the job market very shortly, I sent out this message to the Peernet Rochester Yahoo Group.

I just found out that my old colleagues on the Digital Cinema team at Kodak all got their notices today. And while I’m probably going to be competing with them for some of the same jobs, I’d just like to put a shout out to any hiring managers here to let them know that if you see a software developer or tester with experience in the Kodak Theatre Management System on their resume, you could not do better than to hire them. They are positively the best group of people I’ve worked with in my 25 years of working all over the world.

Ok, if there was some way to put these things on a scale and see how it balances, I’d probably put the team at GeoVision (not the Albany group, the original ones) and the Cineon team as tied for first best, and the Digital Cinema group as a fairly close second, and a couple of the people at SunGard right up there.

Man, I hope we all end up employed again soon. And I hope we all end up working together some time.

Oh, and if you’re one of my former colleagues from Kodak, give me a shout off-line and I’ll hook you up with the Peernet group – it’s really been helpful.

So… C++? Delphi? Markov Chains?

I have a line on a job that involves porting some code that was originally written in R, then in Delphi, and now the researcher wants it re-written in C++, turned into multi-processor/multi-computer friendly (using MPI?), and turned into a plug-in for R. The program as it is now is pretty primitive – he apparently just puts a bunch of parameters into the actual Delphi code then recompiles and runs, and it outputs into a data file. Obviously the first step would be to have a wrapper program that gets the parameters from a data file, and later a wrapper that gets the parameters from however R passes them to plugins.

It’s been a while since I used C++, and the language has changed a lot since then. Name spaces, STL, Boost, auto_ptr, all this stuff is new to me. It’s going to take some frantic reading to get up to speed. Even worse, I have to read the existing code, which means learning a bit of Delphi/Pascal. And I’m going to have to find a decent IDE for C++ – although the consensus on StackOverflow seems to be to go back to the way I’ve always worked until I started using Eclipse last year: gvim, make, gdb, and a web browser open to the man pages.

Even better, the job would mean working from home. The dogs will be happy about that.