Some of you may recall that I was having drive problems the other day. I was getting the data off my old drive and moving it to a new one when it started issuing bad sector and impending drive failure warnings. As a result, I failed to get the hidden files in my home Linux directory (the most critical data loss was my much of my sent email for the last three years). It was looking pretty grim. ddrescue, which is supposed to be a very powerful data-recovery tool, couldn’t see anything, and attempting to look at it with fdisk to see if the drive was even there just resulted in an overflow error. I took it to a local data-recovery guy, and he put his diagnostics on it, and told me that it was having a delay after the first 30g (of a 2T drive), and that he couldn’t do anything with it (though I think that he understood Windows much better than he did ext4). I was almost resolved to simply mourning the loss, or sending it to a specialist, with the potential of spending many hundreds of dollars and only getting back individual randomly named files with no file structure. But I tried one more thing.
I ran the program most of yesterday (it took from mid-morning until late evening to go through all two terabytes), but when it was done, the logical volume was restored, and the drive looked like new (it’s even possible that it will boot, though I haven’t tried it, and don’t really have any reason to). But I can’t recommend it more highly.
And yes, I will be backing up religiously, with a cron job every night (possibly also to my remote server).
Yeah, I tried to reply to your original thread on this to give TestDisk a try. But for some reason I kept getting java scripting errors when I tried to post and I eventually gave up. Glad you found your way to that program on your own. At my work I recover data files from crashed hard drives several times a week and TestDisk does a great job of recovering files from a damaged filesystem. And the fact that it’s free is all the greater.
I haven’t used it but one of the other techs I work with swears by SpinRite. Though that’ll set you back about $90.
Cloud backup is probably a good idea too, although I must admit I haven’t done that comprehensively enough myself. I did manage to recover most of my data when I was stupid enough to leave my smartphone on the train and it was stolen.
If you’re worried about people reading your data, you may want to use Tahoe LAFS with stores only encrypted data in the cloud and does so redundantly.
Instapundit says Amazon is having a big sale on external hard drives. link.
He seems as fascinated with Amazon’s latest sales as he is with politics and culture.
In my opinion, external hard drives are the way to go. Whenever I am going to do something even slightly hazardous, I copy my /gregg folder to the external HD, and rename it with a date stamp.
I’m about to build a new PC (still a lot I don’t know) and auto backups to a secondary drive will be part of the new system. Haven’t decided if that will be via RAID or a simple backup.
Setting up Thunderbird to save sent messages in an IMAP folder on the server instead of (or in addition to) locally would be my added suggestion. I was never satisfied with TB’s archiving options for either POP or IMAP (to save old messages locally so they can be cleared from the server) though. The archiving addons always had problems, or simply failed to save everything.
Who needs backups anymore. Just send a FOIA request to the NSA.
CrashPlan. There’s a free version. Then separate the disks via sneakernet.
Tahoe LAFS lets you do this without having to move physical disks. You could backup data on servers or even desktop machines owned by friends and family members without worrying about privacy, because all data is stored in encrypted form. Uptime is not a problem, because you can do this redundantly.