I really don't care what "classified" papers say, I want proof. It's silly to overwrite data over 30 times, when it's quite likely that a few passes is enough. If you need to overwrite tens of times, then you shouldn't rely on overwriting in the first place.
I agree that it's better to have proof than just speculate. But that's the problem. Who has the means to do something like "Forensic-proof pattern testing"? First you have to gather many different drives which use different encodings. That's no problem for some research at university. Buying EnCase should be no problem too. But the NSA won't give you its best hard drive recovery technology just to find out what you can do to defy it. So, as I said: 2 passes random is enough if your data isn't too sensitive. with 10 pass you'll beard all software tools. But as Pfitzner said: even after 20 overwrites you can recover data. If you want to have a high security level you have to take 30 pass and nothing less.
Using true random numbers is not feasible even if you have a hardware random number generator attached to your computer. It's just too slow. If using a cryptographically strong pseudorandom number generator is good enough for creating encryption keys, it's more than good enough for overwriting.
I totally agree.
Based on the coercity of modern hard drives and Dr. Gutmann's 1996 paper, a magnetic field around 1 Tesla (i.e. 10000 Gauss) should be enough for even the latest drives. Degaussers capable of that are expensive though.
I doubt that. If you have a paper that says: With 1 Tesla everything is gone and we cannot recover anything with magnetic force microscopy (though I admit that is really time expensive) then it's OK, otherwise you have no proof.
Earlier you imply that using only true random numbers might be enough, and then you recommend a program uses Mersenne Twister? Sure, the PRNG has a long enough period, but it's not cryptographically strong.
You are right; I should have better explained. Of course Mersenne Twister and seed files (most will use /dev/urandom) is not true randomness.
But it's quite hard to get true random numbers. Still it's better than nothing. BTW, read about an attack on ISAAC by Marina Pudovkina:
http://eprint.iacr.org/2001/049.pdf Though ISAAC is still a secure algorithm for practical needs.
I wouldn't call IBAS a normal data recovery lab, they use very sophisticated equipment for recovering data from badly damaged drives. The fact that they are not able to recover overwritten data, or don't even want to try, should tell us something.
IBAS does have sophisticated means. No doubt in that. the (German( IT magazine iX
http://www.heise.de/ix/ once did a test of erasing software with 1 pass prng data and then sent the drives to IBAS. They were not able to recover anything. IBAS mentioned that they did not use their full potential. Anyway, I doubt that NSA or maybe some other intelligence, is on the same level as IBAS concerning computer forensics.
Why? Because intelligence has more ppl (IBAS has 5 ppl in their forensic team), more money and better research facilities.
Most likely the people who made the decision aren't data recovery professionals.
Maybe. But even if your just some managing director and some IT expert from Department of the Interior is coming over and saying: do this and you won't be able to recover anything from the criminal next door, what will you do? You'll to hide that. If all criminals were smart and had all information then the police would never get evidence from a PC.
OK, I just noticed this, and that's just plain wrong. Even if a pattern is not designed for the specific encoding scheme used by your hard drive, overwriting with the pattern is still better than not overwriting at all. It's just not as efficient as it would be on an older drive using the targeted encoding.
So the efficiency of the Gutmann method on modern drives does not equal the efficiency of only its 8 random passes. Some of the passes written are simply less efficient than other, but they still do count. However, on modern drives, I'd say 20-30 passes of random data quite likely is more efficient than the 35-pass Gutmann method.
Well, the Gutmann patterns consists of 1. 4 passes random data, then 27 passes of MFM/RLL encoding and again 4 passes random data. (Don't torture me on random or not
If your hard drive does use PRML, it's completely useless to use MFM patterns because your drive does not use them!
Gutmann: "For any modern PRML/EPRML drive, a few passes of random scrubbing is the best you can do."
If you want to do 35 pass, do it with random (or cryptographically strong random numbers) !
Thx a lot for your replies
greets, viper