The truth about Overwriting patterns

  • Thread starter Thread starter Anonymous
  • Start date Start date
A

Anonymous

Guest
hello,

if you want to know the truth about overwriting patterns, visit my site:

http://www.sicher-loeschen.de.vu/

You'll see that most advertising like "Our software even exceeds the Department of Defense standard" is simply bull....
E.g. many ppl use the Gutmann 35-pass method. They don't know that 27 of the 35 passes are designed to flip the bits in MFM/RLL encoding. But modern hard drives don't use that encoding. If you do 8 passes with "random" numbers, it will have the same effect.

greets, Viper
 
nice article, very well written and simple enough for a novice to follow. i'm very curious what the more knowledgable members of this forum have to say about it.

i've always wondered why anyone came up with "patterns". it seems to me that just random data written over and over would be the best protection. anytime someone creates a pattern, theres always someone else who can reverse that pattern.
 
The article is very accurate.

The prime objective of erasing is to remove the file and any traces of it. Pattern erasing leaves the trace of the pattern, thus indicating there was something erased.

I agree on the random patterns but 30 times on a modern drive could be a bit excessive. I can see the original authors point in that saving the same pattern again and again on a magnetic surface will create a deep memory over time necessitating such a deep erasing process. In modern drives the medium maybe too thin to require such drastic steps. In most cases files are just written once or twice so 4 passes of erasing will clean the drive perfectly well.

What has never been researched AFAIK is the effect of erasing on the different drive types available today.

One final point to note would be, to an intelligence collector the evidence of the past existence of something may be as good as the real thing. pattern erasing leaves a very plain fingerprint of this.

Garrett
 
thx

Thank you for your comments.
The problem with overwriting is that normal citizens/researchers don't have access to classified papers. Intelligence won't tell us when they reach the point of "no recovery". So my conclusion is derived from Gutmann's statement that "a few passes with random scrubbing is the best you can do". I know that 30 pass is quite high and even after 1 pass overwriting normal data recovery labs like IBAS are not able to recover anything but with very sophisticated methods. Anyway, I think Pfitzner's recommendation of more than 30 pass with random numbers is "true". Pfitzner works for the Department of the Interior (Brandenburg) and it's highly probable that he can access classified papers. Another point is, that before working for the Interior, he worked for the "State Commissioner
for Data Protection and Access to Information" and it was intended to make the paper publicly available. He then had to send it to the Federal Office of IT security and those guys marked it as Secret. So, why should you classify a paper that won't cause problems to law enforcement?

best wishes, viper
 
From the web page:
Do NOT use the Gutmann (35-pass) Method
I agree, and I have previously suggested in this forum that the default erasing method in Eraser should be changed to something more reasonable. No answer.

Overwrite more than 30 times with strong random numbers
I really don't care what "classified" papers say, I want proof. It's silly to overwrite data over 30 times, when it's quite likely that a few passes is enough. If you need to overwrite tens of times, then you shouldn't rely on overwriting in the first place.

data is overwritten with (true?) random numbers
Using true random numbers is not feasible even if you have a hardware random number generator attached to your computer. It's just too slow. If using a cryptographically strong pseudorandom number generator is good enough for creating encryption keys, it's more than good enough for overwriting.

I have no data which might indicate magnetic forces (T), that are proven to be sufficient for securely erasing data from magnetic media
Based on the coercity of modern hard drives and Dr. Gutmann's 1996 paper, a magnetic field around 1 Tesla (i.e. 10000 Gauss) should be enough for even the latest drives. Degaussers capable of that are expensive though.

As software, I'd recommend Darik's Boot and Nuke. It allows you to use seed files for the PRNG
Earlier you imply that using only true random numbers might be enough, and then you recommend a program uses Mersenne Twister? Sure, the PRNG has a long enough period, but it's not cryptographically strong. Eraser, for instance, uses ISAAC and has a seeding system similar to the one used in GNU Privacy Guard. Just in case the user feels paranoid. :)

it is better to erase full partitions or even the complete drive, than just single files
I definitely agree with you on this one. Better yet, use encryption.
 
Anonymous said:
theres always someone else who can reverse that pattern.
That's not the point of overwriting patterns. The goal is to disturb the magnetic field on the drive platter as much as possible. The patterns are designed to do that, provided that a specific encoding scheme is used on the drive. However, with modern drives using very complex encoding schemes, using specific patters is useless, and falling back to random data is pretty much the only choice.
 
admin said:
Pattern erasing leaves the trace of the pattern, thus indicating there was something erased.
Well, it's not like the drive contains random data when you buy it from the store. I'd say finding statistically random data from the drive is a much better sign that something was erased.

I can see the original authors point in that saving the same pattern again and again on a magnetic surface will create a deep memory
Did I misread something, I thought the author suggested using random data, not a specific pattern?

In most cases files are just written once or twice so 4 passes of erasing will clean the drive perfectly well.
Finally, some sanity in this discussion. So how about changing the default overwriting method in Eraser?

pattern erasing leaves a very plain fingerprint of this.
See above.
 
Re: thx

Viper said:
normal data recovery labs like IBAS
I wouldn't call IBAS a normal data recovery lab, they use very sophisticated equipment for recovering data from badly damaged drives. The fact that they are not able to recover overwritten data, or don't even want to try, should tell us something.

It's highly unlikely that your local law enforcement or government officials are better equipped to recover anything. And if you are worried about the military intelligence, well, you will most likely have more important things to worry about than the data on your hard drive.

So, why should you classify a paper that won't cause problems to law enforcement?
Because the people who classified it don't know if it would cause problems, and wanted to play safe? Most likely the people who made the decision aren't data recovery professionals. If you ever worked with the military people, you'd know what I am talking about.
 
Viper said:
E.g. many ppl use the Gutmann 35-pass method. If you do 8 passes with "random" numbers, it will have the same effect.
OK, I just noticed this, and that's just plain wrong. Even if a pattern is not designed for the specific encoding scheme used by your hard drive, overwriting with the pattern is still better than not overwriting at all. It's just not as efficient as it would be on an older drive using the targeted encoding.

So the efficiency of the Gutmann method on modern drives does not equal the efficiency of only its 8 random passes. Some of the passes written are simply less efficient than other, but they still do count. However, on modern drives, I'd say 20-30 passes of random data quite likely is more efficient than the 35-pass Gutmann method.
 
I really don't care what "classified" papers say, I want proof. It's silly to overwrite data over 30 times, when it's quite likely that a few passes is enough. If you need to overwrite tens of times, then you shouldn't rely on overwriting in the first place.
I agree that it's better to have proof than just speculate. But that's the problem. Who has the means to do something like "Forensic-proof pattern testing"? First you have to gather many different drives which use different encodings. That's no problem for some research at university. Buying EnCase should be no problem too. But the NSA won't give you its best hard drive recovery technology just to find out what you can do to defy it. So, as I said: 2 passes random is enough if your data isn't too sensitive. with 10 pass you'll beard all software tools. But as Pfitzner said: even after 20 overwrites you can recover data. If you want to have a high security level you have to take 30 pass and nothing less.

Using true random numbers is not feasible even if you have a hardware random number generator attached to your computer. It's just too slow. If using a cryptographically strong pseudorandom number generator is good enough for creating encryption keys, it's more than good enough for overwriting.
I totally agree.
Based on the coercity of modern hard drives and Dr. Gutmann's 1996 paper, a magnetic field around 1 Tesla (i.e. 10000 Gauss) should be enough for even the latest drives. Degaussers capable of that are expensive though.
I doubt that. If you have a paper that says: With 1 Tesla everything is gone and we cannot recover anything with magnetic force microscopy (though I admit that is really time expensive) then it's OK, otherwise you have no proof.
Earlier you imply that using only true random numbers might be enough, and then you recommend a program uses Mersenne Twister? Sure, the PRNG has a long enough period, but it's not cryptographically strong.
You are right; I should have better explained. Of course Mersenne Twister and seed files (most will use /dev/urandom) is not true randomness.
But it's quite hard to get true random numbers. Still it's better than nothing. BTW, read about an attack on ISAAC by Marina Pudovkina:
http://eprint.iacr.org/2001/049.pdf Though ISAAC is still a secure algorithm for practical needs.

I wouldn't call IBAS a normal data recovery lab, they use very sophisticated equipment for recovering data from badly damaged drives. The fact that they are not able to recover overwritten data, or don't even want to try, should tell us something.
IBAS does have sophisticated means. No doubt in that. the (German( IT magazine iX http://www.heise.de/ix/ once did a test of erasing software with 1 pass prng data and then sent the drives to IBAS. They were not able to recover anything. IBAS mentioned that they did not use their full potential. Anyway, I doubt that NSA or maybe some other intelligence, is on the same level as IBAS concerning computer forensics.
Why? Because intelligence has more ppl (IBAS has 5 ppl in their forensic team), more money and better research facilities.
Most likely the people who made the decision aren't data recovery professionals.
Maybe. But even if your just some managing director and some IT expert from Department of the Interior is coming over and saying: do this and you won't be able to recover anything from the criminal next door, what will you do? You'll to hide that. If all criminals were smart and had all information then the police would never get evidence from a PC.

OK, I just noticed this, and that's just plain wrong. Even if a pattern is not designed for the specific encoding scheme used by your hard drive, overwriting with the pattern is still better than not overwriting at all. It's just not as efficient as it would be on an older drive using the targeted encoding.

So the efficiency of the Gutmann method on modern drives does not equal the efficiency of only its 8 random passes. Some of the passes written are simply less efficient than other, but they still do count. However, on modern drives, I'd say 20-30 passes of random data quite likely is more efficient than the 35-pass Gutmann method.
Well, the Gutmann patterns consists of 1. 4 passes random data, then 27 passes of MFM/RLL encoding and again 4 passes random data. (Don't torture me on random or not :)
If your hard drive does use PRML, it's completely useless to use MFM patterns because your drive does not use them!
Gutmann: "For any modern PRML/EPRML drive, a few passes of random scrubbing is the best you can do."
If you want to do 35 pass, do it with random (or cryptographically strong random numbers) !

Thx a lot for your replies :)
greets, viper
 
i have a question for you guys. i keep seeing reference to "true" random number generation. whats the difference between "true" random number generation and well the "other kind"?

is Eraser capable of "true" random number generating using the "psuedorandom data" option? or does psuedorandom mean it can only imitate it but not actually do it. i hear the word psuedo used alot in videogames. for example when they say psuedo-3D. it's not really 3-D, its just spoofing it.
 
hello mate,

"true" random numbers are unpredictable. they are really random. You can get random data from radioactive decay measurements and other physical phenomena.
The "other kind" are pseudo-random. Pseudo-random means that it is not true random but quite hard to foretell from a statistical point of view.
To "calculate" pseudorandom numbers, PRNGs (Pseudo Random Number Generators) are used. There are a lot of PRNGs out there. Some are secure, some less good.
Eraser uses the ISAAC. This is a quite good CSPRNG (cryptographically secure PRNG). Though there exists an attack, it is not possible to do in a reasonable time.

greets, viper

www.sicher-loeschen.de.vu
 
Viper said:
I doubt that. If you have a paper that says: With 1 Tesla everything is gone
I don't, but I have a paper that says with approximately 5 x coercity everything is gone, and I know the coercity of modern drives, so this can be easily calculated.

You are right; I should have better explained. Of course Mersenne Twister and seed files (most will use /dev/urandom) is not true randomness.
There seems to be a slight misunderstanding regarding the terms here. When I say true random data, I mean entropy gathered from a physical source. Something that cannot be achieved using arithmetical means. When I say cryptographically strong pseudorandom data, I mean randomness where it is computationally impossible to predict the next random number even if all previous data is known. When I speak of just pseudorandom data, I mean a stream of bits that seems (statistically) random, but is not unpredictable.

But it's quite hard to get true random numbers. Still it's better than nothing.
I did not imply that true random data should be used, I merely noted that Mersenne Twister is not even cryptographically strong, so we could indeed do better here. Although I am not personally convinced using cryptographically strong random data is necessary for overwriting.

BTW, read about an attack on ISAAC by Marina Pudovkina
I have read it, but with a running time of 4.67x10^1240 it's pretty much theoretical.

I doubt that NSA or maybe some other intelligence, is on the same level as IBAS concerning computer forensics.
I agree, but the point I made is that if you are worried about NSA, overwriting the hard drive should be the least of your concerns. Moreover, it would be foolish to rely only on overwriting in that case.

If your hard drive does use PRML, it's completely useless to use MFM patterns
No, it's not completely pointless to use these patterns. Even if they are not optimal for your hard drive encoding, they are still better than nothing. This means the the total effectiveness of the 35-pass Gutmann method is better than a method using only 8 passes of random data.

because your drive does not use them!
Like I said, even if the pattern does not optimally flip the magnetic fields on the drive, it still gets written there. Thus, it's better to overwrite with something than not overwrite with all. I don't follow you on a drive "using" the pattern.

If you want to do 35 pass, do it with random
I definitely agree with you here. I am simply saying that using a constant pattern is not totally useless, even if it is not as effective as random data.
 
i keep seeing reference to "true" random number generation. whats the difference between "true" random number generation and well the "other kind"?
Viper already gave you an answer, but you may still want to read the answer to question "8.7. What does ``random'' mean in cryptography?" in the sci.crypt FAQ: http://www.faqs.org/faqs/cryptography-faq/part08/.

If you are interested in the mathematics behind pseudorandom sequences, I suggest reading "The Art of Computer Programming, Volume 2: Seminumerical algorithms" by Donald Knuth: http://www-cs-faculty.stanford.edu/~knuth/taocp.html.
 
OK, so with all these knowledgable people chiming in, I have a couple basic questions (I know the answers may not be clear-cut):

* Is overwriting with zeroes worthwhile, marginally effective, or essentially useless?

* If a user was to use overwrite with pseudo-random data, how many passes would be a good choice? (Let's suppose this user was concerned about thwarting law enforcement, and safeguarding privacy against data-recovery professionals. Let's also suppose that this user did use strong encryption where really required, and isn't looking for argument on using it more extensively instead of overwriting.)
 
And yes, I did read the page linked to in the opening post. But there seems to be disagreement on how many passes are to be recommended. I've been using 12, which I guess is quite enough.

Why "strongly" recommend against using Schneirer's method? After all, it does use 5 random passes, which is enough for moderate security. At least say that Schneirer's method is good to use, but not for maximum security. I mean, 2 passes are recommended for "low security" in the same article.
 
Anonymous said:
Is overwriting with zeroes worthwhile, marginally effective, or essentially useless?
If there's actually a chance someone is going to spend time recovering the data, it's only marginally effective, but still not entirely useless. However, since it has already been established that you are better off using pseudorandom data for overwriting, why bother using a constant patterns at all.

If a user was to use overwrite with pseudo-random data, how many passes would be a good choice?
I think three passes should be enough for a normal person, you can increase your dosage if you feel you need more security. Anything over 30 passes really should be enough for anyone. As I said, if you need this level of security, you might be better off not relying on just overwriting. Or Windows for that matter.
 
Anonymous said:
Why "strongly" recommend against using Schneirer's method?
I suppose the author feels that way, because if you can tolerate waiting for seven overwriting passes to complete, you might just as well use pseudorandom data for all of them while you're at it.
 
Thanks. The only reason I wondered about overwriting with zeroes is that I had been using a novel utility called mst RealDelete, and I have to use it in "Fast" mode, which simply overwrites deleted files with zeroes. (The reason I have to use "Fast" mode is that my system crashes constantly when any other options are selected. This is a known bug in the software.)

It's a moot point now, since I've removed mst RealDelete. I love the concept, but it's just too poorly implemented, and too buggy. I really wish someone would take that utility idea and do it right, though.
 
Back
Top