What are some of the best hashing algorithms to use for data integrity and deduplication?
Asked Answered
L

3

11

I'm trying to hash a large number of files with binary data inside of them in order to: (1) check for corruption in the future, and (2) eliminate duplicate files (which might have completely different names and other metadata).

I know about md5 and sha1 and their relatives, but my understanding is that these are designed for security and therefore are deliberately slow in order to reduce the efficacy of brute force attacks. In contrast, I want algorithms that run as fast as possible, while reducing collisions as much as possible.

Any suggestions?

Lovettalovich answered 27/7, 2012 at 22:20 Comment(0)
C
10

You are the most right. If your system does not have any adversary, using cryptographic hash-functions is overkill given their security properties.


Collisions depend on the number of bits, b, of your hash function and the number of hash values, N, you estimate to compute. Academic literature defends this collision probability must be bellow hardware error probability, so it is less likely to make a collision with a hash function than to be comparing data byte-by-byte [ref1,ref2,ref3,ref4,ref5]. Hardware error probability is in the range of 2^-12 and 2^-15 [ref6]. If you expect to generate N=2^q hash values then your collision probability may be given by this equation, which already takes into account the birthday paradox:
Equation

The number of bits of your hash function is directly proportional to its computational complexity. So you are interested in finding an hash function with the minimum bits possible, while being able to maintain collision probability at acceptable values.


Here's an example on how to make that analysis:

  • Let's say you have f=2^15 files;
  • The average size of each file lf is 2^20 bytes;
  • You pretend to divide each file into chunks of average size lc equal to 2^10 bytes;
  • Each file will be divided into c=lf/lc=2^10 chunks;

  • You will then hash q = f*c =2^25 objects.

From that equation the collision probability for several hash sizes is the following:

  • P(hash=64 bits) = 2^(2*25-64+1) = 2^-13 (lesser than 2^-12)
  • P(hash=128 bits) = 2^(2*25-128+1) 2^-77 (way much lesser than 2^-12)

Now you just need to decide which non-cryptographic hash function of 64 or 128 bits you will use, knowing 64 bits it pretty close to hardware error probability (but will be faster) and 128 bits is a much safer option (though slower).


Bellow you can find a small list removed from wikipedia of non-cryptographic hash functions. I know Murmurhash3 and it is much faster than any cryptographic hash function:

  1. Fowler–Noll–Vo : 32, 64, 128, 256, 512 and 1024 bits
  2. Jenkins : 64 and 128 bits
  3. MurmurHash : 32, 64, 128, and 160 bits
  4. CityHash : 64, 128 and 256 bits
Corundum answered 21/10, 2012 at 21:22 Comment(1)
First of all, thank you for putting in the time to explain this; I really appreciate it. Second, I wanted to ask a clarifying question: how do you defend differently against an adversary versus a huge number of files? Isn't the end result the same: generation of enough data that you finally find two pieces of data that hash the same? (Either randomly, or through targeted analysis of the algorithm.)Lovettalovich
U
2

MD5 and SHA1 are not designed for security, no, so they are not particularly secure, and hence not really very slow, either. I've used MD5 for deduplication myself (with Python), and performance was just fine.

This article claims machines today can compute the MD5 hash of 330 MB of data per second.

SHA-1 was developed as a safer alternative to MD5 when it was discovered that you could craft inputs that would hash to the same value with MD5, but I think for your purposes MD5 will work fine. It certainly did for me.

Undersecretary answered 27/9, 2012 at 11:43 Comment(2)
MD5 and SHA1 are cryptographic hash function, thus designed for security purposes. Just because their security has been compromised (SHA1 is not that compromised still), it doesn't mean they are not designed for security.Corundum
(SHA1 is not compromised that much)*Corundum
O
-1

If security is not a concern for you you can take one of the secure hash functions and reduce the number of rounds. This makes the cryptographically unsound but still perfect for equality-testing.

Skein is very strong. It has 80 rounds. Try reducing to 10 or so.

Or encrypt with AES and XOR the output blocks together. AES is hardware-accelerated on modern CPUs and insanely fast.

Osborn answered 21/10, 2012 at 21:25 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.