MD5 Flaw Threatens File Integrity

edited December 2004 in Science & Tech
According to a report from security researcher Dan Kaminsky, the MD5 cryptographic algorithm may be at risk. This means that files, applications and programs supposedly authenticated and verified by MD5 could potentially be compromised.
In a research paper titled, "MD5 To Be Considered Harmful Some Day," Kaminsky expanded on the theoretical work done by Chinese security researchers Xiaoyun Wang, Dengguo Feng, Xuejia Lai and Hongbo Yu on "Collisions for MD5 Hash Functions." Kaminsky released a tool Stripwire to demonstrate some of the attacks he describes.

A hash collision essentially means that you could have two identical outputs from a hash function. That situation may lead to an algorithm that is not considered to be cryptographically secure and can be attacked. In August, French research Antoine Joux presented an unpublished paper at the Crypto 2004 show similar to the original Chinese research that Kaminsky expanded upon.

At the time the disclosure prompted data storage giant EMC to allay its customers that the MD5 algorithm it uses is enhanced and buried in the platform and that it was virtually unexploitable.

"Some people have said there's no applied implications to Joux and Wang's research," Kaminsky wrote. "They're wrong; arbitrary payloads can be successfully integrated into a hash collision."
Source: Internet News

Comments

  • Straight_ManStraight_Man Geeky, in my own way Naples, FL Icrontian
    edited December 2004
    Problem is, ANY checksum algorithm can gen has collisions. ANY. It's harder with a 256 bit checksum, that is all. AND, MD5Sum IS a 256 bit hash as used right now. PGP uses a variant of it, also. 512 bit checksum will probably come into play for many software integrity checks, and IS used for comm security and possibly some file verification security at government level now. NERO uses MD5Sum to validate burns. BTW, there are two ways to use MD5Sum for evaling files. One is to checksum each file, AND the overall archive. Modern linux does hits, with the person burning responsible for chacking the MD5Sum of the ISO file before and after burning (overall) and the installer (if this is a Linux RPM package installation for instance), is responsible for checking:

    An MD5Sum checksum by RPM file, and;
    A security signature by RPM file (PGP plus MD5 special hash).

    Only way to make this tighter, is in the archive headers themselves to have MD5Sums for each file in the RPM, and cross check headers agains a md5sum of each file as extracted. Some RPM-Based Linuxes do THAT in latest version also. If all do not check, you have to override the install to make it happen. EVERY time I have done this, I have gotten to fix SOMETHING.

    How do you eliminate a single-layer collision-type penetration?? Make it MULTILAYER, and require an ALL Layers match at install time and not just a single check of archive and the burned archive itself. It's very, very, hard to get synchronous multilayer identicals in a hash collision exploitation scenario.
  • TheBaronTheBaron Austin, TX
    edited December 2004
    absolutely
  • edited December 2004
    Uh, no.
Sign In or Register to comment.