Date: Tue, 18 Dec 2012 12:23:52 -0500
From: Matt Weir <cweir@...edu>
To: crypt-dev@...ts.openwall.com
Subject: Re: Intentionally Increasing Collisions in Password
Hashing Algorithms
Hey Solar,
> Why look at a really weak hash first? I think it only makes sense to
> consider controversial changes such as deliberate hash collisions on top
> of state of the art setups.
I'd like to begin by saying I mostly agree with you. Referring back to
my earlier post:
>> Ideally we should be focusing on getting people to at least upgrade to
>> phpass before we start mucking around with hash collisions
The reason why I wanted to start with a fast hash like vBulletin was
twofold. First, I was trying to define an edge case, (lower bound
security), to see if this whole approach even had a chance of being
useful. To that end I wanted to model a situation where there was a
huge disparity between online and offline cracking speeds. After that
edge case was examined I could move on to more conservative setups. My
second reason gets into some issues that can stray way off topic from
the current thread, but it's due to the fact that adoption of bcrypt
and other computationally expensive hashing schemes has been very
limited. I can name three sites that I know use bcrypt: Twitter,
Cloudflare and @bitweasil's website. The number of websites I can name
that use portable phpass hashes or vBulletin is much greater. This
might be something you could shed a lot of light on for me. In your
estimation, what's the likelihood/time-frame for getting a bcrypt
equivalent strength hash as the default into phpBB3? I know phpass
supports BCrypt as one of it's modes which is why I'm specifically
wondering about its implementation in common software like phpBB3.
It's been my experience that when many people hear "computationally
expensive hashes" they immediately think that it will cause massive
performance issues or open themselves up to new types of DoS attacks.
That's why I like seeing tests/studies like those run by @DefuseSec
https://defuse.ca/website-key-stretching-feasibility.htm that
investigate, (and hopefully disprove), the worries that site admins
have. That's also part of the reason why I'm interested in hash
collisions. Since there's no real performance overhead, would it be
possible that it might be able to gain some adoption when sites were
unwilling to take a performance hit of using PBKDF2 or BCrypt with a
large number of iterations ?
To answer the question of what happens when portable phpass hashes or
Bcryt are used with collisions let me refer back to the model I
described previously. The reason why I focused on a model first was
now it's fairly easy to look at the various assumptions I made, tweak
them, and then see the results.
Let's start with portable phpass hashes, (like those used in Phpbb3).
Referring back to the previous hardware setup it can be expected than
our 'average' attacker can generate around 6 million hashes a second.
This is a huge decrees from the 4 billion hashes a second they were
getting with vBulletin hashes but that's still 1.8 billion guesses in
a five minute time-frame. Thanks to the wonders of Excel, I can simply
plug the shorter cracking session into the Excel workbook I posted
earlier, (I just ended the checks at row 16955 fwi), and I see that
69,322 collisions would be generated on average. That's taking into
account that the attacker would crack nearly 37% of all the passwords
in that time-frame.
Unfortunately, the passwordproject's webpage doesn't have benchmarks
for Bcrypt, but refering to hashcat's main webpage, (hashcat.net), an
attacker can expect to get around 8,000 c/s against BCrypt$2a$ hashes
with two hd6990 GPUs. That's about 2,400,000 guesses in a five minute
period, (and is why BCrypt is really nice). Referring back to the
Excel document, (row 2274), we are now averaging around 132 collisions
per user, with the attacker cracking 9.3% of the passwords.
What does this all mean? I'm still trying to figure that out... While
132 collisions does add additional cost to an attacker, it might not
be enough to justify it, especially considering the fact the
underlying amount of risk that's accepted in that example. That being
said, nearly 70k collisions for a popular hashing algorithm like
phpass would have a huge impact for an attacker.
In short, I'm still on the fence myself on if adding collisions is a
good idea or not, and if so when/where it would be applicable. That's
why I'm still investigating it, and why I'm very interested to hear
everyone else's opinions.