Frequently Asked Questions For
Information Security Intelligence: Cryptographic Principles & Applications


Assume a cryptographic algorithm in which performance for the good guys (the ones that know the key) grows linearly with the length of the key, and for which the only way to break it is a brute force attack of trying all possible keys. Suppose the performance for the good guys is adequate (e.g. it can encrypt and decrypt as fast as the bits can be transmitted over the wire) at a certain size key. Then suppose advances in computer technology make computers twice as fast. Given that both the good guys and the bad guys get faster computers, does this advance in computer speed work to the advantage of the good guys, the bad guys, or does it matter at all?

Moore's law states that periodically (say every year) computing power will double. This means that the ability of an attacker to successfully accomplish a brute force attack of the search space can occur twice as fast. But since growing the key by one bit only causes the good guys a linear change in performance while causing a doubling of the search space, it would appear that the advantage is to the good guys. This is because all they have to do is suffer a linear (imperceptible) loss in computational complexity (which is more than made up for by the increased power of the new computers) by adding one bit to keep up with the advances in computing while all the extra power gained by the bad guys only allows them to achieve the same rate of success that they have today.

 


Many of the answers to FAQs are from lectures presented at JWU.