Previous | Table of Contents | Next |
Remember to take the value of the key into account. Public keys are often used to secure things of great value for a long time: the banks master key for a digital cash system, the key the government uses to certify its passports, or a notary publics digital signature key. It probably isnt worth the effort to spend months of computing time to break an individuals private key, but if you can print your own money with a broken key the idea becomes more attractive. A 1024-bit key is long enough to sign something that will be verified within the week, or month, or even a few years. But you dont want to stand up in court 20 years from now with a digitally signed document and have the opposition demonstrate how to forge documents with the same signature.
Making predictions beyond the near future is even more foolish. Who knows what kind of advances in computing, networking, and mathematics are going to happen by 2020? However, if you look at the broad picture, in every decade we can factor numbers twice as long as in the previous decade. This leads to Table 7.7.
On the other hand, factoring technology may reach its Omega point long before 2045. Twenty years from now, we may be able to factor anything. I think that is unlikely, though.
Not everyone will agree with my recommendations. The NSA has mandated 512-bit to 1024-bit keys for their Digital Signature Standard (see Section 20.1)far less than I recommend for long-term security. Pretty Good Privacy (see Section 24.12) has a maximum RSA key length of 2047 bits. Arjen Lenstra, the worlds most successful factorer, refuses to make predictions past 10 years [949]. Table 7.8 gives Ron Rivests key-length recommendations, originally made in 1990, which I consider much too optimistic [1323]. While his analysis looks fine on paper, recent history illustrates that surprises regularly happen. It makes sense to choose your keys to be resilient against future surprises.
Table 7.7 Long-range Factoring Predictions | |
---|---|
Year | Key Length (in bits) |
1995 | 1024 |
2005 | 2048 |
2015 | 4096 |
2025 | 8192 |
2035 | 16,384 |
2045 | 32,768 |
Low estimates assume a budget of $25,000, the quadratic sieve algorithm, and a technology advance of 20 percent per year. Average estimates assume a budget of $25 million, the general number field sieve algorithm, and a technology advance of 33 percent per year. High estimates assume a budget of $25 billion, a general quadratic sieve algorithm running at the speed of the special number field sieve, and a technology advance of 45 percent per year.
There is always the possibility that an advance in factoring will surprise me as well, but I factored that into my calculations. But why trust me? I just proved my own foolishness by making predictions.
DNA Computing
Now it gets weird. In 1994 Leonard M. Adleman actually demonstrated a method for solving an NP-complete problem (see Section 11.2) in a biochemistry laboratory, using DNA molecules to represent guesses at solutions to the problem [17]. (Thats solutions meaning answers, not meaning liquids containing solutes. Terminology in this field is going to be awkward.) The problem that Adleman solved was an instance of the Directed Hamiltonian Path problem: Given a map of cities connected by one-way roads, find a path from City A to City Z that passes exactly once through all other cities on the map. Each city was represented by a different random 20-base string of DNA; with conventional molecular biology techniques, Adleman synthesized 50 picomols (30 million million molecules) of the DNA string representing each city. Each road was also represented by a 20-base DNA string, but these strings were not chosen randomly: They were cleverly chosen so that the beginning end of the DNA string representing the road from City P to City K (Road PK) would tend to stick to the DNA string representing City P, and the end of Road PK would tend to stick to City K.
Table 7.8 Rivests Optimistic Key-length Recommendations (in bits) | |||
---|---|---|---|
Year | Low | Average | High |
1990 | 398 | 515 | 1289 |
1995 | 405 | 542 | 1399 |
2000 | 422 | 572 | 1512 |
2005 | 439 | 602 | 1628 |
2010 | 455 | 631 | 1754 |
2015 | 472 | 661 | 1884 |
2020 | 489 | 677 | 2017 |
Adleman synthesized 50 picomols of the DNA representing each road, mixed them all together with the DNA representing all the cities, and added a ligase enzyme, which links together the ends of DNA molecules. The clever relationship between the road DNA strings and the city DNA strings causes the ligase to link the road DNA strings together in a legal fashion. That is, the exit end of the road from P to K will always be linked to the entrance end of some road that originates at City K, never to the exit end of any road and never to the entrance end of a road that originates at some city other than K. After a carefully limited reaction time, the ligase has built a large number of DNA strings representing legal but otherwise random multiroad paths within the map.
From this soup of random paths, Adleman can find the tiniest traceperhaps even a single moleculeof the DNA that represents the answer to the problem. Using common techniques of molecular biology, he discards all the DNA strings representing paths that are too long or too short. (The number of roads in the desired path must equal the number of cities minus one.) Next he discards all the DNA strings that do not pass through City A, then those that miss City B, and so forth. If any DNA survives this screening, it is examined to find the sequence of roads that it represents: This is the solution to the directed Hamiltonian path problem.
By definition, an instance of any NP-complete problem can be transformed, in polynomial time, into an instance of any other NP-complete problem, and therefore into an instance of the directed Hamiltonian path problem. Since the 1970s, cryptologists have been trying to use NP-complete problems for public-key cryptography.
While the instance that Adleman solved was very modest (seven cities on his map, a problem that can be solved by inspection in a few minutes), the technique is in its infancy and has no forbidding obstacles keeping it from being extended to larger problems. Thus, arguments about the security of cryptographic protocols based on NP-complete problems, arguments that heretofore have begun, Suppose an adversary has a million processors, each of which can perform a million tests each second, may soon have to be replaced with, Suppose an adversary has a thousand fermentation vats, each 20,000 liters in capacity.
Previous | Table of Contents | Next |