Previous | Table of Contents | Next |
Although the algorithm was one of the first public-key algorithms, and there were no successful cryptanalytic results against the algorithm, it has never gained wide acceptance in the cryptographic community. The scheme is two to three orders of magnitude faster than RSA, but has some problems. The public key is enormous: 219 bits long. The data expansion is large: The ciphertext is twice as long as the plaintext.
Some attempts at cryptanalysis of this system can be found in [8, 943, 1559, 306]. None of these were successful in the general case, although the similarity between the McEliece algorithm and knapsacks worried some.
In 1991, two Russian cryptographers claimed to have broken the McEliece system with some parameters [882]. Their paper contained no evidence to substantiate their claim, and most cryptographers discount the result. Another Russian attack, one that cannot be used directly against the McEliece system, is in [1447, 1448]. Extensions to McEliece can be found in [424, 1227, 976].
Other Algorithms Based on Linear Error-Correcting Codes
The Niederreiter algorithm [1167] is closely related to the McEliece algorithm, and assumes that the public key is a random parity-check matrix of an error-correcting code. The private key is an efficient decoding algorithm for this matrix.
Another algorithm, used for identification and digital signatures, is based on syndrome decoding [1501]; see [306] for comments. An algorithm based on error-correcting codes [1621] is insecure [698, 33, 31, 1560, 32].
Elliptic curves have been studied for many years and there is an enormous amount of literature on the subject. In 1985, Neal Koblitz and V. S. Miller independently proposed using them for public-key cryptosystems [867, 1095]. They did not invent a new cryptographic algorithm with elliptic curves over finite fields, but they implemented existing public-key algorithms, like Diffie-Hellman, using elliptic curves.
Elliptic curves are interesting because they provide a way of constructing elements and rules of combining that produce groups. These groups have enough familiar properties to build cryptographic algorithms, but they dont have certain properties that may facilitate cryptanalysis. For example, there is no good notion of smooth with elliptic curves. That is, there is no set of small elements in terms of which a random element has a good chance of being expressed by a simple algorithm. Hence, index calculus discrete logarithm algorithms do not work. See [1095] for more details.
Elliptic curves over the finite field GF(2n ) are particularly interesting. The arithmetic processors for the underlying field are easy to construct and are relatively simple to implement for n in the range of 130 to 200. They have the potential to provide faster public-key cryptosystems with smaller key sizes. Many public-key algorithms, like Diffie-Hellman, ElGamal, and Schnorr, can be implemented in elliptic curves over finite fields.
The mathematics here are complex and beyond the scope of this book. Those interested in this topic are invited to read the two references previously mentioned, and the excellent book by Alfred Menezes [1059]. Two analogues of RSA work in elliptic curves [890, 454]. Other papers are [23, 119, 1062, 869, 152, 871, 892, 25, 895, 353, 1061, 26, 913, 914, 915]. Elliptic curve cryptosystems with small key lengths are discussed in [701]. Next Computer Inc.s Fast Elliptic Encryption (FEE) algorithm also uses elliptic curves [388]. FEE has the nice feature that the private key can be any easy-to-remember string. There are proposed public-key cryptosystems using hyperelliptic curves [868, 870, 1441, 1214].
Some cryptographers have developed generalizations of RSA that use various permutation polynomials instead of exponentiation. A variation called Kravitz-Reed, using irreducible binary polynomials [898], is insecure [451, 589]. Winfried MŸller and Wilfried Nöbauer use Dickson polynomials [1127, 1128, 965]. Rudolph Lidl and MŸller generalized this approach in [966, 1126] (a variant is called the Réidi scheme), and Nöbauer looked at its security in [1172, 1173]. (Comments on prime generation with Lucas functions are in [969, 967, 968, 598].) Despite all of this prior art, a group of researchers from New Zealand managed to patent this scheme in 1993, calling it LUC [1486, 521,1487].
The nth Lucas number, Vn(P,1), is defined as
Theres a lot more theory to Lucas numbers; Im ignoring all of it. A good theoretical treatment of Lucas sequences is in [1307, 1308]. A particularly nice description of the mathematics of LUC is in [1494, 708].
In any case, to generate a public-key/private-key key pair, first choose two large primes, p and q. Calculate n, the product of p and q. The encryption key, e, is a random number that is relatively prime to p - 1, q - 1, p + 1, and q + 1.
There are four possible decryption keys,
where lcm is the least common multiple.
The public key is d and n; the private key is e and n. Discard p and q.
To encrypt a message, P (P must be less than n), calculate
And to decrypt:
At best, LUC is no more secure than RSA. And recent, still-unpublished results show how to break LUC in at least some implementations. I just dont trust it.
Chinese cryptographer Tao Renji has developed a public-key algorithm based on finite automata [1301, 1302, 1303, 1300, 1304, 666]. Just as it is hard to factor the product of two large primes, it is also hard to factor the composition of two finite automata. This is especially so if one or both of them is nonlinear.
Much of this research took place in China in the 1980s and was published in Chinese. Renji is starting to write in English. His main result was that certain nonlinear automata (the quasilinear automata) possess weak inverses if, and only if, they have a certain echelon matrix structure. This property disappears if they are composed with another automaton (even a linear one). In the public-key algorithm, the secret key is an invertible quasilinear automaton and a linear automaton, and the corresponding public key can be derived by multiplying them out term by term. Data is encrypted by passing it through the public automaton, and decrypted by passing it through the inverses of its components (in some cases provided they have been set to a suitable initial state). This scheme works for both encryption and digital signatures.
The performance of such systems can be summed up by saying that like McElieces system, they run much faster than RSA, but require longer keys. The keylength thought to give similar security to 512-bit RSA is 2792 bits, and to 1024-bit RSA is 4152 bits. For the former case, the system encrypts data at 20, 869 bytes/sec and decrypts data at 17, 117 bytes/sec, running on a 33 Mhz 80486.
Renji has published three algorithms. The first is FAPKC0. This is a weak system which uses linear components, and is primarily illustrative. Two serious systems, FAPKC1 and FAPKC2, use one linear and one nonlinear component each. The latter is more complex, and was developed in order to support identity-based operation.
As for their strength, quite a lot of work has been done on them in China (where there are now over 30 institutes publishing cryptography and security papers). One can see from the considerable Chinese language literature that the problem has been studied.
One possible attraction of FAPKC1 and FAPKC2 is that they are not encumbered by any U.S. patents. Thus, once the Diffie-Hellman patent expires in 1997, they will unquestionably be in the public domain.
Previous | Table of Contents | Next |