Home / Blog / Is there cryptography after the cat?

# Is there cryptography after the cat?

Posted on 09/15/2016, by Miguel Herrero (INCIBE) Or, stated differently: What will happen to cryptography after quantum computers are developed? Please let me explain. Cryptography is currently based on a very simple concept: today’s computing power makes it impossible to respond to the cryptographic challenge within a reasonable period of time. Almost all cryptography is based on one-way mathematical functions, math functions that are very simple to calculate, but very difficult to invert.

One very clear example of these functions is the calculation of hash. It is easy to calculate hash for any input, but impossible to calculate the input from hash, or in the reverse direction. In theory. And with current technology, of course… Because if it were possible to calculate the input from hash in a reasonable period of time, it would be impossible to use it to test the integrity of the message.

The factorization of numbers introduces a similar problem. Many current cryptographic functions (such as RSA) are based on the difficulty of factoring a large number, if this number is the product of two sufficiently large prime numbers. When the two factors are known, multiplication is a simple task. However, the difficulty of the inverse function ensures the security of the public key and private key algorithms.

This situation changes when quantum computing comes into play. Quantum computing differs from traditional in that it is not based on discrete states of information (0 or 1), but on a superposition of states, so that each bit (called a qubit in this field) is both 0 and 1 at the same time. The phenomenon of quantum superposition can be better understood through the paradox of Schrödinger’s cat. While the state of the cat is not observed (the box is opened to verify if the poison has taken effect or not), the cat is in a state that we could summarise as ‘both alive and dead’ (50% alive and 50% dead). The superposition phenomenon lets calculations be performed, assuming that the value of the qubit is both 0 and 1, which translates into multiplying the capacity of executing parallel operations. Thus, a quantum computer with only 30 qubits would be equal to a traditional computer with 10 Teraflops (an Intel Core i7-920 processor with 3.4GHz has a 70 Gigaflop capacity, 142 times less).

In 1994, Peter Shor described a quantum algorithm that enormously reduced the computing time of factoring a number when compared to traditional computing. This algorithm was implemented in 2001 by a quantum computing group at IBM and, although the experiment was simple, as it consisted of factoring the number 15 50% of the time (a probabilistic function), it was able to prove the validity of the algorithm. It has evolved since 2001 and, in 2016, the largest number factored by a quantum computer is 200,099, factored using a quantum annealing algorithm

Quantum computing is here to stay and has a huge capacity to change the way in which we currently do things. The truth is that it is far from being able to factor the numbers that are currently handled in RSA, as the primes used now are around 10200, with a tendency to grow as computer capacity increases. However, the truth is also that it is a losing battle in the long run. Let’s make this assumption. Sooner or later RSA will become insecure.

The same thing may occur as with IPv6, due to the exhaustion of IPv4 directions, which seem to never arrive, and we manage to keep this cryptography alive by increasing the size of the prime numbers or keys, but we must start to think of cryptography that could be resistant to the onset of quantum computing and also resistant to traditional computing.

This cryptography has been baptized with the name of post-quantum cryptography and, at present, these lines are being researched:

• Lattice-based cryptography: Asymmetric key cryptography, whose primitives are lattice-based. Lattice-based cryptosystems are quite efficient and easy to implement, but the resolution of the underlying mathematics is complex and apparently resistant to quantum computing, which has yet to find an algorithm with greater efficiency than those used in traditional computing.
• Multivariate cryptography: Also asymmetric and based on polynomials with multiple variables in a finite field. Its usefulness has been demonstrated in the development of digital signatures, although it has yet to be shown usable for executing secure cryptosystems.
• Code-based cryptography: Includes cryptographic systems that are based on error-correction codes such as the McEliece cryptosystem, proposed in 1978 and that no-one has managed to break yet. Despite being quite fast, the size of the keys handled is a problem. In the original algorithm, the key size was 262,000 bits. Some systems that have reduced this size by introducing more structure in the key have proven to be vulnerable to attacks and therefore insecure. Systems have been proposed for the development of a digital signature using this type of cryptography, although they have proven more useful in the creation of encryption schemes. To be resistant to the appearance of quantum computers, the proposed key size is some 8,400,000 bits.
• Hash-based signatures: include systems including the Lamport signature and the Merkle signature scheme. The latter invented this type of cryptography in the 1970s. Its main drawback is that for each hash-based public key, there are a limited number of signatures that can be created from its corresponding private key. This limit can be increased, although this also increases the size of the signature. In addition, the signatory must control the exact number of messages he has signed in the past and any inexactitude in this number will make the system insecure.
• Other algorithms, like the supersingular isogeny key exchange, are still under development, but are considered good candidates for inclusion in the post-quantum cryptography group.

Today's post-quantum algorithms do not seem capable of replacing traditional ones, as they have the problem that the key sizes are much larger than the size used in those algorithms they aim to replace.

This increase in the key size significantly affects many internet protocols—such as TLS and IKE—, to such a degree that they may need to be modified in order to implement them, and this is no easy undertaking. Further, none of them provides complete security against all quantum attacks and, when the latter evolve, new ways of breaking these encryption systems may be discovered.

This is something that happens at present, where new computing advances make protocols that were previously considered secure until now become unusable, as is the case with MD5, SHA-1, DES…