The world needs codes that quantum computers can’t break – TualGet Downloader
   
The world needs codes that quantum computers can’t break
September 4, 2024

The world needs codes that quantum computers can’t break

QUANTUM COMPUTERS, which use strange properties of the subatomic realm to calculate numbers in powerful new ways, aren’t really working yet. But once they get to work, they will be able to break the cryptographic algorithms that currently protect online communications, financial transactions, medical records and corporate secrets.

Today’s algorithms often rely on the fact that traditional computers have difficulty factoring very large numbers. For example, finding the factors of large numbers used by the RSA -2048 algorithm, which is often used as a measure of progress in the field, has eluded classical computers for generations. But experts believe a quantum computer could emerge within a decade or two that can solve problems in a day. This already makes cryptographers nervous. Nowadays, illegally obtained sensitive data can be kept on ice for years until a sufficiently powerful decoder is built.

Therefore, new algorithms are needed. And since the transition to these will take years, the transition to post-quantum cryptography (PQC) should begin as soon as possible. America’s standards agency, the National Institute of Standards and Technology (NIST), fired the starting gun for this transition. NIST announced on August 13 that three algorithms had been approved as official standards for PQC. The two are based on lattice problems, a type of mathematical puzzle that is challenging for quantum and classical computers alike. The third, based on hash functions used in data analysis, prevents putting too many eggs in one basket.

The announcement marks an important step in an ongoing process. NIST began searching for quantum-safe algorithms in 2016, when it launched a competition for codes that future quantum computers could not decipher. Dozens of algorithms have been presented, mathematicians and cryptographers have done their best to find flaws in them, and most have failed. Finally, in July 2022, NIST announced a shortlist of four algorithms that are candidates for standardization. Three of these were based on lattice problems. The fourth involved hash functions.

NIST also said it will continue to evaluate four backup algorithms, some of which may be considered standards in the future. This is because no one can be sure how secure an algorithm actually is; is that there is always a risk that someone will discover a clever way to break it. NIST ultimately chose backup algorithms that did not rely on meshes. One of these, SIKE, is based on the mathematics of isogeny-based elliptic curves. Elliptic curves are already used in some cryptographic systems today, but they are not considered quantum secure. Isogeny-based elliptic curves were thought to be so.

It turned out that he was wrong. In July 2022, mathematicians Wouter Castryck and Thomas Decru at Katholieke Universiteit Leuven in Belgium announced that they had found a way to break SIKE. Worse, their method was able to decrypt data encrypted by SIKE in just four minutes using a decade-old desktop computer. Fortunately, SIKE was the only example of an isogeny-based elliptic curve cryptosystem evaluated by NIST, so this result did not compromise other algorithms. A huge sigh of relief and the removal of SIKE from the list of candidates for the PQC.

Then in April 2024, another unexpected result came. Yilei Chen of Tsinghua University in Beijing has published a paper detailing a quantum algorithm that can solve certain lattice problems. This suggested that algorithms based on such problems may ultimately be vulnerable to quantum attacks. This was a potentially disastrous finding, considering that three-quarters of NIST’s preferred algorithms were of this type. Fortunately, a flaw was found in the article almost immediately, and cryptographers once again breathed a sigh of relief.

ML-KEM, one of the lattice-based algorithms approved by NIST, is a method of distributing secret encryption keys that allows the correct recipient to decrypt encrypted data. The other, ML-DSA, is an algorithm for digital signatures, a technique that allows users to prove their identity.

The third approved algorithm, SLH-DSA, is an alternative to ML-DSA based on a hash-based algorithm “to avoid relying solely on the security of meshes,” as NIST explains. NIST will also continue to evaluate three other algorithms that do not rely on lattices or elliptic curves as possible alternatives to ML-KEM. These are considered highly secure but require more storage space for encryption keys and encrypted data than ML-KEM.

There is strength in such diversity. Bruce Schneier, Ph.D., a cryptography expert at Harvard University. He notes that the furor over Chen’s paper highlights the fact that not enough analysis has been done on lattice-based systems to ensure their safety. People have tried and failed to crack lattice-based algorithms with traditional computers for decades, but there has been much less research into how they could be cracked using a quantum computer. Adoption of the new NIST standards should continue, he says, but large organizations should aim to be “crypto-agile” as they transition to PQC. This means migrating to facilitate further migrations in the future as better algorithms become available or flaws are found in existing ones.

The work laying the foundation for a successful transition has been underway for some time, says Scott Crowder, a quantum expert at computing giant IBM. IBM, for example, made a PQC software update for its Z series mainframes, which are still widely used in many industries. Similarly, Apple implemented ML-KEM in its iMessage service used on its iPhones, iPads and Macs earlier this year.

For a typical large company, 80% of the work to transition to PQC will be done by vendors providing upgrades and patches, Mr. Crowder says. The other 20% is more difficult and requires companies to reorganize their custom-built internal systems.

One approach that can ease migration and also provide extra assurance is known as “hybrid” or “composite” cryptography. This involves layering existing, traditional cryptography with PQC. This way, if one system goes down, the other still provides some protection. This can act as an insurance policy for organizations that are required by regulators to adopt PQC but are concerned that it may not be completely safe.

Leave a Reply

Your email address will not be published. Required fields are marked *