# Researchers are perfecting cryptographic algorithms to deal with a powerful quantum threat

October 19, 2022 – As the race to develop large-scale, reliable quantum computers gathers pace, fortification efforts are intensifying to protect digital systems from technology that could completely wipe out their security. Commonwealth Cyber Initiative Researchers at Virginia Tech and George Mason University are developing security protocols to ward off cyberattacks — both from today’s computers and from quantum computers coming to the pike.

In an effort to be on the safe side, the US National Institute of Standards and Technology (NIST) called on cryptographers around the world to develop and investigate encryption techniques that can withstand attacks from powerful quantum computers. Six years later, in July 2022, NIST announced its first finalists, and researchers from Virginia Tech’s Commonwealth Cyber Initiative are among those working to make the selected algorithms more efficient and secure.

**Cautious protocols**

Sensitive data such as personal identification information, online banking accounts and medical records as well as government and military communications are protected by cryptography – algorithms that protect data through encryption. Quantum computers can solve some problems much faster than classical computers, sweeping away the cryptosystems currently in place.

But what is the point in mounting a defense against a technology that is still in development?

“First, we need to have techniques ready to prevent attacks before they become real problems. People are working on quantum computers right now and they’re making progress,” said Jason LeGrow, assistant professor of mathematics and member of the Commonwealth Cyber Initiative. “We have to think about protecting our privacy for the future. We do not want the messages we currently send to be compromised when someone has the means to decrypt them using more powerful technology than is currently available.

**Authenticity, integrity and encryption**

Cryptography is the science of secret messages – but it’s not just about keeping information secret; it also verifies the authenticity of the person who sent the message and preserves the integrity of the message so no one can tamper with it, said Travis Morrison, assistant professor in the College of Science’s Virginia Tech Department of Mathematics.

Different cryptographic protocols perform different tasks. The cryptographic algorithms selected by NIST are designed for two main cryptographic tasks: digital signatures, used for identity authentication and data integrity, and public key encryption, which allows two parties to have a secure conversation without ever meeting.

“Take Amazon or Microsoft: anyone or everyone wants to communicate with them, and they can’t create a new cipher for every person,” said Commonwealth Cyber Mathematics Researcher William Mahaney. Initiative working with Morrison.

In public key cryptography, a user has a public key and a private key. Sharing the public key does not disclose the private key, but the two are mathematically related. It is theoretically possible to disentangle the private key from the public key.

“One way to think about it is to mix salt and sand – very easy to do, very hard to undo. They’re tied together in the mix,” said Gretchen Matthews, professor of mathematics and director of Commonwealth Cyber Initiative “Could you separate them? Yes, but it would take a long time.”

When it comes to currently deployed cryptosystems, “long duration” is usually sufficient.

“It’s not that this problem is impossible,” Morrison said. “We all factored in primary school. But if the key is long enough, then the time it would take to factor it is on the order of the age of the universe. This is what classic public-key cryptography bases its security on – that the problem is difficult or takes a long time to solve.

It’s also about having the right tools for the job; a Bunsen burner facilitates the separation of salt and sand. With a quantum computer, most public key cryptosystem problems become trivial.

**The Threat and Wonder of Quantum Computing**

“Quantum computers are dangerous for some cryptosystems because within hours or days they can solve the underlying mathematical problems that would have taken more than a lifetime before,” said Mahaney, who is also Dr. Julian Chin. Fellow in cybersecurity.

Conventional computers deal with bits – 0s and 1s, this or that, yes or no, Matthews said.

“But when we consider quantum bits, a whole spectrum of possibilities opens up between 0 and 1,” Matthews said. Based on quantum mechanics, quantum computers behave in fundamentally different ways, so the algorithms for certain types of calculations are incredibly fast.

**Post-quantum protection**

“Post-quantum cryptosystems are protocols that run on an ordinary computer, but which we believe are secure even when an attacker has a quantum computer,” LeGrow said. You can run them on your cell phone, according to Morrison.

There are several subfields in traditional post-quantum cryptography, LeGrow said, and experts in each subfield design algorithms based on different mathematical frameworks. Families of algorithms have their respective strengths and weaknesses.

Code-based cryptography, which is part of Matthews’ area of expertise, is the oldest scheme under study. These protocols are slower than some of the other systems, but the researchers had more time to examine them for vulnerabilities.

Lattice-based systems, which Mahaney and Morrison are exploring, are faster but have large key sizes compared to current systems — not ideal for memory and bandwidth.

Isogeny-based schemes, studied by LeGrow and Morrison, are at the other extreme, offering very small keys but running slower than other post-quantum schemes on average.

The NIST process aimed to define a suite of post-quantum cryptographic standards that use varied approaches to encryption and could serve as an algorithm toolkit for different situations. The submissions were public so that the global cryptanalysis community could review each one. Once an algorithm was submitted, “cryptanalysts would use every trick possible to crack it,” Mahaney said. They swarmed and dug into every corner of the system, sniffing out weaknesses in the implementation or the code, even in some cases the configuration of a computer or a network – because if they couldn’t find the cracks at At this point, a hacker would do so later when they could reveal private, proprietary, or classified information.

NIST’s candidate pool was winnowed until July, when NIST released the initial four candidates selected for standardization and announced a follow-up round for the remaining two candidates – all from an original slate of 82. .

Among the schemes selected for standardization is a digital signature protocol known as FALCON. Morrison and Mahaney are focusing on algorithms to optimize FALCON cryptosystems, which will be released as a full standard in the coming months. Together with Krzysztof Gaj, a researcher from the Commonwealth Cyber Initiative at George Mason University, they are also studying the susceptibility of these schemes to side-channel attacks that take advantage of information leaks via timing, data consumption energy and electromagnetic radiation.

Both tracking cycle candidates are code-based algorithms, and Matthews and his team are developing code families with the goal of accelerating algorithms without sacrificing security. Both Morrison’s and Matthews’ projects are supported by the Commonwealth Cyber Initiative in southwestern Virginia.

As the NIST process draws to a close, a vibrant area of post-quantum cryptography continues to generate new techniques and algorithms for a wide range of security applications that will protect information today and into a quantum future.

*Source: Kelly Izlar, Virginia Tech*