Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


Qubits and Quibbles

September 22nd, 2021 / in Uncategorized / by Khari Douglas
Scott Aaronson, 2020 ACM Prize in Computing Winner

Scott Aaronson, 2020 ACM Prize in Computing Winner

Khari Douglas will be covering the 8th Heidelberg Laureate Forum (HLF) on the CCC blog all week. Stay tuned and check out the HLF blog for more coverage of the event. 

On the opening day of the 8th Heidelberg Laureate Forum, Scott Aaronson, winner of the 2020 ACM Prize in Computing, discussed the recent advancements in quantum computing and the impact that “quantum supremacy” could have on the future of computing. Aaronson described quantum mechanics as the operating system of the universe, through which everything in nature runs as an application program.

As Aaronson explained, the state of any isolated physical system can be shown as a unit vector of complex numbers called “amplitudes,” the quantum mechanical representation of probabilities.  The ability to accurately simulate isolated physical systems would be very useful to many scientific fields; however, this is exponentially hard to do on a classical computer because the dimensions of a quantum state increase exponentially with the number of particles in the system. This is because a quantum bit, or qubit, can exist in state of superposition where it is both 0 and 1 simultaneously. This is written in Dirac notation as α|0⟩+ β|1⟩ where α and β are amplitudes or complex numbers that represent the probability of being either 0 or 1. Once you measure the qubit, the superposition “collapses” and shows you either a 0 or 1.

Since a qubit can exist in this superposition, to represent a system with “n” number of qubits states requires a vector of 2^n amplitudes. For example, a 1000 qubit system, which would represent a small molecule, requires 2^1000 amplitudes. 2^1000 is greater than the number of particles that exist in the observable universe, so simulating the behavior of such a system is not possible for classical computers.

Aaronson went on to discuss the recent hunt for “quantum supremacy” or “quantum advantage.” Quantum supremacy was coined in 2012 by John Preskill, a physicist and professor at CalTech University, and it refers to the point at which a quantum computer is able to compute some well-defined mathematical task faster than an current classical computer (regardless of the usefulness of this task). In 2019, a team from Google published a claim of quantum supremacy in Nature.

The quantum processor designed by the Google team is called “Sycamore” and was able to create quantum states on 53 qubits. According to the team, the Sycamore processor “takes about 200 seconds to sample one instance of a quantum circuit a million times—our benchmarks currently indicate that the equivalent task for a state-of-the-art classical supercomputer would take approximately 10,000 years.” This is a truly game-changing speed-up, however it is still limited to one very specific, and not particularly useful task. A response from IBM at the time claimed “that an ideal simulation of the same task can be performed on a classical system in 2.5 days and with far greater fidelity.” IBM never actually tested this task on a classical system, but, either way, it is clear that there is still much research yet to be done in order to create more useful quantum computers.

While more advanced quantum computers would be useful to fields such as biochemistry and nuclear physics, they are also predicted to threaten encryption systems, like RSA, that we rely on to transmit data over the Internet. Overly-simplified, RSA is a public key cryptography algorithm that generates public and private keys through the manipulation of prime numbers. Currently, RSA is a very secure encryption scheme because it takes a long time to factor large semiprimes (the product of two prime numbers) which are used in RSA. In 1994, mathematician Peter Shor created Shor’s algorithm, which is able to find the prime factors for any integer N using a quantum computer. Thus if (or when) a quantum computer is created that is capable of running Shor’s algorithm, it could be used to break RSA and other public-key cryptography that relies on factoring primes to generate keys.

Not only would this impact, say, your online banking transactions, but it could also jeopardize the security of military or intelligence information. Encrypted data could be collected by adversaries in the present and, once a sufficiently advanced quantum computer is created, that data could then be decrypted and the intel utilized. The fear of such a future has prompted organizations like the U.S. National Institute of Standards and Technology to create programs on post-quantum cryptography. In 2016 NIST announced a program to “develop and standardize one or more additional public-key cryptographic algorithms to augment FIPS 186-4, Digital Signature Standard (DSS).” This program is currently in round three of candidate algorithms submissions. More information can be found here.

A few years ago, the Computing Community Consortium, a visioning body for computer science research (and my employer) ran a visioning workshop on post quantum cryptography migration and released a workshop report that highlights outstanding research questions in this area. The most important take away from the report is: “While NIST’s much-needed standardization initiative addresses the selection of new cryptographic algorithms, there is an urgent need to consider the complex problem of migration from today’s widely deployed algorithms to PQC alternatives” (p. 18). This is not merely a technical problem, but also a social and policy issue as you need to ensure that all the developers who work in this space are able to migrate to post-quantum alternatives if it becomes necessary.

While I can’t say what the future will hold, starting next week, you can watch the entirety of Scott Aaronson’s lecture on the HLF Youtube channel.

Qubits and Quibbles

Comments are closed.