As billions of dollars pour into quantum computing and countries build communication networks secured by quantum encryption, the prominence of quantum information science has become increasingly hard to ignore.
This year’s Breakthrough Prize in Fundamental Physics honors four pioneers who combined math, computer science and physics to do “foundational work in the field of quantum information.” The prize is shared between Charles Bennett of IBM, Gilles Brassard of the University of Montreal, David Deutsch of the University of Oxford and Peter Shor of the Massachusetts Institute of Technology.
“These four people really contributed heavily to the emergence of quantum information theory,” says Nicolas Gisin, an experimental quantum physicist at the University of Geneva. “It’s nice to see that these prizes come closer to my heart.”
The Breakthrough Prizes were co-founded by Israeli-Russian billionaire and physicist Yuri Milner in 2012, and they have been lavishly supported by other moguls, including co-founders Mark Zuckerberg and Sergey Brin. Similar to Alfred Nobel, whose Nobel Prize–funding fortune arose from his invention of dynamite, Milner’s past financial ties to the Kremlin have drawn scrutiny, especially in light of Russia’s ongoing invasion of Ukraine. In previous interviews, Milner has emphasized his independence and donations to Ukrainian refugees. A spokesperson pointed out to Scientific American that Milner relocated to the US in 2014 and has not returned to Russia since.
But recognition for quantum information science has not always come easily—or with such financial support. Broadly speaking, the field is a combination of two theories: quantum mechanics, which describes the counterintuitive behavior of the atomic and subatomic world, and information theory, which details the mathematical and physical limits of computation and communication. Its history is a messier story, with sporadic advances that were often overlooked by conventional scientific journals.
In 1968, Stephen Wiesner, then a graduate student at Columbia University, developed a new way of encoding information with polarized photons. Among other things, Wiesner proposed that the inherently fragile nature of quantum states could be used to create counterfeit-resistant quantum money. Unable to publish many of his heady theoretical ideas and drawn to religion, Wiesner, who died last year, largely quit academia to become a construction worker in Israel.
Before Wiesner left Columbia, he passed along some of his ideas to another young researcher. “One of my roommates’ boyfriends was Stephen Wiesner, who started telling me about his ‘quantum money,'” Bennett recalls. “[It] struck me as interesting, but it didn’t seem like the beginning of a whole new field.” In the late 1970s Bennett met Brassard, and the two began discussing Wiesner’s money, which they imagined might require the improbable task of trapping photons with mirrors to create a quantum banknote.
“Photons are not meant to stay—they’re meant to travel,” Brassard says, explaining the thought process. “If they travel, what’s more natural than communicating?” The protocol Bennett and Brassard proposed, called BB84, would launch the field of quantum cryptography. Later detailed and popularized in Scientific American, BB84 allowed two parties to exchange messages with utmost secrecy. If a third party snooped, they would leave indelible evidence of their interference—like damaging a quantum wax seal.
While Bennett and Brassard developed quantum cryptography, another radical idea was beginning to emerge: quantum computing. At a now famous meeting at MIT Endicott House in Dedham, Mass., in May 1981, physicist Richard Feynman proposed that a computer using quantum principles could solve problems impossible for a computer bound by the laws of classical physics. Although he didn’t attend the conference, Deutsch heard about the idea and was hooked. “I gradually got more and more convinced of the links between computation and physics,” he says.
Chatting with Bennett later that year, Deutsch experienced a crucial epiphany: then prevailing computational theory was based on the wrong physics—the “classical” mechanics of Isaac Newton and the relativistic approach of Albert Einstein rather than the deeper quantum reality. “So I thought I’d rewrite the theory of computation, basing it on quantum theory instead of basing it on classical theory,” Deutsch says matter-of-factly. “I didn’t expect anything fundamentally new to come out of it. I just expected it to be more rigorous.” Soon, however, he realized he was describing a vastly different kind of computer. Even if it achieved the same results, it got there with principles of quantum mechanics.
Deutsch’s new theory provided a crucial link between quantum mechanics and information theory. “It made quantum mechanics accessible to me in my language of computer science,” says Umesh Vazirani, a computer scientist at the University of California, Berkeley. Later, with Australian mathematician Richard Josza, Deutsch proposed, as a proof of principle, the first algorithm that would be exponentially faster than classical algorithms—although it didn’t do anything practical.
But soon more useful applications emerged. In 1991 Artur Ekert, then a graduate student at Oxford, proposed a new quantum cryptography protocol, E91. The technique caught the attention of many physicists because of its elegance and practicality—as well as the fact that it was published in a leading physics journal. “It’s a beautiful idea. It’s a bit surprising that Ekert is not part of the list” of winners of this year’s fundamental physics Breakthrough Prize, Gisin says.
Two years later, when Bennett, Brassard, Josza, computer science researcher Claude Crépeau, and physicists Asher Peres and William Wootters proposed quantum teleportation, physicists were paying attention. The new technique would give one party the ability to transmit information, such as the result of a coin flip, to another via entanglement, the quantum correlation that can link objects such as electrons. Despite popular science-fiction assertions, this technique does not allow for faster-than-light messaging—but it has dramatically expanded the possibilities of real-world quantum communications. “That’s the most mind-boggling idea,” says Chao-Yang Lu, a quantum physicist at the University of Science and Technology of China, who has helped implement the technique from space.
Words such as “revolution” are overused to describe progress in science, which is usually plodding and incremental. But in 1994 Shor quietly began one. While working at AT&T Bell Laboratories, he had absorbed talks by Vazirani and Bennett. “I started thinking about what useful things you could do with a quantum computer,” he says. “I thought it was a long shot. But it was a very interesting area. So I started working on it. I didn’t really tell anybody.”
Inspired by the success other quantum algorithms had with tasks that were periodic, or repeating, Shor developed an algorithm that could divide numbers into their prime factors (for example, 21 = 7 x 3) exponentially quicker than any classical algorithm. The implications were immediately obvious: prime factorization was the backbone of modern encryption. At last, quantum computers had a truly game-changing practical application. Shor’s algorithm “just made it absolutely clear that you have to drop everything” to work on quantum computing, Vazirani says.
Although Shor had found a powerful use case for a quantum computer, he had not solved the harder problem of how to build one—even in theory. The fragile quantum states such devices could exploit to surpass classical computing also made them extremely vulnerable to errors. Moreover, error correction strategies for classical computers could not be used in quantum computers. Undeterred, at a quantum computing conference in Turin, Italy, in 1995, Shor bet other researchers that a quantum computer would factor a 500-digit number before a classical computer did so. (Even with today’s classical supercomputers, factoring 500 digits would likely take billions of years.) No one took Shor’s bet, and some asked for a third option: that the sun would burn out first.
Two types of errors plague quantum computers: bit errors and phase errors. These errors are akin to flipping a compass needle from north to south or east to west, respectively. Unfortunately, correcting bit errors makes phase errors worse, and vice versa. In other words, a more precise bearing north results in a less accurate bearing east or west. But later in 1995 Shor figured out how to combine bit correction and phase correction—a chain of operations not unlike solving a Rubik’s Cube without altering a completed side. Shor’s algorithm remains ineffective until quantum computers become more powerful (the highest number factored with the algorithm is 21, so classical factoring remains in the lead—for now). But it still made quantum computing possible, if not practical. “That is when the whole thing became real,” Brassard says.
All of this work led to new views of quantum mechanics and computing. For Deutsch, it inspired an even more fundamental theory of “constructors”—which, he says, describe “the set of all physical transformations.” Others remain agnostic about the likelihood of further deep insights emerging from the quantum realm. “Quantum mechanics is really strange, and I don’t think there’s ever going to be any easy way of understanding it,” Shor says. Asked whether his work on quantum computing makes the nature of reality easier or harder to understand, he impishly says, “It certainly makes it more mysterious.”
What began as a pastime or eclectic intellectual pursuit has now grown far beyond many of the wildest imaginings by the field’s pioneers. “We never thought it would ever become practical. It was just a lot of fun to think about these crazy ideas,” Brassard says. “At some point, we decided we were serious, but people didn’t follow us. It was frustrating. Now that it’s being recognized to such an extent is extremely gratifying.”