The Lure of Quantum Computing (QC) – Part 1

Spike Narayan-

 

Spike Narayan

Spike Narayan is a seasoned hi-tech executive managing exploratory research in science and technology at IBM

 

In this article (the first of a two-part series on quantum computing) we will explore why quantum computing is such a big deal and in the second part, we will explore what quantum computing actually is and why it is a hard scientific and engineering challenge that the scientific community has embraced and is collectively starting to make commendable progress.

Theory and computation have been at the heart of physical sciences research forever. Centuries ago, scientists relied on paper and pen to formulate hypotheses and demonstrate proofs. Be it the mathematical treatise of Ramanujan or Maxwell’s equations or Einstein’s theories, our ability to perform calculations has been at the center of major advances in our understanding of the world around us. As our comprehension of the secrets of the universe improved, so too did the need for more sophisticated computational methods to keep pushing the boundaries of science.

The description of the von-Neumann architecture in 1945, the invention of the transistor in 1948, and later the integrated circuit in 1961 started cycles of innovation that gave us the ability to perform complicated calculations which improved exponentially for decades. Till very recently, the computational speed-up has served the scientific community well. There is even the fastest 500 supercomputers list published twice a year (https://www.top500.org/) that touts the eye-popping speed-ups that have become common place.

IBM has famously made this list countless times and has installed 2 of the top 3 supercomputers in the world as of June 2021. They are “Summit” installed at Oak Ridge National Lab which clocks in at 148 PFlops/s and “Sierra” installed at Lawrence Livermore National Lab which comes in at just under 100PFlops/s, One PFlop/s is one quadrillion floating-point operations per second – truly mind-boggling!

If supercomputers are this good what then is all the fuss about quantum computers? The reason stems from the fact that there are many classes of problems that fall outside the reach of even the best supercomputers today. Accurate weather modeling, complex optimization, modeling of chemical reactions and cracking the encryption code are just a few of them.

Let us take weather modeling as a first example. You must have noticed that world over, severe weather appears to be more common and yet the supercomputing community has struggled with being able to predict such drastic changes in weather patterns and it is getting more difficult to do so each year. This could be attributed to the fact there are many more variables that seem to impact global weather than was the case 5 years ago. Just half a degree change in our planet’s temperature has taken a reasonably tractable problem of weather prediction to one that is becoming more and more intractable. Enter the need for a different computing paradigm.

As a second concrete example let’s take modeling the reactions of a caffeine molecule. We need approximately 1048 compute bits to accurately represent the energy configuration of a single caffeine molecule at any single instant. This gargantuan number of computing bits in a supercomputer would require nearly 10% of all the atoms in our planet to build. So, it is impossible to ever build a conventional supercomputer that is big enough to study reactions of caffeine.

As we go to more complicated molecules like sucrose or penicillin we run out of atoms in the entire solar system or even the universe. This again points to innovations in computing paradigms. The examples are endless.  This stark realization is what makes exploration of other non-conventional (beyond von-Neumann) computing schemes so important. However, novel computing architectures have been hard to come by. It has been known for decades that quantum computing has the potential to tame hard computational problems, but that realization could never cross the domain of theory into reality.

It is not for lack of trying but it turns out to be a hard scientific and technological challenge. That is why as quantum computing hardware is starting to appear on the scene there is so much hope and excitement. The introduction of the first quantum computer that was accessible to anyone in the world was just a few years ago when IBM made a 5 quantum bit system available for public use in the cloud. Just a few months ago IBM unveiled a 127 qubit quantum computer that offers a path to solve even more complex problems.

There are a few promising approaches to quantum computing. While IBM and Google have chosen to explore superconducting qubit technology others have chosen technologies like ion traps and silicon spins. Large companies like Intel and Microsoft have also entered the race and there are several dozen start-ups emerging in this space. Having a vibrant ecosystem of companies and partners is absolutely essential to tackle the scientific and engineering challenges that will propel this field to new heights. To make it even more exciting several governments have started funding the quantum initiative. USA, European Union, China and several Asian governments have made long-term commitments. You can find more on this here.

We have discussed the importance and need for this new computing paradigm and seen how global investments to accelerate this field forward are already underway. In part two of this series we will dig a little deeper into what is quantum computing and why this novel compute architecture holds promise and what are some of the challenges we have to surmount to realize the promise of QC at scale.