The most important investments that IBM is making in quantum AI is to build out its developer and partner ecosystem and to provide them with sophisticated tools, libraries, and cloud services.
Quantum computing promises to accelerate artificial intelligence (AI) faster than the speed of light. But first, this futuristic technology must prove its worth as an alternative to more mature, traditional approaches to process data-driven statistical algorithms.
Achieving quantum advantage for AI apps
Quantum computing might be capable, in its current form, of performing feats that are practically impossible for computers built on traditional von Neumann architectures. However, that has not been proven, and IBM isn’t making this claim, often known as “quantum supremacy,” pertaining to its own quantum R&D efforts.
In fact, IBM has taken a practical approach that keeps expectations for the technology’s prowess in check. It has also been in the vanguard of debunking claims of this nature by other tech vendors. A recent case in point was Google’s claim in fall 2019 that Sycamore, its 53-qubit quantum hardware platform, had completed a calculation in a few minutes that would have taken 10,000 for the world’s most powerful existing supercomputer, IBM Summit.
Google’s benchmark didn’t fall into any of the core use cases—including AI, optimization, simulation, or even cryptography—for which quantum computing might some day hold an advantage over classical architectures. The proof of the pudding for AI is whether a computer built on quantum principles can do data-driven algorithmic inferencing faster than a classical computer, or optimistically, faster than the fastest supercomputers currently in existence.
For its own R&D efforts in this field, IBM is merely aiming at the more realistic goal of “quantum advantage.” This refers to any demonstration that a quantum device can solve a problem faster than a classical computer. Considering the range of commercial activity in this field, the likelihood that quantum architecture will soon show a clear performance advantage for core use cases—especially AI—grows by the day.
In that regard, we should the range of recent quantum product announcements by IBM and other leading tech vendors all focus on AI use cases:
- IBM’s Qiskit open-source developer tool includes a library of cross-domain quantum algorithms for experimenting with AI, simulation, optimization, and finance applications for quantum computers.
- Microsoft’s Azure Quantum includes a software development kit with libraries for quantum apps in ML, cryptography, optimization, and other domains, as well as supporting Q#, a domain-specific programming language for expressing quantum algorithms.
- AWS’ Amazon Braket service provides a single development environment to build quantum algorithms—including ML–and test them on simulated quantum computers, and also to run ML and other quantum programs on a range of different hardware architectures.
- Honeywell is rolling out a high-capacity quantum computer and is making investments in partners with expertise in AI and other cross-vertical quantum use cases.
Growing the development ecosystem for programmable quantum AI
All of these vendors are building developer ecosystems around their various quantum computing platforms.
In January, IBM announced the expansion of Q Network, its 3-year-old quantum developer ecosystem. To encourage the development of practical quantum AI applications, IBM provides Q Network participants with Qiskit; IBM Quantum platform, which provides cloud-based software for developers to access IBM quantum computers anytime; and IBM Quantum Experience, a free, publicly available, and cloud-based environment for team exploration of quantum applications. Many of the workloads being run include AI, as well as real-time simulations of quantum computing architectures.
Another key industry milestone came in March when Google launched TensorFlow Quantum. This new software-only framework extends TensorFlow so that it can work with a wide range of quantum computing platforms, not limited to its own hardware, software, and cloud computing services.
Taking on AI’s grand challenges where classical computing has fallen short
As quantum techniques start to prove their practicality on core AI use cases, they will almost certainly be applied to AI’s grand challenges.
At the level of pure computer/data science, AI’s grand challenges include neuromorphic cognitive models, adaptive machine learning, reasoning under uncertainty, representation of complex systems, and collaborative problem solving.
We expect that quantum AI developers in the ecosystems of IBM and its rivals will tackle all of these grand challenges using their respective quantum AI tools, libraries, and platforms.
The most important grand challenges for quantum AI will have compelling practical applications. Chief among these is trying to mitigate a key risk that quantum technology has itself unleashed on the world: the prospect that it might break public-key cryptography as we know it. Fortunately, IBM and others are making progress on developing “quantum-resistant” cryptographic schemes.
Though it’s not clear how much IBM is investing in the R&D needed to combat the technology’s more malignant misuses, you best believe that they are deeply enmeshed in some fairly secretive projects in these domains.
Developers are everything to the future of quantum AI. The most important investments that IBM is making in quantum AI is to build out its developer and partner ecosystem and to provide them with sophisticated but consumable tools, libraries, and cloud services.
Among commercial solution providers, IBM’s quantum developer ecosystem, Q Network, is the most mature and extensive. Let’s hope that sometime this year IBM begins to support TensorFlow Quantum within Q Network and integrates it seamlessly into IBM Quantum Experience.