CIO Influence
Computing Guest Authors IT services Quantum Computing Technology

Quantum Computing In The Now

Quantum Computing in Now

Anyone paying attention to quantum computing likely can relate to a general sense of confusion. On one hand, quantum computing seems perpetually to be a decade away, while at the same time, multiple quantum computing companies and forward-leaning industry players are making big claims about near-term value from quantum computing. Can both of these trends be true at the same time? The simple answer is a resounding yes!

Also Read: A Comprehensive Guide to DDoS Protection Strategies for Modern Enterprises

Since the 1980s, we have seen an ever-increasing number of publications about quantum computing and related fields. Since most of these publications came out years ahead of any practical ability to run them, the industry featured many examples of theoretical solutions to hypothetical use cases requiring imaginary computing resources. These examples rely on mathematical proofs, and while intellectually engaging, they are not the kind of proof points that most industries look for as signs of maturity. Furthermore, many of the algorithms cited in these publications assume a theoretical level of perfection with what’s known as Fault-Tolerant Quantum Computing, or FTQC, and that is indeed where the industry is heading, but it is not there yet. Today, we are at the Noisy Intermediate-Scale Quantum, or NISQ-era and this tension between FTQC and NISQ is causing most of what we experience as industry confusion.

This challenge is exacerbated by an insufficient definition of “fault tolerance.” One common approach for improving the quality of qubits (the basic unit of quantum information) is to virtualize them by grouping multiple physical qubits and running constant error correction across them in what’s known as Logical Qubits. While promising, this technique doesn’t produce perfection. In fact, all existing logical qubit approaches provide incremental improvement in fidelity, and that’s measured in “number of 9s.” A 99.99% fidelity qubit is referred to as having “four 9s.” How many 9s does a system need to have to be genuinely fault-tolerant? There is no simple answer to that, and in some cases, a mathematically defined algorithm might require fifteen 9s to work correctly – an astonishingly high bar (similar to solid-state hard drives).

So, how much value could quantum computers offer during the NISQ era?  The answer to that depends on one’s perspective. The quantum industry has multiple modalities (technical approaches for implementing qubits), and each modality has advantages, disadvantages, and a time horizon for becoming practical. If you evaluate an early-stage modality with lower-quality physical qubits (like “neutral atoms”), you will likely hear that the NISQ era value is minimal and that the technology providers aim directly at FTQC. This massive leap is often supported by an early prototype demonstration of logical qubits, which contributes to further confusion, as having logical qubits is by no means any evidence of fault tolerance. Most near-term prototype demonstrations of logical qubits are still anchored deeply in the NISQ era and must overcome multiple significant engineering challenges to be impactful commercially.

This picture changes completely if you evaluate a more mature quantum modality, like “trapped ions,” with higher-quality physical qubits, longer coherence time, and better connectivity between the qubits. Companies in this space have engaged with customers for several years now, exploring new algorithms and focusing on real-world market needs. Unlike the more theoretical algorithms, NISQ algorithms are designed with error rates in mind as part of a larger hybrid data processing pipeline involving error mitigation techniques. Both technology providers and industry players recognize that NISQ-era quantum computers offer a specific value, not a general-purpose computing platform. Yet, that value can help revolutionize entire industries if embraced at the right time.

Also Read: Protecting APIs at the Edge

How would you know if NISQ-era algorithms could address your specific needs?  One way of thinking about it is through a logical hierarchy of value creation. It all starts with specific quantum algorithms that prove to be both error-resilient and advantageous compared to their classical counterparts. It has been observed repeatedly that many quantum algorithms are flexible and can be used to deliver value in a broad range of use cases and industries. The selection of the right use cases for these algorithms in the context of a specific industry gives us candidates for commercially advantageous target applications. Examples can be found in chemistry algorithms, which can also help deliver meaningful value in manufacturing and pharmaceutical verticals today. Other examples are optimization and hybrid Quantum+AI algorithms, with use cases across multiple industries and verticals.

While the expertise around quantum algorithms can often be found with the quantum technology providers, identifying use cases and defining what should be considered tangible benefits is entirely within the expertise of each industry. It is only through the meeting of the minds between the two sides (technology providers and industry) that meaningful progress is made, and these are the types of commercial relationships that have been flourishing over the last year or so, giving us a glimpse at what’s happening behind the scenes.

When considering quantum applications for production, factors such as seamless integration into hybrid pipelines and choosing an enterprise-grade technology partner are also crucial. The technological path to scale and performance metrics (error rates, connectivity, speed) contribute to the final result but can also be confusing. Performance ultimately needs to be measured in terms of the final result with metrics such as “time to solution,” “cost to solution,” and “energy to solution.” Additionally, quantum computing can often provide different solution qualities and insights that exceed classical methods.

The long-term vision for quantum is unchanged and highly compelling. Leading companies and organizations who invest in quantum now are securing their competitive position and potentially unlocking new value ahead of their competitors. NISQ quantum algorithms are extraordinarily powerful, and when applied to the correct use cases with the right technology choices, they can help enable first-of-their-kind solutions. The proven path for industry in quantum is to partner with an experienced quantum technology provider and establish a productive collaboration between like-minded scientists and engineers. The “Chat GPT moment” of quantum computing is around the corner, and it will likely be powered by NISQ algorithms and the creativity of innovators across industries.

[To share your insights with us as part of editorial or sponsored content, please write to psen@itechseries.com]

Related posts

IBM Launches Its Most Advanced Quantum Computers, Fueling New Scientific Value and Progress towards Quantum Advantage

PR Newswire

Perforce Launches Full CI/CD Integration and Delivers Enhanced Security in Latest Static Analysis Release

PR Newswire

HCL Technologies accredited as FinOps Certified Service Provider and FinOps Certified Platform by FinOps Foundation