Quantum computing is entering an era of more focused investment, more commercial contracts, and more competition. Many start-ups are transitioning to scale-ups with growing teams, ambitions, and revenue streams. As quantum computing capabilities are becoming more sophisticated, so has engagement from research institutions and supercomputing centers who are investing tens of millions for on-premises installations of the latest machines.
Yet, although breakthrough achievements abound across the ecosystem, the industry is still at a relatively early stage. Uncertainty remains as to which hardware approaches will be the most successful. Roadmaps suggest a turning point for advantage and value creation should only be a few years away, but significant technology challenges lie ahead to truly achieve the scalability and revolutionary promise of quantum computing.
So, as pressure mounts on quantum computing to continue showing it is on track to provide a return on investment from governments and private ventures alike – what are the crucial next steps? IDTechEx outlines some key trends below, taken from their recently released report, “Quantum Computing Market 2025-2045: Technology, Trends, Players, Forecasts”.
A representation of how commercial value from quantum computing (focusing on gate-based) is anticipated in the next 5-10 years as capabilities expand from use-case development tools to versatile machines. Source: IDTechEx
More error correction
Awareness of the importance of reducing errors in quantum computers has grown significantly. Individual physical qubits are notoriously vulnerable to decoherence from a variety of noise sources – from temperature and electromagnetic radiation to crosstalk. Decoherence is catastrophic for quantum advantage, seeing qubits no longer simultaneously represent 1s and 0s, but quite classically 1s or 0s.
One method of overcoming the impact of noise and decoherence is quantum error correction (QEC). In simple terms, this requires creating abstracted, error-free, logical qubits from a collection of noisy physical qubits. In oversimplified terms, by comparing the properties of the group, enough information about the noise can be extracted to correct it. Analogous to playing a game of broken telephone enough times to decode the original message. The exact mathematical approaches to large-scale error correction remain a highly active area of research – particularly by the likes of experts at Riverlane. Yet the conclusion is clear: the number of logical qubits per system is becoming a more important benchmark of quantum computer hardware’s long-term potential for success.
Strikingly, it is apparent that the required ratio of physical to logical qubit varies dramatically between qubit modalities. Evidence suggests that for photonic, it could be as low as 2:1, for neutral atom and trapped ion nearer 10:1 – while superconducting could require more than 1000:1. To some extent, this has temporarily leveled the playing field in the quantum computing market, seeing challengers such as QuEra catch-up, if not overtake, giants like IBM and Google in the race for high numbers of logical qubits.
Overall, the need to now transition into a ‘logical era’ is clear. This is well evidenced by the focus on this benchmark in the latest roadmaps by multiple players across the industry. Yet, unfortunately, solely optimizing system design towards reducing errors won’t be enough to secure long-term success. For this – the impact on overall size and power consumption must also be considered.
Less demanding infrastructure
Overcoming the infrastructure limitations associated with scaling quantum computer hardware is no easy task. Almost all systems today require cooling, whether it be using cryostats or lasers. It is often the cooling system that can be the most demanding on space. However, as efforts to increase logical qubit number increase – space per cooling system to house them is running out.
As a result, today, many hardware roadmaps show a modular approach with multiple systems connected. On the one hand, quantum computing is designed for high-value problems – to be solved over the cloud, and so requiring a large footprint within a data center isn’t necessarily a huge barrier to adoption. However, in some instances, the associated power demand for this approach for an LSFT machine is calculated to be in the Mega Watts, which is enough to warrant its own small modular reactor. To truly follow the trend of classical computing from vacuum tube to smartphone, it’s time to start making components smaller before capabilities can get bigger.
One key aspect impacting infrastructure demand is qubit density or the physical size of qubits. Some modalities claim to have a significant advantage in this area over others. For example, it is currently estimated that superconducting and photonic designs could integrate thousands of qubits per chip, trapped-ion tens of thousands, and silicon-spin billions. This is partly limited by the dimensions of the quantum state utilized as well as the manufacturing methods available to produce them. The size advantage offered by silicon-spin is largely a result of leveraging the highly optimized techniques already adopted by the semiconductor industry for transistor and CMOS manufacture. Notably, Microsoft is also working towards hardware protected Majorana qubits, microns in scale, specifically stating the advantage of enabling a ‘single module machine of practical size’. That being said, given the impact of crosstalk and other noise sources, how the impact of spacing between qubits required will change at scale across all modalities remains uncertain.
Furthermore, it can’t be overlooked that as well as the qubit themselves, often the most space is needed for manipulation and readout systems. For example, moving from hundreds to thousands of qubits can lead to unfeasible requirements for microwave cabling, interconnects, lasers, and more. As a result, many players are now also developing more optimized approaches for scalable manipulation and control. SEEQC have created a digital, on-chip alternative to analog control for superconducting qubits, which is now of growing interest to other modalities in the eco-system. Similarly, Oxford Ionics have recently patented an ‘electronic qubit control’, an on-chip interface for trapped-ion modalities. In fact, it is the almost ubiquitous focus of research in start-ups and established players to overcome ‘the wiring challenge’. Looking ahead, remaining agile across the quantum stack offers will offer an advantage over vertical integration in this regard.
Market outlook
In this increasingly competitive industry, the coming years will illuminate which strategies hold the greatest promise for securing a lasting quantum commercial advantage. This task will be an uphill balancing act between reducing errors and scaling up logical qubit numbers while also optimizing for resource efficiency. This is without even considering gate-speed, algorithm development, and many other crucial factors. The enormity of the task will likely see many players fail to survive until the end of the decade. Yet with market consolidation and convergence of talent should come increased clarity as to where and when quantum advantage could be offered first – serving only to increase end-user confidence and engagement. Despite the headwinds, the world-changing potential of quantum computers within finance, healthcare, sustainability, and security will remain a tantalizing enough carrot for not only individual companies but entire nations to chase.
An overview of IDTechEx's independent quantum commercial readiness level (QCRL). Source: IDTechEx
IDTechEx’s report, “Quantum Computing Market 2025-2045: Technology, Trends, Players, Forecasts”, covers the hardware that promises a revolutionary approach to solving the world’s unmet challenges. The quantum computing market is pitched as enabling exponentially faster drug discovery, battery chemistry development, multi-variable logistics, vehicle autonomy, accurate asset pricing, and much more. Drawing on extensive primary and secondary research, including interviews with companies and attendance at multiple conferences, this report provides an in-depth evaluation of the competing quantum computing technologies: superconducting, silicon-spin, photonic, trapped-ion, neutral-atom, topological, diamond-defect and annealing. IDTechEx also presents an independent score for ‘quantum commercial readiness level’ to assess how the quantum computing industry is progressing compared to the evolution of the classical computing industry that came before it. The total addressable market for quantum computer use is converted to hardware sales over time, accounting for advancing capabilities and the cloud access business model. The quantum computing market is forecast to surpass US$10B by 2045 with a CAGR of 30%.
To find out more about this IDTechEx report, including downloadable sample pages, please visit www.IDTechEx.com/QuantumComputing.
For the full portfolio of quantum technologies market research available from IDTechEx, please see www.IDTechEx.com/Research/Quantum.
Upcoming free-to-attend webinar
How Is the Quantum Computing Market Evolving?
Dr Tess Skyrme, Senior Technology Analyst at IDTechEx and author of this article, will be presenting a free-to-attend webinar on the topic on Thursday 19 December 2024 – How Is the Quantum Computing Market Evolving?
Key aspects covered in this webinar include:
- What is quantum computing, and what is the state of the industry?
- How is the industry evolving as commercialisation ramps up?
- How do scalability challenges vary between quantum computing modalities?
- What are some key updates from major players in the eco-system from 2024?
- Which trends can be expected in the quantum computing market in 2025 and beyond?
We will be holding exactly the same webinar three times in one day. Please click here to register for the session most convenient for you.
If you are unable to make the date, please register anyway to receive the links to the on-demand recording (available for a limited time) and webinar slides as soon as they are available.