Getting to 100,000 qubits supercomputer

ibmquantum-systemtwo-banner

IBM Quantum, our goal is to grow quantum computers to the point where they can solve the most difficult problems in reality.

We have a primary objective in mind to get there: a 100,000-qubit system by 2033. we’re funding and collaborating with the Universities of Tokyo and Chicago on focused research to create a system that might be able to handle some of the most important issues facing the world—problems that even the most sophisticated supercomputers of today may never be able to resolve.

Why then, 100,000? The quantum community convened at the IBM Quantum Summit 2022 last year to see the 433-qubit IBM Quantum Osprey processor unveiled as well as to explore the newest Qiskit Runtime features that would speed up research and development and pave the way for quantum-centric supercomputing. Keep reading. We showed at the IBM Quantum Summit that we have mapped the future directions for scaling quantum computers to thousands of qubits, but the future directions beyond that are less obvious.

supercomputer-100k_System_Blog_body_f06b96c850

Why? It combines issues with supply chain, footprint, cost, chip yield, and energy use, to mention a few. We need to work together to do basic research spanning physics, engineering, and computer science to make sure that these obstacles don’t prevent our development. Similar to how no one corporation is to blame for the present computer age, the world’s top institutions are now banding together to address this issue and usher in this new era. We need the assistance of a larger quantum industry.

Quantum computer scaling

Last year, we revealed our strategy for scaling up quantum computers to the point where they can be used. Now that the groundwork has been laid, we can identify four critical areas that still need to be improved upon in order to realize the 100,000-qubit supercomputer: quantum communication, middleware for quantum, quantum algorithms and error correction (capable of utilizing multiple quantum processors and quantum communication), and components with the required supply chain.

See also  IBM Advanced Semiconductor Research is essential for the future of computing

quantum-machinery

To enhance each of these four areas, we will fund research at the Universities of Chicago and Tokyo. The search for, scalability of, and execution of end-to-end quantum algorithm demonstrations will be spearheaded by the University of Tokyo. They will also start to create the supply chain and develop new parts like cryogenics, control electronics, and other parts needed for such a big system. The Quantum Innovation Initiative Consortium (QIIC), which brings together academics, government, and business to develop quantum computing technology and create an ecosystem around it, is led by the University of Tokyo, which has also shown leadership in these fields.

The university has already started investigating and creating quantum computing-related techniques and applications via the IBM-UTokyo Lab, building the foundation for the hardware and supply chain required to create a computer of this size.

The University of Chicago will spearhead efforts to combine quantum networks with classical and quantum parallelization to bring quantum communication to quantum computing. When serverless quantum execution is added, they will also spearhead efforts to develop middleware for quantum. Large quantum circuits may be divided into smaller-sized subcircuits using circuit knitting methods. Keep reading. Circuit knitting, physics-based error resilience, and program execution across various systems are all needed.

The Chicago Quantum Exchange, the University of Chicago has already shown its track record of innovation in quantum technology and quantum communication. To research long-range quantum communication, the CQE runs a 124-mile quantum network. Several software approaches developed at the University of Chicago have also affected IBM middleware and those of other companies in the industry. They have helped provide structure to quantum software.

See also  New transistors from IBM and Samsung might be in the development of sub-1nm chips devices.

BLOG_IBM_Quantum_Safe_leadspace_55ba6325ad

We are aware of how difficult it will be to develop a 100,000-qubit machine. we are aware of the route ahead of us, and we have a list of known knowns and known unknowns. And as an industry, we need to be eager to tackle unanticipated difficulties. Along with the Universities of Chicago and Tokyo, we believe that achieving the target of 100,000 linked qubits by 2033 is possible.

At IBM, we’ll keep working toward realizing quantum-centric supercomputing while empowering the community to seek continuous performance enhancements. Finding quantum processors’ benefits over classical processors while considering quantum as a component of a larger HPC paradigm in which classical and quantum processors operate as a single computing unit what this entails. Together, we’re going to introduce practical quantum computing to the world via our all-encompassing strategy and our drive to reach the 100,000-qubit milestone.

source

Leave a Comment

Comments

No comments yet. Why don’t you start the discussion?

    Leave a Reply

    Your email address will not be published. Required fields are marked *