Chinese scientists create quantum processor 60,000 times faster than current supercomputers
A processor 60,000 times faster than current supercomputers
The race is on to develop a quantum computer capable of outperforming a conventional supercomputer, and researchers around the world are racing. If scaled to the right sizes, quantum computers represent the biggest leap forward in computing in decades, with the potential to leave our current machines in the dust, but significant hurdles remain.
Today, a team of Chinese researchers created a superconducting quantum processor with 66 functional qubits that, faced with a complex sampling task, was able to overtake even the most powerful supercomputers and complete it in a fraction of the time.
What makes the research so impressive is how it demonstrates a huge leap forward to quantum primacy, a milestone in which quantum computers accomplish a task that a conventional computer cannot.
The research is published in Physical Review Letters.
The team is led by Jian-Wei Pan from China University of Science and Technology, who produced both this superconducting processor and an alternative system using photonics or light. To achieve quantum primacy, the team aimed to use “sampling problems” as a computational task, which involve problems whose solutions are not just singular random “samples”, but multiple along a distribution. of probability.
With such vast potential outputs, it is possible to create a sampling problem that a conventional computer cannot feasibly solve, but that quantum computers can, and thus demonstrate quantum primacy.
To this end, Pan and his colleagues must upgrade quantum processors. Quantum computers use qubits to process data, and creating a viable quantum system requires quantum processors involving more qubits than is currently possible.
The largest quantum processors can currently handle around 50 qubits, largely due to the physical limitations of the chip. Pan’s new tunable superconducting processor, called Zuchongzhi, has 66 functional qubits.
When faced with an extremely complex sampling problem, estimated by researchers to be 2-3 times more demanding than previous problems attributed to quantum processors, Zuchongzhi finished it in 1.2 hours. Pan and his colleagues expect the same problem to take 8 years to be solved by the most powerful supercomputers.
In this case, the researchers only used 56 qubits for the sampling problem, which is 3 qubits more than a previous Google claim to primacy. However, even such a small leap requires a lot more computing power for a conventional computer, hopefully cementing their claim to primacy.
Whenever researchers claim primacy, they encounter intense skepticism. Such skepticism implies the thought that the most ideal algorithms for the job are not used when conventional computers are opposed to quantum options, but with such an increase over previous claims, Pan and his colleagues hope to settle the debate completely according to which primacy has been achieved.
So what does all this mean? First, when it comes to sampling issues, it seems that quantum computers are ultimately significantly better than conventional options. That’s not to say they’re practical just yet – a lot more innovation is needed before quantum computers are used for real-world tasks, and that probably won’t happen too soon.
However, it is quite possible that for some computational tasks, quantum processors are the perfect solution and may be used in niche scenarios in the future.