This post is also available in: עברית (Hebrew)
Researchers from NYU discovered that classical computers could keep up with or even surpass quantum computers in certain circumstances. Classical computers can get a boost in speed and accuracy by adopting a new innovative algorithmic method, which could mean that they still have a future in a world of quantum computers.
Many experts believe that quantum computing is the future, and that we are veering away from classical computing, primarily because classical computers are significantly slower and weaker than their quantum-based counterparts. However, turns out that quantum computers are delicate and prone to information loss, and even if information is preserved it is difficult to convert it to classical information necessary for practical computation.
According to Interesting Engineering, classical computers don’t suffer from this issue of information loss and translation. Furthermore, classical algorithms can be designed to take advantage of these challenges and simulate a quantum computer with far fewer resources than previously believed, as explained in the study published in “PRX Quantum”.
The study claims that classical computing can perform faster and more accurate calculations than state-of-the-art quantum computers, a breakthrough that was achieved with an algorithm that keeps only part of the information stored in the quantum state—and just enough to compute the outcome accurately.
Dries Sels, one of the paper’s authors and an assistant professor in NYU’s Department of Physics explains: “This work shows that there are many potential routes to improving computations, encompassing both classical and quantum approaches. Moreover, our work highlights how difficult it is to achieve quantum advantage with an error-prone quantum computer.”
The research team focused on a tensor network data modeling approach. Recent advancements in the field allow these networks (which have been thought to be challenging to work with) to be optimized using tools borrowed from statistical inference. Moreover, this new method focuses only on the most important pieces of information and ignores the rest, like compressing a photo to make it smaller without making it incomprehensible.
This way, their technique simplifies the quantum computing problem so that a regular computer can handle it more efficiently.