Quantum Fluctuations Enhance Algorithm Efficiency.
Originally Published 2 years ago — by Quanta Magazine

Randomness has played an important role in computer science since its inception. Adding randomness into an algorithm can help calculate the correct answer to unambiguous true-or-false questions. Randomness has been used in primality testing and graph theory to solve complex problems. While deterministic algorithms are often efficient only in principle, randomized algorithms remain popular because de-randomization can be tricky. Randomness has found countless other uses in computer science, from cryptography to game theory to machine learning.