| What exactly is a computer? Until the mid-20th century, the word typically referred to humans who did calculations for big science and engineering projects. Those human computers were then replaced by electronic behemoths hard-wired to solve specific math problems. Today we can use a single laptop for a host of unrelated tasks that seem far removed from number crunching. A definition of computing that encompasses all of this might seem too vague to be useful. Yet computing does have a precise definition, one that dates back nearly 90 years to a paper by the famed logician Alan Turing. Turing imagined hypothetical devices (now called Turing machines) that would read and write 0s and 1s on an infinitely long tape according to a set of simple rules. Computing, he then proposed, was just whatever Turing machines could do. That definition may sound circular, but Turing's simple model proved enormously influential. It revealed the fundamental limits of computing and laid the theoretical foundations for the general-purpose computers we now use every day. Turing devised his model of computation to answer a major open problem in the foundations of mathematics about the limits of step-by-step theorem-proving procedures, or algorithms. He showed that Turing machines could in principle execute any known algorithm, and then proved that there are fundamental limits on what problems Turing machines can solve. This suggested that certain "undecidable" problems are beyond the reach of algorithms altogether. Turing's milestone result wasn't all bad news. He also proved that it's theoretically possible to build a universal Turing machine that can simulate the computations of any other Turing machine. That insight inspired the mathematician John von Neumann to devise a more practical plan for a programmable computer, one that could run different programs to solve different problems. It also prompted later researchers to explore intimate connections between computation and the fundamental laws of physics, as Michael Nielsen explained in a 2015 column for Quanta. Alan Turing never intended his imaginary machine to be a blueprint for a real computer. But his invention nonetheless played a central role in the history of computer science, and it inspired questions that researchers are still grappling with today. What's New and Noteworthy Turing machines aren't the only way to model computation mathematically. Researchers have devised countless other models that are "Turing complete," meaning they can theoretically execute all the same algorithms that Turing machines can. Some of these models are much less practical than others. In 2024, Quanta editor Jordana Cepelewicz wrote about a whimsical new addition to the list: origami computing, in which different ways to fold paper represent elementary mathematical operations. Turing completeness has a dark side. Any universal computational system must be plagued by the same undecidability that afflicts Turing machines. Charlie Wood recently surveyed decades of work on Turing completeness and undecidability in physics. For some physics problems that seem simple, researchers have proved that it's impossible to devise any general-purpose algorithm for finding a solution, even if you have perfect knowledge of the system's configuration. Even as many researchers have turned to newer models of computation, others continue to explore the behavior of Turing machines. One of my favorite topics in computer science is the busy beaver game, in which researchers aim to find the longest possible running time for Turing machines with a given number of rules. Last year, I reported on an online community of amateur mathematicians who found the busy beaver for five-rule Turing machines, settling a question that had remained open for 50 years. This summer, I wrote a follow-up story about progress on the six-rule case that's forced researchers to grapple with staggeringly large numbers. And in 2020, John Pavlus showed how researchers have used the busy beaver game to gauge the difficulty of the thorniest problems in mathematics. |