Complexity as History and History as Complexity

Santa Fe Institute
4 min readMay 31, 2023

A BEYOND BORDERS column by David Krakauer, President of the Santa Fe Institute.

Detail from “The Temple of Time” (1846) by Emma Willard

At a recent meeting at SFI, my colleagues Kyle Harper (history of Rome and pandemics) and David Wolpert (physics of information and computation) encouraged a small group of historians and complexity scientists to reflect on recent developments in the thermodynamics of computation. The premise of the meeting was that a full account of history requires some means of connecting the energetics of the natural world with information required for human societies to access and utilize this free energy. In Harper’s account, history is a series of “algorithmic” inventions that more effectively couple the energy supply to societal growth, diversification, and maintenance.

The “special relationship” between history and complexity has its own curious history — starting in media res with Philip Anderson’s foundational paper, “More is Different” (1972), in which he pointed out the absolutely central role of broken symmetry in establishing the basis for all complexity. Broken symmetry describes the “historical” selection of one state from a set of states, all more or less equivalent, with respect to the underlying symmetrical laws of physics. The best-known example is the chirality of biological molecules: nearly all biological molecules are “left-handed” or levorotatory, whereas physical law tells us that left and right should be in equal proportion. All mesoscopic and macroscopic structures (molecules and up) are built on broken symmetries, including mechanisms of gene regulation and neural decision-making — the “algorithms” that have come to define the evolution of complex life.

In 1948 Claude Shannon introduced a mathematical theory of communication that sought to explain the absolute limits to reproducing at one point in space a message encoded at another point in space. The information is measured as the number of messages that can be reliably selected from a large set — “all choices being equally likely.” In other words, the Shannon information measures broken symmetry in a space of equally probable messages. Consider the game of 20 questions. One starts in a state of maximum uncertainty (symmetry — where all possibilities are exchangeable) and through a series of binary questions (animal or vegetable? etc.) symmetries are broken. If we slightly change Shannon’s framework, replacing space with time (transmitting messages forward in time rather than across space), then history describes those moments in time where accidents become frozen (the archduke was shot, the Bastille was stormed, or a plague rat boarded a trade ship).

In 1961 a researcher at IBM, Rolf Landauer, attempted to connect Shannon’s message-bits to thermodynamics. Landauer realized that for every broken symmetry, energy would need to be expended. He proposed that in a given temperature environment, the least possible expenditure of energy would be greater than or equal to kTln2, where T is the temperature and k the Boltzmann constant. Hence information requires a source of energy and as this increases the noisier (higher temperature) the environment. Landauer unites elements of Anderson and Shannon with thermodynamics, and in effect tells us that the longer a history (the more frozen accidents the Shannon information counts), the more energy needs to be expended.

The problem with the Landauer bound is that while it places a lower bound on the cost of information, it tells us nothing about the cost of the decision the information encodes. By analogy, Landauer might say opening and closing a door requires more energy than flicking a switch, but ignores the fact that the door provides access to meager calories in a pantry, whereas the switch turns on a monstrous hydroelectric dam. As Shannon and Weaver wrote in 1964, “information must not be confused with meaning” such that for two equally informative messages, one is “heavily loaded with meaning and the other . . . is pure nonsense”. As David Wolpert puts it, we are less often worried about the cost of a bit and more interested in “the bang per bit.”

One of the keys to understanding the cost of the meaning of information is to explore more carefully how the information is used — how an algorithm or procedure takes an input and processes it into an output that we would describe as the “correct” solution to a local problem. This goes beyond moving a message around in space or time to consider how the message is transformed into something useful — how a message is computed. This is the problem that Charles Bennett tackled in his 1988 paper “Logical Depth and Physical Complexity.” Bennett showed that a principled measure of complexity can be calculated as the running time of the fastest computer making use of the shortest instruction set (the Kolmogorov complexity) to achieve a desired solution.

The Harper–Wolpert meeting sought to take these foundational insights and ponder how they might be extended by considering recent progress in the non-equilibrium thermodynamics of computation. This in brief builds on a profound connection between energy dissipation and non-reversible transitions between meta-stable states described by a Markov process. It is not clear how this will be done. What is clear is that measures of history bear a very close resemblance to measures of complexity. And that as Murray Gell-Mann and Seth Lloyd showed us in their work on “Effective Complexity,” anything complex must have a long history. History should not be confused with time but identified with the steps of social procedures. History does not flow like a river but clicks like a wheel, and whereas in some centuries it is cacophonous in others it is barely audible.

— David Krakauer
President, Santa Fe Institute

From the Spring 2023 edition of the SFI Parallax newsletter. Subscribe here for the monthly email version, or email “news at santafe.edu” to request quarterly home delivery in print.

--

--

Santa Fe Institute

The Santa Fe Institute is an independent research center exploring the frontiers of complex systems science.