Information was mathematically defined by Claude Shannon, who, in 1949, described how a message from an information source could be transmitted through a noisy channel to a receiver.[^1] In his work, Shannon defined a **signal** as the physical transmission of **information** through a **medium**. In Shannon's framework, information is the meaningful message embedded within a signal—an identifiable **pattern** amidst the noise. To quantify this, Shannon borrowed the concept of *entropy* from Ludwig Boltzmann’s statistical mechanics, using it to measure the degree of uncertainty in a signal. Put simply, the *more* unpredictable a signal is before it is received, the *more* information it carries when it arrives. If you already know tomorrow’s weather with certainty, then a forecast is redundant. If, however, the sky’s mood is a mystery, a forecast reduces your uncertainty—thus, it provides information. Shannon's core insight was that information is defined by the **reduction of uncertainty**.
Shannon’s contributions, along with the invention of the transistor, laid the foundation for modern computing. However, for our purposes, his framework is too narrow. He acknowledged that a message is only meaningful in the correct **context**—one must understand the language of a message or possess the cryptographic key to decode it. Yet, Shannon deliberately excluded *meaning* from his theory, as it was irrelevant to his engineering concerns. His model primarily addressed communication between humans (or their machines), but reality is far broader than human chatter. Thus, we must expand his definitions accordingly.
Shannon studied information as **patterns within signals**, but information is far more pervasive. Patterns exist everywhere, whether or not they are transmitted through a signal. All structural arrangements of matter and energy possess statistical properties that form patterns—and all these patterns are information. A salt crystal, for instance, encodes information about atomic symmetry, electrical conductivity, and optical behaviour. The DNA molecule encodes information about its double-helix structure, nucleotide sequences, and transcription mechanisms for self-replication. A galaxy’s spiral arms tell the story of density waves, gravitational interactions, and cosmic evolution. Thus, information is not confined to human or machine messages. It permeates the universe. It is, at its core, **all patterns embedded in matter and energy**.
## Dimensions
These patterns form the basis of the information that is embedded in **all matter**. As we have seen on the [[Vibrant Matter|previous page]], at each level of structural emergence (e.g. sub-atomic, atomic, molecular,) matter embeds additional levels of patterns, and therefore, information. This can be understood as information structured across **dimensions**. Higher dimensions of pattern have the capacity to store more information. A two-dimensional map, for example, cannot perfectly preserve all aspects of a three-dimensional surface, a problem known as the _map projection problem_. Any such map necessarily distorts some features—whether scale, angles, or distances—because only certain details can be retained when reducing dimensionality. The same applies to digital imaging—a 4K image encodes more information than a lower-resolution version—allowing for greater detail and accuracy.
The same principle applies to emergent levels of structural arrangements. Each new level retains the information of its constituent parts while adding a **new dimension** by arranging them into a higher-order structure. Consider an amino acid, a simple molecule containing about 10 atoms—its informational content is relatively small. But when hundreds of amino acids bond to form proteins, they take on intricate three-dimensional structures that vastly increase the information embedded within them. This additional complexity enables proteins to catalyse reactions as enzymes, provide structural support in cells, and serve as molecular machines.
To summarise, the laws of thermodynamics have sculpted a cosmos in which energy dissipation has driven the **spontaneous emergence of structured order**. Thus, from the moment of the Big Bang, the dissipation of energy gradients has driven the spontaneous embedding ever-greater **dimensions of information**. This process has unfolded over the last 14 billion years, giving rise to levels of complexity that physicist **Bobby Azarian** describes thus:
> _Through a series of hierarchical emergences—a nested sequence of parts coming together to form ever-greater wholes—the universe is undergoing a grand and majestic self-organizing process, and at this moment in time, in this corner of the universe, we are the stars of the show. (...) Through the evolution and eventual outward expansion of self-aware beings like ourselves, and their efforts to organize matter into arrangements that support information processing and computation, the universe is, in a very real and literal sense, waking up._ [^2]
[[The Self and the Will to Power|Next page]]
___
[^1]: Shannon, C. E., & Weaver, W. (1998). The mathematical theory of communication. University of Illinois Press.
[^2]: Azarian, B. (2022). The romance of reality: How the universe organizes itself to create life, consciousness, and Cosmic Complexity. BenBella Books, Inc.