Law of Increasing Functional Information: Implications
The Law of Increasing Functional Information explains how systems evolve by accumulating purposeful complexity, uniting physics, biology, and intelligence into one framework.
The Law of Increasing Functional Information is a newly articulated principle emerging from the intersection of physics, biology, systems theory, and information science. It proposes that, under the right conditions—namely energy flow, variation, and selective retention—systems tend to accumulate functional information over time. Unlike entropy, which measures disorder, functional information captures the degree to which a configuration contributes to a specific function within a given context. This law provides a unifying framework to understand why complex structures—cells, ecosystems, minds, technologies—emerge and persist in the universe.
At its core, this law is grounded in selection dynamics. When systems interact with an environment that rewards certain outcomes—like stability, replication, or efficiency—configurations that perform better are retained and amplified. Over time, this leads to the accumulation of structures that are not merely complex, but functionally adaptive. These structures embody information—not in the Shannon sense of unpredictability, but in a purposeful sense: they do something, they work, they matter. Function thus becomes the organizing principle that guides the emergence of order out of chaos.
The mechanism behind this law is fundamentally physical but non-reductionist. It does not violate thermodynamics; instead, it reinterprets entropy and order as complementary processes. Energy gradients enable systems to explore many possible states, but only a few of these states are retained—those that serve a function and are stable in context. Each retained state increases the system’s repertoire of functionality, allowing future configurations to build upon prior ones. This process is recursive: function enables more function, and functional systems evolve to become better at preserving, adapting, and extending themselves.
This principle has profound significance. It reframes the emergence of life, intelligence, and civilization not as improbable accidents, but as law-like consequences of the physics of information and selection. It allows us to ask not only how systems evolve, but why they evolve toward increasing capability. It provides a way to quantify and compare progress across different domains—biological, technological, social—based on their informational depth and functional sophistication. In short, it gives us a scientific foundation for understanding purpose, agency, and progress without resorting to mysticism.
Crucially, the law also helps explain phenomena that traditional models struggle with: the open-ended nature of evolution, the emergence of goal-directed behavior, and the substrate-independence of intelligence. It reveals why life continues to innovate long after basic survival is achieved, why function persists across biological and technological systems, and how major evolutionary transitions emerge when a system crosses thresholds of integrated function. The law provides a new ontology for thinking about adaptive systems—not as static, mechanistic objects, but as evolving informational structures shaped by context-sensitive feedback.
This article explores the 12 key implications of this law, drawing from research by Robert Hazen, Jonathan Wong, Stuart Bartlett, and others who have been developing the theory of functional information. These implications span multiple domains—from thermodynamics to cognition—and reveal how deeply the law penetrates into our understanding of the universe. Each implication highlights a different consequence of treating function, rather than matter or energy alone, as the central axis of evolutionary dynamics. Together, they point to a new paradigm—where information is not only a descriptor of systems but a driver of their evolution.
Summary
1. Complexity as Directional, Not Random
Traditional physics treats complexity as a temporary statistical fluctuation, erased over time by entropy. But when systems are embedded in environments that select for functional configurations, complexity can accumulate directionally. Functional information introduces a bias in configuration space—those configurations that “do something useful” are disproportionately retained, leading to persistent increases in structured complexity over time.
2. Selection as a Universal Physical Process
Selection is typically viewed as a biological or computational principle, but this theory elevates it to a universal mechanism embedded in the fabric of physical processes. Wherever there are flows of energy, variation in configuration, and retention of successful outcomes, selection becomes an active shaper of systems. This reframes evolution as not exclusive to life, but as a general law applicable to matter, energy, and organization itself.
3. Function Determines Persistence
A configuration’s survival is not random—it is dictated by its ability to fulfill a role that supports system stability, propagation, or adaptability. Functional information provides the lens to understand why certain forms persist while others vanish. Structures that maintain coherence under pressure or that generate useful work become the substrates on which further complexity builds. Persistence thus becomes a function-driven filtering process.
4. Thermodynamics Reframed by Function
Entropy is not the enemy of complexity; rather, complexity evolves to manage entropy better. Functional systems act as entropy processors: they extract usable work from energy flows while maintaining internal order. The accumulation of functional information improves this capacity over time. This reframes thermodynamics as not only a constraint on life, but also as a field of opportunity that selects for systems that can best exploit it.
5. Evolution as an Algorithmic Process
As functional systems evolve, they start to resemble algorithms—reusing modules, iterating solutions, and building hierarchies. This is not metaphorical. It’s structural: evolution accumulates subfunctions that can be recombined and repurposed. Over time, the system's capacity to evolve becomes itself an evolved feature. This makes evolution an increasingly intelligent, structured search over function space.
6. Life as Feedback-Driven Learning
Biological and cognitive systems do not just react; they sense, evaluate, and adapt. Functional information grows when systems incorporate feedback loops that measure outcomes and adjust internal configurations. This transforms life from a reactive chemical system into a data processor—a system that modifies its future behavior based on past performance, encoding memory and prediction as fundamental biological features.
7. Open-Ended Evolution as a Result of Functional Bootstrapping
Evolution doesn’t stop at optimization; it builds the conditions for further novelty. As functions accumulate, they generate new environments and opportunities for selection. This creates an upward spiral of innovation. Open-endedness—where new forms of life, mind, or technology emerge—is not random. It is a necessary consequence of compounding functional information over time.
8. Function as the Bridge Between Physics and Purpose
Physics traditionally excludes purpose from its explanations, labeling it subjective or emergent. Functional information provides the missing link: purpose arises naturally when configurations are selected for the outcomes they produce. Systems act “as if” they pursue goals because only goal-achieving configurations persist. This grounds teleology not in mysticism but in information dynamics.
9. Major Evolutionary Transitions as Thresholds of Functional Integration
Key turning points in cosmic or biological history—abiogenesis, multicellularity, consciousness—are best understood as phase shifts in functional information density. Each transition enables a new layer of processing, control, or organization. They occur when existing components are integrated into a new whole that performs higher-order functions. These transitions are lawful, not accidental.
10. Substrate-Independence of Functional Intelligence
Whether in DNA, neural networks, silicon chips, or social institutions, the same informational principles apply. What matters is not the material but whether the system supports variation, selection, and retention of function. This universality enables functional information to emerge across radically different systems, paving the way for general theories of intelligence and evolution.
11. Functional Information as a Universal Metric
Neither complexity, entropy, nor information-theoretic measures fully capture evolutionary progress. Functional information does. It quantifies how much of the configuration space is occupied by solutions to a defined task. As systems evolve toward rarer and more precise functions, this measure increases. It allows cross-domain comparison of functional sophistication in biology, AI, and even social systems.
12. Functional Information as a Law-Like Driver of Cosmic Evolution
From particles to people, the universe exhibits a consistent trend: increasing complexity aligned with functionality. This is not incidental. The law of increasing functional information posits that, given energy flow and selection, systems must trend toward higher-order function. This reframes the history of the cosmos—not as a random drift—but as a law-governed, directional process driven by the growth of functional information.
Implications of the Law of Increasing Functional Information
1. Complexity as Directional, Not Random
Phenomenon:
In evolving systems—biological, chemical, technological—complexity tends to increase over time and becomes functionally specialized.
Conventional Interpretation:
In classical thermodynamics, increasing complexity is treated as a temporary phenomenon enabled by energy gradients. There is no underlying physical reason why complexity should consistently increase or be retained.
Functional Information Interpretation:
Complexity increases systematically when it contributes to function—i.e., when it improves the ability of a system to persist, reproduce, or adapt. Functional configurations are not randomly maintained; they are preferentially retained because they succeed under selection pressures.
Mechanism:
Variation generates different configurations (molecules, behaviors, strategies).
Selection filters for configurations that successfully perform a defined function.
Retention preserves these high-performing configurations, enabling further exploration from a more effective baseline.
Causal Structure:
Variation →
Selection for function →
Retention →
Biased exploration of configuration space →
Directional accumulation of complexity with purpose
Implications:
Directionality in evolution arises from internal system dynamics, not external randomness.
Complexity is stabilized by functionality, not just structure.
Systems under functional selection will not only become more complex but increasingly tailored to specific, persistent tasks.
2. Emergence of Intelligence Across Substrates
Phenomenon:
Systems that accumulate functional information exhibit intelligent behavior, even in non-biological contexts (e.g., AI models, immune systems, ecosystems).
Conventional Interpretation:
Intelligence is typically defined as a biological or cognitive trait—associated with neural processing, language, or reasoning in humans or animals.
Functional Information Interpretation:
Intelligence is reframed as the ability of a system to accumulate, structure, and apply functional information to solve problems or adapt in complex environments. It is substrate-independent and measurable by information-processing capacity linked to task success.
Mechanism:
A system with sufficient variation and feedback begins to select for function.
As functional solutions are retained and built upon, internal models of the environment form.
These models enable adaptive responses, prediction, and problem-solving.
Intelligence emerges not from conscious thought, but from structured functional adaptation over time.
Causal Structure:
System with variation and retention →
Selection pressure favoring performance →
Accumulation of functionally useful structures or behaviors →
Emergence of modeling, feedback integration, and goal-oriented responses →
General intelligence-like capability
Implications:
Intelligence becomes measurable across domains (biological, digital, social).
It enables meaningful cross-domain comparisons between natural and artificial systems.
The boundary between evolution, learning, and reasoning becomes blurred—they all become mechanisms for functional information growth.
3. Functional Information as a Physical Quantity
Phenomenon:
Some configurations (e.g., a protein that folds properly, or a spacecraft that functions) are extremely rare but highly effective. Their existence depends on selection, not chance.
Conventional Interpretation:
Physics uses entropy and energy to describe the likelihood and behavior of physical states. Information is often treated in the Shannon sense—concerned with uncertainty or signal transmission, not with functionality or purpose.
Functional Information Interpretation:
Functional information quantifies how rare and effective a configuration is at achieving a specific outcome. It distinguishes meaningful structure from accidental structure by linking it directly to a defined task or purpose.
Mechanism:
For a given function, only a small subset of all possible configurations will succeed.
Functional information measures how specific and constrained a successful configuration must be relative to all possible options.
This provides a quantifiable measure of purposeful complexity, distinct from randomness or mere order.
Causal Structure:
Define a task or function
Evaluate which configurations achieve that function above a performance threshold
Measure how rare those configurations are relative to all possible configurations
The rarer and more successful the configuration, the higher its functional information
Implications:
Functional information becomes a quantitative axis alongside mass, energy, and entropy.
It enables physics to describe organized, goal-directed systems in ways classical quantities cannot.
It opens a path for integrating purpose, agency, and adaptation into the scientific description of the universe—without violating physical law.
4. Rethinking Thermodynamics and Entropy
Phenomenon:
Life and complex systems seem to maintain internal order and resist decay, while still obeying the second law of thermodynamics.
Conventional Interpretation:
The second law allows temporary local decreases in entropy as long as total system entropy increases. However, this does not explain why certain systems consistently generate and maintain order.
Functional Information Interpretation:
Functional systems evolve mechanisms to export entropy efficiently while maintaining internal structure. Systems that are better at organizing internal processes to dissipate energy while maintaining their own persistence are selected and retained. Functional information explains how and why local order persists—not just that it can.
Mechanism:
Systems use energy flows not just to survive, but to build and maintain structured states that fulfill functions.
These structured states (e.g. enzymes, cities, algorithms) improve the system’s entropy management capabilities.
Selection favors such systems because they maintain their identity and reproduce in a dynamic environment.
Causal Structure:
Energy flow through a system →
Potential for structured, functional configurations to emerge →
Those that manage energy and entropy effectively persist →
Functional information is retained and built upon →
System becomes more organized over time while exporting entropy
Implications:
Entropy management is a selected function, not an accident.
Thermodynamics must account not just for statistical energy states, but for functional entropy pathways.
Life doesn’t defy thermodynamics—it evolves to exploit it.
5. The Algorithmic Nature of Evolution
Phenomenon:
Evolution produces structured, functional systems with hierarchical, modular, and recursive features—similar to how well-written code develops.
Conventional Interpretation:
Evolution is traditionally viewed as a slow, blind process governed by random mutations and natural selection, lacking any high-level structure or optimization strategy.
Functional Information Interpretation:
The accumulation of functional information over time enables evolution to operate algorithmically:
It reuses successful components (modularity),
Combines them in new ways (recombination),
Builds higher-order functions from lower-order ones (hierarchy),
Tests outcomes and retains effective ones (feedback optimization).
Mechanism:
Functional units are encoded, copied, and reused (genes, routines, behaviors).
Recursive improvement mechanisms emerge through iteration.
The structure of evolution becomes increasingly computational—with constraints, optimization, memory, and abstraction.
Causal Structure:
Variation and selection →
Retention of successful subfunctions →
Modular reuse and recombination →
Higher-order structures emerge →
Evolution begins to mimic algorithmic design
Implications:
Evolution is not just a random walk through genetic space; it becomes a structured search process.
The analogy to computation is not metaphorical—it is mechanistic.
This understanding bridges biology and artificial intelligence: both are systems evolving under functional constraints.
6. Life as a Feedback-Driven Data System
Phenomenon:
Living systems sense, respond, and adapt to their environments through complex feedback mechanisms involving perception, memory, and prediction.
Conventional Interpretation:
Biological systems respond to stimuli via chemical and physical processes. Feedback is acknowledged, but often not modeled as central to the system’s structure or evolution.
Functional Information Interpretation:
Life is redefined as a system that gathers, stores, and applies functional information through feedback. The capacity to update internal models based on performance outcomes is essential to sustaining high-function systems.
Mechanism:
Systems that gather data from their environment and adjust behavior or structure accordingly are more likely to survive.
Feedback loops reinforce beneficial functions and suppress harmful ones.
Memory (genetic, neural, digital) enables retention of past functional patterns.
Over time, systems become self-modeling—adapting based on internally processed signals.
Causal Structure:
Input from environment →
Internal data processing →
Adaptive response →
Performance feedback →
Update of stored information →
Iterative improvement of function
Implications:
Life is not defined by chemistry alone, but by feedback-driven learning.
Systems that compute over time using feedback gain evolutionary advantage.
This principle applies to living organisms, economic systems, machine learning models, and more.
7. Open-Ended Evolution as an Information Dynamic
Phenomenon:
Across geological and biological history, evolution does not stop once a solution is found. Instead, systems continue to innovate—leading to new niches, tools, species, and technologies.
Conventional Interpretation:
Classical models often treat evolution as bounded by fitness peaks. Once an optimal solution is found in a local environment, change slows or halts unless perturbed.
Functional Information Interpretation:
Open-ended evolution arises when systems not only retain function but reconfigure existing functions into new contexts, or bootstrap new functions from old components. As functional information grows, so does the space of possible adaptations. Each new function creates new possibilities for further evolution.
Mechanism:
Systems accumulate functional modules.
These modules recombine or are applied in novel contexts.
New environmental conditions select for new uses.
The system’s own growth creates new selective environments.
Causal Structure:
Growth in functional repertoire →
More possible configurations →
Reuse and recombination of functions →
Expansion of possible futures →
Continuous emergence of novelty
Implications:
Evolution is not about reaching a goal, but expanding the space of achievable functions.
Open-endedness is a natural consequence of compounding functional information, not a mystery.
This principle applies across biology, technology, language, and culture.
8. Function as a Bridge Between Physics and Purpose
Phenomenon:
Purposeful behavior—goal-seeking, adaptation, planning—emerges in many systems, even when not explicitly designed.
Conventional Interpretation:
Physics lacks a formal account of purpose. Teleology is generally excluded from physical explanations, seen as subjective or anthropomorphic.
Functional Information Interpretation:
Purpose is not mystical or imposed; it emerges from systems that select for outcomes. When systems are embedded in environments with selection for performance, they develop structures that appear to pursue goals, because only goal-contributing configurations persist.
Mechanism:
System variation → multiple potential behaviors
Selection by environment → behaviors that produce better outcomes persist
Retention → the system becomes structured toward producing those outcomes
Causal Structure:
Environmental constraint defines viable outcomes
Functional behavior is selected
System becomes biased toward specific ends
Apparent purpose arises from real selection dynamics
Implications:
Purpose is emergent from selection on function, not philosophically imposed
Functional information is the bridge that connects raw physics to goal-directed behavior
This reframes agency, adaptation, and meaning in purely physical terms, allowing scientific models of intentionality
9. Functional Information as the Driver of Major Transitions
Phenomenon:
Across history, systems undergo abrupt transitions: abiogenesis, multicellularity, cognition, technology. Each shift marks a radical leap in complexity and capability.
Conventional Interpretation:
Major transitions are often modeled as statistical accidents—threshold-crossing events with no predictable timing or cause beyond luck and environment.
Functional Information Interpretation:
Major transitions occur when systems reach a critical threshold of functional information that enables new layers of control, representation, or interaction. These transitions are not random—they are the result of compounding structure, where a new level of function becomes both possible and selectable.
Mechanism:
Function builds on function, recursively
Accumulated layers enable new systemic properties (e.g. communication, coordination, memory)
Once a sufficient substrate exists, new forms of information processing emerge
These new systems are selected if they improve survivability or performance
Causal Structure:
Base-level functions accumulate
New functionally integrated subsystems emerge
These enable radically different forms of interaction
A new evolutionary regime begins
Implications:
Transitions are function-driven phase shifts, not statistical noise
The history of complexity is best explained by informational thresholds, not randomness
We can begin to predict and model future transitions—e.g., artificial intelligence or planetary-scale cognition
10. Substrate Independence of Functional Information
Phenomenon:
Functionally intelligent behavior emerges in diverse media—biological cells, silicon circuits, chemical networks, even social organizations.
Conventional Interpretation:
Capabilities like learning, adaptation, or intelligence are often thought to be tightly linked to specific substrates (e.g., neurons, DNA, silicon).
Functional Information Interpretation:
What matters is not the substrate, but whether it supports selection, variation, and retention of function. If these conditions are met, functional information can accumulate—regardless of material base.
Mechanism:
Any system capable of generating varied configurations and selecting among them for a goal can evolve functional information.
The substrate only constrains the speed, fidelity, or dimensionality of the information process.
Intelligence or adaptation arises from informational dynamics, not material identity.
Causal Structure:
Substrate supports variation
Environment supplies functional feedback
Selection retains effective configurations
System improves performance over time
Implications:
Minds, ecosystems, technologies, and economies can all be analyzed as informational systems under functional selection.
The theory predicts the emergence of intelligence wherever these dynamics apply.
It allows for universal laws of intelligent system evolution—across carbon-based life, machines, and future synthetic systems.
11. Functional Information as an Evolutionary Metric
Phenomenon:
Not all evolved systems are equally advanced or adaptive, even if they are complex.
Conventional Interpretation:
Biological fitness, algorithmic complexity, or Shannon information are often used to measure system "progress"—but each captures only a fragment.
Functional Information Interpretation:
Functional information provides a quantitative, task-specific metric: how much information a configuration encodes that contributes to a specified function. It allows measurement of evolutionary advancement in a way tied to performance, not just structure.
Mechanism:
For a given function, one measures the proportion of all possible configurations that achieve it.
The rarity and effectiveness of a configuration define its functional information.
As systems evolve to be more specialized, this value increases.
Causal Structure:
Define function
Determine success threshold
Count viable configurations above threshold
Compare to all possibilities → functional information
Implications:
Enables standardized cross-domain comparison (e.g., bacteria vs. AI models vs. social systems).
Provides a unified framework for evolutionary benchmarking.
Shifts focus from complexity for its own sake to functionally grounded sophistication.
12. Functional Information as a Law-Like Process in the Universe
Phenomenon:
The history of the universe shows a clear pattern: from simple particles to atoms, stars, chemistry, life, minds, and civilizations. Complexity and function both increase.
Conventional Interpretation:
This pattern is seen as coincidental—emergent from physical laws, but not itself a law.
Functional Information Interpretation:
The law of increasing functional information proposes that, under certain boundary conditions (energy flow, variation, selection), systems must evolve toward greater functional organization. This directional trend is not incidental—it is a law-like feature of the universe.
Mechanism:
Systems embedded in structured environments naturally explore configuration space.
Selection for persistence or goal-directed behavior amplifies successful structures.
Functional information accumulates because those structures recur and build upon one another.
The process is iterative and directional over cosmic time.
Causal Structure:
Initial conditions →
Physical regularities + selection →
Functional configurations are retained
Functional complexity increases
Function itself becomes self-reinforcing
Implications:
The emergence of life, intelligence, and civilization is not anomalous—it is physically lawful.
This framework may unify physics, biology, and technology as phases in a single dynamic: the accumulation of function.
It invites a new kind of science—one that treats information as not just an outcome, but a driver of cosmic evolution.