29 November 2015
08:09 AM
Stage
11
3
9
26
Headings
3/4
3/4
3/4
3/4
3/4
3/4
3/4
3/4
3/4
3/4
3/4
Edit
K
K
L
K
L
L
L
L
L
L
L
Pages
9
7
13
8
6
4
11
3
5
9
11
86
7
27
120
Stage Heading
s
Edit
Pages
K
L
L
L
L
L
L
L
2
2
3
3
5
2
2
3
5
27
4
K
4
K
Pages
3
3
1
7
3/4
K
Pages
9
4
4
3/4
4
3/4
3/4
4
3/4
4
3/4
4
3/4
4
3/4
18
3/4
K
K
K
K
3
2
6
3
K
L
L
L
L
L
L
L
L
L
8
2
6
3
15
2
6
2
8
3
L
L
D
11
10
1
99
Key
This is the article list for The Living Algorithm System the second monograph in the series associated
with the study of Behaviorial Dynamics. On this page the Reader will find 5 columns. The first column
(Table of Contents) is a list of the linked articles. The second column indicates whether the articles are
finished () or are still a work in progress (WIP). The 3rd column (Headings) provides a link to two
levels of outline detail. If the Reader is interested in Section headings, click on 3 and if interested in
Paragraph headings click on 4. The fourth column (Stage) indicates whether the articles have been
edited (L/K) or not (-). The final column (Pages) indicates the length of each article.
Have you ever considered how we translate the impersonal digital information of 1s and 0s into personal knowledge that is rel evant to our
existance? For instance, how are we able to derive 'music' from our favorite CD? How is it that we dance wildly or cry uncont rollably
when we hear a sequence of 1s and 0s that can't even touch each other? What is the translation process that bridges the infin ite chasm
between these 2 simple numbers?
I am excited to present a plausible theory that accounts for our personal connection to the impersonal digital sequences cont ained on our
CDs, DVDs, computers and IPhones. The process that seems to enable connectivity is contained in a mathematical system
labeled Information Dynamics. The theory concerns how living systems digest digital information to transform it into a form that is
meaningful to Life. While the information processing epitomized by a computer is of necessity static, exact and fixed, living information
digestion is of necessity dynamic, approximate and transformational. Think of the difference between a baby and a computer.
Our initial monograph illustrated some of the many patterns of correspondence between the mathematical processes of Information
Dynamics and empirical reality. These include the harmful effects of Interruptions to the Creative Process, the negative impact of Sleep
Deprivation, the Necessity of Sleep, and even the Biology of Sleep.
These striking correspondences evoke some distinct questions. Why does the mathematical model behave in similar fashion to
experimentally verified behavioral and biological reality? Could these correspondences be a mere coincidence, some kind of od d artifact?
Or perhaps the striking patterns are due to some yet as undiscovered molecular/subatomic mechanism? Or could these odd correl ations
between mathematical and living processes be due to the process by which living systems digest information?
We chose to explore the last theory. The first question we posed ourselves: What kind of information digestion process would a living
system require? What are the entry level requirements?
The following essay addresses three questions. Why do living data streams best characterize the dynamic nature of living syst ems? Why do
data streams require a new mathematics? And what requirements must this data stream mathematics fulfill if it is also to be t he
mathematics of dynamic living systems?
threshold is reached beyond which precision and significance (or relevance) become almost mutually exclusive characteristics. A
corollary principle may be stated succinctly as, "The closer one looks at a real-world problem, the fuzzier becomes the solution." (Fuzzy
Thinking, p. 148, 1993)
If it is true, as Dr Zadeh argues, that real world problems require fuzzy solutions, data stream mathematics may provide a method to explore
this fuzziness.
The concept behind Zadeh's Principle of Incompatibility helps explain why the traditional laws of Probability find Life's Imm ediacy
perplexing. Probability is far more comfortable dealing with what is familiar his specialty. He feels most at ease with fixed,
unchanging data sets where all members are functionally equivalent. Perhaps his desire for the comfortable, yet rigid,
precision of conventional mathematics presents an insurmountable challenge to understanding the complexity of Life's
immediate meaning. For Life to have her spontaneous immediacy appreciated, she may have to search elsewhere for a
mathematical partner.
Perhaps understanding the immediacy of the moment requires a partner that relates better to a data stream. While this tradeof f sacrifices the
comfortable predictability of the more traditional relationship, it offers her a freshness that comes from more accurately un derstanding the
meaning of the moment(s) her most subtle nature. As we shall see, the suggestive predictors of Data Stream Mathematics with
their relative imprecision are an ideal match for characterizing her meaning of the moment Life's Immediacy.
The prior article developed the notion that living systems must extract meaning from ongoing data streams to survive. Because of the
quantitative nature of data streams, we suggest that this meaning is mathematical in nature. This mathematical meaning has a few crucial
features. The mathematics must address the immediacy of living systems as well as providing predictive descriptors for each moment. As of
yet, traditional mathematical systems havent been able to meet this challenge.
The esteemed Dr. Zadeh, the father of Fuzzy Logic, recognizes this deficiency and offers his own mathematical system as a solution. Lets
see how successful this approach is. To set the context, our exploration begins by revisiting a previously-citedQUOTATION
from Dr. Zadeh:
(Due to) the fundamental inadequacy of the conventional mathematics the mathematics of precisely-defined points, functions, sets,
probability measures, etc. for coping with the analysis of biological systems we need a radically different kind of mathematics, the
mathematics of fuzzy or cloudy quantities which are not describable in terms of probability distributions. (Dr. Bart Kosko, Fuzzy Thinking
p.145 quoting from Dr. Zadehs paper.)
Inspired by this insight, Dr. Zadeh went on to formulate the concept of fuzzy sets. His followers turned this notion into the mathematics of
fuzzy logic. Engineers have successfully applied the insights of fuzzy logic to significant real-life problems, such as how to stop bullet
trains smoothly. Cognitive scientists have also employed the insights from fuzzy logic to simulate the neural networks of the brain. In
essence, fuzzy logic successfully introduces the both-and approach to data sets, a complement to the either-or approach of conventional
mathematics.
Zadehs call for a radically different kind of mathematics implies the need for some new kind of arcane operations or esoteric measures to
cope with biological systems perhaps a biological string theory, or a calculus of living systems, or the quantum mechanics of
behavior, or even fuzzy logic. However, with the exception of neural networks, fuzzy logic has not proved to be the new
mathematics that is necessary for coping with the analysis of biological systems.
It may be that the quest to discover a radically different kind of mathematics to analyze biological systems should not be limited to the
esoteric nature of high-level theoretical mathematics. We believe it would be useful to shift the mathematical focus to the inherent nature of
the subject matter at hand the data stream. Rather than pursue ever-more complex abstractions, we believe that the intelligent
application of existing mathematical tools can yield powerful, practical insights into the nature of data streams. This intelligent
application of existing mathematical tools must focus on the immediate significance of moments in the data stream.
We have argued previously that the dynamic nature of living systems is best characterized by data streams. We went on to detail the
requirements of a mathematics of data streams that would address Lifes immediacy. Hunting for a new set of mathematical abstractions
that would fulfill the necessary data stream requirements would be a difficult, if not hopeless, quest. Data Streams offer an extraordinary
challenge because of the inherently changeable nature of an open living system. Living systems can be extremely sensitive to every
interaction with an environment that includes not only the closed systems of inanimate matter, but also includes interactions with other open
animate systems. The complex web of interactions between living systems and the myriad data streams of existence suggests the arcane
approach of theoretical mathematics is highly impractical.
Rather than a higher and more complex level of abstraction, we are looking for pragmatic and useful tools that can help us think about
living data streams. Our search is for a practical mathematics. We appreciate the power of high-level abstractions. However, we believe
there is a place for the use of existing mathematical tools. The key is to apply these tools with a sensitivity to the unique nature of data
streams. The intelligent application of these tools can reveal insights into the nature of data streams, which are accessible to the informed
reader. These practical insights can provide a pragmatic balance to the rarefied language of theoretical mathematics.
more heavily; 2) providing descriptive measures that relate data points to each other in a manner that is sensitive to pattern
recognition; and 3) providing suggestive predictors that serve a pragmatic anticipatory function. These are the requirements
that a successful candidate must fulfill to be considered for the position. If the requirements are not fulfilled, the position will be
left open.
We would like to recommend a candidate for this long vacant, highly coveted and esteemed position. She's an excellent choice. She is a
simple form of information processing. Her sole function is to digest data streams. Further, her method of information processing generates
an 'animate' system, which fulfills the requirements of data stream mathematics. Her mathematics could be called the mathematics of the
moment, in that she effectively addresses Life's Immediacy. This includes providing a suggestive interpretative mechanism that articulates
pattern. The name of our candidate? You may have guessed it. Drum roll please . The Living Algorithm's Info System. The following
discussion provides evidence that supports our claim that this animate system, the Living Algorithm System, fits this demanding job
criteria and should be considered for the position. If her qualifications interest you, read on.
These three descriptors simultaneously provide a prediction that amounts to rough approximations about the next data point: 1) the expected
position (the dot in the center), 2) the range of variation (the circle), and 3) the direction of momentum (the arrow). Accordingly, each of the
Living Algorithm's ongoing derivatives is a descriptor that contains a significant predictive feature. A simple combination of these
predictive averages creates a composite predictive cloud. We choose the term cloud to represent the approximation of the expected features
of the next data point, which in summary, includes position, a range of probable values and recent tendencies of direction.
The Living Algorithm generates a trio of ongoing descriptors in response to the ongoing flow of information in the data stream. These
descriptors create predictive clouds. These meaningful composite elements, these predictive clouds, may be the type of predictive tools that
Dr. Zadeh suggested would be necessary for coping with the analysis of biological systems. Dr. Zadeh argued that these predictive tools
would be, of necessity, 'fuzzy or cloudy quantities which are not describable in terms of probability distributions'. Dr. Zadeh pursues a
solution that applies new mathematical abstractions to what he calls fuzzy sets. In contrast, we pursue an approach that applies existing
mathematical tools to the notion of a living data stream.
These predictive statements inherit their cloudy nature from the constant state of evolution inherent in a data stream. The Living Algorithm
System digests data to provide a predictive cloud, whose shape shifts with each new entry. Each new data point represents change; and the
constant possibility of change requires an ongoing approximation of pattern that is central to the responsiveness of living systems. As with
Life, these predictive clouds are context sensitive, constantly evolving via the dynamic input from a living data stream. The urgency of
response typically required of living systems demands context sensitivity. These predictive clouds reflect the immediate nature of living
systems, as they move through time. As such, the ongoing and suggestive nature of the Living Algorithm's predictive cloud is ideal for
describing the changeable and immediate nature of living systems.
After accomplishing her last ordeal, the Living Algorithm has now fulfilled all of the previously stated job requirements. The Living
Algorithm is the ideal candidate for the position of representing the personal and dynamic nature of living systems. Her mastery of data
The Mathematics of Living Systems Page 8
Algorithm is the ideal candidate for the position of representing the personal and dynamic nature of living systems. Her mastery of data
stream mathematics renders her approach a powerful simulation of an animate system.
Living Algorithm?
A special equation whose sole function is digesting data streams.
The Living Algorithm's digestive process provides the rates of change (derivatives) of any data stream. These metrics/measures contain
meaningful information that living systems could easily employ to fulfill potentials, such as survival.
Living Algorithm System?
A mathematical system based in the Living Algorithm's method of digesting data streams.
In the prior monograph, the Triple Pulse of Attention, we saw that the mathematical behavior of the Living Algorithm System exhibited
patterns of correspondence with many aspects of human behavior. Specifically, the Living Algorithm's Triple Pulse paralleled many sleeprelated phenomena. This intriguing synergy between math and scientific 'fact' led us to ask the Why question. Why does the linkage exist? Is
there a conceptual model that could help explain this math/data synergy?
Life and the Living Algorithm are compatible in many ways. As such, the Living Algorithm is the ideal type of equation to model living
systems. Further, the Living Algorithm has many features that are useful to Life. Taking this line of reasoning a step further, we ask the
question: could it be that the Living Algorithm doesn't just model Life, but that living systems actually employ the Living
Algorithm's algorithm to digest sensory data streams? In other words, could the Living Algorithm be Life's computational tool?
Is there any evidence that Life employs the Living Algorithm to digest data streams?
The initial article in this monograph developed the notion that living systems require a Data Stream Mathematics that provides ongoing upto-date descriptions of a flow of environmental information. Life needs these descriptors to approximate the future. These approximations
enable living systems to make the necessary adjustments to maximize the chances of fulfilling potentials, including survival. This ability to
approximate the future applies to a wide range of behaviors everything from the regulation of hormonal excretions to the ability to
capture prey or escape from predators.
The prior article, The Living Algorithm System, argued that the Living Algorithm's Predictive Cloud provides viable estimates about future
performance. As mentioned, Life requires a mathematical system that provides these future estimates. In this way, the Living Algorithm
System fulfills this particular requirement for a mathematics of living systems. The existence of these talents provides preliminary support
for the notion that the Living Algorithm could be the method by which living systems digest data streams.
If the Living Algorithm is really one of the ways in which living systems digest data streams, could the Living Algorithm have evolutionary
potentials as well? Why else would this computational ability be passed on from generation to generation? If it is indeed a computational tool
of living systems, the Living Algorithm should also provide an essential mathematical backdrop that is crucial for the evo-emergence of
many of Life's complex features.
The Living Algorithm's Predictive Cloud is the collection of derivatives (rates of change) that surround each data point - each moment. This
feature is of particular significance because the Predictive Cloud consists of predictive descriptors. In the prior article, we saw that these
predictive descriptors could be very useful to Life on a moment-to-moment level. Could these predictions concerning environmental
behavior confer an evolutionary advantage as well? Is it possible that knowledge of the Living Algorithms Predictive Cloud could further
the chances of survival for the myriad biological forms? Does the Predictive Cloud provide an indication of the evolutionary potentials of the
Living Algorithm System?
superior to that provided by sheer content alone (the raw data combined with memory).
The most basic of these features is the trio of central measures referred to as the Predictive Cloud. The Cloud's predictive power has many
uses. On the most basic level the Cloud provides information as to probable location, range and direction of the prey/predator. The Living
Average indicates the most probable location for the next piece of data; the Deviation, the range of variation; and the Directional, the
tendency and probable direction of the data stream. This trio of central measures provides incredible predictive power regarding the next data
point.
Could knowledge of the Predictive Cloud's metrics/measures regarding the ongoing flow of environmental data provide an essential
evolutionary advantage? Could an organism, whether cell or human being, make more accurate predictions about the future with an
understanding of these ongoing mathematical features of the myriad environmental data streams?
Lets explore some examples of the predictive power of the trio of measures that constitute the Predictive Cloud. An ongoing knowledge of
this trio of central measures would provide invaluable information to the prey in terms of probable range, direction, acceleration, and actual
location of a moving predator. Vice versa these measures would provide invaluable information to the predator in terms of the probable
location of an escaping prey. (The dot in the diagram indicates the probable location, the circle: the range, and the arrow: the direction of the
next data point.)
An organism could make a more efficient and effective response to environmental input with a knowledge of probable outcome. For
example, the ability to better predict location, range and direction of motion would allow the predator/prey to capture/escape more
frequently. It seems safe to say that the better the organism's predictions are, the greater the chance of survival. This would apply to any
organism. In short, the knowledge of the ongoing contextual features of any data stream enables any organism to make conscious,
subconscious or hard-wired choices that further the chances of survival.
The knowledge of probable outcome supplied by the Predictive Cloud also enables the organism to conserve energy. Instead of wasting
energy in the unguided attempt to procure food or sexual partners, the organism would only expend valuable energy when the Predictive
Cloud indicates a greater chance of success. Of course, energy conservation is a key evolutionary talent.
In addition to physical capabilities such as size and strength, it seems evident that the predator/prey evolutionary arms race would have to
include the computational ability to make probabilistic predictions about the future. Further the refinement of this essentially mathematical
skill has no end. While strength and size have limits imposed by physical requirements, neural development is virtually unlimited, as
witnessed by these words. The continuous refinement of this computing advantage, whether through experience, evolution, or emergence,
would enable the organism to maximize the impact of the response, while minimizing energy expenditure an essential evolutionary
ability. Of course this refinement of computational abilities could apply to the Living Algorithm's multitude of potentials.
As mentioned, an ongoing knowledge of the Living Algorithms Predictive Cloud, the aforementioned trio of measures, could be employed
as an invaluable predictive tool. On more complex levels, these same measures could easily supply the essential computational backdrop for
the development of our emotions. Let us offer some cursory remarks in this regard. The Cloud provides information that could enable an
organism to anticipate and prepare for the future. Anticipation morphs into expectation.
In brief, the Clouds measures of data stream change are emotionally charged because they determine expectations concerning the future.
The investment of emotion into information, whether memories or data, has an evolutionary purpose reinforce memory.
This emotional content renders the information easier to remember. This is not mere speculation. Cognitive studies have shown that memory
and emotions are linked. Information's meaning is invested with emotion because it is relevant to our existence. As such, a random set of
numbers is difficult to remember.
Its clearly evident that the Living Algorithms Predictive Cloud could provide an obvious evolutionary advantage to living systems. The
accuracy of future estimates is increased via the application of a simple and replicable algorithm. The Cloud's estimates of future
performance could be employed to predict environmental behavior. Better predictions increase the efficiency and effectiveness of our energy
usage. This conservation of energy certainly provides an evolutionary advantage. Further, the Cloud's trio of predictors could easily generate
the expectations that are the base of many emotions. Emotions have an evolutionary purpose, as they are associated with heightened
retention and recall associated with memory.
This discussion suggests that it is in Life's best evolutionary interests to have knowledge of the 3 ongoing and up-to-date measures that the
Living Algorithm provides. But to have access to this predictive power, Life must employ the Living Algorithm to digest data streams.
The sense of time passing is important to living systems for multiple reasons. A primary reason is that the flow of digested sensory
information only makes sense over time. Time duration is required to derive meaning from individual sensations. Isolated sensory input
makes no sense by itself. For instance, an isolated sound without temporal context is neither music nor a word. Even a picture takes time to
digest, no matter how brief. A sustained image over time is required to identify objects. The sense of smell, supposedly the first sense,
requires a duration of some kind to differentiate the potentially random noise of a single scent from the organized meaning of a sustained
fragrance.
A sense of time is required to experience the meaning of a signal. If an organism existed in the state of sheer immediacy, it would
automatically respond to environmental stimuli tit-for-tat just as matter does. But to make any kind of sense out of a sensory
message, the organism requires an elemental sense of the passage of time. The organism must be able to pay attention to the
sensory translation for a sufficient length of time to determine if the message indicates food, foe, or sexual partner. Other wise
the raw sensory information is just random garble. It is evident that the ability to sense the passage of time is essential i f we are
to experience the information behind sensory input.
Further, when an organism must make choices based upon sensory input to maximize the chances of survival, a sense of time is required to
even begin comparing alternatives. It seems that a sense of time is not just an evolutionary talent, but a requisite talent for all living systems.
For an organism to both have a sense of time and make educated guesses about the future, it seems that living systems must have emerged
with some sort of computational talent. This computational talent could be employed to digest the sensory data streams that are in turn
derived from the continuous flow of environmental information. The Living Algorithm's method of digesting data streams provides both
future estimates and a sense of time.
If the ability to digest information and transform it into a meaningful form is indeed an essential characteristic feature of living systems,
could the Living Algorithm and Life have emerged from the primordial slime together?
The Mathematics of Living Systems Page 12
could the Living Algorithm and Life have emerged from the primordial slime together?
The Living Algorithm's digestion process generates a sense of time by merging and relating the present moment with past moments.
How is this blending of past and present accomplished?
The impact of each data byte decays over time. This process is illustrated in the graph at right. Each color swatch represents the impact of an
individual data bit as it decays over time. The x-axis represents 180 repetitions of the Living Algorithm's mathematical process. Notice how
each moment includes many colors. This indicates the impact of prior and current data bits upon the current moment.
How is decay incorporated into the mathematical process?
The Living Algorithm's Decay Factor supplies this function. Let's see how.
The senses digest continuous environmental input to transform it into digital form. However this digital form has no meaning. Each byte of
information is isolated from the rest. At this point in the digestion process, the Decay Factor = 1, which means that there is no decay. With
no decay there is no relationship between the data points. With no relationship, there is no sense of time. Without a duration of time the
sensory output - the translated environmental information makes no sense. In summary, when the Decay Factor is one (D=1),
there is no time, hence no meaning.
To provide a sense of time, hence meaning, the sensory data streams require another level of digestion. The senses transform environmental
information into sensory data streams. The Living Algorithm's digestion process relates the isolated points in the sensory data stream to
create a sense of time. This relating process automatically occurs when the Decay Factor is greater than 1 (D>1). The Living Algorithm's
digestion process generates a sense of time passing, which simultaneously imparts the potential for meaning.
To aid retention, let us summarize this important process. Our senses digest continuous environmental input transforming it into sensory
data streams. The isolated instants of these sensory data streams don't have any inherent meaning, because they are not
related to each other in any way, as there is no decay (D=1). The Living Algorithm digests sensory data streams. This digesti on
process relates the isolated instants by introducing decay (D>1), which generates a sense of time. A sense of time is an essence of
meaning.
This analysis suggests that it is at least a plausible proposition that the Living Algorithm's digestion process could create the sense of time
that is required for meaning. Because living systems must derive meaning from data streams, this lends further support for the notion that
Life employs the Living Algorithm to digest data streams.
As an aside, material systems do not derive meaning from data streams. As such, material systems only deal with information that is inert. A
data stream's information is inert when the Living Algorithm's Decay Factor is one (D = 1). Conversely, a data stream's information is
dynamic when the Decay Factor is greater than 1 (D>1). Living systems require dynamic information because it yields meaning. As such,
this significant difference between material and living systems is inherent to the Living Algorithm's method of digesting data streams.
The ability to differentiate a random from an organized signal is due to a simple mathematical fact. The random data streams associated with
background noise possess an innate and stable velocity, but no acceleration. Conversely, an organized data stream (a string of relatively
stable values consistent with ordered environmental input) has a distinct acceleration.
The graph at right exhibits this distinct difference between a random and an organized data stream. The big red curve represents the
acceleration of an ordered data stream, the classic Pulse of Attention (120 ones). The erratic green color represents the acceleration of a
random stream of zeros and ones. It is immediately apparent that the acceleration of the organized data stream overshadows (rising 3 times
higher) the acceleration of a random data stream.
Why is identifying random streams a significant talent? After the senses translate continuous environmental information into sensory data
streams, it is essential to first pare out superfluous data streams from consideration. Differentiating random from organized signals is the
initial step in the process. This filtering process prevents information overload. The ability to identify and ignore random data streams
eliminates an abundance of environmental information from consideration. Minimizing the number of data streams under consideration
maximizes the speed and efficiency of response, which of course conserves energy.
The focus upon data stream acceleration as a way of filtering out random signals has other advantages as well. Paying attention to data
stream acceleration enables frogs to conserve their energy by only shooting their tongues at bugs, rather than plants. Insects move erratically,
and hence with more data stream acceleration, than plants. On more complex levels, focusing upon data stream acceleration allows complex
life forms to determine changes in their environment. Perceiving environmental changes, whether auditory, visual, olfactory or tactile, is
essential for any organism that must detect an approaching predator, prey, or sexual encounter. Organisms with this sense must somehow
have the ability to perform calculations that determine probable quantities that differentiate the random noise of the background environment
from the significant accelerations of predator and prey. The simple Living Algorithm supplies the ability to perform these calculations
relatively effortlessly.
It seems that the Living Algorithm-derived random data stream filter could be employed to diminish the amount of incoming data, hence
prevent information overload. This same filter could also be employed to identify environmental changes. Both of these computational
talents provide an evolutionary advantage.
Living Algorithm System has the streamlined operations that Evolution prefers.
The Living Algorithm System provides predictive capabilities. Further, the Living Algorithm's method of merging/relating the past and the
present generates a sense of the passage of time. Living systems require a sense of time to derive meaning from the sensory data streams.
Finally the Living Algorithm computes data stream acceleration. Knowledge of data stream acceleration could enable an organism to
differentiate random from meaningful signals and identify changes in the environment. Each of these talents provides an evolutionary
The Mathematics of Living Systems Page 14
differentiate random from meaningful signals and identify changes in the environment. Each of these talents provides an evolutionary
advantage.
Besides providing these advantages, the Living Algorithm satisfies evolution's simplicity requirement. Evolutionary processes select for
efficiency and simplicity in order to streamline operations and avoid the corruption of complexity.
Besides providing computational talents that are crucial for survival, hence provide an evolutionary advantage, it seems that the Living
Algorithm System also satisfies the requirement of simplicity. The principle of conservation dictates that Life requires a data digestion
system whose features are as economical as possible. This simplicity minimizes breakdown, data corruption, computational and memory
requirements. The Living Algorithms algorithm is simple; computations are basic; and memory requirements are minimal.
Preliminary Comparisons: the Living Algorithm vs. Probability, Physics & Electronics
We've seen many ways in which the Living Algorithm's Information Digestion System could provide essential evolutionary talents to living
systems. How do other methods compare? Let us offer some preliminary remarks in this regard.
As seen, the Living Algorithm satisfies the simplicity requirement for living systems in terms of computation and memory. In contrast,
Probability has prohibitive computational and memory requirements. No economy whatsoever. This dooms Probability as a computational
tool for living systems. To provide predictive descriptors, Probability requires many complicated formulas and operations, not to mention
the necessity of a huge and precise database. Further, Probability only makes predictions about the general features of the set, not the
ongoing moments. This method also weights past and present data points equally. While Probabilitys descriptors provide estimates of the
future, these estimates are based upon what was, rather than what is happening now. Besides being more economical in terms of operations
and memory requirements, the Living Algorithm System provides more up-to-date predictions about the future than does Probability.
Electronics provides another system that is in the running for the position as Lifes computational tool. Electronic data processing is different
in many fundamental ways from the type of data digestion provided by the Living Algorithm. The function of electronic data processing is to
transmit environmental or internal input as accurately as possible through noise reduction. Shannon, the father of information theory, studied
this type of processing regarding the clarity of electronic transmissions, such as radio, television, computers, and spacecraft. Accuracy is of
utmost importance in electronic transmissions, as Internet users well know. In this case, the internal codes that are required to ensure
accuracy only predict the most usual forms that the message could take as a type of redundancy testing. This type of data processing
requires standards of expectations to establish redundancy patterns. However, this method does not provide any predictive
abilities no estimates concerning future performance and certainly no room to move. Further, electronic information
processing does not relate the data. Without a relation between the past and the present, there is no sense of time pass
passing. Hence, electronic information has no meaning. Electronics imparts accuracy; humans impart meaning. And the Living
Algorithm provides the type of information digestion to impart that meaning.
Physics provides yet another alternative for determining living behavior. Given the initial conditions, say the Big Bang, and the appropriate
equations, Physicists can precisely predict the behavior of material systems. If living systems have no ability to adjust to external
circumstances, and instead respond automatically to environmental stimuli, then Physics is still in the running for the position as Lifes
computational tool. If, however, living systems have the ability to make adjustments that facilitate survival, then they need a mechanism that
will provide predictive powers and a sense of time. If this is the true state of things, then Physics is out of the running, as all operations are
automatic. We will deal with these topics in more depth in the article on Informed Choice.
The Mathematics of Living Systems Page 15
automatic. We will deal with these topics in more depth in the article on Informed Choice.
Summary
The Living Algorithm is an ideal evolutionary tool due to its predictive capacity, minimal memory requirements and ease of use. The Living
Algorithms Predictive Cloud easily characterizes the moment and provides pragmatic estimates about the immediate future. Articulating the
relationship between moments reveals patterns. Moment-to-moment updating discloses changes in these same patterns. Both are crucial
abilities for any organism. The Living Algorithm provides both of these functions. If living systems employ the Living Algorithm to digest
environmental data streams, it seems reasonable to assume that this would impart a huge evolutionary advantage.
Besides this predictive capacity, the Living Algorithm's relating process, whereby past information is related to current information, provides
living systems with a sense of time. A sense of time is a requisite talent for interpreting the sensory input from the environment. Without the
ability to experience this sensory information over time, the environmental input becomes meaningless noise. Without access to meaningful
information, an organism cannot respond effectively to environmental stimuli and perishes. Besides separating living matter from inert
matter, a sense of time provides a serious evolutionary advantage.
Finally, the Living Algorithm's information digestion process also generates the acceleration of any data stream (one of the features of the
Predictive Cloud). Data stream acceleration could easily provide the computations that enable an organism to differentiate a random from an
organized signal. This talent diminishes the possibility of information overload. Eliminating superfluous information from consideration
certainly provides an evolutionary advantage. As a significant side benefit, knowledge of data stream acceleration also enables an organism
to identify environmental change. Identifying change could also signify the need for an appropriate response, another significant
evolutionary advantage. It seems evident that if Life employed the Living Algorithm's digestion process that it could provide a multitude of
evolutionary advantages.
To further illustrate the pragmatic nature of the Living Algorithm's Predictive Cloud, the next article in the stream explores a concrete
example from the sport of baseball the batting average. In the process we will see how Probability & Living Algorithm
are Complementary Systems.
In the preceding article, we showed how the Living Algorithm System is the ideal mathematics to deal with Life's data streams. The Living
Algorithm is sensitive to the moment and weights each data point in the stream according to its relation to the present. Further the Living
Algorithm's predictive cloud also describes the trajectory of the moment's recent trends. This up-to-date information about the moment
provides estimates about the nature of future moments.
To illustrate these concepts, let's explore a concrete example. The baseball players actions during a game can be characterized by any
number of data streams. One of the most basic measures of a players performance at the plate is the batting average. The batting average is
determined by one of these data streams. The rules that generate this data stream are simple. If he gets a hit, he generates a one; and if he
doesn't get a hit, he generates a zero. There are several kinds of at bats that are excluded from the data stream that determines the batting
average (e.g. walks, batters hit by pitches, and sacrifices).
Probability looks at this flow of information as an ever-growing set and computes an average (the mean), which is appropriately called the
batting average. This statistic has had a significant role to play in the evaluation of the success of any professional hitter. Raw numbers such
as hits, home runs and RBIs are also significant measures of success, but the batting average has been a traditional indicator that compliments
these raw numbers. Baseball players' salaries and fame are based, in part, upon these batting averages.
These batting averages, which describe a players' performance, can also be used to serve a predictive function. The knowledge of a players
batting average is likely to shape the strategy of opposing coaches and pitchers. Owners and general managers also use the batting average to
predict how well the player will do in the following year(s). Bonuses, salaries, and long-term contracts are also likely to be influenced by a
player's batting average.
It is easy to see from this example how descriptive measures of past performance, such as the batting average, are used to predict future
performance. It is equally obvious that these predictive descriptors only provide very rough approximations of future performance. Even
though a baseball player might have a batting average of .333, this does not in any way guarantee that he will continue to bat .333 for the rest
of his season, contract, or career. This obvious lack of guarantee reminds us that the batting average is only a rough approximation of future
performance.
The rough approximation of the future provided by a batting average is valued by those who have a stake in predicting future events. Large
salaries are given because of these rough approximations; huge bets are placed on them; and strategies are formed on these predictive
descriptors called batting averages. It is evident that, despite their relative imprecision, these guesstimates are extremely meaningful to the
world at large.
Let's see what happens when the Living Algorithm processes this data stream not as an extended fixed set, but as an ongoing stream.
Living Algorithm digests the Baseball Player's Data Stream of 'at bats'
This analysis might suggest that the Living Algorithm is nothing more than a subset of Probability. In the ensuing discussion, we hope to
illustrate that the Living Algorithm is a unique approach to data analysis. Rather than being a subset, the Living Algorithm appears to be a
valuable complementary approach. The Living Algorithm digests the exact same data stream the baseball player's 'at bats'. From this
data stream, the Living Algorithm generates a predictive cloud, consisting of the previously mentioned trio of descriptors. This
predictive cloud describes the context of each moment in the player's career. Instead of characterizing the entire stream of 'at
bats' as an enlarged fixed set, the Living Algorithm characterizes the changing pattern that results from a constant focus on the
most recent at bats. While Probability weights each data point (each 'at bat') equally, the Living Algorithm assigns the greatest
weight to the most recent 'at bat' (data point), and scales the rest of the 'at bats' in descending order from the present.
Accordingly, the Living Algorithm's predictive cloud is context sensitive, adjusting to recent 'at bats' and providing predictive
information about the next 'at bat'.
Because of this context sensitivity, the Living Algorithm's predictive cloud provides up-to-date, relevant information as to the character of the
next 'at bat'. The trio provides information about the hitter's current state of affairs the position, range of variation, and direction of the
momentum of the batter's hitting data stream. This information could lead to the following scenario. The hitter's batting average
for the year is .333 (Probability's mean average). Complementing this knowledge, are the insights provided by the Living
Algorithm his current weighted average of his most recent at bats is .375, with a range of .20 and a positive direction of .10.
The Living Algorithms predictive cloud indicates that the batter is 'hot' right now. Recently, his weighted average of .375
exceeds his overall batting average of .333. In addition, he has been very consistent in recent at bats, as indicated by the tight
range of variation (.20). Furthermore, his recent batting data stream has a positive momentum (+.10). These descriptors of the
current state of affairs provide rough approximations of the immediate future. This information can be exceedingly relevant to the
opposing pitcher and his coaching staff.
In contrast, a hitter might have the same batting average for the year of .333, but the predictive cloud could indicate his weighted average
is .285, with a range of .80 and a negative direction of .20. This indicates that the batter is currently 'cold'. His weighted average of .285 is
less than his overall average of .333. His performance is erratic, as indicated by the large range of variation (.80). Furthermore his current
hitting momentum is negative (.20). In this case, the Living Algorithms predictive cloud would provide data that could lead to a
very different set of strategies for the opposing pitcher and his coaching staff.
In both scenarios, the batting average generated by Probability remains the same. Yet, these two different hypothetical moments in the
The Mathematics of Living Systems Page 17
In both scenarios, the batting average generated by Probability remains the same. Yet, these two different hypothetical moments in the
hitters data stream suggest that there are two very different patterns at work. The Living Algorithm reveals these diverse patterns by
providing unique information about the data stream of a hitter that is both timely and context sensitive. Could it be that what is relevant to the
data stream of a baseball player may also be relevant to the data streams of other living systems?
The use of Probability's mean average, the famous batting average, is certainly a better way to characterize the player's entire season than the
use of the Living Algorithm. Probability's general averages are adequate, providing a fairly accurate summation of annual talent. This
information is essential when determining annual awards (MVP) and the next year's rewards (salaries). However, it is equally certain that the
batting average for the entire season does not provide up-to-date information as to the hitter's status at the current time. For those who have a
stake in the current game, these general averages merely provide a diluted reflection of the player's present status. In contrast, the Living
Algorithm's predictive clouds provide up-to-date information that is extremely relevant to the manager, the pitcher, and even the betting
community. On the other hand, this up-to-date information, while relevant to the next game, loses its potency when applied to the entire year.
This example from Americas game beautifully illustrates how these two approaches to data analysis complement each other. Probability's
measures accurately characterize the fixed data set of the year, while the Living Algorithm's measures accurately characterize each baseball
moment by analyzing the dynamic data stream of at bats.
The Living Algorithm provides a trio of measures to mine this untapped potential information. This evolving trio consists of the following
ongoing descriptors: 1) batting average, 2) a range of variation, and 3) a description of recent trends (momentum). These evolving measures
weight the data stream of at-bats on a sliding scale according to their proximity to the most recent data point. Further, the Living
Algorithms simple algorithm (procedure) is more user-friendly than the unwieldy use of the 10-day average. Rather than relying
on a database that consists of all relevant at-bats, the Living Algorithm requires only the memory of evolving measures that
characterize the most recent player performance.
There might be some who feel that the Living Algorithm is but a subset of Probability. On first glance, the two approaches to data analysis
seem exceedingly similar. Both Probability and the Living Algorithm employ averages and deviations to characterize data. Due to these
similarities, it would seem that both would follow similar technical guidelines and have a similar purpose. As such, one migh t consider
Living Algorithm mathematics to be just a branch of Probability. But, as we shall see, instead of being Probabilitys subject , the Living
Algorithm rules her own realm. While the two have similar tools, they have mutually exclusive, yet complementary, domains. Ea ch has a
unique purpose and field of action.
Each system analyzes data. Probability, however, processes Data Sets, while the Living Algorithm digests Data Streams. Data S ets are fixed
in size, and Data Streams are continually growing. More importantly, Probability is limited to providing information about th e general
features of the entire data set, while the Living Algorithm can only provide information about individual moments in a data s tream. In short
the mathematical perspectives have unique fields of action. In fact, each is incapable of perceiving data from the others pe rspective. These
differences are crucial to how each form of mathematics manifests their abilities.
Although there are some striking similarities between the two, each has measures that are unique to its system. For instance, the Living
Algorithm includes the Directional as a member of her Family of Measures. The Directional determines the tendencies of the da ta stream and
gives birth to the Liminals. Further, the ideal Triple Pulse depicts the Directional of a specific and significant data strea m. Accordingly, the
Directional is central to Information Dynamics. This measure is unique to data streams. Probability has neither a Directional nor Liminals in
his bag of tricks. Because Probabilitys data sets are static, they have no direction.
Similarly, Probability includes many measures and forms of analysis, which are perfect for analyzing the features of static d ata sets, but are
inaccessible to Living Algorithm mathematics. In general, Probabilitys measures reveal the general state of the Universe, wh ile the Living
Algorithm Measures reveal the individual nature of its Flux. Consequently their fields of action and the questions they inspi re are entirely
different. The Living Algorithm & Probability belong to orthogonal universes, which intersect in Human Behavior.
To illustrate these concepts lets look at a few examples. Probability can take a computational snapshot of a data set and ap ply the insights
that are derived backward and forward in time to every data set that shares the same characteristics. This is why his conclus ions are so
powerful regarding matter. These insights apply to every piece of matter that has ever existed because matter sets are unifor m. For instance,
water molecules have always been and will always be the same.
A fundamental reason that Probability has such a difficult time describing human behavior has to do with evolution, both soci al and
biological. Due this evolutionary nature, the data sets regarding human behavior are frequently not uniform with respect to t ime. People are
continually changing from birth to death and human culture is continually evolving. Consequently, the insights derived from o ne data set are
harder to apply to other data sets of humans. For instance, care must be taken when comparing young people with old people Africans with
Europeans modern women with Stone Age women or college students with the rest of humanity. Many reputable studies of
human behavior, which have been acceptable in all other regards, have been fatally flawed due to inappropriately applying the
results from one data set to another data set of seemingly similar nature. In contrast, since it is universally assumed that
electrons in all times and places have always been identical, it is appropriate to apply the results from one data set of ele ctrons
to another.
The Living Algorithm can take a computational snapshot of a moment in the dynamic changing scene of the data stream and also come up
with some definite answers (the Living Algorithms Predictive Cloud). But these definitive answers are immediately eroded by the incoming
data - the constantly changing external landscape. Accordingly, a snapshot of a data stream only applies to that moment. The snapsho t reveals
the ongoing relationship between data points, not the permanent nature of the set.
This difference between the two systems is due to the way they process data. For example, each approach determines their resp ective measure
for range of variation in a similar, yet distinctly different, fashion. Both employ the same formula, but with one crucial di fference. To
compute the Standard Deviation, Probability relates the individual points to the general mean average of the set. On the othe r hand, to
compute the Deviation, the Living Algorithm relates individual data points to the Living Average of the preceding moment.
This discussion has clarified a few issues. Although the Living Algorithm and Probability have similar measures (averages and deviations),
each has a unique field of action. Probabilitys field of action is static data sets. He computes universal features of these sets with no attention
to the individual points, except as to how they contribute to the whole. The Living Algorithms field of action is individual moments in a
The Mathematics of Living Systems Page 21
to the individual points, except as to how they contribute to the whole. The Living Algorithms field of action is individual moments in a
dynamic data stream. She computes specific features of each moment with no regard for the set as a whole. Accordingly, their fields of action
are complementary, each with a unique perspective. Neither is a subset of the other. Probability specializes in determining t he general
features of fixed data sets, while the Living Algorithm specializes in determining the changing features of individual moment s in a data
stream. As well as her other monikers, the Living Algorithm System could also be deemed the Mathematics of the Moment.
It is evident that the Living Algorithm and Probability are complementary systems. However, the use of probabilistic measures is widespread
in the scientific community, while the Living Algorithm is not employed at all. Does this indicate that the Living Algorithm System has no
scientific validity? The Living Algorithm certainly provides interesting information about the moment. But does this informat ion have any
scientific utility? To explore the issues behind this question, read the next article in the stream General Patterns vs. Individual Measures.
The previous article illustrated how the Living Algorithm specializes in characterizing individual moments in a data stream, and how
Probability specializes in characterizing universal features of fixed data sets. Weve also seen how one of Probabilitys mea sures, the batting
average, is employed in a predictive fashion. This average provides an estimate concerning future performance that is highly valued by the
baseball community. Note that neither Probabilitys nor the Living Algorithms estimates about a player's performance are gua ranteed. The
batter could fall into or break out of a slump at any moment. The measures provided by either system only indicate a probable performance,
not one that is predetermined. Further, it is impossible to even put a probable range onto the accuracy of either prediction.
Why cant the scientific community apply their rigorous tools to the batting average? And if the batting average has no scien tific value, what
scientific significance does the Living Algorithms Predictive Cloud have?
Probability can apply his averages and standard deviations to the growing data set of a players at bats. However, Probabil ity cannot apply
his more sophisticated statistical tools that measure the parameters necessary to determine predictive accuracy the precision of the
estimate. Specifically, he cannot apply the standard error of the mean that is necessary to set confidence levels. This is a fatal
flaw in any scientific study based in statistical analysis. The careful application of these tools is essential for publicati on in
scientific journals. In short, a baseball player's batting average is too individual and transient to have any scientific val ue.
To crystallize these ideas lets look at an example from our political world. The more thorough polls will survey 2000 random people from
the rolls of American voters to make estimates about election outcome. Pollsters then make statements like: If the election were held right
now, 45% of the population is likely to vote for Obama with a range of 5% in either direction. This is the practical applica tion of
Probability's measures. An average (45%) that characterizes the set is highlighted along with the range (5%). The range indic ates the
confidence limits of the estimate of voter preference.
Although Probability provides precise measures concerning the general features of the data sets of these opinion polls, the r esults only apply
to that moment in history. The qualifying statement, If the election were held right now, indicates how tenuous the predict ions are. Due to
the volatility of political conditions underlying voter preferences, the opinions of the voter set are continually subject to change. Accordingly
the intentions of a set of voters at one moment in time can only be loosely compared with the intentions of the set of future voters. Because
of the transient and individual nature of these polls, there have been many notorious examples of voter polls predicting the success of one
candidate when the other wins. Due to this lack in predictive accuracy, opinion polls are like the weather report; we pay att ention but dont
place too much stock in their predictions due to the constant potential for an abrupt change in conditions. Because it is imp ossible to
generalize the certainty about that data set to any other equivalent data sets, opinion polls have no scientific value.
Science requires a certain level of certitude combined with the ability to generalize analysis to similar circumstances. Each baseball players
performance is so individual and transitory that it is impossible to generalize the analysis from one player's set to another or even to the
player's future performance with any certainty. This doesnt take away from the pragmatic predictive value of the batting ave rage. It just
means that it is impossible to achieve scientific certitude. In contrast, Probability can generalize results from identical m achines. For
instance, a company could track the performance of 100s of identical cars (the same make and model) and employ Probability's talents to
make well-defined predictions about future performance. Similarly scientists can examine the effects of the identical drug on a signifi cant
number of humans in a certain stage of a disease and make some sound scientific estimates about the future performance of the drug.
However each player is so unique, the pitchers he is facing are so different, and the psychological pressure of the big game so variable, that
it is impossible to have identical humans and circumstances to compare statistics with.
Although Probability can accurately characterize a player's data set, he can't apply this analysis to the player's future wit h any certitude. This
is due to the transitory nature of a ball players life as he moves through time and circumstance. A ball players performanc e lays in an
unknown future. Aging, accidents, and illness are three common features of life that can have an unpredictable and abrupt eff ect upon the
living data stream of a baseball players at bats.
For instance, nobody would ever attempt to equate the hitting data set of 20-year-old home run hitter Barry Bonds with his hitting data set
when he is 40 years old. This is due to lifes inevitable aging process. This means that the data set of his performance must be continually
redefined. Because of the necessary redefinition the sets are different and hence arent comparable scientifically. Further n o one would ever
attempt to equate 20-year-old Barry Bonds' hitting data set with another 20 year olds hitting data set because of inevitable individual
differences. In short the hitting data set of 20-year-old Barry Bond can only be equated with itself. Barry Bond ages and no one is quite like
him. The data stream characterizes the change inherent to living systems. The individual and transient nature of a humans da ta stream
renders the analytic tools of Probabilitys fixed set mathematics incapable of establishing the certitude that Science requir es.
Living Algorithm's Patterns are scientifically significant, not the individual measures.
The same analysis applies to the Living Algorithms predictive cloud. Data streams are so individual and transient that it is impossible to
achieve the certitude that Science requires. However, the Living Algorithm's predictive clouds supply an abundance of practic al information
when applied to living data streams, as evidenced in our batting average example. It is a plausible assumption that living sy stems employ
this pragmatic tool, the predictive clouds, for assessing environmental patterns to best determine the most appropriate respo nse to ensure
survival. If Life employs the predictive clouds, then Life is also subject to the Living Algorithm's information patterns. In Triple Pulse
Studies, the first notebook in this series, we examined many examples of how Life has employed the Triple Pulse, one of the Living
Algorithm's information patterns, to organize human behavior regarding sleep. Accordingly, the scientific value of the Living Algorithm
System lies in its ability to reveal the underlying information patterns that motivate behavior.
The Mathematics of Living Systems Page 23
System lies in its ability to reveal the underlying information patterns that motivate behavior.
However the Living Algorithm System doesn't have the tools to establish the scientific certitude of these connections. Probab ilitys
analytical talents are required to verify, or at least establish the limits on the correspondences between human behavior and the Living
Algorithm's information patterns. Once again it seems as if Probability and the Living Algorithm represent complementary syst ems.
As complementary systems, Probability provides a mathematical analysis of the general nature of fixed data sets, while the Li ving
Algorithm provides a mathematical analysis of the individual moments in dynamic data streams. Further due to the fixed and ge neral nature
of Probability's analysis of sets, the results can also be generalized with a distinct measure of scientific certitude. In co ntrast, due to dynamic
and individual nature of the Living Algorithm's analysis of moments, the measures that determine the trajectories of individu al moments
cannot be generalized. Hence the individual measures generated by the Living Algorithm, while possessing great pragmatic valu e, have no
scientific value.
While the individual measures of the Living Algorithm have no scientific value, the Living Algorithm's method of digesting in formation
reveals patterns that seem to influence living behavior (Triple Pulse Studies). What is the nature of these patterns? In what manner do they
differ from the patterns that Probability reveals?
To understand these differences, the next article is a historical investigation of Probability's rise to the top of the subat omic world. Ironically,
the story of how Probability became famous as ruler of the subatomic world illustrates both his inherent strengths and weakne sses. Further it
pertains to why the Living Algorithm's dynamic nature is ideally suited to determining causality, while Probability's static nature is more
suited to description. As with other aspects of these respective systems, their talents are mutually exclusive. Read the next article in the
stream Description vs. Causality; Static vs. Dynamics, to see how Probability was able to patch up the gaps in the subatomic universe that
were left by classical Mechanics. Also see how Probability sets the stage for the Living Algorithm's entry onto the scientifi c stage.
Or perhaps you've tired of this endless exposition. For a fresh allegorical perspective reenter our alternate universe and re ad Probability's
Numbers vs. Living Algorithm Patterns.
At the end of the 19th century, the continuous equations of Mechanics (traditional Physics) reigned supreme. Many believed th at Mechanics
had uncovered all the universal laws of matter. It was recommended that young men pursue a different line of research because this field was
exhausted. These continuous equations, which delineated the universal laws of Newtons mechanics, could accurately account fo r the
behavior of matter down to the atomic level. The subatomic realm, which was to turn everything upside down, had yet to be rev ealed.
Due to their explanatory power, it was thought that these continuous equations accurately reflected the nature of Matter, the Universal
Substance. Some of the more enthusiastic claimed, and some still claim, that this power also extends to Life, the form of mat ter that is alive.
The implications of these continuous equations are basic. Space and Time are continuous and distinct. Cause and Effect is an unbroken,
instantaneous, automatic affair. Once the appropriate starting point (initial conditions) has been established, the appropria te equation, which
had already been derived, could accurately determine all future moments with great precision. In other words, the Universal F abric had no
tears or discontinuities. Everything moved as if it were some giant clock. It was a comforting, feel -good perspective on the nature of reality.
All behavior conforms to precise laws. No paradox or ambiguity. Nothing unknown. Everything has a scientific explanation.
At this point in history (the late 1800s) Probability was just a supporting actor. Just recently admitted to the exclusive Sc ience Circle, his
credentials were a bit dubious. Probability was still associated with gambling and its unpredictable laws of chance. Employin g Gaussian
distributions (the normal curve), Probability could accurately predict long-term patterns. This was his scientific utility. However, he could not
determine what would happen in the next throw of the dice. Because of his inability to make firm predictions, Mechanics looke d down on
Probability as a messy or wishy-washy mathematics.
As long as Mechanics was describing planetary position or the trajectories of cannon balls, he was on solid ground. The situa tion changed as
ordinary sized objects turned into collections of atoms and molecules. The computations were too overwhelming for the simplif ied version of
reality supplied by Mechanics. He had to rely on Probability to perform the computations regarding the atomic universe. This was a natural
progression for Probability, whose specialty is characterizing the features of fixed data sets. He had dealt effectively with data sets of identical
dice or coins. Probability could apply these same talents to identical atoms and molecules. Thermodynamics was the first to e mploy
Probabilitys talents extensively to precisely predict the behavior of hundreds of millions of atoms and molecules. Probabili ty became the
computational tool of choice when dealing with material systems containing quadrillions of equally weighted elements. He is a ble to make
exceptionally precise, practically exact, predictions about the behavior of gases. For instance, Probabilitys measures enabl e scientists to make
miraculously accurate statements about the behavior of oxygen molecules when they are heated.
Probabilistic uncertainty, of course, flew in the face of the deterministic worldview of Mechanics where everything could be predicted. Yet
the contradiction was easily resolved. If the positions and trajectories (the initial conditions) of all the atoms and molecu les could be
accurately mapped out, Mechanics could determine the next throw of the atomic dice. Probabilitys talents had just been emplo yed as a form
of approximation a computational necessity due to the sheer number of atomic particles in the process. Probability was not yet a
philosophical necessity. In theory it was certainly possible to calculate the behavior of the atomic particles with the laws and
principles of Newtonian dynamics. Probability was only necessary to make the calculations possible, not for theoretical
purposes. He was just a supporting actor a mere computational tool with no other significance.
Employing Probability is merely a practical convenience, Mechanics confidently asserted. This reliance on Probability in n o way impacts
our view of an orderly, continuous universe where everything is predetermined. My system describes the natural order perfectl y, so my
equations still reign supreme. There is no place in my system for Probabilitys uncertainty. At this stage in history scient ists could still
envision a world that consisted of distinct particles and waves automatically and continuously interacting with each other. T hen came the
electron.
After the war, a primary focus of young Physicists was to resolve this heresy against Mechanics in a traditional manner. Ever yone, including
Bohr, was convinced that electrons were microscopic particles moving through a continuous space and giving off light waves. T here must be
a novel perspective, as yet hidden, that will resolve the paradox of quantized space. The supposed solution came from a surpr ising direction.
To resolve the inconsistencies with this perspective, Schrdinger derived his famous equation that defined electrons instead as waves. This
solution was radical, but still hadn't challenged traditional notions of a continuous space and a single truth.
While his equation fit the increasingly precise data concerning electrons, hypothetical extensions of Schrdingers equation led to some
strange and impossible results. For instance, when certain conditions were introduced into the equation, atoms expanded to th e size of the
Pentagon. To resolve this seemingly paradoxical situation, Max Born successfully applied probability theory to the problem. H is solution
indicated that the motion and position of the electron could be more accurately characterized as a probability wave. Improbab le as the
solution seemed, Borns probability insight resolved the mathematical difficulties introduced by Schrdingers equation and f it the hard data.
But this resolution implies that the world on the outside of the atom is essentially different than that the inside of the at om - the one
continuous and certain, the other discontinuous and ambiguous a Quantized and Probabilistic Universe.
Borns solution, while it fit the facts, didnt resolve the question of why some experimental results suggested that the elec tron was a wave and
that others suggested that it was a particle. Exploring the mathematical inferences of quantum theory, Heisenberg, Bohr's stu dent, derived his
famous Uncertainty Principle. An electron's static position or dynamic movement can be measured precisely, but not both. This suggested that
an electron was either a wave or a particle depending upon the mode of observation. A simplistic version of this philosophica l earthquake
maintains that subjectivity is a factor in observation due to inescapable mathematical constraints.
The insights of Born and Heisenberg moved Probability to center stage in the quest to understand the essential nature of the Subatomic
Universe. Richard Feynmans insights into Quantum Electrodynamics sealed Probabilitys position as Ruler of Subatomic Particl es.
Feynmans insight was even more counter-intuitive. His formulas allowed scientists to calculate the behavior of subatomic matter to
unbelievable levels of precision. However, the basis for these equations considered that subatomic particles moved in all pos sible directions
simultaneously, including forward and backward in time. His computations revealed which of these directions wasnt canceled o ut by
contrary motion. Scientists now had to take all possible directions into account to make this probabilistic computation.
Mechanics provides both the explanations of causation and the computational power when objects are large enough to be seen wi th the eye,
such as planets and balls. Mechanics still provides the causal mechanisms when the objects are invisible to the naked eye. Th is includes both
atomic and subatomic particles. Probability is required to provide the computational power for the uncountable numbers of mic roscopic
particles. In the subatomic realm of electrons and photons, Mechanics requires Probability to fill in the gaps in his theory. As such Probability
completes the explanatory picture.
Despite the inherent philosophical uncertainty that this solution introduced, Physicists could now accurately predict the beh avior of pure
matter from the level of the electron and proton all the way up to the galaxies and everything in -between. Probability was used to perform the
computations for the data sets of eternally identical atomic particles. When the matter became big enough, Mechanics (classic al Physics) with
his continuous equations took over. Physicists proudly claimed that they could predict the behavior of matter on a continuum from the
microscopic to the macroscopic. Probability and Mechanics were now wedded forever as modern Physics. The abilities of both we re required
to describe the behavior of pure matter. However when pure matter was polluted with life, this merger proved helpless. For in stance, Physics
has a hard time predicting where a chicken will land when tossed into the air.
Intoxicated by the explanatory and predictive powers with regards to pure matter of this merger of classical Physics and Prob ability, the
followers began claiming that they were on the verge of predicting the behavior of everything. A classic logical chain of sci entific
reductionism goes as follows: We can predict the behavior of the subatomic world, which provides the building blocks of th e atomic
world, which provides the building blocks of the material world, which provides the building blocks of the Universe. Ther efore we can
predict the behavior of the Universe. We just have a few details to work out.
This line of reasoning is confuted by the notion that the field of action determines precedence, not the building block/funda mental principle
mentality. For instance, while Physics informs us about all the incredible details of resonance, it tells us virtually nothin g about the music of
Bachs Brandenburg Concertos. In a similar fashion, the laws of Material Science provide Biology with some inescapable constr aints. While
these forms supply the essential structure that enables the development of the complexity required of living systems, they do not determine
meaning, the Music of Life, anymore than the laws of resonance reveal anything about the meaning of music. The field of actio n determines
the nature of the explanation. While underlying structure enables essential complexity, it does not determine meaning.
All the subtle concepts introduced by subatomic particles, the ambiguity and paradox, were swept under the rug to keep them o ut of sight.
The scientific community didnt want this uncertainty to taint in the triumphant union of Mechanics & Probability. This union is an
explanatory and computational tool that could describe the behavior of matter almost completely the operative word being almost. The
arrogance of certainty emerged almost immediately after the hubbub died down. After all Physics could confidently claim that he
could totally explain and compute everything that really matters - especially if all that matters is matter. Living matter is another
story.
static system, can never hope to provide. While probabilistic descriptions provide boundaries, they can't possibly reveal und erlying meaning.
This was one of the insights of Bohr's complementarity principle. It was possible to know either process or content, not both . This is another
reason that the two are complementary systems. The Living Algorithm reveals the patterns of the data streams process, while Probability
reveals the content of the data set.
Before Probability was required to plug the subatomic holes, continuous equations reigned supreme both scientifically and phi losophically.
With Probability's ascent, the importance of continuous equations as a philosophy waned. They became a great model rather tha n a definitive
feature of the Universal Substance. This dethronement of continuity opened the door for the Living Algorithms system of dyna mics, which is
digital.
Although the Living Algorithm system has very little in common with the material world from the atom on up, the Living Algori thms
patterns have much in common with subatomic wavicles. Schrdingers equation transforms the electron from a particle to a wav e. Born's
mathematical resolution transforms the material wave into a probability wave. Bohr's interpretation implies that probability is information
more than material. Hence the electron and photon become information waves, packets or pulses of information. The Living Algo rithms
basic manifestation is as a pulse of information - the Creative Pulse, a.k.a. the Pulse of Attention. In fact, the eyes sense the individual photon
as one of the Living Algorithms fundamental information pulses.
This analysis suggests that the triad of Mechanics, Probability and the Living Algorithm form a comprehensive interlocking sy stem that
incorporates both the static and dynamic nature of matter and life. Each mathematical system is necessary to explain and anal yze different
parts of the puzzle. There would be a conspicuous gap if any of the systems were excluded. The similarities and specialties a re shown in the
following diagram.
The bowed triangle in the center is where the three sets intersect as equation -based systems. The systems of Mechanics and Probability
intersect as studies of the material world. Mechanics and the Living Algorithm intersect as systems of dynamics. Probability and the Living
Algorithm intersect as types of measures. Each of the mathematical systems is unique in its own way. The Living Algorithm is digital; the
equations of Mechanics are continuous, and Probability's static.
Mathematics of Relationship
29 November 2015
08:44 AM
In prior articles we provided evidence to support the following claims: 1) Probability best characterizes the permanent and g eneral features
of fixed data sets. 2) The Living Algorithm best characterizes the changeable and individual moments of living data streams. As such, we
have chosen to call the Living Algorithm math, the Mathematics of the Moment. Because of the way the Living Algorithm digests data
streams, she also puts these moments in relationship to each other. Accordingly what happens at each instant in time has an e ffect upon
subsequent developments. Lets see how.
A. An Instantaneous Data Bit becomes a Moment
The Data Stream Mathematics of the Living Algorithm System specializes in relationships. This focus is due to the way in whic h the Living
Algorithm digests information. She takes raw data (which we will call instants) and spreads them over time (which we will call moments).
Graph A illustrates what happens when the Living Algorithm digests the number one (raw data/ instant) and transforms it into a moment.
The instant is the point when the data enters the System. The impact is greatest when the data (instant) first enters the Sys tem. The effect of
the impact on the System decays with each repetition of the process (iteration). This is why the measure of this diminishing impact is named
the Living Average. In this case, it took about 60 repetitions (iterations) until the original impact faded to 'practical' zero. It could be said
that the Living Algorithm transforms a one-dimensional entity into a two-dimensional entity. More simply put, the Living Algorithm gives
Data a meaningful dimension by spreading its influence over time. As we shall see, raw Data is not in an accessible form for the organism,
until it undergoes this transformation. Because moments are accessible, they provide the organism with meaning.
B. The Accumulation of Moments
With each repetition of the process, a new data byte (instant) is digested in a similar fashion. The Data enters the System a s a onedimensional entity in this case a one. The Living Algorithm then spreads the instants impact proportionately over time. This
process transforms instants into moments. With each subsequent iteration, each moment is layered on top of what went before,
i.e. the diminishing (decaying), yet influential, effects of the preceding moments (Graph B). Notice that the blue area in Gr aph A
is an enlargement of the small blue sliver at the bottom left in Graph B. All of the colors in Graph B represent the layering and
accumulation of moments.
Graph A was produced by a data stream consisting of a single 'one', followed by a string of 'zeros'. A data stream consisting of 120 ones
followed by 120 zeros produced Graph B. When there is just one data bit acting alone, its impact (the moment) on the system i s quite
small its maximum only .06. However, when there are series of moments as in Graph B specifically 120 of them, the total
collective impact on the system is much greater eventually rising to '1.0'. The collective force of the stream of ones has an
impact on the overall System that is inevitably 16 times greater than a single one acting alone. In short, there is a great er
impact upon the System when the Moments operate together moving in the same direction.
This type of interaction is additive (accumulative), which is certainly a valid form of interaction. Yet, there is another as pect of the Living
Algorithm's method of digesting information that integrates the Data in an even more complex fashion. The above analysis only applies to
the Living Average the simplest measure generated by the Living Algorithm. When more complex measures are introduced,
the resulting computational stew is an intriguing combination of interactions between data and measures. Without getting into
mathematical details, Chart C below indicates the complex weave between the Data and the Living Algorithm's Measures that
is required to produce the value of each new Measure. The columns of Xs without hats (the 2nd, 4th and 6th columns)
The Mathematics of Living Systems Page 29
is required to produce the value of each new Measure. The columns of Xs without hats (the 2nd, 4th and 6th columns)
represents the contributions of the Data, while the columns of Xs with hats represent the contributions of the Living Algorit hm
Measures.
C. Interlocking Interactions between Data & Living Algorithm Measures
Weve seen two ways in which the Living Algorithms method of digesting data streams creates a relationship between the momen ts. There
is one final way, perhaps the most significant for this discussion, in which the Living Algorithm produces an Interactive Sys tem. All the
triangles in Diagram C above indicate change. As such, all the interactions in the Living Algorithm System are based in chang e, the
differences between a complex mixture of Data and Measures. Accordingly no Moment exists as an independent entity in the dyna mic
System. Each Moment only exists in relation to the surrounding Measures and Data Points. Independent Existence is an Illusion . The only
Reality is constant Change, at least in the Living Algorithm System.
In Probability's System, the Reality is fixed and never-changing. This is why definitive predictions are possible. Permanence can be
characterized by definitive patterns. The event horizon can be narrowed to a single point - just one alternative, based upon initial conditions
and an equation. The mathematical functions (equations) of hard science epitomize this claim. In contrast, because the Living Algorithm
System is founded in dynamic change, only suggestive predictions are possible. Evolving transience is best characterized by s uggestive
patterns. The event horizon can only be narrowed, not eliminated. Identifying these suggestive patterns is the Living Algorit hm's specialty.
In summary, the Living Algorithm's method of digesting Data Streams produces an Interactive System. This occurs in three sign ificant
ways. 1) The impact of individual data points accumulates to have a greater general impact upon the System. 2) The Measures a nd Data
interact in a complex, interlocking fashion to produce derivative Measures. 3) The entire Living Algorithm System is based up on analyzing
differences between Measures and Data. As such, each moment only exists, not as an individual entity, but in a dynamic relati onship to the
preceding moments. Accordingly, it seems fair to say that the Data Stream Mathematics of the Living Algorithm System could al so be
called the Mathematics of Relationships.
To see what other features that the Living Algorithm has in common with living systems, check out the next article in the str eam Precision
vs. Fungible Meaning.
As we've seen the Living Algorithm System and Living Systems share many features in common. This article explores yet another similarity
between the two systems. Both have a fungible component. Let us explore the meaning of the word fungible as it relates to these two systems.
Fungible is a legal term with the following definition: "(of goods contracted for without an individual item being specified) able to replace or
be replaced by another identical item: materially interchangeable: Money is fungible money that is raised for one purpose can easily
be used for another." A can of beans is also fungible in this legal sense because the exact number of beans has not been
specified. This characteristic of any can of beans must be ignored in a court of law. This implies that there is an acceptable
range of imprecision when considering a can of beans. The specific number of beans can have a wide range of values, as long
as the can has the same general net weight as claimed. In this sense the word fungible allows for an acceptable range of
imprecision in the application of law.
Drs. Jack Cohen and Ian Stewart in their book, The Collapse of Chaos, stretch the meaning of fungible to describe a unique aspect of living
systems having to do with the flexibility of interpretation. They point out that every culture, primitive or advanced, employs certain general
words to categorize birds of the same species. This is true even though the birds have an abundance of individual characteristics that separates
one from another. The word chicken can be applied to an entire group of birds because individual characteristics have not been specified. A
chicken can be large or small, young or old, black, speckled or patterned, and still be referred to as a chicken. To identify a meaningful
pattern, for instance the group chickens, it is necessary to overlook the individual characteristics of each bird.
Cohen and Stewart argue that living systems enlist this sense of fungibility to recognize meaningful patterns in their environment. The term
fungible suggests that a certain level of ambiguity is acceptable on the individual level if we are to make meaningful statements about the
whole. This is the sense in which we will use the word fungible.
As soon as raw data enters the Living Algorithm System it is immediately dumped into a computational stew and is forgotten. After making
its instantaneous impact upon the moment, the data is absorbed into the Systems ongoing measures the Predictive Cloud. The data
leaves traces of its impact, but its precise features are just a fading memory.
Probability, in contrast, must retain the precise features of each member of his fixed data sets. Remembering each of these values is essential,
if he is to adequately perform his primary function. Probability requires the perfect memory of a computer, or at least a ledger sheet, to
remember the precision of his data points.
Probabilitys task, as weve discussed, is providing measures that characterize the general features of his data sets. Computing the values of
these measures is tedious, some might even say complicated, to say the least. The Standard Deviations square roots are never that fun.
Psychologists everywhere breathed a sigh of relief when computers entered the scene to compute their statistical measures. Further, decades
of schooling are required to understand and employ Probabilitys many equations. Living Systems dont have this luxury. The urgency of the
moment demands an immediate response to preserve Lifes fragility.
Biological systems have a difficult time remembering anything that is not relevant. A pile of precise numbers from the past has no meaning,
except for what they contribute to the present moment. Because these precise numbers have no relevance, living systems would have a
difficult time remembering them. In contrast, the Living Algorithms Predictive Cloud provides crucial up-to-date information about data
streams that could be relevant to survival. Accordingly living systems could more easily remember the composite averages (the Cloud) that
the Living Algorithm provides. Infused with the emotion of survival, the Living Algorithm Measures are well worth remembering, especially
compared with the precise values of non-emotionally charged data points. The Living Algorithm Measures are also easy to compute, just one
algorithm and basic math (no square roots).
The memory and computational requirements of the Living Algorithm System are all within the range of any biological system, including
cells. The memory and computational requirements of Probability are well outside the range of any biological system including a genius.
It seems safe to say that Probabilitys obsession with the precision of his data prevents him from providing Life with the up-todate meaningful information that is essential for survival. In contrast, the Living Algorithms neglect of extraneous detail allows
her to provide the fungible interpretative mechanism that Life requires. In so doing the Living Algorithm incorporates ambiguity
into her System.
Introduction to Algorithms
2: Articles
3. Sections
4. Paragraphs
"In mathematics and computer science, an algorithm is an effective method expressed as a finite list for calculating a function. In simple
words, an algorithm is a step-by-step procedure for calculations. Giving a formal definition of algorithm, corresponding to the intuitive
notion, remains a challenging problem." (Wikipedia) The 'step-by-step procedures' we learn in elementary school to add, subtract, multiply,
and divide large numbers are common examples of some simple algorithms.
The 'intuitive notion' of algorithm that we employ in our discussion of the Living Algorithm has to do with a 'step-by-step procedure for
determining an answer. In mathematics and computer science this answer must be precise and unique - right or wrong. This is also true for the
algorithm that determines the mathematical value of the Living Algorithm. However, when living systems employ the Living Algorithm's
algorithm to make reasonable predictions, the answer must only be close enough for practical purposes.
Take a deep breath. Lets not give up quite yet. Lets go back to the beginning of this process. What was the only ingredient needed for the
computation of the average arrival time? A simple list. Sounds like a data set. Lets turn our data set into a data stream. In daily life, we process
information as a stream, where the most recent points have the greatest significance (the recency effect). It is difficult to remember a simple list
of arrival times. Yet, remembering a stream of arrival times in the order of their importance is even more daunting. Terrible suggestion. Give it
up. Lets run for the hills before confusion overwhelms us. Data Streams incorporating the recency effect are an order of magnitude more
complex than a simple set of data.
1) Find the difference between the most recent data point and the current average.
2) Scale this difference down by a some ratio (1/2 or less)
3) Add or subtract this scaled difference to the current average to obtain the new current average.
Three steps. Lets see how it deals with our verbal data.
Suppose our first data point is a little more. Lets see? 1) What is the difference between a little more and the expected arrival time (the
previous average)? Simple, a little more, nothing else. Whats next? 2) Scale down this difference. Now our value is tiny. 3) Finally, add this
tiny value (the scaled down difference) to the previous average to get the current average. Because our partner arrived home a little later than
normal we now expect them to be just a tiny bit later from now on. Simple. If the arrival time gets later and later, the expected arrival time (the
average) drifts slowly later. If the arrival times are erratic the expected arrival time hovers around a center point.
expect your partner home at about 6:05 plus or minus 20 minutes taking into account a possible delay.
Up to this point your partner's arrival time had been stable. Your expectations of recent tendencies weren't leaning either way. Because she
arrived a full hour later than usual, you assign a new later time to her recent tendencies. But the next evening, she arrives late again this time
at 6:45 PM. (Working late.) Because normal expectations were exceeded again, you might even be a mite irritated that your
partner didn't call. Again all three expectations are bumped upward: 1) the arrival time, 2) the range of arrival times, and the
recent tendencies of the arrival times. Again these adjustments are proportional to the differences. Maybe you now expect your
partner to arrive home a little later 6:10 PM with an increased range of 25 minutes, and certainly the recent trend is towards later.
With each new arrival time these three expectations are adjusted accordingly. No data need be retained. The expectations are immediately
adjusted and the exact data has no more value - like yesterday's weather. Although you may store extreme values, there is no need for any
database to compute the expectations. No numerical baggage need be lugged around. Although we assigned numbers to these changes, we
could just as easily have assigned words. For instance: 'She has been arriving home a lot later, just recently.' Or 'Her arrival times are much less
stable than they used to be.'
Because expectations determine the threshold of response they have an emotional component as well. As a personal example: Herbert, an older
German waiter, always showed up 10 minutes early for his shift. One time he was 5 minutes late and everyone began worrying. Herbert's range
of expected arrival times, as determined by his past performance, was so narrow that even a slight variation was alarming. In Herberts case 5
minutes late was a lot later than normal. In contrast, my Person always showed up a little late. Even though he was later than Herbert on this
night, no one paid it any mind. Because he was in the range of his expected arrival times, no thresholds were crossed. No need to contemplate a
change in behavior. With Herbert, the manager had already begun to wonder if he should call the police. As this example shows, when people
behave outside of their usual expectations, others may respond emotionally, in this case with concern.
An awareness of the evolving features of your partners arrival times leads to estimates concerning her future performance. These estimates
lead to expectation. Expectations tend to be emotionally charged as they determine the thresholds of response. When expectations are met,
such as the partner arrives home on time, no thresholds are crossed and no action needs to be taken. When expectations aren't met, such as the
partner arrives home much later than usual, the threshold of expectation is crossed, and decisions need to be made about what should be done.
Because the three expectations (expected arrival time, range of arrival times, and recent trends) have emotional content, they are much easier to
remember. Scientific studies are conclusive. Information that is emotionally tagged is much easier to remember than raw facts.
Computationally this is a very economical system in terms of memory, computation and relevance. The Living Algorithm digests a single data
stream to create 3 very different expectations. Mathematical residue from the primary computation becomes the data in the secondary
computations. It is not necessary to retain the data that goes into making these computations. Only the current adjusted expectations regarding
arrival times have any meaning or relevance. One algorithm digests a single raw data stream to produce three emotionally charged
expectations. Due to this emotional component these values are easy to remember.
Mathematically, the Living Algorithm creates 3 evolving, composite data streams from the ongoing raw data of arrival times: 1) probable
location, 2) probable range of variation, and 3) recent trends. Each of the data streams is characterized by a single value (not a database) that
describes a unique feature of the most recent moment. The primary composite data stream determines the most probable arrival time. It is
based upon the difference between expectation and the most recent arrival time. The Living Algorithm employs this original difference
(residue from the initial computation) as the data for the other streams. In the case of the trends of the data stream, the original difference is
digested as it is (positive or negative - a vector), In contrast, to determine the expected range, the initial difference is digested as a positive
number (a scalar).
In Data Stream Mathematics, the initial article of this volume, we argued that biological systems require some type of mathematical system to
digest the ongoing flow of environmental information. This data stream mathematics of living systems must satisfy some precis e requirements.
This mathematical system must be able to address the immediacy of the moment as well as the ongoing relationship between mome nts. Further,
the mathematics must provide some type of fungible interpretative mechanism that sacrifices precision for meaning. This meani ng must be both
descriptive and predictive.
In the volumes subsequent articles, we demonstrated that the Living Algorithm fulfills these requirements for a mathematics of living systems.
Conversely, Probabilitys mathematical system is unable to satisfy these requirements. There is an innate reason for these di fferences in ability.
Probability asks the question: what is the mathematical nature of fixed data sets? Conversely, the Living Algorithm asks the question: what is
the mathematical nature of dynamic data streams? The question that is asked determines the nature of the answer. Consequently Probability
delivers answers that are related to general characteristics of fixed data sets, while the Living Algorithm provides answers that are related to the
individual characteristics of dynamic data streams.
The data stream mathematics of living systems must incorporate yet one more requirement. The system must include the possibil ity of
interaction with the environment. This interaction is an essential feature of living systems as it enables the ability to mon itor and adjust to
external circumstances in order to survive. In other words, the mathematics of living systems must also incorporate the possi bility of Informed
Choice.
Can the Living Algorithm fulfill this crucial requirement? Traditional Physics, i.e. Mechanics, also specializes in character izing data streams.
What are the differences in the approaches of these 2 mathematical systems to data streams? Does Mechanics incorporate the po ssibility of
Informed Choice?
The fundamental difference between the two systems is rooted in the basic equations that each employ to characterize existenc e. The equations
of classical Newtonian Physics utilize an infinite and continuous stream of numbers. As such, the focus is upon a number line. The Living
Algorithm is digital, in the sense that her number stream is comprised of discrete points. As such, her focus is upon individual numbers. This
seemingly small difference is the hairbreadth that leads to entirely different conclusions regarding life and the fundamental nature of the
Universe. To see why, let us establish a historical context.
Let's examine the differences between the traditional continuous analog equations of Physics and the digital feature of the L iving Algorithm
from a visual perspective. At right is one of the fundamental (perhaps even quintessential) graphs of Physics the classic sine wave. The
sine wave isTHE BASIS OF
such universal phenomena as electromagnetic waves and spring action. It is based upon initial conditions and is continuous, a s are virtually all
the equations of classical Physics.
Although the three alternating pulses of the sine wave and the Triple Pulse have many apparent similarities, the method emplo yed to generate
these two graphs is as different as night and day. Although both are seemingly flowing curves, the first is based upon a cont inuous analog
equation, while the second is based upon a discontinuous digital equation. In fact, the two types of graphs seem so visually similar that it took
the Author over 8 years to realize that there is a fundamental difference between the two.
He was further amazed to find that this mathematical difference articulates a key point of departure between the traditional and the new
scientific perspective The traditional approach emphasizes automatic processes, while the new scientific approach stresses th e potential for
informed choice. (These ideas are detailed more completely in A New Age of Science.) As we shall see in the following paragraphs, the
continuous equations of Physics inherently exclude the possibility of choice. In contrast, the potential for choice is inhere nt to the Living
Algorithm System.
A Close-up of the Sine Wave
The continuous analog equations of Physics have no wiggle room. No matter how much the graphs of these equations are enlarged , they remain
a smooth curve. As an example, lets view a close-up of the classic sine wave of Physics (shown at the right). Note the curve remains unbroken.
No matter how many times the graph is blown up, the curve will remain continuous.
In contrast, the Living Algorithm System contains an abundance of wiggle room. This is due to the digital nature of the Livin g Algorithms
method of digesting information. With each iteration (repetition of the digestive process), a new piece of data enters the Li ving Algorithm
System. The union of the Living Algorithm and the Raw Data Stream produces an ongoing Family of Measures. These Measures repr esent a
smoothing out of the Datas potential roughness. However, no matter how many times this smoothing out process is performed, t he resultant
measures remain discrete. As contrasted with Physics invariable automatic continuity, there is absolutely no connection betw een the points in
the ongoing data streams that the Living Algorithm generates. In fact, the elements of the data streams, like data sets, are inherently distinct.
A Close-up of the Triple Pulse
A close-up of the Living Algorithms Triple Pulse is visualized in the graph at the right. The image illustrates how she is made up o f distinct,
rather than continuous parts. The apparent continuity in her classic representation is just an illusion. The illusion of cont inuity is due to the large
number of iterations (repetitions). In similar fashion, the characters in movies and cartoons appear to move continuously, bu t are instead based
upon distinct frames that are shown rapidly enough that our visual processor turns them into a moving picture. In the case of the Triple Pulse,
the bars create a similar effect to the cartoon frames. When there are enough of them, they give the image of the Triple Puls e a continuous
appearance.
As weve seen, the Living Algorithm digests external input in a digital fashion. As such, each data point in the stream of in formation is
discrete/individual. In other words, there is space between each data point. This space provides time to evaluate the m eaning of the signal
and respond. Accordingly, the Living Algorithms digestion method incorporates the ability of an organism to monitor and adju st to the
environment. The capacity to monitor and adjust is an essential ingredient of the ability to choose the essence of informed choice. It is
evident that Living Algorithm mathematics incorporates the possibility of Choice, an inherent ability of living systems. In c ontrast,
the continuous equations of Physics provide no opportunity for an interaction with the environment.
Living Algorithms Fresh & Free Data vs. Physics Hard Data
To further assist our understanding of how the Living Algorithms method of digesting data enables the potential for choice, let us contrast the
relation the Living Algorithm and Physics have to their Data.
The Living Algorithm requires an ongoing flow of fresh raw data to fuel her System. Further this data is free, in the sense that it is not
predetermined by the Living Algorithm. In fact, the Data is entirely independent of the Living Algorithm. Instead of describi ng her Data, the
Living Algorithm organizes her data. This is of great use to Life, as the process provides the fresh & free Data with meaning .
In contrast, Physics has an entirely different relation with his Data Streams. Instead of organizing his Data Streams, he dom inates them with his
continuous, automatic equations. Physics only needs Data to determine his absolute formulas. Once Physics gives birth to his magnificent
equations, he abandons his Data.
After the derivation of the mathematical formula, Data is unnecessary (except to maybe check results). Physics only needs the starting point
(the initial conditions). Once the initial conditions are determined, any computer can crank out the results of continuous (a nalog) equations. The
results of this amazing form of analysis can include the position, velocity, acceleration and force of virtually any material system. Plug in the
initial conditions and out comes the results. The graphic visualization of these results generally includes a continuous curv e from the distant
past into the infinite future. Everything follows automatically according to the immutable laws of the equations derived by the immortals
(Newton et al).
This model drives the philosophy of scientific determinism. Under this way of thinking the initial conditions at the moment o f the Big Bang
determined everything that has transpired since - music, civilization, even your relationship with your dog - everything. God/Science knows all.
The only choice is to set the initial conditions. Once these are set, every point in the data stream is predetermined. No mor e choices. This is why
we say that Physics dominates his Data Streams with continuous Equations a Master/Slave relationship. (Unbelievably, a significant
group of humans actually believe that Physics will eventually dominate every data stream with his marvelous equations.)
Because the equations of Physics are derived from an examination of the Data, the accuracy and precision of the Data is of pa ramount
importance. To indicate this importance Physics refers to his data, as hard data the harder the better. Because his equations are able to
absolutely dominate this hard data, Physics is referred to as a hard science, perhaps the quintessential hard science.
Another indication of the importance of Hard Data is that theory and accuracy of measurement have moved hand -in-hand one or the other
leading. As an example of data leading theory, the accurate mapping of the position of the stars by ancient civilizations eve ntually
led to Ptolemys planetary theory. An example of theory leading data: Copernicus revolutionary idea that the earth revolves
around the sun was finally validated a few centuries later, when advances in technology made it possible to more accurately
measure planetary position. In the case of modern Physics, unusual experimental discoveries that were made possible by
advances in measurement technology led to the brilliant theoretical formulations of Einstein, Heisenberg, Feynman et al. This coevolution of Hard Data and Theory characterizes the Hard Sciences.
In contrast to the equations of hard sciences, as epitomized by Physics, the Living Algorithm is not derived from the Data. H er function is to
digest Data Streams - crunch an ongoing flow of numbers into an ongoing flow of central measures. This process organizes data streams to
reveal their meaning. As mentioned, the Living Algorithm does not predetermine the values of the instants in her Data Streams in any way.
Instead, she waits patiently for the next Data Point. Her suggestive descriptors give an idea what the value might be, but do not in any way
determine it. There is nothing automatic about her suggestive predictions. Her data is free.
But if the data in the stream is truly free, what can be said about it that is meaningful? Lets see.
relationship between these instants. Without this context, instants are inherently devoid of meaningful pattern. Due to the context provided by
the Living Algorithm, moments are inherently filled with mathematical meaning. In essence, the Living Algorithm transforms
meaningless instants into meaningful moments. Or yet another articulation of this process: the Living Algorithm's digestive system provides
meaning to Raw Data.
But this analysis is slightly misleading. The Living Algorithm is only able to transform the instants of a raw data stream into
meaningful moments if there is biological organism to interpret or translate her message. The Living Algorithm only determines the
mathematical nature of each ongoing moment. The organism must interpret these essential clues to give them meaning. The state ment: The
Living Algorithm provides meaning, is just shorthand for the above analysis. It is important to always remember that a biolo gical intermediary
is required to interpret the Living Algorithm measures and give them meaning. This factor becomes very important when we disc uss the mass
of attention.
Algorithm, exists as an Open Information System. There is an inherent permeability between the internal and external world th at characterizes
all biological systems, from a single celled amoeba on up. In living systems there is a constant give -and-take (stimulus/response) that, enables
the possibility of informed choices.
Newtonian Physics does not incorporate the potential for stimulus and response. The traditional analog equations of Physics c ombined with
initial conditions are the sole determiner of future events. Conversely the Living Algorithm, like Life, incorporates the pos sibility of choice with
her digital equations. This give and take relation to data streams is yet one more similarity between Life and the Living Alg orithm.
Despite their many differences, the mathematics of the Living Algorithm and Physics are bound together by a common thread which could
more accurately described as a super highway. The two forms of mathematics, with their uniquely different fields of action (l ifes
informed choice & material determinism), are bound together by classic Newtonian concepts, such as force, work and power,
mass, space, and even time. These concepts, while providing a common element between the two polar systems, have radically
different manifestations. To see where these complementary systems merge and diverge, check out the next notebook Data
Stream Dynamics.
To begin to understand this intersection between two orthogonal planes of existence we must first get to know the Living Algorithm a little
better. We have seen her in action, but we have yet to meet her. The initial article in the stream sets the stage with an exp loration of the Living
Algorithms algebra. Dont worry; the discussion doesnt require a mathematician. The Living Algorithm is a simple equation, only requiring a
basic knowledge of arithmetic. To better understand the underlying patterns of this unique equations innate nature, check ou t the Living
Algorithm, her Instantaneous Self.
Let's encapsulate our story. Life is searching for a mathematical partner that will be sensitive to her subtle immediacy. Pro bability's Data Set
Mathematics with his big picture focus accurately captures the features of the general population. However, because of this s pecialty he
doesn't have the tools to understand Life's immediacy. Needless to say, her relationship with Probability has proved disappoi nting. His
constant focus on providing measures for her fixed data sets has provided stability, in that he has accurately characterized her permanent
features even making definitive predictions about her fixed nature. Yet, Probabilitys style, while dependable, has frustrated
Life.
To understand her subtle and immediate nature Life requires a mathematics of data streams. Her subtle character is more assoc iated with the
momentum of recent moments, than it is with fixed and permanent features. If she hears of her general tendencies one more tim e, she is
going to scream. She almost feels that Probability is objectifying her, rather than appreciating her for who she is and the c haracteristics that
make her special. He even trivializes her ongoing data streams by transforming them into fixed data sets. While Probability a ccurately
characterizes these fixed data sets, Life wants a mathematics that is sensitive to her ongoing data streams.
But this new mathematics of data streams cant be just any old data stream mathematics. This new mathematics must fulfill som e stringent
requirements, if it wants to be considered the Mathematics of Living Systems. Life is very particular about who she partners up with. To be
sensitive to her needs, this mathematics must weight the current moment more heavily and provide ongoing predictive descripto rs that
pragmatically characterize the trajectories of the moment. Further, due to Life's inherently changeable nature, she requires suggestive
predictors that incorporate a range of possibility. This relative imprecision is an asset, not a liability. Probability's def initive predictions are
too exacting and general to be sensitive to Life's contextual spontaneity. Life does not want to be boxed in. She has felt su ffocated by
Probability's approach. To form a new mathematical relationship, Life is looking for a Data Stream Mathematics that is sensit ive to the
special meaning of the moment in short, a Mathematics of the Moment.
Where is Life going to find this special mathematics? Certainly not at a singles bar. Are her requirements too strict? Is she doomed to
mathematical isolation her subtle immediacy unappreciated? For some preliminary answers to these questions, read the next
article in the stream The Living Algorithm System. To continue with the metaphorical perspective, read on.
annual salaries, while my predictive cloud has more utility to players, coaches and gamblers, when making immediate decisions on game
day. This example shows that rather than being subservient to him, we are complementary systems. He specializes in general fe atures of the
data set, while I specialize in the individual moments of a data stream.
There might be some who still feel that I, the Living Algorithm, am but a subset of Probability. To sweep away any remaining confusion as
to our relationship, read the next article in the stream Mathematics of the Moment (vs. Probability). Some of my good friends wrote it. I
think they did a pretty good job."
To remain in the metaphorical world, read on.
Ads by DNSUnlockerAd Options
Life was relieved to find that the Living Algorithm's claims are true.
Life: "The Living Algorithm is not a subset of Probability, but a complementary system. Further, Alga's Predictive Cloud prov ides relevant
information regarding individual moments, something that is very important to me. Could she be the mathematical system of my dreams? Is
it possible that she could reveal some codes to my living matrix that will enable me to better actualize my potentials?"
This pleasant reverie was disturbed when she saw Probability striding confidently towards her obviously with some purpose in mind.
Life could sense from his jutting chin that he still had bones to pick over their last interaction.
After pleasantries were exchanged, Probabilty asked in a not-so-innocent fashion: "So how is your mathematical relationship developing
with the Living Algorithm?"
With a twinge of the victor, Life: "Great. She is able to address aspects of my innate being that the rest of you have ignore d."
Probability: "So we're not good enough for you anymore?"
Life: "Sorry. I didn't mean to be offensive. I love you all. I really appreciate the unique form of guidance that each of you provides. Physics
really understands the dynamics of my matter, while you are a specialist on my general features. I'm especially excited about Alga right now
because she specializes in my immediacy and my ability to choose. As complementary systems, each of you addresses a different side of my
innate nature. Remember in our last encounter, we found that Alga is not a subset of your system, but that you are complement ary systems
instead."
Life could sense that she had pushed some buttons because the muscles in Probability's smile tightened up into a grimace.
Probability: I accept the argument that the Living Algorithm is not my child. Despite our common obsession with Data, we hav e unique
fields of action, mine data sets and hers data streams. Consequently the questions we ask and the answers we get involve uniq ue, yet
complementary, matrices of thought. She characterizes moments in the data stream, while I characterize entire sets. The Livin g Algorithms
results are ever changing, while my results are permanent and never changing. I appreciate the pragmatic utility of the infor mation that the
Living Algorithm provides. However the transitory nature of this information combined with the individual nature of the data streams she is
analyzing limits, if not eliminates, any scientific value of her analysis. In contrast, the permanent nature of my results co mbined with the
general and fixed nature of my sets renders my analysis perfect for the scientific community. Because of my talents scientifi c endeavors
confidently employ my computations and measures to establish the validity of their results. Look at how famous I am in the wo rld of
subatomic particles. What scientific efficacy does the Living Algorithm have, if her analysis is transitory and her data stre ams individual?"
Rattled and not really understanding, Life responded feebly, "But what about the Living Algorithms Predictive Cloud? It cert ainly provides
a unique and pragmatic perspective regarding individual moments something that even you cant do.
Probability derisively: A Predictive Cloud!? What kind of predictions can be made with a cloud? Sounds ambiguous to me.
Life uncertainly: I might be wrong. But it seemed that the Authors most recent article illustrated that the Living Algorith m's Predictive
Cloud does a good job with the batting average perhaps even better than you when it came to the ongoing games in a player's
career."
With a sense of superiority Probability boasted, "Ptah! I'll grant you that the Living Algorithm's method of analyzing the ba tting average
might be useful to gamblers or coaches, but it has no scientific value and hence no significance. Stick with me, if you want some definitive
answers. Go to her, if you are happy with mere suggestions."
Life was confused again. Questions raced through her mind, over and over again like a broken record. "Are my data streams so transitory and
individual that I must be content with a pragmatic mathematics that has no scientific value? And why is Probability so famous in the
subatomic world? Plus, why doesn't baseballs batting average have any scientific significance?
This is all too confusing. To sort things out or, at least establish priorities, Im going to meditate."
Ommm? Whoa! Recently, Ive been riding a rollercoaster of emotions and all due to my budding relationship with the Living
Algorithm. Maybe I should just remain single to eliminate these psychic disturbances. But that wouldnt help. I seem to crave a
mathematical partner. I had grown dissatisfied with other more traditional mathematical choices for constantly attempting to box
me in - regulate my every move. I decided to look for an alternative a mathematical guide that might help me to realize my
potentials by unlocking the code to my matrix. Friends introduced me to the Living Algorithm, which led to a few successful
interactions. Wanting to take our relationship to another level, I posed some requirements, which the Living Algorithm fulfil led.
Everything was going perfectly. It even seemed that the Living Algorithm and Probability, as complementary systems, might be
able to join together to provide me with a more comprehensive set of clues to my behavior.
"But then Probability challenged the Living Algorithm's scientific credentials," Life fretted. "Perhaps he was jealous of all the attention I was
giving to the Living Algorithm. Perhaps he is right. Could the Living Algorithm just be a poseur - pretending to be a valid system, but
without any real scientific foundation? If her insights have no basis, how can I trust the codes she reveals? What was it tha t Probability said?
Oh right. The Living Algorithm's analysis, while pragmatic, is too transitory and individual for Science. And then when I bro ught up how
much useful information the Living Algorithm provided in the batting average example, Probability just snorted derisively. 'T he batting
average has no scientific significance.' Does Probability have a valid point or is he just attempting to undermine my budding relationship
with the Living Algorithm?"
The Mathematics of Living Systems Page 42
Smiling cockily, Probability summarized the analysis: My batting average has a proven pragmatic value, as evidenced by the f act that it is a
factor in determining both strategy and salary. However, due to its individual nature, it cant be compared to any other set with any kind of
scientific certitude. Accordingly, the batting average has no scientific value. The same analysis applies to the Living Algor ithms predictive
cloud. Data streams are so individual and transient that it is impossible to achieve the certitude that Science requires. I m ust admit that her
predictive cloud characterizes moments much better than I. Nevertheless, her descriptions of moments have no more scientific validity than
my batting average for the same reason. It is impossible to generalize the results to other data sets due to the individual c haracter of the data
we are analyzing.
However, I, Probability, can generalize my analysis of one homogeneous matter set to another. Atomic particles, whether elect rons, atoms, or
molecules, are identical and obey the same universal laws in all times and places. Under the same circumstances, one electron behaves the
same as another no individuality whatsoever. Because of my ability generalize my analysis Science has admitted me to his
exclusive Circle. While the Living Algorithms Predictive Cloud provides potentially pragmatic information regarding future
moments, this alone will never get her into the Science Circle. Because she only deals with transitory moments in individual
data streams there can be no certainty of her predictions. Due to this lack of certitude, the Living Algorithm analysis is of
questionable value, at best. Accordingly Science ignores her analysis.
I, on the other hand, am famous in the scientific community. Anyone who has anything to do with science must have at least a rudimentary
knowledge of my system. In the hard sciences I am world renowned for cracking the code to the electron's matrix. The soft sci ences employ
my skills to establish the validity and significance of their experimental studies. They all worship at my altar. I will even tually solve your
code just like I resolved the code of the subatomic matrix. Your Living Algorithm is unnecessary."
Somewhat taken aback by Probability's arrogance, Life responded quietly, "So you are going to identify all my general charact eristics and
then claim that this is me?"
Understanding the subtext of my comment, Probability angrily blurted, "You just wait and see! I will be able to predict your behavior just
like I do matter. After all, you are just a sack of atoms." After asserting his presumed dominance, he stormed out.
I relayed this story to the Living Algorithm in our next encounter. She laughed so merrily that I joined in. "Probability is getting desperate.
He is overly attached to becoming your mathematics. He needs to detach a bit from these expectations. They create emotional c hains that are
disturbing his internal peace. He is already famous the world over for his achievements. Why does he feel a need to dominate human
behavior? Perhaps it is a way of compensating for a sense of inadequacy over his inability to address the dynamic nature of e xistence.
Let me first address his claim that my method has no scientific value because of the transient and individual nature of the d ata streams that
are my sole obsession. As evidenced in our batting average example, my predictive clouds supply an abundance of practical inf ormation
when applied to living data streams. Experimental evidence suggests the likelihood that Life employs this pragmatic tool, my predictive
clouds, for assessing environmental patterns to best determine the most appropriate response to ensure survival. If Life empl oys my
predictive clouds, then Life is also subject to my information patterns. In Triple Pulse Studies, the first notebook in this series, we examined
many examples of how Life has employed the Triple Pulse, one of my many information patterns, to organize human behavior asso ciated
with sleep. Accordingly my scientific value lies in my ability to reveal the underlying information patterns that motivate be havior. However I
can't establish the scientific certitude of these connections on my own. I require Probabilitys analytical talents to verify , or at least establish
the limits on the correspondences between human behavior and my information patterns. Thank you Probability.
Life was relieved to find out that the Living Algorithm was not a poseur. Her method had scientific merit even though she had to rely on
Probability's services to establish certitude. "What a great team you can be. Probability can provide helpful information abo ut my general
features and assist you in your quest to establish your scientific validity. You, on the other hand, can unlock the code to m y personal
dynamics, the matrix of my behavior."
Alga: "Exactly. We need each other to provide a comprehensive picture. Probability is helpless before your dynamic nature, as his specialty
is static data sets, not dynamic systems. He can establish precise definitions, but no causal mechanisms. At best he can prov ide a rough map
of the landscape. This is very useful because it reveals where you can go and where you can't. But the map reveals very littl e about inner
motivations and potentials the factors that influence and inspire your behavior. That is my specialty, as my sole focus is the
dynamics of data streams."
"Ironically, the story of how Probability became famous as ruler of the subatomic world illustrates both his inherent strengt hs and
weaknesses. Further it pertains to why my dynamic nature is ideally suited to determining causality, while his static nature is more suited to
description. As with other aspects of our respective systems, these talents are mutually exclusive. Read on to see how Probab ility was able to
patch up the gaps in the subatomic universe that were left by classical Mechanics, the star of Physics. In so doing, Probabil ity became the
new star - both technically and philosophically for a while at least. Fame is always so fleeting."
To explore these issues, read the next article in the stream Description vs. Causality; Static vs. Dynamics. To continue in the metaphorical
world read Life yearns for Mathematics of Relationship.
charged with memory and expectation, even an emotional momentum that propels me forward. Further, I only exist in connection to other
living beings. We cooperate with each other in order to survive. This relationship is essential if any of my infinite transformations are to
persist. And, of equal importance, we make choices to facilitate survival. The good choices are rewarded and the bad choices are punished.
Most of these choices verge on automatic, but recently a new strain has developed with a sophisticated sense of the ability to make
conscious decisions. This is why I want my own mathematics.
That is also why Physics is such a disappointment. As said, I was so excited when those Europeans Galileo, Newton, and such, got
into dynamics. Finally a mathematics that addresses Change, the essence of my Being. However, my initial infatuation with the
mathematics of Physics was eventually replaced by despondency. It became evident, pretty early on (maybe with Descartes)
that Physics were primarily interested in the automatic behavior of matter - planets, atoms and such. Perhaps I was in a state
of denial, but I had secretly hoped that somehow his focus on change would eventually lead to me.
Boy, was I deluded. Instead Physics had the gall to say that I was only made of matter and had no real say in what my next move was. He
denied my unique ability to choose - to make informed decisions about my future. That was the very reason I wanted my own mathematics,
so s(he) could help me to make better decisions about what to do next. My dearest hope was that this mathematics would assist me in my
quest for Self-Actualization. That is my only real desire - to fulfill my potentials. And then Physics arrogantly claims that Choice is an
illusion. According to him, even Human Behavior is solely determined by the collisions of subatomic wave/particles that go backward and
forward in time. My hopes for a compatible relationship were dashed. That is when I decided to look elsewhere.
I almost gave up hope of ever finding a Mathematical Guide someone who could assist me on my difficult quest. Many of my
friends told me to give up this impossible dream. This mathematics doesn't exist. You will never be compatible with his
numbers, they claimed. Youre too spontaneous; hes too rigid.
And then along came the Living Algorithm. Certainly not much to look at. Our first date was casual. I didn't have much hope. Amazingly
we agreed almost completely on the Interruption Phenomenon. We didn't go on our second date for quite awhile. We didn't really know
what else we had in common. After the Living Algorithm matured a little, we had a great series of interactions concerning the Triple Pulse's
relation to sleep-related phenomena. Again it seemed that the codes of our matrices seemed to be compatible. We agreed on everything.
Trying not to get too excited, I wrote down a series of requirements that I had for a mathematical partner or guide. First and foremost, the
mathematics needed to address the immediacy of data streams. I was excited to find that the Living Algorithm is all about Immediacy. Her
specialty is digesting data streams - turning instants into moments. But I need a Mathematics that also addresses relationships between
moments in time. Although I exist in the moment, I am connected with the past and have a sense of future potentials. Does the Living
Algorithm relate moments together, and if so, do they have an effect on each other?
Some of my friends say that I am being too hard on the Living Algorithm making too many demands. But I am special, in this wide
universe of ours, and have specific needs that must be fulfilled before it is worth it for me to form a relationship with any
mathematics. Is the Living Algorithm the mathematics of my dreams the mathematics of relationships as well as the
mathematics of the moment?"
To find out if the Living Algorithm can fulfill Lifes ideal needs read The Mathematics of Relationship. To see how this interaction turned
out, read on.
Ads by DNSUnlockerAd Options
CE7-8: Can the Living Algorithm provide Life with Fungible Meaning?
2: Articles
4. Paragraph
Life was exhilarated to find that the Living Algorithm created a system where moments interacted with each other, where what happened in
the past had an effect upon the future. She thought to herself hopefully, "As our relationship develops, it seems that the Living Algorithm
and I have exceedingly similar matrices. Is she the mathematical guide of my dreams? Before claiming this title the mathematics of the
Living Algorithm must somehow deal with fungible meaning in the sense of sublimating precise detail for the larger picture. The
others worship precision and, as such, miss essential meaning. Alga must be in my camp, not theirs, if she is to be my method
of digesting environmental input."
We have had many successful interactions. Our theories and experiences correspond almost completely on certain issues - including
interruptions to a creative session and many sleep-related phenomena. However, these could be only superficial similarities, like enjoying
the same music. To move to a deeper level in our relationship, I have required that the Living Algorithm pass four ordeals.
These ordeals did not include dragon-slaying, maiden-rescuing, or tyrant-overthrowing. Nothing aggressive like that. Instead the Living
Algorithm must incorporate Immediacy, Relationship, Fungible Meaning, and Choice in her system. She has passed the first two ordeals
(Immediacy & Relationship). Can she pass the third ordeal - provide Fungible Meaning? This means she would have to abandon the
precision that is a trademark of her trade.
Many say that my demands are impossible, claiming that mathematics and lack of precision are antithetical. Some who are sympathetic to
the Living Algorithm have even attempted to persuade Life to lower her standards. "Be happy that you are compatible in so many ways and
get along so well. No mathematics can possibly be perfect. Don't dismiss her as a Mathematical Guide just because she can't deal with
fungibility. That is an impossible task for any mathematics."
"I'm sorry," Life responds, "but the biological systems that I embody require a fungible interpretation of environmental input in order to
survive. We have sacrificed precision for meaning so that we can recognize pattern. This ambiguity of interpretation enables us to identify
familiar objects and processes in unusual contexts and from peculiar perspectives. I demand the same from my mathematics. If the Living
Algorithm can't incorporate fungibility into her process, how will she ever be able to understand me well enough to give me any guidance?"
To see if the Living Algorithm can pass this next, seemingly impossible, ordeal, read the next article in the stream Precision vs. Fungible
Meaning. To find out how the encounter went, read on.
Life: I am thrilled that you, the amazing Living Algorithm, have been able to pass my third ordeal - providing a fungible interpretive
mechanism.
Living Algorithm: It wasnt difficult; After all fungible is my middle name. When applied to biological systems the word fungible has to
do with sublimating detail for meaning. In my digestion process I immediately trade in the precision of the instant (the data point) for the
meaning of the moment (an ongoing fungible average). These fungible averages, my Predictive Clouds, characterize the meaning of the
moment in relationship to what went before. My Clouds provide the foundation of the rough approximations necessary for the meaning
making of pattern recognition.
Life: Although it is evident that we are compatible in so many ways, you still have to pass one more ordeal. Your system must incorporate
the possibility of informed choice.
Alga: Of course. I think you will not find me wanting in this department. But lets relish the moment of our present triumph rather than
getting lost in future tasks.
Life: We get along so well and have so many features is common. Could it be that the living systems that I embody employ you to digest
data streams?
Alga: It could be. In the attempt to remain alive, you are continually attempting to understand the meaning of the moment. Accordingly,
you need some type of interpretative mechanism to facilitate this task. Coincidentally, perhaps not, my entire focus is upon defining the
meaning of the moment, nothing else. My measures, the Predictive Clouds, are computed by taking into account ongoing data that is
weighted in proximity to the present moment. My digestion process, then computes the trajectories of the ongoing relationships between
moments. These ongoing relationships characterize the potential meaning of the data stream. As such, my Clouds provide an interpretative
mechanism that could be employed to reveal the patterns that are at the heart of meaning. I think you would find the information very
useful. Due to my singular obsession with characterizing the Meaning of the Moment and your need for this interpretative mechanism, I
think it is very likely that you employ me, or a mathematics very much like me, to digest data streams. Gotta get going. Have another
appointment. See you next time.
Life: Whoa! Could it be that I have found the data stream mathematics of my dreams. Alga, via her clouds, is certainly sensitive to the
ongoing, changeable, and immediate nature of the data streams that define my existence. Plus her Clouds provide a plausible meaning
making mechanism that I could certainly use. While this argument makes logical sense, the graveyard of science is filled with ideas that
made lots of sense. Aristotles system of understanding dominated western thinking for about two millennia because it made lots of sense.
Yet the system is discredited because experimental evidence contradicted Aristotles theories. Theories must be tested against the facts of
empirical reality to establish their validity."
"Is there any evidence that I employ the Living Algorithm to digest the data streams that define my existence? Hmmm? There are distinct
patterns of correspondence between the Living Algorithm's mathematical behavior and my human behavior regarding multiple sleep-related
phenomena. Further I have distinct requirements for a data stream mathematics that will fulfill my needs. Thus far, the Living Algorithm
System has fulfilled all of those requirements.
"Suppose that I do employ the Living Algorithm to digest numerical data. What about the abundance of information flows that cant be
assigned a distinct number? For instance, what about the relative terms that are so useful for organizing our world, such as lighter, bigger,
smaller, or smarter? How well does the Living Algorithm fare with non-numerical entities?
"But I cant think clearly anymore. My Pulse of Attention is fading fast. Have absorbed so much new information. Time for a Rest Pulse to
provide my Liminals an opportunity to integrate the information."
To see if the Living Algorithm's digestive process can deal with relative terms, read The Living Algorithm Algorithm.
To remain in our metaphorical world, read on.
CE9-10 Comfortable with Living Algorithm's Algorithm, Life wonders about Choice
2: Articles
4. Paragraph
We have come a long way. As we began this tome, Life was looking for a mathematical partner, who would be sensitive to her unique and
subtle features a Mathematics of the Moment. Probability's preoccupation with her general features prohibited him from
fulfilling this role. While predictable, dependable, and even comfortable, Probability's limited understanding was not sufficient to
deal with Life's spontaneity. In fact he was continually attempting to box her in with his certainty, even claiming that her
'wildness' was just an aberration. 'Statistically insignificant' was the phrase he regularly applied to her behavior, when it strayed
from the norm. Due to the dysfunctional nature of their relationship, many of her friends even speculated that Life would never
find a mathematics that she could be happy with. "Math is just too rigid, precise, and automatic. Life needs to be free. After all
she is an Artist. She can't be boxed in by math's rigid forms."
Life had almost given up, when along came the Living Algorithm. It was not exactly love at first sight between these two unlikely partners.
Small and unassuming, the Living Algorithm was nothing to look at. One would never even notice her in a crowd of equations. Her
operations are basic and elements few. Certainly nothing noteworthy. In fact, Life originally mistook her for one of Probability's many
equations peremptorily dismissing her from the running for mathematical partner. But when Life saw the Living Algorithm in
action everything changed.
Although the Living Algorithm is not much to look at, her offspring are spectacular. She mates with Data Streams to produce an Info
System. Amazingly enough, the Living Algorithm's Info System specializes in characterizing the moment - a possibility that Life had
almost given up on. Life also relished in the Living Algorithm's dragon-like flexibility of interpretation, as she had long since tired of
The Mathematics of Living Systems Page 46
almost given up on. Life also relished in the Living Algorithm's dragon-like flexibility of interpretation, as she had long since tired of
Probability's know-it-all rigidity. Further, the Info System includes the Living Algorithm's Family of Measures along with their myriad
manifestations including the Creative Pulse and the Triple Pulse. As well as being gorgeous, these two are sensitive to context
and relationship. This was particularly attractive to Life, as she is also all about context and relationship. To be honest,
Probability is particularly inept at understanding these two aspects of her personality.
Life: After our many successful interactions which culminated with the Biology of Sleep, I began wondering why we get along so well.
Alga suggested that she might be part of my operating system. To test this theory Alga asked that I state what I needed from a mathematical
system. I posed some requirements and ordeals, all of which the Alga easily passed. It turns out that the Living Algorithm specializes in
describing the relationship between moments providing meaning to the data stream of instants that are continually bombarding
me.
I could see that Alga could deal with numbers, but I wondered how she would do with the relative values that characterize my existence
a little more, a lot less, etcetera. It turns out that the Living Algorithm algorithm can easily handle relative terms that are nonnumerical. The computations are easy and the memory requirements are minimal. She has no need of an extensive database,
like the kind Probability requires. Instead she generates measures that are emotionally charged due to the fact that they are
associated with future expectations. It turns out that I can more easily remember things that are emotionally tagged. It pays for
me to remember the values behind Algas Predictive Cloud because they indicate expected position, range of variation, and
recent tendencies of the data streams next value. It seems that the Living Algorithms algorithm handles both verbal &
numerical data equally well.
Our relationship is proceeding so smoothly, even seamlessly. Our complete compatibility borders upon the miraculous. My attraction to
the Living Algorithm and her System is increasing by leaps and bounds. Alga has fulfilled most of my requirements and passed my ordeals.
Could the Living Algorithm really be part of my operating system?
There was still one feature that had not yet been worked out. I, Life, was still curious about how the Living Algorithm System felt about
choice. . This topic is especially dear to me, in fact a relationship breaker. You see, one of my other mathematical suitors, Physics, had
bluntly told me that choice was just an illusion. He even claimed that his equations were on the verge of completely predicting my every
move. "Just give me your initial conditions and I will tell you everything that is going to happen,' he bragged. Not wanting to ever be boxed
in so completely, I dumped him. But I never forgot his comment about choice. It bugged me. Plus Probability, while a mite more flexible,
had made a similar suggestion. In the midst of one of our many arguments about spontaneity, he even said to me: "You consist solely of
subatomic particles, nothing else. I can accurately predict the behavior of these sub-atomics. By straightforward logic, I can therefore
predict your behavior." While he couldn't predict my behavior currently, I wondered if he might be able to at some later time. His statement
placed more than a little doubt in my mind.
To see how this issue between Life and the Living Algorithm is resolved check out the next article in the series The Mathematics of
Informed Choice.
To remain in the metaphorical world read Probability challenges Living Algorithm's scientific credentials.
In her quest to find a mathematical partner who could provide some guidance, Life found the Living Algorithm. They immediately bonded
due to their common obsession with data streams. However, Life didn't want just any old data stream mathematics. She wanted a data
stream mathematics that could incorporate immediacy, relationships and the potential for informed choice. After their most recent
encounter, Life was satisfied that the Living Algorithm had fulfilled all of her requirements, including choice.
In contrast, Physics denies the possibility of choice due to his absolute obsession with Lifes material nature. Physics prides himself on his
continuous equations. These continuous equations have an extraordinary power for describing the behavior of matter. Due to Physics
obsession with his continuous equations, it should come as no surprise that he would attempt to use them to characterize Life. Despite an
abundance of evidence to the contrary, Physics makes the logical inference that Life is subject to the same mindless automatic laws as
Matter. These automatic laws lead to an equally automatic future where everything is predetermined by the interactions of Matter.
Similarly, it should come as no surprise that Life, whose very survival is based in her ability to make informed choices, would instead
embrace the Living Algorithm as her mathematical partner.
Life: I don't blame Physics for being obsessed with his equations. His equations got him where he is today. In fact, this power was why I
was so attracted to him initially. But then when I got to know him, all he talked about was dirt and dust - atoms and molecules. He even
tried to apply his equations to me. I felt he was totally objectifying me, neglecting my special features.
I try not to judge Physics for his myopia, for I know where he has come from. For millennia, he and his type believed that even matter was
alive, or at least had life-like properties. Boy, were they confused. However in trying to distance themselves from this animistic perspective,
the philosopher scientists had an opposite and equal reaction. From everything being animate, they decided that nothing was animate.
The collective effort of Galileo, Newton, Einstein, et al. established conclusively that the entire material universe, the stars as well as the
Earth, obeyed universal laws of motion. Intoxicated with this success in the material world, Physics generalized his findings to my
biological world of living organisms. "I can predict the world of matter with nearly absolute accuracy. You, Life, are composed of matter,
and as such, with insight, I will also be able to accurately predict your behavior."
Offended, I responded bluntly:"Take your continuous equations and go to that cold, automatic matter that obeys your every command. I
need a different kind of mathematics with a different kind of equation." Obviously Physics had completely objectified me, neglecting all the
qualities that make me special.
The Mathematics of Living Systems Page 47
Life: Whew! I think it is time to take a break. Our Pulse is fading fast. My Liminals are demanding some down time to assimilate and
integrate this new material.
Alga: Good point. Thanks for your attention. See you next time.