different than familiar from more simple chemical or physical
processes. The ∂SG∕∂K are analogous to thermodynamic
forces in a chemical system, and may be sub ject to override by
external physiological or other driving mechanisms: biological
and cognitive phenomena, unlike simple physical systems, can
make choices as to resource allocation.
That is, an essential contrast with simple physical systems
driven by (say) entropy maximization is that complex bio-
logical or cognitive structures can make decisions about re-
source allocation, to the extent resources are available. Thus
resource availability is a context, not a determinant, of be-
havior.
Equations (22) and (23) can be derived in a simple
parameter-free covariant manner which relies on the under-
lying topology of the information source space implicit to the
development (e.g., Wallace and Wallace, 2008b). We will not
pursue that development here.
The dynamics, as we have presented them so far, have
been noiseless, while biological systems are always very noisy.
Equation (23) might be rewritten as
dKj /dt = X Lj,i∂SG/∂Ki + σW (t)
i
where σ is a constant and W(t) represents white noise. This
leads directly to a family of classic stochastic differential equa-
tions having the form
yk, having probabilities Pk , possibly with error. Someone
receiving the symbol yk then retranslates it (without error)
into some xk , which may or may not be the same as the xj
that was sent.
More formally, the message sent along the channel is char-
acterized by a random variable X having the distribution
P(X = xj) = Pj,j = 1, ..., M.
The channel through which the message is sent is charac-
terized by a second random variable Y having the distribution
P(Y =yk) =Pk,k= 1,..., L.
Let the joint probability distribution of X and Y be defined
as
P (X = xj , Y = yk ) = P (xj , yk ) = Pj,k
and the conditional probability of Y given X as
P(Y = yk|X = xj) = P(yk|xj).
Then the Shannon uncertainty of X and Y independently
and the joint uncertainty of X and Y together are defined
respectively as
dKtj = Lj (t, K)dt +σj(t,K)dBt,
(24)
where the Lj and σj are appropriately regular functions of
t and K, and dBt represents the noise structure, and we have
readjusted the indices.
Further progress in this direction requires introduction of
methods from stochastic differential geometry and related
topics in the sense of Emery (1989). The obvious inference
is that noise - not necessarily ‘white’ - can serve as a tool
to shift the system between various topological modes, as a
kind of crosstalk and the source of a generalized stochastic
resonance.
Effectively, topological shifts between and within dynamic
manifolds constitute another theory of phase transitions (Pet-
tini, 2007), and this phenomenological Onsager treatment
would likely be much enriched by explicit adoption of a Morse
theory perspective.
18.4 The Tuning Theorem
Messages from an information source, seen as symbols xj from
some alphabet, each having probabilities Pj associated with
a random variable X, are ‘encoded’ into the language of a
‘transmission channel’, a random variable Y with symbols
H(X)=-XM Pjlog(Pj)
j=1
H(Y) = -XL Pklog(Pk)
k=1
H(X,Y)=-XM XL Pj,klog(Pj,k).
j=1 k=1
(25)
The conditional uncertainty of Y given X is defined as
H(Y|X)=-XM XL Pj,klog[P(yk|xj)].
j=1 k=1
(26)
For any two stochastic variates X and Y , H(Y ) ≥ H(Y |X),
as knowledge of X generally gives some knowledge of Y .
Equality occurs only in the case of stochastic independence.
22