Context-Dependent Thinning 26
Figure captions
Figure 1. Growth of the density p' of 1s in the composite codevectors of higher hierarchical levels (see
equations 2.1 and 2.2). Here each codevector of the higher level is formed by bit disjunction of S
codevectors of the preceding level. Items at each level are uncorrelated. The codevectors of base-level
items are independent with the density of 1s equal to p. The number of hierarchical levels is L. (For any
given number of base-level items, the total number of 1s in the composite codevectors is obviously
limited by the number of 1s in the disjunction of base-level codevectors).
Figure 2. Hatched circles represent patterns of active units encoding items; formed connections are
plotted by arrowed lines.(A) Formation of a false assembly. When three assemblies (abd; bce; acf) are
consecutively formed in a neural network by connecting all active units of patterns encoding their items,
the fourth assembly abc (gridy hatch) is formed as well, though its pattern was not explicitly presented
to the network. (B) Preventing of a false assembly. If each of three assemblies is formed by connecting
only subsets of active units encoding the component items, then the connectivity of the false assembly is
weak. xyz denotes the subset of units encoding item x when it is combined with items y and z.. The
pairwise intersections of the small circles represent the false assembly.
Figure 3. A neural-network implementation of permutive conjunctive thinning. The same N-dimensional
binary pattern is activated in the input neural fields fin1 and fin2. It is a superposition of several component
codevectors. fin1 is connected to fout by a bundle of direct projective connections. fin2 is connected to fout
by a bundle of permutive connections. Conjunction of the superimposed component codevectors and
their permutation is obtained in the output neural field fout, where the neural threshold θ=1.5.
Figure 4. (A), (B). Algorithmic implementations of the additive version of the Context- Dependent
Thinning procedure. Parameter seed defines a configuration of shift permutations. For small K,
checking that r is unique would be useful.
Figure 5. A neural-network implementation of the additive version of Context-Dependent Thinning
procedure. There are four neural fields with the same number of neurons: two input fields fin1 and fin2, the
output field fout, the intermediate field fint. The neurons of fin1 and fout are connected by the bundle of
direct projective connections (1-to-1). fint and fout are also connected in the same manner. The same
binary pattern z (corresponding to superimposed component codevectors) is in the input fields fin1 and
fin2. The intermediate field fint is connected to the input field fin2 by K bundles of permutive projective
connections. The number K of required bundles is estimated in Table 3. Only two bundles are shown
here: one by solid lines and one by dotted lines. The threshold of fint neurons is 0.5. Therefore fint
accumulates (by bit disjunction) various permutations of the pattern z in fin2. The threshold of fout is equal
to 1.5. Hence this field performs conjunction of the pattern z from fin1 and the pattern of K permuted and
superimposed z from fint. z, {z), w correspond to the notation of Figure 4.
Figure 6. A neural-network implementation of the subtractive Context-Dependent Thinning procedure.
There are three neuron fields of the same number of neurons: two input fields fin1 and fin2, as well as the
output field fout. The copy of the input vector z is in both input fields. The neurons of fin1 and fout are
connected by the bundle of direct projective connections (1-to-1). The neurons of fin2 and fout are
connected by K bundles of independent permutive connections. (Only two bundles of permutive
connections are shown here: one by solid lines, and one by dotted lines). Unlike Figure 5, the synapses
of permutive connections are inhibitory (the weight is -1). The threshold of the output field neurons is
0.5. Therefore the neurons of z remaining active in fout are those for which none of the permutive
connections coming from z are active. As follows from Table 4, K is approximately the same for the
number S = 2,...,5 of component codevectors of certain density p.