Draft of paper published in:



Context-Dependent Thinning 26

Figure captions

Figure 1. Growth of the density p' of 1s in the composite codevectors of higher hierarchical levels (see
equations 2.1 and 2.2). Here each codevector of the higher level is formed by bit disjunction of
S
codevectors of the preceding level. Items at each level are uncorrelated. The codevectors of base-level
items are independent with the density of 1s equal to
p. The number of hierarchical levels is L. (For any
given number of base-level items, the total number of 1s in the composite codevectors is obviously
limited by the number of 1s in the disjunction of base-level codevectors).

Figure 2. Hatched circles represent patterns of active units encoding items; formed connections are
plotted by arrowed lines.(A) Formation of a false assembly. When three assemblies (
abd; bce; acf) are
consecutively formed in a neural network by connecting all active units of patterns encoding their items,
the fourth assembly
abc (gridy hatch) is formed as well, though its pattern was not explicitly presented
to the network. (B) Preventing of a false assembly. If each of three assemblies is formed by connecting
only subsets of active units encoding the component items, then the connectivity of the false assembly is
weak.
xyz denotes the subset of units encoding item x when it is combined with items y and z.. The
pairwise intersections of the small circles represent the false assembly.

Figure 3. A neural-network implementation of permutive conjunctive thinning. The same N-dimensional
binary pattern is activated in the input neural fields
fin1 and fin2. It is a superposition of several component
codevectors.
fin1 is connected to fout by a bundle of direct projective connections. fin2 is connected to fout
by a bundle of permutive connections. Conjunction of the superimposed component codevectors and
their permutation is obtained in the output neural field
fout, where the neural threshold θ=1.5.

Figure 4. (A), (B). Algorithmic implementations of the additive version of the Context- Dependent
Thinning procedure. Parameter
seed defines a configuration of shift permutations. For small K,
checking that
r is unique would be useful.

Figure 5. A neural-network implementation of the additive version of Context-Dependent Thinning
procedure. There are four neural fields with the same number of neurons: two input fields
fin1 and fin2, the
output field
fout, the intermediate field fint. The neurons of fin1 and fout are connected by the bundle of
direct projective connections (1-to-1).
fint and fout are also connected in the same manner. The same
binary pattern
z (corresponding to superimposed component codevectors) is in the input fields fin1 and
fin2. The intermediate field fint is connected to the input field fin2 by K bundles of permutive projective
connections. The number
K of required bundles is estimated in Table 3. Only two bundles are shown
here: one by solid lines and one by dotted lines. The threshold of
fint neurons is 0.5. Therefore fint
accumulates (by bit disjunction) various permutations of the pattern z in fin2. The threshold of fout is equal
to 1.5. Hence this field performs conjunction of the pattern
z from fin1 and the pattern of K permuted and
superimposed
z from fint. z, {z), w correspond to the notation of Figure 4.

Figure 6. A neural-network implementation of the subtractive Context-Dependent Thinning procedure.
There are three neuron fields of the same number of neurons: two input fields
fin1 and fin2, as well as the
output field
fout. The copy of the input vector z is in both input fields. The neurons of fin1 and fout are
connected by the bundle of direct projective connections (1-to-1). The neurons of
fin2 and fout are
connected by
K bundles of independent permutive connections. (Only two bundles of permutive
connections are shown here: one by solid lines, and one by dotted lines). Unlike Figure 5, the synapses
of permutive connections are inhibitory (the weight is -1). The threshold of the output field neurons is
0.5. Therefore the neurons of
z remaining active in fout are those for which none of the permutive
connections coming from
z are active. As follows from Table 4, K is approximately the same for the
number
S = 2,...,5 of component codevectors of certain density p.



More intriguing information

1. Environmental Regulation, Market Power and Price Discrimination in the Agricultural Chemical Industry
2. Short- and long-term experience in pulmonary vein segmental ostial ablation for paroxysmal atrial fibrillation*
3. REVITALIZING FAMILY FARM AGRICULTURE
4. The Complexity Era in Economics
5. Julkinen T&K-rahoitus ja sen vaikutus yrityksiin - Analyysi metalli- ja elektroniikkateollisuudesta
6. Banking Supervision in Integrated Financial Markets: Implications for the EU
7. The name is absent
8. Convergence in TFP among Italian Regions - Panel Unit Roots with Heterogeneity and Cross Sectional Dependence
9. The name is absent
10. Word searches: on the use of verbal and non-verbal resources during classroom talk
11. The name is absent
12. Sex-gender-sexuality: how sex, gender, and sexuality constellations are constituted in secondary schools
13. The Composition of Government Spending and the Real Exchange Rate
14. The name is absent
15. The name is absent
16. The name is absent
17. Existentialism: a Philosophy of Hope or Despair?
18. The Mathematical Components of Engineering
19. Improving behaviour classification consistency: a technique from biological taxonomy
20. Financial Development and Sectoral Output Growth in 19th Century Germany