Draft of paper published in:



Context-Dependent Thinning 13

Rachkovskij, 1993) and in one-level APNN applied for recognition of vowels (Rachkovskij &
Fedoseyeva, 1990, 1991), textures (Artykutsa et al., 1991; Kussul, Rachkovskij, & Baidyk, 1991b),
shapes (Kussul & Baidyk, 1990), handprinted characters (Lavrenyuk, 1995), logical inference (Kasatkin
& Kasatkina, 1991).

5. Procedures of auto-thinning, hetero-thinning, self-exclusive thinning and notation

In sections 4.2-4.4 we considered the versions of thinning procedures where a single vector
(superposition of component codevectors) was the input. The corresponding pattern of activity was
present both in the field
fin1 and fin2 (Figures 3, 5, 6), and therefore the input vector thinned itself. Let us
call these procedures "auto-thinning" or "auto-CDT" and denote them as
label(u). (5.1)

Here u is the codevector to be thinned (usually superposition of component codevectors) which
is in the input fields
fin1 and fin2 of Figures 3,5,6. label(...) denotes particular configuration of thinning
(particular realization of bundles of permutive connections). Let us note that angle brackets are used by
Plate to denote normalization operation in HRRs (e.g. Plate, 1995; see also section 9.1.5).

A lot of orthogonal configurations of permutive connections are possible. Differently labeled
CDT procedures implement different thinning. In the algorithmic implementations (Figure 4) different
labels will use different seeds. No label corresponds to some fixed configuration of thinning. Unless
otherwise specified, it is assumed that the number
K of bundles is chosen to maintain the preset density
of the thinned vector (
u), usually (u)M.

k(Pku) can be expressed as Ru thresholded at 1/2, where the matrix R is the disjunction, or it
can also be the sum, of
K permutation matrices Pk. This, in turn, can be written as a function T(u), so that
we get

(u) = u T(u). (5.2)

It is possible to thin one codevector by another one if the pattern to be thinned is activated in fin1,
and the pattern which thins is activated in
fin2. Let us call such procedure hetero-CDT, hetero-thinning,
thinning
u with w. We denote hetero-thinning as
label<u)w. (5.3)

Here w is the pattern that does the thinning. It is activated in fin2 of Figures 3,5,7. u is the pattern which
is thinned, it is activated in
fin1. label(...) is the configuration label of thinning. For auto-thinning, we may
write (
u) = (u)u.

For the additive hetero-thinning, equation 4.10 can be rewritten as

(u)w = u (k(Pkw)) = u T(w). (5.4)

For the subtractive hetero-thinning, equation 4.15 can be rewritten as

(u)w = u -(k(Pkw)) = u -T(w). (5.5)

Examples.

As before, we denote composite codevector u to be thinned by its component codevectors, e.g. u
= a b c or simply u = abc.

Auto-thinning of composite codevector u:

(u)u = (abc)abc = (abc) = (abc)abc = (abc).

Hetero-thinning of composite codevector u with codevector d:

(u)d = (abc)d = (abc)d.

For both additive and subtractive CDT procedures:

(abc) = (a)

abc (b)

abc (c)
abc.

We can also write (abc) = (a T(abc)) (b T(abc)) (c T(abc)). Analogous expression can be
written for a composite pattern with other numbers of components. Let us note that
K should be the same
for thinning of the composite pattern as a whole or its individual components.

For the additive CDT procedure it is also true:

(abc) = (a)abc (b)abc (c)abc =



More intriguing information

1. Micro-strategies of Contextualization Cross-national Transfer of Socially Responsible Investment
2. The Integration Order of Vector Autoregressive Processes
3. The Role of area-yield crop insurance program face to the Mid-term Review of Common Agricultural Policy
4. Running head: CHILDREN'S ATTRIBUTIONS OF BELIEFS
5. Learning-by-Exporting? Firm-Level Evidence for UK Manufacturing and Services Sectors
6. The name is absent
7. Constrained School Choice
8. PERFORMANCE PREMISES FOR HUMAN RESOURCES FROM PUBLIC HEALTH ORGANIZATIONS IN ROMANIA
9. Apprenticeships in the UK: from the industrial-relation via market-led and social inclusion models
10. The Functions of Postpartum Depression