Draft of paper published in:



Context-Dependent Thinning 5

These provide one of the reasons why composite items should be represented not by all 1s of
their component codevectors, but only by their fraction approximately equal to
M (i.e. only by some M
representatives of active neurons encoding the components).

2.3. Ghost patterns and false memories

The well-known problem of ghost patterns or superposition catastrophe was mentioned in the
Introduction. It consists in losing the information on the membership of component codevectors in
particular composite codevector, when several composite codevectors are superimposed in their turn.

This problem is due to the essential property of superposition operation. The contribution of
each member to their superposition does not depend on the contributions of other members. For
superposition by elementwise disjunction, representation of
a in ab and ac is the same. The result of
superposition of several base-level component codevectors contains only the information concerning
participating components and no information about the combinations in which they meet. Therefore if
common items are constituents of several composite items, then the combination of the latter generally
can not be inferred from their superposition codevector. For example, let a complex composite item
consist of base-level items
a, b, c, d, e, f. Then how could one determine that it really consists of the
composite items
abd, bce, caf, if there may be also other items, such as abc, def, etc. ? In the formulation
of "false” or “spurious patterns", superposition of composite patterns
abc and def generates false
patterns ("ghosts")
abd, bce, caf, etc.

The problem of introducing “false assemblies” or “spurious memories” (unforeseen attractors)
into a neural network (e.g. Kussul, 1980; Hopfield, Feinstein, & Palmer, 1983; Vedenov, 1987, 1988)
has the same origin as the problem of ghosts. Training of an associative memory of matrix type is usually
performed using some version of Hebbian learning rule implemented by superimposing in the weight
matrix the outer products of memorized codevectors. For binary connections, e.g.

Wij' = Wij xixj , (2.4)

where xi and xj are the states of the i-th and the j-th neurons when the pattern x to be memorized is
presented (i.e. the values of the corresponding bits of
x), Wij and Wij' are the connection weights between
the
i-th and the j-th neurons before and after training, respectively, stands for disjunction.

When this learning rule is sequentially used to memorize several composite codevectors with
partially coinciding components, false assemblies (attractors) may appear - that is, memorized composite
codevectors that were not presented to the network. For example, when representations of items
abd,
bce, caf
are memorized, the false assembly abc (unforeseen attractor) is formed in the network (Figure
2A). Moreover, various two-item assemblies
ab, ad, etc. are present, which also were not explicitly
presented for storing.

The problem of introducing false assemblies can be avoided if non-distributed associative
memory is used, where the patterns are not superimposed when stored and each composite codevector is
placed into a separate memory word. However the problem of false patterns or superposition catastrophe
still persists.

2.4. An idea of the thinning procedure

A systematic use of distributed representations provides the prerequisite to solve both the problem of
codevector density growth and the superposition catastrophe. The idea of solution consists in including
into the representation of a composite item not full sets of 1s encoding its component items, but only
their subsets. If we choose the fraction of 1s from each component codevector so that the number of 1s
in the codevector of a composite item is equal to
M, then the density of 1s will be preserved in
codevectors of various complexity. For example, if
S=3 items of level l comprise an item of level l+1,
then approximately
M/3 of 1s should be preserved from each codevector of the l-th level. Then the
codevector of level
l+1 will have approximately M of 1s. If two items of level l+1 comprise an item of
level
l+2, then approximately M/2 of 1s should be preserved from each codevector of level l+1. Thus the
low number
M of 1s in the codevectors of composite items of various complexity is maintained, and



More intriguing information

1. Sex-gender-sexuality: how sex, gender, and sexuality constellations are constituted in secondary schools
2. Labour Market Flexibility and Regional Unemployment Rate Dynamics: Spain (1980-1995)
3. WP 92 - An overview of women's work and employment in Azerbaijan
4. Crime as a Social Cost of Poverty and Inequality: A Review Focusing on Developing Countries
5. Campanile Orchestra
6. The name is absent
7. The value-added of primary schools: what is it really measuring?
8. The name is absent
9. Une nouvelle vision de l'économie (The knowledge society: a new approach of the economy)
10. Behavior-Based Early Language Development on a Humanoid Robot
11. Solidaristic Wage Bargaining
12. On the Relation between Robust and Bayesian Decision Making
13. LOCAL PROGRAMS AND ACTIVITIES TO HELP FARM PEOPLE ADJUST
14. The name is absent
15. Trade Openness and Volatility
16. Backpropagation Artificial Neural Network To Detect Hyperthermic Seizures In Rats
17. The Mathematical Components of Engineering
18. Fiscal federalism and Fiscal Autonomy: Lessons for the UK from other Industrialised Countries
19. The name is absent
20. PROFITABILITY OF ALFALFA HAY STORAGE USING PROBABILITIES: AN EXTENSION APPROACH