Draft of paper published in:



Provided by Cognitive Sciences ePrint Archive

Draft of paper published in:

Neural Computation (2001) v. 13 n. 2, pp. 411-452

Binding and Normalization of Binary Sparse Distributed
Representations by Context-Dependent Thinning

Dmitri A. Rachkovskij

V. M. Glushkov Cybernetics Center

Pr. Acad. Glushkova 40

Kiev 03680

Ukraine

[email protected]
(
Preferable contact method)


Ernst M. Kussul

Centro de Instrumentos
Universidad Nacional Autonoma de
Mexico

Apartado Postal 70186
04510 Mexico D.F.

Mexico

[email protected]


Keywords: distributed representation, sparse coding, binary coding, binding, variable binding,
representation of structure, structured representation, recursive representation, nested representation,
compositional distributed representations, connectionist symbol processing.

Abstract

Distributed representations were often criticized as inappropriate for encoding of data with a complex
structure. However Plate's Holographic Reduced Representations and Kanerva's Binary Spatter Codes
are recent schemes that allow on-the-fly encoding of nested compositional structures by real-valued or
dense binary vectors of fixed dimensionality.

In this paper we consider procedures of the Context-Dependent Thinning which were developed
for representation of complex hierarchical items in the architecture of Associative-Projective Neural
Networks. These procedures provide binding of items represented by sparse binary codevectors (with
low probability of 1s). Such an encoding is biologically plausible and allows a high storage capacity of
distributed associative memory where the codevectors may be stored.

In contrast to known binding procedures, Context-Dependent Thinning preserves the same low
density (or sparseness) of the bound codevector for varied number of component codevectors. Besides, a
bound codevector is not only similar to another one with similar component codevectors (as in other
schemes), but it is also similar to the component codevectors themselves. This allows the similarity of
structures to be estimated just by the overlap of their codevectors, without retrieval of the component
codevectors. This also allows an easy retrieval of the component codevectors.

Examples of algorithmic and neural-network implementations of the thinning procedures are
considered. We also present representation examples for various types of nested structured data
(propositions using role-filler and predicate-arguments representation schemes, trees, directed acyclic
graphs) using sparse codevectors of fixed dimension. Such representations may provide a fruitful
alternative to the symbolic representations of traditional AI, as well as to the localist and microfeature-
based connectionist representations.



More intriguing information

1. The name is absent
2. Benefits of travel time savings for freight transportation : beyond the costs
3. The voluntary welfare associations in Germany: An overview
4. AN ANALYTICAL METHOD TO CALCULATE THE ERGODIC AND DIFFERENCE MATRICES OF THE DISCOUNTED MARKOV DECISION PROCESSES
5. Self-Help Groups and Income Generation in the Informal Settlements of Nairobi
6. Valuing Access to our Public Lands: A Unique Public Good Pricing Experiment
7. A Study of Prospective Ophthalmology Residents’ Career Perceptions
8. The name is absent
9. A Critical Examination of the Beliefs about Learning a Foreign Language at Primary School
10. Imperfect competition and congestion in the City
11. Impacts of Tourism and Fiscal Expenditure on Remote Islands in Japan: A Panel Data Analysis
12. The Impact of Hosting a Major Sport Event on the South African Economy
13. Improving behaviour classification consistency: a technique from biological taxonomy
14. Family, social security and social insurance: General remarks and the present discussion in Germany as a case study
15. Keystone sector methodology:network analysis comparative study
16. Shifting Identities and Blurring Boundaries: The Emergence of Third Space Professionals in UK Higher Education
17. Prizes and Patents: Using Market Signals to Provide Incentives for Innovations
18. Multi-Agent System Interaction in Integrated SCM
19. The name is absent
20. Integration, Regional Specialization and Growth Differentials in EU Acceding Countries: Evidence from Hungary