Dendritic Inhibition Enhances Neural Coding Properties



1 0 0 0 0 0
MWI
1 00000

0 10 10 0
ш\
111100

0 1 0 0 0 0
l≡l
110000

0 0 10 10

Ш\

111110

0 0 1 0 0 0
MWI
111000

0 0 1 0 0 1
MWI
111111

0 0 0 1 0 0

MWI

001100

0 0 1 0 00.67

ш\

111101

0 0 0 0 1 0
MWI
000110

0 00.670 1 0

MWI

011110
0 0 0 0 0 1
ιwι
000111

1 0 0 0.5 00.67

ш\

101011

Figure 3: Representing multiple, overlapping, input patterns. A network consisting of six nodes and
six inputs (‘a’, ‘b’, ‘c’, ‘d’, ‘e’, and ‘f’) is wired up so that nodes receive input from patterns ‘a’, ‘ab’,
‘abc’, ‘cd’, ‘de’, and ‘def’. The response of the network to each of these input patterns is shown on the
top row. Pre-integration lateral inhibition (lateral weights have been omitted from the figures) enables
each node to respond exclusively to its preferred pattern. In addition, the response to multiple and partial
patterns is shown on the bottom row. Pattern ‘abcd’ causes the nodes representing ‘ab’ and ‘cd’ to be active
simultaneously, despite the fact that this pattern overlaps strongly with pattern ‘abc’. Input ‘abcde’ is parsed
as ‘abc’ together with ‘de’, and input ‘abcdef’ is parsed as ‘abc’ + ‘def’. Input ‘abcdf’ is parsed as ‘abc’ +
two-thirds of ‘def’, hence the addition of ‘f’ to the pattern ‘abcd’ radically changes the representation that
is generated. Input ‘bcde’ is parsed as two-thirds of ‘abc’ plus pattern ‘de’. Input ‘acef’ is parsed as ‘a’ +
one half of ‘cd’ + two-thirds of pattern ‘def’.

on the frequency which which nodes, or pairs of nodes, are active. Such an network can thus respond
appropriately to any combination of input patterns; for example, it can directly solve the problem of repre-
senting any arbitrary combination of the inputs ‘a’, ‘b’ and ‘c’. A more challenging problem is shown in
figure 3. Here nodes represent six overlapping patterns. The network responds correctly to each of these
patterns and to multiple, overlapping, patterns (even in case where only partial patterns are presented).

Ambiguity

In some circumstances there simply is no correct parsing of the input pattern. Consider a neural network
with two nodes and three inputs (‘a’, ‘b’ and ‘c’). If one node represents the pattern ‘ab’ and the other
represents the pattern ‘bc’ then the input ‘b’ is ambiguous since it equally matches the preferred input of
both nodes. In this situation, most implementations of post-synaptic lateral inhibition would allow one
node, chosen at random, to be active at half its normal strength. An alternative implementation (Marshall,
1995) is to use weaker lateral weights to enable both nodes to respond with one-quarter of the maximum

00


0 0.5


00


01


0.5 0


0.5 0.5


10


0.5 0.5


NNNNNNNN

000


001


010


011


100


101


110


111


Figure 4: Representing ambiguous input patterns. A network consisting of two nodes and three inputs (‘a’,
‘b’, and ‘c’) is wired up so that the first node receives input from ‘ab’ and the second node receives input
from ‘bc’ (all weights have a value of
2). The response of the network to each possible pattern of inputs
is shown. Pre-integration lateral inhibition (lateral weights have been omitted from the figures) suppresses
any response to pattern ‘b’ (010) which overlaps equally with each node’s preferred input pattern. Similarly,
when the input is ‘abc’ the ambiguous contribution from input ‘b’ is suppressed so that both nodes respond
at half strength. It can be seen that in other conditions each node responds at half strength when the input
matches half its preferred input, and at full strength when its preferred input is presented.



More intriguing information

1. A Bayesian approach to analyze regional elasticities
2. POWER LAW SIGNATURE IN INDONESIAN LEGISLATIVE ELECTION 1999-2004
3. Willingness-to-Pay for Energy Conservation and Free-Ridership on Subsidization – Evidence from Germany
4. The name is absent
5. The name is absent
6. Evolving robust and specialized car racing skills
7. The name is absent
8. The name is absent
9. The name is absent
10. QUEST II. A Multi-Country Business Cycle and Growth Model