Content-addressable memory with spiking neurons Innovationskolleg Theoretische Biologie, Humboldt-Universita¨t zu Berlin, Invalidenstrasse 43, 10115 Berlin, Germany
Time evolution and cooperative phenomena of networks with spiking model neurons are analyzed with
emphasis on systems that could be used as content-addressable memories. Stored memories are represented bydistributed patterns of neural activity where single cells either fire periodically or remain quiescent. Twodistinct mechanisms to generate relaxation behavior toward such periodic solutions are investigated: delayedfeedback and subthreshold oscillations. Using theoretical analysis and numerical simulations it is shown that inboth cases model networks with integrate-and-fire neurons possess storage capabilities similar to those ofconventional associative neural networks. ͓S1063-651X͑99͒09202-8͔
PACS number͑s͒: 87.10.ϩe, 05.20.Ϫy, 07.05.Mh, 64.60.Cn
I. INTRODUCTION
mechanisms. It is in this spirit that we investigate associativecapabilities of model networks with integrate-and-fire neu-
Recurrent neural networks may be programmed to func-
tion as content-addressable memories that recover stored pat-
This study focuses on systems in which stored binary
terns from incomplete or noisy inputs. To do so, correlations
memories are represented by distributed patterns of neural
within patterns to be memorized are encoded in the synaptic
activity where single cells either fire periodically ͑the ‘‘on’’-
weights. By this procedure, multiple patterns can be imple-
state͒ or remain completely quiescent ͑the ‘‘off’’- state͒.
mented as fixed-point attractors of the network dynamics.
Two different mechanisms to achieve such behavior are in-
Starting from an initial state close to one of the stored attrac-
vestigated in detail: delayed feedback within the model net-
tors, the system dynamics relaxes to this attractor and thus
work and externally generated subthreshold oscillations. We
retrieves the stored pattern ͓1–3͔.
demonstrate that both mechanisms give rise to collective
Various brain regions have been hypothesized to operate
properties that are almost identical with those of the Little
as content-addressable memories, for example, the CA3 re-
model ͓7͔, one of the classical attractor neural networks.
gion in the hippocampus and locally connected systems of
Since periodic membrane oscillations due to rhythmic back-
pyramidal cells in neocortical association areas ͓4–6͔. Tra-
ground activity are typical for various brain regions, our re-
ditionally, these systems have been modeled using coarse-
sults indicate that such oscillations may play a beneficial role
grained dynamical descriptions based on short-time averaged
in content-addressable memory processes ͓20͔. The results
firing rates. This approach leads either to models with con-
also show that although the microscopic time evolution of
tinuous time and real-valued state variables ͑‘‘graded-
integrate-and-fire neurons strongly differs from that of
response neurons’’͒ ͓2,3͔ or to systems with discrete time
graded-response neurons or binary neurons, the collective
and binary ͓1,7͔ or real-valued state variables ͓8,9͔. Exten-
network dynamics may nevertheless exhibit rather similar
sive theoretical results regarding the convergence properties
and storage capacity of such ‘‘attractor neural networks’’have been derived ͓10–12͔. II. MODEL SYSTEMS
Most cortical neurons communicate using discrete pulses
of electrical activity, called ‘‘action potentials’’ or ‘‘spikes.’’
Biological neurons generate an action potential when their
Firing-rate models neglect this characteristic feature of neu-
cell body ͑soma͒ is sufficiently depolarized. The action po-
ral signals. It is, therefore, important to compare the collec-
tential then propagates along the axon to synapses on the
tive properties of biologically more realistic approaches with
dendritic trees of downstream ͑postsynaptic͒ neurons. When
those of the traditional network models. Previous studies
the action potential arrives at a synapse it initiates the release
have demonstrated that systems with spiking model neurons
of neurotransmitter which leads to a flow of ionic currents
offer computational possibilities not shared by models based
that depolarize or hyperpolarize the postsynaptic cell. De-
on firing rates such as computations based on relative spike
pending on the integrated inputs from the many thousand
times ͓13–18͔. On the other hand, it has also been shown
cells a cortical neuron is typically connected with, the
that for stationary solutions, firing-rate descriptions cover a
postsynaptic neuron will in turn fire an action potential at
large class of networks with spiking model neurons ͓19͔.
some later time and thus influence further neurons. Feedback
The apparent discrepancy between these two lines of re-
through recurrent connections results in complicated spa-
search is resolved if one realizes that results based on the
dynamics of specific models can only demonstrate the rich-
Model neural networks are constructed to capture impor-
ness of phenomena generated by that very class of model.
tant features of these intricate dynamical processes. In order
Other models may or may not exhibit the same phenomena.
to allow for an analysis of the collective behavior of large
Results obtained with specific models are nevertheless ex-
feedback networks, the microscopic dynamics has to be sim-
tremely useful to prove the feasibility of computational
plified as much as possible. Individual neurons are therefore
CONTENT-ADDRESSABLE MEMORY WITH SPIKING NEURONS
often considered to be electrotonically compact. Within this
Here, ti,f with f N are the times when neuron i generates an
class of model, ‘‘integrate-and-fire’’ neurons ͓16,21–29͔ are
characterized by a particularly simple description in that the
mentioned above into account and guarantees that the mem-
state of each neuron i (1рiрN) is modeled by a single
brane potential of cell i is reset to zero.
dynamical variable ui , the membrane potential at the soma.
From a biological point of view, Eq. ͑4͒ describes a situ-
If the membrane potential ui is below the firing threshold
ation where ionic currents due to spike generation are so
uthresh , integrate-and-fire neurons operate as leaky integra-
strong that they override any simultaneous input currents.
Consistent with this picture, currents caused by previous syn-aptic inputs will be set to zero when a cell emits an action
potential. Backpropagating action potentials ͓30͔ could pro-
ϭϪ ͓u ͑t͒Ϫu ͔ϩI ͑t͒.
vide a biophysical mechanism for this phenomenon in bio-logical neurons. In order to properly formulate this charac-
The capacitance C and the resistance R of the cell membrane
teristic of the dynamics, let us now focus on the postsynaptic
sence of an input current Ii(t), the membrane potential ui(t)
If, say, neuron j spikes, an action potential travels along
approaches its rest value u0 .
its axon to other neurons. When it arrives at a synapse with
Within the mathematical formulation, the term u0 /R can
neuron i, neurotransmitter is released and triggers a postsyn-
be absorbed in Ii(t) and we will therefore focus on the case
aptic current in cell i. The shape of the current resulting at
0 without loss of generality. By rescaling time t and
the soma is denoted by ␣() where measures the time
input current Ii , the capacitance C and resistance R can be
since the action potential arrived at the synapse. The func-
tional form of ␣() may thus be used to describe the effectsof synaptic transmission and/or passive dendritic conduction,
i t ͒ ϭϪu ͑t͒ϩI ͑t͒.
with ␣()ϭ0 for Ͻ0. The input to cell i from other neu-
When the membrane potential of an integrate-and-fire modelneuron reaches the firing threshold uInetwork͑t͒ϭ ͚ T ␣
a uniform action potential modeled as a ␦ pulse, and the
an integrate-and-fire neuron is thus just a sequence of ␦
ax) Ϫ t i,l(t)… assures that currents due to
previous synaptic inputs are reset to zero when neuron i
For convenience, units can be chosen such that uthresh
i,l(t) is the last firing time of neuron i before t, and
0. Since the reset is assumed to be instanta-
⌰(x) is the theta function, i.e., ⌰(x)ϭ0 for xϽ0 and
neous, the membrane potential ui takes two values when cell
⌰(x)ϭ1 for xу0. The term ax denotes the axonal delay
i generates an action potential at time t. Where necessary,
from neuron j to neuron i. Unless stated otherwise, the ker-
these two values will be denoted by ui(tϪ) and ui(tϩ). To
nels ␣() are normalized according to
simplify the notation, temporal arguments otherwise alwaysrefer to the time immediately prior to firing. Special care has
to be taken in systems with ␦-shaped input currents that may
cause a cell to generate an action potential at time t even if
immediately prior to that instant, its membrane potential is
strictly below the firing threshold. In particular, a positive
i j are equivalent to the total integrated
synaptic strength from neuron j to neuron i.
input of the form I␦(tϪ t˜) will cause an action potential in a
In the following two sections we study the associative
subthreshold cell if Iу1Ϫui( t˜Ϫ). This example shows that
capabilities of specific realizations of this general model.
in models with integrate-and-fire neurons one has to take the
First we consider systems with no oscillations of the back-
dynamical consequences of ␦-shaped postsynaptic currents
ground activity and with postsynaptic currents modeled as ␦-
pulses. We then turn to systems with periodic background
The reset process can be formally included in the time
oscillations and postsynaptic currents modeled as rectangular
evolution ͑2͒ by an additional effective current Ireset that
pulses or differences of two exponential functions.
guarantees that the membrane potential ui is reset to zeroimmediately after neuron i generates an action potential. De-
III. MODEL WITH CONSTANT BACKGROUND ACTIVITY
noting input currents to cell i from other neurons in the mod-
AND FAST SYNAPTIC CURRENTS
eled network by Inetwork and input currents due to external
stimuli and background activity by Iexternal , we get
To capture the effects of randomly impinging synaptic
inputs known to put cortical neurons close to firing threshold
under in vivo conditions ͓31͔, all model neurons receive a
constant positive background current Iexternal(t)ϭI Ͼ0
which is slightly less than one so that without recurrent orfurther external input, the membrane potentials relax to a
Ireset͑t͒ϭϪ ͚ u ͑tϪ ͒␦͑tϪt ͒.
level below but close to the firing threshold, limt→ϱ͉uthresh
i( t ) ͉ ϭ ͉ 1 Ϫ I B
mated by ␦-pulses which is justified if synaptic time con-
quired to ensure that the constant input current IB has raised
stants are short compared to the membrane time constant.
the membrane potential sufficiently close to threshold by
ax for a neuron which fired and was reset to u i
ax . Note that the second requirement can be
avoided for arbitrary ax by simply adding positive self-couplings T ϭI
Let us now investigate how such a system responds to an
Bexp(Ϫax) . This has been done in the nu-
external stimulus pattern presented to the network at time
It is known from analytical studies and numerical simula-
0. We assume input patterns that raise the membrane
tions that one can store an extensive number ( pϰN) of ran-
potentials of some neurons above firing threshold whereas
dom patterns in the Little model ͓10–12͔. The critical
the membrane potentials of all other neurons remain at their
cN is reached for ␣ Ϸ 0.14 ͓32͔. To be
precise, this result holds for a Little ͑or Hopfield͒ model with
tive input is denoted by G0 .
Ϯ1-representation ͑‘‘on’’ ϭϩ1, ‘‘off’’ ϭϪ1) whose dy-
The time evolution of the network can be readily under-
stood by the following step-by-step consideration. All neu-rons within the group G0 fire action potentials at time t0
ϭ0. Immediately afterwards, their membrane potentials are
0. At time tϭax , the action potentials arrive at
the postsynaptic neurons. Due to the constant background
In the Little model, all neurons are updated in parallel, in the
current Iexternal(t)ϭIB in Eq. ͑2͒, the membrane potentials of
Hopfield model ͓1͔, Eq. ͑9͒ is applied to only one neuron at
a time, chosen in a serial or random sequential order. B 1 Ϫ exp(Ϫax) ͔ at that time. Neurons that did not fire at
In both models, the synaptic weights T
0 still hover at the fixed point uϭIB .
If the axonal delay ax is large compared to one ͑the mem-
brane time constant͒, all ui are thus again just below thresh-
old. The arrival of the action potentials fired by neurons jT ϭNϪ1 ͚ u for i j, T ϭ0,
0 at time t 0 then triggers postsynaptic currents which
instantaneously change the membrane potential of neuron ij G T i j . For sufficiently large ax , neuron i will there-
with equal probability from ͕Ϫ1,1͖, i.e., Prob( ϭ1)
a if ͚ j G T i j
the sum is negative. Repeating this argument shows that at
kax , where kN, certain sets Gk of neurons
that the system remains near a stored pattern if it is initial-
are active. If we denote a neuron i that is active by A ϭ
ized with that pattern, i.e., the overlap m(t),
a neuron that stays quiescent by A ϭ
m͑t͒ϭNϪ1͚ S ͑
remains at a value that is close to one for this particular
This time evolution is identical to the update rule of the
pattern. At the critical storage level, m Ϸ0.97 so that about
Little model ͓7͔ with a 0/1-representation. The result implies
1.5% of the bits of a pattern are flipped, i.e., not retrieved
that the group of neurons that fires action potentials at time
correctly ͓10–12͔. Above ␣c , the system tends to relax to
kax in the integrate-and-fire model is the same group that is
states that are only vaguely reminiscent of the initial pattern,
1) in the kth iteration of a Little model
mϷ0.35. Strictly speaking, the phase transition at ␣c oc-
͓7͔ with identical initial conditions ͑same group G0 of neu-
curs only in the limit N→ϱ, however, the qualitative change
1 at time t0) and identical couplings Tij .
between the behavior below ␣c and above ␣c can also be
It follows that both networks retrieve the same pattern and
seen in finite systems and may be used to determine the
that eventually the group of neurons which fire remains un-
storage capacity in numerical studies: Below ␣c , the prob-
changed. The duration of the transient phase depends on the
ability to remain near a stored pattern increases with system
number of patterns stored in the network and whether the
size, above ␣c it decreases ͓33͔. As in previous studies with
dynamics relaxes to one of the stored patterns or to a spuri-
systems with two-state or graded-response neurons, we will
ous attractor ͓10–12͔. Note that the Little model literally
use this phenomenon to determine the storage capacity of
reaches a fixed point, i.e., the activity Ai of cell i approaches
networks with integrate-and-fire neurons.
a constant value, one or zero, whereas in the present network
Integrate-and-fire neurons that do not fire do not have any
the binary pattern is represented by a ax-periodic firing pat-
influence on the state of the other model neurons. This im-
plied the 0/1-representation in Eq. ͑8͒. However, most ana-
In the above argument, a large axonal delay ax is re-
lytical results on the Little model have been obtained for the
quired for two different reasons. First, ӷ
1-representation Eq. ͑9͒. In order to compare the collective
a drop of ui due to a negative total recurrent input at time
dynamics of the present model with those results, we there-
(kϪ1)ax has decayed sufficiently by time kax , so that ui is
fore transform the Ϯ1-representation into the 0/1 represen-
again just below threshold. Second, a large delay is also re-
CONTENT-ADDRESSABLE MEMORY WITH SPIKING NEURONS
Inserting Eq. ͑12͒ into Eq. ͑9͒ demonstrates that the timeevolution ͑9͒ is identical to the modified dynamics
For patterns where the number of ϩ1’s balances exactlythe number of Ϫ1’s, the term ͚ jTij vanishes if the Hebbrule ͑10͒ is used to determine the Tij . For random patterns,this is only true on average. Fluctuations from the mean do,however, decrease the storage capacity of the model with theoriginal dynamics ͑8͒.
On the level of the macroscopic iteration equations there
are thus two major differences between the models; first, theoccurrence of the terms ͚ jTij in Eq. ͑13͒ as opposed to Eq.
FIG. 1. Associative capability of the integrate-and-fire network
͑8͒ and second, the idealization of long axonal delays im-
with constant background activity and fast synaptic currents ͑left͒
plicit in Eq. ͑8͒. We are interested in the effect of realistic
and the Little model ͑right͒. Shown are numerical results where the
axonal delays on the dynamics and not on the side effect of
network dynamics was initialized with one of the stored memory
patterns and then simulated until it reached a stationary state. The
jT i j arising from a change of the representations
used in the Little model. We therefore separate the two ef-
fraction of final states thus obtained is plotted as a function of the
fects by artificially balancing the second term in Eq. ͑13͒
overlap with the initial memory pattern ͑in five-percentile bins͒.
through an additional auxiliary neuron iϭ0. This cell is trig-
Data are averaged over all stored patterns and 20 realizations of the
gered by the firing activity in the network that is received
synaptic coupling matrix, error bars denote standard deviations. The
networks consisted of 250 neurons ͑A͒, ͑B͒ or 2000 neurons ͑C͒,
͑D͒. Comparison of ͑A͒ and ͑C͒ shows distributions of final states
the times tk . The synaptic strengths from this neuron to all
that are almost independent of system size, indicating that the stor-
jT i j to provide the balance term in
age capacity of the network with spiking neurons is close to ␣
ϭ0.145, the storage level used in the simulations. In ͑B͒, the frac-
For the numerical study of the retrieval quality, pϭ␣N
tion of final states in the highest bin is lower than in ͑A͒ and de-
unbiased Ϯ1 random patterns were stored using the Hebb
creases with increasing system size ͑D͒. The growing peak at m
rule Eq. ͑10͒. For given storage level ␣, averages from mul-
Ϸ0.35 for the Little model indicates that its storage capacity is
tiple realizations of networks with up to 2000 neurons were
lower than 0.145, in accordance with the literature.
analyzed. Each simulation consisted of p runs in which thedynamics was started in states that consecutively resembled
tial overlaps less than one corresponding to noisy input pat-
each of the stored memories. Initial overlaps m(0) less than
terns. As an example, results for ␣ϭ0.135 and initial overlap
one were used to test the capability of the network to retrieve
a stored pattern from incomplete or noisy inputs. Upon
A systematic comparison of various simulations is pre-
reaching an attractor state, the overlap of the final state with
sented in Fig. 3. Here the fraction of final configurations with
the corresponding memory was determined. In Figs. 1, 2, and
an overlap larger than 0.9 is plotted as a function of the
5, these overlaps are represented by histograms that display
initial overlaps. The simulations were performed for both
the fraction of final states with given overlap with the origi-
models and storage levels below and above the storage ca-
nal pattern, averaged over all patterns and realizations. For
pacity of the Little model. The results demonstrate that for
the reader’s convenience, the characteristic shape of the
low storage levels, both models exhibit very similar collec-
postsynaptic potential ͑PSP͒ in each simulation is shown as
tive behavior. With increasing storage level, however, the
an inset in the figures, together with a sketch of the back-
network with spiking neurons performs increasingly better
ground activity Iexternal(t).
than the Little model. Apart from this difference, the overall
In Fig. 1 the behavior of the network is compared with
similarity between the associative capabilities of both models
that of a Little model. The same storage level ␣ϭ0.145 is
is highly surprising if one takes into account that for the
chosen for both systems. In the Little model the bin with the
simulations, the axonal delay was chosen to be ϭ
highest overlap (0.95Ͻmр1.00) slightly decreases with in-
1 required for the heuristic argument leading to
creasing system size and a peak at mϷ0.35 develops, indi-
cating that ␣ϭ0.145 is already above the storage capacity ␣c
Let us now try to understand the differences between both
of the Little model, in accordance with the literature. In the
models in detail. For the chosen value of the axonal delay,
present model the bin with highest overlap stays approxi-
0.2, the total negative recurrent input for neurons which
mately constant and no peak at mϷ0.3 develops, indicating
did not fire at time (kϪ1)ax has only decayed by 18.1% at
0.145. The storage capacity of the integrate-and-fire
time kax , far from the assumption of a 100% decrease in
model is thus slightly higher than that of the Little model. A
Eq. ͑8͒. Therefore higher positive recurrent input for such a
similar conclusion can be drawn from simulations with ini-
neuron is needed to generate an action potential at time kax .
FIG. 3. Systematic comparison of pattern completion in the
integrate-and-fire network with constant background activity and
FIG. 2. Pattern completion in the integrate-and-fire network
with constant background activity and fast synaptic currents ͑left͒
͑solid line͒ and the Little model ͑dashed line͒.
and the Little model ͑right͒. The networks were initialized with
input patterns that were generated by randomly flipping 15% of the
0 , the fraction of final configurations with an over-
bits of each pattern, resulting in an initial overlap m ϭ
ϱϾ0.9, averaged over all stored pattern and 20 realizations of
the coupling matrix, is plotted for three storage levels,
dynamics, numerical analysis, and representation of the data are
otherwise as in Fig. 1. As shown by the simulations, run at a storage
͒, ␣ϭ0.14 ͑middle curves͒, and ␣ϭ0.16 ͑right curves͒. Ver-
tical bars denote standard deviations. The data show that for low
level of ␣ϭ0.135, the network with spiking neurons has a higher
storage levels, the spiking and nonspiking model exhibit almost
noise tolerance than the Little model.
identical cooperative behavior. With increasing storage level andincreasing initial overlap, the integrate-and-fire network performs
This implies that neurons which do not fire in one iteration
increasingly better than the Little model.
have an increased tendency not to fire in the next cycle.
This effect is an advantage if a neuron is supposed to be
of the effect mentioned above this phenomenon is less pro-
quiescent in the desired memory pattern because it admits
nounced in the present model. Second, the average number
larger noise levels induced by the presence of other memo-
of iterations needed to reach a fixed point is smaller than in
ries and thus allows for a higher storage capacity. However,
the Little model, in particular for slowly relaxing solutions.
by the same token it is a disadvantage if the neuron is sup-
Third, the difference between the performance of the two
posed to be in the on-state. The simulation results reveal that
models increases with decreasing initial overlap and increas-
the advantage is larger than the disadvantage. This phenom-
enon can be understood by considering that close to a stored
As argued above, the increased performance of the
pattern, most neurons ‘‘benefit’’ from the effect mentioned
present model is mainly due to the improved associative be-
above. For example, let us assume that the overlap with a
havior of neurons that are supposed to be in the off-state.
certain stored pattern is mϭ0.6. In that case, about 80% of
The latter can be observed directly by comparing the frac-
the off-neurons are supposed to be off in the memory pattern
tions of correctly retrieved off- and on-neurons. For the
͑and thus have an advantage͒ but only about 20% of the off-
Little model, the size of both fractions should be identical
neurons are supposed to be on ͑and thus have a disadvan-
͑apart from statistical fluctuations͒ and be equal to the final
tage͒. Furthermore, if patterns have been learned using the
overlap. Plotted against the final overlap, one would thus
Hebb rule ͑10͒, the expected recurrent input to an off-neuron
expect that data points scatter around the diagonal but with-
in a stored pattern is negative whereas the expected recurrent
out major differences between the off- and on- neurons. Fig-
input to an on-neuron is positive. This implies that in a noisy
ure 4 shows that, as predicted, this is not the case for the
pattern where some neurons are flipped the recurrent input
present model: the fraction of correctly retrieved off-neurons
changes: flipped on-neurons received a negative input but the
is significantly higher than that of on-neurons.
absolute size of this negative input is generally less than theabsolute size of the negative input an off-neuron received
IV. MODEL WITH SUBTHRESHOLD OSCILLATION
that is not flipped. Therefore the effect of the advantage is
AND LONG-LASTING POSTSYNAPTIC CURRENTS
again larger than the effect of the disadvantage.
A number of phenomena are intimately connected with
Subthreshold membrane oscillations have been observed
this difference between the two dynamics. First, it is well
in many brain regions and are prominent in the hippocampus
known that in the Little model trajectories that start near a
͓35–38͔. Oscillation frequencies usually range from a few
stored pattern but end up in the peak at mϷ0.35 show an
Hertz ͑alpha waves͒ to 40-60 Hz ͑gamma waves͒. Experi-
increasing overlap with the desired target pattern during the
mental results suggest that under in vivo conditions certain
first iterations of the dynamics, before the overlap decreases
classes of neurons fire at most a few action potentials in one
due to cross-talk from the other stored patterns ͓34͔. Because
cycle ͓39͔. The generation of action potentials occurs mainly
CONTENT-ADDRESSABLE MEMORY WITH SPIKING NEURONS
For given amplitude Iac and frequency of the oscillatingcurrent, we thus choose the dc part Idc of the input current tobe slightly less than 1ϪA(I
1. Without loss of generality, we shift the
time axis and set ϭ0 so that the maxima of uosc(t) arereached at times k P with kN.
Two different model variants will be investigated to study
the dynamical effects of long-lasting postsynaptic currents.
FIG. 4. Difference in the behavior of firing and quiescent neu-
In the first variant, current pulses are approximated by rect-
rons in the integrate-and-fire network with constant background ac-
tivity and fast synaptic currents (Nϭ2000, ␣ϭ0.135, m ϭ
Shown are the fractions of firing neurons ͑open squares͒ and quies-
cent neurons ͑open circles͒ in the final state as a function of theoverlap of the final state with the stored memory pattern, averaged
This choice implies that the change of ui due to a single
over all stored pattern and 20 realizations of the coupling matrix.
action potential generated by neuron j at time t is maximal in
The length of the vertical bars denotes one standard deviation. In
psc . Its size ⌬ u i j at that time, as
systems without symmetry breaking between on- and off-neurons,
such as the Little model, both fractions would be identical ͑apartfrom statistical fluctuations͒ and be equal to the final overlap. The
systematic deviations demonstrate the different dynamic roles offiring and nonfiring cells in models with spiking neurons.
The simple shape of the current pulse admits an analytictreatment of the retrieval behavior of the network as will be
in the rising part of the oscillation ͓40͔. Furthermore, de-
shown below. However, this choice might be oversimplified.
pending on the specific neurotransmitter, the duration of
For the second model variant, we therefore follow the litera-
postsynaptic potentials may be of the same order as the os-
ture and use the difference of two exponentials as an effec-
cillation period. This suggests that spike activity in one os-
tive description for the time course of the postsynaptic cur-
cillation cycle may generate postsynaptic potentials that last
long enough to trigger action potentials far into the nextcycle.
Within this paper we do not attempt to model a specific
brain region or experimental paradigm. We therefore onlyincorporate the essence of these experimental findings into
The time constant r is the rise time and d is the decay time
the model ͑2͒–͑6͒ and investigate whether the resulting sys-
r) . The constant c is cho-
tem is again capable of functioning as a content-addressable
sen such that the maximum of ␣() equals one as in Eq.
memory. To account for global subthreshold oscillations of
͑18͒. Both normalizations differ from Eq. ͑6͒ used in the
the membrane potential in a minimal way, the time-
preceding section and are more convenient in the present
dependent part of the external input current Iexternal(t) in Eq.
context since they allow a direct comparison of systems with
different shapes and time constants of postsynaptic currents.
To mimic the experimental observation that certain
Iexternal͑t͒ϭI ϩ
classes of neurons mainly fire during the rising phase of the
membrane oscillations, we consider parameter regimes such
that the model neurons only fire within an interval of length
As in the preceding section, model parameters are chosen
Dfire before the maximum of the oscillation, i.e., cannot reach
such that without input from other cells or external stimuli, a
the threshold in the intervals „kP,(kϩ1)PϪDfire…. Because
neuron does not fire. The time evolution below the firing
uosc(t)рuosc(Dfire) for t„kPϩDfire ,(kϩ1)PϪDfire…, the
threshold is described by Eq. ͑2͒, a linear differential equa-
constraint can be satisfied for arbitrary 0рI Ͻ
tion. One can therefore readily determine conditions that
intervals „kPϩDfire ,(kϩ1)PϪDfire… if even in the worst
case where some neuron receives maximal positive input
͑19͒ the firing threshold is not reached at time kPϪDi( t ) equals I external( t ) , and for t → ϱ
all neurons approach the synchronous oscillation
This condition can be met without loss of generality by res-caling the synaptic weights, T →␥
scale factor ␥: this normalization has no effect on the timeevolution of the Little model with which we want to comparethe dynamics of the present system.
To guarantee that neurons do not fire in the remaining
intervals (k P,k PϩD ͔
fire , two conditions have to be fulfilled.
First, postsynaptic potentials due to action potentials gener-ated in earlier cycles should reach their maximum and triggera spike before the oscillation maximum is reached at timek P. This can be achieved by properly adjusting the timeconstant of the postsynaptic current and the membrane timeconstant.
Second, postsynaptic currents due to action potentials
generated in a given cycle should not trigger a neuron to fireafter the oscillation maximum. The sufficient ͑though non-optimal͒ condition
FIG. 5. Associative capability of the integrate-and-fire network
with subthreshold oscillation and long-lasting postsynaptic currents. The simulations were performed in systems with Nϭ400 neurons ata storage level of ␣ϭ0.135 ͑A͒, ͑B͒, ͑C͒ and ␣ϭ0.1 ͑D͒. The initial
will be used in some of the simulations. This constraint also
overlap with stored patterns was m ϭ
guarantees that neurons do not fire twice within a cycle, ide-
systems with block-shaped ͑A͒, ͑C͒ and double-exponential ͑B͒,
alizing the experimental finding that at most a few spikes are
͑D͒ postsynaptic currents, resulting in time courses for the postsyn-
generated in one cycle ͓39͔ . Note that Eq. ͑22͒ can be easily
aptic potentials as depicted in the upper insets. Axonal delays ax
relaxed because long rise times of the postsynaptic potential
were either long ͑A͒, ͑B͒ or short ͑C͒, ͑D͒ compared with the
eliminate the possibility of multiple firings within one cycle
RC-time constant which was normalized to one. The oscillation
or firing after the maximum of the cycle. As in the previous
period P was 6.67 and the amplitude of the oscillation was 0.4.
model we balance the second term in Eq. ͑13͒ by adding an
Otherwise, network dynamics, numerical analysis, and data presen-
auxiliary spiking model neuron iϭ0 to the system. This cell
tation are as in Fig. 1. Comparison of ͑A͒ with ͑C͒ or ͑B͒ with ͑D͒
does not receive input from the other neurons. By choosing
demonstrates that the associative capabilities decrease with decreas-ing axonal delay. This effect is less pronounced for block-shaped
an increased input current Idc for this neuron it will reach
postsynaptic currents than for postsynaptic currents with a double
threshold and fire at or slightly before the maxima of the
exponential shape — in ͑A͒ and ͑C͒, the same value for ␣ has been
oscillation. The synaptic strengths from this cell to all other
used, whereas in ͑D͒ it was lowered to ␣ϭ0.1 to obtain a distribu-
Let us now analyze the dynamics of networks with block-
shaped postsynaptic currents ͑18͒. The systems are initial-
the preceding section, all results obtained for the Little
ized with input currents such that the membrane potentials of
model regarding fixed points, convergence, and storage ca-
a group G0 of neurons are raised above uthresh at or slightly
pacity apply to the present model as well.
before the maximum of an oscillation cycle, whereas the
Note that a firing pattern in the present model consists of
other neurons receive no external input.
all neurons i that generate action potentials (S ϭ
Once the initial spikes are generated, the membrane po-
Little model͒ within a time interval of fixed length Dfire be-
tentials of all neurons in G0 are reset to zero. At time ax ,
fore the maximum of one cycle of the background oscilla-
i.e., after one axonal delay time, the action potentials arrive
tions. In general, these neurons will not fire at the same time
at the postsynaptic neurons and trigger rectangular current
because of variations in the recurrent input. This situation
pulses with a duration psc that is chosen such that the
differs from the model discussed in the preceding section
postsynaptic current lasts till the maximum of the next oscil-
where all active neurons fire in strict synchrony.
The equivalence of the present model with the Little
By the time Dfire before the maximum of the next oscil-
model has now been proven for an ideal situation required
lation cycle all the postsynaptic currents are sufficiently in-
for the mathematical analysis. To analyze the behavior of the
with arbitrarily small ⑀Ͼ0 if P is
network under more realistic conditions, numerical simula-
long enough. Thus in the limit of Pӷ1, a postsynaptic neu-
tions with both block-shaped ͑18͒ and double-exponential
postsynaptic currents ͑20͒ with physiologically realistic pa-
rameters were performed. With a membrane time constant of
Repeating this argument, we see that the group G
neurons that generate action potentials in the time interval
80 ms, r 1 ms, and ax 3 ms. Dfire before the maximum of the kth oscillation cycle is again
Some of the results are shown in Fig. 5 and can be sum-
identical with the group of neurons that are in the on-state in
marized as follows. For periods P that are long compared to
the kth iteration of a Little model ͓7͔ with the same cou-
one ͑the RC-time constant͒ and for long axonal delays ͑as
plings Tij and initial activity pattern. This implies that as in
required for the analytic convergence proof͒, systems with
CONTENT-ADDRESSABLE MEMORY WITH SPIKING NEURONS
rectangular or double-exponential postsynaptic currents do
extended neural systems, and how synaptic noise influences
indeed have associative capabilities similar to the Little
the emergent behavior of such systems.
With respect to oscillatory background activity, our find-
If the axonal delay is reduced to a realistic value of ax
ings indicate that systems with spiking neurons, realistic ax-
ϭ0.2 ͑i.e., 3 ms͒, the storage capacity of the model decreases
onal delays, and long-lasting postsynaptic currents may uti-
because neurons that fire more than ax before the oscillation
lize subthreshold oscillations of their membrane potentials to
maximum may influence the firing behavior of other neurons
function as content-addressable memories. Thus previous ap-
in the same cycle. The decrease of the storage capacity is
proaches modeling the hippocampus as an associative
larger for the model with double-exponential current shape
memory using highly simplified networks with two-state
because the rise time to the maximum of the postsynaptic
neurons and a discrete time evolution may be justified on a
potential is shorter than for the model with rectangular
phenomenological level even if they do not capture major
postsynaptic current shape. Thus the above mentioned
aspects of the microscopic dynamics.
change in the membrane potential for the other neurons is
Systems with subthreshold oscillations studied in this pa-
larger. The size of the postsynaptic potential is also more
per demonstrate another interesting feature. When these net-
strongly influenced by the precise arrival time of the presyn-
works settle in a periodic activity pattern, the firing time of a
aptic spike than in the model with rectangular current shape.
neuron relative to the underlying oscillation varies from neu-
This is not desirable because the system shows best pattern
ron to neuron, and depends on the attractor reached. Cross-
retrieval if the postsynaptic potentials have the same size at
correlations between the spike activity of one neuron and the
subthreshold oscillation or between the firings of two neu-rons would indicate a temporal code where informationabout stored patterns is encoded in the fine temporal struc-ture of neural activity. Indeed, these temporal patterns could
V. DISCUSSION
be used for further signal processing within the present
The present study demonstrates that model systems with
scheme. However, these temporal phenomena do not play
integrate-and-fire neurons can be programmed to relax for
any functional role in the dynamic mechanism used for as-
appropriate initial conditions to periodic oscillations repre-
sociative memory storage of the networks analyzed here—
senting stored memories. Two different mechanisms were
the temporal fine structure is a mere epiphenomenon of the
investigated in detail. In the first case, delayed feedback
network dynamics. This observation provides a simple ex-
within the network enforces neurons to generate action po-
ample that even measuring stimulus-dependent temporal cor-
tentials at integer multiples of the feedback delay. In the
relations in neural systems cannot be used to verify that
second case, periodic solutions are obtained through an in-
those systems actually use the temporal domain to represent
terplay of subthreshold oscillations and long-lasting postsyn-
The numerical simulations showed that various idealiza-
In both cases, the networks exhibit associative properties
tions required for the mathematical analysis may be violated
similar to those of the Little model. The results demonstrate
without a significant decrease in the performance of the mod-
that although the microscopic time evolution of integrate-
els as content-addressable memories. In particular, time con-
and-fire neurons strongly differs from that of graded-
stants governing oscillation periods may be of the same order
response or binary neurons, the collective network dynamics
as the membrane time constant. Let us mention in passing
may nevertheless be almost identical. On the level of mac-
that for the model with constant background activity and fast
roscopic order parameters and for stationary solutions, simi-
synaptic currents, the collective behavior remains qualita-
lar findings have been reported in the literature for systems
tively the same even for much smaller values of ax if one
where the number p of stored patterns scales logarithmically
arranges several networks without internal recurrent connec-
with the number N of neurons ͓19͔. The current study reveals
tions in a staggered loop structure so that the output of one
that even at high storage levels ( pϰN) and during transient
network serves as input to the subsequent network.
relaxations, spiking and nonspiking systems can exhibit
A study that is conceptually related to the present work
strikingly similar cooperative phenomena.
shows that the dynamics of a Hopfield network ͓3͔ can be
Associative memory storage in a dynamical system re-
implemented in a system with spiking neurons ͓43͔. Through
quires the existence of multiple attractors. In the models
a careful adjustment of the temporal characteristics of
studied in this paper, attractors are imprinted using a Heb-
postsynaptic potentials, a neural code based on relative spike
bian learning rule in large networks. The attractors exhibit
times can be accomplished. Apart from this difference in the
the same simple temporal characteristics — periodic firing
representation of memory patterns, both studies aim at the
patterns — but differ in their spatial structure. However,
same general goal: to realize the time evolution of one dy-
multistability can also arise in small neural systems whose
namical system—the Hopfield model in ͓43͔ and the Little
dynamics allows multiple attractors that differ in their tem-
model in the present paper—within the dynamical repertoire
poral signature ͓42͔. The two scenarios highlight two options
for generating multistability in neural networks: spatial ver-
The success of the two approaches demonstrates that this
sus temporal complexity. The existence of multiple concur-
goal can indeed be accomplished. However, as indicated in
rent rhythms in various brain structures ͓37͔ indicates that in
the Introduction, this does not imply that biological neural
nature both mechanisms may operate in parallel. It would
networks do indeed operate as content-addressable memories
therefore be interesting to link both approaches and study
in the way envisaged in both studies. Similar remarks apply
how complex spatiotemporal patterns can be stored at will in
to theoretical studies about the dynamics and computational
capabilities of ‘‘synfire chains’’ ͓13,14,17͔. Nevertheless, all
ACKNOWLEDGMENTS
these results are helpful in that they prove the principal fea-
We are grateful to David MacKay for most helpful dis-
sibility of associative pattern storage and retrieval in systems
cussions, and to Martin Stemmler for critical comments on
͓1͔ J.J. Hopfield, Proc. Natl. Acad. Sci. USA 79, 2554 ͑1982͒.
͓22͔ Y. Kuramoto, Physica D 50, 15 ͑1991͒.
͓2͔ M.A. Cohen and S. Grossberg, IEEE Trans. Syst. Man Cybern.
͓23͔ L.F. Abbott and C. van Vreeswijk, Phys. Rev. E 48, 1483 13, 815 ͑1983͒.
͓3͔ J.J. Hopfield, Proc. Natl. Acad. Sci. USA 81, 3088 ͑1984͒.
͓24͔ M. Tsodyks, I. Mitkov, and H. Sompolinsky, Phys. Rev. Lett.
͓4͔ D. Marr, Proc. R. Soc. London, Ser. B 176, 161 ͑1970͒. 71, 1280 ͑1993͒.
͓5͔ D. Marr, Philos. Trans. R. Soc. London, Ser. B 262, 24 ͑1971͒.
͓25͔ M. Usher, H. Schuster, and E. Niebur, Neural Comput. 5, 570
͓6͔ A. Treves and E.T. Rolls, Hippocampus 4, 374 ͑1994͒.
7͔ W.A. Little, Math. Biosci. 19, 101 ͑1974͒.
26͔ C. van Vreeswijk and L.F. Abbott, SIAM ͑Soc. Ind. Appl.
͓8͔ C.M. Marcus and R.M. Westervelt, Phys. Rev. A 40, 501
Math.͒ J. Appl. Math. 53, 253 ͑1993͒.
͓27͔ D. Hansel, G. Mato, and C. Meunier, Neural Comput. 7, 307
9͔ A.V.M. Herz and C.M. Marcus, Phys. Rev. E 47, 2155 ͑1993͒.
͓28͔ U. Ernst, K. Pawelzik, and T. Geisel, Phys. Rev. Lett. 74, 1570
10͔ D.J. Amit, Modeling Brain Function: The World of AttractorNeural Networks ͑Cambridge University Press, Cambridge,
͓29͔ A. Nischwitz and H. Glu¨nder, Biol. Cybern. 73, 389 ͑1995͒.
͓30͔ G.J. Stuart and B. Sakmann, Nature ͑London͒ 367, 69 ͑1994͒.
͓11͔ J.L. van Hemmen and R. Ku¨hn, in Physics of Neural Net-
͓31͔ W.R. Softky, Curr. Opin. Neurobiol. 5, 239 ͑1995͒. works, edited by E. Domany, J.L. van Hemmen, and K.
͓32͔ T. Stiefvater, K.R. Mu¨ller, and R. Ku¨hn, Physica A 232, 61
Schulten ͑Springer, Berlin, 1991͒, pp. 1–106.
͓12͔ J. Hertz, A. Krogh, and R.G. Palmer, Introduction to the
͓33͔ D.J. Amit, H. Gutfreund, and H. Sompolinsky, Ann. Phys. Theory of Neural Computation ͑Addison-Wesley, Redwood
͑N.Y.͒ 173, 30 ͑1987͒.
͓34͔ H. Horner, D. Bormann, M. Frick, H. Kinzelbach, and A.
͓13͔ M. Abeles, Corticonics: Neuronal Circuits of the Cerebral
Schmidt, Z. Phys. B 76, 381 ͑1989͒. Cortex ͑Cambridge University Press, Cambridge, England,
͓35͔ T.H. Brown and A.M. Zador, in The Synaptic Organization ofthe Brain, edited by G.M. Shepherd ͑Oxford University Press,
͓14͔ M. Herrmann, J.A. Hertz, and A. Pru¨gel-Bennett, Network 6,
͓36͔ S.R. Cobb, E.H. Buhl, K. Halasky, O. Paulsen, and P. Somo-
͓15͔ J.J. Hopfield, Nature ͑London͒ 376, 33 ͑1995͒.
gyi, Nature ͑London͒ 378, 75 ͑1995͒.
͓16͔ J.J. Hopfield and A.V.M. Herz, Proc. Natl. Acad. Sci. USA 92,
͓37͔ G. Buzsaki and J.J. Chobrak, Curr. Opin. Neurobiol. 5, 504
͓17͔ A. Aertsen, M. Diesmann, and M.O. Gewaltig, J. Physiol.
͓38͔ Y. Gutfreund, Y. Yarom, and I. Segev, J. Physiol. ͑Paris͒ 483,
͑Paris͒ 90, 243 ͑1996͒.
͓18͔ W. Maass, Neural Comput. 9, 279 ͑1997͒.
͓39͔ K.J. Jeffrey, J.G. Donnett, and J. O’Keefe, NeuroReport 6,
͓19͔ W. Gerstner and J.L. van Hemmen, Network 3, 139 ͑1992͒.
͓20͔ R. Mueller, D.J.C. MacKay and A.V.M. Herz, in Proceedings
͓40͔ S.E. Fox, S. Wolfson, and J.B. Ranck, Jr., Exp. Brain Res. 62, BioNet‘96, edited by G.K. Heinz ͑Gesellschaft zur Foerderung
angewandter Informatik, GFaI, Berlin, 1996͒, pp. 70–80.
͓41͔ W. Rall, J. Neurophysiol. 30, 1138 ͑1967͒.
͓21͔ R.E. Mirollo and S.H. Strogatz, SIAM ͑Soc. Ind. Appl. Math.͒
͓42͔ J. Foss, F. Moss, and J. Milton, Phys. Rev. E 55, 4536 ͑1997͒.
J. Appl. Math. 50, 1645 ͑1990͒.
͓43͔ W. Maass and T. Natschlager, Network 8, 355 ͑1997͒.
Focused Real-Time Dynamic Programming for MDPs:Robotics Institute, Carnegie Mellon UniversityFRTDP guides outcome selection by maintaining a prior-ity value at each node that estimates the benefit of directingReal-time dynamic programming (RTDP) is a heuris-search to that node. Priority-based outcome selection bothtic search algorithm for solving MDPs. We present amodified algorithm called F
Fármacos y suicidio: Crisis en el control de los medicamentos Miguel Jara El sobrepeso nos puede conducir al suicidio y/o a problemas psiquiátricos graves, si consumimos para ello determinados medicamentos. La Agencia Europea del Medicamento (EMEA) ha decidido suspender la comercialización del preparado contra el sobrepeso Acomplia (cuyo principio activo es rimonabant), fabricado por el