Anda di halaman 1dari 71

Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Scientific Python Tutorial – CCN Course 2013


How to code a neural network simulation

Malte J. Rasch

National Key Laboratory of Cognitive Neuroscience and Learning


Beijing Normal University
China

July 10, 2013


Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Goal of tutorial

We will program a neural network simulation together.


Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts
Usage of array notation
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts
Usage of array notation
How to integrate ODEs
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts
Usage of array notation
How to integrate ODEs
How to plot results
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts
Usage of array notation
How to integrate ODEs
How to plot results
How to simulate neurons and synapses
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will practice on the way:

Writing scripts
Usage of array notation
How to integrate ODEs
How to plot results
How to simulate neurons and synapses
How to program a quite realistic network simulation
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

What has to be done in principle

There are n neurons, excitatory and inhibitory, that are


inter-connected with synapses.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

What has to be done in principle

There are n neurons, excitatory and inhibitory, that are


inter-connected with synapses.
The network gets some input
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

What has to be done in principle

There are n neurons, excitatory and inhibitory, that are


inter-connected with synapses.
The network gets some input
Each neuron and each synapse follows a particular
dynamics over time.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

What has to be done in principle

There are n neurons, excitatory and inhibitory, that are


inter-connected with synapses.
The network gets some input
Each neuron and each synapse follows a particular
dynamics over time.
The simulation solves the interplay of all components and
e.g. yields spiking activity of the network for given
inputs, which can be further analyzed (e.g. plotted)
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps


1 Simulate a single neuron with current step input
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps


1 Simulate a single neuron with current step input
2 Simulate a single neuron with Poisson input
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps


1 Simulate a single neuron with current step input
2 Simulate a single neuron with Poisson input
3 Simulate 1000 neurons (no recurrent connections)
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps


1 Simulate a single neuron with current step input
2 Simulate a single neuron with Poisson input
3 Simulate 1000 neurons (no recurrent connections)
4 Simulate a recurrent network
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps


1 Simulate a single neuron with current step input
2 Simulate a single neuron with Poisson input
3 Simulate 1000 neurons (no recurrent connections)
4 Simulate a recurrent network
5 Simulate a simple orientation column
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

We will proceed in 5 successive steps


1 Simulate a single neuron with current step input
2 Simulate a single neuron with Poisson input
3 Simulate 1000 neurons (no recurrent connections)
4 Simulate a recurrent network
5 Simulate a simple orientation column

Result plots
A single qIF neuron A single qIF neuron
with current step input with 100 Poisson inputs
40 40
Membrane voltage [mV]

Membrane voltage [mV]


Voltage trace
20 20
0 0
−20 −20
−40 −40
−60 −60
−80 −80
0 200 400 600 800 1000 0 200 400 600 800 1000
Time [ms] Time [ms]

An unconnected network An recurrent network


of 1000 qIF neurons of 1000 qIF neurons
1000 1000
Exc. Exc.
Neuron number [#]

Neuron number [#]

800 Inh. 800 Inh.

600 600

400 400

200 200

0 0
0 200 400 600 800 1000 0 200 400 600 800 1000
Time [ms] Time [ms]
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Which neuron model to use?

Biophysical model (i.e. Hodgkin-Huxley model)

dVm 1 X
Cm = − (Vm − VL ) − gi (t)(Vm − Ei ) + I
dt Rm
i
Including non-linear dynamics of many channels in gi (t)
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Which neuron model to use?

Biophysical model (i.e. Hodgkin-Huxley model)

dVm 1 X
Cm = − (Vm − VL ) − gi (t)(Vm − Ei ) + I
dt Rm
i
Including non-linear dynamics of many channels in gi (t)

Mathematical simplification (Izhikevich, book chapter 8)

if v < 35 :
v̇ = (0.04v + 5) v + 150 − u − I
u̇ = a (b v − u)
if v ≥ 35 :
v ←c
u ←u+d

With b = 0.2, c = −65, and d = 8, a = 0.02 for excitatory neurons and d = 2, a = 0.1 for
inhibitory neurons.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Neuron model

peak 30 mV
RZ RS
v'= 0.04v 2+5v +140 - u + I LTS,TC
8

parameter d
0.25

parameter b
u'= a(bv - u)
reset c IB
RS,IB,CH FS 4
v(t) 0.2
if v = 30 mV, de
cay FS,LTS,RZ CH
reset d with r 2
then v c, u u + d ate a
0.05 TC
u(t)
sensitivity b 0 0.02 0.1 -65 -55 -50
parameter a parameter c

regular spiking (RS) intrinsically bursting (IB) chattering (CH) fast spiking (FS)

v(t)

I(t)

thalamo-cortical (TC) thalamo-cortical (TC) resonator (RZ) low-threshold spiking (LTS)

20 mV

40 ms

-63 mV

-87 mV
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1: Simulate a single neuron with injected current

Exercise 1
Simulate one excitatory neuron for 1000ms and plot the resulting voltage
trace. Apply a current step (Iapp = 7pA) between time 200ms and 700ms.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1: Simulate a single neuron with injected current

Exercise 1
Simulate one excitatory neuron for 1000ms and plot the resulting voltage
trace. Apply a current step (Iapp = 7pA) between time 200ms and 700ms.

Neuron model:
A single qIF neuron
if v < 35 : with current step input
40

Membrane voltage [mV]


v̇ = (0.04v + 5) v + 150 − u − Iapp 20
u̇ = a (b v − u) 0

if v ≥ 35 : −20
−40
v ←c
−60
u ←u+d −80
0 200 400 600 800 1000
Time [ms]
with d = 8, a = 0.02, b = 0.2, c = −65
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:
Open Spyder and create a new file (script) that will simulate the neuron. Import the
necessary modules (from pylab import *)

Proceed as follows:
1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:
Open Spyder and create a new file (script) that will simulate the neuron. Import the
necessary modules (from pylab import *)

Proceed as follows:
1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )
2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set first
element to −70 and −14, respectively.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:
Open Spyder and create a new file (script) that will simulate the neuron. Import the
necessary modules (from pylab import *)

Proceed as follows:
1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )
2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set first
element to −70 and −14, respectively.
3 Loop over T − 1 time steps and do for each step t
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:
Open Spyder and create a new file (script) that will simulate the neuron. Import the
necessary modules (from pylab import *)

Proceed as follows:
1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )
2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set first
element to −70 and −14, respectively.
3 Loop over T − 1 time steps and do for each step t
1 set Iapp ← 7 if t∆t is between 200 and 700 (otherwise 0)
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:
Open Spyder and create a new file (script) that will simulate the neuron. Import the
necessary modules (from pylab import *)

Proceed as follows:
1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )
2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set first
element to −70 and −14, respectively.
3 Loop over T − 1 time steps and do for each step t
1 set Iapp ← 7 if t∆t is between 200 and 700 (otherwise 0)
2 if vt < 35: update element t + 1 of v and u according to

vt+1 ← vt + ∆t {(0.04 vt + 5) vt − ut + 140 + Iapp }


ut+1 ← ut + ∆t a (b vt − ut )
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:
Open Spyder and create a new file (script) that will simulate the neuron. Import the
necessary modules (from pylab import *)

Proceed as follows:
1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )
2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set first
element to −70 and −14, respectively.
3 Loop over T − 1 time steps and do for each step t
1 set Iapp ← 7 if t∆t is between 200 and 700 (otherwise 0)
2 if vt < 35: update element t + 1 of v and u according to

vt+1 ← vt + ∆t {(0.04 vt + 5) vt − ut + 140 + Iapp }


ut+1 ← ut + ∆t a (b vt − ut )
3 if vt ≥ 35: set the variables, vt ← 35, vt+1 ← c, and ut+1 ← ut + d.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 1 in detail:
Open Spyder and create a new file (script) that will simulate the neuron. Import the
necessary modules (from pylab import *)

Proceed as follows:
1 Initialize parameter values (∆t = 0.5ms, a = 0.02, d = 8, · · · )
2 Reserve memory for voltage trace v and u (of length T = 1000/∆t) and set first
element to −70 and −14, respectively.
3 Loop over T − 1 time steps and do for each step t
1 set Iapp ← 7 if t∆t is between 200 and 700 (otherwise 0)
2 if vt < 35: update element t + 1 of v and u according to

vt+1 ← vt + ∆t {(0.04 vt + 5) vt − ut + 140 + Iapp }


ut+1 ← ut + ∆t a (b vt − ut )
3 if vt ≥ 35: set the variables, vt ← 35, vt+1 ← c, and ut+1 ← ut + d.
4 Plot the voltage trace v versus t
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 1 (Python)


1 from p y l a b i m p o r t ∗ 28 I = Iapp
2 29 else :
3# 1) i n i t i a l i z e parameters 30 I = 0
4 tmax = 1 0 0 0 . 31
5 dt = 0.5 32
6 33 i f v [ t ] <35:
7# 1 . 1 ) Neuron / Network p a r s 34 # 3 . 2 ) u p d a t e ODE
8 a = 0 . 0 2 # RS , IB : 0 . 0 2 , FS : 0 . 1 35 dv = ( 0 . 0 4 ∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]
9 b = 0.2 # RS , IB , FS : 0 . 2 36 v [ t +1] = v [ t ] + ( dv+I ) ∗ d t
10 c = −65 # RS , FS : −65 IB : −55 37 du = a ∗ ( b∗ v [ t ]−u [ t ] )
11 d = 8 . # RS : 8 , IB : 4 , FS : 2 38 u [ t +1] = u [ t ] + d t ∗ du
12 39 else :
13 # 1.2) Input pars 40 # 3.3) spike !
14 I a p p =10 41 v [ t ] = 35
15 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / d t # stm t i m e 42 v [ t +1] = c
16 43 u [ t +1] = u [ t ]+d
17 # 2 ) r e s e r v e memory 44
18 T = c e i l ( tmax / d t ) 45 # 4) p l o t v o l t a g e t r a c e
19 v = z e r o s (T) 46 figure ()
20 u = z e r o s (T) 47 t v e c = a r a n g e ( 0 . , tmax , d t )
21 v [ 0 ] = −70 # r e s t i n g p o t e n t i a l 48 p l o t ( t v e c , v , ’ b ’ , l a b e l= ’ V o l t a g e t r a c e ’ )
22 u [ 0 ] = −14 # s t e a d y s t a t e 49 x l a b e l ( ’ Time [ ms ] ’ )
23 50 y l a b e l ( ’ Membrane v o l t a g e [mV] ’ )
24 # 3 ) f o r −l o o p o v e r t i m e 51 t i t l e ( ”””A s i n g l e q I F n e u r o n
25 f o r t i n a r a n g e (T−1): 52 w i t h c u r r e n t s t e p i n p u t 6 ””” )
26 # 3.1) get input 53 show ( )
27 i f t>t r [ 0 ] and t<t r [ 1 ] :
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Synapse model

Conductance based synaptic input

A simple synaptic input model would be


X
Isyn = wj sj (v − Ej )
j

where wj is the weight of the jth synapse and Ej its reversal potential (for
instance 0 mV for excitatory and −85 mV for inhibitory synapses).
Variable sj implements the dynamics of the jth synapse:

s˙j = −sj /τs


sj ← sj + 1, if pre-synaptic neuron spikes
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Synapse model

Conductance based synaptic input

A simple synaptic input model would be


X
Isyn = wj sj (v − Ej )
j

where wj is the weight of the jth synapse and Ej its reversal potential (for
instance 0 mV for excitatory and −85 mV for inhibitory synapses).
Variable sj implements the dynamics of the jth synapse:

s˙j = −sj /τs


sj ← sj + 1, if pre-synaptic neuron spikes

Optional: Synaptic depression


Change the update to

sj ← sj + hj , hj ← 1 − (1 + (U − 1)hj )e −∆tj τd ,

with e.g. U = 0.5, τd = 500ms. ∆tj is the interval between current and previous spike of
neuron j.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2: Single neuron with synaptic input

Exercise 2
Simulate the neuron model for 1000ms and plot the resulting voltage trace.
Assume that 100 synapses are attached to the neuron, with each pre-synaptic
neuron firing with a Poisson process of rate frate = 2 Hz between time 200ms
and 700ms.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2: Single neuron with synaptic input

Exercise 2
Simulate the neuron model for 1000ms and plot the resulting voltage trace.
Assume that 100 synapses are attached to the neuron, with each pre-synaptic
neuron firing with a Poisson process of rate frate = 2 Hz between time 200ms
and 700ms.

Synaptic input model:


X
Isyn = wjin sjin (t)(Ejin − v (t)) A single qIF neuron
j with 100 Poisson inputs
40

Membrane voltage [mV]


ṡjin = sjin /τs 20
Voltage trace

sjin ← sjin + hj , if synapse j spikes 0


−20
with hj = 1∗ , τs = 10, wjin = 0.07, Ej = 0, −40
j = 1 . . . 100. Poisson: Input synapse j spikes if −60
−80
0 200 400 600 800 1000
rj (t) < frate ∆t, where rj (t) ∈ [0, 1] are uniform Time [ms]
random numbers drawn for each step t.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2 in detail:
Use the last script, save it under a new file name, and add the necessary lines.

Proceed as follows:
1 Initialize new parameter values (τs = 10, frate = 0.002ms−1 )
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2 in detail:
Use the last script, save it under a new file name, and add the necessary lines.

Proceed as follows:
1 Initialize new parameter values (τs = 10, frate = 0.002ms−1 )
2 Reserve memory and initialize the vectors sin = (sjin ), win = (wjin ), and E = (Ej )
with nin = 100 constant elements (same values as in Step 1)
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2 in detail:
Use the last script, save it under a new file name, and add the necessary lines.

Proceed as follows:
1 Initialize new parameter values (τs = 10, frate = 0.002ms−1 )
2 Reserve memory and initialize the vectors sin = (sjin ), win = (wjin ), and E = (Ej )
with nin = 100 constant elements (same values as in Step 1)
3 Inside the for-loop change/add the following:
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2 in detail:
Use the last script, save it under a new file name, and add the necessary lines.

Proceed as follows:
1 Initialize new parameter values (τs = 10, frate = 0.002ms−1 )
2 Reserve memory and initialize the vectors sin = (sjin ), win = (wjin ), and E = (Ej )
with nin = 100 constant elements (same values as in Step 1)
3 Inside the for-loop change/add the following:
1 Set pj = 1 if rj ≤ frate ∆t (otherwise 0) during times of applied input. rj is an
uniform random number between 0 and 1. Use array notation to set the
input for all nin input synapses. Hint
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 2 in detail:
Use the last script, save it under a new file name, and add the necessary lines.

Proceed as follows:
1 Initialize new parameter values (τs = 10, frate = 0.002ms−1 )
2 Reserve memory and initialize the vectors sin = (sjin ), win = (wjin ), and E = (Ej )
with nin = 100 constant elements (same values as in Step 1)
3 Inside the for-loop change/add the following:
1 Set pj = 1 if rj ≤ frate ∆t (otherwise 0) during times of applied input. rj is an
uniform random number between 0 and 1. Use array notation to set the
input for all nin input synapses. Hint
2 before the vt update: Implement the conductance dynamics s and set Iapp
according to the input. Use array notation with dot “·” product and
element-wise “⊙” product. Hint

sjin ← sjin + pj
Iapp ← win · sin ⊙ Ein − win · sin ⊙ vt
 

sjin ← (1 − ∆t/τs ) sjin


Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 2 (Python)


1 from p y l a b i m p o r t ∗ 34 i f t>t r [ 0 ] and t<t r [ 1 ] :
2 35 # NEW: g e t i n p u t P o i s s o n s p i k e s
3 # 1) i n i t i a l i z e parameters 36 p = u n i f o r m ( s i z e=n i n )< p r a t e ;
4 tmax = 1 0 0 0 . 37 else :
5 dt = 0.5 38 p = 0 ; # no i n p u t
6 39
7 # 1 . 1 ) Neuron / Network p a r s 40 # NEW: c a l c u l a t e i n p u t c u r r e n t
8 a = 0.02 41 s i n = ( 1 − d t / t a u s )∗ s i n + p
9 b = 0.2 42 I = d o t ( W in , s i n ∗ E i n )
10 c = −65 43 I −= d o t ( W in , s i n )∗ v [ t ]
11 d = 8 . 44
12 t a u s = 10 # d e c a y o f s y n a p s e s [ ms ] 45 i f v [ t ] <35:
13 46 # 3 . 2 ) u p d a t e ODE
14 # 1 . 2 ) I n p u t p a r s 47 dv = ( 0 . 0 4 ∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]
15 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / d t 48 v [ t +1] = v [ t ] + ( dv+I )∗ d t
16 r a t e i n = 2 # i n p u t r a t e 49 du = a ∗( b∗v [ t ]−u [ t ] )
17 n i n = 100 # number o f i n p u t s 50 u [ t +1] = u [ t ] + d t ∗du
18 w i n = 0 . 0 7 # i n p u t w e i g h t s 51 else :
19 W in = w i n ∗ o n e s ( n i n ) # v e c t o r 52 # 3.3) spike !
20 53 v [ t ] = 35
21 # 2 ) r e s e r v e memory 54 v [ t +1] = c
22 T = c e i l ( tmax / d t ) 55 u [ t +1] = u [ t ]+d
23 v = z e r o s (T) 56
24 u = z e r o s (T) 57 # 4 ) p l o t v o l t a g e t r a c e
25 v [ 0 ] = −70 # r e s t i n g p o t e n t i a l 58 f i g u r e ( )
26 u [ 0 ] = −14 # s t e a d y s t a t e 59 t v e c = a r a n g e ( 0 . , tmax , d t )
27 s i n = z e r o s ( n i n ) # s y n a p t i c v a r i a b l e 60 p l o t ( t v e c , v , ’ b ’ , l a b e l= ’ V o l t a g e t r a c e ’ )
28 E i n = z e r o s ( n i n ) # r e v p o t e n t i a l 61 x l a b e l ( ’ Time [ ms ] ’ )
29 p r a t e = d t ∗ r a t e i n ∗1e−3 # a b b r e v 62 y l a b e l ( ’ Membrane v o l t a g e [mV] ’ )
30 63 t i t l e ( ”””A s i n g l e q I F n e u r o n
31 # 3 ) f o r −l o o p o v e r t i m e 64 w i t h %d P o i s s o n i n p u t s ””” % n i n )
32 f o r t i n a r a n g e (T−1): 65 show ( )
33 # 3.1) get input
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 2 with STP (Python)


1 from p y l a b i m p o r t ∗ 39 # NEW: g e t i n p u t P o i s s o n s p i k e s
2 40 p = u n i f o r m ( s i z e=n i n )< p r a t e ;
3 # 1) i n i t i a l i z e parameters 41
4 tmax = 1 0 0 0 . 42 #u p d a t e s y n a p t i c d e p r e s s i o n
5 dt = 0.5 43 tmp = e x p ( d t ∗( l a s t s p [ p]− t ) / t a u d )
6 44 h [ p ] = 1 − (1+( s t p u −1)∗h [ p ] ) ∗ tmp
7 # 1 . 1 ) Neuron / Network p a r s 45 lastsp [p] = t
8 a = 0.02 46 else :
9 b = 0.2 47 p = 0 ; # no i n p u t
10 c = −65 48
11 d = 8 . 49
12 t a u s = 10 # d e c a y o f s y n a p s e s [ ms ] 50 # NEW: c a l c u l a t e i n p u t c u r r e n t
13 t a u d = 500 # s y n a p t i c d e p r e s s i o n [ ms ] 51 s i n = ( 1 − d t / t a u s )∗ s i n + p∗h
14 s t p u = 0 . 5 # STP p a r a m e t e r 52 I = d o t ( W in , s i n ∗ E i n )
15 53 I −= d o t ( W in , s i n )∗ v [ t ]
16 # 1 . 2 ) I n p u t p a r s 54
17 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / d t 55 i f v [ t ] <35:
18 r a t e i n = 10 # i n p u t r a t e 56 # 3 . 2 ) u p d a t e ODE
19 n i n = 1 # number o f i n p u t s 57 dv = ( 0 . 0 4 ∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]
20 w i n = 0 . 0 3 # i n p u t w e i g h t s 58 v [ t +1] = v [ t ] + ( dv+I )∗ d t
21 W in = w i n ∗ o n e s ( n i n ) # v e c t o r 59 du = a ∗( b∗v [ t ]−u [ t ] )
22 60 u [ t +1] = u [ t ] + d t ∗du
23 # 2 ) r e s e r v e memory 61 else :
24 T = c e i l ( tmax / d t ) 62 # 3.3) spike !
25 v = z e r o s (T) 63 v [ t ] = 35
26 u = z e r o s (T) 64 v [ t +1] = c
27 v [ 0 ] = −70 # r e s t i n g p o t e n t i a l 65 u [ t +1] = u [ t ]+d
28 u [ 0 ] = −14 # s t e a d y s t a t e 66
29 s i n = z e r o s ( n i n ) # s y n a p t i c v a r i a b l e 67 # 4 ) p l o t v o l t a g e t r a c e
30 E i n = z e r o s ( n i n ) # r e v p o t e n t i a l 68 f i g u r e ( )
31 p r a t e = d t ∗ r a t e i n ∗1e−3 # a b b r e v 69 t v e c = a r a n g e ( 0 . , tmax , d t )
32 h = o n e s ( n i n ) 70 p l o t ( t v e c , v , ’ b ’ , l a b e l= ’ V o l t a g e t r a c e ’ )
33 l a s t s p = −i n f t y ∗ o n e s ( n i n ) 71 x l a b e l ( ’ Time [ ms ] ’ )
34 72 y l a b e l ( ’ Membrane v o l t a g e [mV] ’ )
35 # 3 ) f o r −l o o p o v e r t i m e 73 t i t l e ( ”””A s i n g l e q I F n e u r o n
36 f o r t i n a r a n g e (T−1): 74 w i t h %d P o i s s o n i n p u t s ””” % n i n )
37 # 3.1) get input 75 show ( )
38 i f t>t r [ 0 ] and t<t r [ 1 ] :
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3: Simulate 1000 neurons (not inter-connected)

Exercise 3
Simulate 1000 neurons for 1000 ms and plot the resulting spikes. Assume
that each neuron receives (random) 10% of the 100 Poisson spike trains of
rate frate = 2 Hz between time 200 ms and 700 ms. Note that the neurons
are not yet inter-connected.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3: Simulate 1000 neurons (not inter-connected)

Exercise 3
Simulate 1000 neurons for 1000 ms and plot the resulting spikes. Assume
that each neuron receives (random) 10% of the 100 Poisson spike trains of
rate frate = 2 Hz between time 200 ms and 700 ms. Note that the neurons
are not yet inter-connected.

Excitatory and inhibitory neurons:


An unconnected network
of 1000 qIF neurons
A neuron is, with probability pinh = 0.2, a 1000
Exc.
(fast-spiking) inhibitory neuron (a = 0.1,

Neuron number [#]


800 Inh.
d = 2), others are (regular spiking) excitatory
600
neurons (a = 0.02 and d = 8).
400
Input weights of input synapse j to neuron i is
set to w in = 0.07 if connected (otherwise 0). 200

0
0 200 400 600 800 1000
Time [ms]
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:
Modify the last script (after saving it under new name).

Proceed as follows:
1 Initialize new parameter values (n = 1000)
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:
Modify the last script (after saving it under new name).

Proceed as follows:
1 Initialize new parameter values (n = 1000)
2 Initialize 2 logical vectors kinh and kexc of length n, where kinh (i) is True with
probability p = 0.2 (marking an inhibitory neuron) and False otherwise. And
kexc = ¬kinh .
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:
Modify the last script (after saving it under new name).

Proceed as follows:
1 Initialize new parameter values (n = 1000)
2 Initialize 2 logical vectors kinh and kexc of length n, where kinh (i) is True with
probability p = 0.2 (marking an inhibitory neuron) and False otherwise. And
kexc = ¬kinh .
3 Reserve memory and initialize vi,t , ui,t (now being T × n matrices). Set
parameter vectors a and d according to kexc and kinh .
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:
Modify the last script (after saving it under new name).

Proceed as follows:
1 Initialize new parameter values (n = 1000)
2 Initialize 2 logical vectors kinh and kexc of length n, where kinh (i) is True with
probability p = 0.2 (marking an inhibitory neuron) and False otherwise. And
kexc = ¬kinh .
3 Reserve memory and initialize vi,t , ui,t (now being T × n matrices). Set
parameter vectors a and d according to kexc and kinh .
4 The weights wijin = 0.07 now form a n × nin matrix. Set 90 % random elements
to 0 to account for the connection probability.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:
Modify the last script (after saving it under new name).

Proceed as follows:
1 Initialize new parameter values (n = 1000)
2 Initialize 2 logical vectors kinh and kexc of length n, where kinh (i) is True with
probability p = 0.2 (marking an inhibitory neuron) and False otherwise. And
kexc = ¬kinh .
3 Reserve memory and initialize vi,t , ui,t (now being T × n matrices). Set
parameter vectors a and d according to kexc and kinh .
4 The weights wijin = 0.07 now form a n × nin matrix. Set 90 % random elements
to 0 to account for the connection probability.
5 Inside the for-loop change/add the following:
1 Same update equations (for vi,t+1 and ui,t+1 ) but use array notation.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 3 in detail:
Modify the last script (after saving it under new name).

Proceed as follows:
1 Initialize new parameter values (n = 1000)
2 Initialize 2 logical vectors kinh and kexc of length n, where kinh (i) is True with
probability p = 0.2 (marking an inhibitory neuron) and False otherwise. And
kexc = ¬kinh .
3 Reserve memory and initialize vi,t , ui,t (now being T × n matrices). Set
parameter vectors a and d according to kexc and kinh .
4 The weights wijin = 0.07 now form a n × nin matrix. Set 90 % random elements
to 0 to account for the connection probability.
5 Inside the for-loop change/add the following:
1 Same update equations (for vi,t+1 and ui,t+1 ) but use array notation.
6 Plot the spike raster. Plot black dots at {(t, i)|vit ≥ 35} for excitatory neurons i.
Use red dots for inhibitory neurons.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 3 (Python)


1 from p y l a b i m p o r t ∗ 41 p = u n i f o r m ( s i z e=n i n )< p r a t e ;
2 42 else :
3 # 1) i n i t i a l i z e parameters 43 p = 0;
4 tmax = 1 0 0 0 . 44
5 dt = 0.5 45 s i n = ( 1 − d t / t a u s )∗ s i n + p
6 46 I = W in . d o t ( s i n ∗ E i n )
7 # 1 . 1 ) Neuron / Network p a r s 47 I −= W in . d o t ( s i n )∗ v [ t ]
8 n = 1000 # number o f n e u r o n s 48
9 pinh = 0.2 # prob of inh neuron 49 # NEW: h a n d l e a l l n e u r o n s
10 i n h = ( u n i f o r m ( s i z e=n)< p i n h ) # w h e t h e r i n h . 50 f i r e d = v [ t ]>=35
11 e x c = l o g i c a l n o t ( i n h ) 51
12 a = i n h . c h o o s e ( 0 . 0 2 , 0 . 1 )# e x c = 0 . 0 2 , i n h =0.1 52 # 3 . 2 ) u p d a t e ODE, s i m p l y u p d a t e a l l
13 b = 0 . 2 53 dv = ( 0 . 0 4 ∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]
14 c = −65 54 v [ t +1] = v [ t ] + ( dv+I )∗ d t
15 d = i n h . c h o o s e ( 8 , 2 ) # e x c =8 , i n h =2 55 du = a ∗( b∗v [ t ]−u [ t ] )
16 t a u s = 10 56 u [ t +1] = u [ t ] + d t ∗du
17 57
18 # 1 . 2 ) I n p u t p a r s 58 # 3.3) spikes !
19 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / d t 59 v [ t ] [ f i r e d ] = 35
20 r a t e i n = 2 60 v [ t +1][ f i r e d ] = c
21 n i n = 100 61 u [ t + 1 ] [ f i r e d ] = u [ t ] [ f i r e d ]+d [ f i r e d ]
22 w i n = 0 . 0 7 62
23 p c o n n i n = 0 . 1 # i n p u t conn p r o b 63 # 4 ) p l o t t i n g
24 C = u n i f o r m ( s i z e =(n , n i n ))< p c o n n i n 64 # NEW: g e t s p i k e s and p l o t
25 W in = C . c h o o s e ( 0 , w i n ) # m a t r i x 65 t s p k , n s p k = n o n z e r o ( v==35)
26 66 i d x i = i n 1 d ( nspk , n o n z e r o ( i n h ) [ 0 ] ) # f i n d i n h
27 # 2 ) r e s e r v e memory 67 i d x e = l o g i c a l n o t ( i d x i ) # a l l o t h e r s a r e e x c
28 T = c e i l ( tmax / d t ) 68
29 v = z e r o s ( ( T , n ) ) # now m a t r i x 69 f i g u r e ( )
30 u = z e r o s ( ( T , n ) ) # now m a t r i x 70 p l o t ( t s p k [ i d x e ] ∗ dt , n s p k [ i d x e ] , ’ k . ’ ,
31 v [ 0 ] = −70 # s e t 1 s t row 71 l a b e l= ’ Exc . ’ , m a r k e r s i z e =2)
32 u [ 0 ] = −14 72 p l o t ( t s p k [ i d x i ] ∗ dt , n s p k [ i d x i ] , ’ r . ’ ,
33 s i n = z e r o s ( n i n ) 73 l a b e l= ’ I n h . ’ , m a r k e r s i z e =2)
34 E i n = z e r o s ( n i n ) 74 x l a b e l ( ’ Time [ ms ] ’ )
35 p r a t e = d t ∗ r a t e i n ∗1e−3 75 y l a b e l ( ’ Neuron number [\#] ’ )
36 76 x l i m ( ( 0 , tmax ) )
37 # 3 ) f o r −l o o p o v e r t i m e 77 t i t l e ( ”””An u n c o n n e c t e d n e t w o r k
38 f o r t i n a r a n g e (T−1): 78 o f %d q I F n e u r o n s ””” % n )
39 # 3.1) get input 79 l e g e n d ( l o c= ’ u p p e r r i g h t ’ )
40 i f t>t r [ 0 ] and t<t r [ 1 ] : 80 show ( )
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4: Simulate recurrent network

Exercise 4
Simulate 1000 neurons as before but with added recurrent connections.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4: Simulate recurrent network

Exercise 4
Simulate 1000 neurons as before but with added recurrent connections.

Recurrent synaptic activations

A neuron i is sparsely connected to a neuron j


An recurrent network
(with probability pconn = 0.1). Thus neuron i of 1000 qIF neurons
1000
receives an additional current Iisyn of the form: Exc.

Neuron number [#]


800 Inh.
n
Iisyn =
X
600
wij sj (t) (Ej − vi (t))
j=1 400

200
Weights are Gamma distributed
(wavg = 0.005 and gsc = 0.002). Set the 0
0 200 400 600 800 1000
inhibitory to excitatory connections twice as Time [ms]
strong on average.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:
Modify the last script (after saving it under new name).
Proceed as follows:
1 Initialize and allocate memory for the new variables (s = (sj ), Ej ). Set Ej = −85
if j is an inhibitory neuron (otherwise 0).
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:
Modify the last script (after saving it under new name).
Proceed as follows:
1 Initialize and allocate memory for the new variables (s = (sj ), Ej ). Set Ej = −85
if j is an inhibitory neuron (otherwise 0).
2 Reserve memory and initialize weights W = (wij ) to zero. Randomly choose 10%
of the matrix elements.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:
Modify the last script (after saving it under new name).
Proceed as follows:
1 Initialize and allocate memory for the new variables (s = (sj ), Ej ). Set Ej = −85
if j is an inhibitory neuron (otherwise 0).
2 Reserve memory and initialize weights W = (wij ) to zero. Randomly choose 10%
of the matrix elements.
3 Set the chosen weight matrix elements to values drawn from a Gamma
distribution of scale gsc = 0.002 and shape gsh = wgavg
sc
, with wavg = 0.005. Hint
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:
Modify the last script (after saving it under new name).
Proceed as follows:
1 Initialize and allocate memory for the new variables (s = (sj ), Ej ). Set Ej = −85
if j is an inhibitory neuron (otherwise 0).
2 Reserve memory and initialize weights W = (wij ) to zero. Randomly choose 10%
of the matrix elements.
3 Set the chosen weight matrix elements to values drawn from a Gamma
distribution of scale gsc = 0.002 and shape gsh = wgavg
sc
, with wavg = 0.005. Hint

4 Make the weight matrix “sparse” to speed up computations. Hint


Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:
Modify the last script (after saving it under new name).
Proceed as follows:
1 Initialize and allocate memory for the new variables (s = (sj ), Ej ). Set Ej = −85
if j is an inhibitory neuron (otherwise 0).
2 Reserve memory and initialize weights W = (wij ) to zero. Randomly choose 10%
of the matrix elements.
3 Set the chosen weight matrix elements to values drawn from a Gamma
distribution of scale gsc = 0.002 and shape gsh = wgavg
sc
, with wavg = 0.005. Hint

4 Make the weight matrix “sparse” to speed up computations. Hint

5 Scale weights from inh. to exc. neurons by the factor of 2. Hint


Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 4 in detail:
Modify the last script (after saving it under new name).
Proceed as follows:
1 Initialize and allocate memory for the new variables (s = (sj ), Ej ). Set Ej = −85
if j is an inhibitory neuron (otherwise 0).
2 Reserve memory and initialize weights W = (wij ) to zero. Randomly choose 10%
of the matrix elements.
3 Set the chosen weight matrix elements to values drawn from a Gamma
distribution of scale gsc = 0.002 and shape gsh = wgavg
sc
, with wavg = 0.005. Hint

4 Make the weight matrix “sparse” to speed up computations. Hint

5 Scale weights from inh. to exc. neurons by the factor of 2. Hint

6 Inside the for-loop change/add the following:


1 add the equations for recurrent synaptic dynamics sj and add Isyn to the total
applied current.
sj ← sj + 1, if vj (t − 1) ≥ 35
syn
I ← W · (s ⊙ E) − (W · s) ⊙ v
sj ← (1 − ∆t/τs ) sj
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 4
1 from p y l a b i m p o r t ∗ 43 T = c e i l ( tmax / d t )
2 from s c i p y . s p a r s e i m p o r t c s r m a t r i x 44 v = z e r o s ( ( T , n ) )
3 45 u = z e r o s ( ( T , n ) )
4 # 1) i n i t i a l i z e parameters 46 v [ 0 ] = −70
5 tmax = 1 0 0 0 . 47 u [ 0 ] = −14
6 dt = 0.5 48 s i n = z e r o s ( n i n )
7 49 E i n = z e r o s ( n i n )
8 # 1 . 1 ) Neuron / Network p a r s 50 p r a t e = d t ∗ r a t e i n ∗1e−3
9 n = 1000 51 s = z e r o s ( n ) # r e c s y n a p s e s
10 p i n h = 0 . 2 52
11 i n h = ( u n i f o r m ( s i z e=n)< p i n h ) 53 # 3 ) f o r −l o o p o v e r t i m e
12 e x c = l o g i c a l n o t ( i n h ) 54 f o r t i n a r a n g e (T−1):
13 a = i n h . c h o o s e ( 0 . 0 2 , 0 . 1 ) 55 # 3.1) get input
14 b = 0 . 2 56 i f t>t r [ 0 ] and t<t r [ 1 ] :
15 c = −65 57 p = u n i f o r m ( s i z e=n i n )< p r a t e ;
16 d = i n h . c h o o s e ( 8 , 2 ) 58 else :
17 t a u s = 10 59 p = 0;
18 60
19 # NEW r e c u r r e n t p a r a m e t e r 61 s i n = ( 1 − d t / t a u s )∗ s i n + p
20 w = 0 . 0 0 5 # a v e r a g e r e c u r r e n t w e i g h t 62 I = W in . d o t ( s i n ∗ E i n )
21 pconn = 0 . 1 # r e c u r r e n t c o n n e c t i o n p r o b 63 I −= W in . d o t ( s i n )∗ v [ t ]
22 s c a l e E I = 2 # s c a l e I−>E 64
23 g s c = 0 . 0 0 2 # s c a l e o f gamma 65 f i r e d = v [ t ]>=35
24 E = i n h . c h o o s e (0 , −85) 66
25 # NEW: make w e i g h t m a t r i x 67 # NEW: r e c u r r e n t i n p u t
26 W = z e r o s ( ( n , n ) ) 68 s = ( 1 − d t / t a u s )∗ s + f i r e d
27 C = u n i f o r m ( s i z e =(n , n ) ) 69 I s y n = W. d o t ( s ∗E ) − W. d o t ( s )∗ v [ t ]
28 i d x = n o n z e r o ( C<pconn ) # s p a r s e c o n n e c t i v i t y 70 I += I s y n # add t o i n p u t v e c t o r
29 W[ i d x ] = gamma (w/ g s c , s c a l e=g s c , s i z e=i d x [ 0 ] . s i z e ) 71
30 W[ i x ( exc , i n h ) ] ∗= s c a l e E I #submat i n d e x i n g 72 # 3 . 2 ) u p d a t e ODE
31 W = c s r m a t r i x (W) # make row s p a r s e 73 dv = ( 0 . 0 4 ∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]
32 74 v [ t +1] = v [ t ] + ( dv+I )∗ d t
33 # 1 . 2 ) I n p u t p a r s 75 du = a ∗( b∗v [ t ]−u [ t ] )
34 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / d t 76 u [ t +1] = u [ t ] + d t ∗du
35 r a t e i n = 2 77
36 n i n = 100 78 # 3.3) spikes !
37 w i n = 0 . 0 7 79 v [ t ] [ f i r e d ] = 35
38 p c o n n i n = 0 . 1 80 v [ t +1][ f i r e d ] = c
39 C = u n i f o r m ( s i z e =(n , n i n ))< p c o n n i n 81 u [ t + 1 ] [ f i r e d ] = u [ t ] [ f i r e d ]+d [ f i r e d ]
40 W in = C . c h o o s e ( 0 , w i n ) 82
41 83 # 4 ) p l o t t i n g
42 # 2 ) r e s e r v e memory 84 t s p k , n s p k = n o n z e r o ( v==35)
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 5: Simulate an orientation column

Exercise 5
Restructure the connection matrix and the input to simulate an orientation
column. That is all E-E neurons only connect to neighboring neurons and the
network resembles a 1D ring.
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 5: Simulate an orientation column

Exercise 5
Restructure the connection matrix and the input to simulate an orientation
column. That is all E-E neurons only connect to neighboring neurons and the
network resembles a 1D ring.

Ring structure

A neuron i is still sparsely connected to a


An recurrent network
neuron j but now with probability pconn = 0.4. of 1000 qIF neurons
1000
However, if all neurons are arranged on a ring Exc.

Neuron number [#]


from 0 to 2π exc-to-exc connections are only 800 Inh.

possible if two neurons are nearer than π/4. 600


Input is only delivered to a half of neurons
400
(e.g. from 0 to pi). Use the same input
connection probability as before. 200

0
0 200 400 600 800 1000
Time [ms]
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 5 in detail:
Modify the last script (after saving it under new name).
Proceed as follows:
1 Set indexes of the weight matrix to zero, which belong to exc-exc connections
further apart that π/4. Hint: One can use scipy.linalg.circulant
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Step 5 in detail:
Modify the last script (after saving it under new name).
Proceed as follows:
1 Set indexes of the weight matrix to zero, which belong to exc-exc connections
further apart that π/4. Hint: One can use scipy.linalg.circulant
2 Change the input so that only half (e.g. from 0 to pi) of the neurons receive
input (again with probability 0.2). neurons
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Solution to step 5
1 from p y l a b i m p o r t ∗ 43 W in = C . c h o o s e ( 0 , w i n )
2 from s c i p y . s p a r s e i m p o r t c s r m a t r i x 44 W in [ n / 2 : , : ] = 0 # NEW
3 from s c i p y . l i n a l g i m p o r t c i r c u l a n t 45
4 # 1) i n i t i a l i z e parameters 46 # 2 ) r e s e r v e memory
5 tmax = 1 0 0 0 . 47 T = c e i l ( tmax / d t )
6 dt = 0.5 48 v = z e r o s ( ( T , n ) )
7 49 u = z e r o s ( ( T , n ) )
8 # 1 . 1 ) Neuron / Network p a r s 50 v [ 0 ] = −70
9 n = 1000 51 u [ 0 ] = −14
10 p i n h = 0 . 2 52 s i n = z e r o s ( n i n )
11 i n h = ( u n i f o r m ( s i z e=n)< p i n h ) 53 E i n = z e r o s ( n i n )
12 e x c = l o g i c a l n o t ( i n h ) 54 p r a t e = d t ∗ r a t e i n ∗1e−3
13 a = i n h . c h o o s e ( 0 . 0 2 , 0 . 1 ) 55 s = z e r o s ( n ) # r e c s y n a p s e s
14 b = 0 . 2 56
15 c = −65 57 # 3 ) f o r −l o o p o v e r t i m e
16 d = i n h . c h o o s e ( 8 , 2 ) 58 f o r t i n a r a n g e (T−1):
17 t a u s = 10 59 # 3.1) get input
18 60 i f t>t r [ 0 ] and t<t r [ 1 ] :
19 w i d t h = p i /4 # h a l f −w i d t h o f t h e o r i e n t a t i o n t u n i n g 61 p = u n i f o r m ( s i z e=n i n )< p r a t e ;
20 w = 0 . 0 0 5 62 else :
21 pconn = 0 . 4 # s e t a b i t h i g h e r 63 p = 0;
22 s c a l e E I = 2 64
23 g s c = 0 . 0 0 2 65 s i n = ( 1 − d t / t a u s )∗ s i n + p
24 E = i n h . c h o o s e (0 , −85) 66 I = W in . d o t ( s i n ∗ E i n )
25 W = z e r o s ( ( n , n ) ) 67 I −= W in . d o t ( s i n )∗ v [ t ]
26 C = u n i f o r m ( s i z e =(n , n ) ) 68
27 i d x = n o n z e r o ( C<pconn ) 69 f i r e d = v [ t ]>=35
28 W[ i d x ] = gamma (w/ g s c , s c a l e=g s c , s i z e=i d x [ 0 ] . s i z e ) 70
29 W[ i x ( exc , i n h ) ] ∗= s c a l e E I 71 # NEW: r e c u r r e n t i n p u t
30 t h e t a = l i n s p a c e ( 0 , 2 ∗ p i , n ) # NEW 72 s = ( 1 − d t / t a u s )∗ s + f i r e d
31 R = c i r c u l a n t ( c o s ( t h e t a ))> c o s ( w i d t h ) #NEW 73 I s y n = W. d o t ( s ∗E ) − W. d o t ( s )∗ v [ t ]
32 W[ : , e x c ] = w h e r e (R [ : , e x c ] ,W[ : , e x c ] , 0 ) # NEW 74 I += I s y n # add t o i n p u t v e c t o r
33 W = c s r m a t r i x (W) 75
34 76 # 3 . 2 ) u p d a t e ODE
35 # 1 . 2 ) I n p u t p a r s 77 dv = ( 0 . 0 4 ∗ v [ t ]+5)∗ v [ t ]+140−u [ t ]
36 t r=a r r a y ( [ 2 0 0 . , 7 0 0 ] ) / d t 78 v [ t +1] = v [ t ] + ( dv+I )∗ d t
37 r a t e i n = 2 79 du = a ∗( b∗v [ t ]−u [ t ] )
38 i n w i d t h = p i /2 80 u [ t +1] = u [ t ] + d t ∗du
39 w i n = 0 . 0 7 81
40 p c o n n i n = 0 . 2 82 # 3.3) spikes !
41 n i n = 100 83 v [ t ] [ f i r e d ] = 35
42 C = u n i f o r m ( s i z e =(n , n i n ))< p c o n n i n 84 v [ t +1][ f i r e d ] = c
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Congratulation !

You have just coded and simulated a quite realistic


network model !
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Hint for generating random number

Use the uniform function to generate arrays of random numbers


between 0 and 1.

1 f r o m numpy . random i m p o r t u n i f o r m
2 n i n = 100
3 r = u n i f o r m ( s i z e=n i n )

To set indexes i of a vector v to a with a probability p and otherwise to


b one can use the method choose

1 r = u n i f o r m ( s i z e=n i n )
2 v = ( r<p ) . c h o o s e ( b , a )

back
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Hint for generating gamma distributed random number

Use the numpy.random.gamma function to generate arrays of random


numbers which are gamma distributed.

1 f r o m numpy . random i m p o r t gamma


2

3 g s h a p e , g s c a l e , n = 0 . 0 0 3 , 2 , 1000
4 r = gamma( g s h a p e , g s c a l e , n )

back
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Hint for making sparse matrices

There are several forms of sparse matrices in the module


scipy.sparse. The one which is interesting for our purposes is the
“row-wise” sparse matrix (see the documentation of scipy.sparse for
more information).

1 from pylab i m p o r t ∗
2 from s c i p y . s p a r s e i m p o r t c s r m a t r i x
3

4 R = uniform ( s i z e =(100 ,100)) # example matrix


5W = where (R<0.1 ,R , 0 ) # mostly 0
6 W2 = c s r m a t r i x (W) # make s p a r s e m a t r i x
7

8 v = u n i f o r m ( s i z e =100) # example v e c t o r
9 x = W2. d o t ( v ) # matrix dot product

back
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Hint for submatrix indexing

numpy.array provides a shortcut for (MatLab-like) submatrix


indexing. Assume one has a matrix W and wanted to add 1 to add 1 to
a selection of rows and columns. One could use the convenient function
ix and write

1 from pylab i m p o r t ∗
2

3W = uniform ( s i z e =(10 ,10)) # example matrix


4 i r o w = u n i f o r m ( s i z e =10) <0.5 # s e l e c t some rows
5 i c o l = u n i f o r m ( s i z e =10) <0.5 # s e l e c t some c o l s
6

7 W( i x ( i r o w , i c o l ) ) += 1 # add 1 t o t h e e l e m e n t s

back
Introduction Step 1 Step 2 Step 3 Step 4 Step 5 Hints

Hint for using dot product

Use the dot method of an numpy.array to compute the dot product.


Caution: The operator * yields an element-wise multiplication!

1 from pylab i m p o r t ∗
2 a = array ([1. ,2 ,3 ,4])
3 b = array ([1. ,1 ,5 ,5])

4 c = a ∗b # e l e m e n t −w i s e !
5 c . size

6 4

7 d = a . dot ( b ) # s c a l a r product
8 d . size

9 1

back

Anda mungkin juga menyukai