Structural Plasticity in Arbor¶
In this tutorial, we are going to demonstrate how a network can be built using plasticity and homeostatic connection rules. Despite not playing towards Arbor’s strengths, we choose a LIF (Leaky Integrate and Fire) neuron model, as we are primarily interested in examining the required scaffolding.
We will build up the simulation in stages, starting with an unconnected network and finishing with a dynamically built connectome.
Concepts and Requirements
We cover some advanced topics in this tutorial, mainly structural plasticity. Please refer to other tutorials for the basics of network building. The model employed here — storing an explicit connection matrix — is not advisable in most scenarios.
In addition to Arbor and its requirements, scipy
, matplotlib
, and
networkx
need to be installed.
Unconnected Network¶
Consider a collection of N
LIF cells. This will be the starting point for
our exploration. For now, we set up each cell with a Poissonian input such that
it will produce spikes periodically at a low frequency.
The Python file 01-setup.py
is the scaffolding we will build our simulation
around and thus contains some passages that might seem redundant now, but will
be helpful in later steps.
We begin by defining the global settings:
# global parameters
# cell count
N = 10
# total runtime [ms]
T = 1000
# one interval [ms]
t_interval = 10
# numerical time step [ms]
dt = 0.1
N
is the cell count of the simulation.T
is the total runtime of the simulation inms
.t_interval
defines the _interval_ such that the simulation advances in discrete steps[0, 1, 2, ...] t_interval
. Later, this will be the timescale of plasticity.dt
is the numerical timestep on which cells evolve.
These parameters are used here:
if __name__ == "__main__":
rec = unconnected(N)
sim = A.simulation(rec)
sim.record(A.spike_recording.all)
t = 0
while t < T:
t += t_interval
sim.run(t * U.ms, dt * U.ms)
plot_spikes(sim, rec.num_cells(), t_interval, T, prefix="01-")
where we run the simulation in increments of t_interval
.
Back to the recipe, we set a prototypical cell:
self.cell = A.lif_cell("src", "tgt")
and deliver it for all gid
:
def cell_description(self, gid: int):
return self.cell
Also, each cell has an event generator attached, using a Poisson point process seeded with the cell’s gid
.
def event_generators(self, gid: int):
return [
A.event_generator(
"tgt",
self.gen_weight,
A.poisson_schedule(freq=self.gen_freq, seed=self.cell_seed(gid)),
)
]
All other parameters are set in the constructor:
def __init__(self, N) -> None:
super().__init__()
self.N = N
# Cell prototype
self.cell = A.lif_cell("src", "tgt")
# random seed [0, 100]
self.seed = 42
# event generator parameters
self.gen_weight = 20
self.gen_freq = 1 * U.kHz
We also proceed to add spike recording and generate plots using a helper
function plot_spikes
from util.py
. You can skip the following details
for now and come back later if you are interested in how it works. Rates are
computed by binning spikes into t_interval
and the neuron id; the mean rate
is the average across the neurons smoothed using a Savitzky-Golay filter
(scipy.signal.savgol_filter
).
We plot per-neuron and mean rates:
We also generate raster plots via scatter
:
A Randomly Wired Network¶
We use inheritance to derive a new recipe that contains all the functionality of
the `unconnected
recipe. We then add a random connectivity matrix during
construction, fix connection weights, and deliver the resulting connections
via the connections_on
callback, with the only extra consideration of
allowing multiple connections between two neurons.
In detail, the recipe stores the connection matrix, the current incoming/outgoing connections per neuron, and the maximum for both directions
# format [to, from]
self.connections = np.zeros(shape=(N, N), dtype=np.uint8)
self.inc = np.zeros(N, np.uint8)
self.out = np.zeros(N, np.uint8)
self.max_inc = 4
self.max_out = 4
The connection matrix is used to construct connections,
def connections_on(self, gid: int):
return [
A.connection((source, "src"), "tgt", self.syn_weight, self.syn_delay)
for source in range(self.N)
for _ in range(self.connections[gid, source])
]
together with the fixed connection parameters:
self.syn_weight = 80
self.syn_delay = 0.5 * U.ms
We define helper functions add|del_connections
to manipulate the connection
table while upholding these invariants:
no self-connections, i.e.,
connection[i, i] == 0
inc[i]
the sum ofconnections[:, i]
no more incoming connections than allowed by
max_inc
, i.e.,inc[i] <= max_inc
out[i]
the sum ofconnections[i, :]
no more outgoing connections than allowed by
max_out
, i.e.,out[i] <= max_out
These methods return True
on success and False
otherwise.
def add_connection(self, src: int, tgt: int) -> bool:
if tgt == src or self.inc[tgt] >= self.max_inc or self.out[src] >= self.max_out:
return False
self.inc[tgt] += 1
self.out[src] += 1
self.connections[tgt, src] += 1
return True
def del_connection(self, src: int, tgt: int) -> bool:
if tgt == src or self.connections[tgt, src] <= 0:
return False
self.inc[tgt] -= 1
self.out[src] -= 1
self.connections[tgt, src] -= 1
return True
Both are used in rewire
to produce a random connection matrix.
def rewire(self):
tries = self.N * self.N * self.max_inc * self.max_out
while (
tries > 0
and self.inc.sum() < self.N * self.max_inc
and self.out.sum() < self.N * self.max_out
):
src, tgt = np.random.randint(self.N, size=2, dtype=int)
self.add_connection(src, tgt)
tries -= 1
We then proceed to run the simulation:
Note that we added a plot of the network connectivity using plot_network
from util
as well. This generates images of the graph and connection matrix.
Adding Homeostasis¶
Under the homeostatic model, each cell was a setpoint for the firing rate \(\nu^*\) which is used to determine the creation or destruction of synaptic connections via
Thus we need to add some extra information to our simulation, namely the setpoint \(\nu^*_i\) for each neuron \(i\) and the sensitivity parameter \(\alpha\). We will also use a simplified version of the differential equation above, namely adding/deleting exactly one connection if the difference of observed to desired spiking frequency exceeds \(\pm\alpha\). This is both for simplicity and to avoid sudden changes in the network structure.
As before, we set up global parameters:
# global parameters
# cell count
N = 10
# total runtime [ms]
T = 10000
# one interval [ms]
t_interval = 100
# numerical time step [ms]
dt = 0.1
# Set seed for numpy
np.random.seed = 23
# setpoint rate in kHz
setpoint_rate = 0.1
# sensitivty towards deviations from setpoint
sensitivity = 200
and prepare our simulation:
rec = homeostatic_network(N, setpoint_rate, sensitivity)
sim = A.simulation(rec)
sim.record(A.spike_recording.all)
Note that our new recipe is almost unaltered from the random network.
class homeostatic_network(random_network):
def __init__(self, N, setpoint_rate, sensitivity) -> None:
super().__init__(N)
self.max_inc = 8
self.max_out = 8
self.setpoint = setpoint_rate
self.alpha = sensitivity
All changes are contained in the way we run the simulation. To add a further
interesting feature, we skip the rewiring for the first half of the simulation.
The initial network is unconnected, but could be populated randomly (or any
other way) if desired by calling self.rewire()
in the constructor of
homeostatic_network
before setting the maxima to eight.
Plasticity is implemented by tweaking the connection table inside the recipe
between calls to run
and calling simulation.update
with the modified
recipe:
sim.update(rec)
<<<<<<< HEAD .. note:
As it is the central point here, it is worth emphasizing why this yields a
changed network. The call to ``sim.update(rec)`` causes Arbor to internally
re-build the connection table from scratch based on the data returned by
``rec.connections_on``. However, here, this method just inspects the matrix
in ``rec.connections`` and converts the data into a ``arbor.connection``.
Thus, changing this matrix before ``update`` will build a different network.
Important caveats:
- without ``update``, changes to the recipe have no effect.
- vice versa ``update`` has no effect if the recipe doesn't return a different
data than before.
- ``update`` will delete all existing connections and their parameters, so
all connections to be kept must be explicitly re-instantiated.
- ``update`` will **not** delete synapses or their state, e.g., ODEs will
still be integrated even if not connected and currents might be produced;
- neither synapses/targets nor detectors/sources can be altered. Create all
endpoints up front.
- only the network is updated (this might change in future versions!).
- be very aware that ``connections_on`` might be called in arbitrary order
and by multiple (potentially different) threads and processes! This
requires some thought and synchronization when dealing with random numbers
and updating data *inside* ``connections_on``.
Changes are based on the difference from the current rate we compute from the spikes during the last interval, ||||||| parent of df94162f (Final thoughts and warnings.) Changes are based on the difference of current rate we compute from the spikes during the last interval ======= .. note:
As it is the central point here, it is worth emphasizing why this yields a
changed network. The call to ``sim.update(rec)`` causes Arbor to internally
re-build the connection table from scratch based on the data returned by
``rec.connections_on``. However, here, this method just inspects the matrix
in ``rec.connections`` and converts the data into a ``arbor.connection``.
Thus, changing this matrix before ``update`` will build a different network.
Important caveats:
- without ``update``, changes to the recipe have no effect
- vice versa ``update`` has no effect if the recipe doesn't return different
data than before
- ``update`` will delete all existing connections and their parameters, so
all connections to be kept must be explicitly re-instantiated
- ``update`` will **not** delete synapses or their state, e.g. ODEs will
still be integrated even if not connected and currents might be produced
- neither synapses/targets nor detectors/sources can be altered. Create all
endpoints up front.
- only the network is updated (this might change in future versions!)
- be very aware that ``connections_on`` might be called in arbitrary order
and by multiples (potentially different) threads and processes! This
requires some thought and synchronization when dealing with random numbers
and updating data *inside* ``connections_on``.
Changes are based on the difference of current rate we compute from the spikes during the last interval >>>>>>> df94162f (Final thoughts and warnings.)
rates = np.zeros(N)
for (gid, _), time in sim.spikes():
if time < t:
continue
rates[gid] += 1
rates /= t_interval # kHz
and the setpoint times the sensitivity.
dC = ((rec.setpoint - rates) * rec.alpha).astype(int)
Then, each potential pairing of target and source is checked in random order for whether adding or removing a connection is required:
for tgt in randrange(N):
if dC[tgt] == 0:
continue
for src in randrange(N):
if dC[tgt] > 0 and rec.add_connection(src, tgt):
added.append((src, tgt))
break
elif dC[tgt] < 0 and rec.del_connection(src, tgt):
deled.append((src, tgt))
break
If we find an option to fulfill that requirement, we do so and proceed to the
next target. The randomization is important here, especially for adding
connections to avoid biases, in particular when there are too few eligible
connection partners. The randrange
function produces a shuffled range [0,
N)
. We leverage the helper functions from the random network recipe to
manipulate the connection table, see the discussion above.
Finally, we plot spiking rates as before; the jump at the half-way point is the effect of the plasticity activating after which each neuron moves to the setpoint.
And the resulting network:
Conclusion¶
This concludes our foray into structural plasticity. While the building blocks
an explicit representation of the connections;
running the simulation in batches (and calling
simulation.update
!);a rule to derive the change;
will likely be the same in all approaches, the concrete implementation of the rules is the centerpiece here. For example, although spike rate homeostasis was used here, mechanism states and ion concentrations — extracted via the normal probe and sample interface — can be leveraged to build rules. Due to the way the Python interface is required to link to measurements, using the C++ API for access to streaming spike and measurement data could help to address performance issues. Plasticity as shown also meshes with the high-level connection builder. External tools to build and update connections might be useful as well.