#NESTSimulator

Fabrizio MusacchioFabMusacchio
2026-02-23

The Urbanczik-Senn plasticity model is a powerful framework for understanding synaptic in . It integrates dendritic prediction errors to unify supervised, unsupervised, and under a single rule. Its predictive coding mechanism and robust learning dynamics make it valuable for simulating neural processing and exploring plasticity. Here’s a short simulation using the :

🌍 fabriziomusacchio.com/blog/202

Evolutions of synaptic weights of Urbanczik synapses.Urbanczik-Senn plasticity : membrane potentials, somatic conductances, dendritic currents, fireing rates, and rate derivative.
Fabrizio MusacchioFabMusacchio
2026-02-05

Incorporating structural in (#SNN) enables dynamic connectivity, reflecting the 's adaptability. By modeling synaptic growth and pruning based on concentration, we can simulate processes such as and . In this post, I reproduce the tutorial on structural plasticity, demonstrating its impact on network stability and :

🌍 fabriziomusacchio.com/blog/202

Sketch illustrating structural plasticity during learning and memory formation. The sketch illustrates the dynamic remodeling of synaptic connectivity through dendritic spine turnover. Left: Under baseline conditions, synaptic networks exhibit continuous formation and elimination of dendritic spines, reflecting ongoing structural plasticity. Middle: During learning or learning related activity, this baseline turnover is transiently increased, leading to enhanced formation and pruning of synaptic contacts. Newly formed spines preferentially emerge near previously activated synapses, promoting the local clustering of synaptic inputs and enabling adaptive rewiring of circuits. Right: A subset of newly formed and activated synapses becomes selectively stabilized, providing a structural substrate for the long term retention of behaviorally relevant connections and memory traces. Source: Bernardinelli, Y., Nikonenko, I., Muller, D., Structural plasticity: mechanisms and contribution to developmental psychiatric disorders, Frontiers in Neuroanatomy, 2014, 8:123, DOI 10.3389/fnana.2014.00123ꜛ (license: CC BY 4.0).Simulation results of the structural plasticity model. The plot shows the temporal evolution of the mean calcium concentration of excitatory and inhibitory neurons (blue and red lines, respectively) and the total number of connections of excitatory and inhibitory neurons (magenta and black dashed lines, respectively). The horizontal lines represent the growth curves for excitatory (blue) and inhibitory (red) neurons. The model demonstrates how the network’s connectivity changes over time based on the calcium concentration in the neurons. The growth curves determine the threshold at which synapses are created or pruned.
Fabrizio Musacchiopixeltracker@sigmoid.social
2025-09-04

🧠 Pastorelli et al. (2025) present a "simplified two-compartment #neuron with #CalciumDynamics capturing #brain-state-specific apical-amplification, -isolation and -drive". This Ca-#AdEx model replicates distinct #dendritic mechanisms across wakefulness, #NREM & #REM sleep using a compact ThetaPlanes transfer function. Cool implementation using the #NESTsimulator 💻!

🌍 doi.org/10.3389/fncom.2025.156

#Neuroscience #CompNeuro

Figure 1. Brain-state specific apical mechanisms in pyramidal neurons. (A) Cortical pyramidal cell. Green: soma and (peri)somatic dendrites. Orange: apical dendritic tuft. Gray: apical dendrite. Black: axon. Inputs from other areas, representing internally generated priors and top-down signals from areas higher in the hierarchy are segregated to touch the apical tuft (light red arrow). Sensorial evidence and input from areas at lower hierarchical levels (light blue arrow) target the somatic/perisomatic zone together with local signaling (green arrow). (B) During wakefulness, the AA mechanism signals the temporal coincidence of (peri-)somatic and apical inputs by emitting high-frequency burst. (C) In deep-sleep, AI induces the soma to ignore apical signals. (D) When dreaming, AD induces a multi-areal integration driven by internal imagery in absence of sensorial inputs.
Fabrizio Musacchiopixeltracker@sigmoid.social
2025-08-29

I recently played around with #RateModels using #NESTsimulator. Compared to #SNN, RM focus on average firing rates of #NeuronPopulations, simplifying analysis of large networks. They effectively capture collective dynamics like #oscillations and #synchronization, though they miss precise spike timing details. Thus, both approaches have their merits. Here is a brief overview:

🌍 fabriziomusacchio.com/blog/202

#CompNeuro #Neuroscience #Python #PythonTutorial #SpikingNeuralNetwork

Simulated population activity of the excitatory population using mesoscopic and microscopic simulations. The top panel shows the mesoscopic activity from the rate model: 
A
N
 (blue) computed from spikerecorder data as a binned histogram (discrete, noisier) and 
¯
A
(orange) from multimeter data as a continuous measure (smoother). 
A
N
 is inherently noisier and strongly dependent on bin size, compared to 
¯
A
, which averages activity continuously over the recording interval and therefore appears smoother. This is due to the fact that spikerecorder-based histograms capture discrete spike counts, while the multimeter integrates population firing as a continuous variable. The bottom panel shows in contrast to the rate model’s results the microscopic activity 
A
N
derived from simulated spiking GIF neurons. Mesoscopic and microscopic traces are not identical, since one averages firing rates and the other emerges from explicit spikes, but both capture the population’s strong activation after 1500 ms. Rate models thus offer efficient and smooth approximations, while spiking models preserve variability and spike-level detail.
Fabrizio Musacchiopixeltracker@sigmoid.social
2025-08-18

Here is a direct follow-up on this, now showing how to implement #GapJunctions in a network of #spiking #neurons (#SNN) using the #NESTsimulator. We simulate a network of 500 inhibitory neurons with gap junctions and analyze the effects on #synchrony and #oscillations. The code is also available on GitHub. Feel free to modify and expand upon it 🤗

🌍 fabriziomusacchio.com/blog/202

#CompNeuro #Neuroscience sigmoid.social/@pixeltracker/1

Spike raster plot and histogram of the spiking rate of a network of 500 inhibitory neurons with gap junctions. The simulation was run for different gap junction weights, here: 0.5. The average spike rate of the network is shown in the title of each plot.
Fabrizio Musacchiopixeltracker@sigmoid.social
2025-08-17

📝 New blog post: #GapJunctions (#ElectricalSynapses) enable direct electrical and chemical communication between #neurons, synchronizing activity and supporting rapid signal propagation. Their #modeling is crucial for understanding #NeuralNetworkDynamics, #oscillations, and #brain 🧠 function. Here is a brief summary including a small #PythonTutorial using the #NESTsimulator.

🌍 fabriziomusacchio.com/blog/202

#CompNeuro #Neuroscience #Python #OpenSource

Schematic representation of gap junctions between two cells. Shown are patches of cell membranes (blue) of two cells, which are connected by connexons (orange). The connexons are formed by six connexin proteins each. The connexons of the two cells align and form a continuous hydrophilic channel between the cells, bridging the intercellulare space, and are able to close and open. In the lower right corner, three exemplary cells are shown, which are connected by gap junctions. Source: Wikimedia Commonsꜛ (license: CC BY-SA 4.0)Membrane potential of two Hodgkin-Huxley neurons connected by a gap junction. The neurons exhibit synchronized activity due to the gap junction.
Fabrizio Musacchiopixeltracker@sigmoid.social
2024-07-22

In 2000, Nicolas Brunel presented a framework for studying sparsely connected #SpikingNeuralNetworks (#SNN) with random connectivity & varied excitation-inhibition balance. The model, characterized by high sparseness & low firing rates, captures diverse neural dynamics such as synchronized regular and asynchronous irregular activity and global oscillations. Here is a brief summary of these concepts & a #PythonTuroial using the #NESTsimulator.

🌍 fabriziomusacchio.com/blog/202
#CompNeuro #Neuroscience

Spike raster plots (top) and histograms of the spiking rate (bottom) for a simulation of the Brunel network.
Fabrizio Musacchiopixeltracker@sigmoid.social
2024-07-15

This #tutorial explores the oscillatory #PopulationDynamics of generalized #IntegrateAndFire (GIF) neurons simulated with #NESTSimulator. The GIF #NeuronModel is a biophysically detailed model that captures the essential features of spiking neurons, including #SpikeFrequencyAdaptation and #DynamicThreshold behavior:

🌍 fabriziomusacchio.com/blog/202

#ComputationalNeuroscience #Python #Neuroscience

Oscillatory population dynamics of GIF neurons: spike times (top) and spike frequency (bottom).
Fabrizio Musacchiopixeltracker@sigmoid.social
2024-07-08

It’s actually very easy and straightforward setting up a large-scale, multi-population #SpikingNeuralNetwork (#SNN) with the #NESTsimulator. Here is an example with two distinct populations of #Izhikevich neurons:

🌍 fabriziomusacchio.com/blog/202

#ComputationalNeuroscience #CompNeuro #Neuroscience

Fabrizio Musacchiopixeltracker@sigmoid.social
2024-06-25

Before building more complex neural networks in #NESTSimulator, it is crucial to understand its connections rules. I’ve put together an overview of all available rules, which mainly follows NEST's documentation (nest-simulator.readthedocs.io/)

🌍 fabriziomusacchio.com/blog/202

#CompNeuro #ComputationalNeuroscience #Neuroscience #SNN #SpikingNeuronalNetwork #NEST

Sketch of the all-to-all connection scheme in NEST.
Fabrizio Musacchiopixeltracker@sigmoid.social
2024-06-17

The #NEST #simulator is a powerful software for simulating large-scale #SpikingNeuralNetworks (#SNN). I’ve composed an introductory #tutorial showing the main commands for getting started. It's applied to examples with single neurons to reduce complexity. Feel free to share:

🌍 fabriziomusacchio.com/blog/202

#CompNeuro #ComputationalNeuroscience #Python #PythonTutorial #NESTSimulator

A single neuron simulated with with the iaf_psc_alpha model (leaky integrate-and-fire model with alpha-shaped input currents) with an input current of 376.0 pA in NEST.

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst