#SpikingNeuralNetwork

New preprint! What happens if you add neuromodulation to spiking neural networks and let them go wild with it? TLDR: it can improve performance especially in challenging sensory processing tasks.

Preprint:

biorxiv.org/content/10.1101/20

Short explainer thread on Bluesky:

bsky.app/profile/neural-reckon

#neuroscience #ComputationalNeuroscience #SpikingNeuralNetwork

Submissions (short!) due for SNUFA spiking neural networks conference in <2 weeks!

forms.cloud.microsoft/e/XkZLav

More info at snufa.net/2025/

Note that we normally get around 700 participants and recordings go on YouTube and get 100s-1000s views, so it's a good place to promote your work.

Please repost.

#neuroscience #SpikingNeuralNetwork #SpikingNeuralNetworks #snn #snufa

Fabrizio Musacchiopixeltracker@sigmoid.social
2025-08-29

I recently played around with #RateModels using #NESTsimulator. Compared to #SNN, RM focus on average firing rates of #NeuronPopulations, simplifying analysis of large networks. They effectively capture collective dynamics like #oscillations and #synchronization, though they miss precise spike timing details. Thus, both approaches have their merits. Here is a brief overview:

🌍 fabriziomusacchio.com/blog/202

#CompNeuro #Neuroscience #Python #PythonTutorial #SpikingNeuralNetwork

Simulated population activity of the excitatory population using mesoscopic and microscopic simulations. The top panel shows the mesoscopic activity from the rate model: 
A
N
 (blue) computed from spikerecorder data as a binned histogram (discrete, noisier) and 
¯
A
(orange) from multimeter data as a continuous measure (smoother). 
A
N
 is inherently noisier and strongly dependent on bin size, compared to 
¯
A
, which averages activity continuously over the recording interval and therefore appears smoother. This is due to the fact that spikerecorder-based histograms capture discrete spike counts, while the multimeter integrates population firing as a continuous variable. The bottom panel shows in contrast to the rate model’s results the microscopic activity 
A
N
derived from simulated spiking GIF neurons. Mesoscopic and microscopic traces are not identical, since one averages firing rates and the other emerges from explicit spikes, but both capture the population’s strong activation after 1500 ms. Rate models thus offer efficient and smooth approximations, while spiking models preserve variability and spike-level detail.
Fabrizio Musacchiopixeltracker@sigmoid.social
2025-08-11

📚 New preprint by Vafaii, Galor & Yates: Brain-like variational inference. They derive #SpikingNeuralNetwork dynamics directly from variational free energy minimization via online natural #GradientDescent, yielding the iterative Poisson #VAE (iP-VAE) with strong sparsity, reconstruction & #BiologicalPlausibility.

🌍 arxiv.org/abs/2410.19315
🧑‍💻 github.com/hadivafaii/Iterativ

#Neuroscience #MachineLearning #SNN #CompNeuro

Figure 1: Inferential and dynamical accounts of perception are unified under variational inference.
(a) Perception is framed as a dynamical process of convergence to attractors in a neural state space,
where membrane potentials evolve and generate spikes along the way.

New preprint for #neuromorphic and #SpikingNeuralNetwork folk (with Pengfei Sun and awesome MSc student Ziqiao Yu).

arxiv.org/abs/2507.16043

Surrogate gradients are popular for training SNNs, but some worry whether they really learn complex temporal spike codes. TLDR: we tested this, and yes they can!

We also find that delay-based spiking neural networks seem to degrade in more human-like ways than networks without delays.

Check the next post for links to the code and dataset which you can easily use to test your own spike based learning algorithms and models.

Giacomo Indiveri :emacs:🧠giacomoi@fediscience.org
2024-12-31

Proud to have managed to finish a #neuromorphic manuscript, with Chiara De Luca, Mirco Tincani and Elisa Donati just before the end of the year!

It demonstrates the benefits of using #braininspired principles of computation for achieving robust computation across multiple time-scales, despite the inherent variability of the underlying computational substrate (silicon neurons that emulate faithfully biological ones):
A neuromorphic multi-scale approach for heart rate and state detection
doi.org/10.21203/rs.3.rs-57373
#neuromorphic #wearable #neuroai #SpikingNeuralNetwork

Fabrizio Musacchiopixeltracker@sigmoid.social
2024-07-08

It’s actually very easy and straightforward setting up a large-scale, multi-population #SpikingNeuralNetwork (#SNN) with the #NESTsimulator. Here is an example with two distinct populations of #Izhikevich neurons:

🌍 fabriziomusacchio.com/blog/202

#ComputationalNeuroscience #CompNeuro #Neuroscience

Mmm...
Once I was surfing the net for some papers in CNS and I came across a paper about Natural Language Processing all of a sudden and I realized There aren't any real Spiking Neural Networks that are bio plausible for NLP.

I started reading some papers and some books to gain more knowledge about Language comprehension and language generation but no real model suggestions yet.

Do anyone know any labs working on NLP in Computational neuroscience or how to connect with them?

#computationalneuroscience #cns #SpikingNeuralNetwork #NLP

A Thousand Brains : a new theory of intelligence

Hi :)
I'm a new member of this amazing community and I would like to have my first post on the amazing breakthrough of Jeff Hawkins.

Before giving my opinion, I would like everyone to tell me whether they know the theory of have they read the book or the original papers and If so what's their insight on them?

I think the material in this research is pretty much fascinating and would like to engage and talk about it more.

#cns #SpikingNeuralNetwork #theory_of_brain #jeff_hawkins
#neuroscience #computationalneuroscience

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst