#SpikingNeuralNetworks

Spiking neural networks people, this message is for you!

The annual SNUFA workshop is now open for abstract submission (deadline Sept 26) and (free) registration. This year's speakers include Elisabetta Chicca, Jason Eshraghian, Tomoki Fukai, Chengcheng Huang, and... you?

snufa.net/2025/

Please boost!

#neuroscience #computationalneuroscience #SpikingNeuralNetworks

We are hiring a postdoc. It's a broadly scoped position but I think it would be of interest to someone in #Neuromorphic or #SpikingNeuralNetworks. See ad below. Inquiries to Dan Akarca because I will be on holiday.

Note, the application deadline is very soon! (Unavoidable admin issues.)

imperial.ac.uk/jobs/search-job

@root The closest technologies we have to how the human brain works are not LLMs, but some less well-known ones: reinforcement learning algorithms and hyperdimensional computing. If you want to see what HDC is capable of, check out this video:

youtu.be/P_WRCyNQ9KY?si=JgAuOJ

#HDC #HyperdimensionalComputing
#VSA #VectorSymbolicArchitecture
#HRR #HolographicReducedRepresentation
#SpikingNeuralNetworks
#AGI #ArtificialGeneralIntelligence
#LLM

Word cloud of abstracts we've received for #SNUFA #SpikingNeuralNetworks conference 2024. Register (free) by tomorrow afternoon UTC if you want to take part in selecting which abstracts get offered talk slots at the workshop!

snufa.net/2024/

#neuroscience

We got 50% more submissions this year for the #SNUFA #SpikingNeuralNetworks conference compared to last year: thanks! ❤️

We will shortly send out to registered participants a survey to allow you to take part in the approval voting scheme that will decide which abstracts we select as talks.

Register soon if you want to take part!

snufa.net/2024/

Submit your abstracts for the #SNUFA #SpikingNeuralNetworks conference by tomorrow The conference is free, online and usually has around 700 highly engaged participants. Talks are selected by participant interest.

Please do signal boost this!

snufa.net/2024/

#compneuro #neuroscience

2024-09-11
  • Extends the HOTS algorithm to increase its performance by adding a homeostatic gain control on the activity of neurons to improve the learning of spatio-temporal patterns, we prove an analogy with off-the-shelf LIF #SpikingNeuralNetworks

This needs a hand clap ! 👏

hand clap in DVS-gesture

New preprint on our "collaborative modelling of the brain" (COMOB) project. Over the last two years, a group of us (led by @marcusghosh) have been working together, openly, online, with anyone free to join, on a computational neuroscience research project

biorxiv.org/content/10.1101/20

This was an experiment in a more bottom up, collaborative way of doing science, rather than the hierarchical PI-led model. So how did we do it?

We started from the tutorial I gave at @CosyneMeeting 2022 on spiking neural networks that included a starter Jupyter notebook that let you train a spiking neural network model on a sound localisation task.

neural-reckoning.github.io/cos

youtube.com/watch?v=GTXTQ_sOxa

Participants were free to use and adapt this to any question they were interested in (we gave some ideas for starting points, but there was no constraint). Participants worked in groups or individually, sharing their work on our repository and joining us for monthly meetings.

The repository was set up to automatically build a website using @mystmarkdown showing the current work in progress of all projects, and (later in the project) the paper as we wrote it. This kept everyone up to date with what was going on.

comob-project.github.io/snn-so

We started from a simple feedforward network of leaky integrate-and-fire neurons, but others adapted it to include learnable delays, alternative neuron models, biophysically detailed models, incorporated Dale's law, etc.

We found some interesting results, including that shorter time constants improved performance (consistent with what we see in the auditory system). Surprisingly, the network seemed to be using an "equalisation cancellation" strategy rather than the expected coincidence detection.

Ultimately, our scientific results were not incredibly strong, but we think this was a valuable experiment for a number of reasons. Firstly, it shows that there are other ways of doing science. Secondly, many people got to engage in a research experience they otherwise wouldn't. Several participants have been motivated to continue their work beyond this project. It also proved useful for generating teaching material, and a number of MSc projects were based on it.

With that said, we learned some lessons about how to do this better, and yes, we will be doing this again (call for participation in September/October hopefully). The main challenge will be to keep the project more focussed without making it top down / hierarchical.

We believe this is possible, and we are inspired by the recent success of the Busy Beaver challenge, a bottom up project of mathematics amateurs that found a proof to a 40 year old conjecture.

quantamagazine.org/amateur-mat

We will be calling for proposals for the next project, engaging in an open discussion with all participants to refine the ideas before starting, and then inviting the proposer of the most popular project to act as a 'project lead' keeping it focussed without being hierarchical.

If you're interested in being involved in that, please join our (currently fairly quiet) new discord server, or follow me or @marcusghosh for announcements.

discord.gg/kUzh5MHjVE

I'm excited for a future where scientists work more collaboratively, and where everyone can participate. Diversity will lead to exciting new ideas and progress. Computational science has huge potential here, something we're also pursuing at @neuromatch.

Let's make it happen!

#neuroscience #computationalscience #computationalneuroscience #compneuro #science #metascience #SpikingNeuralNetworks #auditory

Diagram showing process from starting material to group/solo research, online workshops and writing together.Neural network model from audio signal to left/right ears, delay lines to auditory nerve fibres, hidden layer of LIF neurons, output layer of non-spiking neurons and arg max decision.Diagram of weight matrices showing equalisation cancellation structure.

Could we decide if a simulated spiking neural network uses spike timing or not? Given that we have full access to the state of the network and can simulate perturbations. Ideas for how we could decide? Would everyone agree? #neuroscience #SpikingNeuralNetworks #computationalneuroscience #compneuro

Fabrizio Musacchiopixeltracker@sigmoid.social
2024-07-22

In 2000, Nicolas Brunel presented a framework for studying sparsely connected #SpikingNeuralNetworks (#SNN) with random connectivity & varied excitation-inhibition balance. The model, characterized by high sparseness & low firing rates, captures diverse neural dynamics such as synchronized regular and asynchronous irregular activity and global oscillations. Here is a brief summary of these concepts & a #PythonTuroial using the #NESTsimulator.

🌍 fabriziomusacchio.com/blog/202
#CompNeuro #Neuroscience

Spike raster plots (top) and histograms of the spiking rate (bottom) for a simulation of the Brunel network.

SPIKING NEURAL NETWORKS!

If you love them, join us at SNUFA24. Free, online workshop, Nov 5-6 (2-6pm CET). Usually ~700 participants.

Invited speakers: Chiara Bartolozzi, David Kappel, Anna Levina, Christian Machens

Posters + 8 contributed talks selected by participant vote.

Abstract submission is quick and easy (300 word max), and now open until the deadline Sept 27.

Registration is free, but mandatory.

Hope to see you there!

snufa.net/2024/

#SpikingNeuralNetworks #neuroscience #computationalneuroscience #neuromorphic #neuromorphiccomputing #Neuromorphicengineering

SNUFA 2024

Brief summary. This online workshop brings together researchers in the fields of computational neuroscience, machine learning, and neuromorphic engineering to present their work and discuss ways of translating these findings into a better understanding of neural circuits. Topics include artificial and biologically plausible learning algorithms and the dissection of trained spiking circuits toward understanding neural processing. We have a manageable number of talks with ample time for discussions.

Executive committee. Melika Payvand, Laurent Perrinet, Dan Goodman, and Friedemann Zenke.

Key information

Workshop. 5-6 November 2024, European afternoons (online).

Abstract submission deadline. 27 September 2024. Submit an abstract

Registration. Registration is free but required. Register via EventBrite

Invited speakers.

    Chiara Bartolozzi (IIT Genova),
    David Kappel (University of Bochum),
    Anna Levina (Uni Tübingen),
    Christian Machens (Champalimaud)
Fabrizio Musacchiopixeltracker@sigmoid.social
2024-06-17

The #NEST #simulator is a powerful software for simulating large-scale #SpikingNeuralNetworks (#SNN). I’ve composed an introductory #tutorial showing the main commands for getting started. It's applied to examples with single neurons to reduce complexity. Feel free to share:

🌍 fabriziomusacchio.com/blog/202

#CompNeuro #ComputationalNeuroscience #Python #PythonTutorial #NESTSimulator

A single neuron simulated with with the iaf_psc_alpha model (leaky integrate-and-fire model with alpha-shaped input currents) with an input current of 376.0 pA in NEST.
Fabrizio Musacchiopixeltracker@sigmoid.social
2024-05-29

Due to its computational efficiency and biological plausibility, the #IzhikevichModel is an exceptional tool for understanding #neuronal interactions within #SpikingNeuralNetworks (#SNN). Here’s a quick #Python implementation of Izhikevich's original #Matlab code along with examples using different synaptic weights and neuron types, each leading to diverse spiking behaviors and network dynamics:

🌍fabriziomusacchio.com/posts/iz

#CompNeuro #Neuroscience #ComputationalScience #NeuralNetworks #modeling

Spiking neural network community. We are thinking of holding the annual SNUFA workshop on Nov 5-6 or 12-13. Preferences? Are there any clashes we should know about? #SpikingNeuralNetworks #Neuroscience #ComputationalNeuroscience

2024-05-07

Dear colleagues,

It's a pleasure to share with you this fully-funded #PhD position in #computational neuroscience in interaction with #neuromorphic engineering and #neuroscience:

laurentperrinet.github.io/post

TL;DR: This PhD subject focuses on the association between #attention and #SpikingNeuralNetworks for defining new efficient AI models for embedded systems such as drones, robots and more generally autonomous systems. The thesis will take place between the LEAT research lab in Sophia-Antipolis and the INT institute in Marseille which both develop complementary approaches on bio-inspired AI from neuroscience to embedded systems design.

The application should include :
• Curriculum vitæ,

• Motivation Letter,

• Letter of recommendation of the master supervisor.

and sent to Benoit Miramond benoit.miramond@unice.fr, Laurent Perrinet Laurent.Perrinet@univ-amu.fr, and Laurent Rodriguez laurent.rodriguez@univ-cotedazur.fr

Cheers,
Laurent

PS: related references:

  • Emmanuel Daucé, Pierre Albigès, Laurent U Perrinet (2020). A dual foveal-peripheral visual processing model implements efficient saccade selection. Journal of Vision. doi: doi.org/10.1167/jov.20.8.22

  • Jean-Nicolas Jérémie, Emmanuel Daucé, Laurent U Perrinet (2024). Retinotopic Mapping Enhances the Robustness of Convolutional Neural Networks. arXiv: arxiv.org/abs/2402.15480

Animal camouflage illustrates the importance of exploration in vision: looking straight ahead reveals only vegetation, while making the right saccade reveals a cheetah ready to hunt its prey.
2024-03-15

Just in time for your weekend, we released Brian 2.6, the new version of your friendly spiking network simulator. 🚀
It comes with many small improvements, bug and compatibility fixes, and offers a major new feature for running standalone simulations repeatedly (or in parallel) without recompiling code. In addition, it comes with general infrastructure improvements all around (official wheels for Python 3.12! Docker images on Docker hub! Apple Silicon builds/tests!).
Enjoy (and let us know if you run into any issues, of course…) 🥳

briansimulator.org/posts/2024/

#SpikingNeuralNetworks #ComputationalNeuroscience #FOSS #Python #NoDeployFriday

2024-02-28

Special Session on #SpikingNeuralNetworks and #Neuromorphic Computing at the 33rd International Conference on Artificial Neural Networks (ICANN) 2024 - Call for Papers

Sep 17 - 20, Lugano, Switzerland

The special session invites contributions on recent advances in spiking neural networks. Spiking neural networks have gained substantial attention recently as a candidate for low latency and low power AI substrate, with implementations being explored in neuromorphic hardware. This special session aims to bring together practitioners interested in efficient learning algorithms, data representations, and applications.

ORGANIZERS:

  • Sander Bohté (CWI Amsterdam, Netherlands)
  • Sebastian Otte (University of Lübeck, Germany)

Find more details at: e-nns.org/wp-content/uploads/2

#icann #enns #ai #neuralnetworks #spiking #snn #neuromorphic #edgeai

Special Session on Spiking Neural Networks and Neuromorphic Computing at the 33rd International Conference on Artificial Neural Networks (ICANN2024) Sep 17 - 20, Lugano

I'm on the latest episode of Brain Inspired talking about #Neuroscience, #SpikingNeuralNetworks, #MachineLearning and #Metascience! Thanks Paul Middlebrooks (not on Mastodon I think) for the invite and the extremely fun conversation. For the explanation of why this picture you'll have to listen to the episode. 😉

braininspired.co/podcast/183/

Also, if you're not yet listening to Brain Inspired you should be - and support Paul on Patreon. He provides this free for the community with no adverts. What a hero!

Lion in a top hat in the style of van Gogh, generated by DALLE
2023-11-30

We are finally on Mastodon, time for a little #introduction 👋 !

Brian is a #FOSS simulator for biological #SpikingNeuralNetworks, for research in #ComputationalNeuroscience and beyond. It makes it easy to go from a high-level model description in Python, based on mathematical equations and physical units, to a simulation running efficiently on the CPU or GPU.

We have a friendly community and extensive documentation, links to everything on our homepage: briansimulator.org

This account will mostly announce news (releases, other notable events), but we're also looking forward to discussing with y'all 💬

#opensource #neuroscience #researchsoftware #introductions

Python code:
from brian2 import *
G = PoissonGroup(100, rates=50*Hz)
M = SpikeMonitor(G)
run(100*ms)
plot(M.t/ms, M.i, '.k');

Below, a raster plot generated by the above code, showing random spiking activity.

All talks (but one) from SNUFA 2023 #SpikingNeuralNetworks workshop now available on our Youtube channel:

youtube.com/playlist?list=PL09

#neuroscience

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst