#CompNeuro

2026-03-07

💼 Paid Opportunity: Join Neuromatch Academy and Climatematch Academy as a #virtual #TeachingAssistant this July. 8hrs/day, Monday to Friday during the course dates.

We are hiring Teaching Assistants for the following courses:

➡️ Learn more here: neuromatch.io/become-a-teachin
➡️ Apply before 15 March portal.neuromatchacademy.org/s

Apply to join Neuromatch Academy as a Teaching Assistant8 hours/day, Monday to Friday during the course dates in July — fully remote. Support a small pod of students on Zoom and Discord, and help create an inclusive and equitable learning environment.You'll need: 
Proficiency in Python
Knowledge of the course subject you’re applying to
An undergraduate degree
Previous teaching experience preferred, but not required
2026-03-05

Computational and Systems Neuroscience #COSYNE2026 is taking place in Lisbon, Portugal, from 12–17 March.

We’re excited to be there connecting with our global community, learning from the latest research, and sharing the work Neuromatch is doing.

If you’re attending, we’d love to meet! Let’s connect and continue building an open, collaborative #CompNeuro ecosystem.

cosyne.org/

#COSYNE #ComputationalNeuroscience #Neuroscience #OpenScience #Neuromatch

Will you be at COSYNE 2026?

Come see us at the Neuromatch booth
2026-03-05

🛡️ BrainGuard

An interdisciplinary team including Bernstein member Christian Klaes (#RUB) is developing protective measures for brain-computer interfaces and other technologies.

Read the whole story 👉 bernstein-network.de/en/newsro

#BernsteinNetwork #CompNeuro

2026-03-04

Recurrent cortical networks encode natural sensory statistics via sequence filtering www.cell.com/neuron/fullt... #compneuro #neuroskyence

Recurrent cortical networks en...

2026-03-04

🛰️ We’re inviting proposals for Satellite Workshops at the #BernsteinConference 2026!

🗓️ Deadline: April 29, 15:00 CEST

More information 👉 bit.ly/3OV2BNO

#BernsteinNetwork #CompNeuro

2026-03-03

We’re excited to share that Foresight Institute is hosting a free webinar this Friday: “Neuromatch: Building a Global Ecosystem for NeuroAI Training & Innovation”

Friday, March 6
10:00–11:00 AM PST
Free and open to all!
➡️ Register: luma.com/88xhfzsh

Everyone is welcome. Please register and join us!

#NeuroAI #ComputationalNeuroscience #CompNeuro #Neuromatch #ArtificialIntelligence #Neuroscience #MachineLearning

2026-03-03

This video explains what makes Neuromatch and Climatematch Academy special, shares everything you need to know about our intensive 2-3 week online courses, and helps you be prepared to apply! Applications close 15 March.
⬇️ ⬇️ ⬇️
youtu.be/Ja-4meM2dWc?si=dMvXeK

#CompNeuro #DeepLearning #NeuroAI #ClimateScience

2026-03-03

Applications are open for #CompNeuro, a live, intensive online course. Participants learn to combine modern #MachineLearning and causality frameworks with advanced modeling approaches to tackle real #neuroscience questions.

neuromatch.io/computational-ne

Computational Neuroscience 
3 Weeks, 6-24 July 20206
2026-03-02

Applications for Neuromatch and Climatematch Academy 2026 are open for students and Teaching Assistants. Don’t wait; the 15 March deadline is coming fast!

Don’t miss your chance. Start your application today.

➡️ Apply now: portal.neuromatchacademy.org/s

➡️ Learn more: neuromatch.io/courses/

#NeuromatchAcademy #ClimatematchAcademy #ComputationalScience #DeepLearning #CompNeuro #ClimateScience #NeuroAI #OnlineLearning #GlobalLearning #STEMEducation #ResearchSkills #Neuroscience

Student and Teaching Assistant applications close 15 March!
2026-02-27

🙋 Is there a cost to apply?
➡️ No! No payments are needed to apply.

🤓 Applications for Neuromatch Academy close 15 March!

Learn more about our #CompNeuro, #DeepLearning, and #NeuroAI courses and apply now: neuromatch.io/courses/

#NeuromatchAcademy #Neuromatch #STEMEducation #SummerSchool #VirtualSummerSchool #OnlineLearning #ResearchTraining #GlobalEducation #ClimateScience #ComputationalScience

Is there a cost to apply?
2026-02-25

Applications for July #CompNeuro, #DeepLearning, #NeuroAI & #ClimateScience are open.

Learn in small collaborative pods with a dedicated TA, build real research skills, and join 13,000+ alumni worldwide.

➡️ Apply by 15 March:
neuromatch.io/courses/

" I saw what open, rigorous, and collaborative science could look like. I saw people I hadn't known was there. - Kirollus Abdallah, Eygpt
2026-02-24

🌈 How psychedelic drugs affect the brain

Research findings by Bernstein member Dirk Jancke (Ruhr University Bochum) reinforce new approaches in psychology, using psychedelic substances under medical supervision to treat certain clinical conditions.

Read the whole story 👉 bernstein-network.de/en/newsro

#BernsteinNetwork #CompNeuro

Fabrizio MusacchioFabMusacchio
2026-02-23

The Urbanczik-Senn plasticity model is a powerful framework for understanding synaptic in . It integrates dendritic prediction errors to unify supervised, unsupervised, and under a single rule. Its predictive coding mechanism and robust learning dynamics make it valuable for simulating neural processing and exploring plasticity. Here’s a short simulation using the :

🌍 fabriziomusacchio.com/blog/202

Evolutions of synaptic weights of Urbanczik synapses.Urbanczik-Senn plasticity : membrane potentials, somatic conductances, dendritic currents, fireing rates, and rate derivative.
2026-02-19

🔍️ Brain network identified for effective treatment of Parkinson’s disease

Deep brain stimulation is a key procedure in the treatment of Parkinson's disease. Researchers including Bernstein member Wolf-Julian Neumann @UniKoeln have now identified the optimal target network in the human brain.

Read the whole story 👉 bernstein-network.de/en/newsro

#CompNeuro #BernsteinNetwork

Fabrizio MusacchioFabMusacchio
2026-02-19

Just came across an elegant new framework called by Maskeen and Lashkare, which implements a two layer SNN w/ local to classify, e.g., digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:

🌍fabriziomusacchio.com/blog/202

Top: Evolution of the receptive field of the winner neuron across epochs for sample 61, visualized as tiles. Each tile shows the RF of the winner neuron at a specific epoch, allowing us to see how it evolves during training. The title of each tile indicates the epoch number, the index of the winner neuron, its spike count, its mapped label according to the final neuron label map, and the true label of the sample. Bottom: Summary plot of weight metrics (L1 norm, L2 norm, and mean weight) for the winner neurons across epochs for sample 61.Learned synapses, visualized by summing over output neurons of the same predicted class. We trained the model on the classes 0 to 5, and we can see that the learned synaptic patterns for each class show distinct features that resemble the corresponding digit shapes, indicating that the network has successfully learned to differentiate between the classes based on the input spike patterns.Confusion matrix, row-normalized.
2026-02-18

Computational and Systems Neuroscience #COSYNE2026 is taking place in Lisbon, Portugal, from 12–17 March.

We’re excited to be there connecting with our global community, learning from the latest research, and sharing the work Neuromatch is doing.

If you’re attending, we’d love to meet! Let’s connect and continue building an open, collaborative #CompNeuro ecosystem.

cosyne.org/

#COSYNE #ComputationalNeuroscience #Neuroscience #OpenScience #Neuromatch

Will you be at COSYNE 2026?
Alessandro Torcinitorcini
2026-02-18

@hnp_geneva
@virati

We are very pleased to announce the 4th International Workshop On
Neurodynamics (NDy'26) to be held in Castro-Urdiales, Spain, May 26-29,
2026

cody.unizar.es/events/neurodyn

Fabrizio MusacchioFabMusacchio
2026-02-17

Spike-timing-dependent (#STDP) is a core rule in that adjusts strength based on precise pre- vs. postsynaptic timing, enabling and in . In this post, I summarize its mathematical formulation, functional consequences for learning and along with a simple example:

🌍 fabriziomusacchio.com/blog/202

STDP learning window W(Δt) as a function of the relative spike timing ΔtSynaptic weight dynamics with and without spike-timing-dependent plasticity. Left: STDP-enabled network. Synaptic weights differentiate over time and converge toward a bimodal distribution. Right: Control simulation without STDP. Synaptic weights remain at their initial random values and show no dynamical reorganization. Top panels show the final synaptic weights, middle panels show the distribution of synaptic weights, and bottom panels show the time course of two example synapses.

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst