#SpikingNeuralNetworks

Postdoc fellowship opportunity for ECRs (<3 yrs post-PhD). Note that if you want to apply to work with me as your mentor, our dept has an internal deadline of Dec 4th so please email me asap. Our internal process is shorter than the full application.

royalcommission1851.org/fellow

#Neuroscience #ComputationalNeuroscience #SpikingNeuralNetworks

Reminder if you missed #SNUFA spiking neural network and neuromorphic workshop earlier this month, all our talks were recorded and are now available to watch.

youtube.com/playlist?list=PL09

#Neuroscience #ComputationalNeuroscience #SpikingNeuralNetworks

Psst - #neuromorphic folks. Did you know that you can solve the SHD dataset with 90% accuracy using only 22 kB of parameter memory by quantising weights and delays? Check out our preprint with Pengfei Sun and Danyal Akarca:

arxiv.org/abs/2510.27434

Or check out the TLDR thread on Bsky:

bsky.app/profile/did:plc:niqde

#SpikingNeuralNetworks #ComputationalNeuroscience #Neuroscience

Spiking NN fans - the #SNUFA workshop (Nov 5-6) agenda is finalised and online now. Make sure to register (free) soon. (Note you can register for either day and come to both.)

Agenda: snufa.net/2025/
Registration: eventbrite.co.uk/e/snufa-2025-

Thanks to all who voted on abstracts!

#Neuroscience #ComputationalNeuroscience #SpikingNeuralNetworks

2025-10-10

„Vor wenigen Wochen stellte ein chinesisches Forscherteam das Modell „SpikingBrain 1.0“ vor – eine #KI auf Basis eines #SpikingNeuralNetworks. Diese Technik soll nicht nur weniger Energie verbrauchen, sondern auch ohne Nvidia-Chips und ohne große Datenmengen auskommen.“🤔
Lina Knees via #Handelsblatt

Message for participants of the #SNUFA 2025 spiking neural network workshop. We got almost 60 awesome abstract submissions, and we'd now like your help to select which ones should be offered talks. To take part, follow the "abstract voting" link at:

snufa.net/2025/

It should take <15m. Thanks! ❤️

#neuroscience #SpikingNeuralNetworks #ComputationalNeuroscience

C:\mylife\> -=EV=- :sp_ni:ev@c.im
2025-09-27

Neuromorphic Chips: How the Brain Inspires the Machines of the Future

Imagine a computer that thinks like you. Not just crunching numbers, but learning, adapting, and solving problems faster than you can blink, all while sipping less power than a fridge light bulb. Sounds like sci-fi? Welcome to neuromorphic chips—tech that mimics the human brain and is set to redefine computing. Let’s dive into this wild ride where science meets nature’s genius and see how these chips are shaping tomorrow.

What Are Neuromorphic Chips?

Forget traditional processors that plod through data like accountants. Neuromorphic chips are the rock stars of tech, inspired by the ultimate supercomputer: your brain. They mimic neurons and synapses, sending info via short electrical “spikes,” just like your brain does. These chips don’t just process—they learn and adapt with incredible efficiency.

“Neuromorphic” mixes “neuron” and “morphology” (form). These chips borrow the brain’s wiring: millions of artificial neurons linked by synapses work in parallel, like a data symphony. The best part? They use a fraction of the energy of regular chips.

How Do They Work? A Brain in Silicon

Picture your brain as a party where neurons only speak when they’ve got something juicy to share. Neuromorphic chips roll the same way:

1. Spiking Neural Networks (SNN): Unlike regular neural nets that buzz constantly, spiking networks only fire when needed, like texting only big news. Energy savings? Huge!
2. Parallel Power: These chips juggle data across multiple channels, like a brain planning dinner while grooving to music. Perfect for facial recognition or live video analysis.
3. Memory on Deck: Regular computers shuttle data like rush-hour couriers. Neuromorphic chips keep memory in the neurons—no traffic, just speed.
4. Self-Learning: These chips learn on the fly, like a brain memorizing a new route. Think robots that train themselves or cars that react faster than you.

Why Are They Awesome?

1. Firefly Energy: Your brain runs on 20 watts—less than a hallway bulb. Neuromorphic chips aim for that efficiency, ideal for drones, smartwatches, or gadgets lasting months without a charge.
2. Lightning Speed: Parallel processing makes these chips tackle tasks that’d fry regular processors. Your phone could spot faces in a crowd in milliseconds.
3. AI on Fire: Built for AI, these chips supercharge neural networks, making AI smarter and cheaper to run.
4. Scalable Future: From tiny sensors to brain-mimicking supercomputers, these chips scale effortlessly.

Who’s Making the Magic?

Neuromorphic chips aren’t a lab dream—they’re real, thanks to top innovators:

- IBM TrueNorth: Launched in 2014, it rocks a million neurons and 256 million synapses, analyzing video faster than you can say “cool.”
- Intel Loihi: Debuted in 2017, upgraded in 2021, Loihi learns on-device, perfect for robots or smart cameras.
- BrainChip Akida: Built for offline smart devices, like cameras that recognize faces without the cloud.
- SpiNNaker: A Manchester University project to simulate the brain, using neuromorphic processors for real-time neural networks.

Where Will We See Them?

These chips are ready to invade our lives:

- Self-Driving Cars: Cars that see the road like humans and dodge pedestrians in a flash.
- Smart Robots: Robots learning new tasks solo, from factory work to surgical assists.
- Next-Gen Medicine: Analyzing MRIs or monitoring heartbeats in real time with precision.
- IoT and Smart Homes: Smarter speakers and cameras that work offline, sipping power.
- Near-Human AI: AI that gets context, emotions, and learns like a kid.

The Catch…

Every revolution has bumps:

- Programming’s Tough: Spiking networks need new coding tricks. Python won’t cut it.
- Not Universal: Great for AI, but overkill for simple tasks like taxes.
- Pricey Development: Building these chips costs billions and years.
- Algorithm Hunt: New algorithms are needed to unlock their full power.

What’s Next?

Neuromorphic chips are like the moon landing—early days, big dreams:

- Smart Everything: From AR glasses to medical implants, they’ll weave tech into life.
- Quantum Combo: Pairing with quantum computing could create light-speed brains.
- Soulful AI: We might get AI that’s not just smart but feels almost alive.

In Conclusion

Neuromorphic chips bridge machines and humans, silicon and neurons. They’re not just powerful—they’re smart, efficient, and nature-inspired. We’re entering an era where machines might think, learn, and even dream like us. One day, a neuromorphic chip might ask, “What’s love?”—and we’ll be the ones stumped.

:sp_oko: :sp_sama:

#NeuromorphicComputing #ArtificialIntelligence #BrainInspiredTech #FutureOfComputing #AI #MachineLearning #SpikingNeuralNetworks #TechInnovation #IoT #Robotics #EdgeComputing #EnergyEfficiency #IntelLoihi #IBMTrueNorth #BrainChip #SpiNNaker #SmartDevices #AutonomousSystems

Today is the last day for submissions to the free online SNUFA spiking neural network workshop! This is a great opportunity to present your work (hundreds of participants and sometimes thousands of YouTube views per talk), and abstract submission is easy (300 words). Do it now!

snufa.net/2025/

#neuroscience #ComputationalNeuroscience #SpikingNeuralNetworks

2025-09-18

Zoomposium with Martin Bogdan: “When AI gets bored – paths to (artificial) consciousness”

What if a machine felt boredom?
And: How close are we to the vision of the digital self?

Prof. Martin Bogdan talks about brain-computer interfaces, artificial consciousness, spiking neural networks – and why boredom, of all things, could be a possible indicator of true intelligence.

What can AI teach us about consciousness?
And how do we assess risks when machines suddenly feel more than we would like?

💡 Listen now, think along, join the discussion at: open.spotify.com/episode/1R0JG

or 📖 reading: philosophies.de/index.php/2025

#Zoomposium #MartinBogdan #ArtificialIntelligence #ArtificialConsciousness #BrainComputerInterface #NeuromorphicInformationProcessing #DigitalImmortality #Singularity #BoredomInAI #Transhumanism #SpikingNeuralNetworks #Neuroengineering #ConsciousnessResearch

Thumbnail Zoomposium with Martin Bogdan
2025-09-18

Zoomposium mit Martin Bogdan: "Wenn KI sich langweilt – Wege zum (künstlichen) Bewusstsein"

Was wäre, wenn eine Maschine Langeweile empfindet?
Und: Wie nah sind wir der Vision vom digitalen Ich?

Prof. Martin Bogdan spricht über Brain-Computer-Interfaces, künstliches Bewusstsein, spikende neuronale Netze – und warum ausgerechnet Langeweile ein möglicher Indikator für echte Intelligenz sein könnte.

Was kann KI über Bewusstsein lehren?
Und wie bewerten wir Risiken, wenn Maschinen plötzlich mehr fühlen, als uns lieb ist?

💡 Jetzt reinhören, mitdenken, mitdiskutieren auf: open.spotify.com/episode/1R0JG

oder 📖 lesen: philosophies.de/index.php/2025

#Zoomposium #MartinBogdan #KünstlicheIntelligenz #KünstlichesBewusstsein #BrainComputerInterface #NeuromorpheInformationsverarbeitung #DigitaleUnsterblichkeit #Singularität #BoredomInAI #Transhumanismus #SpikingNeuralNetworks #Neuroengineering #Bewusstseinsforschung

Thumbnail Zoomposium mit Martin Bogdan

If you're interested in spiking neural networks, you should know about surrogate gradient descent and @fzenke.bsky.social's SPyTorch tutorial which shows just how easy it is to apply this method that's changing the whole field. Check out my paean below, published in @thetransmitter

thetransmitter.org/this-paper-

#SpikingNeuralNetworks #Neuroscience #ComputationalNeuroscience

Submissions (short!) due for SNUFA spiking neural networks conference in <2 weeks!

forms.cloud.microsoft/e/XkZLav

More info at snufa.net/2025/

Note that we normally get around 700 participants and recordings go on YouTube and get 100s-1000s views, so it's a good place to promote your work.

Please repost.

#neuroscience #SpikingNeuralNetwork #SpikingNeuralNetworks #snn #snufa

Spiking neural networks people, this message is for you!

The annual SNUFA workshop is now open for abstract submission (deadline Sept 26) and (free) registration. This year's speakers include Elisabetta Chicca, Jason Eshraghian, Tomoki Fukai, Chengcheng Huang, and... you?

snufa.net/2025/

Please boost!

#neuroscience #computationalneuroscience #SpikingNeuralNetworks

We are hiring a postdoc. It's a broadly scoped position but I think it would be of interest to someone in #Neuromorphic or #SpikingNeuralNetworks. See ad below. Inquiries to Dan Akarca because I will be on holiday.

Note, the application deadline is very soon! (Unavoidable admin issues.)

imperial.ac.uk/jobs/search-job

@root The closest technologies we have to how the human brain works are not LLMs, but some less well-known ones: reinforcement learning algorithms and hyperdimensional computing. If you want to see what HDC is capable of, check out this video:

youtu.be/P_WRCyNQ9KY?si=JgAuOJ

#HDC #HyperdimensionalComputing
#VSA #VectorSymbolicArchitecture
#HRR #HolographicReducedRepresentation
#SpikingNeuralNetworks
#AGI #ArtificialGeneralIntelligence
#LLM

Word cloud of abstracts we've received for #SNUFA #SpikingNeuralNetworks conference 2024. Register (free) by tomorrow afternoon UTC if you want to take part in selecting which abstracts get offered talk slots at the workshop!

snufa.net/2024/

#neuroscience

We got 50% more submissions this year for the #SNUFA #SpikingNeuralNetworks conference compared to last year: thanks! ❤️

We will shortly send out to registered participants a survey to allow you to take part in the approval voting scheme that will decide which abstracts we select as talks.

Register soon if you want to take part!

snufa.net/2024/

Submit your abstracts for the #SNUFA #SpikingNeuralNetworks conference by tomorrow The conference is free, online and usually has around 700 highly engaged participants. Talks are selected by participant interest.

Please do signal boost this!

snufa.net/2024/

#compneuro #neuroscience

2024-09-11
  • Extends the HOTS algorithm to increase its performance by adding a homeostatic gain control on the activity of neurons to improve the learning of spatio-temporal patterns, we prove an analogy with off-the-shelf LIF #SpikingNeuralNetworks

This needs a hand clap ! 👏

hand clap in DVS-gesture

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst