Publication: The Evolution of Training hyperparameters for SNNs with Hebbian Learning. (video)

The Evolution of Training Parameters for Spiking Neural Networks with Hebbian Learning

In Proceedings of The Artificial Life Conference 2018. MIT Press
Tokyo, Japan. 23-27 July 2018.

Abstract— Spiking neural networks, thanks to their sensitivity to the timing of the inputs, are a promising tool for unsupervised processing of spatio-temporal data. However, they do not perform as well as the traditional machine learning approaches and their real-world applications are still limited. Various supervised and reinforcement learning methods for optimising spiking neural networks have been proposed, but more recently the evolutionary approach regained attention as a tool for training neural networks. Here, we describe a simple evolutionary approach for optimising spiking neural networks. This is the first published use of evolutionary algorithm to develop hyperparameters for fully unsupervised spike-timing- dependent learning for pattern clustering using spiking neural networks. Our results show that combining evolution and unsupervised learning leads to faster convergence on the optimal solutions, better stability of fit solutions and higher fitness of the whole population than using each approach separately.

Recording of the Neuron session at the ALIFE conference, day 2. Talks by Vegard Edwardsen (grid cells), Jason Yoder (neuromodulation) & me (evolution of artificial networks).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s