Lazy Learning: a Biologically-Inspired Plasticity Rule for Fast and Energy Efficient Synaptic Plasticity
Aaron Pache, Mark van Rossum, University of Nottingham, United Kingdom
Session:
Posters 2B Poster
Presentation Time:
Fri, 25 Aug, 13:00 - 15:00 United Kingdom Time
Abstract:
When artificial neural networks learn classification tasks, the backpropagation algorithm updates the network parameters on every trial, even if the sample was correctly classified. Is this incessant updating necessary? In contrast, when humans are learning they concentrate their effort on errors. A possible reason for this selective updating could be to save metabolic energy associated to synaptic updates. Inspired by this, we introduce lazy learning, which only updates on incorrect samples. We find that in particular when datasets are large, lazy learning saves significant energy over vanilla backpropagation.