Spatially-embedded recurrent spiking neural networks reveal patterns of topologically structured computations
Andrew Siyoon Ham, Duncan E. Astle, University of Cambridge, United Kingdom; Jascha Achterberg, University of Cambridge / Intel Labs, United Kingdom; Danyal Akarca, University of Cambridge, United Kingdom
Session:
Posters 3B Poster
Presentation Time:
Sat, 26 Aug, 13:00 - 15:00 United Kingdom Time
Abstract:
Recent work has shown that by placing recurrent neural networks (RNNs) within a geometric space with local communication constraints, it is possible to recapitulate numerous structural and functional empirical neuroscience findings. While promising it is not known how well these findings, which are based on rate RNNs, generalize to (1) contexts where networks must solve tasks that are more challenging or (2) architectures that incorporate spikes rather than rate-based codes. Here, we present the first implementation of a spatially-embedded recurrent spiking neural network (seRSNN) which solves a neuromorphic gestures task with complex temporal dynamics. We show that seRSNNs, without compromising on training or performance, converge on highly communicative modular small-world network topologies that are consistent with prior findings, in which short local connectivity is prioritized while solving the task. These early findings open the door to numerous exciting research directions to study topological structure underlying computations in both neuroscience and neuromorphic computing. We provide an openly available notebook allowing researchers to train their own seRSNNs.