Humans and Neural Networks Show Similar Patterns of Transfer and Interference in a Continual Learning Task
Eleanor Holton, Lukas Braun, Jessica Thompson, Christopher Summerfield, Oxford University, United States
Session:
Posters 3B Poster
Presentation Time:
Sat, 26 Aug, 13:00 - 15:00 United Kingdom Time
Abstract:
When learning a new task, old task knowledge can incur both benefits (transfer) and costs (interference). How biological and artificial agents trade off these costs and benefits is an unsolved computational problem. One clue comes from recent analysis of learning in artificial neural networks (ANNs), which paradoxically show strongest interference for new tasks that are of intermediate levels of similarity to previous tasks (relative to those that are very similar or very different). Here, we directly compare this effect in humans and ANNs. In two successive tasks (A and B) humans and ANNs learned to map stimuli onto a continuously-valued circular output in two distinct contexts, where the outputs across contexts were related by a rule. By varying the similarity between task rules, we found that in both humans and ANNs, more similar tasks led to faster learning on task B, but more dissimilar tasks led to lower interference in memory of A. These results point to key parallels between humans and ANNs in continual learning settings, whereby task similarity promotes shared representations enabling faster acquisition of the new task, while task dissimilarity leads to lower catastrophic interference by invoking distinct representations.