Paper: https://www.nature.com/articles/s42256-023-00748-9
Project page and related work: https://www.jachterberg.com/seRNN
Code (1): https://codeocean.com/capsule/2879348/tree/v2
Code (2): https://github.com/8erberg/spatially-embedded-rnn
Pre-print version on bioRxiv: https://www.biorxiv.org/content/10.1101/2022.11.17.516914v1
University of Cambridge press release: https://www.cam.ac.uk/research/news/ai-system-self-organises-to-develop-features-of-brains-of-complex-organisms
Abstract:
Brain networks exist within the confines of resource limitations. As a result, a brain network must overcome the metabolic costs of growing and sustaining the network within its physical space, while simultaneously implementing its required information processing. Here, to observe the effect of these processes, we introduce the spatially embedded recurrent neural network (seRNN). seRNNs learn basic task-related inferences while existing within a three-dimensional Euclidean space, where the communication of constituent neurons is constrained by a sparse connectome. We find that seRNNs converge on structural and functional features that are also commonly found in primate cerebral cortices. Specifically, they converge on solving inferences using modular small-world networks, in which functionally similar units spatially configure themselves to utilize an energetically efficient mixed-selective code. Because these features emerge in unison, seRNNs reveal how many common structural and functional brain motifs are strongly intertwined and can be attributed to basic biological optimization processes. seRNNs incorporate biophysical constraints within a fully artificial system and can serve as a bridge between structural and functional research communities to move neuroscientific understanding forwards.
Note: Reddit initially botched the case-sensitive URL of the project page. Fixed it now.