The human brain houses approximately 86 billion neurons. These cells transmit electrical impulses that assist the brain in retaining memories and conveying information and commands throughout the brain and the nervous system.
The brain also comprises billions of astrocytes — star-shaped cells with numerous long extensions that enable them to engage with millions of neurons. Although they were traditionally considered primarily supportive cells, recent research has indicated that astrocytes might be involved in memory retention and other cognitive processes.
Researchers at MIT have now proposed a fresh hypothesis regarding how astrocytes might aid in memory retention. The framework put forth by their model could elucidate the brain’s extraordinary storage potential, which far surpasses what would be anticipated based solely on neurons.
“Initially, astrocytes were thought to merely tidy up around neurons, but there’s no distinct reason that evolution didn’t leverage the fact that each astrocyte can connect to hundreds of thousands of synapses, which means they could also facilitate computations,” states Jean-Jacques Slotine, an MIT professor of mechanical engineering and brain and cognitive sciences, and a co-author of the new study.
Dmitry Krotov, a research staff member at the MIT-IBM Watson AI Lab and IBM Research, is the senior author of the open-access paper, published on May 23 in the Proceedings of the National Academy of Sciences. Leo Kozachkov PhD ’22 serves as the lead author of the paper.
Memory Capacity
Astrocytes perform various supportive roles in the brain: they eliminate debris, supply nutrients to neurons, and help maintain sufficient blood circulation.
Astrocytes also extend numerous slender tentacles, known as processes, which can each encircle a single synapse — the junctions where two neurons interact — forming a tripartite (three-part) synapse.
In recent years, neuroscientists have demonstrated that disrupting the connections between astrocytes and neurons in the hippocampus impairs both memory retention and retrieval.
Unlike neurons, astrocytes cannot produce action potentials, the electrical signals that distribute information across the brain. However, they are capable of using calcium signaling to communicate amongst themselves. Over the last few decades, as calcium imaging resolution has improved, researchers have discovered that calcium signaling also allows astrocytes to synchronize their activity with neurons at the synapses where they are connected.
These studies imply that astrocytes can sense neural activity, prompting them to modify their own calcium levels. Those alterations may lead astrocytes to release gliotransmitters — signaling molecules akin to neurotransmitters — into the synapse.
“There’s a closed feedback loop between neuron signaling and astrocyte-to-neuron signaling,” Kozachkov notes. “What remains uncertain is exactly what kinds of computations the astrocytes can perform with the data they are sensing from neurons.”
The MIT team aimed to model the potential functions of those connections and how they might play a role in memory retention. Their model is grounded in Hopfield networks — a particular type of neural network that is capable of storing and recalling patterns.
Originally created by John Hopfield and Shun-Ichi Amari in the 1970s and 1980s, Hopfield networks are frequently used to simulate the brain. However, it has been established that these networks cannot accommodate enough information to explain the vast memory capacity of the human brain. A newer, modified variant of a Hopfield network, termed dense associative memory, can retain considerably more information through a higher degree of couplings among more than two neurons.
Nevertheless, it is uncertain how the brain could implement these multi-neuron couplings at a hypothetical synapse, since traditional synapses only link two neurons: a presynaptic cell and a postsynaptic cell. This is where astrocytes become relevant.
“When you possess a network of neurons that couple in pairs, there is only a limited amount of information that you can encode within those networks,” Krotov states. “To construct dense associative memories, more than two neurons need to be coupled together. Since a single astrocyte can associate with multiple neurons and numerous synapses, it is plausible to propose that there may be an information transfer between synapses mediated by this biological cell. That was our principal inspiration to delve into astrocytes and consider how to create dense associative memories in biology.”
The neuron-astrocyte associative memory model developed by the researchers in their recent paper can store markedly more information than a traditional Hopfield network — exceeding what is required to account for the brain’s memory capacity.
Complex Connections
The intricate biological links between neurons and astrocytes support the notion that this type of model might clarify how the brain’s memory storage systems function, the researchers explain. They hypothesize that within astrocytes, memories are encoded through gradual changes in calcium flow patterns. This information is transmitted to neurons via gliotransmitters released at synapses connected by astrocyte processes.
“Through meticulous coordination of these two elements — the spatial-temporal pattern of calcium within the cell and the signaling back to neurons — you can achieve exactly the dynamics required for an enormous increase in memory capacity,” Kozachkov asserts.
A prominent aspect of the new model is that it views astrocytes as collections of processes rather than a single entity. Each of these processes can be deemed a computational unit. Thanks to the high information storage capabilities of dense associative memories, the ratio of stored information to the number of computational units is exceptionally high and increases with the network’s size. This renders the system not only high capacity but also energy efficient.
“By framing tripartite synaptic domains — where astrocytes dynamically interact with pre- and postsynaptic neurons — as the brain’s fundamental computational units, the authors suggest that each unit can retain as many memory patterns as there are neurons in the network. This leads to the remarkable implication that, theoretically, a neuron-astrocyte network could store an infinitely large number of patterns, limited only by its size,” states Maurizio De Pitta, an assistant professor of physiology at the Krembil Research Institute at the University of Toronto, who was not involved in the study.
To verify whether this model might accurately reflect how the brain stores memories, researchers could explore methods to precisely alter the connections between astrocyte processes and then assess how these alterations influence memory function.
“We hope that one consequence of this work could be that experimentalists seriously consider this idea and conduct experiments to test this hypothesis,” Krotov remarks.
In addition to shedding light on how the brain may retain memories, this model could also provide direction for researchers developing artificial intelligence. By modifying the connectivity of the process-to-process network, researchers could create a vast array of models to explore for various applications, such as establishing a continuum between dense associative memories and attention mechanisms in large language models.
“While neuroscience initially sparked crucial ideas in AI, the past 50 years of neuroscience research have had minimal impact on the field, and many contemporary AI algorithms have strayed from neural analogies,” Slotine observes. “In this regard, this work may represent one of the first contributions to AI informed by recent neuroscience investigations.”