
Dreams are, for the most part, delightful. As we sleep, visual and audio fragments combine into nonsensical snippets and epic narratives. Loosely recalled moments merge with vivid, imagined scenes; we interact with characters known and characters conjured up; we explore our fantasies and, sometimes, face our fears. Yet sleeping and dreaming do not exist for our nocturnal pleasure alone.
As we slumber, our brains filter information collected in waking hours. Neurological processes spring into action. They discard what is irrelevant and consolidate what is important; they form and store memories. These mechanisms — found throughout the mammal world — are so effective that a team of Italian researchers have mathematically modeled them for use in artificial intelligence.
The result is an algorithm that expands the storage capacity of artificial networks by forcing them into an off-line sleep phase during which they reinforce relevant memories (pure states) and erase irrelevant ones (spurious states).
When math meets mammals
Adriano Barra is a theoretical physicist at the University of Salento in Italy. Barra is eager and animated when he speaks to me, frequently using a pack of Marlboro cigarettes as an unlikely prop to illustrate the finer points of A.I.
Along with his colleagues Elena Agliari and Alberto Fachechi, Barra studies complex systems, such as brains, and makes mathematical models of their neurobiology. “We theoretical physicists have a teeny advantage over engineers,” says Barra. “As the mathematics is the same, almost for free, we can apply our results in artificial intelligence. We are a bridge between neurobiology and engineering.”
The classic blueprint for artificial neural networks is the Hopfield model. Developed by John Hopfield in 1982, it describes how artificial networks learn and retrieve information using mechanisms, such as pattern recognition, that mimic real brains. The most popular learning rule for a Hopfield network is Hebbian learning, which proposes how the synapses between neurons are strengthened during the learning process.
There is a drawback, however. The Hopfield model is now decades old, and can only store a limited amount of information. Expressed mathematically, the maximum storage capacity for this symmetric networkis α∼0.14. However, the theoretical limit is 1, or α=1.
Networks in an awake, or online, state are always learning new patterns of information. As well as desirable patterns, however, they collect irrelevant, even fake patterns.
A network’s capacity is determined by the number of neurons (n) it contains. Each bit of information, such as a digital pixel, is a distinct pattern (p) of information. “You can store, at maximum, a number of patterns that is p equal to 0.14 times n,” says Barra.
Why, then, is the storage capacity only a fraction of what is theoretically possible? Why is it 0.14, rather than 1?

Networks in an awake, or online, state are always learning new patterns of information. As well as desirable patterns, however, they collect irrelevant, even fake patterns. “You get the network to store these important patterns but unavoidably it will also store mistakes,” says Barra. “You cannot avoid them. They are automatically generated.”
He grabs his Marlboros and compares them to a pack of Camel cigarettes. “If you keep storing details about these packets of cigarettes — the blue one is Camel and the red one is Marlboro, etc. — but you also take in all the redundant information, the network eventually gets stuck.” Jammed with irrelevant and spurious information, maximum capacity is reached at 0.14.
This limited capacitydoes not stop A.I. from carrying out specific tasks, but clogging up valuable space with useless data is wasteful and inefficient.The Italian team’s solution is an algorithm that forces networks into an off-line phase that echoes mammal sleep, which is used to reinforce important memories and erase irrelevant ones.
“It is deeply related to sleep,” says Barra, “because if you don’t get rid of all these spurious mistakes, they become too much, and the network is no longer able to discriminate between what is a Marlboro and what is a Camel.”
Freeing up space
Mammalian brains are also constantly collecting new patterns of information. But like a fishing boat indiscriminately trawling the seabed, along with relevant information, they also suck up unimportant details.
“When you are awake, you store a lot of information passively; you don’t really need it and don’t really want to store it,” says Barra. “We also produce spurious and fake patterns of information. We want to get rid of that. We do this by sleeping.”
During rapid eye movement (REM) sleep, the phase of sleep in which we usually dream, our brains are busy erasing irrelevant memories. This makes room for storing new ones. In slow-wave (SW) sleep, important memories are reinforced. While most dreaming occurs during REM, we may experience indistinct dreams that are harder to recall during SW sleep.
The research team knew that mammalian brains use highly efficient mechanisms to clear out storage space during sleep. Their analysis of this fact was the starting point for their algorithm.
“We said, okay, let’s go through the neurobiological paper that explains this phenomenon in a real brain and try to model it mathematically,” says Barra. “Then we asked: What can happen if we apply this mathematical backbone to the Hopfield model?”
Evolution has provided the criteria mammalian brains rely upon to determine what should be consolidated and what should be erased, but artificial networks need people to take on the task.
Their answer is presented in a paper published in Neural Networksin April. Under the new framework, artificial networks continue to learn and store patterns of information (as memories) during standard online, or awake, sessions. However, when storage capacity hits 0.14, the network is forced into an off-line, or sleep, session. This sleep phase is used to delete irrelevant information and consolidate the important stuff, or more technically, for “spurious-pattern removal and pure-pattern reinforcement.” By implementing a new off-line regimen, the team was able to free up storage space and increase capacity to 1.
Evolution has provided the criteria mammalian brains rely upon to determine what should be consolidated and what should be erased, but artificial networks need people to take on the task. “They [networks] are absolute slaves of our perception, so we have to say to them, ‘This is important, pay attention to it. This is not important,’” says Barra. “They don’t have the concept of importance.”
There are other technical differences too. For instance, in human brains, erasing and consolidation occur during two distinct stages of sleep (REM and SW); when networks sleep, both tasks happen simultaneously. Perhaps the thing that most distinguishes human from machine sleep is that we have autonomy over our slumber and naps. Artificial networks’ sleep sessions are triggered by math — no cozy blanket required.
With the new framework in the public sphere, Barra’s job is over. “I am a theoretical physicist; I develop the mathematical backbone,” he says. “Now it is a matter for the engineers to implement it.” And maybe he — and artificial networks — can take a well-deserved nap.
All Rights Reserved for Karen Emslie

One Comment