Researchers Made an AI Whose Performance Increases if They Let It Sleep And Dream

Researchers Made an AI Whose Performance Increases if They Let It Sleep And Dream

15 FEB 2019

Sleep is pretty great. In humans, evidence suggests it has a whole range of benefits, including this one: it keeps the brain healthy by letting neurons prune unnecessary synaptic connections we make during the day.

This process, called synaptic homeostasis, prevents the brain from being overrun by useless memories. It’s possible that it helps to improve our cognitive performance, while dreams allow us to process our memories.

As it turns out, something similar may be occurring when artificial neural networks are allowed to sleep and dream.

Yep, you read that correctly. And it works very similarly to how it is thought to occur in humans.

Of course, artificial neural networks (ANNs) – a type of artificial intelligence based on biological neural networks – don’t automatically and instinctively fall asleep and dream. Which is why mathematicians in Italy programmed a type of ANN called a Hopfield network to be able to sleep.

“Inspired by sleeping and dreaming mechanisms in mammal brains, we propose an extension of this model displaying the standard on-line (awake) learning mechanism (that allows the storage of external information in terms of patterns) and an off-line (sleep) unlearning & consolidating mechanism,” they wrote in their paper.

In other words, while the ANN is ‘awake’, it’s learning and storing patterns. But its storage capacity is limited.

So the team worked out a way to mathematically implement human sleep patterns – rapid-eye movement sleep and slow-wave sleep, the former of which is thought to remove unnecessary memories, and the latter of which is thought to consolidate important ones.

So this is what the ANN’s ‘sleep’ state does too, cycling through and unlearning unnecessary information, and then consolidating what’s left, the important stuff.

The result was remarkable. Without sleep, the maximal capacity was α=0.14, where α represents the number of stored bits per synapse. When a sleep cycle was incorporated, the ANN reached the theoretical limit for networks of this type – α=1.

Extensive testing using simulations validated the result – showing that allowing a neural network to nap once in a while (using the correct napping algorithm) could result in improved performance.

“We believe that in the process of cognition – while certainly learning and retrieval keep covering a pivotal role – also sleeping is mandatory for artificial intelligence, as it is for the biological one,” mathematician Adriano Barra of the University of Salento said.

Let’s hope their dreams are nice ones.

The team’s research has been published in the journal Neural Networks, and can be read in full on arXiv.

Dr. Hans C. Mumm