Spiking neural networks have a complex, biologically-inspired design yet haven't been practical for widespread use because it's challenging to train them. Nevertheless, spiking neural networks are more energy-efficient, providing a considerable impetus for transitioning to more spiking network technology over the next decade.
Many AIs can only master one set of well-defined tasks – they can't acquire additional knowledge later without losing everything they previously learned. However, interspersing focused training periods with sleep-like periods prevents forgetting. Sleep in the neural network occurs by activating the network's artificial neurons in a noisy pattern reminiscent of the training's neuronal firing pattern. This way, replay strengthens the connections related to the task. In addition, rapidly alternating sessions of training and sleep help consolidate the links from the first task that would have otherwise been forgotten.
Such a network can combine consecutively learned knowledge in intelligent ways and apply this learning to novel situations – just like animals and humans do.
Read more: AI uses artificial sleep to learn a new task without forgetting the last.
Image credit: Image by iuriimotov on Freepik
No comments:
Post a Comment