Towards Efficient Continual Learning in Deep Neural Networks
N Mehta - 2022 - search.proquest.com
2022•search.proquest.com
Deep learning, trained primarily on a single task under the assumption of independent and
identically distributed (iid) data, has made enormous progress in recent years. However,
when naively trained sequentially on multiple tasks, without revisiting previous tasks, neural
networks are known to suffer catastrophic forgetting: the ability to perform old tasks is often
lost while learning new ones. In contrast, biological life is capable of learning many tasks
throughout a lifetime from decidedly non-iid experiences, acquiring new skills, and reusing …
identically distributed (iid) data, has made enormous progress in recent years. However,
when naively trained sequentially on multiple tasks, without revisiting previous tasks, neural
networks are known to suffer catastrophic forgetting: the ability to perform old tasks is often
lost while learning new ones. In contrast, biological life is capable of learning many tasks
throughout a lifetime from decidedly non-iid experiences, acquiring new skills, and reusing …
Abstract
Deep learning, trained primarily on a single task under the assumption of independent and identically distributed (iid) data, has made enormous progress in recent years. However, when naively trained sequentially on multiple tasks, without revisiting previous tasks, neural networks are known to suffer catastrophic forgetting: the ability to perform old tasks is often lost while learning new ones. In contrast, biological life is capable of learning many tasks throughout a lifetime from decidedly non-iid experiences, acquiring new skills, and reusing old ones to learn fresh abilities, all while retaining important previous knowledge. As we strive to make artificial systems increasingly more intelligent, natural life's ability to learn continually is an important capability to emulate.
ProQuest
Showing the best result for this search. See all results