A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu�- IEEE Transactions on Pattern�…, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as�…

[HTML][HTML] Continual lifelong learning with neural networks: A review

GI Parisi, R Kemker, JL Part, C Kanan, S Wermter�- Neural networks, 2019 - Elsevier
Humans and animals have the ability to continually acquire, fine-tune, and transfer
knowledge and skills throughout their lifespan. This ability, referred to as lifelong learning, is�…

Learning to prompt for continual learning

Z Wang, Z Zhang, CY Lee, H Zhang…�- Proceedings of the�…, 2022 - openaccess.thecvf.com
The mainstream paradigm behind continual learning has been to adapt the model
parameters to non-stationary data distributions, where catastrophic forgetting is the central�…

Dualprompt: Complementary prompting for rehearsal-free continual learning

Z Wang, Z Zhang, S Ebrahimi, R Sun, H Zhang…�- …�on Computer Vision, 2022 - Springer
Continual learning aims to enable a single model to learn a sequence of tasks without
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store�…

Data distributional properties drive emergent in-context learning in transformers

S Chan, A Santoro, A Lampinen…�- Advances in�…, 2022 - proceedings.neurips.cc
Large transformer-based models are able to perform in-context few-shot learning, without
being explicitly trained for it. This observation raises the question: what aspects of the�…

[HTML][HTML] 2022 roadmap on neuromorphic computing and engineering

DV Christensen, R Dittmann…�- Neuromorphic�…, 2022 - iopscience.iop.org
Modern computation based on von Neumann architecture is now a mature cutting-edge
science. In the von Neumann architecture, processing and memory units are implemented�…

[HTML][HTML] Neuroscience-inspired artificial intelligence

D Hassabis, D Kumaran, C Summerfield, M Botvinick�- Neuron, 2017 - cell.com
The fields of neuroscience and artificial intelligence (AI) have a long and intertwined history.
In more recent times, however, communication and collaboration between the two fields has�…

[HTML][HTML] Brain-inspired replay for continual learning with artificial neural networks

GM Van de Ven, HT Siegelmann, AS Tolias�- Nature communications, 2020 - nature.com
Artificial neural networks suffer from catastrophic forgetting. Unlike humans, when these
networks are trained on something new, they rapidly forget what was learned before. In the�…

Mechanisms of systems memory consolidation during sleep

JG Klinzing, N Niethard, J Born�- Nature neuroscience, 2019 - nature.com
Long-term memory formation is a major function of sleep. Based on evidence from
neurophysiological and behavioral studies mainly in humans and rodents, we consider the�…

Overcoming catastrophic forgetting in neural networks

J Kirkpatrick, R Pascanu…�- Proceedings of the�…, 2017 - National Acad Sciences
The ability to learn tasks in a sequential fashion is crucial to the development of artificial
intelligence. Until now neural networks have not been capable of this and it has been widely�…