Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- research-articleDecember 2024
Future-proofing class-incremental learning: Future-proofing class-incremental learning
Machine Vision and Applications (MVAA), Volume 36, Issue 1https://doi.org/10.1007/s00138-024-01635-yAbstractExemplar-free class incremental learning is a highly challenging setting where replay memory is unavailable. Methods relying on frozen feature extractors have drawn attention recently in this setting due to their impressive performances and lower ...
- research-articleDecember 2022
Balanced softmax cross-entropy for incremental learning with and without memory
Computer Vision and Image Understanding (CVIU), Volume 225, Issue Chttps://doi.org/10.1016/j.cviu.2022.103582AbstractWhen incrementally trained on new classes, deep neural networks are subject to catastrophic forgetting which leads to an extreme deterioration of their performance on the old classes while learning the new ones. Using a small memory containing ...
Highlights- Proposed to use Balanced Softmax for bias mitigation in class-incremental learning.
- Can be used for class-incremental learning with and without memory.
- Seamlessly combinable with state-of-the-art approaches to improve accuracy.
- research-articleFebruary 2022
CVPR 2020 continual learning in computer vision competition: Approaches, results, current challenges and future directions
- Vincenzo Lomonaco,
- Lorenzo Pellegrini,
- Pau Rodriguez,
- Massimo Caccia,
- Qi She,
- Yu Chen,
- Quentin Jodelet,
- Ruiping Wang,
- Zheda Mai,
- David Vazquez,
- German I. Parisi,
- Nikhil Churamani,
- Marc Pickett,
- Issam Laradji,
- Davide Maltoni
Artificial Intelligence (ARTI), Volume 303, Issue Chttps://doi.org/10.1016/j.artint.2021.103635AbstractIn the last few years, we have witnessed a renewed and fast-growing interest in continual learning with deep neural networks with the shared objective of making current AI systems more adaptive, efficient and autonomous. However, despite the ...
- ArticleSeptember 2021
Balanced Softmax Cross-Entropy for Incremental Learning
Artificial Neural Networks and Machine Learning – ICANN 2021Pages 385–396https://doi.org/10.1007/978-3-030-86340-1_31AbstractDeep neural networks are prone to catastrophic forgetting when incrementally trained on new classes or new tasks as adaptation to the new data leads to a drastic decrease of the performance on the old classes and tasks. By using a small memory for ...
- ArticleSeptember 2019
Transfer Learning with Sparse Associative Memories
Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural ComputationPages 497–512https://doi.org/10.1007/978-3-030-30487-4_39AbstractIn this paper, we introduce a novel layer designed to be used as the output of pre-trained neural networks in the context of classification. Based on Associative Memories, this layer can help design deep neural networks which support incremental ...