Self-paced Weight Consolidation for Continual Learning
نویسندگان
چکیده
Continual learning algorithms which keep the parameters of new tasks close to that previous tasks, are popular in preventing catastrophic forgetting sequential task settings. However, 1) performance for continual learner will be degraded without distinguishing contributions previously learned tasks; 2) computational cost greatly increased with number since most existing need regularize all when tasks. To address above challenges, we propose a self-paced Weight Consolidation (spWC) framework attain robust via evaluating discriminative specific, develop regularization reflect priorities past measuring difficulty based on key indicator ( i.e ., accuracy). When encountering task, sorted from “difficult” “easy” priorities. Then selectively maintaining knowledge amongst more difficult could well overcome less cost. We adopt an alternative convex search iteratively update model and priority weights bi-convex formulation. The proposed spWC is plug-and-play, applicable xmlns:xlink="http://www.w3.org/1999/xlink">e.g EWC, MAS RCIL) different directions classification segmentation). Experimental results several public benchmark datasets demonstrate our can effectively improve compared other algorithms.
منابع مشابه
Self-Paced Curriculum Learning
Curriculum learning (CL) or self-paced learning (SPL) represents a recently proposed learning regime inspired by the learning process of humans and animals that gradually proceeds from easy to more complex samples in training. The two methods share a similar conceptual learning paradigm, but differ in specific learning schemes. In CL, the curriculum is predetermined by prior knowledge, and rema...
متن کاملMulti-view Self-Paced Learning for Clustering
Exploiting the information from multiple views can improve clustering accuracy. However, most existing multi-view clustering algorithms are nonconvex and are thus prone to becoming stuck into bad local minima, especially when there are outliers and missing data. To overcome this problem, we present a new multi-view self-paced learning (MSPL) algorithm for clustering, that learns the multi-view ...
متن کاملSelf-Paced Learning for Latent Variable Models
Latent variable models are a powerful tool for addressing several tasks in machine learning. However, the algorithms for learning the parameters of latent variable models are prone to getting stuck in a bad local optimum. To alleviate this problem, we build on the intuition that, rather than considering all samples simultaneously, the algorithm should be presented with the training data in a me...
متن کاملSelf-Paced Learning for Semisupervised Image Classification
In this project, I plan to apply self-paced learning to the bounding-box problem using the VOC2011 dataset.
متن کاملSelf-Paced Learning with Diversity
Self-paced learning (SPL) is a recently proposed learning regime inspired by the learning process of humans and animals that gradually incorporates easy to more complex samples into training. Existing methods are limited in that they ignore an important aspect in learning: diversity. To incorporate this information, we propose an approach called self-paced learning with diversity (SPLD) which f...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Circuits and Systems for Video Technology
سال: 2023
ISSN: ['1051-8215', '1558-2205']
DOI: https://doi.org/10.1109/tcsvt.2023.3304567