منابع مشابه
Continuous Adaptation via Meta-Learning in Nonstationary and Competitive Environments
Ability to continuously learn and adapt from limited experience in nonstationary environments is an important milestone on the path towards general intelligence. In this paper, we cast the problem of continuous adaptation into the learning-to-learn framework. We develop a simple gradient-based meta-learning algorithm suitable for adaptation in dynamically changing and adversarial scenarios. Add...
متن کامل6 Competitive Networks and Competitive Learning
Competitive neural networks belong to a class of recurrent networks, and-they are based on algorithms of unsupervised learning, such as the competitive algorithm explained in this section. In competitive learning, the output neurons of a neural network compete among themselves to become active (to be "fired"). Whereas in multiplayer perceptrons several output neurons may be active simultaneousl...
متن کاملCompetitive Learning in Autonomous
A BSTRACT Artificial Neural Network (ANN) algorithms provide powerful techniques for the construction of computing systems which are capable of adapting to non-stationary or incompletely specified operating environments. Unsupervised learning is of particular interest in such situations. This paper reports the results of empirical studies into the characteristics of standard and soft competitiv...
متن کاملMaximum Likelihood Competitive Learning
One popular class of unsupervised algorithms are competitive algorithms. In the traditional view of competition, only one competitor, the winner, adapts for any given case. I propose to view competitive adaptation as attempting to fit a blend of simple probability generators (such as gaussians) to a set of data-points. The maximum likelihood fit of a model of this type suggests a "softer" form ...
متن کاملApproximate kernel competitive learning
Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate ke...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE/CAA Journal of Automatica Sinica
سال: 2023
ISSN: ['2329-9274', '2329-9266']
DOI: https://doi.org/10.1109/jas.2023.123354