نتایج جستجو برای: overfitting

تعداد نتایج: 4333  

Journal: :Entropy 2017
Xiaoyue Feng Yanchun Liang Xiaohu Shi Dong Xu Xu Wang Renchu Guan

Overfitting is an important problem in machine learning. Several algorithms, such as the extreme learning machine (ELM), suffer from this issue when facing high-dimensional sparse data, e.g., in text classification. One common issue is that the extent of overfitting is not well quantified. In this paper, we propose a quantitative measure of overfitting referred to as the rate of overfitting (RO...

1997
A. J. M. M. Weijters H. Jaap van den Herik Antal van den Bosch Eric O. Postma

Overfitting is a well-known problem in the fields of symbolic and connectionist machine learning. It describes the deterioration of generalisation performance of a trained model. In this paper, we investigate the ability of a novel artificial neural network, bp-som, to avoid overfitting. bp-som is a hybrid neural network which combines a multi-layered feed-forward network (mfn) with Kohonen’s s...

Journal: :Proceedings of the ... AAAI Conference on Artificial Intelligence 2021

Although fast adversarial training has demonstrated both robustness and efficiency, the problem of "catastrophic overfitting" been observed. This is a phenomenon in which, during single-step training, robust accuracy against projected gradient descent (PGD) suddenly decreases to 0% after few epochs, whereas sign method (FGSM) increases 100%. In this paper, we demonstrate that catastrophic overf...

2007
Yue Liu Janusz A. Starzyk Zhen Zhu

In this paper, a novel and effective criterion based on the estimation of the signal-to-noise-ratio figure (SNRF) is proposed to optimize the number of hidden neurons in neural networks to avoid overfitting in the function approximation. SNRF can quantitatively measure the useful information left unlearned so that overfitting can be automatically detected from the training error only without us...

2010
Jirí Grim Jan Hora

We discuss the problem of overfitting of probabilistic neural networks in the framework of statistical pattern recognition. The probabilistic approach to neural networks provides a statistically justified subspace method of classification. The underlying structural mixture model includes binary structural parameters and can be optimized by EM algorithm in full generality. Formally, the structur...

2000
Steve Lawrence C. Lee Giles

Methods for controlling the bias/variance tradeoff typically assume that overfitting or overtraining is a global phenomenon. For multi-layer perceptron (MLP) neural networks, global parameters such as the training time (e.g. based on validation tests), network size, or the amount of weight decay are commonly used to control the bias/variance tradeoff. However, the degree of overfitting can vary...

Journal: :Information Fusion 2009
Eulanda Miranda dos Santos Robert Sabourin Patrick Maupin

Information fusion research has recently focused on the characteristics of the decision profiles of ensemble members in order to optimize performance. These characteristics are particularly important in the selection of ensemble members. However, even though the control of overfitting is a challenge in machine learning problems, much less work has been devoted to the control of overfitting in s...

2007
Alexander Vezhnevets Olga Barinova

Boosting methods are known to exhibit noticeable overfitting on some datasets, while being immune to overfitting on other ones. In this paper we show that standard boosting algorithms are not appropriate in case of overlapping classes. This inadequateness is likely to be the major source of boosting overfitting while working with real world data. To verify our conclusion we use the fact that an...

Journal: :Journal of Physics A: Mathematical and Theoretical 2019

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید