نتایج جستجو برای: pruning
تعداد نتایج: 9637 فیلتر نتایج به سال:
In this paper, we propose the grouped scheme, which can be specially applied to compute the pruning fast Fourier transform (pruning FFT) with power-of-two partial transformation length. The group-based pruning FFT algorithm applies the scheme of the grouped frequency indices to accelerate computations of selected discrete Fourier transform (DFT) outputs. The proposed pruning FFT algorithm has f...
We describe an experimental study of pruning methods for decision tree classi ers when the goal is minimizing loss rather than error. In addition to two common methods for error minimization, CART's cost-complexity pruning and C4.5's error-based pruning, we study the extension of cost-complexity pruning to loss and one pruning variant based on the Laplace correction. We perform an empirical com...
The cost-complexity pruning generates nested subtrees and selects the best one. However, its computational cost is large since it uses holdout sample or cross-validation. On the other hand, the pruning algorithms based on posterior calculations such as BIC (MDL) and MEP are faster, but they sometimes produce too big or small trees to yield poor generalization errors. In this paper, we propose a...
1 –Independent Work Report, Spring 2015– Synaptic Pruning Mechanisms in Learning Abstract Synaptic pruning is the process of removing synapses in neural networks and has been considered to be a method of learning. During the developmental stages of the brain, synaptic pruning helps regulate efficiency and energy conservation. However, a destructive algorithm seems to be counter-intuitive to “le...
We describe an experimental study of pruning methods for decision tree classi ers in two learning situations: minimizing loss and probability estimation. In addition to the two most common methods for error minimization, CART's cost-complexity pruning and C4.5's errorbased pruning, we study the extension of cost-complexity pruning to loss and two pruning variants based on Laplace corrections. W...
The min-max modular neural network with Gaussian zerocrossing function (M-GZC) has locally tuned response characteristic and emergent incremental learning ability, but it suffers from quadratic complexity in storage space and response time. Redundant Sample pruning and redundant structure pruning can be considered to overcome these weaknesses. This paper aims at the latter; it analyzes the prop...
This paper presents a new class of pruning rule for unordered search. Previous pruning rules for unordered search identify operators that should not be applied in order to prune nodes reached via those operators. In contrast, the new pruning rules identify operators that should be applied and prune nodes that are not reached via those operators. Specific pruning rules employing both these appro...
This paper presents a new class of pruning axiom for unordered search. Previous pruning axioms for unordered search identify operators that should not be applied in order to prune states reached via those operators. In contrast, the new pruning axioms identify operators that should be applied and prune states that are not reached via those operators. Specific pruning axioms employing both these...
Recently, classifier ensemble methods are gaining more and more attention in the machine-learning and data-mining communities. In most cases, the performance of an ensemble is better than a single classifier. Many methods for creating diverse classifiers were developed during the past decade. When these diverse classifiers are generated, it is important to select the proper base classifier to j...
Neural network pruning methods on the level of individual network parameters (e.g. connection weights) can improve generalization, as is shown in this empirical study. However, an open problem in the pruning methods known today (e.g. OBD, OBS, autoprune, epsiprune) is the selection of the number of parameters to be removed in each pruning step (pruning strength). This work presents a pruning me...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید