نتایج جستجو برای: clustering error
تعداد نتایج: 353239 فیلتر نتایج به سال:
In this paper, we propose and study a semi-random model for the Correlation Clustering problem on arbitrary graphs G. We give two approximation algorithms for Correlation Clustering instances from this model. The first algorithm finds a solution of value (1 + δ) opt-cost +Oδ(n log n) with high probability, where opt-cost is the value of the optimal solution (for every δ > 0). The second algorit...
Reducing the order of higher order systems by mixed approach is Improved Pade-Pole clustering based method to derive a reduced order approximation for a stable continuous time system is presented. In this method, the denominator polynomial of the reduced order model is derive by improved pole-clustering approach and the numerator polynomial are obtain through Padé approximation technique and by...
This paper is concerned with transductive learning. We study a recent transductive learning approach based on clustering. In this approach one constructs a diversity of unsupervised models of the unlabeled data using clustering algorithms. These models are then exploited to construct a number of hypotheses using the labeled data and the learner selects an hypothesis that minimizes a transductiv...
Dynamic clustering problems can be solved by finding several clustering solutions with different number of clusters, and by choosing the one that minimizes a given evaluation function value. This kind of brute force approach is general but not very efficient. We propose a dynamic local search that solves the number and location of the clusters jointly. The algorithm uses a set of basic operatio...
Dynamic clustering problems can be solved by finding several clustering solutions with different number of clusters, and by choosing the one that minimizes a given evaluation function. This kind of brute force approach is general but not very efficient. We propose a new dynamic local search that solves the number and location of the clusters jointly. The algorithm uses a set of basic operations...
We propose a new clustering approach, called optimality-based clustering, that clusters data points based on their latent decision-making preferences. assume each point is decision generated by decision-maker who (approximately) solves an optimization problem and cluster the identifying common objective function of problems for such worst-case optimality error minimized. three different models ...
assigning a set of objects to groups such that objects in one group or cluster are more similar to each other than the other clusters’ objects is the main task of clustering analysis. sspco optimization algorithm is anew optimization algorithm that is inspired by the behavior of a type of bird called see-see partridge. one of the things that smart algorithms are applied to solve is the problem ...
Research on measurement error in network data has typically focused on missing data. We embed missing data, which we term false negative nodes and edges, in a broader classification of error scenarios. This includes false positive nodes and edges and falsely aggregated and disaggregated nodes. We simulate these six measurement errors using an online social network and a publication citation net...
We investigate the robust PCA problem of decomposing an observed matrix intothe sum of a low-rank and a sparse error matrices via convex programming PrincipalComponent Pursuit (PCP). In contrast to previous studies that assume the supportof the sparse error matrix is generated by uniform Bernoulli sampling, we allow non-uniform sampling, i.e., entries of the low-rank matrix are ...
In this paper, a new clustering technique called Dimensional Split Phonetic Decision Tree (DS-PDT) is proposed. In DSPDT, state distributions are split dimensionally when applying phonetic question. This technique is an extension of the decision tree based acoustic modeling. It gives a proper context-dependent sharing structure of each dimension automatically while maintaining the correlations ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید