Pseudo-Likelihood Inference Underestimates Model Uncertainty: Evidence from Bayesian Nearest Neighbours

Authors

  • Hugh Chipman
  • Mu Zhu
  • Wanhua Su
Abstract:

When using the K-nearest neighbours (KNN) method, one often ignores the uncertainty in the choice of K. To account for such uncertainty, Bayesian KNN (BKNN) has been proposed and studied (Holmes and Adams 2002 Cucala et al. 2009). We present some evidence to show that the pseudo-likelihood approach for BKNN, even after being corrected by Cucala et al. (2009), still significantly underestimates model uncertainty.

Upgrade to premium to download articles

Sign up to access the full text

Already have an account?login

similar resources

An Efficient Algorithm for Bayesian Nearest Neighbours

K-Nearest Neighbours (k-NN) is a popular classification and regression algorithm, yet one of its main limitations is the difficulty in choosing the number of neighbours. We present a Bayesian algorithm to compute the posterior probability distribution for k given a target point within a dataset, efficiently and without the use of Markov Chain Monte Carlo (MCMC) methods or simulation—alongside a...

full text

Extended k-Nearest Neighbours based on Evidence Theory

An evidence theoretic classification method is proposed in this paper. In order to classify a pattern we consider its neighbours, which are taken as parts of a single source of evidence to support the class membership of the pattern. A single mass function or basic belief assignment is then derived, and the belief function and the pignistic (“betting rates”) probability function can be calculat...

full text

Feature Reduction and Nearest Neighbours

Feature reduction is a major preprocessing step in the analysis of highdimensional data, particularly from biomolecular high-throughput technologies. Reduction techniques are expected to preserve the relevant characteristics of the data, such as neighbourhood relations. We investigate the neighbourhood preservation properties of feature reduction empirically and theoretically. Our results indic...

full text

Applications of Nearest Neighbours Statistics

Jorg D. Wichard, Ulrich Parlitz and Werner Lauterborn Drittes Physikalisches Institut, Georg-August-Universitat Gottingen, D-37073 Gottingen, Germany [email protected] Abstract| Based on an e cient method for nding nearest neighbours in the phase space of a dynamical system, several applications of nearest neighbour statistics are presented, including methods to detect nonstationarity...

full text

On the Underestimation of Model Uncertainty by Bayesian K-nearest Neighbors

When using the K-nearest neighbors method, one often ignores uncertainty in the choice of K. To account for such uncertainty, Holmes and Adams (2002) proposed a Bayesian framework for K-nearest neighbors (KNN). Their Bayesian KNN (BKNN) approach uses a pseudo-likelihood function, and standard Markov chain Monte Carlo (MCMC) techniques to draw posterior samples. Holmes and Adams (2002) focused o...

full text

My Resources

Save resource for easier access later

Save to my library Already added to my library

{@ msg_add @}


Journal title

volume 10  issue None

pages  167- 180

publication date 2011-11

By following a journal you will be notified via email when a new issue of this journal is published.

Keywords

Hosted on Doprax cloud platform doprax.com

copyright © 2015-2023