Weighting of the k-Nearest-Neighbors

Konstantin Chernoff, Mads Nielsen

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

11 Citations (Scopus)

Abstract

This paper presents two distribution independent weighting schemes for k-Nearest-Neighbors (kNN). Applying the first scheme in a Leave-One-Out (LOO) setting corresponds to performing complete b-fold cross validation (b-CCV), while applying the second scheme corresponds to performing bootstrapping in the limit of infinite iterations. We demonstrate that the soft kNN errors obtained through b-CCV can be obtained by applying the weighted kNN in a LOO setting, and that the proposed weighting schemes can decrease the variance and improve the generalization of kNN in a CV setting.

Original languageEnglish
Title of host publication2010 20th International Conference on Pattern Recognition (ICPR)
Number of pages4
PublisherIEEE
Publication date2010
Pages666-669
ISBN (Print)978-1-4244-7542-1
ISBN (Electronic)978-1-4244-7541-4
DOIs
Publication statusPublished - 2010
Event20th International Conference on Pattern Recognition - Istanbul, Turkey
Duration: 23 Aug 201026 Aug 2010

Conference

Conference20th International Conference on Pattern Recognition
Country/TerritoryTurkey
CityIstanbul
Period23/08/201026/08/2010

Cite this