Selection of Distance Metrics and Feature Subsets for k-Nearest Neighbor Classifiers

Author:
Barker, Allen L., Department of Computer Science, University of Virginia
Advisor:
Martin, Worthy, Department of Computer Science, University of Virginia
Abstract:

The k-nearest neighbor (kNN) classifier is a popular and effective method for associating a feature vector with a unique element in a known, finite set of classes. A common choice for the distance metric used in kNN classification is the quadratic distance Q(x; A; y) = (x -y)'A(x-y), where x and y are n-vectors of features, A is a symmetric n x n matrix, and prime denotes transpose. For finite sets of training samples the choice of matrix A is important in optimizing classifier performance. We show that A can be approximately optimized via gradient descent on a sigmoidally smoothed estimate of the classifier's probability of error. We describe an algorithm for performing the metric selection, and compare the performance of our method with that of other methods. We demonstrate that adding noise during the descent process can reduce the effects of overfitting. We further suggest how feature subset selection can be treated as a special case of this metric selection

Degree:
PHD (Doctor of Philosophy)
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
1997/05/30