Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
This is achieved by using AdaBoost to learn a linear combination of a family of distance measures. Boosted Distance is another algorithm that also uses AdaBoost ...
... using boosting to learn a distance metric and applying it to a k-NN classifier [19]. This is achieved by using AdaBoost to learn a weighted distance mea-.
Jan 1, 2012 · This is accomplished via local warping of the distance metric using modified weights assigned to each instance. The weights are trained by ...
Sep 29, 2011 · Athitsos and Sclaroff experimented with using boosting to learn a distance metric and applying it to a k-NN classifier (Athitsos and Sclaroff, ...
A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric ... Abstract. Though the k-nearest neighbor (k-NN) ...
People also ask
An empirical study conducted on 10 standard databases from the UCI repository shows that this new Boosted k-NN algorithm has increased generalization accuracy ...
A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric.
An algorithm that uses boosting to learn a distance measure for multiclass k-nearest neighbor classification and achieves lower error rates in some of the ...
A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric. T. Neo, and D. Ventura.
Toh Koon Charlie Neo, Dan Ventura: A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric.