Distance metric learning for large margin nearest neighbor. Variants of the zernike moments, called pseudozernike moments, have. Improved pseudo nearest neighbor classification sciencedirect. Remember that the nn prediction rule recall that we defined nn as the special case of. Sample set condensation for a condensed nearest neighbor decision rule for pattern recognition. Introduction to pattern recognition via character recognition. The nearest neighbor nn rule is a classic in pattern recognition. In this article, we propose a new knnbased classifier, called the local meanbased pseudo nearest neighbor lmpnn rule. By the very nature of its decision rule, the performance of knn classification depends crucially. Pattern recognition we dont see things as they are. In pattern recognition, the knearest neighbors algorithm knn is a non parametric method. In both cases, the input consists of the k closest training examples in the feature space. Credit scoring analysis using pseudo nearest neighbor iopscience. It is motivated by the local meanbased k nearest neighbor lmknn rule and the pseudo nearest neighbor pnn rule, with the aim of improving the classification performance.
It is intuitive and there is no need to describe an algorithm. It is thereby very suitable as a base routine in comparative studies. In this paper, we propose a new pseudo nearest neighbor classification rule pnnr. Sparse coefficientbased knearest neighbor classification. In the proposed pncn, the nearest centroid neighbors rather than nearest neighbors per class are first searched by means of ncn. Pseudo nearest neighbor rule for pattern classification semantic.
A local mean representationbased knearest neighbor classifier. In knn classification, the output is a class membership. Everybody who programs it obtains the same results. The amount of computation is minimized by careful book keeping from. And yes, artificial examples can be constructed that shows this for the nn rule. In this paper, we propose a new reliable classification approach, called the pseudo nearest centroid neighbor rule, which is based on the pseudo nearest neighbor rule pnn and nearest centroid neighborhood ncn. Pseudo nearest centroid neighbor classification springerlink. Improved pseudo nearest neighbor classification request pdf. Lmpnn obtains the very satisfactory classification in many pattern recognition problems. Department of automation shanghai jiao tong university. The lmpnn rule 48 can be regarded as the improvement of knn 49, local meanbased knearest neighbor rule 50, and the pseudo nearest neighbor rule 51.
Pseudo nearest neighbor rule for pattern classification. Zeng y, yang y and zhao l 2008 pseudo nearest neighbor rule for pattern classification. In pattern recognition, the k nearest neighbors algorithm knn is a nonparametric method used for classification and regression. The algorithm for the socalled nearest neighbor rule is summarized as. On the left, a 2d scatter plot of almost separable classes for which the nn rule performs badly. Nearest neighbor rules in effect implicitly compute the decision boundary. The lmpnn rule 48 can be regarded as the improvement of knn 49, local meanbased knearest neighbor rule 50, and the pseudo nearest neighbor rule. Lda is widely used as a form of linear preprocessing for pattern classification. The distances of nearest neighbors of different classes are similar to those of the same class. The philosophy of the book is to present various pattern recognition tasks in a unified way. It is different from the previous nearest neighbor rule nnr, this new rule utilizes the distance weighted local learning in each class to get a new nearest neighbor of the unlabeled pattern pseudo nearest neighbor pnn, and then assigns the label associated with the pnn for the unlabeled pattern using the nnr. It is different from the previous nearest neighbor rule nnr, this new rule. The output depends on whether knn is used for classification or regression.
284 1055 444 102 48 754 664 1571 259 487 1017 48 1444 193 833 347 413 1588 121 1148 457 410 1301 634 866 998 633