I was taking the tutorial of making Recommendation system , there I read that Nearest Neighbor is different from KNN classifier . Could anyone explain that what is Nearest Neighbor and how it is different between KNN ?
2 Answers
Not really sure about it, but KNN means K-Nearest Neighbors to me, so both are the same. The K just corresponds to the number of nearest neighbours you take into account when classifying.
Maybe what you call Nearest Neighbor is a KNN with K = 1.
- 656
- 4
- 9
Scikit wrote in his documantation:
sklearn.neighbors provides functionality for unsupervised and supervised neighbors-based learning methods. Unsupervised nearest neighbors is the foundation of many other learning methods, notably manifold learning and spectral clustering. Supervised neighbors-based learning comes in two flavors: classification for data with discrete labels, and regression for data with continuous labels.
The principle behind nearest neighbor methods is to find a predefined number of training samples closest in distance to the new point, and predict the label from these. The number of samples can be a user-defined constant (k-nearest neighbor learning), or vary based on the local density of points (radius-based neighbor learning). The distance can, in general, be any metric measure: standard Euclidean distance is the most common choice.
You can read more about it here https://scikit-learn.org/stable/modules/neighbors.html
- 103
- 4
- 51
- 1
- 5