k-Nearest neighbours
Nearest-neighbour classification
Nearest-neighbor methods: To get the prediction Ŷ for a point , use [those observations ( of them) in the training set T, closest in input space to point x]. Remember training set is a set of pairs . Closest often refers to Euclidean distance.
Lecture about theoretical analysis and understanding of k-NN
Can alsu use for Regression KNN
It turns out that the effective number of parameters of k-nearest neighbors is , even if technically there is only one parameter, .
Interactive app: https://github.com/lettier/interactiveknn
Unbounded capacity but it generalizes at non-trivial rate. Indeed in Nonparametric statistics, overparametrization isn't incompatible with Generalization!
Of course, we need to assume something about the target function, typically Lipschitz continuity, but also other assumptions
–> To me it seems more like a method in Nonparametric statistics! Indeed it is (see Wiki).
Classification w/ K Nearest Neighbors Intro - Practical Machine Learning Tutorial with Python p.13
A KNN classifier with K=1 induces a Voronoi tessellation