Weighted nearest neighbour (WNN) is an extension of the basic K-Nearest Neighbour (K-NN) algorithm. The idea of WNN is that not all neighbours are equally important, closer neighbours should have more influence on the prediction than farther ones. It is often used for regression tasks. We assign weights to each neighbour based on its distance to the new example.
How it works
- Choose a distance metric (e.g. Euclidean, Manhattan)
- Find the nearest neighbours to the new data point
- Assign a weight to each neighbour, often using: Where:
- is the distance between the example and neighbour
- is the weight to the neighbour
Closer points → smaller distance → larger weight
- Prediction for using formula Equivalently: Where:
- : the number of nearest neighbours
- : the target value (e.g. salary, house price) of the nearest neighbour
- : the weight of the neighbour calculated in step 3
Alternative using all training examples
In some cases, you don’t just use the nearest — you use every point in the dataset, but weight all by: Distant points contribute very little (almost zero), because most of them can be far away from . However, the algorithm will be slower.
Back to parent page: Supervised Machine Learning
AI Machine_Learning COMP3308 Unsupervised_Learning Lazy_Learning Classification KNN K-Nearest_Neighbour WNN Weighted_Nearest_Neighbour