K-Nearest Neighbors Classifier

K-Nearest Neighbors (KNN) is a simple, versatile, and non-parametric supervised learning algorithm used for classification and regression. This visualization demonstrates how KNN classifies data points based on the majority class of their k nearest neighbors.

Visualization

Class A
Class B
Class C

Classification Metrics

Click on the canvas to classify a new point.

Algorithm Controls

Algorithm Explanation

KNN works by:

  1. Calculating the distance between the query point and all training points
  2. Selecting the K-nearest data points
  3. Assigning the class by majority vote of the K neighbors

Time & Space Complexity

Time Complexity: O(n×d) for prediction where n is the dataset size and d is the number of dimensions

Space Complexity: O(n) to store all training data points

function predict(queryPoint, trainingData, k) {
    // Calculate distances
    const distances = trainingData.map(point => {
        return {
            distance: calculateDistance(queryPoint, point),
            class: point.class
        };
    });
    
    // Sort by distance
    distances.sort((a, b) => a.distance - b.distance);
    
    // Take k nearest neighbors
    const kNearest = distances.slice(0, k);
    
    // Count class frequencies
    const classCounts = {};
    kNearest.forEach(neighbor => {
        classCounts[neighbor.class] = 
            (classCounts[neighbor.class] || 0) + 1;
    });
    
    // Return the class with highest frequency
    return Object.keys(classCounts).reduce((a, b) => 
        classCounts[a] > classCounts[b] ? a : b);
}