Click4Ai

72.

Medium

Implement a **K-Nearest Neighbors (KNN)** classifier from scratch.

Algorithm:

1. **fit(X, y):** Simply store the training data.

2. **predict(X):** For each test sample:

a. Compute Euclidean distances to all training samples

b. Find the k nearest neighbors

c. Return the majority class among those neighbors

Example:

knn = KNNClassifier(k=3)

knn.fit(X_train, y_train)

predictions = knn.predict(X_test)

**Explanation:** KNN is a lazy learner -- it stores all training data and makes predictions by finding the k closest points. The predicted class is the one that appears most frequently among the neighbors.

Constraints:

  • Use Euclidean distance: sqrt(sum((x1 - x2)^2))
  • Use majority voting (most common class among k neighbors)
  • Break ties arbitrarily (use Counter.most_common)
  • Test Cases

    Test Case 1
    Input: Train:[[1,1],[2,2],[3,3],[6,6],[7,7],[8,8]], labels:[0,0,0,1,1,1], predict:[[4,4]]
    Expected: [0]
    Test Case 2
    Input: Train:[[0,0],[10,10]], labels:[0,1], predict:[[1,1]]
    Expected: [0]
    Test Case 3
    Input: k=1, Train:[[1,1],[2,2]], labels:[0,1], predict:[[2,2]]
    Expected: [1]
    + 2 hidden test cases