Click4Ai

73.

Medium

Implement a **Distance-Weighted KNN** classifier where closer neighbors have more influence on the prediction.

Algorithm:

Instead of simple majority voting, weight each neighbor's vote by the inverse of its distance:

weight_i = 1 / (distance_i + epsilon)

For each class, sum the weights of its neighbors. The class with the highest total weight wins.

Example:

wknn = WeightedKNN(k=3)

wknn.fit(X_train, y_train)

predictions = wknn.predict(X_test)

# Closer neighbors contribute more to the prediction

**Explanation:** In standard KNN, a neighbor at distance 0.1 counts the same as one at distance 10. Weighted KNN fixes this by giving more weight to closer points, leading to better decision boundaries.

Constraints:

  • Use Euclidean distance
  • Use epsilon = 1e-8 to avoid division by zero
  • Weight = 1 / (distance + epsilon)
  • Test Cases

    Test Case 1
    Input: Train:[[0,0],[10,10],[1,1]], labels:[0,1,0], predict:[[0.5,0.5]], k=2
    Expected: [0]
    Test Case 2
    Input: Train:[[0,0],[5,5]], labels:[0,1], predict:[[2,2]]
    Expected: [0]
    Test Case 3
    Input: Train:[[0,0],[0,0.1],[10,10]], labels:[0,1,1], predict:[[0,0]], k=3
    Expected: [0] (closest point has highest weight)
    + 2 hidden test cases