Click4Ai

146.

Medium

Triplet Loss

Implement the Triplet Loss function, a key loss in metric learning that operates on triplets of samples: an anchor, a positive (same class as anchor), and a negative (different class from anchor). The goal is to ensure the anchor is closer to the positive than to the negative by at least a margin.

Formula:

L = max(0, d(anchor, positive) - d(anchor, negative) + margin)

Where:

d(a, b) = ||a - b||_2 (Euclidean distance)

anchor = reference embedding

positive = embedding of the same class as anchor

negative = embedding of a different class from anchor

margin = minimum desired gap (default 1.0)

Example:

Input: anchor = [1, 2], positive = [3, 4], negative = [5, 6], margin = 1

d(anchor, positive) = sqrt((3-1)^2 + (4-2)^2) = sqrt(8) = 2.828

d(anchor, negative) = sqrt((5-1)^2 + (6-2)^2) = sqrt(32) = 5.657

L = max(0, 2.828 - 5.657 + 1) = max(0, -1.828) = 0.0

Output: 0.0

The loss is 0 because the negative is already sufficiently farther from the anchor than the positive (by more than the margin). If the negative were closer, the loss would be positive, pushing the model to adjust its embeddings to separate the classes.

Constraints:

  • All three inputs (anchor, positive, negative) are 1D NumPy arrays of equal length.
  • `margin` is a non-negative float (default 1.0).
  • Use NumPy for distance and max operations.
  • The loss must be non-negative.
  • Test Cases

    Test Case 1
    Input: [[1,2],[3,4],[5,6]]
    Expected: 0.0
    Test Case 2
    Input: [[7,8],[9,10],[11,12]]
    Expected: 0.0
    + 3 hidden test cases