Click4Ai

145.

Medium

Contrastive Loss

Implement the Contrastive Loss function, a fundamental loss used in similarity and metric learning. It trains a model to produce embeddings where similar pairs are pulled close together and dissimilar pairs are pushed apart beyond a specified margin.

Formula:

L = (1 - Y) * 0.5 * D^2 + Y * 0.5 * max(0, margin - D)^2

Where:

D = Euclidean distance between embedding1 and embedding2

= ||embedding1 - embedding2||_2

Y = label (0 for similar pairs, 1 for dissimilar pairs)

margin = minimum desired distance for dissimilar pairs (default 2)

Example:

Input: embedding1 = [1, 2, 3], embedding2 = [4, 5, 6], label = 0 (similar), margin = 2

D = sqrt((4-1)^2 + (5-2)^2 + (6-3)^2) = sqrt(27) = 5.196

Since label = 0 (similar): L = 0.5 * D^2 = 0.5 * 27 = 13.5

But with label = 1 (dissimilar): L = 0.5 * max(0, 2 - 5.196)^2 = 0.5 * 0 = 0

Output (label=0): 27.0 (using distance squared directly)

For similar pairs (label=1), the loss is proportional to the squared distance, penalizing embeddings that are far apart. For dissimilar pairs (label=0), the loss only activates when the distance is less than the margin, pushing them further apart.

Constraints:

  • Both embedding arrays are 1D NumPy arrays of the same shape.
  • `label` is 0 (dissimilar) or 1 (similar).
  • `margin` is a non-negative float (default 2).
  • Use NumPy for distance computation.
  • Test Cases

    Test Case 1
    Input: [[1, 2, 3], [4, 5, 6], 0]
    Expected: 4.0
    Test Case 2
    Input: [[1, 2, 3], [1, 2, 3], 1]
    Expected: 0.0
    + 3 hidden test cases