Click4Ai

228.

Hard

### Self-Attention

Implement a basic self-attention mechanism for a transformer model.

Example:

* Input: input_sequence = np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])

* Output: np.array([[7.0, 8.0, 9.0], [7.0, 8.0, 9.0], [7.0, 8.0, 9.0]])

Constraints:

* Use NumPy for calculations.

Test Cases

Test Case 1
Input: np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
Expected: np.array([[7.0, 8.0, 9.0], [7.0, 8.0, 9.0], [7.0, 8.0, 9.0]])
Test Case 2
Input: np.array([[10, 20, 30], [40, 50, 60], [70, 80, 90]])
Expected: np.array([[90.0, 100.0, 110.0], [90.0, 100.0, 110.0], [90.0, 100.0, 110.0]])
+ 3 hidden test cases