Click4Ai

227.

Hard

### Attention Mechanism

Implement a basic attention mechanism for a transformer model.

Example:

* Input: query = np.array([1, 2, 3]), key = np.array([4, 5, 6]), value = np.array([7, 8, 9])

* Output: np.array([7.0, 8.0, 9.0])

Constraints:

* Use NumPy for calculations.

Test Cases

Test Case 1
Input: np.array([1, 2, 3]), np.array([4, 5, 6]), np.array([7, 8, 9])
Expected: np.array([7.0, 8.0, 9.0])
Test Case 2
Input: np.array([10, 20, 30]), np.array([40, 50, 60]), np.array([70, 80, 90])
Expected: np.array([70.0, 80.0, 90.0])
+ 3 hidden test cases