Click4Ai

211.

Hard

Implement the Word2Vec Skip-gram model. **Example:** Given a sentence 'the quick brown fox', generate word embeddings. **Constraints:** Use NumPy for efficient computation, assume a fixed vocabulary size.

Test Cases

Test Case 1
Input: "the quick brown fox", 10, 5
Expected: [[0.1, 0.2, 0.3, 0.4, 0.5], [0.6, 0.7, 0.8, 0.9, 0.1], ...]
Test Case 2
Input: "hello world", 5, 3
Expected: [[0.1, 0.2, 0.3], [0.4, 0.5, 0.6], ...]
+ 3 hidden test cases