### Word Tokenization
**Example:** Split a sentence into individual words or tokens.
**Constraints:** Input sentence: a string of words separated by spaces
Test Cases
Test Case 1
Input:
"Hello world"Expected:
["Hello", "world"]Test Case 2
Input:
"This is a test sentence"Expected:
["This", "is", "a", "test", "sentence"]+ 3 hidden test cases