Implement **Agglomerative Hierarchical Clustering** (bottom-up approach).
Algorithm:
1. Start with each point as its own cluster
2. Compute distances between all pairs of clusters
3. Merge the two closest clusters
4. Repeat until the desired number of clusters is reached
Linkage Methods:
Single: min distance between any two points in different clusters
Complete: max distance between any two points in different clusters
Average: mean distance between all pairs of points in different clusters
Example:
hc = AgglomerativeClustering(n_clusters=2, linkage='single')
hc.fit(X)
print(hc.labels_) # Cluster assignment for each point
**Explanation:** Unlike K-Means, hierarchical clustering doesn't require specifying k upfront (you can cut the dendrogram at any level). Single linkage tends to create elongated clusters, complete linkage creates compact clusters.
Constraints:
Test Cases
6 points in 2 groups, n_clusters=2, linkage='single'Two clusters separating the groupslabels_ has length n_samplesTruen_clusters=1All labels are 0