The correct answer is D. all of the mentioned.
Hierarchical clustering is a method of cluster analysis which seeks to build a hierarchy of clusters. Strategies for hierarchical clustering generally fall into two types:
- Agglomerative : This is a “bottom up” approach: each observation starts in its own cluster, and pairs of clusters are merged as one moves up the hierarchy.
- Divisive : This is a “top down” approach: all observations start in one cluster, and splits are performed recursively as one moves down the hierarchy.
The output of a hierarchical clustering algorithm is a tree diagram called a dendrogram, which shows the hierarchy of clusters. The leaves of the dendrogram represent the individual data points, and the internal nodes represent the clusters. The height of a node in the dendrogram represents the similarity between the clusters at that node.
The assignment of each point to clusters can be done by following the branches of the dendrogram from the leaves to the root. The points at the leaves are assigned to their own clusters, and the clusters at each internal node are merged to form the cluster at the parent node.
The final estimate of cluster centroids can be obtained by calculating the mean of the points in each cluster.
Therefore, all of the mentioned are finally produced by hierarchical clustering.