Frobenius determinant theorem: Difference between revisions
en>Geometry guy subcat |
→Formal statement: wlink |
||
Line 1: | Line 1: | ||
In [[statistics]], '''Ward's method''' is a criterion applied in [[Hierarchical clustering|hierarchical cluster analysis]]. '''Ward's minimum variance method''' is a special case of the [[objective function]] approach originally presented by Joe H. Ward, Jr.<ref>Ward, J. H., Jr. (1963), "Hierarchical Grouping to Optimize an Objective Function", ''[[Journal of the American Statistical Association]]'', 58, 236–244.</ref> Ward suggested a general [[agglomerative hierarchical clustering]] procedure, where the criterion for choosing the pair of clusters to merge at each step is based on the optimal value of an objective function. This objective function could be "any function that reflects the investigator's purpose." Many of the standard clustering procedures are contained in this very general class. To illustrate the procedure, Ward used the example where the objective function is the [[Lack-of-fit sum of squares|error sum of squares]], and this example is known as ''Ward's method'' or more precisely ''Ward's minimum variance method''. | |||
==The minimum variance criterion== | |||
Ward's minimum variance criterion minimizes the total within-cluster variance. At each step the pair of clusters with minimum between-cluster distance are merged. To implement this method, at each step find the pair of clusters that leads to minimum increase in total within-cluster variance after merging. This increase is a weighted squared distance between cluster centers. At the initial step, all clusters are singletons (clusters containing a single point). To apply a [[recursive algorithm]] under this [[objective function]], the initial distance between individual objects must be (proportional to) squared [[Euclidean distance]]. | |||
The initial cluster distances in Ward's minimum variance method are therefore defined to be the squared Euclidean distance between points: | |||
: <math>d_{ij}=d(\{X_i\}, \{X_j\}) = { \|X_i - X_j\|^2}.</math> | |||
Note: In software that implements Ward's method, it is important to check whether the function arguments should specify Euclidean distances or squared Euclidean distances. In the [[R (programming language)|R function]] hclust, it is necessary to pass the '''squared''' Euclidean distances to obtain Ward's '''minimum variance''' method. For other methods provided by hclust (single, complete, etc.), Euclidean distances are required. | |||
==Lance–Williams algorithms== | |||
Ward's minimum variance method can be defined and implemented recursively by a Lance–Williams algorithm.[2] The Lance–Williams algorithms are an infinite family of agglomerative hierarchical clustering algorithms which are represented by a recursive formula for updating cluster distances at each step (each time a pair of clusters is merged). At each step, it is necessary to optimize the objective function (find the optimal pair of clusters to merge). The recursive formula simplifies finding the optimal pair. | |||
Suppose that clusters <math>C_i</math> and <math>C_j</math> were next to be merged. At this point all of the current pairwise cluster distances are known. The recursive formula gives the updated cluster distances following the pending merge of clusters <math>C_i</math> and <math>C_j</math>. Let | |||
*<math>d_{ij}</math>, <math>d_{ik}</math>, and <math>d_{jk}</math> be the pairwise distances between clusters <math>C_i</math>, <math>C_j</math>, and <math>C_k</math>, respectively, | |||
*<math>d_{(ij)k}</math> be the distance between the new cluster <math>C_i \cup C_j</math> and <math>C_k</math>. | |||
An algorithm belongs to the Lance-Williams family if the updated cluster distance <math>d_{(ij)k}</math> can be computed recursively by | |||
:<math> | |||
d_{(ij)k} = \alpha_i d_{ik} + \alpha_j d_{jk} + \beta d_{ij} + \gamma |d_{ik} - d_{jk}|, </math> | |||
where <math>\alpha_i, \alpha_j, \beta,</math> and <math>\gamma</math> are parameters, which may depend on cluster sizes, that together with the cluster distance function <math>d_{ij}</math> determine the clustering algorithm. Several standard clustering algorithms such as [[Single-linkage clustering|single linkage]], [[Complete-linkage clustering|complete linkage]], and group average method have a recursive formula of the above type. A table of parameters for standard methods is given by several authors.<ref>Cormack, R. M. (1971), "A Review of Classification", ''[[Journal of the Royal Statistical Society]]'', Series A'', 134(3), 321-367.</ref><ref>Gordon, A. D. (1999), ''Classification, 2nd Edition'', Chapman and Hall, Boca Raton.</ref><ref>Milligan, G. W. (1979), "Ultrametric Hierarchical Clustering Algorithms", ''Psychometrika'', 44(3), 343–346.</ref> | |||
Ward's minimum variance method can be implemented by the Lance–Williams formula. For disjoint clusters <math>C_i, C_j,</math> and <math>C_k</math> with sizes <math>n_i, n_j,</math> and <math>n_k</math> respectively: | |||
:<math> | |||
d(C_i \cup C_j, C_k) = | |||
\frac{n_i+n_k}{n_i+n_j+n_k}\;d(C_i,C_k) + | |||
\frac{n_j+n_k}{n_i+n_j+n_k}\;d(C_j,C_k) - | |||
\frac{n_k}{n_i+n_j+n_k}\;d(C_i,C_j). | |||
</math> | |||
Hence Ward's method can be implemented as a Lance–Williams algorithm with | |||
:<math> | |||
\alpha_l = \frac{n_l+n_k}{n_i+n_j+n_k}, \qquad | |||
\beta =\frac{-n_k}{n_i+n_j+n_k}, \qquad | |||
\gamma = 0. | |||
</math> | |||
==References== | |||
{{reflist}} | |||
== Further reading == | |||
* Everitt, B. S., Landau, S. and Leese, M. (2001), ''Cluster Analysis, 4th Edition'', Oxford University Press, Inc., New York; Arnold, London. ISBN 0340761199 | |||
* Hartigan, J. A. (1975), ''Clustering Algorithms'', New York: Wiley. | |||
* [[Anil K. Jain|Jain, A. K.]] and Dubes, R. C. (1988), ''Algorithms for Clustering Data'', New Jersey: Prentice–Hall. | |||
* Kaufman, L. and Rousseeuw, P. J. (1990), ''Finding Groups in Data: An Introduction to Cluster Analysis'', New York: Wiley. | |||
{{DEFAULTSORT:Ward's method}} | |||
[[Category:Data mining]] | |||
[[Category:Data clustering algorithms]] | |||
[[Category:Machine learning]] |
Latest revision as of 02:29, 4 January 2014
In statistics, Ward's method is a criterion applied in hierarchical cluster analysis. Ward's minimum variance method is a special case of the objective function approach originally presented by Joe H. Ward, Jr.[1] Ward suggested a general agglomerative hierarchical clustering procedure, where the criterion for choosing the pair of clusters to merge at each step is based on the optimal value of an objective function. This objective function could be "any function that reflects the investigator's purpose." Many of the standard clustering procedures are contained in this very general class. To illustrate the procedure, Ward used the example where the objective function is the error sum of squares, and this example is known as Ward's method or more precisely Ward's minimum variance method.
The minimum variance criterion
Ward's minimum variance criterion minimizes the total within-cluster variance. At each step the pair of clusters with minimum between-cluster distance are merged. To implement this method, at each step find the pair of clusters that leads to minimum increase in total within-cluster variance after merging. This increase is a weighted squared distance between cluster centers. At the initial step, all clusters are singletons (clusters containing a single point). To apply a recursive algorithm under this objective function, the initial distance between individual objects must be (proportional to) squared Euclidean distance.
The initial cluster distances in Ward's minimum variance method are therefore defined to be the squared Euclidean distance between points:
Note: In software that implements Ward's method, it is important to check whether the function arguments should specify Euclidean distances or squared Euclidean distances. In the R function hclust, it is necessary to pass the squared Euclidean distances to obtain Ward's minimum variance method. For other methods provided by hclust (single, complete, etc.), Euclidean distances are required.
Lance–Williams algorithms
Ward's minimum variance method can be defined and implemented recursively by a Lance–Williams algorithm.[2] The Lance–Williams algorithms are an infinite family of agglomerative hierarchical clustering algorithms which are represented by a recursive formula for updating cluster distances at each step (each time a pair of clusters is merged). At each step, it is necessary to optimize the objective function (find the optimal pair of clusters to merge). The recursive formula simplifies finding the optimal pair.
Suppose that clusters and were next to be merged. At this point all of the current pairwise cluster distances are known. The recursive formula gives the updated cluster distances following the pending merge of clusters and . Let
- , , and be the pairwise distances between clusters , , and , respectively,
- be the distance between the new cluster and .
An algorithm belongs to the Lance-Williams family if the updated cluster distance can be computed recursively by
where and are parameters, which may depend on cluster sizes, that together with the cluster distance function determine the clustering algorithm. Several standard clustering algorithms such as single linkage, complete linkage, and group average method have a recursive formula of the above type. A table of parameters for standard methods is given by several authors.[2][3][4]
Ward's minimum variance method can be implemented by the Lance–Williams formula. For disjoint clusters and with sizes and respectively:
Hence Ward's method can be implemented as a Lance–Williams algorithm with
References
43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.
Further reading
- Everitt, B. S., Landau, S. and Leese, M. (2001), Cluster Analysis, 4th Edition, Oxford University Press, Inc., New York; Arnold, London. ISBN 0340761199
- Hartigan, J. A. (1975), Clustering Algorithms, New York: Wiley.
- Jain, A. K. and Dubes, R. C. (1988), Algorithms for Clustering Data, New Jersey: Prentice–Hall.
- Kaufman, L. and Rousseeuw, P. J. (1990), Finding Groups in Data: An Introduction to Cluster Analysis, New York: Wiley.
- ↑ Ward, J. H., Jr. (1963), "Hierarchical Grouping to Optimize an Objective Function", Journal of the American Statistical Association, 58, 236–244.
- ↑ Cormack, R. M. (1971), "A Review of Classification", Journal of the Royal Statistical Society, Series A, 134(3), 321-367.
- ↑ Gordon, A. D. (1999), Classification, 2nd Edition, Chapman and Hall, Boca Raton.
- ↑ Milligan, G. W. (1979), "Ultrametric Hierarchical Clustering Algorithms", Psychometrika, 44(3), 343–346.