Energy density: Difference between revisions
No edit summary |
|||
Line 1: | Line 1: | ||
{{main|cumulant}} | |||
In [[probability theory]] and [[mathematics|mathematical]] [[statistics]], the '''law of total cumulance''' is a generalization to [[cumulant]]s of the [[law of total probability]], the [[law of total expectation]], and the [[law of total variance]]. It has applications in the analysis of [[time series]]. It was introduced by David Brillinger.<ref>David Brillinger, "The calculation of cumulants via conditioning", ''Annals of the Institute of Statistical Mathematics'', Vol. 21 (1969), pp. 215–218.</ref> | |||
It is most transparent when stated in its most general form, for ''joint'' cumulants, rather than for cumulants of a specified order for just one [[random variable]]. In general, we have | |||
:<math>\kappa(X_1,\dots,X_n)=\sum_\pi \kappa(\kappa(X_i : i\in B \mid Y) : B \in \pi),</math> | |||
where | |||
* κ(''X''<sub>1</sub>, ..., ''X''<sub>''n''</sub>) is the joint cumulant of ''n'' random variables ''X''<sub>1</sub>, ..., ''X''<sub>''n''</sub>, and | |||
* the sum is over all [[partition of a set|partitions]] <math>\pi</math> of the set { 1, ..., ''n'' } of indices, and | |||
* "''B'' ∈ π" means ''B'' runs through the whole list of "blocks" of the partition π, and | |||
* κ(''X''<sub>''i''</sub> : ''i'' ∈ ''B'' | ''Y'') is a conditional cumulant given the value of the random variable ''Y''. It is therefore a random variable in its own right—a function of the random variable ''Y''. | |||
==Examples== | |||
===The special case of just one random variable and ''n'' = 2 or 3=== | |||
Only in case ''n'' = either 2 or 3 is the ''n''th cumulant the same as the ''n''th [[central moment]]. The case ''n'' = 2 is well-known (see [[law of total variance]]). Below is the case ''n'' = 3. The notation μ<sub>3</sub> means the third central moment. | |||
:<math>\mu_3(X)=E(\mu_3(X\mid Y))+\mu_3(E(X\mid Y)) | |||
+3\,\operatorname{cov}(E(X\mid Y),\operatorname{var}(X\mid Y)).\,</math> | |||
===General 4th-order joint cumulants=== | |||
For general 4th-order cumulants, the rule gives a sum of 15 terms, as follows: | |||
:<math>\kappa(X_1,X_2,X_3,X_4)\,</math> | |||
::<math>=\kappa(\kappa(X_1,X_2,X_3,X_4\mid Y))\,</math> | |||
:::<math>\left.\begin{matrix} | |||
& {}+\kappa(\kappa(X_1,X_2,X_3\mid Y),\kappa(X_4\mid Y)) \\ \\ | |||
& {}+\kappa(\kappa(X_1,X_2,X_4\mid Y),\kappa(X_3\mid Y)) \\ \\ | |||
& {}+\kappa(\kappa(X_1,X_3,X_4\mid Y),\kappa(X_2\mid Y)) \\ \\ | |||
& {}+\kappa(\kappa(X_2,X_3,X_4\mid Y),\kappa(X_1\mid Y)) | |||
\end{matrix}\right\}(\mathrm{partitions}\ \mathrm{of}\ \mathrm{the}\ 3+1\ \mathrm{form})</math> | |||
:::<math>\left.\begin{matrix} | |||
& {}+\kappa(\kappa(X_1,X_2\mid Y),\kappa(X_3,X_4\mid Y)) \\ \\ | |||
& {}+\kappa(\kappa(X_1,X_3\mid Y),\kappa(X_2,X_4\mid Y)) \\ \\ | |||
& {}+\kappa(\kappa(X_1,X_4\mid Y),\kappa(X_2,X_3\mid Y))\end{matrix}\right\}(\mathrm{partitions}\ \mathrm{of}\ \mathrm{the}\ 2+2\ \mathrm{form})</math> | |||
:::<math>\left.\begin{matrix} | |||
& {}+\kappa(\kappa(X_1,X_2\mid Y),\kappa(X_3\mid Y),\kappa(X_4\mid Y)) \\ \\ | |||
& {}+\kappa(\kappa(X_1,X_3\mid Y),\kappa(X_2\mid Y),\kappa(X_4\mid Y)) \\ \\ | |||
& {}+\kappa(\kappa(X_1,X_4\mid Y),\kappa(X_2\mid Y),\kappa(X_3\mid Y)) \\ \\ | |||
& {}+\kappa(\kappa(X_2,X_3\mid Y),\kappa(X_1\mid Y),\kappa(X_4\mid Y)) \\ \\ | |||
& {}+\kappa(\kappa(X_2,X_4\mid Y),\kappa(X_1\mid Y),\kappa(X_3\mid Y)) \\ \\ | |||
& {}+\kappa(\kappa(X_3,X_4\mid Y),\kappa(X_1\mid Y),\kappa(X_2\mid Y)) | |||
\end{matrix}\right\}(\mathrm{partitions}\ \mathrm{of}\ \mathrm{the}\ 2+1+1\ \mathrm{form})</math> | |||
:::<math>{}+\kappa(\kappa(X_1\mid Y),\kappa(X_2\mid Y),\kappa(X_3\mid Y),\kappa(X_4\mid Y)).\,</math> | |||
===Cumulants of compound Poisson random variables=== | |||
Suppose ''Y'' has a [[Poisson distribution]] with [[expected value]] 1, and ''X'' is the sum of ''Y'' [[statistical independence|independent]] copies of ''W''. | |||
:<math>X=\sum_{y=1}^Y W_y.\,</math> | |||
All of the cumulants of the Poisson distribution are equal to each other, and so in this case are equal to 1. Also recall that if random variables ''W''<sub>1</sub>, ..., ''W''<sub>''m''</sub> are [[statistical independence|independent]], then the ''n''th cumulant is additive: | |||
:<math>\kappa_n(W_1+\cdots+W_m)=\kappa_n(W_1)+\cdots+\kappa_n(W_m).\,</math> | |||
We will find the 4th cumulant of ''X''. We have: | |||
:<math>\kappa_4(X)=\kappa(X,X,X,X)\,</math> | |||
::<math>=\kappa_1(\kappa_4(X\mid Y))+4\kappa(\kappa_3(X\mid Y),\kappa_1(X\mid Y))+3\kappa_2(\kappa_2(X\mid Y))\,</math> | |||
:::<math>{}+6\kappa(\kappa_2(X\mid Y),\kappa_1(X\mid Y),\kappa_1(X\mid Y))+\kappa_4(\kappa_1(X\mid Y))\,</math> | |||
::<math>=\kappa_1(Y\kappa_4(W))+4\kappa(Y\kappa_3(W),Y\kappa_1(W)) | |||
+3\kappa_2(Y\kappa_2(W))\,</math> | |||
:::<math>{}+6\kappa(Y\kappa_2(W),Y\kappa_1(W),Y\kappa_1(W)) | |||
+\kappa_4(Y\kappa_1(W))\,</math> | |||
::<math>=\kappa_4(W)\kappa_1(Y)+4\kappa_3(W)\kappa_1(W)\kappa_2(Y) | |||
+3\kappa_2(W)^2 \kappa_2(Y)\,</math> | |||
:::<math>{}+6\kappa_2(W) \kappa_1(W)^2 \kappa_3(Y)+\kappa_1(W)^4 \kappa_4(Y)\,</math> | |||
::<math>=\kappa_4(W)+4\kappa_3(W)\kappa_1(W) | |||
+3\kappa_2(W)^2+6\kappa_2(W) \kappa_1(W)^2+\kappa_1(W)^4.\,</math> | |||
::<math>=E(W^4)\,</math> (the punch line—see the explanation below). | |||
We recognize this last sum as the sum over all partitions of the set { 1, 2, 3, 4 }, of the product over all blocks of the partition, of cumulants of ''W'' of order equal to the size of the block. That is precisely the 4th raw [[moment (mathematics)|moment]] of ''W'' (see [[cumulant]] for a more leisurely discussion of this fact). Hence the moments of ''W'' are the cumulants of ''X''. | |||
In this way we see that every moment sequence is also a cumulant sequence (the converse cannot be true, since cumulants of even order ≥ 4 are in some cases negative, and also because the cumulant sequence of the [[normal distribution]] is not a moment sequence of any probability distribution). | |||
===Conditioning on a Bernoulli random variable=== | |||
Suppose ''Y'' = 1 with probability ''p'' and ''Y'' = 0 with probability ''q'' = 1 − ''p''. Suppose the conditional probability distribution of ''X'' given ''Y'' is ''F'' if ''Y'' = 1 and ''G'' if ''Y'' = 0. Then we have | |||
:<math>\kappa_n(X)=p\kappa_n(F)+q\kappa_n(G)+\sum_{\pi<\widehat{1}} \kappa_{\left|\pi\right|}(Y)\prod_{B\in\pi} | |||
(\kappa_{\left|B\right|}(F)-\kappa_{\left|B\right|}(G))</math> | |||
where <math>\pi<\widehat{1}</math> means π is a partition of the set { 1, ..., ''n'' } that is finer than the coarsest partition – the sum is over all partitions except that one. For example, if ''n'' = 3, then we have | |||
:<math>\kappa_3(X)=p\kappa_3(F)+q\kappa_3(G) | |||
+3pq(\kappa_2(F)-\kappa_2(G))(\kappa_1(F)-\kappa_1(G)) | |||
+pq(q-p)(\kappa_1(F)-\kappa_1(G))^3.\,</math> | |||
==References== | |||
{{reflist}} | |||
{{DEFAULTSORT:Law Of Total Cumulance}} | |||
[[Category:Algebra of random variables]] | |||
[[Category:Theory of probability distributions]] | |||
[[Category:Statistical theorems]] | |||
[[Category:Statistical laws]] |
Revision as of 03:09, 3 February 2014
Mining Engineer (Excluding Oil ) Truman from Alma, loves to spend time knotting, largest property developers in singapore developers in singapore and stamp collecting. Recently had a family visit to Urnes Stave Church.
In probability theory and mathematical statistics, the law of total cumulance is a generalization to cumulants of the law of total probability, the law of total expectation, and the law of total variance. It has applications in the analysis of time series. It was introduced by David Brillinger.[1]
It is most transparent when stated in its most general form, for joint cumulants, rather than for cumulants of a specified order for just one random variable. In general, we have
where
- κ(X1, ..., Xn) is the joint cumulant of n random variables X1, ..., Xn, and
- the sum is over all partitions of the set { 1, ..., n } of indices, and
- "B ∈ π" means B runs through the whole list of "blocks" of the partition π, and
- κ(Xi : i ∈ B | Y) is a conditional cumulant given the value of the random variable Y. It is therefore a random variable in its own right—a function of the random variable Y.
Examples
The special case of just one random variable and n = 2 or 3
Only in case n = either 2 or 3 is the nth cumulant the same as the nth central moment. The case n = 2 is well-known (see law of total variance). Below is the case n = 3. The notation μ3 means the third central moment.
General 4th-order joint cumulants
For general 4th-order cumulants, the rule gives a sum of 15 terms, as follows:
Cumulants of compound Poisson random variables
Suppose Y has a Poisson distribution with expected value 1, and X is the sum of Y independent copies of W.
All of the cumulants of the Poisson distribution are equal to each other, and so in this case are equal to 1. Also recall that if random variables W1, ..., Wm are independent, then the nth cumulant is additive:
We will find the 4th cumulant of X. We have:
We recognize this last sum as the sum over all partitions of the set { 1, 2, 3, 4 }, of the product over all blocks of the partition, of cumulants of W of order equal to the size of the block. That is precisely the 4th raw moment of W (see cumulant for a more leisurely discussion of this fact). Hence the moments of W are the cumulants of X.
In this way we see that every moment sequence is also a cumulant sequence (the converse cannot be true, since cumulants of even order ≥ 4 are in some cases negative, and also because the cumulant sequence of the normal distribution is not a moment sequence of any probability distribution).
Conditioning on a Bernoulli random variable
Suppose Y = 1 with probability p and Y = 0 with probability q = 1 − p. Suppose the conditional probability distribution of X given Y is F if Y = 1 and G if Y = 0. Then we have
where means π is a partition of the set { 1, ..., n } that is finer than the coarsest partition – the sum is over all partitions except that one. For example, if n = 3, then we have
References
43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.
- ↑ David Brillinger, "The calculation of cumulants via conditioning", Annals of the Institute of Statistical Mathematics, Vol. 21 (1969), pp. 215–218.