|
|
Line 1: |
Line 1: |
| A '''ratio distribution''' (or '''quotient distribution''') is a [[probability distribution]] constructed as the distribution of the [[ratio]] of [[random variable]]s having two other known distributions.
| | Emilia Shryock is my name but you can contact me something you like. I am a meter reader. To collect cash is what his family and him appreciate. Years in the past we moved to North Dakota.<br><br>My web blog :: std home test ([http://facehack.ir/index.php?do=/profile-110/info/ click over here]) |
| Given two (usually [[statistically independent|independent]]) random variables ''X'' and ''Y'', the distribution of the random variable ''W'' that is formed as the ratio
| |
| | |
| : <math>W = X/Y</math>
| |
| | |
| is a ''ratio distribution''.
| |
| | |
| The [[Cauchy distribution]] is an example of a ratio distribution. The random variable associated with this distribution comes about as the ratio of two [[normal distribution|Gaussian (normal) distributed]] variables with zero mean.
| |
| Thus the Cauchy distribution is also called the ''normal ratio distribution''.{{citation needed|date=April 2013}}
| |
| A number of researchers have considered more general ratio distributions.<ref name="GearyR1930Frequency">{{Cite journal
| |
| | title = The Frequency Distribution of the Quotient of Two Normal Variates
| |
| | last = Geary
| |
| | first = R. C.
| |
| | authorlink= Roy C. Geary
| |
| | journal = [[Journal of the Royal Statistical Society]]
| |
| | volume = 93
| |
| | issue = 3
| |
| | year = 1930
| |
| | pages = 442–446
| |
| | doi = 10.2307/2342070
| |
| | jstor=2342070
| |
| }}</ref><ref>{{Cite journal
| |
| | title = The Distribution of the Index in a Normal Bivariate Population
| |
| | last = Fieller
| |
| | first = E. C.
| |
| | journal = [[Biometrika]]
| |
| | volume = 24
| |
| | issue = 3/4
| |
| |date=November 1932
| |
| | pages = 428–440
| |
| | doi = 10.2307/2331976
| |
| | url = http://biomet.oxfordjournals.org/cgi/content/citation/24/3-4/428
| |
| | jstor = 2331976
| |
| }}</ref><ref name="CurtissJ1941On">{{Cite journal
| |
| | last = Curtiss
| |
| | first =J. H.
| |
| | title = On the Distribution of the Quotient of Two Chance Variables
| |
| | journal = [[The Annals of Mathematical Statistics]]
| |
| | volume = 12
| |
| | issue = 4
| |
| |date=December 1941
| |
| | pages = 409–421
| |
| | doi = 10.1214/aoms/1177731679
| |
| | jstor=2235953
| |
| }}</ref><ref>[[George Marsaglia]] (April 1964). ''[http://www.dtic.mil/dtic/tr/fulltext/u2/600972.pdf Ratios of Normal Variables and Ratios of Sums of Uniform Variables]''. [[Defense Technical Information Center]].</ref><ref>{{Cite journal
| |
| | last = Marsaglia
| |
| | first = George
| |
| | authorlink = George Marsaglia
| |
| | title = Ratios of Normal Variables and Ratios of Sums of Uniform Variables
| |
| | journal = [[Journal of the American Statistical Association]]
| |
| | volume = 60
| |
| | issue = 309
| |
| |date=March 1965
| |
| | pages = 193–204
| |
| | doi = 10.2307/2283145
| |
| | jstor=2283145
| |
| }}</ref><ref name="HinkleyD1969On">{{Cite journal
| |
| | last = Hinkley
| |
| | first = D. V.
| |
| | authorlink = D. V. Hinkley
| |
| | title = On the Ratio of Two Correlated Normal Random Variables
| |
| | journal = [[Biometrika]]
| |
| | volume = 56
| |
| | issue = 3
| |
| |date=December 1969
| |
| | pages = 635–639
| |
| | doi = 10.2307/2334671
| |
| | jstor=2334671
| |
| }}</ref><ref name="HayyaJ1975On">{{Cite journal
| |
| | last1 = Hayya
| |
| | first1 = Jack
| |
| | authorlink1 = Jack Hayya
| |
| | last2 = Armstrong
| |
| | first2 = Donald
| |
| | last3 = Gressis
| |
| | first3 = Nicolas
| |
| | title = A Note on the Ratio of Two Normally Distributed Variables
| |
| | journal = [[Management Science (journal)|Management Science]]
| |
| | year = 1975
| |
| | volume = 21
| |
| | issue = 11
| |
| | pages = 1338–1341
| |
| | month = July
| |
| | doi = 10.1287/mnsc.21.11.1338
| |
| | jstor=2629897
| |
| }}</ref><ref name="SpringerM1979Algebra">{{Cite book
| |
| | last = Springer
| |
| | first = Melvin Dale
| |
| | title = The Algebra of Random Variables
| |
| | publisher = [[John Wiley & Sons|Wiley]]
| |
| | year = 1979
| |
| | isbn = 0-471-01406-0
| |
| }}</ref><ref name="PhamGiaT2006Density">{{Cite journal
| |
| | last1 = Pham-Gia
| |
| | first1 = T.
| |
| | last2 = Turkkan
| |
| | first2 = N.
| |
| | last3 = Marchand
| |
| | first3 = E.
| |
| | title = Density of the Ratio of Two Normal Random Variables and Applications
| |
| | journal = Communications in Statistics - Theory and Methods
| |
| | publisher = [[Taylor & Francis]]
| |
| | volume = 35
| |
| | issue = 9
| |
| | year = 2006
| |
| | pages = 1569–1591
| |
| | doi = 10.1080/03610920600683689
| |
| }}</ref>
| |
| Two distributions often used in test-statistics, the [[t-distribution|''t''-distribution]] and the [[F-distribution|''F''-distribution]], are also ratio distributions:
| |
| The ''t''-distributed random variable is the ratio of a [[Gaussian distribution|Gaussian]] random variable divided by an independent [[chi distribution|chi-distributed]] random variable (i.e., the square root of a [[chi-squared distribution]]),
| |
| while the ''F''-distributed random variable is the ratio of two independent [[chi-squared distribution|chi-squared distributed]] random variables.
| |
| | |
| Often the ratio distributions are [[heavy-tailed]], and it may be difficult to work with such distributions and develop an associated [[statistical test]].
| |
| A method based on the [[median]] has been suggested as a "work-around".<ref>{{Cite journal
| |
| | title = Significance and statistical errors in the analysis of DNA microarray data
| |
| <!-- | authorlink1 = James P. Brody | authorlink2 =Brian A. Williams |authorlink3 = Barbara J. Wold --> |authorlink4 = Stephen R. Quake
| |
| | last1 = Brody
| |
| | first1 = James P.
| |
| | last2 = Williams
| |
| | first2 = Brian A.
| |
| | last3 = Wold
| |
| | first3 = Barbara J.
| |
| | last4 = Quake
| |
| | first4 = Stephen R.
| |
| | journal = [[Proc Natl Acad Sci U S A]]
| |
| |date=October 2002
| |
| | volume = 99
| |
| | issue = 20
| |
| | pages = 12975–12978
| |
| | doi = 10.1073/pnas.162468199
| |
| | pmid = 12235357
| |
| | pmc = 130571
| |
| }}</ref>
| |
| | |
| ==Algebra of random variables==
| |
| {{main|Algebra of random variables}}
| |
| The ratio is one type of algebra for random variables:
| |
| Related to the ratio distribution are the [[product distribution]], [[sum distribution]] and [[difference distribution]]. More generally, one may talk of combinations of sums, differences, products and ratios.
| |
| Many of these distributions are described in [[Melvin D. Springer]]'s book from 1979 ''The Algebra of Random Variables''.<ref name="SpringerM1979Algebra" />
| |
| | |
| The algebraic rules known with ordinary numbers do not apply for the algebra of random variables.
| |
| For example, if a product is ''C = AB'' and a ratio is ''D=C/A'' it does not necessarily mean that the distributions of ''D'' and ''B'' are the same.
| |
| Indeed, a peculiar effect is seen for the [[Cauchy distribution]]: The product and the ratio of two independent Cauchy distributions (with the same scale parameter and the location parameter set to zero) will give the same distribution.<ref name="SpringerM1979Algebra" />
| |
| This becomes evident when regarding the Cauchy distribution as itself a ratio distribution of two Gaussian distributions: Consider two Cauchy random variables, <math>C_1</math> and <math>C_2</math> each constructed from two Gaussian distributions <math>C_1=G_1/G_2</math> and <math>C_2 = G_3/G_4</math> then
| |
| | |
| : <math>\frac{C_1}{C_2} = \frac{{G_1}/{G_2}}{{G_3}/{G_4}} = \frac{G_1 G_4}{G_2 G_3} = \frac{G_1}{G_2} \times \frac{G_4}{G_3} = C_1 \times C_3,</math> | |
| | |
| where <math>C_3 = G_4/G_3</math>. The first term is the ratio of two Cauchy distributions while the last term is the product of two such distributions.
| |
| | |
| ==Derivation==
| |
| A way of deriving the ratio distribution of ''Z'' from the joint distribution of the two other random variables, ''X'' and ''Y'', is by integration of the following form<ref name="CurtissJ1941On" />
| |
| | |
| : <math>p_Z(z) = \int^{+\infty}_{-\infty} |y|\, p_{X,Y}(zy, y) \, dy. </math>
| |
| | |
| This is not always straightforward.
| |
| | |
| The [[Mellin transform]] has also been suggested for derivation of ratio distributions.<ref name="SpringerM1979Algebra" />
| |
| | |
| ==Gaussian ratio distribution==
| |
| When ''X'' and ''Y'' are independent and have a [[Gaussian distribution]] with zero mean the form of their ratio distribution is fairly simple:
| |
| It is a [[Cauchy distribution]].
| |
| However, when the two distributions have non-zero means then the form for the distribution of the ratio is much more complicated.
| |
| In 1969 [[David Hinkley]] found a form for this distribution.<ref name="HinkleyD1969On" /> In the absence of correlation (cor(''X'',''Y'') = 0), the [[probability density function]] of the two normal variable ''X'' = ''N''(''μ<sub>X</sub>'', ''σ<sub>X</sub>''<sup>2</sup>) and ''Y'' = ''N''(''μ<sub>Y</sub>'', ''σ<sub>Y</sub>''<sup>2</sup>) ratio ''Z'' = ''X''/''Y'' is given by the following expression:
| |
| | |
| : <math> p_Z(z)= \frac{b(z) \cdot c(z)}{a^3(z)} \frac{1}{\sqrt{2 \pi} \sigma_x \sigma_y} \left[2 \Phi \left( \frac{b(z)}{a(z)}\right) - 1 \right] + \frac{1}{a^2(z) \cdot \pi \sigma_x \sigma_y } e^{- \frac{1}{2} \left( \frac{\mu_x^2}{\sigma_x^2} + \frac{\mu_y^2}{\sigma_y^2} \right)} </math>
| |
| | |
| where
| |
| | |
| : <math> a(z)= \sqrt{\frac{1}{\sigma_x^2} z^2 + \frac{1}{\sigma_y^2}} </math>
| |
| | |
| : <math> b(z)= \frac{\mu_x }{\sigma_x^2} z + \frac{\mu_y}{\sigma_y^2} </math> | |
| | |
| : <math> c(z)= e^{\frac {1}{2} \frac{b^2(z)}{a^2(z)} - \frac{1}{2} \left( \frac{\mu_x^2}{\sigma_x^2} + \frac{\mu_y^2}{\sigma_y^2} \right)} </math>
| |
| | |
| : <math> \Phi(z)= \int_{-\infty}^{z}\, \frac{1}{\sqrt{2 \pi}} e^{- \frac{1}{2} u^2}\ du\ </math>
| |
| | |
| The above expression becomes even more complicated if the variables ''X'' and ''Y'' are correlated.
| |
| It can also be shown that ''p''(''z'') is a standard [[Cauchy distribution]] if ''μ<sub>X</sub>'' = ''μ<sub>Y</sub>'' = 0, and ''σ<sub>X</sub>'' = ''σ<sub>Y</sub>'' = 1. In such case ''b''(''z'') = 0, and
| |
| : <math>p(z)= \frac{1}{\pi} \frac{1}{1 + z^2} </math>
| |
| | |
| If <math>\sigma_X \neq 1</math>, <math>\sigma_Y \neq 1</math> or <math>\rho \neq 0</math> the more general Cauchy distribution is obtained
| |
| | |
| : <math>p_Z(z) = \frac{1}{\pi} \frac{\beta}{(z-\alpha)^2 + \beta^2},</math>
| |
| | |
| where ρ is the [[Pearson product-moment correlation coefficient|correlation coefficient]] between ''X'' and ''Y'' and
| |
| | |
| : <math>\alpha = \rho \frac{\sigma_x}{\sigma_y},</math>
| |
| | |
| : <math>\beta = \frac{\sigma_x}{\sigma_y} \sqrt{1-\rho^2}.</math>
| |
| | |
| The complex distribution has also been expressed with Kummer's [[confluent hypergeometric function]] or the [[Hermite function]].<ref name="PhamGiaT2006Density" />
| |
| | |
| ===A transformation to Gaussianity===
| |
| A transformation has been suggested so that, under certain assumptions, the transformed variable ''T'' would approximately have a standard Gaussian distribution:<ref name="GearyR1930Frequency" />
| |
| : <math>t = \frac{\mu_y z - \mu_x}{\sqrt{\sigma_y^2 z^2 - 2\rho \sigma_x \sigma_y z + \sigma_x^2}}</math>
| |
| The transformation has been called the Geary–Hinkley transformation,<ref name="HayyaJ1975On" /> and the approximation is good if ''Y'' is unlikely to assume negative values.
| |
| | |
| ==Uniform ratio distribution==
| |
| With two independent random variables following a [[uniform distribution (continuous)|uniform distribution]], e.g.,
| |
| : <math>p_X(x) = \begin{cases} 1 \qquad 0 < x < 1 \\ 0 \qquad \mbox{otherwise}\end{cases}</math>
| |
| the ratio distribution becomes
| |
| : <math>p_Z(z) = \begin{cases}
| |
| 1/2 \qquad & 0 < z < 1 \\
| |
| \frac{1}{2z^2} \qquad & z \geq 1 \\
| |
| 0 \qquad & \mbox{otherwise} \end{cases}</math>
| |
| | |
| ==Cauchy ratio distribution==
| |
| If two independent random variables, ''X'' and ''Y'' each follow a [[Cauchy distribution]] with median equal to zero and shape factor <math>a</math>
| |
| : <math>p_X(x|a) = \frac{a}{\pi (a^2 + x^2)}</math>
| |
| then the ratio distribution for the random variable <math>Z = X/Y</math> is <ref name="Kermond2010">{{Cite journal
| |
| | title = An Introduction to the Algebra of Random Variables
| |
| | last = Kermond
| |
| | first = John
| |
| | journal = Mathematical Association of Victoria 47th Annual Conference Proceedings - New Curriculum. New Opportunities
| |
| | publisher = The Mathematical Association of Victoria
| |
| | year = 2010
| |
| | isbn = 978-1-876949-50-1
| |
| | pages = 1–16
| |
| }}</ref>
| |
| : <math>p_Z(z|a) = \frac{1}{\pi^2(z^2-1)} \ln \left(z^2\right).</math>
| |
| Interestingly, this distribution does not depend on <math>a</math> and it should be noted that the result stated by Springer <ref name="SpringerM1979Algebra" /> (p158 Question 4.6) is not correct.
| |
| The ratio distribution is similar to but not the same as the [[product distribution]] of the random variable <math>W=XY</math>:
| |
| : <math>p_W(w|a) = \frac{a^2}{\pi^2(w^2-a^4)} \ln \left(\frac{w^2}{a^4}\right).</math> <ref name="SpringerM1979Algebra"/>
| |
| More generally, if two independent random variables ''X'' and ''Y'' each follow a [[Cauchy distribution]] with median equal to zero and shape factor <math>a</math> and <math>b</math> respectively, then:
| |
| | |
| 1. The ratio distribution for the random variable <math>Z = X/Y</math> is <ref name="Kermond2010" />
| |
| : <math>p_Z(z|a,b) = \frac{ab}{\pi^2(b^2z^2-a^2)} \ln \left(\frac{b^2 z^2}{a^2}\right).</math>
| |
| 2. The [[product distribution]] for the random variable <math>W = XY</math> is <ref name="Kermond2010" />
| |
| : <math>p_W(w|a,b) = \frac{ab}{\pi^2(w^2-a^2b^2)} \ln \left(\frac{w^2}{a^2b^2}\right).</math>
| |
| The result for the ratio distribution can be obtained from the product distribution by replacing <math>b</math> with <math>\frac{1}{b}.</math>
| |
| | |
| ==Ratio of standard normal to standard uniform==
| |
| {{main|Slash distribution}}
| |
| If ''X'' has a standard normal distribution and ''Y'' has a standard uniform distribution, then ''Z'' = ''X'' / ''Y'' has a distribution known as the ''[[slash distribution]]'', with probability density function
| |
| :<math>p_Z(z) = \begin{cases}
| |
| \left[ \phi(0) - \phi(z) \right] / z^2 \quad & z \ne 0 \\
| |
| \phi(0) / 2 \quad & z = 0, \\
| |
| \end{cases}</math>
| |
| where φ(''z'') is the probability density function of the standard normal distribution.<ref name=nist>{{cite web|url=http://www.itl.nist.gov/div898/software/dataplot/refman2/auxillar/slappf.htm|title=SLAPPF|publisher=Statistical Engineering Division, National Institute of Science and Technology|accessdate=2009-07-02}}</ref>
| |
| | |
| ==Other ratio distributions==
| |
| | |
| Let ''X'' be a normal(0,1) distribution, ''Y'' and Z be a [[chi square distribution]]s with ''m'' and ''n'' [[degrees of freedom]] respectively. Then
| |
| | |
| : <math> \frac{ X }{ \sqrt{ Y / m } } = t_m </math>
| |
| | |
| : <math> \frac{ Y / m }{ Z / n } = F_{ m, n } </math>
| |
| | |
| : <math> \frac{ Y }{ Y + Z } = beta( m / 2, n / 2 )</math>
| |
| | |
| where ''t''<sub>m</sub> is Student's t distribution, F is the [[F distribution]] and beta is the [[beta distribution]].
| |
| | |
| == Ratio distributions in multivariate analysis ==
| |
| Ratio distributions also appear in [[multivariate analysis]].
| |
| If the random matrices '''X''' and '''Y''' follow a [[Wishart distribution]] then the ratio of the [[determinant]]s
| |
| | |
| : <math>\phi = |\mathbf{X}|/|\mathbf{Y}|</math>
| |
| | |
| is proportional to the product of independent [[F-distribution|F]] random variables. In the case where '''X''' and '''Y''' are from independent standardized [[Wishart distribution]]s then the ratio
| |
| : <math>\Lambda = {|\mathbf{X}|/|\mathbf{X}+\mathbf{Y}|} </math>
| |
| has a [[Wilks' lambda distribution]].
| |
| | |
| ==See also==
| |
| *[[Inverse distribution]]
| |
| *[[Product distribution]]
| |
| *[[Ratio estimator]]
| |
| *[[Slash distribution]]
| |
| | |
| ==References==
| |
| {{Reflist}}
| |
| | |
| ==External links==
| |
| *[http://mathworld.wolfram.com/RatioDistribution.html Ratio Distribution] at [[MathWorld]]
| |
| *[http://mathworld.wolfram.com/NormalRatioDistribution.html Normal Ratio Distribution] at [[MathWorld]]
| |
| *[http://www.mathpages.com/home/kmath042/kmath042.htm Ratio Distributions] at MathPages
| |
| | |
| [[Category:Algebra of random variables]]
| |
| [[Category:Statistical ratios]]
| |
| [[Category:Types of probability distributions]]
| |