Fleiss' kappa: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Kgwet
 
Line 1: Line 1:
{{distinguish|Median absolute deviation}}
Hi there. My name is Garland although it is not the title on my beginning certification. Interviewing is what I do for a residing but I plan on altering it. Some time in the past I chose to reside in Arizona but I need to transfer for my family. Playing crochet is some thing that I've carried out for years.<br><br>my site; [http://Www.farebook.us/index.php?do=/profile-7229/info/ Www.farebook.us]
The '''mean difference''' is a [[Statistical dispersion#Measures of statistical dispersion|measure of statistical dispersion]] equal to the average [[absolute difference]] of two independent values drawn from a [[probability distribution]]. A related statistic is the '''[[#Relative_mean_difference|relative mean difference]]''', which is the mean difference divided by the [[arithmetic mean]]. An important relationship is that the relative mean difference is equal to twice the [[Gini coefficient]], which is defined in terms of the [[Lorenz curve]].
 
The mean difference is also known as the '''absolute mean difference''' and the '''[[Corrado Gini|Gini]] mean difference'''.  The mean difference is sometimes denoted by Δ or as MD.  The [[mean deviation]] is a different measure of dispersion.
 
== Definition ==
The mean difference is defined as the "average" or "mean", formally the [[expected value]], of the absolute difference of two [[random variables]] ''X'' and ''Y'' [[Independent and identically distributed random variables|independently and identically distributed]] with the same (unknown) distribution henceforth called ''Q''.
 
:<math>\mathrm{MD} := E[|X - Y|] .</math>
 
== Calculation ==
Specifically,
 
*if ''Q'' has a [[Discrete probability distribution|discrete probability function]] ''f''(''y''), where ''y''<sub>''i''</sub>, ''i'' = 1 to ''n'', are the values with nonzero probabilities:
:<math>\mathrm{MD} = \Sigma_{i=1}^n \Sigma_{j=1}^n f(y_i) f(y_j) | y_i - y_j | .</math>
 
*if ''Q'' has a [[probability density function]] ''f''(''x''):
:<math>\mathrm{MD} = \int_{-\infty}^\infty \int_{-\infty}^\infty f(x)\,f(y)\,|x-y|\,dx\,dy .</math>
 
*if ''Q'' has a [[cumulative distribution function]] F(x) with [[quantile function]] F(x):
:<math>\mathrm{MD} = \int_0^1 \int_0^1 |F(x)-F(y)|\,dx\,dy .</math>
 
For a random sample of size ''n'' of a population distributed according to ''Q'', the (empirical) mean difference of the sequence of sample values ''y''<sub>''i''</sub>, ''i'' = 1 to ''n'' can be caluated as the [[arithmetic mean]] of the absolute value of all possible differences:
:<math>\mathrm{MD} = \frac{1}{n^2} \Sigma_{i=1}^n \Sigma_{j=1}^n | y_i - y_j | .</math>
 
== Relative mean difference ==
When the probability distribution has a finite and nonzero [[arithmetic mean]], the relative mean difference, sometimes denoted by ∇ or RMD, is defined by
 
:<math>RMD = \frac{MD}{\mbox{arithmetic mean}}.</math>
 
The relative mean difference quantifies the mean difference in comparison to the size of the mean and is a dimensionless quantity.  The relative mean difference is equal to twice the [[Gini coefficient]] which is defined in terms of the [[Lorenz curve]].  This relationship gives complementary perspectives to both the relative mean difference and the Gini coefficient, including alternative ways of calculating their values.
 
== Properties ==
The mean difference is invariant to translations and negation, and varies proportionally to positive scaling.  That is to say, if '''''X''''' is a random variable and ''c'' is a constant:
*MD('''''X''''' + ''c'') = MD('''''X'''''),
*MD(-'''''X''''') = MD('''''X'''''), and
*MD(''c'' '''''X''''') = |''c''| MD('''''X''''').
 
The relative mean difference is invariant to positive scaling, commutes with negation, and varies under translation in proportion to the ratio of the original and translated arithmetic means.  That is to say, if '''''X''''' is a random variable and c is a constant:
*RMD('''''X''''' + ''c'') = RMD('''''X''''') · mean('''''X''''')/(mean('''''X''''') + ''c'') = RMD('''''X''''') / (1+''c'' / mean('''''X''''')) for ''c'' ≠ -mean('''''X'''''),
*RMD(-'''''X''''') = −RMD('''''X'''''), and
*RMD(''c'' '''''X''''') = RMD('''''X''''') for ''c'' > 0.
 
If a random variable has a positive mean, then its relative mean difference will always be greater than or equal to zero.  If, additionally, the random variable can only take on values that are greater than or equal to zero, then its relative mean difference will be less than 2.
 
== Compared to standard deviation ==
The mean difference is twice the [[L-scale]] (the second [[L-moment]]), while the standard deviation is the square root of the variance about the mean (the second conventional central moment). The differences between L-moments and conventional moments are first seen in comparing the mean difference and the standard deviation (the first L-moment and first conventional moment are both the mean).
 
Both the [[standard deviation]] and the mean difference measure dispersion—how spread out are the values of a population or the probabilities of a distribution. The mean difference is not defined in terms of a specific measure of central tendency, whereas the standard deviation is defined in terms of the deviation from the arithmetic mean. Because the standard deviation squares its differences, it tends to give more weight to larger differences and less weight to smaller differences compared to the mean difference. When the arithmetic mean is finite, the mean difference will also be finite, even when the standard deviation is infinite. See the [[#Examples|examples]] for some specific comparisons.
 
The recently introduced [[distance standard deviation]] plays similar role to the mean difference but the distance standard deviation works with centered distances. See also [[E-statistic]]s.
 
== Sample estimators ==
For a random sample ''S'' from a random variable '''''X''''', consisting of ''n'' values ''y''<sub>''i''</sub>, the statistic
 
:<math>MD(S) = \frac{\sum_{i=1}^n \sum_{j=1}^n | y_i - y_j |}{n(n-1)}</math>
 
is a [[estimator#Consistency|consistent]] and [[estimator#Point estimators|unbiased]] [[estimator]] of MD('''''X'''''). The statistic:
:<math>RMD(S) = \frac{\sum_{i=1}^n \sum_{j=1}^n | y_i - y_j |}{(n-1)\sum_{i=1}^n y_i}</math>
is a [[estimator#Consistency|consistent]] [[estimator]] of RMD('''''X'''''), but is not, in general,  [[estimator#Point estimators|unbiased]].
 
Confidence intervals for RMD('''''X''''') can be calculated using bootstrap sampling techniques.
 
There does not exist, in general, an unbiased estimator for RMD('''''X'''''), in part because of the difficulty of finding an unbiased estimation for multiplying by the inverse of the mean.  For example, even where the sample is known to be taken from a random variable '''''X'''''(''p'') for an unknown ''p'', and '''''X'''''(''p'') - 1 has the [[Bernoulli distribution]], so that Pr('''''X'''''(''p'') = 1) = 1&nbsp;−&nbsp;''p'' and {{nowrap|1=Pr('''''X'''''(''p'') = 2) = ''p''}}, then
 
:RMD('''''X'''''(''p'')) = 2''p''(1&nbsp;−&nbsp;''p'')/(1&nbsp;+&nbsp;''p'').
 
But the expected value of any estimator ''R''('''''S''''') of RMD('''''X'''''(''p'')) will be of the form:{{Citation needed|date=October 2010}}
 
:<math>\operatorname{E}(R(S)) = \sum_{i=0}^n p^i (1-p)^{n-i} r_i ,</math>
 
where the ''r'' <sub>i</sub> are constants.  So E(''R''('''''S''''')) can never equal RMD('''''X'''''(''p'')) for all ''p'' between 0 and 1.
 
== Examples ==
 
{| class='wikitable' style="text-align:center"
|+ Examples of Mean Difference and Relative Mean Difference
! Distribution !! Parameters !! Mean !! Standard Deviation !! Mean Difference !! Relative Mean Difference
|-
| [[Uniform distribution (continuous)|Continuous uniform]] || a = 0 ; b = 1 || 1 / 2 = 0.5 || <math>\frac{1}{\sqrt{12}}</math> ≈ 0.2887 || 1 / 3 ≈ 0.3333 || 2 / 3 ≈ 0.6667
|-
| [[Normal distribution|Normal]] || μ = 1 ; σ = 1 || 1 || 1 || <math>\frac{2}{\sqrt{\pi}}</math> ≈ 1.1284 || <math>\frac{2}{\sqrt{\pi}}</math> ≈ 1.1284
|-
| [[Exponential distribution|Exponential]] || λ = 1 || 1 || 1 || 1 || 1
|-
| [[Pareto distribution|Pareto]] || ''k'' > 1 ; ''x''<sub>m</sub>&nbsp;=&nbsp;1 || <math>\frac{k}{(k-1)}</math> || <math>\frac{1}{(k-1)}\,\sqrt{\frac{k}{(k-2)}}</math> (for&nbsp;''k''&nbsp;>&nbsp;2) || <math>\frac{2 k} {(k-1) (2k-1)} \,</math> || <math>\frac{2}{2k-1}\,</math>
|-
| [[Gamma distribution|Gamma]] || ''k'' ; θ || ''k''θ || <math>\sqrt{k}\,\theta</math> || ''k'' θ (2 − 4 ''I''<sub> 0.5 </sub>(''k''+1 , ''k'')) † || 2 − 4 ''I''<sub> 0.5 </sub>(''k''+1 , ''k'') †
|-
| [[Gamma distribution|Gamma]] || ''k'' = 1 ; θ&nbsp;=&nbsp;1 || 1 || 1 || 1 || 1
|-
| [[Gamma distribution|Gamma]] || ''k'' = 2 ; θ = 1 || 2 || <math>\sqrt{2}</math> ≈ 1.4142 || 3 / 2 = 1.5 || 3 / 4 = 0.75
|-
| [[Gamma distribution|Gamma]] || ''k'' = 3 ; θ = 1 || 3 || <math>\sqrt{3}</math> ≈ 1.7321 || 15 / 8 = 1.875 || 5 / 8 = 0.625
|-
| [[Gamma distribution|Gamma]] || ''k'' = 4 ; θ = 1 || 4 || 2 || 35 / 16 = 2.1875 || 35 / 64 = 0.546875
|-
| [[Bernoulli distribution|Bernoulli]] || 0 ≤ ''p'' ≤ 1 || ''p'' || <math>\sqrt{p(1-p)}</math> || 2 ''p'' (1&nbsp;−&nbsp;''p'') || 2 (1&nbsp;−&nbsp;''p'') for ''p'' &gt; 0
|-
| [[Student's t-distribution|Student's ''t'']], 2 [[degrees of freedom (statistics)|d.f.]] || ''ν'' = 2 || 0 || <math>\infty</math> || ''π'' / √2 = 2.2214 || undefined
|}
:† I<sub> z </sub>(x,y) is the [[Beta function#Incomplete beta function|regularized incomplete Beta function]]
 
==See also==
* [[Mean Deviation]]
* [[Estimator]]
* [[Coefficient of variation]]
* [[L-moment]]
 
{{No footnotes|date=November 2010}}
 
== References ==
 
*{{Cite journal | author=Xu, Kuan | title=How Has the Literature on Gini's Index Evolved in the Past 80 Years? | publisher=Department of Economics, Dalhousie University | date=January 2004 | version= | url=http://economics.dal.ca/RePEc/dal/wparch/howgini.pdf | accessdate=2006-06-01 }}
*{{cite book | first=Corrado | last=Gini | year=1912 | title=Variabilità e Mutabilità | publisher=Tipografia di Paolo Cuppini | location=Bologna }}
*{{cite journal | author=Gini, Corrado | title=Measurement of Inequality and Incomes | journal=The Economic Journal | year=1921 | volume=31 | pages=124–126 | doi=10.2307/2223319 | issue=121 | publisher=The Economic Journal, Vol. 31, No. 121 | jstor=2223319}}
*{{cite book | first=S. R. | last=Chakravarty | year=1990 | title=Ethical Social Index Numbers | publisher=Springer-Verlag | location=New York }}
*{{cite journal | author=Mills, Jeffrey A.; Zandvakili, Sourushe | title=Statistical Inference via Bootstrapping for Measures of Inequality | journal=Journal of Applied Econometrics | year=1997 | volume=12 | pages=133–150 | doi=10.1002/(SICI)1099-1255(199703)12:2<133::AID-JAE433>3.0.CO;2-H | issue=2}}
*{{cite journal | author=Lomnicki, Z. A. | title=The Standard Error of Gini's Mean Difference | journal=Annals of Mathematical Statistics | year=1952 | volume=23 | pages=635–637 | doi=10.1214/aoms/1177729346 | issue=4 }}
*{{cite journal | author=Nair, U. S. | title=Standard Error  of Gini's Mean Difference | journal=Biometrika | year=1936 | volume=28 | pages=428–436}}
*{{cite journal | author=Yitzhaki, Shlomo |authorlink=Shlomo Yitzhaki (economics) | title=Gini's Mean difference: a superior measure of variability for non-normal distributions | url=ftp://metron.sta.uniroma1.it/RePEc/articoli/2003-2-285-316.pdf | journal=Metron - International Journal of Statistics | year=2003 | volume=61 | pages=285–316}}
 
{{DEFAULTSORT:Mean Difference}}
[[Category:Statistical deviation and dispersion]]
[[Category:Summary statistics]]
[[Category:Theory of probability distributions]]
[[Category:Scale statistics]]

Latest revision as of 12:09, 16 November 2014

Hi there. My name is Garland although it is not the title on my beginning certification. Interviewing is what I do for a residing but I plan on altering it. Some time in the past I chose to reside in Arizona but I need to transfer for my family. Playing crochet is some thing that I've carried out for years.

my site; Www.farebook.us