Four-acceleration: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
 
en>Dr Greg
the distinction should be inertial v. non-inertial rather than SR v. GR
Line 1: Line 1:
[http://Www.answers.com/topic/Grateful Grateful] to meet you! My current name is Eusebio Ledbetter. It's not a common stuff but what I much like doing is bottle counter tops collecting and now My partner have time to look at on new things. Software developing is how My personal [http://search.about.com/?q=support support] my family. My house is but in Vermont. I've been working out on my website just for some time now. Inspection it out here: http://prometeu.net<br><br>
In [[statistics]], '''Yates' correction for continuity''' (or '''Yates' chi-squared test''') is used in certain situations when testing for [[independence (probability theory)|independence]] in a [[contingency table]]. In some cases, Yates' correction may adjust too far, and so its current use is limited.


Also visit my webpage clash of clans hack, [http://prometeu.net click through the following page],
==Correction for approximation error==
Using the [[chi-squared distribution]] to interpret [[Pearson's chi-squared test|Pearson's chi-squared statistic]] requires one to assume that the [[Discrete probability distribution|discrete]] probability of observed [[binomial distribution|binomial frequencies]] in the table can be approximated by the continuous [[chi-squared distribution]]. This assumption is not quite correct, and introduces some error.
 
To reduce the error in approximation, [[Frank Yates]], an [[England|English]] [[statistician]], suggested a correction for continuity that adjusts the formula for [[Pearson's chi-squared test]] by subtracting 0.5 from the difference between each observed value and its expected value in a 2&nbsp;&times;&nbsp;2 contingency table.<ref name=Yates>[[Frank Yates|Yates, F]] (1934). "Contingency table involving small numbers and the χ<sup>2</sup> test". ''Supplement to the [[Journal of the Royal Statistical Society]]'' '''1'''(2): 217&ndash;235. {{jstor|2983604}}</ref> This reduces the chi-squared value obtained and thus increases its [[p-value]].
 
The effect of Yates' correction is to prevent overestimation of statistical significance for small data. This formula is chiefly used when at least one cell of the table has an expected count smaller than 5. Unfortunately, Yates' correction may tend to overcorrect. This can result in an overly conservative result that fails to reject the [[null hypothesis]] when it should (a [[type II error]]). So it is suggested that Yates' correction is unnecessary even with quite low sample sizes,<ref name=Sokal1981>Sokal RR, Rohlf F.J. (1981). ''Biometry: The Principles and Practice of Statistics in Biological Research.'' Oxford: W.H. Freeman, ISBN 0-7167-1254-7.</ref> such as:
 
:<math> \sum_{i=1}^N O_i = 20 \, </math>
 
The following is Yates' corrected version of [[Pearson's chi-squared test|Pearson's chi-squared statistic]]:
 
:<math> \chi_\text{Yates}^2 = \sum_{i=1}^{N} {(|O_i - E_i| - 0.5)^2 \over E_i}</math>
 
where:
 
:''O<sub>i</sub>'' = an observed frequency
:''E<sub>i</sub>'' = an expected (theoretical) frequency, asserted by the null hypothesis
:''N'' = number of distinct events
 
== 2 &times; 2 table ==
 
As a short-cut, for a 2&nbsp;×&nbsp;2 table with the following entries:
 
{| class="wikitable"
! &nbsp; !! S !! F !! &nbsp;
|-
! A
| ''a'' || ''b'' || ''N''<sub>A</sub>
|-
! B
| ''c'' || ''d'' || ''N''<sub>B</sub>
|-
! &nbsp;
| ''N''<sub>S</sub> || ''N''<sub>F</sub> || ''N''
|}
 
we can write
 
: <math>\chi_\text{Yates}^2 = \frac{N(|ad - bc| - N/2)^2}{N_S N_F N_A N_B}.</math>
 
In some cases, this is better.
 
: <math>\chi_\text{Yates}^2 = \frac{N( \max(0, |ad - bc| - N/2) )^2}{N_S N_F N_A N_B}.</math>
<!--
Some sources say that this correction should be used when the expected frequency is less than 10 {{Citation needed|date=February 2007}}, yet other sources say that Yates' corrections should always be applied {{Citation needed|date=February 2007}}. However, in situations with large sample sizes, using the correction will have little effect on the value of the test statistic, and hence the p-value obtained.
-->
 
== See also ==
 
* [[Continuity correction]]
* [[Binomial_proportion_confidence_interval#Wilson_score_interval_with_continuity_correction|Wilson score interval with continuity correction]]
 
==References==
 
{{reflist}}
 
{{DEFAULTSORT:Yates' Correction For Continuity}}
[[Category:Categorical data]]
[[Category:Statistical tests]]

Revision as of 21:08, 20 January 2014

In statistics, Yates' correction for continuity (or Yates' chi-squared test) is used in certain situations when testing for independence in a contingency table. In some cases, Yates' correction may adjust too far, and so its current use is limited.

Correction for approximation error

Using the chi-squared distribution to interpret Pearson's chi-squared statistic requires one to assume that the discrete probability of observed binomial frequencies in the table can be approximated by the continuous chi-squared distribution. This assumption is not quite correct, and introduces some error.

To reduce the error in approximation, Frank Yates, an English statistician, suggested a correction for continuity that adjusts the formula for Pearson's chi-squared test by subtracting 0.5 from the difference between each observed value and its expected value in a 2 × 2 contingency table.[1] This reduces the chi-squared value obtained and thus increases its p-value.

The effect of Yates' correction is to prevent overestimation of statistical significance for small data. This formula is chiefly used when at least one cell of the table has an expected count smaller than 5. Unfortunately, Yates' correction may tend to overcorrect. This can result in an overly conservative result that fails to reject the null hypothesis when it should (a type II error). So it is suggested that Yates' correction is unnecessary even with quite low sample sizes,[2] such as:

i=1NOi=20

The following is Yates' corrected version of Pearson's chi-squared statistic:

χYates2=i=1N(|OiEi|0.5)2Ei

where:

Oi = an observed frequency
Ei = an expected (theoretical) frequency, asserted by the null hypothesis
N = number of distinct events

2 × 2 table

As a short-cut, for a 2 × 2 table with the following entries:

  S F  
A a b NA
B c d NB
  NS NF N

we can write

χYates2=N(|adbc|N/2)2NSNFNANB.

In some cases, this is better.

χYates2=N(max(0,|adbc|N/2))2NSNFNANB.

See also

References

43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.

  1. Yates, F (1934). "Contingency table involving small numbers and the χ2 test". Supplement to the Journal of the Royal Statistical Society 1(2): 217–235. Template:Jstor
  2. Sokal RR, Rohlf F.J. (1981). Biometry: The Principles and Practice of Statistics in Biological Research. Oxford: W.H. Freeman, ISBN 0-7167-1254-7.