<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://en.formulasearchengine.com/index.php?action=history&amp;feed=atom&amp;title=Complementary_monopoly</id>
	<title>Complementary monopoly - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://en.formulasearchengine.com/index.php?action=history&amp;feed=atom&amp;title=Complementary_monopoly"/>
	<link rel="alternate" type="text/html" href="https://en.formulasearchengine.com/index.php?title=Complementary_monopoly&amp;action=history"/>
	<updated>2026-04-10T04:42:22Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.0-wmf.28</generator>
	<entry>
		<id>https://en.formulasearchengine.com/index.php?title=Complementary_monopoly&amp;diff=18128&amp;oldid=prev</id>
		<title>en&gt;Mild Bill Hiccup: spelling</title>
		<link rel="alternate" type="text/html" href="https://en.formulasearchengine.com/index.php?title=Complementary_monopoly&amp;diff=18128&amp;oldid=prev"/>
		<updated>2012-06-03T19:02:26Z</updated>

		<summary type="html">&lt;p&gt;spelling&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;In [[statistics]], the &amp;#039;&amp;#039;&amp;#039;Chapman–Robbins bound&amp;#039;&amp;#039;&amp;#039; or &amp;#039;&amp;#039;&amp;#039;Hammersley–Chapman–Robbins bound&amp;#039;&amp;#039;&amp;#039; is a lower bound on the [[variance]] of [[estimator]]s of a deterministic parameter. It is a generalization of the [[Cramér–Rao bound]]; compared to the Cramér–Rao bound, it is both tighter and applicable to a wider range of problems. However, it is usually more difficult to compute.&lt;br /&gt;
&lt;br /&gt;
The bound was independently discovered by [[John Hammersley]] in 1950,&amp;lt;ref&amp;gt;{{Citation&lt;br /&gt;
  | last = Hammersley   | first = J. M. |authorlink=John Hammersley&lt;br /&gt;
  | title = On estimating restricted parameters&lt;br /&gt;
  | journal = [[Journal of the Royal Statistical Society]], Series B&lt;br /&gt;
  | volume = 12 | issue = 2 | pages = 192–240 | year = 1950&lt;br /&gt;
  | mr = 40631&lt;br /&gt;
  | jstor = 2983981&lt;br /&gt;
}}&amp;lt;/ref&amp;gt; and by Douglas Chapman and [[Herbert Robbins]] in 1951.&amp;lt;ref&amp;gt;{{Citation&lt;br /&gt;
  | last = Chapman | first =  D. G.&lt;br /&gt;
  | last2 = Robbins | first2 = H. | author2-link = Herbert Robbins&lt;br /&gt;
  | title = Minimum variance estimation without regularity assumptions&lt;br /&gt;
  | journal = [[Annals of Mathematical Statistics]]&lt;br /&gt;
  | volume = 22 | issue =4 | pages =581–586 | year =1951&lt;br /&gt;
  | doi = 10.1214/aoms/1177729548&lt;br /&gt;
  | mr = 44084&lt;br /&gt;
  | jstor = 2236927&lt;br /&gt;
}}&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Statement ==&lt;br /&gt;
Let {{nowrap|&amp;#039;&amp;#039;θ&amp;#039;&amp;#039; ∈ &amp;#039;&amp;#039;&amp;#039;R&amp;#039;&amp;#039;&amp;#039;&amp;lt;sup&amp;gt;&amp;#039;&amp;#039;n&amp;#039;&amp;#039;&amp;lt;/sup&amp;gt;}} be an unknown, deterministic parameter, and let {{nowrap|&amp;#039;&amp;#039;X&amp;#039;&amp;#039; ∈ &amp;#039;&amp;#039;&amp;#039;R&amp;#039;&amp;#039;&amp;#039;&amp;lt;sup&amp;gt;&amp;#039;&amp;#039;k&amp;#039;&amp;#039;&amp;lt;/sup&amp;gt;}} be a random variable, interpreted as a measurement of &amp;#039;&amp;#039;θ&amp;#039;&amp;#039;. Suppose the [[probability density function]] of &amp;#039;&amp;#039;X&amp;#039;&amp;#039; is given by &amp;#039;&amp;#039;p&amp;#039;&amp;#039;(&amp;#039;&amp;#039;x&amp;#039;&amp;#039;; &amp;#039;&amp;#039;θ&amp;#039;&amp;#039;). It is assumed that &amp;#039;&amp;#039;p&amp;#039;&amp;#039;(&amp;#039;&amp;#039;x&amp;#039;&amp;#039;; &amp;#039;&amp;#039;θ&amp;#039;&amp;#039;) is well-defined and that {{nowrap|&amp;#039;&amp;#039;p&amp;#039;&amp;#039;(&amp;#039;&amp;#039;x&amp;#039;&amp;#039;; &amp;#039;&amp;#039;θ&amp;#039;&amp;#039;) &amp;gt; 0}} for all values of &amp;#039;&amp;#039;x&amp;#039;&amp;#039; and &amp;#039;&amp;#039;θ&amp;#039;&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
Suppose &amp;#039;&amp;#039;δ&amp;#039;&amp;#039;(&amp;#039;&amp;#039;X&amp;#039;&amp;#039;) is an [[bias (statistics)|unbiased]] estimate of an arbitrary scalar function {{nowrap|&amp;#039;&amp;#039;g&amp;#039;&amp;#039;:&amp;amp;thinsp;&amp;#039;&amp;#039;&amp;#039;R&amp;#039;&amp;#039;&amp;#039;&amp;lt;sup&amp;gt;&amp;#039;&amp;#039;n&amp;#039;&amp;#039;&amp;lt;/sup&amp;gt; → &amp;#039;&amp;#039;&amp;#039;R&amp;#039;&amp;#039;&amp;#039;}} of &amp;#039;&amp;#039;θ&amp;#039;&amp;#039;, i.e., &lt;br /&gt;
&lt;br /&gt;
:&amp;lt;math&amp;gt;E\{\delta(X)\} = g(\theta)\text{ for all }\theta.\,&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The Chapman–Robbins bound then states that&lt;br /&gt;
&lt;br /&gt;
:&amp;lt;math&amp;gt;\mathrm{Var}(\delta(X)) \ge \sup_\Delta \frac{\left[ g(\theta+\Delta) - g(\theta) \right]^2}{E_{\theta} \left[ \tfrac{p(X;\theta+\Delta)}{p(X;\theta)} - 1 \right]^2}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note that the denominator in the lower bound above is exactly the [[F-divergence#Instances_of_f-divergences|&amp;lt;math&amp;gt; \chi^2&amp;lt;/math&amp;gt;-divergence]] of &amp;lt;math&amp;gt; p(\cdot; \theta+\Delta)&amp;lt;/math&amp;gt; with respect to &amp;lt;math&amp;gt; p(\cdot; \theta)&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
== Relation to Cramér–Rao bound ==&lt;br /&gt;
The Chapman–Robbins bound converges to the [[Cramér–Rao bound]] when {{nowrap|Δ → 0}}, assuming the regularity conditions of the Cramér–Rao bound hold. This implies that, when both bounds exist, the Chapman–Robbins version is always at least as tight as the Cramér–Rao bound; in many cases, it is substantially tighter. &lt;br /&gt;
&lt;br /&gt;
The Chapman–Robbins bound also holds under much weaker regularity conditions. For example, no assumption is made regarding differentiability of the probability density function &amp;#039;&amp;#039;p&amp;#039;&amp;#039;(&amp;#039;&amp;#039;x&amp;#039;&amp;#039;; &amp;#039;&amp;#039;θ&amp;#039;&amp;#039;). When &amp;#039;&amp;#039;p&amp;#039;&amp;#039;(&amp;#039;&amp;#039;x&amp;#039;&amp;#039;; &amp;#039;&amp;#039;θ&amp;#039;&amp;#039;) is non-differentiable, the [[Fisher information]] is not defined, and hence the Cramér–Rao bound does not exist.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
* [[Cramér–Rao bound]]&lt;br /&gt;
* [[Estimation theory]]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
{{reflist}}&lt;br /&gt;
&lt;br /&gt;
== Further reading ==&lt;br /&gt;
* {{Citation&lt;br /&gt;
  | last = Lehmann&lt;br /&gt;
  | first = E. L.&lt;br /&gt;
  | coauthors = Casella, G.&lt;br /&gt;
  | title = Theory of Point Estimation&lt;br /&gt;
  | year = 1998&lt;br /&gt;
  | publisher = Springer&lt;br /&gt;
  | isbn = 0-387-98502-6&lt;br /&gt;
  | edition = 2nd&lt;br /&gt;
  | pages = 113–114 }}&lt;br /&gt;
&lt;br /&gt;
{{DEFAULTSORT:Chapman-Robbins bound}}&lt;br /&gt;
[[Category:Statistical inequalities]]&lt;br /&gt;
[[Category:Estimation theory]]&lt;/div&gt;</summary>
		<author><name>en&gt;Mild Bill Hiccup</name></author>
	</entry>
</feed>