Crossing number (graph theory): Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>BG19bot
m WP:CHECKWIKI error fix for #61. Punctuation goes before References. Do general fixes if a problem exists. - using AWB (9821)
en>Gilo1969
fix citation(s) with wikilink(s) embedded in URL title
 
Line 1: Line 1:
[[Inequality (mathematics)|Inequalities]] are very important in the study of [[information theory]]. There are a number of different contexts in which these inequalities appear.
The specific handbook covered just about everything, even though this took ages formulate, every little thing went smoothly and followed out absolutely. Not so difficult, also it won anyway under couple of hours. I'm girl  gold watches and built this without help, so normally worry  used spinning bikes ([http://www.youtube.com/watch?v=owgIEGfI9as www.youtube.com]) when it comes to something as overweight...when you never make use of moving the box. (The exact delivery dudes are over willing to provide some of the box if a, getting him or her a unwanted lesson, in case this is  gold watches not happening for every person, plainly  the package as well as take the exact parts in really.


==Shannon-type inequalities==
   
Consider a finite collection of [[Discrete probability distribution|finitely (or at most countably) supported]] [[random variable]]s on the same [[probability space]]. For a collection of ''n'' random variables, there are 2<sup>''n''</sup>&nbsp;&minus;&nbsp;1 such non-empty subsets for which entropies can be defined.  For example, when ''n''&nbsp;=&nbsp;2, we may consider the entropies <math>H(X_1),</math> <math>H(X_2),</math> and <math>H(X_1, X_2),</math> and express the following inequalities (which together characterize the range of the marginal and joint entropies of two random variables):
* <math>H(X_1) \ge 0</math>
* <math>H(X_2) \ge 0</math>
* <math>H(X_1) \le H(X_1, X_2)</math>
* <math>H(X_2) \le H(X_1, X_2)</math>
* <math>H(X_1, X_2) \le H(X_1) + H(X_2).</math>
In fact, these can all be expressed as special cases of a single inequality involving the '''[[conditional mutual information]]''', namely
:<math>I(A;B|C) \ge 0,</math>
where <math>A</math>, <math>B</math>, and <math>C</math> each denote the joint distribution of some arbitrary (possibly empty) subset of our collection of random variables.  Inequalities that can be derived from this are known as '''Shannon-type''' inequalities.  More formally (following the notation of Yeung <ref>{{cite journal |first=R.W. |last=Yeung |title=A framework for linear information inequalities |journal=IEEE Transactions on Information Theory |location=New York |year=1997 |volume=43 |issue=6 |pages=1924–1934 |doi=10.1109/18.641556 }})</ref>), define <math>\Gamma^*_n</math> to be the set of all ''constructible'' points in <math>\mathbb R^{2^n-1},</math> where a point is said to be constructible if and only if there is a joint, discrete distribution of ''n'' random variables such that each coordinate of that point, indexed by a non-empty subset of {1,&nbsp;2,&nbsp;...,&nbsp;''n''}, is equal to the joint entropy of the corresponding subset of the ''n'' random variables.  The [[Closure (topology)|closure]] of <math>\Gamma^*_n</math> is denoted <math>\overline{\Gamma^*_n}.</math>  In general
:<math>\Gamma^*_n \subseteq \overline{\Gamma^*_n} \subseteq \Gamma_n.</math>


The '''[[Cone (linear algebra)|cone]]''' in <math>\mathbb R^{2^n-1}</math> characterized by all Shannon-type inequalities among ''n'' random variables is denoted <math>\Gamma_n.</math> Software has been developed to automate the task of proving such inequalities
No big thing.) What type brings us to your ultimate point. Pondered meant to choose the Schwinn 170 make. When I first saw the specific price it was eventually designated along through $300, then again e decided in order to wait 2 or 3 days before you buy. After I ended up being set to buy the 170 the seen the most important $300 discounted rate was not any longer presented, therefore I selected the exact 130. Yesterday I was checking out the 170 yet again and noted the most important $300 reduced cost had been granted yet again.
<ref>{{cite journal |first=R.W. |last=Yeung |first2=Y.O. |last2=Yan |title=ITIP - Information Theoretic Inequality Prover |year=1996 |url=http://user-www.ie.cuhk.edu.hk/~ITIP }}</ref>
.<ref>{{cite journal |first=R. |last=Pulikkoonattu |first2=E. |last2=E.Perron |first3=S. |last3=S.Diggavi |title=Xitip - Information Theoretic Inequalities Prover |year=2007 |url=http://xitip.epfl.ch/ }}</ref>
Given an inequality, such software is able to determine whether the given inequality contains the cone <math>\Gamma_n,</math> in which case the inequality can be verified, since <math>\Gamma^*_n \subseteq \Gamma_n.</math>


==Non-Shannon-type inequalities==
  Extremely irritating! The most important tools forwarded to the most important cycle I try not to recommend with for those who your own personal devices; these are generally if you try not to possess resources. Each excessively fatty head for the Phillips screwdriver will not contest the pinnacle scale of some of the anchoring screws. I additionally recommend two 13mm wrenches or possibly sockets the place. Each set-up enough time after opening the box to essentially employing [http://www.youtube.com/watch?v=O9ubRS_Qtzo spinning Bike Reviews] took 35 minutes; it could possibly being quicker then again we were thorough.
Other, less trivial inequalities have been discovered among the entropies and joint entropies of four or more random variables, which cannot be derived from Shannon's basic inequalities.  These are known as '''non-Shannon-type''' inequalities.  In 1997 and 1998, Zhang and Yeung reported two non-Shannon-type inequalities.<ref>{{cite journal |first=Z. |last=Zhang |first2=R. W. |last2=Yeung |title=A non-Shannon-type conditional inequality of information quantities |journal=IEEE Transactions on Information Theory |location=New York |year=1997 |volume=43 |issue=6 |pages=1982–1986 |doi=10.1109/18.641561 }}</ref><ref>{{cite journal |first=Z. |last=Zhang |first2=R. W. |last2=Yeung |title=On characterization of entropy function via information inequalities |journal=IEEE Transactions on Information Theory |location=New York |year=1998 |volume=44 |issue=4 |pages=1440–1452 |doi=10.1109/18.681320 }}</ref>  The latter implies that
:<math> \overline{\Gamma^*_n} \subset \Gamma_n,</math>
where the inclusions are proper for <math>n \ge 4.</math> The two sets above are, in fact, [[convex cone]]s.
 
Further non-Shannon-type inequalities were reported in.<ref>{{cite journal |first=F. |last=Matus  |title=
Conditional independences among four random variables III: Final conclusion |journal=Combinatorics, Probability and Computing |volume=8 |issue=3 |pages=269–276 |year=1999  }}</ref><ref>{{cite journal |first=K. |last=Makarychev |last2=''et al.'' |title=A new class of non-Shannon-type inequalities for entropies |journal=Communications in Information and Systems |volume=2 |issue=2 |pages=147–166 |year=2002 |url=http://www.cs.princeton.edu/~ymakaryc/papers/nonshann.pdf }}</ref><ref>{{cite journal |first=Z. |last=Zhang  |title=On a new non-Shannon-type information inequality |journal=Communications in Information and Systems |volume=3 |issue=1 |pages=47–60 |year=2003 |url=http://www.cs.princeton.edu/~ymakaryc/papers/nonshann.pdf }}</ref>  Dougherty et al.<ref>{{cite conference |first=R. |last=Dougherty |last2="et al." |title=Six new non-Shannon information inequalities |conference=2006 IEEE International Symposium on Information Theory |year=2006 }}</ref> found a number of non-Shannon-type inequalities by computer search.  Matus<ref>{{cite conference |first=F. |last=Matus |title=Infinitely many information inequalities |conference=2007 IEEE International Symposium on Information Theory | year=2007 }}</ref> proved the existence of infinitely many linear non-Shannon-type inequalities.
 
==Lower bounds for the Kullback–Leibler divergence==
A great many important inequalities in information theory are actually lower bounds for the [[Kullback–Leibler divergence]].  Even the Shannon-type inequalities can be considered part of this category, since the bivariate [[mutual information]] can be expressed as the Kullback–Leibler divergence of the joint distribution with respect to the product of the marginals, and thus these inequalities can be seen as a special case of [[Gibbs' inequality]].
 
On the other hand, it seems to be much more difficult to derive useful upper bounds for the Kullback–Leibler divergence.  This is because the Kullback–Leibler divergence ''D''<sub>''KL''</sub>(''P''||''Q'') depends very sensitively on events that are very rare in the reference distribution ''Q''.  ''D''<sub>''KL''</sub>(''P''||''Q'') increases without bound as an event of finite non-zero probability in the distribution ''P'' becomes exceedingly rare in the reference distribution ''Q'', and in fact ''D''<sub>''KL''</sub>(''P''||''Q'') is not even defined if an event of non-zero probability in ''P'' has zero probability in ''Q''.  (Hence the requirement that ''P'' be absolutely continuous with respect to ''Q''.)
 
===Gibbs' inequality===
{{main|Gibbs' inequality}}
This fundamental inequality states that the [[Kullback–Leibler divergence]] is non-negative.
 
===Kullback's inequality===
{{main|Kullback's inequality}}
Another inequality concerning the Kullback–Leibler divergence is known as '''Kullback's inequality'''.<ref>{{cite journal |first=Aimé |last=Fuchs |first2=Giorgio |last2=Letta |title={{lang|fr|L'inégalité de Kullback. Application à la théorie de l'estimation}} |journal=Séminaire de probabilités |location=Strasbourg |volume=4 |issue= |pages=108–131 |year=1970 |mr=267669 }}</ref>  If ''P'' and ''Q'' are [[probability distribution]]s on the real line with ''P'' [[Absolute continuity|absolutely continuous]] with respect to ''Q,'' and whose first moments exist, then
:<math>D_{KL}(P\|Q) \ge \Psi_Q^*(\mu'_1(P)),</math>
where <math>\Psi_Q^*</math> is the [[large deviations theory|large deviations]] '''[[rate function]]''', i.e. the [[convex conjugate]] of the [[cumulant]]-generating function, of ''Q'', and <math>\mu'_1(P)</math> is the first [[Moment (mathematics)|moment]] of ''P''.
 
The [[Cramér–Rao bound]] is a corollary of this result.
 
===Pinsker's inequality===
{{main|Pinsker's inequality}}
 
Pinsker's inequality relates [[Kullback&ndash;Leibler divergence]] and [[total variation distance]].  It states that if ''P'', ''Q'' are two [[probability distribution]]s, then
 
: <math>\sqrt{\frac{1}{2}D_{KL}^{(e)}(P\|Q)} \ge \sup \{ |P(A) - Q(A)| : A\text{ is an event to which probabilities are assigned.} \}. </math>
 
where
 
: <math>D_{KL}^{(e)}(P||Q)</math>
 
is the Kullback&ndash;Leibler divergence in [[Nat (information)|nats]] and
 
: <math> \sup_A |P(A) - Q(A)| \, </math>
 
is the total variation distance.
 
==Other inequalities==
 
===Hirschman uncertainty===
{{main|Hirschman uncertainty}}
 
In 1957,<ref>{{cite journal |first=I. I. |last=Hirschman |title=A Note on Entropy |journal=[[American Journal of Mathematics]] |year=1957 |volume=79 |issue=1 |pages=152–156 |jstor=2372390 |doi=10.2307/2372390}}</ref> Hirschman showed that for a (reasonably well-behaved) function <math>f:\mathbb R \rightarrow \mathbb C</math> such that <math>\int_{-\infty}^\infty |f(x)|^2\,dx = 1,</math> and its [[Fourier transform]] <math>g(y)=\int_{-\infty}^\infty f(x) e^{-2 \pi i x y}\,dx,</math> the sum of the [[differential entropy|differential entropies]] of <math>|f|^2</math> and <math>|g|^2</math> is non-negative, i.e.
:<math>-\int_{-\infty}^\infty |f(x)|^2 \log |f(x)|^2 \,dx -\int_{-\infty}^\infty |g(y)|^2 \log |g(y)|^2 \,dy \ge 0.</math>
Hirschman conjectured, and it was later proved,<ref>{{cite journal |first=W. |last=Beckner |title=Inequalities in Fourier Analysis |journal=[[Annals of Mathematics]] |volume=102 |issue=6 |pages=159–182 |year=1975 |jstor=1970980 |doi=10.2307/1970980}}</ref> that a sharper bound of <math>\log(e/2),</math> which is attained in the case of a [[Normal distribution|Gaussian distribution]], could replace the right-hand side of this inequality. This is especially significant since it implies, and is stronger than, Weyl's formulation of Heisenberg's [[uncertainty principle]].
 
===Tao's inequality===
Given discrete random variables <math>X</math>, <math>Y</math>, and <math>Y'</math>, such that <math>X</math> takes values only in the interval [&minus;1,&nbsp;1] and  <math>Y'</math> is determined by <math>Y</math> (so that <math>H(Y'|Y)=0</math>), we have<ref>{{cite journal |authorlink=Terence Tao |first=T. |last=Tao |title=Szemerédi's regularity lemma revisited |journal=Contrib. Discrete Math. |volume=1 |year=2006 |issue= |pages=8–28 |doi= |arxiv=math/0504472 }}</ref><ref>{{cite journal |authorlink=Rudolf Ahlswede |first=Rudolf |last=Ahlswede |title=The final form of Tao's inequality relating conditional expectation and conditional mutual information |journal=Advances in Mathematics of Communications |volume=1 |issue=2 |year=2007 |pages=239–242 |doi=10.3934/amc.2007.1.239 }}</ref>
 
:<math>\mathbb E \big( \big| \mathbb E(X|Y') - \mathbb E(X|Y) \big| \big)
    \le \sqrt { 2 \log 2 \, I(X;Y|Y') },</math>
 
relating the conditional expectation to the [[conditional mutual information]].  This is a simple consequence of [[Pinsker's inequality]]. (Note: the correction factor log&nbsp;2 inside the radical arises because we are measuring the conditional mutual information in [[bit]]s rather than [[Nat (information)|nats]].)
 
==See also==
*[[Cramér–Rao bound]]
*[[Entropy power inequality]]
*[[Fano's inequality]]
*[[Jensen's inequality]]
*[[Kraft inequality]]
*[[Pinsker's inequality]]
*[[Multivariate mutual information]]
 
==References==
<references/>
 
==External links==
* Thomas M. Cover, Joy A. Thomas. ''Elements of Information Theory'', Chapter 16, "Inequalities in Information Theory"  John Wiley & Sons, Inc. 1991 Print ISBN 0-471-06259-6 Online ISBN 0-471-20061-1 [http://www.matf.bg.ac.rs/nastavno/viktor/Inequalities_in_Information_Theory.pdf pdf]
* Amir Dembo, Thomas M. Cover, Joy A. Thomas. ''Information Theoretic Inequalities.'' IEEE Transactions on Information Theory, Vol. 37, No. 6, November 1991. [http://www.stanford.edu/~cover/papers/dembo_cover_thomas_91.pdf pdf]
 
{{DEFAULTSORT:Inequalities In Information Theory}}
[[Category:Inequalities| ]]
[[Category:Entropy and information]]
[[Category:Information theory]]

Latest revision as of 12:07, 8 December 2014

The specific handbook covered just about everything, even though this took ages formulate, every little thing went smoothly and followed out absolutely. Not so difficult, also it won anyway under couple of hours. I'm girl gold watches and built this without help, so normally worry used spinning bikes (www.youtube.com) when it comes to something as overweight...when you never make use of moving the box. (The exact delivery dudes are over willing to provide some of the box if a, getting him or her a unwanted lesson, in case this is gold watches not happening for every person, plainly the package as well as take the exact parts in really.


No big thing.)	 	What type brings us to your ultimate point. Pondered meant to choose the Schwinn 170 make. When I first saw the specific price it was eventually designated along through $300, then again e decided in order to wait 2 or 3 days before you buy. After I ended up being set to buy the 170 the seen the most important $300 discounted rate was not any longer presented, therefore I selected the exact 130. Yesterday I was checking out the 170 yet again and noted the most important $300 reduced cost had been granted yet again.
Extremely irritating!	 	The most important tools forwarded to the most important cycle I try not to recommend with for those who your own personal devices; these are generally if you try not to possess resources. Each excessively fatty head for the Phillips screwdriver will not contest the pinnacle scale of some of the anchoring screws. I additionally recommend two 13mm wrenches or possibly sockets the place. Each set-up enough time after opening the box to essentially employing spinning Bike Reviews took 35 minutes; it could possibly being quicker then again we were thorough.