|
|
Line 1: |
Line 1: |
| The '''Hartley function''' is a measure of uncertainty, introduced by [[Ralph Hartley]] in 1928. If we pick a sample from a finite set ''A'' uniformly at random, the information revealed after we know the outcome is given by the Hartley function
| | Nice to satisfy you, my name is Numbers Held though I don't really like being called like that. His wife doesn't like it the way he [http://biztoc.com/mod/groups/topicposts.php?topic=1311479&group_guid=1311468 http://biztoc.com/mod/groups/topicposts.php?topic=1311479&group_guid=1311468] does but what he truly likes performing is to do aerobics and he's been performing it at home std testing for fairly a whilst. My working day occupation is a meter reader. home std test kit South Dakota is exactly where I've usually been [http://chastity.com/chastity-qa/stds/infections/whats-most-common-std/whats-most residing].<br><br>Take a [http://www.dev.adaptationlearning.net/solid-advice-regards-yeast-infection http://www.dev.adaptationlearning.net] look at my weblog ... std testing at home, [http://flashtrue.com/stay-yeast-infection-free-using-these-useful-tips/ Read the Full Report], |
| :<math> H_0(A) := \mathrm{log}_b \vert A \vert .</math>
| |
| If the [[base (exponentiation)|base]] of the [[logarithm]] is 2, then the uncertainty is measured in bits. If it is the [[natural logarithm]], then the unit is [[Nat (information)|nats]]. (Hartley himself used a base-ten logarithm, and this unit of information is sometimes called the '''[[Ban (information)|hartley]]''' in his honor.) It is also known as the Hartley entropy.
| |
| | |
| == Hartley function, Shannon's entropy, and Rényi entropy ==
| |
| The Hartley function coincides with the [[Shannon entropy]] (as well as with the Rényi entropies of all orders) in the case of a uniform probability distribution. It is actually a special case of the [[Rényi entropy]] since:
| |
| :<math>H_0(X) = \frac 1 {1-0} \log \sum_{i=1}^{|X|} p_i^0 = \log |X|.</math>
| |
| | |
| But it can also be viewed as a primitive construction, since, as emphasized by Kolmogorov and Rényi (see George, J. Klirr's "Uncertainty and information", p.423), the Hartley function can be defined without introducing any notions of probability.
| |
| | |
| ==Characterization of the Hartley function== | |
| The Hartley function only depends on the number of elements in a set, and hence can be viewed as a function on natural numbers. Rényi showed that the Hartley function in base 2 is the only function mapping natural numbers to real numbers that satisfies
| |
| | |
| # <math>H(mn) = H(m)+H(n)</math> (additivity)
| |
| # <math>H(m) \leq H(m+1)</math> (monotonicity)
| |
| # <math>H(2)=1</math> (normalization)
| |
| | |
| Condition 1 says that the uncertainty of the Cartesian product of two finite sets ''A'' and ''B'' is the sum of uncertainties of ''A'' and ''B''. Condition 2 says that larger set has larger uncertainty.
| |
| | |
| ==Derivation of the Hartley function== | |
| We want to show that the Hartley function, log<sub>2</sub>(''n''), is the only function mapping natural numbers to real numbers that satisfies
| |
| | |
| # <math>H(mn) = H(m)+H(n)\,</math> (additivity)
| |
| # <math>H(m) \leq H(m+1)\,</math> (monotonicity)
| |
| # <math>H(2)=1\,</math> (normalization)
| |
| | |
| Let ''ƒ'' be a function on positive integers that satisfies the above three properties. From the additive property, we can show that for any integer ''n'' and ''k'',
| |
| | |
| :<math>f(n^k) = kf(n).\,</math>
| |
| | |
| Let ''a'', ''b'', and ''t'' be any positive integers. There is a unique integer ''s'' determined by
| |
| | |
| :<math>a^s \leq b^t \leq a^{s+1}. \qquad(1)</math> | |
| | |
| Therefore,
| |
| | |
| :<math>s \log_2 a\leq t \log_2 b \leq (s+1) \log_2 a \, </math>
| |
| | |
| and
| |
| | |
| :<math>\frac{s}{t} \leq \frac{\log_2 b}{\log_2 a} \leq \frac{s+1}{t}.</math>
| |
| | |
| On the other hand, by monotonicity,
| |
| | |
| :<math>f(a^s) \leq f(b^t) \leq f(a^{s+1}). \, </math>
| |
| | |
| Using Equation (1), we get
| |
| | |
| :<math>s f(a) \leq t f(b) \leq (s+1) f(a),\,</math>
| |
| | |
| and
| |
| | |
| :<math>\frac{s}{t} \leq \frac{f(a)}{f(b)} \leq \frac{s+1}{t}.</math>
| |
| | |
| Hence,
| |
| | |
| :<math>\Big\vert \frac{f(a)}{f(b)} - \frac{\log_2(a)}{\log_2(b)} \Big\vert \leq \frac{1}{t}.</math>
| |
| | |
| Since ''t'' can be arbitrarily large, the difference on the left hand side of the above inequality must be zero,
| |
| | |
| :<math>\frac{f(a)}{f(b)} = \frac{\log_2(a)}{\log_2(b)}.</math>
| |
| | |
| So,
| |
| | |
| :<math>f(a) = \mu \log_2(a)\,</math> | |
| | |
| for some constant ''μ'', which must be equal to 1 by the normalization property.
| |
| | |
| ==See also==
| |
| * [[Rényi entropy]]
| |
| | |
| {{PlanetMath attribution|id=6070|title=Hartley function}}
| |
| {{PlanetMath attribution|id=6082|title=Derivation of Hartley function}}
| |
| | |
| [[Category:Information theory]] | |
Nice to satisfy you, my name is Numbers Held though I don't really like being called like that. His wife doesn't like it the way he http://biztoc.com/mod/groups/topicposts.php?topic=1311479&group_guid=1311468 does but what he truly likes performing is to do aerobics and he's been performing it at home std testing for fairly a whilst. My working day occupation is a meter reader. home std test kit South Dakota is exactly where I've usually been residing.
Take a http://www.dev.adaptationlearning.net look at my weblog ... std testing at home, Read the Full Report,