<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://en.formulasearchengine.com/index.php?action=history&amp;feed=atom&amp;title=Integer-valued_function</id>
	<title>Integer-valued function - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://en.formulasearchengine.com/index.php?action=history&amp;feed=atom&amp;title=Integer-valued_function"/>
	<link rel="alternate" type="text/html" href="https://en.formulasearchengine.com/index.php?title=Integer-valued_function&amp;action=history"/>
	<updated>2026-05-13T04:39:41Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.0-wmf.28</generator>
	<entry>
		<id>https://en.formulasearchengine.com/index.php?title=Integer-valued_function&amp;diff=29936&amp;oldid=prev</id>
		<title>en&gt;Gandalf61: Add examples</title>
		<link rel="alternate" type="text/html" href="https://en.formulasearchengine.com/index.php?title=Integer-valued_function&amp;diff=29936&amp;oldid=prev"/>
		<updated>2013-07-03T12:42:22Z</updated>

		<summary type="html">&lt;p&gt;Add examples&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;In [[graph theory]], the &amp;#039;&amp;#039;&amp;#039;Shannon capacity of a graph&amp;#039;&amp;#039;&amp;#039; is a [[graph invariant]] defined from the number of [[independent set (graph theory)|independent set]]s of [[strong graph product]]s. It measures the [[Shannon capacity]] of a [[communications channel]] defined from the graph, and is upper bounded by the [[Lovász number]], which can be computed in [[polynomial time]]. However, the [[computational complexity]] of the Shannon capacity itself remains unknown.&lt;br /&gt;
&lt;br /&gt;
==Definition==&lt;br /&gt;
The Shannon capacity &amp;lt;ref name=&amp;quot;sh56&amp;quot;&amp;gt;{{citation | first = Claude E. | last = Shannon | title = The Zero-Error Capacity of a Noisy Channel | journal = IRE Trans. Inform. Th. | volume = 2 |pages = 8–19  | year = 1956 }}. Reprinted in &amp;lt;em&amp;gt;Key papers in the development of information theory&amp;lt;/em&amp;gt; (D. Slepian, Ed.), New York: IEEE Press, 1974, 112--123.&amp;lt;/ref&amp;gt;  of a graph &amp;#039;&amp;#039;G&amp;#039;&amp;#039; is defined as&lt;br /&gt;
:&amp;lt;math&amp;gt;&lt;br /&gt;
 \Theta(G)&lt;br /&gt;
 = \sup_k \sqrt[k]{\alpha(G^k)}&lt;br /&gt;
 = \lim_{k \rightarrow \infty} \sqrt[k]{\alpha(G^k)},&lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
where α(&amp;#039;&amp;#039;G&amp;#039;&amp;#039;) is the [[independence number]] of [[graph (mathematics)|graph]] &amp;#039;&amp;#039;G&amp;#039;&amp;#039; (the size of a largest [[independent set (graph theory)|independent set]] of &amp;#039;&amp;#039;G&amp;#039;&amp;#039;) and &amp;#039;&amp;#039;G&amp;#039;&amp;#039;&amp;lt;sup&amp;gt;k&amp;lt;/sup&amp;gt; is the [[strong graph product]] of &amp;#039;&amp;#039;G&amp;#039;&amp;#039; with itself &amp;#039;&amp;#039;k&amp;#039;&amp;#039; times.&lt;br /&gt;
&lt;br /&gt;
This quantity measures the [[Shannon capacity]] of a communication channel in which the vertices of &amp;#039;&amp;#039;G&amp;#039;&amp;#039; represent distinct symbols that may be transmitted on the channel in each time step, and the edges of &amp;#039;&amp;#039;G&amp;#039;&amp;#039; represent pairs of symbols that may be confused with each other. A [[code]] of length &amp;#039;&amp;#039;k&amp;#039;&amp;#039; using these symbols has no two mutually confusable codewords if and only if it corresponds to an [[independent set (graph theory)|independent set]] of &amp;#039;&amp;#039;G&amp;lt;sup&amp;gt;k&amp;lt;/sup&amp;gt;&amp;#039;&amp;#039;, so the Shannon capacity of the graph represents the effective number of distinct symbols that may be used per time step, for sufficiently long codes.&lt;br /&gt;
&lt;br /&gt;
==Example==&lt;br /&gt;
[[File:Cycle graph C5.png|thumb|100px|Pentagon graph]]&lt;br /&gt;
Let &amp;#039;&amp;#039;C&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;5&amp;lt;/sub&amp;gt; be the graph of the [[pentagon]]. There exists a five-symbol code of length 2 for this graph, consisting of the five code words &amp;quot;11&amp;quot;, &amp;quot;23&amp;quot;, &amp;quot;35&amp;quot;, &amp;quot;54&amp;quot;, &amp;quot;42&amp;quot;. Every two of these words contain two non-adjacent symbols in the same positions as each other: for instance, for the two words &amp;quot;11&amp;quot; and &amp;quot;23&amp;quot;, the symbols in the second position, &amp;quot;1&amp;quot; and &amp;quot;3&amp;quot;, are non-adjacent. Therefore, these words form an independent set in  &amp;#039;&amp;#039;C&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;5&amp;lt;/sub&amp;gt;&amp;lt;sup&amp;gt;2&amp;lt;/sup&amp;gt;, showing that &amp;amp;Theta;(&amp;#039;&amp;#039;C&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;5&amp;lt;/sub&amp;gt;)&amp;amp;nbsp;≥&amp;amp;nbsp;{{sqrt|5}}. But as {{harvtxt|Lovász|1979}} showed, this bound is tight:&lt;br /&gt;
Θ(&amp;#039;&amp;#039;C&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;5&amp;lt;/sub&amp;gt;)&amp;amp;nbsp;=&amp;amp;nbsp;{{sqrt|5}}.&lt;br /&gt;
&lt;br /&gt;
==Relation to Lovász number==&lt;br /&gt;
The [[Lovász number]] ϑ(&amp;#039;&amp;#039;G&amp;#039;&amp;#039;) is a different graph invariant, that can be computed numerically to high accuracy in [[polynomial time]] by an algorithm based on the [[ellipsoid method]]. The Shannon capacity of a graph &amp;#039;&amp;#039;G&amp;#039;&amp;#039; is bounded from below by α(&amp;#039;&amp;#039;G&amp;#039;&amp;#039;), and from above by ϑ(&amp;#039;&amp;#039;G&amp;#039;&amp;#039;).&amp;lt;ref name=&amp;quot;l79&amp;quot;&amp;gt;{{citation | first = László | last = Lovász | authorlink = László Lovász | title = On the Shannon Capacity of a Graph | journal = [[IEEE Transactions on Information Theory]] | volume = IT-25 | issue = 1 | year = 1979}}.&amp;lt;/ref&amp;gt; In some cases, ϑ(&amp;#039;&amp;#039;G&amp;#039;&amp;#039;) and the Shannon capacity coincide; for instance, for the graph of a [[pentagon]], both are equal to {{sqrt|5}}. However, there exist other graphs for which the Shannon capacity and the Lovász number differ.&amp;lt;ref name=&amp;quot;h79&amp;quot;&amp;gt;{{citation | first = Willem H. | last = Haemers | title = On Some Problems of Lovász Concerning the Shannon Capacity of a Graph | journal = IEEE Transactions on Information Theory | volume = 25 |pages = 231–232  | year = 1979 | url = http://arno.uvt.nl/show.cgi?fid=80311}}.&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Haemers&amp;#039; bound==&lt;br /&gt;
Haemers provided another bound on the Shannon capacity, which is sometimes better than Lovász bound:&amp;lt;ref&amp;gt;{{citation | first = Willem H. | last = Haemers | title = An upper bound for the Shannon capacity of a graph | journal = Colloquia Mathematica Societatis János Bolyai | volume = 25 | pages = 267–272 | year = 1978 | url = http://arno.uvt.nl/show.cgi?fid=80314}}. The definition here corrects a typo in this paper.&amp;lt;/ref&amp;gt;&lt;br /&gt;
:&amp;lt;math&amp;gt;&lt;br /&gt;
 \Theta(G) \leq R(G) = \min_{B} \operatorname{rank}(B),&lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
where &amp;#039;&amp;#039;B&amp;#039;&amp;#039; is an &amp;#039;&amp;#039;n&amp;#039;&amp;#039;&amp;amp;nbsp;&amp;amp;times;&amp;amp;nbsp;&amp;#039;&amp;#039;n&amp;#039;&amp;#039; matrix over some [[field (mathematics)|field]], such that &amp;#039;&amp;#039;b&amp;lt;sub&amp;gt;ii&amp;lt;/sub&amp;gt;&amp;#039;&amp;#039;&amp;amp;nbsp;≠&amp;amp;nbsp;0 and &amp;#039;&amp;#039;b&amp;lt;sub&amp;gt;ij&amp;lt;/sub&amp;gt;&amp;#039;&amp;#039;&amp;amp;nbsp;=&amp;amp;nbsp;0 if vertices &amp;#039;&amp;#039;i&amp;#039;&amp;#039; and &amp;#039;&amp;#039;j&amp;#039;&amp;#039; are not adjacent.&lt;br /&gt;
&lt;br /&gt;
==Computational complexity==&lt;br /&gt;
The [[computational complexity]] of the Shannon capacity is unknown, and even the value of the Shannon capacity for certain small graphs such as &amp;#039;&amp;#039;C&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;7&amp;lt;/sub&amp;gt; (a [[cycle graph]] of seven vertices) remains unknown.&amp;lt;ref&amp;gt;{{citation|title=Rough problems|work=Gödel&amp;#039;s Lost Letter and P=NP|first=Kenneth W.|last=Regan|date=July 10, 2013|url=http://rjlipton.wordpress.com/2013/07/10/rough-problems/}}.&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A natural approach to this problem would be to compute a finite number of powers of the given graph &amp;#039;&amp;#039;G&amp;#039;&amp;#039;, find their independence numbers, and infer from these numbers some information about the limiting behavior of the sequence from which the Shannon capacity is defined. However (even ignoring the computational difficulty of computing the independence numbers of these graphs, an [[NP-hard]] problem) the unpredictable behavior of the sequence of independence numbers of powers of &amp;#039;&amp;#039;G&amp;#039;&amp;#039; implies that this approach cannot be used to accurately approximate the Shannon capacity.&amp;lt;ref&amp;gt;{{citation|title=The Shannon capacity of a graph and the independence numbers of its powers|arxiv=cs/0608021|first1=Noga|last1=Alon|author1-link=Noga Alon|first2=Eyal|last2=Lubetzky|journal=IEEE Transactions on Information Theory|volume=52|year=2006|pages=2172–2176}}.&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
{{reflist}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Graph invariants]]&lt;br /&gt;
[[Category:Information theory]]&lt;/div&gt;</summary>
		<author><name>en&gt;Gandalf61</name></author>
	</entry>
</feed>