|
|
(One intermediate revision by one other user not shown) |
Line 1: |
Line 1: |
| In [[computational complexity theory]], the '''Sipser–Lautemann theorem''' or '''Sipser–Gács–Lautemann theorem''' states that [[Bounded-error probabilistic polynomial|Bounded-error Probabilistic Polynomial]] (BPP) time, is contained in the [[Polynomial hierarchy|polynomial time hierarchy]], and more specifically Σ<sub>2</sub> ∩ Π<sub>2</sub>.
| | Nice to meet you, I am Marvella Shryock. One of the extremely very best issues in the globe for me is to do aerobics and I've been doing it for quite a while. Puerto Rico is where he's usually been living but she needs to move because of her family. For years he's been operating as a meter reader and it's something he really appreciate.<br><br>my web blog; [http://www.youporn-nederlandse.com/blog/117384 home std test] |
| | |
| In 1983, [[Michael Sipser]] showed that BPP is contained in the [[Polynomial hierarchy|polynomial time hierarchy]]. [[Péter Gács]] showed that BPP is actually contained in Σ<sub>2</sub> ∩ Π<sub>2</sub>. [[Clemens Lautemann]] contributed by giving a simple proof of BPP’s membership in Σ<sub>2</sub> ∩ Π<sub>2</sub>, also in 1983. It is conjectured that in fact BPP=[[P (complexity)|P]], which is a much stronger statement than the Sipser–Lautemann theorem.
| |
| | |
| == Proof ==
| |
| [[Michael Sipser|Michael Sipser's]] version of the proof works as follows. Without loss of generality, a machine ''M'' ∈ BPP with error ≤ 2<sup>-|''x''|</sup> can be chosen. (All BPP problems can be amplified to reduce the error probability exponentially.) The basic idea of the proof is to define a Σ<sub>2</sub> ∩ Π<sub>2</sub> sentence that is equivalent to stating that ''x'' is in the language, ''L'', defined by ''M'' by using a set of transforms of the random variable inputs.
| |
| | |
| Since the output of ''M'' depends on random input, as well as the input ''x'', it is useful to define which random strings produce the correct output as ''A''(''x'') = {''r'' | ''M''(''x'',''r'') accepts}. The key to the proof is to note that when ''x'' ∈ ''L'', ''A''(''x'') is very large and when ''x'' ∉ ''L'', ''A''(''x'') is very small. By using [[Exclusive disjunction|bitwise parity]], ⊕, a set of transforms can be defined as ''A''(''x'') ⊕ ''t''={''r'' ⊕ ''t'' | ''r'' ∈ ''A''(''x'')}. The first main lemma of the proof shows that the union of a small finite number of these transforms will contain the entire space of random input strings. Using this fact, a Σ<sub>2</sub> sentence and a Π<sub>2</sub> sentence can be generated that is true if and only if ''x'' ∈ ''L'' (see corollary).
| |
| | |
| === Lemma 1 ===
| |
| The general idea of lemma one is to prove that if ''A''(''x'') covers a large part of the random space <math>R= \{ 1,0 \} ^ {|r|} </math> then there exists a small set of translations that will cover the entire random space. In more mathematical language:
| |
| | |
| :''If'' <math>\frac{|A(x)|}{|R|} > 1 - \frac{1}{2^{|x|}}</math>, ''then'' <math>\exists t_1,t_2,\ldots,t_{|r|}</math>, ''where'' <math>t_i \in \{ 1,0 \} ^{|r|} \ </math> ''such that'' <math> \bigcup_i A(x) \oplus t_i = R.</math>
| |
| | |
| '''Proof.''' Randomly pick ''t''<sub>1</sub>, ''t''<sub>2</sub>, ..., ''t''<sub>|''r''|</sub>. Let <math>S=\bigcup_i A(x)\oplus t_i</math> (the union of all transforms of ''A''(''x'')).
| |
| | |
| So, for all ''r'' in ''R'',
| |
| :<math>\Pr [r \notin S] = \Pr [r \notin A(x) \oplus t_1] \cdot \Pr [r \notin A(x) \oplus t_2] \cdots \Pr [r \notin A(x) \oplus t_{|r|}] \le { \frac{1}{2^{|x| \cdot |r|}} }.</math>
| |
| | |
| The probability that there will exist at least one element in ''R'' not in ''S'' is
| |
| | |
| :<math>\Pr \Bigl[ \bigvee_i (r_i \notin S)\Bigr] \le \sum_i \frac{1}{2^{|x| \cdot |r|}} = \frac{1}{2^{|x|}} < 1.</math>
| |
| | |
| Therefore
| |
| | |
| :<math>\Pr [S = R] \ge 1 - \frac{1}{2^{|x|}}.</math>
| |
| | |
| Thus there is a selection for each <math>t_1,t_2,\ldots,t_{|r|}</math> such that
| |
| | |
| :<math> \bigcup_i A(x) \oplus t_i = R.</math>
| |
| | |
| === Lemma 2 ===
| |
| The previous lemma shows that ''A''(''x'') can cover every possible point in the space using a small set of translations. Complementary to this, for ''x'' ∉ ''L'' only a small fraction of the space is covered by ''A''(''x''). Therefore the set of random strings causing ''M''(''x'',''r'') to accept cannot be generated by a small set of vectors ''t<sub>i</sub>''.
| |
| | |
| :<math>R = \bigcup_i A(x) \oplus t_i</math>
| |
| | |
| ''R'' is the set of all accepting random strings, exclusive-or'd with vectors ''t<sub>i</sub>''.
| |
| | |
| :<math>\frac{|A(x)|}{|R|} \le \frac{1}{2^{|k|}} \implies \neg \exists t_1,t_2,\dots,t_{r}</math>
| |
| | |
| === Corollary ===
| |
| An important corollary of the lemmas shows that the result of the proof can be expressed as a Σ<sub>2</sub> expression, as follows.
| |
| | |
| :<math>x \in L \iff \exists t_1,t_2,\dots,t_{|r|}\, \forall r \in R \bigvee_{ 1 \le i \le |r|} (M(r \oplus t_i) \text{ accepts}).</math>
| |
| | |
| That is, ''x'' is in language ''L'' if and only if there exist |''r''| binary vectors, where for all random bit vectors ''r'', TM ''M'' accepts at least one random vector ⊕ ''t<sub>i</sub>''.
| |
| | |
| The above expression is in [[Polynomial hierarchy|Σ<sub>2</sub>]] in that it is first existentially then universally quantified. Therefore BPP ⊆ Σ<sub>2</sub>. Because BPP is closed under [[Complement (complexity)|complement]], this proves BPP ⊆ Σ<sub>2</sub> ∩ Π<sub>2</sub>
| |
| | |
| == Lautemann's proof ==
| |
| Here we present the proof (due to Lautemann) that BPP ⊆ Σ<sub>2</sub>. See Trevisan's notes for more information.
| |
| | |
| === Lemma 3 ===
| |
| Based on the definition of BPP we show the following:
| |
| | |
| If ''L'' is in BPP then there is an algorithm ''A'' such that for every ''x'',
| |
| | |
| :<math>{\rm Pr}_r(A(x,r) = \mbox{right answer}) \ge 1 - \frac{1}{3m},</math>
| |
| | |
| where ''m'' is the number of random bits <math>|r| = m = |x|^{O(1)}</math> and ''A'' runs in time <math>|x|^{O(1)}</math>.
| |
| | |
| '''Proof:''' Let ''A{{'}}'' be a BPP algorithm for ''L''. For every ''x'', <math>\Pr_r(A'(x,r) = \mbox{wrong answer}) \le 1/3</math>. ''A{{'}}'' uses ''m{{'}}''(''n'') random bits where ''n'' = |''x''|.
| |
| Do ''k''(''n'') repetitions of ''A{{'}}'' and accept if and only if at least ''k''(''n'')/2 executions of ''A{{'}}'' accept. Define this new algorithm as ''A''. So ''A'' uses ''k''(''n'')''m{{'}}''(''n'') random bits and
| |
| :<math>{\rm Pr}_r(A(x,r) = \mbox{wrong answer}) \le 2^{-ck(n)}.</math>
| |
| We can then find ''k''(''n'') with <math>k(n)=\Theta (\log m'(n))</math> such that
| |
| :<math>\frac{1}{2^{ck(n)}} \le \frac{1}{3k(n)m'(n)}.</math>
| |
| | |
| === Theorem 1 ===
| |
| '''Proof:''' Let ''L'' be in BPP and ''A'' as in Lemma 3. We want to show
| |
| | |
| :<math>x \in L \iff \exists y_1,\dots,y_m \in \{0,1\}^m\, \forall z \in \{0,1\}^m \bigvee_{i=1}^mA(x,y_i \oplus z)=1.</math>
| |
| | |
| where ''m'' is the number of random bits used by ''A'' on input ''x''. Given <math>x \in L</math>, then
| |
| | |
| :<math>\begin{align}
| |
| {\rm Pr}_{y_1,\dots,y_m}(\exists z& A(x,y_1 \oplus z)=\dots=A(x,y_m \oplus z)=0)\\
| |
| &\le \sum_{z \in \{0,1\}^m} {\rm Pr}_{y_1,\dots,y_m}(A(x,y_1 \oplus z) = \dots = A( x, y_m \oplus z) = 0)\\
| |
| &\le 2^m \frac{1}{(3m)^m}\\
| |
| &< 1.
| |
| \end{align}</math>
| |
| | |
| Thus
| |
| | |
| :<math>{\rm Pr}_{y_1,\dots,y_m}\Bigl( \forall z \bigvee_i A(x,y_i \oplus z)\Bigr)=1 - {\rm Pr}_{y_1,...,y_m}(\exists z A(x,y_1 \oplus z)=\dots=A(x,y_m \oplus z)=0).</math>
| |
| | |
| Thus <math>(y_1,\dots,y_m)</math> exists.
| |
| | |
| Conversely, suppose <math>x \notin L</math>. Then
| |
| | |
| <math>{\rm Pr}_z \Bigl( \bigvee_i A(x,y_i \oplus z) \Bigr) \le \sum_i {\rm Pr}_z (A(x,y_i \oplus z)=1)\le m \frac{1}{3m}= \frac{1}{3}.</math>
| |
| | |
| Thus
| |
| | |
| :<math>{\rm Pr}_z(A(x,y_1 \oplus z)=\dots=A(x,y_m \oplus z)=0)= 1 - {\rm Pr}_z \Bigl( \bigvee_i A(x,y_i \oplus z) \Bigr)\ge \frac{2}{3} > 0.</math>
| |
| | |
| Thus there is a ''z'' such that <math> \bigvee_i A(x,y_i \oplus z)=0</math> for all <math>y_1,...,y_m \in \{0,1\}^m.</math>
| |
| | |
| == Stronger version ==
| |
| The theorem can be strengthened to <math>\mathsf{BPP} \subseteq \mathsf{MA} \subseteq \mathsf{S}_2^P \subseteq \Sigma_2 \cap \Pi_2</math> (see [[MA (complexity)|MA]], [[S2P (complexity)|S{{su|p=P|b=2}}]]).
| |
| | |
| == References ==
| |
| * [[Michael Sipser|M. Sipser]]. ''A complexity theoretic approach to randomness'' In Proceedings of the 15th ACM Symposium on Theory of Computing, 330–335. ACM Press, 1983.
| |
| * C. Lautemann, ''[http://www.sciencedirect.com/science/article/pii/0020019083900443 BPP and the polynomial hierarchy]'' Inf. Proc. Lett. 17 (4) 215–217, 1983.
| |
| * [[Luca Trevisan]]'s [http://www.cs.berkeley.edu/~luca/notes/ Lecture Notes], University of California, Berkeley
| |
| | |
| {{DEFAULTSORT:Sipser-Lautemann theorem}}
| |
| [[Category:Structural complexity theory]]
| |
| [[Category:Probabilistic complexity theory]]
| |
| [[Category:Theorems in computational complexity theory]]
| |
| [[Category:Articles containing proofs]]
| |
Nice to meet you, I am Marvella Shryock. One of the extremely very best issues in the globe for me is to do aerobics and I've been doing it for quite a while. Puerto Rico is where he's usually been living but she needs to move because of her family. For years he's been operating as a meter reader and it's something he really appreciate.
my web blog; home std test