|
|
Line 1: |
Line 1: |
| A '''Z-channel''' is a [[communications channel]] used in [[coding theory]] and [[information theory]] to model the behaviour of some data storage systems.
| | Hi there. Allow me start by introducing the writer, her name is Sophia Boon but she never truly liked that title. Invoicing is what I do. What me and my family adore is doing ballet but I've been taking on new issues recently. Ohio is where his house is and his family loves it.<br><br>Feel free to surf to my weblog :: [http://clothingcarearchworth.com/index.php?document_srl=441551&mid=customer_review clairvoyants] |
| | |
| == Definition ==
| |
| A ''Z-channel'' (or a ''binary asymmetric channel'') is a channel with binary input and binary output where the crossover 1 → 0 occurs with nonnegative probability ''p'', whereas the crossover 0 → 1 never occurs. In other words, if ''X'' and ''Y'' are the [[random variable]]s describing the probability distributions of the input and the output of the channel, respectively, then the crossovers of the channel are characterized by the [[conditional probability|conditional probabilities]]
| |
| : Prob{''Y'' = 0 | ''X'' = 0} = 1
| |
| : Prob{''Y'' = 0 | ''X'' = 1} = ''p''
| |
| : Prob{''Y'' = 1 | ''X'' = 0} = 0
| |
| : Prob{''Y'' = 1 | ''X'' = 1} = 1−''p''
| |
| | |
| == Capacity ==
| |
| The [[channel capacity|capacity]] <math>\mathsf{cap}(\mathbb{Z})</math> of the Z-channel <math>\mathbb{Z}</math> with the crossover 1 → 0 probability ''p'', when the input random variable ''X'' is distributed according to the [[Bernoulli distribution]] with probability ''α'' for the occurrence of 0, is calculated as follows.
| |
| :<math>\mathsf{cap}(\mathbb{Z}) = </math>
| |
| ::::<math>\max_\alpha\{\mathsf{H}(Y) - \mathsf{H}(Y \mid X)\} = \max_p\left\{\mathsf{H}(Y) - \sum_{x \in \{0,1\}}\mathsf{H}(Y \mid X = x) \mathsf{Prob}\{X = x\}\right\} =</math>
| |
| ::::<math>\max_\alpha\{\mathsf{H}((1-\alpha)(1-p)) - \mathsf{H}(Y \mid X = 1) \mathsf{Prob}\{X = 1\} \}</math>
| |
| ::::<math>\max_\alpha\{\mathsf{H}((1-\alpha)(1-p)) - (1-\alpha)\mathsf{H}(p) \},</math>
| |
| where <math>\mathsf{H}(\cdot)</math> is the [[binary entropy function]]. | |
| | |
| The maximum is attained for
| |
| :<math>\alpha = 1 - \frac{1}{(1-p)(1+2^{\mathsf{H}(p)/(1-p)})},</math>
| |
| yielding the following value of <math>\mathsf{cap}(\mathbb{Z})</math> as a function of ''p''
| |
| :<math>\mathsf{cap}(\mathbb{Z}) = \mathsf{H}\left(\frac{1}{1+2^{\mathsf{s}(p)}}\right) - \frac{\mathsf{s}(p)}{1+2^{\mathsf{s}(p)}} = \log_2(1{+}2^{-\mathsf{s}(p)}) = \log_2\left(1+(1-p) p^{p/(1-p)}\right) \; \textrm{ where } \; \mathsf{s}(p) = \frac{\mathsf{H}(p)}{1-p}.</math>
| |
| | |
| For small ''p'', the capacity is approximated by
| |
| | |
| :<math> \mathsf{cap}(\mathbb{Z}) \approx 1- 0.5 \mathsf{H}(p) \,</math>
| |
| as compared to the capacity <math>1{-}\mathsf{H}(p)</math> of the [[binary symmetric channel]] with crossover probability ''p''.
| |
| | |
| == Bounds on the size of an asymmetric-error-correcting code ==
| |
| Define the following distance function <math>\mathsf{d}_A(\mathbf{x}, \mathbf{y})</math> on the words <math>\mathbf{x}, \mathbf{y} \in \{0,1\}^n</math> of length ''n'' transmitted via a Z-channel
| |
| :<math>\mathsf{d}_A(\mathbf{x}, \mathbf{y}) \stackrel{\vartriangle}{=} \Big|\{i \mid x_i = 0, y_i = 1\}\Big| + \Big|\{i \mid x_i = 1, y_i = 0\}\Big|.</math> | |
| Define the sphere <math>V_t(\mathbf{x})</math> of radius ''t'' around a word <math>\mathbf{x} \in \{0,1\}^n</math> of length ''n'' as the set of all the words at distance ''t'' or less from <math>\mathbf{x}</math>, in other words,
| |
| :<math>V_t(\mathbf{x}) = \{\mathbf{y} \in \{0, 1\}^n \mid \mathsf{d}_A(\mathbf{x}, \mathbf{y}) \leq t\}.</math> | |
| A [[code]] <math>\mathcal{C}</math> of length ''n'' is said to be ''t''-asymmetric-error-correcting if for any two codewords <math>\mathbf{c}, \mathbf{c}' \in \{0,1\}^n</math>, one has <math>V_t(\mathbf{c}) \cap V_t(\mathbf{c}') = \emptyset</math>. Denote by <math>M(n,t)</math> the maximum size of a ''t''-asymmetric-error-correcting code of length ''n''.
| |
| | |
| '''The Varshamov bound'''.
| |
| For ''n''≥1 and ''t''≥1,
| |
| :<math>M(n,t) \leq \frac{2^{n+1}}{\sum_{j = 0}^t{\left( \binom{\lfloor n/2\rfloor}{j}+\binom{\lceil n/2\rceil}{j}\right)}}.</math> | |
| | |
| Let <math>A(n,d, w)</math> denote the maximal number of binary vectors of length ''n'' of weight ''w'' and with Hamming distance at least ''d'' apart.
| |
| | |
| '''The constant-weight code bound'''.
| |
| For ''n > 2t ≥ 2'', let the sequence ''B<sub>0</sub>, B<sub>1</sub>, ..., B<sub>n-2t-1</sub>'' be defined as
| |
| :<math>B_0 = 2, \quad B_i = \min_{0 \leq j < i}\{ B_j + A(n{+}t{+}i{-}j{-}1, 2t{+}2, t{+}i)\}</math> for <math>i > 0</math>.
| |
| Then <math>M(n,t) \leq B_{n-2t-1}.</math>
| |
| | |
| == References == | |
| * {{Smallcaps|T. Kløve,}} Error correcting codes for the asymmetric channel, ''Technical Report 18–09–07–81,'' Department of Informatics, University of Bergen, Norway, 1981.
| |
| * {{Smallcaps|L.G. Tallini, S. Al-Bassam, B. Bose,}} On the capacity and codes for the Z-channel, ''Proceedings of the IEEE International Symposium on Information Theory,'' Lausanne, Switzerland, 2002, p. 422.
| |
| | |
| [[Category:Coding theory]]
| |
| [[Category:Information theory]]
| |
| [[Category:Inequalities]]
| |
Hi there. Allow me start by introducing the writer, her name is Sophia Boon but she never truly liked that title. Invoicing is what I do. What me and my family adore is doing ballet but I've been taking on new issues recently. Ohio is where his house is and his family loves it.
Feel free to surf to my weblog :: clairvoyants