Wetting: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
Changed "2" to "two"
en>AnomieBOT
m Dating maintenance tags: {{What?}}
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
{{mergefrom|Lewis Carroll identity|date=February 2013}}
== ' Pula slight nod ==


In [[mathematics]], '''Dodgson condensation''' is a method of computing the [[determinant]]s of [[square matrix|square matrices]].  It is named for its inventor Charles Dodgson (better known as [[Lewis Carroll]]).  The method in the case of an ''n'' × ''n'' matrix is to construct an (''n'' − 1) × (''n'' − 1) matrix, an (''n'' − 2) × (''n'' − 2), and so on, finishing with a 1 × 1 matrix, which has one entry, the determinant of the original matrix.
Side walls extending out,[http://www.alleganycountyfair.org/_vti_cnf/rakuten_oakley_24.htm オークリー サングラス フロッグスキン], then instantly swept Nabing stab sword, sharp thorn sword extremely tough, just barely issued a 'laugh' is heard on countless powder and then disappeared into see.<br><br>'Good!' Pula surprise shouted, 'even C2 hardness weapons are easily destroyed!'<br>What<br>behalf?<br><br>representative of a normal second-order cosmic Warrior,[http://www.alleganycountyfair.org/_vti_cnf/rakuten_oakley_11.htm オークリー サングラス 登山], will stop this attack.<br><br>'the stronger the better.' Pula eyes shine, threw the second handle backup weapon,[http://www.alleganycountyfair.org/_vti_cnf/rakuten_oakley_55.htm オークリー サングラス 交換レンズ], the same thorn-edged sword. These weapons are not worth ...... Standby Force is a valuable weapon, read the force of weapons.<br><br>'Chi Chi ,[http://www.alleganycountyfair.org/_vti_cnf/rakuten_oakley_12.htm スポーツサングラス オークリー]......'<br><br>time, received three stab sword color streamer spread, slight tremors, cracked a little faint,[http://www.alleganycountyfair.org/_vti_cnf/rakuten_oakley_71.htm 自転車 サングラス オークリー], but the damage is very slow.<br><br>'C5 level is difficult to destroy?' Pula slight nod, smile surpassed, 'Very good,[http://www.alleganycountyfair.org/_vti_cnf/rakuten_oakley_27.htm サングラス オークリー 偏光], very good.'<br><br>a spaceship,[http://www.alleganycountyfair.org/_vti_cnf/rakuten_oakley_44.htm テニス サングラス オークリー], the strongest attacks will not be passive aggressive entrance,[http://www.alleganycountyfair.org/_vti_cnf/rakuten_oakley_50.htm オークリーサングラス取扱店], because
 
相关的主题文章:
==The General Method==
<ul>
This algorithm can be described in the following 4 steps:
 
# Let A be the given ''n''&nbsp;&times;&nbsp;''n'' matrix. Arrange A so that no zeros occur in its interior. An explicit definition of interior would be all a<sub>i,j</sub> with <math>i,j\ne1,n</math>. We can do this using any operation that we could normally perform without changing the value of the determinant, such as adding a multiple of one row to another.
  <li>[http://www.shipingheka.com/plus/feedback.php?aid=264 http://www.shipingheka.com/plus/feedback.php?aid=264]</li>
#Create an (''n''&nbsp;&minus;&nbsp;1)&nbsp;&times;&nbsp;(''n''&nbsp;&minus;&nbsp;1) matrix B, consisting of the determinants of every 2&nbsp;&times;&nbsp;2 submatrix of A. Explicitly, we write <math> b_{i,j}=\begin{vmatrix}  a_{i, j} & a_{i, j + 1} \\ a_{i + 1, j} & a_{i + 1, j + 1} \end{vmatrix}. </math>
 
#Using this (''n''&nbsp;&minus;&nbsp;1)&nbsp;&times;&nbsp;(''n''&nbsp;&minus;&nbsp;1) matrix, perform step 2 to obtain an (''n''&nbsp;&minus;&nbsp;2)&nbsp;&times;&nbsp;(''n''&nbsp;&minus;&nbsp;2) matrix C. Divide each term in C by the corresponding term in the interior of A. <math> c_{i,j} = \dfrac{c_{i,j}}{a_{i + 1, j + 1}}.\,\!</math>
  <li>[http://www.klt365.com/plus/feedback.php?aid=410 http://www.klt365.com/plus/feedback.php?aid=410]</li>
#Let A = B, and B = C. Repeat step 3 as necessary until the 1&nbsp;&times;&nbsp;1 matrix is found; its only entry is the determinant.
 
 
  <li>[http://sxmz.org/sxmz/home.php?mod=space&uid=468563 http://sxmz.org/sxmz/home.php?mod=space&uid=468563]</li>
==Examples==
 
===Without Zeros===
</ul>
We wish to find
 
:<math>
\begin{vmatrix}
-2 & -1 & -1 & -4 \\
-1 & -2 & -1 & -6 \\
-1 & -1 & 2 & 4 \\
2 & 1 & -3 & -8
\end{vmatrix}.
</math>
 
We make a matrix of its 2&nbsp;&times;&nbsp;2 submatrices.
 
:<math>
\begin{bmatrix}
\begin{vmatrix} -2 & -1 \\ -1 & -2 \end{vmatrix} &
\begin{vmatrix} -1 & -1 \\ -2 & -1 \end{vmatrix} &
\begin{vmatrix} -1 & -4 \\ -1 & -6 \end{vmatrix} \\ \\
\begin{vmatrix} -1 & -2 \\ -1 & -1 \end{vmatrix} &
\begin{vmatrix} -2 & -1 \\ -1 & 2 \end{vmatrix} &
\begin{vmatrix} -1 & -6 \\ 2 & 4 \end{vmatrix} \\ \\
\begin{vmatrix} -1 & -1 \\ 2 & 1 \end{vmatrix} &
\begin{vmatrix} -1 & 2 \\ 1 & -3 \end{vmatrix} &
\begin{vmatrix} 2 & 4 \\ -3 & -8 \end{vmatrix}
\end{bmatrix}
=
\begin{bmatrix}
3 & -1 & 2 \\
-1 & -5 & 8 \\
1 & 1 & -4
\end{bmatrix}.
</math>
 
We then find another matrix of determinants:
 
:<math>
\begin{bmatrix}
\begin{vmatrix} 3 & -1 \\ -1 & -5 \end{vmatrix} &
\begin{vmatrix} -1 & 2 \\ -5 & 8 \end{vmatrix} \\ \\
\begin{vmatrix} -1 & -5 \\ 1 & 1 \end{vmatrix} &
\begin{vmatrix} -5 & 8 \\ 1 & -4 \end{vmatrix}
\end{bmatrix}
=
\begin{bmatrix}
-16 & 2 \\
4 & 12
\end{bmatrix}.
</math>
 
We must then divide each element by the corresponding element of our original matrix. The interior of our original matrix is
<math>
\begin{bmatrix}
-2 & -1 \\
-1 & 2
\end{bmatrix}
</math>, so after dividing we get
<math>
\begin{bmatrix}
8 & -2 \\
-4 & 6
\end{bmatrix}
</math>.
The process must be repeated to arrive at a 1&nbsp;&times;&nbsp;1 matrix.
<math>
\begin{bmatrix}
\begin{vmatrix}
8 & -2 \\
-4 & 6
\end{vmatrix}
\end{bmatrix}
=
\begin{bmatrix}
40
\end{bmatrix}.
</math>
We divide by the interior of our 3&nbsp;&times;&nbsp;3 matrix, which is just -5, giving us
<math>\begin{bmatrix} -8 \end{bmatrix}</math>. -8 is indeed the determinant of the original matrix.
 
===With Zeros===
Simply writing out the matrices:
 
:<math>
\begin{bmatrix}
2 & -1 & 2 & 1 & -3 \\
1 & 2 & 1 & -1 & 2  \\
1 & -1 & -2 & -1 & -1 \\
2 & 1 & -1 & -2 & -1 \\
1 & -2 & -1 & -1 & 2
\end{bmatrix}
\to
\begin{bmatrix}
5 & -5 & -3 & -1 \\
-3 & -3 & -3 & 3 \\
3 & 3 & 3 & -1 \\
-5 & -3 & -1 & -5
\end{bmatrix}
\to
\begin{bmatrix}
-30 & 6 & -12 \\
0 & 0 & 6 \\
6 & -6 & 8
\end{bmatrix}.
</math>
 
Here we run into trouble. If we continue the process, we will eventually be dividing by 0. We can perform four row exchanges on the initial matrix to preserve the determinant and repeat the process, with most of the determinants precalculated:
 
:<math>
\begin{bmatrix}
1 & 2 & 1 & -1 & 2  \\
1 & -1 & -2 & -1 & -1 \\
2 & 1 & -1 & -2 & -1 \\
1 & -2 & -1 & -1 & 2 \\
2 & -1 & 2 & 1 & -3
\end{bmatrix}
\to
\begin{bmatrix}
-3 & -3 & -3 & 3 \\
3 & 3 & 3 & -1 \\
-5 & -3 & -1 & -5 \\
3 & -5 & 1 & 1
\end{bmatrix}
\to
\begin{bmatrix}
0 & 0 & 6 \\
6 & -6 & 8 \\
-17 & 8 & -4
\end{bmatrix}
\to
\begin{bmatrix}
0 & 12 \\
18 & 40
\end{bmatrix}
\to
\begin{bmatrix}
36
\end{bmatrix}.
</math>
 
Hence, we arrive at a determinant of 36.
 
==The Desnanot-Jacobi identity and proof of correctness of the condensation algorithm==
The proof that the condensation method computes the determinant of the matrix if no divisions by zero are encountered is based on an identity known as the Desnanot-Jacobi identity.
<br /><br />
Let <math>M=(m_{i,j})_{i,j=1}^k</math> be a square matrix, and for each <math>1\le i, j\le k</math> denote by <math>M_i^j</math> the matrix that results from <math>M</math> by deleting the <math>i</math>-th row and the <math>j</math>-th column. Similarly, for
<math>1\le i, j, p,q\le k</math> denote by <math>M_{i,j}^{p,q}</math> the matrix that results from <math>M</math> by deleting the <math>i</math>-th and <math>j</math>-th rows and the <math>p</math>-th and <math>q</math>-th columns.
 
===The Desnanot-Jacobi identity===
:<math>\det(M) \det(M_{1,k}^{1,k}) = \det(M_1^1)\det(M_k^k) - \det(M_1^k) \det(M_k^1). </math>
 
===Proof of the correctness of Dodgson condensation===
Rewrite the identity as
<br />
:<math>\det(M) = \frac{\det(M_1^1)\det(M_k^k) - \det(M_1^k) \det(M_k^1)}{\det(M_{1,k}^{1,k})}. </math><br />
Now note that by induction it follows that when applying the Dodgson condensation procedure to a square matrix <math>A</math> of order <math>n</math>, the matrix in the <math>k</math>-th stage of the computation
(where the first stage <math>k=1</math> corresponds to the matrix <math>A</math> itself) consists of all the ''connected minors'' of order <math>k</math>
of <math>A</math>, where a connected minor is the determinant of a connected <math>k\times k</math> sub-block of adjacent entries of <math>A</math>. In particular, in the last stage <math>k=n</math> we get a matrix containing a single element equal to the unique connected minor of order <math>n</math>, namely the determinant of <math>A</math>.
===Proof of the Desnanot-Jacobi identity===
We follow the treatment in Bressoud's book; for an alternative combinatorial proof see the paper by Zeilberger.
Denote <math>a_{i,j} = (-1)^{i+j} \det(M_i^j)</math> (up to sign, the <math>(i,j)</math>-th minor of <math>M</math>), and define a <math>k\times k</math>
matrix <math>M'</math> by
<br />
:<math>
M' = \begin{pmatrix} a_{1,1} & 0 & 0 & 0 & \ldots & 0 & a_{k,1} \\
a_{1,2} & 1 & 0 & 0 & \ldots & 0 & a_{k,2} \\
a_{1,3} & 0 & 1 & 0 & \ldots & 0 & a_{k,3} \\
a_{1,4} & 0 & 0 & 1 & \ldots & 0 & a_{k,4} \\
\vdots & \vdots & \vdots & \vdots & & \vdots & \vdots \\
a_{1,k-1} & 0 & 0 & 0 & \ldots & 1 & a_{k,k-1} \\
a_{1,k} & 0 & 0 & 0 & \ldots & 0 & a_{k,k}
\end{pmatrix}.
</math>
<br />
(Note that the first and last column of <math>M'</math> are equal to those of the [[adjugate matrix]] of <math>A</math>). The identity is now obtained by computing <math>\det(M M')</math> in two ways. First, we can directly compute the matrix product <math>M M'</math> (using simple properties of the adjugate matrix, or alternatively using the formula for the expansion of a matrix determinant in terms of a row or a column)
to arrive at
<br />
:<math>
M M' = \begin{pmatrix}
\det(M) & m_{1,2} & m_{1,3} & \ldots &  m_{1,k-1} & 0 \\
0 &  m_{2,2} & m_{2,3} & \ldots & m_{2,k-1} & 0 \\
0 &  m_{3,2} & m_{3,3} & \ldots & m_{3,k-1} & 0 \\
\vdots & \vdots & \vdots & & \vdots & \vdots & \vdots \\
0 &  m_{k-1,2} & m_{k-1,3} & \ldots & m_{k-1,k-1} & 0 \\
0 & m_{k,2} & m_{k,3} & \ldots & m_{k,k-1} & \det(M)
\end{pmatrix}
</math>
<br />
where we use <math>m_{i,j}</math> to denote the <math>(i,j)</math>-th entry of <math>M</math>. The determinant of this matrix is <math>\det(M)^2 \cdot \det(M_{1,k}^{1,k})</math>.
<br />Second, this is equal to the product of the determinants, <math>\det(M) \cdot \det(M')</math>. But clearly
<br />
<math>
\det(M') = a_{1,1} a_{k,k} - a_{k,1} a_{1,k} = \det(M_1^1)\det(M_k^k) - \det(M_1^k) \det(M_k^1),
</math>
<br />
so the identity follows from equating the two expressions we obtained for <math>\det(M M')</math> and dividing out by <math>\det(M)</math> (this is allowed if one thinks of the identities as polynomial identities over the ring of polynomials in the <math>k^2</math> indeterminate variables <math>(m_{i,j})_{i,j=1}^k</math>).
 
==References and further reading==
* [[David Bressoud|Bressoud, David M.]], ''Proofs and Confirmations: The Story of the Alternating Sign Matrix Conjecture'', MAA Spectrum, Mathematical Associations of America, Washington, D.C., 1999.
* [[David Bressoud|Bressoud, David M.]] and Propp, James, How the alternating sign matrix conjecture was solved, ''Notices of the American Mathematical Society'', 46 (1999), 637-646.
* Dodgson, C. L., [http://www.jstor.org/stable/112607?origin=JSTOR-pdf Condensation of Determinants, Being a New and Brief Method for Computing their Arithmetical Values], Proceedings of the Royal Society of London, 15 (1866-1867), 150-155.
* [[D. Knuth|Knuth, Donald]] ,[http://www.emis.de/journals/EJC/Volume_3/PDFFiles/v3i2r5.pdf Overlapping Pfaffians], ''Electronic Journal of Combinatorics'', '''3''' no. 2 (1996).
* Mills, William H., Robbins, David P., and Rumsey, Howard, Jr., Proof of the Macdonald conjecture, ''Inventiones Mathematicae'', 66 (1982), 73-87.
* Mills, William H., Robbins, David P., and Rumsey, Howard, Jr., Alternating sign matrices and descending plane partitions, ''Journal of Combinatorial Theory, Series A'', 34 (1983), 340-359.
* Robbins, David P., The story of <math>1, 2, 7, 42, 429, 7436, \cdots</math>, ''The Mathematical Intelligencer'', 13 (1991), 12-19.
* [[Doron Zeilberger|Zeilberger, Doron]], [http://www.combinatorics.org/ojs/index.php/eljc/article/view/v4i2r22 Dodgson's determinant evaluation rule proved by two-timing men and women], ''Electronic Journal of Combinatorics'', '''4'''  no. 2 (1997).
 
==External links==
*[http://mathworld.wolfram.com/Condensation.html Dodgson Condensation] entry in [[MathWorld]]
 
[[Category:Determinants]]
[[Category:Lewis Carroll]]

Latest revision as of 16:13, 10 January 2015

' Pula slight nod

Side walls extending out,オークリー サングラス フロッグスキン, then instantly swept Nabing stab sword, sharp thorn sword extremely tough, just barely issued a 'laugh' is heard on countless powder and then disappeared into see.

'Good!' Pula surprise shouted, 'even C2 hardness weapons are easily destroyed!'
What
behalf?

representative of a normal second-order cosmic Warrior,オークリー サングラス 登山, will stop this attack.

'the stronger the better.' Pula eyes shine, threw the second handle backup weapon,オークリー サングラス 交換レンズ, the same thorn-edged sword. These weapons are not worth ...... Standby Force is a valuable weapon, read the force of weapons.

'Chi Chi ,スポーツサングラス オークリー......'

time, received three stab sword color streamer spread, slight tremors, cracked a little faint,自転車 サングラス オークリー, but the damage is very slow.

'C5 level is difficult to destroy?' Pula slight nod, smile surpassed, 'Very good,サングラス オークリー 偏光, very good.'

a spaceship,テニス サングラス オークリー, the strongest attacks will not be passive aggressive entrance,オークリーサングラス取扱店, because 相关的主题文章: