|
|
Line 1: |
Line 1: |
| {{distinguish2|a [[triangular array]], a related concept}}
| | Hi there, I am Alyson Boon although it is not the name on my birth certificate. To perform lacross is something he would never give up. Since I was 18 I've been working as a bookkeeper but quickly my wife and I will start our own business. Mississippi is where his home is.<br><br>My web blog; [http://www.ba8ba.com/space.php?uid=40292&do=blog&id=37526 psychic readings online] |
| {{for|the rings|triangular matrix ring}}
| |
| | |
| [[File:Cyclic group Z4; Cayley table; powers of Gray code permutation (small).svg|thumb|[[Logical matrix|Binary]] lower unitriangular [[Toeplitz matrix|Toeplitz]] matrices, multiplied using [[Finite field|'''F'''<sub>2</sub>]] operations<br>They form the [[Cayley table]] of [[cyclic group|Z<sub>4</sub>]] and correspond to [[v:Gray code permutation powers#4 bit|powers of the 4-bit Gray code permutation]].]]
| |
| | |
| In the [[mathematics|mathematical]] discipline of [[linear algebra]], a '''triangular matrix''' is a special kind of [[square matrix|square]] [[matrix (mathematics)|matrix]]. A square matrix is called '''lower triangular''' if all the entries ''above'' the [[main diagonal]] are zero. Similarly, a square matrix is called '''upper triangular''' if all the entries ''below'' the [[main diagonal]] are zero. A triangular matrix is one that is either lower triangular or upper triangular. A matrix that is both upper and lower triangular is called a [[diagonal matrix]].
| |
| | |
| Because matrix equations with triangular matrices are easier to solve, they are very important in [[numerical analysis]]. By the [[LU decomposition]] algorithm, an [[invertible matrix]] may be written as the product of a lower triangular matrix ''L'' by an upper triangular matrix ''U'' [[if and only if]] all its leading principal [[minor (linear algebra)|minors]] are non-zero.
| |
| | |
| == Description ==
| |
| A matrix of the form
| |
| :<math> L=
| |
| \begin{bmatrix}
| |
| l_{1,1} & & & & 0 \\
| |
| l_{2,1} & l_{2,2} & & & \\
| |
| l_{3,1} & l_{3,2} & \ddots & & \\
| |
| \vdots & \vdots & \ddots & \ddots & \\
| |
| l_{n,1} & l_{n,2} & \ldots & l_{n,n-1} & l_{n,n}
| |
| \end{bmatrix}
| |
| </math>
| |
| | |
| is called a '''lower triangular matrix''' or '''left triangular matrix''', and analogously a matrix of the form
| |
| :<math> U =
| |
| \begin{bmatrix}
| |
| u_{1,1} & u_{1,2} & u_{1,3} & \ldots & u_{1,n} \\
| |
| & u_{2,2} & u_{2,3} & \ldots & u_{2,n} \\
| |
| & & \ddots & \ddots & \vdots \\
| |
| & & & \ddots & u_{n-1,n}\\
| |
| 0 & & & & u_{n,n}
| |
| \end{bmatrix}
| |
| </math>
| |
| | |
| is called an '''upper triangular matrix''' or '''right triangular matrix'''. The variable ''L'' (standing for lower or left) is commonly used to represent a lower triangular matrix, while the variable ''U'' (standing for upper) or ''R'' (standing for right) is commonly used for upper triangular matrix. A matrix that is both upper and lower triangular is [[diagonal matrix|diagonal]].
| |
| | |
| Matrices that are [[similar (linear algebra)|similar]] to triangular matrices are called '''triangularisable'''.
| |
| | |
| The standard operations on triangular matrices preserve the triangular shape:
| |
| * The sum of two upper triangular matrices is upper triangular.
| |
| * The product of two upper triangular matrices is upper triangular.
| |
| * The inverse of an invertible upper triangular matrix is upper triangular.
| |
| * The product of an upper triangular matrix by a constant is an upper triangular matrix.
| |
| | |
| Together these facts mean that the upper triangular matrices form a [[subalgebra]] of the [[associative algebra]] of square matrices for a given size. Additionally, this also shows that the upper triangular matrices can be viewed as a Lie subalgebra of the [[Lie algebra]] of square matrices of a fixed size, where the [[Lie bracket]] [''a'',''b''] given by the [[Commutator#Ring_theory|commutator]] ''ab-ba''. The Lie algebra of all upper triangular matrices is often referred to as the [[Borel subalgebra]] and is denoted <math>\mathfrak{b}</math>. The analogous results hold for lower triangular matrices, so they also form a Lie subalgebra. However, note that the product of a ''lower'' triangular with an ''upper'' triangular matrix is not necessarily triangular.
| |
| | |
| ===Examples===
| |
| | |
| This matrix
| |
| :<math>
| |
| \begin{bmatrix}
| |
| 1 & 4 & 2 \\
| |
| 0 & 3 & 4 \\
| |
| 0 & 0 & 1 \\
| |
| \end{bmatrix}
| |
| </math>
| |
| is upper triangular and this matrix
| |
| :<math>
| |
| \begin{bmatrix}
| |
| 1 & 0 & 0 \\
| |
| 2 & 8 & 0 \\
| |
| 4 & 9 & 7 \\
| |
| \end{bmatrix}
| |
| </math>
| |
| is lower triangular.
| |
| | |
| ==Special forms==
| |
| === Unitriangular matrix ===
| |
| | |
| If the entries on the [[main diagonal]] of a (upper or lower) triangular matrix are all 1, the matrix is called (upper or lower) '''unitriangular'''. All unitriangular matrices are [[unipotent]]. Other names used for these matrices are '''unit''' (upper or lower) '''triangular''' (of which "unitriangular" might be a contraction){{fact|date=March 2012}}, or very rarely '''normed''' (upper or lower) '''triangular'''. However a ''unit'' triangular matrix is not the same as '''the''' ''[[identity matrix|unit matrix]]'', and a ''normed'' triangular matrix has nothing to do with the notion of [[matrix norm]]. The [[identity matrix]] is the only matrix which is both upper and lower unitriangular.
| |
| | |
| The set of unitriangular matrices forms a [[Lie group]].
| |
| | |
| === Strictly triangular matrix ===
| |
| | |
| If the entries on the main diagonal of a (upper or lower) triangular matrix are all 0, the matrix is called '''strictly''' (upper or lower) '''triangular'''. All strictly triangular matrices are [[nilpotent matrix|nilpotent]], and the set of strictly upper (or lower) triangular matrices forms a [[nilpotent Lie algebra]], denoted <math>\mathfrak{n}.</math> This algebra is the [[derived Lie algebra]] of <math>\mathfrak{b}</math>, the Lie algebra of all upper triangular matrices; in symbols, <math>\mathfrak{n} = [\mathfrak{b},\mathfrak{b}].</math> In addition, <math>\mathfrak{n}</math> is the Lie algebra of the Lie group of unitriangular matrices.
| |
| | |
| In fact, by [[Engel's theorem]], any finite-dimensional nilpotent Lie algebra is conjugate to a subalgebra of the strictly upper triangular matrices, that is to say, a finite-dimensional nilpotent Lie algebra is simultaneously strictly upper triangularizable.
| |
| | |
| === Atomic triangular matrix ===
| |
| | |
| An '''atomic''' (upper or lower) '''triangular matrix''' is a special form of unitriangular matrix, where all of the off-diagonal entries are zero, except for the entries in a single column. Such a matrix is also called a '''Gauss matrix''' or a '''Gauss transformation matrix'''. So an atomic lower triangular matrix is of the form
| |
| :<math> \mathbf{L}_{i} =
| |
| \begin{bmatrix}
| |
| 1 & & & & & & & 0 \\
| |
| 0 & \ddots & & & & & & \\
| |
| 0 & \ddots & 1 & & & & & \\
| |
| 0 & \ddots & 0 & 1 & & & & \\
| |
| & & 0 & l_{i+1,i} & 1 & & & \\
| |
| \vdots & & 0 & l_{i+2,i} & 0 & \ddots & & \\
| |
| & & \vdots & \vdots & \vdots & \ddots & 1 & \\
| |
| 0 & \dots & 0 & l_{n,i} & 0 & \dots & 0 & 1 \\
| |
| \end{bmatrix}.
| |
| </math>
| |
| The inverse of an '''atomic''' triangular matrix is again atomic triangular. Indeed, we have
| |
| :<math> \mathbf{L}_{i}^{-1} =
| |
| \begin{bmatrix}
| |
| 1 & & & & & & & 0 \\
| |
| 0 & \ddots & & & & & & \\
| |
| 0 & \ddots & 1 & & & & & \\
| |
| 0 & \ddots & 0 & 1 & & & & \\
| |
| & & 0 & -l_{i+1,i} & 1 & & & \\
| |
| \vdots & & 0 & -l_{i+2,i} & 0 & \ddots & & \\
| |
| & & \vdots & \vdots & \vdots & \ddots & 1 & \\
| |
| 0 & \dots & 0 & -l_{n,i} & 0 & \dots & 0 & 1 \\
| |
| \end{bmatrix},
| |
| </math>
| |
| i.e., the off-diagonal entries are replaced in the inverse matrix by their additive inverses.
| |
| | |
| ==== Examples ====
| |
| | |
| The matrix
| |
| :<math>
| |
| \begin{bmatrix}
| |
| 1 & 0 & 0 & 0 \\
| |
| 0 & 1 & 0 & 0 \\
| |
| 0 & 4 & 1 & 0 \\
| |
| 0 & 2 & 0 & 1 \\
| |
| \end{bmatrix}
| |
| </math>
| |
| is atomic lower triangular. Its inverse is
| |
| :<math>
| |
| \begin{bmatrix}
| |
| 1 & 0 & 0 & 0 \\
| |
| 0 & 1 & 0 & 0 \\
| |
| 0 & -4 & 1 & 0 \\
| |
| 0 & -2 & 0 & 1 \\
| |
| \end{bmatrix}.
| |
| </math>
| |
| | |
| ==Special properties==
| |
| | |
| A matrix which is simultaneously triangular and [[normal matrix|normal]] is also diagonal. This can be seen by looking at the diagonal entries of ''A''<sup>*</sup>''A'' and ''AA''<sup>*</sup>, where ''A'' is a normal, triangular matrix.
| |
| | |
| The [[transpose]] of an upper triangular matrix is a lower triangular matrix and vice versa.
| |
| | |
| The [[determinant]] of a triangular matrix equals the product of the diagonal entries. Since for any triangular matrix ''A'' the matrix <math>x I-A</math>, whose determinant is the [[characteristic polynomial]] of ''A'', is also triangular, the diagonal entries of ''A'' in fact give the [[multiset]] of [[eigenvalue]]s of ''A'' (an eigenvalue with multiplicity ''m'' occurs exactly ''m'' times as diagonal entry).<ref name="axler">{{Harv|Axler|1996|loc=pp. 86–87, 169}}</ref>
| |
| | |
| ==Triangularisability==
| |
| A matrix that is [[similar matrix|similar]] to a triangular matrix is referred to as '''triangularisable'''. Abstractly, this is equivalent to stabilising a [[flag (linear algebra)|flag]]: upper triangular matrices are precisely those that preserve the [[standard flag]], which is given by the standard ordered basis <math>(e_1,\ldots,e_n)</math> and the resulting flag <math>0 < \left\langle e_1\right\rangle < \left\langle e_1,e_2\right\rangle < \cdots < \left\langle e_1,\ldots,e_n \right\rangle = K^n.</math> All flags are conjugate (as the general linear group acts transitively on bases), so any matrix that stabilises a flag is similar to one that stabilises the standard flag.
| |
| | |
| Any complex square matrix is triangularisable.<ref name="axler"/> In fact, a matrix ''A'' over a [[field (mathematics)|field]] containing all of the eigenvalues of ''A'' (for example, any matrix over an [[algebraically closed field]]) is similar to a triangular matrix. This can be proven by using induction on the fact that ''A'' has an eigenvector, by taking the quotient space by the eigenvector and inducting to show that ''A'' stabilises a flag, and is thus triangularisable with respect to a basis for that flag.
| |
| | |
| A more precise statement is given by the [[Jordan normal form]] theorem, which states that in this situation, ''A'' is similar to an upper triangular matrix of a very particular form. The simpler triangularization result is often sufficient however, and in any case used in proving the Jordan normal form theorem.<ref name="axler"/><ref name="herstein">{{Harv|Herstein|1975|loc=pp. 285–290}}</ref>
| |
| | |
| In the case of complex matrices, it is possible to say more about triangularisation, namely, that any square matrix ''A'' has a [[Schur decomposition]]. This means that ''A'' is unitarily equivalent (i.e. similar, using a [[unitary matrix]] as change of basis) to an upper triangular matrix; this follows by taking an Hermitian basis for the flag.
| |
| | |
| ===Simultaneous triangularisability===
| |
| {{see also|Simultaneously diagonalizable}}
| |
| A set of matrices <math>A_1, \ldots, A_k</math> are said to be '''{{visible anchor|simultaneously triangularisable}}''' if there is a basis under which they are all upper triangular; equivalently, if they are upper triangularizable by a single similarity matrix ''P.'' Such a set of matrices is more easily understood by considering the algebra of matrices it generates, namely all polynomials in the <math>A_i,</math> denoted <math>K[A_1,\ldots,A_k].</math> Simultaneous triangularizability means that this algebra is conjugate into the Lie subalgebra of upper triangular matrices, and is equivalent to this algebra being a Lie subalgebra of a [[Borel subalgebra]].
| |
| | |
| The basic result is that (over an algebraically closed field), the [[commuting matrices]] <math>A,B</math> or more generally <math>A_1,\ldots,A_k</math> are simultaneously triangularizable. This can be proven by first showing that commuting matrices have a common eigenvector, and then inducting on dimension as before. This was proven by Frobenius, starting in 1878 for a commuting pair, as discussed at [[commuting matrices]]. As for a single matrix, over the complex numbers these can be triangularized by unitary matrices.
| |
| | |
| The fact that commuting matrices have a common eigenvector can be interpreted as a result of [[Hilbert's Nullstellensatz]]: commuting matrices form a commutative algebra <math>K[A_1,\ldots,A_k]</math> over <math>K[x_1,\ldots,x_k]</math> which can be interpreted as a variety in ''k''-dimensional affine space, and the existence of a (common) eigenvalue (and hence a common eigenvector) corresponds to this variety having a point (being non-empty), which is the content of the (weak) Nullstellensatz. In algebraic terms, these operators correspond to an [[algebra representation]] of the polynomial algebra in ''k'' variables.
| |
| | |
| This is generalized by [[Lie's theorem]], which shows that any representation of a [[solvable Lie algebra]] is simultaneously upper triangularisable, the case of commuting matrices being the [[abelian Lie algebra]] case, abelian being a fortiori solvable.
| |
| | |
| More generally and precisely, a set of matrices <math>A_1,\ldots,A_k</math> is simultaneously triangularisable if and only if the matrix <math>p(A_1,\ldots,A_k)[A_i,A_j]</math> is [[nilpotent]] for all polynomials ''p'' in ''k'' ''non''-commuting variables, where <math>[A_i,A_j]</math> is the [[commutator]]; note that for commuting <math>A_i</math> the commutator vanishes so this holds. This was proven in {{Harv|Drazin|Dungey|Gruenberg|1951}}; a brief proof is given in {{Harv|Prasolov|1994|loc=[http://books.google.com/books?id=fuONq1od6nsC&pg=PA178 pp. 178–179]}}. One direction is clear: if the matrices are simultaneously triangularisable, then <math>[A_i, A_j]</math> is ''strictly'' upper triangularizable (hence nilpotent), which is preserved by multiplication by any <math>A_k</math> or combination thereof – it will still have 0s on the diagonal in the triangularizing basis.
| |
| | |
| ==Generalizations==
| |
| Because the product of two upper triangular matrices is again upper triangular, the set of upper triangular matrices forms an [[associative algebra|algebra]]. Algebras of upper triangular matrices have a natural generalization in [[functional analysis]] which yields [[nest algebra]]s on [[Hilbert space]]s.
| |
| | |
| A non-square (or sometimes any) matrix with zeros above (below) the diagonal is called a lower (upper) trapezoidal matrix. The non-zero entries form the shape of a [[trapezoid]].
| |
| | |
| ===Borel subgroups and Borel subalgebras===
| |
| {{main|Borel subgroup|Borel subalgebra}}
| |
| The set of invertible triangular matrices of a given kind (upper or lower) forms a [[group (mathematics)|group]], indeed a [[Lie group]], which is a subgroup of the [[general linear group]] of all invertible matrices; invertible is equivalent to all diagonal entries being invertible (non-zero).
| |
| | |
| Over the real numbers, this group is disconnected, having <math>2^n</math> components accordingly as each diagonal entry is positive or negative. The identity component is invertible triangular matrices with positive entries on the diagonal, and the group of all invertible triangular matrices is a [[semidirect product]] of this group and diagonal entries with <math>\pm 1</math> on the diagonal, corresponding to the components.
| |
| | |
| The [[Lie algebra]] of the Lie group of invertible upper triangular matrices is the set of all upper triangular matrices, not necessarily invertible, and is a [[solvable Lie algebra]]. These are, respectively, the standard [[Borel subgroup]] ''B'' of the Lie group GL<sub>n</sub> and the standard [[Borel subalgebra]] <math>\mathfrak{b}</math> of the Lie algebra gl<sub>n</sub>.
| |
| | |
| The upper triangular matrices are precisely those that stabilize the [[Flag (linear algebra)|standard flag]]. The invertible ones among them form a subgroup of the general linear group, whose conjugate subgroups are those defined as the stabilizer of some (other) complete flag. These subgroups are [[Borel subgroup]]s. The group of invertible lower triangular matrices is such a subgroup, since it is the stabilizer of the standard flag associated to the standard basis in reverse order.
| |
| | |
| The stabilizer of a partial flag obtained by forgetting some parts of the standard flag can be described as a set of block upper triangular matrices (but its elements are ''not'' all triangular matrices). The conjugates of such a group are the subgroups defined as the stabilizer of some partial flag. These subgroups are called [[parabolic subgroup]]s.
| |
| | |
| === Examples ===
| |
| The group of 2 by 2 upper unitriangular matrices is [[isomorphic]] to the [[Abelian group|additive group]] of the field of scalars; in the case of complex numbers it corresponds to a group formed of parabolic [[Möbius transformation]]s; the 3 by 3 upper unitriangular matrices form the [[Heisenberg group]].
| |
| | |
| ==Forward and back substitution==
| |
| <!-- Section is linked from several redirects (Back substitution etc.) – please update if you change the section title -->
| |
| | |
| A matrix equation in the form <math>\mathbf{L}\mathbf{x} = \mathbf{b}</math> or <math>\mathbf{U} \mathbf{x} = \mathbf{b}</math> is very easy to solve by an iterative process called '''forward substitution''' for lower triangular matrices and analogously '''back substitution''' for upper triangular matrices.
| |
| The process is so called because for lower triangular matrices, one first computes <math>x_1</math>, then substitutes that ''forward'' into the ''next'' equation to solve for <math>x_2</math>, and repeats through to <math>x_n</math>. In an upper triangular matrix, one works ''backwards,'' first computing <math>x_n</math>, then substituting that ''back'' into the ''previous'' equation to solve for <math>x_{n-1}</math>, and repeating through <math>x_1</math>.
| |
| | |
| Notice that this does not require inverting the matrix.
| |
| | |
| ===Forward substitution===
| |
| The matrix equation '''L'''''x'' = ''b'' can be written as a system of linear equations
| |
| | |
| :<math>
| |
| \begin{matrix}
| |
| l_{1,1} x_1 & & & & & = & b_1 \\
| |
| l_{2,1} x_1 & + & l_{2,2} x_2 & & & = & b_2 \\
| |
| \vdots & & \vdots & \ddots & & & \vdots \\
| |
| l_{m,1} x_1 & + & l_{m,2} x_2 & + \dotsb + & l_{m,m} x_m & = & b_m \\
| |
| \end{matrix}
| |
| </math>
| |
| | |
| Observe that the first equation (<math>l_{1,1} x_1 = b_1</math>) only involves <math>x_1</math>, and thus one can solve for <math>x_1</math> directly. The second equation only involves <math>x_1</math> and <math>x_2</math>, and thus can be solved once one substitutes in the already solved value for <math>x_1</math>. Continuing in this way, the <math>k</math>-th equation only involves <math>x_1,\dots,x_k</math>, and one can solve for <math>x_k</math> using the previously solved values for <math>x_1,\dots,x_{k-1}</math>.
| |
| | |
| The resulting formulas are:
| |
| :<math> x_1 = \frac{b_1}{l_{1,1}}, </math>
| |
| :<math> x_2 = \frac{b_2 - l_{2,1} x_1}{l_{2,2}}, </math>
| |
| ::<math> \vdots </math>
| |
| :<math> x_m = \frac{b_m - \sum_{i=1}^{m-1} l_{m,i}x_i}{l_{m,m}}. </math>
| |
| | |
| A matrix equation with an upper triangular matrix '''U''' can be solved in an analogous way, only working backwards.
| |
| | |
| ===Algorithm===
| |
| The following is an example implementation of this algorithm in the [[C Sharp (programming language)|C#]] [[programming language]]. Note that the [[algorithm]] performs poorly in C# due to the inefficient handling of non-jagged [[Matrix (mathematics)|matrices]] in this language. Nonetheless, the method of forward and backward substitution ''can'' be highly efficient.
| |
| | |
| <source lang="csharp">
| |
| double[] luEvaluate(double[,] L, double[,] U, Vector b)
| |
| {
| |
| // Ax = b -> LUx = b. Then y is defined to be Ux
| |
| int i = 0;
| |
| int j = 0;
| |
| int n = b.Count;
| |
| double[] x = new double[n];
| |
| double[] y = new double[n];
| |
| // Forward solve Ly = b
| |
| for (i = 0; i < n; i++)
| |
| {
| |
| y[i] = b[i];
| |
| for (j = 0; j < i; j++)
| |
| {
| |
| y[i] -= L[i, j] * y[j];
| |
| }
| |
| y[i] /= L[i, i];
| |
| }
| |
| // Backward solve Ux = y
| |
| for (i = n - 1; i >= 0; i--)
| |
| {
| |
| x[i] = y[i];
| |
| for (j = i + 1; j < n; j++)
| |
| {
| |
| x[i] -= U[i, j] * x[j];
| |
| }
| |
| x[i] /= U[i, i];
| |
| }
| |
| return x;
| |
| }
| |
| </source>
| |
| | |
| ===Applications===
| |
| Forward substitution is used in financial [[Bootstrapping (finance)|bootstrapping]] to construct a [[yield curve]].
| |
| | |
| == See also ==
| |
| * [[Gaussian elimination]]
| |
| * [[QR decomposition]]
| |
| * [[Cholesky decomposition]]
| |
| * [[Hessenberg matrix]]
| |
| * [[Tridiagonal matrix]]
| |
| * [[Invariant subspace]]
| |
| | |
| == Notes ==
| |
| {{reflist|group=note}}
| |
| | |
| == References ==
| |
| {{reflist}}
| |
| {{refbegin}}
| |
| * {{Citation | first = Sheldon | last = Axler | title = Linear Algebra Done Right | publisher = Springer-Verlag | year = 1996 | isbn=0-387-98258-2}}
| |
| * {{Citation | first1 = M. P. | last1 = Drazin | first2 = J. W. | last2 = Dungey | first3 = K. W. | last3 = Gruenberg | title = Some theorems on commutative matrices | journal = J. London Math. Soc. | volume = 26 | pages = 221–228 | year = 1951 | url = http://jlms.oxfordjournals.org/cgi/pdf_extract/s1-26/3/221 |doi=10.1112/jlms/s1-26.3.221 | issue = 3}}
| |
| * {{Citation | first = I. N. | last = Herstein | title=Topics in Algebra | edition=2nd | publisher=John Wiley and Sons | year = 1975 | isbn = 0-471-01090-1}}
| |
| * {{Citation | title = Problems and theorems in linear algebra | first = Viktor | last = Prasolov | year = 1994 | url = http://books.google.com/books?id=fuONq1od6nsC&lpg=PP1&dq=victor%20prasolov%20Problems%20and%20theorems%20in%20linear%20algebra&pg=PP1#v=onepage&q&f=false | isbn = 9780821802366 }}
| |
| {{refend}}
| |
| | |
| {{Numerical linear algebra}}
| |
| | |
| {{DEFAULTSORT:Triangular Matrix}}
| |
| [[Category:Numerical linear algebra]]
| |
| [[Category:Matrices]]
| |