|
|
Line 1: |
Line 1: |
| [[Image:Taxonomy of Complex Matrices.svg|thumb|247px|right|Several important classes of matrices are subsets of each other.]]
| | 1) Be sure you do not inadvertently place yourself inside a starvation mode of metabolism. If you do, correct the situation because soon because possible plus the simplest method to do this might be eat 6 meals per day, graze on food throughout the course of the day.<br><br>There are quite a few sites which make it simple to find out how several calories you want for weight repair plus fat reduction. Just type [http://safedietplansforwomen.com/bmr-calculator bmr calculator] into a look engine plus follow the procedures they give you. BMR stands for basal metabolic rate, and acquiring this number might aid provide we an idea of how many calories you need to consume for your particular goals. You'll be asked to answer certain issues regarding your activity level and maybe even the degree of weight loss we want to achieve. Some BMR calculators will go thus far because to provide you with the amounts of proteins, carbohydrates plus fats you need to consume too. They're a very useful tool!<br><br>The many effectively known way to calculate your BMR is the Harris Benedict equation. This formula accounts for the factors of height, weight, age, plus sex to calculate your basal metabolic rate (BMR). This calculation is more accurate than determining calorie requirements based only on total bodyweight. However, the Harris Benedict equation refuses to take lean body mass into consideration. Calculations will be accurate in all but the extremely muscular and the extremely overweight.<br><br>A good part of the carbs for every day could be consumed with breakfast. The time we spend snoozing is time the liver has to create power utilizing ATP plus the associated porphyria toxins with it. You should eat little food every 2-3 hours instead of 3 leading food. Simply like with breakfast, a carb-heavy late dinner should enable a liver out overnight.<br><br>So my friend's bmr is 1969 calories a day. With this info and her doctor's recommendation, we've decide to go for 1400-1500 calories consumed a day and 50-60 minutes of extremely small exercise a day. According to the goal calculator a reduction of 400-500 calories a day from the BMR can result in about 1 pound of fat lose per week, that is perfect.<br><br>What is a protein? Proteins, that turn into amino acids in the digestive program, are essential to rebuild all tissues in the body. Without adequate daily protein, over time the body may wither plus die. It is critically important for the health of the body to get adequate protein. The body cannot manufacture "amino acid" from either carbs or fats.<br><br>Feel like eating a small more? Use exercise to create sewme of that caloric deficit. Maybe be 500 calories below the BMI with diet, plus exercise which day for another 500 calorie expenditure. Are we still hungry? Thais okay. It's alright to be hungry time to time while you're losing fat and extending the life. Anytime you feel a small hungry, keep in mind that any stomach grumbling is really the sewund of which fat dying. |
| This page lists some important classes of [[matrix (mathematics)|matrices]] used in [[mathematics]], [[science]] and [[engineering]]. A '''matrix''' (plural matrices, or less commonly matrixes) is a rectangular [[Array data structure|array]] of [[number]]s called ''entries''. Matrices have a long history of both study and application, leading to diverse ways of classifying matrices. A first group is matrices satisfying concrete conditions of the entries, including constant matrices. An important example is the [[identity matrix]] given by
| |
| :<math>
| |
| I_n = \begin{bmatrix}
| |
| 1 & 0 & \cdots & 0 \\
| |
| 0 & 1 & \cdots & 0 \\
| |
| \vdots & \vdots & \ddots & \vdots \\
| |
| 0 & 0 & \cdots & 1 \end{bmatrix}.</math>
| |
| | |
| Further ways of classifying matrices are according to their [[eigenvalue]]s or by imposing conditions on the [[matrix product|product]] of the matrix with other matrices. Finally, many domains, both in mathematics and other sciences including [[physics]] and [[chemistry]] have particular matrices that are applied chiefly in these areas.
| |
| <!-- !!!!!! PLEASE NOTE! The — dash used is not the usual hyphen. It is an 'em dash' — please either copy and paste an existing one, or use the Insert facility just below the "Save page" button. --> | |
| | |
| ==Matrices with explicitly constrained entries==
| |
| The following lists matrices whose entries are subject to certain conditions. Many of them apply to ''square matrices'' only, that is matrices with the same number of columns and rows. The [[main diagonal]] of a square matrix is the [[diagonal]] joining the upper left corner and the lower right one or equivalently the entries ''a''<sub>''i'',''i''</sub>. The other diagonal is called anti-diagonal (or counter-diagonal).
| |
| | |
| {| class="wikitable sortable"
| |
| ! Name !! Explanation !! Notes, References
| |
| |-
| |
| | [[(0,1)-matrix]] || A matrix with all elements either 0 or 1. || Synonym for binary matrix, Boolean matrix and logical matrix.
| |
| |-
| |
| | [[Alternant matrix]] || A matrix in which successive columns have a particular function applied to their entries. ||
| |
| |-
| |
| | [[Anti-diagonal matrix]] || A square matrix with all entries off the anti-diagonal equal to zero. ||
| |
| |-
| |
| | [[Anti-Hermitian matrix]] || || Synonym for skew-Hermitian matrix.
| |
| |-
| |
| | [[Anti-symmetric matrix]] || || Synonym for skew-symmetric matrix.
| |
| |-
| |
| | [[Arrowhead matrix]] || A square matrix containing zeros in all entries except for the first row, first column, and main diagonal. ||
| |
| |-
| |
| | [[Band matrix]] || A square matrix whose non-zero entries are confined to a diagonal ''band''. ||
| |
| |-
| |
| | [[Bidiagonal matrix]] || A matrix with elements only on the main diagonal and either the superdiagonal or subdiagonal. || Sometimes defined differently, see article.
| |
| |-
| |
| | [[Logical matrix|Binary matrix]] || A matrix whose entries are all either 0 or 1. ||Synonym for (0,1)-matrix, Boolean matrix or logical matrix.<ref>{{Harvard citations | last1=Hogben |nb=yes|loc=Ch. 31.3| year=2006}}</ref>
| |
| |-
| |
| | [[Bisymmetric matrix]] || A square matrix that is symmetric with respect to its main diagonal and its main cross-diagonal. ||
| |
| |-
| |
| | [[Block-diagonal matrix]] || A [[block matrix]] with entries only on the diagonal. ||
| |
| |-
| |
| | [[Block matrix]] || A matrix partitioned in sub-matrices called blocks. ||
| |
| |-
| |
| | [[Block tridiagonal matrix]] || A block matrix which is essentially a tridiagonal matrix but with submatrices in place of scalar elements ||
| |
| |-
| |
| | [[Boolean matrix]] || A matrix whose entries are all either 0 or 1. || Synonym for (0,1)-matrix, binary matrix or logical matrix.
| |
| |-
| |
| | [[Cauchy matrix]] || A matrix whose elements are of the form 1/(''x<sub>i</sub>'' + ''y<sub>j</sub>'') for (''x<sub>i</sub>''), (''y<sub>j</sub>'') injective sequences (i.e., taking every value only once). ||
| |
| |-
| |
| | [[Centrosymmetric matrix]] || A matrix symmetric about its center; i.e., ''a''<sub>''ij''</sub> = ''a''<sub>''n''−''i''+1,''n''−''j''+1</sub> ||
| |
| |-
| |
| | [[Conference matrix]] || A square matrix with zero diagonal and +1 and −1 off the diagonal, such that C<sup>T</sup>C is a multiple of the identity matrix. ||
| |
| |-
| |
| | [[Complex Hadamard matrix]] || A matrix with all rows and columns mutually orthogonal, whose entries are unimodular. ||
| |
| |-
| |
| | [[Copositive matrix]] || A square matrix ''A'' with real coefficients, such that <math>f(x)=x^TAx</math> is nonnegative for every nonnegative vector ''x'' ||
| |
| |-
| |
| | [[Diagonally dominant matrix]] || A matrix whose entries satisfy | |''a''<sub>''ii''</sub>| > Σ<sub>''j''≠''i''</sub> |''a''<sub>''ij''</sub>|. ||
| |
| |-
| |
| | [[Diagonal matrix]] || A square matrix with all entries outside the [[main diagonal]] equal to zero. ||
| |
| |-
| |
| | [[DFT matrix|Discrete Fourier Transform Matrix]] || Multiplying by a vector gives the DFT of the vector as result. ||
| |
| |-
| |
| | [[Elementary matrix]] || A square matrix derived by applying an elementary row operation to the identity matrix. ||
| |
| |-
| |
| | [[Equivalent matrix]] || A matrix that can be derived from another matrix through a sequence of elementary row or column operations.||
| |
| |-
| |
| | [[Frobenius matrix]] || A square matrix in the form of an identity matrix but with arbitrary entries in one column below the main diagonal. ||
| |
| |-
| |
| | [[Generalized permutation matrix]] || A square matrix with precisely one nonzero element in each row and column. ||
| |
| |-
| |
| | [[Hadamard matrix]]|| A square matrix with entries +1, −1 whose rows are mutually orthogonal.||
| |
| |-
| |
| | [[Hankel matrix]] || A matrix with constant skew-diagonals; also an upside down Toeplitz matrix. || A square Hankel matrix is symmetric.
| |
| |-
| |
| | [[Hermitian matrix]] || A square matrix which is equal to its [[conjugate transpose]], ''A'' = ''A''<sup>*</sup>.||
| |
| |-
| |
| | [[Hessenberg matrix]] || An "almost" triangular matrix, for example, an upper Hessenberg matrix has zero entries below the first subdiagonal. ||
| |
| |-
| |
| | [[Hollow matrix]] || A square matrix whose main diagonal comprises only zero elements. ||
| |
| |-
| |
| | [[Integer matrix]] || A matrix whose entries are all integers. ||
| |
| |-
| |
| | [[Logical matrix]] || A matrix with all entries either 0 or 1. || Synonym for (0,1)-matrix, binary matrix or Boolean matrix. Can be used to represent a ''k''-adic [[relation (mathematics)|relation]].
| |
| |-
| |
| | [[Markov matrix]]|| A matrix of non-negative real numbers, such that the entries in each row sum to 1. ||
| |
| |-
| |
| | [[Metzler matrix]]|| A matrix whose off-diagonal entries are non-negative. ||
| |
| |-
| |
| | [[Monomial matrix]] || A square matrix with exactly one non-zero entry in each row and column. || Synonym for generalized permutation matrix.
| |
| |-
| |
| | [[Moore matrix]] ||A row consists of ''a'', ''a''<sup>''q''</sup>, ''a''<sup>''q''²</sup>, etc., and each row uses a different variable. ||
| |
| |-
| |
| | [[Nonnegative matrix]] || A matrix with all nonnegative entries. ||
| |
| |-
| |
| | [[Partitioned matrix]] || A matrix partitioned into sub-matrices, or equivalently, a matrix whose entries are themselves matrices rather than scalars|| Synonym for block matrix
| |
| |-
| |
| | [[Parisi matrix]] || A block-hierarchical matrix. It consist of growing blocks placed along the diagonal, each block is itself a Parisi matrix of a smaller size.|| In theory of spin-glasses is also known as a replica matrix.
| |
| |-
| |
| | [[Pentadiagonal matrix]] || A matrix with the only nonzero entries on the main diagonal and the two diagonals just above and below the main one. ||
| |
| |-
| |
| | [[Permutation matrix]] || A matrix representation of a [[permutation]], a square matrix with exactly one 1 in each row and column, and all other elements 0. ||
| |
| |-
| |
| | [[Persymmetric matrix]] || A matrix that is symmetric about its northeast-southwest diagonal, i.e., ''a''<sub>''ij''</sub> = ''a''<sub>''n''−''j''+1,''n''−''i''+1</sub>
| |
| ||
| |
| |-
| |
| | [[Polynomial matrix]] || A matrix whose entries are [[polynomial]]s. ||
| |
| |-
| |
| | [[Positive matrix]] || A matrix with all positive entries.||
| |
| |-
| |
| | [[Quaternionic matrix]] || A matrix whose entries are [[quaternion]]s. ||
| |
| |-
| |
| | [[Sign matrix]] || A matrix whose entries are either +1, 0, or −1.||
| |
| |-
| |
| | [[Signature matrix]] || A diagonal matrix where the diagonal elements are either +1 or −1.||
| |
| |-
| |
| | [[Skew-Hermitian matrix]] || A square matrix which is equal to the negative of its [[conjugate transpose]], ''A''<sup>*</sup> = −''A''. ||
| |
| |-
| |
| | [[Skew-symmetric matrix]] || A matrix which is equal to the negative of its [[transpose]], ''A''<sup>''T''</sup> = −''A''. ||
| |
| |-
| |
| | [[Skyline matrix]] || A rearrangement of the entries of a banded matrix which requires less space. ||
| |
| |-
| |
| | [[Sparse matrix]] || A matrix with relatively few non-zero elements. || Sparse matrix algorithms can tackle huge sparse matrices that are utterly impractical for dense matrix algorithms.
| |
| |-
| |
| | [[Sylvester matrix]] || A square matrix whose entries come from coefficients of two [[polynomials]]. || The Sylvester matrix is nonsingular if and only if the two polynomials are [[coprime]] to each other.
| |
| |-
| |
| | [[Symmetric matrix]] || A square matrix which is equal to its [[transpose]], ''A'' = ''A''<sup>T</sup> (''a''<sub>''i'',''j''</sub> = ''a''<sub>''j'',''i''</sub>).||
| |
| |-
| |
| | [[Toeplitz matrix]] || A matrix with constant diagonals. ||
| |
| |-
| |
| | [[Triangular matrix]] || A matrix with all entries above the main diagonal equal to zero (lower triangular) or with all entries below the main diagonal equal to zero (upper triangular).||
| |
| |-
| |
| | [[Tridiagonal matrix]] || A matrix with the only nonzero entries on the main diagonal and the diagonals just above and below the main one.||
| |
| |-
| |
| | [[Unitary matrix]] || A square matrix whose inverse is equal to its [[conjugate transpose]], ''A''<sup>−1</sup> = ''A''<sup>*</sup>. ||
| |
| |-
| |
| | [[Vandermonde matrix]] || A row consists of 1, ''a'', ''a''², ''a''³, etc., and each row uses a different variable.||
| |
| |-
| |
| | [[Walsh matrix]] || A square matrix, with dimensions a power of 2, the entries of which are +1 or -1.||
| |
| |-
| |
| | [[Z-matrix (mathematics)|Z-matrix]] || A matrix with all off-diagonal entries less than zero.
| |
| |}
| |
| | |
| ===Constant matrices===
| |
| | |
| The list below comprises matrices whose elements are constant for any given dimension (size) of matrix. The matrix entries will be denoted ''a<sub>ij</sub>''. The table below uses the [[Kronecker delta]] δ<sub>''ij''</sub> for two integers ''i'' and ''j'' which is 1 if ''i'' = ''j'' and 0 else.
| |
| | |
| {| class="wikitable sortable"
| |
| ! Name !! Explanation !! Symbolic description of the entries !! Notes
| |
| |-
| |
| |width=15%| [[Exchange matrix]] || A [[binary matrix]] with ones on the anti-diagonal, and zeroes everywhere else. || ''a<sub>ij</sub>'' = δ<sub>''n + 1 − i,j''</sub> || A [[permutation matrix]].
| |
| |-
| |
| | [[Hilbert matrix]] || || ''a''<sub>''ij''</sub> = (''i'' + ''j'' − 1)<sup>−1</sup>. || A [[Hankel matrix]].
| |
| |-
| |
| | [[Identity matrix]] || A square diagonal matrix, with all entries on the main diagonal equal to 1, and the rest 0 || ''a<sub>ij</sub>'' = δ<sub>''ij''</sub> ||
| |
| |-
| |
| | [[Lehmer matrix]] || || ''a<sub>ij</sub>'' = min(''i,j'') ÷ max(''i,j''). || A [[positive matrix|positive]] [[symmetric matrix]].
| |
| |-
| |
| | [[Matrix of ones]] || A matrix with all entries equal to one || ''a<sub>ij</sub>'' = 1. ||
| |
| |-
| |
| | [[Pascal matrix]] || A matrix containing the entries of [[Pascal's triangle]]. || ||
| |
| |-
| |
| | [[Pauli matrices]] || A set of three 2 × 2 complex Hermitian and unitary matrices. When combined with the ''I''<sub>2</sub> identity matrix, they form an orthogonal basis for the 2 × 2 complex Hermitian matrices. || ||
| |
| |-
| |
| | [[Redheffer matrix]] || || ''a''<sub>''ij''</sub> are 1 if ''i'' divides ''j'' or if ''j'' = 1; otherwise, ''a''<sub>''ij''</sub> = 0. || A (0, 1)-matrix.
| |
| |-
| |
| | [[Shift matrix]] || A matrix with ones on the superdiagonal or subdiagonal and zeroes elsewhere. || ''a<sub>ij</sub>'' = δ<sub>''i''+1,''j''</sub> or ''a<sub>ij</sub>'' = δ<sub>''i''−1,''j''</sub> || Multiplication by it shifts matrix elements by one position.
| |
| |-
| |
| | [[Zero matrix]] || A matrix with all entries equal to zero.|| ''a<sub>ij</sub>'' = 0. ||
| |
| |}
| |
| | |
| ==Matrices with conditions on eigenvalues or eigenvectors==
| |
| {| class="wikitable sortable"
| |
| ! Name !! Explanation !! Notes
| |
| |-
| |
| | [[Companion matrix]] || A matrix whose eigenvalues are equal to the roots of the polynomial. ||
| |
| |-
| |
| | [[Convergent matrix]] || A square matrix whose successive powers approach the [[zero matrix]]. || Its [[eigenvalues and eigenvectors|eigenvalues]] have magnitude less than one.
| |
| |-
| |
| | [[Defective matrix]] || A square matrix that does not have a complete basis of [[eigenvectors]], and is thus not [[diagonalisable]]. ||
| |
| |-
| |
| | [[Diagonalizable matrix]] || A square matrix [[similar matrix|similar]] to a diagonal matrix. || It has an [[eigenbasis]], that is, a [[complete set]] of [[linearly independent]] eigenvectors.
| |
| |-
| |
| | [[Hurwitz matrix]] || A matrix whose eigenvalues have strictly negative real part. A stable system of differential equations may be represented by a Hurwitz matrix. ||
| |
| |-
| |
| | [[Positive-definite matrix]] || A Hermitian matrix with every eigenvalue positive. ||
| |
| |-
| |
| | [[Stability matrix]] || || Synonym for [[Hurwitz matrix]].
| |
| |-
| |
| | [[Stieltjes matrix]] || A real symmetric positive definite matrix with nonpositive off-diagonal entries. || Special case of an [[M-matrix]].
| |
| |}
| |
| | |
| ==Matrices satisfying conditions on products or inverses==
| |
| A number of matrix-related notions is about properties of products or inverses of the given matrix. The [[matrix product]] of a ''m''-by-''n'' matrix ''A'' and a ''n''-by-''k'' matrix ''B'' is the ''m''-by-''k'' matrix ''C'' given by
| |
| :<math> (C)_{i,j} = \sum_{r=1}^n A_{i,r}B_{r,j}.</math>
| |
| This matrix product is denoted ''AB''. Unlike the product of numbers, matrix products are not [[commutative]], that is to say ''AB'' need not be equal to ''BA''. A number of notions are concerned with the failure of this commutativity. An [[inverse of a matrix|inverse]] of square matrix ''A'' is a matrix ''B'' (necessarily of the same dimension as ''A'') such that ''AB'' = ''I''. Equivalently, ''BA'' = ''I''. An inverse need not exist. If it exists, ''B'' is uniquely determined, and is also called ''the'' inverse of ''A'', denoted ''A''<sup>−1</sup>.
| |
| | |
| {| class="wikitable sortable"
| |
| ! Name !! Explanation !! Notes
| |
| |-
| |
| | [[Matrix congruence|Congruent matrix]] || Two matrices ''A'' and ''B'' are congruent if there exists an invertible matrix ''P'' such that {{nowrap|''P''<sup>T</sup> ''A'' ''P''}} = ''B''. || Compare with similar matrices.
| |
| |-
| |
| | [[Idempotent matrix]] or <br /> [[Projection (linear algebra)|Projection Matrix]] || A matrix that has the property ''A''² = ''AA'' = ''A''. || The name projection matrix inspires from the observation of projection of a point multiple <br /> times onto a subspace(plane or a line) giving the same result as [[Projection_(linear_algebra)#Properties_and_classification|one projection]].
| |
| |-
| |
| | [[Invertible matrix]] || A square matrix having a multiplicative [[inverse matrix|inverse]], that is, a matrix ''B'' such that ''AB'' = ''BA'' = ''I''. || Invertible matrices form the [[general linear group]].
| |
| |-
| |
| | [[Involutary matrix]] || A square matrix which is its own inverse, i.e., ''AA'' = ''I''. || [[Signature matrix|Signature matrices]], [[Householder_transformation#Definition_and_properties|Householder Matrices]] (Also known as 'reflection matrices' <br /> to reflect a point about a plane or line) have this property.
| |
| |-
| |
| | [[Nilpotent matrix]] || A square matrix satisfying ''A''<sup>''q''</sup> = 0 for some positive integer ''q''. || Equivalently, the only eigenvalue of ''A'' is 0.
| |
| |-
| |
| | [[Normal matrix]]|| A square matrix that commutes with its [[conjugate transpose]]: ''AA''<sup>∗</sup> = ''A''<sup>∗</sup>''A'' || They are the matrices to which the [[spectral theorem]] applies.
| |
| |-
| |
| | [[Orthogonal matrix]] || A matrix whose inverse is equal to its [[transpose]], ''A''<sup>−1</sup> = ''A''<sup>''T''</sup>. || They form the [[orthogonal group]].
| |
| |-
| |
| | [[Orthonormal matrix]] || A matrix whose columns are [[orthonormal]] vectors. ||
| |
| |-
| |
| | [[Singular matrix]] || A square matrix that is not invertible. ||
| |
| |-
| |
| | [[Unimodular matrix]] || An invertible matrix with entries in the integers ([[integer matrix]]) || Necessarily the determinant is +1 or −1.
| |
| |-
| |
| | [[Unipotent matrix]] || A square matrix with all eigenvalues equal to 1. || Equivalently, {{nowrap|''A'' − ''I''}} is nilpotent. See also [[unipotent group]].
| |
| |-
| |
| | [[Totally unimodular matrix]] || A matrix for which every non-singular square submatrix is [[unimodular matrix|unimodular]]. This has some implications in the [[linear programming]] [[linear programming relaxation|relaxation]] of an [[integer program]]. ||
| |
| |-
| |
| | [[Weighing matrix]] || A square matrix the entries of which are in {{nowrap|{0, 1, −1}}}, such that ''AA''<sup>T</sup> = ''wI'' for some positive integer ''w''. ||
| |
| |}
| |
| | |
| ==Matrices with specific applications==
| |
| | |
| {| class="wikitable sortable"
| |
| ! Name !! Explanation !! Used in !! Notes
| |
| |-
| |
| | [[Adjugate matrix]] || The matrix containing [[minor (linear algebra)|minors]] of a given square matrix. || Calculating [[inverse matrix|inverse matrices]] via [[Laplace's formula]]. ||
| |
| |-
| |
| | [[Alternating sign matrix]] || A square matrix of with entries 0, 1 and −1 such that the sum of each row and column is 1 and the nonzero entries in each row and column alternate in sign. || [[Dodgson condensation]] to calculate determinants ||
| |
| |-
| |
| | [[Augmented matrix]] || A matrix whose rows are concatenations of the rows of two smaller matrices. || Calculating [[inverse matrix|inverse matrices]]. ||
| |
| |-
| |
| | [[Bézout matrix]] || A square matrix which may be used as a tool for the efficient location of polynomial zeros || [[Control theory]], [[Stable polynomial]]s ||
| |
| |-
| |
| | [[Carleman matrix]] || A matrix that converts composition of functions to multiplication of matrices. || ||
| |
| |-
| |
| | [[Cartan matrix]] || A matrix associated with a finite-dimensional [[associative algebra]], or a [[semisimple Lie algebra]] (the two meanings are distinct). || ||
| |
| |-
| |
| | [[Circulant matrix]] || A matrix where each row is a circular shift of its predecessor. || [[System of linear equations]], [[discrete Fourier transform]] ||
| |
| |-
| |
| | [[Cofactor matrix]] || A containing the [[cofactor (linear algebra)|cofactors]], i.e., signed [[minor (linear algebra)|minors]], of a given matrix. || ||
| |
| |-
| |
| | [[Commutation matrix]] || A matrix used for transforming the vectorized form of a matrix into the vectorized form of its transpose. || ||
| |
| |-
| |
| | [[Coxeter matrix]] || A matrix related to [[Coxeter groups]], which describe [[symmetry|symmetries]] in a structure or system.|| ||
| |
| |-
| |
| | [[Distance matrix]] ||A square matrix containing the distances, taken pairwise, of a set of [[point (geometry)|points]]. || [[Computer vision]], [[Network analysis (electronics)|network analysis]]. || See also [[Euclidean distance matrix]].
| |
| |-
| |
| | [[Duplication matrix]] || A linear transformation matrix used for transforming half-vectorizations of matrices into [[vectorization (mathematics)|vectorization]]s. || ||
| |
| |-
| |
| | [[Elimination matrix]] || A linear transformation matrix used for transforming [[vectorization (mathematics)|vectorization]]s of matrices into half-vectorizations. || ||
| |
| |-
| |
| | [[Euclidean distance matrix]] || A matrix that describes the pairwise distances between [[point (geometry)|points]] in [[Euclidean space]]. || || See also [[distance matrix]].
| |
| |-
| |
| | [[Fundamental matrix (linear differential equation)]] || A matrix containing the fundamental solutions of a linear [[ordinary differential equation]]. || ||
| |
| |-
| |
| | [[Generator matrix]] || A matrix whose rows generate all elements of a [[linear code]]. || [[Coding theory]] ||
| |
| |-
| |
| | [[Gramian matrix]] || A matrix containing the pairwise angles of given vectors in an [[inner product space]]. || Test [[linear independence]] of vectors, including ones in [[function space]]s. || They are real symmetric.
| |
| |-
| |
| | [[Hessian matrix]] || A square matrix of [[Partial derivative|second partial derivatives]] of a scalar-valued function. || Detecting [[local minimum|local minima]] and maxima of scalar-valued functions in several variables; [[Blob detection]] ([[computer vision]]) ||
| |
| |-
| |
| | [[Householder transformation|Householder matrix]] || A transformation matrix widely used in matrix algorithms. || [[QR decomposition]]. ||
| |
| |-
| |
| | [[Jacobian matrix]] || A matrix of first-order partial derivatives of a vector-valued function. || [[Implicit function theorem]]; [[Smooth morphism]]s ([[algebraic geometry]]). ||
| |
| |-
| |
| | [[Payoff matrix]] || A matrix in [[game theory]] and [[economics]], that represents the payoffs in a [[normal form game]] where players move simultaneously || ||
| |
| |-
| |
| | [[Pick matrix]] || A matrix that occurs in the study of analytical interpolation problems. || ||
| |
| |-
| |
| | [[Random matrix]] || A matrix whose entries consist of random numbers from some specified [[random distribution]]. || ||
| |
| |-
| |
| | [[Rotation matrix]] || A matrix representing a rotational geometric transformation. || [[Special orthogonal group]], [[Euler angle]]s ||
| |
| |-
| |
| | [[Seifert matrix]] || A matrix in [[knot theory]], primarily for the algebraic analysis of topological properties of knots and links.|| [[Alexander polynomial]] ||
| |
| |-
| |
| | [[Shear matrix]]|| An elementary matrix whose corresponding geometric transformation is a [[shear transformation]]. || ||
| |
| |-
| |
| | [[Similarity matrix]] || A matrix of scores which express the similarity between two data points. || [[Sequence alignment]] ||
| |
| |-
| |
| | [[Symplectic matrix]] || A square matrix preserving a standard skew-symmetric form. || [[Symplectic group]], [[symplectic manifold]]. ||
| |
| |-
| |
| | [[Totally positive matrix]] || A matrix with [[determinant]]s of all its square submatrices positive. || Generating the reference points of [[Bézier curve]] in [[computer graphics]]. ||
| |
| |-
| |
| | [[Transformation matrix]] || A matrix representing a [[linear transformation]], often from one co-ordinate space to another to facilitate a geometric transform or projection.|| ||
| |
| |-
| |
| | [[Wedderburn matrix]] || A matrix of the form <math>A - (y^T A x)^{-1} A x y^T A</math>, used for rank-reduction & biconjugate decompositions || Analysis of matrix decompositions ||
| |
| |}
| |
| | |
| *[[Derogatory matrix]] — a square ''n×n'' matrix whose [[Minimal polynomial (linear algebra)|minimal polynomial]] is of order less than ''n''.
| |
| *[[Moment matrix]] — a symmetric matrix whose elements are the products of common row/column index dependent [[monomials]].
| |
| *[[X-Y-Z matrix]] — a generalisation of the (rectangular) matrix to a cuboidal form (a 3-dimensional array of entries).
| |
| | |
| ==Matrices used in statistics==
| |
| The following matrices find their main application in [[statistics]] and [[probability theory]].
| |
| *[[Bernoulli matrix]] — a square matrix with entries +1, −1, with equal [[probability]] of each.
| |
| *[[Centering matrix]] — a matrix which, when multiplied with a vector, has the same effect as subtracting the mean of the components of the vector from every component.
| |
| *[[Correlation matrix]] — a symmetric ''n×n'' matrix, formed by the pairwise [[Pearson product-moment correlation coefficient|correlation coefficient]]s of several [[random variable]]s.
| |
| *[[Covariance matrix]] — a symmetric ''n×n'' matrix, formed by the pairwise [[covariance]]s of several random variables. Sometimes called a ''dispersion matrix''.
| |
| *[[Dispersion matrix]] — another name for a ''covariance matrix''.
| |
| *[[Doubly stochastic matrix]] — a non-negative matrix such that each row and each column sums to 1 (thus the matrix is both ''left stochastic'' and ''right stochastic'')
| |
| *[[Fisher information matrix]] — a matrix representing the variance of the partial derivative, with respect to a parameter, of the log of the likelihood function of a random variable.
| |
| *[[Hat matrix]] - a square matrix used in statistics to relate fitted values to observed values.
| |
| *[[Precision matrix]] — a symmetric ''n×n'' matrix, formed by inverting the ''covariance matrix''. Also called the ''information matrix''.
| |
| *[[Stochastic matrix]] — a [[non-negative]] matrix describing a [[stochastic process]]. The sum of entries of any row is one.
| |
| *[[Transition matrix]] — a matrix representing the [[probabilities]] of conditions changing from one state to another in a [[Markov chain]]
| |
| | |
| ==Matrices used in graph theory==
| |
| The following matrices find their main application in [[graph theory|graph]] and [[network theory]].
| |
| *[[Adjacency matrix]] — a square matrix representing a graph, with ''a<sub>ij</sub>'' non-zero if vertex ''i'' and vertex ''j'' are adjacent.
| |
| *[[Biadjacency matrix]] — a special class of [[adjacency matrix]] that describes adjacency in [[bipartite graph]]s.
| |
| *[[Degree matrix]] — a diagonal matrix defining the degree of each [[vertex (graph theory)|vertex]] in a graph.
| |
| *[[Edmonds matrix]] — a square matrix of a bipartite graph.
| |
| *[[Incidence matrix]] — a matrix representing a relationship between two classes of objects (usually [[vertex (graph theory)|vertices]] and [[edge (graph theory)|edges]] in the context of graph theory).
| |
| *[[Laplacian matrix]] — a matrix equal to the degree matrix minus the adjacency matrix for a graph, used to find the number of spanning trees in the graph.
| |
| *[[Seidel adjacency matrix]] — a matrix similar to the usual [[adjacency matrix]] but with −1 for adjacency; +1 for nonadjacency; 0 on the diagonal.
| |
| *[[Tutte matrix]] — a generalisation of the Edmonds matrix for a balanced bipartite graph.
| |
| | |
| ==Matrices used in science and engineering==
| |
| *[[Cabibbo-Kobayashi-Maskawa matrix]] — a unitary matrix used in [[particle physics]] to describe the strength of ''flavour-changing'' weak decays.
| |
| *[[Density matrix]] — a matrix describing the statistical state of a quantum system. [[Hermitian matrix|Hermitian]], [[non-negative matrix|non-negative]] and with [[trace (linear algebra)|trace]] 1.
| |
| *[[Fundamental matrix (computer vision)]] — a 3 × 3 matrix in [[computer vision]] that relates corresponding points in stereo images.
| |
| *[[Fuzzy associative matrix]] — a matrix in [[artificial intelligence]], used in machine learning processes.
| |
| *[[Gamma matrices]] — 4 × 4 matrices in [[quantum field theory]].
| |
| *[[Gell-Mann matrices]] — a generalisation of the [[Pauli matrices]], these matrices are one notable representation of the [[Lie group#The Lie algebra associated to a Lie group|infinitesimal generator]]s of the [[special unitary group]], SU(3).
| |
| *[[Hamiltonian matrix]] — a matrix used in a variety of fields, including [[quantum mechanics]] and [[linear quadratic regulator]] (LQR) systems.
| |
| *[[Irregular matrix]] — a matrix used in [[computer science]] which has a varying number of elements in each row.
| |
| *[[Overlap matrix]] — a type of [[Gramian matrix]], used in [[quantum chemistry]] to describe the inter-relationship of a set of [[basis vector]]s of a [[Quantum mechanics|quantum]] system.
| |
| *[[S matrix]] — a matrix in [[quantum mechanics]] that connects asymptotic (infinite past and future) particle states.
| |
| *[[State-transition matrix|State transition matrix]] — Exponent of state matrix in control systems.
| |
| *[[Substitution matrix]] — a matrix from [[bioinformatics]], which describes mutation rates of [[amino acid]] or [[DNA]] sequences.
| |
| *[[Z-matrix (chemistry)|Z-matrix]] — a matrix in [[chemistry]], representing a molecule in terms of its relative atomic geometry.
| |
| | |
| ==Other matrix-related terms and definitions==
| |
| | |
| *[[Jordan canonical form]] — an 'almost' diagonalised matrix, where the only non-zero elements appear on the lead and super-diagonals.
| |
| *[[Linear independence]] — two or more [[coordinate vector|vectors]] are linearly independent if there is no way to construct one from [[linear combination]]s of the others.
| |
| *[[Matrix exponential]] — defined by the [[Exponential function#Formal definition|exponential series]].
| |
| *[[Matrix representation of conic sections]]
| |
| *[[Pseudoinverse]] — a generalization of the [[inverse matrix]].
| |
| *[[Quaternionic matrix]] - matrix using quaternions as numbers
| |
| *[[Row echelon form]] — a matrix in this form is the result of applying the ''forward elimination'' procedure to a matrix (as used in [[Gaussian elimination]]).
| |
| *[[Wronskian]] — the determinant of a matrix of functions and their derivatives such that row ''n'' is the ''(n-1)''<sup>th</sup> derivative of row one.
| |
| | |
| ==See also==
| |
| *[[Perfect matrix]]
| |
| {{Portal|Mathematics}}
| |
| | |
| ==Notes==
| |
| <references/>
| |
| | |
| ==References==
| |
| * {{Citation | last1=Hogben | first1=Leslie | title=Handbook of Linear Algebra (Discrete Mathematics and Its Applications) | publisher=Chapman & Hall/CRC | location=Boca Raton | isbn=978-1-58488-510-8 | year=2006}}
| |
| | |
| {{DEFAULTSORT:List Of Matrices}}
| |
| [[Category:Mathematics-related lists|Matrices]]
| |
| [[Category:Matrices]]
| |