Point (geometry): Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>JSquish
No edit summary
 
en>Twsx
m unneeded / unhelpful wikilink (HG)
Line 1: Line 1:
My name: Henry Dawkins<br>My age: 33<br>Country: Sweden<br>Home town: Gullholmen <br>ZIP: 440 84<br>Street: Villagatan 48<br><br>my web site :: [http://ebook-pdfree.com/free-mark-athena-pdf-heroes-olympus-book-3-download/ The Mark of Athena pdf]
In [[numerical linear algebra]], the '''QR algorithm''' is an [[eigenvalue algorithm]]: that is, a procedure to calculate the [[eigenvalues and eigenvectors]] of a [[Matrix (mathematics)|matrix]]. The QR transformation was developed in the late 1950s by [[John G.F. Francis]] (England) and by [[Vera N. Kublanovskaya]] (USSR), working independently.<ref>
J.G.F. Francis, "The QR Transformation, I", ''The Computer Journal'', vol. 4, no. 3, pages 265-271 (1961, received Oct 1959) [http://comjnl.oxfordjournals.org/cgi/content/abstract/4/3/265 online at oxfordjournals.org];<br />  
J.G.F. Francis, "The QR Transformation, II" ''The Computer Journal'', vol. 4, no. 4, pages 332-345 (1962) [http://comjnl.oxfordjournals.org/cgi/content/abstract/4/4/332 online at oxfordjournals.org].<br />  
Vera N. Kublanovskaya, "On some algorithms for the solution of the complete eigenvalue problem," ''USSR Computational Mathematics and Mathematical Physics'', vol. 1, no. 3, pages 637–657 (1963, received Feb 1961). Also published in: ''Zhurnal Vychislitel'noi Matematiki i Matematicheskoi Fiziki'', vol.1, no. 4, pages 555–570 (1961).</ref>  The basic idea is to perform a [[QR decomposition]], writing the matrix as a product of an [[orthogonal matrix]] and an upper [[triangular matrix]], multiply the factors in the reverse order, and iterate.
 
==The practical QR algorithm==
Formally, let ''A'' be a real matrix of which we want to compute the eigenvalues, and let ''A''<sub>0</sub>:=''A''. At the ''k''-th step (starting with ''k'' = 0), we compute the QR decomposition ''A''<sub>''k''</sub>=''Q''<sub>''k''</sub>''R''<sub>''k''</sub> where ''Q''<sub>''k''</sub> is an [[orthogonal matrix]] and ''R''<sub>''k''</sub> is an upper triangular matrix. We then form ''A''<sub>''k''+1</sub> = ''R''<sub>''k''</sub>''Q''<sub>''k''</sub>. Note that
:<math> A_{k+1} = R_k Q_k = Q_k^T Q_k R_k Q_k = Q_k^T A_k Q_k = Q_k^{-1} A_k Q_k, </math>
so all the ''A''<sub>''k''</sub> are [[Similar matrix|similar]] and hence they have the same eigenvalues. The algorithm is [[numerical stability|numerically stable]] because it proceeds by ''orthogonal'' similarity transforms.
 
Under certain conditions,<ref>Golub, G. H. and Van Loan, C. F.: Matrix Computations, 3rd ed., Johns Hopkins University Press, Baltimore, 1996, ISBN 0-8018-5414-8.</ref> the matrices ''A''<sub>''k''</sub> converge to a triangular matrix, the [[Schur form]] of ''A''. The eigenvalues of a triangular matrix are listed on the diagonal, and the eigenvalue problem is solved. In testing for convergence it is impractical to require exact zeros, but the [[Gershgorin circle theorem]] provides a bound on the error.
 
In this crude form the iterations are relatively expensive. This can be mitigated by first bringing the matrix ''A'' to upper [[Hessenberg form]] (which costs <math>\begin{matrix}\frac{10}{3}\end{matrix} n^3 + O(n^2)</math> arithmetic operations using a technique based on [[Householder transformation|Householder reduction]]), with a finite sequence of orthogonal similarity transforms, somewhat like a two-sided QR decomposition.<ref name=Demmel>[[James W. Demmel]], ''Applied Numerical Linear Algebra'' (SIAM, 1997).</ref><ref name=Trefethen>[[Lloyd N. Trefethen]] and David Bau, ''Numerical Linear Algebra'' (SIAM, 1997).</ref>  (For QR decomposition, the Householder reflectors are multiplied only on the left, but for the Hessenberg case they are multiplied on both left and right.) Determining the QR decomposition of an upper Hessenberg matrix costs <math>6 n^2 + O(n)</math> arithmetic operations.  Moreover, because the Hessenberg form is already nearly upper-triangular (it has just one nonzero entry below each diagonal), using it as a starting point reduces the number of steps required for convergence of the QR algorithm.
 
If the original matrix is [[symmetric matrix|symmetric]], then the upper Hessenberg matrix is also symmetric and thus [[tridiagonal matrix|tridiagonal]], and so are all the ''A''<sub>''k''</sub>. This procedure costs <math>\begin{matrix}\frac{4}{3}\end{matrix} n^3 + O(n^2)</math> arithmetic operations using  a technique based on Householder reduction.<ref name=Demmel/><ref name=Trefethen/> Determining the QR decomposition of a symmetric tridiagonal matrix costs <math>O(n)</math> operations.<ref>James M. Ortega and Henry F. Kaiser, "The ''LL<sup>T</sup>'' and ''QR'' methods for symmetric tridiagonal matrices," ''The Computer Journal'' '''6''' (1), 99–101 (1963).</ref>
 
The rate of convergence depends on the separation between eigenvalues, so a practical algorithm will use shifts, either explicit or implicit, to increase separation and accelerate convergence. A typical symmetric QR algorithm isolates each eigenvalue (then reduces the size of the matrix) with only one or two iterations, making it efficient as well as robust.{{clarify|date=June 2012}}
 
== The implicit QR algorithm ==
In modern computational practice,<ref>Golub and Van Loan, chapter 7</ref> the QR algorithm is performed in an implicit version which makes the use of multiple shifts easier to introduce. The matrix is first brought to upper Hessenberg form <math>A_0=QAQ^T</math> as in the explicit version; then, at each step, the first column of <math>A_k</math> is transformed via a small-size Householder similarity transformation to the first column of <math>p(A_k)</math> (or <math>p(A_k)e_1</math>), where <math>p(A_k)</math>, of degree <math>r</math>, is the polynomial that defines the shifting strategy (often <math>p(x)=(x-\lambda)(x-\bar\lambda)</math>, where <math>\lambda</math> and <math>\bar\lambda</math> are the two eigenvalues of the trailing <math>2 \times 2</math> principal submatrix of <math>A_k</math>, the so-called ''implicit double-shift''). Then successive Householder transformation of size <math>r+1</math> are performed in order to return the working matrix <math>A_k</math> to upper Hessenberg form. This operation is known as ''bulge chasing'', due to the peculiar shape of the non-zero entries of the matrix along the steps of the algorithm. As in the first version,  deflation is performed as soon as one of the sub-diagonal entries of <math>A_k</math> is sufficiently small.
 
===Renaming proposal===
Since in the modern implicit version of the procedure no [[QR decomposition]]s are explicitly performed, some authors, for instance Watkins,<ref>Watkins, David S.: The Matrix Eigenvalue Problem: GR and Krylov Subspace Methods, SIAM, Philadelphia, PA, USA, 2007, ISBN 0-89871-641-1, ISBN 978-0-89871-641-2</ref> suggested changing its name to ''Francis algorithm''. [[Gene H. Golub|Golub]] and [[Charles F. Van Loan|Van Loan]] use the term ''Francis QR step''.
 
== Interpretation and convergence ==
The QR algorithm can be seen as a more sophisticated variation of the basic "power" [[eigenvalue algorithm]]. Recall that the power algorithm repeatedly multiplies ''A'' times a single vector, normalizing after each iteration. The vector converges to an eigenvector of the largest eigenvalue. Instead, the QR algorithm works with a complete basis of vectors, using QR decomposition to renormalize (and orthogonalize). For a symmetric matrix ''A'', upon convergence, ''AQ'' = ''Q&Lambda;'', where ''&Lambda;'' is the diagonal matrix of eigenvalues to which ''A'' converged, and where ''Q'' is a composite of all the orthogonal similarity transforms required to get there. Thus the columns of ''Q'' are the eigenvectors.
 
== History ==
The QR algorithm was preceded by the ''LR algorithm'', which uses the [[LU decomposition]] instead of the QR decomposition. The QR algorithm is more stable, so the LR algorithm is rarely used nowadays. However, it represents an important step in the development of the QR algorithm.
 
The LR algorithm was developed in the early 1950s by [[Heinz Rutishauer]], who worked at that time as a research assistant of [[Eduard Stiefel]] at [[ETH Zurich]]. Stiefel suggested that Rutishauer use the sequence of moments ''y''<sub>0</sub><sup>T</sup> ''A''<sup>''k''</sup> ''x''<sub>0</sub>, ''k'' = 0, 1, … (where ''x''<sub>0</sub> and ''y''<sub>0</sub> are arbitrary vectors) to find the eigenvalues of ''A''. Rutishauer took an algorithm of [[Alexander Aitken]] for this task and developed it into the ''quotient&ndash;difference algorithm'' or ''qd algorithm''. After arranging the computation in a suitable shape, he discovered that the qd algorithm is in fact the iteration ''A''<sub>''k''</sub> = ''L''<sub>''k''</sub>''U''<sub>''k''</sub> (LU decomposition), ''A''<sub>''k''+1</sub> = ''U''<sub>''k''</sub>''L''<sub>''k''</sub>, applied on a tridiagonal matrix, from which the LR algorithm follows.<ref>{{Citation | last1=Parlett | first1=Beresford N. | last2=Gutknecht | first2=Martin H. | title=From qd to LR, or, how were the qd and LR algorithms discovered? | doi=10.1093/imanum/drq003 | year=2011 | journal=IMA Journal of Numerical Analysis | issn=0272-4979 | volume=31 | pages=741–754}}</ref>
 
== Other variants ==
One variant of the ''QR algorithm'', ''the Golub-Kahan-Reinsch'' algorithm starts with reducing a general matrix into a bidiagonal one.<ref>Bochkanov Sergey Anatolyevich. ALGLIB User Guide - General Matrix operations - Singular value decomposition . ALGLIB Project. 2010-12-11. URL:http://www.alglib.net/matrixops/general/svd.php. Accessed: 2010-12-11. (Archived by WebCite at http://www.webcitation.org/5utO4iSnR)</ref> This variant of the ''QR algorithm'' for the computation of eigenvalues, which was first described by {{harvtxt|Golub|Kahan|1965}}. The [[LAPACK]] subroutine [http://www.netlib.org/lapack/double/dbdsqr.f DBDSQR] implements this iterative method, with some modifications to cover the case where the singular values are very small {{harv|Demmel|Kahan|1990}}. Together with a first step using Householder reflections and, if appropriate, [[QR decomposition]], this forms the [http://www.netlib.org/lapack/double/dgesvd.f  DGESVD] routine for the computation of the [[singular value decomposition]].
 
== References ==
<references/>
 
== External links ==
* {{planetmath reference|id=1474|title=Eigenvalue problem}}
* [http://www.math.umn.edu/~olver/aims_/qr.pdf Prof. Peter Olver's notes on orthogonal bases and the workings of the QR algorithm]
* [http://math.fullerton.edu/mathews/n2003/QRMethodMod.html Module for the QR Method]
 
{{Numerical linear algebra}}
 
{{DEFAULTSORT:Qr Algorithm}}
[[Category:Numerical linear algebra]]

Revision as of 11:18, 16 January 2014

In numerical linear algebra, the QR algorithm is an eigenvalue algorithm: that is, a procedure to calculate the eigenvalues and eigenvectors of a matrix. The QR transformation was developed in the late 1950s by John G.F. Francis (England) and by Vera N. Kublanovskaya (USSR), working independently.[1] The basic idea is to perform a QR decomposition, writing the matrix as a product of an orthogonal matrix and an upper triangular matrix, multiply the factors in the reverse order, and iterate.

The practical QR algorithm

Formally, let A be a real matrix of which we want to compute the eigenvalues, and let A0:=A. At the k-th step (starting with k = 0), we compute the QR decomposition Ak=QkRk where Qk is an orthogonal matrix and Rk is an upper triangular matrix. We then form Ak+1 = RkQk. Note that

Ak+1=RkQk=QkTQkRkQk=QkTAkQk=Qk1AkQk,

so all the Ak are similar and hence they have the same eigenvalues. The algorithm is numerically stable because it proceeds by orthogonal similarity transforms.

Under certain conditions,[2] the matrices Ak converge to a triangular matrix, the Schur form of A. The eigenvalues of a triangular matrix are listed on the diagonal, and the eigenvalue problem is solved. In testing for convergence it is impractical to require exact zeros, but the Gershgorin circle theorem provides a bound on the error.

In this crude form the iterations are relatively expensive. This can be mitigated by first bringing the matrix A to upper Hessenberg form (which costs 103n3+O(n2) arithmetic operations using a technique based on Householder reduction), with a finite sequence of orthogonal similarity transforms, somewhat like a two-sided QR decomposition.[3][4] (For QR decomposition, the Householder reflectors are multiplied only on the left, but for the Hessenberg case they are multiplied on both left and right.) Determining the QR decomposition of an upper Hessenberg matrix costs 6n2+O(n) arithmetic operations. Moreover, because the Hessenberg form is already nearly upper-triangular (it has just one nonzero entry below each diagonal), using it as a starting point reduces the number of steps required for convergence of the QR algorithm.

If the original matrix is symmetric, then the upper Hessenberg matrix is also symmetric and thus tridiagonal, and so are all the Ak. This procedure costs 43n3+O(n2) arithmetic operations using a technique based on Householder reduction.[3][4] Determining the QR decomposition of a symmetric tridiagonal matrix costs O(n) operations.[5]

The rate of convergence depends on the separation between eigenvalues, so a practical algorithm will use shifts, either explicit or implicit, to increase separation and accelerate convergence. A typical symmetric QR algorithm isolates each eigenvalue (then reduces the size of the matrix) with only one or two iterations, making it efficient as well as robust.Template:Clarify

The implicit QR algorithm

In modern computational practice,[6] the QR algorithm is performed in an implicit version which makes the use of multiple shifts easier to introduce. The matrix is first brought to upper Hessenberg form A0=QAQT as in the explicit version; then, at each step, the first column of Ak is transformed via a small-size Householder similarity transformation to the first column of p(Ak) (or p(Ak)e1), where p(Ak), of degree r, is the polynomial that defines the shifting strategy (often p(x)=(xλ)(xλ¯), where λ and λ¯ are the two eigenvalues of the trailing 2×2 principal submatrix of Ak, the so-called implicit double-shift). Then successive Householder transformation of size r+1 are performed in order to return the working matrix Ak to upper Hessenberg form. This operation is known as bulge chasing, due to the peculiar shape of the non-zero entries of the matrix along the steps of the algorithm. As in the first version, deflation is performed as soon as one of the sub-diagonal entries of Ak is sufficiently small.

Renaming proposal

Since in the modern implicit version of the procedure no QR decompositions are explicitly performed, some authors, for instance Watkins,[7] suggested changing its name to Francis algorithm. Golub and Van Loan use the term Francis QR step.

Interpretation and convergence

The QR algorithm can be seen as a more sophisticated variation of the basic "power" eigenvalue algorithm. Recall that the power algorithm repeatedly multiplies A times a single vector, normalizing after each iteration. The vector converges to an eigenvector of the largest eigenvalue. Instead, the QR algorithm works with a complete basis of vectors, using QR decomposition to renormalize (and orthogonalize). For a symmetric matrix A, upon convergence, AQ = , where Λ is the diagonal matrix of eigenvalues to which A converged, and where Q is a composite of all the orthogonal similarity transforms required to get there. Thus the columns of Q are the eigenvectors.

History

The QR algorithm was preceded by the LR algorithm, which uses the LU decomposition instead of the QR decomposition. The QR algorithm is more stable, so the LR algorithm is rarely used nowadays. However, it represents an important step in the development of the QR algorithm.

The LR algorithm was developed in the early 1950s by Heinz Rutishauer, who worked at that time as a research assistant of Eduard Stiefel at ETH Zurich. Stiefel suggested that Rutishauer use the sequence of moments y0T Ak x0, k = 0, 1, … (where x0 and y0 are arbitrary vectors) to find the eigenvalues of A. Rutishauer took an algorithm of Alexander Aitken for this task and developed it into the quotient–difference algorithm or qd algorithm. After arranging the computation in a suitable shape, he discovered that the qd algorithm is in fact the iteration Ak = LkUk (LU decomposition), Ak+1 = UkLk, applied on a tridiagonal matrix, from which the LR algorithm follows.[8]

Other variants

One variant of the QR algorithm, the Golub-Kahan-Reinsch algorithm starts with reducing a general matrix into a bidiagonal one.[9] This variant of the QR algorithm for the computation of eigenvalues, which was first described by Template:Harvtxt. The LAPACK subroutine DBDSQR implements this iterative method, with some modifications to cover the case where the singular values are very small Template:Harv. Together with a first step using Householder reflections and, if appropriate, QR decomposition, this forms the DGESVD routine for the computation of the singular value decomposition.

References

  1. J.G.F. Francis, "The QR Transformation, I", The Computer Journal, vol. 4, no. 3, pages 265-271 (1961, received Oct 1959) online at oxfordjournals.org;
    J.G.F. Francis, "The QR Transformation, II" The Computer Journal, vol. 4, no. 4, pages 332-345 (1962) online at oxfordjournals.org.
    Vera N. Kublanovskaya, "On some algorithms for the solution of the complete eigenvalue problem," USSR Computational Mathematics and Mathematical Physics, vol. 1, no. 3, pages 637–657 (1963, received Feb 1961). Also published in: Zhurnal Vychislitel'noi Matematiki i Matematicheskoi Fiziki, vol.1, no. 4, pages 555–570 (1961).
  2. Golub, G. H. and Van Loan, C. F.: Matrix Computations, 3rd ed., Johns Hopkins University Press, Baltimore, 1996, ISBN 0-8018-5414-8.
  3. 3.0 3.1 James W. Demmel, Applied Numerical Linear Algebra (SIAM, 1997).
  4. 4.0 4.1 Lloyd N. Trefethen and David Bau, Numerical Linear Algebra (SIAM, 1997).
  5. James M. Ortega and Henry F. Kaiser, "The LLT and QR methods for symmetric tridiagonal matrices," The Computer Journal 6 (1), 99–101 (1963).
  6. Golub and Van Loan, chapter 7
  7. Watkins, David S.: The Matrix Eigenvalue Problem: GR and Krylov Subspace Methods, SIAM, Philadelphia, PA, USA, 2007, ISBN 0-89871-641-1, ISBN 978-0-89871-641-2
  8. Many property agents need to declare for the PIC grant in Singapore. However, not all of them know find out how to do the correct process for getting this PIC scheme from the IRAS. There are a number of steps that you need to do before your software can be approved.

    Naturally, you will have to pay a safety deposit and that is usually one month rent for annually of the settlement. That is the place your good religion deposit will likely be taken into account and will kind part or all of your security deposit. Anticipate to have a proportionate amount deducted out of your deposit if something is discovered to be damaged if you move out. It's best to you'll want to test the inventory drawn up by the owner, which can detail all objects in the property and their condition. If you happen to fail to notice any harm not already mentioned within the inventory before transferring in, you danger having to pay for it yourself.

    In case you are in search of an actual estate or Singapore property agent on-line, you simply should belief your intuition. It's because you do not know which agent is nice and which agent will not be. Carry out research on several brokers by looking out the internet. As soon as if you end up positive that a selected agent is dependable and reliable, you can choose to utilize his partnerise in finding you a home in Singapore. Most of the time, a property agent is taken into account to be good if he or she locations the contact data on his website. This may mean that the agent does not mind you calling them and asking them any questions relating to new properties in singapore in Singapore. After chatting with them you too can see them in their office after taking an appointment.

    Have handed an trade examination i.e Widespread Examination for House Brokers (CEHA) or Actual Property Agency (REA) examination, or equal; Exclusive brokers are extra keen to share listing information thus making certain the widest doable coverage inside the real estate community via Multiple Listings and Networking. Accepting a severe provide is simpler since your agent is totally conscious of all advertising activity related with your property. This reduces your having to check with a number of agents for some other offers. Price control is easily achieved. Paint work in good restore-discuss with your Property Marketing consultant if main works are still to be done. Softening in residential property prices proceed, led by 2.8 per cent decline within the index for Remainder of Central Region

    Once you place down the one per cent choice price to carry down a non-public property, it's important to accept its situation as it is whenever you move in – faulty air-con, choked rest room and all. Get round this by asking your agent to incorporate a ultimate inspection clause within the possibility-to-buy letter. HDB flat patrons routinely take pleasure in this security net. "There's a ultimate inspection of the property two days before the completion of all HDB transactions. If the air-con is defective, you can request the seller to repair it," says Kelvin.

    15.6.1 As the agent is an intermediary, generally, as soon as the principal and third party are introduced right into a contractual relationship, the agent drops out of the image, subject to any problems with remuneration or indemnification that he could have against the principal, and extra exceptionally, against the third occasion. Generally, agents are entitled to be indemnified for all liabilities reasonably incurred within the execution of the brokers´ authority.

    To achieve the very best outcomes, you must be always updated on market situations, including past transaction information and reliable projections. You could review and examine comparable homes that are currently available in the market, especially these which have been sold or not bought up to now six months. You'll be able to see a pattern of such report by clicking here It's essential to defend yourself in opposition to unscrupulous patrons. They are often very skilled in using highly unethical and manipulative techniques to try and lure you into a lure. That you must also protect your self, your loved ones, and personal belongings as you'll be serving many strangers in your home. Sign a listing itemizing of all of the objects provided by the proprietor, together with their situation. HSR Prime Recruiter 2010
  9. Bochkanov Sergey Anatolyevich. ALGLIB User Guide - General Matrix operations - Singular value decomposition . ALGLIB Project. 2010-12-11. URL:http://www.alglib.net/matrixops/general/svd.php. Accessed: 2010-12-11. (Archived by WebCite at http://www.webcitation.org/5utO4iSnR)

External links

Template:Numerical linear algebra