<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://en.formulasearchengine.com/index.php?action=history&amp;feed=atom&amp;title=Kylee</id>
	<title>Kylee - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://en.formulasearchengine.com/index.php?action=history&amp;feed=atom&amp;title=Kylee"/>
	<link rel="alternate" type="text/html" href="https://en.formulasearchengine.com/index.php?title=Kylee&amp;action=history"/>
	<updated>2026-04-17T23:47:50Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.0-wmf.28</generator>
	<entry>
		<id>https://en.formulasearchengine.com/index.php?title=Kylee&amp;diff=17186&amp;oldid=prev</id>
		<title>en&gt;BG19bot: /* Covers on YouTube */WP:CHECKWIKI error fix for #61.  Punctuation goes before References. Do general fixes if a problem exists. - using AWB (9876)</title>
		<link rel="alternate" type="text/html" href="https://en.formulasearchengine.com/index.php?title=Kylee&amp;diff=17186&amp;oldid=prev"/>
		<updated>2014-01-27T06:02:57Z</updated>

		<summary type="html">&lt;p&gt;&lt;span class=&quot;autocomment&quot;&gt;Covers on YouTube: &lt;/span&gt;&lt;a href=&quot;/index.php?title=WP:CHECKWIKI&amp;amp;action=edit&amp;amp;redlink=1&quot; class=&quot;new&quot; title=&quot;WP:CHECKWIKI (page does not exist)&quot;&gt;WP:CHECKWIKI&lt;/a&gt; error fix for #61.  Punctuation goes before References. Do &lt;a href=&quot;https://en.wikipedia.org/wiki/GENFIXES&quot; class=&quot;extiw&quot; title=&quot;wikipedia:GENFIXES&quot;&gt;general fixes&lt;/a&gt; if a problem exists. - using &lt;a href=&quot;/index.php?title=Testwiki:AWB&amp;amp;action=edit&amp;amp;redlink=1&quot; class=&quot;new&quot; title=&quot;Testwiki:AWB (page does not exist)&quot;&gt;AWB&lt;/a&gt; (9876)&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;In the [[mathematics|mathematical]] fields of [[geometry]] and [[linear algebra]], a &amp;#039;&amp;#039;&amp;#039;principal axis&amp;#039;&amp;#039;&amp;#039; is a certain line in a [[Euclidean space]] associated to an [[ellipsoid]] or [[hyperboloid]], generalizing the [[ellipse|major and minor axes of an ellipse]].  The &amp;#039;&amp;#039;principal axis theorem&amp;#039;&amp;#039; states that the principal axes are perpendicular, and gives a constructive procedure for finding them.&lt;br /&gt;
&lt;br /&gt;
Mathematically, the principal axis theorem is a generalization of the method of [[completing the square]] from [[elementary algebra]].  In [[linear algebra]] and [[functional analysis]], the principal axis theorem is a geometrical counterpart of the [[spectral theorem]].  It has applications to the [[statistics]] of [[principal components analysis]] and the [[singular value decomposition]].  In [[physics]], the theorem is fundamental to the study of [[angular momentum]].&lt;br /&gt;
&lt;br /&gt;
==Motivation==&lt;br /&gt;
The equations in the [[Cartesian plane]] &amp;#039;&amp;#039;&amp;#039;R&amp;#039;&amp;#039;&amp;#039;&amp;lt;sup&amp;gt;2&amp;lt;/sup&amp;gt;:&lt;br /&gt;
:&amp;lt;math&amp;gt;\frac{x^2}{9}+\frac{y^2}{25}=1&amp;lt;/math&amp;gt;&lt;br /&gt;
:&amp;lt;math&amp;gt;{}\frac{x^2}{9}-\frac{y^2}{25}=1&amp;lt;/math&amp;gt;&lt;br /&gt;
define, respectively, an ellipse and a hyperbola.  In each case, the &amp;#039;&amp;#039;x&amp;#039;&amp;#039; and &amp;#039;&amp;#039;y&amp;#039;&amp;#039; axes are the principal axes.  This is easily seen, given that there are no &amp;#039;&amp;#039;cross-terms&amp;#039;&amp;#039; involving products &amp;#039;&amp;#039;xy&amp;#039;&amp;#039; in either expression.  However, the situation is more complicated for equations like&lt;br /&gt;
:&amp;lt;math&amp;gt;5x^2+8xy+5y^2=1.&amp;lt;/math&amp;gt;&lt;br /&gt;
Here some method is required to determine whether this is an ellipse or a hyperbola.  The basic observation is that if, by completing the square, the expression can be reduced to a sum of two squares then it defines an ellipse, whereas if it reduces to a difference of two squares then it is the equation of a hyperbola:&lt;br /&gt;
:&amp;lt;math&amp;gt;u(x,y)^2+v(x,y)^2=1\qquad\text{(ellipse)}&amp;lt;/math&amp;gt;&lt;br /&gt;
:&amp;lt;math&amp;gt;u(x,y)^2-v(x,y)^2=1\qquad\text{(hyperbola)}.&amp;lt;/math&amp;gt;&lt;br /&gt;
Thus, in our example expression, the problem is how to absorb the coefficient of the cross-term 8&amp;#039;&amp;#039;xy&amp;#039;&amp;#039; into the functions &amp;#039;&amp;#039;u&amp;#039;&amp;#039; and &amp;#039;&amp;#039;v&amp;#039;&amp;#039;.  Formally, this problem is similar to the problem of [[matrix diagonalization]], where one tries to find a suitable coordinate system in which the matrix of a linear transformation is diagonal.  The first step is to find a matrix in which the technique of diagonalization can be applied.&lt;br /&gt;
&lt;br /&gt;
The trick is to write the equation in the following form:&lt;br /&gt;
:&amp;lt;math&amp;gt;5x^2+8xy+5y^2=&lt;br /&gt;
\begin{bmatrix}&lt;br /&gt;
x&amp;amp;y&lt;br /&gt;
\end{bmatrix}&lt;br /&gt;
\begin{bmatrix}&lt;br /&gt;
5&amp;amp;4\\4&amp;amp;5&lt;br /&gt;
\end{bmatrix}&lt;br /&gt;
\begin{bmatrix}&lt;br /&gt;
x\\y&lt;br /&gt;
\end{bmatrix}&lt;br /&gt;
=\mathbf{x}^TA\mathbf{x} &lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
where the cross-term has been split into two equal parts.  The matrix &amp;#039;&amp;#039;A&amp;#039;&amp;#039; in the above decomposition is a [[symmetric matrix]].  In particular, by the [[spectral theorem]], it has [[real numbers|real]] [[eigenvalues]] and is [[diagonalizable]] by an [[orthogonal matrix]] (&amp;#039;&amp;#039;orthogonally diagonalizable&amp;#039;&amp;#039;).&lt;br /&gt;
&lt;br /&gt;
To orthogonally diagonalize &amp;#039;&amp;#039;A&amp;#039;&amp;#039;, one must first find its eigenvalues, and then find an [[orthonormal]] [[eigenbasis]].  Calculation reveals that the eigenvalues of &amp;#039;&amp;#039;A&amp;#039;&amp;#039; are&lt;br /&gt;
:&amp;lt;math&amp;gt;\lambda_1 = 1,\quad \lambda_2 = 9&amp;lt;/math&amp;gt;&lt;br /&gt;
with corresponding eigenvectors&lt;br /&gt;
:&amp;lt;math&amp;gt;\mathbf{v}_1 = \begin{bmatrix}1\\-1\end{bmatrix},\quad \mathbf{v}_2=\begin{bmatrix}1\\1\end{bmatrix}.&amp;lt;/math&amp;gt;&lt;br /&gt;
Dividing these by their respective lengths yields an orthonormal eigenbasis:&lt;br /&gt;
:&amp;lt;math&amp;gt;\mathbf{u}_1 = \begin{bmatrix}1/\sqrt{2}\\-1/\sqrt{2}\end{bmatrix},\quad \mathbf{u}_2=\begin{bmatrix}1/\sqrt{2}\\1/\sqrt{2}\end{bmatrix}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now the matrix &amp;#039;&amp;#039;S&amp;#039;&amp;#039; = [&amp;#039;&amp;#039;&amp;#039;u&amp;#039;&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;1&amp;lt;/sub&amp;gt; &amp;#039;&amp;#039;&amp;#039;u&amp;#039;&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;] is an orthogonal matrix, since it has orthonormal columns, and &amp;#039;&amp;#039;A&amp;#039;&amp;#039; is diagonalized by:&lt;br /&gt;
:&amp;lt;math&amp;gt;A = SDS^{-1} = SDS^T =&lt;br /&gt;
\begin{bmatrix}&lt;br /&gt;
1/\sqrt{2}&amp;amp;1/\sqrt{2}\\&lt;br /&gt;
-1/\sqrt{2}&amp;amp;1/\sqrt{2}&lt;br /&gt;
\end{bmatrix}&lt;br /&gt;
\begin{bmatrix}&lt;br /&gt;
1&amp;amp;0\\&lt;br /&gt;
0&amp;amp;9&lt;br /&gt;
\end{bmatrix}&lt;br /&gt;
\begin{bmatrix}&lt;br /&gt;
1/\sqrt{2}&amp;amp;-1/\sqrt{2}\\&lt;br /&gt;
1/\sqrt{2}&amp;amp;1/\sqrt{2}&lt;br /&gt;
\end{bmatrix}.&lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This applies to the present problem of &amp;quot;diagonalizing&amp;quot; the equation through the observation that&lt;br /&gt;
:&amp;lt;math&amp;gt;5x^2+8xy+5y^2=\mathbf{x}^TA\mathbf{x}= (S^T\mathbf{x})^TD(S^T\mathbf{x})=1\left(\frac{x-y}{\sqrt{2}}\right)^2+9\left(\frac{x+y}{\sqrt{2}}\right)^2.&amp;lt;/math&amp;gt;&lt;br /&gt;
Thus, the equation is that of an ellipse, since it is the sum of two squares.&lt;br /&gt;
&lt;br /&gt;
It is tempting to simplify this expression by pulling out factors of 2. However, it is important &amp;#039;&amp;#039;not&amp;#039;&amp;#039; to do this.  The quantities&lt;br /&gt;
:&amp;lt;math&amp;gt;c_1=\frac{x-y}{\sqrt{2}},\quad c_2=\frac{x+y}{\sqrt{2}}&amp;lt;/math&amp;gt;&lt;br /&gt;
have a geometrical meaning.  They determine an &amp;#039;&amp;#039;orthonormal coordinate system&amp;#039;&amp;#039; on &amp;#039;&amp;#039;&amp;#039;R&amp;#039;&amp;#039;&amp;#039;&amp;lt;sup&amp;gt;2&amp;lt;/sup&amp;gt;.  In other words, they are obtained from the original coordinates by the application of a rotation (and possibly a reflection).  Consequently, one may use the &amp;#039;&amp;#039;c&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;1&amp;lt;/sub&amp;gt; and &amp;#039;&amp;#039;c&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt; coordinates to make statements about &amp;#039;&amp;#039;length and angles&amp;#039;&amp;#039; (particularly length), which would otherwise be more difficult in a different choice of coordinates (by rescaling them, for instance).  For example, the maximum distance from the origin on the ellipse &amp;#039;&amp;#039;c&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;1&amp;lt;/sub&amp;gt;&amp;lt;sup&amp;gt;2&amp;lt;/sup&amp;gt; + 9&amp;#039;&amp;#039;c&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;&amp;lt;sup&amp;gt;2&amp;lt;/sup&amp;gt; = 1 occurs when &amp;#039;&amp;#039;c&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;=0, so at the points &amp;#039;&amp;#039;c&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;1&amp;lt;/sub&amp;gt;=±1.  Similarly, the minimum distance is where &amp;#039;&amp;#039;c&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;=±1/3.&lt;br /&gt;
&lt;br /&gt;
It is possible now to read off the major and minor axes of this ellipse.  These are precisely the individual eigenspaces of the matrix &amp;#039;&amp;#039;A&amp;#039;&amp;#039;, since these are where &amp;#039;&amp;#039;c&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt; = 0 or &amp;#039;&amp;#039;c&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;1&amp;lt;/sub&amp;gt;=0.  Symbolically, the principal axes are&lt;br /&gt;
:&amp;lt;math&amp;gt;&lt;br /&gt;
E_1 = \text{span}\left(\begin{bmatrix}1/\sqrt{2}\\-1/\sqrt{2}\end{bmatrix}\right),\quad&lt;br /&gt;
E_2 = \text{span}\left(\begin{bmatrix}1/\sqrt{2}\\1/\sqrt{2}\end{bmatrix}\right).&lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
To summarize:&lt;br /&gt;
* The equation is for an ellipse, since both eigenvalues are positive.  (Otherwise, if one were positive and the other negative, it would be a hyperbola.)&lt;br /&gt;
* The principal axes are the lines spanned by the eigenvectors.&lt;br /&gt;
* The minimum and maximum distances to the origin can be read off the equation in diagonal form.&lt;br /&gt;
Using this information, it is possible to attain a clear geometrical picture of the ellipse: to graph it, for instance.&lt;br /&gt;
&lt;br /&gt;
==Formal statement==&lt;br /&gt;
The &amp;#039;&amp;#039;&amp;#039;principal axis theorem&amp;#039;&amp;#039;&amp;#039; concern [[quadratic forms]] in &amp;#039;&amp;#039;&amp;#039;R&amp;#039;&amp;#039;&amp;#039;&amp;lt;sup&amp;gt;n&amp;lt;/sup&amp;gt;, which are [[homogeneous polynomial]]&amp;lt;nowiki/&amp;gt;s of degree 2.  Any quadratic form may be represented as&lt;br /&gt;
:&amp;lt;math&amp;gt;Q(\mathbf{x})=\mathbf{x}^TA\mathbf{x}&amp;lt;/math&amp;gt;&lt;br /&gt;
where &amp;#039;&amp;#039;A&amp;#039;&amp;#039; is a symmetric matrix.&lt;br /&gt;
&lt;br /&gt;
The first part of the theorem is contained in the following statements guaranteed by the spectral theorem:&lt;br /&gt;
* The eigenvalues of &amp;#039;&amp;#039;A&amp;#039;&amp;#039; are real.&lt;br /&gt;
* &amp;#039;&amp;#039;A&amp;#039;&amp;#039; is diagonalizable, and the eigenspaces of &amp;#039;&amp;#039;A&amp;#039;&amp;#039; are mutually orthogonal.&lt;br /&gt;
In particular, &amp;#039;&amp;#039;A&amp;#039;&amp;#039; is &amp;#039;&amp;#039;orthogonally diagonalizable&amp;#039;&amp;#039;, since one may take a basis of each eigenspace and apply the [[Gram-Schmidt process]] separately within the eigenspace to obtain an orthonormal eigenbasis.&lt;br /&gt;
&lt;br /&gt;
For the second part, suppose that the eigenvalues of &amp;#039;&amp;#039;A&amp;#039;&amp;#039; are &amp;amp;lambda;&amp;lt;sub&amp;gt;1&amp;lt;/sub&amp;gt;, ..., &amp;amp;lambda;&amp;lt;sub&amp;gt;n&amp;lt;/sub&amp;gt; (possibly repeated according to their algebraic multiplicities) and the corresponding orthonormal eigenbasis is &amp;#039;&amp;#039;&amp;#039;u&amp;#039;&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;1&amp;lt;/sub&amp;gt;,...,&amp;#039;&amp;#039;&amp;#039;u&amp;#039;&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;n&amp;lt;/sub&amp;gt;.  Then&lt;br /&gt;
* &amp;lt;math&amp;gt;Q(\mathbf{x}) = \lambda_1c_1^2+\lambda_2c_2^2+\dots+\lambda_nc_n^2,&amp;lt;/math&amp;gt;&lt;br /&gt;
where the &amp;#039;&amp;#039;c&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;i&amp;lt;/sub&amp;gt; are the coordinates with respect to the given eigenbasis.  Furthermore,&lt;br /&gt;
* The &amp;#039;&amp;#039;i&amp;#039;&amp;#039;-th &amp;#039;&amp;#039;&amp;#039;principal axis&amp;#039;&amp;#039;&amp;#039; is the line determined by the &amp;#039;&amp;#039;n&amp;#039;&amp;#039;-1 equations &amp;#039;&amp;#039;c&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;j&amp;lt;/sub&amp;gt; = 0, &amp;#039;&amp;#039;j&amp;#039;&amp;#039; &amp;amp;ne; &amp;#039;&amp;#039;i&amp;#039;&amp;#039;.  This axis is the span of the vector &amp;#039;&amp;#039;&amp;#039;u&amp;#039;&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;i&amp;lt;/sub&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
==See also==&lt;br /&gt;
* [[Sylvester&amp;#039;s law of inertia]]&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
* {{cite book|authorlink=Gilbert Strang|first=Gilbert|last=Strang|title=Introduction to Linear Algebra|publisher=Wellesley-Cambridge Press|year=1994|isbn=0-9614088-5-5}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Theorems in geometry]]&lt;br /&gt;
[[Category:Theorems in linear algebra]]&lt;/div&gt;</summary>
		<author><name>en&gt;BG19bot</name></author>
	</entry>
</feed>