Eigenvalues and Eigenvectors of Matrices

  • Mathematical Formulation

Consider a transformation matrix {A}, such that it transforms a vector {X} into {Y}. Thus, we can write,

{Y = AX}

Recall that 2 vectors directed along same direction are simply scalar multiples of each other. e.g. {\vec P = 3 \hat i + 4 \hat j = (3,4)} and {\vec Q = 6 \hat i + 8 \hat j} (Check their direction). We can write {\vec Q = 2 \vec P}.

Now, suppose there exists a scalar {\lambda}, such that

{Y = \lambda X}

Now this is interesting, because {Y} and {X} are now related via. a matrix {A} and a scalar (a number) {\lambda}. i.e. {Y = AX} as well as {Y = \lambda X}.

Thus,

{Y = AX = \lambda X = \lambda IX}

So,

{AX - \lambda IX = 0 \ or \ \ (A - \lambda I)X =0}

Since RHS is a null vector, this forms a homogeneous system, which will have non-trivial solutions, when {|A - \lambda I| =0}. On expanding the determinant, we will get {n} values of {\lambda}, if {A} is of the order {n \times n}. These values are known as the eigenvalues. In German, eigen means special.

  • More About Eigenvalues

I) The determinant {|A- \lambda I|} is known as the characteristic determinant and the polynomial obtained on expanding the determinant is known as the characteristic polynomial. If matrix {A} is of the order {n}, the degree of polynomial is {n} and hence, the matrix has {n} eigenvalues, which may be distinct or identical.

The set of eigenvalues is known as the spectrum.

II) {\sum \limits_{i=1}^n \lambda_i = \sum \limits_{i=1}^n a_{ii} =} Trace

III) {\prod \limits_{i=1}^n \lambda_i = |A|}

This implies, if any one of the eigenvalues is {0}, then {|A|} is {0}.

IV) Eigenvalues of {A^{n}} are {\lambda^n}, where {n} is a non-negative integer.

V) Eigenvalues of {A} and {A^T} are same.

VI) Eigenvalues of {A- KI} are {\lambda_i - K}, where {K} is any number.

VII) If {A} is symmetric, then its eigenvalues are real.

  • Eigenvectors

Corresponding to each of the eigenvalues {\lambda}, there will be a vector, known as eigenvector. This is obtained by solving the system {(A - \lambda I) X =0}. As stated earlier, {AX = \lambda X}.

This forms a system of homogeneous equations, and to get {X}, we solve the system.

  • Properties of Eigenvectors

These are the vectors, whose direction does not change under the transformation {AX}.

I) For an eigenvalue {\lambda}, if {X} is an eigenvector, then {KX, K \ne 0} is also an eigenvector.

II) If the eigenvalues are distinct, the eigenvectors are linearly independent.

III) If {A} is symmetric, then the eigenvectors corresponding to 2 distinct eigenvalues are orthogonal.

  • Use of Eigenvalues and Eigenvectors

There are many applications of eigenvalues and eigenvectors. Finding natural frequencies of a system with multiple degrees of freedom is an example which is a typical mechanical engineering application. The equations of motion form a system of the kind {AX = B = \lambda X}. The eigenvalues of {A} tell us its natural frequencies. The corresponding eigenvectors indicate the mode shapes.

  • Cayley Hamilton Theorem

It states that every matrix satisfies its characteristic equation, i.e. if {|A - \lambda I|=0}, or {f(\lambda) = 0} , then

{f(A)=0}

This theorem can be used to find higher powers of a matrix as well as its inverse.

Advertisements
Posted in M I

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s