[go: up one dir, main page]

0% found this document useful (0 votes)
42 views10 pages

Diagonalization of Matrices

The document discusses various types of matrices, including symmetric, skew-symmetric, orthogonal, unitary, Hermitian, and skew Hermitian matrices, along with their properties. It also covers concepts such as similar matrices, the Cayley-Hamilton theorem, eigenvalues, eigenvectors, and diagonalization of matrices. Additionally, it provides proofs for several properties related to orthogonal and unitary matrices.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views10 pages

Diagonalization of Matrices

The document discusses various types of matrices, including symmetric, skew-symmetric, orthogonal, unitary, Hermitian, and skew Hermitian matrices, along with their properties. It also covers concepts such as similar matrices, the Cayley-Hamilton theorem, eigenvalues, eigenvectors, and diagonalization of matrices. Additionally, it provides proofs for several properties related to orthogonal and unitary matrices.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Types of Matrices and Diagonalization of Matrices

Symmetric matrix: A square matrix is called symmetric matrix if 𝐴 = 𝐴𝑇

i.e.𝑎𝑖𝑗 = 𝑎𝑗𝑖

1 2 3
e.g.[2 5 6]
3 6 7

Skew-Symmetric matrix: A square matrix is called symmetric matrix if 𝐴 = −𝐴𝑇

i.e. 𝑎𝑖𝑗 = −𝑎𝑗𝑖 . The diagonal elements of a skew-symmetric matrix are zero because 𝑎𝑖𝑖 =
−𝑎𝑖𝑖 if and only if 𝑎𝑖𝑖 = 0

0 −2 3
e.g. [ 2 0 6]
−3 −6 0

Orthogonal matrix: A square matrix A is said to be orthogonal if

𝐴𝐴𝑇 = 𝐴𝑇 𝐴 = 𝐼.

1 2 2
1
e.g. 𝐴 = [2 1 −2]
3
2 −2 1

Unitary matrix: A square matrix A is said to be Unitary if

𝐴𝜃 𝐴 = 𝐴𝐴𝜃 = 𝐼

𝑇
where 𝐴𝜃 = (𝐴)

1 2 2
1
e.g. 𝐴 = [ 2 1 −2]
3
2 −2 1

Note: Every orthogonal matrix is unitary.

Hermitian matrix: A square matrix A is said to be Hermitian matrix if

𝐴𝜃 = 𝐴 i.e. 𝑎𝑖𝑗 = 𝑎𝑗𝑖

Diagonal elements of a Hermitian matrix are real numbers.

1 2 + 3𝑖 5 − 6𝑖
e.g. 𝐴 = [2 − 3𝑖 2 9 − 6𝑖 ]
5 + 6𝑖 9 + 6𝑖 −11
Skew Hermitian matrix: A square matrix A is said to be skew Hermitian matrix if

𝐴𝜃 = −𝐴 i.e. 𝑎𝑖𝑗 = −𝑎𝑗𝑖

Diagonal elements of a skew Hermitian matrix are either zero or purely imaginary
numbers.

1 2 + 3𝑖 −5 − 6𝑖
e.g. 𝐴 = [−2 + 3𝑖 2 −9 + 6𝑖 ]
5 − 6𝑖 9 + 6𝑖 −11

Similar matrices: A square matrix A is said to be similar to a square matrix B if there exists
an invertible matrix P such that 𝐴 = 𝑃−1 𝐵𝑃. P is called similarity matrix. This relation of
similarity is a symmetric relation.

Cayley Hamilton theorem: Every square matrix satisfies its own characteristic equation.

Eigen values and Eigen Vectors: Let A be a square matrix. Then the equation determinant
(𝐴 − 𝛼𝐼) =0 is called characteristic equation of A. The roots of characteristic equation of A
are called Eigen values or latent roots of matrix A.
A column vector X satisfying the equation 𝐴𝑋 = 𝛼𝑋 i.e. (𝐴 − 𝛼𝐼)𝑋 = 0 is called Eigen
vector or latent vector of matrix A corresponding to eigen value 𝛼.
Diagonalizable matrix: A square matrix A is said to be diagonalizable if there exists an
invertible matrix P such that

𝑃−1 𝐵𝑃 = 𝐷

Where D is a diagonal matrix and the diagonal elements of D are Eigen values of A .

1. The characteristics equation of a matrix A is t2−t−1=0, then determine A-1.

Sol. By Cayley Hamilton theorem, every square matrix satisfies its characteristic equation.

Therefore A2-A-1=0

or A2-A=1

Premultipying both sides by A

A-I=A-1

2. Prove eigen value of a Hermitian matrix is real.

Sol. Let A be a Hermitian matrix. Therefore 𝐴𝜃 = 𝐴 − − − − − −(1)

Let 𝛼 be eigen value of A and X be the corresponding non-zero eigen vector. Then

𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠


𝐴𝑋 = 𝛼𝑋 → (𝐴𝑋)𝜃 = (𝛼𝑋)𝜃 → 𝑋 𝜃 𝐴𝜃 = 𝛼̅𝑋 𝜃 → 𝑋 𝜃 𝐴 = 𝛼̅𝑋 𝜃 (using (1))

Post multiplying both sides by X, we get


𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠
𝑋 𝜃 (𝐴𝑋) = 𝛼̅(𝑋 𝜃 𝑋) → 𝑋 𝜃 𝛼𝑋 = 𝛼̅(𝑋 𝜃 𝑋) → 𝛼(𝑋 𝜃 𝑋) = 𝛼̅(𝑋 𝜃 𝑋) → 𝛼 = 𝛼̅

Hence 𝛼 is a real number. Therefore Eigen value of a Hermitian matrix is real.

|𝐴|
3. Prove is an eigen value of 𝑎𝑑𝑗 (𝐴)eigen vector remaining the same if 𝛼 is an eigen value of A and X
𝛼
is corresponding Eigen vector.

Sol. Let A be a square matrix− − − − − − (1)

Let 𝛼 be eigen value of A and X be the corresponding non-zero eigen vector. Then

𝐴𝑋 = 𝛼𝑋 (using (1))

Pre- multiplying both sides by adj (A), we get

𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠
𝑎𝑑𝑗 (𝐴)(𝐴𝑋) = 𝑎𝑑𝑗 (𝐴)𝛼𝑋 → (𝑎𝑑𝑗 (𝐴)𝐴)𝑋 = 𝛼(𝑎𝑑𝑗 (𝐴)𝑋) → |𝐴|𝑋 = 𝛼(𝑎𝑑𝑗 (𝐴)𝑋)

|𝐴|
𝑎𝑑𝑗 (𝐴)𝑋 = 𝑋
𝛼
|𝐴|
Hence is an eigen value of 𝑎𝑑𝑗 (𝐴) and X is corresponding Eigen vector.
𝛼

4. Prove that product of two orthogonal matrices is orthogonal matrix

Sol. Let A and B be two orthogonal matrices. Therefore

𝐴𝐴𝑇 = 𝐴𝑇 𝐴 = 𝐼 and 𝐵𝐵𝑇 = 𝐵𝑇 𝐵 = 𝐼

Now (𝐴𝐵)(𝐴𝐵)𝑇 = 𝐴𝐵𝐵𝑇 𝐴𝑇 = 𝐴𝐼𝐴𝑇 = 𝐴𝐴𝑇 = 𝐼 and

(𝐴𝐵)𝑇 (𝐴𝐵) = 𝐵𝑇 𝐴𝑇 𝐴𝐵 = 𝐵𝐼𝐵𝑇 = 𝐵𝐵𝑇 = 𝐼

Hence AB is an orthogonal matrix. Therefore product of two orthogonal matrices is orthogonal

matrix.

5. Prove that transpose of an orthogonal matrix is orthogonal matrix.

Sol. Let A be orthogonal matrix. Therefore

𝐴𝐴𝑇 = 𝐴𝑇 𝐴 = 𝐼

Now 𝐴𝑇 (𝐴𝑇 )𝑇 = 𝐴𝑇 𝐴 = 𝐼 and

(𝐴𝑇 )𝑇 𝐴𝑇 =A𝐴𝑇 = 𝐼

Hence 𝐴𝑇 is an orthogonal matrix

Therefore transpose of an orthogonal matrix is orthogonal matrix.

6. Prove that inverse of an orthogonal matrix is an orthogonal matrix.

Sol. Let A be orthogonal matrix. Therefore

𝐴𝐴𝑇 = 𝐴𝑇 𝐴 = 𝐼

Now 𝐴−1 (𝐴−1 )𝑇 = 𝐴−1 (𝐴𝑇 )−1 = (𝐴𝑇 𝐴)−1 = 𝐼 −1 = 𝐼 and

(𝐴−1 )𝑇 𝐴−1 =(𝐴𝑇 )−1 𝐴−1 = (𝐴𝐴𝑇 )−1 = 𝐼 −1 = 𝐼


Hence 𝐴−1 is an orthogonal matrix

Therefore inverse of an orthogonal matrix is orthogonal matrix.

7. Prove that determinant of an orthogonal matrix is ±1.

Sol. Let A be orthogonal matrix. Therefore

𝐴𝐴𝑇 = 𝐴𝑇 𝐴 = 𝐼

Taking determinant on both sides

𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠


|𝐴𝐴𝑇 | = |𝐼| → |𝐴||𝐴𝑇 | = 1 → |𝐴||𝐴| = 1 → |𝐴|2 = 1 → |𝐴| = ±1

(Because |𝐶𝐷| = |𝐶||𝐷|, |𝐼| = 1, |𝐴| = |𝐴𝑇 | )

8. Prove that inverse of a unitary matrix is an unitary matrix.

Sol. Let A be unitary matrix. Therefore

𝑇
𝐴𝜃 𝐴 = 𝐴𝐴𝜃 = 𝐼 where 𝐴𝜃 = (𝐴)

−1 −1
Now 𝐴−1 (𝐴−1 )𝜃 = 𝐴−1 (𝐴𝜃 ) = (𝐴𝜃 𝐴) = 𝐼 −1 = 𝐼 and

−1 −1
(𝐴−1 )𝜃 𝐴−1 =(𝐴𝜃 ) 𝐴−1 = (𝐴𝐴𝜃 ) = 𝐼 −1 = 𝐼

Hence 𝐴−1 is an orthogonal matrix

Therefore inverse of an orthogonal matrix is orthogonal matrix.

9. State and prove Cayley Hamilton theorem.


Sol. Statement: Every square matrix satisfies its own characteristic equation.
Proof: Let A be a square matrix of order n and its characteristic equation be |𝐴 − 𝜆𝐼| = 0
i.e. (−1)𝑛 𝜆𝑛 + 𝑎1 𝜆𝑛−1 + 𝑎2 𝜆𝑛−2 + ⋯ … … + 𝑎𝑛 = 0
Required to be proved: (−1)𝑛 𝐴𝑛 + 𝑎1 𝐴𝑛−1 + 𝑎2 𝐴𝑛−2 + ⋯ … … + 𝑎𝑛 𝐼 = 0
Here 𝜆 is an eigen value of A.
𝑦𝑖𝑒𝑙𝑑𝑠
[𝐴 − 𝜆𝐼] is a matrix of order n → 𝑎𝑑𝑗. (𝐴 − 𝜆𝐼) is a matrix of order (n-1).
Therefore we can write 𝑎𝑑𝑗. (𝐴 − 𝜆𝐼) = 𝑃1 𝜆𝑛−1 + 𝑃2 𝜆𝑛−2 + ⋯ … … + 𝑃𝑛 where
𝑃1 , 𝑃2 , … … … 𝑃𝑛 are square matrices.
𝑦𝑖𝑒𝑙𝑑𝑠
Also 𝐴(𝑎𝑑𝑗. 𝐴) = |𝐴|𝐼 → (𝐴 − 𝜆𝐼)𝑎𝑑𝑗. (𝐴 − 𝜆𝐼) = |𝐴 − 𝜆𝐼|𝐼

𝑦𝑖𝑒𝑙𝑑𝑠
→ (𝐴 − 𝜆𝐼)[𝑃1 𝜆𝑛−1 + 𝑃2 𝜆𝑛−2 + ⋯ … … + 𝑃𝑛 ]= [(−1)𝑛 𝜆𝑛 + 𝑎1 𝜆𝑛−1 + 𝑎2 𝜆𝑛−2 + ⋯ … … + 𝑎𝑛 ]𝐼

Comparing coefficients of like powers of A, we get

−𝑃1 = (−1)𝑛 𝐼

𝐴𝑃1 − 𝑃2 = 𝑎1 𝐼

𝐴𝑃2 − 𝑃3 = 𝑎2 𝐼

𝐴𝑃3 − 𝑃4 = 𝑎3 𝐼

……………………. (and so on)

𝐴𝑃𝑛−1 − 𝑃𝑛 = 𝑎𝑛−1 𝐼

𝐴𝑃𝑛 = 𝑎𝑛 𝐼
Pre-multiplying these equations by 𝐴𝑛 , 𝐴𝑛−1 , 𝐴𝑛−2 , … … … , 𝐴, 𝐼 respectively on both sides and

adding, we get 0 = (−1)𝑛 𝐴𝑛 + 𝑎1 𝐴𝑛−1 + 𝑎2 𝐴𝑛−2 + ⋯ … … + 𝑎𝑛 𝐼

𝑦𝑖𝑒𝑙𝑑𝑠
→ (−1)𝑛 𝐴𝑛 + 𝑎1 𝐴𝑛−1 + 𝑎2 𝐴𝑛−2 + ⋯ … … + 𝑎𝑛 𝐼 = 0

(Hence proved).

1 0 −1
10. Find characteristic equation of 𝐴 = [1 2 1]
2 2 3

1 0 −1
Sol. 𝐴 = [1 2 1]
2 2 3

𝑦𝑖𝑒𝑙𝑑𝑠 1−𝛼 0 −1
Characteristic equation of A is |𝐴 − 𝛼𝐼| = 0 → | 1 2−𝛼 1 |=0
2 2 3−𝛼
𝑦𝑖𝑒𝑙𝑑𝑠
→ 𝛼 3 − 6𝛼 2 + 11𝛼 − 6 = 0

1 0 0
11. Is 𝐴 = [0 3 −1] diagonalizable?
0 −1 3

1 0 0
Sol. 𝐴 = [0 3 −1]
0 −1 3

𝑦𝑖𝑒𝑙𝑑𝑠 1−𝛼 0 0
Characteristic equation of A is |𝐴 − 𝛼𝐼| = 0 → | 0 3−𝛼 −1 | = 0
0 −1 3−𝛼
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠
→ 𝛼 3 − 7𝛼 2 + 14𝛼 − 8 = 0 → 𝛼 = 1,2,4

Since A has three distinct Eigen values, ∴ it has three linearly independent Eigen vectors. Hence A

A is diagonalizable.

1 4
12. Verify Cayley Hamilton theorem for 𝐴 = [ ]. Hence find 𝐴−1 . Also find Eigen values and vectors
3 2
of A

1 4
Sol. 𝐴 = [ ]
3 2
𝑦𝑖𝑒𝑙𝑑𝑠 1−𝛼 4
Characteristic equation of A is |𝐴 − 𝛼𝐼| = 0 → | |=0
3 2−𝛼
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠
→ 𝛼 2 − 3𝛼 − 10 = 0 → 𝛼 = −2,5

By Cayley Hamilton theorem 𝐴2 − 3𝐴 − 10𝐼 = 0 ………………………(*)

1 4 1 4 13 12
Now 𝐴2 = [ ][ ]=[ ] ,
3 2 3 2 9 16

13 12 −3 −12 −10 0 0 0
∴ 𝐴2 − 3𝐴 − 10𝐼 = [ ]+[ ]+[ ]=[ ]
9 16 −9 −6 0 −10 0 0

∴ Cayley Hamilton theorem is verified for given matrix A.


𝑦𝑖𝑒𝑙𝑑𝑠 1 −2 4
Multiplying both sides of (*) by 𝐴−1 , we get 𝐴 − 3𝐼 = 10𝐴−1 → 𝐴−1 = [ ]
10 3 −1
𝑥
Let 𝑋1 = [𝑦 ]be the Eigen vector of A corresponding to Eigen value 𝛼 = −2.
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 3 4 𝑥 0
∴ [𝐴 − 𝛼𝐼]𝑋1 = 0 → [𝐴 − (−2)𝐼]𝑋1 = 0 → [ ][ ] = [ ]
3 4 𝑦 0
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 𝑥 𝑦
→ 3𝑥 + 4𝑦 = 0 , 3𝑥 + 4𝑦 = 0 → =
−4 3

−4
∴ 𝑋1 = [ ] is the Eigen vector of A corresponding to Eigen value 𝛼 = −2.
3
𝑥
Let 𝑋2 = [𝑦]be the Eigen vector of A corresponding to Eigen value 𝛼 = 5.
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 −4 4 𝑥 0
∴ [𝐴 − 𝛼𝐼]𝑋2 = 0 → [𝐴 − (5)𝐼]𝑋2 = 0 → [ ][ ] = [ ]
3 −3 𝑦 0
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 𝑥 𝑦
→ −4𝑥 + 4𝑦 = 0 , 3𝑥 − 3𝑦 = 0 → 𝑥=𝑦 → =
1 1

1
∴ 𝑋2 = [ ] is the Eigen vector of A corresponding to Eigen value 𝛼 = 5.
1

2 −1 1
13. Verify Cayley Hamilton theorem for 𝐴 = [−1 2 −1]. Hence find 𝐴−1 .
1 −1 2

2 −1 1
Sol. 𝐴 = [−1 2 −1]
1 −1 2

𝑦𝑖𝑒𝑙𝑑𝑠 2−𝛼 −1 1
Characteristic equation of A is |𝐴 − 𝛼𝐼| = 0 → | −1 2−𝛼 −1 | = 0
1 −1 2−𝛼
𝑦𝑖𝑒𝑙𝑑𝑠
→ 𝛼 3 − 6𝛼 2 + 9𝛼 − 4 = 0

By Cayley Hamilton theorem 𝐴3 − 6𝐴2 + 9𝐴 − 4𝐼 = 0 …………………..(i)

2 −1 1 2 −1 1 6 −5 5
L.H.S.𝐴2 = 𝐴. 𝐴 = [−1 2 −1] [−1 2 −1] = [−5 6 −5]
1 −1 2 1 −1 2 5 −5 6

6 −5 5 2 −1 1 22 −21 21
𝐴3 = 𝐴𝐴𝐴 = [−5 6 −5] [−1 2 −1] = [−21 22 −21]
5 −5 6 1 −1 2 21 −21 22

Hence 𝐴3 − 6𝐴2 + 9𝐴 − 4𝐼

22 −21 21 6 −5 5 2 −1 1 1 0 0
= [−21 22 −21] − 6 [−5 6 −5] + 9 [−1 2 −1] − 4 [0 1 0]
21 −21 22 5 −5 6 1 −1 2 0 0 1

22 − 36 + 18 − 4 −21 + 30 − 9 21 − 30 + 9 0 0 0
= [ −21 + 30 − 9 22 − 36 + 18 − 4 −21 + 30 − 9 ] = [0 0 0]
21 − 30 + 9 −21 + 30 − 9 22 − 36 + 18 − 4 0 0 0

Hence Cayley Hamilton theorem is verified for the given matrix A

From(i), 4𝐼 = 𝐴3 − 6𝐴2 + 9𝐴

Multiplying both sides by 𝐴−1 , we get


1 2 1 6 −5 5 2 −1 1 9 0 0
𝐴−1 = [𝐴 − 6𝐴 + 9𝐼] = [[−5 6 −5] − 6 [−1 2 −1] + [0 9 0]]
4 4
5 −5 6 1 −1 2 0 0 9

1 3 1 −1
= [1 3 1]
4
−1 1 3

3 1 −1
14. Find Eigen values and vectors of 𝐴 = [−2 1 2]
0 1 2

3 1 −1
Sol. 𝐴 = [−2 1 2]
0 1 2

𝑦𝑖𝑒𝑙𝑑𝑠 3−𝛼 1 −1
Characteristic equation of A is |𝐴 − 𝛼𝐼| = 0 → | −2 1−𝛼 2 |=0
0 1 2−𝛼
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠
→ 𝛼 3 − 6𝛼 2 + 11𝛼 − 6 = 0 → 𝛼 = 1,2,3 are Eigen values of given matrix.

𝑥
Let 𝑋1 = [𝑦]be the Eigen vector of A corresponding to Eigen value 𝛼 = 1.
𝑧
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 2 1 −1 𝑥 0
∴ [𝐴 − 𝛼𝐼]𝑋1 = 0 → [𝐴 − (1)𝐼]𝑋1 = 0 → [−2 0 2 ] [𝑦] = [0]
0 1 1 𝑧 0
𝑦𝑖𝑒𝑙𝑑𝑠
→ 2𝑥 + 𝑦 − 𝑧 = 0 , −2𝑥 + 2𝑧 = 0, 𝑦 + 𝑧 = 0

𝑥 𝑦 𝑧 𝑦𝑖𝑒𝑙𝑑𝑠 𝑥 𝑦 𝑧 𝑦𝑖𝑒𝑙𝑑𝑠 𝑥 𝑦 𝑧
From first two equations, 1 −1 = −1 2 = 2 1→ = = → = =
2 −2 2 1 −1 1
0 2 2 −2 −2 0

1
∴ 𝑋1 = [−1] is the Eigen vector of A corresponding to Eigen value 𝛼 = 1.
1
𝑥
Let 𝑋2 = [𝑦]be the Eigen vector of A corresponding to Eigen value 𝛼 = 2.
𝑧
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 1 1 −1 𝑥 0
∴ [𝐴 − 𝛼𝐼]𝑋2 = 0 → [𝐴 − (2)𝐼]𝑋2 = 0 → [−2 −1 2 ] [ 𝑦 ] = [ 0]
0 1 0 𝑧 0
𝑦𝑖𝑒𝑙𝑑𝑠
→ 𝑥 + 𝑦 − 𝑧 = 0 , −2𝑥 − 𝑦 + 2𝑧 = 0, 𝑦 = 0

𝑥 𝑦 𝑧 𝑦𝑖𝑒𝑙𝑑𝑠 𝑥 𝑦 𝑧
From first two equations, 1 −1 = −1 1 = 1 1 → = =
1 0 1
−1 2 2 −2 −2 −1

1
∴ 𝑋2 = [0] is the Eigen vector of A corresponding to Eigen value 𝛼 = 2.
1
𝑥
Let 𝑋3 = [𝑦]be the Eigen vector of A corresponding to Eigen value 𝛼 = 3.
𝑧
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 0 1 −1 𝑥 0
∴ [𝐴 − 𝛼𝐼]𝑋3 = 0 → [𝐴 − (3)𝐼]𝑋3 = 0 → [−2 −2 2 ] [ 𝑦 ] = [ 0]
0 1 −1 𝑧 0
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠
→ 𝑦 − 𝑧 = 0 , −2𝑥 − 2𝑦 + 2𝑧 = 0, 𝑦 − 𝑧 = 0 → 𝑦 − 𝑧 = 0 , −2𝑥 − 2𝑦 + 2𝑧 = 0
𝑥 𝑦 𝑧 𝑦𝑖𝑒𝑙𝑑𝑠 𝑥 𝑦 𝑧 𝑦𝑖𝑒𝑙𝑑𝑠 𝑥 𝑦 𝑧
∴ we get , 1 −1 = −1 0 = 0 1 → = = → = =
0 2 2 0 1 1
−2 2 2 −2 −2 −2

0
∴ 𝑋3 = [1] is the Eigen vector of A corresponding to Eigen value 𝛼 = 2.
1

1 1 0
15. Find Eigen values and vectors of 𝐴 = [0 1 1]
0 0 1

1 1 0
Sol. 𝐴 = [0 1 1]
0 0 1

𝑦𝑖𝑒𝑙𝑑𝑠 1−𝛼 1 0
Characteristic equation of A is |𝐴 − 𝛼𝐼| = 0 → | 0 1−𝛼 1 |=0
0 0 1−𝛼
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠
→ (1 − 𝛼)3 → 𝛼 = 1,1,1 are Eigen values of given matrix.

𝑥
Let 𝑋1 = [𝑦]be the Eigen vector of A corresponding to Eigen value 𝛼 = 1.
𝑧
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 0 1 0
𝑥 0
∴ [𝐴 − 𝛼𝐼]𝑋1 = 0 → [𝐴 − (1)𝐼]𝑋1 = 0 → [ 0 0 1 ] [ 𝑦 ] = [ 0]
0 0 0 𝑧 0
𝑦𝑖𝑒𝑙𝑑𝑠
→ 𝑦 = 0 , 𝑧 = 0. 𝑇𝑎𝑘𝑒 𝑥 = 1

1
∴ 𝑋1 = [0] is the Eigen vector of A corresponding to Eigen value 𝛼 = 1.
0

16. Examine whether the following matrix is diagonalizable. If so, obtain the matrix P such that 𝑃−1 𝐴𝑃 is
−2 2 −3
a diagonal matrix.𝐴 = [ 2 1 −6]
−1 −2 0

−2 2 −3
Sol. 𝐴 = [ 2 1 −6]
−1 −2 0

𝑦𝑖𝑒𝑙𝑑𝑠 −2 − 𝛼 2 −3
Characteristic equation of A is |𝐴 − 𝛼𝐼| = 0 → | 2 1−𝛼 −6 | = 0
−1 −2 0−𝛼
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠
→ −(𝛼 + 3)(𝛼 + 3)(𝛼 − 5) = 0 → 𝛼 = −3, −3, 5 are Eigen values of given matrix.

𝑥
Let 𝑋1 = [𝑦]be the Eigen vector of A corresponding to Eigen value 𝛼 = −3.
𝑧
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 1 2 −3 𝑥 0
∴ [𝐴 − 𝛼𝐼]𝑋1 = 0 → [𝐴 − (−3)𝐼]𝑋1 = 0→ [2 4 −6] [𝑦] = [0]
−1 −2 3 𝑧 0

(Operating 𝑅2 → 𝑅2 − 2𝑅1 , 𝑅3 → 𝑅3 + 𝑅1 )

𝑦𝑖𝑒𝑙𝑑𝑠 1 2 −3 𝑥 0 𝑦𝑖𝑒𝑙𝑑𝑠
→ [0 0 0 ] [𝑦] = [0] → 𝑥 + 2𝑦 − 3𝑧 = 0
0 0 0 𝑧 0
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 𝑥 𝑧
Choose 𝑦 = 0 → 𝑥 − 3𝑧 = 0 → =
3 1
3
∴ 𝑋1 = [0] is the first Eigen vector of A corresponding to Eigen value 𝛼 = −3.
1
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 𝑥 𝑦
Choose 𝑧 = 0 → 𝑥 + 2𝑦 = 0 → =
−2 1

−2
∴ 𝑋2 = [ 1 ] is another Eigen vector of A corresponding to Eigen value 𝛼 = −3.
0
𝑥
Let 𝑋3 = [𝑦]be the Eigen vector of A corresponding to Eigen value 𝛼 = 5.
𝑧
𝑦𝑖𝑒𝑙𝑑𝑠 𝑦𝑖𝑒𝑙𝑑𝑠 −7 2 −3 𝑥 0
∴ [𝐴 − 𝛼𝐼]𝑋3 = 0 → [𝐴 − (5)𝐼]𝑋3 = 0 → [ 2 −4 −6] [𝑦] = [0]
−1 −2 −5 𝑧 0
𝑦𝑖𝑒𝑙𝑑𝑠
→ −7𝑥 + 2 𝑦 − 3𝑧 = 0 , 2𝑥 − 4𝑦 − 6𝑧 = 0, −𝑥 − 2𝑦 − 5𝑧 = 0

𝑥 𝑦 𝑧 𝑦𝑖𝑒𝑙𝑑𝑠 𝑥 𝑦 𝑧 𝑦𝑖𝑒𝑙𝑑𝑠 𝑥 𝑦 𝑧
∴ from first two equations we get , 2 −3 = −3 −7 = −7 2 → = = → = =
−24 −48 24 1 12 −1
−4 −6 −6 2 2 −4

1
∴ 𝑋3 = [ 2 ] is the Eigen vector of A corresponding to Eigen value 𝛼 = 5.
−1

−2 3 1
∴ Modal Matrix P = [ 1 0 2]
0 1 −1

−2 3 1
|𝑃| = | 1 0 2 | = 8 ≠ 0. Hence vectors are linearly independent and the given matrix is
0 1 −1

Diagonalizable.

𝐴𝑑𝑗. 𝑃 1 −2 4 6
𝑃−1 = = [1 2 5]
|𝑃| 8
1 2 −3

−2 4 6 −2 2 −3 −2 3 1
1
Diagonal Matrix = D = 𝑃 −1 𝐴𝑃 = [ 1 2 5 ][ 2 1 −6] [ 1 0 2]
8
1 2 −3 −1 −2 0 0 1 −1

1 −24 0 0 −2 3 1 −3 0 0
= [ 0 −24 0 ][ 1 0 2 ]=[ 0 −3 0]
8
0 0 40 0 1 −1 0 0 5

1
1 1 0 1
17. Let T be a linear transformation defined by 𝑇 [(
)] = (2) , 𝑇 [( )] =
1 1 1 1
3
1 1 −1
0 0 0 0 4 5
(−2) , 𝑇 [( )] = (−2) , 𝑇 [( )] = ( 2 ). Find 𝑇 [( )].
1 1 0 1 3 8
3 −3 3
1 1 0 1 0 0 0 0
Sol.The matrices ( ), ( ), ( ), ( ) are linearly independent and hence form a
1 1 1 1 1 1 0 1
basis in the space of 2 × 2 matrices. We write for any scalars 𝛼1 , 𝛼2 , 𝛼3 , 𝛼4 , not all zero
4 5 0 1 0 1 0 0 0 0 𝛼 𝛼 +𝛼
( ) = 𝛼1 ( ) + 𝛼2 ( ) + 𝛼3 ( ) + 𝛼4 ( ) = [𝛼 + 𝛼1 + 𝛼 𝛼 + 𝛼1 + 𝛼2 + 𝛼 ]
3 8 1 1 1 1 1 1 0 1 1 2 3 1 2 3 4

Comparing the elements and solving the resulting system of equations, we get

𝛼1 = 4, 𝛼2 = 1, 𝛼3 = −2, 𝛼4 = 5 . Since T is a linear transformation,


4 5 1 1 0 1 0 0 0 0
∴ 𝑇 [( )] = 𝛼1 𝑇 [( )] + 𝛼2 𝑇 [( )] + 𝛼3 𝑇 [( )] + 𝛼4 𝑇 [( )]
3 8 1 1 1 1 1 1 0 1
1 1 1 −1 −2
= 4 (2) + 1 (−2) − 2 (−2) + 5 ( 2 ) = ( 20 )
3 3 −3 3 36

You might also like