Linear Algebra
Linear Algebra
Linear Algebra
● What is Linear Algebra?: Study of vectors, vector spaces, linear transformations, and
systems of linear equations.
● Applications: Physics, engineering, computer science (e.g., machine learning,
computer graphics), economics, and statistics.
3. Matrices
● Basis of a Vector Space: A set of linearly independent vectors that span the entire
space.
● Finding a Basis: Using Gaussian elimination and pivot columns.
● Dimension of a Vector Space: The number of vectors in any basis for the space.
○ Example: The dimension of ℝ² is 2, and the dimension of ℝ³ is 3.
● Definition:
○ Eigenvalues: Scalars λ such that when a matrix is multiplied by its eigenvector,
the result is the same as scaling the eigenvector by λ.
○ Eigenvectors: Non-zero vectors that change only in magnitude, not direction,
when multiplied by a matrix.
● Finding Eigenvalues and Eigenvectors:
○ Solve the characteristic equation det(A−λI)=0\text{det}(A - \lambda I)
= 0det(A−λI)=0 for eigenvalues.
○ Use the eigenvalue to solve (A−λI)v=0(A - \lambda I)v = 0(A−λI)v=0
for the eigenvector.
● Diagonalization: A matrix is diagonalizable if it has a full set of linearly
independent eigenvectors, allowing it to be written as A=PDP−1A = PDP^{-
1}A=PDP−1, where DDD is a diagonal matrix.
● Dot Product: A measure of orthogonality between two vectors; if the dot product is zero,
the vectors are orthogonal.
● Orthogonal Projections: Projecting a vector onto a subspace.
● Orthonormal Bases: A basis where all vectors are orthogonal and of unit length.
● Gram-Schmidt Process: A method for converting a set of vectors into an orthonormal
basis.
● Orthogonal Matrices: A matrix is orthogonal if its inverse is equal to its
transpose, i.e., A−1=ATA^{-1} = A^TA−1=AT.
● Definition: A mapping between vector spaces that preserves vector addition and scalar
multiplication.
● Matrix Representation of Linear Transformations: Every linear transformation can be
represented by a matrix.
● Kernel and Range:
○ Kernel (Null Space): The set of vectors that are mapped to the zero vector by
the transformation.
○ Range (Image): The set of all vectors that can be obtained by applying the
transformation to some vector.
● Rank-Nullity Theorem: For a linear transformation T:V→WT: V \to WT:V→W, the
rank (dimension of the range) plus the nullity (dimension of the kernel)
equals the dimension of the domain.
11. Applications of Linear Algebra