KKKQ1223 Engineering Mathematics (Linear Algebra) : Best Approximation Least Squares Least Squares Fitting To Data
KKKQ1223 Engineering Mathematics (Linear Algebra) : Best Approximation Least Squares Least Squares Fitting To Data
KKKQ1223 Engineering Mathematics (Linear Algebra) : Best Approximation Least Squares Least Squares Fitting To Data
ENGINEERING MATHEMATICS
(LINEAR ALGEBRA)
Solved linear system that cannot be solved exactly and for which an approximate solution needed.
b projW b b w
THEOREM 6.4.2
For every linear system Ax = b, the associated normal system
AT Ax AT b
is consistent, and all solutions of the above equation are least squares solutions of Ax = b. Moreover, if W is the
column space of A, and x is any least squares solutions of Ax = b, then the orthogonal projection b on W is
projW b Ax
BEST APPROXIMATION
LEAST SQUARE SOLUTIONS OF LINEAR SYSTEMS
AT Ax AT bis called the normal equation of the normal system associated with Ax = b.
If a linear system is consistent, then its exact solutions are the same as its least square solutions, in which case the
error is zero.
BEST APPROXIMATION
UNIQUENESS OF LEAST SQUARES SOLUTIONS
THEOREM 6.4.3
THEOREM 6.4.4
If A is an m x n matrix with linearly independent column vectors, then for every m x 1 matrix b, the linear system
Ax = b has a unique least squares solution. This solution is given by
x A A T
1
AT b
Moreover, if W is the column space of A, then the orthogonal projection of b on W is
projW b Ax A A T
1
AT b
BEST APPROXIMATION
EXAMPLE 1
4 1 0 3 0.06306
1 0 1 5 0.0736 0.0388
b Ax 5.4002
3 6 0 2 0.01698
1.2878
4
1 1 1 0.0388
b Ax 0.085298
BEST APPROXIMATION
EXAMPLE 2
Find the orthogonal projection of u on the subspace of R3 spanned by the vectors v1 and
v2 .
Thus, Ax u
2 1 1
1 1 x 2
y
0 0 3
so,
2 1 1
2 1 0 5 1 2 1 0 4
A A
T
1 1 , A u
T
2
1 1 0 0 0 1 2 1 1 0 3 1
The normal system A Ax A u is
T T
5 1 x 4
1 2 y 1
Solving this system yields
x 1
x
y 1
so,
2 1 1
1
projW u Ax 1 1 2
1
0 0 0
or projW u 1, 2, 0
BEST APPROXIMATION
THE ROLE OF QR-DECOMPOSITION IN LEAST SQUARES PROBLEMS
*Both formulas in Theorem 6.4.4 are not well suited for numerical computation.
THEOREM 6.4.5
x R 1QT b
BEST APPROXIMATION
ORTHOGONAL PROJECTIONS ON SUBSPACE OF Rm
DEFINITION 1
• To find the orthogonal projections on subspace of Rm which is the transformation P on the definition above is
following the first formula in Theorem 6.4.4,
P A A A
T 1
AT
where A is constructed using any basis for W as its column vectors.
BEST APPROXIMATION
MORE ON THE EQUIVALENCE THEOREM
Let (x1, y1), (x2, y2), …, (xn, yn) be a set of two or more data points, not all lying on a vertical line, and let
1 x1 y1
1 x2 y
M and y 2
1 xn yn
a *
Then there is a unique least squares straight line fit y toa *the
bdata
*
x points. Moreover, v *
b
is given by the formula
v* M T M 1 T
Mwhich
y express the fact that v = v* is the unique solution of the normal
equations
M T Mv M T y
LEAST SQUARE FITTING TO DATA
LEAST SQUARE FIT OF A POLYNOMIAL
Polynomial of fixed degree m, y a0 a1 x atom xnmpoints (x1, y1), (x2, y2), …, (xn, yn)
Substituting these n values of x and y into the above polynomial yields n equations,
y1 a0 a1 x1 am x1m
y2 a0 a1 x2 am x2m
v M M
*
T
1
M Ty
a *
yn a0 a1 xn am xnm M T Mv M T y v *
b
or in matrix form, y = Mv where
y1 1 x1 x12 x1m a1
y a
2 1 x2 x22 x2m
y , M , v 2
y
n 1 xn xn2 xnm am
LEAST SQUARE FITTING TO DATA
The solution of the normal equations
M T Mv M T y
determine the coefficients of the polynomial, and the vector v minimizes
y Mv
If MTM is invertible, then the normal equations have a unique solution v = v*, which is given by
v* M T M 1
M Ty
LEAST SQUARE FITTING TO DATA
EXAMPLE 3
LEAST SQUARE FITTING TO DATA
EXAMPLE 4
LEAST SQUARE FITTING TO DATA