Lecture Static 04 - 014 PDF
Lecture Static 04 - 014 PDF
Lecture Static 04 - 014 PDF
Chebyshev Polynomials
Least Squares, redux
Outline
Chebyshev Polynomials
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Joe Mahaffy,
hmahaffy@math.sdsu.edui
Department of Mathematics
http://www-rohan.sdsu.edu/jmahaffy
Spring 2010
Joe Mahaffy, hmahaffy@math.sdsu.edui
Chebyshev Polynomials
Least Squares, redux
(1/45)
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Chebyshev Polynomials
Least Squares, redux
(2/45)
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Background
d 2y
dy
2x
+ ( + 1)y = 0,
2
dx
dx
or equivalently
(3/45)
d
2 dy
(1 x )
+ ( + 1)y = 0,
dx
dx
(4/45)
Chebyshev Polynomials
Least Squares, redux
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Chebyshev Polynomials
Least Squares, redux
Background
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Background
(http://mathworld.wolfram.com/OrthogonalPolynomials.html)
d2
dy
+ (1 x)
+ y = 0
2
dx
dx
(5/45)
Chebyshev Polynomials
Least Squares, redux
Background
Interval
[1, 1]
[1, 1]
[1, 1]
(, )
(1, 1)
[1, 1]
[0, )
[0, )
w(x)
2
1/
1x
1 x2
(1 x 2 )1/2
2
e x
(1 x) (1 + x)
1
e x
x k e x
(6/45)
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
1 t 2 t (n+1)
dt
(1 2tz + t 2 )
1
Tn (z) =
2i
(7/45)
(8/45)
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Chebyshev Polynomials
Least Squares, redux
T0 (x) = cos(0) = 1,
n 0.
or
T1 (x) = x.
(9/45)
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Chebyshev Polynomials
Least Squares, redux
where [0, ].
Note:
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Chebyshev Polynomials
Least Squares, redux
Chebyshev Polynomials
Least Squares, redux
0.5
T1(x)
T2(x)
T3(x)
T4(x)
T5(x)
Tn (x)Tm (x)
dx =
1 x2
dx
.
1 x2
dx
,
1 x2
-0.5
cos(n) cos(m) d.
-0.5
0.5
cos(n) cos(m) =
cos(n + m) + cos(n m)
...
2
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Chebyshev Polynomials
Least Squares, redux
If m 6= n, we get
1
1
sin((n + m)) +
sin((n m)) = 0,
2(n + m)
2(n m)
0
x
1
sin((n + m)) +
2(n + m)
2
.
2
Chebyshev Polynomials
Least Squares, redux
xk = cos
2k 1
,
2n
xk = cos
k
n
=
=
Tn (xk )
Tn (xk )
n sin(k)
sin( k
n )
= 0,
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
n (x) =
T
1
2n1
Tn (x),
n 1.
= cos(k) = (1)k .
cos n arccos cos k
n
Chebyshev Polynomials
Least Squares, redux
n sin(n arccos(x))
d
,
[cos(n arccos(x))] =
dx
1 x2
n sin(n arccos(cos( k
n )))
q
1cos2 ( k
)
n
2n
cos 2k1
= 0,
2
Tn (x) =
Then:
Tn (xk )
Theorem
The Chebyshev polynomial of degree n 1 has n simple zeros in
[1, 1] at
2k 1
xk = cos
, k = 1, . . . , n.
2n
Payoff: No matter what the degree of the polynomial, the oscillations are kept under control!!!
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
cos(n + m) + cos(n m)
d.
2
if m = n, we have
Chebyshev Polynomials
Least Squares, redux
2 (x) = x T
1 (x) 1 T
T
2 0 (x)
n+1 (x) = x T
n (x) 1 T
T
4 n1 (x).
Joe Mahaffy, hmahaffy@math.sdsu.edui
Chebyshev Polynomials
Least Squares, redux
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Chebyshev Polynomials
Least Squares, redux
k = 1, . . . , n 1.
0.5
Definition
Let Pn denote the set of all monic polynomials of degree n.
T1(x)
T2(x)
T3(x)
T4(x)
T5(x)
Theorem (Min-Max)
The monic Chebyshev polynomials Tn (x), have the property that
1
n (x)| max |Pn (x)|,
= max |T
2n1
x[1,1]
x[1,1]
Pn (x) Pn .
n (x).
Moreover, equality can only occur if Pn (x) T
Joe Mahaffy, hmahaffy@math.sdsu.edui
Chebyshev Polynomials
Least Squares, redux
-1
-1
n
Y
2k + 1
-0.5
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
-0.5
Chebyshev Polynomials
Least Squares, redux
0.5
x[1,1]
2n (n
1
max |f (n+1) (x)|,
+ 1)! x[1,1]
1
[(b a)x + (a + b)]
2
Chebyshev Polynomials
Least Squares, redux
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
Chebyshev Polynomials
Least Squares, redux
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties
3
f(x)
Taylor2(x)
Lagrange2EquiSpaced(x)
Lagrange2Optimal(x)
2.5
Deviation: EquiSpaced
Deviation: Optimal
0.1
1.5
0
0.5
-0.1
0
0
0.2
0.4
0.8
0.6
0.01
0.005
0.05
0.1
0.005
0.15
0.01
1.5
Introduction
Error: EquiSpaced
Error: Optimal
0.015
0.2
0.8
Degree 3
0.05
0.4
0.6
0.8
0.015
0
0.2
0.4
10
0.6
0.8
Degree 5
Degree 4
0.6
0.02
Error: EquiSpaced
Error: Optimal
x 10
Degree 2
0.15
0.2
0
0.4
Chebyshev Polynomials
Least Squares, redux
0.1
0.2
x 10
Error: EquiSpaced
Error: Optimal
Error: EquiSpaced
Error: Optimal
1
0.5
0.5
1.5
0
0.2
0.4
0.6
0.8
5
0
0.2
0.4
0.6
0.8
Chebyshev Polynomials
Least Squares, redux
Examples
More than one variable? No problem!
Example #1 Warm-up
Chebyshev Polynomials
Least Squares, redux
1 of 3
Examples
More than one variable? No problem!
Example #1 Warm-up
2 of 3
First we consider the problem of fitting 1st, 2nd, and 3rd degree
polynomials to the following data:
p1 = polyval(flipud(pcoef1),xv);
p2 = polyval(flipud(pcoef2),xv);
p3 = polyval(flipud(pcoef3),xv);
plot(xv,p3,k-,linewidth,3); hold on;
plot(x,y,ko,linewidth,3); hold off
3.2
Note: The matrices A1, A2, and A3 are tall and skinny. Normally we
would compute (An An)1 (An y ), however when matlab encounters
An\y it automatically gives us a solution in the least squares sense.
3.2
3.2
2.8
2.8
2.8
2.6
2.6
2.6
2.4
2.4
2.4
2.2
2.2
2.2
LSQerror: 0.087656
1.8
LSQerror: 0.069900
1.8
1
1.2
1.4
1.6
1.8
LSQerror: 0.044729
1.8
1
1.2
1.4
1.6
1.8
1.2
1.4
1.6
1.8
Examples
More than one variable? No problem!
Example #1 Warm-up
Chebyshev Polynomials
Least Squares, redux
3 of 3
1 of 2
p2err = polyval(flipud(pcoef2),x) - y;
p3err = polyval(flipud(pcoef3),x) - y;
disp([sum(p1err.*p1err) sum(p2err.*p2err)
sum(p3err.*p3err)])
But lets find the best fit of the form a + b x to this data! Notice
that this expression is linear in its parameters a, b, so we can solve
the corresponding least squares problem!
matlab>>
A = [ones(size(x)) sqrt(x)];
pcoef = A\y;
P2Err = 0.0699,
P3Err = 0.0447
xv = 1.0:0.01:2.1;
fv = pcoef(1) + pcoef(2)*sqrt(xv);
plot(xv,fv,k-,linewidth,3); hold on;
plot(x,y,ko,linewidth,3); hold off;
Chebyshev Polynomials
Least Squares, redux
Examples
More than one variable? No problem!
Optimal a+b x
Chebyshev Polynomials
Least Squares, redux
2 of 2
Examples
More than one variable? No problem!
1 of 2
Approximation
3.2
2.8
2.6
2.4
2.2
2
LSQerror: 0.074920
1.8
1
1.2
1.4
1.6
1.8
matlab>>
- y;
matlab>>
disp(sum(ferr.*ferr))
coef = A\y;
etc...
Err
P{a+b
x} = 0.0749
Which gives us
2 of 2
Getting Multi-Dimensional
2.8
2.6
2.4
2.2
2
LSQerror: 0.062771
1.8
1.2
1.4
1.6
1.8
Chebyshev Polynomials
Least Squares, redux
Examples
More than one variable? No problem!
Chebyshev Polynomials
Least Squares, redux
Example: Going 2D
1 of 10
Examples
More than one variable? No problem!
Example: Going 2D
2 of 10
matlab>> x=1:0.25:5;
y=1:0.25:5;
[X,Y]=meshgrid(x,y);
Fxy=1+sqrt(X)+Y. 3+0.05*randn(size(X));
surf(x,y,Fxy)
M(a, b, c) = a + bx + cy
matlab>>
150
100
50
sz
= size(X);
Bm
= reshape(X,prod(sz),1);
Cm
= reshape(Y,prod(sz),1);
Am
= ones(size(Bm));
RHS
= reshape(Fxy,prod(sz),1);
= [Am Bm Cm];
coef = A \ RHS;
0
6
5
fit
4
3
2
0
surf(x,y,fitError)
Figure: 2D-data set, the vertexes on the surface are our data points.
Joe Mahaffy, hmahaffy@math.sdsu.edui
Chebyshev Polynomials
Least Squares, redux
Examples
More than one variable? No problem!
Chebyshev Polynomials
Least Squares, redux
Example: Going 2D
3 of 10
Fit (Model 1)
Example: Going 2D
4 of 10
Error (Model 1)
M(a, b, c) = a + bx + cy + dxy
150
30
matlab>>
sz
= size(X);
10
Bm
= reshape(X,prod(sz),1);
Cm
= reshape(Y,prod(sz),1);
10
Dm
= reshape(X.*Y,prod(sz),1);
Am
= ones(size(Bm));
RHS
= reshape(Fxy,prod(sz),1);
= [Am Bm Cm Dm];
20
100
50
0
6
20
6
5
4
3
2
0
4
3
2
0
Figure: The optimal model fit, and the fitting error for the
least squares best-fit in the model space M(a, b, c) = a +
bx + cy . Here, the total LSQ-error is 42,282.
Joe Mahaffy, hmahaffy@math.sdsu.edui
coef = A \ RHS;
fit
Chebyshev Polynomials
Least Squares, redux
Examples
More than one variable? No problem!
Chebyshev Polynomials
Least Squares, redux
Example: Going 2D
5 of 10
Fit (Model 2)
Example: Going 2D
M(a, b, c) = a + bx + cy + dy 2
30
20
100
matlab>>
10
0
50
10
0
6
20
6
5
4
3
4
3
2
0
2
0
Figure: The fitting error for the least squares best-fit in the
model space M(a, b, c) = a + bx + cy + dxy . Still a pretty
bad fit. Here, the total LSQ-error is still 42,282.
Joe Mahaffy, hmahaffy@math.sdsu.edui
Chebyshev Polynomials
Least Squares, redux
7 of 10
Error (Model 3)
Fit (Model 3)
2
0
2
5
3
2
0
4
3
2
0
Figure: The fitting error for the least squares best-fit in the
model space M(a, b, c) = a + bx + cy + dy 2 . We see a
significant drop in the error (one order of magnitude); and
the total LSQ-error has dropped to 578.8.
Joe Mahaffy, hmahaffy@math.sdsu.edui
7 12 of 10
Example: Going 2D
4
6
0
6
Examples
More than one variable? No problem!
100
50
Example: Going 2D
sz
= size(X);
Bm
= reshape(X,prod(sz),1);
Cm
= reshape(Y,prod(sz),1);
Dm
= reshape(Y.*Y,prod(sz),1);
Am
= ones(size(Bm));
RHS = reshape(Fxy,prod(sz),1);
A
= [Am Bm Cm Dm];
coef = A \ RHS;
fit = coef(1) + coef(2)*X + coef(3)*Y + coef(4)*Y.*Y;
fitError = Fxy - fit;
surf(x,y,fitError)
Examples
More than one variable? No problem!
150
6 of 10
Error (Model 2)
150
Examples
More than one variable? No problem!
(A)
(AT A)
xy
86.2
7,422
y2
107.3
11,515
Both
170.5
29,066
Chebyshev Polynomials
Least Squares, redux
Examples
More than one variable? No problem!
Example: Going 2D
Chebyshev Polynomials
Least Squares, redux
8 of 10
Examples
More than one variable? No problem!
Example: Going 2D
9 of 10
Fit (Model 4)
Error (Model 4)
M(a, b, c) = a + bx + cy + dy 2 + ey 3
150
matlab>>
sz
Bm
Cm
Dm
Em
Am
RHS
A
coef
fit
=
=
=
=
=
=
=
=
=
=
size(X);
reshape(X,prod(sz),1);
reshape(Y,prod(sz),1);
reshape(Y.*Y,prod(sz),1);
reshape(Y.*Y.*Y,prod(sz),1);
ones(size(Bm));
reshape(Fxy,prod(sz),1);
[Am Bm Cm Dm Em];
A \ RHS;
coef(1) + coef(2)*X + coef(3)*Y + coef(4)*Y.*Y +
50
0.1
0
6
a + bx
a + bx
a + bx
a + bx
a + bx
+ cy
+ cy
+ cy
+ cy
+ cy
+ dxy
+ dy 2
+ ey 3
+ dy 2 + ey 3
(AT A)
42,282
42,282
578.8
2.695
0.9864
278
7,422
11,515
107,204
1,873,124
4
3
2
0
Figure: The fitting error for the least squares best-fit in the
model space M(a, b, c) = a + bx + cy + dy 2 + ey 3 . We
now have a pretty good fit. The LSQ-error is now down to
0.9864.
Examples
More than one variable? No problem!
LSQ-error
2
0
10 of 10
Model
4
3
Example: Going 2D
0.2
6
Chebyshev Polynomials
Least Squares, redux
0.1
100
coef(5)*Y. 3;
0.2
At this point we can state the Linear Least Squares fitting problem
in any number of dimensions, and we can use exotic models if we
want to.
In 3D we need 10 parameters to fit a model with all linear, and
second order terms
M(a, b, c, d, e, f , g , h, i, j) =
a + bx + cy + dz + ex 2 + fy 2 + gz 2 + hxy + ixz + jyz
With nx , ny , and nz data points in the x-, y -, and z-directions
(respectively) we end up with a matrix A of dimension
(nx ny nz ) 10.
Chebyshev Polynomials
Least Squares, redux
Examples
More than one variable? No problem!
xb
xa
yb
ya
zb
za