[go: up one dir, main page]

0% found this document useful (0 votes)
50 views12 pages

Lecture Static 04 - 014 PDF

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 12

Chebyshev Polynomials

Least Squares, redux

Chebyshev Polynomials
Least Squares, redux

Outline

Numerical Analysis and Computing


Lecture Notes #12
Approximation Theory
Chebyshev Polynomials & Least Squares, redux

Chebyshev Polynomials
Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Least Squares, redux


Examples
More than one variable? No problem!

Joe Mahaffy,
hmahaffy@math.sdsu.edui
Department of Mathematics

Dynamical Systems Group


Computational Sciences Research Center

San Diego State University


San Diego, CA 92182-7720

http://www-rohan.sdsu.edu/jmahaffy

Spring 2010
Joe Mahaffy, hmahaffy@math.sdsu.edui
Chebyshev Polynomials
Least Squares, redux

Chebyshev Polynomials & Least Squares, redux

(1/45)

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Chebyshev Polynomials
Least Squares, redux

Orthogonal Polynomials: A Quick Summary

The ideas and techniques we developed i.e. Gram-Schmidt


orthogonalization with respect to a weight function over any
interval have applications far beyond least squares problems.

(2/45)

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Background

The Legendre polynomials are solutions to the Legendre


Differential Equation (which arises in numerous problems
exhibiting spherical symmetry)
(1 x 2 )

d 2y
dy
2x
+ ( + 1)y = 0,
2
dx
dx

or equivalently

The Legendre Polynomials are orthogonal on the interval [1, 1]


with respect to the weight function w (x) = 1. One curious
property of the Legendre polynomials is that their roots (all real)
yield the optimal node placement for Gaussian quadrature.
Chebyshev Polynomials & Least Squares, redux

Chebyshev Polynomials & Least Squares, redux

The Legendre Polynomials

So far we have seen the use of orthogonal polynomials can help us


solve the normal equations which arise in discrete and continuous
least squares problems, without the need for expensive and
numerically difficult matrix inversions.

Joe Mahaffy, hmahaffy@math.sdsu.edui

Joe Mahaffy, hmahaffy@math.sdsu.edui

(3/45)

d
2 dy
(1 x )
+ ( + 1)y = 0,
dx
dx

Applications: Celestial Mechanics (Legendres original application), Electrodynamics, etc...


Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux

(4/45)

Chebyshev Polynomials
Least Squares, redux

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Other Orthogonal Polynomials

Chebyshev Polynomials
Least Squares, redux

Background

Orthogonal polynomials have very useful properties in the solution


of mathematical and physical problems. [... They] provide a
natural way to solve, expand, and interpret solutions to many types
of important differential equations. Orthogonal polynomials are
especially easy to generate using Gram-Schmidt
orthonormalization.
The roots of orthogonal polynomials possess many rather
surprising and useful properties.

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

The Laguerre Polynomials

Background

The Laguerre polynomials are solutions to the Laguerre


differential equation
x

(http://mathworld.wolfram.com/OrthogonalPolynomials.html)

d2
dy
+ (1 x)
+ y = 0
2
dx
dx

They are associated with the radial solution to the Schrodinger


equation for the Hydrogen atoms electron (Spherical Harmonics).
Joe Mahaffy, hmahaffy@math.sdsu.edui
Chebyshev Polynomials
Least Squares, redux

Chebyshev Polynomials & Least Squares, redux


Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

More Orthogonal Polynomials


Polynomials
Chebyshev (1st)
Chebyshev (2nd)
Gegenbauer
Hermite
Jacobi
Legendre
Laguerre
Laguerre (assoc)

(5/45)

Chebyshev Polynomials
Least Squares, redux

Background
Interval
[1, 1]
[1, 1]
[1, 1]
(, )
(1, 1)
[1, 1]
[0, )
[0, )

w(x)

2
1/
1x
1 x2
(1 x 2 )1/2
2
e x
(1 x) (1 + x)
1
e x
x k e x

Chebyshev Polynomials & Least Squares, redux

(6/45)

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

1 t 2 t (n+1)
dt
(1 2tz + t 2 )

Chebyshev Polynomials are used to minimize approximation


error. We will use them to solve the following problems:

These are the Hermite orthogonal polynomials, not to be confused


with the Hermite interpolating polynomials...
Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux

Chebyshev Polynomials: The Sales Pitch.

1
Tn (z) =
2i

Today well take a closer look at Chebyshev polynomials of the first


kind.

Joe Mahaffy, hmahaffy@math.sdsu.edui

(7/45)

[1] Find an optimal placement of the interpolating points


{x0 , x1 , . . . , xn } to minimize the error in Lagrange interpolation.
[2] Find a means of reducing the degree of an approximating polynomial with minimal loss of accuracy.
Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux

(8/45)

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Chebyshev Polynomials
Least Squares, redux

Chebyshev Polynomials: Definitions.

Chebyshev Polynomials, Tn (x), n 2.

The Chebyshev polynomials {Tn (x)} are orthogonal onthe interval


(1, 1) with respect to the weight function w (x) = 1/ 1 x 2 , i.e.
Z

hTi (x), Tj (x)iw (x)

Ti (x)Tj (x) w (x)dx = i i,j .

We introduce the notation = arccos x, and get


Tn ((x)) Tn () = cos(n),

Definition (Chebyshev Polynomials)

Tn (x) = cos(n arccos x),

Joe Mahaffy, hmahaffy@math.sdsu.edui

Tn+1 () = cos((n + 1)) = cos(n) cos() sin(n) sin()


Tn1 () = cos((n 1)) = cos(n) cos() + sin(n) sin()
Tn+1 () + Tn1 () = 2 cos(n) cos().
Returning to the original variable x, we have

For x [1, 1], define

T0 (x) = cos(0) = 1,

Tn+1 (x) = 2x cos(n arccos x) Tn1 (x),

n 0.

or

Tn+1 (x) = 2xTn (x) Tn1 (x).

T1 (x) = x.

Chebyshev Polynomials & Least Squares, redux

(9/45)

Joe Mahaffy, hmahaffy@math.sdsu.edui

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Chebyshev Polynomials
Least Squares, redux

where [0, ].

We can find a recurrence relation, using these observations:

We could use the Gram-Schmidt orthogonalization process to find


them, but it is easier to give the definition and then check the
properties...

Note:

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Chebyshev Polynomials
Least Squares, redux

The Chebyshev Polynomials

Chebyshev Polynomials
Least Squares, redux

Chebyshev Polynomials & Least Squares, redux (10/45)


Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Orthogonality of the Chebyshev Polynomials, I

0.5

T1(x)
T2(x)
T3(x)
T4(x)
T5(x)

Tn (x)Tm (x)

dx =
1 x2

cos(n arccos x) cos(m arccos x)

dx
.
1 x2

Reintroducing = arccos x gives,


d =

dx
,
1 x2

and the integral becomes


Z
Z 0
cos(n) cos(m) d =

-0.5

cos(n) cos(m) d.

Now, we use the fact that


-1
-1

-0.5

Joe Mahaffy, hmahaffy@math.sdsu.edui

0.5

Chebyshev Polynomials & Least Squares, redux (11/45)

cos(n) cos(m) =

cos(n + m) + cos(n m)
...
2

Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (12/45)

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Chebyshev Polynomials
Least Squares, redux

Orthogonality of the Chebyshev Polynomials, II


We have:

If m 6= n, we get

1
1
sin((n + m)) +
sin((n m)) = 0,
2(n + m)
2(n m)
0

x
1
sin((n + m)) +
2(n + m)
2

.
2

Chebyshev Polynomials & Least Squares, redux (13/45)


Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Chebyshev Polynomials
Least Squares, redux

Zeros and Extrema of Chebyshev Polynomials Proof.


Proof.
Let:

xk = cos

2k 1
,
2n

xk = cos

k
n

=
=

Tn (xk )
Tn (xk )

n sin(k)
sin( k
n )

= 0,

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

n (x) by dividing Tn (x)


We get the monic Chebyshev polynomials T
n1
by 2 , n 1. Hence,
0 (x) = 1,
T

n (x) =
T

1
2n1

Tn (x),

n 1.

They satisfy the following recurrence relations


= cos(k) = (1)k .
cos n arccos cos k
n

Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials
Least Squares, redux

Chebyshev Polynomials & Least Squares, redux (14/45)

A monic polynomial is a polynomial with leading coefficient 1.

n sin(n arccos(x))
d

,
[cos(n arccos(x))] =
dx
1 x2
n sin(n arccos(cos( k
n )))
q
1cos2 ( k
)
n

Joe Mahaffy, hmahaffy@math.sdsu.edui

Definition (Monic Polynomial)

cos(n arccos(xk )) = cos n arccos cos 2k1

2n

cos 2k1
= 0,
2

Tn (x) =

Moreover, Tn (x) assumes its absolute extrema at



k

, with Tn (xk ) = (1)k , k = 1, . . . , n 1.


xk = cos
n

Monic Chebyshev Polynomials, I

Then:
Tn (xk )

Theorem
The Chebyshev polynomial of degree n 1 has n simple zeros in
[1, 1] at

2k 1
xk = cos
, k = 1, . . . , n.
2n

Payoff: No matter what the degree of the polynomial, the oscillations are kept under control!!!

Hence, the Chebyshev polynomials are orthogonal.


Joe Mahaffy, hmahaffy@math.sdsu.edui

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Zeros and Extrema of Chebyshev Polynomials.

cos(n + m) + cos(n m)
d.
2

if m = n, we have

Chebyshev Polynomials
Least Squares, redux

Chebyshev Polynomials & Least Squares, redux (15/45)

2 (x) = x T
1 (x) 1 T
T
2 0 (x)
n+1 (x) = x T
n (x) 1 T

T
4 n1 (x).
Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (16/45)

Chebyshev Polynomials
Least Squares, redux

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Monic Chebyshev Polynomials, II

The Monic Chebyshev Polynomials

n (x) coincides with


The location of the zeros and extrema of T
those of Tn (x), however the extreme values are
k
n (xk ) = (1) ,
T
2n1

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Chebyshev Polynomials
Least Squares, redux

k = 1, . . . , n 1.
0.5

Definition
Let Pn denote the set of all monic polynomials of degree n.

T1(x)
T2(x)
T3(x)
T4(x)
T5(x)

Theorem (Min-Max)
The monic Chebyshev polynomials Tn (x), have the property that
1
n (x)| max |Pn (x)|,
= max |T
2n1
x[1,1]
x[1,1]

Pn (x) Pn .

n (x).
Moreover, equality can only occur if Pn (x) T
Joe Mahaffy, hmahaffy@math.sdsu.edui
Chebyshev Polynomials
Least Squares, redux

Chebyshev Polynomials & Least Squares, redux (17/45)

-1
-1

If x0 , x1 , . . . , xn are distinct points in the interval [1, 1] and


f C n+1 [1, 1], and P(x) the nth degree interpolating Lagrange
polynomial, then x [1, 1] (x) (1, 1) so that
n
f (n+1) ((x)) Y
f (x) P(x) =
(x xk ).
(n + 1)!
k=0

We have no control over f (n+1) ((x)), but we can


Qnplace the nodes
in a clever
Q way as to minimize the maximum of k=0 (x xk ).
Since nk=0 (x xk ) is a monic polynomial of degree (n + 1), we
know the min-max is obtained when the nodes are chosen so that

n
Y
2k + 1

(x xk ) = Tn+1 (x), i.e. xk = cos


.
2(n + 1)
k=0

Chebyshev Polynomials & Least Squares, redux (19/45)

-0.5

Joe Mahaffy, hmahaffy@math.sdsu.edui

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Optimal Node Placement in Lagrange Interpolation, I

Joe Mahaffy, hmahaffy@math.sdsu.edui

-0.5

Chebyshev Polynomials
Least Squares, redux

0.5

Chebyshev Polynomials & Least Squares, redux (18/45)


Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Optimal Node Placement in Lagrange Interpolation, II


Theorem
If P(x) is the interpolating polynomial of degree at most n with
nodes at the roots of Tn+1 (x), then
max |f (x) P(x)|

x[1,1]

2n (n

1
max |f (n+1) (x)|,
+ 1)! x[1,1]

f C n+1 [1, 1].


Extending to any interval: The transformation
x =

1
[(b a)x + (a + b)]
2

transforms the nodes xk in [1, 1] into the corresponding nodes xk


in [a, b].
Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (20/45)

Chebyshev Polynomials
Least Squares, redux

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Chebyshev Polynomials
Least Squares, redux

Example: Interpolating f (x) = x 2 e x .

Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

Example: Interpolating f (x) = x 2 e x The Error.

3
f(x)
Taylor2(x)
Lagrange2EquiSpaced(x)
Lagrange2Optimal(x)

2.5

Deviation: EquiSpaced
Deviation: Optimal

0.1

1.5
0

0.5
-0.1

0
0

0.2

0.4

Joe Mahaffy, hmahaffy@math.sdsu.edui


Chebyshev Polynomials
Least Squares, redux

0.8

0.6

Chebyshev Polynomials & Least Squares, redux (21/45)


Orthogonal Polynomials
Chebyshev Polynomials, Intro & Definitions
Properties

0.01

0.005

0.05

0.1

0.005

0.15

0.01

1.5

Chebyshev Polynomials & Least Squares, redux (22/45)


Examples
More than one variable? No problem!

Introduction

Error: EquiSpaced
Error: Optimal

0.015

0.2

0.8

Degree 3

0.05

0.4

0.6

0.8

Before we move on to new and exciting orthogonal polynomials


with exotic names... Lets take a moment (or two) and look at the
usage of Least Squares Approximation.

0.015
0

0.2

0.4

10

0.6

0.8

Degree 5

Degree 4

0.6

0.02

Error: EquiSpaced
Error: Optimal

x 10

Joe Mahaffy, hmahaffy@math.sdsu.edui

Least Squares: Revisited

Degree 2
0.15

0.2
0

0.4

Chebyshev Polynomials
Least Squares, redux

Example: Interpolating f (x) = x 2 e x The Error.

0.1

0.2

x 10

This section is a how-to with quite a few applied example of


least squares approximation...

Error: EquiSpaced
Error: Optimal

Error: EquiSpaced
Error: Optimal
1

0.5

0.5

1.5
0

0.2

0.4

0.6

0.8

Joe Mahaffy, hmahaffy@math.sdsu.edui

5
0

0.2

0.4

0.6

0.8

Chebyshev Polynomials & Least Squares, redux (23/45)

Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (24/45)

Chebyshev Polynomials
Least Squares, redux

Examples
More than one variable? No problem!

Example #1 Warm-up

Chebyshev Polynomials
Least Squares, redux

1 of 3

Examples
More than one variable? No problem!

Example #1 Warm-up

2 of 3

We now have the coefficients for the polynomials, lets plot:


matlab>> xv = 1.0:0.01:2.1;

First we consider the problem of fitting 1st, 2nd, and 3rd degree
polynomials to the following data:

p1 = polyval(flipud(pcoef1),xv);
p2 = polyval(flipud(pcoef2),xv);
p3 = polyval(flipud(pcoef3),xv);
plot(xv,p3,k-,linewidth,3); hold on;
plot(x,y,ko,linewidth,3); hold off

x = [1.0 1.1 1.3 1.5 1.9 2.1]


y = [1.84 1.90 2.31 2.65 2.74 3.18]
matlab [First we define the matrices]
A1 = [ones(size(x)) x];
A2 = [A1 x.*x];
A3 = [A2 x.*x.*x ];
[Then we solve the Normal Equations]
pcoef1 = A1\y;
pcoef2 = A2\y;
pcoef3 = A3\y;

Optimal Linear Approximation

Optimal Quadratic Approximation

3.2

Note: The matrices A1, A2, and A3 are tall and skinny. Normally we
would compute (An An)1 (An y ), however when matlab encounters
An\y it automatically gives us a solution in the least squares sense.

Optimal Cubic Approximation

3.2

3.2

2.8

2.8

2.8

2.6

2.6

2.6

2.4

2.4

2.4

2.2

2.2

2.2

LSQerror: 0.087656

1.8

LSQerror: 0.069900

1.8
1

1.2

1.4

1.6

1.8

LSQerror: 0.044729

1.8
1

1.2

1.4

1.6

1.8

1.2

1.4

1.6

1.8

Figure: The least squares polynomials p1 (x), p2 (x), and p3 (x).


Joe Mahaffy, hmahaffy@math.sdsu.edui
Chebyshev Polynomials
Least Squares, redux

Chebyshev Polynomials & Least Squares, redux (25/45)

Joe Mahaffy, hmahaffy@math.sdsu.edui

Examples
More than one variable? No problem!

Example #1 Warm-up

Chebyshev Polynomials
Least Squares, redux

3 of 3

Chebyshev Polynomials & Least Squares, redux (26/45)


Examples
More than one variable? No problem!

Example #2 Something More Exotic

1 of 2

Consider the same data:


Finally, we compute the error
matlab>> p1err = polyval(flipud(pcoef1),x)

x = [1.0 1.1 1.3 1.5 1.9 2.1]


- y;

p2err = polyval(flipud(pcoef2),x) - y;
p3err = polyval(flipud(pcoef3),x) - y;
disp([sum(p1err.*p1err) sum(p2err.*p2err)
sum(p3err.*p3err)])

But lets find the best fit of the form a + b x to this data! Notice
that this expression is linear in its parameters a, b, so we can solve
the corresponding least squares problem!
matlab>>

Which gives us the fitting errors


P1Err = 0.0877,

y = [1.84 1.90 2.31 2.65 2.74 3.18]

A = [ones(size(x)) sqrt(x)];
pcoef = A\y;

P2Err = 0.0699,

P3Err = 0.0447

xv = 1.0:0.01:2.1;
fv = pcoef(1) + pcoef(2)*sqrt(xv);
plot(xv,fv,k-,linewidth,3); hold on;
plot(x,y,ko,linewidth,3); hold off;

Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (27/45)

Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (28/45)

Chebyshev Polynomials
Least Squares, redux

Examples
More than one variable? No problem!

Example #2 Something More Exotic


0.5

Optimal a+b x

Chebyshev Polynomials
Least Squares, redux

2 of 2

Examples
More than one variable? No problem!

Getting Even More Exotic?

1 of 2

Approximation

3.2

2.8

As long as the model is linear in its parameters, we can solve the


least squares problem.

2.6

Non-linear dependence will have to wait until Math 693a.

We can fit this model:

2.4

M1 (a, b, c, d) = a + bx 3/2 + c/ x + de sin(x)

2.2
2

LSQerror: 0.074920

1.8
1

1.2

1.4

1.6

1.8

Just define the matrix

Figure: The best fit of the form a + b x.

We compute the fitting error:


matlab>> ferr = pcoef(1) + pcoef(2)*sqrt(x)

matlab>>

and compute the coefficients

- y;

matlab>>

disp(sum(ferr.*ferr))

Joe Mahaffy, hmahaffy@math.sdsu.edui


Chebyshev Polynomials
Least Squares, redux

coef = A\y;

etc...

Err
P{a+b
x} = 0.0749

Which gives us

A = [ones(size(x)) x. (3/2) 1./sqrt(x) exp(sin(x))];

Chebyshev Polynomials & Least Squares, redux (29/45)


Examples
More than one variable? No problem!

Getting Even More Exotic?

Joe Mahaffy, hmahaffy@math.sdsu.edui


Chebyshev Polynomials
Least Squares, redux

2 of 2

Chebyshev Polynomials & Least Squares, redux (30/45)


Examples
More than one variable? No problem!

Getting Multi-Dimensional

Optimal "Exotic" Model


3.2
3

It seems quite unlikely the model

M1 (a, b, c, d) = a + bx 3/2 + c/ x + de sin(x)

2.8
2.6

will ever be useful.

2.4

However, we have forgotten about one important aspect of the


problem so far our models have depended on only one variable,
x.

2.2
2

LSQerror: 0.062771

1.8

How do we go about fitting multi-dimensional data?


1

1.2

1.4

1.6

1.8

The optimal approximation of this form is


11.0059
16.4133 0.9970x 3/2
1.1332e sin(x)
x
Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (31/45)

Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (32/45)

Chebyshev Polynomials
Least Squares, redux

Examples
More than one variable? No problem!

Chebyshev Polynomials
Least Squares, redux

Example: Going 2D

1 of 10

Examples
More than one variable? No problem!

Example: Going 2D

2 of 10

Lets try to fit a simple 3-parameter model to this data

matlab>> x=1:0.25:5;
y=1:0.25:5;
[X,Y]=meshgrid(x,y);
Fxy=1+sqrt(X)+Y. 3+0.05*randn(size(X));
surf(x,y,Fxy)

M(a, b, c) = a + bx + cy
matlab>>

150

100

50

sz

= size(X);

Bm

= reshape(X,prod(sz),1);

Cm

= reshape(Y,prod(sz),1);

Am

= ones(size(Bm));

RHS

= reshape(Fxy,prod(sz),1);

= [Am Bm Cm];

coef = A \ RHS;

0
6
5

fit

4
3

2
0

surf(x,y,fitError)

Figure: 2D-data set, the vertexes on the surface are our data points.
Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (33/45)

Chebyshev Polynomials
Least Squares, redux

= coef(1) + coef(2)*X + coef(3)*Y;

fitError = Fxy - fit;

Joe Mahaffy, hmahaffy@math.sdsu.edui

Examples
More than one variable? No problem!

Chebyshev Polynomials
Least Squares, redux

Example: Going 2D

3 of 10

Fit (Model 1)

Chebyshev Polynomials & Least Squares, redux (34/45)


Examples
More than one variable? No problem!

Example: Going 2D

4 of 10

Lets try to fit a simple 4-parameter (bi-linear) model to this data

Error (Model 1)

M(a, b, c) = a + bx + cy + dxy
150

30

matlab>>

sz

= size(X);

10

Bm

= reshape(X,prod(sz),1);

Cm

= reshape(Y,prod(sz),1);

10

Dm

= reshape(X.*Y,prod(sz),1);

Am

= ones(size(Bm));

RHS

= reshape(Fxy,prod(sz),1);

= [Am Bm Cm Dm];

20
100

50

0
6

20
6
5

4
3

2
0

4
3

2
0

Figure: The optimal model fit, and the fitting error for the
least squares best-fit in the model space M(a, b, c) = a +
bx + cy . Here, the total LSQ-error is 42,282.
Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (35/45)

coef = A \ RHS;
fit

= coef(1) + coef(2)*X + coef(3)*Y + coef(4)*X.*Y;

fitError = Fxy - fit;


surf(x,y,fitError)
Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (36/45)

Chebyshev Polynomials
Least Squares, redux

Examples
More than one variable? No problem!

Chebyshev Polynomials
Least Squares, redux

Example: Going 2D

5 of 10

Fit (Model 2)

Example: Going 2D

M(a, b, c) = a + bx + cy + dy 2

30
20

100

matlab>>

10
0

50

10
0
6

20
6
5

4
3

4
3

2
0

2
0

Figure: The fitting error for the least squares best-fit in the
model space M(a, b, c) = a + bx + cy + dxy . Still a pretty
bad fit. Here, the total LSQ-error is still 42,282.
Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (37/45)

Chebyshev Polynomials
Least Squares, redux

7 of 10
Error (Model 3)

Fit (Model 3)

2
0
2

5
3

2
0

4
3

2
0

Figure: The fitting error for the least squares best-fit in the
model space M(a, b, c) = a + bx + cy + dy 2 . We see a
significant drop in the error (one order of magnitude); and
the total LSQ-error has dropped to 578.8.
Joe Mahaffy, hmahaffy@math.sdsu.edui

7 12 of 10

Example: Going 2D

Why not add both the xy and y 2 always?

4
6

0
6

Examples
More than one variable? No problem!

The change in the LSQ-error as a function of an added term is one


way to decide what is a useful addition to the model.

100

50

Chebyshev Polynomials & Least Squares, redux (38/45)

We notice something interesting: the addition of the xy -term to


the model did not produce a drop in the LSQ-error. However, the
y 2 allowed us to capture a lot more of the action.

Joe Mahaffy, hmahaffy@math.sdsu.edui


Chebyshev Polynomials
Least Squares, redux

Example: Going 2D

sz
= size(X);
Bm
= reshape(X,prod(sz),1);
Cm
= reshape(Y,prod(sz),1);
Dm
= reshape(Y.*Y,prod(sz),1);
Am
= ones(size(Bm));
RHS = reshape(Fxy,prod(sz),1);
A
= [Am Bm Cm Dm];
coef = A \ RHS;
fit = coef(1) + coef(2)*X + coef(3)*Y + coef(4)*Y.*Y;
fitError = Fxy - fit;
surf(x,y,fitError)

Examples
More than one variable? No problem!

150

6 of 10

Since the main problem is in the y -direction, we fit try a


4-parameter model with a quadratic term in y

Error (Model 2)

150

Examples
More than one variable? No problem!

Chebyshev Polynomials & Least Squares, redux (39/45)

(A)
(AT A)

xy
86.2
7,422

y2
107.3
11,515

Both
170.5
29,066

Table: Condition numbers for the A-matrices (and


associated Normal Equations) for the different models.
Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (40/45)

Chebyshev Polynomials
Least Squares, redux

Examples
More than one variable? No problem!

Example: Going 2D

Chebyshev Polynomials
Least Squares, redux

8 of 10

Examples
More than one variable? No problem!

Example: Going 2D

We fit a 5-parameter model with a quadratic term in y

9 of 10

Fit (Model 4)

Error (Model 4)

M(a, b, c) = a + bx + cy + dy 2 + ey 3
150

matlab>>

sz
Bm
Cm
Dm
Em
Am
RHS
A
coef
fit

=
=
=
=
=
=
=
=
=
=

size(X);
reshape(X,prod(sz),1);
reshape(Y,prod(sz),1);
reshape(Y.*Y,prod(sz),1);
reshape(Y.*Y.*Y,prod(sz),1);
ones(size(Bm));
reshape(Fxy,prod(sz),1);
[Am Bm Cm Dm Em];
A \ RHS;
coef(1) + coef(2)*X + coef(3)*Y + coef(4)*Y.*Y +

50
0.1

0
6

a + bx
a + bx
a + bx
a + bx
a + bx

+ cy
+ cy
+ cy
+ cy
+ cy

+ dxy
+ dy 2
+ ey 3
+ dy 2 + ey 3

(AT A)

42,282
42,282
578.8
2.695
0.9864

278
7,422
11,515
107,204
1,873,124

4
3

2
0

Joe Mahaffy, hmahaffy@math.sdsu.edui


Chebyshev Polynomials
Least Squares, redux

Table: Summary of LSQ-error and conditioning of the Normal


Equations for the various models. We notice that additional
columns in the A-matrix (additional model parameters) have a
severe effect on the conditioning of the Normal Equations.

Joe Mahaffy, hmahaffy@math.sdsu.edui

Figure: The fitting error for the least squares best-fit in the
model space M(a, b, c) = a + bx + cy + dy 2 + ey 3 . We
now have a pretty good fit. The LSQ-error is now down to
0.9864.

Examples
More than one variable? No problem!

LSQ-error

2
0

10 of 10

Model

4
3

Chebyshev Polynomials & Least Squares, redux (41/45)

Example: Going 2D

0.2
6

fitError = Fxy - fit;


surf(x,y,fitError)

Chebyshev Polynomials
Least Squares, redux

0.1

100

coef(5)*Y. 3;

Joe Mahaffy, hmahaffy@math.sdsu.edui

0.2

Chebyshev Polynomials & Least Squares, redux (43/45)

Chebyshev Polynomials & Least Squares, redux (42/45)


Examples
More than one variable? No problem!

Moving to Even Higher Dimensions

At this point we can state the Linear Least Squares fitting problem
in any number of dimensions, and we can use exotic models if we
want to.
In 3D we need 10 parameters to fit a model with all linear, and
second order terms
M(a, b, c, d, e, f , g , h, i, j) =
a + bx + cy + dz + ex 2 + fy 2 + gz 2 + hxy + ixz + jyz
With nx , ny , and nz data points in the x-, y -, and z-directions
(respectively) we end up with a matrix A of dimension
(nx ny nz ) 10.

Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (44/45)

Chebyshev Polynomials
Least Squares, redux

Examples
More than one variable? No problem!

Ill-conditioning of the Normal Equations

Needless(?) to say, the normal equations can be quite


ill-conditioned in this case. The ill-conditioning can be eased by
searching for a set of orthogonal functions with respect to the
inner product
Z
hf (x), g (x)i =

xb
xa

yb

ya

zb

za

f (x, y , z)g (x, y , z) dx dy dz

Thats *sometimes* possible, but well leave the details as an


exercise for a dark and stormy night...

Joe Mahaffy, hmahaffy@math.sdsu.edui

Chebyshev Polynomials & Least Squares, redux (45/45)

You might also like