[go: up one dir, main page]

0% found this document useful (0 votes)
80 views5 pages

Non-Linear Regression Analysis

The document summarizes three non-linear regression analyses: 1) Finding coefficients a and b for a model using least squares regression. 2) Comparing data to an exponential decay model using least squares regression. 3) Comparing data to an Arrhenius model using multiple linear regression in two different ways.

Uploaded by

Borith pang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
80 views5 pages

Non-Linear Regression Analysis

The document summarizes three non-linear regression analyses: 1) Finding coefficients a and b for a model using least squares regression. 2) Comparing data to an exponential decay model using least squares regression. 3) Comparing data to an Arrhenius model using multiple linear regression in two different ways.

Uploaded by

Borith pang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

TP6 : NON-LINEAR REGRESSION

𝐚
1. Find a and b of the model r = by using least square regression.
𝐛+𝐭
Data : t(10,20,30,40,50) and r(0.29,0.22,0.18,0.15,0.13)

Code

t=[10;20;30;40;50]
r=[0.29;0.22;0.18;0.15;0.13]
y=[1./r]
figure
plot(t,y,'*r');hold on
%least square regression
X=[ones(length(t),1),t]
phi=inv(X'*X)*X'*y
ymodel=X*phi
plot(t,ymodel)
err=sum((y-ymodel).^2)

Error

err =

0.0021

Coefficient

phi =

2.3989
0.1061

𝐛
 Ao = 2.3989 = and A1 = 0.1061 = a-1
𝐚

So , a = (0.1061)-1 and b = 22.61

1
Figure

2. Compare the data to the model qe = KL (Ce) –n


Data Ce (0.09, 0.15, 0.55,0.35,0.97,1.55,1.78,2.35,3.18) and
qe (4.18,3.07,1.85,3.75,2.95,1.05,3.89,1.99,1.01)

Code

c=[0.09;0.15;0.55;0.35;0.97;1.55;1.78;2.35;3.18]
q=[4.18;3.07;1.85;3.75;2.95;1.05;3.89;1.99;1.01]
x=log(c)
y=log(q)
figure
plot(x,y,'*r');hold on
%least square regression
X=[ones(length(c),1),x]
phi=inv(X'*X)*X'*y
ymodel=X*phi
plot(x,ymodel)
err=sum((y-ymodel).^2)

2
Error

err =

1.4291

Coefficient

phi =

0.7625
-0.2752

Figure

3
3. Compare the data to the model r = Ko e-E/RT Cn

Data 400k 450k 500k 550k 600k


1M 1.48 1.67 1.86 1.96 2.16
2M 2.35 2.79 3.07 3.37 3.62
3M 3.28 3.78 4.24 4.48 5.00
4M 4.12 4.64 5.15 5.76 6.08

 Method 1

Code

c=[1;2;3;4;1;2;3;4;1;2;3;4;1;2;3;4;1;2;3;4]
t=[400;400;400;400;450;450;450;450;500;500;500;500;550;550;550;5
50;600;600;600;600]
r=[1.48;2.35;3.28;4.12;1.67;2.79;3.78;4.64;1.86;3.07;4.24;5.15;1
.96;3.37;4.48;5.76;2.16;3.62;5;6.08]
y=log(r)
x=1./t
u=log(c)
hold on
%least square regression
X=[ones(length(c),1),x,u]
phi=inv(X'*X)*X'*y
ymodel=X*phi
err=sum((y-ymodel).^2)

Error

err =

0.0039

Coefficient

phi =

1.5704
-480.1366
0.7479

4
 Method 2

Code

c=[1;2;3;4]
t=[400,450,500,550,600]
r=[1.48,1.67,1.86,1.96,2.16;2.35,2.79,3.07,3.37,3.62;3.28,3.78,4
.24,4.48,5.00;4.12,4.64,5.15,5.76,6.08]
c=repmat(c,5,1)
t=reshape(repmat(t,4,1),20,1)
xdata=[c,t]
ydata=reshape(r,20,1)
x=1./xdata(:,2)
u=log(xdata(:,1))
y=log(ydata)
X=[ones(length(ydata),1),x,u]
phi=inv(X'*X)*X'*y
ymodel=X*phi
err=sum((y-ymodel).^2)
d=[xdata,ydata]

Error

err =

0.0039

Coefficient

phi =

1.5704
-480.1366
0.7479

You might also like