[go: up one dir, main page]

0% found this document useful (0 votes)
20 views18 pages

Polynomial Curve Fitting in Machine Learning

This material was prepared from the pattern recognition and machine learning book written by Christopher M. Bishop

Uploaded by

ann2022srmist
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views18 pages

Polynomial Curve Fitting in Machine Learning

This material was prepared from the pattern recognition and machine learning book written by Christopher M. Bishop

Uploaded by

ann2022srmist
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 18

Polynomial curve

fitting
Polynomial Curve Fitting
Session objectives
• By the end of the session, students should be able to:
• Understand the need for higher-degree polynomial
regression for non-linear data.
• Implement polynomial curve fitting using Python.
• Understand overfitting and underfitting.
• Use metrics like MSE or R² to evaluate model
performance.
What is Meant by Non-linear Relationships?
Why It Matters in Machine Learning
A visual comparison of different
types of relationships:
Steps in Polynomial Curve Fitting
Commonly used error function
Commonly used error function
Python code for polynomial curve
fitting
import numpy as np
import matplotlib.pyplot as plt
from sklearn.preprocessing import PolynomialFeatures
from sklearn.linear_model import LinearRegression
from sklearn.metrics import mean_squared_error

# Generate data
np.random.seed(0)
x = np.sort(np.random.rand(30))
y = np.sin(2 * np.pi * x) + np.random.normal(0, 0.1, x.shape)
x = x[:, np.newaxis]
y = y[:, np.newaxis]
Python code for polynomial curve
fitting
# Try different polynomial degrees plt.plot(x_plot,
degrees = [1, 3, 9] model.predict(x_plot_poly),
plt.figure(figsize=(12, 4)) label=f'Degree {degree}')
for i, degree in enumerate(degrees, 1):
plt.title(f'Degree {degree} Fit')
poly = PolynomialFeatures(degree) plt.legend()
x_poly = poly.fit_transform(x) plt.tight_layout()
plt.show()
model = LinearRegression()
model.fit(x_poly, y)
y_pred = model.predict(x_poly)

plt.subplot(1, 3, i)
plt.scatter(x, y, color='red', label='Original Data')
x_plot = np.linspace(0, 1, 100)[:, np.newaxis]
x_plot_poly = poly.transform(x_plot)

Results
Root mean squared error
The typical magnitude of the coefficients
What can be done to prevent
OVERFITTING?
REGULARIZATION
Reference
• Pattern recognition and machine learning by christopher
Bishop

You might also like