[go: up one dir, main page]

0% found this document useful (0 votes)
54 views6 pages

Feature Scaling Techniques in ML

The document discusses various feature scaling techniques in machine learning, including Absolute Maximum Scaling, Min-Max Scaling, Standardization, and Robust Scaling. Each method is designed to transform data to improve algorithm performance, with specific formulas provided for each technique. The importance of scaling in handling different data distributions and outliers is emphasized.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
54 views6 pages

Feature Scaling Techniques in ML

The document discusses various feature scaling techniques in machine learning, including Absolute Maximum Scaling, Min-Max Scaling, Standardization, and Robust Scaling. Each method is designed to transform data to improve algorithm performance, with specific formulas provided for each technique. The importance of scaling in handling different data distributions and outliers is emphasized.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Feature Scaling in Machine Learning

• Absolute Maximum Scaling: Absolute Maximum Scaling in Machine Learning


MaxAbs Scaling is a method used to scale features in your dataset, which is especially
useful when your data contains both positive and negative values. This technique is
designed to scale each feature by its absolute maximum value (i.e., the maximum
value by absolute magnitude) while preserving the sign of the data.

• Formula:
• Min-Max Scaling: Min-Max Scaling is a feature scaling technique that transforms
the data into a specific range, typically [0, 1]. This is achieved by subtracting the
minimum value of the feature and dividing by the range (the difference between the
maximum and minimum values of the feature).

• Formula:
• Standardization: Standardization (also known as Z-score normalization) is a
technique used to scale features so that they have a mean of 0 and a standard deviation
of 1. This is done by subtracting the mean of the feature and dividing by the standard
deviation.

• Formula:
• Robust Scaling: Robust Scaling is a feature scaling technique that uses the median
and interquartile range (IQR) to scale the data. It is robust to outliers because it does
not rely on the mean and standard deviation, which can be significantly affected by
outliers. Instead, it scales features based on the median and the IQR, making it more
resistant to the influence of extreme values.

• Formula:
• Z-score normalization: Z-score normalization (also known as Standardization) is a
technique used in machine learning to transform features so that they have a mean of
0 and a standard deviation of 1. This transformation is important when the data has
different scales or units, which can affect the performance of many machine learning
algorithms, especially those based on distance metrics (like k-nearest neighbors or
gradient-based algorithms).

• Formula:

You might also like