[go: up one dir, main page]

0% found this document useful (0 votes)
13 views3 pages

Optimization Lecture Notes Fixed

Uploaded by

smmagency1000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views3 pages

Optimization Lecture Notes Fixed

Uploaded by

smmagency1000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Optimization Methods - Detailed Notes

Section 1

Optimization in one variable involves finding the maximum or minimum value of a function f(x)

by analyzing its critical points and behavior across intervals. This is particularly useful in problems

where outcomes depend on a single input variable.

---

1. **Key Definitions**:

- **Critical Point**: A point x=c is a critical point if f'(c)=0 or f'(c) does not exist. Critical points are candidates

for local maxima, minima, or saddle points.

Example: For f(x) = x^3 - 3x^2 + 2, f'(x) = 3x^2 - 6x. Setting f'(x) = 0, critical points are x=0 and x=2.

- **Local Minimum and Maximum**:

A point x=c is a **local minimum** if f(c) <= f(x) for all x near c.

Similarly, x=c is a **local maximum** if f(c) >= f(x) for all x near c.

Example: For f(x) = x^2, x=0 is a local minimum because the function dips at that point.

- **Global Minimum and Maximum**:

These are the lowest and highest values of f(x) across its entire domain.

---

2. **Finding Critical Points**:

Critical points are found by setting f'(x)=0 and solving for x.

Page 1
Optimization Methods - Detailed Notes

**Example**:

Let f(x) = x^3 - 3x^2 + 4.

1. First, compute f'(x) = 3x^2 - 6x.

2. Set f'(x) = 0: 3x(x-2) = 0. Critical points are x=0 and x=2.

Note: Simply finding critical points doesn't guarantee they're maxima or minima. We need tests.

Section 2

Optimization in multiple variables is similar but involves partial derivatives and additional tools like the

Hessian matrix.

---

1. **Key Definitions**:

- A **critical point** (x, y) is where all partial derivatives are zero: df/dx = 0 and df/dy = 0.

- The **Hessian matrix** is a square matrix of second derivatives:

H=[

[d^2f/dx^2, d^2f/dxdy],

[d^2f/dydx, d^2f/dy^2]

].

---

2. **Second Derivative Test**:

Use the determinant of the Hessian matrix (D):

- D = (d^2f/dx^2)(d^2f/dy^2) - (d^2f/dxdy)^2.

Page 2
Optimization Methods - Detailed Notes

Classification:

- If D > 0 and d^2f/dx^2 > 0, it's a local minimum.

- If D > 0 and d^2f/dx^2 < 0, it's a local maximum.

- If D < 0, it's a saddle point.

- If D = 0, the test is inconclusive.

---

3. **Example**:

Classify critical points of f(x, y) = x^2 + y^2 - 4x - 2y:

- Partial derivatives: df/dx = 2x - 4, df/dy = 2y - 2.

- Critical point: Solve df/dx = 0 and df/dy = 0 to get (x, y) = (2, 1).

- Hessian matrix: [

[2, 0],

[0, 2]

].

- D = (2)(2) - (0)^2 = 4 > 0, and d^2f/dx^2 = 2 > 0.

Therefore, (2, 1) is a local minimum.

Page 3

You might also like