[go: up one dir, main page]

0% found this document useful (0 votes)
14 views30 pages

(DS&A) - Lecture 3 - Formal Approach To Analysis

Uploaded by

brotadese50
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views30 pages

(DS&A) - Lecture 3 - Formal Approach To Analysis

Uploaded by

brotadese50
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

Chapter Three

Formal Approach to Analysis

1
Formal Approach to Analysis

 In the previous examples we have seen that


analyzing Loop statements is so complex.

 It can be simplified by using some formal approach


in which case we can ignore initializations, loop
controls, and updates.

Simple Loops: Formally


 For loop can be translated to a summation.

2
 The index and bounds of the summation are
the same as the index and bounds of the
for loop.
 Suppose we count the number of additions
that are done. There is 1 addition per
iteration of the loop, hence n additions in
total.
N


for (int i = 1; i <= N; i++) {
sum = sum+i; 1  N
}
i 1
3
Nested Loops: Formally
 Nestedfor loops translate into multiple
summations, one for each For loop.

for (int i = 1; i <= N; i++) {


for (int j = 1; j <= M; j++) { N M N

}
sum = sum+i+j;  2   2M  2MN
i 1 j 1 i 1
}

4
Consecutive Statements: Formally
 Add the running times of the separate blocks
of your code.

for (int i = 1; i <= N; i++) {


sum = sum+i;
 
N  N N 

1   2  N  2 N
} 2
for (int i = 1; i <= N; i++) {
for (int j = 1; j <= N; j++) {  i 1   i 1 j 1 
sum = sum+i+j;
}
5

}
Conditionals: (Formally take maximum)
Example:

if (test == 1) {
for (int i = 1; i <= N; i++) { N N N 
sum = sum+i; max 1,  2  
}}  i 1 i 1 j 1 
 
else for (int i = 1; i <= N; i++) {
for (int j = 1; j <= N; j++) { max N , 2 N  2 N
2 2

sum = sum+i+j;
}}
6
Categories of Algorithm Analysis
 Algorithms may be examined under different
situations to correctly determine their efficiency
for accurate comparison.
Best Case Analysis:
 Assumes the input data are arranged in the most
advantageous order for the algorithm.

 Takes the smallest possible set of inputs .


 Causes execution of the fewest number of
statements.
7
 Computes the lower bound of T(n), where
T(n) is the complexity function.

Examples:
For sorting algorithm
 If the list is already sorted (data are arranged
in the required order).
For searching algorithm
 If the desired item is located at first accessed
position.
8
Worst Case Analysis:
 Assumes the input data are arranged in the most
disadvantageous order for the algorithm.
 Takes the worst possible set of inputs.
 Causes execution of the largest number of
statements.
 Computes the upper bound of T(n) where T(n) is
the complexity function.
Examples:
For sorting algorithms
 If the list is in opposite order.
For searching algorithms
 If the desired item is located at the last
position or is missing. 9
Worst Case Analysis:
 Worst case analysis is the most common
analysis because:
 It provides the upper bound for all input (even for
bad ones).
 Average case analysis is often difficult to
determine and define.
 If situations are in their best case, no need to
develop algorithms because data arrangements
are in the best situation.
 Best case analysis can not be used to estimate
complexity.
 We are interested in the worst case time since it
provides a bound for all input-this is called the
“Big-Oh” estimate.
10
Average Case Analysis:
 Determine the average of the running time overall
permutation of input data.
 Takes an average set of inputs.
 It also assumes random input arrangement.
 It causes average number of executions.
 Computes the optimal bound of T(n) where T(n) is the
complexity function.
 Sometimes average cases are as bad as worst cases and
as good as best cases.
Examples:
For sorting algorithms
 While sorting, considering any arrangement (order of input data).

For searching algorithms


 While searching, if the desired item is located at any location or is missing.
11
 The study of algorithms includes:
 How to Design algorithms (Describing algorithms)
 Howto Analyze algorithms (In terms of time and
memory space)
 How to validate algorithms (for any input)
 How to express algorithms (Using programming
language)
 How to test a program (debugging and maintaining)
 But, in this course more focus will be given
to Design and Analysis of algorithms.

12
Order of Magnitude
 Refers to the rate at which the storage or time
grows as a function of problem size.

 It is expressed in terms of its relationship to


some known functions.

 This type of analysis is called Asymptotic


analysis.

13
Asymptotic Notations
 Asymptotic Analysis is concerned with how the
running time of an algorithm increases with the
size of the input in the limit, as the size of the
input increases without bound!
 Asymptotic Analysis makes use of
O (Big-Oh) ,
 (Big-Omega),
  (Theta),
o (little-o),
 (little-omega)
- notations in performance analysis and 14

characterizing the complexity of an algorithm.


Types of Asymptotic Notations
1. Big-Oh Notation
 Definition: We say f(n)=O(g(n)), if there are
positive constants no and c, such that to the
right of no, the value of f(n) always lies on or
below c.g(n).
 As n increases f(n) grows no faster than g(n).
 It’s only concerned with what happens for very
large values of n.
 Describes the worst case analysis.
 Gives an upper bound for a function to within a
constant factor.
15
 O-Notations are used to represent the
amount of time an algorithm takes on the
worst possible set of inputs, “Worst-Case”.
16
Question-1

f(n)=10n+5 and
g(n)=n.
Show that f(n) is O(g(n)) ?

Solution:- To show that f(n) is O(g(n)), we must


show that there exist constants c and k such
that f(n)<=c.g(n) for all n>=k.
10n+5<=c.n  for all n>=k
let c=15, then show that 10n+5<=15n
5<=5n or 1<=n
So, f(n)=10n+5<=15.g(n) for all n>=1 17

(c=15, k=1), there exist two constants that satisfy the


Question-2

f(n)=3n2+4n+1
Show that f(n)=O(n2) ?
Solution:
 4n<=4n2 for all n>=1 and
 1<=n2 for all n>=1
 3n2+4n+1<=3n2+4n2+n2 for all n>=1,<=8n2 for all
n>=1
So, we have shown that f(n)<=8n2 for all n>=1.
Therefore, f(n) is O(n2), (c=8, k=1), there exist two
constants that satisfy the constraints.
18
Big-O Theorems
For all the following theorems, assume that
f(n) is a function n and that k is an arbitrary
constant.
 Theorem 1: k is O(1)
 Theorem 2: A polynomial is O(the term
containing the highest power of n). if( f(n)
is a polynomial of degree d, the f(n) is O(nd)
 Theorem 3: k*f(n) is O(f(n))
 Theorem 4: Transitivity - if f(n) is O(g(n))
and g(n) is O(h(n)), then f(n) is O(h(n))
 etc 19
2. Big-Omega ()-Notation (Lower bound)
 Definition: We write f(n)= (g(n)) if there are positive
constants no and c such that to the right of no the
value of f(n) always lies on or above c.g(n).
 As n increases f(n) grows no slower than g(n).
 Describes the best case analysis.
 Used to represent the amount of time the algorithm
takes on the smallest possible set of inputs-“Best
case”.
Example:
Find g(n) such that f(n) = (g(n)) for f(n)=n2
g(n) = n, c=1, k=1.
f(n)=n2=(n)
20
Big-Omega ()-Notation (Lower bound)

21
3. Theta Notation (-Notation) (Optimal bound)
 Definition: We say f(n)= (g(n)) if there exist positive
constants no, c1 and c2 such that to the right of no, the
value of f(n) always lies between c1.g(n) and c2.g(n)
inclusive, i.e., c2.g(n)<=f(n)<=c1.g(n), for all n>=no.
 As n increases f(n) grows as fast as g(n).
 Describes the average case analysis.
 To represent the amount of time the algorithm takes on an
average set of inputs- “Average case”.
Example: Find g(n) such that f(n) = Θ(g(n)) for
f(n)=2n2+3
 n2 ≤ 2n2 ≤ 3n2
 c1=1, c2=3 and n =1
o
f(n) = Θ(g(n)).
22
Theta Notation (-Notation) (Optimal bound)

23
4. Little-oh (small-oh) Notation
 Definition: We say f(n)=o(g(n)), if there are positive constants
no and c such that to the right of no, the value of f(n) lies
below c.g(n).
 As n increases, g(n) grows strictly faster than f(n).
 Describes the worst case analysis.
 Denotes an upper bound that is not asymptotically tight.
 Big O-Notation denotes an upper bound that may or may not be
asymptotically tight.

Example:
Find g(n) such that f(n) = o(g(n)) for f(n) = n2

n2<2n2, for all n>1,  k=1, c=2, g(n)=n2


n2< n3, g(n) = n3, f(n)=o(n3)
n2< n4 , g(n) =n4 , f(n)=o(n4)
24
5. Little-Omega () notation
 Definition: We write f(n)=(g(n)), if there are positive
constants no and c such that to the right of no, the value of
f(n) always lies above c.g(n).
 As n increases f(n) grows strictly faster than g(n).
 Describes the best case analysis.
 Denotes a lower bound that is not asymptotically tight.
 Big -Notation denotes a lower bound that may or may not
be asymptotically tight.
Example: Find g(n) such that f(n)=(g(n)) for f(n)=n2+3

g(n)=n, Since n2 > n, c=1, k=2.


g(n)=√n, Since n2 > √n, c=1, k=2, can also be
solution.
25
Rules to estimate Big Oh of a given function
 Pick the highest order.
 Ignore the coefficient.
Example:
1. T(n)=3n + 5  O(n)
2. T(n)=3n2+4n+2  O(n2)

 Some known functions encountered when analyzing


algorithms. (Complexity category for Big-Oh).

26
T(n) Complexity Big-O
Category
functions F(n)
c, c is constant 1 C=O(1)

10logn + 5 logn T(n)=O(logn)


√n +2 √n T(n)=O(√n)
5n+3 n T(n)=O(n)
3nlogn+5n+2 nlogn T(n)=O(nlogn)

10n2 +nlogn+1 n2 T(n)=O(n2)


5n3 + 2n2 + 5 n3 T(n)=O(n3)
2n+n5+n+1 2n T(n)=O(2n)

7n!+2n+n2+1 n! T(n)=O(n!)

8nn+2n +n2 +3 nn T(n)=O(nn)


27
28
 Arrangement of common
functions by growth
rate. List of typical
growth rates.
Function Name
c Constant
log N Logarithmic
log2 N Log-squared
N Linear
N log N Log-Linear
N2 Quadratic
N3 Cubic
2N Exponential
29
 The order of the body statements of a given
algorithm is very important in determining Big-
Oh of the algorithm.
Example: Find Big-Oh of the following algorithm.
1. for( int i=1;i<=n; i++)
sum=sum + i;

T(n)=2*n=2n=O(n).

2. for(int i=1; i<=n; i++)


for(int j=1; j<=n; j++)
k++;

T(n)=1*n*n=n2 = O(n2). 30

You might also like