[go: up one dir, main page]

0% found this document useful (0 votes)
26 views30 pages

2 DAA - Handout-02-Analysis Intro

Uploaded by

Bilal Shabbir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views30 pages

2 DAA - Handout-02-Analysis Intro

Uploaded by

Bilal Shabbir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 30

Design & Analysis of

Algorithms

Handout-02

1
Analysis of Algorithms
• Analysis in the context of algorithms is concerned with
predicting the resources that requires
– Time
– Memory/Space

• However, Time – generally measured in terms of the


number of steps required to execute an algorithm - is
the resource of most interest

2
Analysis of Algorithms

• To analyze algorithms:
– First, we start to count the number of
significant(basic/ Elementary ) operations in a
particular solution to assess its efficiency.
Basic operations/ Elementary operations
include:
– Assigning a value to a variable
– Arithmetic operation (+, - , × , /) on integers
– Performing any comparison e.g. a < b
– Boolean operations
– Accessing an element of an array.
– Then, we will express the efficiency of algorithms
using growth functions.
3
Terminologies in Analysis
• Execution Cost
• Running Time
• Complexity
• Efficiency

4
Complexity
• The complexity of an algorithm is simply the amo
unt of work the algorithm performs to complete it
s task.
Complexity

• Complexity is a function T(n) which


measures the time or space used by an
algorithm with respect to the input size
n.

• The running time of an algorithm on a


particular input is determined by the
number of “Elementary Operations”
executed.
6
Things to Remember in Analysi
s
• Constants or low-order terms are ignore
e.g. if f(n) = 2n2 then f(n) = O(n2)

• Parameter N, usually referring to number o


f data items processed, affects running tim
e most significantly.
Components of an Algorithm

An algorithm is made up of three basic


components (building blocks):
sequencing, selection, and iteration.
•Sequences
•Selections
•Repetitions

8
1. Sequence
• A sequence is a series of actions that is
completed in a specific order. Action 1 is
performed, then Action 2, then Action 3,
etc., until all of the actions in the sequence
have been carried out.
• A linear sequence of elementary
operations is performed in constant
time.
• Action 1, Action 2, Action 3 runs
sequentially in times t1 , t2, and t3
9
– Complexity will be t1+t2 +t3
2. Selection
• If <test> then P1 else P2 structures are a
little harder; conditional loops.
• The maximum rule can be applied here:
– max(t1, t2), assuming t1, t2 are times of P1, P2
• However, the maximum rule may prove
too conservative
– if <test> is always true the time is t1
– if <test> is always false the time is t2

10
. If/Else
if cond then S1
else
S2
Block #1 t1 Block #2 t2 Max(t1,t2)
3. Repetition (Loops)
For Loops
• Running time of a for-loop is at most the run
ning time of the statements inside the for-loo
p times number of iterations
for (i = sum = 0; i < n; i++)
sum += a[i];
• for loop iterates n times, executes 2 assign
ment statements each iteration ==> asympt
otic complexity of O(n)
12
. Nested For-Loops
• Analyze inside-out. Total running time is run
ning time of the statement multiplied by prod
uct of the sizes of all the for-loops
e.g. for (i =0; i < n; i++)
for (j = 0, sum = a[0]; j <= i ; j++)
sum += a[j];
printf("sum for subarray - through %d is %d\n", i,
sum);
Example
Algorithm:
Statement Number of
1. n = read input from user times executed
1 1
2. sum = 0 2 1
3. i = 0 3 1
4 n+1
4. while i < n 5 n
{ 6 n
5. number = read input from 7 n
8 1
user
6. sum = sum + number
7. i = i + 1
}
The computing time for this algorithm in terms on input size n is: T(n)
8. mean = sum / n
= 4n + 5.

14
The Execution Time of Algorithms (cont.)

Example: Simple If-Statement


Cost Times
if (n < 0) c1 1
absval = -n c2 1
else
absval = n; c3 1

Total Cost <= c1 + max(c2,c3)

15
The Execution Time of Algorithms
Example: Simple Loop
Cost Times
i = 1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
i = i + 1; c4 n
sum = sum + i; c5 n
}

Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*c5


 The time required for this algorithm is proportional to n

16
The Execution Time of Algorithms (cont.)

Example: Nested Loop


Cost Times
i=1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
j=1; c4 n
while (j <= n) { c5 n*(n+1)
sum = sum + i; c6 n*n
j = j + 1; c7 n*n
}
i = i +1; c8 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*(n+1)*c5+n*n*c6+n*n*c7+n*c8
 The time required for this algorithm is proportional to n2 17
Analysis of Algorithms

• We measure the complexity of an algorithm by


identifying a basic operation and then counting how
many times the algorithm performs that basic
operation for an input size n.
problem input of size n basic operation
searching a list lists with n elements comparison
sorting a list lists with n elements comparison
multiplying two matrices two n-by-n matrices multiplication
traversing a tree tree with n nodes accessing a node

18
Algorithm Growth Rates
• We measure an algorithm’s time requirement as a function of
the problem size.
– Problem size depends on the application: e.g., number of
elements in a list for a sorting algorithm, the number disks in
Towers of Hanoi problem.
• So, for instance, we say that (if the problem size is n)
– Algorithm A requires 5*n2 time units to solve a problem of size n.
– Algorithm B requires 7*n time units to solve a problem of size n.
• The most important thing to learn is how quickly the
algorithm’s time requirement grows as a function of the
problem size.
– Algorithm A requires time proportional to n2.
– Algorithm B requires time proportional to n.
• An algorithm’s proportional time requirement is known as
growth rate.
19
Analysis of Algorithms

• The growth rate of T(n) is referred to the


computational complexity of the algorithm.

• We can compare the efficiency of two algorithms by


comparing their growth rates.

• The lower the growth rate, the faster the algorithm,


at least for large values of n.

• For example; T(n) = an2 + bn + c, the growth rate is


O(n2)

• The goal of the algorithm designer should be an


algorithm with as low a growth rate of the running 20
time function, T(n), of that algorithm as possible.
Algorithm Growth Rates (cont.)

Time requirements as a function


of the problem size n

21
Growth-Rate Functions

O(1) Time requirement is constant, and it is independent of the


problem’s size.
O(log2n) Time requirement for a logarithmic algorithm increases
increases slowly
as the problem size increases.
O(n) Time requirement for a linear algorithm increases directly with
the size
of the problem.
O(n*log2n) Time requirement for a n*log2n algorithm increases more
rapidly than
a linear algorithm.
O(n2) Time requirement for a quadratic algorithm increases rapidly
with the
size of the problem.
O(n3) Time requirement for a cubic algorithm increases more
rapidly with the
size of the problem than the time requirement for a
quadratic algorithm.
O(2n) As the size of the problem increases, the time requirement for
an 22
exponential algorithm increases too rapidly to be practical.
A Comparison of Growth-Rate Functions
(cont.)

23
Best, Worst and Average Cases

For same algorithm, not all inputs take same time to execute

•There are input values for which execution time is least (best
Best Case Analysis –

cases).

•The minimum amount of time that an algorithm require to solve a problem of


size n.
•The best case behavior of an algorithm is NOT so useful.

Average-Case Analysis –The average amount of time that an algorithm


require to solve a problem of size n. Sometimes, it is difficult to find the
average-case behavior of an algorithm.
We have to look at all possible data organizations of a given size n, and
their distribution probabilities of these organizations.
24
Best, Worst and Average Cases

•There are input values for which execution time is


Worst-Case Analysis -

maximum (worst cases).


•The maximum amount of time that an algorithm require to solve a
problem of size n.
•This gives an upper bound for the time complexity of an algorithm.
•Normally, we try to find worst-case behavior of an algorithm.

25
Example-linear search algorithm
 Suppose a linear array DATA contains n
elements, and suppose a specific ITEM of
information is given. We want either to find the
location LOC of ITEM in the array DATA, or to
send some message, such as LOC = 0, to
indicate that ITEM does not appear in DATA.
 The linear search algorithm solves this problem
by comparing ITEM, one by one, with each
element in DATA. That is, we compare ITEM
with DATA[1], then DATA[2], and so on, until we
find LOC such that ITEM = DATA[LOC].
Best case
• Best Case: Clearly the Best case occurs
when ITEM is the first element in the array
DATA.

27
Worst case
• Worst Case: Clearly the worst case
occurs when ITEM is the last element in
the array DATA or is not there at all.
• In either situation, we have T(n) = n
Accordingly, T(n) = n is the worst-case
complexity of the linear search algorithm.
Average case
 Average Case: Here we assume that ITEM
does appear in DATA, and that is equally
likely to occur at any position in the array.
Accordingly, the number of comparisons can
be any of the numbers 1,2,3,..., n, and each
number occurs with probability p=1/n.
 Then T(n) = (n+1)/2
 This agrees with our intuitive feeling that the
average number of comparisons needed to
find the location of ITEM is approximately
equal to half the number of elements in the
DATA list.
END

30

You might also like