DAA Module 1
DAA Module 1
1. INTRODUCTION
2. PERFORMANCE ANALYSIS
5. RECURSION
6. Big O Notation
Every algorithm requires some amount of computer time to execute its instruction to
perform the task. This computer time required is called time complexity.
Time complexity of an algorithm can be defined as
The time complexity of an algorithm is the total amount of time required by an algorithm to
complete its execution.
For all values of n to the right of n0, the value of f(n) is always lies on or below C.g(n)
To understand O-Notation let us consider the following examples
Q) For the function defined by f(n) = 2n2+3n+1 : show that
i. f(n) = O(n2)
ii. f(n) = O(n3)
Solution
(i) To show that f(n) = O(n2) ; we have to show that
f(n) ≤ C.g(n), for all n ≥ n0
ie, 2n2+3n+1 ≤ C. n2
clearly this equation is satisfied for C = 6 and for all n >= 1
2n2+3n+1 ≤ 6.n2 for all n >= 1
since we have found the required constant C = 6 and n 0 = 1 . Hence f(n) = O(n2)
The value of C and n0 is not unique. For example, to satisfy the above equation , we
can also take C=3 and n>=3 . So depending on the value of C, the value of n 0 is also
changes. Thus any value of C and n0 which satisfy the given inequality is a valid
solution.
(ii) To show that f(n) = O(n3); we have to show that
f(n) ≤ C.g(n), for all n ≥ n0
ie, 2n2+3n+1 ≤ C. n3 Let C = 3
Value of n 2n2+3n+1 3 n3
1 6 3
Note that for all values of f(n) always lies on or above g(n).
To understand Ω-Notation let us consider the following examples
Q) For the function defined by f(n) = 2n3+3n2+1 and g(n) = f(n) = 2n2+3n show that
i) f(n) = Ω(g(n))
ii) g(n) ≠ Ω(f(n))
Solution
i) To show that f(n) = Ω(g(n)) we have to show that f(n) ≥ C.g(n), for all n ≥ n 0
2n3+3n2+1 ≥ C(2n2+3n)
This equation satisfied for C=1 and n0 = 1
Hence f(n) = Ω(g(n))
3.3 The notation θ(Theta)
Θ-Notation provides simultaneous both asymptotic lower bound and asymptotic upper bound
for a given function. Let f(n) and g(n) are two positive functions , each from the set of
natural numbers (domain) to the positive real numbers. In some cases, we have
f(n) = O(g(n)) and f(n) = Ω(g(n)) then f(n) = θ(g(n)).
Note that for all values of n to the right of the n0 the value of f(n) lies at or above C 1.g(n)
and at or below C2.g(n).
To understand Ω-Notation let us consider the following example
Q) For the function defined by f(n) = 2n3+3n2+1 and g(n) = 2n3+1 show that f(n) = θ(g(n)).
Solution
To show that f(n) = θ(g(n)) we have to show that
C1.g(n) ≤ f(n) ≤ C2.g(n) for all n ≥ n0
C1(2n3+1) ≤ 2n3+3n2+1 ≤ C2(2n3+1) for all n ≥ n0 …………………….(1)
To satisfy this inequality simultaneously, we have to find the value of C 1, C2 and n0
using the following inequality
C1(2n3+1) ≤ 2n3+3n2+1 …………………..(2) and
2n3+3n2+1 ≤ C2(2n3+1) ………………….(3)
Inequality (2) is satisfied for C1 = 1 and n ≥ 1 (ie, n0 = 1)
Inequality (3) is satisfied for C2 = 2 and n ≥ 1 (ie, n0 = 1)
Hence inequality (1) simultaneously satisfied for C 1 = 1, C2 = 2 and n ≥ 1
Hence f(n) = θ(g(n)).
Suppose M is an algorithm, and suppose n is the size of the input data. The time and space
used by the algorithm M are the two main measures for the efficiency of M.
The time is measured by counting the number of key operations, for example, in case
of sorting and searching algorithms, the number of comparisons is the number of key
operations. That is because key operations are so defined that the time for the other
operations is much less than or at most proportional to the time for the key
operations.
The space is measured by counting the maximum of memory needed by the algorithm.
The complexity of an algorithm M is the function f(n), which give the running time and/or
storage space requirement of the algorithm M in terms of the size n of the input data.
Frequently, the storage space required by an algorithm is simply a multiple of the data size
n. In general the term “complexity” given anywhere simply refers to the running time of the
algorithm. There are 3 cases, in general, to find the complexity function f(n)
1. Best case The minimum value of f(n) for any possible input.
2. Worst case The maximum value of f(n)for any possible input.
3. Average case The value of f(n) which is in between maximum and minimum for any
possible input. Generally the Average case implies the expected value of f(n)
4.1 Best, Worst and Average cases
To better understand all of the 3 cases, consider an example of English dictionary, used to
search a meaning of a particular word.
Best Case: Suppose we open a dictionary and luckily we get the meaning of a word which we
are looking for. This requires only one step (minimum possible) to get the meaning of a word.
Worst case: Suppose you are searching a meaning of a word and that word is either not in a
dictionary or that word takes maximum possible steps (i.e. now no left hand side or right
hand side page is possible to see).
Average Case: If you are searching a word for which neither a Minimum (best case) nor a
maximum (worst case) steps are required is called average case. In this case, we definitely
get the meaning of that word.
Example 1: Linear Search
To understand the Best, Worst and Average cases of an algorithm, consider a linear array
A[1….n], where the array A contains n-elements. Suppose you want either to find the
location LOC of a given element (say ‘x’) in the given array A or to send some message, such
The complexity of the search algorithm is given by the name C of comparisons between x
and array elements A[K].
Best case: Clearly the best case occurs when ‘x’ is the first element in the array A. That is
x = A[LOC]. In this case C(n) = 1 = O(1).
Worst case: Clearly the worst case occurs when ‘x’ is the last element in the array A or ‘x’
is not present in given array A (to ensure this we have to search entire array A till last
element). In this case, we have C(n) = n = O(n).
Average case: Here we assume that searched element ‘x’ appear array A, and it is equally
likely to occur at any position in the array. Here the number of comparisons can be any of
numbers 1, 2, 3… n, and each number occurs with the probability p = 1/n then
C(n) = 1(1/n) + 2(2/n) + …… + n(1/n)
=(1 + 2 + …... + n).1/n
=(n(n+1)/2).1/n
=(n+1)/2
It means the average number of comparisons needed to find the location of x is
approximately equal to half the number of elements in array A. From above discussion, it
may be noted that the complexity of an algorithm in the average case is much more
X = -14 low high mid X = 151 low high mid X = 9 low high mid
1 14 7 1 14 7 1 14 7
1 6 3 8 14 11 1 6 3
4 6 5
1 2 1 12 14 13
Found
2 2 2 14 14 14
2 1 Not Found Found
To test all successful searches, x must take on the n values for a. To test all unsuccessful
searches, x must take on the n+1 values for a. Thus complexity of BinSearch is 2n+1 for
each n.
Now Let analyze the execution profile of BinSearch. The two relevant characteristics of
this profile are the frequency count and space required of the n elements of the array plus
the variables low, high, mid and x, or n+4 locations. As for the time, there are three
possibilities to consider: the best, average and worst cases.
Suppose we begin determining the time for BinSearch on the previous data set. We observe
that the only operations in the algorithm are comparisons and some arithmetic and data
movements. We concentrate on comparisons between x and the elements in a[ ], recognizing
that the frequency count of all other operations is of the same order as that for these
comparisons. We assume that only one comparisons needed to determine which of the three
possibilities of the if statement holds. The number of element comparison needed to find
each of the 14 elements is
a: [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14]
Elements: -15 -6 0 7 9 23 54 82 101 112 125 131 142 151
Comparisons: 3 4 2 4 3 4 1 4 3 4 2 4 3 4
No elements require more than 4 comparisons to be found. The average is obtained by
summing the comparisons need to find all 14 items and dividing by 14. These yields 45/14, or
approximately 3.21, comparisons per successful search on the average. There are 15
possible ways that an unsuccessful search may terminate depending on the value of x. If x <
a[1], the algorithm requires 3 element comparison to determine that x is not present. For all
the remaining possibilities, BinSearch require 4 element comparisons. Thus the average
number of element comparisons for an unsuccessful search is (3 + 14* 4) / 15 = 59 / 15
=3.93.
The analysis just done applies to any stored sequence containing 14 elements. But the result
we would prefer is a formula for n elements. A good way to derive such a formula plus a
better way to understand the algorithm is to consider the sequence of values for mid that
are produced by BinSearch for all possible values for x. These values are nicely described
The first comparison is x with a[7]. If xa[7], then the next comparison is a[3]; similarly, if x
> a[7], then the next comparisons is with a [11]. Each path through the tree represents a
sequence of comparison in the binary search method. If x is present, then the algorithm will
end at one of the circular nodes that lists the index into the array where x was found. If x
is not present, the algorithm will terminate at one of the square nodes. Circular nodes are
called internal node & square nodes are called external nodes.
Complexity
6. Big O Notation
Big O Notation (with a capital letter O, not a zero), also called Landau's symbol, is a
symbolism used in complexity theory, computer science, and mathematics to describe the
asymptotic behavior of functions. Basically, it tells you how fast a function grows or
declines.
Landau's symbol comes from the name of the German number theoretician Edmund Landau
who invented the notation. The letter O is used because the rate of growth of a function is
also called its order.
Here is a list of classes of functions that are commonly encountered when analyzing
algorithms. The slower growing functions are listed first. c is some arbitrary constant.
No Notation Name No Notation Name
1. O(1) Constant 6. O(n )
c
Polynomial
2. O(log(n)) Logarithmic 7. O(c )
n
Exponential
1. What is an Algorithm?
6. What is order of growth? Describe the best average and worst case efficiency
Selection sort n2 n2 n2 No
n (log n)2
n3/2