[go: up one dir, main page]

0% found this document useful (0 votes)
84 views33 pages

TimeComplexityandSpace 2

This document discusses algorithm analysis and complexity. It defines algorithm analysis as providing theoretical estimates of the resources an algorithm requires. Time and space complexity are measures of the computing resources like time and memory space needed by an algorithm. Common complexities discussed include constant, linear, quadratic, and logarithmic time and space complexities. Examples are provided to demonstrate how to calculate the time and space complexity of algorithms.

Uploaded by

SSJ7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views33 pages

TimeComplexityandSpace 2

This document discusses algorithm analysis and complexity. It defines algorithm analysis as providing theoretical estimates of the resources an algorithm requires. Time and space complexity are measures of the computing resources like time and memory space needed by an algorithm. Common complexities discussed include constant, linear, quadratic, and logarithmic time and space complexities. Examples are provided to demonstrate how to calculate the time and space complexity of algorithms.

Uploaded by

SSJ7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

Introduction to Algorithm

and Complexity
Algorithm Analysis

• An important part of computational complexity


theory, which provides theoretical estimation
for the required resources of an algorithm to
solve a specific computational problem
Complexity Theory

• Also called as computational complexity


• It is a measure of the amount of computing
resources (time and space) that a particular
algorithm consumes when it runs
Time Complexity
- The amount of time the algorithm is completed
- It is most commonly expressed using the big O
notation
- input length(number of steps)

Space Complexity
-The amount of working storage an algorithm
need
volume of memory
Why do we need Algorithm Analysis
Knowing efficiency of an algorithm is very vital on mission critical tasks

Generally there are multiple approaches/method/algorithms to solve one problem


statement, algorithm analysis is performed to figure out which is the
better/optimum approaches/method/algorithms out of the options.
What does a BETTER Algorithm mean?

• Faster?(Less execution time) – Time complexity


• Less Memory? – Space Complexity
• Easy to Read?
• Less Line of Code?
• Less Hw/Sw needs?

Algorithm analysis does not give you accurate/exact values (time, space, etc),
however it gives estimates which can be used to study the behavior of the
algorithm.
What is Space Complexity
The space complexity of an algorithm or a computer program is the amount of
memory space required to solve an instance of the computational problem as
function of the size of the input
It is the memory required by an algorithm to execute a program and produce output

Similar to time complexity, it is often expressed asymptotically in big 0 notation,


such as O(n), O(nlog(n), O(n^2) etc., where n is the input size in units of bits
needed to represent the input
What is Space Complexity
For any algorithm, memory is required for the following purposes:
- to store program instructions
- to store constant values
- to store variable value
- add for few other things like function call, jumping statements etc.

Auxiliary Space: is the temporary space (excluding the input size) allocated by your
algorithm to solve the problem with respect to input size

Space complexity includes both Auxiliary space and space used by input
Space Complexity
An algorithm generally requires space for following components:

1.Instruction Space

2. Data Space

3. Environment Space
Big O notation

“Big” for uppercase

“O” for order of complexity


algorithm
Space Complexity

Linear Space or O( n )

Constant Space or O( 1 )

Quadratic Space or O( n2 )
n1 à 4bytes
O(1)
n2 à 4bytes
sum à 4bytes
Aux à 4bytes constant space
Total à 16bytes
arr à N x 4bytes
sum à 4bytes

i à 4bytes O(N)

Aux à 4bytes
Linear Space
Total à 4N + 12 bytes Complexity
n=5
fact à 4 bytes
O(1)
n à 4 bytes Constant Space Complexity
i à 4 bytes
Aux à 4 bytes
Space à 12 bytes + 4 bytes
{ variables a, b, c and z are all integer types
int z = a + b + c; they will take up 4 bytes each
return(z);
so total memory requirement will be (4(4) + 4) = 20 bytes
}
the additional 4 bytes is for return value
space requirement is fixed
Constant Space Complexity.
// n is the length of array a[] 4*n bytes of space is required for the array a[] elements
int sum(int a[], int n) 4 bytes each for x, n, i and the return value.
{ the total memory requirement will be (4n + 16)
int x = 0; // 4 bytes for x
for(inti=0;i<n;i++) //4bytesfori
{
x = x + a[i];
}
return(x);
increasing linearly with the increase in the input value n
}
Linear Space Complexity
Time Complexity

Linear Time or O( n )

Constant Time or O( 1 )

Quadratic Time or O( n2 )
Constant Time or O( 1 )

statement;

we have a single statement. Its Time Complexity will be constant. The running time
of the statement will not change in relation to N.
Linear Time or O( n )

for(i=0; i < N; i++)


{
statement;
}

time complexity for the above algorithm will be Linear. The running time of the
loop is directly proportional to N. When N doubles, so does the running time.
Quadratic Time or O( n2 )

for(i=0; i < N; i++)


{
for(j=0; j < N;j++)
{
statement;
}
}

This time, the time complexity for the code above will be Quadratic. The running time of the
two loops is proportional to the square of N. When N doubles, the running time increases by N
* N.
O(log2n)

while (low <= high)


{
mid = (low + high) / 2; The algorithm breaks a set of numbers into
if (target < list[mid]) halves to search a particular field. Now, this
algorithm will have a Logarithmic Time
high = mid - 1; Complexity. The running time of the algorithm is
else if (target > list[mid]) proportional to the number of times N can be
low = mid + 1; divided by 2 (N is high-low here). This is
because the algorithm divides the working area
else break; in half with each iteration.
}
Comparing Growth Rate
Factors of Complexity Instruction

1. machine execution time


2. time to execute the instruction
3. the instruction set
4. translation (the use of compilers)
How to Calculate the Time Complexity
• Break your algorithm/function into individual operations
• Calculate the Big O of each operation
• Add up the Big O of each operation together
• Remove the constants
• Find the highest order term
3 cases of Time Complexity

1. Best Case – if the algorithm is executed, the fewest


number of instructions are executed.
2. Average Case – executing the algorithm produces
path lengths that will on average be the same.
3. Worst Case – executing the algorithm produces
path lengths that are always a maximum.
If it is good enough today, will it be good enough
tomorrow?
Find square of a number
/* we have to calculate the square of n */
for i=1 to n
do n = n + n
// when the loop ends n will hold its
square return n

/* we have to calculate the square of n */


return n*n
Example 1: Time Complexity I=0
I=1
I=2
arr_X 1 20 5 4 9 I=3
I=4
sum = 0; 1 I=5

for( i = 0; i <n; i++) 1 +n+1 +n


{
sum = sum + arr_X[ i ]; n
} 3n + 3 O(n)
Example 1: Space Complexity
arr_X 1 20 5 4 9 arr_X =n
sum = 0; sum =1
for( i = 0; i <n; i++) i =1
{ n =1
sum = sum + arr_X[ i ]; n+3 O(n)
}
Example 2: Time Complexity
sum = 0; 1
for( x = 0; x <n; x++) n
1+n+1+n
{ for( y = 0; y <n; y++) n * n = n2
{
sum = sum + arr_X[ i ]; n * n = n2
}
} 2n2 +n+1 O(n2)
Example 2: Space Complexity
sum = 0; arr_X =n
for( x = 0; x < n; x++) sum =1
{ for( y = 0; y < n; y++) x =1
{ y =1
sum = sum + arr_X[ i ]; n =1
}
} n + 4 O(n)
Example 3: Time Complexity
n
for( i = 0; i <n; i = i+2) n
2
{
n
sum = sum + arr_X[ i ]; 2
}
n
2 O(n)
2
arr_X 1 20 5 4 9 10 2 3 0 7
Example 4: Time Complexity
for( i = 1; i <n; i = i*2) i=1 20
21
{ i=2
2 2
i=4
sum = sum + arr_X[ i ];
i=8 23
}
i = 16
n = 2x
i = 32
… O(log2n)
Types of Notations for Time Complexity

1.Big Oh denotes "fewer than or the same as" <expression>


iterations.
2.Big Omega denotes "more than or the same as" <expression>
iterations.
3.Big Theta denotes "the same as" <expression> iterations.
4.Little Oh denotes "fewer than" <expression> iterations.
5.Little Omega denotes "more than" <expression> iterations.

You might also like