DSA Unit 1 Final
DSA Unit 1 Final
AND ALGORITHMS
data items that considers not only the elements stored but
also their relationship to each other.
INTRODUCTION
Data structure affects the design of both structural &
functional aspects of a program.
Program=algorithm + Data Structure
You know that a algorithm is a step by step procedure to solve
a particular function.
INTRODUCTION
That means, algorithm is a set of instruction written to carry
out certain tasks & the data structure is the way of
organizing the data with their logical relationship retained.
To develop a program of an algorithm, we should select an
program.
CLASSIFICATION OF DATA
STRUCTURE
Data structure are normally divided into two broad
categories:
Primitive
Data Structure
Non-Primitive Data Structure
CLASSIFICATION OF DATA
STRUCTURE
Data structure
Primitive DS Non-Primitive DS
Non-Primitive DS
For(i=0;i<=9;i++)
printf(“%d”,arr[i]);
ARRAYS
If we are reading or writing two-
dimensional array it would require two
loops. And similarly the array of a N
dimension would required N loops.
Some common operation performed on
array are:
Creation of an array
Traversing an array
ARRAYS
Insertionof new element
Deletion of required element
Modification of an element
Merging of arrays
LISTS
A lists (Linear linked list) can be defined as a collection of
variable number of data items.
Lists are the most commonly used non-primitive data
structures.
An element of list must contain at least two fields, one for
storing data or information and other for storing address of
next element.
As you know for storing address we have a special data
structure of list the address must be pointer type.
LISTS
Technically each such element is referred to as a node,
therefore a list can be defined as a collection of nodes as
show bellow:
stack, its base remains fixed where the top of stack changes.
STACK
Insertion of element into stack is called PUSH and
deletion of element from stack is called POP.
The bellow show figure how the operations take place on
a stack:
PUSH POP
[STACK]
STACK
The stack can be implemented into two ways:
Using arrays (Static implementation)
Using pointer (Dynamic
implementation)
QUEUE
Queue are first in first out type of data structure (i.e. FIFO)
In a queue new elements are added to the queue from one
end called REAR end and the element are always removed
from other end called the FRONT end.
The people standing in a railway reservation row are an
example of queue.
QUEUE
Each new person comes and stands at the end of the row
and person getting their reservation confirmed get out of
the row from the front end.
The bellow show figure how the operations take place on
a stack:
10 20 30 40 50
front rear
QUEUE
The queue can be implemented into two ways:
Using arrays (Static implementation)
Using pointer (Dynamic
implementation)
TREES
A tree can be defined as finite set of data items (nodes).
Tree is non-linear type of data structure in which data
various elements.
TREES
In trees:
There is a special data item at the top of hierarchy called the
Root of the tree.
The remaining data items are partitioned into number of
mutually exclusive subset, each of which is itself, a tree
which is called the sub tree.
The tree always grows in length towards bottom in data
structures, unlike natural trees which grows upwards.
TREES
The tree structure organizes the data into branches,
which related the information.
A root
B C
D E F G
GRAPH
Graph is a mathematical non-linear data structure
capable of representing many kind of physical structures.
It has found application in Geography, Chemistry and
Engineering sciences.
Definition: A graph G(V,E) is a set of vertices V and a
set of edges E.
GRAPH
An edge connects a pair of vertices and many have
weight such as length, cost and another measuring
instrument for according the graph.
Vertices on the graph are shown as point or circles and
6
v2 v5
v1 v3
10
v1 8 11
15
9 v2
v3 v4 v4
Memory utilization is
Memory utilization may
Memory utilization efficient as memory can
be inefficient
be reused
It is inverse of top down method.
A bottom-up approach starts with designing
most basic or primitive component & proceeds
to higher level components.
Starting from very bottom , operations that
provide layer of abstraction are implemented.
The programmer may write code to perform
basic operations then combined those to make a
modules ,whcih are finally combined to form
overall system structure.
TIME COMPLEXITY
Time complexity of program / algorithm is the
amount of computer time that it needs to run
to completion.
While calculating time complexity, we
develop frequency count for all key
statements which are important.
Consider three algorithms given below:-
Algorithm A:- a=a+1
Algorithm B:- for x=1 to n step
a=a+1
Loop
Algorithm C:- for x=1 to n step 1
for y=1 to n step 2
a=a+1
loop
Frequency count for algorithm A is 1 as a=a+1
statement will execute only once.
Frequency count for algorithm B is n as a=a+1
is a key statement executes “n‟ times as
loop runs
“n‟ times.
Frequency count for algorithm C is n2 as
a=a+1 is a key statement executes n2
times as the inner
loop runs n times, each time the outer loop
runs and the outer loop also runs for n times.
SPACE COMPLEXITY
Space complexity of a program / algorithm is the amount of
memory that it needs to run to completion.
The space needed by the program is the sum of the following
components.
Fixed space requirements:-
Fixed space is not dependent on the characteristics of the
input and outputs.
Fixed space consists of space for simple variable, fixed
size variables, etc.
Variable space requirements:-
Variable space includes space needed by variables
whose size depends upon the particular problem being
solved, referenced variables and the stack space
required for recursion on particular instance of variables.
e.g. Additional space required where function uses
recursion.
ALGORITHM ANALYSIS:
There are different ways of solving problem &
there are different algorithms which can be
designed to solve a problem.
There is difference between problem &
algorithm.
A problem has single problem statement that
describes it in general terms.
However there are different ways to solve a
problem & some solutions may be more
efficient than others.
There are different types of time complexities
which can be analyzed for an algorithm:
Best Case Time Complexity:
Worst Case Time Complexity:
Average Case Time Complexity:
BEST CASE TIME COMPLEXITY:
It is measure of minimum time that
algorithm will require for input of size “n‟.
Running time of many algorithms varies not
only for inputs of different sizes but also input
of same size.
For example in running time of some sorting
algorithms, sorting will depend on ordering of
input data. Therefore if input data of “n‟ items
is presented in sorted order, operations
performed by algorithm will take least time.
WORST CASE TIME COMPLEXITY:
It is measure of maximum time that
algorithm will require for input of size “n‟.
Therefore if various algorithms for sorting are
taken into account & say “n‟ input data items
are supplied in reverse order for any sorting
algorithm, then algorithm will require n2
operations to perform sort which will
correspond to worst case time complexity of
algorithm.
AVERAGE CASE TIME
COMPLEXITY:
The time that an algorithm will require to
execute typical input data of size “n‟ is
known as average case time complexity.
We can say that value that is obtained by
averaging running time of an algorithm for all
possible inputs of size “n‟ can determine
average case time complexity.
Big O notation
upper
bound, Ω notation provides asymptotic
lower bound.
f(n) is said to
be Θ(g(n)) if f(n) is O(g(n)) and f(n) is Ω(g
(n)).
Mathematically,
It is define as lower
It is define as upper bound It is define as tightest
bound and lower bound
and upper bound on an bound and tightest
on an algorithm is the
algorithm is the most bound is the best of all
4. least amount of time
amount of time required the worst case times
required ( the most
( the worst case that the algorithm can
efficient way possible, in
performance). take.
other words best case).
Mathematically – Big
Mathematically: Big Oh is 0 Mathematically: Big
Theta is 0 <= C2g(n) <=
5. <= f(n) <= Cg(n) for all n Omega is 0 <= Cg(n) <=
f(n) <= C1g(n) for n >=
>= n0 f(n) for all n >= n0
n0
1-D ARRAY ADDRESS CALCULATION
Address of A[I] = B + W * (I – LB)
I = Subset of element whose address to be
found,
B = Base address,
W = Storage size of one element store in any
array(in byte),
LB = Lower Limit/Lower Bound of subscript(If
not specified assume zero).
Given the base address of an array A[1300 …………
1900] as 1020 and the size of each element is 2 bytes in
the memory, find the address of A[1700].
Given:
Address of A [ I ][ J ] = B + W * [ N * ( I – Lr ) + ( J – Lc ) ]
The address of a location in Column Major System is calculated using the following formula:
Solution:
As you see here the number of rows and columns are not given in the question. So they are
calculated as:
Number or rows say M = (Ur – Lr) + 1 = [10 – (- 15)] +1 = 26 Number or columns say N = (Uc –
Lc) + 1 = [40 – 15)] +1 = 26
(i)Column Major Wise Calculation of above equation
The given values are: B = 1500, W = 1 byte, I = 15, J = 20, Lr = -15, Lc = 15, M
= 26 Address of A [ I ][ J ] = B + W * [ ( I – Lr ) + M * ( J – Lc ) ]
given key.
Delete − Deletes an element using the
given key.