[go: up one dir, main page]

0% found this document useful (0 votes)
11 views65 pages

DS Algorithms 1

The document provides an overview of data structures, defining them as methods for organizing and storing data efficiently, with real-time applications like arrays, linked lists, and trees. It also explains algorithms, their characteristics, and the importance of algorithm analysis in evaluating performance based on time and space complexity. Additionally, it covers asymptotic notations for analyzing algorithms and includes examples of Java implementations for stack and queue data structures.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views65 pages

DS Algorithms 1

The document provides an overview of data structures, defining them as methods for organizing and storing data efficiently, with real-time applications like arrays, linked lists, and trees. It also explains algorithms, their characteristics, and the importance of algorithm analysis in evaluating performance based on time and space complexity. Additionally, it covers asymptotic notations for analyzing algorithms and includes examples of Java implementations for stack and queue data structures.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 65

Introduction to Data Structures

What is a Data Structure?

• A data structure is a way of organizing and


storing data
• so it can be accessed and modified efficiently.
• Example: Like organizing books on a shelf for
easier access.
Why Do We Need Data Structures?

• Imagine you have a pile of books scattered randomly.


Finding a particular book would take a lot of time. But if
the books are arranged on a shelf in alphabetical order, it
becomes much faster.
• This is exactly what data structures do with data in a
computer.

• - Efficient storage and access of data


• - Fast retrieval and updates
• - Better memory and time management
• - Helps solve complex problems easily
Real-Time Applications of Data Structures

• Array: Student marks, items list

• Linked List: Music playlist, image viewer


• Stack: Undo/Redo in editor
• Queue: Task scheduling, print queue
• Tree: File system, XML parsing

• Graph: Social networks, GPS routes


• Hash Table: Caching, fast lookup
Real-Time Applications of Data Structures

Data Structure Real-Time Use Cases


Storing list of temperatures, marks,
Array
items
Music playlist, image viewer
Linked List
navigation
Undo feature in editors, expression
Stack
evaluation
Print queue, task scheduling, call center
Queue
systems
File systems, organization charts,
Tree
XML/HTML parsing
Social networks, GPS navigation,
Graph
network routing
Caching, databases, dictionary
Hash Table
implementations
Summary
• DS = Way to organize/store data
• Purpose = Efficiency, Optimization
• Used in apps, OS, databases, etc.
• Advantages = Speed, memory usage, scalability
Definition of Algorithm
➢ An algorithm is a step-by-step procedure to solve
a specific problem.
Or
➢ A procedure for solving a mathematical problem
in a finite number of steps.

Examples:
• - A cooking recipe
• - Instructions to solve a puzzle
• - Binary Search to find an item
Characteristics of algorithms
Finiteness
•The algorithm must always terminate after a finite
number of steps.
•It should not go into an infinite loop.

Definiteness (Clarity)
•Each step of the algorithm must be precisely and
unambiguously defined.
•No vague instructions like “do it fast”.

Input
•An algorithm should accept zero or more well-defined
inputs.
•These inputs should be specified before the algorithm
starts.
Output
•It should produce at least one output.
•The output is the expected result based on the inputs.

Effectiveness
•Each step must be basic enough to be carried out easily
and exactly.
•The algorithm should be feasible using basic operations.

Time and Space Efficiency


•The algorithm should solve the problem using minimal
time and memory.
•Good algorithms are optimized for performance.
Basic Algorithmic Analysis
➢ The analysis of an algorithm is a technique
that measures the performance of an algorithm.
➢ Evaluating the performance of an algorithm based
on the following factors:

• Time complexity (execution time)


• Space complexity (memory usage)
• Helps choose the most efficient algorithm.
➢ The goal of algorithm analysis is to compare
different algorithms that are used to solve the
same problem. This is done to evaluate which
method solves a certain problem more quickly,
efficiently, and memory-wise.

➢ Time complexity is determined by the size of the


algorithm's input to the number of steps taken to
execute it.

➢ Space complexity is determined by the amount of


storage needed by the algorithm to execute.
Types of Algorithm Analysis
There are three types of analysis of algorithms. They
are stated below :

➢ Best Case − Minimum time required for program


execution.
➢ Average Case − Average time required for
program execution.
➢ Worst Case − Maximum time required for
program execution.
Type Purpose Common
Notation

Best Case Measures the minimum Big Omega (Ω)


time or space an algorithm
takes on any input.

Average Case Measures expected time Big Theta (Θ)


over all inputs of a given
size.

Worst Case Measures the maximum Big O (O)


time or space needed on any
input.
Asymptotic notations
➢ Asymptotic notations in general tell us about how
good an algorithm performs when compared to
another algorithm.

➢ Asymptotic Notation is used to describe the


running time of an algorithm - how much time an
algorithm takes with a given input, n.

➢ For example if we want to compare the runtimes


of the bubble sort algorithm and merge sort
algorithm, we can use asymptotic notations to do
this comparison.
➢ The efficiency of an algorithm depends on the
amount of time, storage and other resources required
to execute the algorithm.

➢ The efficiency is measured with the help of


asymptotic notations.

➢ An algorithm may not have the same performance


for different types of inputs. With the increase in the
input size, the performance will change.
Types of asymptotic notations
There are three types of asymptotic
notations that are most commonly used to
analyze time complexities of various
algorithms.

➢ Big O notation (O)


➢ Omega notation (Ω)
➢ Theta notation (θ).
The common types of asymptotic notations are:
Notation Symbol Meaning Description

Big O O Upper Bound (Worst- Describes the maximum time or


Notation case complexity) space an algorithm takes; it gives an
upper bound on growth rate.

Big Omega Ω Lower Bound (Best- Describes the minimum time or space
Notation case complexity) an algorithm takes; it gives a lower
bound on growth rate.

Big Theta Θ Tight Bound Describes an asymptotically tight


Notation (Average-case or bound—both upper and lower
exact) bounds—on the running time.

Little o o Strict Upper Bound Denotes growth strictly less than a


Notation (Non-tight upper given function (not equal to), i.e.,
bound) grows slower asymptotically.

Little ω Strict Lower Bound Denotes growth strictly greater than a


omega (Non-tight lower given function (not equal to), i.e.,
Notation bound) grows faster asymptotically.
Big O notation (O)
➢ Big-O notation represents the upper bound of the running
time of an algorithm.
➢ Thus, it gives the worst-case complexity of an algorithm.

Definition: Assume f(n) and g(n) are


non-negative functions. f(n)=O(g(n)) iff
f(n) <= c.g(n)
for some c>0 and for all values of n
where n>=no and c and no are constants.

Hence, function g(n) is an upper bound


for function f(n), as g(n) grows faster
than f(n).
f(n)<=c.g(n)
example
f(n)=2n+2 g(n)=3n^2
for n=1 and assume c=4
f(n)=4 g(n)=3
4<=4*3=12 True
for n=2
f(n)=6 g(n)=12
6<=4*12=48 True
for all n>=1 f(n)=O(g(n))
Omega Notation ( Ω)
• Omega notation represents the lower bound of the running
time of an algorithm.
• Thus, it provides the best case complexity of an algorithm.

Definition: Assume f(n) and g(n) are


non-negative functions. f(n)=Ω(g(n))iff
f(n) >= c.g(n)
for some c>0 and for all values of n
where n>=no and c and no are constants.

It means function g is a lower bound for


function f ; after a certain value of n,
f will never go below g.
f(n)>=c.g(n)
example
f(n)=2n+2 g(n)=3n^2
for n=1 and assume c=1
f(n)=4 g(n)=3
4>=3 True
for all n>=1 f(n)=Ω(g(n))
Theta Notation ( θ)
➢ Theta notation encloses the function from above and
below.
➢ Since it represents the upper and the lower bound of the
running time of an algorithm, it is used for analysing the
average-case complexity of an algorithm.
Definition:
Assume f(n) and g(n) are non-
negative functions. f(n)=θ(g(n))
iff c1.g(n)<=f(n) <= c2.g(n)
and for all values of n where n>=no
and c1, c2 and no are constants.

This means function g is a tight


bound for function f.
f(n)=θ(g(n))
c1.g(n)<=f(n)<=c2.g(n)
example
f(n)=3n+2 g(n)=n
for n=1 and assume c1=1 and c2=4
1.n<=3n+2<=4.n
1.1<=3.1+2<=4.1 => 1<=5<=4 false
for n=2 and assume c1=1 and c2=4
1.n<=3n+2<=4.n
1.2<=3.2+2<=4.2 => 2<=8<=8 True
for all n>=2 f(n)=Ω(g(n))
Little o Notation
Definition: Assume f(n) and g(n) are non-negative
functions. f(n)=o(g(n)) iff f(n) < c.g(n) for all
values of c>0 and for all values of n where n>=no
and c and no are constants.

Mathematical Relation of Little o notation


Using mathematical relation, we can say that
f(n) = o(g(n)) means,
General rules to calculate Time
Complexity
Rule1:- Loops
one running time of for loop is the running time of
statements in the for loop.
EX:
for(i=0;i<n;i++)-------- n+1
s=s+i;----------------- n
................................
2n+1
Time complexity= O(n)
Rule2:- Nested Loops
one total running time of statements inside a group of
nested loop is the product of sizes of all loops.
EX:
for(i=0;i<n;i++)----------------- n+1
{ for(j=0;j<n;j++)---------- n*(n+1)=n^2+n
{ c[i][j]=a[i][j]+b[i][j];------ n*n=n^2
}
} ---------------------
2n^2+2n+1
Time
complexity=O(n^2)
Rule3:- Consequitive Statements
Here running time is the maximum one.
EX: int i; -------------------------------- 1
for(i=0;i<n;i++)------------------------- n+1
s=s+i; --------------------------------- n
for(i=0;i<n;i++)------------------------- n+1
{ for(j=0;j<n;j++)------------------- n*(n+1)
{ c[i][j]=a[i][j]+b[i][j];------- n*n
}
} --------------------------
2n^2+4n+3
Time
complexity=O(n^2)
Rule4:- if else
Here running time is maximum of S1 and S2
Ex:
if(n<=0)---------------------------- 1
{ return n;--------------------- 1
}
else -------------------------------- 1
{ for(i=0;i<n;i++)------------ n+1
{ s=s+i;---------------- n
}
return s;-------------- 1
} ------------------------
2n+5 Time complexity=O(n)
for(i=0;i<n;i++)------------------------- n+1
{ for(j=0;j<=i;j++)
{
--------
}
} 1+2+3+------n
n(n+1)/2=>n^2+n/2
Time complexity=O(n^2)
Frequency Count or Step Count
➢The step count method is one of the
methods to analyze the Time complexity of
an algorithm. In this method, we count the
number of times each instruction is
executed. Based on that we will calculate the
Time Complexity

➢The step Count method is also called as


Frequency Count method.
1. for comments and Declaration step
count=0.
2. for return statement and Assignment
statement step count=1.
3. Ignore Lower order exponents when
higher order exponents are present.
4. Ignore constant multipliers.
Time complexity
sum(int a[], int n)
{ s=0;--------------------- 1
for(i=0;i<n;i++) ------ n+1
s=s+a[i]; --------- n
return s; ------------- 1
} ---------------------- 2n+3
Time complexity= O(n)
Space complexity
Amount of space required to store an
algorithm is called space complexity.

a------- n
n------- 1
s--------1
i---------1
------------------
n+3 space complexity=O(n)
void mat_mul(int a[], int b[])
{ for( i=0;i<n;i++)----------- n+1
{ for(j=0;j<n;j++)------- n*n+1
{ c[i][j]=0; -------- n*n
for(k=0;k<n;k++)----- n*n*(n+1)
{ c[i][j]=a[i][k]*b[k][j];--- n*n*n
}
} ------------------------
} 2n^3+3n^2+2n+1
} Time cimplexity=O(n^3)
space complexity
a------ n^2
b------ n^2
c------ n^2
n------ 1
i------- 1
j------- 1
k------- 1
-----------------------
3n^2+4 space complexity O(n^2)
Growth rate
It is the rate at which execution time of an
algorithm increases on increase of input.
order of growth rate ( from smallest to
Largest)
O(1)-constant
O(logn)- Logarithmic
O(n)-linear
O(nlogn)-Lineararithmatic
O(n^2)- Quadratic
O(n^3)- Cubic
O(2^n)- Exponential
Space complexity

formula
import java.util.*;
public class Stack
{
static int stack[] = new int[10];
static int top = 0;

public static void push(int value)


{
if(top>9)
System.out.println("Stack Overflow");
else
{
stack[top] = value;
top++;
System.out.println("Insertion Successful");
}
}

public static void pop()


{
if(top==0)
System.out.println("Stack Underflow");
else
{
System.out.println("Deleted: "+ stack[(top-1)] );
top--;
}
}

public static void display()


{
if(top==0)
System.out.println("Stack Underflow");
else
{
System.out.println("The intended stack is: ");
for(int i = (top-1); i>=0; i--)
{
System.out.println(stack[i]);
}
}
}

public static void main(String args[])


{
Scanner sc = new Scanner(System.in);
int value;
while(true)
{
System.out.println("1. Push\n2. Pop\n3. Display\n4. Exit");
System.out.println("Enter your choice");
int choice = sc.nextInt();
switch(choice)
{
case 1:
System.out.println("Enter the value");
value = sc.nextInt();
push(value);
break;
case 2:
pop();
break;
case 3:
display();
break;
case 4:
System.out.println("Thank You!");
System.exit(0);
default:
System.out.println("Invalid Input!!");
}
}
}
}'
mport java.util.*;
ublic class Queue

static int queue[] = new int[10];


static int front = 0;
static int rear = 0;

public static void enqueue(int value)


{
if(rear > 9)
System.out.println("Queue Overflow");
else
{
queue[rear] = value;
rear++;
System.out.println("Insertion Successful");
}
}

public static void dequeue()


{
if(front == rear)
System.out.println("Queue Underflow");
else
{
System.out.println("Deleted: " + queue[front]);
front++;
}
}

public static void display()


{
if(front == rear)
System.out.println("Queue Underflow");
else
{
System.out.println("The intended queue is: ");
for(int i = front; i < rear; i++)
{
System.out.println(queue[i]);
}
}
}

public static void main(String args[])


{
Scanner sc = new Scanner(System.in);
int value;
while(true)
{
System.out.println("1. Enqueue\n2. Dequeue\n3. Display\n4. Exit");
System.out.println("Enter your choice");
int choice = sc.nextInt();
switch(choice)
{
case 1:
System.out.println("Enter the value");
value = sc.nextInt();
enqueue(value);
break;
case 2:
dequeue();
break;
case 3:
display();
break;
case 4:
System.out.println("Thank You!");
System.exit(0);
default:
System.out.println("Invalid Input!!");
}
}
class Node {
int data;
Node next;
Node(int data)
{
this.data = data;
}
}

public class LinkedList


{
Node head;
public void insertAtEnd(int data)
public
{ void insertAtStart(int data)
{ Node newNode = new Node(data);
Node newNode
if (head = new
== null) { Node(data);
newNode.next = head;
head = newNode;
head return;
= newNode;
} }
Node temp = head;
publicwhile
void (temp.next
deleteAtStart()
!= null)
{ temp = temp.next;
if (head != null)
head = head.next;
temp.next = newNode;
} }

public
publicvoid insertAtEnd(int
void deleteAtEnd()data)
{ {
Node newNode
if (head = new Node(data);
== null)
if (head ==
return; null) {
head = newNode;
return;
if (head.next == null)
} {
Nodehead temp==null;
head;
while return;
(temp.next != null)
temp
} = temp.next;
Node temp = head;
temp.next = newNode; != null)
while (temp.next.next
} temp = temp.next;
publictemp.next
void deleteAtEnd()
= null;
{ }
if (head == null)
return;

if (head.next == null)
{
head = null;
return;
}
Node temp = head;
while (temp.next.next != null)
temp = temp.next;

temp.next = null;
}

public void insertAtIndex(int data, int idx)


{
if (idx == 0)
.

You might also like