01-Introduction to ADS
01-Introduction to ADS
Data Structures
Introduction
Overview
Stack and Queue / Slide 2
The Lecturer
Course structure
Topics
Assessment
Course Policies
Stack and Queue / Slide 3
Lecturer’s Information
Albert Osei Owusu
Mobile
email alowusu@gctu.edu.gh
Stack and Queue / Slide 4
q The class will meet for three hours every week (see Time
table).
Stack and Queue / Slide 7
Content
Introduction:
Complexity and efficiency:
Time and space complexity, Big-O notation, Analysis of algorithms
Algorithms:
Searching and sorting, Recursion, Divide and Conquer strategies, Greedy algorithms,
Data Structures:
Arrays, Lists, Trees, Stacks, Queues, Heaps, Self balancing trees, Graphs and Hash
tables
* Essential Reading
* McGrath, M. (2011) C++ Programming in Easy
Steps. 4th ed. Southam: In Easy Steps
* Recommended Reading
* Cormen, Thomas H. (2009) Introduction to
Algorithms. 3rd ed. Cambridge, Mass: MIT
Press
Course Policies
Stack and Queue / Slide 9
q Class Participation:
q Preparation and engaged participation at all class
sessions are expected of all students.
Solution
Problem Computer Program
Data Structures+Algorithms=Programs
Stack and Queue / Slide 11
Data Structures
Array
Linked List
Queue
Tree
Stack
Stack and Queue / Slide 14
Data Structures
q Data can be organized into:
qPrimitive types
qComposite types
What is an Algorithm?
q An algorithm is a definite procedure for solving a
problem in finite number of steps
Algorithm
* An algorithm is a set of instructions to solve a
problem.
n There can be multiple solutions (more than one
algorithm) to solve a given problem.
n An algorithm can be implemented using different
programming languages on different platforms.
* An algorithm must be correct. It should correctly
solve the problem.
n e.g. For sorting, this means even if (1) the input is
already sorted, or (2) it contains repeated elements.
* Once we have a correct algorithm for a problem,
we have to determine the efficiency of that
algorithm.
23
Stack and Queue / Slide 24
Favorite Algorithms
Efficient Algorithms
q Time
Measuring Efficiency
q The efficiency of an algorithm is a measure of
the amount of resources consumed in solving a
problem of size n.
Measuring Efficiency
q Is it correct??
Stack and Queue / Slide 28
qProcessor speed
qProcessor type
qProgramming language
qQuality of compiler
qSize and nature of input
qOperating system
Stack and Queue / Slide 29
Theoretical Analysis
q Uses a high-level description of the algorithm
instead of an implementation
Analyzing an Algorithm
q Finding running time of an Algorithm
Analysis of Algorithms
N = 10 => 53 steps
N = 100 => 503 steps
N = 1,000 => 5003 steps
N = 1,000,000 => 5,000,003 steps
Asymptotic Complexity
q The 5N+3 time bound is said to "grow
asymptotically" like N
Time complexity
* In computer science, the time complexity of an
algorithm quantifies the amount of time taken by an
algorithm to run as a function of the size of the input
to the problem.
Time Complexity
* For example, if the time required by an algorithm on
all inputs of size n is at most 5n3 + 3n, the asymptotic
time complexity is O(n3).
Time Complexity
Big O Notation
Big O Notation
* Informally, this means that for large enough input sizes the
running time increases linearly with the size of the input.
O(log N)
O(log N).
Big-Oh Rules
q If is f(n) a polynomial of degree d, then f(n) is
O(nd), i.e.,
q 1. Drop lower-order terms
q 2. Drop constant factors
q Example: 2n + 10 is O(n)
q 2n + 10 ≤ cn
q (c − 2) n ≥ 10
q n ≥ 10/(c − 2)
q Pick c = 3 and n0 = 1
Stack and Queue / Slide 58
Big-Oh Example
qn2 ≤ cn
qn ≤c
*The above inequality cannot be satisfied
since c must be a constant
Stack and Queue / Slide 60
q 3n3 + 20n2 + 5
q 3n3 + 20 n2 + 5 is O(n3)
q need c > 0 and n0 ≥ 1 such that 3n3 + 20n2 + 5
≤ c*n3 for n ≥ n0
q this is true for c = 4 and n0 = 21
Orders of common functions
Stack and Queue / Slide 61
Growth-Rate Functions
O(1) Time requirement is constant, and it is independent of the problem’s
size.
O(log2n) Time requirement for a logarithmic algorithm increases increases
slowly
as the problem size increases.
O(n) Time requirement for a linear algorithm increases directly with the size
of the problem.
O(n*log2n) Time requirement for a n*log2n algorithm increases more rapidly than
a linear algorithm.
O(n2) Time requirement for a quadratic algorithm increases rapidly with the
size of the problem.
O(n3) Time requirement for a cubic algorithm increases more rapidly with the
size of the problem than the time requirement for a quadratic algorithm.
O(2n) As the size of the problem increases, the time requirement for an
exponential algorithm increases too rapidly to be practical.
Growth-Rate Functions
* If an algorithm takes 1 second to run with the problem size 8,
what is the time requirement (approximately) for that algorithm
with the problem size 16?
* If its order is:
O(1) T(n) = 1 second
O(log2n) T(n) = (1*log216) / log28 = 4/3 seconds
O(n) T(n) = (1*16) / 8 = 2 seconds
O(n*log2n) T(n) = (1*16*log216) / 8*log28 = 8/3 seconds
O(n2) T(n) = (1*162) / 82 = 4 seconds
O(n3) T(n) = (1*163) / 83 = 8 seconds
O(2n) T(n) = (1*216) / 28 = 28 seconds = 256 seconds
Any Questions?