Advanced Algorithms
Introduction to Algorithms
Dr. Rubi Quiñones
Computer Science Department
Southern Illinois University Edwardsville
Course Timeline
• Introduction to algorithms
ALGORITH
ANALYSIS
MIC
• Median findings and order statistics
• Time complexity
STRATEGIES • Activity selection problems
GREEDY
• Water connection problem, Egyptian fraction
• Huffman (de)coding
• shelves, mice, and policemen problem
•
CONQUER
Max subarray sum and Nearest neighbor
DIVIDE
AND
• Newton’s and bisection algorithm
• Matrix multiplication, skyline and hanoi
• Fibonacci and path counting
PROGRAMMING
• Coin row and collecting problem
DYNAMIC
• Matrix chain multiplication and longest common subsequence
• Knapsack and optimal binary trees
• Floyd Warshall Algorithm and A*
Concep
•
Advanc
Algorithm intractability
ed
ts
• Randomized algorithms 2
Introduction to Algorithms
• What is an algorithm?
• Misconceptions
• Why does it matter?
• Preliminary Knowledge
• Asymptotic notation
• Upper and lower bounds
• Graphs
• Algorithm Techniques
• Proofs
3
What is an Algorithm?
An algorithm is a sequence of instructions that one must perform in order
to solve a well formulated problem
Problem
Algorithm
Input Computer Output
4
Clearing up Misconceptions
What is NOT Important What IS Important
1. Memorizing a lot of algorithm’s 1. Recognizing an algorithm’s
detailed instructions performance characteristics
(time and space complexity)
2. Analyzing the algorithm’s
2. Coding algorithms at record
correctness
speed or from scratch
5
Clearing up Misconceptions
What is NOT Important What IS Important
1. Memorizing a lot of algorithm’s 1. Recognizing an algorithm’s
detailed instructions performance characteristics
(time and space complexity)
2. Analyzing the algorithm’s
2. Coding algorithms at record
correctness
speed or from scratch
1. Programmers don't learn standard algorithms so that they can re-implement
them. Knowing some of the details of these algorithms allows you to use them
more effectively, because you know their strengths and weaknesses.
2. Interviewers for programming job's purpose is to see how well you think. Not
to test what details you know.
6
Why Does it Matter?
• Problem: NASA was receiving images at a low
resolution, and slow rate that it was affecting
timely decisions.
1964 – the equipment could take an 700x832
image with 29 channels, but only 200x200 could
be sent at a time at a rate of 8 1/3 bit/sec. How
long did it take to send that one image?
• Solution: Robert Nathan designed an algorithm
late 1960’s that improved the image pipeline.
1971 – rate increased to 16,200 bit/sec. How 1964 first image of 1976 image of the
long would it take to send the same image? Mars Valles Marineris
canyon on Mars
7
Why Does it Matter?
• Problem: NASA was receiving images at a low
resolution, and slow rate that it was affecting
timely decisions.
1964 – the equipment could take an 700x832
image with 29 channels, but only 200x200 could
be sent at a time at a rate of 8 1/3 bit/sec. How
long did it take to send that one image?
• Solution: Robert Nathan designed an algorithm
late 1960’s that improved the image pipeline.
1971 – rate increased to 16,200 bit/sec. How 1964 first image of 1976 image of the
long would it take to send the same image? Mars Valles Marineris
canyon on Mars
8
Introduction to Algorithms
• What is an algorithm?
• Misconceptions
• Why does it matter?
• Preliminary Knowledge
• Asymptotic notation
• Upper and lower bounds
• Graphs
• Algorithm Techniques
9
Preliminary Knowledge
Approaches First, you would need to identify -
1. Theoretical analysis 1. How good is the algorithm?
2. Empirical analysis 1. Time efficiency/complexity
2. Space efficiency/complexity
3. Correctness
2. Does there exist a better algorithm?
1. Lower bounds
2. Optimality
3. modularity
10
How would Someone Approach to Designing
the Algorithm?
• Correctness
• Does the algorithm do what it’s suppose to do for all inputs? Sometimes you
can test this empirically, but when there’s infinite inputs, it’s a theoretical
• Time and Space Complexity
• Time complexity is the number of operations
• Space complexity is the amount of memory being used
• Time and Space Efficiency
• An algorithm is efficient if its worst-case time and space complexity is O(nc)
for constant c for input size n
11
Asymptotic Notation
• growth of a function
• 𝐵𝑖𝑔 − Ο
• 𝐵𝑖𝑔 − Ω
• 𝐵𝑖𝑔 − 𝜃
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝑜
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝜔
12
Asymptotic Notation
• growth of a function
• 𝐵𝑖𝑔 − Ο: asymptotic upper bound
• 𝐵𝑖𝑔 − Ω
• 𝐵𝑖𝑔 − 𝜃
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝑜
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝜔
Can very loosely and informally think of this as a ≤ relation between functions
13
Asymptotic Notation
• growth of a function
• 𝐵𝑖𝑔 − Ο: asymptotic upper bound
• 𝐵𝑖𝑔 − Ω: asymptotic lower bound
• 𝐵𝑖𝑔 − 𝜃
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝑜
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝜔
Can very loosely and informally think of this as a ≥ relation between functions
14
Asymptotic Notation
• growth of a function
• 𝐵𝑖𝑔 − Ο: asymptotic upper bound
• 𝐵𝑖𝑔 − Ω: asymptotic lower bound
• 𝐵𝑖𝑔 − 𝜃: asymptotic tight bound
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝑜
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝜔
Can very loosely and informally think of this as a = relation between functions
15
Asymptotic Notation
• growth of a function
• 𝐵𝑖𝑔 − Ο: asymptotic upper bound
• 𝐵𝑖𝑔 − Ω: asymptotic lower bound
• 𝐵𝑖𝑔 − 𝜃: asymptotic tight bound
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝑜: upper bound, not asymptotically tight
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝜔
Can very loosely and informally think of this as a < relation between functions
16
Asymptotic Notation
• growth of a function
• 𝐵𝑖𝑔 − Ο: asymptotic upper bound
• 𝐵𝑖𝑔 − Ω: asymptotic lower bound
• 𝐵𝑖𝑔 − 𝜃: asymptotic tight bound
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝑜: upper bound, not asymptotically tight
• 𝐿𝑖𝑡𝑡𝑙𝑒 − 𝜔: lower bound, not asymptotically tight
Can very loosely and informally think of this as a > relation between functions
17
Time and Space Complexity
18
Upper and Lower Bounds
• For time and/or space boundaries for specific algorithms or general
problems.
• If we say “upper bound of an algorithm”
• An algorithm A has an upper bound of f(n) for input size n if there exists no input of size
n such that A requires more than f(n) time
• If we say “upper bound of a problem”
• A problem has an upper bound of f(n) if there exists at least one algorithm that has an
upper bound of f(n)
• If Mergesort has worst-case time complexity of O(nlogn), the sorting problem has an
upper bound of O(nlogn)
19
Upper and Lower Bounds
• For time and/or space boundaries for specific algorithms or general
problems.
• If we say “lower bound of an algorithm”
• An algorithm A has a lower bound of f(n) for input size n if there exists no input of size n
such that A requires less than f(n) time
• If we say “lower bound of a problem”
• A problem has a lower bound of f(n), if for any algorithm A to solve that problem, there
exists at least one input of size n that forces A to take at least f(n) time and/or space
• Since every sorting algorithm has a size input of n forcing Ω(nlogn) steps, the sorting
problem has a time complexity lower bound of Ω(nlogn)
20
Graphs
Directed Graph Weighted Graph
Undirected Graph
You can represent a graph as
1. An adjacency list
2. An adjacency matrix
21
Represent a Graph
Adjacency List Adjacency Matrix
• For each vertex 𝑣 ∈ 𝑉, store a list of • Use an nxn matrix M, where 𝑀 𝑖, 𝑗 = 1 if
vertices adjacent to v (i,j) is an edge, 0 otherwise
• For weighted graphs, add information to • If G is weighted, store weights in the
each node matrix, using ∞ for non-edges
• How much space is required for storage? • How much space is required for storage?
22
Introduction to Algorithms
• What is an algorithm?
• Misconceptions
• Why does it matter?
• Preliminary Knowledge
• Asymptotic notation
• Upper and lower bounds
• Graphs
• Algorithm Techniques
• Proofs
23
Algorithm Techniques
1. Greedy Algorithms
1. Similar to DP where we examine subproblems, exploiting optimal substructure property
2. Instead of considering all possible subproblems (like in DP), at each step, we commit to one
subproblem which results in its greedy choice (locally optimal choice)
24
Algorithm Techniques
1. Greedy Algorithms
1. Similar to DP where we examine subproblems, exploiting optimal substructure property
2. Instead of considering all possible subproblems (like in DP), at each step, we commit to one
subproblem which results in its greedy choice (locally optimal choice)
2. Divide and Conquer
1. Splits a problem into subproblems, solves each subproblem recursively, and combines the
subsolution to a final solution. (Not limited to optimization)
2. Sometimes analyzed via recurrence relations
25
Algorithm Techniques
1. Greedy Algorithms
1. Similar to DP where we examine subproblems, exploiting optimal substructure property
2. Instead of considering all possible subproblems (like in DP), at each step, we commit to one
subproblem which results in its greedy choice (locally optimal choice)
2. Divide and Conquer
1. Splits a problem into subproblems, solves each subproblem recursively, and combines the
subsolution to a final solution. (Not limited to optimization)
2. Sometimes analyzed via recurrence relations
3. Dynamic Programming (DP)
1. Solving optimization problems where we need to choose a “best” solution as evaluated by an
objective function
2. We decompose a problem into subproblems, optimally solve them recursively, and combine the
solutions
3. There are typically exponential number of subproblems, but many overlap so we can reuse the
solution instead of resolving
26
Pseudocode vs Code
Pseudocode
Algorithm
27
Pseudocode vs Code
Pseudocode
Algorithm
It matters when:
1. you’re designing and/or planning a solution
2. Communicate ideas to team members
3. Needs conversion to a different language
28
You may use Latex’s algopseudocode package to help write your pseudocode or Word’s pseudocode plugin
29
Course Timeline
• Introduction to algorithms
ALGORITH
ANALYSIS
MIC
• Median findings and order statistics
• Time complexity
STRATEGIES • Activity selection problems
GREEDY
• Water connection problem, Egyptian fraction
• Huffman (de)coding
• shelves, mice, and policemen problem
•
CONQUER
Max subarray sum and Nearest neighbor
DIVIDE
AND
• Newton’s and bisection algorithm
• Matrix multiplication, skyline and hanoi
• Fibonacci and path counting
PROGRAMMING
• Coin row and collecting problem
DYNAMIC
• Matrix chain multiplication and longest common subsequence
• Knapsack and optimal binary trees
• Floyd Warshall Algorithm and A*
Concep
•
Advanc
Algorithm intractability
ed
ts
• Randomized algorithms 30