[go: up one dir, main page]

0% found this document useful (0 votes)
48 views31 pages

Lecture 17 - Bellman-Ford and Arbitrage: Eric A. Autry

Bellman-Ford algorithm can be used to find shortest paths in a graph even if negative edge costs exist. It works by performing a sequence of safe distance updates, updating each vertex's distance based on its neighbors, for |V|-1 iterations, where V is the set of vertices. This ensures all necessary distance updates occur to find accurate shortest paths. However, if a negative cost cycle exists in the graph, shortest paths are undefined as distance to vertices on the cycle could be driven to negative infinity by traversing the cycle repeatedly.

Uploaded by

Lethabo Moropa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views31 pages

Lecture 17 - Bellman-Ford and Arbitrage: Eric A. Autry

Bellman-Ford algorithm can be used to find shortest paths in a graph even if negative edge costs exist. It works by performing a sequence of safe distance updates, updating each vertex's distance based on its neighbors, for |V|-1 iterations, where V is the set of vertices. This ensures all necessary distance updates occur to find accurate shortest paths. However, if a negative cost cycle exists in the graph, shortest paths are undefined as distance to vertices on the cycle could be driven to negative infinity by traversing the cycle repeatedly.

Uploaded by

Lethabo Moropa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Lecture 17 - Bellman-Ford and Arbitrage

Eric A. Autry

1 / 31
Last time: Dijkstra’s Algorithm

This time: Bellman-Ford and Arbitrage

Next time: Floyd-Warshall (Shortest Path meets DP Tables)

HW 6 due this Thursday, the 8th.

Project 2 due Friday the 16th.

2 / 31
Dijkstra’s Algorithm Implementation
def dijkstra(graph, start):
# Set initial dist and prev.
for vertex in graph:
vertex.dist = ∞
vertex.prev = null
start.dist = 0
# Make the queue out of the distances.
queue = makequeue(dists)
# Loop while there are vertices left to visit.
while not queue.isEmpty():
# Find the next vertex to visit.
u = deletemin(queue)
# Check each neighbor of u.
# Update predictions and previous vertex.
for neighbor of u:
if neighbor.dist > u.dist + length(u, neighbor):
neighbor.dist = u.dist + length(u, neighbor)
neighbor.prev = u
decreasekey(queue, neighbor)
3 / 31
Dijkstra’s Algorithm

Main Idea:
I Make distance predictions.
I Iteratively find the next closest vertex.
I Then use that vertex to update the predictions.

Major Assumption:
I Every vertex on the shortest path from the start to vertex v
is closer to the start than vertex v.

Problem:
I What happens if we have negative cost edges?

4 / 31
Negative Cost Edges
Problem:
I What happens if we have negative cost edges?

Our major assumption is wrong! Vertex v is closer than the


vertices on the path...

5 / 31
Safe Updates

What can we do? We would still like to use the major idea of
Dijkstra’s algorithm if possible.

Important Note: in the algorithm, the dist values are either


overestimated or are exactly correct, they are never
underestimated.

They start at infinity and are only ever updated along an edge
from u to v following the rule:

v.dist = min( v.dist, u.dist+length(u,v) )

This is because the distance to v cannot possibly be more than


the distance to u plus the length of the edge from u to v.

6 / 31
Safe Updates
Distances are updated following the rule:

v.dist = min( v.dist, u.dist+length(u,v) )

This rule has two important properties:


1. It gives the correct distance to v if we are in the case where
u was the previous vertex on the shortest path to v.

2. It will never make v.dist too small because the shortest


path can never be larger than the prediction.

I This is important because it means we are making safe


predictions.

I Repetitively updating the predictions can never hurt us as


long as we follow this rule.

7 / 31
Sequences of Safe Updates

This update step is very useful: it is harmless, and if used


carefully it can give us the correct distances.

In fact, we can view Dijkstra’s Algorithm as a sequence of


update operations performed in a specific order.

This sequence doesn’t get us correct distances when there are


negative edges, but maybe another sequence of updates can.

8 / 31
Sequences of Safe Updates
Can we find a sequence of updates that will give us the correct
answer?

Consider the shortest path from vertex s to vertex t that goes


through vertices u1 , u2 , u3 , ..., uk .

What order do the predictions need to be made in to give a


correct distance for vertex t?

s → u1 , u1 → u2 , u2 → u3 , ..., uk → t

Note: other updates can occur because updates are safe.


Once each of these updates occurs, it will fix any overestimated
distances.

So as long as these updates are all performed in this order


(with potentially other updates in between), we will arrive at the
correct distances.
9 / 31
Sequences of Safe Updates
But we don’t know the shortest path beforehand, how can we
be sure we make those updates in that specific order?

Note: how many total updates could be required? |V| − 1

Idea: update along all of the edges in the graph |V| − 1 times.

I All of the necessary updates occurs each iteration.


I Since we iterate |V| − 1 times, we know that all of the
necessary updates were eventually done in order.
I The first iteration correctly did update s → u1 , the second
iteration correctly did update u1 → u2 , etc.

Repeating an update, or having other unnecessary updates, is


fine because updates are safe
Once the correct update gets done, the value of the distance
will be correct.
10 / 31
Bellman-Ford Implementation
def bellmanFord(graph, start):

# Set initial dist and prev.


for vertex in graph:
vertex.dist = ∞
vertex.prev = null
start.dist = 0

# Iterate |V| − 1 times.


for iter in range(0, |V| − 1):

# Look at each vertex.


for u in graph:

# Check each neighbor of u.


# Update predictions and previous vertex.
for neigh of u:
# Only update if the new value is better!
if neigh.dist > u.dist + length(u, neigh):
neigh.dist = u.dist + length(u, neigh)
neigh.prev = u

11 / 31
Bellman-Ford Example

12 / 31
Bellman-Ford Example

Cost/Prev:

S → (0, None), A → (∞, None), B → (∞, None),

C → (∞, None), D → (∞, None), E → (∞, None),


F → (∞, None), G → (∞, None)
13 / 31
Bellman-Ford Example

Cost/Prev:

S → (0, None), A → (10, S), B → (∞, None),

C → (∞, None), D → (∞, None), E → (∞, None),


F → (∞, None), G → (8, S)

14 / 31
Bellman-Ford Example

Cost/Prev:

S → (0, None), A → (10, S), B → (∞, None),

C → (∞, None), D → (∞, None), E → (12, A),


F → (9, G), G → (8, S)

15 / 31
Bellman-Ford Example

Cost/Prev:

S → (0, None), A → (5, F), B → (10, E),

C → (∞, None), D → (∞, None), E → (8, F),


F → (9, G), G → (8, S)

16 / 31
Bellman-Ford Example

Cost/Prev:

S → (0, None), A → (5, F), B → (6, E),

C → (11, B), D → (∞, None), E → (7, A),


F → (9, G), G → (8, S)

17 / 31
Bellman-Ford Example

Cost/Prev:

S → (0, None), A → (5, F), B → (5, E),

C → (7, B), D → (14, C), E → (7, A),


F → (9, G), G → (8, S)

18 / 31
Bellman-Ford Example

Cost/Prev:

S → (0, None), A → (5, F), B → (5, E),

C → (6, B), D → (10, C), E → (7, A),


F → (9, G), G → (8, S)

19 / 31
Bellman-Ford Example

Cost/Prev:

S → (0, None), A → (5, F), B → (5, E),

C → (6, B), D → (9, C), E → (7, A),


F → (9, G), G → (8, S)

20 / 31
Bellman-Ford Example

Cost/Prev:

S → (0, None), A → (5, F), B → (5, E),

C → (6, B), D → (9, C), E → (7, A),


F → (9, G), G → (8, S)

21 / 31
Bellman-Ford Example

Cost/Prev:

S → (0, None), A → (5, F), B → (5, E),

C → (6, B), D → (9, C), E → (7, A),


F → (9, G), G → (8, S)

22 / 31
Bellman-Ford Runtime

What is the runtime of Bellman-Ford?

Each edge is updated each iteration, with |V| − 1 iterations.

O(|V| · |E|)

Recall that in a dense graph with n vertices, |E| ∈ O(n2 ), so


worst case is O(n3 ).

Worse runtime than Dijkstra’s, but can handle negative cost


edges.

23 / 31
Negative Cost Cycles
What happens if we change the weight of the edge between E
and B to −4?

The cost of going through the cycle: E → B → C → D → E is -1.


What is the shortest path to E?
S→G→F→A→E = 7
But going around that cycle subtracts 1 each time!
24 / 31
Negative Cost Cycles
What happens if we change the weight of the edge between E
and B to −4?

The cost of going through the cycle: E → B → C → D → E is -1.


What is the shortest path to E?
S→G→F→A→E = 7
But going around that cycle subtracts 1 each time!

25 / 31
Negative Cost Cycles
Shortest path algorithms are invalid when negative cost cycles
exist because there cannot be shortest paths!

Bellman-Ford can actually detect these negative cost cycles:

I Run for 1 extra iteration, if any values change, there is a


negative cost cycle.
I Why?
I At the end of Bellman-Ford, all vertices have the correct
shortest path distances.
I After |V| − 1 iterations, even the longest possible path has
been accounted for.
I At this point, a distance could only change on a new update
if a cycle is involved (i.e., the path had more than |V|
vertices).

26 / 31
Negative Cost Cycles
Application: finance.

Let’s consider currency exchange rates.

Say that:
I 1 USD buys 0.82 Euro,
I 1 Euro buys 129.7 Yen,
I 1 Yen buys 12 Turkish Lira,
I 1 Turkish Lira buys 0.0008 USD

Then 1 USD buys:

0.82 · 129.7 · 12 · 0.0008 ≈ 1.02USD

We made a 2% profit!

This is called arbitrage.


27 / 31
Negative Cost Cycles
Let’s encode these exchange rates as a graph.

Say that we have exchange rates where one unit of currency ci


buys R(ci , cj ) units of currency cj .

Our goal is to find a cycle of currencies c1 , c2 , c3 , . . . , ck where:

R(c1 , c2 ) · R(c2 , c3 ) · · · · · R(ck , c1 ) > 1.

Note that this is equivalent to:


1 1 1
· · ··· · < 1.
R(c1 , c2 ) R(c2 , c3 ) R(ck , c1 )

Clever trick: take the log of both sides.

1 1 1
log + log + · · · + log < 0.
R(c1 , c2 ) R(c2 , c3 ) R(ck , c1 )

28 / 31
Negative Cost Cycles
We are looking for a cycle of currencies such that

1 1 1
log + log + · · · + log < 0.
R(c1 , c2 ) R(c2 , c3 ) R(ck , c1 )

So we will use these logarithmic values as our edge weights:

1
length(u,v) = log = − log R(u, v).
R(u, v)

Once we have created our currency graph...

I If Bellman-Ford detects a negative cost cycle, there is an


arbitrage opportunity.
I Consider the vertices that changed values in that final
iteration and follow their prev values to find the cycle.

This will be project 3.


29 / 31
Shortest Paths between All Vertices
What do I do if I want to find the shortest paths between all
pairs of vertices?

I could run Bellman-Ford |V| times, taking each vertex as the


starting vertex.

This would have a runtime of O(|V|2 · |E|), which could be O(n4 )


worst case.

It turns out we can do better.

When we re-run Bellman-Ford, we are throwing away a lot of


information. It seems like there should be a way to store that
info...

Idea: consider a use-it-or-lose-it approach and use dynamic


programming!

This will give us Floyd-Warshall (next time).


30 / 31
Course Feedback
I would like you all to take a course feedback survey at:

http://duke.is/xdRkXT

I would like this feedback to give me a better sense of where


you all stand with the class, how you feel about the pacing and
difficulty, and what your backgrounds are.

I will use this feedback to potentially adjust the pace of the


lectures going forward.

This survey will be completely anonymous, so please feel


free to share negative feedback along with any positive
feedback you have.

Please complete the survey by TODAY. It should only take a


couple minutes.
31 / 31

You might also like