AIE455
ROBOT MAPPING
AND LOCALIZATION
DR.SAFA A.ELASKARY
FALL 2024-2025
Lecture Outline
• STATE ESTIMATION
01 OVER TIME 02 CONTROL ACTION
• OUR BAYESAN
03 • SENSOR 04
NETWORK
MODEL
STATE ESTIMATION OVER TIME
xt-1 xt xt+1
• An example of a “causal” chain in a Bayes’ Net.
• What is the state transition probability?
P (X t |X 0:t—1 ) = P (X t |X t—1 )
• We are also assuming stationary dynamics here:
P (X t |X t—1 ) = P (X t—1 |X t—2 )
5
CONTROL ACTIONS
• Robot motion
• Manipulation objects xt-1 xt xt+1
ut-1 ut ut+1
• We assume the robot executes ut (an “action”) each time step.
• We assume the following transition function dynamics:
State transition probability
P(X t |X 0:t—1 , U0:t ) = P(X t |X t—1 , Ut ) (Markoproperty) 7
8 SENSOR MODEL
We also assume the robot xt-1 xt xt+1
has sensors to estimate
the state.
Measurement data ut-1 ut ut+1
Camera image
Ultrasonic sensor
output zt-1 zt zt+1
Measurement probability
P(Z t |X 0:t , U0:t ) = P(Z t |X t ) (Markov property)
OUR BAYES’ NET
• Our Bayesian network:
• Characterizes evolution of the controls (u), states (x), and measurements (z).
• Makes assumptions that simplify conditional independence calculations.
• Stationary dynamics:
• Markov Assumption on states (prior equation) and measurements:
9
BELIEFS
• A belief represents the robot’s internal knowledge about the environment
state (in contrast to true state).
• The state cannot be measured directly (e.g., position of the robot is technically
approximate, never “exactly” true).
• Assign probabilities to each hypothesis about the true state.
• Belief over state variables conditioned on data are posterior probabilities:
Predict state xt.
Predict state xt before
incorporating the measurement
zt. 10
11 BAYES FILTERS
• Bayes Filter: a general algorithm for calculating beliefs.
• Recursive algorithm: bel(xt) calculated from bel(xt-1).
• The above shows one iteration(repeat) of the algorithm at time t.
• We will show that it correctly computes bel(xt) from bel(xt-1).
19
Assumption: state
Bayes’ Rule xt is complete.
Normalizing factor
(1) Assume xt complete Showing a sum, but it
(2) Total probability could be an integral.
P(A,B)P(A|B)P(B)
Assume xt complete
Don’t need ut
20 BAYES FILTERS
• Putting it all together, we have:
• Hence, computes posterior correctly.
• Recall pseudocode, with integrals
21 BAYES FILTER EXAMPLE
• Assume a robot estimates the state of a door using its camera.
• Environment assumptions:
• States:
• Observations:
• Actions:
• We also assume a noisy sensor model:
• Intuition: sensors are more reliable in
detecting a closed door (error=0.2)
but if door is open, error=0.4.
22 BAYES FILTER EXAMPLE
The robot can use its manipulator to push the door: Or the robot can do nothing:
23 BAYES FILTER EXAMPLE (T=0)
• Suppose at time t=1 the robot takes no control action and senses an
open door. What is the resulting posterior belief bel(x1)?
• Before answering this, we need a prior belief.
• Let’s assign equal prior probability to the two states:
BAYES FILTER EXAMPLE (T=1)
Suppose at time t=1 the robot takes no Recall that the Bayes Filter had two
control action and senses an open door. major steps. This is the first, before
What is the resulting posterior belief bel(x1)? incorporating measurement:
24
BAYES FILTER EXAMPLE (T=1)
One hypothesis: x1 is open. Second hypothesis: x1 is closed.
25
BAYES FILTER EXAMPLE (T=1)
After processing the control, we went from the prior to the updated beliefs:
Our updated belief is equal to the prior belief. The robot’s action was to
do nothing, and we didn’t incorporate the measurement yet.
26
BAYES FILTER EXAMPLE (T=1)
Next, let’s incorporate the measurement update z1 = open
After normalizing:
ᶯ=(0.3+0.1) =2.5 -1 27
BAYES FILTER EXAMPLE (T=2)
(First, let’s get the updated conditionals)
Suppose at time t=2 the robot
pushes the door and senses an (Also, recall Bayes filter algorithm, with integrals)
open door. What is the resulting
posterior belief bel(x2)?
28
BAYES FILTER EXAMPLE (T=2)
Suppose at time t=2 the robot pushes the door and senses an open door. What is the
resulting posterior belief bel(x2)? Our updated beliefs before considering measurements:
29
BAYES FILTER EXAMPLE (T=2)
Measurement updating
Next, let’s incorporate the measurement update z2 =
open
After normalizing:
ᶯ=(0.57+0.01) =1.72
-1 30
38 SUMMARY
• Bayes Rule lets us compute probabilities that are hard to assess.
• Under the Markov assumption, recursive Bayesian updating efficiently
combines evidence.
• Bayes Filters are a probabilistic tool for estimating the state of
dynamic systems.
• In practice, hard to apply Bayes Filters to all but most trivial cases.