Vision-Aided Inertial Navigation For PPL
Vision-Aided Inertial Navigation For PPL
Altitude (km)
attitude, and velocity in real time. Two types of features are
60
considered: mapped landmarks, i.e., features whose global 3D po-
sitions can be determined from a surface map, and opportunistic 40
features, i.e., features that can be tracked in consecutive images,
but whose 3D positions are not known. Both types of features are 20
processed in an extended Kalman filter (EKF) estimator and are ← Drogue Chute Deploy
0
optimally fused with measurements from an inertial measurement −107
unit (IMU). Results from a sounding rocket test, covering the −106.8 Main Chute Deploy →
← Touchdown
dynamic profile of typical planetary landing scenarios, show −106.6 ← Launch
estimation errors of magnitude 0.16 m/s in velocity and 6.4 m −106.4
in position at touchdown. These results vastly improve current −106.2
33
33.5
state of the art for non-vision based EDL navigation, and meet 32.5
32
Longitude (deg) Latitude (deg)
the requirements of future planetary exploration missions.
Altitude (m)
Off nadir angular rate ≤ 19 ◦ /s < 60 ◦ /s
4000 Main Chute Deploy →
Roll Rate ≤ 360 ◦ /s < 60 ◦ /s
3000
TABLE I
DYNAMICS C OMPARISON BETWEEN S OUNDING ROCKET AND M ARS EDL. 2000
1000
off-line. We should note, however, that our algorithm is capa- Longitude (deg) Latitude (deg)
ble of real-time operation. The FFT correlation-based feature
Fig. 3. Zoomed-in view of trajectory after main parachute deployment.
matching is predicted to run at 5-20 Hz in an FPGA-based
implementation currently under development at JPL, and the
100
current C++ implementation of the pose estimator runs at 20
Hz on a 2 GHz CPU, with the number of stored poses set to 80
Number of MLs
N = 20. 60
v (m/s)
0
−5
approximately ±4 m, and the velocity uncertainty to less than −10
x
1st ML set IMU Integration 2nd ML set
±0.25 m/s along each axis (3σ). Notice that the actual error OFs
4000 3500 3000 2500 2000 1500 1000 500 0
at the beginning of OF updates is smaller than at the end
of ML processing ten seconds later. This can be attributed 10
5
v (m/s)
0
to the already decreasing number of detected MLs within −5
−10
the FOV at this altitude, due to the difference in resolution y
1st ML set IMU Integration 2nd ML set
OFs
between satellite and camera image (cf. Fig. 4). With the 4000 3500 3000 2500 2000 1500 1000 500 0
discontinuation of ML processing at 230 m altitude, the pose 10
uncertainty begins to increase even more, although still at a 5
v (m/s)
0
very low rate (cf. the zoomed-in view of the errors for the −5
−10
z
final 300 m in Figs. 6 and 8). As predicted, this is the result 1st ML set IMU Integration 2nd ML set
OFs
of the system becoming unobservable. Table II shows the final 4000 3500 3000 2500 2000 1500 1000 500 0
Altitude (m)
velocity and position error magnitudes at touchdown, which
are approximately 6.4 m for position and 0.16 m/s for velocity.
Similar to position and velocity, the attitude uncertainty Fig. 5. Velocity error expressed in NED frame (blue solid line) and
corresponding 3σ-bounds (red dashed line). Note that x-y-z in the plots
bounds were decreased to ±0.15◦ accuracy along each axis corresponds to N-E-D.
(3σ) during processing of ML updates, with a temporary
increase to ±0.9◦ during open-loop IMU integration between
3100 m and 1600 m altitude. Due to the lack of ground- 0.5
truth, the attitude uncertainty was determined from the EKF
vx (m/s)
0.25
0
estimates of the covariance matrix. The figures for position −0.25
and velocity errors (cf. Figs. 5-8) show that the filter estimates −0.5
300 250 200 150 100 50 0
for these variables are consistent, indicating that the attitude
estimates are also consistent. The filter attitude estimate was 0.5
vy (m/s)
0.25
tal validation of a vision-aided inertial navigation algorithm for 0
planetary landing applications. Results from a sounding rocket −0.25
−0.5
test, covering the dynamic profile of typical planetary EDL 300 250 200 150 100 50 0
scenarios, showed estimation errors of magnitude 0.16 m/s in Altitude (m)
−250
−500
vol. 24, no. 5, pp. 357–378, May 2007.
1st ML set IMU Integration 2nd ML set [3] A. I. Mourikis and S. I. Roumeliotis, “A multi-state constraint Kalman
OFs
filter for vision-aided inertial navigation,” in Proceedings of the IEEE
4000 3500 3000 2500 2000 1500 1000 500 0
International Conference on Robotics and Automation, Rome, Italy,
500 April 2007, pp. 3565–3572.
250
0
[4] J. L. Crassidis, R. Alonso, and J. L. Junkins, “Optimal attitude and
y (m)
−250
−500 [6] G. F. Ivey and E. Johnson, “Investigation of methods for simultaneous
1st ML set IMU Integration 2nd ML set OFs localization and mapping using vision sensors,” in Proceedings of the
4000 3500 3000 2500 2000 1500 1000 500 0 AIAA Guidance, Navigation, and Control Conference, no. AIAA 2006-
Altitude (m) 6578, Keystone, CO, Aug. 2006.
[7] D. S. Bayard and P. B. Brugarolas, “An estimation algorithm for vision-
based exploration of small bodies in space,” in Proceedings of the 2005
Fig. 7. Position error expressed in NED frame (blue solid line) and American Control Conference, vol. 7, Portland, Oregon, Jun. 2005, pp.
corresponding 3σ-bounds (red dashed line). 4589–4595.
[8] D. Strelow, “Motion estimation from image and inertial measurements,”
Ph.D. dissertation, Carnegie Mellon University, Nov. 2004.
10 [9] S. I. Roumeliotis, A. E. Johnson, and J. F. Montgomery, “Augmenting
inertial navigation with image-based motion estimation,” in IEEE Inter-
x (m)
0
on Robotics and Automation, Orlando, FL, May 15-19 2006, pp. 2277
−10
– 2284.
300 250 200 150 100 50 0 [12] P. McLauchlan, “The variable state dimension filter,” Centre for Vision,
Speech and Signal Processing, University of Surrey, UK, Tech. Rep.,
10 1999.
[13] Y. Cheng and A. Ansar, “Landmark based position estimation for pin-
z (m)