Authors:
Ignacio Rocco Spremolla
1
;
Michel Antunes
2
;
Djamila Aouada
2
and
Björn Ottersten
2
Affiliations:
1
University of Luxembourg and Université Paris-Saclay, Luxembourg
;
2
University of Luxembourg, Luxembourg
Keyword(s):
Sensor Fusion, RGB-D, Thermal Sensing, Person Tracking.
Related
Ontology
Subjects/Areas/Topics:
Computer Vision, Visualization and Computer Graphics
;
Device Calibration, Characterization and Modeling
;
Image and Video Analysis
;
Image Formation and Preprocessing
;
Image Registration
;
Motion, Tracking and Stereo Vision
;
Tracking and Visual Navigation
Abstract:
Many systems combine RGB cameras with other sensor modalities for fusing visual data with complementary
environmental information in order to achieve improved sensing capabilities. This article explores the possibility
of fusing a commodity RGB-D camera and a thermal sensor. We show that using traditional methods, it
is possible to accurately calibrate the complete system and register the three RGB-D-T data sources. We propose
a simple person tracking algorithm based on particle filters, and show how to combine the mapped pixel
information from the RGB-D-T data. Furthermore, we use depth information to adaptively scale the tracked
target area when radial displacements from the camera occur. Experimental results provide evidence that this
allows for a significant tracking performance improvement in situations with large radial displacements, when
compared to using only a tracker based on RGB or RGB-T data.