Sonar-Based Simultaneous Localization and Mapping Using the Semi-Direct Method
<p>Underwater sonar-based SLAM: (<b>a</b>) The poses obtained using both methods, projecting the point cloud from the previous frame to the position in the next frame: the top is the result of optimizing the position using the semi-direct method, and the bottom is the result obtained using only ICP. (<b>b</b>) Enlarged view of the dense point cloud area in (<b>a</b>); it can be seen that the ICP algorithm is relatively accurate but still has minor errors. (<b>c</b>) Minor errors, when accumulated over a large amount of data, will eventually lead to an unacceptable level of drift. (<b>d</b>) ROV equipment equipped with sonar.</p> "> Figure 2
<p>System framework diagram. The entire SLAM system is divided into three main components: First, the processing of raw sonar data, where different shapes of windows are employed based on the levels of reverberation noise to extract features from the sonar data and obtain Cartesian coordinate sonar images; second, the front-end, which is primarily responsible for obtaining initial pose estimates using scan matching and semi-direct methods; and third, the back-end, which is mainly in charge of global optimization and final state estimation.</p> "> Figure 3
<p>Forward-looking sonar model: Let P be a point in space, R represents the range from this point to the sonar origin, <math display="inline"><semantics> <mi>θ</mi> </semantics></math> represents the horizontal angle, and <math display="inline"><semantics> <mi>φ</mi> </semantics></math> represents the pitch angle. In sonar imaging, the information from the pitch angle to the imaging plane will all be compressed into the imaging plane.</p> "> Figure 4
<p>(<b>a</b>) One-dimensional CFAR detection is prone to false alarms: red represents the test cells, blue represents the reference cells, and orange represents the guard cells. (<b>b</b>) Two-dimensional SO-CFAR: Two-dimensional SO-CFAR window: The window of SO-CFAR includes the test cells (represented in red), reference cells (represented in blue), and guard cells (represented in orange). Areas with no reflections are represented in white, and obstacles are represented in gray.</p> "> Figure 5
<p>The principle of the direct method in sonar imaging. A forward-looking multibeam sonar emits sound waves from different poses, and the echoes generated by obstacles form the sonar image. Assuming we know the pose transformation between the sonars, the photometric error should be minimal when calculating the position of a pixel block from the previous sonar image in the next sonar image based on the sonar projection model. However, in reality, we do not know the pose transformation between the sonars. Therefore, we need to iterate multiple times to find the pose transformation that minimizes the photometric error, and this process is the direct method.</p> "> Figure 6
<p>Image process diagram. The forward-looking multi-beam sonar grabs sonar images, which are in polar coordinates. The raw sonar images can be transformed into Cartesian coordinates through interpolation and transformation. Pixel values along a certain direction often exhibit high-frequency changes, making it easy to fall into local minima. Image gradients also change at a very high frequency.</p> "> Figure 7
<p>The changes in pixel values along the straight line AB in the sonar image of <a href="#jmse-12-02234-f006" class="html-fig">Figure 6</a> before and after bilateral filtering: (<b>a</b>) before bilateral filtering; (<b>b</b>) after bilateral filtering.</p> "> Figure 8
<p>Effects of feature extraction with different CFAR detection methods: (<b>a</b>) Raw sonar image. (<b>b</b>) Features extracted via one-dimensional SO-CFAR detection along the range dimension. (<b>c</b>) Features extracted via GO-CFAR detection along the range dimension. (<b>d</b>) Features extracted via two-dimensional SO-CFAR detection.</p> "> Figure 9
<p>Quantitative assessment: (<b>a</b>) Pixel coordinates in the imaging of point clouds transformed using poses <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math>. (<b>b</b>) Enlarged sonar image (part) of (<b>a</b>). The red box represents the enlarged area. (<b>c</b>) Position of point clouds in the sonar image. (<b>d</b>) Sonar image (part corresponding to the point clouds). (<b>e</b>) Statistical chart of point cloud pixel values.</p> "> Figure 10
<p>(<b>a</b>) The trajectory and map of SLAM result that uses only the ICP algorithm in the front end. (<b>b</b>) The trajectory and map of SLAM result that utilizes the semi-direct method based on the ICP algorithm.</p> ">
Abstract
:1. Introduction
- We propose a two-dimensional constant false alarm detection method that uses prefix sum of matrix and accelerated calculations. Compared with the one-dimensional constant false alarm algorithm, the added time overhead is very low, and it has better detection effects for obstacles in the horizontal and vertical directions.
- We write a sonar imaging model based on the camera imaging model. We further optimize the initial pose by minimizing photometric errors.
2. Related Work
2.1. Sonar-Based SLAM
2.2. Optical Flow and Direct Methods
3. Method
3.1. Overview
3.2. Sonar-Based SLAM Problem Formulation
3.3. Smallest of Constant False Alarm Rate
3.4. Semi-Direct Method Applicable to the Field of Sonar
3.5. Image Processing
4. Experiments and Results
4.1. Hardware Overview
4.2. Feature Extraction
4.3. SLAM Performance
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Hafiza, W.; Shukor, M.M.; Jasman, F.; Mutalip, Z.A.; Abdullah, M.S.; Idrus, S.M. Advancement of Underwater Surveying and Scanning Techniques: A Review. J. Adv. Res. Appl. Sci. Eng. Technol. 2024, 41, 256–281. [Google Scholar] [CrossRef]
- Sun, K.; Cui, W.; Chen, C. Review of underwater sensing technologies and applications. Sensors 2021, 21, 7849. [Google Scholar] [CrossRef] [PubMed]
- Paull, L.; Saeedi, S.; Seto, M.; Li, H. AUV navigation and localization: A review. IEEE J. Ocean. Eng. 2013, 39, 131–149. [Google Scholar] [CrossRef]
- Amer, K.O.; Elbouz, M.; Alfalou, A.; Brosseau, C.; Hajjami, J. Enhancing underwater optical imaging by using a low-pass polarization filter. Opt. Express 2019, 27, 621–643. [Google Scholar] [CrossRef] [PubMed]
- Filisetti, A.; Marouchos, A.; Martini, A.; Martin, T.; Collings, S. Developments and applications of underwater LiDAR systems in support of marine science. In Proceedings of the OCEANS 2018 MTS/IEEE Charleston, Charleston, SC, USA, 22–25 October 2018; pp. 1–10. [Google Scholar]
- Biber, P.; Straßer, W. The normal distributions transform: A new approach to laser scan matching. In Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No. 03CH37453), Las Vegas, NV, USA, 27–31 October 2003; Volume 3, pp. 2743–2748. [Google Scholar]
- Burguera, A.; González, Y.; Oliver, G. The UspIC: Performing scan matching localization using an imaging sonar. Sensors 2012, 12, 7855–7885. [Google Scholar] [CrossRef]
- Bengtsson, O.; Baerveldt, A.J. Robot localization based on scan-matching—Estimating the covariance matrix for the IDC algorithm. Robot. Auton. Syst. 2003, 44, 29–40. [Google Scholar] [CrossRef]
- Wang, J.; Chen, F.; Huang, Y.; McConnell, J.; Shan, T.; Englot, B. Virtual maps for autonomous exploration of cluttered underwater environments. IEEE J. Ocean. Eng. 2022, 47, 916–935. [Google Scholar] [CrossRef]
- Hover, F.S.; Eustice, R.M.; Kim, A.; Englot, B.; Johannsson, H.; Kaess, M.; Leonard, J.J. Advanced perception, navigation and planning for autonomous in-water ship hull inspection. Int. J. Robot. Res. 2012, 31, 1445–1464. [Google Scholar] [CrossRef]
- Wang, J.; Bai, S.; Englot, B. Underwater localization and 3D mapping of submerged structures with a single-beam scanning sonar. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 4898–4905. [Google Scholar]
- Cheng, C.; Wang, C.; Yang, D.; Liu, W.; Zhang, F. Underwater localization and mapping based on multi-beam forward looking sonar. Front. Neurorobot. 2022, 15, 801956. [Google Scholar] [CrossRef]
- Li, J.; Kaess, M.; Eustice, R.M.; Johnson-Roberson, M. Pose-graph SLAM using forward-looking sonar. IEEE Robot. Autom. Lett. 2018, 3, 2330–2337. [Google Scholar] [CrossRef]
- Fallon, M.F.; Folkesson, J.; McClelland, H.; Leonard, J.J. Relocating underwater features autonomously using sonar-based SLAM. IEEE J. Ocean. Eng. 2013, 38, 500–513. [Google Scholar] [CrossRef]
- Besl, P.J.; McKay, N.D. A Method for Registration of 3-D Shapes. IEEE Trans. Pattern Anal. Mach. Intell. 1992, 14, 239–256. [Google Scholar] [CrossRef]
- Huang, T.A.; Kaess, M. Incremental data association for acoustic structure from motion. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016; pp. 1334–1341. [Google Scholar]
- Machado, M.; Zaffari, G.; Ribeiro, P.O.; Drews-Jr, P.; Botelho, S. Description and Matching of Acoustic Images Using a Forward Looking Sonar: A Topological Approach. IFAC-PapersOnLine 2017, 50, 2317–2322. [Google Scholar] [CrossRef]
- Zhang, J.; Xie, Y.; Ling, L.; Folkesson, J. A fully-automatic side-scan sonar simultaneous localization and mapping framework. IET Radar Sonar Navig. 2024, 18, 674–683. [Google Scholar] [CrossRef]
- Hurtos, N.; Ribas, D.; Cufi, X.; Petillot, Y.; Salvi, J. Fourier-based registration for robust forward-looking sonar mosaicing in low-visibility underwater environments. J. Field Robot. 2015, 32, 123–151. [Google Scholar] [CrossRef]
- Lucas, B.D.; Kanade, T. An iterative image registration technique with an application to stereo vision. In Proceedings of the IJCAI’81: 7th International Joint Conference on Artificial Intelligence, Vancouver, BC, Canada, 24–28 August 1981; Volume 2, pp. 674–679. [Google Scholar]
- Wang, J.; Shan, T.; Englot, B. Underwater terrain reconstruction from forward-looking sonar imagery. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 3471–3477. [Google Scholar]
- Alcantarilla, P.F.; Solutions, T. Fast explicit diffusion for accelerated features in nonlinear scale spaces. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 34, 1281–1298. [Google Scholar]
- Lane, D.M.; Chantler, M.J.; Dai, D. Robust tracking of multiple objects in sector-scan sonar image sequences using optical flow motion estimation. IEEE J. Ocean. Eng. 1998, 23, 31–46. [Google Scholar] [CrossRef]
- Bouzaouit, A.; Fietz, D.; Badri-Höher, S. Fish tracking based on sonar images by means of a modified optical flow. In Proceedings of the OCEANS 2021: San Diego–Porto, Virtual, 20–23 September 2021; pp. 1–7. [Google Scholar]
- Zhu, H.; Sun, X.; Zhang, Q.; Wang, Q.; Robles-Kelly, A.; Li, H.; You, S. Full view optical flow estimation leveraged from light field superpixel. IEEE Trans. Comput. Imaging 2019, 6, 12–23. [Google Scholar] [CrossRef]
- Forster, C.; Pizzoli, M.; Scaramuzza, D. SVO: Fast semi-direct monocular visual odometry. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 15–22. [Google Scholar]
- Dong, X.; Cheng, L.; Peng, H.; Li, T. FSD-SLAM: A fast semi-direct SLAM algorithm. Complex Intell. Syst. 2022, 8, 1823–1834. [Google Scholar] [CrossRef]
- Engel, J.; Schöps, T.; Cremers, D. LSD-SLAM: Large-scale direct monocular SLAM. In Computer Vision, Proceedings of the European Conference on Computer Vision, Zurich, Switzerland, 6–12 September 2014; Springer: Cham, Switzerland, 2014; pp. 834–849. [Google Scholar]
- Xu, S.; Zhang, K.; Hong, Z.; Liu, Y.; Wang, S. DISO: Direct Imaging Sonar Odometry. In Proceedings of the 2024 IEEE International Conference on Robotics and Automation (ICRA), Yokohama, Japan, 13–17 May 2024; pp. 8573–8579. [Google Scholar]
- Ester, M.; Kriegel, H.P.; Sander, J.; Xu, X. A density-based algorithm for discovering clusters in large spatial databases with noise. In Proceedings of the KDD-96: Second International Conference on Knowledge Discovery and Data Mining, Portland, OR, USA, 2–4 August 1996; Volume 96, pp. 226–231. [Google Scholar]
- McConnell, J.; Martin, J.D.; Englot, B. Fusing concurrent orthogonal wide-aperture sonar images for dense underwater 3d reconstruction. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020; pp. 1653–1660. [Google Scholar]
- Rohling, H. Radar CFAR thresholding in clutter and multiple target situations. IEEE Trans. Aerosp. Electron. Syst. 1983, 4, 608–621. [Google Scholar] [CrossRef]
- Acosta, G.G.; Villar, S.A. Accumulated CA–CFAR process in 2-D for online object detection from sidescan sonar data. IEEE J. Ocean. Eng. 2014, 40, 558–569. [Google Scholar] [CrossRef]
- McConnell, J.; Chen, F.; Englot, B. Overhead image factors for underwater sonar-based slam. IEEE Robot. Autom. Lett. 2022, 7, 4901–4908. [Google Scholar] [CrossRef]
- Zhang, Y.; Yang, Z.; Cui, R.; Song, X.; Li, Y. SLAM Algorithm of Underwater Vehicle Based on Multi-beam Sonar. In Proceedings of the International Conference on Intelligent Robotics and Applications, Hangzhou, China, 5–7 July 2023; Springer: Berlin/Heidelberg, Germany, 2023; pp. 274–286. [Google Scholar]
- Zhao, X.; Liu, P.; Wang, B.; Jin, Y. GPU-Accelerated Signal Processing for Passive Bistatic Radar. Remote Sens. 2023, 15, 5421. [Google Scholar] [CrossRef]
- Tomasi, C.; Manduchi, R. Bilateral filtering for gray and color images. In Proceedings of the Sixth International Conference on Computer Vision (IEEE Cat. No. 98CH36271), Bombay, India, 4–7 January 1998; pp. 839–846. [Google Scholar]
- Rixon Fuchs, L.; Larsson, C.; Gällström, A. Deep learning based technique for enhanced sonar imaging. In Proceedings of the 5th Underwater Acoustics Conference & Exhibition (UACE), Hersonissos, Crete, Greece, 30 June–5 July 2019; pp. 1021–1028. [Google Scholar]
- Lu, Y.; Liu, R.W.; Chen, F.; Xie, L. Learning a deep convolutional network for speckle noise reduction in underwater sonar images. In Proceedings of the 2019 11th International Conference on Machine Learning and Computing, Zhuhai, China, 22–24 February 2019; pp. 445–450. [Google Scholar]
Method | Mean |
---|---|
Semi-direct + 1D SO-CFAR | 13,290.7 |
ICP + 1D SO-CFAR | 11,985.5 |
Semi-direct + 2D SO-CFAR | 13,855.1 |
ICP + 2D SO-CFAR | 12,557.1 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Han, X.; Sun, J.; Zhang, S.; Dong, J.; Yu, H. Sonar-Based Simultaneous Localization and Mapping Using the Semi-Direct Method. J. Mar. Sci. Eng. 2024, 12, 2234. https://doi.org/10.3390/jmse12122234
Han X, Sun J, Zhang S, Dong J, Yu H. Sonar-Based Simultaneous Localization and Mapping Using the Semi-Direct Method. Journal of Marine Science and Engineering. 2024; 12(12):2234. https://doi.org/10.3390/jmse12122234
Chicago/Turabian StyleHan, Xu, Jinghao Sun, Shu Zhang, Junyu Dong, and Hui Yu. 2024. "Sonar-Based Simultaneous Localization and Mapping Using the Semi-Direct Method" Journal of Marine Science and Engineering 12, no. 12: 2234. https://doi.org/10.3390/jmse12122234
APA StyleHan, X., Sun, J., Zhang, S., Dong, J., & Yu, H. (2024). Sonar-Based Simultaneous Localization and Mapping Using the Semi-Direct Method. Journal of Marine Science and Engineering, 12(12), 2234. https://doi.org/10.3390/jmse12122234