Analytical Approach to Sampling Estimation of Underwater Tunnels Using Mechanical Profiling Sonars
<p>An example of profiling sensor with a single ray connected to a moving head.</p> "> Figure 2
<p>An example of tunnel cross-section with base with length <span class="html-italic">a</span>, the tunnel height is <span class="html-italic">b</span> and its arc shaped ceiling has radius <span class="html-italic">c</span>. Vectors <math display="inline"><semantics> <mover accent="true"> <mi mathvariant="italic">u</mi> <mo>^</mo> </mover> </semantics></math>, <math display="inline"><semantics> <mover accent="true"> <mi mathvariant="italic">v</mi> <mo>^</mo> </mover> </semantics></math> and <math display="inline"><semantics> <mover accent="true"> <mi mathvariant="italic">w</mi> <mo>^</mo> </mover> </semantics></math> are the normalized basis vectors of the tunnel reference frame, where <math display="inline"><semantics> <mover accent="true"> <mi mathvariant="italic">u</mi> <mo>^</mo> </mover> </semantics></math> is orthogonal to the cross-section of the entrance of the tunnel.</p> "> Figure 3
<p>Analysis of floor samples in the two extreme cases: (<b>a</b>) when the sensor is centered and perpendicular to the floor; and, (<b>b</b>) when the sensor hits the corner.</p> "> Figure 4
<p>The impact of velocity on mapping with single point sensor when a robot moves with constant linear velocity <math display="inline"><semantics> <mrow> <mi mathvariant="italic">v</mi> <mo>=</mo> <mo>(</mo> <mi>v</mi> <mo>,</mo> <mn>0</mn> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </semantics></math>.</p> "> Figure 5
<p>Distance between pair of sensor readings which are closest to the corner, <math display="inline"><semantics> <mrow> <mi>m</mi> <mi>θ</mi> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mo>(</mo> <mi>m</mi> <mo>−</mo> <mn>1</mn> <mo>)</mo> <mi>θ</mi> </mrow> </semantics></math> angular positions, during the first sensor revolution are displayed as blue vertices, while the same samples at the next revolution are in red. Note that they form a parallelogram of sides <math display="inline"><semantics> <msub> <mi mathvariant="script">T</mi> <mi>x</mi> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi mathvariant="script">T</mi> <msub> <mi>y</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> </msub> </semantics></math>, with the largest distance between samples at the parallelogram at the largest diagonal, shown in red dashed lines.</p> "> Figure 6
<p>The behavior of four identical sensors <math display="inline"><semantics> <mrow> <msub> <mi>s</mi> <mn>0</mn> </msub> <mo>,</mo> <mspace width="3.33333pt"/> <msub> <mi>s</mi> <mn>1</mn> </msub> <mo>,</mo> <mspace width="3.33333pt"/> <msub> <mi>s</mi> <mn>2</mn> </msub> </mrow> </semantics></math>, and <math display="inline"><semantics> <msub> <mi>s</mi> <mn>3</mn> </msub> </semantics></math>, with phases <math display="inline"><semantics> <mrow> <mi>φ</mi> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> </semantics></math> for each sensor <math display="inline"><semantics> <msub> <mi>s</mi> <mi>k</mi> </msub> </semantics></math>—that is, the phase for each sensor is <math display="inline"><semantics> <msup> <mn>0</mn> <mo>°</mo> </msup> </semantics></math>, <math display="inline"><semantics> <msup> <mn>90</mn> <mo>°</mo> </msup> </semantics></math>, <math display="inline"><semantics> <msup> <mn>180</mn> <mo>°</mo> </msup> </semantics></math> and <math display="inline"><semantics> <msup> <mn>270</mn> <mo>°</mo> </msup> </semantics></math> respectively. Samples at the same head position for all sensors are out of phase and the distance between subsequent samples with the same head position is always <math display="inline"><semantics> <msub> <mi mathvariant="script">T</mi> <mi>x</mi> </msub> </semantics></math>/s, where <span class="html-italic">s</span> is the number of sensors. For all sensors, the distance covered while performing one full sensor revolution is <math display="inline"><semantics> <msub> <mi mathvariant="script">T</mi> <mi>x</mi> </msub> </semantics></math>. A 4x faster sensor is shown in blue dashed lines. As the slow sensors can sample 4 points at the same time, sampling does not happen exactly at the same points of a faster sensor, but the distance between samples are the same—for example, the floor is sampled with the same spatial distance, but not at the same points.</p> "> Figure 7
<p>Impact of sampling with two sensors when the robot moves with constant velocity <math display="inline"><semantics> <mrow> <mi mathvariant="bold-italic">v</mi> <mo>=</mo> <mo>(</mo> <mi>v</mi> <mo>,</mo> <mn>0</mn> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </semantics></math> and the distance between sensors is <span class="html-italic">x</span>. When <math display="inline"><semantics> <msub> <mi>s</mi> <mn>1</mn> </msub> </semantics></math>, reaches the position of <math display="inline"><semantics> <msub> <mi>s</mi> <mn>0</mn> </msub> </semantics></math>, phases must be shifted by <math display="inline"><semantics> <msup> <mn>180</mn> <mo>°</mo> </msup> </semantics></math>.</p> "> Figure 8
<p>Sector sampling of a flat region. Dashed black lines represent the readings of a sensor, while green dashed lines represent the reading of an adjacent sensor. The distance between readings at the same head position, <math display="inline"><semantics> <mi>β</mi> </semantics></math>, but subsequent time-steps, presents different values—depending on the path taken by the sensor. The Long path (blue dashed) results in greater distance between samples than the shorter path (red dashed). Smaller and larger distances alternate each other and also that the sum of both is <math display="inline"><semantics> <mfrac> <mrow> <mn>2</mn> <msub> <mi mathvariant="script">T</mi> <mi>x</mi> </msub> </mrow> <mi>s</mi> </mfrac> </semantics></math>.</p> "> Figure 9
<p>Robot connected to one mechanical profiling sonar (MPS) with the sensor rotation axis aligned to the robot heading.</p> "> Figure 10
<p>The tunnel scenario considered for simulations and real world tests. The rock trap (the ditch shown in the figure) along with an auxiliary tunnel entrance.</p> "> Figure 11
<p>Robots positioned in the entrance of the tunnel. In (<b>a</b>), the vector in red depicts the robot heading vector, while the green vector is the ray of the sensor which is always co-planar with the cross-section of the tunnel. Blue lines depict the ping rays of four ping sensors which are used to keep the two robots centered at the cross-section of the tunnel as they move. In (<b>b</b>–<b>d</b>) we see VITA 2 with several 881L configurations.</p> "> Figure 12
<p>Comparison of the two configuration of sensors with respect to phase alignment.</p> "> Figure 13
<p>The problem with occlusions in the phase shift approach is shown in (<b>a</b>,<b>b</b>) when using sensors side by side, for the MEDIUM configuration of the 881L sensor, which makes it easier to see the resulting sampling of both approaches. Note the reading gaps at the side walls. The space between sensors also results in gaps in the sector approach, the gap is smaller on side walls than the phase shift approach, see (<b>c</b>,<b>d</b>). Comparing figure (<b>a</b>) with (<b>c</b>) and figure (<b>b</b>) with (<b>d</b>), the resolution difference between the two approaches. The resulting meshes for the phase shift and sector approaches for three and four sensors are shown in (<b>e</b>,<b>f</b>) and (<b>g</b>,<b>h</b>), respectively. The phase shift approach for three and four sensors shows aliasing at the side walls at the occlusions, but there is little aliasing at the floor for three and four sensors. Note the aliasing in (<b>g</b>), for sector offset for three sensors shown at the regions in blue next to the floor, while the mesh reconstruction algorithm can decrease the aliasing when using four sensors in (<b>h</b>). Also aliasing at side walls seems to be smaller for the sector offset than the phase shift approach.</p> "> Figure 14
<p>Low visibility using the low light camera at the tunnel: centered at the cross-section (<b>a</b>) and close the floor (<b>b</b>).</p> "> Figure 15
<p>The behavior of projected distances on a surface as the range and frequency change.</p> "> Figure 16
<p>Cross-section of the tunnel and the red thinning line which represents its skeletonization followed by pruning.</p> "> Figure 17
<p>The reconstructed tunnel. Regions of interest are marked in red. Note we can detect height differences in the ceiling and also some details of the end of the rock trap. Due to poor sampling along the tunnel length we cannot get a lot of details of the side tunnel we saw in previous experiments.</p> "> Figure A1
<p>The upper part of a tunnel with an arbitrarily convex but symmetric ceiling. Note that <math display="inline"><semantics> <msub> <mi mathvariant="script">T</mi> <mrow> <mi>y</mi> <mn>1</mn> </mrow> </msub> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>b</mi> <mo>/</mo> <mn>2</mn> </mrow> </semantics></math> always form a square angle, therefore <math display="inline"><semantics> <mrow> <mi>p</mi> <mo>+</mo> <mi>q</mi> <mo>=</mo> <mi>b</mi> <mo>/</mo> <mo>(</mo> <mn>2</mn> <mo form="prefix">cos</mo> <mi>θ</mi> <mo>)</mo> </mrow> </semantics></math>.</p> ">
Abstract
:1. Introduction
2. Related Work
3. Formalization of the Mapping Problem
The Sampling Period over Cross-Sections
4. Considering Forward Motion
5. Multiple Sensors Strategy
5.1. Phase Shift
5.2. Sector Offset
Algorithm 1: Computing the angular position of the head of sensor for sector-based configuration. |
|
6. Non-Ideal MPSs
7. Materials and Methods
7.1. Sensor
7.2. Simulations
7.3. Simulated and Real World Environment
8. Experiments
8.1. Metrics
8.2. The Impact of Alignment on Tunnel Sampling
8.3. Three and Four Sensors—The Problem of Occlusion in Phase Shift
9. Real World Experiments
9.1. The Power Plant Environment
9.2. Mapping
10. Discussion
11. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
3D | Three Dimensional |
DVL | Doppler Velocity Log |
MBS | Multi-Beam Sonar |
MPS | Mechanical Profiling Sonar |
SLAM | Simultaneous Localization and Mapping |
ROS | Robot Operating System |
UUV | Unmanned Underwater Vehicle |
Appendix A
Appendix B
Appendix C
References
- Hosko, M.A. Inspection of a Hydropower Tunnel Using Remotely Operated Vehicles (ROV): A 5-Year Case Study; Technical Report; American Society of Civil Engineers: New York, NY, USA, 1995. [Google Scholar]
- Davies-Colley, R.J.; Smith, D.G. Optically pure waters in Waikoropupu (‘Pupu’) Springs, Nelson, New Zealand. New Zealand J. Mar. Freshwater Res. 1995, 29, 251–256. [Google Scholar] [CrossRef] [Green Version]
- Castillón, M.; Palomer, A.; Forest, J.; Ridao, P. State of the Art of Underwater Active Optical 3D Scanners. Sensors 2019, 19, 5161. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bleier, M.; Nüchter, A. Low-Cost 3D Laser Scanning in Air or Water Using Self-Calibrating Structured Light. In Proceedings of the 7th International Workshop of 3D ARCH (3D ARCH 2017), Nafplio, Greece, 1–3 March 2017. [Google Scholar]
- Dobke, A.; Vasquez, J.; Lieu, L.; Chasnov, B.; Clark, C.; Dunn, I.; Wood, Z.J.; Gambin, T. Towards three-dimensional underwater mapping without odometry. In Proceedings of the Unmanned Untethered Submersible Technology Conference 2013, Portsmouth, NH, USA, 11–14 August 2013. [Google Scholar]
- Tan, C.; Seet, G.; Sluzek, A.; He, D. A novel application of range-gated underwater laser imaging system (ULIS) in near-target turbid medium. Opt. Lasers Eng. 2005, 43, 995–1009. [Google Scholar] [CrossRef]
- He, D.M.; Seet, G.G. Divergent-beam Lidar imaging in turbid water. Opt. Lasers Eng. 2004, 41, 217–231. [Google Scholar] [CrossRef]
- Cho, M.; Javidi, B. Three-Dimensional Visualization of Objects in Turbid Water Using Integral Imaging. J. Disp. Technol. 2010, 6, 544–547. [Google Scholar] [CrossRef]
- Joshi, R.; O’Connor, T.; Shen, X.; Wardlaw, M.; Javidi, B. Optical 4D signal detection in turbid water by multi-dimensional integral imaging using spatially distributed and temporally encoded multiple light sources. Opt. Express 2020, 28, 10477–10490. [Google Scholar] [CrossRef] [PubMed]
- Levoy, M.; Chen, B.; Vaish, V.; Horowitz, M.; McDowall, I.; Bolas, M. Synthetic Aperture Confocal Imaging. ACM Trans. Graph. 2004, 23, 825–834. [Google Scholar] [CrossRef]
- Bruno, F.; Bianco, G.; Muzzupappa, M.; Barone, S.; Razionale, A. Experimentation of structured light and stereo vision for underwater 3D reconstruction. ISPRS J. Photogramm. Remote Sens. 2011, 66, 508–518. [Google Scholar] [CrossRef]
- Drap, P.; Merad, D.; Boï, J.M.; Boubguira, W.; Mahiddine, A.; Chemisky, B.; Seguin, E.; Alcala, F.; Bianchimani, O. ROV-3D: 3D Underwater Survey Combining Optical and Acoustic Sensor. In Proceedings of the 12th International Conference on Virtual Reality, Archaeology and Cultural Heritage, Prato, Italy, 18–21 October 2011; pp. 177–184. [Google Scholar] [CrossRef]
- Fairfield, N.; Kantor, G.; Wettergreen, D. Towards particle filter SLAM with three dimensional evidence grids in a flooded subterranean environment. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation, ICRA 2006, Orlando, FL, USA, 15–19 May 2006; pp. 3575–3580. [Google Scholar] [CrossRef]
- Fairfield, N.; Kantor, G.; Wettergreen, D. Real-Time SLAM with Octree Evidence Grids for Exploration in Underwater Tunnels. J. Field Rob. 2007, 24, 3–21. [Google Scholar] [CrossRef] [Green Version]
- Eliazar, A.I.; Parr, R. Hierarchical Linear/Constant Time SLAM Using Particle Filters for Dense Maps. In Proceedings of the 18th International Conference on Neural Information Processing Systems, Shanghai, China, 13–17 November 2005; pp. 339–346. [Google Scholar]
- Lasbouygues, A.; Louis, S.; Ropars, B.; Rossi, L.; Jourde, H.; Délas, H.; Balordi, P.; Bouchard, R.; Dighouth, M.; Dugrenot, M.; et al. Robotic mapping of a karst aquifer. In Proceedings of the IFAC: International Federation of Automatic Control, Toulouse, France, 9–14 July 2017. [Google Scholar]
- White, C.; Hiranandani, D.; Olstad, C.S.; Buhagiar, K.; Gambin, T.; Clark, C.M. The Malta cistern mapping project: Underwater robot mapping and localization within ancient tunnel systems. J. Field Rob. 2010, 27, 399–411. [Google Scholar] [CrossRef] [Green Version]
- McVicker, W.; Forrester, J.; Gambin, T.; Lehr, J.; Wood, Z.J.; Clark, C.M. Mapping and visualizing ancient water storage systems with an ROV—An approach based on fusing stationary scans within a particle filter. In Proceedings of the 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO), Guangzhou, China, 11–14 December 2012; pp. 538–544. [Google Scholar] [CrossRef]
- Clark, C.M.; Hiranandani, D.; White, C.; Boardman, M.; Schlachtman, M.; Phillips, P.; Kuehn, J.; Gambin, T.; Buhagiar, K. The Malta Cistern Mapping Project: Expedition II. In Proceedings of the 16th Annual International Symposium on Unmanned Untethered Submersible Technology 2009 (UUST 09), Durham, NH, USA, 26 August 2009. [Google Scholar]
- Clark, C.M.; Olstad, C.S.; Buhagiar, K.; Gambin, T. Archaeology via underwater robots: Mapping and localization within maltese cistern systems. In Proceedings of the 2008 10th International Conference on Control, Automation, Robotics and Vision, Hanoi, Vietnam, 17–20 December 2008; pp. 662–667. [Google Scholar] [CrossRef]
- Mallios, A.; Ridao, P.; Carreras, M.; Hernández, E. Navigating and mapping with the SPARUS AUV in a natural and unstructured underwater environment. In Proceedings of the OCEANS’11 MTS/IEEE KONA, Waikoloa, HI, USA, 19–22 September 2011; pp. 1–7. [Google Scholar] [CrossRef]
- Stipanov, M.; Bakarić, V.; Eškinja, Z. ROV use for cave mapping and modeling. In Proceedings of the IFAC Workshop on Navigation, Guidance and Control of Underwater Vehicles, Killaoe, Ireland, 8–10 April 2008. [Google Scholar]
- Kantor, G.; Fairfield, N.; Jonak, D.; Wettergreen, D. Experiments in Navigation and Mapping with a Hovering AUV. In Field and Service Robotics: Results of the 6th International Conference; Springer: Berlin/Heidelberg, Germany, 2008; pp. 115–124. [Google Scholar] [CrossRef] [Green Version]
- Ageev, M.D.; Boreyko, A.A.; Gornak, V.E.; Matvienko, Y.V.; Scherbatyuk, A.P.; Vanlin, Y.V.; Zolotarev, V.V. Modernized TSL-underwater robot for tunnel and shallow-water inspection. In Proceedings of the 2000 International Symposium on Underwater Technology (Cat. No.00EX418), Tokyo, Japan, 23–26 May 2000; pp. 90–95. [Google Scholar] [CrossRef]
- Zhou, H.; Yang, F.; Wang, W.; Wang, T.; Yan, C. Research and Application of an Underwater Detection Robot with Three Level Control Mode of ROV/ARV/AUV. In Proceedings of the 2018 3rd International Conference on Information Technology and Industrial Automation (ICITIA), Guangzhou, China, 21 December 2018. [Google Scholar]
- Loisy, F.; François, P.; Douchet, G.; Hope-Darby, P.; Shimmin, K.; Bonner, T.; Laurent, E.; Colin, R. Underwater inspection experiment for a long tunnel of EDF’s hydroelectric facilities. In Proceedings of the 2010 1st International Conference on Applied Robotics for the Power Industry, Montreal, QC, Canada, 5–7 October 2010; pp. 1–4. [Google Scholar] [CrossRef]
- Shannon, C.E. Communication in the Presence of Noise. Proc. IRE 1949, 37, 10–21. [Google Scholar] [CrossRef]
- Nyquist, H. Certain topics in telegraph transmission theory. Proc. IEEE 1927, 90, 280–305. [Google Scholar] [CrossRef]
- Blackman, R.B.; Tukey, J.W. The measurement of power spectra from the point of view of communications engineering—Part I. Bell Syst. Tech. J. 1958, 37, 185–282. [Google Scholar] [CrossRef]
- Ridao, P.; Batlle, E.; Ribas, D.; Carreras, M. Neptune: A hil simulator for multiple UUVs. In Proceedings of the Oceans ’04 MTS/IEEE Techno-Ocean ’04 (IEEE Cat. No.04CH37600), Kobe, Japan, 9–12 November 2004; Volume 1, pp. 524–531. [Google Scholar] [CrossRef] [Green Version]
- Echeverria, G.; Lassabe, N.; Degroote, A.; Lemaignan, S. Modular open robots simulation engine: MORSE. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 46–51. [Google Scholar] [CrossRef]
- Dhurandher, S.K.; Misra, S.; Obaidat, M.S.; Khairwal, S. UWSim: A Simulator for Underwater Sensor Networks. Simulation 2008, 84, 327–338. [Google Scholar] [CrossRef]
- Henriksen, E.H.; Schjølberg, I.; Gjersvik, T.B. UW-MORSE: The underwater Modular Open Robot Simulation Engine. In Proceedings of the IEEE/OES Autonomous Underwater Vehicles (AUV), Tokyo, Japan, 6–9 November 2016; pp. 261–267. [Google Scholar]
- Tosik, T.; Maehle, E. MARS: A simulation environment for marine robotics. In Proceedings of the 2014 Oceans—St. John’s, St. John’s, NL, Canada, 14–19 September 2014; pp. 1–7. [Google Scholar]
- Rohmer, E.; Singh, S.P.N.; Freese, M. V-REP: A versatile and scalable robot simulation framework. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, 3–7 November 2013; pp. 1321–1326. [Google Scholar]
- Quigley, M.; Gerkey, B.; Conley, K.; Faust, J.; Foote, T.; Leibs, J.; Berger, E.; Wheeler, R.; Ng, A. ROS: An open-source Robot Operating System. In Proceedings of the IEEE Internaional Conference on Robotics and Automation (ICRA) Workshop on Open Source Robotics, Kobe, Japan, 12–17 May 2009. [Google Scholar]
- Koenig, N.; Howard, A. Design and use paradigms for Gazebo, an open-source multi-robot simulator. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), Sendai, Japan, 28 September 2004; Volume 3, pp. 2149–2154. [Google Scholar] [CrossRef] [Green Version]
- Kermorgant, O. A Dynamic Simulator for Underwater Vehicle-Manipulators. In Proceedings of the International Conference on Simulation, Modeling, and Programming for Autonomous Robots (SIMPAR), Bergamo, Italy, 20–23 October 2014; pp. 25–36. [Google Scholar]
- Salazar, J.D.; Buschinelli, P.; Marcellino, G.; Machado, M.; Rodrigues, H.; Regner, D.; Oliveira, D.; Santos, J.M.; Marinho, C.A.; Stemmer, M.R.; et al. 3D photogrammetric inspection of risers using rpas and deep learning in oil and gas offshore platforms. ISPRS Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2020, 43, 1265–1272. [Google Scholar] [CrossRef]
- Paravisi, M.; H. Santos, D.; Jorge, V.; Heck, G.; Gonçalves, L.M.; Amory, A. Unmanned Surface Vehicle Simulator with Realistic Environmental Disturbances. Sensors 2019, 19, 1068. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Manhães, M.M.M.; Scherer, S.A.; Voss, M.; Douat, L.R.; Rauschenbach, T. UUV Simulator: A Gazebo-based package for underwater intervention and multi-robot simulation. In Proceedings of the OCEANS 2016 MTS/IEEE, Monterey, CA, USA, 19–23 September 2016; pp. 1–8. [Google Scholar] [CrossRef]
- Thrun, S.; Burgard, W.; Fox, D. Probabilistic Robotics; The MIT Press: Cambridge, MA, USA; London, UK, 2004; p. 646. [Google Scholar]
- Kazhdan, M.; Bolitho, M.; Hoppe, H. Poisson Surface Reconstruction. In Proceedings of the Fourth Eurographics Symposium on Geometry Processing, Cagliari, Sardinia, Italy, 26–28 June 2006; pp. 61–70. [Google Scholar]
- Wiemann, T.; Annuth, H.; Lingemann, K.; Hertzberg, J. An evaluation of open source surface reconstruction software for robotic applications. In Proceedings of the 2013 16th International Conference on Advanced Robotics (ICAR), Montevideo, Uruguay, 25–29 November 2013; pp. 1–7. [Google Scholar] [CrossRef]
@600 kHz | 8 | 4 | 2.67 | 2 | 1 |
@1 MHz | 4.67 | 2.33 | 1.55 | 1.16 | 0 |
Parameter | SLOW | MEDIUM | FAST | FASTER | FASTEST |
---|---|---|---|---|---|
(degrees) | |||||
samples/rev | 1200 | 600 | 400 | 300 | 150 |
T (s) | 21.6 | 10.8 | 8 | 6.6 | 4.8 |
(samples/s) | 55.55 | 55.55 | 50 | 45.45 | 31.25 |
881L Mode | Experimental Data (m) | Baseline (m) | |||||||
---|---|---|---|---|---|---|---|---|---|
std | |||||||||
SLOW | 4.200 | 0.032 | 0.009 | 0.016 | 0.099 | 4.255 | 0.029 | 0.013 | 0.014 |
MEDIUM | 2.090 | 0.058 | 0.015 | 0.038 | 0.156 | 2.128 | 0.057 | 0.026 | 0.203 |
FAST | 1.527 | 0.086 | 0.033 | 0.045 | 0.337 | 1.573 | 0.086 | 0.039 | 0.301 |
FASTER | 1.283 | 0.114 | 0.049 | 0.051 | 0.417 | 1.300 | 0.115 | 0.053 | 0.396 |
FASTEST | 0.947 | 0.226 | 0.079 | 0.075 | 0.552 | 0.946 | 0.230 | 0.100 | 0.752 |
881L Mode | Sensors | Experimental Data (m) | Baseline (m) | |||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Phase Shift | Sector Offset | |||||||||||||||
std | std | |||||||||||||||
FASTEST | 2 | 0.470 | 0.226 | 0.083 | 0.074 | 0.542 | 0.939 | 0.224 | 0.082 | 0.077 | 0.575 | 0.470 | 0.941 | 0.230 | 0.100 | 0.745 |
FASTER | 0.647 | 0.114 | 0.051 | 0.052 | 0.414 | 1.300 | 0.113 | 0.050 | 0.056 | 0.418 | 0.647 | 1.294 | 0.115 | 0.05 | 0.391 | |
FAST | 0.788 | 0.086 | 0.034 | 0.046 | 0.345 | 1.572 | 0.086 | 0.034 | 0.049 | 0.362 | 0.784 | 1.568 | 0.086 | 0.041 | 0.297 | |
MEDIUM | 1.056 | 0.058 | 0.016 | 0.035 | 0.154 | 2.115 | 0.058 | 0.016 | 0.037 | 0.156 | 1.058 | 2.117 | 0.057 | 0.027 | 0.201 | |
SLOW | 2.098 | 0.032 | 0.009 | 0.017 | 0.104 | 4.174 | 0.032 | 0.009 | 0.016 | 0.098 | 2.116 | 4.234 | 0.029 | 0.014 | 0.102 | |
FASTEST | 3 | 0.320 | 0.226 | 0.083 | 0.072 | 0.597 | 0.635 | 0.221 | 0.083 | 0.076 | 0.567 | 0.314 | 0.627 | 0.23 | 0.109 | 0.759 |
FASTER | 0.429 | 0.114 | 0.051 | 0.049 | 0.419 | 0.865 | 0.112 | 0.050 | 0.048 | 0.405 | 0.431 | 0.862 | 0.115 | 0.055 | 0.399 | |
FAST | 0.525 | 0.086 | 0.035 | 0.040 | 0.364 | 1.003 | 0.085 | 0.034 | 0.046 | 0.335 | 0.523 | 1.045 | 0.115 | 0.055 | 0.399 | |
MEDIUM | 0.709 | 0.058 | 0.016 | 0.029 | 0.154 | 1.416 | 0.058 | 0.016 | 0.036 | 0.168 | 0.706 | 1.411 | 0.057 | 0.027 | 0.205 | |
SLOW | 1.410 | 0.031 | 0.009 | 0.015 | 0.099 | 2.832 | 0.032 | 0.009 | 0.019 | 0.099 | 1.411 | 2.822 | 0.029 | 0.014 | 0.104 | |
FASTEST | 4 | 0.229 | 0.226 | 0.081 | 0.088 | 0.585 | 0.461 | 0.217 | 0.081 | 0.074 | 0.585 | 0.235 | 0.470 | 0.23 | 0.109 | 0.759 |
FASTER | 0.323 | 0.114 | 0.050 | 0.057 | 0.491 | 0.641 | 0.111 | 0.049 | 0.058 | 0.401 | 0.323 | 0.647 | 0.115 | 0.055 | 0.399 | |
FAST | 0.394 | 0.086 | 0.047 | 0.043 | 0.359 | 0.790 | 0.084 | 0.033 | 0.044 | 0.340 | 0.392 | 0.784 | 0.115 | 0.055 | 0.399 | |
MEDIUM | 0.532 | 0.058 | 0.016 | 0.034 | 0.185 | 1.066 | 0.057 | 0.015 | 0.031 | 0.189 | 0.529 | 1.058 | 0.057 | 0.027 | 0.205 | |
SLOW | 1.056 | 0.032 | 0.010 | 0.015 | 0.101 | 2.100 | 0.031 | 0.009 | 0.001 | 0.101 | 1.058 | 2.117 | 0.029 | 0.014 | 0.104 |
881L Mode | Sensors | Phase Shift | Sector Offset | ||||
---|---|---|---|---|---|---|---|
(m) | (m) | (m) | (m) | (m) | (m) | ||
FASTEST | 2 | 0.522 | 0.476 | 0.717 | 0.965 | 0.942 | 1.101 |
FASTER | 2 | 0.657 | 0.649 | 0.768 | 1.305 | 1.301 | 1.366 |
FAST | 2 | 0.793 | 0.789 | 0.860 | 1.574 | 1.573 | 1.613 |
MEDIUM | 2 | 1.058 | 1.057 | 1.067 | 2.116 | 2.115 | 2.121 |
SLOW | 2 | 2.098 | 2.098 | 2.101 | 4.174 | 4.174 | 4.175 |
FASTEST | 3 | 0.392 | 0.328 | 0.677 | 0.672 | 0.640 | 0.851 |
FASTER | 3 | 0.444 | 0.432 | 0.600 | 0.872 | 0.866 | 0.955 |
FAST | 3 | 0.532 | 0.527 | 0.639 | 1.007 | 1.004 | 1.057 |
MEDIUM | 3 | 0.711 | 0.710 | 0.726 | 1.417 | 1.416 | 1.426 |
SLOW | 3 | 1.410 | 1.410 | 1.413 | 2.832 | 2.832 | 2.834 |
FASTEST | 4 | 0.322 | 0.245 | 0.628 | 0.510 | 0.467 | 0.745 |
FASTER | 4 | 0.343 | 0.328 | 0.588 | 0.651 | 0.644 | 0.756 |
FAST | 4 | 0.403 | 0.396 | 0.533 | 0.794 | 0.791 | 0.860 |
MEDIUM | 4 | 0.535 | 0.533 | 0.563 | 1.068 | 1.066 | 1.083 |
SLOW | 4 | 1.056 | 1.056 | 1.061 | 2.100 | 2.100 | 2.102 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Machado Jorge, V.A.; de Cerqueira Gava, P.D.; Belchior de França Silva, J.R.; Mancilha, T.M.; Vieira, W.; Adabo, G.J.; Nascimento, C.L., Jr. Analytical Approach to Sampling Estimation of Underwater Tunnels Using Mechanical Profiling Sonars. Sensors 2021, 21, 1900. https://doi.org/10.3390/s21051900
Machado Jorge VA, de Cerqueira Gava PD, Belchior de França Silva JR, Mancilha TM, Vieira W, Adabo GJ, Nascimento CL Jr. Analytical Approach to Sampling Estimation of Underwater Tunnels Using Mechanical Profiling Sonars. Sensors. 2021; 21(5):1900. https://doi.org/10.3390/s21051900
Chicago/Turabian StyleMachado Jorge, Vitor Augusto, Pedro Daniel de Cerqueira Gava, Juan Ramon Belchior de França Silva, Thais Machado Mancilha, Waldir Vieira, Geraldo José Adabo, and Cairo Lúcio Nascimento, Jr. 2021. "Analytical Approach to Sampling Estimation of Underwater Tunnels Using Mechanical Profiling Sonars" Sensors 21, no. 5: 1900. https://doi.org/10.3390/s21051900
APA StyleMachado Jorge, V. A., de Cerqueira Gava, P. D., Belchior de França Silva, J. R., Mancilha, T. M., Vieira, W., Adabo, G. J., & Nascimento, C. L., Jr. (2021). Analytical Approach to Sampling Estimation of Underwater Tunnels Using Mechanical Profiling Sonars. Sensors, 21(5), 1900. https://doi.org/10.3390/s21051900