Guidance Point Generation-Based Cooperative UGV Teleoperation in Unstructured Environment
<p>System overview of input correction cooperative control.</p> "> Figure 2
<p>System overview of haptic interaction cooperative control.</p> "> Figure 3
<p>System overview of guidance interaction cooperative control.</p> "> Figure 4
<p>Occupied grid map.</p> "> Figure 5
<p>A binary image corresponding to a real environment.</p> "> Figure 6
<p>Obstacle description in binary image.</p> "> Figure 7
<p>The neighbor pixels set of pixel P<sub>1</sub>.</p> "> Figure 8
<p>Non-single pixel skeletons generated by Zhang–Suen algorithm.</p> "> Figure 9
<p>The templates for removing redundant pixels.</p> "> Figure 10
<p>Skeletons of navigable regions.</p> "> Figure 11
<p>Spurious branches of skeleton.</p> "> Figure 12
<p>Flow chart of convex hull transform for grid map.</p> "> Figure 13
<p>An example of convex hull vertices searching for irregular region.</p> "> Figure 13 Cont.
<p>An example of convex hull vertices searching for irregular region.</p> "> Figure 14
<p>Smooth skeleton extraction for binary-image. (<b>a</b>) The red closed curves represent the envelopes of the obstacles; (<b>b</b>) result of convex hull transform; (<b>c</b>) extracted smooth skeletons after convex hull transform of obstacles.</p> "> Figure 15
<p>The searching result of the desired skeleton.</p> "> Figure 16
<p>The process of skeleton decomposition. (<b>a</b>) Fork points and endpoints of the skeleton; (<b>b</b>) the skeleton trunk; (<b>c</b>) the skeleton branch ahead of the UGV.</p> "> Figure 17
<p>An example of the occupied grid map (OGM) distance field.</p> "> Figure 18
<p>The result of skeleton shape optimization.</p> "> Figure 19
<p>The result of skeleton shape optimization.</p> "> Figure 20
<p>Human–machine interaction for cooperative control.</p> "> Figure 21
<p>Superimposed candidate trajectories on the video image.</p> "> Figure 22
<p>The trajectory curves satisfying the differential constraint of motion in different scenes.</p> "> Figure 22 Cont.
<p>The trajectory curves satisfying the differential constraint of motion in different scenes.</p> "> Figure 23
<p>Components of the navigation control system.</p> "> Figure 24
<p>Teleoperation system.</p> "> Figure 25
<p>Test environment and route.</p> "> Figure 26
<p>Driving speed and acceleration variation in three test routes based on remote control mode and human-machine cooperative control mode from one of the subjects.</p> ">
Abstract
:1. Introduction
2. Related Works
2.1. Input Correction Cooperative Control
2.2. Haptic Interaction Cooperative Control
2.3. Guidance Interaction Cooperative Control
- Not all regions have high precision maps for global path guidance. Even for an area with a global map, due to the lack of timely update of map data, the seemingly passable area on the map is actually impassable when the natural environment changes, which leads to the infeasibility of the planned global path.
- Due to the limitation of autonomous capability of a UGV, it is difficult to complete the task only by its own autonomous navigation system in the complex unstructured environment.
- The operator needs to actively determine the position of the guidance point in the absence of decision support, which results in a large workload for the operator. On the other hand, if the guidance point is not entered in time, it will bring serious fluctuation of speed.
3. Guidance Point Generation Method
3.1. Generation of Occupied Grid Map (OGM)
3.1.1. Assumption
- Assumption 1: The trajectory of the UGV is located on a two-dimensional plane to ensure that the measured data obtained at different times are in the same plane, so that the environmental information obtained by light detection and ranging (lidar) at close times contains the same environmental objects.
- Assumption 2: It is assumed that the environment is static, that is, the positions of all or most of the objects or features in the environment do not change over time. This assumption, also known as the Markov hypothesis, states that if the current state is known, the future state is independent of the past states.
- Assumption 3: Based on the assumption of two-dimensional plane motion of UGV, the 6 degrees of freedom (DOF) spatial motion of the UGV can be simplified into 3-DOF plane motion. The motion state of the UGV can be determined by its heading and 2-DOF position information.
- Assumption 4: It is assumed that the UGV is a rigid body during motion, and the influence of suspension changes, tire deformation and other factors on the vehicle body and the pitch and roll angle of the lidar is ignored, so as to improve the real-time performance of the map generation algorithm and meet the requirements of high-speed driving.
3.1.2. OGM Update
- is the distance between the center position of the grid and the lidar;
- represent the occupancy probability of grid which locate in the area that the laser beam passes through, and ;
- is the occupancy probability of grid where the measurement points are located, and ;
- is the occupancy probability of grid outside the measurement range, and ;
3.1.3. Binary Image Representation of OGM
3.2. Obstacle Description Based on Mathematical Morphology
3.2.1. Basic Operations of Mathematical Morphology
3.2.2. Expanding of Obstacles
3.3. Generating Algorithm of Candidate Guidance Points
3.3.1. An Improved Zhang–Suen Topology Thinning Algorithm
- Define the foreground pixel value of a binary image to be 1 and the background point pixel value to be 0. For a certain foreground pixel P1, the pixel set of its eight neighborhoods is shown in Figure 7, where P2~P9 are adjacent pixels of pixel P1;
- ;
- is defined as the number of times that the pixel value changes from 1 to 0 by traversing neighbor pixels around clockwise from ;
- The two steps are repeated until no pixels are deleted in either step, and the output is the thinning skeleton of the foreground in binary image.
- (1)
- ;
- (2)
- ;
- (3)
- ;
- (4)
- .
- (1)
- ;
- (2)
- ;
- (3)
- ;
- (4)
- .
- (1)
- ;
- (2)
- (3)
- ;
- (4)
- ;
- (5)
- .
3.3.2. Smooth Skeleton Generation Based on Convex Hull Transform
- (1)
- The leftmost and rightmost grids of each row in the obstacle area are marked as unremovable;
- (2)
- The grids at the top and bottom of each column in the obstacle area are marked as unremovable;
- (3)
- Eliminate the remaining unmarked obstacle grids in the obstacle area.
- (1)
- Traverse the leftmost grid of each row. If the leftmost grid and of the upper and lower two rows are both on the left of , or either is on the left of , and the other is in the same column as ;
- (2)
- Traverse the rightmost grid of each row. If the rightmost grid and of the upper and lower two rows are both on the right of , or either is on the right of , and the other is in the same column as ;
- (3)
- Traverse the top grid of each column. If the top grid and of the left and right two columns are both on the top of , or either is on the top of , and the other is in the same row as ;
- (4)
- Traverse the bottom grid of each column. If the bottom grid and of the left and right two columns are both on the below of , or either is on the below of , and the other is in the same row as .
3.3.3. Optimization of Skeleton Shape
- The definition of the foreground pixel and its eight neighborhoods in Section 3.3.1 is followed.
- ;
- Iterate through all the pixels in the foreground;
- Mark the pixel with as the endpoint;
- Mark the pixel with or as fork point.
3.3.4. Candidate Guidance Point Generation
4. Human-Machine Cooperation-Based UGV Teleoperation
4.1. Human–Machine Interaction Mechanism Design
- The autonomous control system generates candidate guidance points periodically and sends them to the teleoperation system.
- The teleoperation system overlays the candidate guidance points with the video images and displays them.
- The operator selects a candidate point as the guidance point of the UGV according to the control intention.
- The teleoperation system sends the selected guidance point back to the UGV, and the autonomous control system takes the guidance point as the input to generate the desired trajectory line based on the kinematics constraints of the UGV.
- During trajectory tracking, if the operator selects a new guidance point, the UGV will immediately generate trajectory based on the new guidance point.
- When the UGV is close to the current guidance point, if the operator has not yet provided a new guidance point, the autonomous control system will select a candidate guidance point as the tracking target point by itself according to the principle of minimum trajectory curvature.
- If no new guidance point is available, the UGV will stop automatically.
4.2. Trajectory Generation Based On Kinematic Constraints
5. Experiments
5.1. Test System
5.1.1. Wheeled Unmanned Ground Vehicle (UGV)
5.1.2. Teleoperation System
5.1.3. Wireless Communication System
5.2. Experimental Design
5.3. Experimental Result
5.3.1. Maneuvering Task Performance
5.3.2. Handling Stability
5.4. Experiment Analysis
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Chen, J.Y.; Haas, E.C.; Barnes, M.J. Human performance issues and user interface design for teleoperated robots. IEEE Trans. Syst. Man Cyber. Part C Appl. Rev. 2007, 37, 1231–1245. [Google Scholar] [CrossRef]
- Luck, J.P.; McDermott, P.L.; Allender, L.; Russell, D.C. An investigation of real world control of robotic assets under communication latency. In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction, Salt Lake City, UT, USA, 2–3 March 2006; pp. 202–209. [Google Scholar]
- Storms, J.; Tilbury, D. Equating user performance among communication latency distributions and simulation fidelities for a teleoperated mobile robot. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 4440–4445. [Google Scholar]
- Ferland, F.; Pomerleau, F.; Le Dinh, C.T.; Michaud, F. Egocentric and exocentric teleoperation interface using real-time, 3d video projection. In Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), La Jolla, CA, USA, 9–13 March 2009; pp. 37–44. [Google Scholar]
- Badue, C.; Guidolini, R.; Carneiro, R.V.; Azevedo, P.; Cardoso, V.B.; Forechi, A.; Jesus, L.; Berriel, R.; Paixão, T.M.; Mutz, F.; et al. Self-driving cars: A survey. Expert. Syst. Appl. 2020, 2020, 113816. [Google Scholar]
- Yurtsever, E.; Lambert, J.; Carballo, A.; Takeda, K. A survey of autonomous driving: Common practices and emerging technologies. IEEE Access 2020, 8, 58443–58469. [Google Scholar] [CrossRef]
- Cosenzo, K.A.; Barnes, M.J. Who needs an operator when the robot is autonomous? The challenges and advantages of robots as team members. In Designing Soldier Systems: Current Issues in Human Factors; Martin, J., Allender, L., Savage-Knepshield, P., Lockett, J., Eds.; CRC Press: Boca Raton, FL, USA, 2018; pp. 35–51. [Google Scholar]
- Kelly, A.; Chan, N.; Herman, H.; Warner, R. Experimental validation of operator aids for high speed vehicle teleoperation. In Experimental Robotics; Desai, J.P., Dudek, G., Khatib, O., Kumar, V., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; pp. 951–962. [Google Scholar]
- Macharet, D.G.; Florencio, D. A collaborative control system for telepresence robots. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Algarve, Portugal, 7–12 October 2012; pp. 5105–5111. [Google Scholar]
- Anderson, S.J.; Walker, J.M.; Iagnemma, K. Experimental performance analysis of a homotopy-based shared autonomy framework. IEEE Trans. Humman-Mach. Syst. 2014, 44, 190–199. [Google Scholar] [CrossRef]
- Erlien, S.M.; Funke, J.; Gerdes, J.C. Incorporating non-linear tire dynamics into a convex approach to shared steering control. In Proceedings of the IEEE American Control Conference (ACC), Portland, OR, USA, 4–6 June 2014; pp. 3468–3473. [Google Scholar]
- Shia, V.; Gao, Y.; Vasudevan, R.; Campbell, K.D.; Lin, T.; Borrelli, F.; Bajcsy, R. Semiautonomous vehicular control using driver modeling. IEEE Trans. Intell. Transp. Syst. 2014, 15, 2696–2709. [Google Scholar] [CrossRef]
- Bicker, R.; Ow, S.M. Shared control in bilateral telerobotic systems. In Proceedings of the SPIE—The International Society for Optical Engineering, San Diego, CA, USA, 9–11 July 1995; Volume 2351, pp. 200–206. [Google Scholar]
- Brandt, T.; Sattel, T.; Bohm, M. Combining haptic human-machine interaction with predictive path planning for lane-keeping and collision avoidance systems. In Proceedings of the IEEE Intelligent Vehicles Symposium, Istanbul, Turkey, 13–15 June 2007; pp. 582–587. [Google Scholar]
- Mulder, M.; Abbink, D.A.; Boer, E.R. Sharing control with haptics: Seamless driver support from manual to automatic control. Hum. Factors 2012, 54, 786–798. [Google Scholar] [CrossRef] [PubMed]
- Flemisch, F.; Heesen, M.; Hesse, T.; Kelsch, J.; Schieben, A.; Beller, J. Towards a dynamic balance between humans and automation: Authority, ability, responsibility and control in shared and cooperative control situations. Cogn. Technol. Work 2012, 14, 3–18. [Google Scholar] [CrossRef]
- Gray, A.; Ali, M.; Gao, Y.; Hedrick, J.; Borrelli, F. Semi-autonomous vehicle control for road departure and obstacle avoidance. In Proceedings of the IFAC Symposium on Control in Transportation Systems, Sofia, Bulgaria, 12–14 September 2012; pp. 1–6. [Google Scholar]
- Petermeijer, S.M.; Abbink, D.A.; de Winter, J.C. Should drivers be operating within an automation-free bandwidth? Evaluating haptic steering support systems with different levels of authority. Hum. Factors 2015, 57, 5–20. [Google Scholar] [CrossRef] [PubMed]
- Forsyth, B.A.; Maclean, K.E. Predictive haptic guidance: Intelligent user assistance for the control of dynamic tasks. IEEE Trans. Visual. Comput. Graph. 2006, 12, 103–113. [Google Scholar] [CrossRef]
- Mulder, M.; Abbink, D.A.; Boer, E.R. The effect of haptic guidance on curve negotiation behavior of young, experienced drivers. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Singapore, 12–15 October 2008; pp. 804–809. [Google Scholar]
- Abbink, D.A.; Cleij, D.; Mulder, M.; Van Paassen, M.M. The importance of including knowledge of neuromuscular behaviour in haptic shared control. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, Seoul, Korea, 14–17 October 2012; pp. 3350–3355. [Google Scholar]
- Mars, F.; Deroo, M.; Hoc, J.M. Analysis of human-machine cooperation when driving with different degrees of haptic shared control. IEEE Trans. Haptics 2014, 7, 324–333. [Google Scholar] [CrossRef] [PubMed]
- Boehm, P.; Ghasemi, A.H.; O’Modhrain, S.; Jayakumar, P.; Gillespie, R.B. Architectures for shared control of vehicle steering. IFAC-PapersOnLine 2016, 49, 639–644. [Google Scholar] [CrossRef]
- Witus, G.; Hunt, S.; Janicki, P. Methods for UGV teleoperation with high latency communications. In Unmanned Systems Technology XIII; International Society for Optics and Photonics: Bellingham, WA, USA, 2011; Volume 8045, p. 80450N. [Google Scholar]
- Silver, D.; Sofman, B.; Vandapel, N.; Bagnell, J.A.; Stentz, A. Experimental analysis of overhead data processing to support long range navigation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 2443–2450. [Google Scholar]
- Zych, N.; Silver, D.; Stager, D.; Green, C.; Pilarski, T.; Fischer, J.; Kuntz, N.; Anderson, D.; Costa, A.; Gannon, J.; et al. Achieving integrated convoys: Cargo unmanned ground vehicle development and experimentation. In Unmanned Systems Technology XV; International Society for Optics and Photonics: Bellingham, WA, USA, 2013; Volume 8741, p. 87410Y. [Google Scholar]
- Liu, D. Research on Human-machine Intelligent Integration Based Path Planning for Mobile Robots. Master’s Thesis, National University of Defense Technology, Changsha, China, 2011. [Google Scholar]
- Suzuki, T.; Amano, Y.; Hashizume, T.; Kubo, N. Vehicle teleoperation using 3D maps and GPS time synchronization. IEEE Comput. Graph. Appl. 2013, 33, 82–88. [Google Scholar] [CrossRef]
- Kaufman, E.; Lee, T.; Ai, Z.; Moskowitz, I.S. Bayesian occupancy grid mapping via an exact inverse sensor model. In Proceedings of the IEEE American Control Conference (ACC), Boston, MA, USA, 6–8 July 2016; pp. 5709–5715. [Google Scholar]
- Dougherty, E. Mathematical Morphology in Image Processing; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
- Xie, W.; Thompson, R.P.; Perucchio, R. A topology-preserving parallel 3D thinning algorithm for extracting the curve skeleton. Pattern Recognit. 2003, 36, 1529–1544. [Google Scholar] [CrossRef]
- Ding, Y.; Liu, W.Y.; Zheng, Y.H. Hierarchical connected skeletonization algorithm based on distance transform. J. Infrared Millim. Waves 2005, 24, 281–285. [Google Scholar]
- Ogniewicz, R.L. Hierarchic Voronoi skeletons. Pattern Recognit. 1995, 28, 343–359. [Google Scholar] [CrossRef]
- Zhang, T.Y.; Suen, C.Y. A fast parallel algorithm for thinning digital patterns. Commun. ACM 1984, 27, 236–239. [Google Scholar] [CrossRef]
- Graham, R.L. An efficient algorithm for determining the convex hull of a finite planar set. Info. Pro. Lett. 1972, 1, 132–133. [Google Scholar] [CrossRef]
- Bresenham, J.E. Algorithms for computer control of a digital plotter. IBM Syst. J. 1965, 4, 25–30. [Google Scholar] [CrossRef]
- Zhang, D.C.; Zhou, C.G.; Zhou, Q.; Chi, S.Z.; Wang, S.J. Hole-filling algorithm based on contour. J. Jilin Uni. 2011, 49, 82–86. [Google Scholar]
- Rajaguru, H.; Prabhakar, S.K. KNN Classifier and K-Means Clustering for Robust Classification of Epilepsy from EEG Signals; A Detailed Analysis; Anchor Academic Publishing: Hamburg, Germany, 2017; pp. 31–36. [Google Scholar]
- Saito, T.; Toriwaki, J.I. New algorithms for euclidean distance transformation of an n-dimensional digitized picture with applications. Pattern Recognit. 1994, 27, 1551–1565. [Google Scholar] [CrossRef]
- Broggi, A.; Bertozzi, M.; Fasciolia, A.; Guarino, C.; Lo Bianco, C.G.; Piazzi, A. The ARGO autonomous vehicles vision and control systems. Int. J. Intell. Cont. Syst. 1999, 3, 409–441. [Google Scholar]
- Vegda, H.; Modi, N. Review paper on mobile ad-hoc networks. Int. J. Comput. Appl. 2018, 179, 33–35. [Google Scholar] [CrossRef]
- Zheng, Y.; Brudnak, M.J.; Jayakumar, P.; Stein, J.L.; Ersal, T. An experimental evaluation of a model-free predictor framework in teleoperated vehicles. IFAC-PapersOnLine 2016, 49, 157–164. [Google Scholar] [CrossRef]
- Abe, M. Vehicle Handling Dynamics: Theory and Application; Butterworth-Heinemann: Oxford, UK, 2015. [Google Scholar]
- Emmanuel, L.A.; Christian, C.; Mbaocha, C.C.; Olubiwe, M. Design of Two-Degree-Of-Freedom (2DOF) Steering Control for Automated Guided Vehicle. Int. J. Scienti. Eng. Sci. 2019, 3, 57–64. [Google Scholar]
Subject | Average Speed of Route 1 [m/s] | Average Speed of Route 2 [m/s] | Average Speed of Route 3 [m/s] | |||
---|---|---|---|---|---|---|
R. C. | H. C. C. | R. C. | H. C. C. | R. C. | H. C. C. | |
1 | 5.70 | 6.96 | 7.98 | 8.34 | 8.66 | 9.17 |
2 | 6.38 | 5.94 | 5.67 | 6.83 | 7.02 | 7.89 |
3 | 5.93 | 6.33 | 7.58 | 6.17 | 9.61 | 10.36 |
4 | 5.24 | 5.78 | 8.86 | 9.84 | 7.37 | 7.61 |
Mean Value | 5.81 | 6.25 | 7.52 | 7.80 | 8.17 | 8.76 |
Subject | MAD of Yaw Rate in Route 1 [°/s] | MAD of Yaw Rate in Route 2 [°/s] | MAD of Yaw Rate in Route 3 [°/s] | |||
---|---|---|---|---|---|---|
R. C. | H. C. C. | R. C. | H. C. C. | R. C. | H. C. C. | |
1 | 0.97 | 0.79 | 2.68 | 2.78 | 1.52 | 0.96 |
2 | 1.10 | 0.90 | 2.30 | 3.48 | 1.84 | 1.21 |
3 | 1.13 | 0.69 | 2.38 | 2.59 | 1.62 | 0.91 |
4 | 0.95 | 0.67 | 3.61 | 2.92 | 1.38 | 1.00 |
Mean Value | 1.04 | 0.76 | 2.74 | 2.94 | 1.59 | 1.02 |
Subject | MAD of Sideslip Angle in Route 1 [°] | MAD of Sideslip Angle in Route 2 [°] | MAD of Sideslip Angle in Route 3 [°] | |||
---|---|---|---|---|---|---|
R. C. | H. C. C. | R. C. | H. C. C. | R. C. | H. C. C. | |
1 | 0.25 | 0.16 | 0.45 | 0.37 | 0.33 | 0.21 |
2 | 0.24 | 0.21 | 0.51 | 0.40 | 0.41 | 0.19 |
3 | 0.20 | 0.14 | 0.54 | 0.36 | 0.29 | 0.26 |
4 | 0.18 | 0.15 | 0.33 | 0.43 | 0.37 | 0.24 |
Mean Value | 0.22 | 0.17 | 0.46 | 0.42 | 0.35 | 0.23 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhu, S.; Xiong, G.; Chen, H.; Gong, J. Guidance Point Generation-Based Cooperative UGV Teleoperation in Unstructured Environment. Sensors 2021, 21, 2323. https://doi.org/10.3390/s21072323
Zhu S, Xiong G, Chen H, Gong J. Guidance Point Generation-Based Cooperative UGV Teleoperation in Unstructured Environment. Sensors. 2021; 21(7):2323. https://doi.org/10.3390/s21072323
Chicago/Turabian StyleZhu, Sen, Guangming Xiong, Huiyan Chen, and Jianwei Gong. 2021. "Guidance Point Generation-Based Cooperative UGV Teleoperation in Unstructured Environment" Sensors 21, no. 7: 2323. https://doi.org/10.3390/s21072323