[go: up one dir, main page]

CN115031718A - Unmanned ship synchronous positioning and mapping method (SLAM) and system with multi-sensor fusion - Google Patents

Unmanned ship synchronous positioning and mapping method (SLAM) and system with multi-sensor fusion Download PDF

Info

Publication number
CN115031718A
CN115031718A CN202210575283.7A CN202210575283A CN115031718A CN 115031718 A CN115031718 A CN 115031718A CN 202210575283 A CN202210575283 A CN 202210575283A CN 115031718 A CN115031718 A CN 115031718A
Authority
CN
China
Prior art keywords
slam
point cloud
unmanned ship
mapping
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210575283.7A
Other languages
Chinese (zh)
Other versions
CN115031718B (en
Inventor
余道洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongyu Future (Hefei) Water Technology Co.,Ltd.
Original Assignee
Anhui Zhongkeheding Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Zhongkeheding Technology Development Co ltd filed Critical Anhui Zhongkeheding Technology Development Co ltd
Priority to CN202210575283.7A priority Critical patent/CN115031718B/en
Publication of CN115031718A publication Critical patent/CN115031718A/en
Application granted granted Critical
Publication of CN115031718B publication Critical patent/CN115031718B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a multisensor-fused unmanned ship synchronous positioning and mapping method (SLAM), which comprises the following steps: s1, detecting the water surface environment in front of the unmanned ship through the solid-state laser radar and producing first point cloud data; s2, extracting the features of the first point cloud data, and extracting the features of the angle and the surface for key frame registration of SLAM; s3, detecting the periphery of the unmanned ship through a mechanical laser radar and generating second point cloud data; s4, extracting the features of the second point cloud data, and fusing the second point cloud data with the features extracted in the S2 to form a key frame; and S5, the SLAM system receives the key frame after the feature fusion in the S4 and processes the key frame to carry out synchronous positioning and map construction.

Description

Unmanned ship synchronous positioning and mapping method (SLAM) and system based on multi-sensor fusion
Technical Field
The invention relates to the technical field of unmanned ship autonomous circulation, in particular to a multi-sensor fusion unmanned ship synchronous positioning and mapping method (SLAM) and a system.
Background
Due to the interference of reflection of light from water, random disturbance of waves, blocking of river bank buildings, fog and the like on the environment sensor, the completely autonomous cruising of the unmanned ship in the inland river is an international difficult problem. When the signal of the unmanned ship is interfered or shielded, the position and state information is lost, and the American Massachusetts institute of technology Wei Wang et al develops a Roboat unmanned ship for the water traffic in Amsterdam city, but also encounters the problems that the bridge and the building block the receiving of GPS data, and the laser radar observes a lot of wrong data when the sunlight is directly radiated. The invention patent in China adopts the method of surveying and mapping the accurate position of a bridge in advance, before crossing, an unmanned ship is anchored in front of the bridge to fix the starting points of longitude, latitude and course angle, and then fast and uncontrolled crossing is carried out according to the fixed course angle (202110783834.4).
Therefore, the existing technology at home and abroad is difficult to solve the problems of positioning and navigation of the unmanned ship when the water surface signal is limited and cannot completely cruise autonomously. A common synchronous positioning And map construction (SLAM) method for unmanned systems such as robots, unmanned planes And unmanned vehicles solves the problems of navigation positioning And map updating, but the existing SLAM sensor has poor adaptability to the complex environment of a water surface scene, so that the key of autonomous cruising of an unmanned ship in the complex water surface environment is to solve the special SLAM technology for the water surface.
Disclosure of Invention
The invention aims to provide a multi-sensor fused unmanned ship synchronous positioning and mapping method (SLAM) and a system thereof, so as to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme:
a multi-sensor fused unmanned ship synchronous positioning and mapping method (SLAM), wherein the synchronous positioning and mapping process comprises the following steps:
s1, detecting the water surface environment in front of the unmanned ship through a solid laser radar and generating first point cloud data;
s2, extracting the features of the first point cloud data, and extracting the features of the angle and the surface for key frame registration of SLAM;
s3, detecting the periphery of the unmanned ship through a mechanical laser radar and generating second point cloud data;
s4, extracting the features of the second point cloud data, and fusing the second point cloud data with the features extracted in the S2 to form a key frame;
and S5, receiving the key frames subjected to feature fusion in S4 by the SLAM system, and processing the key frames to perform synchronous positioning and map construction.
As a further scheme of the invention: the SLAM adopts a factor graph optimization method and gives the ship state of the measured value of the sensor
Figure BDA0003661891910000021
The SLAM system factors comprise laser radar odometry factors, GPS factors and loopback factors;
wherein: d x And d y Representing the relative position between the drone and the current observed landmark and gamma represents the drone heading angle.
As a further scheme of the invention: the factor adds new nodes using heuristic methods
Figure BDA0003661891910000022
When the change in position or rotation of the vessel exceeds a user-defined threshold, a new node x is added, and the factor graph after the new node is inserted is optimized using the incremental smoothing and mapping of the bayesian tree.
As a further scheme of the invention: the construction of the laser radar odometer factor comprises the following steps:
t1, further extracting the edge characteristics of the water surface and the shore of the point cloud image or other solid matters on the water surface on the basis of extracting the characteristics of the original point cloud of the solid laser radar, using the edge characteristics as the selection standard of the point cloud frame, and then performing an interframe characteristic matching process;
t2, the inter-frame feature registration algorithm is to establish an objective function with Euclidean distance and the like as the measurement standard by matching the features of two adjacent frames in the laser scanning data, solve the least square problem with the transformation relation between two scanning point clouds as the optimization variable, namely translation and rotation transformation, and further obtain the local pose transformation of the mobile robot in the interval time of the two frames, and further obtain the local pose transformation T of the unmanned ship in the interval time of the two frames T and T +1 (t,t+1)
T3 adjacent two frames of laser light obtained for T time and T +1 timeRadar point cloud F t 、F t+1 The feature angular point set and the feature plane point set extracted from the image are respectively F t c 、F t p All the features extracted at time t form a laser radar point cloud frame F t In which F is t ={F t c ,F t p };
T4, selecting a lidar frame F when the change in attitude of the vessel exceeds a user-defined threshold t+1 As a key frame. The rule is selected when the position change or the rotation angle change of the ship exceeds a certain threshold value;
t5, the lidar frames between two keyframes will be dropped, when a lidar keyframe is selected, scan matching is used to calculate the relative transformation between the new keyframe and the previous sub-keyframe obtained by converting the first n keyframes into global frames, and these transformed keyframes are merged into a pixel map M t ,M t Consisting of two sub-pixel maps, i.e. two types of features, point and face, extracted in the previous step, these two sub-pixel maps being respectively represented as
Figure BDA0003661891910000031
(Point feature pixel map) and
Figure BDA0003661891910000032
(face feature pixel map), the relationship between feature set and pixel map can be expressed as:
Figure BDA0003661891910000033
wherein:
Figure BDA0003661891910000034
Figure BDA0003661891910000035
wherein
Figure BDA0003661891910000036
And
Figure BDA0003661891910000037
transformed point and plane features in the global frame, new lidar keyframe F t+1 Using the initial guess of the IMU, the transformed new keyframe in the global frame is denoted as' F t+1 Aiming at the key frames and the pixel sets, scanning matching is carried out by using a nearest neighbor searching method, and relative transformation delta T between the key frames and the pixel sets is obtained t,t+1 Added to the factor graph as a lidar odometry factor.
As a further scheme of the invention: according to the GPS combined navigation factor construction method, in an SLAM system, an unmanned ship utilizes absolute measurement values of a GPS, Beidou and inertial navigation combined module, a combined navigation pose information source is merged into a point cloud frame of a laser radar odometer through a timestamp, and the point cloud frame is taken as a GPS factor and is included in a factor graph.
As a further scheme of the invention: loop detection of the loop factors in the SLAM system is used for calibration of laser radar odometer factors to eliminate time-accumulated pose drift, and the loop factor keyframe matching comprises the following steps:
l1, inputting the feature point cloud at the current moment;
l2, searching the most adjacent points and the secondary adjacent points of the feature point cloud at the current moment in the feature point cloud at the previous moment;
l3, calculating the distance between a point in the feature point cloud at the current moment and a neighboring edge line;
l4, stacking distance constraint, and iteratively solving relative change through an L-M algorithm;
l5, obtaining iteration times and judging whether the iteration times are equal to a set value and are smaller than the maximum value, if so, changing the feature point cloud at the current time through relative change and executing a step L1, otherwise, executing a step L6;
l6, judging whether the iteration times are equal to the maximum value and the distances are smaller than the threshold value, if not, executing a step L4, and if so, executing a step L7;
l7, output relative change solution.
As a further scheme of the invention: the SLAM map is formed by acquiring the accurate pose of each key frame and the pose of the three-dimensional point cloud corresponding to each key frame, converting the laser coordinate system of the laser three-dimensional point cloud coordinate at each moment into the global map coordinate through coordinate change, and splicing to produce the three-dimensional map.
As a further scheme of the invention: after the synchronous positioning and map construction, when satellite signals are normal, the unmanned ship uses the satellite and the SLAM to carry out combined navigation, carries out positioning and map updating on the unmanned ship, when the satellite signals are insufficient and the satellite navigation cannot be used, automatically switches to an inertial navigation mode and calculates the position and posture information of the unmanned ship, and the position and posture information is used as the position and posture calibration and the time stamp synchronization of the SLAM, and carries out navigation through an SLAM system
The unmanned ship synchronous positioning and mapping system comprises an unmanned ship, wherein an SLAM system is arranged on the unmanned ship, and a solid-state laser radar, a low-beam panoramic mechanical laser radar, a satellite and an inertial navigation module which are in signal communication with the SLAM system are arranged on the unmanned ship.
Compared with the prior art, the invention has the beneficial effects that: the invention is based on SLAM algorithm, carries out deep optimization and improvement on SLAM system partner algorithm, enables the unmanned ship synchronous positioning and map construction method to be suitable for complex water surface environment, solves the problems that the conventional vision and laser radar fusion sensor is sensitive to light interference and has larger registration error, enables the unmanned ship to automatically detect and identify obstacles, bridges and other targets on the complex water surface environment, and carries out autonomous course, and can still safely and autonomously course through SLAM technology under the condition of no satellite signal.
Drawings
FIG. 1 is a schematic diagram of SLAM principle of an inland river unmanned ship optimized by a factor graph in the embodiment;
fig. 2 and 3 are schematic diagrams of key feature pose transformation of the water surface bridge in the embodiment;
FIG. 4 is a schematic diagram of a key frame matching algorithm flow chart according to the present embodiment;
fig. 5 is a schematic diagram of map stitching of the three-dimensional point cloud in this embodiment.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1 to 5, in an embodiment of the present invention, a unmanned ship synchronous positioning and mapping system includes an unmanned ship, an SLAM system is disposed on the unmanned ship, a solid state laser radar, a low beam panoramic mechanical laser radar, a satellite and an inertial navigation module are disposed on the unmanned ship and in signal communication with the SLAM system, the solid state laser radar is configured to detect a water surface environment in a certain view angle range (e.g., 120 degrees) in front of the unmanned ship, generate point cloud data, further identify a point cloud image, identify a type of a water surface target, such as a river bank, a bridge, a buoy, a ship or other obstacles, and automatically switch an identification result as an unmanned ship cruising scene, such as crossing the bridge and an obstacle gauge, and the mechanical laser radar is configured to detect a panoramic water surface environment of 360 degrees around the unmanned ship and generate the point cloud data
A multi-sensor fusion unmanned ship synchronous positioning and mapping method (SLAM) is based on the SLAM method to carry out synchronous positioning and mapping, wherein SLAM (simultaneous localization and mapping) is the synchronous positioning and mapping commonly used by unmanned systems such as robots, unmanned planes and unmanned vehicles, the synchronous positioning and mapping in the embodiment is based on the SLAM system, and the process comprises the following steps:
s1, detecting the water surface environment in front of the unmanned ship through the solid-state laser radar and producing first point cloud data;
s2, extracting the features of the first point cloud data, and extracting the features of the angle and the surface for key frame registration of SLAM;
s3, detecting the periphery of the unmanned ship through a mechanical laser radar and generating second point cloud data;
s4, extracting features of the second point cloud data, fusing the second point cloud data with the features extracted in S2, synchronizing time stamps with the fused data to form a key frame, using the key frame as the key frame for laser radar odometer and loop detection registration in the SLAM system, and calculating positions and distances of banks or obstacles on two sides and the rear of the unmanned ship at the same time by the point cloud data for obstacle avoidance or bank-by-bank cruising;
and S5, the SLAM system receives the key frame after feature fusion in S4 and processes the key frame to perform synchronous positioning and map construction.
SLAM consists of a solid laser radar, a mechanical laser radar and a combined navigation sensor, and adopts a factor graph optimization method. The factor graph optimization SLAM method refers to the state of a ship in estimating a given sensor measurement value
Figure BDA0003661891910000051
d x And d y The method comprises the steps of representing the relative position between an unmanned ship and a current observation landmark, representing the course angle of the unmanned ship, modeling in order to seamlessly combine the measurement from various sensors, wherein a factor graph modeling method is adopted by an SLAM system, three factors of the SLAM system are a laser radar odometry factor, a GPS factor and a loop back factor, the factors are used for limiting the use of a memory and improving the efficiency of a positioning system, and a heuristic method is used for adding a new node to the graph
Figure BDA0003661891910000061
Only when the position or rotation change of the ship exceeds a threshold value defined by a user, a new node x is added, and then the factor graph after the new node is inserted is optimized by using the incremental smoothing and mapping of the Bayes tree.
The construction of the laser radar odometer factor comprises the following steps:
t1, further extracting the edge characteristics of the water surface and the shore or other water surface solid matters of the point cloud image on the basis of extracting the characteristics of the original point cloud of the solid-state laser radar, taking the edge characteristics as the selection standard of the point cloud frame, and then performing an interframe characteristic matching process;
t2, inter-frame feature registration algorithm is to establish an objective function with Euclidean distance as a measurement standard by matching the features of two adjacent frames in laser scanning data, and solve the least square problem that the transformation relation between two scanning point clouds is an optimized variable, namely translation and rotation transformation, so as to obtain the local pose transformation of the mobile robot in the interval time of two frames (t,t+1) Further, the local pose transformation T of the unmanned ship in the interval time of two frames T and T +1 is obtained (t,t+1)
T3, two adjacent frames of laser radar point clouds F obtained for T moment and T +1 moment t 、F t+1 The feature angular point set and the feature plane point set extracted from the image are respectively F t c 、F t p All the features extracted at time t form a laser radar point cloud frame F t In which F is t ={F t c ,F t p };
T4, SLAM generally employs the concept of keyframe selection to provide computational efficiency and speed, which is widely used in the field of visual SLAM to select a lidar frame F when the change in attitude of a vessel exceeds a user-defined threshold t+1 As a key frame. The rule is selected when the position change or the rotation angle change of the ship exceeds a certain threshold value;
t5, the lidar frames between two keyframes will be dropped, when a lidar keyframe is selected, scan matching is used to calculate the relative transformation between the new keyframe and the previous sub-keyframe obtained by converting the first n keyframes into global frames, and these transformed keyframes are merged into a pixel map M t ,M t Consisting of two sub-pixel maps, i.e. two types of features, point and face, extracted in the previous step, these two sub-pixel maps being respectively represented as
Figure BDA0003661891910000071
(dot feature pixel map) and
Figure BDA0003661891910000072
(face feature pixel map), the relationship between feature set and pixel map can be expressed as:
Figure BDA0003661891910000073
wherein:
Figure BDA0003661891910000074
Figure BDA0003661891910000075
wherein F t c And' F t p Transformed point and plane features in the global frame, new lidar keyframe F t+1 Using the initial guess of the IMU, the transformed new keyframe in the global frame is denoted as' F t+1 Aiming at the key frames and the pixel sets, scanning matching is carried out by using a nearest neighbor searching method, and relative transformation delta T between the key frames and the pixel sets is obtained t,t+1 Added to the factor graph as a lidar odometry factor.
According to the GPS combined navigation factor construction method, in the SLAM system, the unmanned ship can realize low drift state estimation by using the laser radar odometer factor, but the positioning system still has the drift problem in a long-time navigation task. Therefore, by using absolute measurement values of the GPS, Beidou and inertial navigation combined modules, the combined navigation pose information source is integrated into a point cloud frame of the laser radar odometer through a timestamp and taken as a GPS factor to be incorporated into a factor graph.
According to the loop factor construction method, the SLAM is mainly used for positioning and mapping an unknown environment without prior information, an existing SLAM system has the capacity of judging and identifying a historical scene with a mapped map, the existing SLAM system is generally called loop closing detection, as the operation of an unmanned ship in an inland river is generally a sequence execution non-repeated path, the existing SLAM system can be understood as an infinite corridor, the system is called a sequence SLAM, loop detection in the system is mainly used for calibrating a laser radar odometer factor to eliminate time accumulated pose drift, a specific algorithm principle is shown in figure 4, loop detection in the SLAM system is used for calibrating the laser radar odometer factor to eliminate time accumulated pose drift, and loop factor key frame matching comprises the following steps:
l1, inputting the feature point cloud at the current moment;
l2, searching the most adjacent point and the secondary adjacent point of the feature point cloud at the current moment in the feature point cloud at the previous moment;
l3, calculating the distance between a point in the feature point cloud at the current moment and a neighboring edge line;
l4, stacking distance constraint, and iteratively solving relative change through an L-M algorithm;
l5, obtaining iteration times and judging whether the iteration times are equal to a set value and are smaller than the maximum value, if so, changing the feature point cloud at the current time through relative change and executing a step L1, otherwise, executing a step L6;
l6, judging whether the iteration times are equal to the maximum value and the distances are smaller than the threshold value, if not, executing a step L4, and if so, executing a step L7;
l7, output relative change solution.
Because a factor graph is used, loop detection can be seamlessly integrated into an SLAM system, the factor graph is mainly established by real-time pose estimation results and observation constraint conditions obtained through interframe registration, the factor graph is composed of points and edges, wherein the points represent poses of the unmanned ship at different moments, the edges represent constraint factors between the poses at different moments, and the simple but effective loop detection method based on Euclidean distance is adopted.
When the new state X i+1 When added to the factor graph, as shown in FIG. 1, search the graph and find the distance X i+1 A priori state within a certain distance. For example, x 3 Is the returned candidate state, extracts the sub-key frame { F 3-m ,…,F 3 ,…,F 3+m And combine them into a partial pixel map, if possible at F i+1 And finding a successful match with the pixel map, the relative transformation deltat can be obtained 3,i+1 It is added to the graph as a loop closure factor.
And (3) splicing the navigation map based on the SLAM, wherein the posture of each key frame is optimal through the map optimization process, the accurate position and posture of the key frame and the position and posture of the three-dimensional point cloud corresponding to each key frame are known, and the laser three-dimensional light point cloud coordinates are converted into a global map coordinate system from a laser coordinate system at each moment through coordinate transformation, so that the splicing of the three-dimensional map is completed. And calibrating coordinate systems of the shipborne solid laser radar and the mechanical laser radar, wherein the calibration aims to convert point cloud of the mechanical laser radar into the coordinate system based on the solid laser radar coordinate system which is installed in the forward direction. The coordinate transformation relation can be represented by a rotation matrix R and a translation vector T, and the rotation matrix and the translation vector between the two coordinate systems of R and T are finally calculated through a Levenberg-Marquart algorithm.
The combined navigation is used for navigation, positioning and map updating of the unmanned ship when the satellite signals are normal, and automatically switches to an inertial navigation mode and calculates the unmanned ship position and attitude information when the satellite signals are absent, and the unmanned ship position and attitude information is used as SLAM position and attitude calibration and timestamp synchronization
When satellite signals are normal, the unmanned ship uses a satellite and SLAM for combined navigation, carries out positioning and map updating on the unmanned ship, automatically switches to an inertial navigation mode and calculates unmanned ship position and attitude information when the satellite signals are insufficient and the unmanned ship cannot be used for navigation, and the unmanned ship position and attitude information is used as SLAM position and attitude calibration and timestamp synchronization and is navigated through an SLAM system.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.

Claims (9)

1. A multi-sensor fused unmanned ship synchronous positioning and mapping method (SLAM), which is characterized in that the synchronous positioning and mapping process comprises the following steps:
s1, detecting the water surface environment in front of the unmanned ship through the solid-state laser radar and producing first point cloud data;
s2, extracting the features of the first point cloud data, and extracting the features of the angle and the surface for key frame registration of SLAM;
s3, detecting the periphery of the unmanned ship through a mechanical laser radar and generating second point cloud data;
s4, extracting the features of the second point cloud data, and fusing the second point cloud data with the features extracted in the S2 to form a key frame;
and S5, receiving the key frames subjected to feature fusion in S4 by the SLAM system, and processing the key frames to perform synchronous positioning and map construction.
2. The multi-sensor fused unmanned aerial vehicle synchronous positioning and mapping method (SLAM) as claimed in claim 1, whereinCharacterized in that the SLAM adopts a factor graph optimization method and gives the ship state of the measured value of the sensor
Figure FDA0003661891900000011
The SLAM system factors comprise laser radar odometry factors, GPS factors and loopback factors;
wherein: d x And d y Representing the relative position between the drone and the current observed landmark and gamma represents the drone heading angle.
3. The method for multi-sensor fusion unmanned ship synchronous positioning and mapping (SLAM) of claim 2, wherein the factors add new nodes using heuristic methods
Figure FDA0003661891900000012
When the change in position or rotation of the vessel exceeds a user-defined threshold, a new node x is added, and the factor graph after the new node is inserted is optimized using the incremental smoothing and mapping of the bayesian tree.
4. The method for multi-sensor fusion unmanned ship synchronous positioning and mapping (SLAM) of claim 2, wherein the construction of the lidar odometry factor comprises the steps of:
t1, further extracting the edge characteristics of the water surface and the shore or other water surface solid matters of the point cloud image on the basis of extracting the characteristics of the original point cloud of the solid-state laser radar, taking the edge characteristics as the selection standard of the point cloud frame, and then performing an interframe characteristic matching process;
t2, the inter-frame feature registration algorithm is to establish an objective function with Euclidean distance and the like as the measurement standard by matching the features of two adjacent frames in the laser scanning data, solve the least square problem with the transformation relation between two scanning point clouds as the optimization variable, namely translation and rotation transformation, and further obtain the local pose transformation of the mobile robot in the interval time of the two frames, and further obtain the local pose transformation T of the unmanned ship in the interval time of the two frames T and T +1 (t,t+1)
T3, two adjacent frames of laser radar point clouds F obtained for time T and time T +1 t 、F t+1 The feature angular point set and the feature plane point set extracted from the image are respectively F t c 、F t p All the features extracted at time t form a laser radar point cloud frame F t In which F is t ={F t c ,F t p };
T4, selecting a lidar frame F when the change in attitude of the vessel exceeds a user-defined threshold t+1 As a key frame. The rule is selected when the position change or the rotation angle change of the ship exceeds a certain threshold value;
t5, the lidar frames between two keyframes will be dropped, when a lidar keyframe is selected, scan matching is used to calculate the relative transformation between the new keyframe and the previous sub-keyframe obtained by converting the first n keyframes into global frames, and these transformed keyframes are merged into a pixel map M t ,M t Consisting of two sub-pixel maps, i.e. two types of features, point and face, extracted in the previous step, these two sub-pixel maps being respectively represented as
Figure FDA0003661891900000021
(dot feature pixel map) and
Figure FDA0003661891900000022
(face feature pixel map), the relationship between feature set and pixel map can be expressed as:
Figure FDA0003661891900000023
wherein:
Figure FDA0003661891900000024
Figure FDA0003661891900000025
wherein F t c And' F t p Transformed points and planar features in the global framework, new lidar keyframe F t+1 Using initial guesses of the IMU, the transformation is converted to a global frame, where the transformed new key frame is denoted as' F t+1 Aiming at the key frames and the pixel sets, scanning matching is carried out by using a nearest neighbor searching method, and relative transformation delta T between the key frames and the pixel sets is obtained t,t+1 Added to the factor graph as a lidar odometry factor.
5. The method for synchronously positioning and mapping the unmanned ship with the fused multiple Sensors (SLAM) as claimed in claim 2, wherein in the SLAM system, the unmanned ship utilizes absolute measurement values of GPS, Beidou and inertial navigation combination modules, and a pose information source of the combined navigation is fused into a point cloud frame of a laser radar odometer through a timestamp and is taken as a GPS factor to be included in a factor graph.
6. The method for unmanned ship synchronized positioning and mapping (SLAM) with multi-sensor fusion of claim 2, wherein loop detection of the loop factors in a SLAM system is used for calibration of lidar odometry factors to eliminate time-accumulated pose drift, and wherein the loop factor keyframe matching comprises the steps of:
l1, inputting the feature point cloud at the current moment;
l2, searching the most adjacent points and the secondary adjacent points of the feature point cloud at the current moment in the feature point cloud at the previous moment;
l3, calculating the distance between the point in the feature point cloud at the current moment and the adjacent edge line;
l4, stacking distance constraint, and iteratively solving relative change through an L-M algorithm;
l5, obtaining iteration times and judging whether the iteration times are equal to a set value and are smaller than the maximum value, if so, changing the feature point cloud at the current time through relative change and executing a step L1, otherwise, executing a step L6;
l6, judging whether the iteration times are equal to the maximum value and the distances are smaller than the threshold value, if not, executing a step L4, and if so, executing a step L7;
l7, output relative change solution.
7. The method for synchronously positioning and mapping the unmanned ship with the multi-sensor fusion (SLAM) as claimed in claim 1, wherein the SLAM map is formed by acquiring the accurate pose of each key frame and the pose of the three-dimensional point cloud corresponding to each key frame, converting the laser coordinate system of the laser three-dimensional point cloud coordinate at each moment into the global map coordinate through coordinate change, and splicing to produce the three-dimensional map.
8. The method for unmanned ship synchronous positioning and mapping (SLAM) with multi-sensor fusion according to claim 1, wherein after synchronous positioning and mapping are carried out, when satellite signals are normal, the unmanned ship uses the combination of satellites and SLAM to navigate and carries out positioning and mapping updating on the unmanned ship, when the satellite signals are insufficient and the satellite cannot be used to navigate, the mode is automatically switched to an inertial navigation mode, the unmanned ship position and attitude information is calculated and used as SLAM position and attitude calibration and timestamp synchronization, and navigation is carried out through an SLAM system.
9. The unmanned ship synchronous positioning and mapping system for multi-sensor fusion of claim 1, comprising an unmanned ship, wherein the unmanned ship is provided with a SLAM system, and the unmanned ship is provided with a solid state laser radar, a low beam panoramic mechanical laser radar, a satellite and an inertial navigation module which are in signal communication with the SLAM system.
CN202210575283.7A 2022-05-25 2022-05-25 Multi-sensor fused unmanned ship synchronous positioning and mapping method (SLAM) and system Active CN115031718B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210575283.7A CN115031718B (en) 2022-05-25 2022-05-25 Multi-sensor fused unmanned ship synchronous positioning and mapping method (SLAM) and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210575283.7A CN115031718B (en) 2022-05-25 2022-05-25 Multi-sensor fused unmanned ship synchronous positioning and mapping method (SLAM) and system

Publications (2)

Publication Number Publication Date
CN115031718A true CN115031718A (en) 2022-09-09
CN115031718B CN115031718B (en) 2023-10-31

Family

ID=83122050

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210575283.7A Active CN115031718B (en) 2022-05-25 2022-05-25 Multi-sensor fused unmanned ship synchronous positioning and mapping method (SLAM) and system

Country Status (1)

Country Link
CN (1) CN115031718B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116736327A (en) * 2023-08-10 2023-09-12 长沙智能驾驶研究院有限公司 Positioning data optimization method, device, electronic equipment and readable storage medium
CN117215309A (en) * 2023-09-27 2023-12-12 河南工业大学 Positioning and mapping method and positioning and mapping device for unmanned vehicle grain depot cleaning system
CN118424300A (en) * 2024-07-05 2024-08-02 诺力智能装备股份有限公司 SLAM-based positioning method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180075643A1 (en) * 2015-04-10 2018-03-15 The European Atomic Energy Community (Euratom), Represented By The European Commission Method and device for real-time mapping and localization
CN112862894A (en) * 2021-04-12 2021-05-28 中国科学技术大学 Robot three-dimensional point cloud map construction and expansion method
CN113340295A (en) * 2021-06-16 2021-09-03 广东工业大学 Unmanned ship near-shore real-time positioning and mapping method with multiple ranging sensors
CN113670316A (en) * 2021-08-27 2021-11-19 广州市工贸技师学院(广州市工贸高级技工学校) Path planning method and system based on double radars, storage medium and electronic equipment
CN114080625A (en) * 2020-06-19 2022-02-22 深圳市大疆创新科技有限公司 Absolute pose determination method, electronic equipment and movable platform

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180075643A1 (en) * 2015-04-10 2018-03-15 The European Atomic Energy Community (Euratom), Represented By The European Commission Method and device for real-time mapping and localization
CN114080625A (en) * 2020-06-19 2022-02-22 深圳市大疆创新科技有限公司 Absolute pose determination method, electronic equipment and movable platform
CN112862894A (en) * 2021-04-12 2021-05-28 中国科学技术大学 Robot three-dimensional point cloud map construction and expansion method
CN113340295A (en) * 2021-06-16 2021-09-03 广东工业大学 Unmanned ship near-shore real-time positioning and mapping method with multiple ranging sensors
CN113670316A (en) * 2021-08-27 2021-11-19 广州市工贸技师学院(广州市工贸高级技工学校) Path planning method and system based on double radars, storage medium and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116736327A (en) * 2023-08-10 2023-09-12 长沙智能驾驶研究院有限公司 Positioning data optimization method, device, electronic equipment and readable storage medium
CN116736327B (en) * 2023-08-10 2023-10-24 长沙智能驾驶研究院有限公司 Positioning data optimization method, device, electronic equipment and readable storage medium
CN117215309A (en) * 2023-09-27 2023-12-12 河南工业大学 Positioning and mapping method and positioning and mapping device for unmanned vehicle grain depot cleaning system
CN118424300A (en) * 2024-07-05 2024-08-02 诺力智能装备股份有限公司 SLAM-based positioning method and device

Also Published As

Publication number Publication date
CN115031718B (en) 2023-10-31

Similar Documents

Publication Publication Date Title
CN110850403B (en) Multi-sensor decision-level fused intelligent ship water surface target feeling knowledge identification method
CN113340295B (en) A nearshore real-time positioning and mapping method for unmanned boats with multiple ranging sensors
CN115031718B (en) Multi-sensor fused unmanned ship synchronous positioning and mapping method (SLAM) and system
CN107167826B (en) Vehicle longitudinal positioning system and method based on variable grid image feature detection in automatic driving
Ribas et al. Underwater SLAM in a marina environment
Chambers et al. Perception for a river mapping robot
EP2687868B1 (en) Systems and methods for correlating reduced evidence grids
EP2834668A1 (en) Method for localizing a vehicle equipped with two lidar systems
US11580688B2 (en) High-definition city mapping
CN116380079B (en) An underwater SLAM method integrating forward-looking sonar and ORB-SLAM3
CN108318034A (en) A kind of AUV based on sonar map times depressed place air navigation aid
CN112880678A (en) Unmanned ship navigation planning method in complex water area environment
CN113554705A (en) Robust positioning method for laser radar in changing scene
Wang et al. Monocular visual SLAM algorithm for autonomous vessel sailing in harbor area
CN112747749A (en) Positioning navigation system based on binocular vision and laser fusion
Ma et al. Rolm: Radar on lidar map localization
CN114077249B (en) Operation method, operation equipment, device and storage medium
KR102624644B1 (en) Method of estimating the location of a moving object using vector map
CN114593739A (en) Vehicle global positioning method and device based on visual detection and reference line matching
CN114119752A (en) Robot positioning method for indoor and outdoor connection based on GNSS and vision
CN112581610A (en) Robust optimization method and system for establishing map from multi-beam sonar data
CN114323038B (en) Outdoor positioning method integrating binocular vision and 2D laser radar
CN113390422B (en) Automobile positioning method and device and computer storage medium
Jabbour et al. Backing up GPS in urban areas using a scanning laser
Krishnaswamy et al. Sensor fusion for GNSS denied navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 230031 room 304-1, R & D building, Hefei Institute of technology innovation engineering, Chinese Academy of Sciences, 2666 Xiyou Road, high tech Zone, Hefei, Anhui Province

Applicant after: Anhui Zhongke Heding Intelligent Technology Development Co.,Ltd.

Address before: 230031 room 304-1, R & D building, Hefei Institute of technology innovation engineering, Chinese Academy of Sciences, 2666 Xiyou Road, high tech Zone, Hefei, Anhui Province

Applicant before: Anhui zhongkeheding Technology Development Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230809

Address after: Room 1157, Building A, J1, Phase II, Innovation Industrial Park, No. 2800, Chuangxin Avenue, High tech Zone, Hefei, Anhui 230088

Applicant after: Hefei Henghao Intelligent Technology Partnership (L.P.)

Address before: 230031 room 304-1, R & D building, Hefei Institute of technology innovation engineering, Chinese Academy of Sciences, 2666 Xiyou Road, high tech Zone, Hefei, Anhui Province

Applicant before: Anhui Zhongke Heding Intelligent Technology Development Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240425

Address after: Room 766-3, 7th Floor, Building A3A4, Zhong'an Chuanggu Science and Technology Park, No. 900 Wangjiang West Road, High tech Zone, Hefei City, Anhui Province, 230088

Patentee after: Zhongyu Future (Hefei) Water Technology Co.,Ltd.

Country or region after: China

Address before: Room 1157, Building A, J1, Phase II, Innovation Industrial Park, No. 2800, Chuangxin Avenue, High tech Zone, Hefei, Anhui 230088

Patentee before: Hefei Henghao Intelligent Technology Partnership (L.P.)

Country or region before: China