[go: up one dir, main page]

CN118999531B - Mobile robot synchronous positioning and mapping method for underground space - Google Patents

Mobile robot synchronous positioning and mapping method for underground space Download PDF

Info

Publication number
CN118999531B
CN118999531B CN202411489703.5A CN202411489703A CN118999531B CN 118999531 B CN118999531 B CN 118999531B CN 202411489703 A CN202411489703 A CN 202411489703A CN 118999531 B CN118999531 B CN 118999531B
Authority
CN
China
Prior art keywords
coordinate system
speckle
structured light
measurement
light sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202411489703.5A
Other languages
Chinese (zh)
Other versions
CN118999531A (en
Inventor
郑小兵
宋耀东
李健军
袁银麟
丁蕾
夏茂鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Institutes of Physical Science of CAS
Original Assignee
Hefei Institutes of Physical Science of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Institutes of Physical Science of CAS filed Critical Hefei Institutes of Physical Science of CAS
Priority to CN202411489703.5A priority Critical patent/CN118999531B/en
Publication of CN118999531A publication Critical patent/CN118999531A/en
Application granted granted Critical
Publication of CN118999531B publication Critical patent/CN118999531B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a synchronous positioning and mapping method of a mobile robot in an underground space, which belongs to the fields of SLAM, unmanned, surveying and underground engineering of the mobile robot, and adopts a scanning mode of combining axial motion and circular motion to realize point cloud acquisition, construct three-dimensional point cloud, utilize circular track measurement IMU pre-integration to realize fine measurement of pose, utilize UWB base station global correction and eliminate accumulated errors. The dual-purpose speckle structure light is adopted to realize the point cloud pickup of the low-illumination underground environment. And realizing coarse registration of point clouds among frames by fusing pose of the circumferential track measurement IMU and the UWB base station, realizing iterative optimization according to an ICP iterative algorithm, improving the registration precision of the point clouds, and obtaining a global space map and robot positioning. The invention is suitable for three-dimensional construction and robot positioning of underground long and narrow spaces such as underground karst cave, large pipe gallery, underground pipe gallery, mine gallery and the like, and is used for serving application scenes such as unmanned construction operation and the like.

Description

Mobile robot synchronous positioning and mapping method for underground space
Technical Field
The invention belongs to the fields of mobile robot SLAM, unmanned, mapping and underground engineering, and particularly relates to a synchronous positioning and mapping method of a mobile robot in an underground space, which can be applied to intelligent construction of pipe gallery, roadway and pipe gallery environments and convergence deformation inspection.
Background
The simultaneous localization and mapping (simultaneous localization AN MAPPING, SLAM) technology is an important problem in the robot field, and the SLAM problem can be described as free movement of a robot equipped with a sensor in space, localization of the position of the robot by using acquired sensor information, and incremental map construction based on self localization are performed, so that simultaneous localization and mapping of the robot are realized.
The underground space has a complex structure and a severe environment, an intelligent robot is required to replace a person to operate, and environment perception is an intelligent precondition. The environment perception and slam of the underground space are different from the indoor and outdoor environments of the ground, and the underground space has the characteristics of low illumination, no GPS signal, narrow space and the like.
The existing underground three-dimensional map is mainly applied to three-dimensional mapping of an as-built underground pipe gallery and used for detecting whether the pipe gallery is converged and deformed or not, and means adopted by the map are a three-dimensional laser scanner with the ground, a total station and the like. The completed pipe gallery is clear and free of shielding, the environment of severe dust and water mist is avoided, the obtained point cloud is sparse and uneven, and the three-dimensional construction of the pipe gallery, the roadway, the pipe gallery and the like in construction cannot be met.
Disclosure of Invention
In order to solve the technical problems, realize narrow and long and narrow underground space map building, the invention provides a method for synchronously positioning and building a mobile robot in an underground space, which adopts a scanning mode combining axial motion and circular motion to realize point cloud acquisition, builds three-dimensional point cloud based on UWB (Ultra Wide Band), IMU (Inertial Measurement Unit ) and binocular speckle structure light, realizes fine measurement of pose by using IMU pre-integration, and eliminates accumulated error by using UWB global correction; the method comprises the steps of realizing point cloud pickup of a low-illumination underground environment by adopting binocular speckle structure light, realizing inter-frame point cloud rough registration by means of pose fusion of an IMU and UWB, realizing iterative optimization according to an ICP (ITERATIVE CLOSEST POINT) algorithm, improving point cloud registration precision, and obtaining a global space map and robot positioning.
The invention provides a speckle structure point cloud acquisition device which is different from the traditional phase shift or Gray code coding mode, the acquisition of the point cloud can be realized without photographing through multiple exposure. The speckle structure light has horizontal and vertical structure lines, so that the left and right camera baselines can be conveniently aligned, and the speckle matching speed is improved. The method has the characteristics of wide application scene, dense point cloud, high three-dimensional graph construction precision and high reliability.
In order to achieve the above purpose, the invention adopts the following technical scheme:
a synchronous positioning and mapping method of a mobile robot in an underground space comprises the following steps:
Step 1, a mobile robot synchronous positioning and mapping system of an underground space is arranged in a pipe gallery, 1 UWB base station is arranged at the top of an inlet of the pipe gallery, and a world coordinate system is established;
Step 2, initializing a control and processing module, and aligning the circumference track measurement IMU, the global measurement UWB base station with the time stamp of the shot image of the first UWB receiver, the second UWB receiver and the speckle structure light sensor;
Step 3, starting circular motion scanning, after scanning along the circular motion for a circle, advancing a scanning base along a track for 1 meter, starting circular motion scanning again, and synchronously collecting point cloud data of the circular track measurement IMU, the second UWB receiver and the speckle structure light in the moving process;
Step 4, carrying out Kalman filtering fusion on position and posture data of the circumferential track measurement IMU and the second UWB receiver, fitting to obtain the position and posture of the speckle structure optical sensor under the world coordinate system in the motion process, and carrying out rough registration on point clouds acquired at different moments before and after the motion according to the fitted position and posture to obtain rough registered point clouds;
And 5, carrying out fine registration of frames and subgraphs and fine registration of the subgraphs and updating a map on the point cloud subjected to coarse registration by an ICP iterative algorithm, wherein the ICP iterative algorithm obtains an error function E (R, t) by solving the transformation between a source point set P= { P i,i=1,2,...Np } and a target point set Q= { Q i,i=1,2,...Nq } so as to minimize the distance between the point clouds, i is the sequence number of the point cloud, N p、Nq is the maximum point number of the source point set P and the maximum point number of the target point set Q respectively, and calculating a rotation matrix R and a translation matrix t.
The beneficial effects are that:
1. The invention realizes the pose measurement of the sensor based on the IMU and the UWB, and fills the problem that the traditional GPS navigation has no signal and can not work in the underground space.
2. According to the invention, the research on the method for realizing high-precision pose measurement of the tail end of the moving cantilever based on Kalman filtering is carried out, the multi-sensing data fusion is realized based on multistage Kalman filtering through the method of combining IMU dynamic measurement and UWB static measurement, UWB global pose correction and IMU local pose fine acquisition. The problem of IMU drift is corrected through the global pose of the total station, the problem of data missing is supplemented through the fine acquisition integral of the IMU, high-precision positioning solution is realized, the problem of positioning error caused by vibration and the like in the motion process is solved, and the measurement precision is improved.
3. The scanning mechanism for collecting the point cloud is a hollow mechanism, and the scanning mechanism is close to the surface to be measured, so that the problem of shielding of an underground space can be effectively solved.
4. The invention adopts the motion mode of combining axial motion and rotary scanning, the obtained point cloud is more dense and uniform, and the degree of refinement is higher.
Drawings
FIG. 1 is a schematic diagram of a mobile robot synchronous positioning and mapping system for implementing a mobile robot synchronous positioning and mapping method for an underground space according to the present invention;
FIG. 2 is a schematic diagram of a point cloud acquisition module;
FIG. 3 is a coded structured light pattern;
FIG. 4 is a flow chart of an algorithm of the algorithm module;
Fig. 5 is a three-dimensional point cloud image of a piping lane image obtained according to the present invention.
The device comprises a mobile robot motion guide rail 1, a scanning base 2, a first UWB receiver 3, a second UWB receiver 4, a circumferential track measurement IMU 5, a control and processing module 6, a speckle structure light sensor 7, a pipe gallery 8, a UWB base station 9, a left camera 10, a right camera 11, a laser light source 12, a speckle mask plate 13 and a zoom lens 14.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. In addition, the technical features of the embodiments of the present invention described below may be combined with each other as long as they do not collide with each other.
The mobile robot synchronous positioning and mapping system for realizing the mobile robot synchronous positioning and mapping method of the underground space comprises a data acquisition scanning structure, a pose measuring system, a point cloud acquisition module, a control and processing module and an algorithm module.
The data acquisition scanning structure comprises an axial movement structure and a circular movement structure, the axial movement structure comprises but is not limited to a track type, a wheel type and a four-foot type, the circular movement track is circular, the circular movement structure is a hollow structure, the scanning of the wall surface can be realized by erecting a laser scanner in the middle in the open long and narrow spaces such as a coal mine tunnel and a pipe gallery, but during construction operation, engineering operation devices can be arranged in the device, the open space is changed into a narrow space by stacking the devices, and the linear propagation of shielding influence light exists, so that the acquisition scanning of the wall surface is influenced.
In order to realize the wall surface scanning of the construction operation environment, an axial scanning system and a circumferential scanning system are designed, and because the axial movement and the circumferential movement are independent, in order to avoid the interference of mutual movement, in the movement measurement, the movement is stopped after the axial movement is a certain distance, after the whole is ensured to be stationary for a period of time, the circumferential scanning movement is started and is stationary for a plurality of seconds, and then the circulation measurement of the next period is started again.
As shown in fig. 1, the data acquisition scanning structure comprises a mobile robot motion guide rail 1, a scanning base 2, a first UWB receiver 3 fixed on the scanning base 2, a second UWB receiver 4, a circular trace measuring IMU 5 performing circular motion on the scanning base 2, and a UWB base station 9 fixed on the top of a pipe rack 8. The UWB base station 9 at the top of the pipe rack 8 forms the world coordinate system, the scanning base 2 forms the machine coordinate system, and the speckle pattern photosensor 7 forms the sensor coordinate system.
The speckle structure light sensor 7 forms a point cloud acquisition module, and acquired data is a three-dimensional position of a target under a local coordinate system of the speckle structure light sensor 7.
Specifically, the robot motion guide 1 is mounted at the bottom of the pipe lane 8, for supporting the scanning base 2, and providing the scanning base 2 with a track that moves in the axial direction of the pipe lane 8.
Specifically, the scanning base 2 is a ring mechanism, and is mounted on the robot motion guide rail 1, and a first UWB receiver 3 is mounted at the lower bottom of the scanning base 2 and is used for receiving signals of a UWB base station 9 and measuring the position and the posture of the scanning base 2 in the UWB base station 9 fixed at the top of the pipe rack 8, that is, the posture and the angle of the scanning base 2 in a world coordinate system. A speckle structure photosensor 7, a circular trajectory measuring IMU 5, a control and processing module 6, and a second UWB receiver 4 are mounted on a movement axis of the upper portion of the scanning base 2, and reciprocate with the movement axis around the central axis of the scanning base 2.
Specifically, the second UWB receiver 4 is fixedly connected to the speckle pattern light sensor 7, and is used for measuring the position and posture of the speckle pattern light sensor 7 in the UWB base station 9, that is, the posture and angle of the speckle pattern light sensor 7 in the world coordinate system.
Specifically, the circular path measuring IMU 5 is fixedly connected with the optical sensor 7 of the speckle structure and the second UWB receiver 4, and moves together with the movement of the optical sensor 7 of the speckle structure on the circumference, the circular path measuring IMU 5 includes 3 accelerometers and 3 angular velocity meters, and the position and posture change of the optical sensor 7 of the speckle structure in the movement process can be calculated through integration.
Specifically, the control and processing module 6 is fixedly connected with the circumferential track measurement IMU 5, the speckle structure optical sensor 7 and the second UWB receiver 4, so as to control the movement of the scanning base 2 and process the received data of the circumferential track measurement IMU 5, the speckle structure optical sensor 7 and the second UWB receiver 4 in real time.
Specifically, the pipe lane 8 is a target measurement object, and the coordinates of the spatial point of the inner wall of the pipe lane 8 in the world coordinate system need to be obtained.
Specifically, the UWB base station 9 is used as a measurement reference, fixed on the top of the pipe gallery wall, and forms a world coordinate system with the UWB base station 9, and finally all the point cloud coordinates are unified under the world coordinate system formed by the UWB base station 9.
The pose measurement system realizes pose measurement and comprises pose measurement of the scanning base 2 and pose measurement of the speckle structure light sensor 7.
The pose measurement of the scanning base 2 is realized by a measuring unit consisting of an ultra wideband UWB base station 9 and a first UWB receiver 3 on the scanning base 2, and the pose of a machine coordinate system (the coordinate system where the scanning base 2 is located) in a world coordinate system is obtained through space coordinate transformation.
The pose measurement of the optical sensor 7 with the speckle structure is realized through the UWB base station 9, the second UWB receiver 4 and the circumference track measurement IMU 5. The circumferential-track measurement IMU 5 comprises a 3-axis accelerometer and a 3-axis gyroscope, and can measure acceleration in three axial directions and angular velocities around the three axes of a sensor coordinate system fixedly connected with the circumferential-track measurement IMU 5 during movement.
The pose in the motion process is obtained through global measurement of the UWB base station 9 and the second UWB receiver 4 and measurement of the IMU 5 of the circular motion track, firstly, angular acceleration and acceleration obtained through the gyroscope are filtered and denoised, angle measured through the Kalman filtering gyroscope and angle measured through the UWB base station 9 and the second UWB receiver 4 are fused, angular positions with higher precision are obtained, then, acceleration values of the sensor coordinate system after filtering are converted through coordinates to obtain acceleration values under the world coordinate system, the gravity values are subtracted in the vertical direction to obtain motion acceleration of all directions, linear speeds of three circular motions are obtained through multiplication of the angular speeds obtained through the gyroscope and the circular motion diameter, and predicted sensor motion position values are obtained through calculation of a displacement calculation formula S=S 0+vt+at2/2, wherein S, S 0、, v, a and t are displacement values of the speckle structure optical sensor 7 on three axes of the world coordinate system, initial displacement values, initial speeds, average acceleration in the time t and motion time. The second UWB receiver 4 is used for measuring the position of the sensor in the circular motion process, and each discrete measuring point is fitted into a curve to obtain the motion track change of the sensor. The position of the optical sensor 7 with the speckle structure is obtained in two ways, and then the motion position of the optical sensor 7 with the speckle structure calculated by the displacement calculation formula and the motion position of the optical sensor 7 with the speckle structure calculated by the second UWB receiver 4 are fused by Kalman filtering, so that the more accurate sensor position is obtained.
The point cloud acquisition module is of an active binocular speckle structure and can measure in an underground low-illumination environment. The baseline distance of the optical sensor 7 with the speckle structure can be adjusted according to the precision requirement of the working distance of the center of the measured object, so that the high-precision measurement requirements of different scenes are met. The specially encoded speckle structure photosensor 7 is characterized in that the focal length of the speckle light is adjustable, and in the projected speckle structure light, as shown in fig. 3, lines of transverse and vertical encoding are also included, namely a first transverse encoding line, a second transverse encoding line, a third transverse encoding line, a first vertical encoding line, a second vertical encoding line and a third vertical encoding line, and the projection direction normal plane is divided into 16 different area sizes, wherein the lines are used for realizing transverse or vertical alignment of images when the left camera and the right camera shoot, so that the calculated amount is reduced when later speckle pattern matching is facilitated, and the matching efficiency is improved.
Fig. 2 shows components of a point cloud acquisition module, including a left camera 10, a right camera 11, a laser light source 12, a speckle mask 13, and a zoom lens 14. The laser light source 12, the speckle mask 13 and the zoom lens 14 form a speckle projector, so that the function of projecting speckle on the surface of a measured object is realized, the laser light source 12 provides illumination, the speckle mask 13 provides patterns, and the zoom lens 14 can change the size of the patterns on the speckle mask 13 on the measured surface, so that the device is suitable for the surfaces of the measured objects with different sizes. The left camera 10 and the right camera 11 form a binocular shooting system, speckle projected on an object to be measured is imaged, because the difference of the spatial positions of the cameras can cause the difference of the imaging of the speckle on the surface of the object to be measured in the left camera and the right camera, by comparing imaging pictures of the left camera and the right camera, different image points of the same object point in the left camera and the right camera are found according to the object image relationship, parallax is calculated, and the spatial three-dimensional position information of the object point can be calculated. The distance between the left camera and the right camera is adjustable so as to meet the measurement resolution requirements of measured objects with different scales. The arrows in fig. 2 indicate that the distance between the left camera 10 and the right camera 11 is adjustable.
The control and processing module 6 is connected with the optical sensor 7 of the speckle structure through a serial port, and the control and processing module 6 controls the movement of each moving part and the data acquisition and time stamp alignment of each sensor. Specifically, the control and processing module 6 is an MPU.
As shown in fig. 4, the algorithm module measures acceleration and angular velocity of the IMU 5 by measuring a circumferential track, measures positions of the first UWB receiver 3, the second UWB receiver 4 and the UWB base station 9, obtains positions and postures of the fused speckle structure light sensor 7 under a world coordinate system, performs coordinate transformation on point cloud data obtained by measuring the speckle structure light sensor 7 under the world coordinate system through the positions and postures of the speckle structure light sensor 7, converts the spatial positions of the point cloud data into the world coordinate system, obtains a sub-graph of the point cloud, and then continuously updates and outputs the point cloud data by updating the positions of the speckle structure light sensor 7, registers new single-frame point cloud into the point cloud data of the sub-graph, and finally obtains a global map.
The invention relates to a method for synchronously positioning and mapping a mobile robot in an underground space, which comprises the following steps:
Step 1, carrying out synchronous positioning of mobile robots in an underground space and position arrangement of a mapping system, placing the synchronous positioning of the mobile robots in the underground space and the mapping system in a pipe gallery 8, arranging 1 UWB base station 9 at the top of an inlet of the pipe gallery 8, and establishing a world coordinate system.
Step 2, initializing the control and processing module 6, and aligning the circumference track measurement IMU 5 and the globally measured UWB base station 9 with the time stamps of the photographed images of the first UWB receiver 3, the second UWB receiver 4 and the speckle structure light sensor 7.
And 3, starting circular motion scanning, after scanning along the circular motion, advancing the scanning base 2 along the track for 1 meter, starting circular motion scanning again, and synchronously collecting point cloud data of the circular track measurement IMU 5, the second UWB receiver 4 and the speckle structure light in the motion process, wherein the update frequency of the circular track measurement IMU 5 and the second UWB receiver 4 is set to be 200Hz, and the update frequency of the speckle structure light is set to be 10Hz.
And 4, carrying out Kalman filtering fusion on position and posture data of the circumferential track measurement IMU 5 and the second UWB receiver 4, fitting to obtain the position and posture of the speckle structure optical sensor 7 under a world coordinate system in the motion process, and carrying out rough registration on point clouds acquired at different moments before and after the motion according to the fitted position and posture to obtain rough registered point clouds.
And 5, carrying out fine registration of frames and subgraphs and fine registration of the subgraphs and updating of a map on the point cloud after coarse registration through an ICP iterative algorithm, wherein the ICP iterative algorithm enables the distance between the point clouds to be minimum by solving the transformation between a source point set P= { pi, i=1, 2,..Np } and a target point set Q= { qi, i=1, 2,..Nq ]. Wherein i is the sequence number of the point cloud, np and Nq are the maximum points of the source point set P and the maximum points of the target point set Q respectively, and an error function E (R, t) is obtained by calculating a rotation matrix R and a translation matrix t, wherein the smaller the value is, the better the value is.
Specifically, the step 4 includes conversion of a spatial coordinate system, and the method includes:
the coordinate system conversion sequence of the wall coordinates of the narrow space is a sensor coordinate system-a machine coordinate system-a world coordinate system. The conversion of the sensor coordinate system into the machine coordinate system is described below.
In the piping lane three-dimensional measurement, there are a plurality of coordinate systems, as shown in fig. 1, the coordinate system fixedly connected to the scanning base 2 is a machine coordinate system, the coordinate system fixedly connected to the circumferential track measurement IMU 5 is a sensor coordinate system, and the coordinate system in which the UWB base station 9 is located is a world coordinate system.
The euler theorem of general motion of a rigid body can know that the motion of a point under an absolute coordinate system can be divided into vector sum of translational displacement along with a base point and rotation around the base point, namely:
;
Wherein, The superscript T denotes the transpose of the matrix for the coordinates of the ith point, u i is the position before transformation, and u i' is the position after transformation.
In combination with the kinematic characteristics, when the rotation matrix R is calculated, the rotation around the Z axis (yaw angle α) under the sensor coordinate system is considered, the rotation around the Y axis (pitch angle β) is considered, and the rotation around the X axis (roll angle γ) is considered. The expression for the rotation matrix R is available as:
;
Wherein R α、Rβ、Rγ is the rotation matrix about Z, Y, X axes, respectively.
The pose fusion method of the circular track measurement IMU 5 and the second UWB receiver 4 for carrying out Kalman filtering comprises the following steps:
in a static state, at different rotation angles of the motion axis of the scanning base 2, the position of the speckle structure optical sensor 7 under a world coordinate system and the angle value of the current moment are calibrated through the second UWB receiver 4 which is measured globally, the coordinate and angle relation of the position of the different moment under a machine coordinate system are obtained, and according to the converted discrete coordinates, a space track equation of the rotation axis at different angles is obtained through a fitting equation.
Under the motion state, the value of the circumferential track measurement IMU 5, which changes along with time under the sensor coordinate system, is acquired, the originally acquired signal contains a large amount of random noise, mean value filtering processing is needed to be carried out on the originally acquired signal to improve the signal to noise ratio, and the filtered angular velocity is integrated to obtain the angular change at different moments.
The pose of the optical sensor with the speckle structure on the motion axis of the scanning base 2 can be obtained by fitting the angular velocity integral of the circular track measurement IMU 5 and the data of the second UWB receiver 4 measured globally, but both the measured data have errors, and the errors conform to gaussian distribution, and in order to obtain the data with higher precision, the kalman filter fitting is performed on the data from two types of sources. Taking the integration result of the circumference track measurement IMU 5 as a prediction, and taking the fitting result of the UWB base station 9 of global measurement as a state measurement result.
Initial angleComponent calculation is performed in two directions according to the gravitational acceleration g:
;
Wherein, ,For the angle values at time k and time k-1, ω k、ωk-1 is the angular velocity value at time k and time k-1, Δt is the time between time k and time k-1, W is the noise, and the acceleration of the speckle structure photosensor in the y direction a y and the acceleration of the sensor in the z direction a z are calculated.
After the integration gesture of the speckle structure light obtained by integrating the angular velocity of the circumferential track measurement IMU 5 and fitting the data of the globally measured second UWB receiver 4, the acceleration value of the circumferential track measurement IMU 5 in the sensor coordinate system needs to be converted into the world coordinate system, the three-dimensional acceleration signal in motion contains the gravity acceleration information, and the value containing the gravity in the three-dimensional acceleration information needs to be removed when the motion track of the sensor is calculated. The equation for resolving the sensor trajectory of the speckle pattern photosensor 7 from the circumferential trajectory measurement IMU 5 is as follows:
;
;
Wherein X k、Yk、Zk represents the position of the light sensor of the speckle structure on each coordinate axis at the time k under the world coordinate system, X k-1、Yk-1、Zk-1 represents the position of the light sensor of the speckle structure on each coordinate axis at the time k-1 under the world coordinate system, v x k、vy k、vz k represents the speed of the light sensor of the speckle structure on each coordinate axis at the time k under the world coordinate system, v x k-1、vy k-1、vz k-1 represents the speed of the light sensor of the speckle structure on each coordinate axis at the time k-1 under the world coordinate system, a x k-1、ay k-1、az k-1 represents the acceleration of the light sensor of the speckle structure on each coordinate axis at the time k-1 under the world coordinate system, δt is the time interval for collecting data, a x k、ωy k、ωz k is the X-axis acceleration, y-axis angular velocity and z-axis angular velocity at the time k, Is the angle of rotation about the x-axis.
And finally, taking the calculation result of the circumferential track measurement IMU 5 as a pre-measurement, taking the data measurement result of the second UWB receiver 4 as a state measurement quantity, and carrying out Kalman data fusion on two different types of data to obtain a more accurate cantilever tail end position.
Specifically, the step 5 includes:
the essence of the ICP iterative algorithm is to convert to an optimal solution (R, t) for the error function E (R, t), which is formulated as:
;
Wherein P i and Q i are the ith point in the source point set P and the ith point in the target point set Q, respectively. n is the total number of matching points, "|" represents the norm of the vector, referring to the length size in vector space.
The ICP iterative algorithm comprises the following steps:
step 5.1, a point P i E P is taken from a source point set P, wherein i is a sequence number;
Step 5.2, selecting a point Q i epsilon Q from the target point set Q, and finding a corresponding point P i, pi epsilon P from the source point set P;
step 5.3, calculating a rotation matrix R and a translation matrix t so that an error function E (R, t) is minimum;
Step 5.4, transforming the rotation matrix R and the translation matrix t obtained in the step 5.3 into a point P i to obtain a new source point set P '= { P' i=Rpi+t, pi∈P };p'i as a transformed point;
Step 5.5, calculating an average distance d between points Q i corresponding to P 'i of a source point set P' in a target point set Q, wherein the formula is d= ΣIIq i-p'i2/n, and n is the number of matched point pairs;
And 5.6, setting a threshold value, stopping iteration when the average distance d is smaller than the threshold value, otherwise, returning to the step 5.2 until the threshold value condition is met.
Fig. 5 is a three-dimensional point cloud image of a piping lane image obtained according to the present invention, which is a set of three-dimensional points in a world coordinate system, wherein the darkness of gray represents the distance between a point and a central axis, the darker the color represents the closer the point is to the central axis, and the blank portion represents that the point cloud is not collected at the position under the current view due to occlusion or the like.

Claims (10)

1.一种地下空间的移动机器人同步定位与建图方法,其特征在于,包括如下步骤:1. A method for synchronous positioning and mapping of a mobile robot in an underground space, characterized in that it comprises the following steps: 步骤1、将地下空间的移动机器人同步定位与建图系统置于管廊中,在管廊的入口处的顶部布置1个UWB基站,建立世界坐标系;Step 1: Place the underground mobile robot synchronous positioning and mapping system in the tunnel, arrange a UWB base station on the top of the entrance of the tunnel, and establish a world coordinate system; 步骤2、将控制及处理模块初始化,将圆周轨迹测量IMU 、全局测量的UWB基站与第一UWB接收器、第二UWB接收器、散斑结构光传感器的拍摄图像的时间戳对齐;Step 2: Initialize the control and processing module, and align the timestamps of the images captured by the circular trajectory measurement IMU, the global measurement UWB base station, the first UWB receiver, the second UWB receiver, and the speckle structured light sensor; 步骤3、开启圆周运动扫描,沿着圆周运动扫描一周后,扫描基座沿着轨道前进1米距离,再次开启圆周运动扫描,同步采集运动过程中的圆周轨迹测量IMU、第二UWB接收器及散斑结构光的点云数据;Step 3: Start circular motion scanning. After scanning one circle along the circular motion, the scanning base moves forward 1 meter along the track, and then starts circular motion scanning again to synchronously collect the point cloud data of the circular trajectory measurement IMU, the second UWB receiver, and the speckle structured light during the motion process; 步骤4、将圆周轨迹测量IMU、第二UWB接收器的位置和姿态数据进行卡尔曼滤波融合,拟合得到运动过程中的散斑结构光传感器在世界坐标系下的位置和姿态,并根据拟合后的位置和姿态数据对运动前后不同时刻采集的点云进行粗配准,获得粗配准的点云;Step 4: Kalman filter fusion is performed on the position and attitude data of the circular trajectory measurement IMU and the second UWB receiver to fit the position and attitude of the speckle structured light sensor in the world coordinate system during the movement, and the point clouds collected at different times before and after the movement are roughly aligned according to the fitted position and attitude data to obtain the roughly aligned point clouds; 步骤5、对完成粗配准的点云通过ICP迭代算法,进行帧和子图的精细配准、子图和更新地图的精细配准,ICP迭代算法通过求解源点集P={pi,i=1,2,...Np}和目标点集Q={qi,i=1,2,...Nq}之间的变换,使得点云之间的距离最小;其中,i为点云的序号;pi为源点集中的点;qi为目标点集中的点;Np、Nq分别为源点集P的最大点数、目标点集Q的最大点数;通过计算旋转矩阵R和平移矩阵t,求得误差函数E(R,t);ICP迭代算法为迭代最近点算法。Step 5. The point cloud that has completed the coarse registration is finely registered between the frame and the sub-image, and between the sub-image and the updated map through the ICP iterative algorithm. The ICP iterative algorithm minimizes the distance between the point clouds by solving the transformation between the source point set P={ pi , i=1,2,... Np } and the target point set Q={ qi , i=1,2,... Nq }; wherein i is the sequence number of the point cloud; pi is a point in the source point set; qi is a point in the target point set; Np and Nq are the maximum number of points in the source point set P and the maximum number of points in the target point set Q, respectively; the error function E(R,t) is obtained by calculating the rotation matrix R and the translation matrix t; the ICP iterative algorithm is an iterative closest point algorithm. 2.根据权利要求1所述的一种地下空间的移动机器人同步定位与建图方法,其特征在于,所述步骤1中,所述移动机器人同步定位与建图系统包括数据采集扫描结构、位姿测量系统、点云采集模块、控制及处理模块和算法模块。2. According to a method for synchronous positioning and mapping of a mobile robot in an underground space according to claim 1, it is characterized in that in step 1, the mobile robot synchronous positioning and mapping system includes a data acquisition scanning structure, a posture measurement system, a point cloud acquisition module, a control and processing module and an algorithm module. 3.根据权利要求2所述的一种地下空间的移动机器人同步定位与建图方法,其特征在于,所述数据采集扫描结构包括移动机器人运动导轨、扫描基座、固定在扫描基座上的第一UWB接收器、第二UWB接收器、在扫描基座上做圆周运动的圆周轨迹测量IMU、固定在管廊的顶部的UWB基站;管廊顶部的UWB基站构成世界坐标系,扫描基座构成机器坐标系;所述机器人运动导轨安装在管廊的底部,用于支撑扫描基座,并为扫描基座提供沿管廊的轴线方向运动的轨道,在扫描基座的下底部安装第一UWB接收器,用于接收UWB基站的信号,用来测量扫描基座在固定在管廊顶部的UWB基站中的位置及姿态,即扫描基座在世界坐标系中的姿态和角度;扫描基座的上部的运动轴上安装有散斑结构光传感器、圆周轨迹测量IMU 、控制及处理模块及第二UWB接收器,与扫描基座上的运动轴一起绕着扫描基座的中心轴线做往返圆周运动。3. A method for synchronous positioning and mapping of a mobile robot in an underground space according to claim 2, characterized in that the data acquisition scanning structure includes a mobile robot motion guide rail, a scanning base, a first UWB receiver fixed on the scanning base, a second UWB receiver, a circular trajectory measurement IMU that performs circular motion on the scanning base, and a UWB base station fixed on the top of the tunnel; the UWB base station on the top of the tunnel constitutes a world coordinate system, and the scanning base constitutes a machine coordinate system; the robot motion guide rail is installed at the bottom of the tunnel, used to support the scanning base and provide a track for the scanning base to move along the axis direction of the tunnel, and a first UWB receiver is installed at the lower bottom of the scanning base, which is used to receive the signal of the UWB base station and measure the position and posture of the scanning base in the UWB base station fixed on the top of the tunnel, that is, the posture and angle of the scanning base in the world coordinate system; a speckle structured light sensor, a circular trajectory measurement IMU are installed on the motion axis of the upper part of the scanning base , the control and processing module and the second UWB receiver, together with the motion axis on the scanning base, perform reciprocating circular motion around the central axis of the scanning base. 4.根据权利要求3所述的一种地下空间的移动机器人同步定位与建图方法,其特征在于,所述散斑结构光传感器构成点云采集模块;散斑结构光传感器构成传感器坐标系;散斑结构光传感器的位姿测量通过UWB基站、第二UWB接收器、圆周轨迹测量IMU 实现;所述圆周轨迹测量IMU 包含3轴加速度计和3轴陀螺仪,测量在运动过程中与圆周轨迹测量IMU 固连的传感器坐标系的三个轴向的加速度和绕着三个轴的角速度。4. A method for synchronous positioning and mapping of a mobile robot in an underground space according to claim 3, characterized in that the speckle structured light sensor constitutes a point cloud acquisition module; the speckle structured light sensor constitutes a sensor coordinate system; the position and posture measurement of the speckle structured light sensor is achieved through a UWB base station, a second UWB receiver, and a circular trajectory measurement IMU; the circular trajectory measurement IMU includes a 3-axis accelerometer and a 3-axis gyroscope, which measures the acceleration of the three axes and the angular velocity around the three axes of the sensor coordinate system fixed to the circular trajectory measurement IMU during movement. 5.根据权利要求4所述的一种地下空间的移动机器人同步定位与建图方法,其特征在于,通过UWB基站、第二UWB接收器的全局测量及圆周运动轨迹测量IMU 获得运动过程中的位置和姿态数据,对获得的加速度进行滤波去噪,融合卡尔曼滤波陀螺仪测量的角度和UWB基站、第二UWB接收器测量的角度,然后将滤波后在传感器坐标系的加速度值通过坐标转化得到世界坐标系下的加速度值,在竖直方向扣除重力值得到其各方向的加速度,将滤波获得的角速度与圆周运动直径相乘获得圆周三个运动的线速度,通过位移计算公式计算得到预测的传感器运动位置值;通过第二UWB接收器测量散斑结构光传感器的圆周运动过程中的位置,并将各离散的测量点拟合成曲线,获得散斑结构光传感器的运动轨迹变化,从而获得散斑结构光传感器的位置,然后将位移计算公式计算的散斑结构光传感器的运动位置和第二UWB接收器计算得到的散斑结构光传感器的运动位置,通过卡尔曼滤波实现数据融合,获得更加准确的散斑结构光传感器的位置。5. A method for synchronous positioning and mapping of a mobile robot in an underground space according to claim 4, characterized in that the global measurement and circular motion trajectory measurement IMU of the UWB base station and the second UWB receiver are used. The position and attitude data in the motion process are obtained, the obtained acceleration is filtered and denoised, the angle measured by the Kalman filter gyroscope and the angle measured by the UWB base station and the second UWB receiver are fused, and then the acceleration value in the sensor coordinate system after filtering is obtained by coordinate transformation to obtain the acceleration value in the world coordinate system, and the gravity value is deducted in the vertical direction to obtain the acceleration in each direction, and the angular velocity obtained by filtering is multiplied by the circular motion diameter to obtain the linear velocity of the three circular motions, and the predicted sensor motion position value is calculated by the displacement calculation formula; the position of the speckle structured light sensor in the circular motion process is measured by the second UWB receiver, and each discrete measurement point is fitted into a curve to obtain the motion trajectory change of the speckle structured light sensor, thereby obtaining the position of the speckle structured light sensor, and then the motion position of the speckle structured light sensor calculated by the displacement calculation formula and the motion position of the speckle structured light sensor calculated by the second UWB receiver are fused through Kalman filtering to obtain a more accurate position of the speckle structured light sensor. 6.根据权利要求4所述的一种地下空间的移动机器人同步定位与建图方法,其特征在于,所述点云采集模块为主动式双目散斑结构,用于地下低照度环境的测量。6. The method for synchronous positioning and mapping of a mobile robot in an underground space according to claim 4, characterized in that the point cloud acquisition module is an active binocular speckle structure used for measuring underground low-light environments. 7.根据权利要求4所述的一种地下空间的移动机器人同步定位与建图方法,其特征在于,所述点云采集模块包括左相机、右相机、激光光源、散斑掩膜板、变焦镜头、激光光源、散斑掩膜板、变焦镜头构成散斑投射器,实现将散斑投射到被测物体表面的功能,激光光源提供照明,散斑掩膜板提供图案,变焦镜头改变散斑掩膜板上图案在被测物体表面上的大小,从而适应不同大小的被测物体表面;左相机、右相机构成双目摄影系统,对投射在被测物体的散斑成像,通过比对左相机、右相机的成像照片,根据物像关系,找到相同物点在左相机、右相机中的不同像点,计算出视差,从而计算出物点的空间三维位置信息。7. A method for synchronous positioning and mapping of a mobile robot in an underground space according to claim 4, characterized in that the point cloud acquisition module comprises a left camera, a right camera, a laser light source, a speckle mask, a zoom lens, the laser light source, the speckle mask, and the zoom lens constitute a speckle projector to realize the function of projecting speckles onto the surface of the object to be measured, the laser light source provides illumination, the speckle mask provides a pattern, and the zoom lens changes the size of the pattern on the speckle mask on the surface of the object to be measured, thereby adapting to surfaces of the object to be measured of different sizes; the left camera and the right camera constitute a binocular photography system to image the speckle projected on the object to be measured, and by comparing the imaging photos of the left camera and the right camera, different image points of the same object point in the left camera and the right camera are found according to the object-image relationship, and the parallax is calculated, thereby calculating the spatial three-dimensional position information of the object point. 8.根据权利要求4所述的一种地下空间的移动机器人同步定位与建图方法,其特征在于,所述算法模块实现圆周轨迹测量IMU 的加速度和角速度测量、第一UWB接收器、第二UWB接收器和UWB基站的位置测量,获得散斑结构光传感器在世界坐标系下融合后的位置和姿态数据,将散斑结构光传感器坐标系下测量获得的点云数据通过散斑结构光传感器在世界坐标系下的位置和姿态数据进行坐标变换,将点云数据的空间位置转换到世界坐标系中,获得点云的子图,然后通过散斑结构光传感器的位置更新,不断更新输出,将新的单帧点云配准到子图的点云中,最终获得全局地图。8.A method for synchronous positioning and mapping of a mobile robot in an underground space according to claim 4, characterized in that the algorithm module realizes the acceleration and angular velocity measurement of the circular trajectory measurement IMU, the position measurement of the first UWB receiver, the second UWB receiver and the UWB base station, obtains the position and posture data of the speckle structured light sensor after fusion in the world coordinate system, performs coordinate transformation on the point cloud data measured in the speckle structured light sensor coordinate system through the position and posture data of the speckle structured light sensor in the world coordinate system, converts the spatial position of the point cloud data into the world coordinate system, obtains a sub-graph of the point cloud, and then updates the position of the speckle structured light sensor, continuously updates the output, aligns the new single-frame point cloud to the point cloud of the sub-graph, and finally obtains a global map. 9.根据权利要求1所述的一种地下空间的移动机器人同步定位与建图方法,其特征在于,所述步骤4包括空间坐标系的转换,转换顺序为:传感器坐标系转换到机器坐标系,再转换到世界坐标系;将圆周轨迹测量IMU、第二UWB接收器的位置和姿态数据进行卡尔曼滤波融合的方法为:9. A method for synchronous positioning and mapping of a mobile robot in an underground space according to claim 1, characterized in that step 4 includes conversion of a spatial coordinate system, and the conversion sequence is: converting the sensor coordinate system to the machine coordinate system, and then to the world coordinate system; the method of Kalman filtering fusion of the position and posture data of the circular trajectory measurement IMU and the second UWB receiver is: 静止状态下,在扫描基座的运动轴的不同旋转角度处,通过全局测量的第二UWB接收器标定散斑结构光传感器的在世界坐标系下的位置和当前时刻的角度值,得到不同时刻的位置在机器坐标系的坐标及角度关系,根据转换后的离散坐标,通过拟合方程,获得旋转轴在不同角度处的空间轨迹方程;In a stationary state, at different rotation angles of the motion axis of the scanning base, the position of the speckle structured light sensor in the world coordinate system and the angle value at the current moment are calibrated by the second UWB receiver of global measurement, and the coordinates and angle relationship of the position at different moments in the machine coordinate system are obtained. According to the converted discrete coordinates, the spatial trajectory equation of the rotation axis at different angles is obtained by fitting the equation; 运动状态下,采集圆周轨迹测量IMU 在传感器坐标系下的随时间变化的值,对原始采集的信号进行均值滤波处理,对滤波后的角速度积分获得不同时刻的角度变化;In the motion state, the circular trajectory measurement IMU value changing over time in the sensor coordinate system is collected, the original collected signal is processed by mean filtering, and the filtered angular velocity is integrated to obtain the angle change at different times; 扫描基座运动轴上散斑结构光传感器的姿态通过圆周轨迹测量IMU 的角速度积分和全局测量的第二UWB接收器的数据的拟合获得,将圆周轨迹测量IMU 的积分结果作为预测,全局测量的UWB基站的拟合结果作为状态测量结果;The posture of the speckle structured light sensor on the moving axis of the scanning base is obtained by fitting the angular velocity integration of the circular trajectory measurement IMU and the data of the second UWB receiver of global measurement. The integral result of the circular trajectory measurement IMU is used as the prediction, and the fitting result of the global measurement UWB base station is used as the state measurement result. 初始角度根据重力加速度g在两个方向进行分量计算:Initial Angle Calculate the components in two directions according to the gravitational acceleration g: ; 其中,为k时刻、k-1时刻的角度值,ωk、ωk-1为k时刻、k-1时刻的角速度值,Δt为k时刻与k-1时刻之间的时间,W为噪声,ay为散斑结构光传感器在传感器坐标系下y方向的加速度,az为散斑结构光传感器在传感器坐标系下z方向的加速度;in, , , is the angle value at time k and time k-1, ω k and ω k-1 are the angular velocity values at time k and time k-1, Δt is the time between time k and time k-1, W is the noise, a y is the acceleration of the speckle structured light sensor in the y direction in the sensor coordinate system, and a z is the acceleration of the speckle structured light sensor in the z direction in the sensor coordinate system; 对圆周轨迹测量IMU 的角速度积分和全局测量的第二UWB接收器的数据的拟合获得的散斑结构光的融合姿态之后,将圆周轨迹测量IMU 在传感器坐标系下的加速度值转换到世界坐标系下;根据圆周轨迹测量IMU 解算散斑结构光传感器的传感器轨迹的公式如下所示:After the fusion posture of the speckle structured light is obtained by fitting the angular velocity integration of the circular trajectory measurement IMU and the data of the second UWB receiver of the global measurement, the acceleration value of the circular trajectory measurement IMU in the sensor coordinate system is converted to the world coordinate system; the formula for solving the sensor trajectory of the speckle structured light sensor according to the circular trajectory measurement IMU is as follows: ; ; 其中,Xk、Yk、Zk表示散斑结构光传感器在世界坐标系下的k时刻在各坐标轴上的位置,Xk-1、Yk-1、Zk-1表示散斑结构光传感器在世界坐标系下的k-1时刻在各坐标轴上的位置,vx k、vy k、vz k表示散斑结构光传感器在世界坐标系下的k时刻在各坐标轴上的速度,vx k-1、vy k-1、vz k-1表示散斑结构光传感器在世界坐标系下的k-1时刻在各坐标轴上的速度、ax k-1、ay k-1、az k-1表示散斑结构光传感器在世界坐标系下的k-1时刻在各坐标轴上的加速度,δt为采集数据的时间间隔,ax k、ωy k、ω z k为k时刻的x轴加速度、k时刻的y轴角速度、k时刻的z轴角速度,为绕着x轴的旋转角度;Wherein, Xk , Yk , and Zk represent the position of the speckle structured light sensor on each coordinate axis at time k in the world coordinate system, Xk -1 , Yk-1 , and Zk -1 represent the position of the speckle structured light sensor on each coordinate axis at time k-1 in the world coordinate system, vxk , vyk , and vzk represent the velocity of the speckle structured light sensor on each coordinate axis at time k in the world coordinate system, vxk -1 , vyk -1 , and vzk -1 represent the velocity of the speckle structured light sensor on each coordinate axis at time k -1 in the world coordinate system, axk -1 , ayk -1 , and azk-1 represent the acceleration of the speckle structured light sensor on each coordinate axis at time k-1 in the world coordinate system, δt is the time interval for collecting data, axk , ωyk , and ωzk are the x-axis acceleration at time k, the y-axis angular velocity at time k, and the z-axis angular velocity at time k, is the rotation angle around the x-axis; 最后,以圆周轨迹测量IMU 的计算结果作为预测量,以全局测量的第二UWB接收器的数据测量结果作为状态测量量,将预测量和状态测量量进行卡尔曼数据融合,获得更精确的悬臂末端的位置。Finally, the calculation result of the circular trajectory measurement IMU is used as the prediction quantity, and the data measurement result of the second UWB receiver of the global measurement is used as the state measurement quantity. The prediction quantity and the state measurement quantity are Kalman data fused to obtain a more accurate position of the cantilever end. 10.根据权利要求1所述的一种地下空间的移动机器人同步定位与建图方法,其特征在于,所述步骤5包括:10. The method for synchronous positioning and mapping of a mobile robot in an underground space according to claim 1, wherein step 5 comprises: ICP迭代算法的计算方法是求误差函数E(R,t)的最优解(R,t),其公式为:The calculation method of the ICP iterative algorithm is to find the optimal solution (R, t) of the error function E (R, t), and its formula is: ; 其中,pi和qi分别为源点集P中的第i个点和目标点集Q中的第i个点,n为匹配点的总数量,“‖”表示向量的范数,指的是向量空间中的长度大小;Where p i and q i are the i-th point in the source point set P and the i-th point in the target point set Q, respectively. n is the total number of matching points. “‖” represents the norm of the vector, which refers to the length in the vector space. ICP迭代算法包括如下步骤:The ICP iterative algorithm includes the following steps: 步骤5.1在源点集P中取一点pi∈P;i为序号;Step 5.1 Select a point p i ∈ P from the source point set P; i is the sequence number; 步骤5.2在目标点集Q中选取点qi∈Q,并在源点集P中找到对应的点pi, pi∈P;Step 5.2 Select a point q i ∈ Q in the target point set Q, and find the corresponding point pi, pi ∈ P in the source point set P; 步骤5.3计算旋转矩阵R和平移矩阵t,使得误差函数E(R,t)最小;Step 5.3 Calculate the rotation matrix R and the translation matrix t so that the error function E (R, t) is minimized; 步骤5.4将步骤5.3得到的旋转矩阵R和平移矩阵t对点pi进行变换,得到新的源点集P’={ p'i=Rpi+t, pi∈P };p'i为变换后的点;Step 5.4: Transform point p i using the rotation matrix R and translation matrix t obtained in step 5.3 to obtain a new source point set P'={ p' i =Rp i +t, p i ∈P }; p' i is the transformed point; 步骤5.5计算目标点集Q中与源点集P’的p'i对应的点qi之间的平均距离d,公式为:d=∑‖qi-p'i2/n;n为匹配点对的数量;Step 5.5 Calculate the average distance d between the points qi in the target point set Q and the points p'i in the source point set P', using the formula: d = ∑‖qi -p'i‖2 / n ; n is the number of matching point pairs; 步骤5.6设定阈值,当平均距离d小于阈值时停止迭代,否则返回步骤5.2,直至满足阈值条件。Step 5.6 sets a threshold. When the average distance d is less than the threshold, the iteration stops. Otherwise, return to step 5.2 until the threshold condition is met.
CN202411489703.5A 2024-10-24 2024-10-24 Mobile robot synchronous positioning and mapping method for underground space Active CN118999531B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411489703.5A CN118999531B (en) 2024-10-24 2024-10-24 Mobile robot synchronous positioning and mapping method for underground space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411489703.5A CN118999531B (en) 2024-10-24 2024-10-24 Mobile robot synchronous positioning and mapping method for underground space

Publications (2)

Publication Number Publication Date
CN118999531A CN118999531A (en) 2024-11-22
CN118999531B true CN118999531B (en) 2024-12-27

Family

ID=93487491

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411489703.5A Active CN118999531B (en) 2024-10-24 2024-10-24 Mobile robot synchronous positioning and mapping method for underground space

Country Status (1)

Country Link
CN (1) CN118999531B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120471451B (en) * 2025-05-22 2025-10-31 四川旭信科技有限公司 A mine safety monitoring system and method for confined spaces in mines

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021137750A1 (en) * 2019-12-30 2021-07-08 Singpilot Pte Ltd Sequential mapping and localization (smal) for navigation
CN115639813A (en) * 2022-09-21 2023-01-24 东软医疗系统股份有限公司 A mobile DR control method and device, storage medium, and computer equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210356279A1 (en) * 2018-07-08 2021-11-18 Nng Software Developing And Commercial Llc. A Method and Apparatus for Optimal Navigation to Multiple Locations

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021137750A1 (en) * 2019-12-30 2021-07-08 Singpilot Pte Ltd Sequential mapping and localization (smal) for navigation
CN115639813A (en) * 2022-09-21 2023-01-24 东软医疗系统股份有限公司 A mobile DR control method and device, storage medium, and computer equipment

Also Published As

Publication number Publication date
CN118999531A (en) 2024-11-22

Similar Documents

Publication Publication Date Title
CN114608561B (en) A positioning and mapping method and system based on multi-sensor fusion
CN109993113B (en) A Pose Estimation Method Based on RGB-D and IMU Information Fusion
CN108489496B (en) Non-cooperative target relative navigation motion estimation method and system based on multi-source information fusion
CN110675450B (en) Method and system for generating orthoimage in real time based on SLAM technology
JP5992184B2 (en) Image data processing apparatus, image data processing method, and image data processing program
CN111091587B (en) Low-cost motion capture method based on visual markers
CN112254729B (en) A mobile robot positioning method based on multi-sensor fusion
CN111899276A (en) SLAM method and system based on binocular event camera
RU2734387C1 (en) System and method of initial exposure by optical flow method for strapdown inertial navigation of coal mining machine
CN118135526A (en) Visual target recognition and positioning method for quadrotor UAV based on binocular camera
CN104501779A (en) High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
WO2010001940A1 (en) Position measurement method, position measurement device, and program
CN105606092B (en) Indoor robot positioning method and system
CN114370871A (en) Close coupling optimization method for visible light positioning and laser radar inertial odometer
CN118999531B (en) Mobile robot synchronous positioning and mapping method for underground space
CN109520476B (en) System and method for measuring dynamic pose of rear intersection based on inertial measurement unit
CN110278371A (en) Three axial seven variables full freedom degrees of the video camera in space position and tracking
TW201711011A (en) Positioning and directing data analysis system and method thereof
CN110736457A (en) An integrated navigation method based on Beidou, GPS and SINS
CN118967795B (en) Visual inertial navigation tight coupling SLAM method based on four-eye panoramic camera
CN113345032A (en) Wide-angle camera large-distortion image based initial image construction method and system
CN112129263B (en) Distance measurement method of separated mobile stereo distance measurement camera
CN112762929B (en) Intelligent navigation method, device and equipment
CN113379911A (en) SLAM method, SLAM system and intelligent robot
CN110411475A (en) A kind of robot vision odometer assisted based on template matching algorithm and IMU

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant