CN118999531B - Mobile robot synchronous positioning and mapping method for underground space - Google Patents
Mobile robot synchronous positioning and mapping method for underground space Download PDFInfo
- Publication number
- CN118999531B CN118999531B CN202411489703.5A CN202411489703A CN118999531B CN 118999531 B CN118999531 B CN 118999531B CN 202411489703 A CN202411489703 A CN 202411489703A CN 118999531 B CN118999531 B CN 118999531B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- speckle
- structured light
- measurement
- light sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3807—Creation or updating of map data characterised by the type of data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3841—Data obtained from two or more sources, e.g. probe vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a synchronous positioning and mapping method of a mobile robot in an underground space, which belongs to the fields of SLAM, unmanned, surveying and underground engineering of the mobile robot, and adopts a scanning mode of combining axial motion and circular motion to realize point cloud acquisition, construct three-dimensional point cloud, utilize circular track measurement IMU pre-integration to realize fine measurement of pose, utilize UWB base station global correction and eliminate accumulated errors. The dual-purpose speckle structure light is adopted to realize the point cloud pickup of the low-illumination underground environment. And realizing coarse registration of point clouds among frames by fusing pose of the circumferential track measurement IMU and the UWB base station, realizing iterative optimization according to an ICP iterative algorithm, improving the registration precision of the point clouds, and obtaining a global space map and robot positioning. The invention is suitable for three-dimensional construction and robot positioning of underground long and narrow spaces such as underground karst cave, large pipe gallery, underground pipe gallery, mine gallery and the like, and is used for serving application scenes such as unmanned construction operation and the like.
Description
Technical Field
The invention belongs to the fields of mobile robot SLAM, unmanned, mapping and underground engineering, and particularly relates to a synchronous positioning and mapping method of a mobile robot in an underground space, which can be applied to intelligent construction of pipe gallery, roadway and pipe gallery environments and convergence deformation inspection.
Background
The simultaneous localization and mapping (simultaneous localization AN MAPPING, SLAM) technology is an important problem in the robot field, and the SLAM problem can be described as free movement of a robot equipped with a sensor in space, localization of the position of the robot by using acquired sensor information, and incremental map construction based on self localization are performed, so that simultaneous localization and mapping of the robot are realized.
The underground space has a complex structure and a severe environment, an intelligent robot is required to replace a person to operate, and environment perception is an intelligent precondition. The environment perception and slam of the underground space are different from the indoor and outdoor environments of the ground, and the underground space has the characteristics of low illumination, no GPS signal, narrow space and the like.
The existing underground three-dimensional map is mainly applied to three-dimensional mapping of an as-built underground pipe gallery and used for detecting whether the pipe gallery is converged and deformed or not, and means adopted by the map are a three-dimensional laser scanner with the ground, a total station and the like. The completed pipe gallery is clear and free of shielding, the environment of severe dust and water mist is avoided, the obtained point cloud is sparse and uneven, and the three-dimensional construction of the pipe gallery, the roadway, the pipe gallery and the like in construction cannot be met.
Disclosure of Invention
In order to solve the technical problems, realize narrow and long and narrow underground space map building, the invention provides a method for synchronously positioning and building a mobile robot in an underground space, which adopts a scanning mode combining axial motion and circular motion to realize point cloud acquisition, builds three-dimensional point cloud based on UWB (Ultra Wide Band), IMU (Inertial Measurement Unit ) and binocular speckle structure light, realizes fine measurement of pose by using IMU pre-integration, and eliminates accumulated error by using UWB global correction; the method comprises the steps of realizing point cloud pickup of a low-illumination underground environment by adopting binocular speckle structure light, realizing inter-frame point cloud rough registration by means of pose fusion of an IMU and UWB, realizing iterative optimization according to an ICP (ITERATIVE CLOSEST POINT) algorithm, improving point cloud registration precision, and obtaining a global space map and robot positioning.
The invention provides a speckle structure point cloud acquisition device which is different from the traditional phase shift or Gray code coding mode, the acquisition of the point cloud can be realized without photographing through multiple exposure. The speckle structure light has horizontal and vertical structure lines, so that the left and right camera baselines can be conveniently aligned, and the speckle matching speed is improved. The method has the characteristics of wide application scene, dense point cloud, high three-dimensional graph construction precision and high reliability.
In order to achieve the above purpose, the invention adopts the following technical scheme:
a synchronous positioning and mapping method of a mobile robot in an underground space comprises the following steps:
Step 1, a mobile robot synchronous positioning and mapping system of an underground space is arranged in a pipe gallery, 1 UWB base station is arranged at the top of an inlet of the pipe gallery, and a world coordinate system is established;
Step 2, initializing a control and processing module, and aligning the circumference track measurement IMU, the global measurement UWB base station with the time stamp of the shot image of the first UWB receiver, the second UWB receiver and the speckle structure light sensor;
Step 3, starting circular motion scanning, after scanning along the circular motion for a circle, advancing a scanning base along a track for 1 meter, starting circular motion scanning again, and synchronously collecting point cloud data of the circular track measurement IMU, the second UWB receiver and the speckle structure light in the moving process;
Step 4, carrying out Kalman filtering fusion on position and posture data of the circumferential track measurement IMU and the second UWB receiver, fitting to obtain the position and posture of the speckle structure optical sensor under the world coordinate system in the motion process, and carrying out rough registration on point clouds acquired at different moments before and after the motion according to the fitted position and posture to obtain rough registered point clouds;
And 5, carrying out fine registration of frames and subgraphs and fine registration of the subgraphs and updating a map on the point cloud subjected to coarse registration by an ICP iterative algorithm, wherein the ICP iterative algorithm obtains an error function E (R, t) by solving the transformation between a source point set P= { P i,i=1,2,...Np } and a target point set Q= { Q i,i=1,2,...Nq } so as to minimize the distance between the point clouds, i is the sequence number of the point cloud, N p、Nq is the maximum point number of the source point set P and the maximum point number of the target point set Q respectively, and calculating a rotation matrix R and a translation matrix t.
The beneficial effects are that:
1. The invention realizes the pose measurement of the sensor based on the IMU and the UWB, and fills the problem that the traditional GPS navigation has no signal and can not work in the underground space.
2. According to the invention, the research on the method for realizing high-precision pose measurement of the tail end of the moving cantilever based on Kalman filtering is carried out, the multi-sensing data fusion is realized based on multistage Kalman filtering through the method of combining IMU dynamic measurement and UWB static measurement, UWB global pose correction and IMU local pose fine acquisition. The problem of IMU drift is corrected through the global pose of the total station, the problem of data missing is supplemented through the fine acquisition integral of the IMU, high-precision positioning solution is realized, the problem of positioning error caused by vibration and the like in the motion process is solved, and the measurement precision is improved.
3. The scanning mechanism for collecting the point cloud is a hollow mechanism, and the scanning mechanism is close to the surface to be measured, so that the problem of shielding of an underground space can be effectively solved.
4. The invention adopts the motion mode of combining axial motion and rotary scanning, the obtained point cloud is more dense and uniform, and the degree of refinement is higher.
Drawings
FIG. 1 is a schematic diagram of a mobile robot synchronous positioning and mapping system for implementing a mobile robot synchronous positioning and mapping method for an underground space according to the present invention;
FIG. 2 is a schematic diagram of a point cloud acquisition module;
FIG. 3 is a coded structured light pattern;
FIG. 4 is a flow chart of an algorithm of the algorithm module;
Fig. 5 is a three-dimensional point cloud image of a piping lane image obtained according to the present invention.
The device comprises a mobile robot motion guide rail 1, a scanning base 2, a first UWB receiver 3, a second UWB receiver 4, a circumferential track measurement IMU 5, a control and processing module 6, a speckle structure light sensor 7, a pipe gallery 8, a UWB base station 9, a left camera 10, a right camera 11, a laser light source 12, a speckle mask plate 13 and a zoom lens 14.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. In addition, the technical features of the embodiments of the present invention described below may be combined with each other as long as they do not collide with each other.
The mobile robot synchronous positioning and mapping system for realizing the mobile robot synchronous positioning and mapping method of the underground space comprises a data acquisition scanning structure, a pose measuring system, a point cloud acquisition module, a control and processing module and an algorithm module.
The data acquisition scanning structure comprises an axial movement structure and a circular movement structure, the axial movement structure comprises but is not limited to a track type, a wheel type and a four-foot type, the circular movement track is circular, the circular movement structure is a hollow structure, the scanning of the wall surface can be realized by erecting a laser scanner in the middle in the open long and narrow spaces such as a coal mine tunnel and a pipe gallery, but during construction operation, engineering operation devices can be arranged in the device, the open space is changed into a narrow space by stacking the devices, and the linear propagation of shielding influence light exists, so that the acquisition scanning of the wall surface is influenced.
In order to realize the wall surface scanning of the construction operation environment, an axial scanning system and a circumferential scanning system are designed, and because the axial movement and the circumferential movement are independent, in order to avoid the interference of mutual movement, in the movement measurement, the movement is stopped after the axial movement is a certain distance, after the whole is ensured to be stationary for a period of time, the circumferential scanning movement is started and is stationary for a plurality of seconds, and then the circulation measurement of the next period is started again.
As shown in fig. 1, the data acquisition scanning structure comprises a mobile robot motion guide rail 1, a scanning base 2, a first UWB receiver 3 fixed on the scanning base 2, a second UWB receiver 4, a circular trace measuring IMU 5 performing circular motion on the scanning base 2, and a UWB base station 9 fixed on the top of a pipe rack 8. The UWB base station 9 at the top of the pipe rack 8 forms the world coordinate system, the scanning base 2 forms the machine coordinate system, and the speckle pattern photosensor 7 forms the sensor coordinate system.
The speckle structure light sensor 7 forms a point cloud acquisition module, and acquired data is a three-dimensional position of a target under a local coordinate system of the speckle structure light sensor 7.
Specifically, the robot motion guide 1 is mounted at the bottom of the pipe lane 8, for supporting the scanning base 2, and providing the scanning base 2 with a track that moves in the axial direction of the pipe lane 8.
Specifically, the scanning base 2 is a ring mechanism, and is mounted on the robot motion guide rail 1, and a first UWB receiver 3 is mounted at the lower bottom of the scanning base 2 and is used for receiving signals of a UWB base station 9 and measuring the position and the posture of the scanning base 2 in the UWB base station 9 fixed at the top of the pipe rack 8, that is, the posture and the angle of the scanning base 2 in a world coordinate system. A speckle structure photosensor 7, a circular trajectory measuring IMU 5, a control and processing module 6, and a second UWB receiver 4 are mounted on a movement axis of the upper portion of the scanning base 2, and reciprocate with the movement axis around the central axis of the scanning base 2.
Specifically, the second UWB receiver 4 is fixedly connected to the speckle pattern light sensor 7, and is used for measuring the position and posture of the speckle pattern light sensor 7 in the UWB base station 9, that is, the posture and angle of the speckle pattern light sensor 7 in the world coordinate system.
Specifically, the circular path measuring IMU 5 is fixedly connected with the optical sensor 7 of the speckle structure and the second UWB receiver 4, and moves together with the movement of the optical sensor 7 of the speckle structure on the circumference, the circular path measuring IMU 5 includes 3 accelerometers and 3 angular velocity meters, and the position and posture change of the optical sensor 7 of the speckle structure in the movement process can be calculated through integration.
Specifically, the control and processing module 6 is fixedly connected with the circumferential track measurement IMU 5, the speckle structure optical sensor 7 and the second UWB receiver 4, so as to control the movement of the scanning base 2 and process the received data of the circumferential track measurement IMU 5, the speckle structure optical sensor 7 and the second UWB receiver 4 in real time.
Specifically, the pipe lane 8 is a target measurement object, and the coordinates of the spatial point of the inner wall of the pipe lane 8 in the world coordinate system need to be obtained.
Specifically, the UWB base station 9 is used as a measurement reference, fixed on the top of the pipe gallery wall, and forms a world coordinate system with the UWB base station 9, and finally all the point cloud coordinates are unified under the world coordinate system formed by the UWB base station 9.
The pose measurement system realizes pose measurement and comprises pose measurement of the scanning base 2 and pose measurement of the speckle structure light sensor 7.
The pose measurement of the scanning base 2 is realized by a measuring unit consisting of an ultra wideband UWB base station 9 and a first UWB receiver 3 on the scanning base 2, and the pose of a machine coordinate system (the coordinate system where the scanning base 2 is located) in a world coordinate system is obtained through space coordinate transformation.
The pose measurement of the optical sensor 7 with the speckle structure is realized through the UWB base station 9, the second UWB receiver 4 and the circumference track measurement IMU 5. The circumferential-track measurement IMU 5 comprises a 3-axis accelerometer and a 3-axis gyroscope, and can measure acceleration in three axial directions and angular velocities around the three axes of a sensor coordinate system fixedly connected with the circumferential-track measurement IMU 5 during movement.
The pose in the motion process is obtained through global measurement of the UWB base station 9 and the second UWB receiver 4 and measurement of the IMU 5 of the circular motion track, firstly, angular acceleration and acceleration obtained through the gyroscope are filtered and denoised, angle measured through the Kalman filtering gyroscope and angle measured through the UWB base station 9 and the second UWB receiver 4 are fused, angular positions with higher precision are obtained, then, acceleration values of the sensor coordinate system after filtering are converted through coordinates to obtain acceleration values under the world coordinate system, the gravity values are subtracted in the vertical direction to obtain motion acceleration of all directions, linear speeds of three circular motions are obtained through multiplication of the angular speeds obtained through the gyroscope and the circular motion diameter, and predicted sensor motion position values are obtained through calculation of a displacement calculation formula S=S 0+vt+at2/2, wherein S, S 0、, v, a and t are displacement values of the speckle structure optical sensor 7 on three axes of the world coordinate system, initial displacement values, initial speeds, average acceleration in the time t and motion time. The second UWB receiver 4 is used for measuring the position of the sensor in the circular motion process, and each discrete measuring point is fitted into a curve to obtain the motion track change of the sensor. The position of the optical sensor 7 with the speckle structure is obtained in two ways, and then the motion position of the optical sensor 7 with the speckle structure calculated by the displacement calculation formula and the motion position of the optical sensor 7 with the speckle structure calculated by the second UWB receiver 4 are fused by Kalman filtering, so that the more accurate sensor position is obtained.
The point cloud acquisition module is of an active binocular speckle structure and can measure in an underground low-illumination environment. The baseline distance of the optical sensor 7 with the speckle structure can be adjusted according to the precision requirement of the working distance of the center of the measured object, so that the high-precision measurement requirements of different scenes are met. The specially encoded speckle structure photosensor 7 is characterized in that the focal length of the speckle light is adjustable, and in the projected speckle structure light, as shown in fig. 3, lines of transverse and vertical encoding are also included, namely a first transverse encoding line, a second transverse encoding line, a third transverse encoding line, a first vertical encoding line, a second vertical encoding line and a third vertical encoding line, and the projection direction normal plane is divided into 16 different area sizes, wherein the lines are used for realizing transverse or vertical alignment of images when the left camera and the right camera shoot, so that the calculated amount is reduced when later speckle pattern matching is facilitated, and the matching efficiency is improved.
Fig. 2 shows components of a point cloud acquisition module, including a left camera 10, a right camera 11, a laser light source 12, a speckle mask 13, and a zoom lens 14. The laser light source 12, the speckle mask 13 and the zoom lens 14 form a speckle projector, so that the function of projecting speckle on the surface of a measured object is realized, the laser light source 12 provides illumination, the speckle mask 13 provides patterns, and the zoom lens 14 can change the size of the patterns on the speckle mask 13 on the measured surface, so that the device is suitable for the surfaces of the measured objects with different sizes. The left camera 10 and the right camera 11 form a binocular shooting system, speckle projected on an object to be measured is imaged, because the difference of the spatial positions of the cameras can cause the difference of the imaging of the speckle on the surface of the object to be measured in the left camera and the right camera, by comparing imaging pictures of the left camera and the right camera, different image points of the same object point in the left camera and the right camera are found according to the object image relationship, parallax is calculated, and the spatial three-dimensional position information of the object point can be calculated. The distance between the left camera and the right camera is adjustable so as to meet the measurement resolution requirements of measured objects with different scales. The arrows in fig. 2 indicate that the distance between the left camera 10 and the right camera 11 is adjustable.
The control and processing module 6 is connected with the optical sensor 7 of the speckle structure through a serial port, and the control and processing module 6 controls the movement of each moving part and the data acquisition and time stamp alignment of each sensor. Specifically, the control and processing module 6 is an MPU.
As shown in fig. 4, the algorithm module measures acceleration and angular velocity of the IMU 5 by measuring a circumferential track, measures positions of the first UWB receiver 3, the second UWB receiver 4 and the UWB base station 9, obtains positions and postures of the fused speckle structure light sensor 7 under a world coordinate system, performs coordinate transformation on point cloud data obtained by measuring the speckle structure light sensor 7 under the world coordinate system through the positions and postures of the speckle structure light sensor 7, converts the spatial positions of the point cloud data into the world coordinate system, obtains a sub-graph of the point cloud, and then continuously updates and outputs the point cloud data by updating the positions of the speckle structure light sensor 7, registers new single-frame point cloud into the point cloud data of the sub-graph, and finally obtains a global map.
The invention relates to a method for synchronously positioning and mapping a mobile robot in an underground space, which comprises the following steps:
Step 1, carrying out synchronous positioning of mobile robots in an underground space and position arrangement of a mapping system, placing the synchronous positioning of the mobile robots in the underground space and the mapping system in a pipe gallery 8, arranging 1 UWB base station 9 at the top of an inlet of the pipe gallery 8, and establishing a world coordinate system.
Step 2, initializing the control and processing module 6, and aligning the circumference track measurement IMU 5 and the globally measured UWB base station 9 with the time stamps of the photographed images of the first UWB receiver 3, the second UWB receiver 4 and the speckle structure light sensor 7.
And 3, starting circular motion scanning, after scanning along the circular motion, advancing the scanning base 2 along the track for 1 meter, starting circular motion scanning again, and synchronously collecting point cloud data of the circular track measurement IMU 5, the second UWB receiver 4 and the speckle structure light in the motion process, wherein the update frequency of the circular track measurement IMU 5 and the second UWB receiver 4 is set to be 200Hz, and the update frequency of the speckle structure light is set to be 10Hz.
And 4, carrying out Kalman filtering fusion on position and posture data of the circumferential track measurement IMU 5 and the second UWB receiver 4, fitting to obtain the position and posture of the speckle structure optical sensor 7 under a world coordinate system in the motion process, and carrying out rough registration on point clouds acquired at different moments before and after the motion according to the fitted position and posture to obtain rough registered point clouds.
And 5, carrying out fine registration of frames and subgraphs and fine registration of the subgraphs and updating of a map on the point cloud after coarse registration through an ICP iterative algorithm, wherein the ICP iterative algorithm enables the distance between the point clouds to be minimum by solving the transformation between a source point set P= { pi, i=1, 2,..Np } and a target point set Q= { qi, i=1, 2,..Nq ]. Wherein i is the sequence number of the point cloud, np and Nq are the maximum points of the source point set P and the maximum points of the target point set Q respectively, and an error function E (R, t) is obtained by calculating a rotation matrix R and a translation matrix t, wherein the smaller the value is, the better the value is.
Specifically, the step 4 includes conversion of a spatial coordinate system, and the method includes:
the coordinate system conversion sequence of the wall coordinates of the narrow space is a sensor coordinate system-a machine coordinate system-a world coordinate system. The conversion of the sensor coordinate system into the machine coordinate system is described below.
In the piping lane three-dimensional measurement, there are a plurality of coordinate systems, as shown in fig. 1, the coordinate system fixedly connected to the scanning base 2 is a machine coordinate system, the coordinate system fixedly connected to the circumferential track measurement IMU 5 is a sensor coordinate system, and the coordinate system in which the UWB base station 9 is located is a world coordinate system.
The euler theorem of general motion of a rigid body can know that the motion of a point under an absolute coordinate system can be divided into vector sum of translational displacement along with a base point and rotation around the base point, namely:
;
Wherein, The superscript T denotes the transpose of the matrix for the coordinates of the ith point, u i is the position before transformation, and u i' is the position after transformation.
In combination with the kinematic characteristics, when the rotation matrix R is calculated, the rotation around the Z axis (yaw angle α) under the sensor coordinate system is considered, the rotation around the Y axis (pitch angle β) is considered, and the rotation around the X axis (roll angle γ) is considered. The expression for the rotation matrix R is available as:
;
Wherein R α、Rβ、Rγ is the rotation matrix about Z, Y, X axes, respectively.
The pose fusion method of the circular track measurement IMU 5 and the second UWB receiver 4 for carrying out Kalman filtering comprises the following steps:
in a static state, at different rotation angles of the motion axis of the scanning base 2, the position of the speckle structure optical sensor 7 under a world coordinate system and the angle value of the current moment are calibrated through the second UWB receiver 4 which is measured globally, the coordinate and angle relation of the position of the different moment under a machine coordinate system are obtained, and according to the converted discrete coordinates, a space track equation of the rotation axis at different angles is obtained through a fitting equation.
Under the motion state, the value of the circumferential track measurement IMU 5, which changes along with time under the sensor coordinate system, is acquired, the originally acquired signal contains a large amount of random noise, mean value filtering processing is needed to be carried out on the originally acquired signal to improve the signal to noise ratio, and the filtered angular velocity is integrated to obtain the angular change at different moments.
The pose of the optical sensor with the speckle structure on the motion axis of the scanning base 2 can be obtained by fitting the angular velocity integral of the circular track measurement IMU 5 and the data of the second UWB receiver 4 measured globally, but both the measured data have errors, and the errors conform to gaussian distribution, and in order to obtain the data with higher precision, the kalman filter fitting is performed on the data from two types of sources. Taking the integration result of the circumference track measurement IMU 5 as a prediction, and taking the fitting result of the UWB base station 9 of global measurement as a state measurement result.
Initial angleComponent calculation is performed in two directions according to the gravitational acceleration g:
;
Wherein, ,、For the angle values at time k and time k-1, ω k、ωk-1 is the angular velocity value at time k and time k-1, Δt is the time between time k and time k-1, W is the noise, and the acceleration of the speckle structure photosensor in the y direction a y and the acceleration of the sensor in the z direction a z are calculated.
After the integration gesture of the speckle structure light obtained by integrating the angular velocity of the circumferential track measurement IMU 5 and fitting the data of the globally measured second UWB receiver 4, the acceleration value of the circumferential track measurement IMU 5 in the sensor coordinate system needs to be converted into the world coordinate system, the three-dimensional acceleration signal in motion contains the gravity acceleration information, and the value containing the gravity in the three-dimensional acceleration information needs to be removed when the motion track of the sensor is calculated. The equation for resolving the sensor trajectory of the speckle pattern photosensor 7 from the circumferential trajectory measurement IMU 5 is as follows:
;
;
Wherein X k、Yk、Zk represents the position of the light sensor of the speckle structure on each coordinate axis at the time k under the world coordinate system, X k-1、Yk-1、Zk-1 represents the position of the light sensor of the speckle structure on each coordinate axis at the time k-1 under the world coordinate system, v x k、vy k、vz k represents the speed of the light sensor of the speckle structure on each coordinate axis at the time k under the world coordinate system, v x k-1、vy k-1、vz k-1 represents the speed of the light sensor of the speckle structure on each coordinate axis at the time k-1 under the world coordinate system, a x k-1、ay k-1、az k-1 represents the acceleration of the light sensor of the speckle structure on each coordinate axis at the time k-1 under the world coordinate system, δt is the time interval for collecting data, a x k、ωy k、ωz k is the X-axis acceleration, y-axis angular velocity and z-axis angular velocity at the time k, Is the angle of rotation about the x-axis.
And finally, taking the calculation result of the circumferential track measurement IMU 5 as a pre-measurement, taking the data measurement result of the second UWB receiver 4 as a state measurement quantity, and carrying out Kalman data fusion on two different types of data to obtain a more accurate cantilever tail end position.
Specifically, the step 5 includes:
the essence of the ICP iterative algorithm is to convert to an optimal solution (R, t) for the error function E (R, t), which is formulated as:
;
Wherein P i and Q i are the ith point in the source point set P and the ith point in the target point set Q, respectively. n is the total number of matching points, "|" represents the norm of the vector, referring to the length size in vector space.
The ICP iterative algorithm comprises the following steps:
step 5.1, a point P i E P is taken from a source point set P, wherein i is a sequence number;
Step 5.2, selecting a point Q i epsilon Q from the target point set Q, and finding a corresponding point P i, pi epsilon P from the source point set P;
step 5.3, calculating a rotation matrix R and a translation matrix t so that an error function E (R, t) is minimum;
Step 5.4, transforming the rotation matrix R and the translation matrix t obtained in the step 5.3 into a point P i to obtain a new source point set P '= { P' i=Rpi+t, pi∈P };p'i as a transformed point;
Step 5.5, calculating an average distance d between points Q i corresponding to P 'i of a source point set P' in a target point set Q, wherein the formula is d= ΣIIq i-p'i‖2/n, and n is the number of matched point pairs;
And 5.6, setting a threshold value, stopping iteration when the average distance d is smaller than the threshold value, otherwise, returning to the step 5.2 until the threshold value condition is met.
Fig. 5 is a three-dimensional point cloud image of a piping lane image obtained according to the present invention, which is a set of three-dimensional points in a world coordinate system, wherein the darkness of gray represents the distance between a point and a central axis, the darker the color represents the closer the point is to the central axis, and the blank portion represents that the point cloud is not collected at the position under the current view due to occlusion or the like.
Claims (10)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411489703.5A CN118999531B (en) | 2024-10-24 | 2024-10-24 | Mobile robot synchronous positioning and mapping method for underground space |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411489703.5A CN118999531B (en) | 2024-10-24 | 2024-10-24 | Mobile robot synchronous positioning and mapping method for underground space |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN118999531A CN118999531A (en) | 2024-11-22 |
| CN118999531B true CN118999531B (en) | 2024-12-27 |
Family
ID=93487491
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202411489703.5A Active CN118999531B (en) | 2024-10-24 | 2024-10-24 | Mobile robot synchronous positioning and mapping method for underground space |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN118999531B (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN120471451B (en) * | 2025-05-22 | 2025-10-31 | 四川旭信科技有限公司 | A mine safety monitoring system and method for confined spaces in mines |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021137750A1 (en) * | 2019-12-30 | 2021-07-08 | Singpilot Pte Ltd | Sequential mapping and localization (smal) for navigation |
| CN115639813A (en) * | 2022-09-21 | 2023-01-24 | 东软医疗系统股份有限公司 | A mobile DR control method and device, storage medium, and computer equipment |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20210356279A1 (en) * | 2018-07-08 | 2021-11-18 | Nng Software Developing And Commercial Llc. | A Method and Apparatus for Optimal Navigation to Multiple Locations |
-
2024
- 2024-10-24 CN CN202411489703.5A patent/CN118999531B/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021137750A1 (en) * | 2019-12-30 | 2021-07-08 | Singpilot Pte Ltd | Sequential mapping and localization (smal) for navigation |
| CN115639813A (en) * | 2022-09-21 | 2023-01-24 | 东软医疗系统股份有限公司 | A mobile DR control method and device, storage medium, and computer equipment |
Also Published As
| Publication number | Publication date |
|---|---|
| CN118999531A (en) | 2024-11-22 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN114608561B (en) | A positioning and mapping method and system based on multi-sensor fusion | |
| CN109993113B (en) | A Pose Estimation Method Based on RGB-D and IMU Information Fusion | |
| CN108489496B (en) | Non-cooperative target relative navigation motion estimation method and system based on multi-source information fusion | |
| CN110675450B (en) | Method and system for generating orthoimage in real time based on SLAM technology | |
| JP5992184B2 (en) | Image data processing apparatus, image data processing method, and image data processing program | |
| CN111091587B (en) | Low-cost motion capture method based on visual markers | |
| CN112254729B (en) | A mobile robot positioning method based on multi-sensor fusion | |
| CN111899276A (en) | SLAM method and system based on binocular event camera | |
| RU2734387C1 (en) | System and method of initial exposure by optical flow method for strapdown inertial navigation of coal mining machine | |
| CN118135526A (en) | Visual target recognition and positioning method for quadrotor UAV based on binocular camera | |
| CN104501779A (en) | High-accuracy target positioning method of unmanned plane on basis of multi-station measurement | |
| WO2010001940A1 (en) | Position measurement method, position measurement device, and program | |
| CN105606092B (en) | Indoor robot positioning method and system | |
| CN114370871A (en) | Close coupling optimization method for visible light positioning and laser radar inertial odometer | |
| CN118999531B (en) | Mobile robot synchronous positioning and mapping method for underground space | |
| CN109520476B (en) | System and method for measuring dynamic pose of rear intersection based on inertial measurement unit | |
| CN110278371A (en) | Three axial seven variables full freedom degrees of the video camera in space position and tracking | |
| TW201711011A (en) | Positioning and directing data analysis system and method thereof | |
| CN110736457A (en) | An integrated navigation method based on Beidou, GPS and SINS | |
| CN118967795B (en) | Visual inertial navigation tight coupling SLAM method based on four-eye panoramic camera | |
| CN113345032A (en) | Wide-angle camera large-distortion image based initial image construction method and system | |
| CN112129263B (en) | Distance measurement method of separated mobile stereo distance measurement camera | |
| CN112762929B (en) | Intelligent navigation method, device and equipment | |
| CN113379911A (en) | SLAM method, SLAM system and intelligent robot | |
| CN110411475A (en) | A kind of robot vision odometer assisted based on template matching algorithm and IMU |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |