[go: up one dir, main page]

CN113252023A - Positioning method, device and equipment based on odometer - Google Patents

Positioning method, device and equipment based on odometer Download PDF

Info

Publication number
CN113252023A
CN113252023A CN202110452025.5A CN202110452025A CN113252023A CN 113252023 A CN113252023 A CN 113252023A CN 202110452025 A CN202110452025 A CN 202110452025A CN 113252023 A CN113252023 A CN 113252023A
Authority
CN
China
Prior art keywords
attitude data
determining
point cloud
cloud image
odometer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110452025.5A
Other languages
Chinese (zh)
Inventor
李强
毕艳飞
柴黎林
李贝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ubtech Technology Co ltd
Original Assignee
Shenzhen Ubtech Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ubtech Technology Co ltd filed Critical Shenzhen Ubtech Technology Co ltd
Priority to CN202110452025.5A priority Critical patent/CN113252023A/en
Publication of CN113252023A publication Critical patent/CN113252023A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
    • G01C22/02Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers by conversion into electric waveforms and subsequent integration, e.g. using tachometer generator
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Navigation (AREA)

Abstract

The application belongs to the field of positioning, and provides a positioning method, a positioning device and positioning equipment based on a odometer. The method comprises the following steps: acquiring first attitude data of equipment to be positioned through a wheel type odometer; acquiring a point cloud image through a laser radar, and determining second attitude data of equipment to be positioned based on a normal distribution transformation method; and determining a nonlinear system corresponding to the equipment to be positioned according to the first attitude data and the second attitude data, fusing the first attitude data and the second attitude data based on an extended Kalman filtering method, and determining fused third attitude data for positioning. Therefore, the integrated attitude data can effectively reduce the influence of accumulated errors, reduce the mileage deviation caused by road factors, and be beneficial to improving the positioning precision and positioning stability of the positioning equipment.

Description

Positioning method, device and equipment based on odometer
Technical Field
The application belongs to the field of positioning, and particularly relates to a positioning method, device and equipment based on a odometer.
Background
At present, AMR (english is called an autonomous mobile Robot, and chinese is called an autonomous mobile Robot) navigation positioning system mainly uses a wheel encoder as a track estimation hardware. The motion speed or acceleration of the wheel of the positioning equipment is acquired through a wheel type encoder of the odometer, and the displacement of the positioning equipment is calculated through a time integration mode.
When the wheel type encoder is used for acquiring the movement speed or the acceleration of the positioning equipment, due to the influence of factors such as size deviation of the positioning equipment, asynchronous pulse of the encoder or MCU delay and the like, when the odometer carries out integral operation on time, the accumulated error can be gradually enlarged along with the increase of the time. Moreover, the influence of abnormal fluctuation of encoder data or uneven road surface factors can also cause mileage deviation, which is not favorable for ensuring the positioning accuracy and positioning stability of the positioning equipment.
Disclosure of Invention
In view of this, embodiments of the present application provide a positioning method, an apparatus, and a device based on an odometer, so as to solve the problem in the prior art that when the odometer is used for positioning, mileage deviation is caused, which is not beneficial to guarantee.
A first aspect of an embodiment of the present application provides a method for odometer-based positioning, the method including:
acquiring first attitude data of equipment to be positioned through a wheel type odometer;
acquiring a point cloud image through a laser radar, and determining second attitude data of equipment to be positioned based on a normal distribution transformation method;
and determining a nonlinear system corresponding to the equipment to be positioned according to the first attitude data and the second attitude data, fusing the first attitude data and the second attitude data based on an extended Kalman filtering method, and determining fused third attitude data for positioning.
With reference to the first aspect, in a first possible implementation manner of the first aspect, acquiring a point cloud image by using a laser radar, and determining second posture data of a device to be positioned based on a normal distribution transformation method includes:
collecting a first point cloud image at a first moment through a laser radar;
collecting a second point cloud image at a second moment through a laser radar;
and determining second attitude data at a second moment by a normal distribution transformation method according to the first point cloud image and the second point cloud image.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, determining second posture data at a second time according to the first point cloud image and the second point cloud image by using a normal distribution transformation method includes:
dividing the space occupied by the first point cloud image into grids with preset sizes, and calculating the normal distribution parameter of each grid;
converting the second point cloud image into a grid of the first point cloud image through preset conversion parameters;
calculating the probability density of each conversion point according to the normal distribution parameters to obtain the probability density sum of the conversion points;
optimizing the transformation parameters such that the sum of probability densities is maximized;
and determining second attitude data of the second moment relative to the first moment according to the optimized transformation parameters.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, optimizing the transformation parameters so that a sum of probability densities is maximized includes:
and optimizing the probability density and the maximization of the probability density and the corresponding objective function by a Newton optimization method.
With reference to the first aspect, in a fourth possible implementation manner of the first aspect, acquiring first posture data of the device to be positioned by using a wheeled odometer includes:
determining the moving distance of the equipment to be positioned through a pulse encoder for acquiring the rotating speed of a left wheel and a right wheel;
determining the attitude angle of the equipment to be positioned through an azimuth sensor;
and determining the first attitude data according to the moving distance and the attitude angle.
With reference to the first aspect, in a sixth possible implementation manner of the first aspect, the determining, according to the first posture data and the second posture data, a nonlinear system corresponding to the device to be positioned includes:
determining a system prediction equation of the nonlinear system by taking the first attitude data as a prediction result;
determining a system observation equation of the nonlinear system by taking the second attitude data as an observation result;
and obtaining the nonlinear system according to the system prediction equation and the system observation equation.
With reference to the fifth possible implementation manner of the first aspect, in a sixth possible implementation manner of the first aspect, the fusing the nonlinear system by using an extended kalman filter method to obtain fused third attitude data includes:
and expanding the system prediction equation and the system observation equation, performing linear processing, and performing state updating on the linearly processed system prediction equation and system observation equation through a Kalman filtering method to obtain fused third attitude data.
A second aspect of embodiments of the present application provides an odometer-based positioning device, the device comprising:
the first attitude data acquisition unit is used for acquiring first attitude data of the equipment to be positioned through the wheel type odometer;
the second attitude data acquisition unit is used for acquiring a point cloud image through a laser radar and determining second attitude data of the equipment to be positioned based on a normal distribution transformation method;
and the state data fusion unit is used for determining a nonlinear system corresponding to the equipment to be positioned according to the first attitude data and the second attitude data, fusing the first attitude data and the second attitude data based on an extended Kalman filtering method, and determining fused third attitude data for positioning.
A third aspect of embodiments of the present application provides an odometer-based positioning device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to any one of the first aspect when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, in which a computer program is stored, which, when executed by a processor, performs the steps of the method according to any one of the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: this application acquires first attitude data through wheeled odometer, acquires second attitude data through the laser odometer to fuse first attitude data and second attitude data through the extended Kalman filtering method, thereby make the influence that the attitude data after fusing can the accumulative total error of effectual reduction, reduce the mileage deviation that road surface factor brought, be favorable to improving positioning device's positioning accuracy and the stability of location.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of a location method based on an odometer according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a first pose data calculation provided by an embodiment of the present application;
FIG. 3 is a flowchart illustrating an implementation of a method for determining second pose data according to an embodiment of the present application;
fig. 4 is a schematic flow chart illustrating an implementation of obtaining third posture data by fusion according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an odometer-based locating device provided by an embodiment of the present application;
fig. 6 is a schematic diagram of an odometer-based positioning device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
For positioning devices such as automatic mobile robots that need to be navigated and positioned, the position of the moving positioning device needs to be determined during the process of automatically executing tasks. The positioning method based on the odometer has the characteristics of simple calculation and high response speed, so the positioning method based on the odometer is more common. However, the odometer-based positioning method also has a drawback in positioning accuracy.
For example, when the odometer is a wheeled odometer, the positioning accuracy may be affected by:
influencing factor 1, the factor of the positioning device itself. Since the kinematic model relies on the size of the positioning apparatus itself. When the size of the positioning equipment is deviated, or the encoder pulses are not synchronous, or the processor is delayed, the angle calculation deviation can be caused. Wheel odometers typically integrate the speed over an equal amount of time, with the cumulative error increasing with time. Moreover, abnormal fluctuation or sudden change of the encoder data can also cause instantaneous fluctuation of the odometer, so that the error accumulation at the next moment is increased.
Influencing factor 2, road surface factor. When the positioning equipment moves on an uneven road surface, the left wheel and the right wheel are asynchronous due to the influence of the road surface fluctuation factor, and the mileage deviation is generated.
When the positioning device replaces the wheel-type odometer with the laser odometer, although the precision of the positioning device is higher than that of the wheel-type odometer, the laser odometer can cause a large positioning error in a long passageway scene or other scenes with no increment in environment.
Based on the above problem, an embodiment of the present application provides a positioning method based on an odometer, where the method is shown in fig. 1, and includes:
in S101, first attitude data of a device to be positioned is acquired by a wheel odometer.
The wheel type odometer can comprise pulse encoders arranged on a left wheel and a right wheel of the positioning equipment. The rotating speeds of the left wheel and the right wheel can be detected through the pulse signals output by the pulse encoder. According to the detected rotating speed, the moving distance of the left wheel and the right wheel of the positioning equipment can be obtained by combining the size parameters of the left wheel and the right wheel.
In a possible implementation, the attitude angle of the pointing device may be obtained by an orientation sensor. For example, the orientation sensor may be a sensing device such as a six-axis orientation sensor.
After the attitude angle and the movement distance of the positioning device are determined, first attitude data of the positioning device can be determined according to the attitude angle and the movement distance.
As shown in fig. 2, the distance between the left and right wheels of the positioning device is l, d is the moving distance difference between the left and right wheels of the positioning device in the predetermined time period from the time t1 to the time t2, and r is the distance between the center point of the positioning device and the center of the arc during the arc motion of the positioning device. Θ 1 is the amount of angular change of the pointing device during a predetermined period from time t1 to time t 2. v. oflThe moving speed v of the left wheellThe moving speed of the right wheel.
Since the time interval from time t1 to time t2 is short, the first pose data at time t2 can be calculated as (x1, y1, Θ) in combination with the pose data at time t1, including the pose angle and coordinates at time t 1.
In S102, a point cloud image is collected through a laser radar, and second attitude data of the equipment to be positioned is determined based on a normal distribution transformation method.
When the second attitude data acquired by the laser odometer is determined, the second attitude data of the positioning device can be determined in a point cloud matching mode.
The orientation of the second point cloud image relative to the first point cloud image can be determined in a manner that the first point cloud image is collected as a parameter point cloud image at a first moment by the laser odometer, the second point cloud image is collected at a second moment, and feature registration is carried out on the second point cloud image and the first point cloud image, so that second attitude data of the positioning equipment at the second moment is determined. The first point cloud image may be a global point cloud image or a point cloud image acquired at a time before the second time.
In a possible implementation manner, when the second posture data at the second time is determined according to the first point cloud image and the second point cloud image, the second posture data may be determined according to a normal distribution transformation manner.
When determining the second pose data according to the first point cloud image and the second point cloud image, the first point cloud image may be used as a reference image, the second point cloud image may be used as a registration image, and the transformation parameters of the second point cloud image and the first point cloud image are determined by calculating the probability density of the second point cloud image and the first point cloud image, so as to determine the second pose data, which may specifically be as shown in fig. 3, including:
in S301, the space occupied by the first cloud image is divided into grids of a predetermined size, and a normal distribution parameter of each grid is calculated.
The first point cloud image is used as a reference point cloud, and can be divided according to a specified size into grids or voxels with a specified size. The normal distribution parameters of the calculated grid may include a mean value q, a covariance matrix Σ, and the like.
For example, the mean value q may be expressed as:
Figure BDA0003039146450000071
the covariance matrix Σ can be expressed as:
Figure BDA0003039146450000072
wherein n is the number of pixels in the grid, and xi is the pixel value in the grid of the point cloud image.
In S302, the second point cloud image is transformed into a mesh of the first point cloud image according to a preset transformation parameter.
The predetermined transformation parameter P may be a zero value or another value calculated from odometry data. The transformation parameters need to be calculated iteratively until the transformation converges.
For the second point cloud image used for registration, the second point cloud image can be converted into a grid in the reference point cloud according to the set conversion parameters.
In S303, the probability density of each transition point is calculated according to the normal distribution parameter, so as to obtain the sum of the probability densities of the transition points.
According to the normal distribution parameter of each grid determined in step S301, the probability density of the transition point corresponding to each grid can be calculated. And summing the probability densities of all the conversion points to obtain the probability density sum of the conversion points corresponding to all the grids.
In S304, the transformation parameters are optimized such that the sum of probability densities is maximized.
According to the size of the probability density, transformation parameters can be optimized, so that the probability density after optimization processing is maximum.
Wherein, when the transformation parameters are optimized, a Newton optimization method can be adopted for optimization.
In S305, second posture data of the second time with respect to the first time is determined based on the optimized transformation parameter.
According to the transformation parameters after the optimization processing, the registration relation between the second point cloud image and the first point cloud image can be determined. According to the position of the second point cloud image relative to the first point cloud image, the position relation of the positioning device at the second moment relative to the first moment can be determined. When the first point cloud image is a global image, the orientation of the positioning device in the global image at the second moment can be determined.
In S103, a nonlinear system corresponding to the device to be positioned is determined according to the first attitude data and the second attitude data, the first attitude data and the second attitude data are fused based on an extended kalman filter method, and fused third attitude data used for positioning is determined.
The first attitude data acquired based on the wheel type odometer and the second attitude data determined in a point cloud image registration mode can be fused to obtain third attitude data with higher positioning accuracy. The fusion method includes, but is not limited to, mean fusion, fusion according to a specific weight method, and the like.
In a possible implementation manner, the first attitude data and the second attitude data may form a nonlinear system, and the nonlinear system is fused by an extended kalman filtering method to implement state updating, so as to obtain fused third attitude data. Specifically, as shown in fig. 4, the method includes:
in S401, a system prediction equation of the nonlinear system is determined using the first posture data as a prediction result.
Since the first attitude data is acquired by the wheel odometer. The wheel type odometer calculates the change information of the pose in a speed or acceleration integrating mode, and determines the pose of the positioning equipment in a mode of accumulating the pose change information, so that a system prediction equation of the nonlinear system can be determined according to the first pose data. For example, the system prediction equation can be expressed as: x is the number oft=g(xt-1,ut)+εt. Wherein x ist-1Is the true attitude data at time t-1, epsilontFor excitation noise, g is a function of the prediction equation, xtIs predicted pose data.
In S402, a system observation equation of the nonlinear system is determined using the second posture data as an observation result.
And the second attitude data is determined according to the registration result of the first point cloud image and the second point cloud image. Particularly, when the first point cloud image is a global point cloud image, the positioning accuracy of the second attitude data is high, so that a system observation equation can be determined according to the second attitude data. For example, the system observation equation can be expressed as: z is a radical oft=h(xt)+δtWherein z istFor observed attitude data, h is the function of the observation equation, δtTo predict noise.
In S403, the system prediction equation and the system observation equation are expanded and subjected to linear processing, and the state of the linearly processed system prediction equation and system observation equation is updated by a kalman filtering method, so as to obtain fused third attitude data.
The method comprises the steps of expanding a system prediction equation and a system observation equation of a nonlinear system, for example, Taylor expansion can be performed, further performing linear processing, using a Kalman filtering method to use first attitude data as a predicted value of the system prediction equation and second attitude data as an observed value of the system observation equation, and updating a system state according to the system state and a system covariance matrix set at an initial moment to obtain fused state data, namely third attitude data.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 5 is a schematic diagram of an odometer-based positioning device according to an embodiment of the present application, as shown in fig. 5, the device includes:
a first posture data acquiring unit 501, configured to acquire first posture data of a device to be positioned through a wheel type odometer;
a second attitude data acquisition unit 502, configured to acquire a point cloud image through a laser radar, and determine second attitude data of a device to be positioned based on a normal distribution transformation method;
and a state data fusion unit 503, configured to determine a nonlinear system corresponding to the device to be positioned according to the first attitude data and the second attitude data, fuse the first attitude data and the second attitude data based on an extended kalman filter method, and determine fused third attitude data used for positioning.
The odometer-based positioning device shown in fig. 5 corresponds to the odometer-based positioning method shown in fig. 1.
Fig. 6 is a schematic diagram of an odometer-based positioning device provided in an embodiment of the present application. As shown in fig. 6, the odometer-based positioning apparatus 6 of this embodiment includes: a processor 60, a memory 61 and a computer program 62, such as an odometer-based positioning program, stored in said memory 61 and executable on said processor 60. The processor 60, when executing the computer program 62, implements the steps in the various odometer-based positioning method embodiments described above. Alternatively, the processor 60 implements the functions of the modules/units in the above-described device embodiments when executing the computer program 62.
Illustratively, the computer program 62 may be partitioned into one or more modules/units that are stored in the memory 61 and executed by the processor 60 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 62 in the odometer-based positioning device 6.
The odometer-based positioning device 6 may be a robot or the like. The odometer-based positioning device may include, but is not limited to, a processor 60, a memory 61. Those skilled in the art will appreciate that fig. 6 is merely an example of an odometry-based positioning device 6, does not constitute a limitation of the odometry-based positioning device 6, may include more or fewer components than shown, or combine certain components, or different components, e.g., the odometry-based positioning device may also include input-output devices, network access devices, buses, etc.
The Processor 60 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the odometer-based positioning device 6, such as a hard disk or a memory of the odometer-based positioning device 6. The memory 61 may also be an external storage device of the odometer-based positioning device 6, such as a plug-in hard disk provided on the odometer-based positioning device 6, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 61 may also include both an internal storage unit and an external storage device of the odometer-based positioning device 6. The memory 61 is used for storing the computer program and other programs and data required by the odometer-based positioning device. The memory 61 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of the methods described above can be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A method of odometer-based positioning, the method comprising:
acquiring first attitude data of equipment to be positioned through a wheel type odometer;
acquiring a point cloud image through a laser radar, and determining second attitude data of equipment to be positioned based on a normal distribution transformation method;
and determining a nonlinear system corresponding to the equipment to be positioned according to the first attitude data and the second attitude data, fusing the first attitude data and the second attitude data based on an extended Kalman filtering method, and determining fused third attitude data for positioning.
2. The method of claim 1, wherein the step of collecting the point cloud image through the laser radar and determining the second attitude data of the equipment to be positioned based on a normal distribution transformation method comprises the following steps:
collecting a first point cloud image at a first moment through a laser radar;
collecting a second point cloud image at a second moment through a laser radar;
and determining second attitude data at a second moment by a normal distribution transformation method according to the first point cloud image and the second point cloud image.
3. The method of claim 2, wherein determining second pose data at a second time from the first point cloud image and the second point cloud image by a normal distribution transformation method comprises:
dividing the space occupied by the first point cloud image into grids with preset sizes, and calculating the normal distribution parameter of each grid;
converting the second point cloud image into a grid of the first point cloud image through preset conversion parameters;
calculating the probability density of each conversion point according to the normal distribution parameters to obtain the probability density sum of the conversion points;
optimizing the transformation parameters such that the sum of probability densities is maximized;
and determining second attitude data of the second moment relative to the first moment according to the optimized transformation parameters.
4. The method of claim 3, wherein optimizing the transformation parameters such that a sum of probability densities is maximized comprises:
and optimizing the probability density and the maximization of the probability density and the corresponding objective function by a Newton optimization method.
5. The method of claim 1, wherein obtaining first pose data for a device to be positioned via a wheeled odometer comprises:
determining the moving distance of the equipment to be positioned through a pulse encoder for acquiring the rotating speed of a left wheel and a right wheel;
determining the attitude angle of the equipment to be positioned through an azimuth sensor;
and determining the first attitude data according to the moving distance and the attitude angle.
6. The method of claim 1, wherein determining a non-linear system corresponding to the device to be positioned based on the first pose data and the second pose data comprises:
determining a system prediction equation of the nonlinear system by taking the first attitude data as a prediction result;
determining a system observation equation of the nonlinear system by taking the second attitude data as an observation result;
and obtaining the nonlinear system according to the system prediction equation and the system observation equation.
7. The method according to claim 6, wherein the fusing the nonlinear system by an extended Kalman filtering method to obtain fused third attitude data comprises:
and expanding the system prediction equation and the system observation equation, performing linear processing, and performing state updating on the linearly processed system prediction equation and system observation equation through a Kalman filtering method to obtain fused third attitude data.
8. An odometer-based positioning device, the device comprising:
the first attitude data acquisition unit is used for acquiring first attitude data of the equipment to be positioned through the wheel type odometer;
the second attitude data acquisition unit is used for acquiring a point cloud image through a laser radar and determining second attitude data of the equipment to be positioned based on a normal distribution transformation method;
and the state data fusion unit is used for determining a nonlinear system corresponding to the equipment to be positioned according to the first attitude data and the second attitude data, fusing the first attitude data and the second attitude data based on an extended Kalman filtering method, and determining fused third attitude data for positioning.
9. An odometer-based positioning device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, carries out the steps of the method according to any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202110452025.5A 2021-04-26 2021-04-26 Positioning method, device and equipment based on odometer Pending CN113252023A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110452025.5A CN113252023A (en) 2021-04-26 2021-04-26 Positioning method, device and equipment based on odometer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110452025.5A CN113252023A (en) 2021-04-26 2021-04-26 Positioning method, device and equipment based on odometer

Publications (1)

Publication Number Publication Date
CN113252023A true CN113252023A (en) 2021-08-13

Family

ID=77221648

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110452025.5A Pending CN113252023A (en) 2021-04-26 2021-04-26 Positioning method, device and equipment based on odometer

Country Status (1)

Country Link
CN (1) CN113252023A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115480579A (en) * 2022-10-13 2022-12-16 华侨大学 Crawler-type mobile machinery and method, device, and medium for tracking and controlling its established trajectory
CN116224349A (en) * 2022-12-12 2023-06-06 珠海创智科技有限公司 Robot positioning method, system and electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109186601A (en) * 2018-07-05 2019-01-11 南京理工大学 A kind of laser SLAM algorithm based on adaptive Unscented kalman filtering
CN110689576A (en) * 2019-09-29 2020-01-14 桂林电子科技大学 Automatic ware-based dynamic 3D point cloud normal distribution AGV positioning method
CN112444246A (en) * 2020-11-06 2021-03-05 北京易达恩能科技有限公司 Laser fusion positioning method in high-precision digital twin scene

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109186601A (en) * 2018-07-05 2019-01-11 南京理工大学 A kind of laser SLAM algorithm based on adaptive Unscented kalman filtering
CN110689576A (en) * 2019-09-29 2020-01-14 桂林电子科技大学 Automatic ware-based dynamic 3D point cloud normal distribution AGV positioning method
CN112444246A (en) * 2020-11-06 2021-03-05 北京易达恩能科技有限公司 Laser fusion positioning method in high-precision digital twin scene

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
姜祚鹏;梅天灿;: "一种基于PL-ICP及NDT点云匹配的单线激光里程计", 激光杂志, no. 03, 25 March 2020 (2020-03-25) *
张立川等: "深远海工程装备与高技术丛书 自主水下航行器导航与控制技术", 31 October 2020, 上海科学技术出版社, pages: 93 - 96 *
杨奇峰;曲道奎;徐方;: "基于3D-NDT的移动机器人定位算法研究", 控制工程, no. 04, 20 April 2020 (2020-04-20), pages 613 - 619 *
谢勇;刘晓日;汪晓波;王斌锐;: "隧洞移动机器人里程计激光雷达融合定位", 科技通报, no. 01, 31 January 2020 (2020-01-31), pages 93 - 98 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115480579A (en) * 2022-10-13 2022-12-16 华侨大学 Crawler-type mobile machinery and method, device, and medium for tracking and controlling its established trajectory
CN116224349A (en) * 2022-12-12 2023-06-06 珠海创智科技有限公司 Robot positioning method, system and electronic device

Similar Documents

Publication Publication Date Title
CN111912417B (en) Map construction method, map construction device, map construction equipment and storage medium
US10852139B2 (en) Positioning method, positioning device, and robot
CN108253958B (en) Robot real-time positioning method in sparse environment
EP3875907B1 (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN109410735B (en) Reflection value map construction method and device
CN108279670B (en) Method, apparatus and computer readable medium for adjusting point cloud data acquisition trajectory
CN112179330A (en) Pose determination method and device of mobile equipment
CN110887493B (en) Track calculation method, medium, terminal and device based on local map matching
CN113933818A (en) Method, device, storage medium and program product for calibration of external parameters of lidar
CN113587934B (en) Robot, indoor positioning method and device and readable storage medium
CN104677361B (en) A kind of method of comprehensive location
CN110969649A (en) Matching evaluation method, medium, terminal and device of laser point cloud and map
CN111113422B (en) Robot positioning method and device, computer readable storage medium and robot
CN114593735B (en) A posture prediction method and device
CN113252023A (en) Positioning method, device and equipment based on odometer
CN114061573B (en) Ground unmanned vehicle formation positioning device and method
CN116380039A (en) A mobile robot navigation system based on solid-state lidar and point cloud map
CN112859110B (en) Positioning navigation method based on three-dimensional laser radar
CN118311561A (en) Fusion positioning method of laser radar and millimeter wave radar and related equipment
CN116399324A (en) Picture construction method and device, controller and unmanned vehicle
CN114111769B (en) Visual inertial positioning method and device and automatic driving device
CN112097772B (en) Robot and map construction method and device thereof
CN115457152A (en) External parameter calibration method and device, electronic equipment and storage medium
CN113538677A (en) Positioning method, robot and storage medium
CN116734840A (en) Mowing robot positioning method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination