CN110470333B - Calibration method and device of sensor parameters, storage medium and electronic device - Google Patents
Calibration method and device of sensor parameters, storage medium and electronic device Download PDFInfo
- Publication number
- CN110470333B CN110470333B CN201910760696.0A CN201910760696A CN110470333B CN 110470333 B CN110470333 B CN 110470333B CN 201910760696 A CN201910760696 A CN 201910760696A CN 110470333 B CN110470333 B CN 110470333B
- Authority
- CN
- China
- Prior art keywords
- sensors
- calibration
- preset number
- specified
- feature data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000004422 calculation algorithm Methods 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 15
- 238000000605 extraction Methods 0.000 claims description 8
- 238000001514 detection method Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 7
- 238000005516 engineering process Methods 0.000 abstract description 11
- 239000011159 matrix material Substances 0.000 description 11
- 230000008859 change Effects 0.000 description 9
- 230000008569 process Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000004927 fusion Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000007689 inspection Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 238000013519 translation Methods 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 241000228740 Procrustes Species 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D18/00—Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Testing Or Calibration Of Command Recording Devices (AREA)
Abstract
The invention provides a calibration method and device of sensor parameters, a storage medium and an electronic device; wherein, the method comprises the following steps: acquiring characteristic data of target equipment by a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment; and calibrating the parameters of the sensors in the preset number by using the characteristic data. The invention solves the problem of low online calibration timeliness caused by the fact that the calibration of the sensor parameters in the related technology needs to depend on external environment information.
Description
Technical Field
The invention relates to the field of communication, in particular to a calibration method and device of sensor parameters, a storage medium and an electronic device.
Background
The functions of positioning navigation, path planning and autonomous obstacle avoidance are hot spots in the research field of mobile robots and unmanned driving, and are also the key points of landing productization of mobile robots and unmanned driving technologies. The mobile robot And the unmanned vehicle can realize autonomous obstacle avoidance, positioning And navigation in a dynamic environment with obstacles by sensing surrounding environment And self state through a camera, a wheel odometer, an ultrasonic radar, an IMU (Inertial Measurement Unit), a Light Detection And Ranging (laser Detection And Measurement) And other multiple sensors (consisting of a plurality of sensors).
In the related technology, information fusion between different types of sensors is a necessary means for improving the environment perception capability of the robot/the unmanned vehicle, and the multi-sensor fusion technology provides more accurate environment information and higher safety performance for the mobile robot/the unmanned vehicle. The multi-sensor calibration technology is a premise of the multi-sensor fusion technology, and information fusion between different types of sensors can be well performed only by acquiring accurate internal and external parameters of the multi-sensor, so that the multi-sensor calibration technology has stronger environment sensing capability. However, most of the related technologies are offline calibration in a laboratory or a factory assembly stage, a specific calibration scene (a specific calibration target, a specific motion trajectory) needs to be set up, and a calibration process is generally complex and needs to consume a large amount of manpower, material resources and financial resources. The working scenes of the mobile robot/unmanned vehicle are generally complex and changeable, the relative position relationship among the multiple sensors is easy to change due to external force factors such as collision, long-term mechanical vibration and the like, and wrong environment sensing information is inevitably caused by wrong position relationship, so that the mobile robot/unmanned vehicle loses correct environment sensing capability.
In order to endow the mobile robot/the unmanned vehicle with complex and changeable dynamic environments corresponding to different road conditions and improve the environment adaptability, the multi-sensor online calibration technology is an effective solution. In the existing multi-sensor online calibration scheme at present, the characteristic extraction with the external environment is mainly relied on, and certain requirements are made on environment information. For example, for calibrating the relative position between a camera and a wheel odometer, the surrounding environment is generally required to be clear, and strong texture information is required, and meanwhile, the wheel needs to run a specific track. For the calibration of the relative position relationship between the camera and the Lidar, the surrounding environment is also required to be spacious, and stronger texture information is required. Therefore, the related technology excessively depends on external environment information, the application scenes of the online calibration method are limited, and the timeliness of the completion of the online calibration task cannot be guaranteed.
In view of the above problems in the related art, no effective solution exists at present.
Disclosure of Invention
The embodiment of the invention provides a calibration method and device of sensor parameters, a storage medium and an electronic device, which are used for at least solving the problem of low online calibration timeliness caused by the fact that external environment information is required to be relied on for calibrating the sensor parameters in the related technology.
According to an embodiment of the invention, a calibration method of sensor parameters is provided, the method comprising: acquiring characteristic data of target equipment by a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment; and calibrating the parameters of the sensors in the preset number by using the characteristic data.
According to another embodiment of the present invention, there is provided a calibration apparatus for sensor parameters, including: the system comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring characteristic data of target equipment by a preset number of sensors, and the preset number of sensors are arranged on the target equipment; and the calibration module is used for calibrating the parameters of the preset number of sensors by using the characteristic data.
According to a further embodiment of the present invention, there is also provided a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
According to yet another embodiment of the present invention, there is also provided an electronic device, including a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
According to the invention, the characteristic data of the target equipment is acquired through the sensors in the preset number, and the parameter calibration of the sensors in the preset number is carried out by utilizing the characteristic data; that is to say, if the target device is a robot or an unmanned vehicle, the vehicle body characteristics of the robot or the unmanned vehicle can be collected through the sensor, and the parameters of the sensor are calibrated, so that the problem of low online calibration timeliness caused by the fact that external environment information is required to be relied on for calibrating the parameters of the sensor in the related art is solved, and the online calibration efficiency of the sensor is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention and do not constitute a limitation of the invention. In the drawings:
FIG. 1 is a flow chart of a method for calibration of sensor parameters according to an embodiment of the present invention;
FIG. 2 is an alternative flow diagram of a method for calibration of sensor parameters according to an embodiment of the present invention;
FIG. 3 is a block diagram of a calibration apparatus for sensor parameters according to an embodiment of the present invention;
fig. 4 is a schematic diagram of an alternative structure of a calibration device for sensor parameters according to an embodiment of the invention.
Detailed Description
The invention will be described in detail hereinafter with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Example 1
In this embodiment, a calibration method for sensor parameters is provided, where an execution subject of the method may be a target device, the target device may be an electronic device with an environment sensing capability, such as a mobile robot, an unmanned vehicle, and the like, a preset number of sensors for sensing environment data are disposed on the target device, fig. 1 is a flowchart of the calibration method for sensor parameters according to an embodiment of the present invention, and as shown in fig. 1, the flowchart includes the following steps:
step S102, acquiring characteristic data of target equipment by a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment;
among other things, in alternative embodiments of the present application, the types of sensors may include one or more of a camera, a wheel odometer, an ultrasonic radar, an IMU, a Lidar, etc. sensors for sensing the surrounding environment; the parameters of the sensor may include, but are not limited to, the following: RGB camera intrinsic parameters, RGB camera and depth camera (TOF/structured light/binocular) extrinsic parameters, RGB camera/depth camera and Lidar extrinsic parameters, RGB camera/depth camera and wheel odometer extrinsic parameters, RGB camera/depth camera and IMU extrinsic parameters, and the like. It should be noted that objects of data to be acquired by the sensors of different types are different, and if the target device is a robot, vehicle body features corresponding to the sensors of different types are different.
Step S104, calibrating parameters of a preset number of sensors by using the characteristic data;
the calibration of the parameters of the sensor to which the present application relates may include: at least one of an internal reference calibration of the sensor and an external reference calibration between the sensors. Specifically, the calibration of the parameters of the sensor may include: internal reference calibration of a camera type sensor, external reference calibration between a camera type sensor and any or each of the other types of sensors.
The distance explanation is carried out by arranging a camera, an ultrasonic radar and an IMU on the target equipment. The internal reference calibration of the camera can be carried out by utilizing the characteristic data, the external reference calibration between the camera and the ultrasonic radar is carried out, and the external reference calibration between the camera and the IMU is carried out.
Through the steps S102 to S106, acquiring characteristic data of the target equipment by a preset number of sensors, and calibrating parameters of the preset number of sensors by using the characteristic data; that is to say, if the target device is a robot or an unmanned vehicle, the vehicle body characteristics of the robot or the unmanned vehicle can be collected through the sensor, and the parameters of the sensor are calibrated, so that the problem of low online calibration timeliness caused by the fact that external environment information is required to be relied on for calibrating the parameters of the sensor in the related art is solved, and the online calibration efficiency of the sensor is improved.
It should be noted that, in a specific application scenario, feature data corresponding to vehicle body features are acquired according to specific calibration task requirements. For example, for calibrating the camera internal parameters, only the image data containing the vehicle body characteristics of the corresponding camera needs to be acquired; for the calibration of external parameters between the camera and the 3D Lidar, the data of the vehicle body characteristic picture and the data of the 3D Lidar need to be acquired simultaneously.
In addition, if the target device is a mobile robot or an unmanned vehicle, for example, a preset number of sensors involved in the present application need to be installed at appropriate positions of the robot or the unmanned vehicle to ensure that each preset number of sets of sensors can acquire a specified feature of the vehicle body itself, where the number of the specified features may be one or more. For example, the specified features are body features of a robot or an unmanned vehicle, including but not limited to the following: the calibration method comprises the following steps of vehicle body plane isoplanar characteristics, straight lines in the vehicle body outline, shape characteristics such as circular arcs, line segment length in the vehicle body outline, straight line included angles, geometrical characteristics such as circular arc radiuses, and commonly-used calibration targets in calibration of a preset number of groups of sensors such as posted checkerboards or two-dimensional codes.
It should be noted that, for some scenes in which the relative positional relationship between the camera and the vehicle body is fixed and unchanged, in order to complete the calibration task, more complex vehicle body features need to be constructed, for example, targets need to be posted on a plurality of non-coplanar vehicle body planes; that is, under a scene that the relative position relationship between two sensors is fixed and unchanged, a plurality of features on the target device can be used as the designated features corresponding to the two sensors, and a group of designated feature data is obtained by simultaneously performing one-time acquisition by using the two sensors in the external reference calibration process between the two sensors, wherein the group of feature data includes the feature data of the designated features corresponding to the two sensors.
In order to improve the accuracy of parameter calibration, the characteristics of the used vehicle bodies can be different when different parameters are calibrated; for example, taking calibration of a relative positional relationship (external reference) between an RGB camera and a Lidar as an example for explanation, in order to facilitate quick and accurate calibration of internal reference of the RGB camera and quick and accurate determination of a relative positional relationship between a vehicle body feature and the camera, a calibration target with a known accurate size, similar to AprilTag, may be pasted on a vehicle body in a field of view of the RGB camera; for another example, in calibrating the relative positional relationship (external reference) between the RGB camera and the IMU, other vehicle body features other than the specified features corresponding to the RGB camera and the Lidar may be used.
Based on the above description, in an alternative embodiment of the present application, the manner of processing the feature data through the preset algorithm to calibrate the parameters in the preset number of sets of sensors, which is referred to in step S104 of the present application, may further include:
step S104-11, extracting specified feature data corresponding to specified features of the target equipment from the feature data, wherein the specified feature data are acquired by any two sensors in a preset number of sensors;
optionally, in the process of calibrating the external parameters of the two sensors, the feature data of the specified features corresponding to the two sensors are extracted from the feature data to obtain specified feature data, and the extracted specified feature data is processed to calibrate the external parameters between the two sensors.
Step S104-12, determining the designated features or the planes of the designated features according to the designated feature data, and describing information of coordinate systems corresponding to the sensors in any two sensors;
among them, the above description information may be a plane equation in an alternative embodiment of the present application.
And S104-13, calibrating parameters of any two sensors by processing the description information through a preset algorithm.
Wherein the preset algorithm corresponds to the two sensors involved in the step S104-13; the preset algorithms adopted for external reference calibration between sensors of different combinations may be different, and the corresponding relationship is set by developers.
As can be seen from the foregoing steps S104-11 to S104-13, in the present application, calibration of parameters in a preset number of sets of sensors can be implemented based on a specified feature of a target device, and in a case that the target device is a mobile robot or an unmanned vehicle, the specified feature may be a target pasted on a vehicle body plane or a vehicle body plane, and of course, in other scenarios, other vehicle body features may also be implemented, such as: straight line features, circular hole features, etc. In the present application, the steps S104-11 to S104-13 are exemplified by taking two sensors as an RGB camera and a Lidar to perform external reference calibration of the RGB camera and the Lidar;
firstly, in order to facilitate quick and accurate calibration of RGB camera internal parameters and quick and accurate calculation of the relative position relationship between the vehicle body characteristics and the camera, calibration targets with known accurate sizes similar to AprilTag can be pasted on a vehicle body in the camera visual field in advance, and target characteristic points are extracted by using an AprilTag recognition algorithm, wherein if the thickness of the targets is ignored, the vehicle body plane is the target plane, and the targets are components of the vehicle body characteristics. According to the characteristic points, the relative position relation between the target coordinate system and the camera coordinate system is solved by utilizing a PnP algorithm, and then a space plane equation of the vehicle body plane under the camera coordinate system can be obtained:
Acx+Bcy+Ccz+Dc=0
extracting a space plane equation of the vehicle body plane under the Lidar coordinate system by utilizing a PCL point cloud plane extraction algorithm:
Alx+Bly+Clz+ul=0
wherein A, B, C and D are coefficients of a plane equation, x, y and z are coordinate variable representations of three-dimensional space points, and the coefficients are provided with Ac,Bc,CcThe subscript represents the plane equation representation in the camera coordinate system, with Al,Bl,ClThe subscript represents the plane equation representation in the Lidar coordinate system.
Each vehicle body plane forms a group of three-dimensional plane constraint relations under a camera coordinate system and a Lidar coordinate system respectively, and the distance from a space plane normal vector and a coordinate system origin to the plane is as follows:
wherein n iscIs a unit normal vector of a target plane in a camera coordinate system, nlIs a unit normal vector of a target plane in a Lidar coordinate system, dcFor the distance of the origin of the camera coordinate system to the plane, dlIs the distance from the origin of the Lidar coordinate system to the plane.
Before the PnP algorithm is used for solving the relative position relation between the target plane and the camera coordinate system, the camera calibration algorithm is needed to be used for completing the calibration of camera internal parameters to obtain a camera internal parameter matrix:
wherein f isxNormalized focal length for sensor horizontal direction, fyIn the vertical direction of the sensorAnd normalizing the focal length. (c)x,cy) Is the principal point pixel coordinate, where the principal point is the intersection of the camera optical axis and the camera plane, in units of pixels.
The unit normal vector n of the target plane in the camera coordinate system has been derived as described abovecAnd the distance d from the origin of the camera coordinate system to the planecUnit normal vector n of target plane in Lidar coordinate systemlAnd the distance d from the origin of the Lidar coordinate system to the planel. The relative position relationship between the RGB camera to be calibrated and the Lidar is represented by a rotation matrix T as follows:
wherein R isCLA rotation matrix r for the change of the camera coordinate system to the Lidar coordinate systemijIs an element in 3 rows and 3 columns (0)<=i,j<=2);tCLTranslation matrix for the change of camera coordinate system to Lidar coordinate system, ti(0<=i<2) are elements in 3 rows and 1 columns.
According to the rotation principle of the plane in the three-dimensional space, the following corresponding relation can be obtained:
from the plane normal vector:
nl=(l1,l2,l3)T,nc=(c1,c2,c3)Twherein, in the three-dimensional space coordinate system, the normal vector of the plane is a three-dimensional vector and has three parameters, l1,l2,l3Three parameters of a plane normal vector under the Lidar coordinate system respectively, and c1,c2,c3Three parameters of a plane normal vector under a camera coordinate system are respectively obtained, so that:
the constraint relation of the normal vector of one target plane occupied by one group of sensors (camera + Lidar) under the camera coordinate system and the Lidar coordinate system is described above.
When a plurality of vehicle body features are respectively arranged on a plurality of non-coplanar N (N is an integer larger than 1) vehicle body planes, a constraint relation can be obtained for each vehicle body plane by referring to the method, and N groups of constraint relations are obtained; under the condition that the relative positions of the two sensors and the specified features of the target device are variable, the target device utilizes the two sensors to successively acquire feature data of one specified feature for N times in the moving process, N groups of specified feature data can be obtained, one group of specified feature data is acquired by the two sensors at the same time, and each group of specified feature data is processed by referring to the method to obtain N groups of constraint relations. In both cases, N sets of such constraint equations can be obtained:
wherein,
r _ CL is an orthogonal matrix, and satisfies the following orthogonal matrix properties:
RTR=I3,amd det(R)=1
from the above orthogonal matrix properties, the equivalent objective function can be obtained as follows:
based on the method, the rotation matrix R _ CL can be obtained according to the original Procrustes recipe solving algorithm.
In addition, a plane unit normal vector n before transformation is known from the principle of distance correspondence between points and planescAnd a distance dcAnd transforming the matrix R _ CL, t _ CL to obtain the distance from the coordinate system origin to the plane after transformation as follows:
theoretically there should be the following equation:
d′l=dl
however, because there is an error in the actual measurement process, the theoretically calculated distance and the actually measured distance are not completely equal, so the following objective optimization function can be constructed:
therefore, the translation vector t _ CL in the transformation matrix can be solved by using the Levenberg-Marquard algorithm.
Further, the manner of extracting the specified feature data corresponding to the specified feature of the target device from the feature data involved in the above step S104-11 includes:
in the method (1), under the condition that the relative position between any two sensors and the designated feature of the target device is variable, multiple groups of designated feature data which are acquired by any two sensors in sequence aiming at one vehicle body feature are acquired, wherein each group of designated feature data is acquired by any two sensors at the same time; further, the plurality of sets of specified characteristic data are used to determine a parameter between the two sensors.
In the method (2), under the condition that the relative positions of any two sensors and the designated features of the target device are fixed, a group of designated feature data acquired by any two sensors is acquired, wherein the group of designated feature data comprises feature data of a plurality of vehicle body features, and the plurality of vehicle body features are not coplanar; further, the external parameters between the two sensors are determined by using the plurality of sets of specified characteristic data.
In the case where the relative position between any two sensors and the designated feature of the target device is variable, the target device needs to change the position, for example, move or rotate, to acquire multiple sets of designated feature data.
It should be noted that, in a specific application scenario, that is, in the entire calibration algorithm, in order to complete the solution of the rotation matrix R _ CL and the translation vector, five sets of constraint relationships of the target plane occupied by the sensor (camera + Lidar) in the camera coordinate system and the Lidar coordinate system are at least required. Of course, due to the inevitable sensor measurement errors in the real world, N should be greater than 5 in order to obtain higher combined calibration accuracy, and N is generally 20 in order to take the calibration efficiency into consideration.
In addition, it should be noted that, for a scene in which the relative position relationship between the preset number of sensors and the vehicle body characteristics can be changed, for example, the preset number of sensors is installed on the handle bar of the scooter, and the occupation of the preset number of sensors can be easily obtained by rotating the handle bar. In this scenario, the vehicle body features need only include one target plane. However, for a scene in which the relative positional relationship between the preset number of sets of sensors and the vehicle body features is fixed, it is necessary that the vehicle body features include at least five target planes, and in order to meet the requirement of high precision, it is necessary to include more target plane features.
Therefore, through the mode of the application, the online calibration of multiple sensors can be completed only by utilizing the self characteristics of target equipment (such as a robot or an unmanned vehicle), the method does not depend on external environment information, the online calibration efficiency of the multiple sensors is improved, and meanwhile, the success rate and the timeliness of the completion of the online calibration task of the multiple sensors are ensured.
In an optional embodiment of the present application, the calibrating the parameters of the preset number of sensors by using the characteristic data, which is referred to in step S106 of the present application, includes:
s106-21, extracting specified feature data corresponding to specified features of the target equipment from the feature data, wherein the specified feature data are acquired by a sensor of a preset type in a preset number of sensors;
the predetermined type of sensor to which the present application relates may be a camera type sensor.
And S106-22, performing internal reference calibration of the sensor of the preset type according to the specified characteristic data.
It is to be noted that; when external parameter calibration between any two sensors is completed for the first time, the calibration result of the parameters is stored; and storing the calibration structure of the parameter when the internal parameter calibration of any sensor is completed for the first time.
In yet another alternative embodiment of the present application, as shown in fig. 2, the method steps of the present application may further comprise:
step S106, detecting whether the relative positions among the sensors in the preset number of sensors are changed;
in a specific application scenario, the target device may be impacted, or the relative positions of the sensors in the preset number of sensors may change due to human factors, or the like.
It should be noted that: the target equipment can periodically self-check whether the relative positions among the sensors in the preset number of sensors change, and the detection period can be set by developers or can be customized by users; when the target device receives the calibration check instruction, it may perform detection on whether the relative positions between the sensors in the preset number of sensors change, where the calibration check instruction may be a signal generated when a detection button on the target device is operated, or may also be a calibration check instruction sent by a mobile terminal, such as a mobile phone, a tablet computer, and the like, and is not specifically limited in this application.
And step S108, after detecting that the relative positions of the sensors in the preset number of sensors are changed, triggering to execute step S102 and step S104 again.
Updating the stored calibration result of the parameter by executing step S102 and step S104 again
As can be seen from the above steps S106 to S108, after the relative positions of the sensors in the preset number of sensors change, since the previous calibration result is inaccurate, calibration needs to be performed again, and thus the parameter calibration of the sensors is triggered again.
The method for detecting whether the relative position between the sensors in the preset number of sensors changes in step S106 may further include:
s106-11, acquiring appointed characteristic data of appointed characteristics of target equipment acquired by any two sensors in a preset number of sensors, and acquiring parameter calibration results of any two sensors;
step S106-12, determining the space representation of the specified feature or the plane of the specified feature in each sensor coordinate system of any two sensors;
and S106-13, carrying out calibration check on the parameter calibration results of any two sensors according to the spatial representation.
For the above step S106-11 to step S106-13, in a specific application scenario: taking calibration check of external reference between the RGB camera and the Lidar as an example, after acquiring a spatial representation a (first spatial representation) of the vehicle body feature in the camera coordinate system and a spatial representation B (second spatial representation) of the Lidar coordinate system, respectively, the spatial representation a of the vehicle body feature in the camera coordinate system is converted to a spatial representation C (third spatial representation) of the Lidar coordinate system on the basis of the calibration result of the external reference between the RGB camera and the Lidar.
The method for performing calibration check on the parameter calibration results of any two sensors according to the spatial representation in step S106-13 of the present application may further include:
a step S1 of determining a distance between the second spatial representation and the third spatial representation;
step S2, determining that the calibration results of the parameters of any two sensors are correct under the condition that the distance is less than or equal to a preset threshold value;
reporting a message for indicating that the calibration result is correct under the condition that the calibration result is correct;
and step S3, determining that the calibration results of the parameters of any two sensors are incorrect under the condition that the distance is greater than the preset threshold value.
When the calibration result is incorrect, a message for indicating that the calibration result is incorrect is reported, and/or recalibration of the parameters of the preset number of sensors is triggered, and/or recalibration of the parameters of any two sensors (involved in step S106-13) in the preset number of sensors is triggered. The parameter recalibration mode can refer to the parameter calibration method of the sensor, and details are not repeated here.
Reporting the calibration result as success, indicating that the calibration result is correct and the multi-sensor recalibration is not needed, reporting the calibration result as failure, indicating that the calibration result is wrong and the navigation task needs to be terminated immediately and self-safety protection needs to be carried out, wherein the self-safety protection includes but is not limited to moving to a pedestrian path, moving to an unmanned area and the like. That is to say, in the moving process of the target equipment (mobile robot/unmanned vehicle), the calibration and inspection instruction can be responded quickly to complete the multi-sensor online calibration and inspection task, once the calibration result of the multi-sensor is found to be inaccurate, the navigation task can be stopped, the recalibration instruction is issued after the multi-sensor online calibration and inspection task is operated to a safe position, the relative position relation of the multi-sensor on the vehicle body can be updated in time after the recalibration, the ability of accurately sensing the environment is given to the mobile target equipment (robot/unmanned vehicle) in time, a large amount of manpower, material resources and financial resources are saved, and meanwhile, the operation stability and the safety of the mobile target equipment (robot/unmanned vehicle) are improved.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
In this embodiment, a calibration apparatus for sensor parameters is further provided, and the apparatus is used to implement the foregoing embodiments and preferred embodiments, and the description already made is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware or a combination of software and hardware is also possible and contemplated.
Fig. 3 is a block diagram of a calibration apparatus for sensor parameters according to an embodiment of the present invention, as shown in fig. 3, the apparatus includes: an acquisition module 302, configured to acquire feature data of a target device through a preset number of sensors, where the preset number of sensors are disposed on the target device; and the calibration module 304 is coupled to the acquisition module 302, and configured to perform parameter calibration on a preset number of sensors by using the characteristic data.
Optionally, the calibration module 304 in the present application may further include: the first extraction unit is used for extracting specified feature data corresponding to specified features of the target equipment from the feature data, wherein the specified feature data are acquired by any two sensors in a preset number of sensors; the first determining unit is used for determining the specified features or the planes of the specified features according to the specified feature data, and describing information of a coordinate system corresponding to each sensor in any two sensors; and the first calibration unit is used for calibrating the parameters of any two sensors by processing the description information through a preset algorithm.
Optionally, the first extraction unit includes: the first acquisition subunit is used for acquiring a group of specified characteristic data acquired by any two sensors under the condition that the relative positions of the specified characteristics of any two sensors and the target equipment are variable; and the second acquisition subunit is used for acquiring multiple groups of specified characteristic data acquired by any two sensors under the condition that the relative positions of the specified characteristics of any two sensors and the target equipment are fixed, wherein one group of specified characteristic data is acquired by any two sensors at the same time.
Optionally, the calibration module 304 in this application further includes: the second extraction unit is used for extracting specified feature data corresponding to specified features of the target equipment from the feature data, wherein the specified feature data are acquired by a sensor of a preset type in a preset number of sensors; and the second calibration unit is used for performing internal reference calibration on the sensor of the preset type according to the specified characteristic data.
Fig. 4 is a first structural block diagram of an alternative calibration apparatus for sensor parameters according to an embodiment of the present invention, as shown in fig. 4, the apparatus may further include: a detecting module 402, configured to detect whether a relative position between sensors in a preset number of sensors changes; and the triggering module 404 is coupled to the detecting module 402, and configured to trigger the steps of acquiring feature data of the target device by using the preset number of sensors again after detecting that the relative positions of the sensors in the preset number of sensors change, and calibrating the parameters of the preset number of sensors by using the feature data.
Optionally, the detecting module 402 in this application may further include: the acquisition unit is used for acquiring appointed characteristic data of appointed characteristics of target equipment acquired by any two sensors in a preset number of sensors and acquiring parameter calibration results of any two sensors; a second determination unit, configured to determine a spatial representation of the specified feature or a plane in which the specified feature is located in each of the sensor coordinate systems of any two sensors; and the checking unit is used for carrying out calibration check on the parameter calibration results of any two sensors according to the spatial representation.
Example 3
Embodiments of the present invention also provide a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, acquiring characteristic data of the target equipment by a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment;
and S2, calibrating the parameters of the sensors with the characteristic data.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, collecting the characteristic data of the target device by a preset number of sensors, wherein the preset number of sensors are arranged on the target device
And S2, calibrating the parameters of the sensors with the characteristic data.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the principle of the present invention should be included in the protection scope of the present invention.
Claims (10)
1. A calibration method for sensor parameters is characterized by comprising the following steps:
acquiring characteristic data of target equipment by a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment;
calibrating the parameters of the sensors in the preset number by using the characteristic data;
wherein the method further comprises:
detecting whether the relative positions among the sensors in the preset number of sensors are changed or not;
triggering to execute the steps of acquiring the characteristic data of the target equipment by the preset number of sensors again and calibrating the parameters of the preset number of sensors by using the characteristic data after detecting that the relative positions of the sensors in the preset number of sensors are changed;
wherein, whether the relative position that detects between the sensor among the sensor of the predetermined number changes includes:
acquiring appointed characteristic data of appointed characteristics of target equipment acquired by any two sensors in the preset number of sensors, and acquiring parameter calibration results of the any two sensors;
determining a spatial representation of the specified feature or a plane in which the specified feature lies in a sensor coordinate system of each of the two arbitrary sensors;
and carrying out calibration check on the parameter calibration results of any two sensors according to the spatial representation.
2. The method of claim 1, wherein the using the characteristic data for parameter calibration of a predetermined number of sensors comprises:
extracting specified feature data corresponding to specified features of the target device from the feature data, wherein the specified feature data are acquired by any two sensors in the preset number of sensors;
determining the designated feature or the plane where the designated feature is located according to the designated feature data, and describing information of a coordinate system corresponding to each sensor in the two arbitrary sensors;
and processing the description information through a preset algorithm to calibrate the parameters of any two sensors.
3. The method according to claim 2, wherein extracting, from the feature data, specified feature data corresponding to a specified feature of the target device includes:
acquiring a plurality of groups of specified characteristic data acquired by any two sensors under the condition that the relative positions between the specified characteristics of the any two sensors and the target equipment are variable;
and under the condition that the relative positions of the any two sensors and the specified features of the target equipment are fixed, acquiring a group of specified feature data acquired by the any two sensors, wherein the group of specified feature data is acquired by the any two sensors at the same time.
4. The method of claim 1, wherein the using the characteristic data for parameter calibration of a predetermined number of sensors comprises:
extracting specified feature data corresponding to specified features of the target device from the feature data, wherein the specified feature data are acquired by a sensor of a preset type in the preset number of sensors;
and performing internal reference calibration of the sensor of the preset type according to the specified characteristic data.
5. A calibration device for sensor parameters is characterized by comprising:
the system comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring characteristic data of target equipment by a preset number of sensors, and the preset number of sensors are arranged on the target equipment;
the calibration module is used for calibrating the parameters of the sensors with the preset number by using the characteristic data;
wherein the apparatus further comprises:
the detection module is used for detecting whether the relative positions among the sensors in the preset number of sensors are changed or not;
the triggering module is used for triggering the collection of the characteristic data of the target equipment through the sensors in the preset number again after detecting that the relative positions of the sensors in the preset number of sensors are changed, and carrying out the step of calibrating the parameters of the sensors in the preset number by using the characteristic data;
wherein the detection module comprises:
the acquisition unit is used for acquiring appointed characteristic data of appointed characteristics of target equipment acquired by any two sensors in the preset number of sensors and acquiring parameter calibration results of the any two sensors;
a second determination unit, configured to determine a spatial representation of the specified feature or a plane in which the specified feature is located in each of the two arbitrary sensors in a coordinate system of the sensor;
and the checking unit is used for carrying out calibration check on the parameter calibration results of any two sensors according to the spatial representation.
6. The apparatus of claim 5, wherein the calibration module comprises:
a first extraction unit, configured to extract specified feature data corresponding to specified features of the target device from the feature data, where the specified feature data is acquired by any two sensors in the preset number of sensors;
a first determining unit, configured to determine, according to the specified feature data, the specified feature or a plane where the specified feature is located, and description information of a coordinate system corresponding to each of the two arbitrary sensors;
and the first calibration unit is used for calibrating the parameters of any two sensors by processing the description information through a preset algorithm.
7. The apparatus of claim 6, wherein the first extraction unit comprises:
the first acquisition subunit is used for acquiring a group of specified feature data acquired by any two sensors under the condition that the relative positions between the any two sensors and the specified features of the target equipment are variable;
and the second acquisition subunit is configured to acquire, when the relative positions between the arbitrary two sensors and the specified features of the target device are fixed, multiple sets of specified feature data acquired by the arbitrary two sensors, where the set of specified feature data is acquired by the arbitrary two sensors at the same time.
8. The apparatus of claim 5, wherein the calibration module further comprises:
a second extraction unit, configured to extract specified feature data corresponding to a specified feature of the target device from the feature data, where the specified feature data is acquired by a predetermined type of sensor among the preset number of sensors;
and the second calibration unit is used for performing internal reference calibration on the sensor of the preset type according to the specified characteristic data.
9. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 4 when executed.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910760696.0A CN110470333B (en) | 2019-08-16 | 2019-08-16 | Calibration method and device of sensor parameters, storage medium and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910760696.0A CN110470333B (en) | 2019-08-16 | 2019-08-16 | Calibration method and device of sensor parameters, storage medium and electronic device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110470333A CN110470333A (en) | 2019-11-19 |
CN110470333B true CN110470333B (en) | 2022-05-24 |
Family
ID=68511030
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910760696.0A Active CN110470333B (en) | 2019-08-16 | 2019-08-16 | Calibration method and device of sensor parameters, storage medium and electronic device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110470333B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113256727B (en) * | 2020-02-13 | 2024-10-01 | 纳恩博(北京)科技有限公司 | Mobile device and image sensing system parameter on-line calibration and inspection method and device |
CN113252066B (en) * | 2020-02-13 | 2024-04-09 | 纳恩博(北京)科技有限公司 | Calibration method and device for parameters of odometer equipment, storage medium and electronic device |
CN113256726B (en) * | 2020-02-13 | 2024-11-22 | 纳恩博(北京)科技有限公司 | Online calibration and inspection method of sensor system of mobile device, mobile device |
CN111307195A (en) * | 2020-03-11 | 2020-06-19 | 深圳市创维电器科技有限公司 | Universal sensor calibration method, device, equipment and computer readable storage medium |
CN111427028B (en) * | 2020-03-20 | 2022-03-25 | 新石器慧通(北京)科技有限公司 | Parameter monitoring method, device, equipment and storage medium |
CN113494927A (en) * | 2020-03-20 | 2021-10-12 | 郑州宇通客车股份有限公司 | Vehicle multi-sensor calibration method and device and vehicle |
CN111693968B (en) * | 2020-05-29 | 2022-10-28 | 江苏大学 | Systematic calibration method for external parameters of vehicle-mounted three-dimensional laser radar system |
CN112135125B (en) * | 2020-10-28 | 2024-07-30 | 歌尔光学科技有限公司 | Camera internal parameter testing method, device, equipment and computer readable storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103925879A (en) * | 2014-04-24 | 2014-07-16 | 中国科学院合肥物质科学研究院 | Indoor robot vision hand-eye relation calibration method based on 3D image sensor |
WO2016195915A1 (en) * | 2015-06-02 | 2016-12-08 | Empire Technology Development Llc | Sensor degradation compensation |
JP2018047896A (en) * | 2017-10-30 | 2018-03-29 | 株式会社Subaru | Steering device |
CN109211298A (en) * | 2017-07-04 | 2019-01-15 | 百度在线网络技术(北京)有限公司 | A kind of transducer calibration method and device |
CN109270534A (en) * | 2018-05-07 | 2019-01-25 | 西安交通大学 | A kind of intelligent vehicle laser sensor and camera online calibration method |
CN109658457A (en) * | 2018-11-02 | 2019-04-19 | 浙江大学 | A kind of scaling method of laser and any relative pose relationship of camera |
CN109949371A (en) * | 2019-03-18 | 2019-06-28 | 北京智行者科技有限公司 | A kind of scaling method for laser radar and camera data |
-
2019
- 2019-08-16 CN CN201910760696.0A patent/CN110470333B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103925879A (en) * | 2014-04-24 | 2014-07-16 | 中国科学院合肥物质科学研究院 | Indoor robot vision hand-eye relation calibration method based on 3D image sensor |
WO2016195915A1 (en) * | 2015-06-02 | 2016-12-08 | Empire Technology Development Llc | Sensor degradation compensation |
CN109211298A (en) * | 2017-07-04 | 2019-01-15 | 百度在线网络技术(北京)有限公司 | A kind of transducer calibration method and device |
JP2018047896A (en) * | 2017-10-30 | 2018-03-29 | 株式会社Subaru | Steering device |
CN109270534A (en) * | 2018-05-07 | 2019-01-25 | 西安交通大学 | A kind of intelligent vehicle laser sensor and camera online calibration method |
CN109658457A (en) * | 2018-11-02 | 2019-04-19 | 浙江大学 | A kind of scaling method of laser and any relative pose relationship of camera |
CN109949371A (en) * | 2019-03-18 | 2019-06-28 | 北京智行者科技有限公司 | A kind of scaling method for laser radar and camera data |
Also Published As
Publication number | Publication date |
---|---|
CN110470333A (en) | 2019-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110470333B (en) | Calibration method and device of sensor parameters, storage medium and electronic device | |
CN110501036A (en) | The calibration inspection method and device of sensor parameters | |
CN107741234A (en) | The offline map structuring and localization method of a kind of view-based access control model | |
CN111735439B (en) | Map construction method, map construction device and computer-readable storage medium | |
CN111856499B (en) | Map construction method and device based on laser radar | |
CN111338383B (en) | GAAS-based autonomous flight method and system, and storage medium | |
US11045953B2 (en) | Relocalization method and robot using the same | |
CN113137968B (en) | Repositioning method and repositioning device based on multi-sensor fusion and electronic equipment | |
KR102130687B1 (en) | System for information fusion among multiple sensor platforms | |
KR102694715B1 (en) | Method for detecting obstacle, electronic device, roadside device and cloud control platform | |
WO2023283929A1 (en) | Method and apparatus for calibrating external parameters of binocular camera | |
CN111709988A (en) | Method and device for determining characteristic information of object, electronic equipment and storage medium | |
WO2022217988A1 (en) | Sensor configuration scheme determination method and apparatus, computer device, storage medium, and program | |
CN112597946B (en) | Obstacle representation method, device, electronic device and readable storage medium | |
CN113093759A (en) | Robot formation construction method and system based on multi-sensor information fusion | |
CN115100287B (en) | External parameter calibration method and robot | |
CN111104861B (en) | Method and apparatus for determining wire position and storage medium | |
US20210156710A1 (en) | Map processing method, device, and computer-readable storage medium | |
JP2023503750A (en) | ROBOT POSITIONING METHOD AND DEVICE, DEVICE, STORAGE MEDIUM | |
CN113252066B (en) | Calibration method and device for parameters of odometer equipment, storage medium and electronic device | |
CN111708046A (en) | Method and device for processing plane data of obstacle, electronic equipment and storage medium | |
CN116958452A (en) | Three-dimensional reconstruction method and system | |
Adinandra et al. | A low cost indoor localization system for mobile robot experimental setup | |
CN113256726B (en) | Online calibration and inspection method of sensor system of mobile device, mobile device | |
CN116734840A (en) | Mowing robot positioning method and device, electronic equipment and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20210202 Address after: Floor 16 and 17, block a, building 3, chuangyangang, Changzhou science and Education City, No.18, middle Changwu Road, Wujin District, Changzhou City, Jiangsu Province, 213000 Applicant after: NINEBOT (BEIJING) TECH Co.,Ltd. Address before: 100086 No.161, 6 / F, block B, building 1, No.38, Zhongguancun Street, Haidian District, Beijing Applicant before: BEIJING ZHIXING MUYUAN TECHNOLOGY Co.,Ltd. |
|
TA01 | Transfer of patent application right | ||
GR01 | Patent grant | ||
GR01 | Patent grant |