[go: up one dir, main page]

CN110501036A - The calibration inspection method and device of sensor parameters - Google Patents

The calibration inspection method and device of sensor parameters Download PDF

Info

Publication number
CN110501036A
CN110501036A CN201910760738.0A CN201910760738A CN110501036A CN 110501036 A CN110501036 A CN 110501036A CN 201910760738 A CN201910760738 A CN 201910760738A CN 110501036 A CN110501036 A CN 110501036A
Authority
CN
China
Prior art keywords
sensors
calibration
spatial representation
sensor
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910760738.0A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ninebot Changzhou Technology Co Ltd
Original Assignee
Beijing Zhixing Muyuan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhixing Muyuan Technology Co Ltd filed Critical Beijing Zhixing Muyuan Technology Co Ltd
Priority to CN201910760738.0A priority Critical patent/CN110501036A/en
Publication of CN110501036A publication Critical patent/CN110501036A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Testing Or Calibration Of Command Recording Devices (AREA)

Abstract

The present invention provides the calibration inspection methods and device of a kind of sensor parameters, wherein, this method comprises: any two sensor acquires the characteristic of the specific characteristic of target device in the sensor for passing through preset quantity, wherein the sensor setting of preset quantity is on the target device;Space representation of the plane in any two sensor in each sensor coordinate system where determining specific characteristic or specific characteristic according to characteristic;Calibration inspection is carried out according to calibration result of the space representation to the parameter of any two sensor.Through the invention, solve the problems, such as that carrying out on-line proving dependent on external environmental information in the related technology is easy wrong report multisensor on-line proving inspection scenario outcomes.

Description

Calibration checking method and device for sensor parameters
Technical Field
The invention relates to the field of robots, in particular to a calibration checking method and device for sensor parameters.
Background
The functions of positioning navigation, path planning and autonomous obstacle avoidance are hot spots in the research field of mobile robots and unmanned driving, and are also the key points of landing productization of mobile robots and unmanned driving technologies. The mobile robot And the unmanned vehicle can realize autonomous obstacle avoidance, positioning And navigation in a dynamic environment with obstacles by sensing surrounding environment And self state through a camera, a wheel odometer, an ultrasonic radar, an IMU (Inertial Measurement Unit), a Light Detection And Ranging (laser Detection And Measurement) And other multiple sensors (consisting of a plurality of sensors).
In the related technology, information fusion among different types of sensors is a necessary means for improving the environment perception capability of the robot/the unmanned vehicle, and the multi-sensor fusion technology provides more accurate environment information and higher safety performance for the mobile robot/the unmanned vehicle. The multi-sensor calibration technology is a premise of the multi-sensor fusion technology, and information fusion between different types of sensors can be well performed only by acquiring accurate internal and external parameters of the multi-sensor, so that the multi-sensor calibration technology has stronger environment sensing capability. However, most of the related technologies are offline calibration in a laboratory or a factory assembly stage, a specific calibration scene (a specific calibration target, a specific motion trajectory) needs to be set up, and a calibration process is generally complex and needs to consume a large amount of manpower, material resources and financial resources. The working scenes of the mobile robot/unmanned vehicle are generally complex and changeable, the relative position relationship among the multiple sensors is easy to change due to external force factors such as collision, long-term mechanical vibration and the like, and wrong environment sensing information is inevitably caused by wrong position relationship, so that the mobile robot/unmanned vehicle loses correct environment sensing capability.
In order to endow the mobile robot/the unmanned vehicle with complex and changeable dynamic environments corresponding to different road conditions and improve the environment adaptability, the multi-sensor online calibration technology is an effective solution. In the existing multi-sensor online calibration scheme, the characteristic extraction with the external environment is mainly relied on, and certain requirements are made on environment information. For example, for calibrating the relative position between a camera and a wheel odometer, the surrounding environment is generally required to be clear, and strong texture information is required, and meanwhile, the wheel needs to run a specific track. For the calibration of the relative position relationship between the camera and the Lidar, the surrounding environment is also required to be spacious, and stronger texture information is required. Therefore, the method excessively depends on external environment information, limits the application scene of the online calibration method, cannot ensure the timeliness of the completion of the online calibration task, further influences the result of the multi-sensor online calibration inspection scheme based on the method, and easily causes the false alarm of the multi-sensor calibration result.
In view of the above problems in the related art, no effective solution exists at present.
Disclosure of Invention
The embodiment of the invention provides a calibration checking method and a calibration checking device for sensor parameters, which at least solve the problem that the result of a multi-sensor online calibration checking scheme is easy to be misreported by online calibration depending on external environment information in the related technology.
According to one embodiment of the invention, the method comprises the steps of collecting feature data of specified features of target equipment through any two sensors in a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment; determining a spatial representation of the specified feature or a plane in which the specified feature is located in each of the two arbitrary sensors in a sensor coordinate system according to the feature data; and carrying out calibration check on the calibration results of the parameters of any two sensors according to the spatial representation.
According to another embodiment of the present invention, there is provided an acquisition module configured to acquire feature data of a specified feature of a target device through any two sensors of a preset number of sensors, where the preset number of sensors are disposed on the target device; a determination module, configured to determine, according to the feature data, a spatial representation of the specified feature or a plane in which the specified feature is located in each sensor coordinate system of the two arbitrary sensors; and the checking module is used for carrying out calibration checking on the calibration results of the parameters of any two sensors according to the spatial representation.
According to a further embodiment of the present invention, there is also provided a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
According to yet another embodiment of the present invention, there is also provided an electronic device, including a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
According to the invention, the characteristic data of the designated characteristics of the target equipment is acquired by any two sensors in the preset number of sensors, the spatial representation of the designated characteristics or the planes of the designated characteristics in the coordinate systems of the sensors in any two sensors is determined according to the characteristic data, and the calibration inspection is carried out on the calibration results of the parameters of any two sensors according to the spatial representation, so that the calibration inspection of the calibration results of the sensors can be realized without depending on external environmental factors, the problem that the online calibration inspection scheme result of multiple sensors is easy to falsely report in the related technology depending on external environmental information is solved, and the timeliness of the online calibration inspection is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow chart of a method of calibration checking of sensor parameters according to an embodiment of the present invention;
fig. 2 is a block diagram of a calibration checking apparatus for sensor parameters according to an embodiment of the present invention.
Detailed Description
The invention will be described in detail hereinafter with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Example 1
In this embodiment, a calibration checking method for a sensor parameter is provided, where an execution subject of the calibration checking method may be a target device, the target device may be an electronic device with an environment sensing capability, such as a mobile robot, an unmanned vehicle, and the like, a preset number of sensors for sensing environment data are disposed on the target device, fig. 1 is a flowchart of a calibration checking method for a sensor parameter according to an embodiment of the present invention, and as shown in fig. 1, the flowchart includes the following steps:
step S102, acquiring feature data of specified features of target equipment through any two sensors in a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment;
in a specific application scenario, the target device in the present application may be a robot or an unmanned vehicle, and the feature data may include, but is not limited to, the following: the calibration method comprises the following steps of determining the shape characteristics of straight lines, arcs and the like in the vehicle body profile, determining the geometric characteristics of line segment lengths, straight line included angles, arc radii and the like in the vehicle body profile, and marking targets commonly used in multi-sensor calibration such as posted checkerboards or two-dimensional codes.
Furthermore, types of sensors in the present application may include one or more of a camera, wheel odometer, ultrasonic radar, IMU, Lidar, etc. for sensing the surrounding environment and/or the own state sensors; the parameters of the sensor may include, but are not limited to, the following: RGB camera intrinsic parameters, RGB camera and depth camera (TOF/structured light/binocular) extrinsic parameters, RGB camera/depth camera and Lidar extrinsic parameters, RGB camera/depth camera and wheel odometer extrinsic parameters, RGB camera/depth camera and IMU extrinsic parameters, and the like. It should be noted that objects of data to be acquired by the sensors of different types are different, and if the target device is a robot, the vehicle body features corresponding to the sensors of different types are different.
Step S104, determining the space representation of the designated feature or the plane of the designated feature in each sensor coordinate system of any two sensors according to the feature data;
and step S106, carrying out calibration check on the calibration results of the parameters of any two sensors according to the spatial representation.
Through the steps S102 to S106, feature data of the specified feature of the target device is acquired by any two sensors in the preset number of sensors, spatial representation of the specified feature or a plane where the specified feature is located in each sensor coordinate system of any two sensors is determined according to the feature data, and calibration inspection is performed on the calibration result of the parameter of any two sensors according to the spatial representation, so that calibration inspection of the calibration result of the sensors can be realized without depending on external environmental factors, the problem that the result of the multi-sensor online calibration inspection scheme is easily misreported by online calibration depending on external environmental information in the related art is solved, and the timeliness of the online calibration inspection is improved.
During actual implementation, the target device may perform calibration check periodically, and execute step S102 to step S106, where the detection period may be set by a developer or may be customized by a user; when the target device receives the calibration check instruction, step S102 to step S106 are executed, where the calibration check instruction may be a signal generated when the detection button on the target device is operated, or may also be a calibration check instruction sent by a mobile terminal, such as a mobile phone, a tablet computer, and the like, which is not specifically limited in this application.
In an alternative embodiment of the present application, the manner of determining the spatial representation of the specified feature or the plane in which the specified feature is located in each of the two sensors in the coordinate system of any sensor according to the feature data, which is referred to in step S104, may be implemented as follows:
step S104-11, determining the spatial representation of the specified feature in the coordinate system of the first sensor according to the feature data to obtain a first spatial representation;
step S104-12, converting the first spatial representation into a second sensor coordinate system according to the parameter calibration result to obtain a second spatial representation;
and step S104-13, determining the space representation of the specified feature in the coordinate system of the second sensor according to the feature data to obtain a third space representation, wherein the first sensor and the second sensor form any two sensors.
Alternatively, this is achieved by:
step S104-21, determining the spatial representation of the plane of the specified feature in the coordinate system of the first sensor according to the feature data to obtain a first spatial representation;
step S104-22, converting the first space representation into a second sensor coordinate system according to the parameter calibration result to obtain a second space representation;
and step S104-23, determining the spatial representation of the plane of the specified feature in the coordinate system of the second sensor according to the feature data to obtain a third spatial representation.
As for the manner in the step S104, in a specific application scenario: taking calibration check of external reference between the RGB camera and the Lidar as an example, after acquiring a spatial representation a (first spatial representation) of the vehicle body feature in the camera coordinate system and a spatial representation B (second spatial representation) of the Lidar coordinate system, respectively, the spatial representation a of the vehicle body feature in the camera coordinate system is converted to a spatial representation C (third spatial representation) of the Lidar coordinate system on the basis of the calibration result of the external reference between the RGB camera and the Lidar.
For the first spatial representation, the second spatial representation, and the third spatial representation, taking the calibration check of the relative position relationship (external reference) between the RGB camera and the Lidar as an example, the following descriptions are performed:
in the calibration and inspection data acquisition process, image data containing vehicle body target feature points are acquired, and the target feature points are extracted by utilizing a mature AprilTag recognition algorithm. According to the characteristic points, the relative position relation between the target coordinate system and the camera coordinate system is solved by using a classical PnP algorithm, and then a space plane equation of the charging pile target plane under the camera coordinate system can be obtained:
Acx+Bcy+Ccz+Dc=0
extracting a space plane equation of a charging pile target plane under the Lidar coordinate system by utilizing a PCL point cloud plane extraction algorithm: a. thelx+Bly+Clz+Dl=0
Alx+Bly+Clz+Dl=0
And further selecting three-dimensional point cloud data on the plane according to the vehicle body plane equation under the above Lidar coordinate system, wherein the three-dimensional coordinates of the three-dimensional point cloud on the vehicle body plane are represented as follows:
Pl={pl,i}={xl,i,yl,i,zl,i}
now, the RGB camera and Lidar extrinsic reference matrix T _ CL are known:
the vehicle body plane point cloud data in the Lidar coordinate system can be converted to be in the camera coordinate system:
wherein:
and obtaining three-dimensional point cloud data on a target plane of the vehicle body in a camera coordinate system, wherein the three-dimensional coordinate form is expressed as follows:
Pc={pc,i}={xc,i,yc,i,zc,i}
according to the formula of the distance from the point to the plane:
and the distance from the three-dimensional point cloud Pc on the body target plane to the charging pile target plane under the camera coordinate system can be obtained.
And counting the distance distribution of all the points, and finishing the inspection of the external parameters of the camera Lidar system through a distance threshold value obtained through experiments. The first space is represented as a three-dimensional point cloud on a vehicle body plane under a Lidar coordinate system; the third space is expressed as a plane equation of the vehicle body plane under the camera coordinate system; converting the three-dimensional point cloud on the vehicle body plane under the Lidar coordinate system into the camera coordinate system according to the external reference matrix between the camera and the Lidar, and obtaining the three-dimensional point cloudRepresentation of a second space of vehicle body features, i.e. a three-dimensional point cloud P on a target plane of the vehicle body in the camera coordinate systemc
The method for performing calibration check on the parameter calibration results of any two sensors according to the spatial representation in step S106 of the present application may further include:
step S106-11, determining the distance between the second spatial representation and the third spatial representation;
step S106-12, determining that the calibration results of the parameters of any two sensors are correct under the condition that the distance is smaller than or equal to a preset threshold value;
reporting a message for indicating that the calibration result is correct under the condition that the calibration result is correct;
and S106-13, under the condition that the distance is larger than the preset threshold value, determining that the calibration results of the parameters of any two sensors are incorrect.
And reporting a message for indicating that the calibration result is incorrect and/or triggering the recalibration of the parameters of the preset number of sensors or the recalibration of the parameters of any two sensors in the preset number of sensors under the condition that the calibration result is incorrect.
If the reported calibration result is successful, the calibration result is correct, and multi-sensor recalibration is not needed, and if the reported calibration result is failed, the navigation task is stopped immediately and self-safety protection is needed if the calibration result is wrong, and multi-sensor recalibration is needed. That is to say, in the moving process of the target equipment (mobile robot/unmanned vehicle), the calibration and inspection instruction can be responded quickly to complete the multi-sensor online calibration and inspection task, once the calibration result of the multi-sensor is found to be inaccurate, the navigation task can be stopped, the recalibration instruction is issued after the multi-sensor online calibration and inspection task is operated to a safe position, the relative position relation of the multi-sensor on the vehicle body can be updated in time after the recalibration, the ability of accurately sensing the environment is given to the mobile target equipment (robot/unmanned vehicle) in time, a large amount of manpower, material resources and financial resources are saved, and meanwhile, the operation stability and the safety of the mobile target equipment (robot/unmanned vehicle) are improved.
With regard to the above steps S106-11 to S106-13, or with the above calibration check of external reference between the RGB camera and the Lidar as an example, after obtaining the spatial representation a (first spatial representation) of the vehicle body feature in the camera coordinate system and the spatial representation B (second spatial representation) of the Lidar coordinate system, respectively, on the basis of the calibration result of the external reference between the RGB camera and the Lidar, the spatial representation a of the vehicle body feature in the camera coordinate system is converted into the spatial representation C (third spatial representation) of the Lidar coordinate system, and then by setting an appropriate metric threshold value to compare with the distance between the spatial representations B and C, if the distance is less than or equal to the threshold value, the task calibration result is correct, otherwise, the task calibration result is incorrect.
In the case that the calibration result is incorrect, the method for re-calibrating the parameters of the preset number of sensors includes the following steps:
step S108, acquiring characteristic data of the target equipment by a preset number of sensors;
among other things, in alternative embodiments of the present application, the types of sensors may include one or more of a camera, a wheel odometer, an ultrasonic radar, an IMU, a Lidar, etc. sensors for sensing the surrounding environment; the parameters of the sensor may include, but are not limited to, the following: RGB camera intrinsic parameters, RGB camera and depth camera (TOF/structured light/binocular) extrinsic parameters, RGB camera/depth camera and Lidar extrinsic parameters, RGB camera/depth camera and wheel odometer extrinsic parameters, RGB camera/depth camera and IMU extrinsic parameters, and the like. It should be noted that objects of data to be acquired by the sensors of different types are different, and if the target device is a robot, the vehicle body features corresponding to the sensors of different types are different.
And step S110, calibrating parameters of a preset number of sensors by using the characteristic data.
The calibration of the parameters of the sensor to which the present application relates may include: at least one of an internal reference calibration of the sensor and an external reference calibration between the sensors. Specifically, the calibration of the parameters of the sensor may include: internal reference calibration of a camera type sensor, external reference calibration between a camera type sensor and any or each of the other types of sensors.
The distance explanation is carried out by arranging a camera, an ultrasonic radar and an IMU on the target equipment. The internal reference calibration of the camera can be carried out by utilizing the characteristic data, the external reference calibration between the camera and the ultrasonic radar is carried out, and the external reference calibration between the camera and the IMU is carried out.
Through the steps S108 to S110, acquiring characteristic data of the target equipment by using a preset number of sensors, and calibrating parameters of the preset number of sensors by using the characteristic data; that is to say, if the target device is a robot or an unmanned vehicle, the vehicle body characteristics of the robot or the unmanned vehicle can be collected through the sensor, and the parameters of the sensor are calibrated, so that the problem of low online calibration timeliness caused by the fact that external environment information is required to be relied on for calibrating the parameters of the sensor in the related art is solved, and the online calibration efficiency of the sensor is improved.
It should be noted that, in a specific application scenario, feature data corresponding to vehicle body features are collected according to specific calibration task requirements. For example, for calibrating the camera internal parameters, only the image data containing the vehicle body characteristics of the corresponding camera needs to be acquired; for the calibration of external parameters between the camera and the 3D Lidar, the data of the vehicle body feature picture and the 3D Lidar data need to be acquired simultaneously.
In addition, if the target device is a mobile robot or an unmanned vehicle, for example, a preset number of sensors involved in the present application need to be installed at appropriate positions of the robot or the unmanned vehicle to ensure that each preset number of sets of sensors can acquire a specified feature of the vehicle body itself, where the number of the specified features may be one or more. For example, the specified features are body features of a robot or an unmanned vehicle, including but not limited to the following: the calibration method comprises the following steps of vehicle body plane isoplanar characteristics, straight lines in the vehicle body outline, shape characteristics such as circular arcs, line segment length in the vehicle body outline, straight line included angles, geometrical characteristics such as circular arc radiuses, and commonly-used calibration targets in calibration of a preset number of groups of sensors such as posted checkerboards or two-dimensional codes.
It should be noted that, for some scenes in which the relative positional relationship between the camera and the vehicle body is fixed and unchanged, in order to complete the calibration task, more complex vehicle body features need to be constructed, for example, targets need to be posted on a plurality of non-coplanar vehicle body planes; that is, under a scene that the relative position relationship between two sensors is fixed and unchanged, a plurality of features on the target device can be used as the designated features corresponding to the two sensors, and a group of designated feature data is obtained by simultaneously performing one-time acquisition by using the two sensors in the external reference calibration process between the two sensors, wherein the group of feature data includes the feature data of the designated features corresponding to the two sensors.
In order to improve the accuracy of parameter calibration, the characteristics of the used vehicle bodies can be different when different parameters are calibrated; for example, taking calibration of a relative positional relationship (external reference) between an RGB camera and a Lidar as an example for explanation, in order to facilitate quick and accurate calibration of internal reference of the RGB camera and quick and accurate determination of a relative positional relationship between a vehicle body feature and the camera, a calibration target with a known accurate size, similar to AprilTag, may be pasted on a vehicle body in a field of view of the RGB camera; for another example, in calibrating the relative positional relationship (external reference) between the RGB camera and the IMU, other vehicle body features other than the specified features corresponding to the RGB camera and the Lidar may be used.
Wherein, step S110 can be further implemented by:
step S110-11, extracting specified feature data corresponding to specified features of the target device from the feature data, wherein the specified feature data are acquired by any two sensors in a preset number of sensors;
optionally, in the process of calibrating the external parameters of the two sensors, the feature data of the specified features corresponding to the two sensors are extracted from the feature data to obtain specified feature data, and the extracted specified feature data is processed to calibrate the external parameters between the two sensors.
Step S110-12, determining the designated features or the planes of the designated features according to the designated feature data, and describing information of coordinate systems corresponding to the sensors in any two sensors;
among them, the above description information may be a plane equation in an alternative embodiment of the present application.
And S110-13, calibrating parameters of any two sensors by processing the description information through a preset algorithm.
Wherein the preset algorithm corresponds to the two sensors involved in the step S104-13; the preset algorithms adopted for external reference calibration between sensors of different combinations may be different, and the corresponding relationship is set by developers.
As can be seen from the foregoing steps S110-11 to S110-13, in the present application, calibration of parameters in a preset number of sets of sensors may be implemented based on specified feature data of a target device, and in a case where the target device is a mobile robot or an unmanned vehicle, the specified feature data is a vehicle body plane, and of course, in other scenarios, other vehicle body features may also be implemented, such as: linear characteristics, circular hole characteristics and the like, any two sensors in the preset number of sensors are an RGB camera and an Lidar, and parameters in the preset number of sensors to be calibrated are relative position relations (external reference) between the RGB camera and the Lidar, which are taken as examples, and the steps S110-11 to S110-13 are exemplified;
firstly, in order to facilitate quick and accurate calibration of internal parameters of the RGB camera and quick and accurate calculation of a relative position relationship between a vehicle body feature and the camera, a calibration target with a known accurate size similar to AprilTag can be pasted on a vehicle body in the field of view of the camera, and a target feature point is extracted by using an AprilTag recognition algorithm, wherein if the thickness of the target is ignored, the vehicle body plane is the target plane, and the target is a component part of the vehicle body feature. According to the characteristic points, the relative position relation between the target coordinate system and the camera coordinate system is solved by utilizing a PnP algorithm, and then a space plane equation of the vehicle body plane under the camera coordinate system can be obtained:
Acx+Bcy+Ccz+Dc=0
extracting a space plane equation of the vehicle body plane under the Lidar coordinate system by utilizing a PCL point cloud plane extraction algorithm:
Alx+Bly+Clz+Dl=0
wherein A, B, C and D are coefficients of a plane equation, x, y and z are coordinate variable representations of three-dimensional space points, and the coefficients are provided with Ac,Bc,CcThe subscript represents the plane equation representation in the camera coordinate system, with Al,Bl,ClThe subscript represents the plane equation representation in the Lidar coordinate system.
Each vehicle body plane forms a group of three-dimensional plane constraint relations under a camera coordinate system and a Lidar coordinate system respectively, and the distance from a space plane normal vector and a coordinate system origin to the plane is as follows:
wherein n iscIs a unit normal vector of a target plane in a camera coordinate system, nlIs a unit normal vector of a target plane in a Lidar coordinate system, dcFor the distance of the origin of the camera coordinate system to the plane, dlIs the distance from the origin of the Lidar coordinate system to the plane.
Before the PnP algorithm is used for solving the relative position relation between the target plane and the camera coordinate system, the camera calibration algorithm is needed to be used for completing the calibration of camera internal parameters to obtain a camera internal parameter matrix:
wherein f isxNormalized focal length for sensor horizontal direction, fyIs the normalized focal length of the sensor in the vertical direction. (c)x,cy) Is the principal point pixel coordinate, where the principal point is the intersection of the camera optical axis and the camera plane, in units of pixels.
The unit normal vector n of the target plane in the camera coordinate system has been derived as described abovecAnd the distance d from the origin of the camera coordinate system to the planecUnit normal vector n of target plane in Lidar coordinate systemlAnd the distance d from the origin of the Lidar coordinate system to the planel. The relative position relationship between the RGB camera to be calibrated and the Lidar is represented by a rotation matrix T as follows:
wherein R isCLA rotation matrix, r, for the change of the camera coordinate system to the Lidar coordinate systemijIs an element in 3 rows and 3 columns (0)<=i,j<=2);tCLTranslation matrix for the change of camera coordinate system to Lidar coordinate system, ti(0<=i<2) are elements in 3 rows and 1 columns.
According to the rotation principle of the plane in the three-dimensional space, the following corresponding relation can be obtained:
from the plane normal vector:
nl=(l1,l2,l3)T,nc=(c1,c2,c3)Twherein, in the three-dimensional space coordinate system, the normal vector of the plane is a three-dimensional vector and has three parameters, l1,l2,l3Three parameters of a plane normal vector under the Lidar coordinate system respectively, and c1,c2,c3Three parameters of a plane normal vector under a camera coordinate system are respectively obtained:
the constraint relation of the normal vector of one target plane occupied by one group of sensors (camera + Lidar) under the camera coordinate system and the Lidar coordinate system is described above.
When a plurality of vehicle body features are respectively arranged on a plurality of non-coplanar N (N is an integer larger than 1) vehicle body planes, a constraint relation can be obtained for each vehicle body plane by referring to the method, and N groups of constraint relations are obtained; under the condition that the relative positions of the two sensors and the specified features of the target device are variable, the target device utilizes the two sensors to successively acquire feature data of one specified feature for N times in the moving process, N groups of specified feature data can be obtained, one group of specified feature data is acquired by the two sensors at the same time, and each group of specified feature data is processed by referring to the method to obtain N groups of constraint relations. In both cases, N sets of such constraint equations can be obtained:
wherein,
further, the following objective function is obtained:
r _ CL is an orthogonal matrix, and satisfies the following orthogonal matrix properties:
RTR=I3,amd det(R)=1
from the above orthogonal matrix properties, the equivalent objective function can be obtained as follows:
based on the method, the rotation matrix R _ CL can be obtained according to the original Procrustes problem solving algorithm.
In addition, a plane unit normal vector n before transformation is known from the principle of distance correspondence between points and planescAnd a distance dcAnd transforming the matrix R _ CL, t _ CL to obtain the distance from the coordinate system origin to the plane after transformation as follows:
theoretically there should be the following equation:
d′l=dl
however, because there is an error in the actual measurement process, the theoretically calculated distance and the actually measured distance are not completely equal, so the following objective optimization function can be constructed:
therefore, the translation vector t _ CL in the transformation matrix can be solved by using the Levenberg-Marquard algorithm.
Further, the manner of extracting the specified feature data corresponding to the specified feature of the target device from the feature data involved in the above step S110-11 includes:
in the method (1), under the condition that the relative position between any two sensors and the designated feature of the target device is variable, multiple groups of designated feature data which are acquired by any two sensors in sequence aiming at one vehicle body feature are acquired, wherein each group of designated feature data is acquired by any two sensors at the same time; further, the plurality of sets of specified characteristic data are used to determine a parameter between the two sensors.
In the method (2), under the condition that the relative positions of any two sensors and the designated features of the target device are fixed, a group of designated feature data acquired by any two sensors is acquired, wherein the group of designated feature data comprises feature data of a plurality of vehicle body features, and the plurality of vehicle body features are not coplanar; further, the plurality of sets of specified characteristic data are used to determine a parameter between the two sensors.
In the case where the relative positions between any two sensors and the designated features of the target device are fixed, the target device needs to change positions, such as move and rotate, to acquire multiple sets of designated feature data.
It should be noted that, in a specific application scenario, that is, in the entire calibration algorithm, in order to complete the solution of the rotation matrix R _ CL and the translation vector, five sets of constraint relationships of the target plane occupied by the sensor (camera + Lidar) in the camera coordinate system and the Lidar coordinate system are at least required. Of course, due to the inevitable sensor measurement errors in the real world, N should be greater than 5 in order to obtain higher combined calibration accuracy, and N is generally 20 in order to take the calibration efficiency into consideration.
In addition, it should be noted that, for a scene in which the relative position relationship between the preset number of sensors and the vehicle body characteristics can be changed, for example, the preset number of sensors is installed on the handle bar of the scooter, and the occupation of the preset number of sensors can be easily obtained by rotating the handle bar. In this scenario, the vehicle body features need only include one target plane. However, for a scene in which the relative positional relationship between the preset number of sets of sensors and the vehicle body features is fixed, it is necessary that the vehicle body features include at least five target planes, and in order to meet the requirement of high precision, it is necessary to include more target plane features.
Therefore, through the above embodiment of the application, the on-line inspection of the calibration results of multiple sensors can be completed only by utilizing the characteristics of target equipment (such as a robot or an unmanned vehicle), the inspection of the calibration results of the multiple sensors can be performed at any time and any place without depending on external environment information, the efficiency of the on-line calibration inspection of the multiple sensors is improved, the success rate and the timeliness of the completion of the on-line calibration inspection task of the multiple sensors are ensured, the correctness of the relative position relationship of the multiple sensors is verified at any time, and the subsequent positioning navigation and obstacle avoidance functions are effectively ensured.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
In this embodiment, a calibration checking device for sensor parameters is further provided, and the device is used to implement the foregoing embodiments and preferred embodiments, which have already been described and are not described again. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 2 is a block diagram of a calibration checking apparatus for sensor parameters according to an embodiment of the present invention, as shown in fig. 2, the apparatus includes: the acquisition module 22 is configured to acquire feature data of a specified feature of the target device through any two sensors in a preset number of sensors, where the preset number of sensors are arranged on the target device; a determining module 24, configured to determine, according to the feature data, a spatial representation of the specified feature or a plane in which the specified feature is located in each of the sensor coordinate systems of any two sensors; (ii) a And the checking module 26 is used for carrying out calibration checking on the calibration results of the parameters of any two sensors according to the spatial representation.
Optionally, the determining module 24 in the present application may further include: the first determining unit is used for determining the spatial representation of the specified feature in the coordinate system of the first sensor according to the feature data to obtain a first spatial representation; the conversion unit is used for converting the first space representation into a second sensor coordinate system according to the parameter calibration result to obtain a second space representation; a second determination unit for determining a spatial representation of the specified feature in the coordinate system of the second sensor from the feature data to obtain a third spatial representation; or,
the first determining unit is used for determining the spatial representation of the plane where the specified feature is located in the coordinate system of the first sensor according to the feature data to obtain a first spatial representation; the conversion unit is used for converting the first space representation into a second sensor coordinate system according to the parameter calibration result to obtain a second space representation; the second determining unit is used for determining the spatial representation of the plane where the specified feature is located in the coordinate system of the second sensor according to the feature data to obtain a third spatial representation; wherein, the first sensor and the second sensor constitute any two sensors.
Optionally, the inspection module 26 in the present application may further include: a third determination unit for determining a distance between the second spatial representation and a third spatial representation; the fourth determining unit is used for determining that the calibration results of the parameters of any two sensors are correct under the condition that the distance is smaller than or equal to the preset threshold value; and the fifth determining unit is used for determining that the calibration results of the parameters of any two sensors are incorrect under the condition that the distance is greater than the preset threshold value.
Optionally, the apparatus in the present application may further include: the reporting module is used for reporting a message for indicating that the calibration result is correct under the condition that the calibration result is correct; and the processing module is used for reporting a message for indicating that the calibration result is incorrect and/or triggering the recalibration of the parameters of the preset number of sensors under the condition that the calibration result is incorrect.
Optionally, the processing module in this application may further include: the acquisition unit is used for acquiring the characteristic data of the target equipment by a preset number of sensors; and the calibration unit is used for calibrating the parameters of the sensors in the preset number by using the characteristic data.
Wherein, calibration unit includes: the extraction subunit is used for extracting specified feature data corresponding to specified features of the target device from the feature data, wherein the specified feature data are acquired by any two sensors in a preset number of sensors; the determining subunit is used for determining the designated features or the planes of the designated features according to the designated feature data, and describing information of a coordinate system corresponding to each sensor in any two sensors; and the calibration subunit is used for calibrating the parameters of any two sensors by processing the description information through a preset algorithm.
It should be noted that, the above modules may be implemented by software or hardware, and for the latter, the following may be implemented, but not limited to: the modules are all positioned in the same processor; alternatively, the modules are respectively located in different processors in any combination.
Example 3
Embodiments of the present invention also provide a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, acquiring feature data of the designated features of the target equipment through any two sensors in a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment;
s2, determining the space representation of the designated feature or the plane of the designated feature in the coordinate system of each sensor in any two sensors according to the feature data;
and S3, performing calibration check on the calibration results of the parameters of any two sensors according to the spatial representation.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring feature data of the designated features of the target equipment through any two sensors in a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment;
s2, determining the space representation of the designated feature or the plane of the designated feature in the coordinate system of each sensor in any two sensors according to the feature data;
and S3, performing calibration check on the calibration results of the parameters of any two sensors according to the spatial representation.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the principle of the present invention should be included in the protection scope of the present invention.

Claims (14)

1. A calibration checking method for sensor parameters is characterized by comprising the following steps:
acquiring feature data of specified features of target equipment through any two sensors in a preset number of sensors, wherein the preset number of sensors are arranged on the target equipment;
determining a spatial representation of the specified feature or a plane in which the specified feature is located in each of the two arbitrary sensors in a sensor coordinate system according to the feature data;
and carrying out calibration check on the calibration results of the parameters of any two sensors according to the spatial representation.
2. The method of claim 1, wherein said determining from said feature data a spatial representation of said specified feature or a plane in which said specified feature lies in a respective sensor coordinate system of said two arbitrary sensors comprises:
determining a spatial representation of the specified feature in a coordinate system of a first sensor according to the feature data to obtain a first spatial representation; converting the first spatial representation into a second sensor coordinate system according to the parameter calibration result to obtain a second spatial representation; determining a spatial representation of the specified feature in a coordinate system of a second sensor according to the feature data to obtain a third spatial representation; or,
determining the spatial representation of the plane of the specified feature in the coordinate system of the first sensor according to the feature data to obtain a first spatial representation; converting the first spatial representation into a second sensor coordinate system according to the parameter calibration result to obtain a second spatial representation; determining the spatial representation of the plane of the specified feature in the coordinate system of the second sensor according to the feature data to obtain a third spatial representation;
wherein the first sensor and the second sensor constitute the arbitrary two sensors.
3. The method according to claim 2, wherein the calibration checking of the parameter calibration results of any two sensors according to the spatial representation comprises:
determining a distance between the second spatial representation and the third spatial representation;
determining that the calibration results of the parameters of any two sensors are correct under the condition that the distance is smaller than or equal to a preset threshold value;
and determining that the calibration results of the parameters of any two sensors are incorrect under the condition that the distance is greater than the preset threshold value.
4. The method of claim 3, further comprising:
reporting a message for indicating that the calibration result is correct under the condition that the calibration result is correct;
and reporting a message for indicating that the calibration result is incorrect when the calibration result is incorrect, and/or triggering the recalibration of the parameters of the preset number of sensors, and/or triggering the recalibration of the parameters of any two sensors.
5. The method of claim 4, wherein calibrating the parameters of the predetermined number of sensors comprises:
acquiring characteristic data of the target equipment by the preset number of sensors;
and calibrating the parameters of the sensors in the preset number by using the characteristic data.
6. The method of claim 5, wherein the using the characteristic data for parameter calibration of a predetermined number of sensors comprises:
extracting specified feature data corresponding to specified features of the target device from the feature data, wherein the specified feature data are acquired by any two sensors in the preset number of sensors;
determining the designated feature or the plane where the designated feature is located according to the designated feature data, and describing information of a coordinate system corresponding to each sensor in the two arbitrary sensors;
and processing the description information through a preset algorithm to calibrate the parameters of any two sensors.
7. A calibration checking device for sensor parameters is characterized by comprising:
the system comprises an acquisition module, a storage module and a control module, wherein the acquisition module is used for acquiring characteristic data of specified characteristics of target equipment through any two sensors in a preset number of sensors, and the preset number of sensors are arranged on the target equipment;
a determination module, configured to determine, according to the feature data, a spatial representation of the specified feature or a plane in which the specified feature is located in each sensor coordinate system of the two arbitrary sensors;
and the checking module is used for carrying out calibration checking on the calibration results of the parameters of any two sensors according to the spatial representation.
8. The apparatus of claim 7, wherein the determining module comprises:
the first determining unit is used for determining the spatial representation of the specified feature in the coordinate system of the first sensor according to the feature data to obtain a first spatial representation; the conversion unit is used for converting the first space representation into a second sensor coordinate system according to the parameter calibration result to obtain a second space representation; a second determining unit, configured to determine, according to the feature data, a spatial representation of the specified feature in a coordinate system of a second sensor to obtain a third spatial representation; or,
the first determining unit is used for determining the spatial representation of the plane of the specified feature in the coordinate system of the first sensor according to the feature data to obtain a first spatial representation; the conversion unit is used for converting the first space representation into a second sensor coordinate system according to the parameter calibration result to obtain a second space representation; the second determining unit is used for determining the spatial representation of the plane of the specified feature in the coordinate system of the second sensor according to the feature data to obtain a third spatial representation;
wherein the first sensor and the second sensor constitute the arbitrary two sensors.
9. The apparatus of claim 8, wherein the inspection module comprises:
a third determination unit for determining a distance between the second spatial representation and the third spatial representation;
the fourth determining unit is used for determining that the calibration results of the parameters of any two sensors are correct under the condition that the distance is smaller than or equal to a preset threshold value;
and the fifth determining unit is used for determining that the calibration results of the parameters of any two sensors are incorrect under the condition that the distance is greater than the preset threshold value.
10. The apparatus of claim 9, further comprising:
a reporting module, configured to report a message indicating that the calibration result is correct when the calibration result is correct;
and the processing module is used for reporting a message for indicating that the calibration result is incorrect and/or triggering the recalibration of the parameters of the preset number of sensors under the condition that the calibration result is incorrect.
11. The apparatus of claim 10, wherein the processing module comprises:
the acquisition unit is used for acquiring the characteristic data of the target equipment by the sensors in the preset number;
and the calibration unit is used for calibrating the parameters of the preset number of sensors by using the characteristic data.
12. The apparatus of claim 11, wherein the calibration unit comprises:
an extraction subunit, configured to extract specified feature data corresponding to specified features of the target device from the feature data, where the specified feature data is acquired by any two sensors in the preset number of sensors;
the determining subunit is configured to determine, according to the specified feature data, the specified feature or a plane where the specified feature is located, and description information of a coordinate system corresponding to each of the two arbitrary sensors;
and the calibration subunit is used for carrying out parameter calibration on any two sensors by processing the description information through a preset algorithm.
13. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 6 when executed.
14. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 6.
CN201910760738.0A 2019-08-16 2019-08-16 The calibration inspection method and device of sensor parameters Pending CN110501036A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910760738.0A CN110501036A (en) 2019-08-16 2019-08-16 The calibration inspection method and device of sensor parameters

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910760738.0A CN110501036A (en) 2019-08-16 2019-08-16 The calibration inspection method and device of sensor parameters

Publications (1)

Publication Number Publication Date
CN110501036A true CN110501036A (en) 2019-11-26

Family

ID=68588168

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910760738.0A Pending CN110501036A (en) 2019-08-16 2019-08-16 The calibration inspection method and device of sensor parameters

Country Status (1)

Country Link
CN (1) CN110501036A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111366911A (en) * 2020-03-05 2020-07-03 三一机器人科技有限公司 Method and device for calibrating positioning consistency of multiple AGV (automatic guided vehicle) and electronic terminal
CN111427028A (en) * 2020-03-20 2020-07-17 新石器慧通(北京)科技有限公司 Parameter monitoring method, device, equipment and storage medium
CN111429521A (en) * 2020-03-05 2020-07-17 深圳市镭神智能系统有限公司 External parameter calibration method, device, medium and electronic equipment for camera and laser radar
CN111443337A (en) * 2020-03-27 2020-07-24 北京航空航天大学 Radar-IMU calibration method based on hand-eye calibration
CN113256726A (en) * 2020-02-13 2021-08-13 纳恩博(北京)科技有限公司 Online calibration and inspection method for sensing system of mobile device and mobile device
CN113256727A (en) * 2020-02-13 2021-08-13 纳恩博(北京)科技有限公司 Mobile device and method and device for online parameter calibration and inspection of image sensing system
CN113252066A (en) * 2020-02-13 2021-08-13 纳恩博(北京)科技有限公司 Method and device for calibrating parameters of odometer equipment, storage medium and electronic device
CN113494927A (en) * 2020-03-20 2021-10-12 郑州宇通客车股份有限公司 Vehicle multi-sensor calibration method and device and vehicle
CN113532499A (en) * 2021-07-15 2021-10-22 中国科学院深圳先进技术研究院 Sensor security detection method, device and storage medium for unmanned system
CN114415671A (en) * 2021-12-28 2022-04-29 上海擎朗智能科技有限公司 Method for detecting whether sensor of robot fails or not and robot
CN114623856A (en) * 2022-02-08 2022-06-14 武汉路特斯汽车有限公司 Multi-sensor off-line composite calibration system and method
CN117554937A (en) * 2024-01-08 2024-02-13 安徽中科星驰自动驾驶技术有限公司 Error-controllable laser radar and combined inertial navigation external parameter calibration method and system
CN118623748A (en) * 2024-08-09 2024-09-10 深圳市志奋领科技有限公司 Inductive sensor multi-distance sensing calibration method, system and related equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103925879A (en) * 2014-04-24 2014-07-16 中国科学院合肥物质科学研究院 Indoor robot vision hand-eye relation calibration method based on 3D image sensor
CN106840242A (en) * 2017-01-23 2017-06-13 驭势科技(北京)有限公司 The sensor self-checking system and multi-sensor fusion system of a kind of intelligent driving automobile
CN108020825A (en) * 2016-11-03 2018-05-11 岭纬公司 Laser radar, Laser video camera head, the fusion calibration system of video camera and method
CN108226883A (en) * 2017-11-28 2018-06-29 深圳市易成自动驾驶技术有限公司 Test the method, apparatus and computer readable storage medium of millimetre-wave radar performance
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method
CN109781163A (en) * 2018-12-18 2019-05-21 北京百度网讯科技有限公司 Calibrating parameters validity check method, apparatus, equipment and storage medium
CN109920011A (en) * 2019-05-16 2019-06-21 长沙智能驾驶研究院有限公司 External parameter calibration method, device and equipment for lidar and binocular camera
KR20190072734A (en) * 2017-12-18 2019-06-26 전자부품연구원 Calibration Method for Integrating a Single Three-dimensional Coordinate System of Multiple Sensors

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103925879A (en) * 2014-04-24 2014-07-16 中国科学院合肥物质科学研究院 Indoor robot vision hand-eye relation calibration method based on 3D image sensor
CN108020825A (en) * 2016-11-03 2018-05-11 岭纬公司 Laser radar, Laser video camera head, the fusion calibration system of video camera and method
CN106840242A (en) * 2017-01-23 2017-06-13 驭势科技(北京)有限公司 The sensor self-checking system and multi-sensor fusion system of a kind of intelligent driving automobile
CN108226883A (en) * 2017-11-28 2018-06-29 深圳市易成自动驾驶技术有限公司 Test the method, apparatus and computer readable storage medium of millimetre-wave radar performance
KR20190072734A (en) * 2017-12-18 2019-06-26 전자부품연구원 Calibration Method for Integrating a Single Three-dimensional Coordinate System of Multiple Sensors
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method
CN109781163A (en) * 2018-12-18 2019-05-21 北京百度网讯科技有限公司 Calibrating parameters validity check method, apparatus, equipment and storage medium
CN109920011A (en) * 2019-05-16 2019-06-21 长沙智能驾驶研究院有限公司 External parameter calibration method, device and equipment for lidar and binocular camera

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113252066B (en) * 2020-02-13 2024-04-09 纳恩博(北京)科技有限公司 Calibration method and device for parameters of odometer equipment, storage medium and electronic device
CN113256726B (en) * 2020-02-13 2024-11-22 纳恩博(北京)科技有限公司 Online calibration and inspection method of sensor system of mobile device, mobile device
CN113256726A (en) * 2020-02-13 2021-08-13 纳恩博(北京)科技有限公司 Online calibration and inspection method for sensing system of mobile device and mobile device
CN113256727A (en) * 2020-02-13 2021-08-13 纳恩博(北京)科技有限公司 Mobile device and method and device for online parameter calibration and inspection of image sensing system
CN113252066A (en) * 2020-02-13 2021-08-13 纳恩博(北京)科技有限公司 Method and device for calibrating parameters of odometer equipment, storage medium and electronic device
CN113256727B (en) * 2020-02-13 2024-10-01 纳恩博(北京)科技有限公司 Mobile device and image sensing system parameter on-line calibration and inspection method and device
CN111366911A (en) * 2020-03-05 2020-07-03 三一机器人科技有限公司 Method and device for calibrating positioning consistency of multiple AGV (automatic guided vehicle) and electronic terminal
CN111429521A (en) * 2020-03-05 2020-07-17 深圳市镭神智能系统有限公司 External parameter calibration method, device, medium and electronic equipment for camera and laser radar
CN111427028A (en) * 2020-03-20 2020-07-17 新石器慧通(北京)科技有限公司 Parameter monitoring method, device, equipment and storage medium
CN111427028B (en) * 2020-03-20 2022-03-25 新石器慧通(北京)科技有限公司 Parameter monitoring method, device, equipment and storage medium
CN113494927A (en) * 2020-03-20 2021-10-12 郑州宇通客车股份有限公司 Vehicle multi-sensor calibration method and device and vehicle
CN111443337B (en) * 2020-03-27 2022-03-08 北京航空航天大学 Radar-IMU calibration method based on hand-eye calibration
CN111443337A (en) * 2020-03-27 2020-07-24 北京航空航天大学 Radar-IMU calibration method based on hand-eye calibration
CN113532499A (en) * 2021-07-15 2021-10-22 中国科学院深圳先进技术研究院 Sensor security detection method, device and storage medium for unmanned system
WO2023283987A1 (en) * 2021-07-15 2023-01-19 中国科学院深圳先进技术研究院 Sensor security detection method and device for unmanned system, and storage medium
CN114415671A (en) * 2021-12-28 2022-04-29 上海擎朗智能科技有限公司 Method for detecting whether sensor of robot fails or not and robot
CN114623856A (en) * 2022-02-08 2022-06-14 武汉路特斯汽车有限公司 Multi-sensor off-line composite calibration system and method
CN117554937A (en) * 2024-01-08 2024-02-13 安徽中科星驰自动驾驶技术有限公司 Error-controllable laser radar and combined inertial navigation external parameter calibration method and system
CN117554937B (en) * 2024-01-08 2024-04-26 安徽中科星驰自动驾驶技术有限公司 Error-controllable laser radar and combined inertial navigation external parameter calibration method and system
CN118623748A (en) * 2024-08-09 2024-09-10 深圳市志奋领科技有限公司 Inductive sensor multi-distance sensing calibration method, system and related equipment

Similar Documents

Publication Publication Date Title
CN110501036A (en) The calibration inspection method and device of sensor parameters
CN110470333B (en) Calibration method and device of sensor parameters, storage medium and electronic device
CN109345596B (en) Multi-sensor calibration method, device, computer equipment, medium and vehicle
CN111121754A (en) Mobile robot positioning and navigation method, device, mobile robot and storage medium
CN107741234A (en) The offline map structuring and localization method of a kind of view-based access control model
CN111856499B (en) Map construction method and device based on laser radar
CN110850859B (en) Robot and obstacle avoidance method and obstacle avoidance system thereof
CN112051591A (en) Detection method and related device for laser radar and inertial measurement unit
CN113137968B (en) Repositioning method and repositioning device based on multi-sensor fusion and electronic equipment
KR102694715B1 (en) Method for detecting obstacle, electronic device, roadside device and cloud control platform
CN110782531A (en) Method and computing device for processing three-dimensional point cloud data
WO2023283929A1 (en) Method and apparatus for calibrating external parameters of binocular camera
CN111709988A (en) Method and device for determining characteristic information of object, electronic equipment and storage medium
KR20200032776A (en) System for information fusion among multiple sensor platforms
WO2022217988A1 (en) Sensor configuration scheme determination method and apparatus, computer device, storage medium, and program
CN112597946B (en) Obstacle representation method, device, electronic device and readable storage medium
CN113093759A (en) Robot formation construction method and system based on multi-sensor information fusion
CN115100287B (en) External parameter calibration method and robot
CN116184430A (en) Pose estimation algorithm fused by laser radar, visible light camera and inertial measurement unit
WO2022088613A1 (en) Robot positioning method and apparatus, device and storage medium
US20210156710A1 (en) Map processing method, device, and computer-readable storage medium
CN112381873A (en) Data labeling method and device
CN111708046A (en) Method and device for processing plane data of obstacle, electronic equipment and storage medium
CN117687006A (en) External parameter calibration method from laser radar to inertial measurement unit and device thereof
CN116958452A (en) Three-dimensional reconstruction method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210205

Address after: Floor 16 and 17, block a, building 3, chuangyangang, Changzhou science and Education City, No.18, middle Changwu Road, Wujin District, Changzhou City, Jiangsu Province, 213000

Applicant after: NINEBOT (CHANGZHOU) TECH Co.,Ltd.

Address before: 100086 No.161, 6 / F, block B, building 1, No.38, Zhongguancun Street, Haidian District, Beijing

Applicant before: BEIJING ZHIXING MUYUAN TECHNOLOGY Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20191126