[go: up one dir, main page]

CN112541950A - Method and device for calibrating external parameter of depth camera - Google Patents

Method and device for calibrating external parameter of depth camera Download PDF

Info

Publication number
CN112541950A
CN112541950A CN201910892567.7A CN201910892567A CN112541950A CN 112541950 A CN112541950 A CN 112541950A CN 201910892567 A CN201910892567 A CN 201910892567A CN 112541950 A CN112541950 A CN 112541950A
Authority
CN
China
Prior art keywords
plane
coordinate system
camera
calibration
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910892567.7A
Other languages
Chinese (zh)
Other versions
CN112541950B (en
Inventor
李建禹
龙学雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikrobot Technology Co Ltd
Original Assignee
Hangzhou Hikrobot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikrobot Technology Co Ltd filed Critical Hangzhou Hikrobot Technology Co Ltd
Priority to CN201910892567.7A priority Critical patent/CN112541950B/en
Publication of CN112541950A publication Critical patent/CN112541950A/en
Application granted granted Critical
Publication of CN112541950B publication Critical patent/CN112541950B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请公开了一种深度相机外参的标定方法,其特征在于,该方法包括,获取深度图像数据,所述深度图像数据包括像素点坐标和深度值;基于获取的深度图像数据,将该深度图像中的像素点转化为相机坐标系下的空间三维点;基于所述三维点,获取三维点的拟合平面;根据与拟合平面平行或重合的标定平面与相机坐标系的当前位姿关系,得到深度相机与标定平面之间的参数;其中,标定平面包括平行或垂直于深度相机所在移动机器人本体承载面的任一当前平面。本申请标定过程中无需借助任何外部工具,仅需采集一幅图像即可给出标定结果,借助自然水平垂直的地面和墙面即可进行标定,为基于深度图像数据的应用提供了鲁棒性佳、实时性强的应用基础。

Figure 201910892567

The present application discloses a method for calibrating external parameters of a depth camera, which is characterized in that the method includes: acquiring depth image data, where the depth image data includes pixel coordinates and depth values; based on the acquired depth image data, the depth The pixel points in the image are converted into spatial three-dimensional points in the camera coordinate system; based on the three-dimensional points, the fitting plane of the three-dimensional points is obtained; according to the current pose relationship between the calibration plane parallel or overlapping with the fitting plane and the camera coordinate system , to obtain the parameters between the depth camera and the calibration plane; wherein, the calibration plane includes any current plane that is parallel or perpendicular to the bearing surface of the mobile robot body where the depth camera is located. In the calibration process of this application, there is no need to use any external tools, and only one image can be collected to give the calibration result, and the calibration can be carried out with the help of the natural horizontal and vertical ground and wall, which provides robustness for applications based on depth image data. Excellent and real-time application foundation.

Figure 201910892567

Description

一种深度相机外参的标定方法、及装置A method and device for calibrating external parameters of a depth camera

技术领域technical field

本发明涉及机器视觉领域,特别地,涉及一种深度相机外参的标定方法。The present invention relates to the field of machine vision, in particular, to a method for calibrating external parameters of a depth camera.

背景技术Background technique

在图像测量过程以及机器视觉应用中,为确定空间物体表面某点的三维几何位置与其在图像中对应点之间的相互关系,必须建立相机成像的几何模型,这些几何模型参数就是相机参数。相机标定就是通过一定的方法计算相机参数的过程。In the process of image measurement and machine vision applications, in order to determine the relationship between the three-dimensional geometric position of a point on the surface of a space object and its corresponding point in the image, a geometric model of camera imaging must be established, and these geometric model parameters are camera parameters. Camera calibration is the process of calculating camera parameters through a certain method.

在图像拾取的过程中,相机的位姿并不是固定不变的,这导致相机坐标系并不是一个稳定的坐标系,会随着相机的移动而改变相机坐标系的原点和各个坐标轴的方向,这就需要引进一个稳定不变坐标系:世界坐标系,该坐标系是绝对坐标系。从相机坐标系到世界坐标系的变换需要旋转变换和平移变换,这些旋转变换和平移变换就成为了相机的外参。In the process of image pickup, the pose of the camera is not fixed, which makes the camera coordinate system not a stable coordinate system, and the origin of the camera coordinate system and the direction of each coordinate axis will change with the movement of the camera. , which requires the introduction of a stable and invariable coordinate system: the world coordinate system, which is an absolute coordinate system. The transformation from the camera coordinate system to the world coordinate system requires rotation transformation and translation transformation, and these rotation transformation and translation transformation become the external parameters of the camera.

深度相机是一种可以直接获得深度图像数据的相机,其所获得的深度图像数据中的任一像素点数据是基于像素点坐标以像素点灰度值表征空间点的距离值(深度值)的数据,可以表示为p(u,v,d),其中,u、v为图像坐标系下像素点的坐标,d为该像素点对应的空间三维点的深度值。A depth camera is a camera that can directly obtain depth image data. Any pixel data in the obtained depth image data is a distance value (depth value) that represents the distance value (depth value) of a spatial point based on the pixel point coordinates and the pixel point gray value. The data can be expressed as p(u, v, d), where u and v are the coordinates of the pixel point in the image coordinate system, and d is the depth value of the spatial three-dimensional point corresponding to the pixel point.

按照相机的构成原理,深度相机包括被动双目相机,主动双目相机,飞行时间(TOF)相机,单目结构光等。目前深度相机的外参标定,通常标定的是深度相机内部的外参,如由双目相机构成的深度相机,标定的是两个相机之间的相对旋转和平移,这种标定方法不适用于标定深度相机与外界坐标系之间的外参关系。According to the composition principle of cameras, depth cameras include passive binocular cameras, active binocular cameras, time-of-flight (TOF) cameras, monocular structured light, etc. At present, the external parameter calibration of the depth camera usually calibrates the internal external parameters of the depth camera. For example, the depth camera composed of binocular cameras calibrates the relative rotation and translation between the two cameras. This calibration method is not suitable for Calibrate the external parameter relationship between the depth camera and the external coordinate system.

发明内容SUMMARY OF THE INVENTION

本发明提供了一种深度相机外参的标定方法,以基于深度图像数据来标定相机坐标系与标定平面之间的变换关系。The present invention provides a method for calibrating external parameters of a depth camera, so as to calibrate the transformation relationship between the camera coordinate system and the calibration plane based on depth image data.

本发明提供的一种深度相机外参的标定方法,该方法包括,The present invention provides a method for calibrating external parameters of a depth camera, the method comprising:

获取深度图像数据,所述深度图像数据包括像素点坐标和深度值;Obtaining depth image data, the depth image data includes pixel coordinates and depth values;

基于获取的深度图像数据,将该深度图像中的像素点转化为相机坐标系下的空间三维点;Based on the acquired depth image data, transform the pixels in the depth image into three-dimensional spatial points in the camera coordinate system;

基于所述三维点,获取三维点的拟合平面;Based on the three-dimensional point, obtain the fitting plane of the three-dimensional point;

根据与拟合平面平行或重合的标定平面与相机坐标系的当前位姿关系,得到深度相机与标定平面之间的参数;According to the current pose relationship between the calibration plane parallel or coincident with the fitting plane and the camera coordinate system, the parameters between the depth camera and the calibration plane are obtained;

其中,标定平面包括平行或垂直于深度相机所在移动机器人本体承载面的任一当前平面。Wherein, the calibration plane includes any current plane that is parallel or perpendicular to the bearing surface of the mobile robot body where the depth camera is located.

较佳地,所述深度图像数据最少仅一幅深度图像,所述将该深度图像中的像素点转化为相机坐标系下的空间三维点包括,Preferably, the depth image data is at least one depth image, and the conversion of pixels in the depth image into three-dimensional spatial points in the camera coordinate system includes,

按照相机坐标系下的空间三维点与图像坐标系下的二维图像的映射几何模型,获得像素点对应的三维点坐标。The three-dimensional point coordinates corresponding to the pixel points are obtained according to the mapping geometric model of the spatial three-dimensional point in the camera coordinate system and the two-dimensional image in the image coordinate system.

较佳地,所述按照相机坐标系下的空间三维点与图像坐标系下的二维图像的映射几何模型,获得像素点对应的三维点坐标,包括,Preferably, obtaining the three-dimensional point coordinates corresponding to the pixel points according to the mapping geometric model of the spatial three-dimensional point under the camera coordinate system and the two-dimensional image under the image coordinate system, including:

对于深度图像中的任一像素点,For any pixel in the depth image,

将像素点的深度值作为三维点的z坐标值;Use the depth value of the pixel point as the z coordinate value of the three-dimensional point;

获取像素点x坐标与x方向偏移之间的第一差值,将z坐标值与相机内参中x方向焦距之比值与所述第一差值相乘,得到的结果作为三维点的x坐标值;所述x方向偏移为相机坐标系原点相对图像坐标系在x方向的偏移;Obtain the first difference between the x-coordinate of the pixel point and the offset in the x-direction, multiply the ratio of the z-coordinate value to the focal length in the x-direction of the camera's internal parameters and the first difference, and the obtained result is used as the x-coordinate of the three-dimensional point value; the x-direction offset is the offset of the origin of the camera coordinate system relative to the image coordinate system in the x-direction;

获取像素点y坐标与y方向偏移之间的第二差值,将z坐标值与相机内参中y方向焦距之比值与所述第二差值相乘,得到的结果作为三维点的y坐标值;所述y方向偏移为相机坐标系原点相对图像坐标系在y方向的偏移;Obtain the second difference between the y-coordinate of the pixel and the offset in the y-direction, multiply the ratio of the z-coordinate value to the y-direction focal length in the camera's internal parameters and the second difference, and use the result as the y-coordinate of the three-dimensional point value; the y-direction offset is the offset of the origin of the camera coordinate system relative to the image coordinate system in the y-direction;

其中,相机坐标系原点相对图像坐标系中的x方向偏移、相机坐标系原点相对图像坐标系中的y方向偏移根据相机内参获得;Wherein, the offset of the origin of the camera coordinate system relative to the x direction in the image coordinate system, and the offset of the origin of the camera coordinate system relative to the y direction in the image coordinate system are obtained according to the camera internal parameters;

将深度图像中的所有像素点转化为三维点坐标,得到三维点集合。Convert all pixels in the depth image to 3D point coordinates to obtain a 3D point set.

较佳地,该方法进一步包括,按照筛选策略,对三维点集合中的三维点进行筛选,得到筛选后的三维点集合;其中,所述筛选策略包括以下任一条件或其任意组合:Preferably, the method further includes, according to a screening strategy, screening the three-dimensional points in the three-dimensional point set to obtain a screened three-dimensional point set; wherein, the screening strategy includes any of the following conditions or any combination thereof:

(1)根据相机朝向,去除深度图像上方一定范围的三维点;(1) According to the orientation of the camera, remove 3D points in a certain range above the depth image;

(2)有外参初始估计值的情况下,通过相机外参将相机坐标系下的三维点变换到世界坐标系下,去除高度方向上高度大于预设高度阈值的三维点;(2) When there is an initial estimated value of the external parameter, the three-dimensional point in the camera coordinate system is transformed into the world coordinate system through the camera external parameter, and the three-dimensional point whose height in the height direction is greater than the preset height threshold is removed;

(3)对于双目立体视觉的深度相机,去除深度值大于预设深度阈值的三维点;(3) For the depth camera of binocular stereo vision, remove the three-dimensional points whose depth value is greater than the preset depth threshold;

(4)对于飞行时间TOF深度相机,根据像素点与其临近像素点的距离差值,去除距离差值大于预设距离阈值的三维点。(4) For the TOF depth camera of the time of flight, according to the distance difference between the pixel point and its adjacent pixel points, remove the three-dimensional points whose distance difference is greater than the preset distance threshold.

较佳地,所述基于所述三维点,获取三维点的拟合平面,包括,Preferably, obtaining the fitting plane of the three-dimensional point based on the three-dimensional point includes,

根据筛选后的三维点集合,利用随机抽样一致RANSAC算法,获取三维点的拟合平面方程,其中,三维点数量大于等于3。According to the screened 3D point set, a random sampling consistent RANSAC algorithm is used to obtain the fitting plane equation of the 3D points, wherein the number of 3D points is greater than or equal to 3.

较佳地,所述利用随机抽样一致RANSAC算法,获取三维点的拟合平面方程,包括,Preferably, the random sampling consistent RANSAC algorithm is used to obtain the fitting plane equation of the three-dimensional point, including,

基于筛选后的三维点集合,进行随机选择,得到通过随机选择出的三维点构成的当前子集,其中,子集中的三维点数量至少包括3个;Based on the filtered 3D point set, random selection is performed to obtain a current subset composed of the randomly selected 3D points, wherein the number of 3D points in the subset includes at least 3;

基于当前子集中的三维点,获取该子集的拟合平面估计;Based on the 3D points in the current subset, obtain a fitting plane estimate for the subset;

根据获取的拟合平面估计,获取筛选后的三维点集合中所有三维点相对所述拟合平面估计的符合程度;According to the obtained fitting plane estimation, obtain the degree of conformity of all 3D points in the screened 3D point set relative to the fitting plane estimation;

如果符合程度不够,则返回所述基于筛选后的三维点集合进行随机选择的步骤;If the degree of conformity is not enough, returning to the step of randomly selecting based on the filtered three-dimensional point set;

如果达到符合程度,则利用符合程度最佳的拟合平面估计的内点,求解拟合平面方程;If the degree of conformity is reached, use the estimated interior point of the fitting plane with the best degree of conformity to solve the fitting plane equation;

其中,所述内点包括,筛选后的三维点集合中三维点到符合程度最佳的拟合平面的距离小于预设距离阈值的三维点。Wherein, the inliers include three-dimensional points whose distances from the three-dimensional points in the filtered three-dimensional point set to the fitting plane with the best degree of conformity are less than a preset distance threshold.

较佳地,所述基于当前子集中的三维点,获取该子集的拟合平面估计,包括,Preferably, obtaining the fitting plane estimation of the subset based on the three-dimensional points in the current subset, including,

如果当前子集中的三维点数量等于3,则将三维点坐标值代入拟合平面方程,求解拟合平面方程中的未知数,得到当前子集的拟合平面估计;If the number of 3D points in the current subset is equal to 3, substitute the coordinate values of the 3D points into the fitting plane equation, solve the unknowns in the fitting plane equation, and obtain the fitting plane estimation of the current subset;

如果当前子集中的三维点数量大于3,则将三维点坐标值代入拟合平面方程,通过最小二乘方法,求解拟合平面方程中的未知数,得到当前子集的拟合平面估计。If the number of 3D points in the current subset is greater than 3, the coordinate values of the 3D points are substituted into the fitting plane equation, and the unknowns in the fitting plane equation are solved by the least squares method to obtain the fitting plane estimate of the current subset.

较佳地,所述根据获取的拟合平面估计,获取筛选后的三维点集合中所有三维点相对所述拟合平面估计的符合程度,包括,Preferably, according to the obtained fitting plane estimation, the degree of conformity of all 3D points in the screened 3D point set relative to the fitting plane estimation is obtained, including:

计算筛选后的三维点集合中的每个三维点到所述拟合平面估计的距离值,Calculate the distance value estimated from each 3D point in the filtered 3D point set to the fitting plane,

将计算得到的距离值小于设定距离阈值的三维点作为内点,Take the three-dimensional point whose calculated distance value is less than the set distance threshold as the interior point,

统计内点数量,count the number of inliers,

计算内点数量占筛选后的三维点云的比例,得到内点率;Calculate the ratio of the number of inliers to the filtered 3D point cloud to obtain the inlier rate;

根据内点率确定所述符合程度;determining the degree of compliance according to the interior point rate;

所述利用符合程度最佳的拟合平面估计的内点,求解拟合平面方程,包括,将内点率最高的拟合平面估计作为最佳拟合平面估计,根据最佳拟合平面估计的内点,通过最小二乘的方法,重新求解拟合平面方程中的未知数,得到拟合平面方程。The fitting plane equation is solved by using the estimated interior points of the fitting plane with the best degree of conformity, including, using the fitting plane estimate with the highest interior point rate as the best fitting plane estimate, and according to the best fitting plane estimate For the interior point, the unknowns in the fitted plane equation are re-solved by the method of least squares, and the fitted plane equation is obtained.

较佳地,所述根据内点率确定所述符合程度包括,Preferably, the determining the degree of conformity according to the inlier rate includes,

根据迭代次数确定所述符合程度,其中,迭代次数满足:The degree of conformity is determined according to the number of iterations, wherein the number of iterations satisfies:

Figure BDA0002209215940000041
Figure BDA0002209215940000041

其中,m为子集中三维点的数量,η为设定的置信度,ε为内点率,为最坏条件下内点的比例,或者在初始状态下设置为最坏条件下的比例,并随着迭代次数,更新为当前最大的内点率。where m is the number of 3D points in the subset, η is the set confidence level, ε is the interior point rate, is the proportion of interior points under the worst condition, or is set to the proportion under the worst condition in the initial state, and With the number of iterations, update to the current maximum inlier rate.

较佳地,所述根据内点率确定所述符合程度包括,根据v个子集全部是内点的概率是否满足设定的置信度,确定所述符合程度,其中,v个子集全部是内点的概率为:Preferably, the determining the degree of conformity according to the inlier rate includes determining the degree of conformity according to whether the probability that the v subsets are all inliers satisfies a set confidence level, wherein all the v subsets are inliers. The probability is:

Figure BDA0002209215940000042
Figure BDA0002209215940000042

其中,λ为在当前迭代中子集全都是内点的选取次数的期望。where λ is the expectation of the number of times the subsets are all interior points in the current iteration.

较佳地,所述根据与拟合平面平行或重合的标定平面与相机坐标系的当前位姿关系,得到深度相机与标定平面之间的参数,包括,Preferably, the parameters between the depth camera and the calibration plane are obtained according to the current pose relationship between the calibration plane that is parallel or coincident with the fitting plane and the camera coordinate system, including,

根据拟合平面方程,得到拟合平面法向量,According to the fitted plane equation, the fitted plane normal vector is obtained,

根据所述拟合平面法向量,得到与所述拟合平面平行或重合的标定平面的方程;According to the normal vector of the fitting plane, obtain the equation of the calibration plane that is parallel or coincident with the fitting plane;

计算相机坐标系的原点到与所述标定平面的距离,得到相机坐标系相对标定平面的距离变换;Calculate the distance from the origin of the camera coordinate system to the calibration plane, and obtain the distance transformation of the camera coordinate system relative to the calibration plane;

计算相机坐标系相对标定平面正视的旋转,到相机坐标系相对标定平面的旋转变换,Calculate the rotation of the camera coordinate system relative to the calibration plane, to the rotation transformation of the camera coordinate system relative to the calibration plane,

将所述距离变换、以及旋转变换,作为所述深度相机与标定平面之间的参数。The distance transformation and the rotation transformation are used as parameters between the depth camera and the calibration plane.

较佳地,所述深度图像数据包括有移动机器人本体承载面图像数据,Preferably, the depth image data includes image data of the bearing surface of the mobile robot body,

所述根据所述拟合平面法向量,得到与所述拟合平面关联的标定平面的方程,包括;According to the fitted plane normal vector, the equation of the calibration plane associated with the fitted plane is obtained, including;

根据所述拟合平面法向量,得到与所述拟合平面平行、且具有设定距离的标定平面的方程;其中,所述标定平面为世界坐标系下移动机器人本体的承载地面;According to the normal vector of the fitting plane, the equation of the calibration plane parallel to the fitting plane and having a set distance is obtained; wherein, the calibration plane is the bearing ground of the mobile robot body in the world coordinate system;

所述计算相机坐标系的原点到与所述标定平面的距离,得到相机坐标系相对标定平面的距离变换,包括;The calculating the distance from the origin of the camera coordinate system to the calibration plane, and obtaining the distance transformation of the camera coordinate system relative to the calibration plane, including;

根据标定平面的方程参数,计算相机坐标系原点到与所述标定平面的距离,得到相机坐标系相对标定平面的高度变换;According to the equation parameters of the calibration plane, calculate the distance from the origin of the camera coordinate system to the calibration plane, and obtain the height transformation of the camera coordinate system relative to the calibration plane;

所述计算相机坐标系相对标定平面正视的旋转,到相机坐标系相对标定平面的旋转变换,包括,The calculation of the rotation of the camera coordinate system relative to the front view of the calibration plane, to the rotation transformation of the camera coordinate system relative to the calibration plane, includes,

根据标定平面的方程参数,计算相机坐标系的y轴相对拟合平面法向量的旋转,得到相机坐标系相对标定平面的旋转变换。According to the equation parameters of the calibration plane, the rotation of the y-axis of the camera coordinate system relative to the normal vector of the fitting plane is calculated, and the rotation transformation of the camera coordinate system relative to the calibration plane is obtained.

较佳地,所述深度图像数据包括有与移动机器人本体承载面垂直的立面图像数据,Preferably, the depth image data includes elevation image data perpendicular to the bearing surface of the mobile robot body,

所述根据所述拟合平面法向量,得到与所述拟合平面关联的标定平面的方程,包括;According to the fitted plane normal vector, the equation of the calibration plane associated with the fitted plane is obtained, including;

根据所述拟合平面的法向量,得到与所述拟合平面平行、且具有设定距离的标定平面的方程;其中,所述标定平面垂直于世界坐标系下移动机器人本体的承载地面;According to the normal vector of the fitting plane, the equation of the calibration plane that is parallel to the fitting plane and has a set distance is obtained; wherein, the calibration plane is perpendicular to the bearing ground of the mobile robot body in the world coordinate system;

所述计算相机坐标系的原点到与所述标定平面的距离,得到相机坐标系相对标定平面的距离变换,包括;The calculating the distance from the origin of the camera coordinate system to the calibration plane, and obtaining the distance transformation of the camera coordinate system relative to the calibration plane, including;

根据标定平面的方程参数,计算相机坐标系原点到标定平面的距离,得到相机坐标系到标定平面的距离变换;According to the equation parameters of the calibration plane, calculate the distance from the origin of the camera coordinate system to the calibration plane, and obtain the distance transformation from the camera coordinate system to the calibration plane;

所述计算相机坐标系相对标定平面正视的旋转,到相机坐标系相对标定平面的旋转变换,包括,The calculation of the rotation of the camera coordinate system relative to the front view of the calibration plane, to the rotation transformation of the camera coordinate system relative to the calibration plane, includes,

根据标定平面的方程参数,计算相机坐标系的z轴相对标定平面法向量的旋转,得到相机坐标系相对标定平面的旋转变换。According to the equation parameters of the calibration plane, the rotation of the z-axis of the camera coordinate system relative to the normal vector of the calibration plane is calculated, and the rotation transformation of the camera coordinate system relative to the calibration plane is obtained.

本发明提供的一种深度相机外参标定的电子设备,该电子设备包括存储器和处理器,其中,The present invention provides an electronic device for calibrating external parameters of a depth camera, the electronic device includes a memory and a processor, wherein,

所述存储器用于存放计算机程序;The memory is used to store computer programs;

所述处理器,用于执行所述存储器上所存放的程序,实现上述任一深度相机外参的标定方法。The processor is configured to execute the program stored in the memory to implement any of the above methods for calibrating external parameters of the depth camera.

本发明还提供一种计算机可读存储介质,所述存储介质内存储有计算机程序,所述计算机程序被处理器执行上述任一所述深度相机外参的标定方法。The present invention also provides a computer-readable storage medium, where a computer program is stored in the storage medium, and the computer program is executed by a processor to perform any of the above-mentioned methods for calibrating external parameters of a depth camera.

本申请提供的一种深度相机外参的标定方法,基于深度图像数据所对应的相机坐标系下的空间三维点,获取标定平面的拟合平面,从而根据相机坐标系与拟合平面的当前相对位姿,得到相机坐标系与标定平面的距离变换以及旋转变化,得到相机相对标定平面的外参。本申请标定过程中无需预先借助任何外部工具,仅需采集一幅图像可给出标定结果,这样,在实时采集一图像之时,借助自然水平垂直的地面或者墙面可实时进行标定,简单易用,实时性强,适用性强,为基于深度图像数据的应用提供了鲁棒性佳、实时性强的应用基础。The present application provides a method for calibrating external parameters of a depth camera. Based on the spatial three-dimensional points in the camera coordinate system corresponding to the depth image data, the fitting plane of the calibration plane is obtained, so as to obtain the fitting plane of the calibration plane according to the current relative relationship between the camera coordinate system and the fitting plane. Pose, obtain the distance transformation and rotation change between the camera coordinate system and the calibration plane, and obtain the external parameters of the camera relative to the calibration plane. In the calibration process of this application, there is no need to use any external tools in advance, and only one image can be collected to give the calibration result. In this way, when an image is collected in real time, the natural horizontal and vertical ground or wall can be used for real-time calibration, which is simple and easy. It has strong real-time performance and strong applicability, and provides an application foundation with good robustness and strong real-time performance for applications based on depth image data.

附图说明Description of drawings

图1为基于深度图像数据进行深度相机外参标定原理的一种示意图。FIG. 1 is a schematic diagram of the principle of depth camera extrinsic parameter calibration based on depth image data.

图2为本申请实施例进行深度相机外参标定的一种流程示意图。FIG. 2 is a schematic flowchart of an embodiment of the present application for calibrating external parameters of a depth camera.

图3为以拟合平面来进行标定的一种示意图。FIG. 3 is a schematic diagram of calibration by fitting a plane.

图4为以与所述拟合平面垂直的第二拟合平面(墙面)来进行标定的一种示意图。FIG. 4 is a schematic diagram of performing calibration with a second fitting plane (wall surface) perpendicular to the fitting plane.

图5为本申请实施例标定方法数据关联的一种示意图。FIG. 5 is a schematic diagram of data association of a calibration method according to an embodiment of the present application.

图6为本申请实施例标定装置的一种示意图。FIG. 6 is a schematic diagram of a calibration device according to an embodiment of the present application.

具体实施方式Detailed ways

为了使本申请的目的、技术手段和优点更加清楚明白,以下结合附图对本申请做进一步详细说明。In order to make the objectives, technical means and advantages of the present application more clear, the present application will be further described in detail below with reference to the accompanying drawings.

参见图1所示,图1为基于深度图像数据进行深度相机外参标定原理的一种示意图。安装于移动机器人本体的深度相机获取深度图像数据,深度图像的一种呈现如图中所示,图中物理空间地面(图中粗实线表示)对应的深度图像为图像中的虚线框区域,基于所述空间地面的深度图像数据求解一拟合平面Ax+By+Cz+D=0,该拟合平面可视为表征了空间地面所在的平面;鉴于世界坐标系通常是以地面为基准的,xy轴所在平面与地平面平行,z轴垂直于xy轴所在平面,相机的平移轨迹(x、y坐标变化)属于同一平面内,这样,标定相机相对地面的旋转和高度(距离),等同于求解相机相对拟合平面的旋转和高度(距离),也就获得了相机相对世界坐标系的旋转变换和z轴变换,从而获得了地面深度图像与物理空间地平面(水平面)之间成像模型的参数;广义地,深度图像与平行或垂直于移动机器人本体所在承载面的任一物理空间期望平面之间的成像模型参数,即,深度图像与标定平面之间的成像模型参数,其反映了深度相机与外界坐标系之间的一种外参关系,均可视为深度相机的一种外参,在本申请中简称为深度相机外参。Referring to FIG. 1 , FIG. 1 is a schematic diagram of the principle of depth camera extrinsic parameter calibration based on depth image data. The depth camera installed on the mobile robot body acquires depth image data. A representation of the depth image is shown in the figure. The depth image corresponding to the physical space ground in the figure (indicated by the thick solid line in the figure) is the dotted frame area in the image. Based on the depth image data of the space ground, a fitting plane Ax+By+Cz+D=0 is solved, and the fitted plane can be regarded as representing the plane where the space ground is located; since the world coordinate system is usually based on the ground , the plane where the xy axis is located is parallel to the ground plane, the z axis is perpendicular to the plane where the xy axis is located, and the translation trajectory (x, y coordinate changes) of the camera belongs to the same plane, so the rotation and height (distance) of the calibration camera relative to the ground are equivalent to In order to solve the rotation and height (distance) of the camera relative to the fitting plane, the rotation transformation and z-axis transformation of the camera relative to the world coordinate system are obtained, thereby obtaining the imaging model between the ground depth image and the physical space ground plane (horizontal plane). parameters; broadly speaking, the imaging model parameters between the depth image and any desired plane in physical space parallel or perpendicular to the bearing surface where the mobile robot body is located, that is, the imaging model parameters between the depth image and the calibration plane, which reflect An external parameter relationship between the depth camera and the external coordinate system can be regarded as an external parameter of the depth camera, which is referred to as the depth camera external parameter in this application.

基于上述标定原理,本申请的深度相机外参标定,基于深度相机采集到的深度图像,转化为相机坐标系下的三维点,利用三维点拟合出与标定平面关联的一拟合平面,通过拟合平面与深度相机之间的当前位姿关系获取深度相机的外参。Based on the above calibration principle, the depth camera external parameter calibration of the present application is based on the depth image collected by the depth camera, and is converted into a three-dimensional point in the camera coordinate system, and a fitting plane associated with the calibration plane is fitted by using the three-dimensional point. The extrinsic parameters of the depth camera are obtained by fitting the current pose relationship between the plane and the depth camera.

参见图2所示,图2为本申请实施例进行深度相机外参标定的一种流程示意图。Referring to FIG. 2 , FIG. 2 is a schematic flowchart of an embodiment of the present application for calibrating external parameters of a depth camera.

步骤201,获取深度相机输出的任一深度图像数据,通过相机内参将深度图像投影成三维点云,即深度图像投影成三维点的集合。Step 201: Acquire any depth image data output by the depth camera, and project the depth image into a three-dimensional point cloud through the camera's internal parameters, that is, project the depth image into a set of three-dimensional points.

鉴于标定原理中,利用深度图像的三维点可拟合出与标定平面关联的一拟合平面,通过拟合平面与深度相机之间的当前位姿关系即可获取深度相机的外参,这样,深度相机输出的深度图像最少仅用一张即可,对于深度图像上的任意一像素点p(u,v,d),其中,u、v为该像素点在图像坐标系下的坐标,d为该像素点灰度值,也是实际三维点的深度值:In view of the calibration principle, a fitting plane associated with the calibration plane can be fitted by using the three-dimensional points of the depth image, and the external parameters of the depth camera can be obtained by fitting the current pose relationship between the plane and the depth camera. In this way, At least one depth image output by the depth camera can be used. For any pixel p(u, v, d) on the depth image, u and v are the coordinates of the pixel in the image coordinate system, and d is the gray value of the pixel point, which is also the depth value of the actual three-dimensional point:

按照三维坐标点与二维图像平面(单位为像素)的映射几何模型,例如针孔模型,获得将深度图像中的像素点转化为相机坐标系下的三维点pc(x,y,z)的关系式:According to the mapping geometric model between the three-dimensional coordinate point and the two-dimensional image plane (unit is pixel), such as the pinhole model, the pixel point in the depth image is converted into the three-dimensional point pc(x, y, z) in the camera coordinate system. Relationship:

将像素点的深度值作为三维点的z坐标值;获取像素点x坐标与x方向偏移之间的第一差值,其中,x方向偏移为相机坐标系原点(相机光心)相对图像坐标系在x方向的偏移,将z坐标值与相机内参中x方向焦距之比值与所述第一差值相乘,得到的结果作为三维点的x坐标值;Use the depth value of the pixel point as the z coordinate value of the three-dimensional point; obtain the first difference between the x coordinate of the pixel point and the offset in the x direction, where the offset in the x direction is the relative image of the origin of the camera coordinate system (camera optical center) For the offset of the coordinate system in the x-direction, multiply the ratio of the z-coordinate value to the focal length in the x-direction of the camera's internal reference by the first difference, and the obtained result is used as the x-coordinate value of the three-dimensional point;

获取像素点y坐标与y方向偏移之间的第二差值,其中,y方向偏移为相机坐标系原点相对图像坐标系在y方向的偏移,将z坐标值与相机内参中y方向焦距之比值与所述第二差值相乘,得到的结果作为三维点的y坐标值,Obtain the second difference between the y coordinate of the pixel and the offset in the y direction, where the offset in the y direction is the offset of the origin of the camera coordinate system relative to the image coordinate system in the y direction, and the z coordinate value and the y direction in the camera's internal parameters The ratio of the focal length is multiplied by the second difference, and the obtained result is taken as the y coordinate value of the three-dimensional point,

其中,相机坐标系原点相对图像坐标系中的x方向偏移、相机坐标系原点相对图像坐标系中的y方向偏移根据相机内参获得;Wherein, the offset of the origin of the camera coordinate system relative to the x direction in the image coordinate system, and the offset of the origin of the camera coordinate system relative to the y direction in the image coordinate system are obtained according to the camera internal parameters;

用数学式表达为:Mathematically expressed as:

Figure BDA0002209215940000071
Figure BDA0002209215940000071

其中,fx、fy分别为相机内参,cx、cy为相机坐标系中的相机坐标系原点相对图像坐标系中的偏移,这些参数对应着相机内参矩阵K中的数据:Among them, f x and f y are the internal parameters of the camera, respectively, and c x and cy are the offsets of the origin of the camera coordinate system in the camera coordinate system relative to the image coordinate system. These parameters correspond to the data in the camera internal parameter matrix K:

Figure BDA0002209215940000072
Figure BDA0002209215940000072

按照上述方法,可将深度图像中的每一像素点都进行相机坐标系下三维点的转化,得到的三维点集合,又称为点云。According to the above method, each pixel in the depth image can be transformed into a three-dimensional point in the camera coordinate system, and the obtained three-dimensional point set is also called a point cloud.

步骤202,对三维点云进行筛选,去除误差较大的三维点,以提高获得拟合平面的准确度和成功率。Step 202: Screen the three-dimensional point cloud, and remove the three-dimensional points with larger errors, so as to improve the accuracy and success rate of obtaining the fitting plane.

以地面为例,在这一步中对步骤201所计算出的三维点云进行筛选,目的是初步去除明显不在地面上的三维点,以及误差比较大的三维点。Taking the ground as an example, in this step, the three-dimensional point cloud calculated in step 201 is screened for the purpose of initially removing three-dimensional points that are obviously not on the ground and three-dimensional points with relatively large errors.

点云筛选策略可根据不同的使用场景,增添点云筛选的条件和标准。在这里提供几种点云筛选的条件,每种筛选条件有自己的适用情况,实际使用过程中可以是如下筛选条件的之一或任一组合:The point cloud screening strategy can add conditions and standards for point cloud screening according to different usage scenarios. Several conditions for point cloud screening are provided here. Each screening condition has its own application. In actual use, it can be one or any combination of the following screening conditions:

(1)根据相机朝向去除深度图像上方一定范围的三维点。(1) Remove a certain range of 3D points above the depth image according to the camera orientation.

例如:相机接近平视时,深度图像中上半部分三维点不会出现在地面上,因此可以以图像的1/2为阈值,将深度图像上半部分的像素点转换的三维点去除。For example, when the camera is close to the head-up view, the 3D points in the upper half of the depth image will not appear on the ground, so 1/2 of the image can be used as the threshold to remove the 3D points converted from the pixels in the upper half of the depth image.

(2)有外参初始估计值的情况下,可通过相机外参将相机坐标系下的三维点转到世界坐标系下,去除高度方向上高度大于预设高度阈值的点。(2) When there is an initial estimated value of the external parameter, the three-dimensional point in the camera coordinate system can be transferred to the world coordinate system through the camera external parameter, and the points whose height in the height direction is greater than the preset height threshold can be removed.

(3)如果深度相机的原理是采用双目立体视觉,那么由于双目测量原理固有的特点,深度测量精度会随着距离的增加而降低。也就是说,远处的三维点的深度精度会变差,因此可设定一定深度方向距离阈值(深度阈值),去除超出该深度阈值的点。(3) If the principle of the depth camera is to use binocular stereo vision, then due to the inherent characteristics of the binocular measurement principle, the depth measurement accuracy will decrease as the distance increases. That is to say, the depth accuracy of the three-dimensional points in the distance deteriorates, so a certain depth-direction distance threshold (depth threshold) can be set, and points exceeding the depth threshold can be removed.

(4)如果深度相机的原理是采用TOF方法,那么由于TOF传感本身的特点,相机比较容易受到多路径反射的影响,深度图像中可能存在较多的飞点,则可以根据像素点与其临近像素点的距离差值,去除距离差值大于预设距离阈值的点。(4) If the principle of the depth camera is to use the TOF method, then due to the characteristics of the TOF sensor itself, the camera is more susceptible to the influence of multi-path reflection, and there may be more flying spots in the depth image. The distance difference between pixel points, and remove the points whose distance difference is greater than the preset distance threshold.

步骤203,基于筛选后的三维点云,获取拟合平面;Step 203, obtaining a fitting plane based on the screened 3D point cloud;

在该步骤中,实施方式之一,利用随机抽样一致(RANSAC,Random SampleConsensus)算法获取拟合平面。具体包括:In this step, in one of the embodiments, a random sample consensus (RANSAC, Random Sample Consensus) algorithm is used to obtain the fitting plane. Specifically include:

步骤2031,基于筛选后的三维点云,进行随机选择,得到通过随机选择出的三维点构成的子集,其中,三维点的数量与确定拟合平面的未知数相等,由于拟合平面方程可表达为:Ax+By+Cz+D=0,可变形为ax+by+cz=1,Step 2031, based on the screened three-dimensional point cloud, perform random selection, and obtain a subset formed by randomly selected three-dimensional points, wherein the number of three-dimensional points is equal to the unknown number for determining the fitting plane, because the fitting plane equation can express It is: Ax+By+Cz+D=0, which can be transformed into ax+by+cz=1,

因而,其中包括三个未知数a、b、c,故子集中三维点的数量至少为3个。Therefore, it includes three unknowns a, b, c, so the number of three-dimensional points in the subset is at least three.

步骤2032,基于子集中的三维点,生成该子集的拟合平面估计:利用所选择的子集中的三维点,代入平面方程,求解出拟合平面估计方程中的未知数,得到拟合平面的估计;Step 2032, based on the three-dimensional points in the subset, generate an estimate of the fitting plane of the subset: use the three-dimensional points in the selected subset to substitute the plane equation, solve the unknowns in the fitting plane estimation equation, and obtain the equation of the fitting plane. estimate;

如果所选择的子集中的三维点数量m大于未知数数量,则需要求解超定方程组,可采用回归分析求解拟合平面估计方程中的未知数,例如,采用线性最小二乘方法,具体如下:If the number m of 3D points in the selected subset is greater than the number of unknowns, the overdetermined equation system needs to be solved, and regression analysis can be used to solve the unknowns in the fitting plane estimation equation, for example, the linear least squares method is used, as follows:

将拟合平面方程变形为:ax+by+cz=1,则有:Deform the fitting plane equation as: ax+by+cz=1, then there are:

Figure BDA0002209215940000091
Figure BDA0002209215940000091

当矩阵

Figure BDA0002209215940000092
非奇异性,则未知数a、b、c有唯一解,其中,xm、ym、zm为三维点的坐标。When the matrix
Figure BDA0002209215940000092
Non-singularity, the unknowns a, b, and c have unique solutions, where x m , y m , and z m are the coordinates of three-dimensional points.

步骤2033,基于拟合平面估计,获取所有筛选后的三维点相对该拟合平面估计的符合程度;Step 2033, based on the fitting plane estimation, obtain the degree of conformity of all the screened 3D points relative to the fitting plane estimation;

实施方式之一是,计算拟合平面估计的内点率,根据内点率确定符合程度,具体为:One of the embodiments is to calculate the interior point rate estimated by the fitted plane, and determine the degree of conformity according to the interior point rate, specifically:

计算筛选后的三维点云中的每个点到拟合平面估计的距离,Calculate the estimated distance from each point in the filtered 3D point cloud to the fitted plane,

将计算得到的距离值小于设定距离阈值的点作为内点,Take the point where the calculated distance value is less than the set distance threshold as the interior point,

统计内点数量,count the number of inliers,

计算内点数量占筛选后的三维点云的比例,得到内点率,比例越大,内点率越高,则说明符合程度越高,拟合平面估计越佳。Calculate the proportion of the number of interior points to the filtered 3D point cloud, and obtain the interior point rate.

步骤2034,判断所述符合程度是否满足,如果是,则执行步骤2035,否则,返回步骤2031,以便重新随机选择子集来进行拟合平面估计,从而进行估计-确认的循环;Step 2034, determine whether the degree of conformity is satisfied, if so, execute step 2035, otherwise, return to step 2031, so as to randomly select a subset to perform fitting plane estimation, thereby performing an estimation-confirmation cycle;

在该步骤中,实施方式之一,判断内点率是否满足预设的条件,In this step, in one of the embodiments, it is judged whether the inlier rate satisfies the preset condition,

实施方式之二,为了在置信度η的条件下,在迭代循环过程中,至少有一次随机选择,使得选择的m个点均为内点,这样有利于在循环的过程中,至少有一次能取得拟合平面估计的最佳值。因此,迭代次数i应该满足以下条件:In the second embodiment, under the condition of the confidence η, in the iterative loop process, at least one random selection is made, so that the selected m points are all interior points, which is beneficial to at least one time in the loop process. Get the best value for the fitted plane estimate. Therefore, the number of iterations i should satisfy the following conditions:

Figure BDA0002209215940000093
Figure BDA0002209215940000093

其中,m为子集的大小,即,子集中三维点的数量;置信度一般设置在0.95~0.99的范围内。ε为内点率,在一般情况下,ε通常是未知的,因此可以取最坏条件下内点的比例,或者在初始状态下设置为最坏条件下的比例,然后随着迭代次数,不断更新为当前最大的内点率。Among them, m is the size of the subset, that is, the number of three-dimensional points in the subset; the confidence level is generally set in the range of 0.95 to 0.99. ε is the interior point rate. In general, ε is usually unknown, so the proportion of interior points under the worst condition can be taken, or the proportion under the worst condition can be set in the initial state, and then with the number of iterations, Update to the current maximum inlier rate.

实施方式之三,判断子集全部是内点的概率是否满足要求置信度的要求,具体为,将选取的子集看作为“全部是内点”或“不全部是内点”这两种结果的二项分布,而前者的概率为p=1/εm。对于p足够小的情况下,可以将其看作为一种泊松分布,因此,在i次循环中,有v个“子集全部是内点”的概率可以表达为:The third embodiment is to judge whether the probability that the subsets are all interior points meets the requirement of confidence. Specifically, the selected subset is regarded as two results of "all interior points" or "not all interior points". The binomial distribution of , and the probability of the former is p=1/ε m . For the case where p is small enough, it can be regarded as a Poisson distribution. Therefore, in the i cycles, the probability that there are v "subsets are all interior points" can be expressed as:

Figure BDA0002209215940000101
Figure BDA0002209215940000101

其中,λ表示在i次循环中,“子集全都是内点”的选取次数的期望。Among them, λ represents the expectation of the number of times that "the subsets are all interior points" in the i rounds.

例如,希望在这i次迭代循环中所选子集“没有一个全是内点”的概率小于某个置信度,即:p(0,λ)=e<1-η,以置信度为95%为例,λ约等于3,表示在95%置信度下,在i次循环中,平均可以选到3次“好”的子集。For example, it is hoped that the probability that the selected subset "no one is all interior points" in this iterative loop is less than a certain confidence level, namely: p(0, λ)=e - λ <1-η, with confidence level Taking 95% as an example, λ is approximately equal to 3, which means that under 95% confidence, in i cycles, an average of 3 "good" subsets can be selected.

步骤2035,利用最佳拟合平面估计的内点求解拟合平面方程;Step 2035, using the estimated interior point of the best fitting plane to solve the fitting plane equation;

具体为,基于内点率最高的拟合平面估计的内点,通过最小二乘的方法,重新求解拟合平面估计,得到最终拟合平面方程。Specifically, based on the estimated interior points of the fitting plane with the highest interior point rate, the fitting plane estimation is re-solved by the least squares method to obtain the final fitting plane equation.

步骤204,基于拟合平面参数,计算相机坐标系原点到拟合平面的距离、以及相机的旋转。Step 204 , based on the fitting plane parameters, calculate the distance from the origin of the camera coordinate system to the fitting plane and the rotation of the camera.

参见图3所示,图3为以拟合平面作为标定平面来进行标定的一种示意图,图中符号

Figure BDA0002209215940000102
表示向量的叉乘,该拟合平面为世界坐标系下移动机器人本体的承载地面。相机坐标系原点到拟合平面的距离即为相机的高度变换,相机坐标系相对拟合平面正视的旋转对应为相机坐标系的Y轴相对拟合平面法向量的旋转;故,根据拟合平面方程Ax+By+Cz+D=0,可得:Referring to Fig. 3, Fig. 3 is a schematic diagram of performing calibration by using the fitting plane as the calibration plane, the symbol in the figure
Figure BDA0002209215940000102
It represents the cross product of the vectors, and the fitting plane is the bearing ground of the mobile robot body in the world coordinate system. The distance from the origin of the camera coordinate system to the fitting plane is the height transformation of the camera, and the rotation of the camera coordinate system relative to the front view of the fitting plane corresponds to the rotation of the Y-axis of the camera coordinate system relative to the normal vector of the fitting plane; therefore, according to the fitting plane Equation Ax+By+Cz+D=0, we can get:

拟合平面(地面)的法向量n=(A,B,C);The normal vector n=(A, B, C) of the fitted plane (ground);

相机坐标系到拟合平面的距离

Figure BDA0002209215940000103
The distance from the camera coordinate system to the fitted plane
Figure BDA0002209215940000103

相机坐标系旋转变换

Figure BDA0002209215940000104
Camera coordinate system rotation transformation
Figure BDA0002209215940000104

其中,符号×表示向量的叉乘。Among them, the symbol × represents the cross product of vectors.

上述实施方式可以扩展为,根据拟合平面的法向量,可以得到任何与拟合平面平行、且与所述拟合平面具有设定距离的标定平面的方程,由此,可以得到相机坐标系相对于标定平面的距离变换和旋转变换,从而得到深度相机与标定平面之间的参数。The above-mentioned embodiment can be extended to, according to the normal vector of the fitting plane, the equation of any calibration plane that is parallel to the fitting plane and has a set distance from the fitting plane can be obtained. The distance transformation and rotation transformation of the calibration plane are used to obtain the parameters between the depth camera and the calibration plane.

如果深度图像是与移动机器人本体承载面垂直的立面(例如,墙面)有关,例如,获取了墙面的深度图像,则可以以立面来进行标定。参见图4所示,图4为以拟合平面为立面来进行标定的一种示意图,图中,法向量n与相机坐标系相对标定平面的正视方向相反,故为-n。相机坐标系原点到墙面的距离即为相机的距离变换,相机坐标系到第一标定平面正视的旋转对应为相机坐标系的z轴相对墙面法向量的旋转;故,根据标定平面的方程A′x+B′y+C′z+D′=0,可得:If the depth image is related to an elevation (eg, a wall) perpendicular to the bearing surface of the mobile robot body, for example, a depth image of a wall is obtained, the elevation can be used for calibration. Referring to FIG. 4, FIG. 4 is a schematic diagram of calibration with the fitting plane as the elevation. In the figure, the normal vector n is opposite to the front view direction of the camera coordinate system relative to the calibration plane, so it is -n. The distance from the origin of the camera coordinate system to the wall is the distance transformation of the camera, and the rotation of the camera coordinate system to the front view of the first calibration plane corresponds to the rotation of the z-axis of the camera coordinate system relative to the normal vector of the wall; therefore, according to the equation of the calibration plane A'x+B'y+C'z+D'=0, we can get:

标定平面法向量-n=(A′,B′,C′)Calibration plane normal vector -n = (A', B', C')

相机坐标系到标定平面的距离

Figure BDA0002209215940000111
The distance from the camera coordinate system to the calibration plane
Figure BDA0002209215940000111

相机坐标系旋转变换

Figure BDA0002209215940000112
Camera coordinate system rotation transformation
Figure BDA0002209215940000112

其中,符合×表示向量的叉乘。Among them, the coincidence × represents the cross product of the vector.

所述第一标定平面方程可以基于深度图像通过步骤201-203得到。The first calibration plane equation can be obtained through steps 201-203 based on the depth image.

上述实施方式可以扩展为,根据拟合平面的法向量,可以得到任何与拟合平面平行、且与拟合平面具有设定距离的标定平面的方程,由此,可以得到相机坐标系相对于标定平面的距离变换和旋转变换,从而得到深度相机与标定平面之间的参数。The above embodiment can be extended to, according to the normal vector of the fitting plane, can obtain the equation of any calibration plane that is parallel to the fitting plane and has a set distance from the fitting plane. The distance transformation and rotation transformation of the plane, so as to obtain the parameters between the depth camera and the calibration plane.

本申请提出的深度相机外参标定方法,根据需求的不同,例如,当获得的深度图像包括有移动机器人本体承载面图像时,可借助地平面进行标定,当获得的深度图像包括有与移动机器人本体承载面垂直的墙面图像数据时,可借助该墙面进行标定,不需要借助任何外部工具;对深度相机自身的原理没有要求,可适用于基于双目、结构光或TOF原理的深度相机外参标定;标定方法操作简单,标定精度高,鲁棒性强;仅需要一幅深度图像即可进行标定,标定方法计算复杂度低,并且可实时进行标定;通过该外参的标定,可将相机坐标系下的三维信息转化到世界坐标系下,可以很方便地进行地面去除、三维障碍物感知等操作,具有很强的应用价值。The depth camera extrinsic parameter calibration method proposed in this application, according to different requirements, for example, when the obtained depth image includes the image of the mobile robot body bearing surface, the ground plane can be used for calibration, when the obtained depth image includes the mobile robot. When the body carries the image data of the vertical wall, the wall can be used for calibration without any external tools; there is no requirement for the principle of the depth camera itself, and it can be applied to the depth camera based on binocular, structured light or TOF principle External parameter calibration; the calibration method has simple operation, high calibration accuracy and strong robustness; only one depth image is needed for calibration, the calibration method has low computational complexity, and can be calibrated in real time; Converting the three-dimensional information in the camera coordinate system to the world coordinate system can easily perform operations such as ground removal and three-dimensional obstacle perception, which has strong application value.

参见图5所示,图5为本申请实施例标定方法中的数据关联的一种示意图。基于深度图像数据,将该深度图像中的像素点转化为相机坐标系下的空间三维点;为提高获得拟合平面的准确度和成功率,可对空间三维点进行筛选;在采用RANSAC算法时,随机抽取三维点作为子集,基于子集得到该子集的拟合平面估计,通过该算法的多次随机抽取的子集、以及各个子集的拟合平面估计,按照内点率最大的条件选择出最佳拟合平面估计;基于最佳拟合平面的内点,重新计算拟合平面;如果以拟合平面为标定平面,则根据拟合平面,计算相机坐标系相对拟合平面高度变换、以及旋转变换。Referring to FIG. 5 , FIG. 5 is a schematic diagram of data association in the calibration method according to an embodiment of the present application. Based on the depth image data, the pixels in the depth image are converted into spatial 3D points in the camera coordinate system; in order to improve the accuracy and success rate of obtaining the fitting plane, the spatial 3D points can be screened; when using the RANSAC algorithm , randomly select three-dimensional points as a subset, and obtain the fitting plane estimation of the subset based on the subset. Through the multiple randomly selected subsets of the algorithm, and the fitting plane estimation of each subset, according to the maximum interior point rate The best fitting plane estimation is selected according to the conditions; the fitting plane is recalculated based on the interior points of the best fitting plane; if the fitting plane is used as the calibration plane, the height of the camera coordinate system relative to the fitting plane is calculated according to the fitting plane transformation, and rotation transformation.

参见图6所示,图6为本申请实施例标定装置的一种示意图。该装置包括,Referring to FIG. 6 , FIG. 6 is a schematic diagram of a calibration device according to an embodiment of the present application. The device includes,

深度图像获取模块,获取深度图像数据,所述深度图像数据包括像素点坐标和深度值;a depth image acquisition module, which acquires depth image data, where the depth image data includes pixel coordinates and depth values;

转换模块,基于获取的深度图像数据,将该深度图像中的像素点转化为相机坐标系下的空间三维点;The conversion module, based on the acquired depth image data, converts the pixel points in the depth image into spatial three-dimensional points in the camera coordinate system;

拟合平面获取模块,基于所述三维点,获取三维点的拟合平面;a fitting plane obtaining module, which obtains the fitting plane of the three-dimensional point based on the three-dimensional point;

变换计算模块,根据与拟合平面平行或重合的标定平面与相机坐标系的当前位姿关系,得到深度相机与标定平面之间的参数;The transformation calculation module obtains the parameters between the depth camera and the calibration plane according to the current pose relationship between the calibration plane that is parallel or coincident with the fitting plane and the camera coordinate system;

其中,标定平面包括平行或垂直于深度相机所在移动机器人本体承载面的当前任一平面。Wherein, the calibration plane includes any current plane that is parallel or perpendicular to the bearing surface of the mobile robot body where the depth camera is located.

所述装置还包括筛选模块,按照筛选策略,对三维点集合中的三维点进行筛选,得到筛选后的三维点集合。The device further includes a screening module, which screens the three-dimensional points in the three-dimensional point set according to the screening strategy to obtain a screened three-dimensional point set.

所述拟合平面获取模块包括,The fitting plane acquisition module includes,

随机选择模块,基于筛选后的三维点集合,进行随机选择,得到通过随机选择出的三维点构成的当前子集,其中,子集中的三维点数量至少包括3个;The random selection module performs random selection based on the screened set of three-dimensional points, and obtains a current subset formed by randomly selected three-dimensional points, wherein the number of three-dimensional points in the subset includes at least three;

拟合平面估计模块,基于当前子集中的三维点,获取该子集的拟合平面估计;The fitting plane estimation module obtains the fitting plane estimation of the subset based on the three-dimensional points in the current subset;

符合程度获取模块,根据获取的拟合平面估计,获取筛选后的三维点集合中所有三维点相对所述拟合平面估计的符合程度;The degree of conformity obtaining module, according to the obtained fitting plane estimation, obtains the degree of conformity of all the three-dimensional points in the screened three-dimensional point set relative to the fitting plane estimate;

如果符合程度不够,则返回至所述随机选择模块;If the degree of conformity is not enough, return to the random selection module;

如果达到符合程度,则返回至所述重解模块;If the degree of conformity is reached, returning to the re-decomposition module;

其中,所述内点包括,筛选后的三维点集合中三维点到符合程度最佳的拟合平面的距离小于预设距离阈值的三维点。Wherein, the inliers include three-dimensional points whose distances from the three-dimensional points in the filtered three-dimensional point set to the fitting plane with the best degree of conformity are less than a preset distance threshold.

重解模块,利用符合程度最佳的拟合平面估计的内点,求解拟合平面方程。The resolving module uses the estimated interior points of the fitted plane with the best fit to solve the fitted plane equation.

所述拟合平面估计模块还包括,如果当前子集中的三维点数量等于3,则将三维点坐标值代入拟合平面方程,求解拟合平面方程中的未知数,得到当前子集的拟合平面估计;The fitting plane estimation module also includes, if the number of three-dimensional points in the current subset is equal to 3, substituting the three-dimensional point coordinate values into the fitting plane equation, solving the unknowns in the fitting plane equation, and obtaining the fitting plane of the current subset. estimate;

如果当前子集中的三维点数量大于3,则将三维点坐标值代入拟合平面方程,通过最小二乘方法,求解拟合平面方程中的未知数,得到当前子集的拟合平面估计。If the number of 3D points in the current subset is greater than 3, the coordinate values of the 3D points are substituted into the fitting plane equation, and the unknowns in the fitting plane equation are solved by the least squares method to obtain the fitting plane estimate of the current subset.

所述符合程度获取模块还包括,计算筛选后的三维点集合中的每个三维点到所述拟合平面估计的距离值,The degree of conformity obtaining module further includes calculating a distance value estimated from each three-dimensional point in the screened three-dimensional point set to the fitting plane,

将计算得到的距离值小于设定距离阈值的三维点作为内点,Take the three-dimensional point whose calculated distance value is less than the set distance threshold as the interior point,

统计内点数量,count the number of inliers,

计算内点数量占筛选后的三维点云的比例,得到内点率;Calculate the ratio of the number of inliers to the filtered 3D point cloud to obtain the inlier rate;

根据内点率确定所述符合程度;determining the degree of compliance according to the interior point rate;

或者,or,

根据迭代次数确定所述符合程度,其中,迭代次数满足:The degree of conformity is determined according to the number of iterations, wherein the number of iterations satisfies:

Figure BDA0002209215940000131
Figure BDA0002209215940000131

其中,m为子集中三维点的数量,η为设定的置信度,ε为内点率,为最坏条件下内点的比例,或者在初始状态下设置为最坏条件下的比例,并随着迭代次数,更新为当前最大的内点率;where m is the number of 3D points in the subset, η is the set confidence level, ε is the interior point rate, is the proportion of interior points under the worst condition, or is set to the proportion under the worst condition in the initial state, and With the number of iterations, it is updated to the current maximum inlier rate;

或者,or,

根据v个子集全部是内点的概率是否满足设定的置信度,确定所述符合程度,其中,v个子集全部是内点的概率为:The degree of conformity is determined according to whether the probability that the v subsets are all interior points satisfies the set confidence level, where the probability that the v subsets are all interior points is:

Figure BDA0002209215940000132
Figure BDA0002209215940000132

其中,λ为在当前迭代中子集全都是内点的选取次数的期望。where λ is the expectation of the number of times the subsets are all interior points in the current iteration.

所述重解模块还包括,根据内点率最高的拟合平面估计的内点,通过最小二乘的方法,重新求解拟合平面方程中的未知数,得到拟合平面方程。The resolving module further includes, according to the estimated interior point of the fitting plane with the highest interior point rate, by using the least squares method, resolving the unknowns in the fitting plane equation to obtain the fitting plane equation.

所述变换计算模块包括,根据拟合平面方程,得到拟合平面法向量,The transformation calculation module includes, according to the fitted plane equation, obtaining the fitted plane normal vector,

根据所述拟合平面法向量,得到与所述拟合平面平行或重合的标定平面的方程;According to the normal vector of the fitting plane, obtain the equation of the calibration plane that is parallel or coincident with the fitting plane;

计算相机坐标系的原点到与所述标定平面的距离,得到相机坐标系相对标定平面的距离变换;Calculate the distance from the origin of the camera coordinate system to the calibration plane, and obtain the distance transformation of the camera coordinate system relative to the calibration plane;

计算相机坐标系相对标定平面正视的旋转,到相机坐标系相对标定平面的旋转变换,Calculate the rotation of the camera coordinate system relative to the calibration plane, to the rotation transformation of the camera coordinate system relative to the calibration plane,

将所述距离变换、以及旋转变换,作为所述深度相机与标定平面之间的参数。The distance transformation and the rotation transformation are used as parameters between the depth camera and the calibration plane.

所述深度图像数据包括有移动机器人本体承载面图像数据,所述变换计算模块还包括,The depth image data includes image data of the bearing surface of the mobile robot body, and the transformation calculation module further includes:

根据所述拟合平面法向量,得到与所述拟合平面平行、且具有设定距离的标定平面的方程;其中,所述标定平面为世界坐标系下移动机器人本体的承载地面;According to the normal vector of the fitting plane, the equation of the calibration plane parallel to the fitting plane and having a set distance is obtained; wherein, the calibration plane is the bearing ground of the mobile robot body in the world coordinate system;

根据标定平面的方程参数,计算相机坐标系原点到拟合平面的距离,得到相机坐标系相对标定平面的高度变换;计算相机坐标系的y轴相对拟合平面法向量的旋转,得到相机坐标系相对标定平面的旋转变换,According to the equation parameters of the calibration plane, calculate the distance from the origin of the camera coordinate system to the fitting plane, and obtain the height transformation of the camera coordinate system relative to the calibration plane; calculate the rotation of the y-axis of the camera coordinate system relative to the normal vector of the fitting plane, and obtain the camera coordinate system The rotation transformation relative to the calibration plane,

将所述高度变换、以及所述旋转变换,作为所述深度相机与标定平面之间的参数。The height transformation and the rotation transformation are used as parameters between the depth camera and the calibration plane.

所述深度图像数据包括有与移动机器人本体承载面垂直的立面图像数据,所述变换计算模块还包括,根据所述拟合平面的法向量,得到与所述拟合平面平行、且具有设定距离的标定平面的方程;其中,所述标定平面垂直于世界坐标系下移动机器人本体的承载地面;The depth image data includes elevation image data that is perpendicular to the bearing surface of the mobile robot body, and the transformation calculation module further includes, according to the normal vector of the fitting plane, obtaining a plane that is parallel to the fitting plane and has a set value. The equation of a fixed distance calibration plane; wherein, the calibration plane is perpendicular to the bearing ground of the mobile robot body in the world coordinate system;

根据标定平面的方程参数,计算相机坐标系原点到第二拟合平面的距离,得到相机坐标系相对标定平面的距离变换;计算相机坐标系的z轴相对第二拟合平面法向量的旋转,得到相机坐标系相对标定平面的旋转变换,According to the equation parameters of the calibration plane, calculate the distance from the origin of the camera coordinate system to the second fitting plane, and obtain the distance transformation of the camera coordinate system relative to the calibration plane; calculate the rotation of the z-axis of the camera coordinate system relative to the normal vector of the second fitting plane, Obtain the rotation transformation of the camera coordinate system relative to the calibration plane,

将所述距离变换、以及所述旋转变换,作为所述深度相机与标定平面之间的参数。The distance transformation and the rotation transformation are used as parameters between the depth camera and the calibration plane.

本申请还提供一种深度相机外参标定的电子设备,该电子设备包括存储器和处理器,其中,The present application also provides an electronic device for calibrating external parameters of a depth camera, the electronic device includes a memory and a processor, wherein,

所述存储器用于存放计算机程序;The memory is used to store computer programs;

所述处理器,用于执行所述存储器上所存放的程序,实现任一所述深度相机外参的标定步骤。The processor is configured to execute the program stored in the memory to realize the calibration step of any of the external parameters of the depth camera.

存储器可以包括随机存取存储器(Random Access Memory,RAM),也可以包括非易失性存储器(Non-Volatile Memory,NVM),例如至少一个磁盘存储器。可选的,存储器还可以是至少一个位于远离前述处理器的存储装置。The memory may include random access memory (Random Access Memory, RAM), and may also include non-volatile memory (Non-Volatile Memory, NVM), such as at least one disk memory. Optionally, the memory may also be at least one storage device located away from the aforementioned processor.

上述的处理器可以是通用处理器,包括中央处理器(Central Processing Unit,CPU)、网络处理器(Network Processor,NP)等;还可以是数字信号处理器(Digital SignalProcessing,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。The above-mentioned processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; may also be a digital signal processor (Digital Signal Processing, DSP), an application-specific integrated circuit (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.

本发明实施例还提供了一种计算机可读存储介质,所述存储介质内存储有计算机程序,所述计算机程序被处理器执行时实现如下步骤:An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored in the storage medium, and when the computer program is executed by a processor, the following steps are implemented:

获取深度图像数据,所述深度图像数据包括像素点坐标和深度值;Obtaining depth image data, the depth image data includes pixel coordinates and depth values;

基于获取的深度图像数据,将该深度图像中的像素点转化为相机坐标系下的空间三维点;Based on the acquired depth image data, transform the pixels in the depth image into three-dimensional spatial points in the camera coordinate system;

基于所述三维点,获取与标定平面关联的拟合平面;Based on the three-dimensional points, obtaining a fitting plane associated with the calibration plane;

根据拟合平面与相机坐标系的当前位姿关系,得到深度相机与标定平面之间的参数;According to the current pose relationship between the fitting plane and the camera coordinate system, the parameters between the depth camera and the calibration plane are obtained;

其中,标定平面包括平行或垂直于深度相机所在移动机器人本体承载面的当前任一平面。Wherein, the calibration plane includes any current plane that is parallel or perpendicular to the bearing surface of the mobile robot body where the depth camera is located.

对于装置/网络侧设备/存储介质实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。As for the apparatus/network side device/storage medium embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and for related parts, please refer to the partial description of the method embodiment.

需要说明的是,本申请所提供的深度相机外参标定方法的实施例,可不限于上述实施方式,例如,在获取拟合平面时,可采用除RANSAC算法之外的其他拟合算法,在求解超定方程时,也可采用除了最小二乘方法之外的求解方法,例如,采用回归计算等方式等。另外,为了降低超定方程的计算复杂度,还可提高通过有目标地选择可靠的三维点,例如,辅以图像的特征点等。It should be noted that the embodiments of the depth camera external parameter calibration method provided by the present application may not be limited to the above-mentioned embodiments. When the equation is overdetermined, a solution method other than the least squares method can also be used, for example, regression calculation and the like can be used. In addition, in order to reduce the computational complexity of the over-determined equation, it is also possible to improve the selection of reliable three-dimensional points, for example, supplemented by image feature points.

在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。In this document, relational terms such as first and second, etc. are used only to distinguish one entity or operation from another entity or operation, and do not necessarily require or imply any such existence between these entities or operations. The actual relationship or sequence. Moreover, the terms "comprising", "comprising" or any other variation thereof are intended to encompass a non-exclusive inclusion such that a process, method, article or device that includes a list of elements includes not only those elements, but also includes not explicitly listed or other elements inherent to such a process, method, article or apparatus. Without further limitation, an element qualified by the phrase "comprising a..." does not preclude the presence of additional identical elements in a process, method, article or apparatus that includes the element.

以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本发明保护的范围之内。The above descriptions are only preferred embodiments of the present invention, and are not intended to limit the present invention. Any modifications, equivalent replacements, improvements, etc. made within the spirit and principles of the present invention shall be included in the present invention. within the scope of protection.

Claims (15)

1.一种深度相机外参的标定方法,其特征在于,该方法包括,1. a calibration method of depth camera external parameter, is characterized in that, this method comprises, 获取深度图像数据,所述深度图像数据包括像素点坐标和深度值;Obtaining depth image data, the depth image data includes pixel coordinates and depth values; 基于获取的深度图像数据,将该深度图像中的像素点转化为相机坐标系下的空间三维点;Based on the acquired depth image data, transform the pixels in the depth image into three-dimensional spatial points in the camera coordinate system; 基于所述三维点,获取三维点的拟合平面;Based on the three-dimensional point, obtain the fitting plane of the three-dimensional point; 根据与拟合平面平行或重合的标定平面与相机坐标系的当前位姿关系,得到深度相机与标定平面之间的参数;According to the current pose relationship between the calibration plane parallel or coincident with the fitting plane and the camera coordinate system, the parameters between the depth camera and the calibration plane are obtained; 其中,标定平面包括平行或垂直于深度相机所在移动机器人本体承载面的任一当前平面。Wherein, the calibration plane includes any current plane that is parallel or perpendicular to the bearing surface of the mobile robot body where the depth camera is located. 2.如权利要求1所述的方法,其特征在于,所述深度图像数据最少仅一幅深度图像,所述将该深度图像中的像素点转化为相机坐标系下的空间三维点包括,2. The method according to claim 1, wherein the depth image data is at least one depth image, and the conversion of the pixel points in the depth image into the spatial three-dimensional points under the camera coordinate system comprises, 按照相机坐标系下的空间三维点与图像坐标系下的二维图像的映射几何模型,获得像素点对应的三维点坐标。The three-dimensional point coordinates corresponding to the pixel points are obtained according to the mapping geometric model of the spatial three-dimensional point in the camera coordinate system and the two-dimensional image in the image coordinate system. 3.如权利要求2所述的方法,其特征在于,所述按照相机坐标系下的空间三维点与图像坐标系下的二维图像的映射几何模型,获得像素点对应的三维点坐标,包括,3. The method according to claim 2, wherein, according to the spatial three-dimensional point under the camera coordinate system and the mapping geometric model of the two-dimensional image under the image coordinate system, the three-dimensional point coordinates corresponding to the pixel points are obtained, including , 对于深度图像中的任一像素点,For any pixel in the depth image, 将像素点的深度值作为三维点的z坐标值;Use the depth value of the pixel point as the z coordinate value of the three-dimensional point; 获取像素点x坐标与x方向偏移之间的第一差值,将z坐标值与相机内参中x方向焦距之比值与所述第一差值相乘,得到的结果作为三维点的x坐标值;所述x方向偏移为相机坐标系原点相对图像坐标系在x方向的偏移;Obtain the first difference between the x-coordinate of the pixel point and the offset in the x-direction, multiply the ratio of the z-coordinate value to the focal length in the x-direction of the camera's internal parameters and the first difference, and the obtained result is used as the x-coordinate of the three-dimensional point value; the x-direction offset is the offset of the origin of the camera coordinate system relative to the image coordinate system in the x-direction; 获取像素点y坐标与y方向偏移之间的第二差值,将z坐标值与相机内参中y方向焦距之比值与所述第二差值相乘,得到的结果作为三维点的y坐标值;所述y方向偏移为相机坐标系原点相对图像坐标系在y方向的偏移;Obtain the second difference between the y-coordinate of the pixel and the offset in the y-direction, multiply the ratio of the z-coordinate value to the y-direction focal length in the camera's internal parameters and the second difference, and use the result as the y-coordinate of the three-dimensional point value; the y-direction offset is the offset of the origin of the camera coordinate system relative to the image coordinate system in the y-direction; 其中,相机坐标系原点相对图像坐标系中的x方向偏移、相机坐标系原点相对图像坐标系中的y方向偏移根据相机内参获得;Wherein, the offset of the origin of the camera coordinate system relative to the x direction in the image coordinate system, and the offset of the origin of the camera coordinate system relative to the y direction in the image coordinate system are obtained according to the camera internal parameters; 将深度图像中的所有像素点转化为三维点坐标,得到三维点集合。Convert all pixels in the depth image to 3D point coordinates to obtain a 3D point set. 4.如权利要求3所述的方法,其特征在于,该方法进一步包括,按照筛选策略,对三维点集合中的三维点进行筛选,得到筛选后的三维点集合;其中,所述筛选策略包括以下任一条件或其任意组合:4. The method of claim 3, further comprising, according to a screening strategy, screening the three-dimensional points in the three-dimensional point set to obtain a screened three-dimensional point set; wherein the screening strategy comprises: Any or any combination of the following conditions: (1)根据相机朝向,去除深度图像上方一定范围的三维点;(1) According to the orientation of the camera, remove 3D points in a certain range above the depth image; (2)有外参初始估计值的情况下,通过相机外参将相机坐标系下的三维点变换到世界坐标系下,去除高度方向上高度大于预设高度阈值的三维点;(2) When there is an initial estimated value of the external parameter, the three-dimensional point in the camera coordinate system is transformed into the world coordinate system through the camera external parameter, and the three-dimensional point whose height in the height direction is greater than the preset height threshold is removed; (3)对于双目立体视觉的深度相机,去除深度值大于预设深度阈值的三维点;(3) For the depth camera of binocular stereo vision, remove the three-dimensional points whose depth value is greater than the preset depth threshold; (4)对于飞行时间TOF深度相机,根据像素点与其临近像素点的距离差值,去除距离差值大于预设距离阈值的三维点。(4) For the TOF depth camera of the time of flight, according to the distance difference between the pixel point and its adjacent pixel points, remove the three-dimensional points whose distance difference is greater than the preset distance threshold. 5.如权利要求4所述的方法,其特征在于,所述基于所述三维点,获取三维点的拟合平面,包括,5. The method according to claim 4, wherein the obtaining the fitting plane of the three-dimensional point based on the three-dimensional point comprises: 根据筛选后的三维点集合,利用随机抽样一致RANSAC算法,获取三维点的拟合平面方程,其中,三维点数量大于等于3。According to the screened 3D point set, a random sampling consistent RANSAC algorithm is used to obtain the fitting plane equation of the 3D points, wherein the number of 3D points is greater than or equal to 3. 6.如权利要求5所述的方法,其特征在于,所述利用随机抽样一致RANSAC算法,获取三维点的拟合平面方程,包括,6. The method according to claim 5, characterized in that, using a random sampling consistent RANSAC algorithm to obtain the fitting plane equation of the three-dimensional point, comprising: 基于筛选后的三维点集合,进行随机选择,得到通过随机选择出的三维点构成的当前子集,其中,子集中的三维点数量至少包括3个;Based on the filtered 3D point set, random selection is performed to obtain a current subset composed of the randomly selected 3D points, wherein the number of 3D points in the subset includes at least 3; 基于当前子集中的三维点,获取该子集的拟合平面估计;Based on the 3D points in the current subset, obtain a fitting plane estimate for the subset; 根据获取的拟合平面估计,获取筛选后的三维点集合中所有三维点相对所述拟合平面估计的符合程度;According to the obtained fitting plane estimation, obtain the degree of conformity of all 3D points in the screened 3D point set relative to the fitting plane estimation; 如果符合程度不够,则返回所述基于筛选后的三维点集合进行随机选择的步骤;If the degree of conformity is not enough, returning to the step of randomly selecting based on the filtered three-dimensional point set; 如果达到符合程度,则利用符合程度最佳的拟合平面估计的内点,求解拟合平面方程;If the degree of conformity is reached, use the estimated interior point of the fitting plane with the best degree of conformity to solve the fitting plane equation; 其中,所述内点包括,筛选后的三维点集合中三维点到符合程度最佳的拟合平面的距离小于预设距离阈值的三维点。Wherein, the inliers include three-dimensional points whose distances from the three-dimensional points in the filtered three-dimensional point set to the fitting plane with the best degree of conformity are less than a preset distance threshold. 7.如权利要求6所述的方法,其特征在于,所述基于当前子集中的三维点,获取该子集的拟合平面估计,包括,7. The method of claim 6, wherein the obtaining a fitting plane estimate of the subset based on the three-dimensional points in the current subset comprises: 如果当前子集中的三维点数量等于3,则将三维点坐标值代入拟合平面方程,求解拟合平面方程中的未知数,得到当前子集的拟合平面估计;If the number of 3D points in the current subset is equal to 3, substitute the coordinate values of the 3D points into the fitting plane equation, solve the unknowns in the fitting plane equation, and obtain the fitting plane estimation of the current subset; 如果当前子集中的三维点数量大于3,则将三维点坐标值代入拟合平面方程,通过最小二乘方法,求解拟合平面方程中的未知数,得到当前子集的拟合平面估计。If the number of 3D points in the current subset is greater than 3, the coordinate values of the 3D points are substituted into the fitting plane equation, and the unknowns in the fitting plane equation are solved by the least squares method to obtain the fitting plane estimate of the current subset. 8.如权利要求6所述的方法,其特征在于,所述根据获取的拟合平面估计,获取筛选后的三维点集合中所有三维点相对所述拟合平面估计的符合程度,包括,8. The method according to claim 6, wherein, according to the obtained fitting plane estimation, obtaining the degree of conformity of all 3D points in the screened 3D point set relative to the fitting plane estimation, comprising: 计算筛选后的三维点集合中的每个三维点到所述拟合平面估计的距离值,Calculate the distance value estimated from each 3D point in the filtered 3D point set to the fitting plane, 将计算得到的距离值小于设定距离阈值的三维点作为内点,Take the three-dimensional point whose calculated distance value is less than the set distance threshold as the interior point, 统计内点数量,count the number of inliers, 计算内点数量占筛选后的三维点云的比例,得到内点率;Calculate the ratio of the number of inliers to the filtered 3D point cloud to obtain the inlier rate; 根据内点率确定所述符合程度;determining the degree of compliance according to the inlier rate; 所述利用符合程度最佳的拟合平面估计的内点,求解拟合平面方程,包括,将内点率最高的拟合平面估计作为最佳拟合平面估计,根据最佳拟合平面估计的内点,通过最小二乘的方法,重新求解拟合平面方程中的未知数,得到拟合平面方程。The fitting plane equation is solved by using the estimated interior points of the fitting plane with the best degree of conformity, including, using the fitting plane estimate with the highest interior point rate as the best fitting plane estimate, and according to the best fitting plane estimate For the interior point, the unknowns in the fitted plane equation are re-solved by the method of least squares, and the fitted plane equation is obtained. 9.如权利要求8所述的方法,其特征在于,所述根据内点率确定所述符合程度包括,9. The method of claim 8, wherein the determining the degree of conformity according to the inlier rate comprises: 根据迭代次数确定所述符合程度,其中,迭代次数满足:The degree of conformity is determined according to the number of iterations, wherein the number of iterations satisfies:
Figure FDA0002209215930000031
Figure FDA0002209215930000031
其中,m为子集中三维点的数量,η为设定的置信度,ε为内点率,为最坏条件下内点的比例,或者在初始状态下设置为最坏条件下的比例,并随着迭代次数,更新为当前最大的内点率。where m is the number of 3D points in the subset, η is the set confidence level, ε is the interior point rate, is the proportion of interior points under the worst condition, or is set to the proportion under the worst condition in the initial state, and With the number of iterations, update to the current maximum inlier rate.
10.如权利要求8所述的方法,其特征在于,所述根据内点率确定所述符合程度包括,根据v个子集全部是内点的概率是否满足设定的置信度,确定所述符合程度,其中,v个子集全部是内点的概率为:10 . The method according to claim 8 , wherein the determining the degree of conformity according to the inlier rate comprises: determining the conformity according to whether the probability that the v subsets are all inliers satisfies a set confidence level. 11 . degree, where the probability that the v subsets are all interior points is:
Figure FDA0002209215930000032
Figure FDA0002209215930000032
其中,λ为在当前迭代中子集全都是内点的选取次数的期望。where λ is the expectation of the number of times the subsets are all interior points in the current iteration.
11.如权利要求1至10任一所述的方法,其特征在于,所述根据与拟合平面平行或重合的标定平面与相机坐标系的当前位姿关系,得到深度相机与标定平面之间的参数,包括,11. The method according to any one of claims 1 to 10, wherein the distance between the depth camera and the calibration plane is obtained according to the current pose relationship between the calibration plane that is parallel or coincident with the fitting plane and the camera coordinate system parameters, including, 根据拟合平面方程,得到拟合平面法向量,According to the fitted plane equation, the fitted plane normal vector is obtained, 根据所述拟合平面法向量,得到与所述拟合平面平行或重合的标定平面的方程;According to the normal vector of the fitting plane, obtain the equation of the calibration plane that is parallel or coincident with the fitting plane; 计算相机坐标系的原点到与所述标定平面的距离,得到相机坐标系相对标定平面的距离变换;Calculate the distance from the origin of the camera coordinate system to the calibration plane, and obtain the distance transformation of the camera coordinate system relative to the calibration plane; 计算相机坐标系相对标定平面正视的旋转,到相机坐标系相对标定平面的旋转变换,Calculate the rotation of the camera coordinate system relative to the calibration plane, to the rotation transformation of the camera coordinate system relative to the calibration plane, 将所述距离变换、以及旋转变换,作为所述深度相机与标定平面之间的参数。The distance transformation and the rotation transformation are used as parameters between the depth camera and the calibration plane. 12.如权利要求11所述的方法,其特征在于,所述深度图像数据包括有移动机器人本体承载面图像数据,12. The method according to claim 11, wherein the depth image data includes image data of the bearing surface of the mobile robot body, 所述根据所述拟合平面法向量,得到与所述拟合平面关联的标定平面的方程,包括;According to the fitted plane normal vector, the equation of the calibration plane associated with the fitted plane is obtained, including; 根据所述拟合平面法向量,得到与所述拟合平面平行、且具有设定距离的标定平面的方程;其中,所述标定平面为世界坐标系下移动机器人本体的承载地面;According to the normal vector of the fitting plane, the equation of the calibration plane parallel to the fitting plane and having a set distance is obtained; wherein, the calibration plane is the bearing ground of the mobile robot body in the world coordinate system; 所述计算相机坐标系的原点到与所述标定平面的距离,得到相机坐标系相对标定平面的距离变换,包括;The calculating the distance from the origin of the camera coordinate system to the calibration plane, and obtaining the distance transformation of the camera coordinate system relative to the calibration plane, including; 根据标定平面的方程参数,计算相机坐标系原点到与所述标定平面的距离,得到相机坐标系相对标定平面的高度变换;According to the equation parameters of the calibration plane, calculate the distance from the origin of the camera coordinate system to the calibration plane, and obtain the height transformation of the camera coordinate system relative to the calibration plane; 所述计算相机坐标系相对标定平面正视的旋转,到相机坐标系相对标定平面的旋转变换,包括,The calculation of the rotation of the camera coordinate system relative to the front view of the calibration plane, to the rotation transformation of the camera coordinate system relative to the calibration plane, includes, 根据标定平面的方程参数,计算相机坐标系的y轴相对拟合平面法向量的旋转,得到相机坐标系相对标定平面的旋转变换。According to the equation parameters of the calibration plane, the rotation of the y-axis of the camera coordinate system relative to the normal vector of the fitting plane is calculated, and the rotation transformation of the camera coordinate system relative to the calibration plane is obtained. 13.如权利要求11任一所述的方法,其特征在于,所述深度图像数据包括有与移动机器人本体承载面垂直的立面图像数据,13. The method according to any one of claims 11, wherein the depth image data includes elevation image data perpendicular to the bearing surface of the mobile robot body, 所述根据所述拟合平面法向量,得到与所述拟合平面关联的标定平面的方程,包括;According to the fitted plane normal vector, the equation of the calibration plane associated with the fitted plane is obtained, including; 根据所述拟合平面的法向量,得到与所述拟合平面平行、且具有设定距离的标定平面的方程;其中,所述标定平面垂直于世界坐标系下移动机器人本体的承载地面;According to the normal vector of the fitting plane, the equation of the calibration plane that is parallel to the fitting plane and has a set distance is obtained; wherein, the calibration plane is perpendicular to the bearing ground of the mobile robot body in the world coordinate system; 所述计算相机坐标系的原点到与所述标定平面的距离,得到相机坐标系相对标定平面的距离变换,包括;The calculating the distance from the origin of the camera coordinate system to the calibration plane, and obtaining the distance transformation of the camera coordinate system relative to the calibration plane, including; 根据标定平面的方程参数,计算相机坐标系原点到标定平面的距离,得到相机坐标系到标定平面的距离变换;According to the equation parameters of the calibration plane, calculate the distance from the origin of the camera coordinate system to the calibration plane, and obtain the distance transformation from the camera coordinate system to the calibration plane; 所述计算相机坐标系相对标定平面正视的旋转,到相机坐标系相对标定平面的旋转变换,包括,The calculation of the rotation of the camera coordinate system relative to the front view of the calibration plane, to the rotation transformation of the camera coordinate system relative to the calibration plane, includes, 根据标定平面的方程参数,计算相机坐标系的z轴相对标定平面法向量的旋转,得到相机坐标系相对标定平面的旋转变换。According to the equation parameters of the calibration plane, the rotation of the z-axis of the camera coordinate system relative to the normal vector of the calibration plane is calculated, and the rotation transformation of the camera coordinate system relative to the calibration plane is obtained. 14.一种深度相机外参标定的电子设备,其特征在于,该电子设备包括存储器和处理器,其中,14. An electronic device for calibrating external parameters of a depth camera, characterized in that the electronic device comprises a memory and a processor, wherein, 所述存储器用于存放计算机程序;The memory is used to store computer programs; 所述处理器,用于执行所述存储器上所存放的程序,实现权利要求1-13任一所述深度相机外参的标定方法。The processor is configured to execute the program stored in the memory, and implement the method for calibrating the external parameters of the depth camera according to any one of claims 1-13. 15.一种计算机可读存储介质,其特征在于,所述存储介质内存储有计算机程序,所述计算机程序被处理器执行权利要求1-13任一所述深度相机外参的标定方法。15 . A computer-readable storage medium, wherein a computer program is stored in the storage medium, and the computer program is executed by a processor to perform the method for calibrating external parameters of a depth camera according to any one of claims 1-13 .
CN201910892567.7A 2019-09-20 2019-09-20 A method and device for calibrating external parameters of a depth camera Active CN112541950B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910892567.7A CN112541950B (en) 2019-09-20 2019-09-20 A method and device for calibrating external parameters of a depth camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910892567.7A CN112541950B (en) 2019-09-20 2019-09-20 A method and device for calibrating external parameters of a depth camera

Publications (2)

Publication Number Publication Date
CN112541950A true CN112541950A (en) 2021-03-23
CN112541950B CN112541950B (en) 2025-01-03

Family

ID=75012324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910892567.7A Active CN112541950B (en) 2019-09-20 2019-09-20 A method and device for calibrating external parameters of a depth camera

Country Status (1)

Country Link
CN (1) CN112541950B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112967347A (en) * 2021-03-30 2021-06-15 深圳市优必选科技股份有限公司 Pose calibration method and device, robot and computer readable storage medium
CN113284197A (en) * 2021-07-22 2021-08-20 浙江华睿科技股份有限公司 TOF camera external reference calibration method and device for AGV, and electronic equipment
CN113340310A (en) * 2021-07-08 2021-09-03 深圳市人工智能与机器人研究院 Step terrain identification and positioning method for mobile robot and related device
CN113689391A (en) * 2021-08-16 2021-11-23 炬佑智能科技(苏州)有限公司 ToF device installation parameter acquisition method and system and ToF device
CN114066981A (en) * 2021-11-11 2022-02-18 国网辽宁省电力有限公司沈阳供电公司 Unmanned aerial vehicle ground target positioning method
CN114359400A (en) * 2021-12-08 2022-04-15 深圳市优必选科技股份有限公司 A kind of external parameter calibration method, device, computer readable storage medium and robot
CN114972539A (en) * 2022-06-01 2022-08-30 广州铁路职业技术学院(广州铁路机械学校) On-line calibration method, system, computer equipment and medium for camera plane in computer room
CN115131763A (en) * 2021-03-25 2022-09-30 光宝科技股份有限公司 Image processing method and electronic device using the same
CN115187645A (en) * 2022-06-30 2022-10-14 山东新一代信息产业技术研究院有限公司 Robot anti-falling method based on depth image
CN115239803A (en) * 2021-04-23 2022-10-25 阿里巴巴新加坡控股有限公司 Data processing method and device
CN115267814A (en) * 2022-08-05 2022-11-01 深圳大方智能科技有限公司 A Fast Surface Fitting Method for Multi-sensor Fusion
CN115937331A (en) * 2023-01-13 2023-04-07 安徽绿舟科技有限公司 Deep camera external parameter calibration method based on heavy truck battery automatic battery replacement system

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103646389A (en) * 2013-03-26 2014-03-19 中国科学院电子学研究所 SAR slant range image match automatic extraction method based on geometric model
CN104156972A (en) * 2014-08-25 2014-11-19 西北工业大学 Perspective imaging method based on laser scanning distance measuring instrument and multiple cameras
CN104376558A (en) * 2014-11-13 2015-02-25 浙江大学 Cuboid-based intrinsic parameter calibration method for Kinect depth camera
JP2017118396A (en) * 2015-12-25 2017-06-29 Kddi株式会社 Program, apparatus and method for calculating internal parameters of depth camera
CN107146256A (en) * 2017-04-10 2017-09-08 中国人民解放军国防科学技术大学 Camera Calibration Method Based on Differential GPS System under Large Field of View Condition
CN107590836A (en) * 2017-09-14 2018-01-16 斯坦德机器人(深圳)有限公司 A kind of charging pile Dynamic Recognition based on Kinect and localization method and system
CN107633536A (en) * 2017-08-09 2018-01-26 武汉科技大学 A kind of camera calibration method and system based on two-dimensional planar template
CN107945234A (en) * 2016-10-12 2018-04-20 杭州海康威视数字技术股份有限公司 A kind of definite method and device of stereo camera external parameter
CN108280853A (en) * 2018-01-11 2018-07-13 深圳市易成自动驾驶技术有限公司 Vehicle-mounted vision positioning method, device and computer readable storage medium
CN108416791A (en) * 2018-03-01 2018-08-17 燕山大学 A Binocular Vision-Based Pose Monitoring and Tracking Method for Parallel Mechanism Maneuvering Platform
CN108629756A (en) * 2018-04-28 2018-10-09 东北大学 A kind of Kinect v2 depth images Null Spot restorative procedure
CN109029284A (en) * 2018-06-14 2018-12-18 大连理工大学 Three-dimensional laser scanner and camera calibration method based on geometric constraint
CN109544677A (en) * 2018-10-30 2019-03-29 山东大学 Indoor scene main structure method for reconstructing and system based on depth image key frame
CN110111248A (en) * 2019-03-15 2019-08-09 西安电子科技大学 A kind of image split-joint method based on characteristic point, virtual reality system, camera

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103646389A (en) * 2013-03-26 2014-03-19 中国科学院电子学研究所 SAR slant range image match automatic extraction method based on geometric model
CN104156972A (en) * 2014-08-25 2014-11-19 西北工业大学 Perspective imaging method based on laser scanning distance measuring instrument and multiple cameras
CN104376558A (en) * 2014-11-13 2015-02-25 浙江大学 Cuboid-based intrinsic parameter calibration method for Kinect depth camera
JP2017118396A (en) * 2015-12-25 2017-06-29 Kddi株式会社 Program, apparatus and method for calculating internal parameters of depth camera
CN107945234A (en) * 2016-10-12 2018-04-20 杭州海康威视数字技术股份有限公司 A kind of definite method and device of stereo camera external parameter
CN107146256A (en) * 2017-04-10 2017-09-08 中国人民解放军国防科学技术大学 Camera Calibration Method Based on Differential GPS System under Large Field of View Condition
CN107633536A (en) * 2017-08-09 2018-01-26 武汉科技大学 A kind of camera calibration method and system based on two-dimensional planar template
CN107590836A (en) * 2017-09-14 2018-01-16 斯坦德机器人(深圳)有限公司 A kind of charging pile Dynamic Recognition based on Kinect and localization method and system
CN108280853A (en) * 2018-01-11 2018-07-13 深圳市易成自动驾驶技术有限公司 Vehicle-mounted vision positioning method, device and computer readable storage medium
CN108416791A (en) * 2018-03-01 2018-08-17 燕山大学 A Binocular Vision-Based Pose Monitoring and Tracking Method for Parallel Mechanism Maneuvering Platform
CN108629756A (en) * 2018-04-28 2018-10-09 东北大学 A kind of Kinect v2 depth images Null Spot restorative procedure
CN109029284A (en) * 2018-06-14 2018-12-18 大连理工大学 Three-dimensional laser scanner and camera calibration method based on geometric constraint
CN109544677A (en) * 2018-10-30 2019-03-29 山东大学 Indoor scene main structure method for reconstructing and system based on depth image key frame
CN110111248A (en) * 2019-03-15 2019-08-09 西安电子科技大学 A kind of image split-joint method based on characteristic point, virtual reality system, camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
孙士杰 等: "点云下地平面检测的RGB-D相机外参自动标定", 《中国图象图形学报》, vol. 23, no. 6, pages 866 - 873 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115131763A (en) * 2021-03-25 2022-09-30 光宝科技股份有限公司 Image processing method and electronic device using the same
CN112967347A (en) * 2021-03-30 2021-06-15 深圳市优必选科技股份有限公司 Pose calibration method and device, robot and computer readable storage medium
US12046008B2 (en) 2021-03-30 2024-07-23 Ubtech Robotics Corp Ltd Pose calibration method, robot and computer readable storage medium
CN112967347B (en) * 2021-03-30 2023-12-15 深圳市优必选科技股份有限公司 Pose calibration method, pose calibration device, robot and computer readable storage medium
WO2022205845A1 (en) * 2021-03-30 2022-10-06 深圳市优必选科技股份有限公司 Pose calibration method and apparatus, and robot and computer-readable storage medium
CN115239803A (en) * 2021-04-23 2022-10-25 阿里巴巴新加坡控股有限公司 Data processing method and device
CN113340310A (en) * 2021-07-08 2021-09-03 深圳市人工智能与机器人研究院 Step terrain identification and positioning method for mobile robot and related device
CN113340310B (en) * 2021-07-08 2024-03-15 深圳市人工智能与机器人研究院 Step terrain identification and positioning method and relevant device for mobile robot
CN113284197A (en) * 2021-07-22 2021-08-20 浙江华睿科技股份有限公司 TOF camera external reference calibration method and device for AGV, and electronic equipment
CN113284197B (en) * 2021-07-22 2021-11-23 浙江华睿科技股份有限公司 TOF camera external reference calibration method and device for AGV, and electronic equipment
CN113689391A (en) * 2021-08-16 2021-11-23 炬佑智能科技(苏州)有限公司 ToF device installation parameter acquisition method and system and ToF device
CN113689391B (en) * 2021-08-16 2024-07-19 炬佑智能科技(苏州)有限公司 ToF equipment installation parameter acquisition method and system and ToF equipment
CN114066981A (en) * 2021-11-11 2022-02-18 国网辽宁省电力有限公司沈阳供电公司 Unmanned aerial vehicle ground target positioning method
CN114359400A (en) * 2021-12-08 2022-04-15 深圳市优必选科技股份有限公司 A kind of external parameter calibration method, device, computer readable storage medium and robot
CN114972539A (en) * 2022-06-01 2022-08-30 广州铁路职业技术学院(广州铁路机械学校) On-line calibration method, system, computer equipment and medium for camera plane in computer room
CN114972539B (en) * 2022-06-01 2025-04-18 广州铁路职业技术学院(广州铁路机械学校) Computer room camera plane online calibration method, system, computer equipment and medium
CN115187645A (en) * 2022-06-30 2022-10-14 山东新一代信息产业技术研究院有限公司 Robot anti-falling method based on depth image
CN115267814A (en) * 2022-08-05 2022-11-01 深圳大方智能科技有限公司 A Fast Surface Fitting Method for Multi-sensor Fusion
CN115937331A (en) * 2023-01-13 2023-04-07 安徽绿舟科技有限公司 Deep camera external parameter calibration method based on heavy truck battery automatic battery replacement system

Also Published As

Publication number Publication date
CN112541950B (en) 2025-01-03

Similar Documents

Publication Publication Date Title
CN112541950A (en) Method and device for calibrating external parameter of depth camera
CN109146980B (en) Monocular vision based optimized depth extraction and passive distance measurement method
CN109035320B (en) Monocular vision-based depth extraction method
CN107633536B (en) Camera calibration method and system based on two-dimensional plane template
CN104778688B (en) The method for registering and device of cloud data
US11010924B2 (en) Method and device for determining external parameter of stereoscopic camera
WO2021004416A1 (en) Method and apparatus for establishing beacon map on basis of visual beacons
CN112489140B (en) Attitude measurement method
CN111123242B (en) Combined calibration method based on laser radar and camera and computer readable storage medium
WO2018201677A1 (en) Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system
JP7219561B2 (en) In-vehicle environment recognition device
CN113361365B (en) Positioning method, positioning device, positioning equipment and storage medium
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
WO2021244161A1 (en) Model generation method and apparatus based on multi-view panoramic image
CN112083403A (en) Positioning tracking error correction method and system for virtual scene
CN116129037B (en) Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof
CN114943762B (en) Binocular vision odometer method based on event camera
CN112184811A (en) Monocular space structured light system structure calibration method and device
CN114332247A (en) Calibration method and device for multi-eye vision measurement, storage medium and camera device
CN109345484A (en) A depth map repair method and device
WO2021193672A1 (en) Three-dimensional model generation method and three-dimensional model generation device
CN114494383A (en) Light field depth estimation method based on Richard-Lucy iteration
CN114638891A (en) Target detection positioning method and system based on image and point cloud fusion
CN118230231A (en) Pose construction method and device of unmanned vehicle, electronic equipment and storage medium
CN114663519A (en) Multi-camera calibration method and device and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 310051 room 304, B / F, building 2, 399 Danfeng Road, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Hikvision Robot Co.,Ltd.

Address before: 310052 5 / F, building 1, building 2, no.700 Dongliu Road, Binjiang District, Hangzhou City, Zhejiang Province

Applicant before: HANGZHOU HIKROBOT TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant