[go: up one dir, main page]

CN116934867A - Camera calibration method - Google Patents

Camera calibration method Download PDF

Info

Publication number
CN116934867A
CN116934867A CN202310830592.9A CN202310830592A CN116934867A CN 116934867 A CN116934867 A CN 116934867A CN 202310830592 A CN202310830592 A CN 202310830592A CN 116934867 A CN116934867 A CN 116934867A
Authority
CN
China
Prior art keywords
calibration
robot
camera
plate
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310830592.9A
Other languages
Chinese (zh)
Inventor
赖钦伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN202310830592.9A priority Critical patent/CN116934867A/en
Publication of CN116934867A publication Critical patent/CN116934867A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a camera calibration method, when a robot performs camera internal and external parameter calibration, firstly, optimizing the camera internal parameter through the image of a calibration plate, and then calibrating the camera external parameter through the pose of the camera relative to each calibration plate obtained according to the image of the calibration plate and the pose of the calibration plate of a calibration jig relative to the center point of the calibration jig, so that the robot can complete the calibration of the camera internal parameter and the camera external parameter in one camera calibration operation, shorten the time for calibrating the camera internal parameter and the camera external parameter, and improve the mass production efficiency of the robot; the robot obtains the position and orientation of the calibration plate of the calibration jig relative to the center point of the calibration jig before the camera is calibrated, and the external parameters of the robot can be calculated through the position and orientation of the calibration plate of the calibration jig relative to the center point of the calibration jig as long as the center points of different external parameters of the robot are overlapped with the center point of the calibration jig.

Description

Camera calibration method
Technical Field
The application relates to the technical field of cameras, in particular to a camera calibration method.
Background
In the fields of computer vision and graphics, external parameter calibration is required to be carried out on a camera for security monitoring, interactive somatosensory games, automatic driving, environment 3D modeling and other application scenes, so that the mapping of the position relation between a real physical space and the view field of the camera is realized. For example, for a robot, a camera is an important device for the robot to acquire external information, and in order for the robot to acquire an accurate position of the external information based on its own coordinate system, the camera needs to be calibrated with internal and external parameters. At present, when calibrating the internal and external parameters of a camera of a robot, the common practice in the industry is to separately calibrate the internal parameters and the external parameters of the camera, so that two calibration operations are required, more time is required, and the mass production efficiency is reduced.
Disclosure of Invention
In order to solve the problems, the application provides a camera calibration method. The specific technical scheme of the application is as follows:
a camera calibration method, the method comprising the steps of: s1: the robot acquires an image of a calibration plate of the calibration jig through a camera, and then identifies a calibration point from the image of the calibration plate; s2: the robot calibrates internal parameters of the cameras and obtains the pose of the cameras relative to each calibration plate according to the positions of the calibration points on the calibration plate and the positions of the calibration points in the images on the images; s3: the robot detects the internal reference calibration result of the camera, if the detection result is correct, the step S4 is entered, and if the detection result is incorrect, the internal reference of the camera is calibrated again; s4: the robot calibrates the external parameters of the cameras through the positions of the calibration plates of the calibration jig relative to the center point of the calibration jig and the positions of the cameras relative to each calibration plate.
Further, in step S1, the robot acquires an image with a plurality of calibration plates through the camera, including the following steps: the robot moves into a calibration jig formed by a plurality of calibration plates, and then the center point of the robot coincides with the center point of the calibration jig; the robot makes the camera vertically upwards, then obtains the image that has all calibration boards of calibration tool through the camera.
Further, in step S1, the robot identifies a calibration point from the image of the calibration plate, including the steps of: the robot detects a quadrilateral black area in the image of the calibration plate; the robot connects the detected quadrilateral black areas in a clockwise direction; the robot acquires an inner quadrangle of a black area of the quadrangle; the robot uses four endpoints of the inner quadrangle as calibration points of the calibration plate through angular point detection; the calibration plate is a checkerboard calibration plate, and four sides of the inner quadrangle are respectively connected with the black areas of the quadrangle.
Further, in step S2, the robot calibrates internal parameters of the camera and obtains the pose of the camera relative to each calibration plate according to the position of the calibration point on the calibration plate and the position of the calibration point in the image on the image, including the following steps: the robot obtains physical coordinates of a calibration point on a physical coordinate system of the calibration plate and pixel coordinates of the calibration point in the image on the image; the robot projects the calibration points on the calibration plates into the images of the calibration plates according to the initial pose of each calibration plate relative to the center point of the robot and the initial internal reference of the camera to obtain projection coordinates of the calibration points of the calibration plates; the robot calculates the Euclidean distance between the projection coordinates of the calibration point of the calibration plate and the pixel coordinates of the calibration point in the image corresponding to the calibration point according to the Euclidean distance formula; and optimizing the internal parameters of the camera by the robot according to the acquired Euclidean distance through an iterative formula of a Gaussian Newton method to obtain the optimal internal parameters of the camera and the pose of the camera relative to each calibration plate.
Further, when the robot optimizes the internal parameters of the camera through the iterative formula of the Gauss Newton method and obtains the minimum Euclidean distance in the calculation process, the pose of each calibration plate corresponding to the minimum Euclidean distance and the central point of the robot and the internal parameters of the camera are used as the optimal internal parameters of the camera and the pose of the camera relative to each calibration plate.
Further, the robot obtains an initial pose of the calibration plate relative to a center point of the robot, comprising the following steps: the robot passes through the corner point of the upper left corner of the calibration plate and makes a vertical line perpendicular to the calibration plate; the robot takes an included angle between a vertical line vertical to the calibration plate and the advancing direction of the robot as an included angle between the calibration plate and the robot; the robot obtains the distance between the center point of the robot and the corner point of the upper left corner of the calibration plate; and the robot combines according to the included angle between the calibration plate and the robot and the distance between the center point of the robot and the corner point of the upper left corner of the calibration plate to obtain the initial pose of the calibration plate relative to the center point of the robot.
Further, the robot obtains physical coordinates of the calibration point on the physical coordinate system of the calibration plate, including the following steps: the robot takes the corner point of the upper left corner of each calibration plate as an origin, and two sides of the calibration plate connected with the corner point of the upper left corner are respectively an X axis and a Y axis to construct a physical coordinate system; and the robot acquires the coordinates of the calibration point on the physical coordinate system according to the position of the calibration point on the calibration plate.
Further, in step S3, the robot detects the internal parameter calibration result of the camera, including the following steps: the robot classifies the calibration points in the image and determines the calibration plate to which the identified calibration points belong; the robot converts the pixel coordinates of the identified calibration points belonging to the same calibration plate into physical coordinates of a physical coordinate system of the calibration plate through the camera internal parameters and the pose of the camera relative to the calibration plate, and obtains the physical coordinates of the calibration points of the image;
the robot acquires the Euclidean distance between the physical coordinates of the calibration points of the image and the physical coordinates of the calibration points of the calibration plate corresponding to the calibration points of the image, if the average value of the Euclidean distances of the set number is larger than a first set threshold value, the calibration of the calibration plate is judged to be wrong, otherwise, the calibration is correct; the robot is used for fitting the calibration points of the same calibration plate with correct calibration into a plurality of calibration point straight lines, then comparing the vertical distance between two adjacent and parallel calibration point straight lines, if the vertical distance between any two calibration point straight lines is larger than a second set threshold value, judging that the calibration plate is in error calibration, otherwise, the calibration is correct; the robot deletes the calibration plate with the wrong calibration and the identified calibration points belonging to the calibration plate, and then again calibrates the internal parameters of the camera.
Further, in step S2, the robot classifies the identified calibration points, and determines the calibration plate to which the identified calibration points belong, including the steps of: a1: the robot randomly selects the calibration points with the same number as the calibration plates as the centers of the clusters; a2: the robot obtains the cluster distance between the calibration point and the center of each cluster, and then assigns the calibration point to the cluster with the minimum cluster distance; a3: the robot obtains an average value of image coordinates of a calibration point of each cluster, and then takes the average value as the center of a new cluster; a4: the robot repeatedly performs the steps A2 and A3 until the average value of the image coordinates of the calibration point of each cluster is not changed, and then the robot determines the calibration plate to which the calibration point of each cluster belongs according to the center of each cluster.
Further, in step S4, the robot calibrates the external parameters of the camera by calibrating the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig and the pose of the camera relative to each calibration plate, including the following steps: the robot obtains a rotation matrix of the position and the posture of a calibration plate of the calibration jig relative to the center point of the calibration jig; the robot obtains a rotation matrix of the pose of the camera relative to one of the calibration plates; and converting the rotation matrix of the pose of the calibration plate of the calibration jig relative to the central point of the calibration jig and the rotation matrix of the pose of the camera relative to one of the calibration plates by the robot according to an Euler formula to obtain an external parameter matrix of the camera.
Further, before the camera calibration is performed, the robot obtains the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig, and the method comprises the following steps: the robot enables the calibration plates to be in the shooting range of the camera, then moves to different preset positions to obtain images of a plurality of calibration plates, and obtains IMU data during movement; the robot identifies a calibration point from the acquired image of the calibration plate, and then calculates the pose of the camera relative to the calibration plate at different preset positions by adopting a nonlinear optimization algorithm according to the identified calibration point; according to IMU data and the pose of the camera relative to the calibration plate at different preset positions, the robot calculates the pose of the camera relative to the central point of the robot by adopting a nonlinear optimization algorithm; the robot moves into a calibration jig formed by a plurality of calibration plates, and then the position of the center point of the current robot is set as the center point of the calibration jig; the robot makes the camera vertically upwards, then obtains images of all calibration plates with the calibration jig through the camera, and identifies the calibration points from the images of the calibration plates; the robot obtains the pose of the camera relative to each calibration plate according to the positions of the calibration points on the calibration plate and the positions of the calibration points in the image; and the robot acquires the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig in a matrix transformation mode according to the pose of the camera relative to each calibration plate and the pose of the camera relative to the center point of the robot.
Compared with the prior art, the application has the following advantages: when the robot carries out the internal and external parameter calibration of the camera, the internal parameters of the camera are optimized through the images of the calibration plate, so that the accuracy of the camera is improved; then calibrating the external parameters of the camera through the pose of the camera relative to each calibration plate obtained according to the images of the calibration plates and the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig, so that the robot can finish the calibration of the internal parameters and the external parameters of the camera in one-time calibration operation of the camera, the time for calibrating the internal parameters and the external parameters of the camera is shortened, and the mass production efficiency of the robot is improved; the robot acquires the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig before the camera is calibrated, so that the external parameters of the robot can be calculated through the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig as long as the center points of different external parameters of the robot are overlapped with the center point of the calibration jig, and the flexibility is high; after the robot calibrates the internal parameters of the camera through the image of the calibration plate, the accuracy of the internal parameter calibration result of the camera and the accuracy of the external parameter calibration result of the camera adopting the same data are improved through verifying the internal parameters of the camera.
Drawings
Fig. 1 is a flowchart of a camera calibration method according to an embodiment of the application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout.
The technical scheme and the beneficial effects of the application are more clear and definite by further describing the specific embodiments of the application with reference to the drawings in the specification. The embodiments described below are exemplary by referring to the drawings for the purpose of illustrating the application and are not to be construed as limiting the application.
As shown in fig. 1, a camera calibration method includes the following steps:
s1: the robot obtains an image of a calibration plate of the calibration jig through the camera, and then identifies a calibration point from the image of the calibration plate. The calibration jig is composed of a plurality of calibration plates, the number and the size of the calibration plates are adjusted according to the requirements of the robot, and the number and the size of the calibration plates are not limited. The calibration plate may be a checkerboard calibration plate, an ArUco calibration plate (ArUco is an open source library for estimating the pose of the camera according to a preset black-and-white markers), a churco calibration plate (churco calibration plate is obtained by combining a checkerboard calibration plate and an ArUco calibration plate), etc., and the calibration point may be a corner point, a dot, a ChArUco angle, etc. according to the type of the calibration plate. The robot can identify the calibration points from the image of the calibration plate through a corner detection algorithm. S2: the robot calibrates internal parameters of the cameras and obtains the pose of the cameras relative to each calibration plate according to the positions of the calibration points on the calibration plate and the positions of the calibration points in the images. The calibration points on the calibration plate are points on the calibration plate in the calibration jig in reality, the positions of the calibration points on the calibration plate are determined by the types and the sizes of the calibration plate, the positions of the calibration points on the calibration plate of different types and sizes are different, and a user can detect the positions of the calibration points on the calibration plate through auxiliary tools before calibrating the camera. The calibration points in the image are the calibration points identified in the calibration plate image, the positions of the calibration points in the image on the image are the pixel coordinates of the image, and the positions of the calibration points are obtained according to the pixel points occupied by the calibration points when the robot identifies the calibration points in the image. S3: and detecting an internal reference calibration result of the camera by the robot, if the detection result is correct, entering a step S4, and if the detection result is incorrect, processing the calibration point, and then calibrating the internal reference of the camera again by the step S2. After the robot determines the internal parameters of the camera, the internal parameter calibration result can be detected to improve the accuracy of the internal parameter calibration result of the camera. The internal parameters of the camera calibrated by the robot are generally internal parameter matrixes, and the internal parameter matrixes comprise focal lengths and centers. In addition, for a wide angle camera, the internal parameters of the camera are projection functions instead of an internal matrix, and the projection functions include focal length, center and camera distortion parameters. S4: the robot calibrates the external parameters of the cameras through the positions of the calibration plates of the calibration jig relative to the center point of the calibration jig and the positions of the cameras relative to each calibration plate. The external parameters of the camera are the pose of the camera relative to the center point of the robot. In the application, when the pose of the camera relative to each calibration plate, the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig and the pose of the camera relative to the center point of the robot are determined, the positive direction of the camera is the forward shooting direction of the camera, the positive direction of the robot is the forward direction of the robot, and the positive direction of the calibration plate of the calibration jig is determined according to the physical coordinate system of the calibration plate. When the robot carries out the internal and external parameter calibration of the camera, the internal parameters of the camera are optimized through the images of the calibration plate, and the accuracy of the camera is improved. And then calibrating the external parameters of the camera by the pose of the camera relative to each calibration plate obtained according to the images of the calibration plates and the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig, so that the robot can finish the calibration of the internal parameters and the external parameters of the camera in one camera calibration operation, the time for calibrating the internal parameters and the external parameters of the camera is shortened, and the mass production efficiency of the robot is improved. The robot acquires the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig before the camera is calibrated, so that the external parameters of the robot can be calculated through the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig as long as the center points of different external parameters of the robot are overlapped with the center point of the calibration jig, and the flexibility is high; after the robot calibrates the internal parameters of the camera through the image of the calibration plate, the accuracy of the internal parameter calibration result of the camera and the accuracy of the external parameter calibration result of the camera adopting the same data are improved through verifying the internal parameters of the camera.
As one embodiment, in step S1, the robot obtains an image of a calibration plate of the calibration jig through a camera, and the method includes the following steps: the robot moves into a calibration jig formed by a plurality of calibration plates, then the center point of the robot is overlapped with the center point of the calibration jig, and the center point of the robot and the center point of the calibration jig can be overlapped on a horizontal plane. The calibration plates of the calibration jig are generally arranged around the robot, the robot enables the camera to vertically upwards when the robot acquires images of the calibration plates, then the robot acquires the images with all the calibration plates of the calibration jig through the camera, in order to enable the camera to capture all the calibration plates, the robot can enable all the calibration plates of the calibration jig to fall into the visual angle range of the camera through autorotation, and if the calibration plates do not fall into the visual angle range of the camera, the calibration plates can be removed. In the application, the center point of the robot can be the center point of the horizontal section of the robot, and the center point of the calibration jig is set through the center point of the robot when the robot obtains the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig, and the set time is before the robot starts the calibration work without changing after the setting. The robot acquires the images of the calibration plates in a mode that the camera is vertically upwards, so that the robot is facilitated to acquire the images of all the calibration plates with the calibration jig, and the calibration efficiency is improved.
As one embodiment, in step S1, the robot identifies a calibration point from an image of the calibration plate, including the steps of: the robot detects a black area of a quadrangle in the image of the calibration plate. The robot connects the detected black areas of the quadrangle in the clockwise direction. The robot acquires an inner quadrangle of the black area of the quadrangle. The robot uses four endpoints of the inner quadrangle as calibration points of the calibration plate through corner detection. The calibration plate is a checkerboard calibration plate, four sides of the inner quadrangle are respectively connected with the black areas of the quadrangle, namely the black areas of the quadrangle are black quadrangles of the checkerboard, the inner quadrangle of the black areas of the quadrangle is white quadrangle of the checkerboard, and the four black quadrangles are surrounded by the four quadrangles. The robot firstly identifies the inner quadrangle from the checkerboard calibration plate, then takes four endpoints of the inner quadrangle as calibration points, effectively removes the calibration points connected with four edges of the checkerboard in the checkerboard, and improves the accuracy of the calibration result.
As one embodiment, in step S2, the robot calibrates internal parameters of the camera and obtains the pose of the camera relative to each calibration plate according to the position of the calibration point on the calibration plate and the position of the calibration point in the image on the image, and includes the following steps: the robot obtains physical coordinates of the calibration point on the physical coordinate system of the calibration plate and pixel coordinates of the calibration point in the image on the image. The robot obtains the projection coordinates of the calibration points of the calibration plates by projecting the calibration points on each calibration plate into the image of the calibration plate according to the initial pose of each calibration plate relative to the center point of the robot (the initial pose can be calculated by manually measuring the distance between the calibration plate and the center point of the robot) and the initial internal parameters of the camera (the initial camera internal parameters are estimated from the image (or obtained from a camera manufacturer)). The robot calculates the Euclidean distance between the projection coordinates of the calibration point of the calibration plate and the pixel coordinates of the calibration point in the image corresponding to the calibration point according to the Euclidean distance formula. And optimizing the internal parameters of the camera by the robot according to the acquired Euclidean distance through an iterative formula of a Gaussian Newton method to obtain the optimal internal parameters of the camera and the pose of the camera relative to each calibration plate. Assuming that the physical coordinates of the calibration point on the physical coordinate system of the calibration plate are (x 0, y 0), the projection coordinates (x 1, y 1) of the calibration point of the calibration plate are obtained by conversion from the projection coordinates, the pixel coordinates of the calibration point in the image on the image are (x 2, y 2), and the euclidean distance between the projection coordinates of the calibration point of the calibration plate and the pixel coordinates of the calibration point in the image corresponding to the calibration point is d=sqrt ((x 1-x 2) a2- (y 1-y 2) a2). Then the robot optimizes the internal parameters of the camera according to an iterative formula of Gauss Newton method, wherein θn+1=θn- (JTJ)' Λ < -1 >. JTd; wherein, thetan+1 is the updated internal reference or external reference, thetan is the pre-updated internal reference or external reference, d is the calculated Euclidean distance, and J is the Jacobian matrix. The robot can also optimize the pose of each calibration plate and the robot center point and internal parameters of the camera by a Levenberg-Marquardt algorithm (LM algorithm is the most widely used nonlinear least squares algorithm, chinese is the Levenberg-Marquardt method, which is an algorithm that uses gradients to find the maximum (small) value). After the initial pose of each calibration plate relative to the center point of the robot and the initial internal reference of the camera are obtained by the robot, the pose of each calibration plate and the center point of the robot and the internal reference of the camera are continuously and iteratively optimized through the Gauss Newton method, so that the pose of each calibration plate and the center point of the robot and the internal reference of the camera obtained by the robot are the optimal values, and the accuracy of the calculation result is improved.
As one embodiment, when the robot optimizes the internal parameters of the camera through the iterative formula of the gauss newton method, and obtains the minimum euclidean distance in the calculation process, the pose of each calibration plate corresponding to the minimum euclidean distance and the central point of the robot and the internal parameters of the camera are used as the optimal internal parameters of the camera and the pose of the camera relative to each calibration plate. The Euclidean distance in the document is the re-projection error of the calibration point of the calibration plate and the calibration point in the image, and the pose of each calibration plate and the central point of the robot and the internal reference of the camera are obtained by obtaining the minimum value of the re-projection error, so that the practicability is higher.
As one of the embodiments, the robot obtains an initial pose of the calibration plate relative to a center point of the robot, including the steps of: the robot passes through the corner point of the upper left corner of the calibration plate and makes a vertical line perpendicular to the calibration plate; the robot takes an included angle between a vertical line vertical to the calibration plate and the advancing direction of the robot as an included angle between the calibration plate and the robot; the robot obtains the distance between the center point of the robot and the corner point of the upper left corner of the calibration plate; and the robot combines according to the included angle between the calibration plate and the robot and the distance between the center point of the robot and the corner point of the upper left corner of the calibration plate to obtain the initial pose of the calibration plate relative to the center point of the robot. The robot uses the midpoint of the robot as the origin of a robot coordinate system, the advancing direction of the robot is used as the X axis of the robot coordinate system, the wheel axis of the robot is used as the Y axis of the robot coordinate system, the robot constructs the robot coordinate system, and the initial pose of each calibration plate relative to the center point of the robot can be converted according to the included angle between the vertical line perpendicular to the calibration plate and the advancing direction of the robot, the distance between the corner points of the upper left corner of the calibration plate and the height of the upper left corner of the calibration plate. The robot calculates the pose of the calibration plate relative to the center point of the robot according to the included angle between the calibration plate and the robot and the distance between the center point of the robot and the corner point of the upper left corner of the calibration plate, so that the calculation speed is improved.
As one of the embodiments, the robot acquires physical coordinates of the calibration point on the physical coordinate system of the calibration plate, including the steps of: the robot takes the corner point of the upper left corner of each calibration plate as an origin, and two sides of the calibration plate connected with the corner point of the upper left corner are respectively an X axis and a Y axis, so as to construct a physical coordinate system. And the robot acquires the coordinates of the calibration point on the physical coordinate system according to the position of the calibration point on the calibration plate. The robot constructs a physical coordinate system according to the edges of the calibration plate respectively, so that the calculation accuracy is improved.
As one embodiment, in step S3, the robot detects the internal reference calibration result of the camera, and includes the following steps: the robot classifies the calibration points in the image and determines the calibration plate to which the identified calibration points in the image belong. And the robot converts the identified pixel coordinates of the calibration points belonging to the same calibration plate into physical coordinates of a physical coordinate system of the calibration plate through the optimal camera internal parameters and the pose of the camera relative to the calibration plate, so as to obtain the physical coordinates of the calibration points of the image. The robot acquires the Euclidean distance between the physical coordinates of the calibration points of the image and the physical coordinates of the calibration points of the calibration plate corresponding to the calibration points of the image, and if the average value of the Euclidean distances between the physical coordinates of the calibration points of the set number of images and the physical coordinates of the calibration points of the calibration plate corresponding to the calibration points of the image is larger than a first set threshold value, the calibration of the calibration plate is judged to be wrong, otherwise, the calibration is correct. The robot can also convert the physical coordinates of the calibration points of the calibration plate into the pixel coordinates of the image, and then calculate the euclidean distance with the pixel coordinates of the corresponding calibration points of the image. And the robot is used for fitting the calibration points of the same calibration plate with correct calibration into a plurality of calibration point straight lines, comparing the vertical distance between any two adjacent and parallel calibration point straight lines, and judging that the calibration plate is in error calibration if the vertical distance between any two adjacent and parallel calibration point straight lines is larger than a second set threshold value, otherwise, the calibration is correct. And the robot deletes the calibration plate with the calibration error and the identified calibration points belonging to the calibration plate, and then, the internal parameters of the camera are calibrated again through the step S2. If all the calibration plates are in error calibration, the robot rotates for a certain angle, and then the camera is used for acquiring images of the calibration plates of the calibration jig again to calibrate the internal parameters and the external parameters of the camera. After the robot calibrates the internal parameters of the camera through the image of the calibration plate, the accuracy of the internal parameter calibration result of the camera and the accuracy of the external parameter calibration result of the camera adopting the same data are improved through verifying the internal parameters of the camera.
As one embodiment, in step S2, the robot classifies the identified calibration points, determines the calibration plate to which the identified calibration points belong, and includes the following steps: a1: the robot randomly selects the calibration points with the same number as the calibration plates as the centers of clusters, wherein the clusters are point sets. A2: the robot acquires the cluster distance between the calibration point and the center of each cluster, and then assigns the calibration point to the cluster with the minimum cluster distance. A3: the robot obtains an average value of the image coordinates of the calibration points of each cluster, and then takes the average value as the center of the new cluster. A4: the robot repeatedly performs the steps A2 and A3 until the average value of the image coordinates of the calibration point of each cluster is not changed, and then the robot determines the calibration plate to which the calibration point of each cluster belongs according to the center of each cluster. The robot classifies the standard points through a clustering algorithm, so that the accuracy is high.
As one embodiment, in step S4, the robot calibrates the external parameters of the camera by the preset calibration plate pose with respect to the center of the robot and the pose of the camera with respect to each calibration plate, and includes the following steps: the robot obtains a rotation matrix of the pose of the preset calibration plate relative to the center of the robot. The robot acquires a rotation matrix of the pose of the camera relative to one of the calibration plates. And converting a rotation matrix of the pose of the preset calibration plate relative to the center of the robot and a rotation matrix of the pose of the camera relative to one of the calibration plates by the robot according to an Euler formula to obtain an external parameter matrix of the camera. The robot obtains the external parameters of the camera through a matrix transformation mode, and the calculation speed is high.
As one embodiment, before calibrating a camera, a robot obtains the pose of a calibration plate of a calibration jig relative to the center point of the calibration jig, and the method comprises the following steps: the robot enables the calibration plates to be in the shooting range of the camera, then the robot moves to different preset positions to acquire images of a plurality of calibration plates, and IMU data during movement is acquired. The robot identifies the calibration points from the acquired images of the calibration plate, and then calculates the pose of the camera relative to the calibration plate at different preset positions by adopting a nonlinear optimization algorithm according to the identified calibration points. And the robot calculates the pose of the camera relative to the central point of the robot by adopting a nonlinear optimization algorithm according to the IMU data and the poses of the camera relative to the calibration plate at different preset positions. The robot acquires initial internal parameters of the camera and initial pose of each calibration plate relative to the center point of the robot when the robot is at different preset positions according to IMU data. The robot respectively identifies the calibration points from the images of the calibration plate and obtains the physical coordinates of the calibration points on the physical coordinate system of the calibration plate. And the robot projects the physical coordinates of the calibration points on the calibration plate onto the corresponding calibration plate images respectively to obtain the projection coordinates of the calibration points of the calibration plate. The robot calculates the Euclidean distance between the projection coordinates of the calibration point of the calibration plate and the pixel coordinates of the calibration point in the image corresponding to the calibration point according to the Euclidean distance formula. And optimizing internal parameters of the camera by the robot according to the acquired Euclidean distance through an iterative formula of a Gaussian Newton method to obtain the optimal pose of the camera relative to each calibration plate. And the robot obtains the pose of the camera relative to the center point of the robot through an iterative formula of a Gauss Newton method according to the initial pose of each calibration plate relative to the center point of the robot and the pose of the camera relative to each calibration plate. The robot moves into a calibration jig formed by a plurality of calibration plates, then the position of the center point of the current robot is set as the center point of the calibration jig, and when the center point of the calibration jig is set to be finished, the center point of the calibration jig is not changed when the calibration jig is used for calibrating cameras of other robots. The robot makes the camera vertically upwards, then obtains the image that has all calibration boards of calibration tool through the camera to discern the calibration point from the image of calibration board. And the robot acquires the pose of the camera relative to each calibration plate according to the positions of the calibration points on the calibration plate and the positions of the calibration points in the image on the image. And the robot acquires the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig in a matrix transformation mode according to the pose of the camera relative to each calibration plate and the pose of the camera relative to the center point of the robot. The robot obtains the position appearance of the calibration board of demarcating the tool for the central point of demarcating the tool before the camera is demarcating, as long as with the central point coincidence of the central point of the robot of different external parameters and demarcating the tool, just can calculate the external parameters of robot through the position appearance of the calibration board of demarcating the tool for the central point of demarcating the tool, the flexibility is higher.
Compared with the prior art, the application has the following advantages: when the robot carries out the internal and external parameter calibration of the camera, the internal parameters of the camera are optimized through the images of the calibration plate, and the accuracy of the camera is improved. And then calibrating the external parameters of the camera by the pose of the camera relative to each calibration plate obtained according to the images of the calibration plates and the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig, so that the robot can finish the calibration of the internal parameters and the external parameters of the camera in one camera calibration operation, the time for calibrating the internal parameters and the external parameters of the camera is shortened, and the mass production efficiency of the robot is improved. The robot obtains the position appearance of the calibration board of demarcating the tool for the central point of demarcating the tool before the camera is demarcating, as long as with the central point coincidence of the central point of the robot of different external parameters and demarcating the tool, just can calculate the external parameters of robot through the position appearance of the calibration board of demarcating the tool for the central point of demarcating the tool, the flexibility is higher.
In the description of the present application, a description of the terms "one embodiment," "preferred," "example," "specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application, and a schematic representation of the terms described above in the present specification does not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. The connection modes in the description of the specification have obvious effects and practical effectiveness.
From the above description of the structure and principles, it should be understood by those skilled in the art that the present application is not limited to the above-described embodiments, but rather that modifications and substitutions using known techniques in the art on the basis of the present application fall within the scope of the present application, which is defined by the appended claims.

Claims (11)

1. A camera calibration method, comprising the steps of:
s1: the robot acquires an image of a calibration plate of the calibration jig through a camera, and then identifies a calibration point from the image of the calibration plate;
s2: the robot calibrates internal parameters of the cameras and obtains the pose of the cameras relative to each calibration plate according to the positions of the calibration points on the calibration plate and the positions of the calibration points in the images on the images;
s3: the robot detects the internal reference calibration result of the camera, if the detection result is correct, the step S4 is entered, and if the detection result is incorrect, the internal reference of the camera is calibrated again;
s4: the robot calibrates the external parameters of the cameras through the positions of the calibration plates of the calibration jig relative to the center point of the calibration jig and the positions of the cameras relative to each calibration plate.
2. The camera calibration method according to claim 1, wherein in step S1, the robot acquires an image with a plurality of calibration plates through the camera, comprising the steps of:
the robot moves into a calibration jig formed by a plurality of calibration plates, and then the center point of the robot coincides with the center point of the calibration jig;
the robot makes the camera vertically upwards, then obtains the image that has all calibration boards of calibration tool through the camera.
3. The camera calibration method according to claim 2, wherein in step S1, the robot identifies a calibration point from the image of the calibration plate, comprising the steps of:
the robot detects a quadrilateral black area in the image of the calibration plate;
the robot connects the detected quadrilateral black areas in a clockwise direction;
the robot acquires an inner quadrangle of a black area of the quadrangle;
the robot uses four endpoints of the inner quadrangle as calibration points of the calibration plate through angular point detection;
the calibration plate is a checkerboard calibration plate, and four sides of the inner quadrangle are respectively connected with the black areas of the quadrangle.
4. A camera calibration method according to claim 3, wherein in step S2, the robot calibrates internal parameters of the camera and obtains the pose of the camera relative to each calibration plate according to the position of the calibration point on the calibration plate and the position of the calibration point in the image on the image, comprising the steps of:
the robot obtains physical coordinates of a calibration point on a physical coordinate system of the calibration plate and pixel coordinates of the calibration point in the image on the image;
the robot projects the calibration points on the calibration plates into the images of the calibration plates according to the initial pose of each calibration plate relative to the center point of the robot and the initial internal reference of the camera to obtain projection coordinates of the calibration points of the calibration plates;
the robot calculates the Euclidean distance between the projection coordinates of the calibration point of the calibration plate and the pixel coordinates of the calibration point in the image corresponding to the calibration point according to the Euclidean distance formula;
and optimizing the internal parameters of the camera by the robot according to the acquired Euclidean distance through an iterative formula of a Gaussian Newton method to obtain the optimal internal parameters of the camera and the pose of the camera relative to each calibration plate.
5. The camera calibration method according to claim 4, wherein when the robot optimizes the internal parameters of the camera through an iterative formula of gauss newton method, and the minimum euclidean distance is obtained in the calculation process, the pose of each calibration plate corresponding to the minimum euclidean distance and the central point of the robot and the internal parameters of the camera are used as the optimal camera internal parameters and the pose of the camera relative to each calibration plate.
6. The camera calibration method according to claim 4, wherein the robot obtains an initial pose of the calibration plate relative to a center point of the robot, comprising the steps of:
the robot passes through the corner point of the upper left corner of the calibration plate and makes a vertical line perpendicular to the calibration plate;
the robot takes an included angle between a vertical line vertical to the calibration plate and the advancing direction of the robot as an included angle between the calibration plate and the robot;
the robot obtains the distance between the center point of the robot and the corner point of the upper left corner of the calibration plate;
and the robot combines according to the included angle between the calibration plate and the robot and the distance between the center point of the robot and the corner point of the upper left corner of the calibration plate to obtain the initial pose of the calibration plate relative to the center point of the robot.
7. The camera calibration method according to claim 4, wherein the robot acquires physical coordinates of the calibration point on the physical coordinate system of the calibration plate, comprising the steps of:
the robot takes the corner point of the upper left corner of each calibration plate as an origin, and two sides of the calibration plate connected with the corner point of the upper left corner are respectively an X axis and a Y axis to construct a physical coordinate system;
and the robot acquires the coordinates of the calibration point on the physical coordinate system according to the position of the calibration point on the calibration plate.
8. The camera calibration method according to claim 7, wherein in step S3, the robot detects the internal reference calibration result of the camera, and the method comprises the following steps:
the robot classifies the calibration points in the image and determines the calibration plate to which the identified calibration points belong;
the robot converts the pixel coordinates of the identified calibration points belonging to the same calibration plate into physical coordinates of a physical coordinate system of the calibration plate through the camera internal parameters and the pose of the camera relative to the calibration plate, and obtains the physical coordinates of the calibration points of the image;
the robot acquires the Euclidean distance between the physical coordinates of the calibration points of the image and the physical coordinates of the calibration points of the calibration plate corresponding to the calibration points of the image, if the average value of the Euclidean distances of the set number is larger than a first set threshold value, the calibration of the calibration plate is judged to be wrong, otherwise, the calibration is correct;
the robot is used for fitting the calibration points of the same calibration plate with correct calibration into a plurality of calibration point straight lines, then comparing the vertical distance between two adjacent and parallel calibration point straight lines, if the vertical distance between any two calibration point straight lines is larger than a second set threshold value, judging that the calibration plate is in error calibration, otherwise, the calibration is correct;
the robot deletes the calibration plate with the wrong calibration and the identified calibration points belonging to the calibration plate, and then again calibrates the internal parameters of the camera.
9. The camera calibration method according to claim 8, wherein in step S2, the robot classifies the identified calibration points and determines the calibration plate to which the identified calibration points belong, and the method comprises the steps of:
a1: the robot randomly selects the calibration points with the same number as the calibration plates as the centers of the clusters;
a2: the robot obtains the cluster distance between the calibration point and the center of each cluster, and then assigns the calibration point to the cluster with the minimum cluster distance;
a3: the robot obtains an average value of image coordinates of a calibration point of each cluster, and then takes the average value as the center of a new cluster;
a4: the robot repeatedly performs the steps A2 and A3 until the average value of the image coordinates of the calibration point of each cluster is not changed, and then the robot determines the calibration plate to which the calibration point of each cluster belongs according to the center of each cluster.
10. The camera calibration method according to claim 8, wherein in step S4, the robot calibrates the external parameters of the camera by the pose of the calibration plate of the calibration jig with respect to the center point of the calibration jig and the pose of the camera with respect to each calibration plate, comprising the steps of:
the robot obtains a rotation matrix of the position and the posture of a calibration plate of the calibration jig relative to the center point of the calibration jig;
the robot obtains a rotation matrix of the pose of the camera relative to one of the calibration plates;
and converting the rotation matrix of the pose of the calibration plate of the calibration jig relative to the central point of the calibration jig and the rotation matrix of the pose of the camera relative to one of the calibration plates by the robot according to an Euler formula to obtain an external parameter matrix of the camera.
11. The camera calibration method according to claim 10, wherein the robot obtains the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig before the camera calibration, and the method comprises the following steps:
the robot enables the calibration plates to be in the shooting range of the camera, then moves to different preset positions to obtain images of a plurality of calibration plates, and obtains IMU data during movement;
the robot identifies a calibration point from the acquired image of the calibration plate, and then calculates the pose of the camera relative to the calibration plate at different preset positions by adopting a nonlinear optimization algorithm according to the identified calibration point;
according to IMU data and the pose of the camera relative to the calibration plate at different preset positions, the robot calculates the pose of the camera relative to the central point of the robot by adopting a nonlinear optimization algorithm;
the robot moves into a calibration jig formed by a plurality of calibration plates, and then the position of the center point of the current robot is set as the center point of the calibration jig;
the robot makes the camera vertically upwards, then obtains images of all calibration plates with the calibration jig through the camera, and identifies the calibration points from the images of the calibration plates;
the robot obtains the pose of the camera relative to each calibration plate according to the positions of the calibration points on the calibration plate and the positions of the calibration points in the image;
and the robot acquires the pose of the calibration plate of the calibration jig relative to the center point of the calibration jig in a matrix transformation mode according to the pose of the camera relative to each calibration plate and the pose of the camera relative to the center point of the robot.
CN202310830592.9A 2023-07-07 2023-07-07 Camera calibration method Pending CN116934867A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310830592.9A CN116934867A (en) 2023-07-07 2023-07-07 Camera calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310830592.9A CN116934867A (en) 2023-07-07 2023-07-07 Camera calibration method

Publications (1)

Publication Number Publication Date
CN116934867A true CN116934867A (en) 2023-10-24

Family

ID=88388805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310830592.9A Pending CN116934867A (en) 2023-07-07 2023-07-07 Camera calibration method

Country Status (1)

Country Link
CN (1) CN116934867A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118887301A (en) * 2024-10-08 2024-11-01 名商科技有限公司 A calibration system and method for installing a vehicle-mounted camera sensor

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118887301A (en) * 2024-10-08 2024-11-01 名商科技有限公司 A calibration system and method for installing a vehicle-mounted camera sensor
CN118887301B (en) * 2024-10-08 2024-12-20 名商科技有限公司 A calibration system and method for installing a vehicle-mounted camera sensor

Similar Documents

Publication Publication Date Title
Yan et al. Joint camera intrinsic and lidar-camera extrinsic calibration
US6470271B2 (en) Obstacle detecting apparatus and method, and storage medium which stores program for implementing the method
CN110555889A (en) CALTag and point cloud information-based depth camera hand-eye calibration method
CN110827361B (en) Camera group calibration method and device based on global calibration frame
CN110738273B (en) Image feature point matching method, device, equipment and storage medium
CN111612794A (en) Multi-2D vision-based high-precision three-dimensional pose estimation method and system for parts
CN110163025A (en) Two dimensional code localization method and device
CN111123242A (en) Combined calibration method based on laser radar and camera and computer readable storage medium
CN107527368B (en) Three-dimensional space attitude positioning method and device based on two-dimensional code
CN116630444B (en) An optimization method for camera and lidar fusion calibration
CN112734844A (en) Monocular 6D pose estimation method based on octahedron
CN112560704B (en) Visual identification method and system for multi-feature fusion
CN116934867A (en) Camera calibration method
CN114945450A (en) Robot System
CN116563391B (en) Automatic laser structure calibration method based on machine vision
CN117437393A (en) Active alignment algorithm for micro LED chip
CN116736259A (en) Laser point cloud coordinate calibration method and device for tower crane automatic driving
CN116912324A (en) Internal reference calibration method of camera
CN116934866A (en) Camera calibration method of robot
CN116524041A (en) Camera calibration method, device, equipment and medium
CN116912325A (en) Camera external parameter calibration method based on calibration jig, chip and robot
CN116912326A (en) Calibration method of calibration jig, calibration jig and calibration system
CN114066991B (en) Light field camera calibration method based on spatial plane homography fixed point constraint
CN113834488B (en) Robot space attitude calculation method based on remote identification of structured light array
CN111716340A (en) Correction device and method for coordinate system of 3D camera and robot arm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination