[go: up one dir, main page]

CN113997295A - Hand-eye calibration method and device for mechanical arm, electronic equipment and storage medium - Google Patents

Hand-eye calibration method and device for mechanical arm, electronic equipment and storage medium Download PDF

Info

Publication number
CN113997295A
CN113997295A CN202111639830.5A CN202111639830A CN113997295A CN 113997295 A CN113997295 A CN 113997295A CN 202111639830 A CN202111639830 A CN 202111639830A CN 113997295 A CN113997295 A CN 113997295A
Authority
CN
China
Prior art keywords
hand
eye
parameters
camera
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111639830.5A
Other languages
Chinese (zh)
Other versions
CN113997295B (en
Inventor
陈万春
邹远兵
周深宁
杨宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Shibite Robot Co Ltd
Original Assignee
Hunan Shibite Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Shibite Robot Co Ltd filed Critical Hunan Shibite Robot Co Ltd
Priority to CN202111639830.5A priority Critical patent/CN113997295B/en
Publication of CN113997295A publication Critical patent/CN113997295A/en
Application granted granted Critical
Publication of CN113997295B publication Critical patent/CN113997295B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to the technical field of robot vision, and provides a method and a device for calibrating a hand and an eye of a mechanical arm, electronic equipment and a storage medium. The method comprises the following steps: responding to the calibration request, and acquiring initialization parameters for hand-eye calibration; starting the mechanical arm according to the initialization parameters, and acquiring initial data through a camera arranged at the tail end of the mechanical arm to obtain initial hand-eye calibration data; performing hand-eye calibration based on the initial hand-eye calibration data to obtain hand-eye parameters; carrying out attitude sampling on the camera to obtain candidate poses of the camera; and performing iterative optimization on the hand-eye parameters based on the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses to obtain optimized hand-eye parameters. The full automation is realized in the hand-eye calibration process of the whole mechanical arm by the method, so that the labor cost and the time cost are reduced; meanwhile, iteration optimization is carried out on the initial hand-eye parameters obtained by calibrating the automatically acquired data, so that the hand-eye parameters with higher precision can be obtained, and the calibration precision is improved.

Description

Hand-eye calibration method and device for mechanical arm, electronic equipment and storage medium
Technical Field
The present invention relates to the field of robot vision technologies, and in particular, to a method and an apparatus for calibrating a hand and an eye of a robot arm, an electronic device, and a computer-readable storage medium.
Background
The hand-eye calibration is a comparative basis of a robot vision system and is an important work. The hand-eye calibration refers to determining the position relationship between the robot and the vision system, so that the robot and the vision sensor are unified into a standard coordinate system, and can also be understood as determining the coordinate transformation relationship between a pixel coordinate system and a space manipulator coordinate system. The hand-eye calibration is mainly used for robot positioning, robot grabbing and other works, and the precision of the hand-eye calibration directly influences the development of subsequent works.
Various well-known theories exist for solving the problem of calibrating the eyes and hands; however, most of the related technologies are semi-automatic, multiple groups of photos need to be taken manually, and robot hand-eye calibration is performed, because of the manual components in the middle, each calibration needs a long time, and a plurality of people are required to cooperate together to complete corresponding tasks, so that the labor cost is high; moreover, the calibration accuracy highly depends on manual experience, and people with insufficient experience are difficult to obtain hand-eye parameters with higher accuracy.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method and an apparatus for calibrating a hand-eye of a robot arm, an electronic device, and a computer-readable storage medium, so as to solve the problem of high time cost caused by semi-automation of hand-eye calibration in the prior art.
In a first aspect of the embodiments of the present invention, a method for calibrating a hand and an eye of a robot arm is provided, including:
responding to the calibration request, and acquiring initialization parameters for hand-eye calibration;
starting the mechanical arm according to the initialization parameters, and acquiring initial data through a camera arranged at the tail end of the mechanical arm to obtain initial hand-eye calibration data;
performing hand-eye calibration based on the initial hand-eye calibration data to obtain hand-eye parameters;
carrying out attitude sampling on the camera to obtain candidate poses of the camera;
and performing iterative optimization on the hand-eye parameters based on the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses to obtain optimized hand-eye parameters.
In a second aspect of the embodiments of the present invention, there is provided a hand-eye calibration apparatus for a robot arm, including:
the parameter acquisition module is used for responding to the calibration request and acquiring initialization parameters for hand-eye calibration;
the calibration data acquisition module is used for starting the mechanical arm according to the initialization parameters and acquiring initial data through a camera arranged at the tail end of the mechanical arm to obtain initial hand-eye calibration data;
the calibration module is used for carrying out hand-eye calibration based on the initial hand-eye calibration data to obtain hand-eye parameters;
the camera sampling module is used for sampling the postures of the cameras to obtain candidate poses of the cameras;
and the optimization module is used for performing iterative optimization on the hand-eye parameters based on the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses to obtain optimized hand-eye parameters.
In a third aspect of the embodiments of the present invention, there is provided an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method when executing the computer program.
In a fourth aspect of the embodiments of the present invention, a computer-readable storage medium is provided, which stores a computer program, and the computer program realizes the steps of the above method when being executed by a processor.
Compared with the prior art, the embodiment of the invention has the following beneficial effects: when the hand-eye calibration is carried out on the mechanical arm, responding to a calibration request, starting the mechanical arm according to the obtained initialization parameters, carrying out data acquisition through a camera arranged at the tail end of the mechanical arm to obtain initial hand-eye calibration data, and carrying out hand-eye calibration based on the initial hand-eye calibration data to obtain hand-eye parameters; and further sampling candidate postures of the camera, and performing iterative optimization on the hand-eye parameters obtained by the preliminary calibration by combining the candidate poses of the camera to obtain the optimized hand-eye parameters. According to the method, the full automation of the hand-eye calibration process of the whole mechanical arm is realized, and the labor cost and the time cost are reduced; meanwhile, iteration optimization is carried out on the initial hand-eye parameters obtained by calibrating the automatically acquired data, so that the hand-eye parameters with higher precision can be obtained, and the calibration precision is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a scene schematic diagram of an application scenario provided in an embodiment of the present invention;
fig. 2 is a schematic flowchart of a hand-eye calibration method for a robot provided in an embodiment of the present invention;
FIG. 3 is a schematic diagram of a robot arm for data acquisition according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a camera candidate pose acquisition point provided by an embodiment of the invention;
fig. 5 is a schematic flowchart of a procedure of performing iterative optimization on hand-eye parameters based on initial hand-eye calibration data, the hand-eye parameters and candidate camera poses to obtain optimized hand-eye parameters according to an embodiment of the present invention;
FIG. 6 is a schematic flow chart illustrating a method for calibrating a hand-eye of a robotic arm according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a transformation relationship between coordinate systems according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a hand-eye calibration device of a robot provided in an embodiment of the present invention;
FIG. 9 is a schematic structural diagram of an alternative hand-eye calibration apparatus for a robotic arm according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
The following describes a hand-eye calibration method and device for a robot arm according to an embodiment of the present invention in detail with reference to the accompanying drawings.
Fig. 1 is a scene diagram of an application scenario according to an embodiment of the present invention. The application scenario may include the terminal device 1, the robot arm 2, the network 4, and the robot arm end.
The terminal device 1 may be hardware or software. When the terminal device 1 is hardware, it may be various electronic devices having a display screen and supporting communication with the robot arm 2, the camera 3, including but not limited to a smart phone, a tablet computer, a laptop portable computer, a desktop computer, and the like; when the terminal device 1 is software, it can be installed in the electronic device as above. The terminal device 1 may be implemented as a plurality of software or software modules, or may be implemented as a single software or software module, which is not limited in this embodiment of the present invention. Further, various applications, such as a data processing application, an instant messaging tool, social platform software, a search-type application, a shopping-type application, and the like, may be installed on the terminal device 1.
The mechanical arm may be any mechanical arm, and the camera may also be any device having a photographing and/or shooting function, which is not limited in this embodiment of the present invention.
The network 4 may be a wired network connected by a coaxial cable, a twisted pair and an optical fiber, or may be a wireless network capable of interconnecting various Communication devices without wiring, for example, Bluetooth (Bluetooth), Near Field Communication (NFC), Infrared (Infrared), and the like, which is not limited in this embodiment of the present invention.
The user can establish a communication connection with the robot arm 2, the camera 3 via the network 4 through the terminal device 1 to receive or transmit information or the like. Specifically, a user initiates a calibration request through the terminal device 1, and sends a control command to the mechanical arm 2 and the camera 3 through the network 4, so that the mechanical arm moves according to the acquired initialization parameters, and meanwhile, the camera 3 is controlled to acquire data, and initialization calibration data is obtained; the acquired initialized calibration data are returned to the terminal equipment 1 for hand-eye calibration to obtain hand-eye parameters; and then controlling the camera to acquire a plurality of poses to obtain camera candidate poses, and performing iterative optimization on the hand-eye parameters based on the initial hand-eye calibration data, the hand-eye parameters and the camera candidate poses to obtain optimized hand-eye parameters.
It should be noted that specific types, numbers, and combinations of the terminal device 1, the mechanical arm 2, the camera 3, and the network 4 may be adjusted according to actual requirements of an application scenario, which is not limited in this embodiment of the present invention.
Fig. 2 is a schematic flow chart of a hand-eye calibration method for a robot arm according to an embodiment of the present invention. The hand-eye calibration method of the robot arm of fig. 2 may be performed by the terminal device of fig. 1. As shown in fig. 2, the hand-eye calibration method for the robot arm includes steps S201 to S205:
s201, in response to the calibration request, acquiring initialization parameters for hand-eye calibration.
The calibration request is used for requesting to start hand-eye calibration of the mechanical arm. In one embodiment, the calibration request may be initiated by a person.
The initialization parameters are used for initially controlling the mechanical arm to move, so that the mechanical arm moves to a corresponding position based on the initialization parameters to collect corresponding data, and then hand-eye calibration is carried out according to the collected corresponding data to obtain hand-eye parameters. In an embodiment, the initialization parameter may be set in advance according to an actual situation, and stored in the designated path, and when a calibration request is received, the corresponding initialization parameter is obtained from the designated path.
Further, in one embodiment, the initialization parameter includes an end initial pose of the end of the robot arm, the end initial pose corresponding to a point at which the camera at the end of the robot arm can photograph the calibration plate. In the initialization stage of hand-eye calibration, a robot arm tail end posture needs to be specified, so that the calibration plate can be seen after the robot arm tail end (the setting position of a camera) reaches a corresponding posture.
In another embodiment, the initialization parameters further include interval data when the robot arm moves with the initial pose of the end of the robot arm as a starting point, and a plurality of sets of initial hand-eye calibration data can be acquired by moving with the initial pose of the end and the interval data.
S202, starting the mechanical arm according to the initialization parameters, and acquiring initial data through a camera arranged at the tail end of the mechanical arm to obtain initial hand-eye calibration data.
The mechanical arm is a main body which needs to be calibrated by hands and eyes in the embodiment, and the camera is arranged at the tail end of the mechanical arm.
In one embodiment, starting the mechanical arm according to the initialization parameters, and performing initial data acquisition by a camera arranged at the tail end of the mechanical arm to obtain initial hand-eye calibration data, includes: acquiring a tail end initial pose corresponding to the tail end of the mechanical arm; and taking the initial pose at the tail end as a starting point, taking the coordinate axis of a preset coordinate system as a central axis to rotate, and collecting a group of initial hand-eye calibration data at intervals of a preset angle.
In this embodiment, the mechanical arm is started according to the initialization parameter, and the mechanical arm is controlled to move to the point corresponding to the terminal initial pose first, so as to acquire a first set of initial calibration data. Further, the preset angle represents a rotation angle required when the robot arm is controlled to move to the next pose point for data acquisition after the data acquisition of the terminal initial pose is completed, and in this embodiment, the preset angle is rotated around the coordinate axis of the preset angle. The preset coordinate system may be set according to actual conditions, and in a specific embodiment, as shown in fig. 3, the preset coordinate system is rotated by a preset angle on the abscissa axis and the ordinate axis of the preset coordinate system, so as to control the acquisition of the movement calibration data of the robot arm 2 and the robot arm end camera 3.
In another embodiment, the mechanical arm is started according to the initialization parameters, and the hand-eye calibration data acquisition by the camera at the end of the mechanical arm can also be realized in other manners, for example, setting the poses of multiple mechanical arm ends, and sequentially controlling the mechanical arm ends to move to the corresponding poses for data acquisition to obtain initial hand-eye calibration data.
By the method, the initialization stage of the mechanical arm can be automated, and initial data collection is not required to be carried out by manually setting point positions.
And S203, performing hand-eye calibration based on the initial hand-eye calibration data to obtain hand-eye parameters.
The hand-eye calibration is to determine the position relationship between the robot and the vision system, so as to unify the robot and the vision sensor into a standard coordinate system. In the present embodiment, the hand-eye parameters represent the conversion relationship from the camera coordinate system to the robot arm end coordinate system.
After the initial hand-eye calibration data is collected, the hand-eye calibration according to the initial hand-eye calibration data can be realized in any way. For example, the problem of hand-eye calibration can be solved similarly to the solution
Figure DEST_PATH_IMAGE001
Equations of form and provide a solution. A and B refer to the change of the pose of the mechanical arm and the pose of the camera in the two groups of data. The unknown quantity X is the transformation between the end of the mechanical arm and the camera, i.e. the hand-eye parameter in this embodiment. As another example, the shape can be similar to the shape by de-forming
Figure 876538DEST_PATH_IMAGE002
The unknown quantities are X and Y, X is the transformation of the tail end of the camera and the camera, Y is the transformation of the calibration plate and the mechanical arm base, A and B are the pose of the mechanical arm and the pose of the camera in each group of data, and the hand-eye parameters are obtained by solving
Figure DEST_PATH_IMAGE003
Can directly obtain two different equations of X and YAnd (4) transforming. For another example, the hand-eye parameters can also be solved by a "two-step method" in which the rotation part is solved first and then the translation part is solved, and so on.
In one embodiment, since the accuracy of the hand-eye parameters obtained by the initial calibration is low, after the hand-eye parameters are obtained by performing the hand-eye calibration based on the initial hand-eye calibration data, the optimization of the hand-eye parameters is further included. In a specific embodiment, the hand-eye parameters may be optimized by a nonlinear minimization method, and in this embodiment, the hand-eye parameters obtained by the optimization method are recorded as intermediate hand-eye parameters.
Further, in a specific embodiment, the hand-eye parameter is optimized by a non-linear minimization method to obtain a middle hand-eye parameter, which can be represented as:
Figure DEST_PATH_IMAGE005
wherein,
Figure 278831DEST_PATH_IMAGE006
the parameters of the middle hand-eye are represented,
Figure DEST_PATH_IMAGE007
is to express the quaternion
Figure 215825DEST_PATH_IMAGE008
Translation vector
Figure DEST_PATH_IMAGE009
A transformation matrix denoted 4 x 4;
Figure 407772DEST_PATH_IMAGE010
a point on the calibration plate is indicated,
Figure DEST_PATH_IMAGE011
to represent
Figure 441719DEST_PATH_IMAGE012
Points corresponding to the photograph;
Figure DEST_PATH_IMAGE013
the pose of the camera on the calibration plate can be calculated by an epnp method (proposed by Lepetit) in opencv.
It will be appreciated that after the preliminary optimization of the hand-eye parameters, further optimization will be subsequently performed on the basis of the intermediate hand-eye parameters obtained by the preliminary optimization. In the embodiment, the hand-eye parameters are preliminarily optimized in the above manner, so that the accuracy of the hand-eye calibration parameters is improved.
And S204, carrying out attitude sampling on the camera to obtain a candidate pose of the camera.
The method comprises the steps of sampling postures of a camera to obtain candidate poses of the camera, representing the collection of position postures of a part of the camera as candidates, subsequently selecting a better part of the candidate poses of the camera through calculation to collect data, and carrying out iterative optimization on hand-eye parameters according to the collected data, so that the accuracy of the hand-eye parameters is improved.
In one embodiment, sampling the pose of the camera to obtain candidate poses of the camera includes: and taking the coordinate axis origin of the coordinate system where the calibration plate is located as a starting point, moving and sampling the camera pose in each sampling plane at preset intervals by a preset rotation angle to obtain the candidate pose of the camera.
Wherein, the calibration plate is used when the hand and eye calibration is carried out. In one particular embodiment, the calibration board may use an apriltag calibration board. The apriltag calibration board can estimate the pose of the camera through an Epnp algorithm by detecting the corner points of the photos, corresponding to the corner points on the calibration board. Since apriltags does not need all points to appear in the camera visual field, and partial shielding does not influence the overall calibration effect, the applicability is wider.
The sampling plane can be set according to actual conditions. Fig. 4 is a schematic diagram of camera candidate pose acquisition points in one embodiment. In a specific embodiment, the sampling rule in the camera pose sampling includes: defining a series of definitions on the calibration plate by taking the origin of coordinate axes of the calibration plate as a starting pointThe z values of the planes are separated by a certain distance (for example, the z values can be set to 0.1m and the like), and the value ranges of x and y of the planes are the planes
Figure 71371DEST_PATH_IMAGE014
(ii) a The dot display in the figure is the displacement value of the sample, and displacement in the x and y directions (which may be set to 5cm, for example) will sample some dots at preset intervals. In addition to sampling displacement values, rotating parts are sampled, if the same rotating vector is uniformly adopted, the precision of the hand-eye calibration is directly influenced, so that a plurality of rotating vectors are sampled, and in a specific embodiment, five Euler angles can be respectively sampled on x, y and z axes
Figure DEST_PATH_IMAGE015
Further, in one embodiment, after sampling the camera pose, the method further includes: the camera pose is optimized, and the specific optimization process can be expressed as follows:
Figure 902186DEST_PATH_IMAGE016
(1)
wherein points on the plate are calibrated
Figure DEST_PATH_IMAGE017
And corresponds to a point on the photograph
Figure 445294DEST_PATH_IMAGE018
Figure DEST_PATH_IMAGE019
The representative camera is a reference of the camera,
Figure 333747DEST_PATH_IMAGE020
representing the distortion coefficient of the camera and,
Figure DEST_PATH_IMAGE021
representing a quaternion
Figure 16792DEST_PATH_IMAGE022
Translation vector
Figure DEST_PATH_IMAGE023
A transformation matrix denoted 4 x 4,
Figure 990696DEST_PATH_IMAGE024
means that the 3d point of the calibration plate (z = 0) is projected onto the picture.
In this embodiment, the points on the calibration plate are directly referenced
Figure DEST_PATH_IMAGE025
And corresponds to a point on the photograph
Figure 212861DEST_PATH_IMAGE026
Solving the pose of the camera on the calibration board in opencv by using an epnp method
Figure DEST_PATH_IMAGE027
Then, the Levenberg-Marquardt nonlinear optimization is used to minimize the reprojection error. By the technical scheme in the embodiment of the invention, more accurate camera pose can be obtained through nonlinear optimization
Figure 486978DEST_PATH_IMAGE028
S205, iterative optimization is carried out on the hand-eye parameters based on the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses, and optimized hand-eye parameters are obtained.
In one embodiment, when the hand-eye parameters are iteratively optimized, the iteration can be stopped when the iteration termination condition is reached. Further, the iteration termination condition may be set according to actual conditions, for example, the iteration termination condition may be set to stop the iteration when a threshold number of iterations is reached, and the like. In this embodiment, the hand-eye parameters after the iterative optimization are recorded as optimized hand-eye parameters.
Specifically, the specific process of iteratively optimizing the parameters of the hand's eye will be described in detail in the subsequent embodiments.
According to the technical scheme provided by the embodiment of the invention, when the hand-eye calibration is carried out on the mechanical arm, the mechanical arm is started according to the acquired initialization parameters in response to the calibration request, data acquisition is carried out through a camera arranged at the tail end of the mechanical arm to obtain initial hand-eye calibration data, and the hand-eye calibration is carried out based on the initial hand-eye calibration data to obtain hand-eye parameters; and further sampling candidate postures of the camera, and performing iterative optimization on the hand-eye parameters obtained by the preliminary calibration by combining the candidate poses of the camera to obtain the optimized hand-eye parameters. According to the method, the full automation of the hand-eye calibration process of the whole mechanical arm is realized, and the labor cost and the time cost are reduced; meanwhile, iteration optimization is carried out on the initial hand-eye parameters obtained by calibrating the automatically acquired data, so that the hand-eye parameters with higher precision can be obtained, and the calibration precision is improved.
In some embodiments, as shown in fig. 5, the iterative optimization of the hand-eye parameters based on the initial hand-eye calibration data, the hand-eye parameters, and the candidate camera poses to obtain optimized hand-eye parameters includes steps S501 to S507:
s501, obtaining a predicted end pose of the tail end of the mechanical arm according to the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses.
In this embodiment, the pose of the end of the mechanical arm is predicted by the initial hand-eye calibration data, the hand-eye parameters, and the candidate poses of the camera.
In one embodiment, determining a predicted end pose of the end of the robotic arm based on the initial hand-eye calibration data, the hand-eye parameters, and the candidate poses of the camera may be represented by:
Figure DEST_PATH_IMAGE029
wherein,
Figure 5904DEST_PATH_IMAGE030
representing initial hand-eye calibration data
Figure DEST_PATH_IMAGE031
Indicating the end of the arm at the base of the armPose in the coordinate system;
Figure 263841DEST_PATH_IMAGE032
representing external parameters of the camera;
Figure DEST_PATH_IMAGE033
the parameters of the hand-eye are represented,
Figure 24117DEST_PATH_IMAGE034
a candidate pose of the camera is represented,
Figure DEST_PATH_IMAGE035
the representation represents a predicted end pose determined from the i-th set of initial hand-eye calibration data, the hand-eye parameters, and the candidate camera poses.
S502, calculating uncertainty parameters based on the predicted end pose, and determining generalization parameters based on the predicted end pose.
In the present embodiment, uncertainty parameters are defined for describing the uncertainty of the hand-eye parameters; in one particular embodiment, calculating the uncertainty parameter based on the predicted end pose may be expressed as:
Figure DEST_PATH_IMAGE037
wherein,
Figure 808534DEST_PATH_IMAGE038
a parameter of uncertainty is represented and,
Figure DEST_PATH_IMAGE039
respectively representing predicted end poses determined according to the ith and the j groups of initial hand-eye calibration data;
Figure 888573DEST_PATH_IMAGE040
and
Figure DEST_PATH_IMAGE041
respectively representing the quaternion and the translation vector corresponding to the rotating part of the corresponding transformation;
Figure 837069DEST_PATH_IMAGE042
represents any one parameter in parentheses; n represents the number of initial hand-eye calibration data;
Figure DEST_PATH_IMAGE043
representing preset parameters;
Figure 728933DEST_PATH_IMAGE044
indicating a modulo. Further, the air conditioner is provided with a fan,
Figure DEST_PATH_IMAGE045
respectively represent
Figure 712063DEST_PATH_IMAGE046
The rotating part corresponding to the transformation corresponds to a quaternion and a translation vector;
Figure DEST_PATH_IMAGE047
respectively represent
Figure 228583DEST_PATH_IMAGE048
The rotating part corresponding to the transformation corresponds to a quaternion and a translation vector; preset parameters
Figure DEST_PATH_IMAGE049
Can be adjusted according to actual conditions.
In another embodiment, a generalization parameter is further defined for obtaining a plurality of different sets of hand-eye calibration data, so that the hand-eye parameter obtained by calibration according to the hand-eye calibration data is more generalized. In one embodiment, a generalization parameter is determined based on the predicted end pose, expressed as:
Figure DEST_PATH_IMAGE051
wherein,
Figure 133216DEST_PATH_IMAGE052
representing a generalization parameter;
Figure DEST_PATH_IMAGE053
respectively representing predicted end poses determined according to the ith and the j groups of initial hand-eye calibration data;
Figure 766454DEST_PATH_IMAGE054
Figure DEST_PATH_IMAGE055
thresholds representing quaternion and displacement vector, respectively; wherein
Figure 310085DEST_PATH_IMAGE056
Figure DEST_PATH_IMAGE057
Representing initial hand-eye calibration data
Figure 941049DEST_PATH_IMAGE058
. The uncertainty parameter is defined according to the variance, and the generalization parameter is a loss function.
And S503, scoring the candidate poses of each camera based on the initial hand-eye calibration data, the hand-eye parameters, the uncertainty parameters and the generalization parameters to obtain a scoring result.
The predicted end poses of the tail ends of the multiple groups of mechanical arms can be predicted according to the multiple groups of initial hand-eye calibration data, errors exist in the calibration of the hand-eye parameters and the calibration of the external parameters of the camera, so that the calculated predicted end poses also have errors.
The hand-eye calibration needs to acquire a plurality of different data to ensure the generalization of the hand-eye parameter estimation with sufficient data, and in order to ensure that the acquired hand-eye calibration data has a certain generalization, the predicted end pose is obtained by punishment calculation in this embodiment
Figure DEST_PATH_IMAGE059
And the position and posture of the tail end of the mechanical arm in the base coordinate system of the mechanical arm which is actually acquired
Figure 129716DEST_PATH_IMAGE060
If the distance is smaller than a certain threshold, a penalty is imposed by the generalization parameters, so as to ensure that the hand-eye calibration data can be acquired at different positions, i.e. the generalization parameters in this embodiment. The distance may be calculated in any manner, such as L1 distance, L2 distance, and so on.
Since the candidate poses of the camera sampled at the early stage are more, if data acquisition is completely finished for each candidate pose of the camera, calibration calculation is slow due to excessive data, so in this embodiment, scores corresponding to the candidate poses of the camera are calculated, and all candidate poses of the camera are accepted or rejected based on the score result, so that the calibration efficiency is improved as much as possible on the premise of ensuring the hand-eye calibration accuracy.
In one embodiment, based on the initial hand-eye calibration data, the hand-eye parameters, the uncertainty parameters, and the generalization parameters, each camera candidate pose is scored, and a scoring result is obtained, which may be expressed as:
Figure DEST_PATH_IMAGE061
wherein,
Figure 628961DEST_PATH_IMAGE062
wherein,
Figure DEST_PATH_IMAGE063
respectively representing a base coordinate system of the mechanical arm, a terminal coordinate system of the mechanical arm, a camera coordinate system and a calibration board coordinate system,
Figure 580825DEST_PATH_IMAGE064
the result of the scoring is presented as a representation,
Figure DEST_PATH_IMAGE065
indicating what has been obtained in the initial hand-eye calibration data
Figure 553329DEST_PATH_IMAGE066
Representing the pose of the tail end of the mechanical arm in a base system of the mechanical arm;
Figure DEST_PATH_IMAGE067
representing external parameters of the camera;
Figure 229292DEST_PATH_IMAGE068
the parameters of the hand-eye are represented,
Figure DEST_PATH_IMAGE069
a candidate pose of the camera is represented,
Figure 96010DEST_PATH_IMAGE070
representing a predicted end pose determined from the ith set of initial hand-eye calibration data;
Figure DEST_PATH_IMAGE071
which is indicative of a first adjustment parameter,
Figure 699118DEST_PATH_IMAGE072
represents a second adjustment parameter;
Figure DEST_PATH_IMAGE073
is representative of the uncertainty parameter(s),
Figure 514627DEST_PATH_IMAGE074
representing the generalization parameters.
The first adjusting parameter and the second adjusting parameter can be set according to actual conditions.
According to the technical scheme provided by the embodiment of the invention, the punishment item of the generalization parameter is introduced, so that the condition that the mechanical arm is at the same position when the hand-eye calibration data is collected can be reduced, and the generalization of the hand-eye calibration is increased.
And S504, determining the target camera pose from the candidate camera poses according to the grading result.
In one embodiment, selecting the target camera pose according to the scoring result includes: and taking the camera candidate pose with the highest score as a next data acquisition pose point, namely the target camera pose. In another embodiment, the first few camera candidate poses with higher scores can be selected from the scoring results as target camera poses as the next few data acquisition pose points.
And S505, controlling the mechanical arm to move according to the pose of the target camera in sequence, and acquiring data to obtain second hand-eye calibration data corresponding to the pose of the target camera.
And after scoring the candidate poses of the cameras according to the scoring result and selecting the pose of the target camera from the candidate poses, controlling the mechanical arm to move to the position corresponding to the pose of the target camera for data acquisition, wherein in the embodiment, the data acquired at the pose of the target camera is recorded as second eye calibration data. It should be noted that the terms "first" and "second" in this embodiment are used only for distinguishing and naming, and are not used to indicate any actual meaning.
And S506, optimizing the hand-eye data based on the second hand-eye calibration data to obtain intermediate optimized hand-eye parameters.
After the candidate poses of the cameras are screened and the mechanical arm is controlled to move to the screened target pose of the camera for data acquisition, the hand-eye parameters are estimated again according to the second hand-eye calibration data acquired at this time, and then the optimized hand-eye parameters can be acquired. It should be noted that the process of re-estimating the hand-eye parameters according to the second hand-eye calibration data is similar to the process of determining the hand-eye parameters by calibrating according to the initial hand-eye calibration data, and is not repeated here.
And returning to the step of obtaining the predicted terminal pose of the tail end of the mechanical arm according to the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses, and S507, until an iteration termination condition is met, and obtaining optimized hand-eye parameters.
Furthermore, on the basis of the intermediate optimization of the hand-eye parameters, the unselected candidate poses of the camera can be screened according to the intermediate optimization of the hand-eye parameters at the moment to obtain the next candidate pose of the target, data acquisition and estimation of the hand-eye parameters are carried out again, the hand-eye parameters are continuously updated and iterated until iteration is stopped, and the obtained hand-eye parameters are recorded as the optimized hand-eye parameters.
In one embodiment, the iteration termination condition for the iteration termination may be set according to actual conditions, for example, the iteration termination condition may be set to stop the iteration when a threshold number of iterations is reached, and the like.
According to the technical scheme provided by the embodiment of the invention, after initial hand-eye parameter calibration is carried out through initial hand-eye calibration data, initial hand-eye parameters can be estimated, uncertainty parameters and generalization parameters are calculated according to the initial hand-eye parameters, the initial hand-eye calibration data and other information, one target camera pose is screened from collected camera candidate poses each time to be used as a next data collection point, intermediate optimized hand-eye parameters can be obtained through re-estimation by utilizing the next collected data, the next target camera pose can be continuously screened according to the intermediate optimized hand-eye parameters to carry out data collection, and iterative optimization is carried out on the hand-eye parameters by the same method until the final optimized hand-eye parameters are obtained when the iteration termination condition is met. The uncertainty and the generalization parameters can guide selection of a next data acquisition point, part of candidate poses of the camera are screened out, time required by calibration is reduced, calibration calculation efficiency is improved, and iterative optimization is continuously performed on hand-eye parameters, so that hand-eye parameters with higher precision are obtained.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
In an embodiment, the method for calibrating the hand-eye of the mechanical arm is described in an embodiment, as shown in fig. 6, and includes two stages, where the first stage includes initial hand-eye calibration and camera candidate pose sampling, and the second stage includes acquisition of data by walking the mechanical arm and iterative optimization of hand-eye parameters.
And in the first stage, when initial hand-eye calibration is carried out, the tail end of the mechanical arm is controlled to reach a tail end initial pose corresponding to the initialization parameters according to the acquired initialization parameters to acquire a group of data, then the mechanical arm is rotated around the x axis and the y axis of a preset coordinate system according to the initialization parameters, and a group of data is recorded every time the mechanical arm is rotated by 10 degrees. In one embodiment, in order not to affect the accuracy of the initial hand-eye calibration, only two sets of data are taken about the x-axis and the y-axis, respectively, the data when the maximum positive and negative rotation angles of the calibration plate can be seen, and the initial data is added, so that 5 sets of data are collected in total, namely the initial hand-eye calibration data in the above embodiment. And carrying out initial hand-eye calibration according to the initial hand-eye calibration data. FIG. 7 shows a coordinate system of a representative robot arm base
Figure DEST_PATH_IMAGE075
End of arm coordinate system
Figure 5782DEST_PATH_IMAGE076
Coordinate system of camera
Figure DEST_PATH_IMAGE077
And calibrating the coordinate system of the plate
Figure 518934DEST_PATH_IMAGE078
Schematic diagram of the conversion of (1).
After the initial hand-eye calibration is performed, the accuracy of the initial calibration hand-eye parameters is not high because the data only has 5 groups. Because the space of the camera pose is large, the camera pose is sampled below, and the camera pose sampling is to select a series of camera poses in the space as candidates, namely the candidate poses of the camera in the embodiment. Please refer to the foregoing embodiments for a specific camera pose sampling rule, which is not described herein again.
After sampling is completed, the pose of the camera needs to be screened, so that the camera can see a calibration plate (apriltags plate), points on the edge of the calibration plate are projected onto a picture to see whether the edge of the image is exceeded or not, but in order to ensure the accuracy of estimating the camera external parameters, half of tag (target) needs to appear in the camera.
And a second stage: after the initialization is completed, the hand-eye parameters with low precision already exist, and then the operations of moving the camera, collecting data, updating the hand-eye parameters and the like are carried out in an iterative manner. In the iterative camera walking, camera walking points are selected by scoring the acquired candidate poses of each camera, namely the target camera pose in the embodiment, the candidate pose with a high score is the next point for acquiring data, and the higher the score is, the higher the uncertainty of the corresponding region is, the pose selection of the tail end of the mechanical arm at the next time is biased to the region; and then calculating the pose corresponding to the tail end of the robot arm, controlling the robot arm to operate to the pose of the target camera by applying the knowledge of the inverse dynamics of the robot, acquiring data, namely second hand-eye calibration data in the embodiment, and calculating optimized hand-eye parameters. In this embodiment, an initial hand-eye parameter may be estimated according to initially acquired data, candidate camera poses may be screened according to the initial hand-eye parameter (by calculating scores of the candidate poses of each camera) to obtain a target camera pose, the hand-eye parameter may be updated once by moving the mechanical arm to the target camera pose and collecting data once, and then, the next target camera pose may be continuously screened to obtain the next target camera pose, the mechanical arm is controlled to move again to collect data, the hand-eye parameter is updated again according to the collected data, and so on, the hand-eye parameter is iteratively updated until iteration is terminated, and a final optimized back hand-eye parameter is obtained. By scoring the candidate camera poses, the better candidate camera poses can be selected for data acquisition, hand-eye parameters can be updated better and faster, and hand parameters with higher precision can be obtained.
The specific process of scoring the candidate poses of the camera can be referred to in the description of the foregoing embodiments.
Through the technical scheme in the embodiment, the full-automatic mechanical arm hand-eye calibration program is provided, and the hand-eye parameters can be automatically obtained by controlling the robot to automatically move, photograph and calculate only by appointing the initialized sampling parameters and completing the interface between the robot and the camera. Some camera poses are sampled in an appointed space, and walking is carried out in a mode of reducing uncertainty of hand-eye parameters most in a scoring mode, so that the precision of hand-eye calibration is improved.
The following are embodiments of the apparatus of the present invention that may be used to perform embodiments of the method of the present invention. For details which are not disclosed in the embodiments of the apparatus of the present invention, reference is made to the embodiments of the method of the present invention.
Fig. 8 is a schematic view of a hand-eye calibration device of a robot provided in an embodiment of the present invention. As shown in fig. 8, the hand-eye calibration apparatus of the robot arm includes:
a parameter obtaining module 801, configured to obtain an initialization parameter for hand-eye calibration in response to a calibration request;
a calibration data acquisition module 802, configured to start the mechanical arm according to the initialization parameter, and acquire initial data through a camera arranged at the end of the mechanical arm to obtain initial hand-eye calibration data;
a calibration module 803, configured to perform hand-eye calibration based on the initial hand-eye calibration data to obtain hand-eye parameters;
the camera sampling module 804 is used for sampling the gesture of the camera to obtain candidate poses of the camera;
and the optimization module 805 is configured to perform iterative optimization on the hand-eye parameters based on the initial hand-eye calibration data, the hand-eye parameters, and the candidate camera poses to obtain optimized hand-eye parameters.
According to the technical scheme provided by the embodiment of the invention, when the hand-eye calibration is carried out on the mechanical arm, the mechanical arm is started according to the acquired initialization parameters in response to the calibration request, data acquisition is carried out through a camera arranged at the tail end of the mechanical arm to obtain initial hand-eye calibration data, and the hand-eye calibration is carried out based on the initial hand-eye calibration data to obtain hand-eye parameters; and further sampling candidate postures of the camera, and performing iterative optimization on the hand-eye parameters obtained by the preliminary calibration by combining the candidate poses of the camera to obtain the optimized hand-eye parameters. The full automation is realized in the hand-eye calibration process of the whole mechanical arm, so that the labor cost and the time cost are reduced; meanwhile, iteration optimization is carried out on the initial hand-eye parameters obtained by calibrating the automatically acquired data, so that the hand-eye parameters with higher precision can be obtained, and the calibration precision is improved.
In some embodiments, as shown in fig. 9, the optimization module of the above apparatus comprises:
the predicting unit 901 is configured to obtain a predicted end pose of the end of the mechanical arm according to the initial hand-eye calibration data, the hand-eye parameters, and the candidate camera poses;
a parameter determination unit 902 for calculating uncertainty parameters based on the predicted end pose, and determining generalization parameters based on the predicted end pose;
the scoring unit 903 is used for scoring the candidate poses of each camera based on the initial hand-eye calibration data, the hand-eye parameters, the uncertainty parameters and the generalization parameters to obtain a scoring result;
a selecting unit 904 for determining a target camera pose from the camera candidate poses according to the scoring result;
the data acquisition unit 905 is used for controlling the mechanical arm to move in sequence according to the pose of the target camera, and acquiring data to obtain second hand-eye calibration data corresponding to the pose of the target camera;
and the optimizing unit 906 is configured to optimize the hand-eye data based on the second hand-eye calibration data to obtain an intermediate optimized hand-eye parameter.
And a judging unit 907, when the iteration termination condition is met, stopping iteration to obtain optimized hand-eye parameters.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Fig. 10 is a schematic diagram of an electronic device 10 according to an embodiment of the present invention. As shown in fig. 10, the electronic apparatus 10 of this embodiment includes: a processor 1001, a memory 1002, and a computer program 1003 stored in the memory 1002 and executable on the processor 1001. The steps in the various method embodiments described above are implemented when the processor 1001 executes the computer program 1003. Alternatively, the processor 1001 realizes the functions of each module/unit in each device embodiment described above when executing the computer program 1003.
Illustratively, the computer program 1003 may be divided into one or more modules/units, which are stored in the memory 1002 and executed by the processor 1001 to implement the present invention. One or more of the modules/units may be a series of computer program instruction segments capable of performing certain functions, which are used to describe the execution of the computer program 1003 in the electronic device 10.
The electronic device 10 may be a desktop computer, a notebook, a palm top computer, a cloud server, or other electronic devices. The electronic device 10 may include, but is not limited to, a processor 1001 and a memory 1002. Those skilled in the art will appreciate that fig. 10 is merely an example of an electronic device 10 and does not constitute a limitation of the electronic device 10 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the electronic device may also include input-output devices, network access devices, buses, etc.
The Processor 1001 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 1002 may be an internal storage unit of the electronic device 10, for example, a hard disk or a memory of the electronic device 10. The memory 1002 may also be an external storage device of the electronic device 10, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the electronic device 10. Further, the memory 1002 may also include both internal storage units and external storage devices of the electronic device 10. The memory 1002 is used for storing computer programs and other programs and data required by the electronic device. The memory 1002 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/electronic device and method may be implemented in other ways. For example, the above-described apparatus/electronic device embodiments are merely illustrative, and for example, a module or a unit may be divided into only one logical function, and may be implemented in other ways, and multiple units or components may be combined or integrated into another system, or some features may be omitted or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by the present invention, and the computer program can be stored in a computer readable storage medium to instruct related hardware, and when the computer program is executed by a processor, the steps of the method embodiments described above can be realized. The computer program may comprise computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain suitable additions or additions that may be required in accordance with legislative and patent practices within the jurisdiction, for example, in some jurisdictions, computer readable media may not include electrical carrier signals or telecommunications signals in accordance with legislative and patent practices.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (9)

1. A hand-eye calibration method of a mechanical arm is characterized by comprising the following steps:
responding to the calibration request, and acquiring initialization parameters for hand-eye calibration;
starting the mechanical arm according to the initialization parameters, and acquiring initial data through a camera arranged at the tail end of the mechanical arm to obtain initial hand-eye calibration data;
performing hand-eye calibration based on the initial hand-eye calibration data to obtain hand-eye parameters;
carrying out attitude sampling on the camera to obtain candidate poses of the camera;
performing iterative optimization on the hand-eye parameters based on the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses to obtain optimized hand-eye parameters;
wherein, obtain the hand eye parameter after optimizing, include:
obtaining a predicted end pose of the tail end of the mechanical arm according to the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses;
calculating uncertainty parameters based on the predicted end pose, determining generalization parameters based on the predicted end pose;
based on the initial hand-eye calibration data, the hand-eye parameters, the uncertainty parameters and the generalization parameters, scoring each camera candidate pose to obtain a scoring result;
determining a target camera pose from the camera candidate poses according to the grading result;
controlling the mechanical arm to move according to the pose of the target camera in sequence, and acquiring data to obtain second eye calibration data corresponding to the pose of the target camera;
optimizing the hand-eye parameters based on the second hand-eye calibration data to obtain intermediate optimized hand-eye parameters;
and returning to the step of obtaining the predicted end pose of the tail end of the mechanical arm according to the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses until an iteration termination condition is met, and obtaining the optimized hand-eye parameters.
2. The hand-eye calibration method for a mechanical arm according to claim 1, wherein the scoring of each candidate pose of the camera based on the initial hand-eye calibration data, the hand-eye parameters, the uncertainty parameters and the generalization parameters is performed to obtain a scoring result, and the scoring result is expressed as:
Figure 894737DEST_PATH_IMAGE001
wherein,
Figure 610889DEST_PATH_IMAGE002
wherein,
Figure 400116DEST_PATH_IMAGE003
respectively representing a base coordinate system of the robot arm, a coordinate system of the robot arm tip, a camera coordinate system and a calibration plate coordinate system,
Figure 176311DEST_PATH_IMAGE004
the result of the scoring is represented as,
Figure 644464DEST_PATH_IMAGE005
representing in said initial hand-eye calibration data
Figure 305252DEST_PATH_IMAGE006
Representing a pose of the end of the robot arm in a base frame of the robot arm;
Figure 978679DEST_PATH_IMAGE007
representing external parameters of the camera;
Figure 218119DEST_PATH_IMAGE008
representing the parameters of the hand-eye in question,
Figure 875365DEST_PATH_IMAGE009
representing the candidate poses of the camera,
Figure 106888DEST_PATH_IMAGE010
representing the predicted end pose determined from the ith set of initial hand-eye calibration data;
Figure 900401DEST_PATH_IMAGE011
which is indicative of a first adjustment parameter,
Figure 51022DEST_PATH_IMAGE012
represents a second adjustment parameter;
Figure 336510DEST_PATH_IMAGE013
is representative of the uncertainty parameter(s),
Figure 293095DEST_PATH_IMAGE014
representing the generalization parameters.
3. A hand-eye calibration method for a robotic arm as claimed in claim 2, wherein said calculating an uncertainty parameter based on said predicted end pose is represented as:
Figure 675535DEST_PATH_IMAGE015
wherein,
Figure 991198DEST_PATH_IMAGE016
is representative of the uncertainty parameter(s),
Figure 826299DEST_PATH_IMAGE017
respectively representing the predicted end poses determined according to the ith and the j groups of initial hand-eye calibration data;
Figure 993100DEST_PATH_IMAGE018
and
Figure 26784DEST_PATH_IMAGE019
respectively representing the quaternion and the translation vector corresponding to the rotating part of the corresponding transformation; n represents the number of the initial hand-eye calibration data;
Figure 893109DEST_PATH_IMAGE020
representing preset parameters;
Figure 779288DEST_PATH_IMAGE021
indicating a modulo.
4. A hand-eye calibration method for a robotic arm as claimed in claim 2, wherein said determining a generalization parameter based on said predicted end pose comprises:
Figure 982736DEST_PATH_IMAGE022
wherein,
Figure 247757DEST_PATH_IMAGE023
representing the generalization parameters;
Figure 144038DEST_PATH_IMAGE024
respectively representing the predicted end poses determined according to the ith and the j groups of initial hand-eye calibration data;
Figure 511653DEST_PATH_IMAGE025
and
Figure 925317DEST_PATH_IMAGE026
respectively representing the quaternion and the translation vector corresponding to the rotating part of the corresponding transformation;
Figure 668014DEST_PATH_IMAGE027
Figure 236661DEST_PATH_IMAGE028
thresholds representing quaternion and displacement vector, respectively; wherein
Figure 533650DEST_PATH_IMAGE029
Figure 845945DEST_PATH_IMAGE030
Representing in said initial hand-eye calibration data
Figure 443149DEST_PATH_IMAGE031
5. The hand-eye calibration method for the mechanical arm according to any one of claims 1 to 4, wherein the sampling the pose of the camera to obtain candidate poses of the camera comprises:
and taking the coordinate axis origin of the coordinate system where the calibration plate is located as a starting point, moving and sampling the camera pose in each sampling plane at preset intervals by a preset rotation angle to obtain the candidate pose of the camera.
6. A hand-eye calibration method for a mechanical arm according to any one of claims 1 to 4, wherein the starting the mechanical arm according to the initialization parameters and performing initial data acquisition by a camera arranged at the tail end of the mechanical arm to obtain initial hand-eye calibration data comprises:
acquiring a tail end initial pose corresponding to the tail end of the mechanical arm;
and rotating by taking the terminal initial pose as a starting point and the coordinate axis of a preset coordinate system as a central axis, and collecting a group of initial hand-eye calibration data at intervals of a preset angle.
7. A hand-eye calibration device of a mechanical arm is characterized by comprising:
the parameter acquisition module is used for responding to the calibration request and acquiring initialization parameters for hand-eye calibration;
the calibration data acquisition module is used for starting the mechanical arm according to the initialization parameters and acquiring initial data through a camera arranged at the tail end of the mechanical arm to obtain initial hand-eye calibration data;
the calibration module is used for carrying out hand-eye calibration based on the initial hand-eye calibration data to obtain hand-eye parameters;
the camera sampling module is used for sampling the postures of the cameras to obtain candidate poses of the cameras;
the optimization module is used for performing iterative optimization on the hand-eye parameters based on the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses to obtain optimized hand-eye parameters;
wherein the optimization module performs the steps of:
obtaining a predicted end pose of the tail end of the mechanical arm according to the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses;
calculating uncertainty parameters based on the predicted end pose, determining generalization parameters based on the predicted end pose;
based on the initial hand-eye calibration data, the hand-eye parameters, the uncertainty parameters and the generalization parameters, scoring each camera candidate pose to obtain a scoring result;
determining a target camera pose from the camera candidate poses according to the grading result;
controlling the mechanical arm to move according to the pose of the target camera in sequence, and acquiring data to obtain second eye calibration data corresponding to the pose of the target camera;
optimizing the hand-eye parameters based on the second hand-eye calibration data to obtain intermediate optimized hand-eye parameters;
and returning to the step of obtaining the predicted end pose of the tail end of the mechanical arm according to the initial hand-eye calibration data, the hand-eye parameters and the candidate camera poses until an iteration termination condition is met, and obtaining the optimized hand-eye parameters.
8. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, carries out the steps of a hand-eye calibration method for a robot arm according to any of claims 1 to 6.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of a method for hand-eye calibration of a robot arm according to any one of claims 1 to 6.
CN202111639830.5A 2021-12-30 2021-12-30 Hand-eye calibration method and device for mechanical arm, electronic equipment and storage medium Active CN113997295B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111639830.5A CN113997295B (en) 2021-12-30 2021-12-30 Hand-eye calibration method and device for mechanical arm, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111639830.5A CN113997295B (en) 2021-12-30 2021-12-30 Hand-eye calibration method and device for mechanical arm, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113997295A true CN113997295A (en) 2022-02-01
CN113997295B CN113997295B (en) 2022-04-12

Family

ID=79932257

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111639830.5A Active CN113997295B (en) 2021-12-30 2021-12-30 Hand-eye calibration method and device for mechanical arm, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113997295B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114407018A (en) * 2022-02-11 2022-04-29 天津科技大学 Robot hand-eye calibration method, device, electronic device, storage medium and product
CN114872039A (en) * 2022-04-19 2022-08-09 汕头大学 Mechanical arm hand-eye calibration method and system based on improved SVD algorithm
CN115008454A (en) * 2022-05-31 2022-09-06 深圳大学 An online hand-eye calibration method for robots based on multi-frame pseudo-label data enhancement
CN115781698A (en) * 2023-02-06 2023-03-14 广东省科学院智能制造研究所 Method, system, equipment and medium for automatically generating motion pose of layered hand-eye calibration robot
CN119002273A (en) * 2024-08-09 2024-11-22 超节点创新科技(深圳)有限公司 Self-adaptive method and system of robot wearable equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105014667A (en) * 2015-08-06 2015-11-04 浙江大学 Camera and robot relative pose calibration method based on pixel space optimization
CN105773609A (en) * 2016-03-16 2016-07-20 南京工业大学 Robot kinematics calibration method based on vision measurement and distance error model
US20160214255A1 (en) * 2015-01-22 2016-07-28 GM Global Technology Operations LLC Method for calibrating an articulated end effector employing a remote digital camera
CN109483516A (en) * 2018-10-16 2019-03-19 浙江大学 A kind of mechanical arm hand and eye calibrating method based on space length and epipolar-line constraint
CN109658460A (en) * 2018-12-11 2019-04-19 北京无线电测量研究所 A kind of mechanical arm tail end camera hand and eye calibrating method and system
CN110238845A (en) * 2019-05-22 2019-09-17 湖南视比特机器人有限公司 Optimal Calibration point chooses and the automatic hand and eye calibrating method and device of error measurement
CN112752091A (en) * 2019-10-29 2021-05-04 牧今科技 Method and system for determining pose of camera calibration
CN113052907A (en) * 2021-04-12 2021-06-29 深圳大学 Positioning method of mobile robot in dynamic environment
CN113787522A (en) * 2021-10-12 2021-12-14 华侨大学 Hand-eye calibration method for eliminating accumulated errors of mechanical arm
CN113814987A (en) * 2021-11-24 2021-12-21 季华实验室 Multi-camera robot hand-eye calibration method and device, electronic equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160214255A1 (en) * 2015-01-22 2016-07-28 GM Global Technology Operations LLC Method for calibrating an articulated end effector employing a remote digital camera
CN105818167A (en) * 2015-01-22 2016-08-03 通用汽车环球科技运作有限责任公司 Method for calibrating an articulated end effector employing a remote digital camera
CN105014667A (en) * 2015-08-06 2015-11-04 浙江大学 Camera and robot relative pose calibration method based on pixel space optimization
CN105773609A (en) * 2016-03-16 2016-07-20 南京工业大学 Robot kinematics calibration method based on vision measurement and distance error model
CN109483516A (en) * 2018-10-16 2019-03-19 浙江大学 A kind of mechanical arm hand and eye calibrating method based on space length and epipolar-line constraint
CN109658460A (en) * 2018-12-11 2019-04-19 北京无线电测量研究所 A kind of mechanical arm tail end camera hand and eye calibrating method and system
CN110238845A (en) * 2019-05-22 2019-09-17 湖南视比特机器人有限公司 Optimal Calibration point chooses and the automatic hand and eye calibrating method and device of error measurement
CN112752091A (en) * 2019-10-29 2021-05-04 牧今科技 Method and system for determining pose of camera calibration
CN113052907A (en) * 2021-04-12 2021-06-29 深圳大学 Positioning method of mobile robot in dynamic environment
CN113787522A (en) * 2021-10-12 2021-12-14 华侨大学 Hand-eye calibration method for eliminating accumulated errors of mechanical arm
CN113814987A (en) * 2021-11-24 2021-12-21 季华实验室 Multi-camera robot hand-eye calibration method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HAIYAN WU 等: "Hand-Eye Calibration and Inverse Kinematics of Robot Arm using Neural Network", 《DTU LIBRARY》 *
刘小明 等: "光流控制地形跟随与自动着陆", 《北京航空航天大学学报》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114407018A (en) * 2022-02-11 2022-04-29 天津科技大学 Robot hand-eye calibration method, device, electronic device, storage medium and product
CN114407018B (en) * 2022-02-11 2023-09-22 天津科技大学 Robot hand-eye calibration methods, devices, electronic equipment, storage media and products
CN114872039A (en) * 2022-04-19 2022-08-09 汕头大学 Mechanical arm hand-eye calibration method and system based on improved SVD algorithm
CN114872039B (en) * 2022-04-19 2023-06-27 汕头大学 Mechanical arm hand-eye calibration method and system based on improved SVD algorithm
CN115008454A (en) * 2022-05-31 2022-09-06 深圳大学 An online hand-eye calibration method for robots based on multi-frame pseudo-label data enhancement
CN115008454B (en) * 2022-05-31 2024-12-17 深圳大学 Robot online hand-eye calibration method based on multi-frame pseudo tag data enhancement
CN115781698A (en) * 2023-02-06 2023-03-14 广东省科学院智能制造研究所 Method, system, equipment and medium for automatically generating motion pose of layered hand-eye calibration robot
CN115781698B (en) * 2023-02-06 2023-04-04 广东省科学院智能制造研究所 Method, system, equipment and medium for automatically generating motion pose of layered hand-eye calibration robot
CN119002273A (en) * 2024-08-09 2024-11-22 超节点创新科技(深圳)有限公司 Self-adaptive method and system of robot wearable equipment
CN119002273B (en) * 2024-08-09 2025-06-20 超节点创新科技(深圳)有限公司 Adaptive method and system for wearable robot devices

Also Published As

Publication number Publication date
CN113997295B (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN113997295B (en) Hand-eye calibration method and device for mechanical arm, electronic equipment and storage medium
CN111015655B (en) Mechanical arm grabbing method and device, computer readable storage medium and robot
CN114012731B (en) Hand-eye calibration method and device, computer equipment and storage medium
WO2020168770A1 (en) Object pose estimation method and apparatus
CN111208783B (en) Action simulation method, device, terminal and computer storage medium
RU2700246C1 (en) Method and system for capturing an object using a robot device
WO2019114339A1 (en) Method and device for correcting motion of robotic arm
CN112936301A (en) Robot hand-eye calibration method and device, readable storage medium and robot
CN107953329B (en) Object recognition and attitude estimation method, device and robotic arm grasping system
TWI748409B (en) Data processing method, processor, electronic device and computer readable medium
CN110193849A (en) A kind of method and device of Robotic Hand-Eye Calibration
CN109702738B (en) Mechanical arm hand-eye calibration method and device based on three-dimensional object recognition
KR20180080630A (en) Robot and electronic device for performing hand-eye calibration
CN113787522B (en) Hand-eye calibration method for eliminating accumulated errors of mechanical arm
CN113172636B (en) Automatic hand-eye calibration method and device and storage medium
CN112818898B (en) Model training method and device and electronic equipment
CN109559341B (en) Method and device for generating mechanical arm grabbing scheme
CN112686950A (en) Pose estimation method and device, terminal equipment and computer readable storage medium
CN115213896A (en) Object grasping method, system, device and storage medium based on robotic arm
CN113119104B (en) Mechanical arm control method, mechanical arm control device, computing equipment and system
CN111098306A (en) Robot calibration method, device, robot and storage medium
CN116352711A (en) Hand-eye calibration method and equipment for robot, robot and computer storage medium
CN113298870A (en) Object posture tracking method and device, terminal equipment and storage medium
WO2023207186A1 (en) Target positioning method and apparatus, electronic device, and storage medium
CN115031635A (en) Measuring method and device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant