Detailed Description
The application is further described in detail below by means of the figures and examples. The features and advantages of the present application will become more apparent from the description.
The word "exemplary" is used herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
In addition, the technical features described below in the different embodiments of the present application may be combined with each other as long as they do not collide with each other.
The embodiment of the invention provides a robot and laser sensor hand-eye calibration method for reducing calibration errors and simplifying calibration modes, which is shown in figure 1 and comprises the following steps:
step 101, establishing a user coordinate system of the robot by using a three-point method based on a calibration plate, wherein the calibration plate is arranged in the reachable range of the robot;
102, moving the robot according to a user coordinate system, and acquiring a first point position of the tail end of the robot tool when the near-field position of the laser sensor coincides with a specific edge of the calibration plate;
step 103, moving the robot according to a user coordinate system, and acquiring a second point position of the tail end of the robot tool when the laser far field of the laser sensor coincides with a specific edge of the calibration plate;
104, keeping the posture of the robot unchanged, moving the position of the robot, and acquiring a third point position of the tail end of the robot tool when the TCP of the center point of the robot tool is aligned with the space identification point on the calibration plate;
Step 105, constructing a hand-eye calibration mathematical model of the robot and the laser sensor;
and 106, determining a calibration matrix between the robot and the laser sensor based on the first point location, the second point location, the third point location and the hand-eye calibration mathematical model.
In the specific embodiment, a user coordinate system of the robot is established by using a three-point method based on a calibration plate, the robot is moved according to the user coordinate system, a first point position of the tail end of the robot tool when the near-field position of laser light of the laser sensor coincides with a specific edge of the calibration plate is obtained, a second point position of the tail end of the robot tool when the far-field position of laser light of the laser sensor coincides with the specific edge of the calibration plate is obtained according to the user coordinate system, the posture of the robot is kept unchanged, the position of the robot is moved, a third point position of the robot tool when a central point TCP of the robot is aligned with a space identification point on the calibration plate is obtained, a hand-eye calibration mathematical model of the robot and the laser sensor is constructed, and a calibration matrix between the robot and the laser sensor is determined based on the first point position, the second point position, the third point position and the hand-eye calibration mathematical model. Only three point positions are determined and a hand-eye calibration mathematical model is constructed, so that calibration can be completed, errors caused by the fact that laser irradiates the identification points for many times through manual teaching in the prior art are avoided, the calibration mode is simplified, and the calibration precision is improved.
In the embodiment of the invention, as shown in fig. 2, the robot is fixedly installed on a floor or other walking shafts, the laser sensor is fixedly installed on a sixth joint shaft of the robot, or a preset position such as a hand grip, a welding gun or the like, and the laser sensor is a word line laser sensor. After the robot, the word line laser sensor and the calibration plate are installed, a local area network formed by the sensor, the industrial personal computer and the robot is constructed for communication, the word line laser sensor and the industrial personal computer communicate through a protocol agreed by the sensor, the industrial personal computer and the robot communicate through a protocol agreed by the robot, the word line laser sensor and the robot do not directly communicate, the three parts form the local area network together, and the industrial personal computer is used for acquiring and acquiring position information of the two parts and carrying out algorithm processing. In the figure, the base coordinate system of the robot is a coordinate system shown by X R、YR、ZR, the coordinate system shown by X L、YL、ZL is a laser sensor coordinate system, and the coordinate system shown by X T、YT、ZT is a tool coordinate system of the robot.
Since the laser emitted by a word line laser sensor is linear, the visual field is limited, the workpiece cannot be identified in the range of the visual field, the first preset range which is closer to the workpiece is defined as a near-visual field of the laser, the first preset range which is farther from the workpiece is defined as a far-visual field of the laser, for example, if the visual field of the laser is 300-500mm, the distance from the laser to the workpiece is less than 300mm or more than 500mm can be defined as a near-visual field range which is close to 300mm, and the distance from the laser to the workpiece is close to 500mm as a far-visual field range.
Before calibration, the preparation operation is needed, including calibration of the TCP (Tool Central Point) of the robot, and since the technology of performing TCP calibration of the robot is mature, and not described in detail herein, in the embodiment of the invention, calibration modes provided by each brand of robot are used as standards, for example, a standard gun calibration ruler or gun calibration block can be used for calibration preferentially, and if no condition exists, calibration can be performed by using a non-L1 mode, that is, the XYZ distance from the end of the tool to the center of the flange disc is measured sequentially, and the angle parameters are provided by a welding gun manufacturer and then filled into the parameters of the robot tool.
Based on the calibration plate, a user coordinate system of the robot is established by using a three-point method, and the calibration plate is placed in a robot reachable range, wherein the robot reachable range refers to a range that a robot arm can reach through operations such as movement or rotation. The specific implementation process comprises the steps of adjusting the gun gesture of the robot, so that an included angle between a laser beam of the laser sensor and the plane of the calibration plate is in a preset range, is generally vertical or approximately vertical, and the laser beam of the laser sensor coincides with or is parallel to a specific edge of the calibration plate.
The calibration plate is generally placed on a platform and placed in a range that can be reached by the robot so that the spatial recognition points on the calibration plate can be recognized, in a specific embodiment, the calibration plate is generally rectangular, in order to facilitate that the spatial recognition points on the calibration plate can be accurately captured, as shown in fig. 3, the calibration plate is formed by welding two plates, one of the two plates is large and the other is small, and the spatial recognition points are located at the lap joint of the two plates.
The robot user coordinate system is a rectangular coordinate system for customizing each working space by a user, is used for teaching and executing a position register, executing a position compensation instruction and the like, the user coordinate system building method is relatively simple, and is generally realized by teaching 3 teaching points, wherein the first teaching point is an origin of the user coordinate system, the second teaching point is on an X axis, a connecting line from the first teaching point to the second teaching point is an X axis, the pointing direction is an X positive direction, the third teaching point is in a positive direction area of a Y axis, and the Z axis is determined by a right hand rule. In the embodiment, as shown in fig. 4, P 0 is a first point set up and is also the origin position of the user coordinate system, P X is a second point set up, and P X points from P 0 to P X, i.e., the X direction of the user coordinate system (the direction shown by X s in the figure), P Y is a third point set up for calculating the Y direction of the user coordinate system (the direction shown by Y s in the figure), and the Z direction of the user coordinate system is determined by the right rule. In order to facilitate the subsequent use, the laser beam of the laser sensor is approximately perpendicular to the plane of the calibration plate by adjusting the gun pose of the welding gun of the robot as much as possible, and is parallel or coincident with a specific side (a side parallel to the Y s direction, namely the side where the edge 1 is positioned in fig. 4) of the calibration plate.
After a user coordinate system of the robot is established, the robot is moved according to the user coordinate system, and a first point position of the tail end of the robot tool when the laser near-field position of the laser sensor coincides with a specific side of the calibration plate is obtained. The specific implementation process comprises the following steps:
keeping the gun pose of the welding gun unchanged, and moving the robot according to a user coordinate system so that the center of a laser near vision field of the laser sensor can identify an identification point of the calibration plate;
And moving the X s direction of the user coordinate system to enable the laser beam of the laser sensor to coincide with the specific edge of the calibration plate, and recording the position of the tail end of the robot tool at the moment as a first point position P 1.
Then, the robot is moved according to a user coordinate system, a second point position of the tail end of the robot tool when the laser far field of the laser sensor coincides with the specific edge of the calibration plate is acquired, and the specific implementation process comprises the following steps:
The gun pose of the welding gun is kept unchanged, and the robot is moved according to a user coordinate system, so that the center of a laser far field of view of the laser sensor can identify an identification point of the calibration plate;
And moving the X s direction of the user coordinate system to enable the laser beam of the laser sensor to coincide with the specific edge of the calibration plate, and recording the position of the tail end of the robot tool at the moment as a second point position P 2.
After P 2 is obtained, the posture of the robot is kept unchanged, the position of the robot is moved, a third point position of the tail end of the robot tool when the center point TCP of the robot tool is aligned with the space identification point on the calibration plate is obtained, the specific process comprises the steps of keeping the gun posture of the welding gun unchanged, moving the position of the robot, enabling the TCP of the robot to be aligned with the space identification point on the calibration plate, and recording the position of the tail end of the robot tool at the moment as the third point position P 3.
Then, constructing a hand-eye calibration mathematical model of the robot and the laser sensor, wherein the specific mathematical model is as follows:
Wherein, P represents the position of the space recognition point under the robot base coordinate system, P i represents the coordinate under the robot base coordinate system when the laser sensor irradiates the space recognition point; The rotation matrix representing the coordinate system of the robot tool to the basic coordinate system can be obtained by Euler angle transformation of the robot, wherein P Laser represents the position of the space identification point under the coordinate system of the laser sensor; Representing the transformation matrix relationship of the laser sensor coordinate system to the tool coordinate system, and the unknown quantity, and M TL represents the translation component and the unknown quantity.
After the above-mentioned steps are finished, the invented product can be obtained
Order the
Then there are:
For the above equation, each equation has 4 unknowns, so long as three nonlinear equation sets can be obtained by collecting 4 sets of coordinate data, and the first equation in the above equation can be obtained by way of example:
Wherein X bi、Ybi、Zbi represents the component of the coordinate of the spatial recognition point in the X, Y, Z direction under the i (i=1, 2,3, 4) th group laser sensor coordinate system, and X ai represents the component of the C a in the X direction calculated by the P i coordinate under the i (i=1, 2,3, 4) th group robot base coordinate system.
The coefficients of the calibration matrix are:
similarly, the second equation and the third equation are transformed and solved, and finally the method can be determined And M TL.
Therefore, after the hand-eye calibration mathematical models of the robot and the laser sensor are built, a calibration matrix between the robot and the laser sensor is determined based on the first point location, the second point location, the third point location and the hand-eye calibration mathematical models. In specific implementation, setting an offset distance parameter, transforming the first point position P 1 and the second point position P 2 according to the offset distance parameter to obtain four identification points, inputting the four identification points and the third point position P 3 into a hand-eye calibration mathematical model, and calculating to obtain a calibration matrix between the robot and the laser sensor.
Specifically, the calculation process is realized by an industrial personal computer, offset distance parameters are set in an automatic calibration program of the robot, the automatic calibration process is a written teaching program of the robot, and instructions of each robot manufacturer are different, but basically comprise instructions of obtaining the current position of the robot, performing offset according to the specified parameters and the like. The industrial personal computer firstly receives offset distance parameters set by a user, operates an automatic calibration program of the robot, and transforms P 1 and P 2 according to the offset distance parameters when in operation, so that four identification points on the left side and the right side of a far vision field and the left side and the right side of a near vision field can be respectively obtained, the industrial personal computer collects the four identification points, the pose of the robot corresponding to the P 3 and the coordinates under a laser sensor coordinate system, inputs the pose into a hand-eye calibration mathematical model, and can calculate and obtain a calibration matrix between the robot and the laser sensor according to the solving and deducing process of the hand-eye calibration mathematical model.
The offset distance parameter includes that S X and S Y,SX are values of the distance from the space identification point on the punctuation plate to the specific edge of the punctuation plate, and the value of S Y is related to the length of the laser beam of the laser sensor, and is generally half of the identifiable length of the near field of the laser beam.
As shown in fig. 5, after three points P 1、P2、P3 are sequentially acquired, a left-right offset S X and a forward offset S Y are set.
Under the user coordinate system of the robot, the P 1 point is shifted in the X S direction by S X and in the Y S direction by S Y to obtain an identification point S1, under the user coordinate system of the robot, the P 1 point is shifted in the X S direction by S X and in the Y S direction by- Y to obtain an identification point S2, under the user coordinate system of the robot, the P 2 point is shifted in the X S direction by S X and in the Y S direction by S Y to obtain an identification point S3, under the user coordinate system of the robot, the P 2 point is shifted in the X S direction by S X and in the Y S direction by- Y to obtain an identification point S4.
The industrial personal computer respectively collects the pose of the robot under the coordinate system corresponding to the P 3, S1, S2, S3 and S4 points and the position of the laser sensor under the coordinate system corresponding to the S1, S2, S3 and S4 points, and the calibration matrix between the robot and the laser sensor can be determined by substituting the pose into the constructed hand-eye calibration mathematical model.
Based on the same inventive concept, the embodiment of the invention also provides a robot and a laser sensor hand-eye calibration device, the principle of solving the problem is similar, the repetition is omitted, the specific structure is shown in fig. 6, and the device comprises:
The user coordinate system establishing module 601 is used for establishing a user coordinate system of the robot by using a three-point method based on a calibration plate, wherein the calibration plate is arranged in the reachable range of the robot;
a first point position obtaining module 602, configured to move the robot according to a user coordinate system, and obtain a first point position of the end of the robot tool when the near-field position of the laser sensor coincides with a specific edge of the calibration plate;
A second point position obtaining module 603, configured to move the robot according to a user coordinate system, and obtain a second point position of the end of the robot tool when the laser far field of view of the laser sensor coincides with a specific edge of the calibration plate;
The third point position obtaining module 604 is configured to keep the posture of the robot unchanged, move the position of the robot, and obtain a third point position of the end of the robot tool when the center point TCP of the robot tool is aligned with the space identification point on the calibration board;
The mathematical model construction module 605 is used for constructing a hand-eye calibration mathematical model of the robot and the laser sensor;
A calibration matrix determining module 606, configured to determine a calibration matrix between the robot and the laser sensor based on the first point location, the second point location, the third point location, and the hand-eye calibration mathematical model;
Wherein the laser sensor is mounted on the robot and is provided as a word line laser sensor.
The robot and laser sensor hand-eye calibration device in the specific embodiment further comprises a TCP calibration module on the basis of FIG. 6, wherein the TCP calibration module is used for carrying out TCP calibration on the robot.
In specific implementation, the user coordinate system building module 601 is specifically configured to adjust a gun pose of a welding gun of the robot, so that an included angle between a laser beam of the laser sensor and a plane of the calibration plate is within a preset range, and the laser beam of the laser sensor coincides with or is parallel to a specific edge of the calibration plate.
In a specific embodiment, the first bit acquisition module 602 is specifically configured to:
keeping the gun pose of the welding gun unchanged, and moving the robot according to a user coordinate system so that the center of a laser near vision field of the laser sensor can identify an identification point of the calibration plate;
And moving the X s direction of the user coordinate system to enable the laser beam of the laser sensor to coincide with the specific edge of the calibration plate, and recording the position of the tail end of the robot tool at the moment as a first point position.
In a specific embodiment, the second point location obtaining module 603 is specifically configured to:
The gun pose of the welding gun is kept unchanged, and the robot is moved according to a user coordinate system, so that the center of a laser far field of view of the laser sensor can identify an identification point of the calibration plate;
and moving the X s direction of the user coordinate system to enable the laser beam of the laser sensor to coincide with the specific edge of the calibration plate, and recording the position of the tail end of the robot tool at the moment as a second point position.
In a specific embodiment, the third point location obtaining module 604 is specifically configured to:
And (3) keeping the gun pose of the welding gun unchanged, moving the position of the robot, enabling the TCP of the robot to be aligned to a space identification point on the calibration plate, and recording the position of the tail end of the tool of the robot at the moment as a third point.
In a specific embodiment, the mathematical model building module 605 is specifically configured to:
the hand-eye calibration mathematical model of the robot and the laser sensor is constructed as follows:
Wherein, P represents the position of the space recognition point under the robot base coordinate system, P i represents the coordinate under the robot base coordinate system when the laser sensor irradiates the space recognition point; P Laser represents the position of the spatial recognition point under the coordinate system of the laser sensor; Representing the transformation matrix relationship of the laser sensor coordinate system to the tool coordinate system, and M TL represents the translational component.
In a specific embodiment, the calibration matrix determining module 606 is specifically configured to:
Setting an offset distance parameter;
and transforming the first point location and the second point location according to the offset distance parameters to obtain four identification point locations, inputting the four identification point locations and the third point location into a hand-eye calibration mathematical model, and calculating to obtain a calibration matrix between the robot and the laser sensor.
In summary, the method and the device for calibrating the hand and the eye of the robot and the laser sensor provided by the embodiment have the following advantages:
The method comprises the steps of establishing a user coordinate system of a robot by using a three-point method based on a calibration plate, moving the robot according to the user coordinate system, obtaining a first point position of the tail end of a robot tool when the near-field position of laser of a laser sensor coincides with a specific edge of the calibration plate, moving the robot according to the user coordinate system, obtaining a second point position of the tail end of the robot tool when the far-field position of laser of the laser sensor coincides with the specific edge of the calibration plate, keeping the posture of the robot unchanged, moving the position of the robot, obtaining a third point position of the tail end of the robot tool when a central point TCP of the robot tool is aligned with a space identification point on the calibration plate, constructing a hand-eye calibration mathematical model of the robot and the laser sensor, and determining a calibration matrix between the robot and the laser sensor based on the first point position, the second point position, the third point position and the hand-eye calibration mathematical model. Only three point positions are determined and a hand-eye calibration mathematical model is constructed, so that calibration can be completed, errors caused by the fact that laser irradiates the identification points for many times through manual teaching in the prior art are avoided, the calibration mode is simplified, and the calibration precision is improved. And the manual teaching is not needed for many times, the on-site debugging and maintenance are convenient, and the time cost of debugging consumption is reduced.
Although the invention provides method operational steps as described in the examples or flowcharts, more or fewer operational steps may be included based on conventional or non-inventive labor. The order of steps recited in the embodiments is merely one way of performing the order of steps and does not represent a unique order of execution. When implemented by an actual device or client product, the instructions may be executed sequentially or in parallel (e.g., in a parallel processor or multi-threaded processing environment) as shown in the embodiments or figures.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, apparatus (system) or computer program product. Accordingly, the present specification embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments. In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other. The present invention is not limited to any single aspect, nor to any single embodiment, nor to any combination and/or permutation of these aspects and/or embodiments. Moreover, each aspect and/or embodiment of the invention may be used alone or in combination with one or more other aspects and/or embodiments.
It should be noted that the above embodiments are only used to illustrate the technical solution of the present invention, but not to limit the technical solution of the present invention, and although the detailed description of the present invention is given with reference to the above embodiments, it should be understood by those skilled in the art that the technical solution described in the above embodiments may be modified or some or all technical features may be equivalently replaced, and these modifications or substitutions do not make the essence of the corresponding technical solution deviate from the scope of the technical solution of the embodiments of the present invention, and all the modifications or substitutions are included in the scope of the claims and the specification of the present invention.