[go: up one dir, main page]

CN115319754B - Robot and laser sensor hand-eye calibration method and device - Google Patents

Robot and laser sensor hand-eye calibration method and device Download PDF

Info

Publication number
CN115319754B
CN115319754B CN202211129307.2A CN202211129307A CN115319754B CN 115319754 B CN115319754 B CN 115319754B CN 202211129307 A CN202211129307 A CN 202211129307A CN 115319754 B CN115319754 B CN 115319754B
Authority
CN
China
Prior art keywords
robot
point
laser sensor
calibration
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211129307.2A
Other languages
Chinese (zh)
Other versions
CN115319754A (en
Inventor
柴宗兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Welding Systems Tangshan Co Ltd
Original Assignee
Panasonic Welding Systems Tangshan Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Welding Systems Tangshan Co Ltd filed Critical Panasonic Welding Systems Tangshan Co Ltd
Priority to CN202211129307.2A priority Critical patent/CN115319754B/en
Publication of CN115319754A publication Critical patent/CN115319754A/en
Application granted granted Critical
Publication of CN115319754B publication Critical patent/CN115319754B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Manipulator (AREA)

Abstract

本发明提供了一种机器人与激光传感器手眼标定方法及装置,该方法包括:基于标定板,建立机器人的用户坐标系;按照用户坐标系移动机器人,获取传感器的激光近视野处与标定板特定的边重合时机器人工具末端的第一点位;获取传感器的激光远视野处与标定板特定的边重合时机器人工具末端的第二点位;保持姿势不变,移动机器人位置,获取机器人TCP对准空间识别点时机器人工具末端的第三点位;构建机器人和激光传感器的手眼标定数学模型;基于第一点位、第二点位、第三点位和数学模型,确定机器人和激光传感器之间的标定矩阵。只需要确定三个点位并构建手眼标定数学模型,即可标定,避免了多次手动示教带来的误差,简化了标定方式,提高了标定的精度。

The present invention provides a robot and laser sensor hand-eye calibration method and device, the method comprising: establishing a user coordinate system of the robot based on a calibration plate; moving the robot according to the user coordinate system, obtaining a first point position of the robot tool end when the laser near field of view of the sensor coincides with a specific edge of the calibration plate; obtaining a second point position of the robot tool end when the laser far field of the sensor coincides with a specific edge of the calibration plate; keeping the posture unchanged, moving the robot position, obtaining a third point position of the robot tool end when the robot TCP is aligned with a spatial recognition point; constructing a hand-eye calibration mathematical model for the robot and the laser sensor; determining a calibration matrix between the robot and the laser sensor based on the first point position, the second point position, the third point position and the mathematical model. Only three points need to be determined and a hand-eye calibration mathematical model needs to be constructed to calibrate, thus avoiding errors caused by multiple manual teachings, simplifying the calibration method, and improving the calibration accuracy.

Description

Robot and laser sensor hand-eye calibration method and device
Technical Field
The invention relates to the technical field of industrial robots, in particular to a method and a device for calibrating a robot and a laser sensor hand and eye.
Background
With the vigorous development of the manufacturing industry, industrial robots have become irreplaceable automated equipment in the manufacturing industry by virtue of high flexibility and good accessibility. The combination of the laser sensor and the industrial robot is equivalent to installing eyes on the robot, so that the robot can sense the external environment. In order for the robot to move in accordance with the position identified by the laser sensor, a conversion relationship between the sensor coordinate system and the robot coordinate system needs to be established.
The existing hand-eye calibration method is complex in conversion relation, and the robot alignment identification points are required to be manually taught for many times, so that the calibration error is large, and the calibration mode is complex.
Disclosure of Invention
The invention aims to provide a robot and a laser sensor hand-eye calibration method and device capable of reducing calibration errors and simplifying calibration modes.
In order to achieve the above object, the present invention provides a method for calibrating a robot and a laser sensor, comprising:
Based on a calibration plate, a user coordinate system of the robot is established by using a three-point method, wherein the calibration plate is arranged in the reachable range of the robot;
Moving the robot according to the user coordinate system, and acquiring a first point position of the tail end of the robot tool when the near-field position of the laser sensor coincides with a specific edge of the calibration plate;
moving the robot according to the user coordinate system, and acquiring a second point position of the tail end of the robot tool when the laser far field of the laser sensor coincides with the specific edge of the calibration plate;
Maintaining the posture of the robot unchanged, moving the position of the robot, and acquiring a third point position of the central point TCP of the robot tool when aligning to the tail end of the spatial identification point robot tool on the calibration plate;
Constructing a hand-eye calibration mathematical model of the robot and the laser sensor;
and determining a calibration matrix between the robot and the laser sensor based on the first point location, the second point location, the third point location and the hand-eye calibration mathematical model.
The invention provides a robot and laser sensor hand-eye calibration device, which is used for reducing calibration errors and simplifying calibration modes, and comprises:
The first point position acquisition module is used for moving the robot according to the user coordinate system and acquiring a first point position of the tail end of the robot tool when the near-field position of the laser sensor coincides with the specific edge of the calibration plate;
The second point position acquisition module is used for moving the robot according to the user coordinate system and acquiring a second point position of the tail end of the robot tool when the laser far field of the laser sensor coincides with the specific edge of the calibration plate;
the third point position acquisition module is used for keeping the posture of the robot unchanged, moving the position of the robot and acquiring a third point position of the robot tool center point TCP aiming at the tail end of the robot tool when the space identification point on the calibration plate is aligned;
the mathematical model construction module is used for constructing a hand-eye calibration mathematical model of the robot and the laser sensor;
The calibration matrix determining module is used for determining a calibration matrix between the robot and the laser sensor based on the first point location, the second point location, the third point location and the hand-eye calibration mathematical model;
wherein the laser sensor is mounted on the robot.
The embodiment of the invention establishes a user coordinate system of a robot by using a three-point method based on a calibration plate, moves the robot according to the user coordinate system to obtain a first point position of the tail end of a robot tool when the near-field position of laser of a laser sensor coincides with a specific edge of the calibration plate, moves the robot according to the user coordinate system to obtain a second point position of the tail end of the robot tool when the far-field position of laser of the laser sensor coincides with the specific edge of the calibration plate, keeps the posture of the robot unchanged, moves the position of the robot to obtain a third point position of the tail end of the robot tool when a central point TCP of the robot tool is aligned with a space identification point on the calibration plate, constructs a hand-eye calibration mathematical model of the robot and the laser sensor, and determines a calibration matrix between the robot and the laser sensor based on the first point position, the second point position, the third point position and the hand-eye calibration mathematical model. Only three point positions are determined and a hand-eye calibration mathematical model is constructed, so that calibration can be completed, errors caused by the fact that laser irradiates the identification points for many times through manual teaching in the prior art are avoided, the calibration mode is simplified, and the calibration precision is improved.
Drawings
The following drawings are only for purposes of illustration and explanation of the present invention and are not intended to limit the scope of the invention. Wherein:
FIG. 1 is a schematic diagram of an implementation process of a robot and laser sensor hand-eye calibration method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of the installation and operation of a robot and laser sensor in an embodiment of the present invention;
FIG. 3 is a schematic side view of a calibration plate in an embodiment of the invention;
FIG. 4 is a schematic diagram of the positional relationship between a user coordinate system of a robot and a calibration plate in an example of the invention;
FIG. 5 is a schematic diagram of a process flow of an automatic calibration procedure of a robot in an embodiment of the invention;
fig. 6 is a schematic structural diagram of a robot and a laser sensor hand-eye calibration device according to an embodiment of the invention.
Detailed Description
The application is further described in detail below by means of the figures and examples. The features and advantages of the present application will become more apparent from the description.
The word "exemplary" is used herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
In addition, the technical features described below in the different embodiments of the present application may be combined with each other as long as they do not collide with each other.
The embodiment of the invention provides a robot and laser sensor hand-eye calibration method for reducing calibration errors and simplifying calibration modes, which is shown in figure 1 and comprises the following steps:
step 101, establishing a user coordinate system of the robot by using a three-point method based on a calibration plate, wherein the calibration plate is arranged in the reachable range of the robot;
102, moving the robot according to a user coordinate system, and acquiring a first point position of the tail end of the robot tool when the near-field position of the laser sensor coincides with a specific edge of the calibration plate;
step 103, moving the robot according to a user coordinate system, and acquiring a second point position of the tail end of the robot tool when the laser far field of the laser sensor coincides with a specific edge of the calibration plate;
104, keeping the posture of the robot unchanged, moving the position of the robot, and acquiring a third point position of the tail end of the robot tool when the TCP of the center point of the robot tool is aligned with the space identification point on the calibration plate;
Step 105, constructing a hand-eye calibration mathematical model of the robot and the laser sensor;
and 106, determining a calibration matrix between the robot and the laser sensor based on the first point location, the second point location, the third point location and the hand-eye calibration mathematical model.
In the specific embodiment, a user coordinate system of the robot is established by using a three-point method based on a calibration plate, the robot is moved according to the user coordinate system, a first point position of the tail end of the robot tool when the near-field position of laser light of the laser sensor coincides with a specific edge of the calibration plate is obtained, a second point position of the tail end of the robot tool when the far-field position of laser light of the laser sensor coincides with the specific edge of the calibration plate is obtained according to the user coordinate system, the posture of the robot is kept unchanged, the position of the robot is moved, a third point position of the robot tool when a central point TCP of the robot is aligned with a space identification point on the calibration plate is obtained, a hand-eye calibration mathematical model of the robot and the laser sensor is constructed, and a calibration matrix between the robot and the laser sensor is determined based on the first point position, the second point position, the third point position and the hand-eye calibration mathematical model. Only three point positions are determined and a hand-eye calibration mathematical model is constructed, so that calibration can be completed, errors caused by the fact that laser irradiates the identification points for many times through manual teaching in the prior art are avoided, the calibration mode is simplified, and the calibration precision is improved.
In the embodiment of the invention, as shown in fig. 2, the robot is fixedly installed on a floor or other walking shafts, the laser sensor is fixedly installed on a sixth joint shaft of the robot, or a preset position such as a hand grip, a welding gun or the like, and the laser sensor is a word line laser sensor. After the robot, the word line laser sensor and the calibration plate are installed, a local area network formed by the sensor, the industrial personal computer and the robot is constructed for communication, the word line laser sensor and the industrial personal computer communicate through a protocol agreed by the sensor, the industrial personal computer and the robot communicate through a protocol agreed by the robot, the word line laser sensor and the robot do not directly communicate, the three parts form the local area network together, and the industrial personal computer is used for acquiring and acquiring position information of the two parts and carrying out algorithm processing. In the figure, the base coordinate system of the robot is a coordinate system shown by X R、YR、ZR, the coordinate system shown by X L、YL、ZL is a laser sensor coordinate system, and the coordinate system shown by X T、YT、ZT is a tool coordinate system of the robot.
Since the laser emitted by a word line laser sensor is linear, the visual field is limited, the workpiece cannot be identified in the range of the visual field, the first preset range which is closer to the workpiece is defined as a near-visual field of the laser, the first preset range which is farther from the workpiece is defined as a far-visual field of the laser, for example, if the visual field of the laser is 300-500mm, the distance from the laser to the workpiece is less than 300mm or more than 500mm can be defined as a near-visual field range which is close to 300mm, and the distance from the laser to the workpiece is close to 500mm as a far-visual field range.
Before calibration, the preparation operation is needed, including calibration of the TCP (Tool Central Point) of the robot, and since the technology of performing TCP calibration of the robot is mature, and not described in detail herein, in the embodiment of the invention, calibration modes provided by each brand of robot are used as standards, for example, a standard gun calibration ruler or gun calibration block can be used for calibration preferentially, and if no condition exists, calibration can be performed by using a non-L1 mode, that is, the XYZ distance from the end of the tool to the center of the flange disc is measured sequentially, and the angle parameters are provided by a welding gun manufacturer and then filled into the parameters of the robot tool.
Based on the calibration plate, a user coordinate system of the robot is established by using a three-point method, and the calibration plate is placed in a robot reachable range, wherein the robot reachable range refers to a range that a robot arm can reach through operations such as movement or rotation. The specific implementation process comprises the steps of adjusting the gun gesture of the robot, so that an included angle between a laser beam of the laser sensor and the plane of the calibration plate is in a preset range, is generally vertical or approximately vertical, and the laser beam of the laser sensor coincides with or is parallel to a specific edge of the calibration plate.
The calibration plate is generally placed on a platform and placed in a range that can be reached by the robot so that the spatial recognition points on the calibration plate can be recognized, in a specific embodiment, the calibration plate is generally rectangular, in order to facilitate that the spatial recognition points on the calibration plate can be accurately captured, as shown in fig. 3, the calibration plate is formed by welding two plates, one of the two plates is large and the other is small, and the spatial recognition points are located at the lap joint of the two plates.
The robot user coordinate system is a rectangular coordinate system for customizing each working space by a user, is used for teaching and executing a position register, executing a position compensation instruction and the like, the user coordinate system building method is relatively simple, and is generally realized by teaching 3 teaching points, wherein the first teaching point is an origin of the user coordinate system, the second teaching point is on an X axis, a connecting line from the first teaching point to the second teaching point is an X axis, the pointing direction is an X positive direction, the third teaching point is in a positive direction area of a Y axis, and the Z axis is determined by a right hand rule. In the embodiment, as shown in fig. 4, P 0 is a first point set up and is also the origin position of the user coordinate system, P X is a second point set up, and P X points from P 0 to P X, i.e., the X direction of the user coordinate system (the direction shown by X s in the figure), P Y is a third point set up for calculating the Y direction of the user coordinate system (the direction shown by Y s in the figure), and the Z direction of the user coordinate system is determined by the right rule. In order to facilitate the subsequent use, the laser beam of the laser sensor is approximately perpendicular to the plane of the calibration plate by adjusting the gun pose of the welding gun of the robot as much as possible, and is parallel or coincident with a specific side (a side parallel to the Y s direction, namely the side where the edge 1 is positioned in fig. 4) of the calibration plate.
After a user coordinate system of the robot is established, the robot is moved according to the user coordinate system, and a first point position of the tail end of the robot tool when the laser near-field position of the laser sensor coincides with a specific side of the calibration plate is obtained. The specific implementation process comprises the following steps:
keeping the gun pose of the welding gun unchanged, and moving the robot according to a user coordinate system so that the center of a laser near vision field of the laser sensor can identify an identification point of the calibration plate;
And moving the X s direction of the user coordinate system to enable the laser beam of the laser sensor to coincide with the specific edge of the calibration plate, and recording the position of the tail end of the robot tool at the moment as a first point position P 1.
Then, the robot is moved according to a user coordinate system, a second point position of the tail end of the robot tool when the laser far field of the laser sensor coincides with the specific edge of the calibration plate is acquired, and the specific implementation process comprises the following steps:
The gun pose of the welding gun is kept unchanged, and the robot is moved according to a user coordinate system, so that the center of a laser far field of view of the laser sensor can identify an identification point of the calibration plate;
And moving the X s direction of the user coordinate system to enable the laser beam of the laser sensor to coincide with the specific edge of the calibration plate, and recording the position of the tail end of the robot tool at the moment as a second point position P 2.
After P 2 is obtained, the posture of the robot is kept unchanged, the position of the robot is moved, a third point position of the tail end of the robot tool when the center point TCP of the robot tool is aligned with the space identification point on the calibration plate is obtained, the specific process comprises the steps of keeping the gun posture of the welding gun unchanged, moving the position of the robot, enabling the TCP of the robot to be aligned with the space identification point on the calibration plate, and recording the position of the tail end of the robot tool at the moment as the third point position P 3.
Then, constructing a hand-eye calibration mathematical model of the robot and the laser sensor, wherein the specific mathematical model is as follows:
Wherein, P represents the position of the space recognition point under the robot base coordinate system, P i represents the coordinate under the robot base coordinate system when the laser sensor irradiates the space recognition point; The rotation matrix representing the coordinate system of the robot tool to the basic coordinate system can be obtained by Euler angle transformation of the robot, wherein P Laser represents the position of the space identification point under the coordinate system of the laser sensor; Representing the transformation matrix relationship of the laser sensor coordinate system to the tool coordinate system, and the unknown quantity, and M TL represents the translation component and the unknown quantity.
After the above-mentioned steps are finished, the invented product can be obtained
Order the
Then there are:
For the above equation, each equation has 4 unknowns, so long as three nonlinear equation sets can be obtained by collecting 4 sets of coordinate data, and the first equation in the above equation can be obtained by way of example:
Wherein X bi、Ybi、Zbi represents the component of the coordinate of the spatial recognition point in the X, Y, Z direction under the i (i=1, 2,3, 4) th group laser sensor coordinate system, and X ai represents the component of the C a in the X direction calculated by the P i coordinate under the i (i=1, 2,3, 4) th group robot base coordinate system.
The coefficients of the calibration matrix are:
similarly, the second equation and the third equation are transformed and solved, and finally the method can be determined And M TL.
Therefore, after the hand-eye calibration mathematical models of the robot and the laser sensor are built, a calibration matrix between the robot and the laser sensor is determined based on the first point location, the second point location, the third point location and the hand-eye calibration mathematical models. In specific implementation, setting an offset distance parameter, transforming the first point position P 1 and the second point position P 2 according to the offset distance parameter to obtain four identification points, inputting the four identification points and the third point position P 3 into a hand-eye calibration mathematical model, and calculating to obtain a calibration matrix between the robot and the laser sensor.
Specifically, the calculation process is realized by an industrial personal computer, offset distance parameters are set in an automatic calibration program of the robot, the automatic calibration process is a written teaching program of the robot, and instructions of each robot manufacturer are different, but basically comprise instructions of obtaining the current position of the robot, performing offset according to the specified parameters and the like. The industrial personal computer firstly receives offset distance parameters set by a user, operates an automatic calibration program of the robot, and transforms P 1 and P 2 according to the offset distance parameters when in operation, so that four identification points on the left side and the right side of a far vision field and the left side and the right side of a near vision field can be respectively obtained, the industrial personal computer collects the four identification points, the pose of the robot corresponding to the P 3 and the coordinates under a laser sensor coordinate system, inputs the pose into a hand-eye calibration mathematical model, and can calculate and obtain a calibration matrix between the robot and the laser sensor according to the solving and deducing process of the hand-eye calibration mathematical model.
The offset distance parameter includes that S X and S Y,SX are values of the distance from the space identification point on the punctuation plate to the specific edge of the punctuation plate, and the value of S Y is related to the length of the laser beam of the laser sensor, and is generally half of the identifiable length of the near field of the laser beam.
As shown in fig. 5, after three points P 1、P2、P3 are sequentially acquired, a left-right offset S X and a forward offset S Y are set.
Under the user coordinate system of the robot, the P 1 point is shifted in the X S direction by S X and in the Y S direction by S Y to obtain an identification point S1, under the user coordinate system of the robot, the P 1 point is shifted in the X S direction by S X and in the Y S direction by- Y to obtain an identification point S2, under the user coordinate system of the robot, the P 2 point is shifted in the X S direction by S X and in the Y S direction by S Y to obtain an identification point S3, under the user coordinate system of the robot, the P 2 point is shifted in the X S direction by S X and in the Y S direction by- Y to obtain an identification point S4.
The industrial personal computer respectively collects the pose of the robot under the coordinate system corresponding to the P 3, S1, S2, S3 and S4 points and the position of the laser sensor under the coordinate system corresponding to the S1, S2, S3 and S4 points, and the calibration matrix between the robot and the laser sensor can be determined by substituting the pose into the constructed hand-eye calibration mathematical model.
Based on the same inventive concept, the embodiment of the invention also provides a robot and a laser sensor hand-eye calibration device, the principle of solving the problem is similar, the repetition is omitted, the specific structure is shown in fig. 6, and the device comprises:
The user coordinate system establishing module 601 is used for establishing a user coordinate system of the robot by using a three-point method based on a calibration plate, wherein the calibration plate is arranged in the reachable range of the robot;
a first point position obtaining module 602, configured to move the robot according to a user coordinate system, and obtain a first point position of the end of the robot tool when the near-field position of the laser sensor coincides with a specific edge of the calibration plate;
A second point position obtaining module 603, configured to move the robot according to a user coordinate system, and obtain a second point position of the end of the robot tool when the laser far field of view of the laser sensor coincides with a specific edge of the calibration plate;
The third point position obtaining module 604 is configured to keep the posture of the robot unchanged, move the position of the robot, and obtain a third point position of the end of the robot tool when the center point TCP of the robot tool is aligned with the space identification point on the calibration board;
The mathematical model construction module 605 is used for constructing a hand-eye calibration mathematical model of the robot and the laser sensor;
A calibration matrix determining module 606, configured to determine a calibration matrix between the robot and the laser sensor based on the first point location, the second point location, the third point location, and the hand-eye calibration mathematical model;
Wherein the laser sensor is mounted on the robot and is provided as a word line laser sensor.
The robot and laser sensor hand-eye calibration device in the specific embodiment further comprises a TCP calibration module on the basis of FIG. 6, wherein the TCP calibration module is used for carrying out TCP calibration on the robot.
In specific implementation, the user coordinate system building module 601 is specifically configured to adjust a gun pose of a welding gun of the robot, so that an included angle between a laser beam of the laser sensor and a plane of the calibration plate is within a preset range, and the laser beam of the laser sensor coincides with or is parallel to a specific edge of the calibration plate.
In a specific embodiment, the first bit acquisition module 602 is specifically configured to:
keeping the gun pose of the welding gun unchanged, and moving the robot according to a user coordinate system so that the center of a laser near vision field of the laser sensor can identify an identification point of the calibration plate;
And moving the X s direction of the user coordinate system to enable the laser beam of the laser sensor to coincide with the specific edge of the calibration plate, and recording the position of the tail end of the robot tool at the moment as a first point position.
In a specific embodiment, the second point location obtaining module 603 is specifically configured to:
The gun pose of the welding gun is kept unchanged, and the robot is moved according to a user coordinate system, so that the center of a laser far field of view of the laser sensor can identify an identification point of the calibration plate;
and moving the X s direction of the user coordinate system to enable the laser beam of the laser sensor to coincide with the specific edge of the calibration plate, and recording the position of the tail end of the robot tool at the moment as a second point position.
In a specific embodiment, the third point location obtaining module 604 is specifically configured to:
And (3) keeping the gun pose of the welding gun unchanged, moving the position of the robot, enabling the TCP of the robot to be aligned to a space identification point on the calibration plate, and recording the position of the tail end of the tool of the robot at the moment as a third point.
In a specific embodiment, the mathematical model building module 605 is specifically configured to:
the hand-eye calibration mathematical model of the robot and the laser sensor is constructed as follows:
Wherein, P represents the position of the space recognition point under the robot base coordinate system, P i represents the coordinate under the robot base coordinate system when the laser sensor irradiates the space recognition point; P Laser represents the position of the spatial recognition point under the coordinate system of the laser sensor; Representing the transformation matrix relationship of the laser sensor coordinate system to the tool coordinate system, and M TL represents the translational component.
In a specific embodiment, the calibration matrix determining module 606 is specifically configured to:
Setting an offset distance parameter;
and transforming the first point location and the second point location according to the offset distance parameters to obtain four identification point locations, inputting the four identification point locations and the third point location into a hand-eye calibration mathematical model, and calculating to obtain a calibration matrix between the robot and the laser sensor.
In summary, the method and the device for calibrating the hand and the eye of the robot and the laser sensor provided by the embodiment have the following advantages:
The method comprises the steps of establishing a user coordinate system of a robot by using a three-point method based on a calibration plate, moving the robot according to the user coordinate system, obtaining a first point position of the tail end of a robot tool when the near-field position of laser of a laser sensor coincides with a specific edge of the calibration plate, moving the robot according to the user coordinate system, obtaining a second point position of the tail end of the robot tool when the far-field position of laser of the laser sensor coincides with the specific edge of the calibration plate, keeping the posture of the robot unchanged, moving the position of the robot, obtaining a third point position of the tail end of the robot tool when a central point TCP of the robot tool is aligned with a space identification point on the calibration plate, constructing a hand-eye calibration mathematical model of the robot and the laser sensor, and determining a calibration matrix between the robot and the laser sensor based on the first point position, the second point position, the third point position and the hand-eye calibration mathematical model. Only three point positions are determined and a hand-eye calibration mathematical model is constructed, so that calibration can be completed, errors caused by the fact that laser irradiates the identification points for many times through manual teaching in the prior art are avoided, the calibration mode is simplified, and the calibration precision is improved. And the manual teaching is not needed for many times, the on-site debugging and maintenance are convenient, and the time cost of debugging consumption is reduced.
Although the invention provides method operational steps as described in the examples or flowcharts, more or fewer operational steps may be included based on conventional or non-inventive labor. The order of steps recited in the embodiments is merely one way of performing the order of steps and does not represent a unique order of execution. When implemented by an actual device or client product, the instructions may be executed sequentially or in parallel (e.g., in a parallel processor or multi-threaded processing environment) as shown in the embodiments or figures.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, apparatus (system) or computer program product. Accordingly, the present specification embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments. In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other. The present invention is not limited to any single aspect, nor to any single embodiment, nor to any combination and/or permutation of these aspects and/or embodiments. Moreover, each aspect and/or embodiment of the invention may be used alone or in combination with one or more other aspects and/or embodiments.
It should be noted that the above embodiments are only used to illustrate the technical solution of the present invention, but not to limit the technical solution of the present invention, and although the detailed description of the present invention is given with reference to the above embodiments, it should be understood by those skilled in the art that the technical solution described in the above embodiments may be modified or some or all technical features may be equivalently replaced, and these modifications or substitutions do not make the essence of the corresponding technical solution deviate from the scope of the technical solution of the embodiments of the present invention, and all the modifications or substitutions are included in the scope of the claims and the specification of the present invention.

Claims (9)

1.一种机器人与激光传感器手眼标定方法,其特征在于,所述激光传感器安装在所述机器人上,所述机器人与激光传感器手眼标定方法,包括:1. A robot and laser sensor hand-eye calibration method, characterized in that the laser sensor is installed on the robot, and the robot and laser sensor hand-eye calibration method comprises: 基于标定板,利用三点法建立所述机器人的用户坐标系;所述标定板安放在所述机器人可达范围内;Based on the calibration plate, a user coordinate system of the robot is established by using a three-point method; the calibration plate is placed within the reach of the robot; 按照所述用户坐标系移动机器人,获取激光传感器的激光近视野处与标定板特定的边重合时机器人工具末端的第一点位;Moving the robot according to the user coordinate system to obtain the first point position of the robot tool end when the laser near field of view of the laser sensor coincides with a specific edge of the calibration plate; 按照所述用户坐标系移动机器人,获取激光传感器的激光远视野处与标定板特定的边重合时机器人工具末端的第二点位;Moving the robot according to the user coordinate system to obtain a second point position of the robot tool end when the laser far field of view of the laser sensor coincides with a specific edge of the calibration plate; 保持所述机器人的姿势不变,移动所述机器人的位置,获取机器人工具中心点TCP对准所述标定板上的空间识别点时机器人工具末端的第三点位;Keeping the posture of the robot unchanged, moving the position of the robot, and obtaining the third point position of the end of the robot tool when the robot tool center point TCP is aligned with the spatial identification point on the calibration plate; 构建机器人和激光传感器的手眼标定数学模型;Construct mathematical models for hand-eye calibration of robots and laser sensors; 基于所述第一点位、第二点位、第三点位和所述手眼标定数学模型,确定机器人和激光传感器之间的标定矩阵;Determine a calibration matrix between the robot and the laser sensor based on the first point, the second point, the third point and the hand-eye calibration mathematical model; 其中,所述构建机器人和激光传感器的手眼标定数学模型为:The mathematical model for constructing the hand-eye calibration of the robot and the laser sensor is: 其中,P代表空间识别点在机器人基坐标下的位置;Pi代表激光传感器照到空间识别点时在机器人基坐标系下的坐标;代表机器人工具坐标系到基坐标系的旋转矩阵;PLaser代表空间识别点在激光传感器坐标系下的位置;代表激光传感器坐标系到工具坐标系的变换矩阵关系;MTL代表平移分量。Where P represents the position of the spatial recognition point in the robot base coordinate system; Pi represents the coordinates of the spatial recognition point in the robot base coordinate system when the laser sensor illuminates it; represents the rotation matrix from the robot tool coordinate system to the base coordinate system; P Laser represents the position of the spatial recognition point in the laser sensor coordinate system; Represents the transformation matrix relationship from the laser sensor coordinate system to the tool coordinate system; M TL represents the translation component. 2.根据权利要求1所述的机器人与激光传感器手眼标定方法,其特征在于,所述激光传感器为一字线激光传感器。2. The robot and laser sensor hand-eye calibration method according to claim 1 is characterized in that the laser sensor is a straight line laser sensor. 3.根据权利要求1所述的机器人与激光传感器手眼标定方法,其特征在于,还包括:3. The robot and laser sensor hand-eye calibration method according to claim 1, characterized in that it also includes: 进行所述机器人的TCP标定。Perform TCP calibration of the robot. 4.根据权利要求1所述的机器人与激光传感器手眼标定方法,其特征在于,基于标定板,利用三点法建立所述机器人的用户坐标系,包括:4. The robot and laser sensor hand-eye calibration method according to claim 1 is characterized in that the user coordinate system of the robot is established based on the calibration plate using a three-point method, comprising: 调节所述机器人的焊枪枪姿,使得所述激光传感器的激光线束与所述标定板平面之间的夹角处于预设范围内,所述激光传感器的激光线束与所述标定板特定的边重合或平行。The welding gun posture of the robot is adjusted so that the angle between the laser beam of the laser sensor and the plane of the calibration plate is within a preset range, and the laser beam of the laser sensor coincides with or is parallel to a specific edge of the calibration plate. 5.根据权利要求4所述的机器人与激光传感器手眼标定方法,其特征在于,按照所述用户坐标系移动机器人,获取激光传感器的激光近视野处与标定板特定的边重合时机器人工具末端的第一点位,包括:5. The robot and laser sensor hand-eye calibration method according to claim 4 is characterized in that the robot is moved according to the user coordinate system to obtain the first point position of the robot tool end when the laser near field of view of the laser sensor coincides with a specific edge of the calibration plate, comprising: 保持所述焊枪枪姿不变,按照所述用户坐标系移动机器人,使得激光传感器的激光近视野的中心能识别出所述标定板的识别点;Keeping the welding gun posture unchanged, the robot is moved according to the user coordinate system so that the center of the laser near field of view of the laser sensor can identify the identification point of the calibration plate; 移动所述用户坐标系的Xs方向,使得所述激光传感器的激光线束与所述标定板特定的边重合,记录此时的机器人工具末端的位置为所述第一点位。The Xs direction of the user coordinate system is moved so that the laser beam of the laser sensor coincides with a specific edge of the calibration plate, and the position of the end of the robot tool at this time is recorded as the first point position. 6.根据权利要求4所述的机器人与激光传感器手眼标定方法,其特征在于,按照所述用户坐标系移动机器人,获取激光传感器的激光远视野处与标定板特定的边重合时机器人工具末端的第二点位,包括:6. The robot and laser sensor hand-eye calibration method according to claim 4 is characterized in that the robot is moved according to the user coordinate system to obtain the second point position of the robot tool end when the laser far field of view of the laser sensor coincides with a specific edge of the calibration plate, comprising: 保持所述焊枪枪姿不变,按照所述用户坐标系移动机器人,使得激光传感器的激光远视野的中心能识别出所述标定板的识别点;Keeping the welding gun posture unchanged, the robot is moved according to the user coordinate system so that the center of the laser far field of view of the laser sensor can identify the identification point of the calibration plate; 移动所述用户坐标系的Xs方向,使得所述激光传感器的激光线束与所述标定板特定的边重合,记录此时的机器人工具末端的位置为所述第二点位。The Xs direction of the user coordinate system is moved so that the laser beam of the laser sensor coincides with a specific edge of the calibration plate, and the position of the end of the robot tool at this time is recorded as the second point position. 7.根据权利要求4所述的机器人与激光传感器手眼标定方法,其特征在于,保持所述机器人的姿势不变,移动所述机器人的位置,获取机器人工具中心点TCP对准所述标定板上的空间识别点时机器人工具末端的第三点位,包括:7. The robot and laser sensor hand-eye calibration method according to claim 4 is characterized in that the posture of the robot is kept unchanged, the position of the robot is moved, and the third point position of the robot tool end is obtained when the robot tool center point TCP is aligned with the spatial identification point on the calibration plate, including: 保持所述焊枪枪姿不变,移动所述机器人的位置,使得所述机器人TCP对准到所述标定板上的空间识别点,记录此时的机器人工具末端的位置为所述第三点位。Keeping the welding gun posture unchanged, move the position of the robot so that the robot TCP is aligned with the spatial identification point on the calibration plate, and record the position of the robot tool end at this time as the third point position. 8.根据权利要求1所述的机器人与激光传感器手眼标定方法,其特征在于,基于所述第一点位、第二点位、第三点位和所述手眼标定数学模型,确定机器人和激光传感器之间的标定矩阵,包括:8. The robot and laser sensor hand-eye calibration method according to claim 1, characterized in that the calibration matrix between the robot and the laser sensor is determined based on the first point, the second point, the third point and the hand-eye calibration mathematical model, comprising: 设置偏移距离参数;Set the offset distance parameters; 将所述第一点位和第二点位、按照所述偏移距离参数进行变换,得到四个识别点位,将所述四个识别点位和所述第三点位输入所述手眼标定数学模型,计算得到机器人和激光传感器之间的标定矩阵。The first point and the second point are transformed according to the offset distance parameter to obtain four identification points, and the four identification points and the third point are input into the hand-eye calibration mathematical model to calculate the calibration matrix between the robot and the laser sensor. 9.一种机器人与激光传感器手眼标定装置,其特征在于,包括:9. A robot and laser sensor hand-eye calibration device, characterized by comprising: 用户坐标系建立模块,用于基于标定板,利用三点法建立所述机器人的用户坐标系;所述标定板安放在所述机器人可达范围内;A user coordinate system establishment module, used to establish the user coordinate system of the robot using a three-point method based on a calibration plate; the calibration plate is placed within the reach of the robot; 第一点位获取模块,用于按照所述用户坐标系移动机器人,获取激光传感器的激光近视野处与标定板特定的边重合时机器人工具末端的第一点位;A first point position acquisition module is used to move the robot according to the user coordinate system to obtain a first point position of the robot tool end when the laser near field of view of the laser sensor coincides with a specific edge of the calibration plate; 第二点位获取模块,用于按照所述用户坐标系移动机器人,获取激光传感器的激光远视野处与标定板特定的边重合时机器人工具末端的第二点位;A second point position acquisition module is used to move the robot according to the user coordinate system to obtain a second point position of the robot tool end when the laser far field of view of the laser sensor coincides with a specific edge of the calibration plate; 第三点位获取模块,用于保持所述机器人的姿势不变,移动所述机器人的位置,获取机器人工具中心点TCP对准所述标定板上的空间识别点时机器人工具末端的第三点位;A third point position acquisition module is used to keep the posture of the robot unchanged, move the position of the robot, and acquire the third point position of the end of the robot tool when the robot tool center point TCP is aligned with the spatial identification point on the calibration plate; 数学模型构建模块,用于构建机器人和激光传感器的手眼标定数学模型;Mathematical model building module, used to build the hand-eye calibration mathematical model of the robot and laser sensor; 标定矩阵确定模块,用于基于所述第一点位、第二点位、第三点位和所述手眼标定数学模型,确定机器人和激光传感器之间的标定矩阵;A calibration matrix determination module, used to determine a calibration matrix between the robot and the laser sensor based on the first point, the second point, the third point and the hand-eye calibration mathematical model; 其中,所述激光传感器安装在所述机器人上;Wherein, the laser sensor is installed on the robot; 其中,所述数学模型构建模块,用于构建机器人和激光传感器的手眼标定数学模型为:The mathematical model building module is used to build the hand-eye calibration mathematical model of the robot and the laser sensor: 其中,P代表空间识别点在机器人基坐标下的位置;Pi代表激光传感器照到空间识别点时在机器人基坐标系下的坐标;代表机器人工具坐标系到基坐标系的旋转矩阵;PLaser代表空间识别点在激光传感器坐标系下的位置;代表激光传感器坐标系到工具坐标系的变换矩阵关系;MTL代表平移分量。Where P represents the position of the spatial recognition point in the robot base coordinate system; Pi represents the coordinates of the spatial recognition point in the robot base coordinate system when the laser sensor illuminates it; represents the rotation matrix from the robot tool coordinate system to the base coordinate system; P Laser represents the position of the spatial recognition point in the laser sensor coordinate system; Represents the transformation matrix relationship from the laser sensor coordinate system to the tool coordinate system; M TL represents the translation component.
CN202211129307.2A 2022-09-16 2022-09-16 Robot and laser sensor hand-eye calibration method and device Active CN115319754B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211129307.2A CN115319754B (en) 2022-09-16 2022-09-16 Robot and laser sensor hand-eye calibration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211129307.2A CN115319754B (en) 2022-09-16 2022-09-16 Robot and laser sensor hand-eye calibration method and device

Publications (2)

Publication Number Publication Date
CN115319754A CN115319754A (en) 2022-11-11
CN115319754B true CN115319754B (en) 2025-03-04

Family

ID=83930485

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211129307.2A Active CN115319754B (en) 2022-09-16 2022-09-16 Robot and laser sensor hand-eye calibration method and device

Country Status (1)

Country Link
CN (1) CN115319754B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110009685A (en) * 2018-12-29 2019-07-12 南京衍构科技有限公司 A kind of laser camera hand and eye calibrating method increasing material applied to electric arc

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018214147A1 (en) * 2017-05-26 2018-11-29 深圳配天智能技术研究院有限公司 Robot calibration method and system, robot and storage medium
CN107907048A (en) * 2017-06-30 2018-04-13 长沙湘计海盾科技有限公司 A kind of binocular stereo vision method for three-dimensional measurement based on line-structured light scanning
CN111735390B (en) * 2020-08-28 2020-12-11 中国计量大学 A calibration block and hand-eye calibration method for line laser sensor
CN112621711B (en) * 2020-11-19 2022-11-29 深圳众为兴技术股份有限公司 Robot, hand-eye calibration method for fixing camera of robot on frame and storage medium
CN114670203A (en) * 2022-04-20 2022-06-28 无锡信捷电气股份有限公司 Automatic welding hand-eye calibration method for laser vision guided robot

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110009685A (en) * 2018-12-29 2019-07-12 南京衍构科技有限公司 A kind of laser camera hand and eye calibrating method increasing material applied to electric arc

Also Published As

Publication number Publication date
CN115319754A (en) 2022-11-11

Similar Documents

Publication Publication Date Title
CN108748159B (en) Self-calibration method for tool coordinate system of mechanical arm
JP7207851B2 (en) Control method, robot system, article manufacturing method, program and recording medium
CN107738254B (en) A method and system for converting and calibrating a manipulator coordinate system
EP1936458B1 (en) Device, method, program and recording medium for robot offline programming
CN107214692B (en) Automatic Calibration Method of Robot System
CN106777656B (en) A PMPSD-based Absolute Precision Calibration Method for Industrial Robots
CN112833792B (en) An accuracy calibration and verification method for a six-degree-of-freedom manipulator
JP2005201824A (en) Measuring device
JP2005300230A (en) Measuring instrument
CN110900610B (en) Industrial robot calibration method based on LM algorithm and particle filter algorithm optimization
CN103153553A (en) Vision-guided alignment system and method
CN109848989B (en) A ruby probe-based automatic calibration and detection method of robot execution end
CN107053216A (en) The automatic calibration method and system of robot and end effector
JP3349652B2 (en) Offline teaching method
TW202128378A (en) Calibrating method and calibrating system
CN113211445A (en) Robot parameter calibration method, device, equipment and storage medium
CN112907682A (en) Hand-eye calibration method and device for five-axis motion platform and related equipment
CN113352345A (en) System, method and device for replacing quick-change device, electronic equipment and storage medium
WO2024207703A1 (en) Hand-eye calibration method and system without kinematics involvement
CN113211436B (en) Six-degree-of-freedom series robot error calibration method based on genetic algorithm
CN115319754B (en) Robot and laser sensor hand-eye calibration method and device
Chen et al. Object-based terminal positioning solution within task-boosted global constraint for improving mobile robotic stacking accuracy
CN114131607A (en) Method and system for calibrating kinematics of generalized kinematics error of industrial robot
CN113240753A (en) Sphere fitting method for calibrating base coordinate system of robot and double-shaft deflection mechanism
CN108592838B (en) Calibration method and device of tool coordinate system and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant