[go: up one dir, main page]

CN119036517B - End effector guiding system and method based on 2D camera and laser range finder - Google Patents

End effector guiding system and method based on 2D camera and laser range finder Download PDF

Info

Publication number
CN119036517B
CN119036517B CN202411524161.0A CN202411524161A CN119036517B CN 119036517 B CN119036517 B CN 119036517B CN 202411524161 A CN202411524161 A CN 202411524161A CN 119036517 B CN119036517 B CN 119036517B
Authority
CN
China
Prior art keywords
end effector
camera
plane
range finder
laser range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202411524161.0A
Other languages
Chinese (zh)
Other versions
CN119036517A (en
Inventor
莫威
黄雅阁
葛成
蔡相宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Smartstate Technology Co ltd
Original Assignee
Shanghai Smartstate Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Smartstate Technology Co ltd filed Critical Shanghai Smartstate Technology Co ltd
Priority to CN202411524161.0A priority Critical patent/CN119036517B/en
Publication of CN119036517A publication Critical patent/CN119036517A/en
Application granted granted Critical
Publication of CN119036517B publication Critical patent/CN119036517B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/0095Means or methods for testing manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

本发明提供了一种基于2D相机和激光测距仪的末端执行器引导系统及方法,包括机器人、测量系统和末端执行器;所述末端执行器和测量系统均安装在机器人末端;所述测量系统获取作业平面的空间信息,所述机器人根据位置信息调节末端执行器的位姿,使末端执行器的进给方向垂直于作业平面,并移动末端执行器至作业平面完成作业指令。本发明提供的一种基于2D相机和激光测距仪引导末端执行器的系统可以测得深度信息,可以在不同高度和不同倾斜度的平面上进行末端执行器操作。

The present invention provides an end effector guidance system and method based on a 2D camera and a laser rangefinder, comprising a robot, a measuring system and an end effector; the end effector and the measuring system are both installed at the end of the robot; the measuring system obtains spatial information of a working plane, and the robot adjusts the posture of the end effector according to the position information, so that the feeding direction of the end effector is perpendicular to the working plane, and moves the end effector to the working plane to complete the working instruction. The system for guiding the end effector based on a 2D camera and a laser rangefinder provided by the present invention can measure depth information, and can perform end effector operations on planes of different heights and different inclinations.

Description

End effector guiding system and method based on 2D camera and laser range finder
Technical Field
The invention relates to the technical field of robot vision guidance, in particular to an end effector guiding system and method based on a 2D camera and a laser range finder.
Background
In the field of automated vision guidance, methods for performing operations by end effectors are generally classified into repetitive operations for fixed workpieces and sensor identification operations for non-fixed workpieces. The method comprises the steps of setting a path of an execution module to realize stable end effector operation, wherein the accuracy of the method depends on the repeated positioning accuracy of the execution mechanism, but if the workpiece is replaced, the path is required to be planned again to adapt to the operation requirements of different products, a sensor for recognizing and guiding the end effector operation instead of the fixed workpiece correspondingly operates according to the characteristics of the operation place, the accuracy of the method is comprehensively determined by the execution mechanism and the sensor, and the method can carry out end effector operation on different workpieces on the basis of not changing a program.
Sensor identification operation of non-stationary workpieces two schemes of 2D cameras and 3D cameras may be used depending on the different sensors. If the 2D camera is used as a sensor, the distance between the workpiece and the shooting point of the 2D camera needs to be kept consistent when the end effector works, the operation position of the workpiece cannot rotate except the normal direction of the operated plane, and otherwise, the operation accuracy of the end effector cannot be ensured. At the same time the guided end effector height of the workpiece must be fixed, since the monocular camera cannot measure height information. If the 3D camera is used as a sensor, the requirement that the workpiece is positioned at any placement position can be met, but the price of the corresponding 3D camera is more expensive due to the fact that the operation precision of the end effector is higher.
Disclosure of Invention
In view of the drawbacks of the prior art, an object of the present invention is to provide an end effector guiding system and method based on a 2D camera and a laser rangefinder.
The end effector guiding system based on the 2D camera and the laser range finder comprises a robot, a measuring system and an end effector, wherein the end effector and the measuring system are arranged at the tail end of the robot;
the measuring system acquires the space information of the working plane, and the robot adjusts the pose of the end effector according to the position information, so that the feeding direction of the end effector is perpendicular to the working plane, and moves the end effector to the working plane to finish the working instruction.
Preferably, the measuring system comprises a 2D camera and a laser range finder, wherein the method for acquiring the space information of the operation plane comprises the steps of acquiring the two-dimensional plane information of three points which are not collinear in the operation plane through the 2D camera, acquiring the height information of the three points through the laser range finder, obtaining the three-dimensional coordinate information of the three points, and obtaining the space information of the operation plane according to the three-dimensional coordinate information.
According to the end effector guiding method based on the 2D camera and the laser range finder, which is provided by the invention, the end effector guiding method comprises the following steps:
step S1, determining a reference plane of the end effector, and obtaining a feed direction vector OA of the end effector compared with the reference plane;
step S2, determining a working plane of the end effector, acquiring coordinates of any three points in the working plane, and calculating a normal vector OA' of the working plane;
step S3, calculating the included angle between the normal vector OA' of the working plane and the feed direction vector OA of the end effector ;
Step S4, according to the included angleAdjusting the feeding direction of the end effector to obtain a feeding direction vector of the end effector after adjustment;
Step S5, measuring the shooting height of the 2D camera, if the shooting height exceeds the reference height range, adjusting the shooting height of the 2D camera, and repeatedly executing the steps S2-S4 until the reference height requirement is met;
the reference height is set in a self-defining mode according to the operation content;
and S6, determining characteristic points on the working plane, acquiring position information of the characteristic points, controlling the end effector to move to the characteristic points, and executing a working instruction.
Preferably, the reference plane is an initial calibration plane of the end effector, and in an original state, a feeding direction of the end effector is perpendicular to the reference plane.
Preferably, the coordinate acquisition mode of any three points in the working plane is that the working plane is shot by a 2D camera, two-dimensional coordinate information (x 1, y 1), (x 2, y 2) and (x 3, y 3) of the three points are determined, and coordinate values Z1, Z2 and Z3 of the three points in the Z direction are obtained according to the distance from the laser range finder to any one reference point on the reference plane and the distance from the laser range finder to the three points in the working plane.
Preferably, the said angle is based onThe method for adjusting the feed direction of the end effector includes:
the rotation of the end effector is represented by a quaternion, which is: Wherein q is a rotation quaternion, Is the angle between OA' and OA,AndFor each component of the vector (a, b, c) orthogonal to OA' and OA,AndIn imaginary units.
Preferably, step S4 further comprises setting a range of angles over which end effector rotation is not required.
Preferably, the method for measuring the shooting height of the 2D camera comprises the steps of selecting any point in a reference plane as a reference point, measuring the distance from the reference point through a laser range finder, and obtaining the distance from the 2D camera to the reference point through conversion based on the position relation between the laser range finder and the 2D camera, wherein the distance is regarded as the distance from the 2D camera to a working plane.
Compared with the prior art, the invention has the following beneficial effects:
1. Compared with the prior 2D camera, the system for guiding the end effector based on the 2D camera and the laser range finder can measure depth information and can operate the end effector on planes with different heights and different inclinations.
2. Compared with the scheme of adopting a 3D camera at present, the invention provides a system for guiding an end effector based on a 2D camera and a laser range finder, which can greatly reduce the acquisition cost of equipment under the same precision requirement.
3. The invention provides a method for guiding an end effector based on a 2D camera and a laser range finder, which can adapt to planes with different heights and different inclinations to enable the end effector to accurately operate perpendicular to an operation plane.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
FIG. 1 is a schematic frame diagram of a control system of the present invention;
FIG. 2 is a schematic diagram of the control principle of the control system of the present invention;
FIG. 3 is a flow chart of the control method of the present invention.
Reference numerals illustrate:
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present invention.
The invention provides an end effector 300 guiding system based on a 2D camera 201 and a laser range finder 202, which is shown with reference to figures 1 and 2 and comprises a robot 100, a measuring system 200 and an end effector 300. Wherein the end effector 300 and the measurement system 200 are both mounted at the end of the robot 100. The measurement system 200 comprises a 2D camera 201 and a laser range finder 202, three-dimensional coordinate information of a target point is obtained through measurement and calculation of the 2D camera 201 and the laser range finder 202, and the robot 100 controls the end effector 300 to finish a set operation instruction according to the three-dimensional coordinate information.
In one particular embodiment, the end effector 300 includes, but is not limited to, a dispenser, a welder, a hole maker, and the like.
The 2D camera 201 serves as a server, the 2D camera 201 performs network communication with the robot 100 through TCP/IP, the robot 100 transmits a program execution instruction to the 2D camera 201, and the 2D camera 201 recognizes the feature point and then transmits the two-dimensional coordinates of the feature point to the robot 100.
The laser rangefinder 202 serves as a server and the robot 100 serves as a client for network communication via TCP/IP. The robot 100 transmits a program execution instruction to the laser rangefinder 202, and the laser rangefinder 202 transmits the depth data at that time to the robot 100.
The end effector 300 performs a final execution operation, the end effector 300 is controlled by the robot 100 through the DI/DO module, and after the robot 100 transmits a program execution signal, the end effector 300 starts a corresponding operation.
The end effector 300 guidance system disclosed by the invention works in the following principles:
First, the 2D camera 201 recognizes a feature point, which may be a point in the operation plane, or a center point of the work area, or a work start point of the end effector 300, and transmits two-dimensional plane information thereof to the robot 100. The two-dimensional plane information is two-dimensional coordinate information of the feature points in a reference plane, which is an initial calibration plane of the robot 100, and in an initial stage, a feeding direction of the robot 100 is perpendicular to the reference plane.
Next, the TCP of the robot 100 is changed according to the inclination of the operation plane so that the feeding direction of the robot 100 is perpendicular to the operation plane, and in a specific embodiment, the TCP of the robot 100 is the working end point of the end effector 300. The height of the robot 100 is changed according to the height of the operation plane from the TCP;
finally, the robot 100 controls the end effector 300 to move to the vicinity of the feature point based on the information transmitted from the 2D camera 201 and the laser range finder 202, and performs the operation of the end effector 300. The method provided by the invention can adapt to the requirements of high-precision positioning of the end effector 300 on different planes. Wherein the operating plane is the plane in which the end effector 300 operates, and is at an angle and height difference from the reference plane.
The invention also provides a guiding method of the end effector 300 based on the 2D camera 201 and the laser range finder 202, which is based on the guiding system of the end effector 300 based on the 2D camera 201 and the laser range finder 202 and is used for guiding the end effector 300 to execute operation instructions. Specifically, referring to fig. 3, the booting includes the steps of:
step S1, determining a reference plane of the end effector 300, and obtaining a feed direction vector OA of the end effector 300 compared with the reference plane.
Specifically, the reference plane is an initial calibration plane of the end effector 300, and in the original state, the feeding direction of the end effector 300 is perpendicular to the reference plane. Feed direction vector oa= (0, 1) of end effector 300 compared to the reference plane in the initial state
Step S2, determining a working plane of the end effector 300, acquiring coordinates of any three points in the working plane, and calculating a normal vector OA' of the working plane;
In a specific embodiment, the coordinate acquisition mode of any three points in the working plane is that the working plane is shot by the 2D camera 201, two-dimensional coordinate information (x 1, y 1), (x 2, y 2) and (x 3, y 3) of the three points are determined, and coordinate values Z1, Z2 and Z3 of the three points in the Z direction are obtained according to the distance from the laser range finder 202 to any one reference point on the reference plane and the distance from the laser range finder 202 to the three points in the working plane.
Step S3, calculating the included angle between the normal vector OA' of the working plane and the feed direction vector OA of the end effector 300;
Step S4, according to the included angleAdjusting the feed direction of the end effector 300 results in an adjusted feed direction vector of the end effector 300.
The specific adjusting method is that the quaternion is used for expressing rotation, namely: The robot 100 (100) may be rotated to a new end effector feed direction vector by multiplying the current quaternion by the rotation quaternion q, where, In order to rotate the quaternion,Is the angle between OA and OA',AndFor each component of the vector (a, b, c) orthogonal to OA and OA',AndIn imaginary units.
In a preferred embodiment, the setting is such that there is no need to rotate the end effector 300 over a range of angles, where the smaller the range of angles, the better the perpendicularity of the end effector 300 to the plane, but the greater the number of rotations that the end effector 300 needs to make, and the greater the range of angles, the worse the perpendicularity of the end effector 300 to the plane, but the fewer the number of rotations that the end effector 300 needs to make.
Step S5, the laser range finder 202 is used for measuring the shooting height of the 2D camera 201, if the shooting height exceeds the reference height range, the height of the 2D camera 201 is adjusted, and the steps S2-S4 are repeatedly executed until the normal vector OA' of the working plane and the feed direction vector OA of the end effector 300 form an included angleThe requirements are met, and the adjusted height is within the reference height range.
The feature point may be a point in the work plane, and the feature point may be a start point of an end execution work or a center point of a work area. The reference height range is a range artificially set according to actual working conditions.
The 2D camera 201 has a difference in imaging viewing angle of the camera at different photographing heights, and the ratio of the obtained images is changed according to the perspective principle. Therefore, after the posture of the end effector 300 is adjusted, the distance value from the 2D camera 201 to the working plane needs to be measured, and if the distance value exceeds the reference height range, it is determined that there is an error between the posture of the end effector 300 and the final ideal posture. At this time, after adjusting the distance from the 2D camera 201 to the work plane, the normal vector of the work plane is re-measured, thereby further calibrating the posture of the end effector 300.
In a specific embodiment, the distance value between the 2D camera 201 and the working plane is measured by selecting any point in the reference plane as a reference point, measuring the distance between the reference point by the laser range finder 202, and converting the distance between the 2D camera 201 and the reference point based on the position relation between the laser range finder 202 and the 2D camera 201 to be regarded as the distance between the 2D camera 201 and the working plane.
Step S7, coordinate information and height information of the feature points are acquired, the end effector 300 is controlled to move to the feature points, and a working instruction is executed.
Those skilled in the art will appreciate that the invention provides a system and its individual devices, modules, units, etc. that can be implemented entirely by logic programming of method steps, in addition to being implemented as pure computer readable program code, in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Therefore, the system and the devices, modules and units thereof provided by the invention can be regarded as a hardware component, and the devices, modules and units for realizing various functions included in the system can be regarded as structures in the hardware component, and the devices, modules and units for realizing various functions can be regarded as structures in the hardware component as well as software modules for realizing the method.
In the description of the present application, it should be understood that the terms "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are merely for convenience in describing the present application and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present application.
The foregoing describes specific embodiments of the present application. It is to be understood that the application is not limited to the particular embodiments described above, and that various changes or modifications may be made by those skilled in the art within the scope of the appended claims without affecting the spirit of the application. The embodiments of the application and the features of the embodiments may be combined with each other arbitrarily without conflict.

Claims (7)

1. An end effector guiding method based on a 2D camera and a laser range finder, comprising:
step S1, determining a reference plane of the end effector (300) to obtain a feed direction vector OA of the end effector (300) compared with the reference plane;
s2, determining a working plane of the end effector (300), acquiring coordinates of any three points in the working plane, and calculating a normal vector OA' of the working plane;
The space information of the operation plane is determined by acquiring three-dimensional coordinates of non-collinear three points in the operation plane through a 2D camera (201) and a laser range finder (202);
Step S3, calculating the included angle between the normal vector OA' of the working plane and the feed direction vector OA of the end effector (300) ;
Step S4, according to the included angleAdjusting the feed direction of the end effector (300) to obtain a feed direction vector of the end effector (300) after adjustment;
Step S5, measuring the shooting height of the 2D camera (201), if the shooting height exceeds a reference height range, adjusting the shooting height of the 2D camera (201), and repeatedly executing the steps S2-S4 until the reference height requirement is met;
the reference height is set in a self-defining mode according to the operation content;
step S6, determining characteristic points on the working plane, acquiring position information of the characteristic points, controlling the end effector (300) to move to the characteristic points, and executing a working instruction;
The coordinate acquisition mode of any three points in the working plane comprises the steps of shooting the working plane through a 2D camera (201), determining two-dimensional coordinate information (x 1, y 1), (x 2, y 2) and (x 3, y 3) of the three points, and obtaining coordinate values Z1, Z2 and Z3 of the three points in the Z direction according to the distance from the laser range finder (202) to any one reference point on the reference plane and the distance from the laser range finder (202) to the three points in the working plane.
2. The end effector guiding method based on a 2D camera and a laser range finder according to claim 1, wherein the reference plane is an initial calibration plane of the end effector (300), and in an original state, a feeding direction of the end effector (300) is perpendicular to the reference plane.
3. The end effector guiding method based on a 2D camera and a laser range finder according to claim 1, wherein the angle is based onThe method of adjusting a feed direction of an end effector (300) includes:
the rotation of the end effector (300) is represented by a quaternion, which is: Wherein q is a rotation quaternion, Is the angle between OA' and OA,AndFor each component of the vector (a, b, c) orthogonal to OA' and OA,AndIn imaginary units.
4. The end effector guiding method based on a 2D camera and a laser range finder of claim 1, further comprising setting an angular range in which rotation of the end effector (300) is unnecessary in step S4.
5. The end effector guiding method based on the 2D camera and the laser range finder according to claim 1, wherein the measuring method of the photographing height of the 2D camera (201) includes selecting an arbitrary point in a reference plane as a reference point, measuring a distance to the reference point by the laser range finder (202), and converting a distance from the 2D camera (201) to the reference point based on a positional relationship between the laser range finder (202) and the 2D camera (201), and considering the distance from the 2D camera (201) to a working plane.
6. An end effector guiding system based on a 2D camera and a laser range finder, adopting the end effector guiding method based on a 2D camera and a laser range finder according to any one of claims 1 to 5, characterized by comprising a robot (100), a measuring system (200) and an end effector (300), wherein the end effector (300) and the measuring system (200) are both installed at the end of the robot (100);
The measuring system (200) acquires spatial information of a working plane, the robot (100) adjusts the pose of the end effector (300) according to the position information, enables the feeding direction of the end effector (300) to be perpendicular to the working plane, and moves the end effector (300) to the working plane to finish a working instruction.
7. The end effector guiding system based on the 2D camera and the laser range finder according to claim 6, wherein the measuring system (200) comprises the 2D camera (201) and the laser range finder (202), and the method for acquiring the spatial information of the working plane is that two-dimensional plane information of three points which are not collinear in the working plane is acquired through the 2D camera (201), the height information of the three points is acquired through the laser range finder (202), three-dimensional coordinate information of the three points is obtained, and the spatial information of the working plane is obtained according to the three-dimensional coordinate information.
CN202411524161.0A 2024-10-30 2024-10-30 End effector guiding system and method based on 2D camera and laser range finder Active CN119036517B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411524161.0A CN119036517B (en) 2024-10-30 2024-10-30 End effector guiding system and method based on 2D camera and laser range finder

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411524161.0A CN119036517B (en) 2024-10-30 2024-10-30 End effector guiding system and method based on 2D camera and laser range finder

Publications (2)

Publication Number Publication Date
CN119036517A CN119036517A (en) 2024-11-29
CN119036517B true CN119036517B (en) 2025-03-11

Family

ID=93580055

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411524161.0A Active CN119036517B (en) 2024-10-30 2024-10-30 End effector guiding system and method based on 2D camera and laser range finder

Country Status (1)

Country Link
CN (1) CN119036517B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119304561B (en) * 2024-12-17 2025-03-18 潍坊雷腾动力机械有限公司 Diesel engine assembly system and control method thereof

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116766196A (en) * 2023-07-11 2023-09-19 华东理工大学 Outer hexagon bolt assembly control method, system, equipment and storage medium
CN117381789A (en) * 2023-11-13 2024-01-12 江苏徐工工程机械研究院有限公司 Breaking and disassembling robot, control method thereof, controller, control system and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5340962A (en) * 1992-08-14 1994-08-23 Lumonics Corporation Automatic control of laser beam tool positioning
CN104816307B (en) * 2015-03-25 2016-08-17 西北工业大学 The four-point method of the accurate drilling of industrial robot is to leveling method
CN108177143B (en) * 2017-12-05 2021-08-10 上海工程技术大学 Robot positioning and grabbing method and system based on laser vision guidance
CN108680101B (en) * 2018-04-11 2019-11-01 上海交通大学 Mechanical arm tail end space repetitive positioning accuracy measuring device and method
CN112025722B (en) * 2020-08-19 2022-04-29 上海拓璞数控科技股份有限公司 C-shaped automatic drilling and riveting equipment and workpiece normal measurement and adjustment method
CN112070133B (en) * 2020-08-27 2023-02-03 武汉华工激光工程有限责任公司 Three-dimensional space point positioning method based on distance measuring instrument and machine vision
CN115731285A (en) * 2021-08-27 2023-03-03 中国科学院自动化研究所 Robot terminal control method, device, electronic equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116766196A (en) * 2023-07-11 2023-09-19 华东理工大学 Outer hexagon bolt assembly control method, system, equipment and storage medium
CN117381789A (en) * 2023-11-13 2024-01-12 江苏徐工工程机械研究院有限公司 Breaking and disassembling robot, control method thereof, controller, control system and storage medium

Also Published As

Publication number Publication date
CN119036517A (en) 2024-11-29

Similar Documents

Publication Publication Date Title
US11911914B2 (en) System and method for automatic hand-eye calibration of vision system for robot motion
US10585167B2 (en) Relative object localization process for local positioning system
JP4191080B2 (en) Measuring device
EP1555508B1 (en) Measuring system
US10099380B2 (en) Robot, robot control device, and robot system
US9043146B2 (en) Systems and methods for tracking location of movable target object
JP5850962B2 (en) Robot system using visual feedback
US9050728B2 (en) Apparatus and method for measuring tool center point position of robot
TWI670153B (en) Robot and robot system
JP4267005B2 (en) Measuring apparatus and calibration method
EP3421930B1 (en) Three-dimensional shape data and texture information generation system, photographing control program, and three-dimensional shape data and texture information generation method
US20150363935A1 (en) Robot, robotic system, and control device
CN108453701A (en) Control method, the method for teaching robot and the robot system of robot
US11951637B2 (en) Calibration apparatus and calibration method for coordinate system of robotic arm
CN119036517B (en) End effector guiding system and method based on 2D camera and laser range finder
JP5523392B2 (en) Calibration apparatus and calibration method
KR20140008262A (en) Robot system, robot, robot control device, robot control method, and robot control program
JP2016185572A (en) Robot, robot controller and robot system
CN111590593B (en) Calibration method, device and system of mechanical arm and storage medium
US20250229428A1 (en) Coordinate system calibration method, apparatus and system for robot, and medium
JP2017056546A (en) Measurement system used for calibrating mechanical parameters of robot
JPH1063317A (en) Method for combining coordinate system in robot and visual sensor system
JP7660686B2 (en) ROBOT CONTROL DEVICE, ROBOT CONTROL SYSTEM, AND ROBOT CONTROL METHOD
US20250042036A1 (en) Robot device provided with three-dimensional sensor and method for controlling robot device
CN113733078B (en) Method for interpreting fine adjustment control amount of manipulator, computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant