[go: up one dir, main page]

CN116423495A - PNP-based 3D positioning tracking method and system - Google Patents

PNP-based 3D positioning tracking method and system Download PDF

Info

Publication number
CN116423495A
CN116423495A CN202310162817.8A CN202310162817A CN116423495A CN 116423495 A CN116423495 A CN 116423495A CN 202310162817 A CN202310162817 A CN 202310162817A CN 116423495 A CN116423495 A CN 116423495A
Authority
CN
China
Prior art keywords
coordinate system
camera
pnp
target
manipulator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310162817.8A
Other languages
Chinese (zh)
Inventor
侯梦华
陈凯
梅文宝
刘俊锋
彭家豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qiling Image Technology Co ltd
Original Assignee
Shenzhen Qiling Image Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Qiling Image Technology Co ltd filed Critical Shenzhen Qiling Image Technology Co ltd
Priority to CN202310162817.8A priority Critical patent/CN116423495A/en
Publication of CN116423495A publication Critical patent/CN116423495A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a 3D positioning tracking method and system based on PNP, wherein the method comprises the following steps: in an off-line state, firstly calibrating a camera to obtain camera internal parameters required by the PNP positioning model, teaching a standard position relation between a target coordinate system and the camera coordinate system based on the PNP positioning method, and then calibrating a hand and an eye to determine a conversion relation between the camera coordinate system and a manipulator tool coordinate system; when the target does irregular motion, the camera shoots the target in real time to obtain a target image; performing target 3D positioning based on a PNP method, and calculating the pose of the target coordinate system in a camera coordinate system at the photographing moment; calculating the pose of the manipulator tool coordinate system in the base coordinate system when the teaching positions of the camera and the target are kept unchanged, namely the position to which the manipulator needs to move; and controlling the manipulator to move according to the pose. The invention adopts PNP method to perform 3D positioning, can realize high-precision positioning, and has less data required for positioning and higher positioning speed.

Description

PNP-based 3D positioning tracking method and system
Technical Field
The invention relates to the technical field of target tracking, in particular to a 3D positioning tracking method and system based on PNP.
Background
With the development of industrial automation technology, the manipulator has been widely used in various fields of industry, mainly by using multiple degrees of freedom manipulator to complete some simple assembly and manufacturing operations, and most industrial manipulators are used for pipeline operation, and the working process is based on a given program to complete monotonically repeated operations, which lacks autonomy and decision-making. In the prior art, a 3D imaging mode is adopted to acquire data so as to realize the positioning of a target object, and the control of a manipulator is combined, so that the manipulator can move to the position of the target object and perform corresponding operation. Data acquisition is performed by adopting a 3D imaging mode, and the data is rich in the mode, but the acquisition speed is relatively slow. The target tracking is required to be fast, the positioning does not need too much data, and compared with the conventional 3D imaging, the conventional 3D imaging mode is not suitable for the requirement of fast 3D positioning tracking.
Disclosure of Invention
Aiming at the technical problems, the invention provides a 3D positioning tracking method and system based on PNP, which have high positioning speed and high precision.
The embodiment of the invention provides a 3D positioning and tracking method based on PNP, which comprises the following steps: in an off-line state, firstly calibrating a camera to obtain camera internal parameters required by a PNP positioning model, teaching a standard position relationship between a target coordinate system and the camera coordinate system based on the PNP positioning method, and then calibrating a hand and an eye to determine a conversion relationship between the camera coordinate system and a manipulator tool coordinate system; when the target does irregular motion, the camera shoots the target in real time to obtain a target image; performing target 3D positioning based on a PNP method, and calculating the pose of a target coordinate system in a camera coordinate system at the photographing moment; calculating the pose of the manipulator tool coordinate system in the manipulator base coordinate system when the teaching positions of the camera and the target are kept unchanged, namely the position to which the manipulator needs to move; and controlling the manipulator to move according to the pose.
Optionally, when the positions of the camera and the target teaching are unchanged, the step of calculating the pose of the manipulator tool coordinate system in the manipulator base coordinate system, namely the position to which the manipulator needs to move, includes: calculating the pose of the target coordinate system in the camera coordinate system by a PNP method; calculating the pose of the real-time target coordinate system in the manipulator base coordinate system through the position relation between the camera coordinate system and the manipulator tool coordinate system and the pose of the manipulator tool coordinate system in the manipulator base coordinate system; and when the positions of the camera coordinate system and the manipulator tool coordinate system are kept unchanged, calculating the pose of the manipulator tool coordinate system in the manipulator base coordinate system.
Optionally, the camera calibration is a single-camera calibration, the single-camera calibration camera is internally referred to, a monocular camera PNP positioning model is established, and the conversion relation between any point (Xw, yw, zw) in space and pixel coordinates (u, v) is that
Figure BDA0004094771830000021
Wherein f is focal length, r is distortion factor, S x 、S y For the pixel size, u 0 For pixel offset, v 0 Is an internal reference of the camera.
Preferably, the transformation relationship between any two coordinate systems among the target coordinate system, the camera coordinate system, the manipulator tool coordinate system and the manipulator base coordinate system is
Figure BDA0004094771830000022
Wherein R3 x 3 is a rotation matrix and T3 x 1 is a translation matrix.
Preferably, the camera is fixed on the manipulator and moves along with the manipulator.
Preferably, the manipulator moves in an axial mode.
Preferably, the camera is a high frame rate camera.
Preferably, the camera intrinsic includes a focal length of the camera.
The invention further provides a 3D positioning and tracking system based on PNP, which comprises a control device and a six-degree-of-freedom manipulator, wherein a camera is arranged on the six-degree-of-freedom manipulator, and the control device is used for executing the 3D positioning and tracking method based on PNP.
In the technical scheme provided by the embodiment of the invention, under an off-line state, firstly calibrating a camera to obtain a camera internal reference required by a PNP positioning model, teaching a standard position relation between a target coordinate system and the camera coordinate system based on the PNP positioning method, and determining a conversion relation between the camera coordinate system and a manipulator tool coordinate system through hand-eye calibration; when the target does irregular motion, the camera shoots the target in real time to obtain a target image; performing target 3D positioning based on a PNP method, and calculating the pose of a target coordinate system in a camera coordinate system at the photographing moment; calculating the pose of the manipulator tool coordinate system in the manipulator base coordinate system when the teaching positions of the camera and the target are kept unchanged, namely the position to which the manipulator needs to move; compared with the prior art, the PNP method is adopted to perform 3D positioning, the target is customized, the customized ground target can be made with very high precision, the high-precision positioning can be realized, less data is needed for positioning, and the positioning speed is high.
Drawings
FIG. 1 is a schematic flow chart of a PNP-based 3D positioning and tracking method of the present invention;
FIG. 2 is a schematic diagram of four coordinate systems of the present invention;
FIG. 3 is a flow chart of another embodiment of a PNP-based 3D positioning and tracking method of the present invention;
FIG. 4 is a perspective projection model of pinhole imaging in accordance with the present invention;
FIG. 5 is a schematic diagram of PNP positioning according to the present invention;
FIG. 6 is a schematic diagram of the hand-eye calibration of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to fall within the scope of the invention.
The invention provides a target tracking system in a motion state, which comprises a rotary table and a control device, wherein a six-degree-of-freedom manipulator is arranged on the rotary table, the six-degree-of-freedom manipulator is driven to rotate through the rotary table, a camera is arranged on the six-degree-of-freedom manipulator, and the camera is controlled to move through the six-degree-of-freedom manipulator, so that 3D tracking is performed through the camera, and the control device adopts a PNP-based 3D positioning tracking method.
Referring to fig. 1, the present invention provides a PNP-based 3D positioning and tracking method, which includes the following steps:
step S10, firstly calibrating a camera in an off-line state to obtain camera internal parameters required by the PNP positioning model, teaching a standard position relation between a target coordinate system and the camera coordinate system based on the PNP positioning method, and then calibrating a hand and an eye to determine a conversion relation between the camera coordinate system and a manipulator tool coordinate system.
In the model of the target tracking device, the model includes four coordinate systems, please refer to fig. 2, which are respectively a manipulator base coordinate system, a manipulator tool coordinate system, a camera coordinate system and a target coordinate system, wherein the camera keeps photographing the target in the target tracking process, namely, when the target moves irregularly, positioning, and then calculating the pose that the manipulator needs to walk when the relative position of the camera and the target is kept unchanged.
In this embodiment, by calibrating the single camera, the camera internal parameters required by the PNP positioning model may be obtained, and then the pose of a target coordinate system in the camera coordinate system, that is, the conversion relationship between the target coordinate system and the camera coordinate system, is taught, specifically, by obtaining the internal parameters and the external parameters of the monocular camera, according to the internal parameters of the camera and the coordinate system on the purpose-made target, the pose of the target coordinate system in the camera coordinate system is calculated based on the PNP method, that is, the teaching position required by the subsequent real-time tracking, where the internal parameters include the pixel focal length parameter and the optical axis offset parameter along the horizontal direction of pixel arrangement, the pixel focal length parameter and the optical axis offset parameter along the vertical direction of pixel arrangement, and the radial distortion parameter and the tangential distortion parameter in the imaging process of the monocular camera. The external parameters include rotation parameters and translation parameters.
The pose of the manipulator working coordinate system in the manipulator base coordinate system is required to be acquired by accessing the manipulator when photographing each time, specifically, photographing refers to the position and the moment when photographing each time, because the position of the camera on the manipulator changes in real time, the position of the tool coordinate system in the manipulator coordinate system is mainly acquired by informing software through a manipulator program or actively accessing the manipulator by the software.
When the eyes are marked, the pose of the target coordinate system in the base coordinate system is fixed and can be regarded as constant; the pose of the camera coordinate system in the tool coordinate system is also fixed, and final calibration is required.
And S20, when the target does irregular motion, the camera shoots the target in real time to obtain a target image.
Specifically, the customization of the target is realized, the customized target can achieve very high precision, and high-precision positioning can be realized, and the camera is arranged on the manipulator and used for positioning the manipulator in real time. The captured image may be a photograph or may be an image frame in a video image. In this embodiment, the camera adopts the high frame rate camera, and the speed of shooing is faster, and stability is also higher and traditional camera shoots at fixed position usually, does not possess self-adaptation rotatory function by itself, and this application camera sets up on the manipulator, and the camera can rotate along with the manipulator, and the manipulator adopts the transmission mode that passes through, and the transmission is accessible, and the manipulator does not have actions such as start, acceleration and deceleration when dynamic real-time tracking, so eliminate dynamic pause and time consuming, and dynamic location tracking frequency can reach more than 30 HZ.
And step S30, performing target 3D positioning based on a PNP method, and calculating the shooting moment, wherein the pose of a target coordinate system in a camera coordinate system.
Step S40, calculating the pose of the manipulator tool coordinate system in the manipulator base coordinate system, namely the position to which the manipulator needs to move when the relative teaching position of the camera and the target is kept unchanged;
in one embodiment of the present invention, please refer to fig. 3, the step S40 specifically includes the following steps:
step S41, calculating the pose of the target coordinate system in the camera coordinate system by a PNP method;
step S42, obtaining the position relation between a camera coordinate system and a tool coordinate system through hand-eye calibration, obtaining the pose of the tool coordinate system in a base coordinate system during photographing, and calculating the pose of a real-time target coordinate system in the manipulator base coordinate system;
and step S43, calculating the pose of the manipulator tool coordinate system in the base coordinate system when the positions of the camera coordinate system and the target coordinate system obtained through teaching remain unchanged.
The embodiment adopts a monocular camera PNP positioning algorithm to realize the pose calculation process of the target coordinate system in the camera coordinate system. Specifically, the target coordinate information, the world coordinate information and the calibration parameters of the camera are input into a preset monocular camera PNP positioning model, and the pose of the target coordinate system in the camera coordinate system is output through the preset monocular camera PNP positioning model.
Please refer to fig. 4, which illustrates a pinhole imaging perspective projection model, wherein feature points are imaged through pinholes, exposed and projected onto a camera chip, and then converted into pictures through photoelectric conversion. The conversion relation between any point (Xw, yw, zw) in space and pixel coordinates (u, v) of the internal reference of the single-camera calibration camera is
Figure BDA0004094771830000061
Wherein f is a focal length,r is distortion factor, S x 、S y For the pixel size, u 0 For pixel offset, v 0 Is an internal reference of the camera and can be obtained through camera calibration. Points in space (X w ,Y w ,Z w ) The pixel coordinates (u, v) are obtained by the above-described conversion.
Please refer to fig. 5, which illustrates a schematic diagram of PNP positioning, wherein if a three-dimensional structure of a scene is known, an absolute pose relationship between a camera coordinate system and a world coordinate system representing the three-dimensional scene structure can be solved by using coordinates of a plurality of control points in the three-dimensional scene and Perspective projection coordinates thereof in an image, including an absolute translation vector t and a rotation matrix R, and such solving methods are collectively referred to as N-Point Perspective pose solving (PNP problem). Control points here refer to points where the three-dimensional space coordinate position is known accurately, and where the corresponding image plane coordinates are also known. For perspective projection, at least three sets of control points are required to make the PNP problem a deterministic solution.
According to the invention, the relation between the camera coordinate system and the manipulator tool coordinate system is calibrated through the hand-eye calibration method, and the calibration is performed in an off-line state.
Please refer to fig. 6, which is a schematic diagram of hand-eye calibration, in this embodiment, the camera is fixed on the manipulator, so we need to calibrate the camera on the manipulator. As shown in fig. 6, when the target is calibrated, the target is stationary, and the camera moves more than 5 poses to photograph the target, and four coordinate systems are provided in the calibration system: (1) the transformation relation between the target coordinate system (2) camera coordinate system (3) manipulator tool coordinate system (4) manipulator base coordinate system and each two coordinate systems is as follows
Figure BDA0004094771830000071
Wherein R3 x 3 is a rotation matrix and T3 x 1 is a translation matrix. The conversion matrix is obtained through a hand-eye calibration method. The rotation translation matrix is the conversion relation between two coordinate systems, and the four coordinate systems form a closed loop. Coordinate system relationship characteristics in the closed loop:
pose of target coordinate system in camera coordinate system: each shooting can be obtained through PNP positioning calculation; pose of camera coordinate system in tool coordinate system: fixed, also need to be calibrated in advance; pose of the manipulator tool coordinate system in the base coordinate system: each shot can be known by accessing the manipulator; pose of target coordinate system in manipulator base coordinate system: fixed, can be regarded as constant.
If the pose of the target coordinate system in the base coordinate system is taken as a constant, the rotational translation matrix between the manipulator tool coordinate system and the camera coordinate system is taken as unknowns (then a total of 12 unknowns). After 5 times of shooting and positioning, 4 groups of equations can be built, each group of equations has 3 equations, and the rotation translation matrix can be obtained by total 12 unknowns, so that calibration is completed.
The reference position of the camera and the target is taught, which is a standard position, namely, the target tracked each time is to ensure that the standard relative position between the camera and the target is kept unchanged, and the method is finished offline.
And (3) positioning a camera, acquiring a target picture in real time by the camera, and calculating the pose, namely the relative relation, of the target coordinate system in the camera coordinate system by a PNP method. And calculating the pose of the real-time target coordinate system in the base coordinate system through the position relation between the camera coordinate system and the manipulator tool coordinate system and the pose of the manipulator tool coordinate system in the base coordinate system.
Under the condition that the standard pose of the camera coordinate system and the target coordinate system is kept unchanged during teaching, the pose of the manipulator tool coordinate system in the manipulator base coordinate system and the pose to which the manipulator needs to move are calculated (the positions of the camera coordinate system and the tool coordinate system are kept unchanged, so that the tool coordinate system needs to be calculated to be moved to a certain position of the base coordinate system).
And S50, controlling the manipulator to move according to the pose.
In an off-line state, camera internal parameters required by the PNP positioning model are obtained through camera calibration, then a standard position relation between a target coordinate system and a camera coordinate system is taught based on the PNP positioning method, and a conversion relation between the camera coordinate system and a manipulator tool coordinate system is determined through hand-eye calibration; when the target does irregular motion, the camera shoots the target in real time to obtain a target image; performing target 3D positioning based on a PNP method, and calculating the pose of a target coordinate system in a camera coordinate system at the photographing moment; calculating the pose of the manipulator tool coordinate system in the manipulator base coordinate system when the teaching positions of the camera and the target are kept unchanged, namely the position to which the manipulator needs to move; according to the pose control manipulator motion, compared with the prior art, the PNP method is adopted to perform 3D positioning, the target is customized, the customized ground target can be made with very high precision, high-precision positioning can be achieved, less data are needed for positioning, and the positioning speed is high.
The hardware of the invention can be flexibly selected, and imaging hardware with different precision and fields of view and a customized corresponding target can be selected according to the precision and speed requirements. The invention improves the target tracking speed, improves the target tracking precision, has wide tracking range, and can perform positioning tracking in real time by matching with a manipulator motion control mode.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. A PNP-based 3D location tracking method, the method comprising:
in an off-line state, firstly calibrating a camera to obtain camera internal parameters required by a PNP positioning model, teaching a standard position relationship between a target coordinate system and the camera coordinate system based on the PNP positioning method, and then calibrating a hand and an eye to determine a conversion relationship between the camera coordinate system and a manipulator tool coordinate system;
when the target does irregular motion, the camera shoots the target in real time to obtain a target image;
performing target 3D positioning based on a PNP method, and calculating the pose of a target coordinate system in a camera coordinate system at the photographing moment;
calculating the pose of the manipulator tool coordinate system in the manipulator base coordinate system when the teaching positions of the camera and the target are kept unchanged, namely the position to which the manipulator needs to move;
and controlling the manipulator to move according to the pose.
2. The PNP-based 3D positioning tracking method according to claim 1, wherein when the calculation keeps the camera and target teaching positions unchanged, the pose of the manipulator tool coordinate system in the manipulator base coordinate system, that is, the position to which the manipulator needs to be moved, includes:
calculating the pose of the target coordinate system in the camera coordinate system by a PNP method;
calculating the pose of the real-time target coordinate system in the manipulator base coordinate system through the position relation between the camera coordinate system and the manipulator tool coordinate system and the pose of the manipulator tool coordinate system in the manipulator base coordinate system;
and when the positions of the camera coordinate system and the manipulator tool coordinate system are kept unchanged, calculating the pose of the manipulator tool coordinate system in the manipulator base coordinate system.
3. The PNP-based 3D location tracking method of claim 1, wherein said camera calibration is a single-camera calibration, said single-camera calibration camera references, a monocular camera PNP location model is built, and a conversion relationship between any point in space (Xw, yw, zw) and pixel coordinates (u, v) is as follows
Figure FDA0004094771820000011
Wherein f is focal length, r is distortion factor, S x 、S y For the pixel size, u 0 For pixel offset, v 0 Is an internal reference of the camera.
4. The PNP-based 3D positioning tracking method of claim 2, wherein said target coordinate system, camera coordinate system, manipulator tool coordinate system, and manipulator base coordinate system are in a conversion relationship of
Figure FDA0004094771820000021
Wherein R3 x 3 is a rotation matrix and T3 x 1 is a translation matrix.
5. The PNP based 3D positioning tracking method of claim 1, wherein said camera is fixed to a robot arm, following the robot arm movement.
6. The PNP-based 3D positioning tracking method of claim 1, wherein said robot arm moves in an axial mode.
7. The PNP based 3D location tracking method of claim 1, wherein said camera is a high frame rate camera.
8. The PNP based 3D positioning tracking method of claim 1, wherein said camera intrinsic comprises a focal length of a camera.
9. A PNP-based 3D positioning and tracking system, characterized in that the system comprises a control device and a six degree of freedom manipulator, on which a camera is arranged, the control device being adapted to execute the PNP-based 3D positioning and tracking system according to any of claims 1-8.
CN202310162817.8A 2023-02-24 2023-02-24 PNP-based 3D positioning tracking method and system Pending CN116423495A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310162817.8A CN116423495A (en) 2023-02-24 2023-02-24 PNP-based 3D positioning tracking method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310162817.8A CN116423495A (en) 2023-02-24 2023-02-24 PNP-based 3D positioning tracking method and system

Publications (1)

Publication Number Publication Date
CN116423495A true CN116423495A (en) 2023-07-14

Family

ID=87086250

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310162817.8A Pending CN116423495A (en) 2023-02-24 2023-02-24 PNP-based 3D positioning tracking method and system

Country Status (1)

Country Link
CN (1) CN116423495A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118485729A (en) * 2024-05-24 2024-08-13 北京新航地拓科技有限公司 Position locating method, device, equipment, medium and product based on monocular camera

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118485729A (en) * 2024-05-24 2024-08-13 北京新航地拓科技有限公司 Position locating method, device, equipment, medium and product based on monocular camera

Similar Documents

Publication Publication Date Title
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
CN106426172B (en) A kind of scaling method and system of industrial robot tool coordinates system
CN110136208B (en) Joint automatic calibration method and device for robot vision servo system
CN112223302B (en) Rapid calibration method and device of live working robot based on multiple sensors
CN107871328B (en) Machine vision system and calibration method implemented by machine vision system
JP2022028672A5 (en)
CN109658460A (en) A kind of mechanical arm tail end camera hand and eye calibrating method and system
JP7185860B2 (en) Calibration method for a multi-axis movable vision system
CN111331592A (en) Tool center point correction device for robotic arm, method thereof, and robotic arm system
CN108326850B (en) A method and system for a robot to accurately move a manipulator to a designated position
CN109671122A (en) Trick camera calibration method and device
CN109807937A (en) A robot hand-eye calibration method based on natural scenes
WO2018209592A1 (en) Movement control method for robot, robot and controller
CN102842117A (en) Method for correcting kinematic errors in microscopic vision system
CN111360821A (en) Picking control method, device and equipment and computer scale storage medium
CN112258589A (en) Hand-eye calibration method and device
CN111080776A (en) Processing method and system for human body action three-dimensional data acquisition and reproduction
CN115972192A (en) 3D computer vision system with variable spatial resolution
CN116423495A (en) PNP-based 3D positioning tracking method and system
CN112164112B (en) Method and device for acquiring pose information of mechanical arm
CN110544278B (en) Rigid body motion capture method and device and AGV pose capture system
CN109900251A (en) A kind of robotic positioning device and method of view-based access control model technology
CN211915840U (en) Three-dimensional five-axis laser cutting machine tool based on monocular vision
CN215701709U (en) Configurable hand-eye calibration device
CN112767494A (en) Precise measurement positioning method based on calibration algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination