[go: up one dir, main page]

CN113506344A - High-precision three-dimensional positioning device and method for nuclear radiation environment robot - Google Patents

High-precision three-dimensional positioning device and method for nuclear radiation environment robot Download PDF

Info

Publication number
CN113506344A
CN113506344A CN202110768801.2A CN202110768801A CN113506344A CN 113506344 A CN113506344 A CN 113506344A CN 202110768801 A CN202110768801 A CN 202110768801A CN 113506344 A CN113506344 A CN 113506344A
Authority
CN
China
Prior art keywords
dimensional
structured light
positioning
precision
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110768801.2A
Other languages
Chinese (zh)
Inventor
徐锋
李瑾
张文凯
陈妍洁
陈国栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest University of Science and Technology
Original Assignee
Southwest University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest University of Science and Technology filed Critical Southwest University of Science and Technology
Priority to CN202110768801.2A priority Critical patent/CN113506344A/en
Publication of CN113506344A publication Critical patent/CN113506344A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J18/00Arms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/10Programme-controlled manipulators characterised by positioning means for manipulator elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Electromagnetism (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本发明公开了一种核辐射环境机器人高精度三维定位装置及方法,涉及三维重建技术领域,包括以下步骤:S1,移动机器人获得三维环境点云地图;S2,基于三维环境点云地图信息,机器人完成对作业目标的初步定位;S3,移动机器人通过驱动机械臂运动,使得作业目标进入机器人结构光相机的定位范围内;S4,移动机器人计算出结构光相机的定位范围内的三维数据信息;S5,根据结构光相机的定位范围内的三维数据信息对作业目标进行高精度的三维定位;通过初步定位与高精度三维定位结合的方式,获得对目标物毫米级的空间位置。

Figure 202110768801

The invention discloses a high-precision three-dimensional positioning device and method for a robot in a nuclear radiation environment, and relates to the technical field of three-dimensional reconstruction, comprising the following steps: S1, a mobile robot obtains a three-dimensional environment point cloud map; Complete the preliminary positioning of the job target; S3, the mobile robot drives the robotic arm to move, so that the job target enters the positioning range of the robot's structured light camera; S4, the mobile robot calculates the three-dimensional data information within the positioning range of the structured light camera; S5 , carry out high-precision three-dimensional positioning of the work target according to the three-dimensional data information within the positioning range of the structured light camera; through the combination of preliminary positioning and high-precision three-dimensional positioning, the millimeter-level spatial position of the target is obtained.

Figure 202110768801

Description

High-precision three-dimensional positioning device and method for nuclear radiation environment robot
Technical Field
The invention relates to the technical field of three-dimensional reconstruction, in particular to a high-precision three-dimensional positioning device and method for a nuclear radiation environment robot.
Background
Although the nuclear power plant is constructed to prevent and relieve possible accidents in the design stage, serious accidents may still occur due to aging of operating equipment, misoperation and the like. Therefore, strict detection and maintenance of nuclear power plant equipment are required, aged faulty equipment is timely discovered and replaced, operations such as disassembly, transportation, sorting, shipment and the like are carried out on the abandoned nuclear device, and people cannot directly enter a target site for long-time operation due to factors such as radiation and the like. Traditional mode mainly wears thick and heavy protective clothing through the staff and carries out the operation in batches, and such mode not only work efficiency is low but also has high risk to the staff. With the progress of science and technology, a robot with a sensor mounted on a mechanical arm is used for remote control, environmental images are collected through a camera in operation, and a method for remotely controlling the robot and the mechanical arm to operate by workers is widely applied. However, the motion control of the mechanical arm often has a certain error and the transmission of the remote image has a certain delay, so that the accurate operation of the target cannot be accurately completed.
With the development of sensor technology, many sensors are available on the market to acquire spatial target position information. The method has the advantages that the principle is simple, the operation is convenient, and the defects that the method can cause mutual interference between sound waves in a closed space and has low precision; the millimeter wave radar is utilized to realize ranging, and the method has the advantages of long measuring distance, high ranging precision and the like, but is easy to be interfered by electromagnetic waves and has relatively high product cost. 3D lidar measurements are widely used for three-dimensional sensing and positioning, but the accuracy of the measurements is only in centimeters. The structured light vision distance measurement technology can reach millimeter-scale precision after being accurately calibrated by a high-resolution industrial camera, but because the focal length of a lens is fixed during measurement, a clear image can be obtained only in a certain working distance. The nuclear power station is a scene with complex radiation, large spatial range and single characteristic, and in order to implement high-precision operation on a target, the precise operation cannot be achieved only by using a single sensor, so a new technical scheme needs to be provided.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a high-precision three-dimensional positioning device and method for a nuclear radiation environment robot.
The purpose of the invention is realized by the following technical scheme:
a high-precision three-dimensional positioning method for a nuclear radiation environment robot comprises the following steps:
s1, the mobile robot obtains the three-dimensional environment point cloud map of the operation target, and the step S2 is executed;
s2, based on the three-dimensional environment point cloud map information, the robot completes the preliminary positioning of the operation target, and the step S3 is executed;
s3, the mobile robot drives the mechanical arm to move, so that the work target enters the positioning range of the robot structured light camera (4), and the step S4 is executed;
s4, the mobile robot calculates the three-dimensional data information in the positioning range of the structured light camera, and executes the step S5;
and S5, performing high-precision three-dimensional positioning on the working target according to the three-dimensional data information in the positioning range of the structured light camera.
Further, in step S1, the three-dimensional environment point cloud map is obtained by performing point cloud fusion through the movement of the mobile robot in combination with the laser radar scanning.
Further, in step S2, the preliminary positioning is performed by combining the movement of the robot and the mechanical arm with the laser radar scanning.
Further, in step S2, the preliminary positioning includes point cloud acquisition, point cloud data preprocessing, similarity measurement, and repositioning.
Further, in step S3, the mobile robot drives the mechanical arm to move based on the relative pose relationship between the laser radar and the structured light camera, the relative pose relationship between the laser radar and the structured light camera is obtained by calibrating the laser radar and the structured light camera, and then the path of the mechanical arm is planned according to the phase pose relationship between the operation target and the laser radar.
Further, in step S4, the three-dimensional data information in the structured light camera positioning range is obtained by a structured light decoding method, where the structured light decoding method includes projecting structured light by the structured light camera, collecting structured light information encoded by the work object by the structured light camera, decoding the collected encoded structured light information, and obtaining the three-dimensional data information in the positioning range of the work object relative to the structured light camera according to the decoded information.
Further, in step S5, the high-precision three-dimensional positioning includes: the high-precision three-dimensional positioning comprises the steps of firstly calibrating the structured light camera and the tail end of the mechanical arm to obtain the relative pose relationship between the structured light camera and the tail end of the mechanical arm, then selecting a positioning point according to three-dimensional data information of an operation target relative to the measurement range of the structured light camera, then planning the path of the mechanical arm according to the calibrated position relationship, and finally driving the mechanical arm to enable the tail end of the mechanical arm to reach the positioning point.
A high-precision three-dimensional positioning device for a nuclear radiation environment robot comprises a mobile robot, a laser radar and a structured light camera, wherein the mobile robot comprises a mobile platform and a mechanical arm, the mechanical arm is fixedly connected with the mobile platform, the laser radar is fixedly arranged at one end of the mechanical arm connected with the mobile platform, and the structured light camera is fixedly arranged at the other end of the mechanical arm; the mobile platform can be in a wheel type or a slide rail type according to an actual scene, the laser radar is a 3D laser radar, the position of the laser radar is located on the mechanical arm, the specific position can be selected according to the actual situation of the nuclear radiation scene, the laser radar is calibrated with the structured light camera after being fixed in position, the projection light source of the structured light camera (4) can adopt white light, LED light, laser and infrared light, the projection mode can adopt point structure projection, line structure projection and surface structure projection, and the acquisition mode can be monocular, binocular or monocular.
The invention has the beneficial effects that:
according to the invention, a working scene is accurately simulated through the primary positioning of the 3D laser radar, the three-dimensional information of the target is more accurately obtained through a structured light vision method, and the millimeter-level spatial position of the target object is obtained through the combination of the primary positioning and the high-precision three-dimensional positioning.
Drawings
FIG. 1 is a flow chart of a three-dimensional positioning method of a nuclear radiation environment robot according to the present invention;
FIG. 2 is a schematic structural diagram of a three-dimensional positioning device of a nuclear radiation environment robot according to the present invention;
FIG. 3 is a schematic view of the calibration of a binocular camera according to the present invention;
FIG. 4 is a schematic view of the hand-eye calibration of the present invention;
FIG. 5 is a flow chart of the 3D lidar primary positioning of the present invention;
fig. 6 is a flow chart of the precise positioning of the binocular structured light vision system of the present invention.
In the invention: 1-moving platform, 2-mechanical arm, 3-laser radar and 4-structured light camera.
Detailed Description
The technical solutions in the embodiments of the present invention are clearly and completely described below with reference to fig. 1 to 6 of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, rather than all embodiments, and based on the embodiments of the present invention, a person skilled in the art can obtain all other embodiments without creative efforts.
In the description of the present invention, it is to be understood that the terms "counterclockwise", "clockwise", "longitudinal", "lateral", "up", "down", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc., indicate orientations or positional relationships based on those shown in the drawings, and are used for convenience of description only, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and thus, are not to be considered as limiting.
The binocular structure light system is preferably selected in the present invention only for convenience of describing the present invention, and does not indicate or imply that the present invention can only use the binocular structure light system.
A high-precision three-dimensional positioning method for a nuclear radiation environment robot comprises the following steps:
s1, the mobile robot obtains the three-dimensional environment point cloud map of the operation target, and the step S2 is executed;
s2, based on the three-dimensional environment point cloud map information, the robot completes the preliminary positioning of the operation target, and the step S3 is executed;
s3, the mobile robot drives the mechanical arm to move, so that the work target enters the positioning range of the robot structured light camera, and the step S4 is executed;
s4, the mobile robot calculates the three-dimensional data information in the positioning range of the structured light camera, and executes the step S5;
and S5, performing high-precision three-dimensional positioning on the working target according to the three-dimensional data information in the positioning range of the structured light camera.
The working principle of the scheme is briefly described as follows:
in the invention, a 3D laser radar emits a laser beam to realize initial positioning of a working scene, a structured light camera is arranged at the tail end of a mechanical arm, and a structured light camera system realizes precise positioning of a target by using a structured light vision measurement algorithm so as to guide the mechanical arm to perform specified operation;
according to fig. 1, firstly, the calibration work of the system is carried out; secondly, planning the path of the mobile robot by combining a given global three-dimensional map; then the mobile robot enables the 3D laser radar to enter an operation scene for primary positioning; then, accurately positioning by combining the target position and the structured light camera to obtain the three-dimensional coordinates of the operation target under the structured light camera coordinate system; and then calculating the position of the target at the tail end of the mechanical arm according to external parameters calibrated by the camera and the tail end of the mechanical arm, planning a path, finally controlling the mechanical arm to perform specified operation, and finally finishing all specified operation finishing work. The key technologies involved in this project have three major parts: the system calibration is used for determining the pose relationship among the structured light camera, the 3D laser radar and the robot; the three key technologies are described below respectively based on the preliminary sensing and positioning based on the 3D laser radar and the accurate sensing and positioning based on the visual measurement of the structured light system:
the system calibration mainly completes the determination of the pose relationship among the 3D laser radar, the structured light camera and the mobile robot, and is used for conveying the tail end of the mechanical arm to a position needing to be operated through the motion of the robot after identifying an operation target. Mainly comprises the calibration of a structured light camera; calibrating between the mechanical arm and the structured light camera and calibrating between the laser radar and the structured light camera;
in the method, images are collected by adopting a binocular camera, the two cameras simultaneously shoot calibration plates at different positions in space, then the images are calibrated according to the shot sequences, and the two cameras respectively carry out monocular camera calibration to obtain internal parameters and external parameters of the cameras. The extrinsic parameters here refer to the rotational-translational relationship between the camera coordinate system and the world coordinate system established in each step calibration. And then unifying the pose relations of the two cameras to a camera coordinate (generally to a left camera coordinate system) by combining the polar constraint, the consistency constraint and the like in the calibration plate images and the binocular vision in the left camera and the right camera. The calculated binocular calibration parameters need to be further optimized to obtain more accurate calibration parameters. Considering that the calibration plate has non-negligible geometric errors in the manufacturing process, the calibration parameters need to be optimized for the second time after the binocular camera is calibrated (the first optimization is the optimization for solving the distortion parameters of the camera);
before the mechanical arm performs target designation operation, the position of a target object relative to the tail end of the mechanical arm needs to be obtained, and the pose relationship between a target and a binocular camera is determined through a binocular vision measurement system, so that the relative pose between the mechanical arm (hand) and the binocular camera (eye) needs to be calibrated, and the coordinates of the mechanical arm (hand) and the binocular camera (eye) are unified under the same world coordinate system;
the hand-eye calibration is to obtain a conversion relationship between a coordinate system of a camera mounted on the robot arm and a coordinate system of the robot arm base so that the robot arm can use information acquired by the camera. For ease of operation, the coordinate system is normalized to the robot arm base point. The conversion matrix between the coordinate systems of the two tools before and after the mechanical arm moves can be calculated through the parameters of the mechanical arm sub-band. In order to solve the rotation and translation matrix, the mechanical arm needs to be moved for multiple times in the experiment, and the position coordinates of three mechanical arm gripping tools are obtained, so that multiple groups of equations for solving the hand-eye relationship are obtained. And (4) carrying out simultaneous solution on the equations to obtain the parameters of the hand-eye calibration. The target coordinates of the grabbing position can be converted to the mechanical arm base coordinates through parameters calibrated by hands and eyes, so that the mechanical arm can operate the target;
after the 3D laser radar-based initial positioning is completed, the pose relationship between the laser radar and the operation target is determined, in the process from the initial positioning to the accurate positioning, the 3D laser radar and the binocular camera are needed to be calibrated to acquire the pose relationship between the laser radar and the camera, path planning is further carried out through the pose relationship between the laser radar and the camera, and the binocular structure light vision measuring system is moved to the optimal position for measuring the operation target. The calibration process comprises the following steps: a calibration plate is placed in front of a binocular camera and images are collected, meanwhile, the 3D laser radar scans the direction of the calibration plate, point clouds of all poses and images of corresponding cameras are intercepted, the point clouds on the calibration plate are selected from the point clouds, and the poses of the calibration plate in laser radar detection data and a plane normal vector matrix are estimated. And calculating pixel coordinates of the inner corner points in the calibration picture by using methods such as corner point detection and the like, and calculating the position and the posture of the calibration plate in a camera coordinate system and a direction matrix of a normal vector of the calibration plate according to the coordinates of the corresponding corner points. Calculating a rotation matrix through normal vector matrixes in two coordinate systems, and optimizing a translation vector through minimizing the distance from the point cloud to a plane to finish external parameter calibration of the laser radar and the camera;
3D laser radar positioning process: firstly, scanning in a working scene through a 3D laser radar to obtain point cloud data; secondly, preprocessing the scanned point cloud data, filtering out unreasonable abnormal points to ensure the accuracy of subsequent algorithm processing, and simultaneously reducing the information quantity of the point cloud data, thereby reducing the calculation pressure and improving the calculation efficiency. The pretreatment mainly comprises three parts: removing outliers, removing mechanical arm points and down-sampling point clouds; then, similarity measurement is carried out according to the provided global three-dimensional map, before similarity measurement is carried out, geometric correspondence of two aggregation sets is established, two aggregation sets to be detected are established according to the number of each aggregation set, then, according to cosine similarity of a set histogram, and finally, a candidate scene with the highest similarity is selected, namely, matching of the scenes is considered to be completed; and finally, realizing the relocation of the laser radar by using algorithms such as 3D-NDT and the like. After matching is completed, initial positioning of the laser radar on the global three-dimensional map can be obtained, coordinates of the laser radar on the global three-dimensional map are obtained, and then the three-dimensional coordinates of the operation target in the global three-dimensional map in the whole process are combined, so that the motion planning of the robot can be guided. When a given global three-dimensional map is adopted, corresponding initial scanning positions, moving tracks, index information of point clouds and the like are required to be provided in addition to the map so as to increase the success of relocation and reduce the calculation amount;
accurate positioning process of a binocular structured light vision system: the binocular structured light camera system is composed of two cameras and a projector, and the projector is placed between the two cameras and used for projecting structured light grating images. Firstly, projecting a structured light image with a certain code on a working target by a projector fixed on a mechanical arm, and then acquiring a projection image of a target object by a binocular camera; then, expanding the phase of the captured projection image according to a decoding algorithm corresponding to the structured light code to obtain a continuous phase diagram of the operation target; then obtaining a depth map of the working target through a binocular stereo matching algorithm, calculating the distance, the relative position and the direction between the position of the working target and a binocular camera based on the obtained depth information and by combining internal and external parameters calibrated by the two eyes, and performing three-dimensional reconstruction of the target; and finally, estimating and positioning the pose of the operation target. Since the work object is approximately regular in shape, the spatial location for the object may be equated by the mass points of the target point cloud. And performing mass point calculation on the target point cloud obtained according to the structured light algorithm to obtain mass point coordinates and normal information, and then combining with a rotational translation matrix obtained by calibrating hands and eyes to transform the mass point coordinates into a coordinate system where the mechanical arm is located, so as to control the mechanical arm to operate the target.
Further, in step S1, the three-dimensional environment point cloud map is obtained by performing point cloud fusion through the movement of the mobile robot in combination with the laser radar scanning.
Further, in step S2, the preliminary positioning is performed by combining the movement of the robot and the mechanical arm with the laser radar scanning.
Further, in step S2, the preliminary positioning includes point cloud acquisition, point cloud data preprocessing, similarity measurement, and repositioning.
Further, in step S3, the mobile robot drives the mechanical arm to move based on the relative pose relationship between the laser radar and the structured light camera, the relative pose relationship between the laser radar and the structured light camera is obtained by calibrating the laser radar and the structured light camera, and then the path of the mechanical arm is planned according to the phase pose relationship between the operation target and the laser radar.
Further, in step S4, the three-dimensional data information in the structured light camera positioning range is obtained by a structured light decoding method, where the structured light decoding method includes projecting structured light by the structured light camera, collecting structured light information encoded by the work object by the structured light camera, decoding the collected encoded structured light information, and obtaining the three-dimensional data information in the positioning range of the work object relative to the structured light camera according to the decoded information.
Further, in step S5, the high-precision three-dimensional positioning includes: the high-precision three-dimensional positioning comprises the steps of firstly calibrating the structured light camera and the tail end of the mechanical arm to obtain the relative pose relationship between the structured light camera and the tail end of the mechanical arm, then selecting a positioning point according to three-dimensional data information of an operation target relative to the measurement range of the structured light camera, then planning the path of the mechanical arm according to the calibrated position relationship, and finally driving the mechanical arm to enable the tail end of the mechanical arm to reach the positioning point.
A high-precision three-dimensional positioning device for a nuclear radiation environment robot comprises a mobile robot, a laser radar 3 and a structured light camera 4, wherein the mobile robot comprises a mobile platform 1 and a mechanical arm 2, the mechanical arm 2 is fixedly connected with the mobile platform 1, the laser radar 3 is fixedly arranged at one end of the mechanical arm 2 connected with the mobile platform 1, and the structured light camera 4 is fixedly arranged at the other end of the mechanical arm 2; the mobile platform 1 can be wheeled or slide rail type according to an actual scene, the laser radar 3 is a 3D laser radar 3, the position of the laser radar is located on the mechanical arm 2, the specific position can be selected according to the actual situation of the nuclear radiation scene, the laser radar is calibrated with the structured light camera 4 after being fixed, the projection light source of the structured light camera 4 can adopt white light, LED light, laser and infrared light, the projection mode can adopt point structure projection, line structure projection and surface structure projection, and the acquisition mode can be monocular, binocular or binocular.
The foregoing is merely a preferred embodiment of the invention, it being understood that the embodiments described are part of the invention, and not all of it. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. The invention is not intended to be limited to the forms disclosed herein, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein. And that modifications and variations may be effected by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (8)

1.一种核辐射环境机器人高精度三维定位方法,其特征在于,包括以下步骤:1. a nuclear radiation environment robot high-precision three-dimensional positioning method, is characterized in that, comprises the following steps: S1,移动机器人获得作业目标的三维环境点云地图,执行步骤S2;S1, the mobile robot obtains a three-dimensional environment point cloud map of the job target, and executes step S2; S2,基于三维环境点云地图信息,机器人完成对作业目标的初步定位,执行步骤S3;S2, based on the three-dimensional environment point cloud map information, the robot completes the preliminary positioning of the job target, and executes step S3; S3,移动机器人通过驱动机械臂运动,使得作业目标进入机器人结构光相机的定位范围内,执行步骤S4;S3, the mobile robot drives the robotic arm to move, so that the operation target enters the positioning range of the robot structured light camera, and executes step S4; S4,移动机器人计算出结构光相机的定位范围内的三维数据信息,执行步骤S5;S4, the mobile robot calculates the three-dimensional data information within the positioning range of the structured light camera, and executes step S5; S5,根据结构光相机的定位范围内的三维数据信息对作业目标进行高精度的三维定位。S5, perform high-precision three-dimensional positioning on the work target according to the three-dimensional data information within the positioning range of the structured light camera. 2.根据权利要求1所述的一种核辐射环境机器人高精度三维定位方法,其特征在于,所述步骤S1中,三维环境点云地图是通过移动机器人的移动结合激光雷达扫描进行点云融合获得的。2. The high-precision three-dimensional positioning method of a nuclear radiation environment robot according to claim 1, wherein in the step S1, the three-dimensional environment point cloud map is to perform point cloud fusion through the movement of the mobile robot combined with laser radar scanning acquired. 3.根据权利要求1所述的一种核辐射环境机器人高精度三维定位方法,其特征在于,所述步骤S2中,初步定位是通过机器人与机械臂的移动结合激光雷达扫描完成的。3 . The high-precision three-dimensional positioning method of a nuclear radiation environment robot according to claim 1 , wherein, in the step S2 , the preliminary positioning is completed by the movement of the robot and the mechanical arm combined with laser radar scanning. 4 . 4.根据权利要求2所述的一种核辐射环境机器人高精度三维定位方法,其特征在于,所述步骤S2中,初步定位包括点云获取、点云数据预处理、相似度测量、重定位。4. The high-precision three-dimensional positioning method of a nuclear radiation environment robot according to claim 2, wherein in the step S2, the preliminary positioning comprises point cloud acquisition, point cloud data preprocessing, similarity measurement, and relocation . 5.根据权利要求1所述的一种核辐射环境机器人高精度三维定位方法,其特征在于,所述步骤S3中,移动机器人驱动机械臂运动是基于激光雷达与结构光相机的相对位姿关系进行驱动的。5. The high-precision three-dimensional positioning method of a nuclear radiation environment robot according to claim 1, wherein in the step S3, the movement of the mobile robot driving the mechanical arm is based on the relative pose relationship between the laser radar and the structured light camera driven. 6.根据权利要求1所述的一种核辐射环境机器人高精度三维定位方法,其特征在于,所述步骤S4中,所述结构光相机定位范围内的三维数据信息是通过结构光解码方法得到的。6. The high-precision three-dimensional positioning method of a nuclear radiation environment robot according to claim 1, wherein in the step S4, the three-dimensional data information within the positioning range of the structured light camera is obtained by a structured light decoding method of. 7.根据权利要求1所述的一种核辐射环境机器人高精度三维定位方法,其特征在于,所述步骤S5中,所述高精度的三维定位包括:结构光相机与机械臂末端之间的相对位姿关系、机械臂的行驶路径。7. The high-precision three-dimensional positioning method of a nuclear radiation environment robot according to claim 1, wherein in the step S5, the high-precision three-dimensional positioning comprises: a structure between the structured light camera and the end of the mechanical arm. The relative pose relationship and the driving path of the robotic arm. 8.一种核辐射环境机器人高精度三维定位装置,其特征在于,包括移动机器人、激光雷达(3)以及结构光相机(4),所述移动机器人包括移动平台(1)、机械臂(2),所述机械臂(2)与移动平台(1)固定连接,机械臂(2)与移动平台(1)连接的一端上固定设置有激光雷达(3),机械臂(2)另一端上固定设置有结构光相机(4)。8. A high-precision three-dimensional positioning device for a nuclear radiation environment robot, characterized in that it comprises a mobile robot, a laser radar (3) and a structured light camera (4), the mobile robot comprising a mobile platform (1), a mechanical arm (2) ), the mechanical arm (2) is fixedly connected with the mobile platform (1), the end of the mechanical arm (2) connected with the mobile platform (1) is fixedly provided with a laser radar (3), and the other end of the mechanical arm (2) A structured light camera (4) is fixedly arranged.
CN202110768801.2A 2021-07-07 2021-07-07 High-precision three-dimensional positioning device and method for nuclear radiation environment robot Pending CN113506344A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110768801.2A CN113506344A (en) 2021-07-07 2021-07-07 High-precision three-dimensional positioning device and method for nuclear radiation environment robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110768801.2A CN113506344A (en) 2021-07-07 2021-07-07 High-precision three-dimensional positioning device and method for nuclear radiation environment robot

Publications (1)

Publication Number Publication Date
CN113506344A true CN113506344A (en) 2021-10-15

Family

ID=78011574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110768801.2A Pending CN113506344A (en) 2021-07-07 2021-07-07 High-precision three-dimensional positioning device and method for nuclear radiation environment robot

Country Status (1)

Country Link
CN (1) CN113506344A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114693762A (en) * 2022-04-15 2022-07-01 西南科技大学 A nuclear radiation scene space and radiation information three-dimensional fusion reconstruction device and method
CN116690582A (en) * 2023-07-21 2023-09-05 山东新一代信息产业技术研究院有限公司 Automatic server equipment ageing line loading device and method
FR3150027A1 (en) * 2023-06-19 2024-12-20 Endel Sra Method and device for checking the conformity of an anchoring device of equipment in a nuclear power plant
CN119237339A (en) * 2024-12-05 2025-01-03 西南科技大学 A radioactive waste classification method based on multi-robot collaboration

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104460671A (en) * 2014-11-12 2015-03-25 西南科技大学 Cross positioning method and system for radioactive source in three-dimensional space
US20160016312A1 (en) * 2013-03-15 2016-01-21 Carnegie Mellon University A Supervised Autonomous Robotic System for Complex Surface Inspection and Processing
CN107091643A (en) * 2017-06-07 2017-08-25 旗瀚科技有限公司 A kind of indoor navigation method based on many 3D structure lights camera splicings
WO2018103694A1 (en) * 2016-12-07 2018-06-14 苏州笛卡测试技术有限公司 Robotic three-dimensional scanning device and method
CN110497373A (en) * 2019-08-07 2019-11-26 大连理工大学 A joint calibration method between 3D lidar and manipulator of mobile robot
CN110599546A (en) * 2019-08-28 2019-12-20 贝壳技术有限公司 Method, system, device and storage medium for acquiring three-dimensional space data
CN110842940A (en) * 2019-11-19 2020-02-28 广东博智林机器人有限公司 Building surveying robot multi-sensor fusion three-dimensional modeling method and system
CN111123911A (en) * 2019-11-22 2020-05-08 北京空间飞行器总体设计部 A sensing system of a legged intelligent star catalogue detection robot and its working method
CN111459166A (en) * 2020-04-22 2020-07-28 北京工业大学 A method for constructing a scenario map with location information of trapped persons in a post-disaster rescue environment
WO2020155616A1 (en) * 2019-01-29 2020-08-06 浙江省北大信息技术高等研究院 Digital retina-based photographing device positioning method
CN112132894A (en) * 2020-09-08 2020-12-25 大连理工大学 A real-time tracking method of robotic arm based on binocular vision guidance
CN112650255A (en) * 2020-12-29 2021-04-13 杭州电子科技大学 Robot indoor and outdoor positioning navigation system method based on vision and laser radar information fusion

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160016312A1 (en) * 2013-03-15 2016-01-21 Carnegie Mellon University A Supervised Autonomous Robotic System for Complex Surface Inspection and Processing
CN104460671A (en) * 2014-11-12 2015-03-25 西南科技大学 Cross positioning method and system for radioactive source in three-dimensional space
WO2018103694A1 (en) * 2016-12-07 2018-06-14 苏州笛卡测试技术有限公司 Robotic three-dimensional scanning device and method
CN107091643A (en) * 2017-06-07 2017-08-25 旗瀚科技有限公司 A kind of indoor navigation method based on many 3D structure lights camera splicings
WO2020155616A1 (en) * 2019-01-29 2020-08-06 浙江省北大信息技术高等研究院 Digital retina-based photographing device positioning method
CN110497373A (en) * 2019-08-07 2019-11-26 大连理工大学 A joint calibration method between 3D lidar and manipulator of mobile robot
CN110599546A (en) * 2019-08-28 2019-12-20 贝壳技术有限公司 Method, system, device and storage medium for acquiring three-dimensional space data
CN110842940A (en) * 2019-11-19 2020-02-28 广东博智林机器人有限公司 Building surveying robot multi-sensor fusion three-dimensional modeling method and system
CN111123911A (en) * 2019-11-22 2020-05-08 北京空间飞行器总体设计部 A sensing system of a legged intelligent star catalogue detection robot and its working method
CN111459166A (en) * 2020-04-22 2020-07-28 北京工业大学 A method for constructing a scenario map with location information of trapped persons in a post-disaster rescue environment
CN112132894A (en) * 2020-09-08 2020-12-25 大连理工大学 A real-time tracking method of robotic arm based on binocular vision guidance
CN112650255A (en) * 2020-12-29 2021-04-13 杭州电子科技大学 Robot indoor and outdoor positioning navigation system method based on vision and laser radar information fusion

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
张伟伟 等: "融合激光与视觉点云信息的定位与建图方法", 计算机应用与软件, no. 07, pages 114 - 119 *
王刚 等: "融合Kinect与γ相机图像的放射性区域重建与定位", 应用光学, no. 05, pages 965 - 972 *
韩明瑞 等: "基于激光雷达的室外移动机器人三维定位和建图", 华中科技大学学报(自然科学版), no. 1, pages 315 - 318 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114693762A (en) * 2022-04-15 2022-07-01 西南科技大学 A nuclear radiation scene space and radiation information three-dimensional fusion reconstruction device and method
FR3150027A1 (en) * 2023-06-19 2024-12-20 Endel Sra Method and device for checking the conformity of an anchoring device of equipment in a nuclear power plant
CN116690582A (en) * 2023-07-21 2023-09-05 山东新一代信息产业技术研究院有限公司 Automatic server equipment ageing line loading device and method
CN119237339A (en) * 2024-12-05 2025-01-03 西南科技大学 A radioactive waste classification method based on multi-robot collaboration

Similar Documents

Publication Publication Date Title
CN110728715B (en) A method for self-adaptive adjustment of the camera angle of an intelligent inspection robot
CN113506344A (en) High-precision three-dimensional positioning device and method for nuclear radiation environment robot
CN106056587B (en) Full view line laser structured light three-dimensional imaging caliberating device and method
JP6323993B2 (en) Information processing apparatus, information processing method, and computer program
CN109927036A (en) A kind of method and system of 3D vision guidance manipulator crawl
CN104315995B (en) TOF depth camera three-dimensional coordinate calibration device and method based on virtual multi-cube standard target
CN106338245A (en) Non-contact movement measuring method for workpiece
CN106041937A (en) Control method of manipulator grabbing control system based on binocular stereoscopic vision
CN101504275A (en) Hand-hold line laser three-dimensional measuring system based on spacing wireless location
CN105574812B (en) Multi-angle three-dimensional data method for registering and device
CN110017852B (en) Navigation positioning error measuring method
WO2006120759A1 (en) 3-dimensional shape measuring method and device thereof
JP2014169990A (en) Position/posture measuring apparatus and method
CN112257536B (en) Space and object three-dimensional information acquisition and matching equipment and method
JPWO2018043524A1 (en) Robot system, robot system control apparatus, and robot system control method
CN108180834A (en) A kind of industrial robot is the same as three-dimensional imaging instrument position orientation relation scene real-time calibration method
CN115170625A (en) Three-dimensional model reconstruction method based on curved surface segmentation and ICP (inductively coupled plasma) registration algorithm
CN114140534A (en) Combined calibration method for laser radar and camera
CN111780715A (en) Visual ranging method
CN109900251A (en) A kind of robotic positioning device and method of view-based access control model technology
CN112001945A (en) Multi-robot monitoring method suitable for production line operation
CN112253913A (en) Intelligent visual 3D information acquisition equipment deviating from rotation center
CN115019167B (en) Fusion positioning method, system, equipment and storage medium based on mobile terminal
CN112304250B (en) Three-dimensional matching equipment and method between moving objects
CN112257535B (en) Three-dimensional matching equipment and method for avoiding object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211015