[go: up one dir, main page]

CN113253735B - Method, device, robot and computer readable storage medium for following target - Google Patents

Method, device, robot and computer readable storage medium for following target Download PDF

Info

Publication number
CN113253735B
CN113253735B CN202110658055.1A CN202110658055A CN113253735B CN 113253735 B CN113253735 B CN 113253735B CN 202110658055 A CN202110658055 A CN 202110658055A CN 113253735 B CN113253735 B CN 113253735B
Authority
CN
China
Prior art keywords
target
coordinate system
laser radar
coordinate
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110658055.1A
Other languages
Chinese (zh)
Other versions
CN113253735A (en
Inventor
崔锦
王虹
彭志
张宇哲
黄玲
杜珊珊
张可欣
乔光辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nuctech Co Ltd
Original Assignee
Nuctech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nuctech Co Ltd filed Critical Nuctech Co Ltd
Priority to CN202110658055.1A priority Critical patent/CN113253735B/en
Publication of CN113253735A publication Critical patent/CN113253735A/en
Application granted granted Critical
Publication of CN113253735B publication Critical patent/CN113253735B/en
Priority to PCT/CN2022/096935 priority patent/WO2022262594A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The disclosure provides a method and a device for following a target, a robot and a non-transitory computer readable storage medium, and relates to the technical field of robots. The method for following the target comprises the following steps: acquiring coordinates of each alternative target under a laser radar coordinate system through a laser radar; acquiring auxiliary coordinates of the targets under a laser radar coordinate system through an auxiliary positioning device, wherein each alternative target comprises a target; determining the coordinates of the target in the laser radar coordinate system according to the coordinates of each alternative target in the laser radar coordinate system and the auxiliary coordinates of the target in the laser radar coordinate system; and moving along with the target according to the coordinate of the target in the laser radar coordinate system. The robot can automatically and accurately move along with the target.

Description

Method, device, robot and computer readable storage medium for following target
Technical Field
The present disclosure relates to the field of robotics, and in particular, to a method and an apparatus for following a target, a robot, and a non-transitory computer-readable storage medium.
Background
The robot has obvious operation advantages in the aspects of carrying goods and the like.
In order to enable the robot to work in cooperation with a worker, two methods are generally used. One way is to move the robot to a designated position by manual remote control; another approach includes triaging the robot and building a map for the robot.
Disclosure of Invention
The technical problem solved by the present disclosure is how to make the robot automatically and accurately follow the target to move.
According to a first aspect of the present disclosure, there is provided a method of following a target, comprising: acquiring coordinates of each alternative target under a laser radar coordinate system through a laser radar; acquiring auxiliary coordinates of the targets under a laser radar coordinate system through an auxiliary positioning device, wherein each alternative target comprises a target; determining the coordinates of the target in the laser radar coordinate system according to the coordinates of each alternative target in the laser radar coordinate system and the auxiliary coordinates of the target in the laser radar coordinate system; and moving along with the target according to the coordinate of the target in the laser radar coordinate system.
In some embodiments, obtaining, by the lidar, coordinates of each candidate target in the lidar coordinate system comprises: acquiring a first coordinate of each alternative target in the first scanning frame under a laser radar coordinate system and a second coordinate of each alternative target in the second scanning frame under the laser radar coordinate system through a laser radar; the method for acquiring the auxiliary coordinates of the target under the laser radar coordinate system through the auxiliary positioning device comprises the following steps: acquiring a first auxiliary coordinate of a target under a laser radar coordinate system at the scanning moment of a first scanning frame and a second auxiliary coordinate of the target under the laser radar coordinate system at the scanning moment of a second scanning frame by an auxiliary positioning device; determining the coordinates of the target in the laser radar coordinate system according to the coordinates of the various alternative targets in the laser radar coordinate system and the auxiliary coordinates of the target in the laser radar coordinate system comprises the following steps: determining a first coordinate of the target under a laser radar coordinate system according to the first coordinate and the first auxiliary coordinate; and determining a second coordinate of the target in the laser radar coordinate system according to the second coordinate and the second auxiliary coordinate.
In some embodiments, following the target movement according to the coordinates of the target in the lidar coordinate system comprises: determining a moving direction and a moving distance according to a coordinate difference between a second coordinate of the target in the laser radar coordinate system and a first coordinate of the target in the laser radar coordinate system; and determining the moving speed according to the scanning interval between the first scanning frame and the second scanning frame and the coordinate difference.
In some embodiments, acquiring, by the lidar, first coordinates of each candidate target in the first scan frame under the lidar coordinate system includes: clustering laser point clouds generated by the laser radar through a first scanning frame according to coordinates in a laser radar coordinate system to obtain a plurality of laser point clusters; respectively carrying out curve fitting on each laser spot cluster; respectively determining a minimum external rotation rectangle for each fitting curve with the length within a preset length range; and taking the coordinates of the central point of each minimum external rotation rectangle under the laser radar coordinate system as the first coordinates of each alternative target in the first scanning frame under the laser radar coordinate system.
In some embodiments, determining the first coordinates of the target in the lidar coordinate system from the first coordinates and the first auxiliary coordinates comprises: determining the width direction of each minimum circumscribed rotating rectangle; screening out a minimum external rotation rectangle of which the included angle between the wide direction and the advancing direction of the robot is smaller than a first threshold value; selecting a central point closest to the origin of the laser radar coordinate system from the central points of the screened minimum external rotation rectangles; and taking the central point closest to the target as the first coordinate of the target in the laser radar coordinate system when the distance between the central point closest to the target and the first auxiliary coordinate is smaller than a second threshold value.
In some embodiments, acquiring, by the lidar, first coordinates of each candidate target in the first scan frame in the lidar coordinate system further includes: before clustering laser point clouds generated by a laser radar through a first scanning frame, deleting a t-th scanning point in the laser point clouds, wherein the distance between the coordinate of the t-th scanning point and a (t-1) -th scanning point is larger than a third threshold, the distance between the coordinate of the t-th scanning point and a (t + 1) -th scanning point is larger than the third threshold, and t is an integer larger than 1.
In some embodiments, acquiring, by the lidar, second coordinates of each candidate target in the second scan frame under the lidar coordinate system includes: clustering laser point clouds generated by the laser radar through a second scanning frame according to coordinates under a laser radar coordinate system to obtain a plurality of laser point clusters; respectively carrying out curve fitting on each laser spot cluster; respectively determining a minimum external rotation rectangle for each fitting curve with the length within a preset length range; and taking the coordinates of the central point of each minimum external rotation rectangle under the laser radar coordinate system as second coordinates of each alternative target in the second scanning frame under the laser radar coordinate system.
In some embodiments, determining the second coordinates of the target in the lidar coordinate system from the second coordinates and the second auxiliary coordinates comprises: inputting a first coordinate of a target in a laser radar coordinate system into a Kalman filter to obtain a predicted coordinate of the target in the laser radar coordinate system; selecting a second coordinate closest to the predicted coordinate from second coordinates of each alternative target in the second scanning frame under a laser radar coordinate system; and taking the second coordinate closest to the target as the second coordinate of the target in the laser radar coordinate system when the distance between the second coordinate closest to the target and the second auxiliary coordinate is smaller than a second threshold value.
In some embodiments, acquiring, by the lidar, second coordinates of each candidate target in the second scan frame under the lidar coordinate system further includes: and deleting the t-th scanning point in the laser point cloud before clustering the laser point cloud generated by the laser radar through the second scanning frame, wherein the distance between the coordinate of the t-th scanning point and the (t-1) th scanning point is greater than a third threshold, the distance between the coordinate of the t-th scanning point and the (t + 1) th scanning point is greater than the third threshold, and t is an integer greater than 1.
In some embodiments, the auxiliary locating device comprises an ultra-wideband locating device or a visual camera.
According to a second aspect of the present disclosure, there is provided an apparatus for following a target, comprising: a memory; and a processor coupled to the memory, the processor configured to perform the foregoing method of following a target based on instructions stored in the memory.
According to a third aspect of the present disclosure, there is provided a robot comprising a lidar, an auxiliary positioning device, and the aforementioned device for following a target.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium, wherein the non-transitory computer readable storage medium stores computer instructions which, when executed by a processor, implement the foregoing method of following an object.
The robot can automatically and accurately move along with the target.
Other features of the present disclosure and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or technical solutions in the related art, the drawings required to be used in the description of the embodiments or the related art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and for those skilled in the art, other drawings may be obtained according to the drawings without inventive exercise.
FIG. 1 illustrates a flow diagram of a method of following a target of some embodiments of the present disclosure.
Fig. 2 shows a schematic diagram of the coordinates of the respective candidate targets in the lidar coordinate system.
FIG. 3 shows a flow diagram of a method of following a target in accordance with further embodiments of the present disclosure.
FIG. 4 shows a schematic of a plurality of clusters of laser spots, respective fitted curves, and respective minimum circumscribed rotated rectangles.
FIG. 5 illustrates a schematic structural diagram of a target-following device according to some embodiments of the present disclosure.
Fig. 6 shows a schematic structural view of a robot according to some embodiments of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
Research shows that in the process of moving the robot to a specified position through manual remote control, a professional worker familiar with the robot remote control is required to operate, and in the operation process, the professional worker needs to remotely control the robot to move back and forth and left and right repeatedly, so that the operation flexibility is poor, long operation time is required, and the operation experience is poor. The way of trying to teach the robot and building a map for the robot requires the moving route of the robot to be specified in advance, so the moving route cannot be flexibly changed. Therefore, in the traditional method of the robot and the worker working cooperatively, the robot cannot automatically and accurately move along with the target (such as the worker).
In view of the above, the present disclosure provides a method of following a target. Some embodiments of methods of the present disclosure to follow a target are first described in conjunction with fig. 1.
FIG. 1 illustrates a flow diagram of a method of following a target of some embodiments of the present disclosure. As shown in fig. 1, the method includes steps S101 to S104.
In step S101, coordinates of each candidate target in a lidar coordinate system are acquired by the lidar.
The robot can obtain laser point cloud data by configuring a laser radar (e.g., a single line laser radar or a multi-line laser radar). According to the laser point cloud data, the coordinates of each candidate target under a laser radar coordinate system can be obtained, and the candidate targets comprise targets to be followed by the robot. Fig. 2 shows a schematic diagram of the coordinates of the respective candidate targets in the lidar coordinate system. As shown in fig. 2, the y-axis direction of the laser radar coordinate system indicates the advancing direction of the robot (which is also the scanning main direction of the laser radar), and the x-axis direction of the laser radar coordinate system indicates the right direction of the robot.
In step S102, auxiliary coordinates of the target in the lidar coordinate system are acquired by the auxiliary positioning device.
Wherein the auxiliary positioning device comprises an Ultra Wide Band (UWB) positioning device or a visual camera with an object detection function, wherein the visual camera with the object detection function may comprise a depth camera, a binocular camera, and the like.
In step S103, the coordinates of the target in the lidar coordinate system are determined according to the coordinates of each candidate target in the lidar coordinate system and the auxiliary coordinates of the target in the lidar coordinate system.
One specific implementation way is that the coordinate closest to the auxiliary coordinate of the target in the laser radar coordinate system is selected from the coordinates of each candidate target in the laser radar coordinate system, and the selected coordinate is used as the coordinate of the target in the laser radar coordinate system.
In step S104, the target is moved according to the coordinates of the target in the lidar coordinate system.
After the coordinates of the target under the laser radar coordinate system are obtained, the moving direction and the moving distance of the robot can be determined according to the coordinates of the target under the laser radar coordinate system, so that the robot moves along with the target.
Because the accuracy of the auxiliary coordinate provided by the auxiliary positioning device is low, the accuracy of the coordinate provided by the laser radar is high, but the target is difficult to accurately identify. In the above embodiment, the coordinates of the target in the laser radar coordinate system can be determined more accurately according to the coordinates provided by the laser radar and the auxiliary coordinates provided by the auxiliary positioning device, so that the robot can automatically and accurately move along with the target according to the coordinates of the target in the laser radar coordinate system. Therefore, the robot can automatically follow workers to perform cooperative operation without manual remote control, trial teaching on the robot or building a map for the robot. Meanwhile, the embodiment has better robustness to factors such as light, shielding, target moving speed, target deformation and the like, and therefore has wide applicability.
Further embodiments of the method of the present disclosure to follow the target are described below in conjunction with FIG. 3.
FIG. 3 shows a flow diagram of a method of following a target in accordance with further embodiments of the present disclosure. As shown in fig. 3, these embodiments include steps S301 to S308. The steps S301 to S303 are processes of obtaining coordinates of the target in the lidar coordinate system through the first scanning frame, the steps S304 to S306 are processes of obtaining coordinates of the target in the lidar coordinate system through the second scanning frame, and the steps S307 to S308 are processes of moving along with the target according to the coordinates of the target in the lidar coordinate system in the two scanning frames.
In step S301, first coordinates of each candidate target in the first scanning frame in a lidar coordinate system are obtained through the lidar. The specific implementation process of step S301 may include steps S3011 to S3014.
And (S3011) clustering laser point clouds generated by the laser radar through the first scanning frame according to the coordinates of the laser radar in the coordinate system to obtain a plurality of laser point clusters. The Clustering algorithm may specifically adopt DBSCAN (Density-Based Spatial Clustering of Applications with Noise, Density-Based Spatial Clustering application), nearest neighbor Clustering algorithm, and the like. And each laser point cluster obtained by clustering represents an alternative target.
(S3012) curve fitting is performed on each laser spot cluster.
(S3013) determining the minimum circumscribed rotating rectangle for each fitted curve with the length within a preset length range. For example, a fitting curve with the length approximate to the waist width of the worker is selected, and the minimum circumscribed rotating rectangle is determined.
(S3014) taking the coordinates of the central point of each minimum circumscribed rotating rectangle under the laser radar coordinate system as the first coordinates of each candidate target in the first scanning frame under the laser radar coordinate system.
A plurality of clusters of laser spots, each fitted curve, and each minimum circumscribed rotated rectangle are shown in fig. 4.
In step S302, a first auxiliary coordinate of the target in the lidar coordinate system at the scanning time of the first scanning frame is obtained by the auxiliary positioning device.
Those skilled in the art will understand that after the position information acquired by the auxiliary positioning device is mapped to the lidar coordinate system, the first auxiliary coordinate of the target in the lidar coordinate system is obtained. The first auxiliary coordinate may be a specific coordinate point, or may be an auxiliary coordinate region having an auxiliary center coordinate.
In step S303, a first coordinate of the target in the lidar coordinate system is determined according to the first coordinate and the first auxiliary coordinate. The specific implementation process of step S303 may include steps S3031 to S3034.
(S3031) determining the width direction of each of the minimum bounding rotation rectangles. It can be understood by those skilled in the art that the direction of the width of the minimum bounding rectangle represents the front-to-back direction of the target.
(S3032) screening out the minimum circumscribed rotating rectangle of which the included angle between the width direction and the advancing direction of the robot is smaller than a first threshold value, thereby screening out the target back to the center of the robot.
(S3033) selecting the central point which is closest to the origin of the laser radar coordinate system from the central points of the screened minimum circumscribed rotating rectangles, thereby selecting the target which is closest to the robot.
(S3034) when the distance between the closest center point and the first auxiliary coordinate is less than the second threshold, setting the closest center point as the first coordinate of the target in the lidar coordinate system.
And repeating the steps S301 to S303 when the distance between the nearest center point and the first auxiliary coordinate is not less than the second threshold value. And if the first coordinate of the target in the laser radar coordinate system cannot be determined even if the times of repeatedly executing the steps S301 to S303 reach the preset upper limit of times, giving an alarm to indicate that the target to be followed cannot be identified.
In step S304, second coordinates of each candidate target in the second scanning frame in the lidar coordinate system are obtained through the lidar. The specific implementation process of step S304 may include steps S3041 to S3044, which is similar to the implementation process of S301.
And (S3041) clustering laser point clouds generated by the laser radar through the second scanning frame according to the coordinates of the laser radar under the coordinate system to obtain a plurality of laser point clusters.
(S3042) curve fitting is performed on each laser spot cluster.
(S3043) determining the minimum circumscribed rotation rectangle for each fitted curve having a length within a preset length range, respectively.
(S3044) taking the coordinates of the central point of each minimum external rotation rectangle under the laser radar coordinate system as the second coordinates of each alternative target in the second scanning frame under the laser radar coordinate system.
In step S305, a second auxiliary coordinate of the target in the lidar coordinate system at the scanning time of the second scanning frame is obtained by the auxiliary positioning device. Step S305 is similar to the implementation of step S302.
In step S306, a second coordinate of the target in the lidar coordinate system is determined according to the second coordinate and the second auxiliary coordinate. The specific implementation process of step S306 may further include step S3061-step S3063.
(S3061) inputting the first coordinate of the target in the laser radar coordinate system into a Kalman filter, and obtaining the predicted coordinate of the target in the laser radar coordinate system.
(S3062) selecting second coordinates closest to the predicted coordinates from the second coordinates of the candidate targets in the second scanning frame under the laser radar coordinate system.
(S3063) if the distance between the second coordinate and the second auxiliary coordinate is less than the second threshold value, setting the second coordinate as the second coordinate of the target in the laser radar coordinate system.
And repeating the steps S304-S306 under the condition that the distance between the nearest center point and the second auxiliary coordinate is not less than the second threshold value. And if the first coordinate of the target in the laser radar coordinate system cannot be determined even if the times of repeatedly executing the steps S304-S306 reach the preset upper limit of times, giving an alarm to indicate that the target to be followed is lost.
In step S307, the moving direction and the moving distance are determined according to the coordinate difference between the second coordinate of the target in the lidar coordinate system and the first coordinate of the target in the lidar coordinate system.
For example, the coordinate difference uniquely corresponds to one vector. And taking the direction of the vector as the moving direction of the robot, and multiplying the length of the vector by a preset proportion to obtain the moving distance of the robot.
In step S308, the movement rate is determined according to the scanning interval between the first scanning frame and the second scanning frame and the coordinate difference.
For example, the ratio of the length of the vector to the scanning interval may be used as the moving speed of the robot.
In some embodiments, the specific implementation process of step S301 may further include step S3010.
(S3010) before clustering laser point clouds generated by the laser radar through the first scanning frame, deleting a tth scanning point in the laser point clouds, wherein a distance between a coordinate of the tth scanning point and a (t-1) th scanning point is greater than a third threshold, a distance between the coordinate of the tth scanning point and a (t + 1) th scanning point is greater than the third threshold, and t is an integer greater than 1.
In some embodiments, the specific implementation process of step S304 may further include step S3040.
(S3040) before clustering the laser point cloud generated by the laser radar through the second scanning frame, deleting the t-th scanning point in the laser point cloud, wherein the distance between the coordinate of the t-th scanning point and the (t-1) -th scanning point is greater than a third threshold, the distance between the coordinate of the t-th scanning point and the (t + 1) -th scanning point is greater than the third threshold, and t is an integer greater than 1.
The steps S3010 and S3040 are performed to remove noise points in the laser point cloud to prevent interference due to reflection and the like, so as to further improve the accuracy of following the target.
In the above embodiment, the coordinates of the target in the laser radar coordinate system can be determined more accurately according to the coordinates provided by the laser radar and the auxiliary coordinates provided by the auxiliary positioning device, so that the robot can automatically and accurately move along with the target according to the coordinates of the target in the laser radar coordinate system. Therefore, the robot can automatically follow workers to perform cooperative operation without manual remote control, trial teaching on the robot or building a map for the robot. Meanwhile, the embodiment has better robustness to factors such as light, shielding, target moving speed, target deformation and the like, and therefore has wide applicability.
Some embodiments of the apparatus of the present disclosure that follow the target are described below in conjunction with fig. 5.
FIG. 5 illustrates a schematic structural diagram of a target-following device according to some embodiments of the present disclosure. As shown in fig. 5, the apparatus 50 for following a target includes: a memory 510 and a processor 520 coupled to the memory 510, the processor 520 configured to perform a method of following a target in any of the foregoing embodiments based on instructions stored in the memory 510.
Memory 510 may include, for example, system memory, fixed non-volatile storage media, and the like. The system memory stores, for example, an operating system, an application program, a Boot Loader (Boot Loader), and other programs.
The device to follow a target 50 may further include an input-output interface 530, a network interface 540, a storage interface 550, and the like. These interfaces 530, 540, 550 and the connections between the memory 510 and the processor 520 may be, for example, via a bus 560. The input/output interface 530 provides a connection interface for input/output devices such as a display, a mouse, a keyboard, and a touch screen. The network interface 540 provides a connection interface for various networking devices. The storage interface 550 provides a connection interface for external storage devices such as an SD card and a usb disk.
Some embodiments of the disclosed robot are described below in conjunction with fig. 6.
Fig. 6 shows a schematic structural view of a robot according to some embodiments of the present disclosure. As shown in fig. 6, the robot 60 includes: lidar 601, an auxiliary positioning device 602, and the aforementioned target-following device 50.
The present disclosure also includes a non-transitory computer readable storage medium having stored thereon computer instructions that, when executed by a processor, implement a method of following a target in any of the foregoing embodiments.
The aforementioned integrated units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
The above description is only exemplary of the present disclosure and is not intended to limit the present disclosure, so that any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (9)

1. A method of following a target, comprising:
acquiring coordinates of each alternative target under a laser radar coordinate system through a laser radar;
acquiring auxiliary coordinates of targets in a laser radar coordinate system through an auxiliary positioning device, wherein each alternative target comprises the target;
determining the coordinates of the targets in a laser radar coordinate system according to the coordinates of the various alternative targets in the laser radar coordinate system and the auxiliary coordinates of the targets in the laser radar coordinate system;
according to the coordinates of the target in a laser radar coordinate system, moving along with the target;
the obtaining of the coordinates of each candidate target under the laser radar coordinate system through the laser radar includes: acquiring a first coordinate of each alternative target in the first scanning frame under a laser radar coordinate system and a second coordinate of each alternative target in the second scanning frame under the laser radar coordinate system through a laser radar;
the acquiring of the auxiliary coordinates of the target under the laser radar coordinate system through the auxiliary positioning device comprises: acquiring a first auxiliary coordinate of the target under a laser radar coordinate system at the scanning moment of a first scanning frame and a second auxiliary coordinate of the target under the laser radar coordinate system at the scanning moment of a second scanning frame by an auxiliary positioning device;
the determining the coordinates of the target in the laser radar coordinate system according to the coordinates of the various candidate targets in the laser radar coordinate system and the auxiliary coordinates of the target in the laser radar coordinate system comprises: determining a first coordinate of each candidate target in a laser radar coordinate system according to the first coordinate and the first auxiliary coordinate of each candidate target in the laser radar coordinate system; determining a second coordinate of each candidate target in a laser radar coordinate system according to the second coordinate and the second auxiliary coordinate of each candidate target in the laser radar coordinate system;
the obtaining, by the lidar, first coordinates of each candidate target in the first scanning frame in a lidar coordinate system includes:
clustering laser point clouds generated by the laser radar through the first scanning frame according to the coordinates of all the alternative targets in the first scanning frame under a laser radar coordinate system to obtain a plurality of laser point clusters;
respectively carrying out curve fitting on each laser spot cluster;
respectively determining a minimum external rotation rectangle for each fitting curve with the length within a preset length range;
taking the coordinates of the central point of each minimum external rotation rectangle under a laser radar coordinate system as first coordinates of each alternative target in a first scanning frame under the laser radar coordinate system;
wherein, the determining the first coordinate of the target in the lidar coordinate system according to the first coordinate and the first auxiliary coordinate of each candidate target in the lidar coordinate system includes:
determining the width direction of each minimum circumscribed rotating rectangle;
screening out the minimum external rotation rectangle of which the included angle between the wide direction and the advancing direction of the robot is smaller than a first threshold value;
selecting a central point closest to the origin of the laser radar coordinate system from the central points of the screened minimum external rotation rectangles;
taking the central point with the closest distance as a first coordinate of the target in a laser radar coordinate system under the condition that the distance between the central point with the closest distance and the first auxiliary coordinate is smaller than a second threshold value;
wherein, the determining the second coordinate of the target in the lidar coordinate system according to the second coordinate of each candidate target in the lidar coordinate system and the second auxiliary coordinate comprises:
inputting a first coordinate of the target in a laser radar coordinate system into a Kalman filter to obtain a predicted coordinate of the target in the laser radar coordinate system;
selecting a second coordinate closest to the predicted coordinate from second coordinates of each alternative target in a second scanning frame under a laser radar coordinate system;
and taking the second coordinate closest to the distance as the second coordinate of the target in the laser radar coordinate system when the distance between the second coordinate closest to the distance and the second auxiliary coordinate is smaller than a second threshold value.
2. The method of claim 1, wherein the following the target movement according to the target's coordinates in a lidar coordinate system comprises:
determining a moving direction and a moving distance according to a coordinate difference between a second coordinate of the target in a laser radar coordinate system and a first coordinate of the target in the laser radar coordinate system;
and determining the movement rate according to the scanning interval between the first scanning frame and the second scanning frame and the coordinate difference.
3. The method of claim 2, wherein the acquiring, by lidar, first coordinates of each candidate object in the first scan frame in a lidar coordinate system further comprises:
deleting a t-th scanning point in laser point cloud generated by a laser radar through a first scanning frame before clustering, wherein the distance between the coordinate of the t-th scanning point and a (t-1) -th scanning point is greater than a third threshold, the distance between the coordinate of the t-th scanning point and a (t + 1) -th scanning point is greater than a third threshold, and t is an integer greater than 1.
4. The method of claim 1, wherein the acquiring, by the lidar, second coordinates of each candidate target in the second scan frame in the lidar coordinate system comprises:
clustering laser point clouds generated by the laser radar through the second scanning frame according to the coordinates of all the alternative targets in the second scanning frame under the laser radar coordinate system to obtain a plurality of laser point clusters;
respectively carrying out curve fitting on each laser spot cluster;
respectively determining a minimum external rotation rectangle for each fitting curve with the length within a preset length range;
and taking the coordinates of the central point of each minimum external rotation rectangle under the laser radar coordinate system as second coordinates of each alternative target in the second scanning frame under the laser radar coordinate system.
5. The method of claim 4, wherein the acquiring, by the lidar, second coordinates of each candidate target in the second scan frame in the lidar coordinate system further comprises:
and deleting the t-th scanning point in the laser point cloud before clustering the laser point cloud generated by the laser radar through the second scanning frame, wherein the distance between the coordinate of the t-th scanning point and the (t-1) th scanning point is greater than a third threshold, the distance between the coordinate of the t-th scanning point and the (t + 1) th scanning point is greater than the third threshold, and t is an integer greater than 1.
6. The method of any one of claims 1 to 5, wherein the auxiliary positioning device comprises an ultra-wideband positioning device or a visual camera.
7. An apparatus for following a target, comprising:
a memory; and
a processor coupled to the memory, the processor configured to perform the method of following a target of any one of claims 1 to 6 based on instructions stored in the memory.
8. A robot comprising a lidar, an auxiliary positioning device, and a device for following a target as claimed in claim 7.
9. A non-transitory computer readable storage medium, wherein the non-transitory computer readable storage medium stores computer instructions that, when executed by a processor, implement a method of following a target as recited in any one of claims 1-6.
CN202110658055.1A 2021-06-15 2021-06-15 Method, device, robot and computer readable storage medium for following target Active CN113253735B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110658055.1A CN113253735B (en) 2021-06-15 2021-06-15 Method, device, robot and computer readable storage medium for following target
PCT/CN2022/096935 WO2022262594A1 (en) 2021-06-15 2022-06-02 Method and apparatus for following target, robot, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110658055.1A CN113253735B (en) 2021-06-15 2021-06-15 Method, device, robot and computer readable storage medium for following target

Publications (2)

Publication Number Publication Date
CN113253735A CN113253735A (en) 2021-08-13
CN113253735B true CN113253735B (en) 2021-10-08

Family

ID=77188103

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110658055.1A Active CN113253735B (en) 2021-06-15 2021-06-15 Method, device, robot and computer readable storage medium for following target

Country Status (2)

Country Link
CN (1) CN113253735B (en)
WO (1) WO2022262594A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113253735B (en) * 2021-06-15 2021-10-08 同方威视技术股份有限公司 Method, device, robot and computer readable storage medium for following target
CN116642903A (en) * 2023-04-17 2023-08-25 国能锅炉压力容器检验有限公司 Device and method for obtaining continuous metallographic structure on site based on laser scanning
CN117549939B (en) * 2023-11-13 2024-04-12 沈阳奇辉机器人应用技术有限公司 Method and equipment for co-speed of rail robot and carriage
CN118628779B (en) * 2024-08-08 2024-10-15 浙江中控信息产业股份有限公司 A target recognition method based on adaptive clustering of minimum bounding rectangle
CN119087397B (en) * 2024-11-08 2025-02-14 深圳市普渡科技有限公司 Method and computer equipment for detecting light-transmitting materials

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110942474A (en) * 2019-11-27 2020-03-31 炬星科技(深圳)有限公司 Robot target tracking method, device and storage medium
WO2020085965A1 (en) * 2018-10-25 2020-04-30 Ireality Ab Method and controller for tracking moving objects
CN111986232A (en) * 2020-08-13 2020-11-24 上海高仙自动化科技发展有限公司 Target object detection method, target object detection device, robot and storage medium
CN112154444A (en) * 2019-10-17 2020-12-29 深圳市大疆创新科技有限公司 Target detection and tracking method, system, movable platform, camera and medium
CN112731371A (en) * 2020-12-18 2021-04-30 重庆邮电大学 Laser radar and vision fused integrated target tracking system and method

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107765220B (en) * 2017-09-20 2020-10-23 武汉木神机器人有限责任公司 Pedestrian following system and method based on UWB and laser radar hybrid positioning
CN112543877B (en) * 2019-04-03 2022-01-11 华为技术有限公司 Positioning method and positioning device
CN110246159B (en) * 2019-06-14 2023-03-28 湖南大学 3D target motion analysis method based on vision and radar information fusion
CN111079607A (en) * 2019-12-05 2020-04-28 灵动科技(北京)有限公司 Autonomous Driving System with Tracking
CN111198496A (en) * 2020-01-03 2020-05-26 浙江大学 A target following robot and following method
CN113916213B (en) * 2020-07-08 2024-07-23 北京猎户星空科技有限公司 Positioning method, positioning device, electronic equipment and computer readable storage medium
CN112487919A (en) * 2020-11-25 2021-03-12 吉林大学 3D target detection and tracking method based on camera and laser radar
CN112462782B (en) * 2020-11-30 2022-10-28 北京航天光华电子技术有限公司 Multifunctional intelligent following trolley system
CN112561841A (en) * 2020-12-04 2021-03-26 深兰人工智能(深圳)有限公司 Point cloud data fusion method and device for laser radar and camera
CN113066100B (en) * 2021-03-25 2024-09-10 东软睿驰汽车技术(沈阳)有限公司 Target tracking method, device, equipment and storage medium
CN113253735B (en) * 2021-06-15 2021-10-08 同方威视技术股份有限公司 Method, device, robot and computer readable storage medium for following target
CN113985445B (en) * 2021-08-24 2024-08-09 中国北方车辆研究所 3D target detection algorithm based on camera and laser radar data fusion
CN114155557B (en) * 2021-12-07 2022-12-23 美的集团(上海)有限公司 Positioning method, positioning device, robot and computer-readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020085965A1 (en) * 2018-10-25 2020-04-30 Ireality Ab Method and controller for tracking moving objects
CN112154444A (en) * 2019-10-17 2020-12-29 深圳市大疆创新科技有限公司 Target detection and tracking method, system, movable platform, camera and medium
CN110942474A (en) * 2019-11-27 2020-03-31 炬星科技(深圳)有限公司 Robot target tracking method, device and storage medium
CN111986232A (en) * 2020-08-13 2020-11-24 上海高仙自动化科技发展有限公司 Target object detection method, target object detection device, robot and storage medium
CN112731371A (en) * 2020-12-18 2021-04-30 重庆邮电大学 Laser radar and vision fused integrated target tracking system and method

Also Published As

Publication number Publication date
WO2022262594A1 (en) 2022-12-22
CN113253735A (en) 2021-08-13

Similar Documents

Publication Publication Date Title
CN113253735B (en) Method, device, robot and computer readable storage medium for following target
CN106774345B (en) Method and equipment for multi-robot cooperation
CN110176078B (en) Method and device for labeling training set data
CN109683175B (en) Laser radar configuration method, device, equipment and storage medium
CN107742304B (en) Method and device for determining movement track, mobile robot and storage medium
MY195922A (en) Systems and Methods for Configurable Operation of a Robot Based on Area Classification
JP6665056B2 (en) Work support device, work support method, and program
US11253997B2 (en) Method for tracking multiple target objects, device, and computer program for implementing the tracking of multiple target objects for the case of moving objects
CN115170580A (en) Plate processing control method and device, computer equipment and storage medium
Balabokhin et al. Iso-scallop tool path building algorithm “based on tool performance metric” for generalized cutter and arbitrary milling zones in 3-axis CNC milling of free-form triangular meshed surfaces
CN110271006A (en) Mechanical arm visual guide method and device
CN114815851A (en) Robot following method, robot following device, electronic device, and storage medium
Horváth et al. Point cloud based robot cell calibration
RU2018121964A (en) OPTIMIZATION OF USER INTERACTIONS IN SEGMENTATION
CN111524165B (en) Target tracking method and device
Yu et al. Collaborative SLAM and AR-guided navigation for floor layout inspection
CN113741474B (en) Method and device for identifying parking area of unmanned mine car
CN113311836B (en) Control method, device, equipment and storage medium
CN113177980B (en) Target object speed determining method and device for automatic driving and electronic equipment
CN115139303A (en) Grid well lid detection method, device, equipment and storage medium
EP2806398A1 (en) Method of defining a region of interest
CN110852138B (en) Method and device for labeling objects in image data
CN115493581B (en) Robot movement map generation method, device, equipment and storage medium
Kim et al. Construction and inspection management system using mobile augmented reality
CN113702931B (en) External parameter calibration method and device for vehicle-mounted radar and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant