[go: up one dir, main page]

CN114102574B - Positioning error evaluation system and method - Google Patents

Positioning error evaluation system and method Download PDF

Info

Publication number
CN114102574B
CN114102574B CN202010882935.2A CN202010882935A CN114102574B CN 114102574 B CN114102574 B CN 114102574B CN 202010882935 A CN202010882935 A CN 202010882935A CN 114102574 B CN114102574 B CN 114102574B
Authority
CN
China
Prior art keywords
pose information
coordinate system
positioning
pose
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010882935.2A
Other languages
Chinese (zh)
Other versions
CN114102574A (en
Inventor
姜伟
俞毓锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jizhijia Technology Co Ltd
Original Assignee
Beijing Jizhijia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jizhijia Technology Co Ltd filed Critical Beijing Jizhijia Technology Co Ltd
Priority to CN202010882935.2A priority Critical patent/CN114102574B/en
Publication of CN114102574A publication Critical patent/CN114102574A/en
Application granted granted Critical
Publication of CN114102574B publication Critical patent/CN114102574B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application provides a positioning error evaluation system and a positioning error evaluation method, wherein the positioning error evaluation system comprises a first pose measuring device, a second pose measuring device and a control server, and at least one positioning mark is arranged on a robot. The first pose measuring device is used for acquiring at least one piece of first pose information of the positioning mark; the second pose measuring device is used for acquiring at least one second pose information of the robot; the control server is used for determining positioning error information for positioning the robot based on the at least one first pose information and the at least one second pose information. The embodiment of the application combines the first pose information of the positioning mark and the second pose information of the robot, can break through the limitation of navigation and positioning technology used by the pose measuring device, and realizes comprehensive and accurate assessment of the positioning performance of the robot.

Description

Positioning error evaluation system and method
Technical Field
The application relates to the technical field of robots and positioning, in particular to a positioning error evaluation system and a positioning error evaluation method.
Background
With the development of automation technology, robots are more and more commonly used in life and work of people, and can receive and execute human dispatch tasks to provide services for human beings.
In the process of executing tasks, the positioning accuracy of the robot determines the quality of executing the tasks to a certain extent. In order to ensure the quality of the task performed by the robot, the positioning error of the robot needs to be evaluated. The existing positioning error evaluation method is often used for calculating the repeated positioning error of two-dimensional code navigation or SLAM (simultaneous localization and mapping, synchronous positioning and map building) navigation, and as the two-dimensional code and the SLAM map can be mixed in the navigation process of the robot, the repeated positioning error evaluation of the two-dimensional code navigation or the SLAM navigation is carried out on the robot alone, so that the positioning performance of the robot is difficult to evaluate comprehensively.
Disclosure of Invention
The embodiment of the application at least provides a positioning error evaluation system so as to improve the accuracy of robot positioning performance evaluation.
In a first aspect, an embodiment of the present application provides a positioning error evaluation system including: at least one robot, wherein at least one positioning mark is arranged on the robot;
the first pose measuring device is used for acquiring at least one piece of first pose information of the positioning mark;
the second pose measuring device is used for acquiring at least one piece of second pose information of the robot;
and a control server configured to determine positioning error information for positioning the robot based on the at least one first pose information and the at least one second pose information.
In a possible implementation manner, the first pose information is true pose information of the positioning identifier under a true coordinate system;
the second pose measuring device is arranged on the robot; the second pose information is measured pose information of the robot under a measurement coordinate system.
In a possible implementation manner, the control server is configured to, when determining the positioning error information:
determining second pose information which is matched with each first pose information in time to obtain at least one error evaluation data pair; each error evaluation data pair comprises first pose information and second pose information matched with the first pose information;
performing correction processing on each piece of second pose information based on a coordinate system transformation relation between a positioning identification coordinate system corresponding to the positioning identification and a body coordinate system corresponding to the robot to obtain third pose information, wherein the third pose information is measured pose information after correction processing of the positioning identification under a measurement coordinate system;
converting each third pose information into a true value coordinate system based on a coordinate system transformation relation between the measurement coordinate system and the true value coordinate system to obtain fourth pose information, wherein the fourth pose information is measured pose information after deviation correction processing of the positioning mark in the true value coordinate system;
for each error evaluation data pair, determining error information corresponding to the error evaluation data pair based on the fourth pose information and the first pose information of the error evaluation data pair;
and determining and outputting the positioning error information based on the error information corresponding to each error evaluation data pair.
In a possible embodiment, the control server is further configured to, before converting each third pose information into the true coordinate system:
a coordinate system transformation relationship between the measurement coordinate system and the truth coordinate system is determined.
In one possible embodiment, the control server, when determining the coordinate system transformation relationship between the measurement coordinate system and the true coordinate system, is configured to:
acquiring a plurality of first sample pose information of the positioning mark in the true value coordinate system;
acquiring a plurality of second sample pose information of the robot in the measurement coordinate system;
performing correction processing on each piece of second sample pose information based on the coordinate system transformation relation between the positioning identification coordinate system and the body coordinate system and the plurality of pieces of second sample pose information to obtain corrected sample pose information of the positioning identification under a measurement coordinate system;
and determining a coordinate system transformation relation between the measurement coordinate system and the true value coordinate system based on the plurality of first sample pose information and the plurality of second sample measurement pose information subjected to deviation correction processing.
In one possible embodiment, the control server, when determining the second pose information temporally matched to each of the first pose information, is configured to:
acquiring the generation time of each first pose information and the generation time of each second pose information;
and screening second pose information which is closest to the generation time of the first pose information and is not matched with other first pose information aiming at each first pose information, and taking the screened second pose information as second pose information which is matched with the first pose information in time.
In one possible implementation, the first pose measurement device includes a plurality of cameras with different shooting orientations; the true value coordinate system is a camera coordinate system;
the measurement coordinate system is a map coordinate system.
In a second aspect, the present application provides a positioning error evaluation method, applied to an error evaluation system, for determining positioning error information generated by a pose measurement system for measuring pose information of a robot, where at least one positioning identifier is set on the robot; comprising the following steps:
receiving at least one first pose information of the positioning mark measured by a first pose measuring device;
at least one second pose information of the robot measured by a second pose measuring device;
positioning error information for positioning the robot is determined based on the at least one first pose information and the at least one second pose information.
In a possible implementation manner, the first pose information is true pose information of the positioning identifier under a true coordinate system;
the second pose measuring device is arranged on the robot; the second pose information is measured pose information of the robot under a measurement coordinate system.
In a possible implementation manner, the determining positioning error information for positioning the robot based on the at least one first pose information and the at least one second pose information includes:
determining second pose information which is matched with each first pose information in time to obtain at least one error evaluation data pair; each error evaluation data pair comprises first pose information and second pose information matched with the first pose information;
performing correction processing on each piece of second pose information based on a coordinate system transformation relation between a positioning identification coordinate system corresponding to the positioning identification and a body coordinate system corresponding to the robot to obtain third pose information, wherein the third pose information is measured pose information after correction processing of the positioning identification under a measurement coordinate system;
converting each third pose information into a true value coordinate system based on a coordinate system transformation relation between the measurement coordinate system and the true value coordinate system to obtain fourth pose information, wherein the fourth pose information is measured pose information after deviation correction processing of the positioning mark in the true value coordinate system;
for each error evaluation data pair, determining error information corresponding to the error evaluation data pair based on the fourth pose information and the first pose information of the error evaluation data pair;
and determining and outputting the positioning error information based on the error information corresponding to each error evaluation data pair.
The positioning error evaluation system comprises a first pose measuring device, a second pose measuring device and a control server, wherein at least one positioning mark is arranged on a robot. The first pose measuring device is used for acquiring at least one piece of first pose information of the positioning mark; the second pose measuring device is used for acquiring at least one piece of second pose information of the robot; the control server is arranged to determine positioning error information for positioning the robot based on the at least one first pose information and the at least one second pose information. According to the embodiment of the application, the at least one first pose information of the positioning mark and the at least one second pose information of the robot are combined, so that the limitation of navigation and positioning technologies used by the pose measuring device can be broken through, and the positioning performance of the robot can be comprehensively and accurately evaluated.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are necessary for use in the embodiments are briefly described below, which drawings are incorporated in and form a part of the present description, these drawings illustrate embodiments consistent with the present application and together with the description serve to explain the technical solutions of the present application. It is to be understood that the following drawings illustrate only certain embodiments of the present application and are therefore not to be considered limiting of its scope, for the person of ordinary skill in the art may derive other relevant drawings from the drawings without inventive effort.
Fig. 1 is a schematic structural diagram of a positioning error evaluation system according to an embodiment of the present disclosure;
fig. 2A is a schematic diagram of pose information without deviation correction in an embodiment of the present application;
fig. 2B is a schematic diagram of pose information after correction in the embodiment of the present application;
FIG. 3A is a schematic diagram of a lateral error of a robot in an embodiment of the present application;
FIG. 3B is a schematic view of the angle error of the robot according to the embodiment of the present application;
fig. 4 is a schematic diagram of a positioning error evaluation method in an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The term "and/or" is used herein to describe only one relationship, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
In the process of executing tasks, the autonomous robot may use two-dimensional code navigation and SLAM navigation in a mixed mode, and when the positioning error of the robot is estimated, repeated positioning error estimation is independently carried out on the two-dimensional code navigation or SLAM navigation of the robot, so that the positioning performance of the robot cannot be comprehensively estimated, and the accuracy of the positioning performance estimation of the robot is reduced. The embodiment of the application combines the first pose information of the positioning mark and the second pose information of the robot, can break through the limitation of navigation and positioning technology used by the robot, and realizes comprehensive and accurate assessment of the positioning performance of the robot.
The autonomous robot comprises a robot body and a robot positioning system, wherein the robot positioning system is used for measuring the pose of the robot body, namely, the robot body is positioned. The second pose measuring device corresponds to the robot positioning system, and comprises the robot positioning system, wherein the robot positioning system is a specific pose measuring device. The robot and the second pose measuring device are combined to obtain the autonomous moving robot.
The positioning error evaluation system provided in the present application is described in detail below. As shown in fig. 1, the positioning error evaluation system provided in the embodiment of the present application includes a first pose measurement device 11, a second pose measurement device 12, and a control server 13. The control server is used for determining positioning error information generated by the pose information measurement of the robot by the second pose measurement device, wherein at least one positioning mark is arranged on the robot.
The first pose measurement device may include a plurality of cameras having different shooting orientations; the second pose measurement device may comprise a robotic positioning system. Specifically, the present invention relates to a method for manufacturing a semiconductor device. The first pose measuring device can be a set of 6DOF pose measuring device with the length of 14m and the width of 9m and composed of 12 cameras, the pose measuring error of the device is of a sub-millimeter level, and pose information measured by the device can be used as a true value. The second pose measuring device can realize two-dimensional code positioning and SLAM positioning, the robot moves in a two-dimensional code and SLAM mixed field, and if the robot runs in the two-dimensional code field, the second pose measuring device determines pose information of the robot by identifying the two-dimensional code; when the robot walks into the SLAM field from the two-dimensional code field, the second pose measuring device determines pose information of the robot through the SLAM.
In a specific implementation, the first pose measurement device is configured to obtain at least one piece of first pose information of the positioning identifier. Here, the positioning mark is arranged on the robot, and the first pose measuring device does not directly detect and obtain pose information of the robot, but directly detects and obtains pose information of the positioning mark. The pose information of the robot can be determined by tracking the pose information of the positioning mark. The positioning mark can be arranged at the mass center of the robot, so that the pose information of the positioning mark is most similar to the pose information of the robot. The first pose information is true pose information of the positioning mark in a true coordinate system, namely a camera coordinate system.
The second pose measurement device is configured to obtain at least one second pose information of the robot. Here, the detected second pose information may include pose information obtained by two-dimensional code recognition or pose information obtained by SLAM. Here, the second pose measurement device is provided on the robot; the second pose information is measured pose information of the robot in a measured coordinate system, namely a map coordinate system.
The control server is arranged to determine positioning error information for positioning the robot based on the at least one first pose information and the at least one second pose information.
The control server is specifically configured to perform the following steps when determining the positioning error information based on the first pose information and the second pose information:
step one, determining second pose information which is matched with each piece of first pose information in time to obtain at least one error evaluation data pair; each error evaluation data pair comprises first pose information and second pose information matched with the first pose information.
And secondly, carrying out correction processing on each piece of second pose information based on a coordinate system transformation relation between a positioning identification coordinate system corresponding to the positioning identification and a body coordinate system corresponding to the robot to obtain third pose information, wherein the third pose information is measured pose information after correction processing of the positioning identification under a measurement coordinate system.
The second pose information indicates a pose of the robot in the map coordinate system, and the pose of the robot in the map coordinate system can be converted into a pose of the positioning identifier in the map coordinate system, namely the third pose information, by using a coordinate system conversion relation between the positioning identifier coordinate system corresponding to the positioning identifier and the body coordinate system corresponding to the robot. The third pose information obtained by transformation is deviation correcting information of the second pose information obtained by direct measurement, namely a measured value deviation correcting value.
Specifically, the third pose information may be calculated using the following formula:
P corest =P testimate ×T marker (1)
wherein P is corest Representing third pose information, P testimate Representing the second pose information, T marker And the coordinate system transformation relation between the positioning identification coordinate system and the body coordinate system is represented.
And thirdly, converting each piece of third pose information into a true value coordinate system based on a coordinate system transformation relation between the measurement coordinate system and the true value coordinate system to obtain fourth pose information, wherein the fourth pose information is measured pose information after correction processing of the positioning mark in the true value coordinate system.
Based on the coordinate system transformation relation between the measurement coordinate system and the true value coordinate system, the pose of the positioning mark in the map coordinate system, namely the third pose information, can be transformed into the pose of the positioning mark in the camera coordinate system, and the fourth pose information is obtained. The fourth pose information obtained by transformation is shown as a square box 21 in fig. 2B, the second pose information obtained by direct measurement is shown as a square box 22 in fig. 2A, and the first pose information obtained by direct measurement is shown as a square box 23 in fig. 2A and a square box 24 in fig. 2B. As can be seen from fig. 2A and 2B, after the coordinate transformation, the measured value, i.e. the fourth pose information 21, is closer or similar to the true value, i.e. the first pose information 24. In practical application, if the robot positioning error is smaller, the fourth pose information 21 is very close to the first pose information 24, and the difference between the fourth pose information and the first pose information is smaller.
Specifically, the fourth pose information may be calculated using the following formula:
P translatied =T translation ×P corest (2)
wherein P is translatied Representing fourth pose information, T translation And the coordinate system transformation relation between the measurement coordinate system and the true value coordinate system is represented.
And step four, determining error information corresponding to each error evaluation data pair based on the fourth pose information and the first pose information in the error evaluation data pair.
Thus, the measured value of the pose of the positioning mark in the camera coordinate system, namely the fourth pose information, is obtained, and the positioning error information generated by the pose information measurement of the robot by the second pose measuring device can be determined by combining the true value of the pose of the positioning mark in the camera coordinate system, namely the first pose information.
In a specific implementation, the positioning error information may be determined by calculating a difference between the fourth pose information and the first pose information. The pose information comprises angle information, and the angle in the difference value can be further normalized.
Specifically, the positioning error information may be calculated using the following formula:
P error =P truth -P translatied (3)
wherein P is error Representing positioning error information, P truth Representing first pose information.
The angle θ in the positioning error information may be normalized using the following formula:
Theta error =atan(cosθ,sinθ) (4)
in Theta error The result of normalizing the angle θ in the positioning error information is shown.
And fifthly, determining and outputting the positioning error information based on the error information corresponding to each error evaluation data pair.
In some embodiments, the control server is further configured to determine a coordinate system transformation relationship between the measurement coordinate system and the true coordinate system, and in implementation, the method may be implemented by using the following steps:
step one, acquiring a plurality of first sample pose information of the positioning mark in the true value coordinate system.
And step two, acquiring a plurality of pieces of second sample pose information of the robot in the map coordinate system.
And thirdly, carrying out correction processing on each piece of second sample pose information based on the coordinate system transformation relation between the positioning identification coordinate system and the body coordinate system and the plurality of pieces of second sample pose information, and obtaining the sample pose information after correction processing of the positioning identification under the measurement coordinate system.
Specifically, the sample pose information after the correction processing can be calculated by using the following formula:
Q corest i =Q testimate i ×T marker (5)
in which Q corest i Representing the pose information of the sample after the ith deviation correction processing, Q testimate i Representing the ith second sample pose information.
And step four, determining a coordinate system transformation relation between the measurement coordinate system and the true value coordinate system based on the plurality of pieces of first sample pose information and the plurality of pieces of second sample pose information subjected to deviation correction processing.
The method can be realized by the following substeps:
and step one, determining centroid pose information and centroid measurement information based on the plurality of first sample pose information and the plurality of second sample pose information subjected to deviation correction processing.
Here, the average value of the plurality of pieces of first sample pose information may be used as centroid pose information, and the average value of the second sample pose information after the correction processing may be used as centroid measurement information.
And secondly, determining a coordinate system transformation relation between the measurement coordinate system and the true value coordinate system based on the first sample pose information, the second sample pose information after the correction processing, the centroid pose information and the centroid measurement information.
Specifically, the coordinate system transformation relationship between the measurement coordinate system and the true coordinate system is determined by using the following formula:
q truth i =Q truth i -Q truth (6)
q corest i= Q corest i -Q corest (7)
Figure BDA0002654688240000091
SVD decomposition is carried out on W to obtain:
W=UΣV T (9)
R=UV T (10)
t=Q truth -R×Q corest (11)
Figure BDA0002654688240000092
in which Q truth i Representing the ith first sample pose information, Q truth Representing centroid pose information, Q corest Representing centroid measurement information.
Four coordinate systems appear above: camera coordinate system, map coordinate system, positioning identification coordinate system and body coordinate system, and T is used for transforming relationship between camera coordinate system and map coordinate system translation Representing the transformation relation between the positioning identification coordinate system and the body coordinate system by T marker Representing the transformation relation between the body coordinate system and the map coordinate system by T estimate Representing the transformation relation between the positioning identification coordinate system and the camera coordinate system by T truth The relationship between the four transformation relationships can be expressed by the following formula:
T truth =T translation ×T estimate ×T marker (13)
the coordinate system transformation in the above embodiment can be realized based on the above-described change relation.
In some embodiments, the first pose measurement device is further configured to: and acquiring a plurality of pieces of first pose information according to the time sequence. Here, a first pose information sequence is obtained.
The second pose measurement device is further configured to: and acquiring a plurality of second pose information according to the time sequence. Here, a second pose information sequence is obtained.
The control server is further configured to: and for each piece of first pose information, screening second pose information which is closest to the generation time of the first pose information and is not matched with other pieces of first pose information based on the generation time of the first pose information and the generation time of each piece of second pose information, and taking the screened second pose information as second pose information which is matched with the first pose information in time.
When determining the second pose information matched with the first pose information, the control server specifically takes the second pose information closest to the generation time of the first pose information as the second pose information matched with the first pose information. In the implementation, a callback function can be utilized to transmit the first pose information sequence and the second pose information sequence, and two buffer areas are defined in the callback function to match the pose information.
The calculated positioning error information can be represented by a histogram and a normal histogram, as shown in fig. 3A and 3B, wherein fig. 3A shows a lateral error of the robot, and fig. 3B shows an angle error of the robot. In the figure, mu represents the mean value of the ecological distribution, and sigma represents the variance of the normal distribution.
Corresponding to the positioning error evaluation system, the application also discloses a positioning error evaluation method which is applied to the control server and is used for determining positioning error information generated by the pose information measurement of the robot by the second pose measurement device, wherein at least one positioning mark is arranged on the robot. Specifically, as shown in fig. 4, the positioning error evaluation method includes:
s410, at least one piece of first pose information of the positioning mark measured by the first pose measuring device is received.
S420, receiving at least one piece of second pose information of the robot measured by a second pose measuring device.
S430, determining positioning error information for positioning the robot based on the at least one first pose information and the at least one second pose information.
In some embodiments, the determining positioning error information for positioning the robot based on the at least one first pose information and the at least one second pose information includes:
determining second pose information which is matched with each first pose information in time to obtain at least one error evaluation data pair; each error evaluation data pair comprises first pose information and second pose information matched with the first pose information;
performing correction processing on each piece of second pose information based on a coordinate system transformation relation between a positioning identification coordinate system corresponding to the positioning identification and a body coordinate system corresponding to the robot to obtain third pose information, wherein the third pose information is measured pose information after correction processing of the positioning identification under a measurement coordinate system;
converting each third pose information into a true value coordinate system based on a coordinate system transformation relation between the measurement coordinate system and the true value coordinate system to obtain fourth pose information, wherein the fourth pose information is measured pose information after deviation correction processing of the positioning mark in the true value coordinate system;
for each error evaluation data pair, determining error information corresponding to the error evaluation data pair based on the fourth pose information and the first pose information of the error evaluation data pair;
and determining and outputting the positioning error information based on the error information corresponding to each error evaluation data pair.
The present application also provides a computer-readable storage medium, on which a computer program is stored, which when being executed by a processor performs the steps of the positioning error assessment method described in the above method embodiments. Wherein the storage medium may be a volatile or nonvolatile computer readable storage medium.
The computer program product of the positioning error evaluation method provided in the embodiments of the present application includes a computer readable storage medium storing program codes, where the instructions included in the program codes may be used to execute the steps of the positioning error evaluation method described in the above method embodiments, and specifically, reference may be made to the above method embodiments, which are not repeated herein.
The present application also provides a computer program which, when executed by a processor, implements any of the methods of the previous embodiments. The computer program product may be realized in particular by means of hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present application, and are not intended to limit the scope of the present application, but the present application is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, the present application is not limited thereto. Any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or make equivalent substitutions for some of the technical features within the technical scope of the disclosure of the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. A positioning error assessment system, comprising: at least one robot, wherein at least one positioning mark is arranged on the robot;
the first pose measuring device is used for acquiring at least one piece of first pose information of the positioning mark;
the second pose measuring device is used for acquiring at least one piece of second pose information of the robot;
a control server configured to determine positioning error information for positioning the robot based on the at least one first pose information and the at least one second pose information;
the first pose information is true-value pose information of the positioning mark under a true-value coordinate system;
the second pose measuring device is arranged on the robot; the second pose information is measured pose information of the robot under a measurement coordinate system;
the control server, when determining the positioning error information, is configured to:
determining second pose information which is matched with each first pose information in time to obtain at least one error evaluation data pair; each error evaluation data pair comprises first pose information and second pose information matched with the first pose information;
determining and outputting positioning error information based on the error information corresponding to each error evaluation data pair;
the true value coordinate system is a camera coordinate system;
the measurement coordinate system is a map coordinate system.
2. The positioning error assessment system according to claim 1, wherein the control server, when determining the corresponding error information for each error assessment data pair, is arranged to:
performing correction processing on each piece of second pose information based on a coordinate system transformation relation between a positioning identification coordinate system corresponding to the positioning identification and a body coordinate system corresponding to the robot to obtain third pose information, wherein the third pose information is measured pose information after correction processing of the positioning identification under a measurement coordinate system;
converting each third pose information into a true value coordinate system based on a coordinate system transformation relation between the measurement coordinate system and the true value coordinate system to obtain fourth pose information, wherein the fourth pose information is measured pose information after deviation correction processing of the positioning mark in the true value coordinate system;
for each error evaluation data pair, determining error information corresponding to the error evaluation data pair based on the fourth pose information and the first pose information of the error evaluation data pair.
3. The positioning error assessment system according to claim 2, wherein the control server, prior to converting each third pose information into a true coordinate system, is further configured to:
a coordinate system transformation relationship between the measurement coordinate system and the truth coordinate system is determined.
4. A positioning error assessment system according to claim 3, wherein said control server, when determining a coordinate system transformation relationship between the measurement coordinate system and the true coordinate system, is arranged to:
acquiring a plurality of first sample pose information of the positioning mark in the true value coordinate system;
acquiring a plurality of second sample pose information of the robot in the measurement coordinate system;
performing correction processing on each piece of second sample pose information based on the coordinate system transformation relation between the positioning identification coordinate system and the body coordinate system and the plurality of pieces of second sample pose information to obtain corrected sample pose information of the positioning identification under a measurement coordinate system;
and determining a coordinate system transformation relation between the measurement coordinate system and the true value coordinate system based on the plurality of first sample pose information and the plurality of second sample pose information subjected to deviation correction processing.
5. A positioning error assessment system according to claim 3, wherein said control server, when determining second pose information temporally matching each first pose information, is arranged to:
acquiring the generation time of each first pose information and the generation time of each second pose information;
and screening second pose information which is closest to the generation time of the first pose information and is not matched with other first pose information aiming at each first pose information, and taking the screened second pose information as second pose information which is matched with the first pose information in time.
6. The positioning error evaluation system according to any one of claims 1 to 5, wherein the first pose measurement means includes a plurality of cameras having different shooting orientations.
7. The positioning error evaluation method is applied to an error evaluation system and is used for determining positioning error information generated by measuring pose information of a robot by a pose measurement system, wherein at least one positioning mark is arranged on the robot; characterized by comprising the following steps:
receiving at least one first pose information of the positioning mark measured by a first pose measuring device;
at least one second pose information of the robot measured by a second pose measuring device;
determining positioning error information for positioning the robot based on the at least one first pose information and the at least one second pose information;
the first pose information is true-value pose information of the positioning mark under a true-value coordinate system;
the second pose measuring device is arranged on the robot; the second pose information is measured pose information of the robot under a measurement coordinate system;
the determining positioning error information for positioning the robot based on the at least one first pose information and the at least one second pose information includes:
determining second pose information which is matched with each first pose information in time to obtain at least one error evaluation data pair; each error evaluation data pair comprises first pose information and second pose information matched with the first pose information;
determining and outputting positioning error information based on the error information corresponding to each error evaluation data pair;
the true value coordinate system is a camera coordinate system;
the measurement coordinate system is a map coordinate system.
8. The positioning error assessment method according to claim 7, wherein determining the corresponding error information for each error assessment data pair comprises:
performing correction processing on each piece of second pose information based on a coordinate system transformation relation between a positioning identification coordinate system corresponding to the positioning identification and a body coordinate system corresponding to the robot to obtain third pose information, wherein the third pose information is measured pose information after correction processing of the positioning identification under a measurement coordinate system;
converting each third pose information into a true value coordinate system based on a coordinate system transformation relation between the measurement coordinate system and the true value coordinate system to obtain fourth pose information, wherein the fourth pose information is measured pose information after deviation correction processing of the positioning mark in the true value coordinate system;
for each error evaluation data pair, determining error information corresponding to the error evaluation data pair based on the fourth pose information and the first pose information of the error evaluation data pair.
CN202010882935.2A 2020-08-28 2020-08-28 Positioning error evaluation system and method Active CN114102574B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010882935.2A CN114102574B (en) 2020-08-28 2020-08-28 Positioning error evaluation system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010882935.2A CN114102574B (en) 2020-08-28 2020-08-28 Positioning error evaluation system and method

Publications (2)

Publication Number Publication Date
CN114102574A CN114102574A (en) 2022-03-01
CN114102574B true CN114102574B (en) 2023-05-30

Family

ID=80374674

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010882935.2A Active CN114102574B (en) 2020-08-28 2020-08-28 Positioning error evaluation system and method

Country Status (1)

Country Link
CN (1) CN114102574B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115464647B (en) * 2022-09-19 2024-11-19 南方科技大学 Master-slave control software robot system and control method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013071190A1 (en) * 2011-11-11 2013-05-16 Evolution Robotics, Inc. Scaling vector field slam to large environments
CN105953798A (en) * 2016-04-19 2016-09-21 深圳市神州云海智能科技有限公司 Determination method and apparatus for poses of mobile robot
CN107782304A (en) * 2017-10-26 2018-03-09 广州视源电子科技股份有限公司 Mobile robot positioning method and device, mobile robot and storage medium
CN109443392A (en) * 2018-12-10 2019-03-08 北京艾瑞思机器人技术有限公司 Navigation error determines method and device, navigation control method, device and equipment
CN110163025A (en) * 2019-04-29 2019-08-23 达泊(东莞)智能科技有限公司 Two dimensional code localization method and device
CN110385720A (en) * 2019-07-26 2019-10-29 南京航空航天大学 A kind of robot localization error compensating method based on deep neural network
CN110794434A (en) * 2019-11-29 2020-02-14 广州视源电子科技股份有限公司 Pose determination method, device, equipment and storage medium
DE102018124595A1 (en) * 2018-10-05 2020-04-09 Carl Zeiss Industrielle Messtechnik Gmbh Device for detecting a position and location of a robot end effector
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
CN111238496A (en) * 2020-01-14 2020-06-05 深圳市锐曼智能装备有限公司 Robot posture confirming method, device, computer equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013071190A1 (en) * 2011-11-11 2013-05-16 Evolution Robotics, Inc. Scaling vector field slam to large environments
CN105953798A (en) * 2016-04-19 2016-09-21 深圳市神州云海智能科技有限公司 Determination method and apparatus for poses of mobile robot
CN107782304A (en) * 2017-10-26 2018-03-09 广州视源电子科技股份有限公司 Mobile robot positioning method and device, mobile robot and storage medium
DE102018124595A1 (en) * 2018-10-05 2020-04-09 Carl Zeiss Industrielle Messtechnik Gmbh Device for detecting a position and location of a robot end effector
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
CN109443392A (en) * 2018-12-10 2019-03-08 北京艾瑞思机器人技术有限公司 Navigation error determines method and device, navigation control method, device and equipment
CN110163025A (en) * 2019-04-29 2019-08-23 达泊(东莞)智能科技有限公司 Two dimensional code localization method and device
CN110385720A (en) * 2019-07-26 2019-10-29 南京航空航天大学 A kind of robot localization error compensating method based on deep neural network
CN110794434A (en) * 2019-11-29 2020-02-14 广州视源电子科技股份有限公司 Pose determination method, device, equipment and storage medium
CN111238496A (en) * 2020-01-14 2020-06-05 深圳市锐曼智能装备有限公司 Robot posture confirming method, device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
智能机器人仓储物流系统设计;杜宇等;《组合机床与自动化加工技术》(第05期);全文 *

Also Published As

Publication number Publication date
CN114102574A (en) 2022-03-01

Similar Documents

Publication Publication Date Title
Li et al. RGB-D SLAM in dynamic environments using static point weighting
US7333631B2 (en) Landmark, apparatus, and method for effectively determining position of autonomous vehicles
CN112967339B (en) Vehicle pose determining method, vehicle control method and device and vehicle
US12159429B2 (en) Hand-eye calibration of camera-guided apparatuses
JP6349418B2 (en) Object positioning by high-precision monocular movement
CN111964680B (en) A real-time positioning method of inspection robot
CN113156407B (en) Vehicle-mounted laser radar external parameter joint calibration method, system, medium and device
Belter et al. Improving accuracy of feature-based RGB-D SLAM by modeling spatial uncertainty of point features
US20130094706A1 (en) Information processing apparatus and processing method thereof
Ozog et al. On the importance of modeling camera calibration uncertainty in visual SLAM
Iocchi et al. Self-localization in the RoboCup environment
Zhou A closed-form algorithm for the least-squares trilateration problem
Tamjidi et al. 6-DOF pose estimation of a portable navigation aid for the visually impaired
CN114102574B (en) Positioning error evaluation system and method
Koch et al. RFID-enabled location fingerprinting based on similarity models from probabilistic similarity measures
Li et al. An effective point cloud registration method based on robust removal of outliers
Martinelli Robot localization using the phase of passive UHF-RFID signals under uncertain tag coordinates
Tahri et al. Efficient iterative pose estimation using an invariant to rotations
CN117067261A (en) Robot monitoring method, device, equipment and storage medium
Unicomb et al. A monocular indoor localiser based on an extended kalman filter and edge images from a convolutional neural network
US20190362517A1 (en) Image database creation device, location and inclination estimation device, and image database creation method
Belter et al. The importance of measurement uncertainty modelling in the feature-based RGB-D SLAM
Ha et al. 6-DOF direct homography tracking with extended Kalman filter
CN112596070B (en) Robot positioning method based on laser and vision fusion
Mottaghi et al. Place recognition-based fixed-lag smoothing for environments with unreliable GPS

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant