CN115147796A - Method and device for evaluating target recognition algorithm, storage medium and vehicle - Google Patents
Method and device for evaluating target recognition algorithm, storage medium and vehicle Download PDFInfo
- Publication number
- CN115147796A CN115147796A CN202210827749.8A CN202210827749A CN115147796A CN 115147796 A CN115147796 A CN 115147796A CN 202210827749 A CN202210827749 A CN 202210827749A CN 115147796 A CN115147796 A CN 115147796A
- Authority
- CN
- China
- Prior art keywords
- target
- real
- perception
- determining
- confidence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004422 calculation algorithm Methods 0.000 title claims abstract description 84
- 238000000034 method Methods 0.000 title claims abstract description 46
- 230000008447 perception Effects 0.000 claims abstract description 144
- 238000011156 evaluation Methods 0.000 claims abstract description 34
- 238000012545 processing Methods 0.000 claims abstract description 11
- 239000011159 matrix material Substances 0.000 claims description 86
- 230000015654 memory Effects 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 10
- 230000000875 corresponding effect Effects 0.000 description 60
- 238000004891 communication Methods 0.000 description 12
- 238000004364 calculation method Methods 0.000 description 7
- 230000006399 behavior Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000002596 correlated effect Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 230000010267 cellular communication Effects 0.000 description 3
- 239000000446 fuel Substances 0.000 description 3
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 2
- ATUOYWHBWRKTHZ-UHFFFAOYSA-N Propane Chemical compound CCC ATUOYWHBWRKTHZ-UHFFFAOYSA-N 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 238000002485 combustion reaction Methods 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- SAZUGELZHZOXHB-UHFFFAOYSA-N acecarbromal Chemical compound CCC(Br)(CC)C(=O)NC(=O)NC(C)=O SAZUGELZHZOXHB-UHFFFAOYSA-N 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000011217 control strategy Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000003208 petroleum Substances 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 239000001294 propane Substances 0.000 description 1
- 230000000979 retarding effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000002344 surface layer Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Mathematical Analysis (AREA)
- Computational Mathematics (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Algebra (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The disclosure relates to the technical field of automatic driving, in particular to a method and a device for evaluating a target recognition algorithm, a storage medium and a vehicle. The method for evaluating the target recognition algorithm comprises the following steps: acquiring point cloud data obtained by sensing a target scene by a millimeter wave radar sensor; performing target recognition on the point cloud data according to a target recognition algorithm to be evaluated to obtain a recognized perception target; performing association processing on a real target in the target scene and the perception target to obtain an association result; and determining an evaluation result of the target recognition algorithm to be evaluated according to the correlation result. By adopting the method, the accuracy of the target recognition algorithm to be evaluated can be determined.
Description
Technical Field
The present disclosure relates to the technical field of millimeter wave radar sensors in the automatic driving technology, and in particular, to a method, an apparatus, a storage medium, and a vehicle for evaluating a target recognition algorithm.
Background
The automatic driving system senses the environment around the vehicle through the sensing system and makes driving decisions based on the environment around the vehicle to control the vehicle to automatically drive. In the process that the sensing system senses the environment around the vehicle, the sensing system can perform entity/target recognition (such as obstacle recognition) on sensing data acquired by the sensor, and the automatic driving system can make a driving decision adaptive to the current environment based on the entity/target recognition result.
In the related art, the quality of a target recognition algorithm influences the accuracy of a target recognition result, and the accuracy of the target recognition result has very important influence on whether an automatic driving system makes a safe driving decision. Therefore, in order to ensure that the autonomous vehicle runs safely, the quality of the target recognition algorithm needs to be evaluated.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a method, an apparatus, a storage medium, and a vehicle for evaluating a target recognition algorithm.
According to a first aspect of the embodiments of the present disclosure, there is provided a method of evaluating a target recognition algorithm, the method comprising:
acquiring point cloud data obtained by sensing a target scene by a millimeter wave radar sensor;
performing target recognition on the point cloud data according to a target recognition algorithm to be evaluated to obtain a recognized perception target;
performing association processing on a real target in the target scene and the perception target to obtain an association result;
and determining an evaluation result of the target recognition algorithm to be evaluated according to the correlation result.
Optionally, the determining, by the processing unit, that the number of the sensing targets is multiple, and the number of the real targets is multiple, and the associating the real targets in the target scene with the sensing targets to obtain an association result includes:
for each perception target, calculating a confidence degree of the perception target and each real target to obtain a confidence degree matrix, wherein rows of the confidence degree matrix correspond to one of the perception target and the real target, columns of the confidence degree matrix correspond to the other of the perception target and the real target, and the confidence degree represents the probability that the perception target and the real target are the same target;
determining an associated perception target associated with the real target and an associated real target associated with the perception target based on the magnitude of each confidence in the confidence matrix, wherein the association result comprises the associated perception target and the associated real target.
Optionally, in a case that a row of the confidence matrix corresponds to the sensing target and a column of the confidence matrix corresponds to the real target, the determining, based on a magnitude of each confidence in the confidence matrix, an associated sensing target associated with the real target and an associated real target associated with the sensing target includes:
determining a first target element which is larger than a first preset threshold value in the confidence coefficient matrix;
determining the perception target corresponding to the first target row where the first target element is located as the associated perception target;
and determining the real target corresponding to the first target column in which the first target element is positioned as the associated real target.
Optionally, the determining, based on the magnitude of each confidence in the confidence matrix, an associated sensing target associated with the real target and an associated real target associated with the sensing target further includes:
for a second target row except the first target row, determining a second target element with the maximum confidence level value in the second target row;
determining a second target column in which the second target element is located;
if the second target element is the maximum value in the second target column, determining the perception target corresponding to the second target row as the associated perception target;
and determining the real target corresponding to the second target column as the associated real target.
Optionally, the determining, based on the magnitude of each confidence in the confidence matrix, an associated sensing target associated with the real target and an associated real target associated with the sensing target further includes:
aiming at each real target, calculating a first included angle between the real target and an X axis under a target coordinate system according to the position information of the real target;
determining a first included angle interval according to the first included angle corresponding to each real target;
aiming at the candidate perception targets except the associated perception target, calculating a second included angle between the candidate perception target and the X axis in the target coordinate system according to the position information of the candidate perception target;
and if the second included angle is within the first included angle interval and a third target element larger than a second preset threshold exists in a third target row where the candidate perception target is located, determining the candidate perception target as the associated perception target.
Optionally, the determining, based on the magnitude of each confidence in the confidence matrix, an associated sensing target associated with the real target and an associated real target associated with the sensing target further includes:
if the number of the third target elements is larger than 1, determining candidate third target columns corresponding to all the third target elements;
determining a target candidate real target closest to the candidate perception target from the candidate real targets corresponding to all the candidate third target columns;
determining the target candidate real target as the associated real target.
Optionally, when a row of the confidence matrix corresponds to the perceptual target and a column of the confidence matrix corresponds to the real target, determining, according to the association result, an evaluation result of the target recognition algorithm to be evaluated includes:
determining the accuracy rate of the target recognition algorithm to be evaluated according to the number of the associated sensing targets and the total row number of the confidence coefficient matrix;
determining the recall rate of the target recognition algorithm to be evaluated according to the number of the associated real targets and the total column number of the confidence coefficient matrix;
wherein the evaluation result comprises the accuracy rate and the recall rate.
Optionally, the determining, according to the association result, an evaluation result of the target recognition algorithm to be evaluated includes:
determining associated pairs based on the association result, each associated pair comprising the associated perception target and the associated real target associated with the associated perception target;
calculating an error between the associated perceptual target and the associated real target in the associated pair;
and counting an error mean value corresponding to each target category according to the errors corresponding to the associated pairs, wherein the target category of the associated pairs is determined according to the category of the associated real target in the associated pairs, and the evaluation result comprises the error mean value.
According to a second aspect of the embodiments of the present disclosure, there is provided an apparatus for evaluating a target recognition algorithm, the apparatus comprising:
the acquisition module is configured to acquire point cloud data obtained by sensing a target scene by the millimeter wave radar sensor;
the identification module is configured to perform target identification on the point cloud data according to a target identification algorithm to be evaluated to obtain an identified perception target;
the association module is configured to perform association processing on a real target in the target scene and the perception target to obtain an association result;
and the execution module is configured to determine an evaluation result of the target recognition algorithm to be evaluated according to the correlation result.
Optionally, the number of the sensing targets is multiple, the number of the real targets is multiple, and the associating module includes:
a first calculation sub-module configured to calculate, for each of the perception targets, a confidence of the perception target and each of the real targets, resulting in a confidence matrix, rows of the confidence matrix corresponding to one of the perception target and the real target, columns of the confidence matrix corresponding to the other of the perception target and the real target, the confidence characterizing a probability that the perception target and the real target are the same target;
a first determining sub-module configured to determine, based on the magnitude of each of the confidences in the confidence matrix, an associated sensing target associated with the real target and an associated real target associated with the sensing target, the association result including the associated sensing target and the associated real target.
Optionally, in a case that a row of the confidence matrix corresponds to the perceptual target and a column of the confidence matrix corresponds to the real target, the first determining sub-module includes:
a second determination submodule configured to determine a first target element of the confidence matrix that is greater than a first preset threshold;
a third determining submodule configured to determine the perception target corresponding to the first target row in which the first target element is located as the associated perception target;
a fourth determining submodule configured to determine the real target corresponding to the first target column in which the first target element is located as the associated real target.
Optionally, the first determining sub-module further includes:
a fifth determination submodule configured to determine, for a second target row other than the first target row, a second target element in which a value of confidence in the second target row is largest;
a sixth determining submodule configured to determine a second target column in which the second target element is located;
a seventh determining sub-module, configured to determine, if the second target element is a maximum value in the second target column, the sensing target corresponding to the second target row as the associated sensing target; and determining the real target corresponding to the second target column as the associated real target.
Optionally, the first determining sub-module further includes:
the second calculation submodule is configured to calculate, for each real target, a first included angle between the real target and the X axis in the target coordinate system according to the position information of the real target;
an eighth determining submodule configured to determine a first included angle interval according to the first included angle corresponding to each of the real targets;
a third calculation submodule configured to calculate, for candidate perceptual targets other than the associated perceptual target, a second included angle between the candidate perceptual target and the X axis in the target coordinate system according to the position information of the candidate perceptual target;
a ninth determining sub-module, configured to determine the candidate sensing target as the associated sensing target if the second included angle is within the first included angle interval and a third target element larger than a second preset threshold exists in a third target row where the candidate sensing target is located.
Optionally, the first determining sub-module further includes:
a tenth determining submodule configured to determine candidate third target columns corresponding to all the third target elements if the number of the third target elements is greater than 1; determining a target candidate real target closest to the candidate perception target from the candidate real targets corresponding to all the candidate third target columns; determining the target candidate real target as the associated real target.
Optionally, in a case that a row of the confidence matrix corresponds to the perception target and a column of the confidence matrix corresponds to the real target, the executing module includes:
the first execution submodule is configured to determine the accuracy of the target recognition algorithm to be evaluated according to the number of the associated perception targets and the total row number of the confidence coefficient matrix;
the second execution submodule is configured to determine a recall rate of the target recognition algorithm to be evaluated according to the number of the associated real targets and the total column number of the confidence coefficient matrix; wherein the evaluation result comprises the accuracy rate and the recall rate.
Optionally, the execution module includes:
a third execution submodule configured to determine association pairs based on the association result, each association pair including the associated sensing target and the associated real target associated with the associated sensing target; calculating an error between the associated perceptual target and the associated real target in the associated pair; and counting an error mean value corresponding to each target category according to the errors corresponding to the associated pairs, wherein the target category of the associated pairs is determined according to the category of the associated real target in the associated pairs, and the evaluation result comprises the error mean value.
According to a third aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the steps of the method of evaluating a target recognition algorithm provided by the first aspect of the present disclosure.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a vehicle including:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method of evaluating an object recognition algorithm provided by the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
and point cloud data obtained by sensing the target scene by the millimeter wave radar sensor are obtained. And carrying out target recognition on the point cloud data according to a target recognition algorithm to be evaluated to obtain a recognized perception target. And performing association processing according to the real target and the perception target in the target scene to obtain an association result. And determining an evaluation result of the target recognition algorithm to be evaluated according to the correlation result. By adopting the method, the evaluation result of the target recognition algorithm to be evaluated can be calculated according to the correlation result of the real target and the perception target, so that the quality of the target recognition algorithm to be evaluated is determined. The quality of the target recognition algorithm to be evaluated can be represented by the indexes of the accuracy (accuracy rate), the recall rate and the like of the target recognition algorithm to be evaluated.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow diagram illustrating a method of evaluating an object recognition algorithm in accordance with an exemplary embodiment.
FIG. 2 is a block diagram illustrating an apparatus for evaluating an object recognition algorithm, according to an exemplary embodiment.
FIG. 3 is a functional block diagram of a vehicle shown in accordance with an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
It should be noted that all the actions of acquiring signals, information or data in the present application are performed under the premise of complying with the corresponding data protection regulation policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
Fig. 1 is a flowchart illustrating a method of evaluating a target recognition algorithm according to an exemplary embodiment, and as shown in fig. 1, the method of evaluating a target recognition algorithm is applied to a terminal device. For example, the method for evaluating the target recognition algorithm may include the following steps.
In step S11, point cloud data obtained by sensing the target scene by the millimeter wave radar sensor is obtained.
It should be explained that a point cloud (point cloud) represents a data set comprising a plurality of points, each point may comprise information of geometrical coordinates (X, Y), time stamps, intensity values, velocity values, RCS values (radar scattering cross-sectional area), etc. The intensity value refers to the intensity of a signal received after the signal emitted by the sensor encounters an object and is reflected back. When these points are combined together, a point cloud is formed.
The millimeter wave Radar (Radar) is cheap, has long detection distance, can acquire speed information and has strong anti-interference capability. Millimeter wave radar is an important sensor for autonomous vehicles. However, the point cloud acquired by the millimeter wave radar is too sparse and the resolution is not high, and the millimeter wave radar cannot acquire the height information of the obstacle. Therefore, the algorithm for identifying the target aiming at the point cloud data acquired by the millimeter wave radar sensor is difficult to design, and the evaluation of the algorithm is also difficult.
In step S12, target recognition is performed on the point cloud data according to a target recognition algorithm to be evaluated, so as to obtain a recognized perception target.
According to the implementation mode, target recognition is carried out on point cloud data according to a target recognition algorithm to be evaluated, and a recognized perception target is obtained. That is, the target type/ID name (i.e., entity type/ID name) corresponding to each point in the point cloud can be identified.
The target identification technique (target identification technique) is a technique for identifying a remote target by using a radar and a computer, and estimates the size, shape, weight and physical characteristic parameters of a surface layer of the target through various mathematical multidimensional space transformations by analyzing target characteristic information such as amplitude, phase, frequency spectrum and polarization in radar echo, and finally carries out identification judgment in a classifier according to an identification function determined by a large number of training samples. Modern radars (including thermal radars and laser radars) are not only tools for detecting and locating remote targets, but also are capable of measuring parameters related to the physical and surface physical characteristics of the targets, thereby classifying and identifying the targets.
It should be noted that the target identification algorithm to be evaluated in the present disclosure refers to an algorithm for performing target identification on point cloud data acquired by a millimeter wave radar sensor.
In step S13, the real target in the target scene is associated with the sensing target to obtain an association result.
The real target in the target scene may be determined according to information obtained by artificially performing entity/target labeling on the target point cloud data, where the target point cloud data may be point cloud data obtained by sensing the target scene by the millimeter wave radar sensor in step S11. In some embodiments, if the method for evaluating a target recognition algorithm of the present disclosure is applied to a vehicle, a millimeter wave radar sensor for collecting point cloud data is installed on the vehicle, and a real target may correspond to a plurality of points in the point cloud data, then a point closest to the vehicle may be determined from the plurality of points to characterize the real target, i.e., a point closest to the vehicle may be determined from the plurality of points to be a true value point of the real target.
In addition, the target point cloud data may also be point cloud data obtained by sensing a target scene by a laser radar sensor. Or, the target point cloud data may also be data obtained by sensing a target scene by a camera. Or, the target point cloud data may also be data obtained by manually measuring and drawing a target scene.
In step S14, an evaluation result of the target recognition algorithm to be evaluated is determined according to the association result.
In some embodiments, a real target in the target scene may be associated with a sensing target, resulting in a real target associated with the sensing target. Based on the magnitude of the error between the associated perceptual target and the real target, the goodness of the target recognition algorithm may be determined. For example, the accuracy, recall, fitness, etc. of the target recognition algorithm may be determined.
By adopting the method, the point cloud data obtained by sensing the target scene by the millimeter wave radar sensor is obtained. And carrying out target recognition on the point cloud data according to a target recognition algorithm to be evaluated to obtain a recognized perception target. And performing association processing according to the real target and the perception target in the target scene to obtain an association result. And determining an evaluation result of the target recognition algorithm to be evaluated according to the correlation result. By adopting the method, the evaluation result of the target recognition algorithm to be evaluated can be calculated according to the correlation result of the real target and the perception target, so that the quality of the target recognition algorithm to be evaluated is determined.
And moreover, the evaluation precision of the evaluation result determined based on the correlation result is favorably improved based on the way of performing correlation processing on the real target and the perception target in the target scene.
Optionally, the number of the sensing targets is multiple, the number of the real targets is multiple, and the associating the real targets in the target scene with the sensing targets to obtain an association result includes:
for each perception target, calculating a confidence degree of the perception target and each real target to obtain a confidence degree matrix, wherein rows of the confidence degree matrix correspond to one of the perception target and the real target, columns of the confidence degree matrix correspond to the other of the perception target and the real target, and the confidence degree represents the probability that the perception target and the real target are the same target; determining an associated perception target associated with the real target and an associated real target associated with the perception target based on the magnitude of each confidence in the confidence matrix, wherein the association result comprises the associated perception target and the associated real target.
For example, assuming that there are two perceptual targets a, B, and 3 real targets C, D, E, then for perceptual target a, a confidence X (a, C) of perceptual target a and real target C is calculated. Computing perceptionConfidence X (a, D) of object a with real object D. And calculating the confidence degrees X (A and E) of the perception target A and the real target E. Then, for the perception target B, the confidence X (B, C) of the perception target B and the real target C is calculated. And calculating the confidence coefficient X (B, D) of the perception object B and the real object D. And calculating the confidence degrees X (B and E) of the perception target B and the real target E. Obtain a confidence matrix ofOr is。
The confidence characterizes the probability that the perceived target is the same target (entity) as the true target. The confidence of the perception target and the real target can be calculated by calculating the mahalanobis distance between the perception target and the real target. In addition, in the process of calculating the Mahalanobis distance between the sensing target and the real target, the position error of the sensing target can be corrected according to the millimeter wave radar position error model by considering the Gaussian distribution characteristic of points in the point cloud acquired by the millimeter wave radar sensor. It should be noted here that the sensing target may be one point or multiple points acquired by the millimeter wave radar sensor, and the embodiment of the present disclosure is exemplified by taking the sensing target as one point.
Here, mahalanobis Distance (Mahalanobis Distance) represents a covariance Distance of data. Mahalanobis distance is a distance measurement and can be regarded as a correction of euclidean distance, which corrects the problem that dimensions in euclidean distance are inconsistent and related. The method is an effective method for calculating the similarity of two unknown sample sets.
After the confidence matrix is obtained through calculation, the associated sensing target associated with the real target and the associated real target associated with the sensing target can be determined based on the magnitude of each confidence in the confidence matrix. Correspondingly, according to the total row number and the total column number of the confidence coefficient matrix, unassociated sensing targets which are not associated with real targets and unassociated real targets which are not associated with sensing targets can be further determined.
An embodiment, in a case where a row of the confidence matrix corresponds to the perception target and a column of the confidence matrix corresponds to the real target, the determining, based on a magnitude of each of the confidences in the confidence matrix, an associated perception target associated with the real target and an associated real target associated with the perception target, includes:
determining a first target element which is larger than a first preset threshold value in the confidence coefficient matrix; determining the perception target corresponding to the first target row where the first target element is located as the associated perception target; and determining the real target corresponding to the first target column in which the first target element is positioned as the associated real target.
The first preset threshold is a preset prior value and can be adaptively set based on requirements.
By way of example, assume a confidence matrix ofIf the elements X (a, D) in the first row and the second column are greater than the first preset threshold, determining the perception target a corresponding to the first row in which the first target element X (a, D) is located as the associated perception target. And determining the real target D corresponding to the second column where the first target element X (A, D) is positioned as the associated real target.
Optionally, the determining, based on the magnitude of each confidence in the confidence matrix, an associated sensing target associated with the real target and an associated real target associated with the sensing target further includes:
for a second target row except the first target row, determining a second target element with the maximum confidence level value in the second target row; determining a second target column in which the second target element is located; if the second target element is the maximum value in the second target column, determining the perception target corresponding to the second target row as the associated perception target; and determining the real target corresponding to the second target column as the associated real target.
The confidence matrix for carrying out the foregoing embodiment isAnd after the foregoing embodiment determines the perception object a as an associated perception object and determines the real object D as an associated real object, further performing the example, for a second object row (i.e., a second row in which the perception object B is located) other than the first object row (i.e., a first row in which the perception object a is located), determining a second object element with the largest value of confidence in the second row, assuming that the second object element is X (B, E). A second target column (i.e., the third column of the confidence matrix) is determined in which the second target element X (B, E) is located. In the first case, if the second target element X (B, E) is the maximum value in the second target column (i.e., X (a, E), X (B, E)), that is, it is determined that X (B, E) is greater than X (a, E), the sensing target B corresponding to the second target row (i.e., the second row of the confidence matrix) is determined as the associated sensing target. And determining the real target E corresponding to the second target column (i.e. the third column of the confidence matrix) as the associated real target.
In the second case, if the second target element X (B, E) is not the maximum value in the second target column, that is, X (a, E) is greater than X (B, E), it cannot be determined whether the sensing target corresponding to the row where the second target element X (B, E) is located is the associated sensing target, and cannot be determined whether the real target corresponding to the column where the second target element X (B, E) is located is the associated real target.
In a third case, if the second target element X (B, E) is not the maximum value in the second target column, that is, X (a, E) is greater than X (B, E), it is determined that the sensing target corresponding to the row where the second target element X (B, E) is located is an unassociated sensing target, and it is determined that the real target corresponding to the column where the second target element X (B, E) is located is an unassociated real target.
Optionally, the determining, based on the magnitude of each confidence in the confidence matrix, an associated sensing target associated with the real target and an associated real target associated with the sensing target further includes:
aiming at each real target, calculating a first included angle between the real target and an X axis under a target coordinate system according to the position information of the real target; determining a first included angle interval according to the first included angle corresponding to each real target; aiming at candidate perception targets except the associated perception target, calculating a second included angle between the candidate perception target and an X axis in the target coordinate system according to the position information of the candidate perception target; and if the second included angle is within the first included angle interval and a third target element larger than a second preset threshold exists in a third target row where the candidate perception target is located, determining the candidate perception target as the associated perception target.
And the second preset threshold is smaller than the first preset threshold.
The first included angle between the real target and the X axis in the target coordinate system refers to a first connecting line between the coordinates of the real target and the origin of the target coordinate system, and the included angle between the first connecting line and the X axis.
Correspondingly, the second included angle between the candidate sensing target and the X axis in the target coordinate system is a second connection line between the coordinates of the candidate sensing target and the origin of the target coordinate system, and the second connection line forms an included angle with the X axis.
The target coordinate system may be a body coordinate system corresponding to the vehicle on which the millimeter wave radar sensor is mounted.
The confidence matrix isAnd A is an example of a related perception target, and for a real target C, a first included angle a between the real target C and an X axis in a target coordinate system is calculated according to position information of the real target C 1 . Aiming at the real target D, calculating a first included angle a between the real target D and the X axis in the target coordinate system according to the position information of the real target D 2 . Aiming at the real target E, calculating a first included angle a between the real target E and an X axis in a target coordinate system according to the position information of the real target E 3 . According to respective truthsFirst included angle a corresponding to targets C, D and E 1 、a 2 、a 3 Determining a first angle interval, assuming as [ a ] 1 ,a 3 ]. And aiming at the candidate perception target B except the associated perception target A, calculating a second included angle B between the candidate perception target B and the X axis in the target coordinate system according to the position information of the candidate perception target B. In the first case, if the second angle b is in the first angle range [ a ] 1 ,a 3 ]Therein, i.e.And if a third target element larger than a second preset threshold exists in a third target row (i.e. a second row in the confidence matrix) where the candidate sensing target B is located, determining the candidate sensing target B as the associated sensing target.
The following explains how to determine an associated real target associated with an associated perceptual target B in a case where a candidate perceptual target B is determined as an associated perceptual target.
In one embodiment, if the third target element has X (B, E), the real target E corresponding to the column where the third target element X (B, E) is located is directly determined as the associated real target associated with the associated sensing target B.
In another embodiment, if the number of the third target elements is greater than 1, if the third target elements are X (B, C), X (B, D), and X (B, E), candidate third target columns (i.e., the first column, the second column, and the third column of the confidence matrix) corresponding to all the third target elements are determined. And determining a target candidate real target (assumed to be C) closest to the candidate perception target B from the candidate real targets C, D and E corresponding to all the candidate third target columns. Determining the target candidate real target C as an associated real target associated with the associated perceptual target B.
Optionally, when a row of the confidence matrix corresponds to the sensing target and a column of the confidence matrix corresponds to the real target, the determining, according to the association result, an evaluation result of the target recognition algorithm to be evaluated includes:
determining the accuracy rate of the target recognition algorithm to be evaluated according to the number of the associated perception targets and the total row number of the confidence coefficient matrix; determining the recall rate of the target identification algorithm to be evaluated according to the number of the associated real targets and the total column number of the confidence coefficient matrix; wherein the evaluation result comprises the accuracy rate and the recall rate.
By way of example, assume a confidence matrix ofWherein A is the associated perception target, and C and D are the associated real target. Then the accuracy of the target recognition algorithm to be evaluated can be determined to be 50% based on the number of associated perceptual targets 1 and the total number of rows of the confidence matrix 2. According to the number 2 of the associated real targets and the total column number 3 of the confidence coefficient matrix, the recall rate of the target recognition algorithm to be evaluated can be determined to be 66.67%.
Optionally, the determining, according to the association result, an evaluation result of the target recognition algorithm to be evaluated includes:
determining association pairs based on the association result, wherein each association pair comprises the associated perception target and the associated real target associated with the associated perception target; calculating an error between the associated perceptual target and the associated real target in the associated pair; and counting an error mean value corresponding to each target category according to the errors corresponding to the associated pairs, wherein the target category of the associated pairs is determined according to the category of the associated real target in the associated pairs, and the evaluation result comprises the error mean value.
By way of example, assume a confidence matrix ofAnd assume that A is associated with C and B is associated with E. Then there are two pairs of associations a and C, B and E. And calculating the error M between the correlated perception target A and the correlated real target C in the correlation pairs A and C. Coordinate error of the position coordinates x and/or y, for example A and C-Root mean square error (since the millimeter wave radar cannot know the height information of the target, the z-coordinate error representing the height of the target is not calculated), and speed error/root mean square error of a and C, for example.
And calculating the error N between the correlated perception target B and the correlated real target E in the correlation pair B and E.
And according to the error corresponding to each correlation pair, counting the error mean value corresponding to each target category. Assuming that the corresponding categories of A and C are cars and the corresponding categories of B and E are cars, the mean value of the corresponding errors under the car category is (M + N)/2.
The target category may also be cars, pedestrians, birds, trees, billboards, etc.
Optionally, determining an evaluation result of the target recognition algorithm to be evaluated according to the association result, and may further include:
and determining association pairs based on the association result, wherein each association pair comprises an associated perception target and an associated real target associated with the associated perception target. A confidence measure is determined for each associated pair. And sequencing and drawing all the associated pairs according to the confidence degree to obtain a PR curve map. The evaluation result of the target recognition algorithm to be evaluated comprises a PR curve map.
The area under the PR curve can represent the Average Precision (AP), and the PR curve is an index for evaluating the quality of the algorithm.
It should be noted that the foregoing embodiment is exemplified by the case where the rows of the confidence matrix correspond to the perception targets, and the columns of the confidence matrix correspond to the real targets.
In the same principle, when the columns of the confidence matrix correspond to the sensing targets and the rows of the confidence matrix correspond to the real targets, the implementation manner is substantially the same as that of the foregoing embodiment, and details are not repeated here.
FIG. 2 is a block diagram illustrating an apparatus for evaluating an object recognition algorithm, according to an exemplary embodiment. Referring to fig. 2, the apparatus 200 for evaluating an object recognition algorithm includes:
the acquisition module 210 is configured to acquire point cloud data obtained by sensing a target scene by a millimeter wave radar sensor;
the identification module 220 is configured to perform target identification on the point cloud data according to a target identification algorithm to be evaluated to obtain an identified perception target;
the association module 230 is configured to perform association processing on a real target in the target scene and the perception target to obtain an association result;
and the execution module 240 is configured to determine an evaluation result of the target recognition algorithm to be evaluated according to the association result.
Optionally, the number of the sensing targets is multiple, the number of the real targets is multiple, and the associating module 230 includes:
a first calculation sub-module configured to calculate, for each of the perception targets, a confidence of the perception target and each of the real targets, resulting in a confidence matrix, rows of the confidence matrix corresponding to one of the perception target and the real target, columns of the confidence matrix corresponding to the other of the perception target and the real target, the confidence characterizing a probability that the perception target and the real target are the same target;
a first determining sub-module configured to determine, based on the magnitude of each of the confidences in the confidence matrix, an associated sensing target associated with the real target and an associated real target associated with the sensing target, the association result including the associated sensing target and the associated real target.
Optionally, in a case that a row of the confidence matrix corresponds to the perception target and a column of the confidence matrix corresponds to the real target, the first determining submodule includes:
a second determining submodule configured to determine a first target element of the confidence matrix that is greater than a first preset threshold;
a third determining submodule configured to determine the perception target corresponding to the first target row in which the first target element is located as the associated perception target;
a fourth determining sub-module, configured to determine the real target corresponding to the first target column in which the first target element is located as the associated real target.
Optionally, the first determining sub-module further includes:
a fifth determination submodule configured to determine, for a second target row other than the first target row, a second target element in which the value of confidence in the second target row is largest;
a sixth determining submodule configured to determine a second target column in which the second target element is located;
a seventh determining sub-module, configured to determine, if the second target element is a maximum value in the second target column, the sensing target corresponding to the second target row as the associated sensing target; and determining the real target corresponding to the second target column as the associated real target.
Optionally, the first determining sub-module further includes:
the second calculation submodule is configured to calculate, for each real target, a first included angle between the real target and the X axis in the target coordinate system according to the position information of the real target;
the eighth determining submodule is configured to determine a first included angle interval according to the first included angle corresponding to each real target;
a third calculation submodule configured to calculate, for candidate perceptual targets other than the associated perceptual target, a second included angle between the candidate perceptual target and the X axis in the target coordinate system according to the position information of the candidate perceptual target;
a ninth determining sub-module, configured to determine the candidate perceptual target as the associated perceptual target if the second angle is within the first angle interval and a third target element greater than a second preset threshold exists in a third target row in which the candidate perceptual target is located.
Optionally, the first determining sub-module further includes:
a tenth determining submodule configured to determine candidate third target columns corresponding to all the third target elements if the number of the third target elements is greater than 1; determining a target candidate real target closest to the candidate sensing target from the candidate real targets corresponding to all the candidate third target columns; determining the target candidate real target as the associated real target.
Optionally, in a case that a row of the confidence matrix corresponds to the perception target and a column of the confidence matrix corresponds to the real target, the executing module 240 includes:
the first execution submodule is configured to determine the accuracy of the target recognition algorithm to be evaluated according to the number of the associated sensing targets and the total row number of the confidence coefficient matrix;
the second execution submodule is configured to determine a recall rate of the target recognition algorithm to be evaluated according to the number of the associated real targets and the total column number of the confidence coefficient matrix; wherein the evaluation result comprises the accuracy rate and the recall rate.
Optionally, the executing module 240 includes:
a third execution submodule configured to determine associated pairs based on the association result, each of the associated pairs including the associated perception target and the associated real target associated with the associated perception target; calculating an error between the associated perceptual target and the associated real target in the associated pair; and counting an error mean value corresponding to each target category according to the errors corresponding to the associated pairs, wherein the target category of the associated pairs is determined according to the category of the associated real target in the associated pairs, and the evaluation result comprises the error mean value.
By adopting the device, point cloud data obtained by sensing a target scene by the millimeter wave radar sensor is obtained. And carrying out target recognition on the point cloud data according to a target recognition algorithm to be evaluated to obtain a recognized perception target. And performing association processing according to the real target and the perception target in the target scene to obtain an association result. And determining an evaluation result of the target recognition algorithm to be evaluated according to the correlation result. By adopting the method, the evaluation result of the target recognition algorithm to be evaluated can be calculated according to the correlation result of the real target and the perception target, so that the quality of the target recognition algorithm to be evaluated is determined.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The present disclosure also provides a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the method of evaluating a target recognition algorithm provided by the present disclosure.
Referring to fig. 3, fig. 3 is a functional block diagram of a vehicle 600 according to an exemplary embodiment. The vehicle 600 may be configured in a fully or partially autonomous driving mode. For example, the vehicle 600 may acquire environmental information of its surroundings through the sensing system 620 and derive an automatic driving strategy based on an analysis of the surrounding environmental information to implement full automatic driving, or present the analysis result to the user to implement partial automatic driving.
Vehicle 600 may include various subsystems such as infotainment system 610, perception system 620, decision control system 630, drive system 640, and computing platform 650. Alternatively, vehicle 600 may include more or fewer subsystems, and each subsystem may include multiple components. In addition, each of the sub-systems and components of the vehicle 600 may be interconnected by wire or wirelessly.
In some embodiments, the infotainment system 610 may include a communication system 611, an entertainment system 612, and a navigation system 613.
The communication system 611 may comprise a wireless communication system that may wirelessly communicate with one or more devices, either directly or via a communication network. For example, the wireless communication system may use 3G cellular communication, such as CDMA, EVD0, GSM/GPRS, or 4G cellular communication, such as LTE. Or 5G cellular communication. The wireless communication system may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, the wireless communication system may communicate directly with the device using an infrared link, bluetooth, or ZigBee. Other wireless protocols, such as various vehicular communication systems, for example, a wireless communication system may include one or more Dedicated Short Range Communications (DSRC) devices that may include public and/or private data communications between vehicles and/or roadside stations.
The entertainment system 612 may include a display device, a microphone, and a sound box, and a user may listen to a broadcast in the car based on the entertainment system, playing music; or the mobile phone is communicated with the vehicle, screen projection of the mobile phone is realized on the display equipment, the display equipment can be in a touch control type, and a user can operate the display equipment by touching the screen.
In some cases, the voice signal of the user may be acquired through a microphone, and certain control of the vehicle 600 by the user, such as adjusting the temperature in the vehicle, etc., may be implemented according to the analysis of the voice signal of the user. In other cases, music may be played to the user through a stereo.
The navigation system 613 may include a map service provided by a map provider to provide navigation of a route of travel for the vehicle 600, and the navigation system 613 may be used in conjunction with a global positioning system 621 and an inertial measurement unit 622 of the vehicle. The map service provided by the map provider can be a two-dimensional map or a high-precision map.
The sensing system 620 may include several types of sensors that sense information about the environment surrounding the vehicle 600. For example, the sensing system 620 may include a global positioning system 621 (the global positioning system may be a GPS system, a beidou system or other positioning system), an Inertial Measurement Unit (IMU) 622, a laser radar 623, a millimeter wave radar 624, an ultrasonic radar 625, and a camera 626. The sensing system 620 may also include sensors of internal systems of the monitored vehicle 600 (e.g., an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors may be used to detect the object and its corresponding characteristics (position, shape, orientation, velocity, etc.). Such detection and identification is a critical function of the safe operation of the vehicle 600.
The inertial measurement unit 622 is used to sense a pose change of the vehicle 600 based on the inertial acceleration. In some embodiments, inertial measurement unit 622 may be a combination of accelerometers and gyroscopes.
Lidar 623 utilizes laser light to sense objects in the environment in which vehicle 600 is located. In some embodiments, lidar 623 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
The millimeter-wave radar 624 utilizes radio signals to sense objects within the surrounding environment of the vehicle 600. In some embodiments, in addition to sensing objects, the millimeter-wave radar 624 may also be used to sense the speed and/or heading of objects.
The ultrasonic radar 625 may sense objects around the vehicle 600 using ultrasonic signals.
The camera 626 is used to capture image information of the surroundings of the vehicle 600. The image capturing device 626 may include a monocular camera, a binocular camera, a structured light camera, a panoramic camera, and the like, and the image information acquired by the image capturing device 626 may include still images or video stream information.
Decision control system 630 includes a computing system 631 that makes analytical decisions based on information acquired by sensing system 620, decision control system 630 further includes a vehicle control unit 632 that controls the powertrain of vehicle 600, and a steering system 633, throttle 634, and brake system 635 for controlling vehicle 600.
The computing system 631 may operate to process and analyze the various information acquired by the perception system 620 to identify objects, and/or features in the environment surrounding the vehicle 600. The target may comprise a pedestrian or an animal and the objects and/or features may comprise traffic signals, road boundaries and obstacles. The computing system 631 may use object recognition algorithms, motion from Motion (SFM) algorithms, video tracking, and the like. In some embodiments, the computing system 631 may be used to map an environment, track objects, estimate the speed of objects, and so forth. The computing system 631 may analyze the various information obtained and derive a control strategy for the vehicle.
The vehicle controller 632 may be used to perform coordinated control on the power battery and the engine 641 of the vehicle to improve the power performance of the vehicle 600.
The steering system 633 is operable to adjust the heading of the vehicle 600. For example, in one embodiment, a steering wheel system.
The throttle 634 is used to control the operating speed of the engine 641 and, in turn, the speed of the vehicle 600.
The brake system 635 is used to control the deceleration of the vehicle 600. The braking system 635 may use friction to slow the wheel 644. In some embodiments, the braking system 635 may convert kinetic energy of the wheels 644 to electrical current. The braking system 635 may also take other forms to slow the rotational speed of the wheels 644 to control the speed of the vehicle 600.
The drive system 640 may include components that provide powered motion to the vehicle 600. In one embodiment, the drive system 640 may include an engine 641, an energy source 642, a transmission 643, and wheels 644. The engine 641 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine consisting of a gasoline engine and an electric motor, a hybrid engine consisting of an internal combustion engine and an air compression engine. The engine 641 converts the energy source 642 into mechanical energy.
Examples of energy source 642 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source 642 may also provide energy to other systems of the vehicle 600.
The transmission 643 may transmit mechanical power from the engine 641 to the wheels 644. The transmission 643 may include a gearbox, a differential, and a drive shaft. In one embodiment, the transmission 643 may also include other components, such as clutches. Wherein the drive shaft may include one or more axles that may be coupled to one or more wheels 644.
Some or all of the functionality of the vehicle 600 is controlled by the computing platform 650. Computing platform 650 can include at least one processor 651, which processor 651 can execute instructions 653 stored in a non-transitory computer-readable medium, such as memory 652. In some embodiments, the computing platform 650 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 600 in a distributed manner.
The processor 651 may be any conventional processor, such as a commercially available CPU. Alternatively, the processor 651 may also include a processor such as a Graphics Processor (GPU), a Field Programmable Gate Array (FPGA), a System On Chip (SOC), an Application Specific Integrated Circuit (ASIC), or a combination thereof. Although fig. 3 functionally illustrates a processor, memory, and other elements of a computer in the same block, those skilled in the art will appreciate that the processor, computer, or memory may actually comprise multiple processors, computers, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard drive or other storage medium located in a different enclosure than the computer. Thus, references to a processor or computer are to be understood as including references to a collection of processors or computers or memories which may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only computations related to the component-specific functions.
In the disclosed embodiment, processor 651 may perform the above-described method of evaluating a target recognition algorithm.
In various aspects described herein, the processor 651 can be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle and others are executed by a remote processor, including taking the steps necessary to perform a single maneuver.
In some embodiments, the memory 652 may contain instructions 653 (e.g., program logic), which instructions 653 may be executed by the processor 651 to perform various functions of the vehicle 600. The memory 652 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of the infotainment system 610, the perception system 620, the decision control system 630, the drive system 640.
In addition to instructions 653, memory 652 may store data such as road maps, route information, the location, direction, speed of the vehicle, and other such vehicle data, as well as other information. Such information may be used by the vehicle 600 and the computing platform 650 during operation of the vehicle 600 in autonomous, semi-autonomous, and/or manual modes.
Computing platform 650 may control functions of vehicle 600 based on inputs received from various subsystems (e.g., drive system 640, perception system 620, and decision control system 630). For example, computing platform 650 may utilize input from decision control system 630 in order to control steering system 633 to avoid obstacles detected by sensing system 620. In some embodiments, the computing platform 650 is operable to provide control over many aspects of the vehicle 600 and its subsystems.
Optionally, one or more of these components described above may be mounted or associated separately from the vehicle 600. For example, the memory 652 may exist partially or completely separate from the vehicle 600. The above components may be communicatively coupled together in a wired and/or wireless manner.
Optionally, the above components are only an example, in an actual application, components in the above modules may be added or deleted according to an actual need, and fig. 3 should not be construed as limiting the embodiment of the present disclosure.
An autonomous automobile traveling on a roadway, such as vehicle 600 above, may identify objects within its surrounding environment to determine an adjustment to the current speed. The object may be another vehicle, a traffic control device, or another type of object. In some examples, each identified object may be considered independently, and based on the respective characteristics of the object, such as its current speed, acceleration, separation from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to be adjusted.
Optionally, the vehicle 600 or a sensing and computing device associated with the vehicle 600 (e.g., computing system 631, computing platform 650) may predict the behavior of the identified object based on characteristics of the identified object and the state of the surrounding environment (e.g., traffic, rain, ice on the road, etc.). Optionally, each identified object depends on the behavior of each other, so it is also possible to predict the behavior of a single identified object taking all identified objects together into account. The vehicle 600 is able to adjust its speed based on the predicted behavior of the identified object. In other words, the autonomous vehicle is able to determine what steady state the vehicle will need to adjust to (e.g., accelerate, decelerate, or stop) based on the predicted behavior of the object. In this process, other factors may also be considered to determine the speed of the vehicle 600, such as the lateral position of the vehicle 600 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and so forth.
In addition to providing instructions to adjust the speed of the autonomous vehicle, the computing device may also provide instructions to modify the steering angle of the vehicle 600 to cause the autonomous vehicle to follow a given trajectory and/or maintain a safe lateral and longitudinal distance from objects in the vicinity of the autonomous vehicle (e.g., vehicles in adjacent lanes on the road).
The vehicle 600 may be any type of vehicle, such as a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a recreational vehicle, a train, etc., and the disclosed embodiment is not particularly limited.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-described method of evaluating an object recognition algorithm when executed by the programmable apparatus.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Claims (11)
1. A method of evaluating a target recognition algorithm, the method comprising:
acquiring point cloud data obtained by sensing a target scene by a millimeter wave radar sensor;
performing target recognition on the point cloud data according to a target recognition algorithm to be evaluated to obtain a recognized perception target;
performing association processing on a real target in the target scene and the perception target to obtain an association result;
and determining an evaluation result of the target recognition algorithm to be evaluated according to the correlation result.
2. The method according to claim 1, wherein the number of the sensing targets is plural, the number of the real targets is plural, and the associating the real targets in the target scene with the sensing targets to obtain an association result comprises:
for each perception target, calculating a confidence degree of the perception target and each real target to obtain a confidence degree matrix, wherein rows of the confidence degree matrix correspond to one of the perception target and the real target, columns of the confidence degree matrix correspond to the other of the perception target and the real target, and the confidence degree represents the probability that the perception target and the real target are the same target;
determining an associated perception target associated with the real target and an associated real target associated with the perception target based on the magnitude of each confidence in the confidence matrix, wherein the association result comprises the associated perception target and the associated real target.
3. The method of claim 2, wherein in a case where a row of the confidence matrix corresponds to the perception target and a column of the confidence matrix corresponds to the real target, the determining the associated perception target associated with the real target and the associated real target associated with the perception target based on the magnitude of each of the confidences in the confidence matrix comprises:
determining a first target element which is larger than a first preset threshold value in the confidence coefficient matrix;
determining the perception target corresponding to the first target row where the first target element is located as the associated perception target;
and determining the real target corresponding to the first target column in which the first target element is positioned as the associated real target.
4. The method of claim 3, wherein determining the associated perceptual target associated with the real target and the associated real target associated with the perceptual target based on the magnitude of each of the confidences in the confidence matrix, further comprises:
for a second target row except the first target row, determining a second target element with the maximum confidence level value in the second target row;
determining a second target column in which the second target element is located;
if the second target element is the maximum value in the second target column, determining the perception target corresponding to the second target row as the associated perception target;
and determining the real target corresponding to the second target column as the associated real target.
5. The method according to claim 3 or 4, wherein the determining of the associated sensing target associated with the real target and the associated real target associated with the sensing target based on the magnitude of each confidence in the confidence matrix further comprises:
aiming at each real target, calculating a first included angle between the real target and an X axis under a target coordinate system according to the position information of the real target;
determining a first included angle interval according to the first included angle corresponding to each real target;
aiming at candidate perception targets except the associated perception target, calculating a second included angle between the candidate perception target and the X axis in the target coordinate system according to the position information of the candidate perception target;
and if the second included angle is within the first included angle interval and a third target element larger than a second preset threshold value exists in a third target row in which the candidate sensing target is located, determining the candidate sensing target as the associated sensing target.
6. The method of claim 5, wherein determining the associated perceptual target associated with the real target and the associated real target associated with the perceptual target based on the magnitude of each of the confidences in the confidence matrix, further comprises:
if the number of the third target elements is larger than 1, determining candidate third target columns corresponding to all the third target elements;
determining a target candidate real target closest to the candidate perception target from the candidate real targets corresponding to all the candidate third target columns;
determining the target candidate real target as the associated real target.
7. The method according to claim 2, wherein in a case where a row of the confidence matrix corresponds to the perception target and a column of the confidence matrix corresponds to the real target, the determining an evaluation result of the target recognition algorithm to be evaluated according to the association result includes:
determining the accuracy rate of the target recognition algorithm to be evaluated according to the number of the associated sensing targets and the total row number of the confidence coefficient matrix;
determining the recall rate of the target recognition algorithm to be evaluated according to the number of the associated real targets and the total column number of the confidence coefficient matrix;
wherein the evaluation result comprises the accuracy rate and the recall rate.
8. The method according to claim 2, wherein the determining an evaluation result of the target recognition algorithm to be evaluated according to the correlation result comprises:
determining association pairs based on the association result, wherein each association pair comprises the associated perception target and the associated real target associated with the associated perception target;
calculating an error between the associated perceptual target and the associated real target in the associated pair;
and counting an error mean value corresponding to each target category according to the errors corresponding to the associated pairs, wherein the target category of the associated pairs is determined according to the category of the associated real target in the associated pairs, and the evaluation result comprises the error mean value.
9. An apparatus for evaluating a target recognition algorithm, the apparatus comprising:
the acquisition module is configured to acquire point cloud data obtained by sensing a target scene by the millimeter wave radar sensor;
the identification module is configured to perform target identification on the point cloud data according to a target identification algorithm to be evaluated to obtain an identified perception target;
the association module is configured to perform association processing on a real target in the target scene and the perception target to obtain an association result;
and the execution module is configured to determine an evaluation result of the target recognition algorithm to be evaluated according to the association result.
10. A computer-readable storage medium on which computer program instructions are stored, which program instructions, when executed by a processor, implement the steps of the method according to any one of claims 1 to 8.
11. A vehicle, characterized by comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method of any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210827749.8A CN115147796A (en) | 2022-07-14 | 2022-07-14 | Method and device for evaluating target recognition algorithm, storage medium and vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210827749.8A CN115147796A (en) | 2022-07-14 | 2022-07-14 | Method and device for evaluating target recognition algorithm, storage medium and vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115147796A true CN115147796A (en) | 2022-10-04 |
Family
ID=83412644
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210827749.8A Pending CN115147796A (en) | 2022-07-14 | 2022-07-14 | Method and device for evaluating target recognition algorithm, storage medium and vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115147796A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115639532A (en) * | 2022-10-20 | 2023-01-24 | 湖南交通工程学院 | Method and device for comprehensively evaluating cognitive radar target recognition effect on line |
CN115907566A (en) * | 2023-02-17 | 2023-04-04 | 小米汽车科技有限公司 | Evaluation method and device for automatic driving perception detection capability and electronic equipment |
CN116091553A (en) * | 2023-04-04 | 2023-05-09 | 小米汽车科技有限公司 | Track determination method, track determination device, electronic equipment, vehicle and storage medium |
CN116500565A (en) * | 2023-06-28 | 2023-07-28 | 小米汽车科技有限公司 | Method, device and equipment for evaluating automatic driving perception detection capability |
CN118035939A (en) * | 2024-02-26 | 2024-05-14 | 安徽蔚来智驾科技有限公司 | Confidence acquisition method for perceived targets and autonomous driving planning and control method |
CN118094173A (en) * | 2023-11-17 | 2024-05-28 | 北京理工大学 | Method for autonomously identifying and analyzing reconnaissance load target and evaluating reconnaissance load target through algorithm |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112307955A (en) * | 2020-10-29 | 2021-02-02 | 广西科技大学 | Optimization method based on SSD infrared image pedestrian detection |
CN112329892A (en) * | 2020-12-03 | 2021-02-05 | 中国第一汽车股份有限公司 | Target detection algorithm evaluation method, device, equipment and storage medium |
CN113239609A (en) * | 2020-12-24 | 2021-08-10 | 天津职业技术师范大学(中国职业培训指导教师进修中心) | Test system and detection method for target identification and monitoring of monocular camera of intelligent vehicle |
WO2022037387A1 (en) * | 2020-08-20 | 2022-02-24 | 魔门塔(苏州)科技有限公司 | Visual perception algorithm evaluation method and device |
CN114120246A (en) * | 2021-10-12 | 2022-03-01 | 吉林大学 | A front vehicle detection algorithm based on complex environment |
-
2022
- 2022-07-14 CN CN202210827749.8A patent/CN115147796A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022037387A1 (en) * | 2020-08-20 | 2022-02-24 | 魔门塔(苏州)科技有限公司 | Visual perception algorithm evaluation method and device |
CN112307955A (en) * | 2020-10-29 | 2021-02-02 | 广西科技大学 | Optimization method based on SSD infrared image pedestrian detection |
CN112329892A (en) * | 2020-12-03 | 2021-02-05 | 中国第一汽车股份有限公司 | Target detection algorithm evaluation method, device, equipment and storage medium |
CN113239609A (en) * | 2020-12-24 | 2021-08-10 | 天津职业技术师范大学(中国职业培训指导教师进修中心) | Test system and detection method for target identification and monitoring of monocular camera of intelligent vehicle |
CN114120246A (en) * | 2021-10-12 | 2022-03-01 | 吉林大学 | A front vehicle detection algorithm based on complex environment |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115639532A (en) * | 2022-10-20 | 2023-01-24 | 湖南交通工程学院 | Method and device for comprehensively evaluating cognitive radar target recognition effect on line |
CN115907566A (en) * | 2023-02-17 | 2023-04-04 | 小米汽车科技有限公司 | Evaluation method and device for automatic driving perception detection capability and electronic equipment |
CN115907566B (en) * | 2023-02-17 | 2023-05-30 | 小米汽车科技有限公司 | Evaluation method and device for automatic driving perception detection capability and electronic equipment |
CN116091553A (en) * | 2023-04-04 | 2023-05-09 | 小米汽车科技有限公司 | Track determination method, track determination device, electronic equipment, vehicle and storage medium |
CN116500565A (en) * | 2023-06-28 | 2023-07-28 | 小米汽车科技有限公司 | Method, device and equipment for evaluating automatic driving perception detection capability |
CN116500565B (en) * | 2023-06-28 | 2023-10-13 | 小米汽车科技有限公司 | Method, device and equipment for evaluating automatic driving perception detection capability |
CN118094173A (en) * | 2023-11-17 | 2024-05-28 | 北京理工大学 | Method for autonomously identifying and analyzing reconnaissance load target and evaluating reconnaissance load target through algorithm |
CN118035939A (en) * | 2024-02-26 | 2024-05-14 | 安徽蔚来智驾科技有限公司 | Confidence acquisition method for perceived targets and autonomous driving planning and control method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112512887B (en) | Driving decision selection method and device | |
CN115147796A (en) | Method and device for evaluating target recognition algorithm, storage medium and vehicle | |
CN113792566A (en) | Laser point cloud processing method and related equipment | |
CN115123257A (en) | Method and device for identifying position of road deceleration strip, vehicle, storage medium and chip | |
CN115100630B (en) | Obstacle detection method, obstacle detection device, vehicle, medium and chip | |
WO2024113207A1 (en) | Data processing method and apparatus | |
CN115205311B (en) | Image processing method, device, vehicle, medium and chip | |
CN115220449B (en) | Path planning method, device, storage medium, chip and vehicle | |
CN114842075B (en) | Data labeling method and device, storage medium and vehicle | |
CN115035494A (en) | Image processing method, image processing device, vehicle, storage medium and chip | |
CN115100377A (en) | Map construction method and device, vehicle, readable storage medium and chip | |
CN115407344B (en) | Grid map creation method, device, vehicle and readable storage medium | |
CN114782638B (en) | Method and device for generating lane line, vehicle, storage medium and chip | |
CN114842440B (en) | Automatic driving environment sensing method and device, vehicle and readable storage medium | |
CN115222791B (en) | Target association method, device, readable storage medium and chip | |
CN115202234B (en) | Simulation test method and device, storage medium and vehicle | |
CN114842455B (en) | Obstacle detection method, device, equipment, medium, chip and vehicle | |
CN115203457B (en) | Image retrieval method, device, vehicle, storage medium and chip | |
CN115221151B (en) | Vehicle data transmission method and device, vehicle, storage medium and chip | |
CN115205848A (en) | Target detection method, target detection device, vehicle, storage medium and chip | |
CN114972824B (en) | Rod detection method, device, vehicle and storage medium | |
CN115082886B (en) | Target detection method, device, storage medium, chip and vehicle | |
CN115063639B (en) | Model generation method, image semantic segmentation device, vehicle and medium | |
CN115179930B (en) | Vehicle control method and device, vehicle and readable storage medium | |
CN114822216B (en) | Method and device for generating parking space map, vehicle, storage medium and chip |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |