[go: up one dir, main page]

CN113470091B - Hub point cloud registration method and device, electronic equipment and storage medium - Google Patents

Hub point cloud registration method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113470091B
CN113470091B CN202111025763.8A CN202111025763A CN113470091B CN 113470091 B CN113470091 B CN 113470091B CN 202111025763 A CN202111025763 A CN 202111025763A CN 113470091 B CN113470091 B CN 113470091B
Authority
CN
China
Prior art keywords
point cloud
hub
sampling point
projection image
main shaft
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111025763.8A
Other languages
Chinese (zh)
Other versions
CN113470091A (en
Inventor
赵佳南
黄雪峰
杨超
胡亘谦
刘云备
蔡恩祥
吴志浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xinrun Fulian Digital Technology Co Ltd
Original Assignee
Shenzhen Xinrun Fulian Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xinrun Fulian Digital Technology Co Ltd filed Critical Shenzhen Xinrun Fulian Digital Technology Co Ltd
Priority to CN202111025763.8A priority Critical patent/CN113470091B/en
Publication of CN113470091A publication Critical patent/CN113470091A/en
Application granted granted Critical
Publication of CN113470091B publication Critical patent/CN113470091B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a hub point cloud registration method and device, electronic equipment and a storage medium. Acquiring a hub sampling point cloud and a preset hub standard point cloud; carrying out gradient correction on the hub sampling point cloud so that the main shaft direction of the hub sampling point cloud is superposed with the main shaft direction of the hub standard point cloud; projecting the corrected hub sampling point cloud and hub standard point cloud to a reference plane, and respectively obtaining a sampling point cloud projection image and a standard point cloud projection image; carrying out image matching on the sampling point cloud projection image and the standard point cloud projection image to obtain the rotation amount and the translation amount of the sampling point cloud projection image which is moved to be overlapped with the standard point cloud projection image; determining a main shaft translation amount between the hub sampling point cloud and the hub standard point cloud based on the moved sampling point cloud projection image and the standard point cloud projection image; and moving the hub sampling point cloud according to the rotation amount, the translation amount and the main shaft translation amount. The scheme provided by the invention can improve the registration precision of the hub point cloud and reduce the registration error.

Description

Hub point cloud registration method and device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of point cloud registration, in particular to a hub point cloud registration method and device, electronic equipment and a storage medium.
Background
The point cloud registration is a process of firstly acquiring the relative pose of the point cloud to be detected and the standard point cloud, and then performing translation and rotation transformation on the point cloud to be detected to enable the point cloud to be detected to be identical to the standard point cloud in pose. Currently, an Iterative Closest Point (ICP) method is widely used for Point cloud registration, and the registration accuracy is extremely high. However, in this method, in order to avoid getting into local optimum in the process of matching point pairs, in practical applications, the point clouds are roughly registered first, so that the relative poses of the 2 point clouds do not differ too much, thereby avoiding getting into local optimum in matching.
However, in the existing coarse registration process, when a rotation matrix and a translation matrix of a point cloud are obtained, the registration accuracy is usually good only in one direction, the registration error in the second main direction and the third main direction is large, and the coarse registration effect is not good. And the current coarse registration process cannot well realize the registration effect between the local sampling point cloud and the standard point cloud.
Disclosure of Invention
In order to solve the technical problem of poor registration effect in the rough point cloud registration process, the embodiment of the invention provides a hub point cloud registration method and device, electronic equipment and a storage medium.
The technical scheme of the embodiment of the invention is realized as follows:
the embodiment of the invention provides a hub point cloud registration method, which comprises the following steps:
acquiring a hub sampling point cloud obtained by shooting a hub and a preset hub standard point cloud;
performing inclination correction on the hub sampling point cloud to enable the main shaft direction of the hub sampling point cloud to be coincident with the main shaft direction of the hub standard point cloud; wherein, the main shaft direction is the direction of the hub rotating shaft;
projecting the corrected hub sampling point cloud and the corrected hub standard point cloud to a reference plane, and respectively obtaining a sampling point cloud projection image and a standard point cloud projection image; wherein the reference plane is a plane perpendicular to the direction of the main axis;
performing image matching on the sampling point cloud projection image and the standard point cloud projection image to obtain the rotation amount and the translation amount of the sampling point cloud projection image which is moved to be overlapped with the standard point cloud projection image;
after the rotation amount and the translation amount are used for moving the sampling point cloud projection image, determining the main shaft translation amount between the hub sampling point cloud and the hub standard point cloud based on the moved sampling point cloud projection image and the standard point cloud projection image;
and moving the hub sampling point cloud according to the rotation amount, the translation amount and the main shaft translation amount so as to complete the registration of the hub sampling point cloud.
In the above scheme, the inclination correction of the hub sampling point cloud is performed, so that the coincidence of the main axis direction of the hub sampling point cloud and the main axis direction of the hub standard point cloud includes:
determining the direction of a main shaft of the hub sampling point cloud;
obtaining a main shaft direction vector of the hub sampling point cloud according to the main shaft direction;
and performing gradient correction on the hub sampling point cloud based on the main shaft direction vector, so that the main shaft direction of the hub sampling point cloud is superposed with the main shaft direction of the hub standard point cloud.
In the above scheme, the determining the main axis direction of the hub sampling point cloud includes:
processing the hub sampling point cloud by using a principal component analysis method to obtain a covariance matrix;
singular value decomposition is carried out on the covariance matrix to obtain a singular value matrix and a singular vector matrix;
and taking the direction of the first column of vectors of the left singular vector matrix in the singular vector matrix as the main shaft direction of the hub sampling point cloud.
In the above scheme, the obtaining of the main axis direction vector of the hub sampling point cloud according to the main axis direction includes:
determining the coordinates of a point farthest from the main shaft direction in the hub sampling point cloud;
determining coordinates of a center of gravity point of the hub sampling point cloud;
and determining a main shaft direction vector of the hub sampling point cloud according to the coordinate of the farthest point, the coordinate of the gravity center point and the main shaft direction.
In the foregoing solution, the performing gradient correction on the hub sampled point cloud based on the main axis direction vector to make the main axis direction of the hub sampled point cloud coincide with the main axis direction of the hub standard point cloud includes:
obtaining a rotation transformation matrix by utilizing a Rodrigues rotation formula according to the main shaft direction vector;
and transforming the hub sampling point cloud by using the rotation transformation matrix so that the main shaft direction of the hub sampling point cloud is superposed with the main shaft direction of the hub standard point cloud.
In the above scheme, projecting the corrected hub sampling point cloud and the corrected hub standard point cloud to a reference plane, and respectively obtaining a sampling point cloud projection image and a standard point cloud projection image includes:
obtaining scale factors on two mutually perpendicular coordinate axes on a reference plane;
and projecting the corrected hub sampling point cloud and the hub standard point cloud to a reference plane by using the scale factor to respectively obtain a sampling point cloud projection image and a standard point cloud projection image.
In the above scheme, the image matching the sampled point cloud projection image and the standard point cloud projection image to obtain the rotation amount and the translation amount for moving the sampled point cloud projection image to overlap with the standard point cloud projection image includes:
acquiring a preset image processing template matching operator;
carrying out image matching on the sampling point cloud projection image and the standard point cloud projection image by using the image processing template matching operator to obtain a rigid body conversion matrix for moving the sampling point cloud projection image to be overlapped with the standard point cloud projection image; and the rigid body conversion matrix comprises a rotation angle for moving the sampling point cloud projection image to be overlapped with the standard point cloud projection image and translation amounts on two coordinate axes which are vertical to each other.
In the above scheme, determining the main shaft translation amount between the hub sampling point cloud and the hub standard point cloud based on the moved sampling point cloud projection image and the standard point cloud projection image includes:
determining a first overlapping area in an overlapping area between the moved sampling point cloud projection image and the standard point cloud projection image; the range of the first overlapping area is a preset range;
determining a sampling three-dimensional point set and a standard three-dimensional point set corresponding to the first overlapping area in the hub sampling point cloud and the hub standard point cloud respectively;
and determining the main shaft translation amount between the hub sampling point cloud and the hub standard point cloud according to the sampling three-dimensional point set and the standard three-dimensional point set.
The embodiment of the invention also provides a hub point cloud registration device, which comprises:
the acquisition module is used for acquiring a hub sampling point cloud obtained by shooting a hub and a preset hub standard point cloud;
the correction module is used for carrying out inclination correction on the hub sampling point cloud so that the main shaft direction of the hub sampling point cloud is superposed with the main shaft direction of the hub standard point cloud; wherein, the main shaft direction is the direction of the hub rotating shaft;
the projection module is used for projecting the corrected hub sampling point cloud and the corrected hub standard point cloud to a reference plane to respectively obtain a sampling point cloud projection image and a standard point cloud projection image; wherein the reference plane is a plane perpendicular to the direction of the main axis;
the matching module is used for carrying out image matching on the sampling point cloud projection image and the standard point cloud projection image to obtain the rotation amount and the translation amount of the sampling point cloud projection image which is moved to be overlapped with the standard point cloud projection image;
the determining module is used for determining the main shaft translation amount between the hub sampling point cloud and the hub standard point cloud based on the moved sampling point cloud projection image and the standard point cloud projection image after the sampling point cloud projection image is moved by the rotation amount and the translation amount;
and the moving module is used for moving the hub sampling point cloud according to the rotation amount, the translation amount and the main shaft translation amount so as to complete the registration of the hub sampling point cloud.
An embodiment of the present invention further provides an electronic device, including: a processor and a memory for storing a computer program capable of running on the processor; wherein,
the processor is adapted to perform the steps of any of the methods described above when running the computer program.
The embodiment of the invention also provides a storage medium, wherein a computer program is stored in the storage medium, and when the computer program is executed by a processor, the steps of any one of the methods are realized.
According to the hub point cloud registration method, the hub point cloud registration device, the electronic equipment and the storage medium, the hub sampling point cloud obtained by shooting the hub and the preset hub standard point cloud are obtained; performing inclination correction on the hub sampling point cloud to enable the main shaft direction of the hub sampling point cloud to be coincident with the main shaft direction of the hub standard point cloud; wherein, the main shaft direction is the direction of the hub rotating shaft; projecting the corrected hub sampling point cloud and the corrected hub standard point cloud to a reference plane, and respectively obtaining a sampling point cloud projection image and a standard point cloud projection image; wherein the reference plane is a plane perpendicular to the direction of the main axis; performing image matching on the sampling point cloud projection image and the standard point cloud projection image to obtain the rotation amount and the translation amount of the sampling point cloud projection image which is moved to be overlapped with the standard point cloud projection image; after the rotation amount and the translation amount are used for moving the sampling point cloud projection image, determining the main shaft translation amount between the hub sampling point cloud and the hub standard point cloud based on the moved sampling point cloud projection image and the standard point cloud projection image; and moving the hub sampling point cloud according to the rotation amount, the translation amount and the main shaft translation amount so as to complete the registration of the hub sampling point cloud. By adopting the scheme provided by the invention, the registration precision of the hub point cloud can be improved, and the registration error can be reduced.
Drawings
FIG. 1 is a schematic flow chart of a hub point cloud registration method according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a hub point cloud registration device according to an embodiment of the present invention;
fig. 3 is an internal structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
The embodiment of the invention provides a hub point cloud registration method, as shown in fig. 1, the method comprises the following steps:
step 101: acquiring a hub sampling point cloud obtained by shooting a hub and a preset hub standard point cloud;
step 102: performing inclination correction on the hub sampling point cloud to enable the main shaft direction of the hub sampling point cloud to be coincident with the main shaft direction of the hub standard point cloud; wherein, the main shaft direction is the direction of the hub rotating shaft;
step 103: projecting the corrected hub sampling point cloud and the corrected hub standard point cloud to a reference plane, and respectively obtaining a sampling point cloud projection image and a standard point cloud projection image; wherein the reference plane is a plane perpendicular to the direction of the main axis;
step 104: performing image matching on the sampling point cloud projection image and the standard point cloud projection image to obtain the rotation amount and the translation amount of the sampling point cloud projection image which is moved to be overlapped with the standard point cloud projection image;
step 105: after the rotation amount and the translation amount are used for moving the sampling point cloud projection image, determining the main shaft translation amount between the hub sampling point cloud and the hub standard point cloud based on the moved sampling point cloud projection image and the standard point cloud projection image;
step 106: and moving the hub sampling point cloud according to the rotation amount, the translation amount and the main shaft translation amount so as to complete the registration of the hub sampling point cloud.
Specifically, the method of the embodiment can be applied to a point cloud rough registration process, and the registration accuracy of the point cloud rough registration process is improved.
In practical application, the hub standard point cloud in this embodiment may be a standard point cloud obtained through a hub digital-to-analog file provided by a manufacturer in a hub production and manufacturing process. And in actual application, producing and manufacturing the hub according to the digital-analog file corresponding to the hub standard point cloud.
Specifically, the present embodiment employs distributed registration. The method comprises the following steps of firstly calculating the main shaft direction of the hub sampling point cloud, and carrying out gradient correction on the hub sampling point cloud according to the main shaft direction, so that the main shaft of the hub sampling point cloud is overlapped with the main shaft of the hub standard point cloud (namely the z axis in a hub standard point cloud coordinate system) in the direction. And then projecting the superposed hub sampling point cloud and hub standard point cloud into a main plane (a plane vertical to the main axis, namely a screen where an x axis and a y axis in a hub standard point cloud coordinate system are located) by adopting a dimensionality reduction idea, and converting the superposed hub sampling point cloud and hub standard point cloud into corresponding two-dimensional maps (namely a sampling point cloud projection image and a standard point cloud projection image), wherein at the moment, the point cloud matching problem of the hub sampling point cloud and the hub standard point cloud is converted into the image template matching problem of the sampling point cloud projection image and the standard point cloud projection image. And acquiring the rotation angle and the translation amount of the hub sampling point cloud and the hub standard point cloud on the main plane by using the sampling point cloud projection image and the standard point cloud projection image. And finally, selecting an overlapping area in the sampling point cloud projection image and the standard point cloud projection image according to an image matching result, obtaining the translation amount of the hub sampling point cloud and the hub standard point cloud in the z-axis direction through mean value calculation, further obtaining a spatial rotation translation matrix from the hub sampling point cloud to the hub standard point cloud, and finishing the registration of the hub point cloud.
Further, in an embodiment, the inclination correcting the hub sample point cloud so that the main axis direction of the hub sample point cloud coincides with the main axis direction of the hub standard point cloud includes:
determining the direction of a main shaft of the hub sampling point cloud;
obtaining a main shaft direction vector of the hub sampling point cloud according to the main shaft direction;
and performing gradient correction on the hub sampling point cloud based on the main shaft direction vector, so that the main shaft direction of the hub sampling point cloud is superposed with the main shaft direction of the hub standard point cloud.
Specifically, the main axis direction is the direction in which the hub rotation axis is located. The hub is symmetrical based on the hub rotation axis. In practical application, the main shaft direction can be further defined as the main shaft positive direction from inside to outside according to the hub assembling direction. Here, it should be noted that, in the coordinate system of the hub standard point cloud, the main axis is generally located at the z-axis, so that the pose of the hub sampling point cloud can be converted to the coordinate system of the hub standard point cloud based on the coordinate system of the hub standard point cloud, thereby achieving the registration between the hub sampling point cloud and the hub standard point cloud.
Further, in an embodiment, the determining the main axis direction of the hub sampling point cloud comprises:
processing the hub sampling point cloud by using a principal component analysis method to obtain a covariance matrix;
singular value decomposition is carried out on the covariance matrix to obtain a singular value matrix and a singular vector matrix;
and taking the direction of the first column of vectors of the left singular vector matrix in the singular vector matrix as the main shaft direction of the hub sampling point cloud.
Principal Component Analysis (PCA) is a method of analyzing multivariate statistical distribution by using feature vectors, and characterizes the main distribution direction inside data. Here, the point cloud processing by PCA may be regarded as calculating a linear projection of the point cloud data with a direction of maximum variance of the data as a main axis direction of the point cloud, and the magnitude of the variance may be defined by a corresponding feature value.
Specifically, the hub sampling point cloud is processed by a principal component analysis method according to the following formula (1), and a covariance matrix is obtained:
Figure 767208DEST_PATH_IMAGE001
formula (1)
Wherein,
Figure 622032DEST_PATH_IMAGE002
representing a covariance matrix;
Figure 401769DEST_PATH_IMAGE003
for decentralization of the point cloud;
Figure 796978DEST_PATH_IMAGE004
Figure 611350DEST_PATH_IMAGE005
sampling n three-dimensional points of a point cloud Q for forming a hub; t is an operator, representing the transpose of the matrix.
Specifically, the singular value decomposition may be performed on the covariance matrix using the following formula (2), to obtain a singular value matrix and a singular vector matrix:
Figure 433813DEST_PATH_IMAGE006
formula (2)
Wherein,
Figure 389262DEST_PATH_IMAGE007
representing a covariance matrix, matrix
Figure 588162DEST_PATH_IMAGE008
And
Figure 257041DEST_PATH_IMAGE009
is a 3 x 3 orthogonal matrix which is a left singular vector matrix and a right singular vector matrix of the covariance matrix respectively
Figure 250404DEST_PATH_IMAGE010
The matrix is a diagonal matrix and a singular value matrix of a covariance matrix, and T is an operation symbol and represents the transposition of the matrix.
Here, any of the m × n matrices can be decomposed into the product of an m × m orthogonal matrix, an m × n diagonal matrix, and an n × n orthogonal matrix. This decomposition process is called matrix singular value decomposition. Singular value decomposition can be applied to all matrices, as opposed to eigenvalue decomposition which can only be limited to square matrices.
Here, 3 x 3 matrix
Figure 4734DEST_PATH_IMAGE008
First column vector of
Figure 679429DEST_PATH_IMAGE011
The direction of the maximum variance is defined, and therefore, in this embodiment, the 3 x 3 matrix is used
Figure 202814DEST_PATH_IMAGE008
First column vector of
Figure 632658DEST_PATH_IMAGE012
The direction of the point cloud is used as the direction of the main shaft of the hub sampling point cloud. Here, ,
Figure 874284DEST_PATH_IMAGE013
is a three-dimensional column vector
Figure 336358DEST_PATH_IMAGE014
Three element values of (a).
The PCA can only determine the direction of the main axis and cannot determine the positive and negative of the main axis (which can also be understood as the positive and negative of the point cloud cannot be determined). Therefore, in the embodiment, the characteristics of the hub workpiece are analyzed, and it is found that the hub outer contour point is located on the bottom surface of the hub and has the largest distance from the hub central axis (which can also be understood as the main axis), so that the front side and the back side of the main axis are determined by the vector from the point farthest from the main axis to the hub gravity center point in the hub sampling point cloud (which can be understood as determining the main axis direction vector of the hub sampling point cloud).
Specifically, in an embodiment, the obtaining a main axis direction vector of the hub sampling point cloud according to the main axis direction includes:
determining the coordinates of a point farthest from the main shaft direction in the hub sampling point cloud;
determining coordinates of a center of gravity point of the hub sampling point cloud;
and determining a main shaft direction vector of the hub sampling point cloud according to the coordinate of the farthest point, the coordinate of the gravity center point and the main shaft direction.
Here, the center of gravity point of the hub sample point cloud Q may be defined as
Figure 979829DEST_PATH_IMAGE015
Will cross the center of gravity
Figure 314995DEST_PATH_IMAGE015
To be provided with
Figure 43917DEST_PATH_IMAGE016
The spatial straight line as direction vector is defined as a line
Figure 122731DEST_PATH_IMAGE017
Sampling the hub to obtain a point cloud Q middle distance line
Figure 558392DEST_PATH_IMAGE017
The farthest point is defined as
Figure 330039DEST_PATH_IMAGE018
Then, the main axis direction vector of the hub sampling point cloud Q can be determined by the following formula (3):
Figure 280677DEST_PATH_IMAGE019
formula (3)
Wherein N represents a main axis direction vector,
Figure 163183DEST_PATH_IMAGE020
a vector in the direction of a principal axis is represented,
Figure 515667DEST_PATH_IMAGE021
is a vector
Figure 143701DEST_PATH_IMAGE020
The transpose of (a) is performed,
Figure 581635DEST_PATH_IMAGE022
representing a center of gravity point
Figure 2252DEST_PATH_IMAGE015
To the farthest point
Figure 474822DEST_PATH_IMAGE018
The direction vector of (2).
Further, in an embodiment, the performing inclination correction on the hub sample point cloud based on the main axis direction vector so that the main axis direction of the hub sample point cloud coincides with the main axis direction of the hub standard point cloud includes:
obtaining a rotation transformation matrix by utilizing a Rodrigues rotation formula according to the main shaft direction vector;
and transforming the hub sampling point cloud by using the rotation transformation matrix so that the main shaft direction of the hub sampling point cloud is superposed with the main shaft direction of the hub standard point cloud.
The gradient correction process of the hub sampling point cloud is a process of converting the hub sampling point cloud and enabling the main shaft direction of the hub sampling point cloud to be consistent with the z-axis direction in a hub standard point cloud coordinate system.
Specifically, the transformation of the hub sampling point cloud can be performed by the following formula (4):
Figure 588271DEST_PATH_IMAGE023
formula (4)
Wherein,
Figure 451185DEST_PATH_IMAGE024
representing the transformed hub sample point cloud,
Figure 675493DEST_PATH_IMAGE025
a matrix of a rotational transformation is represented,
Figure 736990DEST_PATH_IMAGE026
representing the hub sampling point cloud before transformation.
Here, the main axis direction vector
Figure 21341DEST_PATH_IMAGE027
After being transformed by a rotation change matrix, the matrix becomes
Figure 433868DEST_PATH_IMAGE028
In practical applications, the rotation transformation matrix can be obtained by the following formula (5) according to the Rodrigues rotation formula
Figure 648817DEST_PATH_IMAGE025
Figure 830400DEST_PATH_IMAGE029
Formula (5)
Wherein,
Figure 20073DEST_PATH_IMAGE025
a matrix of a rotational transformation is represented,
Figure 919896DEST_PATH_IMAGE030
Figure 486006DEST_PATH_IMAGE031
Figure 725358DEST_PATH_IMAGE032
for three parameters in the principal axis direction vector N,
Figure 85932DEST_PATH_IMAGE033
is a vector
Figure 473051DEST_PATH_IMAGE028
The transposed vector of (1).
Obtaining a rotational transformation matrix by a Rodrigues rotation formula
Figure 842852DEST_PATH_IMAGE025
The matrix can then be transformed by rotating
Figure 733448DEST_PATH_IMAGE025
And transforming the hub sampling point cloud to ensure that the main shaft direction of the hub sampling point cloud is superposed with the main shaft direction of the hub standard point cloud.
Further, in an embodiment, the projecting the corrected hub sample point cloud and hub standard point cloud to a reference plane, and respectively obtaining a sample point cloud projection image and a standard point cloud projection image includes:
obtaining scale factors on two mutually perpendicular coordinate axes on a reference plane;
and projecting the corrected hub sampling point cloud and the hub standard point cloud to a reference plane by using the scale factor to respectively obtain a sampling point cloud projection image and a standard point cloud projection image.
Specifically, any point in the point cloud (including the hub sampling point cloud and the hub standard point cloud) is calculated by the following formula (6)
Figure 218918DEST_PATH_IMAGE034
And (3) projection conversion is carried out:
Figure 827754DEST_PATH_IMAGE035
formula (6)
Wherein,
Figure 1246DEST_PATH_IMAGE036
are respectively a point
Figure 11928DEST_PATH_IMAGE034
Coordinates in the coordinate system of the point cloud before projective transformation,
Figure 917567DEST_PATH_IMAGE037
respectively, the scale factors of the point cloud projection image (including the sampling point cloud projection image and the standard point cloud projection image) in the x direction and the y direction,
Figure 13699DEST_PATH_IMAGE038
respectively corresponding pixel coordinates of each point in a point cloud projection image (comprising a hub sampling point cloud and a hub standard point cloud),
Figure 725303DEST_PATH_IMAGE039
is a point
Figure 590491DEST_PATH_IMAGE034
Gray values in point cloud projection images (including sampled point cloud projection images and standard point cloud projection images).
Further, in an embodiment, the image matching the sampled point cloud projection image and the standard point cloud projection image, and the obtaining the amount of rotation and the amount of translation for moving the sampled point cloud projection image to overlap with the standard point cloud projection image comprises:
acquiring a preset image processing template matching operator;
carrying out image matching on the sampling point cloud projection image and the standard point cloud projection image by using the image processing template matching operator to obtain a rigid body conversion matrix for moving the sampling point cloud projection image to be overlapped with the standard point cloud projection image; and the rigid body conversion matrix comprises a rotation angle for moving the sampling point cloud projection image to be overlapped with the standard point cloud projection image and translation amounts on two coordinate axes which are vertical to each other.
Here, the image processing template matching operator and the image matching according to the image processing template matching operator are the prior art, and are not described herein again. Specifically, Opencv2D image processing template matching operator CV _ TM _ CCOEFF (implemented operator in Opencv, which is a correlation matching operator) may be used to perform image matching on the sample point cloud projection image and the standard point cloud projection image, so as to obtain a rigid body transformation matrix that moves the sample point cloud projection image to overlap with the standard point cloud projection image.
Specifically, the rigid body transformation matrix can be represented by the following formula (7):
Figure 729348DEST_PATH_IMAGE040
formula (7)
Wherein,
Figure 499727DEST_PATH_IMAGE041
a rigid body transformation matrix is represented,
Figure 749443DEST_PATH_IMAGE042
representing the rotation angle by which the sampled point cloud projection image is moved to overlap with the standard point cloud projection image,
Figure 734716DEST_PATH_IMAGE043
and the translation amount of the sampling point cloud projection image which is overlapped on two coordinate axes which are vertical to each other with the standard point cloud projection image is represented.
Here, after the two-dimensional image matching is completed, the amount of translation on the main axis between the hub sampling point cloud and the hub standard point cloud may be further obtained from the matched image (which may be understood as two images when the sampling point cloud projection image is moved by the rigid body conversion matrix so that the sampling point cloud projection image and the standard point cloud projection image are overlapped).
Further, in an embodiment, the determining the main axis translation amount between the hub sampling point cloud and the hub standard point cloud based on the moved sampling point cloud projection image and the standard point cloud projection image includes:
determining a first overlapping area in an overlapping area between the moved sampling point cloud projection image and the standard point cloud projection image; the range of the first overlapping area is a preset range;
determining a sampling three-dimensional point set and a standard three-dimensional point set corresponding to the first overlapping area in the hub sampling point cloud and the hub standard point cloud respectively;
and determining the main shaft translation amount between the hub sampling point cloud and the hub standard point cloud according to the sampling three-dimensional point set and the standard three-dimensional point set.
In practical application, any pixel point in the overlapping area can be taken as a central point, sampling is carried out in the overlapping area by taking the preset size as a radius, and a first overlapping area is obtained. Here, the preset size may be set according to the width and height of the actual pixel of the image, but it is required to ensure that the first overlapping area after sampling is located in the overlapping area between the sampling point cloud projection image and the standard point cloud projection image.
Here, after determining the first overlapping area, the sampled three-dimensional point sets corresponding to the first overlapping area in the hub sampled point cloud and the hub standard point cloud respectively may be obtained by taking all information in the first overlapping area as features
Figure 44475DEST_PATH_IMAGE044
And a standard three-dimensional set of points
Figure 115199DEST_PATH_IMAGE045
. Here, ,
Figure 106289DEST_PATH_IMAGE046
respectively a standard three-dimensional point set
Figure 946069DEST_PATH_IMAGE047
The three-dimensional point of (a) is,
Figure 426729DEST_PATH_IMAGE048
respectively a sampled three-dimensional point set
Figure 719170DEST_PATH_IMAGE049
Is measured.
Specifically, the translation amount of the hub sampling point cloud and the hub standard point cloud in the main shaft direction can be determined by the following formula (8):
Figure 576267DEST_PATH_IMAGE050
formula (8)
Wherein,
Figure 956040DEST_PATH_IMAGE051
representing the translation amount of the hub sampling point cloud and the hub standard point cloud in the main shaft direction, wherein m is the number of three-dimensional points in the standard three-dimensional point set, n is the number of three-dimensional points in the sampling three-dimensional point set,
Figure 607601DEST_PATH_IMAGE052
as three-dimensional points
Figure 652917DEST_PATH_IMAGE053
The z-axis coordinate of (a) is,
Figure 48126DEST_PATH_IMAGE054
as three-dimensional points
Figure 596919DEST_PATH_IMAGE055
Z-axis coordinate of (c).
Thus, a first rigid body transformation matrix for moving the corrected hub sample point cloud to the hub standard point cloud can be obtained by the following formula (9):
Figure 622644DEST_PATH_IMAGE056
formula (9)
Wherein,
Figure 889678DEST_PATH_IMAGE057
a first rigid body transformation matrix representing the movement of the corrected hub sample point cloud to the hub standard point cloud,
Figure 88578DEST_PATH_IMAGE042
representing the rotation angle by which the sampled point cloud projection image is moved to overlap with the standard point cloud projection image,
Figure 491877DEST_PATH_IMAGE058
representing the amount of translation of moving the sampled point cloud projection image to overlap the standard point cloud projection image on two mutually perpendicular coordinate axes,
Figure 750820DEST_PATH_IMAGE051
and representing the translation amount of the hub sampling point cloud and the hub standard point cloud in the main shaft direction.
By the first rigid body transformation matrix
Figure 692100DEST_PATH_IMAGE057
A second rigid body conversion matrix (which can be understood as a spatial rotation and translation matrix from the hub sampling point cloud to the hub standard point cloud) for registration from the uncorrected hub sampling point cloud to the hub standard point cloud can be obtained by using the following formula (10):
Figure 429112DEST_PATH_IMAGE059
formula (10)
Wherein,
Figure 952497DEST_PATH_IMAGE060
a second rigid body transformation matrix representing registration of the uncorrected hub sample point cloud to the hub standard point cloud,
Figure 382342DEST_PATH_IMAGE057
a first rigid body transformation matrix representing the movement of the corrected hub sample point cloud to the hub standard point cloud,
Figure 296071DEST_PATH_IMAGE025
representing a rotational transformation matrix.
The registration from the hub sampling point cloud to the hub standard point cloud can be realized through the second rigid body conversion matrix.
According to the embodiment, the spatial rotation translation matrix from the hub sampling point cloud to the hub standard point cloud is obtained by utilizing the appearance characteristics of the hub, so that the registration from the hub sampling point cloud to the hub standard point cloud is realized, the calculated amount in the registration process is reduced, and the reliability of the registration result is improved. In addition, in the embodiment, the registration accuracy of coordinate axes other than the main axis, namely the z axis, in the point cloud registration process is improved by obtaining the movement amounts of the hub sampling point cloud to the hub standard point cloud in the three directions of the x axis, the y axis and the z axis, and the registration of the hub local sampling point cloud to the standard point cloud can be realized.
The hub point cloud registration method provided by the embodiment of the invention comprises the steps of obtaining a hub sampling point cloud obtained by shooting a hub and a preset hub standard point cloud; performing inclination correction on the hub sampling point cloud to enable the main shaft direction of the hub sampling point cloud to be coincident with the main shaft direction of the hub standard point cloud; wherein, the main shaft direction is the direction of the hub rotating shaft; projecting the corrected hub sampling point cloud and the corrected hub standard point cloud to a reference plane, and respectively obtaining a sampling point cloud projection image and a standard point cloud projection image; wherein the reference plane is a plane perpendicular to the direction of the main axis; performing image matching on the sampling point cloud projection image and the standard point cloud projection image to obtain the rotation amount and the translation amount of the sampling point cloud projection image which is moved to be overlapped with the standard point cloud projection image; after the rotation amount and the translation amount are used for moving the sampling point cloud projection image, determining the main shaft translation amount between the hub sampling point cloud and the hub standard point cloud based on the moved sampling point cloud projection image and the standard point cloud projection image; and moving the hub sampling point cloud according to the rotation amount, the translation amount and the main shaft translation amount so as to complete the registration of the hub sampling point cloud. By adopting the scheme provided by the invention, the registration precision of the hub point cloud can be improved, and the registration error can be reduced.
In order to implement the method according to the embodiment of the present invention, an embodiment of the present invention further provides a hub point cloud registration apparatus, as shown in fig. 2, the hub point cloud registration apparatus 200 includes: an acquisition module 201, a correction module 202, a projection module 203, a matching module 204, a determination module 205 and a movement module 206; wherein,
an obtaining module 201, configured to obtain a hub sampling point cloud obtained by shooting a hub and a preset hub standard point cloud;
the correction module 202 is configured to perform inclination correction on the hub sampling point cloud so that a main axis direction of the hub sampling point cloud coincides with a main axis direction of the hub standard point cloud; wherein, the main shaft direction is the direction of the hub rotating shaft;
the projection module 203 is used for projecting the corrected hub sampling point cloud and the hub standard point cloud to a reference plane to respectively obtain a sampling point cloud projection image and a standard point cloud projection image; wherein the reference plane is a plane perpendicular to the direction of the main axis;
a matching module 204, configured to perform image matching on the sampling point cloud projection image and the standard point cloud projection image, and obtain a rotation amount and a translation amount by which the sampling point cloud projection image is moved to overlap with the standard point cloud projection image;
a determining module 205, configured to determine a main shaft translation amount between the hub sampling point cloud and the hub standard point cloud based on the moved sampling point cloud projection image and the standard point cloud projection image after moving the sampling point cloud projection image by using the rotation amount and the translation amount;
a moving module 206, configured to move the hub sampling point cloud according to the rotation amount, the translation amount, and the main shaft translation amount, so as to complete registration of the hub sampling point cloud.
In practical applications, the obtaining module 201, the correcting module 202, the projecting module 203, the matching module 204, the determining module 205 and the moving module 206 may be implemented by a processor in the hub point cloud registration apparatus.
It should be noted that: the above-mentioned apparatus provided in the above-mentioned embodiment is only exemplified by the division of the above-mentioned program modules when executing, and in practical application, the above-mentioned processing may be distributed to be completed by different program modules according to needs, that is, the internal structure of the terminal is divided into different program modules to complete all or part of the above-mentioned processing. In addition, the apparatus provided by the above embodiment and the method embodiment belong to the same concept, and the specific implementation process thereof is described in the method embodiment and is not described herein again.
To implement the method of the embodiments of the present invention, the embodiments of the present invention also provide a computer program object, which includes computer instructions, the computer instructions being stored in a computer readable storage medium. A processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the steps of the above-described method.
Based on the hardware implementation of the program module, in order to implement the method according to the embodiment of the present invention, an electronic device (computer device) is also provided in the embodiment of the present invention. Specifically, in one embodiment, the computer device may be a terminal, and its internal structure diagram may be as shown in fig. 3. The computer apparatus includes a processor a01, a network interface a02, a display screen a04, an input device a05, and a memory (not shown in the figure) connected through a system bus. Wherein processor a01 of the computer device is used to provide computing and control capabilities. The memory of the computer device comprises an internal memory a03 and a non-volatile storage medium a 06. The nonvolatile storage medium a06 stores an operating system B01 and a computer program B02. The internal memory a03 provides an environment for the operation of the operating system B01 and the computer program B02 in the nonvolatile storage medium a 06. The network interface a02 of the computer device is used for communication with an external terminal through a network connection. The computer program is executed by the processor a01 to implement the method of any of the above embodiments. The display screen a04 of the computer device may be a liquid crystal display screen or an electronic ink display screen, and the input device a05 of the computer device may be a touch layer covered on the display screen, a button, a trackball or a touch pad arranged on a casing of the computer device, or an external keyboard, a touch pad or a mouse.
Those skilled in the art will appreciate that the architecture shown in fig. 3 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
The device provided by the embodiment of the present invention includes a processor, a memory, and a program stored in the memory and capable of running on the processor, and when the processor executes the program, the method according to any one of the embodiments described above is implemented.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program object. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program object embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application has been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program objects according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include transitory computer readable media (transmyedia) such as modulated data signals and carrier waves.
It will be appreciated that the memory of embodiments of the invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a magnetic random access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical disk, or a Compact Disc Read-Only Memory (CD-ROM); the magnetic surface storage may be disk storage or tape storage. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Synchronous Static Random Access Memory (SSRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), Enhanced Synchronous Dynamic Random Access Memory (ESDRAM), Enhanced Synchronous Dynamic Random Access Memory (Enhanced DRAM), Synchronous Dynamic Random Access Memory (SLDRAM), Direct Memory (DRmb Access), and Random Access Memory (DRAM). The described memory for embodiments of the present invention is intended to comprise, without being limited to, these and any other suitable types of memory.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A hub point cloud registration method, the method comprising:
acquiring a hub sampling point cloud obtained by shooting a hub and a preset hub standard point cloud;
performing inclination correction on the hub sampling point cloud to enable the main shaft direction of the hub sampling point cloud to be coincident with the main shaft direction of the hub standard point cloud; wherein, the main shaft direction is the direction of the hub rotating shaft;
projecting the corrected hub sampling point cloud and the corrected hub standard point cloud to a reference plane, and respectively obtaining a sampling point cloud projection image and a standard point cloud projection image; wherein the reference plane is a plane perpendicular to the direction of the main axis;
performing image matching on the sampling point cloud projection image and the standard point cloud projection image to obtain the rotation amount and the translation amount of the sampling point cloud projection image which is moved to be overlapped with the standard point cloud projection image;
after the rotation amount and the translation amount are used for moving the sampling point cloud projection image, determining the main shaft translation amount between the hub sampling point cloud and the hub standard point cloud based on the moved sampling point cloud projection image and the standard point cloud projection image;
moving the hub sampling point cloud according to the rotation amount, the translation amount and the main shaft translation amount to complete the registration of the hub sampling point cloud; wherein,
the inclination correction of the hub sampling point cloud is carried out, so that the coincidence of the main shaft direction of the hub sampling point cloud and the main shaft direction of the hub standard point cloud comprises the following steps:
determining the direction of a main shaft of the hub sampling point cloud;
obtaining a main shaft direction vector of the hub sampling point cloud according to the main shaft direction;
and performing gradient correction on the hub sampling point cloud based on the main shaft direction vector, so that the main shaft direction of the hub sampling point cloud is superposed with the main shaft direction of the hub standard point cloud.
2. The method of claim 1, wherein the determining the direction of the principal axis of the hub sample point cloud comprises:
processing the hub sampling point cloud by using a principal component analysis method to obtain a covariance matrix;
singular value decomposition is carried out on the covariance matrix to obtain a singular value matrix and a singular vector matrix;
and taking the direction of the first column of vectors of the left singular vector matrix in the singular vector matrix as the main shaft direction of the hub sampling point cloud.
3. The method of claim 1, wherein the obtaining a principal axis direction vector of the hub sample point cloud from the principal axis direction comprises:
determining the coordinates of a point farthest from the main shaft direction in the hub sampling point cloud;
determining coordinates of a center of gravity point of the hub sampling point cloud;
and determining a main shaft direction vector of the hub sampling point cloud according to the coordinate of the farthest point, the coordinate of the gravity center point and the main shaft direction.
4. The method of claim 1, wherein the tilt correcting the hub sample point cloud based on the principal axis direction vector such that the principal axis direction of the hub sample point cloud coincides with the principal axis direction of the hub standard point cloud comprises:
obtaining a rotation transformation matrix by utilizing a Rodrigues rotation formula according to the main shaft direction vector;
and transforming the hub sampling point cloud by using the rotation transformation matrix so that the main shaft direction of the hub sampling point cloud is superposed with the main shaft direction of the hub standard point cloud.
5. The method of claim 1, wherein the projecting the corrected hub sample point cloud and hub standard point cloud to a reference plane, respectively obtaining a sample point cloud projection image and a standard point cloud projection image comprises:
obtaining scale factors on two mutually perpendicular coordinate axes on a reference plane;
and projecting the corrected hub sampling point cloud and the hub standard point cloud to a reference plane by using the scale factor to respectively obtain a sampling point cloud projection image and a standard point cloud projection image.
6. The method of claim 1, wherein the image matching the sampled point cloud projection image and the standard point cloud projection image to obtain an amount of rotation and an amount of translation to move the sampled point cloud projection image to overlap the standard point cloud projection image comprises:
acquiring a preset image processing template matching operator;
carrying out image matching on the sampling point cloud projection image and the standard point cloud projection image by using the image processing template matching operator to obtain a rigid body conversion matrix for moving the sampling point cloud projection image to be overlapped with the standard point cloud projection image; and the rigid body conversion matrix comprises a rotation angle for moving the sampling point cloud projection image to be overlapped with the standard point cloud projection image and translation amounts on two coordinate axes which are vertical to each other.
7. The method of claim 1, wherein determining a main axis translation amount between the hub sample point cloud and the hub standard point cloud based on the moved sample point cloud projection image and the standard point cloud projection image comprises:
determining a first overlapping area in an overlapping area between the moved sampling point cloud projection image and the standard point cloud projection image; the range of the first overlapping area is a preset range;
determining a sampling three-dimensional point set and a standard three-dimensional point set corresponding to the first overlapping area in the hub sampling point cloud and the hub standard point cloud respectively;
and determining the main shaft translation amount between the hub sampling point cloud and the hub standard point cloud according to the sampling three-dimensional point set and the standard three-dimensional point set.
8. A hub point cloud registration apparatus, comprising:
the acquisition module is used for acquiring a hub sampling point cloud obtained by shooting a hub and a preset hub standard point cloud;
the correction module is used for carrying out inclination correction on the hub sampling point cloud so that the main shaft direction of the hub sampling point cloud is superposed with the main shaft direction of the hub standard point cloud; wherein, the main shaft direction is the direction of the hub rotating shaft;
the projection module is used for projecting the corrected hub sampling point cloud and the corrected hub standard point cloud to a reference plane to respectively obtain a sampling point cloud projection image and a standard point cloud projection image; wherein the reference plane is a plane perpendicular to the direction of the main axis;
the matching module is used for carrying out image matching on the sampling point cloud projection image and the standard point cloud projection image to obtain the rotation amount and the translation amount of the sampling point cloud projection image which is moved to be overlapped with the standard point cloud projection image;
the determining module is used for determining the main shaft translation amount between the hub sampling point cloud and the hub standard point cloud based on the moved sampling point cloud projection image and the standard point cloud projection image after the sampling point cloud projection image is moved by the rotation amount and the translation amount;
the moving module is used for moving the hub sampling point cloud according to the rotation amount, the translation amount and the main shaft translation amount so as to complete the registration of the hub sampling point cloud; wherein,
the correction module is also used for determining the main shaft direction of the hub sampling point cloud; obtaining a main shaft direction vector of the hub sampling point cloud according to the main shaft direction; and performing gradient correction on the hub sampling point cloud based on the main shaft direction vector, so that the main shaft direction of the hub sampling point cloud is superposed with the main shaft direction of the hub standard point cloud.
9. An electronic device, comprising: a processor and a memory for storing a computer program capable of running on the processor; wherein,
the processor is adapted to perform the steps of the method of any one of claims 1 to 7 when running the computer program.
10. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, performs the steps of the method of any one of claims 1 to 7.
CN202111025763.8A 2021-09-02 2021-09-02 Hub point cloud registration method and device, electronic equipment and storage medium Active CN113470091B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111025763.8A CN113470091B (en) 2021-09-02 2021-09-02 Hub point cloud registration method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111025763.8A CN113470091B (en) 2021-09-02 2021-09-02 Hub point cloud registration method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113470091A CN113470091A (en) 2021-10-01
CN113470091B true CN113470091B (en) 2021-11-30

Family

ID=77867405

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111025763.8A Active CN113470091B (en) 2021-09-02 2021-09-02 Hub point cloud registration method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113470091B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113674425B (en) * 2021-10-25 2022-02-15 深圳市信润富联数字科技有限公司 Point cloud sampling method, device, equipment and computer readable storage medium
CN114529652B (en) * 2022-04-24 2022-07-19 深圳思谋信息科技有限公司 Point cloud compensation method, device, equipment and storage medium
CN118010000A (en) * 2024-04-09 2024-05-10 江苏兴力工程管理有限公司 A method for detecting verticality of high-voltage towers based on laser point cloud
CN119339004A (en) * 2024-12-19 2025-01-21 国网江苏省电力有限公司建设分公司 A point cloud replacement method for substation equipment based on principal component analysis

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101645170B (en) * 2009-09-03 2011-07-20 北京信息科技大学 Precise registration method of multilook point cloud
US10482196B2 (en) * 2016-02-26 2019-11-19 Nvidia Corporation Modeling point cloud data using hierarchies of Gaussian mixture models
GB2559157A (en) * 2017-01-27 2018-08-01 Ucl Business Plc Apparatus, method and system for alignment of 3D datasets
US10671082B2 (en) * 2017-07-03 2020-06-02 Baidu Usa Llc High resolution 3D point clouds generation based on CNN and CRF models
CN111819601A (en) * 2018-02-26 2020-10-23 英特尔公司 Method and system for point cloud registration for image processing
CN112581457B (en) * 2020-12-23 2023-12-12 武汉理工大学 Pipeline inner surface detection method and device based on three-dimensional point cloud
CN112700537B (en) * 2020-12-31 2024-12-03 广东美的白色家电技术创新中心有限公司 Tire point cloud construction method, assembly method, control device, and storage medium

Also Published As

Publication number Publication date
CN113470091A (en) 2021-10-01

Similar Documents

Publication Publication Date Title
CN113470091B (en) Hub point cloud registration method and device, electronic equipment and storage medium
CN110780285B (en) Pose calibration method, system and medium for laser radar and combined inertial navigation
CN113052905B (en) Round target pose measurement method and device based on binocular inverse projection transformation
CN111415387A (en) Camera pose determining method and device, electronic equipment and storage medium
CN111400830B (en) Machining calibration method and device for three-dimensional blank workpiece
CN113012226B (en) Method and device for estimating pose of camera, electronic equipment and computer storage medium
CN111754579A (en) Method and device for determining external parameters of multi-view camera
US20230025058A1 (en) Image rectification method and device, and electronic system
CN116109686A (en) Point cloud registration method, equipment and medium
CN109685841B (en) Registration method and system of three-dimensional model and point cloud
CN117788529B (en) Three-dimensional plane point cloud coarse registration method, system, medium and equipment
Sun et al. An orthogonal iteration pose estimation algorithm based on an incident ray tracking model
CN112631200A (en) Machine tool axis measuring method and device
CN117788580A (en) Target distance and azimuth measuring method and device based on monocular camera
CN114119684A (en) Marker registration method based on tetrahedral structure
CN108253931B (en) Binocular stereo vision ranging method and ranging device thereof
CN113362328B (en) Point cloud picture generation method and device, electronic equipment and storage medium
CN113570659B (en) Shooting device pose estimation method, device, computer equipment and storage medium
CN116385557A (en) Camera calibration method, camera calibration device, computer equipment and storage medium
CN115049813A (en) Coarse registration method, device and system based on first-order spherical harmonics
CN115294280A (en) Three-dimensional reconstruction method, apparatus, device, storage medium, and program product
JP2019067004A (en) IMAGE PROCESSING METHOD, IMAGE PROCESSING DEVICE, AND IMAGE PROCESSING PROGRAM
CN112907669A (en) Camera pose measuring method and device based on coplanar feature points
CN119357176B (en) Method and system for correcting deviation of geographic entity position
Zhang et al. Camera parameter calibration based on the correction of center positioning deviation in machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant