[go: up one dir, main page]

CN111145232A - An automatic registration method of 3D point cloud based on the change degree of feature information - Google Patents

An automatic registration method of 3D point cloud based on the change degree of feature information Download PDF

Info

Publication number
CN111145232A
CN111145232A CN201911299529.7A CN201911299529A CN111145232A CN 111145232 A CN111145232 A CN 111145232A CN 201911299529 A CN201911299529 A CN 201911299529A CN 111145232 A CN111145232 A CN 111145232A
Authority
CN
China
Prior art keywords
point
point cloud
feature
points
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911299529.7A
Other languages
Chinese (zh)
Inventor
达飞鹏
黄源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201911299529.7A priority Critical patent/CN111145232A/en
Publication of CN111145232A publication Critical patent/CN111145232A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/35Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/344Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

本发明公开了一种基于特征信息变化度的三维点云自动配准方法,处理对象为两幅及两幅以上相互间有重叠部分的三维点云数据,处理步骤为:(1)根据点云局部法向量的变化度选取特征点;(2)设计三种特征度量对获得的每个特征点进行特征描述;(3)通过阈值约束比较各特征点的特征描述获得初始匹配点对;(4)运用刚性距离约束条件获取精确匹配点对,并利用四元素法计算得到初始配准参数;(5)采用改进的ICP(iterative closest point)算法对点云精确配准。按照上述步骤可对点云进行自动配准,本发明提出的特征描述简单且辨识度高,同时具有较高的鲁棒性,配准精度和速度都有一定的提高。

Figure 201911299529

The invention discloses a three-dimensional point cloud automatic registration method based on the degree of change of feature information. The processing object is two or more three-dimensional point cloud data with overlapping parts. The processing steps are: (1) According to the point cloud Select feature points based on the degree of change of the local normal vector; (2) design three feature metrics to describe each feature point obtained; (3) compare the feature description of each feature point by threshold constraint to obtain the initial matching point pair; (4) ) Using rigid distance constraints to obtain accurate matching point pairs, and using the four-element method to calculate the initial registration parameters; (5) Using the improved ICP (iterative closest point) algorithm to accurately register the point cloud. According to the above steps, the point cloud can be automatically registered, and the feature description proposed by the present invention is simple and highly recognizable, and at the same time has high robustness, and the registration accuracy and speed are improved to a certain extent.

Figure 201911299529

Description

Three-dimensional point cloud automatic registration method based on characteristic information change degree
Technical Field
The invention relates to a three-dimensional point cloud automatic registration method based on characteristic information change degree, and belongs to the field of three-dimensional information reconstruction.
Background
Three-dimensional reconstruction of object surfaces has been an important topic of research in the field of machine vision. The point cloud data of the surface of the measured object can be quickly acquired by an optical scanner, but due to the linear propagation characteristic of light, the complete data of the surface of the measured object needs to be acquired by measuring for many times under multiple view angles, so that the acquired data are not under the same coordinate system, and in order to acquire a complete model of the object, the data acquired from each view angle need to be subjected to coordinate transformation and finally combined into a unified coordinate system, which is point cloud registration. The point cloud registration technology is widely applied to the fields of robot navigation, reverse engineering, object surface shape detection, virtual reality and the like.
According to the difference of input and output results of the depth map to be registered, the existing depth map registration methods can be roughly divided into two categories: coarse registration and fine registration. The rough registration refers to finding a group of approximate coordinate transformation relations without any prior knowledge, and unifying the depth maps under two viewpoints into the same coordinate system. Since there is no information about the relative positional relationship between the depth maps before registration, the coarse registration algorithm is usually developed around how to establish the matching relationship of corresponding elements between two depth maps at the same position of the model, and many scholars propose different methods for this. These methods can be divided into point-to-point correspondence, line-to-line correspondence, plane-to-plane correspondence, and body-to-body correspondence according to the difference in corresponding elements; if the matching modes are different, the equal relation based on Euclidean distance and the equal relation based on descriptors can be divided; the search range by matching can be divided into non-feature matching and feature matching. However, due to the diversity and complexity of the measurement model, the coarse registration algorithm is often dependent on some specific applications, and further research on the coarse registration algorithm aims to improve the accuracy, efficiency, robustness and applicability of the coarse registration. The result of the coarse registration typically provides only an approximation of the exact coordinate transformation between the depth maps, so that the overlapping regions of the two depth maps have a certain degree of fit. However, due to the approximate position transformation, the overlapping regions of the depth maps are difficult to be accurately fitted, some interleaving and layering phenomena often exist, and the subsequent fusion of the depth map data is not facilitated, so that the position of the depth map needs to be further adjusted to improve the registration accuracy of the depth map, and the process is called as fine registration. The fine registration is to realize more accurate registration transformation by continuously iterating and minimizing the distance between corresponding points on two depth maps on the basis that the approximate value of the coordinate transformation between the depth maps is obtained by coarse registration, and the representative algorithm is the classic ICP algorithm.
Disclosure of Invention
The invention provides an automatic registration method based on feature information change degree, aiming at the problems of identification degree of feature description of a three-dimensional point cloud automatic registration midpoint, registration precision and robustness. The method includes the steps that point cloud data are simplified by introducing Sine of Normal (SiN) on the basis of solution vector, a feature point set is extracted, so that the subsequent calculation amount is reduced, corresponding point pairs are searched through three geometric features, effective accurate matching point pairs are obtained by a method of combining rigid distance constraint and a RANSAC algorithm and used for calculating initial registration parameters, and finally, secondary splicing is conducted through an improved ICP algorithm. The method can improve the precision and speed of registration and has higher robustness.
The invention adopts the following technical scheme for solving the technical problems:
the invention provides a three-dimensional point cloud automatic registration method based on characteristic information change degree, which comprises the following steps:
step 1, acquiring feature point sets Pt and Qt of two point cloud data P and Q to be registered based on normal vector information of different neighborhood radii, wherein Q is reference point cloud;
step 2, establishing the feature description of each feature point in Pt and Qt;
step 3, based on the characteristic description of Pt and Qt obtained in step 2, finding the most similar point for each point in Pt in Qt as a matching point to obtain an initial matching point pair set;
step 4, combining the initial matching point pair set obtained in the step 3 with the distance constraint condition between points, and using a self-adaptive RANSAC algorithm to obtain an accurate matching point pair set;
step 5, calculating an initial registration parameter by using a four-element method;
and 6, performing point cloud registration by adopting an improved iterative closest point ICP algorithm.
As a further technical solution of the present invention, step 1 specifically comprises:
step 1.1, the characteristic degree of a certain point p in the point cloud data is
Figure BDA0002321510800000021
Where θ is p in different neighborhood radius r1、r2Normal vector of
Figure BDA0002321510800000022
Angle of (d) r1≠r2
Step 1.2, extracting characteristic degree of point cloud data greater than set threshold epsilon1The points of (2) are used as characteristic points of the point cloud data;
step 1.3, feature point extraction is performed on P and Q by the methods of step 1.1 and step 1.2, respectively, and the set of feature points of P is obtained as Pt ═ { Pt ═ Pt1,pt2,pt3,…ptm'Q has a feature point set of Qt ═ Qt1,qt2,qt3,…qtn'Where m 'and n' are the numbers of characteristic points of P and Q, respectively.
As a further technical solution of the present invention, step 2 specifically is:
step 2.1, for the ith' point Pt in Pti′The feature degree of the point is taken as the first feature quantity f at the point1(pti′);
Step 2.2: for pti′In P with pti′The point in the spherical region having the origin and the radius of γ is denoted as pti′To calculate pti′Neighborhood center of gravity O (pt)i′) In pti′To O (pt)i′) As the second characteristic quantity f of the point2(pti′)=||O(pti′)-pti′||;
Step 2.3: for pti′In pti′And O (pt)i′) The line between pt andi′taking the inverse cosine value of the included angle of the normal vector as the third characteristic quantity of the point
Figure BDA0002321510800000031
Step 2.4: for pti′If there is a jth point Qt at Qtj′Meet the following three conditions, then qtj′Is pti′The corresponding points of (1):
|f1(pti′)-f1(qtj′)|/(f1(pti′)+f1(qtj′))≤ε2
|f2(pti′)-f2(qtj′)|/(f2(pti′)+f2(qtj′))≤ε3
|f3(pti′)-f3(qtj′)|/(f3(pti′)+f3(qtj′))≤ε4
wherein epsilon2、ε3、ε4All are set thresholds.
As a further technical solution of the present invention, step 3 specifically is:
based on the information obtained in step 2, finding a corresponding point for each point in Pt in Qt as its initial matching point, and the initial matching point pair is set as:
Figure BDA0002321510800000032
where num (matchdots) is the number of initial matching point pairs.
As a further technical solution of the present invention, step 4 specifically is:
step 4.1: for point pairs in Matchdots
Figure BDA0002321510800000033
And
Figure BDA0002321510800000034
if it satisfies
Figure BDA0002321510800000035
Then
Figure BDA0002321510800000036
And
Figure BDA0002321510800000037
according to the constraint condition of the distance between the points, and the sum in Matchdots is calculated
Figure BDA0002321510800000038
Number num of matching point pairs satisfying distance constraint between pointsgWherein h is 1,2,3 … num (Matchdots), dist (·,) represents the distance between two points, ε5Setting a threshold;
step 4.2: selecting the matching point pair corresponding to the larger N values in the calculation result of the step 4.1, and checking whether the N matching point pairs are correct matching point pairs by using a RANSAC algorithm, wherein the specific checking method comprises the following steps:
1) randomly selecting 3 matching point pairs from the N matching point pairs as a sample;
2) setting 3 selected matching point pairs in the step 1) as correct matching point pairs, and calculating a rigid body transformation matrix U;
3) judging whether the rest N-3 matching points are correctly matched under the rigid body transformation matrix U in the step 2): for any point(s) in the remaining N-3 matching points1,s2) If | | | U(s)1),s2If | is less than the predefined threshold value, then(s) is considered1,s2) If the matching point pair is correct, marking as an inner point; otherwise,(s)1,s2) The matching point pairs are wrong and are marked as outer points, and all inner points form a consistent set of the sampling; wherein, U(s)1) Denotes s1Calculating a new coordinate point through a rigid body transformation matrix;
4) repeating the steps 1) to 3) until the iteration times reaches the upper limit of the sampling times, finally, taking the rigid body transformation matrix corresponding to the consistent set with the maximum number of the internal points obtained by iteration as a correct rigid body transformation matrix, and obtaining the internal points in Matchdots as final accurate matching point pairs according to the correct rigid body transformation matrix, wherein the accurate matching point pair set is as follows:
Matchdots1={(me,m'e)|me∈P,m'e∈Q,e=1,2,3…t}
wherein (m)e,m'e) To exactly match a pair of points, t is the number of exactly matching pairs of points.
As a further technical solution of the present invention, the calculation of the initial registration parameter in step 5 specifically includes:
1): separately compute a set of points { me|meE.g. P, e 1, 2.e|m'eE.g., centroid of Q, e 1, 2.
Figure BDA0002321510800000041
Wherein μ, μ' are each point me,m'eThe center of mass of;
2): respectively set points { me|meE.g. P, e 1, 2.e|m'eE Q, e 1,2, t, do a translation relative to the centroid:
Figure BDA0002321510800000042
wherein s ise,s'eAre respectively a point me,m'eNew feature points after the relative centroid translation;
3): according to the point set seAnd { s'eComputing a correlation matrix K:
Figure BDA0002321510800000043
4): by the elements K in the correlation matrix KbcConstructing a four-dimensional symmetric matrix Ka
Figure BDA0002321510800000044
Wherein b, c is 1,2,3, 4;
5): calculating KaThe unit feature vector q corresponding to the maximum feature root:
q=[q0q1q2q3]T
6): calculating a rotation matrix R0
Figure BDA0002321510800000051
7): solving translation vector T0
T0=μ'-Rμ。
As a further technical solution of the present invention, step 6 specifically is:
step 6.1, utilizing the initial registration parameter R obtained in step 50And T0Transforming each point in the target point cloud P to a coordinate system of the reference point cloud Q, and obtaining an initial target point cloud set P in the ICP algorithm after transformation0={p01,p02,…,p0mIn which p is0i=R0·pi+T0,piIs the ith point in the target point cloud P, i is 1,2, …, m;
step 6.2, for P0At any point in the four points, the gravity center P of a tetrahedron formed by four points with the nearest Euclidean distance on Q is obtained through a polyhedron gravity center formula0The ith point and its corresponding center of gravity constitute the initial closest point pair (p) in the ICP algorithm0i,q0i);
Step 6.3, according to an iterative formula Ps+1=Rs+1Ps+Ts+1Performing iterative calculation until a set convergence condition d is mets-ds+1If the registration parameter is less than E, the obtained final registration parameter is used for registration of the target point cloud P to obtain an accurate registration result; wherein E is a set threshold value,
Figure BDA0002321510800000052
(p(s-1)i,q(s-1)i)、(psi,qsi) Respectively the s-1 th iteration and the nearest neighbor point pair, R, obtained by the registration parameter obtained by the iteration after the s iteration according to the method in the steps 6.1 and 6.2s、Rs+1Respectively obtaining rotation matrixes after the s-th iteration and the s + 1-th iteration; t iss、Ts+1Respectively obtaining translation vectors after the s-th iteration and the s + 1-th iteration; ps、Ps+1Respectively after s iterations and s +1 iterations of the target point cloud PPoint cloud collection of ds、ds+1Are respectively Ps、Ps+1And the average distance of Q.
Has the advantages that:
compared with other feature-based registration algorithms which only use a single feature value (such as curvature, gravity center distance and the like) to describe the point cloud, the invention adopts various geometric feature features to describe the point cloud, so that the point identification degree is improved, the anti-noise interference capability of the method is improved, the influence of the point cloud topological structure is small, and the registration precision and speed are improved to a certain extent. Compared with other feature-based registration techniques, the present invention has the following advantages:
firstly, the method selects the characteristic point set based on the change degree of the normal vectors of different neighborhood radiuses of the point cloud instead of using all the points for point pair matching, saves the processing time, removes the points with unobvious relative characteristics, improves the accuracy of point pair matching, and avoids the occurrence of wrong matched point pairs caused by too many similar point matches.
Secondly, various geometric characteristics are designed to carry out characteristic point pair acquisition, the identification degree of characteristic description is improved, the anti-noise interference capability of the characteristics is strong, and the influence of a point cloud topological structure is small, so that the accuracy of point pair matching is effectively improved.
And finally, combining rigid distance constraint and RANSAC algorithm to obtain accurate matching point pairs, and further iteratively solving an accurate registration result by using an improved ICP algorithm, wherein the algorithm is quick and effective.
In conclusion, the method can realize automatic registration of partially overlapped point cloud data, the point cloud does not need any predicted information, manual intervention is not needed in the registration process, the initial registration effect is good, the secondary registration result is more accurate, and the registration requirement of multi-view point cloud data is met.
Drawings
FIG. 1 is a flow chart of the overall process of the present invention.
FIG. 2 is a schematic diagram of the bunny point cloud with normal information in whole and in part.
FIG. 3 is a schematic diagram of bunny point cloud feature point extraction.
FIG. 4 is a schematic diagram of two pairwise view-angle initial matching point-to-connecting lines of the bunny point cloud.
FIG. 5 is a schematic diagram of a connecting line of matching point pairs of two pairwise viewing angles of the bunny point cloud after removing error point pairs.
Fig. 6 is the initial registration result of two view angles of bunny point cloud.
Fig. 7 is a comparison diagram of the initial registration and the accurate registration result of two viewing angles of the bunny point cloud.
Detailed Description
The following further describes the embodiments of the present invention with reference to the drawings. And selecting Visual Studio2017 as a programming tool under a Windows operating system, and performing registration processing on multi-view point cloud data acquired by the three-dimensional measuring equipment. In the example, the bunny point cloud data of Stanford university is adopted, and a relatively accurate registration result is finally obtained.
FIG. 1 is a flow chart of the overall process of the present invention.
The invention provides a point cloud automatic registration algorithm based on feature information change degree, which aims at the problems of low feature identification degree, sensitivity to noise, high requirements on a point cloud topological structure and the like in the existing feature-based automatic registration algorithm. On the basis of solving vector, point cloud data is simplified by introducing Sine of Normal (SiN) to extract a feature point set, so that the subsequent calculation amount is reduced, corresponding point pairs are sought through three geometric features, then accurate matching point pairs are obtained by utilizing rigid distance constraint conditions and applying RANSAC algorithm, initial registration parameters are calculated by adopting a four-element method, and finally, the point cloud is accurately registered by adopting an improved ICP algorithm.
The method comprises the following specific implementation steps:
step 1: the method comprises the following steps of obtaining feature point sets in two point clouds to be matched based on normal vector information of different neighborhood radiuses:
step 1.1: the existing three-dimensional scanner is adopted to obtain multi-view point cloud data with normal vector information, and overlapping parts exist between point clouds obtained from adjacent views, so that the point p of a certain point in the point cloud can be calculated at different positionsNeighborhood radius r1、r2(r1≠r2) Normal vector of
Figure BDA0002321510800000071
Defining the characteristic degree f of the point p as its normal vector
Figure BDA0002321510800000072
Absolute value of the included angle sine | sin θ |:
Figure BDA0002321510800000073
step 1.2: selecting a suitable threshold value epsilon1Removing the relatively flat part in the point cloud, and reserving f > epsilon1Point (2) of (c).
Step 1.3: setting two point cloud data to be registered as P and Q respectively, wherein Q is a reference point cloud, and respectively extracting the feature points of the two point clouds by using the feature point extraction methods provided in the step 1.1 and the step 1.2 to obtain a feature point set Pt ═ Pt of P1,pt2,pt3,…ptm'Q has a feature point set of Qt ═ Qt1,qt2,qt3,…qtn'Where m 'and n' are the numbers of characteristic points of P and Q, respectively.
FIG. 2 is a schematic diagram of the bunny point cloud with normal information in whole and in part.
FIG. 3 is a schematic diagram of bunny point cloud feature point extraction.
Step 2: establishing the feature description of the feature point set, wherein the method comprises the following steps:
step 2.1: for the ith' point Pt in the point set Pti′The feature degree of the point is taken as the first feature quantity f at the point1(pti′);
Step 2.2: for the ith' point Pt in the point set Pti′In the origin set P, pti′The point in the spherical region having the origin and the radius of γ is denoted as pti′To calculate pti′Neighborhood center of gravity O (pt)i′) In pti′To O (pt)i′) The distance value of (2) is taken as the second special characteristic of the pointCharacterization, noted as:
f2(pti′)=||O(pti′)-pti′||
step 2.3: for the ith' point Pt in the point set Pti′In pti′And O (pt)i′) The line between pt andi′normal vector
Figure BDA0002321510800000074
The inverse cosine value of the included angle of (b) is taken as a third characteristic quantity of the point and is recorded as:
Figure BDA0002321510800000075
step 2.4: for the ith' point Pt in the point set Pti′For it, search for the corresponding point in the set of points Qt, if Qtj′Is pti′Should the three geometrical characteristics proposed above be the same or approximately the same for the two points, and therefore for Pt in the point set Pti′And Qt in the set of points Qtj′We consider these three conditions to have correspondence as long as they are satisfied, and the three conditions are as follows:
|f1(pti′)-f1(qtj′)|/(f1(pti′)+f1(qtj′))≤ε2
|f2(pti′)-f2(qtj′)|/(f2(pti′)+f2(qtj′))≤ε3
|f3(pti′)-f3(qtj′)|/(f3(pti′)+f3(qtj′))≤ε4
wherein take epsilon2=ε3=ε4And (5) filtering by using the three relations and initially establishing the point correspondence relation, wherein the point correspondence relation is 0.01.
And step 3: based on the information obtained in step 2, finding a corresponding point for each point in Pt in Qt as its initial matching point, and the initial matching point pair is set as:
Figure BDA0002321510800000081
where num (matchdots) is the number of initial matching point pairs.
FIG. 4 is a schematic diagram of two pairwise view-angle initial matching point-to-connecting lines of the bunny point cloud.
And 4, step 4: combining the distance constraint condition between points, using a self-adaptive RANSAC algorithm to obtain an accurate matching point pair, and the method specifically comprises the following steps:
step 4.1: any two point pairs in Matchdots
Figure BDA0002321510800000082
And
Figure BDA0002321510800000083
if all are correct matching point pairs, then it is easy to get the following according to the constraint condition of distance between points:
Figure BDA0002321510800000084
since it is difficult to find the exact corresponding point pair in two discrete point clouds, the correct matching point pair here is only an approximate corresponding point pair, i.e. it only needs to satisfy:
Figure BDA0002321510800000085
selecting an appropriate ε5>0(ε5Taken as 0.02), for point pairs in Matchdots
Figure BDA0002321510800000086
And
Figure BDA0002321510800000087
if it satisfies
Figure BDA0002321510800000088
Then
Figure BDA0002321510800000089
And
Figure BDA00023215108000000810
and meeting the constraint condition of the distance between the points.
To pair
Figure BDA00023215108000000811
Calculating the number num of point pairs of Matchdots, except for the point pairs meeting the point-to-point distance constraint conditiong
Step 4.2: because a certain number of correct matching point pairs exist in the matching point pairs obtained after the initial matching, in general, the number num of the point pairs conforming to the distance constraint is obtained by the correct matching point pairsgIs relatively large; the num proposed above is calculated for each point pairgValue, then in numgSorting Matchdots from large to small, and selecting the first N point pairs; the calculation of the rigid body transformation matrix requires at least 3 pairs of matching points which are not collinear in the three-dimensional space.
The RANSAC algorithm is used herein to check whether the N point pairs are correct matching point pairs, where N is generally 50 in the experiment, and the specific checking method is as follows:
1) randomly selecting 3 point pairs from the N point pairs as a sample;
2) considering the 3 point pairs as correct matching point pairs, and calculating a rigid body transformation matrix U;
3) judging whether the rest N-3 point pairs are correctly matched under the rigid body transformation matrix U: for any point(s) in the remaining N-3 matching points1,s2) If | | | U(s)1),s2If | is less than a predefined threshold (this threshold is typically set to 2 times the average distance between points of the point cloud), then(s) is considered1,s2) If the matching point pair is correct, marking as an inner point; otherwise,(s)1,s2) And the matching point pairs are wrong and are marked as outer points, and all the inner points form a consistent set of the sampling. Wherein, U(s)1) Denotes s1And calculating a new coordinate point through the rigid body transformation matrix.
4) Repeating the steps 3 until the cycle times reach the upper limit of the sampling times, and finally taking the rigid body transformation matrix corresponding to the consistent set with the maximum number of the internal points as a correct rigid body transformation matrix. And obtaining inner points in Matchdots as final accurate matching point pairs according to the transformation matrix, wherein the accurate matching point pairs are recorded as:
Matchdots1={(me,m'e)|me∈P,m'e∈Q,e=1,2,3…t}
wherein t is the number of precisely matched point pairs;
fig. 5 is a schematic diagram of the effect of the matching point pairs obtained after distance constraint of two pairwise viewing angle matching point pairs of bunny.
And 5: calculating an initial registration parameter by using a four-element method according to the accurate matching point pair obtained in the step 4, and specifically comprising the following steps:
1): separately compute a set of points { me|meE.p, e 1,2, n, and m'e|m'eE.g., centroid of Q, e 1, 2.
Figure BDA0002321510800000091
Wherein μ, μ' are each point me,m'eThe center of mass of;
2): respectively set points { me|meE.p, e 1,2, n, and m'e|m'eE Q, e 1,2, n, do a translation relative to the centroid:
Figure BDA0002321510800000092
wherein s ise,s'eAre respectively a point me,m'eNew feature points after the relative centroid translation;
3): according to the point set seAnd { s'eComputing a correlation matrix K:
Figure BDA0002321510800000101
4): by the elements K in the correlation matrix KbcConstructing a four-dimensional symmetric matrix Ka
Figure BDA0002321510800000102
Wherein b, c is 1,2,3, 4;
5): calculating KaThe unit feature vector q corresponding to the maximum feature root:
q=[q0q1q2q3]T
6): calculating a rotation matrix R0
Figure BDA0002321510800000103
7): solving translation vector T0
T0=μ'-Rμ。
Fig. 6 is a result diagram obtained after initial registration of the bunny four-view point cloud image, and it can be seen from the result diagram that a satisfactory result can be obtained by the initial registration of the method.
Step 6: the method adopts an improved ICP algorithm to accurately register the point cloud, and comprises the following specific steps:
step 6.1, utilizing the initial registration parameter R obtained in step 50And T0Transforming each point in the target point cloud P to a coordinate system of the reference point cloud Q, and obtaining an initial target point cloud set P in the ICP algorithm after transformation0={p01,p02,…,p0mIn which p is0i=R0·pi+T0,piIs the ith point in the target point cloud P, i is 1,2, …, m;
step 6.2, for P0At any point in the four points, the gravity center P of a tetrahedron formed by four points with the nearest Euclidean distance on Q is obtained through a polyhedron gravity center formula0The ith point and its corresponding center of gravity constitute the initial closest point pair (p) in the ICP algorithm0i,q0i);
Step 6.3, according to an iterative formula Ps+1=Rs+1Ps+Ts+1Performing iterative computation onUntil a set convergence condition d is satisfieds-ds+1If the registration parameter is less than E, the obtained final registration parameter is used for registration of the target point cloud P to obtain an accurate registration result; wherein E is a set threshold (E is 0.001 in the present invention),
Figure BDA0002321510800000111
Figure BDA0002321510800000112
(p(s-1)i,q(s-1)i)、(psi,qsi) Respectively the s-1 th iteration and the nearest neighbor point pair, R, obtained by the registration parameter obtained by the iteration after the s iteration according to the method in the steps 6.1 and 6.2s、Rs+1Respectively obtaining rotation matrixes after the s-th iteration and the s + 1-th iteration; t iss、Ts+1Respectively obtaining translation vectors after the s-th iteration and the s + 1-th iteration; ps、Ps+1Respectively is a point cloud set of the target point cloud P after s iterations and s +1 iterations, ds、ds+1Are respectively Ps、Ps+1And the average distance of Q.
Fig. 7 is a comparison graph of results obtained after initial and accurate registration of bunny two-view point cloud images, and the effectiveness and good registration effect of the method can be seen from the comparison graph.
Based on the normal vector information of the point cloud, the invention selects points with larger local normal vector change as feature points, utilizes three kinds of geometric information as feature description, sets and compares the similarity degree of the feature points through a threshold value to obtain initial matching point pairs, obtains accurate matching point pairs by using a method of combining rigid distance constraint and RANSAC algorithm, obtains initial registration parameters by using a four-element method, and further obtains accurate registration results by using improved ICP algorithm iteration.
The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can understand that the modifications or substitutions within the technical scope of the present invention are included in the scope of the present invention, and therefore, the scope of the present invention should be subject to the protection scope of the claims.

Claims (7)

1.一种基于特征信息变化度的三维点云自动配准方法,其中待配准的点云数据间有重叠部分,其特征在于,该方法包括以下步骤:1. a three-dimensional point cloud automatic registration method based on the degree of change of feature information, wherein there is overlap between the point cloud data to be registered, it is characterized in that, the method comprises the following steps: 步骤1,基于不同邻域半径的法向量信息,获取待配准的两个点云数据P和Q的特征点集Pt和Qt,其中Q为参考点云;Step 1, based on the normal vector information of different neighborhood radii, obtain the feature point sets Pt and Qt of the two point cloud data P and Q to be registered, wherein Q is the reference point cloud; 步骤2,建立Pt和Qt中每个特征点的特征描述;Step 2, establish the feature description of each feature point in Pt and Qt; 步骤3,基于步骤2中获取的Pt和Qt的特征描述,在Qt中为Pt中每个点找到最相似的点作为匹配点,得到初始匹配点对集;Step 3, based on the feature descriptions of Pt and Qt obtained in step 2, find the most similar point as a matching point for each point in Pt in Qt, and obtain an initial matching point pair set; 步骤4,对步骤3中得到的初始匹配点对集,结合点间距离约束条件,使用一种自适应的RANSAC算法获取精确匹配点对集;Step 4, for the initial matching point pair set obtained in step 3, combined with the distance constraints between points, an adaptive RANSAC algorithm is used to obtain the exact matching point pair set; 步骤5,利用四元素法计算初始配准参数;Step 5, using the four-element method to calculate the initial registration parameters; 步骤6,采用改进的迭代最近点ICP算法进行点云配准。Step 6, using the improved iterative closest point ICP algorithm for point cloud registration. 2.根据如权利要求1所述的一种基于特征信息变化度的三维点云自动配准方法,其特征在于,步骤1具体为:2. according to a kind of three-dimensional point cloud automatic registration method based on characteristic information variation degree as claimed in claim 1, it is characterized in that, step 1 is specifically: 步骤1.1,点云数据中的某一点p的特征度为
Figure FDA0002321510790000011
其中,θ为p在不同邻域半径r1、r2下的法向量
Figure FDA0002321510790000012
的夹角,r1≠r2
Step 1.1, the characteristic degree of a point p in the point cloud data is
Figure FDA0002321510790000011
Among them, θ is the normal vector of p under different neighborhood radii r 1 , r 2
Figure FDA0002321510790000012
, r 1 ≠r 2 ;
步骤1.2,提取点云数据中特征度大于设定阈值ε1的点作为点云数据的特征点;Step 1.2, extracting points in the point cloud data whose characteristic degree is greater than the set threshold ε 1 as the characteristic points of the point cloud data; 步骤1.3,利用步骤1.1和步骤1.2的方法,分别对P和Q进行特征点提取,得到P的特征点集为Pt={pt1,pt2,pt3,…ptm'},Q的特征点集为Qt={qt1,qt2,qt3,…qtn'},其中m'和n'分别为P和Q的特征点的个数。Step 1.3, using the methods of step 1.1 and step 1.2, extract feature points for P and Q respectively, and obtain the feature point set of P as Pt={pt 1 , pt 2 , pt 3 ,...pt m' }, the feature of Q The point set is Qt={qt 1 , qt 2 , qt 3 ,...qt n' }, where m' and n' are the number of feature points of P and Q, respectively.
3.根据如权利要求2所述的一种基于特征信息变化度的三维点云自动配准方法,其特征在于,步骤2具体为:3. according to a kind of three-dimensional point cloud automatic registration method based on characteristic information variation degree as claimed in claim 2, it is characterized in that, step 2 is specifically: 步骤2.1,对于Pt中的第i个点pti,以该点的特征度作为该点处的第一种特征量f1(pti);Step 2.1, for the i-th point pt i in Pt, take the characteristic degree of this point as the first kind of characteristic quantity f 1 (pt i ) at this point; 步骤2.2:对于pti,在P中以pti为原点、半径为γ的球域内的点作为pti的邻近点,计算pti的邻域重心O(pti),以pti到O(pti)的距离值作为该点第二种特征量f2(pti)=||O(pti)-pti||;Step 2.2: For pt i , in P, take pt i as the origin and the point in the spherical domain with radius γ as the adjacent point of pt i , calculate the neighborhood center of gravity of pt i O(pt i ), take pt i to O( The distance value of pt i ) is taken as the second feature quantity f 2 (pt i )=||O(pt i )-pt i ||; 步骤2.3:对于pti,以pti与O(pti)之间连线和pti法向量的夹角的反余弦值作为该点的第三种特征量
Figure FDA0002321510790000013
Step 2.3: For pt i , take the arc cosine of the angle between the line between pt i and O(pt i ) and the normal vector of pt i as the third feature quantity of the point
Figure FDA0002321510790000013
步骤2.4:对于pti,若在Qt存在第j个点qtj满足以下三个条件,则qtj是pti的对应点:Step 2.4: For pt i , if the jth point qt j in Qt satisfies the following three conditions, then qt j is the corresponding point of pt i : |f1(pti)-f1(qtj)|/(f1(pti)+f1(qtj))≤ε2 |f 1 (pt i )-f 1 (qt j )|/(f 1 (pt i )+f 1 (qt j ))≤ε 2 |f2(pti)-f2(qtj)|/(f2(pti)+f2(qtj))≤ε3 |f 2 (pt i )-f 2 (qt j )|/(f 2 (pt i )+f 2 (qt j ))≤ε 3 |f3(pti)-f3(qtj)|/(f3(pti)+f3(qtj))≤ε4 |f 3 (pt i )-f 3 (qt j )|/(f 3 (pt i )+f 3 (qt j ))≤ε 4 其中,ε2、ε3、ε4均为设定阈值。Among them, ε 2 , ε 3 , and ε 4 are all set thresholds.
4.根据如权利要求3所述的一种基于特征信息变化度的三维点云自动配准方法,其特征在于,步骤3具体为:4. according to a kind of three-dimensional point cloud automatic registration method based on characteristic information variation degree as claimed in claim 3, it is characterized in that, step 3 is specifically: 基于步骤2获取的信息,在Qt中为Pt中每个点找到对应点作为它的初始匹配点,初始匹配点对集记为:Based on the information obtained in step 2, a corresponding point is found for each point in Pt as its initial matching point in Qt, and the initial matching point pair set is recorded as:
Figure FDA0002321510790000021
Figure FDA0002321510790000021
其中num(Matchdots)是初始匹配点对的数量。where num(Matchdots) is the number of initial matching dot pairs.
5.根据如权利要求4所述的一种基于特征信息变化度的三维点云自动配准方法,其特征在于,步骤4具体为:5. according to a kind of three-dimensional point cloud automatic registration method based on characteristic information variation degree as claimed in claim 4, it is characterized in that, step 4 is specifically: 步骤4.1:对Matchdots中的点对
Figure FDA0002321510790000022
Figure FDA0002321510790000023
若满足
Figure FDA0002321510790000024
Figure FDA0002321510790000025
Figure FDA0002321510790000026
符合点间距离约束条件,计算Matchdots中与
Figure FDA0002321510790000027
符合点间距离约束条件的匹配点对的数目numg,其中,h=1,2,3…num(Matchdots),dist(·,·)表示两点间的距离,ε5为设定阈;
Step 4.1: Pair the dots in Matchdots
Figure FDA0002321510790000022
and
Figure FDA0002321510790000023
if satisfied
Figure FDA0002321510790000024
but
Figure FDA0002321510790000025
and
Figure FDA0002321510790000026
In line with the distance constraints between points, the calculation of Matchdots and
Figure FDA0002321510790000027
The number num g of matching point pairs that meet the distance constraints between points, where h=1, 2, 3...num(Matchdots), dist( , ) represents the distance between two points, and ε 5 is the set threshold;
步骤4.2:选取步骤4.1的计算结果中较大的N个值对应的匹配点对,使用RANSAC算法检验这N个匹配点对是否为正确匹配点对,具体检验方法为:Step 4.2: Select the matching point pairs corresponding to the larger N values in the calculation result of step 4.1, and use the RANSAC algorithm to check whether the N matching point pairs are correct matching point pairs. The specific test method is as follows: 1)从这N个匹配点对中随机选取3个匹配点对作为一个样本;1) randomly select 3 matching point pairs from the N matching point pairs as a sample; 2)设定1)中选取的3个匹配点对为正确匹配点对,计算刚体变换矩阵U;2) Set the 3 matching point pairs selected in 1) as correct matching point pairs, and calculate the rigid body transformation matrix U; 3)判断其余N-3个匹配点对在2)中的刚体变换矩阵U下是否为正确匹配:对其余N-3个匹配点中的任一点(s1,s2),如果||U(s1),s2||小于预定义的阈值,则认为(s1,s2)是正确的匹配点对,记为内点;否则,(s1,s2)是错误的匹配点对,记为外点,所有内点组成本次采样的一致集;其中,U(s1)表示s1经过刚体变换矩阵的计算得到的新的坐标点;3) Determine whether the remaining N-3 matching point pairs are correctly matched under the rigid body transformation matrix U in 2): for any point (s 1 , s 2 ) in the remaining N-3 matching points, if ||U (s 1 ), s 2 || is less than the predefined threshold, then (s 1 , s 2 ) is a correct matching point pair, which is recorded as an interior point; otherwise, (s 1 , s 2 ) is a wrong matching point Yes, it is recorded as an outer point, and all the inner points form the consistent set of this sampling; among them, U(s 1 ) represents the new coordinate point obtained by s 1 through the calculation of the rigid body transformation matrix; 4)重复以上1)至3),直到迭代次数达到采样次数上限,最后将迭代得到的内点数目最多的一致集对应的刚体变换矩阵作为正确的刚体变换矩阵,据此正确的刚体变换矩阵求得Matchdots中的内点作为最终的精确匹配点对,精确匹配点对集为:4) Repeat the above 1) to 3) until the number of iterations reaches the upper limit of the number of sampling times. Finally, the rigid body transformation matrix corresponding to the consistent set with the largest number of interior points obtained by iteration is used as the correct rigid body transformation matrix, and the correct rigid body transformation matrix is calculated accordingly. The interior points in Matchdots are obtained as the final exact matching point pair, and the exact matching point pair set is: Matchdots1={(me,m'e)|me∈P,m'e∈Q,e=1,2,3…t}Matchdots1={(m e ,m' e )|m e ∈P,m' e ∈Q,e=1,2,3…t} 其中(me,m'e)为精确匹配点对,t为精确匹配点对的数目。where (m e , m' e ) is the exact matching point pair, and t is the number of exact matching point pairs.
6.根据如权利要求5所述的一种基于特征信息变化度的三维点云自动配准方法,其特征在于,步骤5中初始配准参数的计算具体为:6. according to a kind of three-dimensional point cloud automatic registration method based on characteristic information variation degree as claimed in claim 5, it is characterized in that, the calculation of initial registration parameter in step 5 is specifically: 1):分别计算点集{me|me∈P,e=1,2,...,t}和{m'e|m'e∈Q,e=1,2,...,t}的质心:1): Calculate the point sets {m e |m e ∈P, e=1,2,...,t} and {m' e |m' e ∈ Q, e=1,2,..., The centroid of t}:
Figure FDA0002321510790000031
Figure FDA0002321510790000031
其中μ,μ'分别为点me,m'e的质心;where μ and μ' are the centroids of points me and m' e , respectively; 2):分别将点集{me|me∈P,e=1,2,...,t}和{m'e|m'e∈Q,e=1,2,...,t}做相对于质心的平移:2): The point sets {m e |m e ∈P, e=1,2,...,t} and {m' e |m' e ∈ Q, e=1,2,..., t} do a translation relative to the centroid:
Figure FDA0002321510790000032
Figure FDA0002321510790000032
其中se,s'e分别为点me,m'e相对质心平移后的新特征点;where s e , s' e are the new feature points after the points me and m' e are shifted relative to the centroid; 3):根据点集{se}和{s'e}计算相关矩阵K:3): Calculate the correlation matrix K according to the point sets {s e } and {s' e }:
Figure FDA0002321510790000033
Figure FDA0002321510790000033
4):由相关矩阵K中的元素Kbc构造出四维对称矩阵Ka4): Construct a four-dimensional symmetric matrix Ka from the element K bc in the correlation matrix K:
Figure FDA0002321510790000034
Figure FDA0002321510790000034
其中b,c=1,2,3,4;Where b, c = 1, 2, 3, 4; 5):计算Ka的最大特征根所对应的单位特征向量q:5): Calculate the unit eigenvector q corresponding to the largest eigenroot of Ka: q=[q0 q1 q2 q3]T q=[q 0 q 1 q 2 q 3 ] T 6):计算旋转矩阵R06): Calculate the rotation matrix R 0 :
Figure FDA0002321510790000041
Figure FDA0002321510790000041
7):求解平移矢量T07): Solve the translation vector T 0 : T0=μ'-Rμ。T 0 =μ'-Rμ.
7.根据如权利要求6所述的一种基于特征信息变化度的三维点云自动配准方法,其特征在于,步骤6具体为:7. according to a kind of three-dimensional point cloud automatic registration method based on characteristic information variation degree as claimed in claim 6, it is characterized in that, step 6 is specifically: 步骤6.1,利用利用步骤5中获得的初始配准参数R0和T0,将目标点云P中的每个点变换到参考点云Q所在的坐标系下,变换后得到的ICP算法中的初始目标点云集P0={p01,p02,…,p0m},其中p0i=R0·pi+T0,pi为目标点云P中的第i个点,i=1,2,…,m;Step 6.1, using the initial registration parameters R 0 and T 0 obtained in step 5, transform each point in the target point cloud P to the coordinate system where the reference point cloud Q is located, and the ICP algorithm obtained after transformation is: The initial target point cloud set P 0 ={p 01 ,p 02 ,...,p 0m }, where p 0i =R 0 ·pi +T 0 , p i is the ith point in the target point cloud P, i=1 ,2,…,m; 步骤6.2,对P0中的任一点,通过多面体重心公式求出其在Q上欧氏距离最近的四个点所构成四面体的重心,P0中的第i个点与其对应的重心构成ICP算法中的初始最邻近点对(p0i,q0i);Step 6.2, for any point in P 0 , use the polyhedral centroid formula to obtain the centroid of the tetrahedron formed by the four points with the nearest Euclidean distance on Q, and the i-th point in P 0 and its corresponding centroid constitute ICP. The initial nearest neighbor point pair (p 0i ,q 0i ) in the algorithm; 步骤6.3,根据迭代公式Ps+1=Rs+1Ps+Ts+1进行迭代计算,直至满足设定收敛条件ds-ds+1<E,将得到最终的配准参数用于目标点云P的配准,得到精确配准结果;其中,E为设定阈值,
Figure FDA0002321510790000042
(p(s-1)i,q(s-1)i)、(psi,qsi)分别为第s-1次迭代、第s次迭代后通过该次迭代所获配准参数根据步骤6.1和6.2中的方法求得的最邻近点对,Rs、Rs+1分别为第s次迭代、第s+1次迭代后获得的旋转矩阵;Ts、Ts+1分别为第s次迭代、第s+1次迭代后获得的平移矢量;Ps、Ps+1分别为目标点云P经过s次迭代、s+1次迭代后的点云集,ds、ds+1分别为Ps、Ps+1和Q的平均距离。
Step 6.3, perform iterative calculation according to the iterative formula P s+1 =R s+1 P s +T s+1 until the set convergence condition d s -d s+1 <E is satisfied, and the final registration parameters will be obtained using According to the registration of the target point cloud P, the accurate registration result is obtained; among them, E is the set threshold,
Figure FDA0002321510790000042
(p (s-1)i , q (s-1)i ), (p si , q si ) are respectively the s-1th iteration and the sth iteration after the registration parameters obtained through this iteration according to the steps For the nearest point pair obtained by the methods in 6.1 and 6.2, R s and R s+1 are the rotation matrices obtained after the s-th iteration and the s+1-th iteration, respectively; T s and T s+1 are the The translation vectors obtained after s iterations and s+1 iterations; P s and P s+1 are the point cloud sets of the target point cloud P after s iterations and s+1 iterations, respectively, d s , d s+ 1 is the average distance of P s , P s+1 and Q, respectively.
CN201911299529.7A 2019-12-17 2019-12-17 An automatic registration method of 3D point cloud based on the change degree of feature information Pending CN111145232A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911299529.7A CN111145232A (en) 2019-12-17 2019-12-17 An automatic registration method of 3D point cloud based on the change degree of feature information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911299529.7A CN111145232A (en) 2019-12-17 2019-12-17 An automatic registration method of 3D point cloud based on the change degree of feature information

Publications (1)

Publication Number Publication Date
CN111145232A true CN111145232A (en) 2020-05-12

Family

ID=70518551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911299529.7A Pending CN111145232A (en) 2019-12-17 2019-12-17 An automatic registration method of 3D point cloud based on the change degree of feature information

Country Status (1)

Country Link
CN (1) CN111145232A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111681282A (en) * 2020-06-18 2020-09-18 浙江大华技术股份有限公司 A kind of pallet identification processing method and device
CN111862173A (en) * 2020-07-03 2020-10-30 广州大学 An online fitting method based on point cloud registration
CN112017220A (en) * 2020-08-27 2020-12-01 南京工业大学 A Point Cloud Accurate Registration Method Based on Robust Constrained Least Squares Algorithm
CN112085793A (en) * 2020-09-04 2020-12-15 上海理工大学 Three-dimensional imaging scanning system based on combined lens group and point cloud registration method
CN112116638A (en) * 2020-09-04 2020-12-22 季华实验室 A three-dimensional point cloud matching method, device, electronic device and storage medium
CN113421290A (en) * 2021-07-05 2021-09-21 山西大学 Power plant boiler interior three-dimensional reconstruction method based on unmanned aerial vehicle image acquisition
CN113643270A (en) * 2021-08-24 2021-11-12 凌云光技术股份有限公司 Image registration method and device based on point cloud data
CN114037745A (en) * 2021-11-17 2022-02-11 北京容积视觉科技有限公司 A Coarse Registration Method for Multi-view 3D Point Cloud Data Based on Branch and Bound
CN114648445A (en) * 2022-03-03 2022-06-21 电子科技大学 Multi-view high-resolution point cloud splicing method based on feature point extraction and fine registration optimization
CN114972456A (en) * 2022-04-28 2022-08-30 西安交通大学 Rigid body registration method, system, terminal equipment and storage medium based on dynamic mixed characteristics
CN115018983A (en) * 2022-05-31 2022-09-06 广东电网有限责任公司 Phase-shifting transformer site selection method, device, electronic equipment and storage medium
CN111768490B (en) * 2020-05-14 2023-06-27 华南农业大学 Plant three-dimensional modeling method and system based on iteration closest point and manual intervention
CN117495926A (en) * 2023-10-31 2024-02-02 哈尔滨工业大学 A three-dimensional point cloud data registration method for large-aperture aspheric surfaces

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103236064A (en) * 2013-05-06 2013-08-07 东南大学 Point cloud automatic registration method based on normal vector
CN109919984A (en) * 2019-04-15 2019-06-21 武汉惟景三维科技有限公司 A kind of point cloud autoegistration method based on local feature description's

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103236064A (en) * 2013-05-06 2013-08-07 东南大学 Point cloud automatic registration method based on normal vector
CN109919984A (en) * 2019-04-15 2019-06-21 武汉惟景三维科技有限公司 A kind of point cloud autoegistration method based on local feature description's

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
黄源 等: "一种基于特征提取的点云自动配准算法" *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111768490B (en) * 2020-05-14 2023-06-27 华南农业大学 Plant three-dimensional modeling method and system based on iteration closest point and manual intervention
CN111681282A (en) * 2020-06-18 2020-09-18 浙江大华技术股份有限公司 A kind of pallet identification processing method and device
CN111862173A (en) * 2020-07-03 2020-10-30 广州大学 An online fitting method based on point cloud registration
CN112017220A (en) * 2020-08-27 2020-12-01 南京工业大学 A Point Cloud Accurate Registration Method Based on Robust Constrained Least Squares Algorithm
CN112017220B (en) * 2020-08-27 2023-07-28 南京工业大学 A Point Cloud Accurate Registration Method Based on Robust Constrained Least Squares Algorithm
CN112085793A (en) * 2020-09-04 2020-12-15 上海理工大学 Three-dimensional imaging scanning system based on combined lens group and point cloud registration method
CN112116638A (en) * 2020-09-04 2020-12-22 季华实验室 A three-dimensional point cloud matching method, device, electronic device and storage medium
CN112085793B (en) * 2020-09-04 2022-07-05 上海理工大学 A three-dimensional imaging scanning system and point cloud registration method based on combined lens group
CN113421290A (en) * 2021-07-05 2021-09-21 山西大学 Power plant boiler interior three-dimensional reconstruction method based on unmanned aerial vehicle image acquisition
CN113643270A (en) * 2021-08-24 2021-11-12 凌云光技术股份有限公司 Image registration method and device based on point cloud data
CN113643270B (en) * 2021-08-24 2024-04-26 凌云光技术股份有限公司 Image registration method and device based on point cloud data
CN114037745A (en) * 2021-11-17 2022-02-11 北京容积视觉科技有限公司 A Coarse Registration Method for Multi-view 3D Point Cloud Data Based on Branch and Bound
CN114648445A (en) * 2022-03-03 2022-06-21 电子科技大学 Multi-view high-resolution point cloud splicing method based on feature point extraction and fine registration optimization
CN114648445B (en) * 2022-03-03 2023-05-30 电子科技大学 A multi-view high-resolution point cloud stitching method based on feature point extraction and fine registration optimization
CN114972456A (en) * 2022-04-28 2022-08-30 西安交通大学 Rigid body registration method, system, terminal equipment and storage medium based on dynamic mixed characteristics
CN115018983A (en) * 2022-05-31 2022-09-06 广东电网有限责任公司 Phase-shifting transformer site selection method, device, electronic equipment and storage medium
CN117495926A (en) * 2023-10-31 2024-02-02 哈尔滨工业大学 A three-dimensional point cloud data registration method for large-aperture aspheric surfaces

Similar Documents

Publication Publication Date Title
CN111145232A (en) An automatic registration method of 3D point cloud based on the change degree of feature information
CN103236064B (en) A kind of some cloud autoegistration method based on normal vector
Liu et al. Relative pose estimation for cylinder-shaped spacecrafts using single image
JP6216508B2 (en) Method for recognition and pose determination of 3D objects in 3D scenes
Jiang et al. Registration for 3-D point cloud using angular-invariant feature
CN104899918B (en) The three-dimensional environment modeling method and system of a kind of unmanned plane
CN110009680B (en) Monocular Image Position and Attitude Measurement Method Based on Circle Feature and Different Surface Feature Points
CN101377812B (en) Method for recognizing position and attitude of space plane object
CN101216895A (en) An Automatic Extraction Method of Ellipse Image Feature in Complicated Background Image
CN113393524B (en) Target pose estimation method combining deep learning and contour point cloud reconstruction
CN107358629A (en) Figure and localization method are built in a kind of interior based on target identification
CN116129037B (en) Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof
CN110322492A (en) A kind of extraterrestrial target three-dimensional point cloud method for registering based on global optimization
Pan et al. Medical image registration using modified iterative closest points
CN114972539B (en) Computer room camera plane online calibration method, system, computer equipment and medium
CN116630423A (en) ORB (object oriented analysis) feature-based multi-target binocular positioning method and system for micro robot
CN115131433B (en) Non-cooperative target pose processing method and device and electronic equipment
CN108447084B (en) Stereo matching compensation method based on ORB feature
Sakai et al. Phase-based window matching with geometric correction for multi-view stereo
CN112767457A (en) Principal component analysis-based plane point cloud matching method and device
Li et al. Industrial robot hand–eye calibration combining data augmentation and actor-critic network
CN116188545B (en) Online registration method of infrared and visible light sensors based on IMU and odometer
CN110246192A (en) Binocular crag deforms intelligent identification Method
Xie et al. Real-time Reconstruction of unstructured scenes based on binocular vision depth
CN112614166A (en) Point cloud matching method and device based on CNN-KNN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200512

RJ01 Rejection of invention patent application after publication