[go: up one dir, main page]

CN101957916B - Method for extracting affine invariant feature of image by using M-band wavelet - Google Patents

Method for extracting affine invariant feature of image by using M-band wavelet Download PDF

Info

Publication number
CN101957916B
CN101957916B CN2010101092568A CN201010109256A CN101957916B CN 101957916 B CN101957916 B CN 101957916B CN 2010101092568 A CN2010101092568 A CN 2010101092568A CN 201010109256 A CN201010109256 A CN 201010109256A CN 101957916 B CN101957916 B CN 101957916B
Authority
CN
China
Prior art keywords
feature
wavelet
point
scale
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2010101092568A
Other languages
Chinese (zh)
Other versions
CN101957916A (en
Inventor
张茂军
徐玮
周韬
王炜
熊志辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Vision Splend Photoelectric Technology Co ltd
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN2010101092568A priority Critical patent/CN101957916B/en
Publication of CN101957916A publication Critical patent/CN101957916A/en
Application granted granted Critical
Publication of CN101957916B publication Critical patent/CN101957916B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

本发明提出了一种使用M进小波提取图像仿射不变特征的方法,首先通过M进小波变换建立图像的多尺度空间,从而确定候选特征点即局部极值点的位置及其所在尺度,然后去除不稳定的低对比度的候选特征点,以剩下的稳定特征点作为特征区域的中心,完成特征区域的定位;再通过特征点周围像素的梯度方向分布情况确定特征描述子的主方向,并将特征点周围的区域旋转到这个方向;最后根据特征点周围区域的梯度信息,构造出仿射不变的特征描述子。通过本发明提取的数字图像特征,具有完全的仿射不变性。

Figure 201010109256

The present invention proposes a method for extracting affine invariant features of images using M-advance wavelets. Firstly, the multi-scale space of images is established through M-advance wavelet transform, thereby determining the positions and scales of candidate feature points, that is, local extremum points. Then remove the unstable low-contrast candidate feature points, and use the remaining stable feature points as the center of the feature area to complete the positioning of the feature area; then determine the main direction of the feature descriptor by the gradient direction distribution of the pixels around the feature point, And rotate the area around the feature point to this direction; finally, according to the gradient information of the area around the feature point, construct an affine invariant feature descriptor. The digital image features extracted by the invention have complete affine invariance.

Figure 201010109256

Description

使用M进小波提取图像仿射不变特征的方法A Method of Extracting Affine Invariant Features of Image Using M-adic Wavelet

技术领域: Technical field:

本发明涉及数字图像不变特征的提取,尤其是使用M进小波提取图像的仿射不变特征的方法。  The invention relates to the extraction of invariant features of digital images, in particular to a method for extracting affine invariant features of images using M-advance wavelets. the

背景技术: Background technique:

在基于图像的目标识别、遥感图像几何校正、图像检索等许多智能图像处理领域,往往需要从多幅不同视点获取的图像中提取相同的特征量,并以此特征量作为后续处理的基础。由于大部分视点变化情况下获取的图像之间的关系可以用仿射变换来近似,因此提取仿射不变特征成为了众多技术领域的共性问题,在自然资源分析、天气预报、环境研究、变迁检测、生理病变、文字识别、指纹识别等许多领域有重要的应用价值,也是民用导航、地图与地形匹配、立体视觉、运动分析、数据融合等图像分析技术的基础。  In many intelligent image processing fields such as image-based target recognition, geometric correction of remote sensing images, and image retrieval, it is often necessary to extract the same feature quantity from multiple images acquired from different viewpoints, and use this feature quantity as the basis for subsequent processing. Since the relationship between images obtained under most viewpoint changes can be approximated by affine transformation, extracting affine invariant features has become a common problem in many technical fields. Detection, physiological lesions, character recognition, fingerprint recognition and many other fields have important application value, and are also the basis of image analysis technologies such as civil navigation, map and terrain matching, stereo vision, motion analysis, and data fusion. the

目前仿射不变特征的研究主要在两个方向上展开:全局仿射不变特征提取和局部仿射不变特征提取,与全局仿射不变特征相比,局部仿射不变特征不需要对目标进行分割、提取轮廓等预处理,只利用目标的局部信息,更适用于复杂背景和有局部遮挡情况下的特征提取,具有更广阔的应用前景。Lowe在期刊International Journal of Computer Vision上,描述了一种尺度不变关键点(Keypoint)的检测方法,以同时在尺度域和空间域取得极值的点作为关键点,并以关键点作为特征区域的中心,同时,关键点所在的尺度还用以确定特征区域的大小。这种方法较好地同时解决了特征区域定位和大小选择的问 题。而在每个特征区域内,Lowe则以梯度方向的直方图为基础构造了一种称为SIFT(Scale Invariant Feature Transform)的特征量来描述该区域的特征。SIFT方法对图像的旋转变化、尺度变化、亮度变化均具有很好的不变性,已应用于全景图拼接、目标识别、三维重建、图像检索、机器人自主导航等许多领域。但是SIFT方法实际上并不具有完全的仿射不变性,尤其是在视角变化较大的情况下,使用SIFT方法很难提取图像的仿射不变特征。  At present, the research on affine invariant features is mainly carried out in two directions: global affine invariant feature extraction and local affine invariant feature extraction, compared with global affine invariant features, local affine invariant features do not need Segmentation, contour extraction and other preprocessing of the target, only using the local information of the target, is more suitable for feature extraction in the case of complex backgrounds and partial occlusions, and has a broader application prospect. Lowe described a scale-invariant keypoint (Keypoint) detection method in the journal International Journal of Computer Vision, using the point that obtains the extreme value in both the scale domain and the space domain as the key point, and using the key point as the feature area At the same time, the scale of the key point is also used to determine the size of the feature area. This method better solves the problem of feature region location and size selection at the same time. In each feature area, Lowe constructs a feature quantity called SIFT (Scale Invariant Feature Transform) based on the histogram of the gradient direction to describe the characteristics of the area. The SIFT method has good invariance to image rotation changes, scale changes, and brightness changes, and has been applied in many fields such as panorama stitching, target recognition, 3D reconstruction, image retrieval, and robot autonomous navigation. However, the SIFT method does not actually have complete affine invariance, especially in the case of a large change in viewing angle, it is difficult to extract the affine invariant features of the image using the SIFT method. the

发明内容 Contents of the invention

针对上述现有技术存在的缺陷,本发明目的旨在提出一种使用M进小波提取图像仿射不变特征的方法,能够解决仿射不变特征提取中的两个核心问题:第一,特征区域的定位,即在哪里提取特征;第二,区域内信息的描述,即用什么特征量来描述区域内的信息;同时,该提取方法提取到的仿射不变特征能够为后续的图像处理打下好的基础。  In view of the above-mentioned defects in the prior art, the purpose of the present invention is to propose a method for extracting image affine invariant features using M-advance wavelets, which can solve two core problems in affine invariant feature extraction: first, feature The positioning of the region, that is, where to extract features; second, the description of the information in the region, that is, what feature quantity is used to describe the information in the region; at the same time, the affine invariant features extracted by this extraction method can be used for subsequent image processing Lay a good foundation. the

本发明采取的技术方案是:一种使用M进小波提取图像仿射不变特征的方法,其具体步骤为:  The technical solution adopted by the present invention is: a method for extracting image affine invariant features using M-advance wavelet, and its specific steps are:

1)首先通过M进小波变换建立图像的多尺度空间,在多尺度空间中通过检测小波系数的局部模最大值,确定候选特征点即局部极值点的位置及其所在尺度,具体步骤如下:  1) Firstly, the multi-scale space of the image is established by M-advance wavelet transform. In the multi-scale space, the position and scale of the candidate feature point, that is, the local extremum point, are determined by detecting the local modulus maximum value of the wavelet coefficient. The specific steps are as follows:

将图像f(x,y)在递增尺度Mj下沿x和y方向,做M进小波变换,其中M为大于1小于2的实数,j为递增的正整数,其中的小波函数ψ(x,y)定义为光滑函数 

Figure GSA00000027670700021
沿x和y方向的导数,从而得到由递增尺度Mj下的平滑图像 及小波系数 
Figure GSA00000027670700023
组成的图像多尺度空间;将多尺度空间下每个像素点处的小波系数 
Figure DEST_PATH_GSB00000756719800011
与同尺度的相邻8个小波系数以及上一尺度和下一尺度的18个小波系数进行比较,小波系数 
Figure DEST_PATH_GSB00000756719800012
的模是最大值的点为局部极值点,作为一个候选特征点,记为X=(x,y,Mj),其中(x,y)表示候选特征点的位置,Mj表示候选特征点所在尺度; The image f(x, y) is subjected to M wavelet transformation along the x and y directions under the increasing scale M j , where M is a real number greater than 1 and less than 2, j is an increasing positive integer, and the wavelet function ψ(x , y) is defined as a smooth function
Figure GSA00000027670700021
Derivatives along the x and y directions, resulting in a smooth image with increasing scale M j and wavelet coefficients
Figure GSA00000027670700023
The multi-scale space of the composed image; the wavelet coefficient at each pixel in the multi-scale space
Figure DEST_PATH_GSB00000756719800011
Compared with the adjacent 8 wavelet coefficients of the same scale and the 18 wavelet coefficients of the previous scale and the next scale, the wavelet coefficient
Figure DEST_PATH_GSB00000756719800012
The point whose modulus is the maximum value is the local extremum point. As a candidate feature point, it is recorded as X=(x, y, M j ), where (x, y) represents the position of the candidate feature point, and M j represents the candidate feature The scale of the point;

2)去除不稳定的低对比度的候选特征点,即局部极值点,以剩下的稳定特征点作为特征区域的中心,完成特征区域的定位: 2) Remove unstable low-contrast candidate feature points, that is, local extreme points, and use the remaining stable feature points as the center of the feature area to complete the positioning of the feature area:

将不同尺度空间下的平滑图像 

Figure DEST_PATH_GSB00000756719800013
在局部极值点X处用泰勒公式展开,通过对泰勒展开式求导,并令泰勒展开式的导数为零,得到一个偏移量 
Figure DEST_PATH_GSB00000756719800014
将偏移量 
Figure DEST_PATH_GSB00000756719800015
与X相加,求出局部极值点位置;再将偏移量 带入泰勒展开式,如果算出来的绝对值小于绝对值阀值,则认为该局部极值点不稳定,予以剔除,剩下的所有局部极值点即为特征点;  Smooth images in different scale spaces
Figure DEST_PATH_GSB00000756719800013
Use the Taylor formula to expand at the local extremum point X, and obtain an offset by deriving the Taylor expansion and making the derivative of the Taylor expansion zero
Figure DEST_PATH_GSB00000756719800014
will offset
Figure DEST_PATH_GSB00000756719800015
Add it to X to find the position of the local extremum point; then the offset Into the Taylor expansion, if the calculated absolute value is less than the absolute value threshold, the local extreme point is considered unstable and removed, and all the remaining local extreme points are feature points;

3)确定特征描述子的主方向,并将特征点周围的区域旋转到主方向:  3) Determine the main direction of the feature descriptor, and rotate the area around the feature point to the main direction:

以特征点为中心,选取半径与特征点所在尺度成正比的圆形区域,在此区域内计算哈尔小波在水平和垂直方向上的响应值,记为hx、hy,并对两个值进行高斯加权,加权后的值分别表示在水平和垂直方向上的方向分量,记为Whx、Why;将圆形区域内分割成多个大小相等的扇形区域,并分别统计扇形区域内的Whx与Why,记为∑Whx,∑Why;其中∑Whx,∑Why取最大的区域所在的方向作为该特征点的主方向,且该主方向的度数可根据∑Whx,∑Why的反正切值求出;最后将圆形区域的方向旋转到上述得到的特征点的主方向;  Taking the feature point as the center, select a circular area whose radius is proportional to the scale of the feature point, and calculate the response value of the Haar wavelet in the horizontal and vertical directions in this area, denoted as h x , h y , and the two Values are Gaussian weighted, and the weighted values represent the direction components in the horizontal and vertical directions respectively, denoted as W hx , Why ; the circular area is divided into multiple fan-shaped areas of equal size, and the statistics of the fan-shaped areas are respectively W hx and Why of , recorded as ∑W hx , ∑W hy ; where ∑W hx , ∑W hy take the direction of the largest area as the main direction of the feature point, and the degree of the main direction can be calculated according to ∑W hx , the arctangent value of ∑W hy is calculated; finally, the direction of the circular area is rotated to the main direction of the feature points obtained above;

4)最后根据特征点周围区域的梯度信息,构造出多维特征描述子:  4) Finally, according to the gradient information of the area around the feature point, a multi-dimensional feature descriptor is constructed:

基于上一步确定的特征点主方向将特征点所在的圆形区域等分为上下左右4个扇形,同时将该圆形区域用多个同心圆进行划分,得到特征点邻域的多个子区域;根据其中每个子区域的梯度相对于其相邻子区域梯度均值的变化规律以及相邻子区域之间梯度分布的相关性,为每个特征点构造出具有仿射不变的多维特征描述算子,即为该特征点对应的仿射不变特征。  Based on the main direction of the feature points determined in the previous step, the circular area where the feature points are located is divided into four sectors, up, down, left, and right. At the same time, the circular area is divided into multiple concentric circles to obtain multiple sub-areas of the feature point neighborhood; According to the change law of the gradient of each sub-region relative to the gradient mean of its adjacent sub-regions and the correlation of gradient distribution between adjacent sub-regions, a multi-dimensional feature description operator with affine invariance is constructed for each feature point , which is the affine invariant feature corresponding to the feature point. the

根据实施例的优选方案,所述步骤1中的M进小波变换具体采用提升格式的3/2进小波变换。所述步骤2中,所述绝对值阈值为0.03。所述步骤4中每个子区域的梯度相对于其相邻子区域梯度均值的变化规律采用Harris自相关描述,所述相邻子区域之间梯度分布的相关性采用Harris互相关描述。  According to a preferred solution of the embodiment, the M-advance wavelet transform in the step 1 specifically adopts the 3/2-advance wavelet transform of lifting format. In the step 2, the absolute value threshold is 0.03. In the step 4, the change law of the gradient of each sub-region relative to the gradient mean value of its adjacent sub-regions is described by Harris autocorrelation, and the correlation of gradient distribution between the adjacent sub-regions is described by Harris cross-correlation. the

本发明的设计原理详细叙述如下:  Design principle of the present invention is described in detail as follows:

现有的M进小波变换可以在不同的分辨率即多个尺度下对图像进行分析,通过M进小波变换在不同尺度级别下产生的平滑图像和对应的小波系数可以构成一个多尺度空间,这与人类视觉系统由粗到细地感受图像信息是相一致的,所以M进小波变换非常适合于提取图像在多尺度空间上的特征。本发明结合M进小波变换,提出一种提取仿射不变特征的方法,其首先通过M进小波变换建立图像的多尺度空间,在多尺度空间中通过检测小波系数的局部模最大值,确定候选特征点即局部极值点的位置及其所在尺度,然后通过特征点精确定位方法去除不稳定的低对比度的候选特征点,即局部极值点,以剩下的稳定特征点作为特征区域的中心,完成特征区域的定位。再通过特征点周围像素的梯度方向分布情况确定特征描述子的主方向,并将特征点周围的区域旋转到这个方向。最后根据特征点周围区域的梯度信息,利用特征点周围区域的Harris自相关以及互相关,构造出多维特征描述子。  The existing M-adic wavelet transform can analyze images at different resolutions, that is, at multiple scales. The smooth images and corresponding wavelet coefficients generated by the M-adic wavelet transform at different scale levels can form a multi-scale space. It is consistent with the human visual system to perceive image information from coarse to fine, so M-adic wavelet transform is very suitable for extracting the features of images in multi-scale space. In combination with M-advance wavelet transform, the present invention proposes a method for extracting affine invariant features, which first establishes the multi-scale space of the image through M-advance wavelet transform, and determines the local modulus maximum value of the wavelet coefficient in the multi-scale space. The candidate feature points are the position and scale of the local extreme points, and then the unstable low-contrast candidate feature points, that is, the local extreme points, are removed by the feature point precise positioning method, and the remaining stable feature points are used as the feature area. Center to complete the positioning of the feature area. Then, the main direction of the feature descriptor is determined by the gradient direction distribution of the pixels around the feature point, and the area around the feature point is rotated to this direction. Finally, according to the gradient information of the area around the feature point, the multi-dimensional feature descriptor is constructed by using the Harris autocorrelation and cross-correlation of the area around the feature point. the

综上所述,本发明找到了可以提取特征的图像区域,即像素点坐标值,且确定了用什么样的算子来描述这些区域内的特征信息。通过本发明提取的 数字图像特征,对图像的旋转变化、尺度变化、亮度变化、视角变化、噪声干扰等均能保持不变性,即具有完全的仿射不变性。同时本发明使用的M进小波,可以采用提升格式实现,这样能够减少M进小波变换的计算复杂度,提高运算速度,并且能够实现从整数到整数的变换和计算,在硬件实现中非常有价值。  To sum up, the present invention finds the image regions where features can be extracted, that is, pixel coordinate values, and determines what operator is used to describe the feature information in these regions. The digital image features extracted by the present invention can maintain invariance to image rotation changes, scale changes, brightness changes, viewing angle changes, noise interference, etc., that is, it has complete affine invariance. Simultaneously, the M-advancing wavelet used in the present invention can be implemented in a lifting format, which can reduce the computational complexity of the M-advancing wavelet transform, improve the computing speed, and can realize the transformation and calculation from integers to integers, which is very valuable in hardware implementation . the

附图说明 Description of drawings

图1是使用M进小波提取图像仿射不变特征方法的整体流程图;  Fig. 1 is the overall flowchart of the method for extracting image affine invariant features using M-advance wavelet;

图2是实施例中特征点P的邻域划分示意图。  Fig. 2 is a schematic diagram of the neighborhood division of the feature point P in the embodiment. the

具体实施方式: Detailed ways:

以下结合附图和实施例对本发明的设计原理进行详细描述:  Design principle of the present invention is described in detail below in conjunction with accompanying drawing and embodiment:

如图1所示,本实施例提供一种使用M进小波提取图像仿射不变特征的方法,具体使用的是3/2进小波变换,即M=3/2,其包括如下步骤:  As shown in Figure 1, the present embodiment provides a method for extracting image affine invariant features using M-advance wavelet, specifically using 3/2-advance wavelet transform, i.e. M=3/2, which includes the following steps:

第一步,通过M进小波变换建立图像的多尺度空间,在多尺度空间中通过检测小波系数的局部模最大值,确定局部极值点位置及其所在尺度:  The first step is to establish the multi-scale space of the image through the M-adic wavelet transform, and determine the position and scale of the local extremum point by detecting the local modulus maximum value of the wavelet coefficient in the multi-scale space:

(1)选择小波分解尺度J,本实施例优选J为10;  (1) select the wavelet decomposition scale J, the preferred J of the present embodiment is 10;

(2)对f(x,y)的每一行进行递增j值二维M进小波变换,0<j<J;  (2) For each row of f(x, y), perform incremental j-value two-dimensional M-advance wavelet transform, 0<j<J;

(3)找到 

Figure GSA00000027670700051
的零交叉点;  (3) found
Figure GSA00000027670700051
the zero crossing point of

(4)计算所有小波变换的模值 MO M J f ( x , y ) = | W M j 1 f ( x , y ) | + | W M j 2 f ( x , y ) | 在像素点(x,y)的n×n领域内沿方向梯度上的模极大值点;  (4) Calculate the modulus of all wavelet transforms MO m J f ( x , the y ) = | W m j 1 f ( x , the y ) | + | W m j 2 f ( x , the y ) | The modulus maximum point on the gradient along the direction within the n×n domain of the pixel point (x, y);

(5)去掉那些模值随尺度减小而增加的点,这些点被认为是噪声;  (5) Remove those points whose modulus value increases with the decrease of scale, these points are considered as noise;

(6)对图像的每一列重复(2)至(5)步;  (6) Repeat steps (2) to (5) for each column of the image;

(7)对于两次分别得到极值处的点就认为是局部极值点。  (7) The points where the extreme values are obtained twice are considered as local extreme points. the

其中的M进小波变换定义为:  The M-advance wavelet transform is defined as:

假设函数θ(x)满足 &Integral; - &infin; &infin; &theta; ( x ) dx = 1 , lim x &RightArrow; &infin; &theta; ( x ) = 0 时,称θ(x)为光滑函数。当θ(x,y)为二维光滑函数时,图像f(x,y)和不同尺度a上的光滑函数θa(x,y)卷积,将使图像f(x,y)被光滑。定义二维小波函数分别为:  Suppose the function θ(x) satisfies &Integral; - &infin; &infin; &theta; ( x ) dx = 1 , and lim x &Right Arrow; &infin; &theta; ( x ) = 0 , θ(x) is called a smooth function. When θ(x, y) is a two-dimensional smooth function, the convolution of the image f(x, y) and the smooth function θ a (x, y) on different scales a will make the image f(x, y) smooth . The two-dimensional wavelet functions are defined as:

&psi;&psi; 11 (( xx ,, ythe y )) == d&theta;d&theta; (( xx ,, ythe y )) dxdx

&psi;&psi; 22 (( xx ,, ythe y )) == d&theta;d&theta; (( xx ,, ythe y )) dydy

当ψ1(x,y)和ψ2(x,y)满足二维小波的完备性和稳定性条件,可以作为二维小波变换的小波基母函数。记:  When ψ 1 (x, y) and ψ 2 (x, y) meet the completeness and stability conditions of two-dimensional wavelet, they can be used as wavelet basis function of two-dimensional wavelet transform. remember:

&psi;&psi; aa 11 (( xx ,, ythe y )) == 11 aa 22 &psi;&psi; 11 (( xx aa ,, ythe y aa ))

&psi;&psi; aa 22 (( xx ,, ythe y )) == 11 aa 22 &psi;&psi; 22 (( xx aa ,, ythe y aa ))

则函数f(x,y)的小波变换为:  Then the wavelet transform of the function f(x, y) is:

WW aa 11 ff (( xx ,, ythe y )) == ff (( xx ,, ythe y )) ** &psi;&psi; aa 11

WW aa 22 ff (( xx ,, ythe y )) == ff (( xx ,, ythe y )) ** &psi;&psi; aa 22

以M进制的小波变换表示,即取a=Mj则有:  It is represented by M-ary wavelet transform, that is, if a=M j is taken, then:

WW Mm jj 11 ff (( xx ,, ythe y )) WW Mm jj 22 ff (( xx ,, ythe y )) == Mm jj dd dxdx (( ff ** &theta;&theta; Mm JJ (( xx ,, ythe y )) )) dd dydy (( ff ** &theta;&theta; Mm jj (( xx ,, ythe y )) )) == Mm jj &dtri;&dtri; &OverBar;&OverBar; (( ff ** &theta;&theta; Mm jj (( xx ,, ythe y )) ))

M进小波变换 

Figure GSA00000027670700071
Figure GSA00000027670700072
分别是在尺度Mj时所平滑图像f(x,y)沿水平方向和垂直方向的部分导数,可看做被 所平滑图像f(x,y)的梯度矢量的模和辐角,记为:  M-adic wavelet transform
Figure GSA00000027670700071
Figure GSA00000027670700072
are the partial derivatives of the smoothed image f(x, y) along the horizontal direction and vertical direction at the scale M j respectively, which can be regarded as The modulus and argument of the gradient vector of the smoothed image f(x, y) are recorded as:

MOMO Mm JJ ff (( xx ,, ythe y )) == || WW Mm jj 11 ff (( xx ,, ythe y )) || ++ || WW Mm jj 22 ff (( xx ,, ythe y )) ||

AA Mm JJ ff (( xx ,, ythe y )) == argarg (( WW Mm jj 11 ff (( xx ,, ythe y )) ++ iWwxya Mm jj 22 ff (( xx ,, ythe y )) ))

第二步,通过特征点精确定位去除不稳定的低对比度的特征点。由于低对比度的点对噪声很敏感,利用特征点的位置、尺度、曲率等信息可以去除低对比度的点以增强匹配稳定性、提高抗噪声能力。去除低对比度的点后,特征点数有了很大的减少,极大地降低了特征误匹配率;  In the second step, unstable low-contrast feature points are removed through precise feature point positioning. Since the low-contrast points are very sensitive to noise, the position, scale, curvature and other information of the feature points can be used to remove the low-contrast points to enhance the matching stability and improve the anti-noise ability. After removing the low-contrast points, the number of feature points has been greatly reduced, which greatly reduces the feature error matching rate;

由于相邻两层的尺度相差较大,为了精确定位特征点,需要对尺度空间进行插值。对于特征点的精确定位用到了泰勒二次展开式,具体如下所述。  Since the scales of two adjacent layers differ greatly, in order to accurately locate the feature points, it is necessary to interpolate the scale space. For the precise positioning of feature points, the Taylor quadratic expansion is used, as described below. the

为了去除低对比度的特征点,利用泰勒公式的二次展开式以精确确定特征点的位置和尺度。首先记不同尺度空间下平滑后的图像 

Figure GSA00000027670700076
为D(x,y,σ),其中σ=Mj,将D(x,y,σ)在局部极值点(x0,y0,σ)处泰勒展开到二次项:  In order to remove low-contrast feature points, the quadratic expansion of Taylor's formula is used to accurately determine the position and scale of feature points. First remember the smoothed images in different scale spaces
Figure GSA00000027670700076
is D(x, y, σ), where σ=M j , Taylor expansion of D(x, y, σ) at the local extremum point (x 0 , y 0 , σ) to the quadratic term:

DD. (( xx ,, ythe y ,, &sigma;&sigma; )) == DD. (( xx 00 ,, ythe y 00 ,, &sigma;&sigma; )) ++ &PartialD;&PartialD; DD. TT &PartialD;&PartialD; Xx Xx ++ 11 22 Xx TT &PartialD;&PartialD; 22 DD. &PartialD;&PartialD; Xx 22 Xx -- -- -- (( 11 ))

其中X=(x,y,σ)T。通过求这个函数导数并将其设为零,有:  where X = (x, y, σ) T . By taking the derivative of this function and setting it to zero, we have:

&PartialD;&PartialD; DD. TT &PartialD;&PartialD; Xx Xx ++ &PartialD;&PartialD; 22 DD. &PartialD;&PartialD; Xx 22 Xx ^^ == 00

可得到X的偏移量  The offset of X can be obtained

Xx ^^ == -- &PartialD;&PartialD; 22 DD. -- 11 &PartialD;&PartialD; Xx 22 &PartialD;&PartialD; DD. &PartialD;&PartialD; Xx -- -- -- (( 22 ))

利用附近点的差分近似求出导数可以减少计算量。如果特征点在任一方向上的偏移量 

Figure GSA00000027670700081
大于0.5就意味着特征点和其他的样本点更接近。在这种情况下,需要用插值来代替该样本点,把偏移量 
Figure GSA00000027670700082
加到样本点上以得到特征点定位的插值估计。  Using the difference of nearby points to approximate the derivative can reduce the amount of computation. If the offset of the feature point in either direction
Figure GSA00000027670700081
A value greater than 0.5 means that the feature point is closer to other sample points. In this case, the sample point needs to be replaced by interpolation, and the offset
Figure GSA00000027670700082
Added to the sample points to obtain an interpolated estimate of the feature point location.

偏移量 

Figure GSA00000027670700083
对于去除不稳定的低对比度的特征点是很有用的。结合(1)(2)两式,可得  Offset
Figure GSA00000027670700083
It is useful for removing unstable low-contrast feature points. Combining the two formulas (1) and (2), we can get

DD. (( Xx ^^ )) == DD. ++ 11 22 &PartialD;&PartialD; DD. TT &PartialD;&PartialD; Xx Xx ^^

如果 

Figure GSA00000027670700085
的值小于0.03,就去掉该特征点。  if
Figure GSA00000027670700085
If the value is less than 0.03, the feature point will be removed.

第三步,通过特征点周围像素的分布情况确定特征描述子的主方向,并将特征点周围的区域旋转到这个方向。为了使描述子具有旋转不变性,要对描述子赋予一个方向值,记为主方向。先将特征点周围的描述子的区域旋转到这个主方向,然后再计算描述子:  In the third step, the main direction of the feature descriptor is determined by the distribution of pixels around the feature point, and the area around the feature point is rotated to this direction. In order to make the descriptor invariant to rotation, it is necessary to assign a direction value to the descriptor, which is recorded as the main direction. First rotate the area of the descriptor around the feature point to this main direction, and then calculate the descriptor:

为了使描述子具有旋转不变性,要对描述子赋予一个方向值,有了这个方向,在特征点周围的描述子的区域旋转到这个主方向,然后再计算描述子。求特征点的主方向时以特征点为中心选取半径为6s的圆形区域,s是特征点所在的尺度。在此区域内计算哈尔小波在水平和垂直方向上的响应值,记为hx、hy。  In order to make the descriptor invariant to rotation, it is necessary to assign a direction value to the descriptor. With this direction, the region of the descriptor around the feature point is rotated to this main direction, and then the descriptor is calculated. When finding the main direction of the feature point, select a circular area with a radius of 6s centered on the feature point, where s is the scale of the feature point. Calculate the horizontal and vertical response values of the Haar wavelet in this area, denoted as h x , h y .

计算出图像在哈尔小波的水平和垂直方向上的响应值之后,对两个值进行因子为σ=2s的高斯加权,加权后的值分别表示在水平和垂直方向上的方向分量,记为Whx、Why。  After calculating the response values of the image in the horizontal and vertical directions of the Haar wavelet, Gaussian weighting with a factor of σ=2s is performed on the two values, and the weighted values represent the direction components in the horizontal and vertical directions respectively, denoted as W hx , Why .

求主方向时,对Whx、Why用直方图进行统计,将360°分成72组,每5°一个 组,这样直方图由72个柱子组成,每一个柱子表示5°。对于以特征点为中心的圆形区域内被分割成60°大小的区域,如0°~60°,5°~65°分别统计60°扇形区域内的Whx与Why,记为∑Whx,∑Why,同时计算该区域的梯度值,梯度值取最大的区域所在的方向就是该特征点的主方向,根据∑Whx,∑Why的反正切值就可以求出主方向的度数。  When finding the main direction, use the histogram to make statistics on W hx and Why , and divide 360° into 72 groups, each group is 5°, so the histogram is composed of 72 columns, and each column represents 5°. For a circular area centered on a feature point that is divided into 60° areas, such as 0°~60°, 5°~65° respectively count W hx and Why in the 60° fan-shaped area, which is recorded as ∑W hx , ∑W hy , and calculate the gradient value of this area at the same time. The direction of the area where the gradient value is the largest is the main direction of the feature point. According to ∑W hx , the arctangent value of ∑W hy can find the main direction degree.

第四步,采用特征点周围区域的Harris自相关以及互相关,构造特征描述子。构造出的特征描述子对子区域的梯度信息进行了整合,这就使得本描述子更抗噪,抗光照。  The fourth step is to use the Harris autocorrelation and cross-correlation of the area around the feature point to construct a feature descriptor. The constructed feature descriptor integrates the gradient information of sub-regions, which makes the descriptor more anti-noise and anti-light. the

特征描述子的产生过程如下所述:  The process of generating feature descriptors is as follows:

记图像点X的梯度为▽f(X)=[fx(X),fy(X)]T。令G是图像f的子区域,G的Harris自相关矩阵定义为:  Note that the gradient of the image point X is ▽f(X)=[f x (X), f y (X)] T . Let G be a subregion of image f, the Harris autocorrelation matrix of G is defined as:

SC ( G ) = &Sigma; X &Element; G ( f x ( X ) - M x ) 2 ( f x ( X ) - M x ) ( f y ( X ) - M y ) ( f x ( X ) - M x ) ( f y ( X ) - M y ) ( f y ( X ) - M y ) 2 , 其中  SC ( G ) = &Sigma; x &Element; G ( f x ( x ) - m x ) 2 ( f x ( x ) - m x ) ( f the y ( x ) - m the y ) ( f x ( x ) - m x ) ( f the y ( x ) - m the y ) ( f the y ( x ) - m the y ) 2 , in

NN == 11 ## GG &Sigma;&Sigma; Xx &Element;&Element; GG &dtri;&dtri; ff (( Xx ))

为区域G内图像点的梯度均值,#G表示区域G所包含的图像像素点数,由于SC(G)是半正定的,所以它的行列式与迹都是非负的。定义区域G的Harris自相关矩阵如下,  is the gradient mean of image points in area G, #G represents the number of image pixels contained in area G, and since SC(G) is positive semi-definite, its determinant and trace are both non-negative. The Harris autocorrelation matrix defining the region G is as follows,

HscHsc (( GG )) == [[ trtr (( SCSC (( GG )) )) ,, detdet (( SCSC (( GG )) )) 11 22 ]]

其中tr(SC(G))和det(SC(G))分别表示Harris自相关矩阵SC(G)的迹与行列式。Harris自相关描述的是特定区域内图像梯度的变化规律。  Where tr(SC(G)) and det(SC(G)) represent the trace and determinant of the Harris autocorrelation matrix SC(G), respectively. Harris autocorrelation describes the changing law of the image gradient in a specific region. the

Harris互相关描述的是两个区域梯度分布的相关性,令G、H是图像f的 两个子区域,它们之间的Harris互相关矩阵定义为  The Harris cross-correlation describes the correlation of the gradient distribution of two regions, let G and H be the two sub-regions of the image f, and the Harris cross-correlation matrix between them is defined as

MCMC (( GG ,, Hh )) == &Sigma;&Sigma; Xx &Element;&Element; GG (( ff xx (( Xx )) -- NN xx Hh )) 22 (( ff xx (( Xx )) -- NN xx Hh )) (( ff ythe y (( Xx )) -- NN ythe y Hh )) (( ff xx (( Xx )) -- NN xx Hh )) (( ff ythe y (( Xx )) -- NN ythe y Hh )) (( ff ythe y (( Xx )) -- NN ythe y Hh )) 22

其中NH表示区域H的梯度均值。  where N H represents the gradient mean of region H.

由于矩阵MC(G,H)也是半正定的,它的迹与行列式也都是非负的,定义区域G和区域H的互相关为,  Since the matrix MC(G, H) is also semi-positive definite, its trace and determinant are also non-negative, and the cross-correlation between the defined region G and region H is,

HmcHmc (( GG )) == [[ trtr (( MCMC (( GG )) )) ,, detdet (( MCMC (( GG )) )) 11 22 ]]

Harris互相关描述的是特定区域的梯度相对于相邻区域梯度均值的变化规律以及相邻区域之间梯度分布的相关性。  Harris cross-correlation describes the change law of the gradient of a specific region relative to the mean gradient of adjacent regions and the correlation of gradient distribution between adjacent regions. the

将以特征点为中心,以半径为r的圆形邻域Ω作为特征点的支撑区域,基于支撑区域的主方向将Ω等分为4个扇形,同时将Ω用4个同心圆进行划分,得到17个特征点邻域的子区域,如图2所示,其中r为特征点所在尺度。  Taking the feature point as the center and the circular neighborhood Ω with radius r as the support area of the feature point, divide Ω into 4 sectors based on the main direction of the support area, and divide Ω into 4 concentric circles at the same time, Get 17 sub-regions of the feature point neighborhood, as shown in Figure 2, where r is the scale of the feature point. the

为了引入相邻子区域的互相关,并使相邻子区域的互相关具有对称性,定义  In order to introduce the cross-correlation of adjacent sub-regions and make the cross-correlation of adjacent sub-regions symmetric, define

HmcHmc (( GG )) &OverBar;&OverBar; == (( HmcHmc (( GG ,, Hh )) ++ HmcHmc (( Hh ,, GG )) )) 22

其中G、H表示特征点领域Ω的相邻子区域。相邻子区域指的是具有公共边界或公共点的一对区域。  Among them, G and H represent the adjacent sub-regions of the feature point domain Ω. Neighboring subregions refer to a pair of regions with a common boundary or common point. the

由于特征点领域Ω内不同位置的点对描述特征点贡献不同,且距离特征点越近,贡献越大,反之越小。为了体现这种思想,在计算特征点描述子时,对邻域Ω内点的梯度进行加权处理,采用高斯函数进行加权,且高斯尺度取为 

Figure GSA00000027670700104
Because the points in different positions in the feature point field Ω contribute differently to the description feature points, and the closer to the feature point, the greater the contribution, and vice versa. In order to reflect this idea, when calculating the feature point descriptor, the gradient of the points in the neighborhood Ω is weighted, and the Gaussian function is used for weighting, and the Gaussian scale is taken as
Figure GSA00000027670700104

根据前面定义的Harris自相关以及特征点邻域的划分策略,可得到一个34维的向量:  According to the previously defined Harris autocorrelation and the division strategy of the feature point neighborhood, a 34-dimensional vector can be obtained:

HS=[Hsc(G00),HS1HS = [Hsc(G 00 ), HS 1 ]

其中  in

HS1=[Hsc(G11),Hsc(G12),...,Hsc(Gi,j),...]i=1,2,3,4;j=1,2,3,4 HS 1 =[Hsc(G 11 ), Hsc(G 12 ),..., Hsc(G i,j ),...] i=1, 2, 3, 4; j=1, 2, 3, 4

而根据前面定义的Harris互相关,则可以得到2个24维向量HM1和HM2,以及两个8维向量HM3和HM4:  According to the previously defined Harris cross-correlation, two 24-dimensional vectors HM 1 and HM 2 and two 8-dimensional vectors HM 3 and HM 4 can be obtained:

Hh Mm 11 == [[ HmcHmc &OverBar;&OverBar; (( GG 1111 ,, GG 1212 )) ,, .. .. .. ,, HmcHmc &OverBar;&OverBar; (( GG ijij ,, GG ikik )) ,, .. .. .. ]] ii == 1,2,3,41,2,3,4 ;; jj == 1,2,31,2,3 ;; kk == jj ++ 11

Hh Mm 22 == [[ HmcHmc &OverBar;&OverBar; (( GG 1111 ,, GG 21twenty one )) ,, .. .. .. ,, HmcHmc &OverBar;&OverBar; (( GG ijij ,, GG kjkj )) ,, .. .. .. ]] ii == 1,2,31,2,3 ;; jj == 1,2,3,41,2,3,4 ;; kk == ii ++ 11

Hh Mm 33 == [[ HmcHmc &OverBar;&OverBar; (( GG 0000 ,, GG 1111 )) ,, HmcHmc &OverBar;&OverBar; (( GG 0000 ,, GG 1212 )) ,, HmcHmc &OverBar;&OverBar; (( GG 0000 ,, GG 1313 )) ,, HmcHmc &OverBar;&OverBar; (( GG 0000 ,, GG 1414 )) ]]

Hh Mm 44 == [[ HmcHmc &OverBar;&OverBar; (( GG 1111 ,, GG 1414 )) ,, HmcHmc &OverBar;&OverBar; (( GG 21twenty one ,, GG 24twenty four )) ,, HmcHmc &OverBar;&OverBar; (( GG 3131 ,, GG 3434 )) ,, HmcHmc &OverBar;&OverBar; (( GG 4141 ,, GG 4444 )) ]]

这样就为特征点P建立了一个98维的Harris相关描述子  In this way, a 98-dimensional Harris related descriptor is established for the feature point P

HCD(P)=[HS,HM1,HM2,HM3,HM4HCD(P)=[HS, HM 1 , HM 2 , HM 3 , HM 4 ]

为了使描述子对图像亮度的线性变化具有不变性,最后将HCD(P)归一化:  In order to make the descriptor invariant to the linear change of image brightness, finally normalize HCD(P):

NHCDNHCD (( PP )) == HCDHCD (( PP )) || || HCDHCD (( PP )) || ||

归一化后的描述子NHCD(P)有效剔除了照度变化、噪声干扰等因素的影响,具有较强的尺度不变性与旋转不变性,使用该描述子可以较好地描述数字图像的仿射不变特征。  The normalized descriptor NHCD(P) effectively eliminates the influence of illumination changes, noise interference and other factors, and has strong scale invariance and rotation invariance. Using this descriptor can better describe the affine of digital images Invariant features. the

Claims (4)

1.使用M进小波提取图像仿射不变特征的方法,其特征是,包括如下步骤: 1. use M to advance the method for wavelet extraction image affine invariant feature, it is characterized in that, comprises the steps: 1)首先通过M进小波变换建立图像的多尺度空间,在多尺度空间中通过检测小波系数的局部模最大值,确定候选特征点即局部极值点的位置及其所在尺度: 1) Firstly, the multi-scale space of the image is established by M-advance wavelet transform. In the multi-scale space, the position and scale of the candidate feature point, that is, the local extremum point, are determined by detecting the local modulus maximum value of the wavelet coefficient: 将图像f(x,y)在递增尺度Mj下沿x和y方向,做M进小波变换,其中M为大于1小于2的实数,j为递增的正整数,其中的小波函数ψ(x,y)定义为光滑函数 
Figure DEST_PATH_FSB00000756719700011
沿x和y方向的导数,从而得到由递增尺度Mj下的平滑图像 
Figure DEST_PATH_FSB00000756719700012
及小波系数 
Figure DEST_PATH_FSB00000756719700013
组成的图像多尺度空间;将多尺度空间下每个像素点处的小波系数 与同尺度的相邻8个小波系数以及上一尺度和下一尺度的18个小波系数进行比较,小波系数 
Figure DEST_PATH_FSB00000756719700015
的模是最大值的点为局部极值点,作为一个候选特征点,记为X=(x,y,Mj),其中(x,y)表示候选特征点的位置,Mj表示候选特征点所在尺度;
The image f(x, y) is subjected to M wavelet transformation along the x and y directions under the increasing scale M j , where M is a real number greater than 1 and less than 2, j is an increasing positive integer, and the wavelet function ψ(x , y) is defined as a smooth function
Figure DEST_PATH_FSB00000756719700011
Derivatives along the x and y directions, resulting in a smooth image with increasing scale M j
Figure DEST_PATH_FSB00000756719700012
and wavelet coefficients
Figure DEST_PATH_FSB00000756719700013
The multi-scale space of the composed image; the wavelet coefficient at each pixel in the multi-scale space Compared with the adjacent 8 wavelet coefficients of the same scale and the 18 wavelet coefficients of the previous scale and the next scale, the wavelet coefficient
Figure DEST_PATH_FSB00000756719700015
The point whose modulus is the maximum value is the local extremum point. As a candidate feature point, it is recorded as X=(x, y, M j ), where (x, y) represents the position of the candidate feature point, and M j represents the candidate feature The scale of the point;
2)去除不稳定的低对比度的候选特征点,以剩下的稳定特征点作为特征区域的中心,完成特征区域的定位: 2) Remove unstable low-contrast candidate feature points, and use the remaining stable feature points as the center of the feature area to complete the positioning of the feature area: 将不同尺度空间下的平滑图像 
Figure DEST_PATH_FSB00000756719700016
在局部极值点X处用泰勒公式展开,通过对泰勒展开式求导,并令泰勒展开式求导后的结果为零,得到一个偏移量 
Figure DEST_PATH_FSB00000756719700017
将偏移量 
Figure DEST_PATH_FSB00000756719700018
与X相加,求出局部极值点位置;再将偏移量 带入泰勒展开式,如果算出来的绝对值小于绝对值阈值,则认为该局部极值点不稳定,予以剔除,剩下的所有局部极值点即为特征点; 
Smooth images in different scale spaces
Figure DEST_PATH_FSB00000756719700016
Use the Taylor formula to expand at the local extremum point X, and obtain an offset by deriving the Taylor expansion and making the result of the Taylor expansion derivation zero
Figure DEST_PATH_FSB00000756719700017
will offset
Figure DEST_PATH_FSB00000756719700018
Add it to X to find the position of the local extremum point; then the offset Into the Taylor expansion, if the calculated absolute value is less than the absolute value threshold, the local extreme point is considered unstable and removed, and all the remaining local extreme points are feature points;
3)确定特征描述子的主方向,并将特征点周围的区域旋转到主方向: 3) Determine the main direction of the feature descriptor, and rotate the area around the feature point to the main direction: 以特征点为中心,选取半径与特征点所在尺度成正比的圆形区域,在此区域内计算哈尔小波在水平和垂直方向上的响应值,记为hx、hy,并对两个值进行高斯加权,加权后的值分别表示在水平和垂直方向上的方向分量,记为Whx、Why;将圆形区域内分割成多个大小相等的扇形区域,并分别统计扇形区域内的Whx与Why,记为∑Whx,∑Why;其中∑Whx,∑Why取最大的区域所在的方向作为该特征点的主方向,且该主方向的度数可根据∑Whx,∑Why的反正切值求出;最后将圆形区域的方向旋转到上述得到的特征点的主方向; Taking the feature point as the center, select a circular area whose radius is proportional to the scale of the feature point, and calculate the response value of the Haar wavelet in the horizontal and vertical directions in this area, denoted as h x , h y , and the two Values are Gaussian weighted, and the weighted values represent the direction components in the horizontal and vertical directions respectively, denoted as W hx , Why ; the circular area is divided into multiple fan-shaped areas of equal size, and the statistics of the fan-shaped areas are respectively W hx and Why of , recorded as ∑W hx , ∑W hy ; where ∑W hx , ∑W hy take the direction of the largest area as the main direction of the feature point, and the degree of the main direction can be calculated according to ∑W hx , the arctangent value of ∑W hy is calculated; finally, the direction of the circular area is rotated to the main direction of the feature points obtained above; 4)最后根据特征点周围区域的梯度信息,构造出多维特征描述子: 4) Finally, according to the gradient information of the area around the feature point, a multi-dimensional feature descriptor is constructed: 基于上一步确定的特征点主方向将特征点所在的圆形区域等分为上下左右4个扇形,同时将该圆形区域用多个同心圆进行划分,得到特征点邻域的多个子区域;根据其中每个子区域的梯度相对于其相邻子区域梯度均值的变化规律以及相邻子区域之间梯度分布的相关性,为每个特征点构造出具有仿射不变的多维特征描述算子,即为该特征点对应的仿射不变特征。 Based on the main direction of the feature points determined in the previous step, the circular area where the feature points are located is divided into four sectors, up, down, left, and right, and the circular area is divided into multiple concentric circles to obtain multiple sub-areas of the feature point neighborhood; According to the change law of the gradient of each sub-region relative to the gradient mean of its adjacent sub-regions and the correlation of gradient distribution between adjacent sub-regions, a multi-dimensional feature description operator with affine invariance is constructed for each feature point , which is the affine invariant feature corresponding to the feature point.
2.根据权利要求1所述使用M进小波提取图像仿射不变特征的方法,其特征是,所述步骤1中的M进小波变换具体采用提升格式的3/2进小波变换。 2. according to claim 1, use M-advance wavelet to extract the method for image affine invariant feature, it is characterized in that, the M-advance wavelet transform in the described step 1 specifically adopts the 3/2 advance wavelet transform of lifting format. 3.根据权利要求1或2所述使用M进小波提取图像仿射不变特征的方法,其特征是,所述步骤2中绝对值阈值为0.03。 3. according to claim 1 and 2 described method using M to advance wavelet to extract image affine invariant feature, it is characterized in that, in described step 2, absolute value threshold is 0.03. 4.根据权利要求1或2所述使用M进小波提取图像仿射不变特征的方法,其特征是,所述步骤4中每个子区域的梯度相对于其相邻子区域梯度均值的变化规律采用Harris自相关描述,所述相邻子区域之间梯度分布的相关性采用Harris互相关描述。  4. according to claim 1 or 2, use M to advance the method for wavelet extraction image affine invariant feature, it is characterized in that, the gradient of each subregion in the described step 4 is with respect to the change rule of its adjacent subregion gradient mean value Harris autocorrelation is used to describe, and the correlation of the gradient distribution between the adjacent sub-regions is described by Harris cross-correlation. the
CN2010101092568A 2010-02-11 2010-02-11 Method for extracting affine invariant feature of image by using M-band wavelet Active CN101957916B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010101092568A CN101957916B (en) 2010-02-11 2010-02-11 Method for extracting affine invariant feature of image by using M-band wavelet

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010101092568A CN101957916B (en) 2010-02-11 2010-02-11 Method for extracting affine invariant feature of image by using M-band wavelet

Publications (2)

Publication Number Publication Date
CN101957916A CN101957916A (en) 2011-01-26
CN101957916B true CN101957916B (en) 2012-06-27

Family

ID=43485239

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010101092568A Active CN101957916B (en) 2010-02-11 2010-02-11 Method for extracting affine invariant feature of image by using M-band wavelet

Country Status (1)

Country Link
CN (1) CN101957916B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103455817A (en) * 2013-04-28 2013-12-18 南京理工大学 Method for extracting human body features of robust time-space domain

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102222228B (en) * 2011-05-26 2013-09-04 北京建筑工程学院 Method for extracting feature points of images
CN103093226B (en) * 2012-12-20 2016-01-20 华南理工大学 A kind of building method of the RATMIC descriptor for characteristics of image process
JP6448767B2 (en) * 2014-04-24 2019-01-09 ナント・ホールデイングス・アイ・ピー・エル・エル・シー Robust feature identification in image object recognition
CN104156723B (en) * 2014-09-01 2016-03-02 中国人民解放军国防科学技术大学 A kind of extracting method with the most stable extremal region of scale invariability
CN104881877A (en) * 2015-06-12 2015-09-02 哈尔滨工业大学 Method for detecting image key point based on convolution and time sequence optimization of FPGA
CN106296719A (en) * 2016-11-01 2017-01-04 山东省科学院情报研究所 The intelligent safety check instrument of blending algorithm based on a kind of local invariant features and safety inspection method
CN109711416B (en) * 2018-11-23 2021-08-06 西安天和防务技术股份有限公司 Target identification method and device, computer equipment and storage medium
CN110969145B (en) * 2019-12-19 2020-08-28 珠海大横琴科技发展有限公司 Remote sensing image matching optimization method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101470805B (en) * 2007-12-28 2012-01-04 北大方正集团有限公司 Characteristics information extraction method and device for static image target

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103455817A (en) * 2013-04-28 2013-12-18 南京理工大学 Method for extracting human body features of robust time-space domain

Also Published As

Publication number Publication date
CN101957916A (en) 2011-01-26

Similar Documents

Publication Publication Date Title
CN101957916B (en) Method for extracting affine invariant feature of image by using M-band wavelet
US11189020B2 (en) Systems and methods for keypoint detection
EP3382644B1 (en) Method for 3d modelling based on structure from motion processing of sparse 2d images
CN104867126B (en) Based on point to constraint and the diameter radar image method for registering for changing region of network of triangle
Jiang et al. Performance evaluation of feature detection and matching in stereo visual odometry
CN103065135A (en) License number matching algorithm based on digital image processing
CN106447601B (en) Unmanned aerial vehicle remote sensing image splicing method based on projection-similarity transformation
Misra et al. Feature based remote sensing image registration techniques: A comprehensive and comparative review
CN103593832A (en) Method for image mosaic based on feature detection operator of second order difference of Gaussian
CN104036523A (en) Improved mean shift target tracking method based on surf features
CN101488224B (en) Feature Point Matching Method Based on Correlation Measure
Abdellali et al. L2d2: Learnable line detector and descriptor
CN103700082B (en) Image Mosaic Method Based on Relative Orientation of Dual Quaternions
CN103336964B (en) SIFT image matching method based on module value difference mirror image invariant property
Zhao et al. Automatic registration of images with inconsistent content through line-support region segmentation and geometrical outlier removal
US20230334821A1 (en) Compressed spatial frequency transform for feature tracking, image matching, search, and retrieval
CN104036494B (en) A kind of rapid matching computation method for fruit image
CN105678720A (en) Image matching judging method and image matching judging device for panoramic stitching
Sargent et al. Feature detector and descriptor for medical images
Hwang et al. Real-time 2d orthomosaic mapping from drone-captured images using feature-based sequential image registration
Kai et al. Multi-source remote sensing image registration based on normalized SURF algorithm
Cheng Research on automatic tracking method of marker points in sports image sequence based on feature matching
Akl et al. Second-moment matrix adaptation for local orientation estimation
Parmar et al. An efficient technique for subpixel accuracy using integrated feature based image registration
Matusiak et al. Improving matching performance of the keypoints in images of 3D scenes by using depth information

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
ASS Succession or assignment of patent right

Owner name: SHANXI GREEN ELECTRO-OPTIC INDUSTRY TECHNOLOGY INS

Free format text: FORMER OWNER: DEFENSIVE SCIENTIFIC AND TECHNOLOGICAL UNIV., PLA

Effective date: 20130514

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 410073 CHANGSHA, HUNAN PROVINCE TO: 033300 LVLIANG, SHAANXI PROVINCE

TR01 Transfer of patent right

Effective date of registration: 20130514

Address after: 033300 Shanxi city of Lvliang province Liulin County Li Jia Wan Xiang Ge duo Cun Bei River No. 1

Patentee after: SHANXI GREEN OPTOELECTRONIC INDUSTRY SCIENCE AND TECHNOLOGY RESEARCH INSTITUTE (CO., LTD.)

Address before: 410073 Hunan province Changsha Kaifu District, Deya Road No. 109

Patentee before: National University of Defense Technology of People's Liberation Army of China

ASS Succession or assignment of patent right

Owner name: HUNAN VISIONSPLEND OPTOELECTRONIC TECHNOLOGY CO.,

Free format text: FORMER OWNER: SHANXI GREEN ELECTRO-OPTIC INDUSTRY TECHNOLOGY INSTITUTE (CO., LTD.)

Effective date: 20140110

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 033300 LVLIANG, SHAANXI PROVINCE TO: 410073 CHANGSHA, HUNAN PROVINCE

TR01 Transfer of patent right

Effective date of registration: 20140110

Address after: 410073 Hunan province Changsha Kaifu District, 31 Road No. 303 Building 5 floor A Di Shang Yong

Patentee after: HUNAN VISION SPLEND PHOTOELECTRIC TECHNOLOGY Co.,Ltd.

Address before: 033300 Shanxi city of Lvliang province Liulin County Li Jia Wan Xiang Ge duo Cun Bei River No. 1

Patentee before: SHANXI GREEN OPTOELECTRONIC INDUSTRY SCIENCE AND TECHNOLOGY RESEARCH INSTITUTE (CO., LTD.)