[go: up one dir, main page]

CN107909018A - A kind of sane multi-modal Remote Sensing Images Matching Method and system - Google Patents

A kind of sane multi-modal Remote Sensing Images Matching Method and system Download PDF

Info

Publication number
CN107909018A
CN107909018A CN201711078617.5A CN201711078617A CN107909018A CN 107909018 A CN107909018 A CN 107909018A CN 201711078617 A CN201711078617 A CN 201711078617A CN 107909018 A CN107909018 A CN 107909018A
Authority
CN
China
Prior art keywords
mrow
msub
feature
matching
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711078617.5A
Other languages
Chinese (zh)
Other versions
CN107909018B (en
Inventor
叶沅鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest Jiaotong University
Original Assignee
Southwest Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest Jiaotong University filed Critical Southwest Jiaotong University
Priority to CN201711078617.5A priority Critical patent/CN107909018B/en
Publication of CN107909018A publication Critical patent/CN107909018A/en
Application granted granted Critical
Publication of CN107909018B publication Critical patent/CN107909018B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)

Abstract

本发明公开了一种稳健的多模态遥感影像匹配方法和系统,整合各种局部特征描述符进行多模态遥感影像自动匹配。首先对影像进行密集的格网采样,并提取每个格网点的局部特征描述符,形成三维稠密特征表达图,然后基于该特征表达图利用三维傅里叶变换在频率域建立一种快速的匹配相似性测度,最后采用模板匹配进行同名点识别。另外针对所发明的匹配方法,还提出了一种新的稠密特征表达方法梯度方向特征通道(CFOG),它在匹配性能和计算效率方面要优于基于HOG、LSS、SIFT、SURF等现有的稠密特征表达方式。本发明能有效克服可见光、红外、激光雷达、合成孔径雷达以及地图等多模态影像间的非线性辐射差异,在影像间快速、精确地识别出同名点,实现影像的自动匹配。

The invention discloses a robust multimodal remote sensing image matching method and system, which integrates various local feature descriptors for automatic multimodal remote sensing image matching. First, dense grid sampling is performed on the image, and the local feature descriptor of each grid point is extracted to form a three-dimensional dense feature expression map, and then a fast matching is established in the frequency domain using three-dimensional Fourier transform based on the feature expression map Similarity measurement, and finally template matching is used to identify the same-name point. In addition, for the invented matching method, a new dense feature expression method Gradient Oriented Feature Channel (CFOG) is proposed, which is superior to existing methods based on HOG, LSS, SIFT, SURF, etc. in terms of matching performance and computational efficiency. Dense feature representation. The invention can effectively overcome the non-linear radiation difference among multimodal images such as visible light, infrared, laser radar, synthetic aperture radar, and map, quickly and accurately identify points with the same name among the images, and realize automatic image matching.

Description

一种稳健的多模态遥感影像匹配方法和系统A Robust Multimodal Remote Sensing Image Matching Method and System

技术领域technical field

本发明涉及卫星影像处理技术领域,尤其是一种多模态遥感影像匹配方法和系统,用于可见光、红外、激光雷达、合同孔径雷达以及地图等多模态影像的自动匹配。The invention relates to the technical field of satellite image processing, in particular to a multimodal remote sensing image matching method and system for automatic matching of multimodal images such as visible light, infrared, laser radar, contract aperture radar, and maps.

背景技术Background technique

影像匹配是在两幅或多福影像间识别同名点的过程,它是诸多遥感影像分析如影像融合,变化检测,影像镶嵌等的基本预处理步骤,其匹配精度对后续的分析工作产生重要的影响。目前遥感卫星传感器装载有全球定位系统和惯性导航系统,可对遥感影像进行直接定位和粗匹配,消除影像间明显的旋转和尺度变化,使影像间仅存在一定量(如几十个像素内)的平移差异。尽管如此,由于影像的成像机理的不同,多模态遥感影像间(可见光,红外,激光雷达和合同孔径雷达等)存在显著的非线性辐射差异,导致同名点的自动匹配仍然非常具有挑战性。Image matching is the process of identifying points with the same name between two or more images. It is the basic preprocessing step of many remote sensing image analyzes such as image fusion, change detection, image mosaic, etc. The matching accuracy is important for subsequent analysis work. influences. At present, remote sensing satellite sensors are equipped with global positioning system and inertial navigation system, which can directly locate and roughly match remote sensing images, eliminate obvious rotation and scale changes between images, so that only a certain amount (such as within dozens of pixels) exists between images translation difference. However, due to the different imaging mechanisms of images, there are significant nonlinear radiation differences among multi-modal remote sensing images (visible light, infrared, lidar and contract aperture radar, etc.), resulting in automatic matching of points with the same name is still very challenging.

目前多模态遥感影像的匹配方法主要可分为:特征匹配和区域匹配。特征匹配是通过影像间显著特征的相似性来实现影像的匹配。常用的特征包括了点特征的方法,线特征的方法和面特征。最近局部不变性特征如Scale Invariant Feature Transform(SIFT),shape context等在遥感影像匹配中也得到了一定的应用。但这些方法通常需要在影像间提取出具有高重复率的特征,而对于具有显著辐射差异的多模态遥感而言,特征提取的重复率往往较低,因此它们对于多模态遥感影像的自动匹配还存在一定的局限性。At present, the matching methods of multi-modal remote sensing images can be mainly divided into: feature matching and area matching. Feature matching is to achieve image matching through the similarity of salient features between images. Commonly used features include point feature methods, line feature methods and surface features. Recently, local invariant features such as Scale Invariant Feature Transform (SIFT), shape context, etc. have also been applied in remote sensing image matching. However, these methods usually need to extract features with a high repetition rate between images, and for multimodal remote sensing with significant radiation differences, the repetition rate of feature extraction is often low, so they are suitable for automatic multimodal remote sensing images. Matching also has certain limitations.

基于区域的方法主要是采用模板匹配的策略,以某种匹配相似性测度为准则,在影像间进行同名点识别。在此过程中,相似性测度的选择至关重要,直接影响到后续的匹配精度。常用的相似性测度包括了灰度差平方和、归一化相关系数和互信息等。但这些相似性测度都是利用影像间灰度的相似性进行同名点识别,而由于多模态遥感影像间的灰度信息存在较大的差异,所以它们无法较好适用于多模态遥感影像的自动匹配。相比于灰度信息,影像的结构和形状属性具有较高的相似性,而且相关研究利用梯度方向直方图(Histogramof Orientated Gradient,HOG)、局部自相似(Local Self-Similarity,LSS)等局部描述符提取影像的结构和形状特征,并在此基础上建立相似性测度进行影像匹配,提高了匹配性能。尽管如此,HOG和LSS只是利用影像的局部信息进行特征构建,或者利用所提取特征点的邻域信息计算特征,是一种稀疏的特征表达方式,难以很好地反映多模态影像间的共有属性,而且其计算效率较低。The region-based method mainly adopts the strategy of template matching, and uses a certain matching similarity measure as a criterion to identify points with the same name between images. In this process, the choice of similarity measure is very important, which directly affects the subsequent matching accuracy. Commonly used similarity measures include sum of squares of gray difference, normalized correlation coefficient and mutual information. However, these similarity measures use the gray similarity between images to identify points with the same name, and because there are large differences in gray information between multimodal remote sensing images, they cannot be well applied to multimodal remote sensing images. automatic matching. Compared with grayscale information, the structure and shape attributes of images have higher similarity, and related research uses local descriptions such as Histogram of Oriented Gradient (HOG) and Local Self-Similarity (LSS). The structure and shape features of the image are extracted by the symbol, and a similarity measure is established on this basis for image matching, which improves the matching performance. Nevertheless, HOG and LSS only use the local information of the image to construct features, or use the neighborhood information of the extracted feature points to calculate features, which is a sparse feature expression method, which is difficult to reflect the commonality between multi-modal images. attributes, and its computational efficiency is relatively low.

鉴于此,本发明构建了一种稳健的多模态遥感影像匹配方法,该方法可以整合各种局部特征描述符进行多模态遥感影像自动匹配。该方法首先对影像进行密集的格网采样,并提取每个格网点的HOG,LSS、SIFT或Speeded-Up Robust Features(SURF)等局部特征描述符,形成三维的稠密特征表达图,来反映影像间共有的结构和形状属性。然后基于该特征表达图,建立影像匹配的相似性测度,并采用模板匹配的策略进行同名点识别,另外针对所发明的方法,构建了一种基于方向梯度特征的稠密特征描述符,名为方向梯度通道特征(Channel Feature of Orientated Gradient,CFOG)。它在匹配性能和计算效率方面要优于稠密的HOG,LSS、SIFT和SURF等特征表达技术。In view of this, the present invention constructs a robust multimodal remote sensing image matching method, which can integrate various local feature descriptors for automatic multimodal remote sensing image matching. This method first performs dense grid sampling on the image, and extracts local feature descriptors such as HOG, LSS, SIFT or Speeded-Up Robust Features (SURF) for each grid point to form a three-dimensional dense feature expression map to reflect the image. shared structural and shape properties. Then based on the feature expression graph, the similarity measure of image matching is established, and the strategy of template matching is used to identify the same name point. In addition, for the invented method, a dense feature descriptor based on the direction gradient feature is constructed, named direction. Gradient channel feature (Channel Feature of Orientated Gradient, CFOG). It is superior to feature expression techniques such as dense HOG, LSS, SIFT and SURF in terms of matching performance and computational efficiency.

发明内容Contents of the invention

本发明的发明目的在于:克服传统匹配方法的不足,提供了一种稳健的多模态遥感影像匹配方法。该方法利用通过稠密的特征表达技术来提取影像间共有的结构和形状特征,并在其基础上建立了快速的匹配相似性测度,可在多模态遥感影像间快速、精确地获取大量分布均匀的同名点。另外针对所发明的方法,构建了一种新颖的稠密特征表达技术,名为方向梯度通道特征(Channel Feature of Orientated Gradient,CFOG)。The purpose of the present invention is to overcome the shortcomings of traditional matching methods and provide a robust multi-modal remote sensing image matching method. This method uses the dense feature expression technology to extract the common structure and shape features between images, and establishes a fast matching similarity measure based on it, which can quickly and accurately obtain a large number of uniformly distributed point of the same name. In addition, for the invented method, a novel dense feature expression technique named Channel Feature of Orientated Gradient (CFOG) is constructed.

一方面,本发明提供了一种稳健的多模态遥感影像匹配方法,包括下列步骤:On the one hand, the present invention provides a robust multimodal remote sensing image matching method, comprising the following steps:

A.判断参考影像和输入影像的分辨率信息,如果两幅影像具有相同的分辨率,则进行后续处理,如果分辨率不同,则将两幅影像采样为同样的分辨率;A. Determine the resolution information of the reference image and the input image. If the two images have the same resolution, perform subsequent processing. If the resolutions are different, sample the two images to the same resolution;

B.采用分块的策略,在参考影像上检测出一系列分布均匀的特征点,记为P1i(i=1,2,3,……,N),以点P1i为中心选取模板区域AreaW1iB. Using the block strategy, detect a series of evenly distributed feature points on the reference image, which are recorded as P 1i (i=1,2,3,...,N), and select the template area centered on point P 1i AreaW 1i ;

C.根据遥感影像自身提供的地理坐标信息,预测点集P1i(i=1,2,3,….,N)在输入影像上所对应的匹配区域AreaW2iC. According to the geographical coordinate information provided by the remote sensing image itself, predict the matching area AreaW 2i corresponding to the point set P 1i (i=1,2,3,...,N) on the input image;

D.在匹配区域内进行密集的格网采样,并构建三维稠密特征表达图;D. Perform dense grid sampling in the matching area and construct a three-dimensional dense feature expression map;

E.在三维稠密特征表达图的基础上,建立相似性测度进行同名点匹配;E. On the basis of the three-dimensional dense feature expression map, establish a similarity measure to match the same-name points;

F.对于获得的同名点,对其相似性图进行局部极值拟合,求解出匹配点的亚像素位置;F. For the obtained point with the same name, perform local extremum fitting on its similarity map, and solve the sub-pixel position of the matching point;

G.重复步骤(C)—(F),遍历P1i(i=1,2,3,…,N)的每一个点,得到具有亚像素精度的同名点对{PD1i(x,y),PD2i(x,y)}(i=1,2,3,…,N);G. Repeat steps (C)—(F), traverse each point of P 1i (i=1,2,3,...,N), and obtain a point pair {PD 1i(x,y) with the same name with sub-pixel precision , PD 2i(x,y) }(i=1,2,3,...,N);

H.剔除{PD1i(x,y),PD2i(x,y)}(i=1,2,3,…,N)中误差较大的同名点对,获取最终的同名点对{PID1i(x,y),PID2i(x,y)}(i=1,2,3,…,S)。H. Eliminate {PD 1i(x,y) ,PD 2i(x,y) }(i=1,2,3,...,N) with the same name point pair with large error to obtain the final same name point pair {PID 1i(x,y) , PID 2i(x,y) }(i=1,2,3,...,S).

其中,所述步骤D包括以下步骤:对匹配区域内的影像数据进行密集的格网采样,并计算格网点的HOG、LSS、SIFT或SURF等局部特征描述符,形成三维的稠密特征表达图。Wherein, the step D includes the following steps: performing dense grid sampling on the image data in the matching area, and calculating local feature descriptors such as HOG, LSS, SIFT or SURF of the grid points to form a three-dimensional dense feature expression map.

进一步的,所述步骤D还可以为在匹配区域内构建方向梯度通道特征(CFOG),具体包括以下步骤:Further, the step D can also be to construct a channel feature of direction gradient (CFOG) in the matching area, which specifically includes the following steps:

D1.对匹配区域内的影像数据进行密集的格网采样,计算每个格网点在各个方向的梯度信息,形成三维的方向梯度图;D1. Perform dense grid sampling on the image data in the matching area, calculate the gradient information of each grid point in each direction, and form a three-dimensional direction gradient map;

D2.在水平X方向和垂直Y方向,利用高斯滤波器或三角滤波器对三维的方向梯度图做卷积运算,得到获得特征图再利用一维滤波器[1,2,1]在Z方向对特征图进行卷积运算,得到特征图 D2. In the horizontal X direction and vertical Y direction, use a Gaussian filter or a triangular filter to perform convolution operations on the three-dimensional directional gradient map to obtain a feature map Then use the one-dimensional filter [1,2,1] to map the feature map in the Z direction Perform convolution operation to obtain feature map

D3.对特征图进行归一化操作,获得最终的方向梯度通道特征图。D3. Pair feature map Perform a normalization operation to obtain the final directional gradient channel feature map.

其中所述步骤D为在匹配区域内构建方向梯度通道特征(CFOG),具体计算过程包括如下步骤:Wherein said step D is to construct the direction gradient channel feature (CFOG) in the matching area, and the specific calculation process includes the following steps:

对于区域内的所有格网点,分别利用一维滤波器[-1,0,1]和[-1,0,1]T计算它们在水平方向(X方向)和垂直方向(Y方向)的梯度gx和gyFor all grid points in the area, use one-dimensional filters [-1,0,1] and [-1,0,1] T to calculate their gradients in the horizontal direction (X direction) and vertical direction (Y direction) g x and g y ;

利用gx和gy计算它们在各个方向的梯度值gθ,计算公式如下:Using g x and g y to calculate their gradient value g θ in each direction, the calculation formula is as follows:

式中,θ表示量化的梯度方向,abs表示取绝对值,符号表示值为正时取本身,否则取0;In the formula, θ represents the gradient direction of quantization, abs represents the absolute value, When the sign indicates that the value is positive, it takes itself, otherwise it takes 0;

将各个方向的gθ叠置在一起,形成三维方向梯度图go,然后在X和Y方向利用标准为σ的二维高斯滤波器或三角滤波器对go进行卷积运算获得特征图再利用一维滤波器[1,2,1]在Z方向对进行卷积运算得到特征图 Stack g θ in all directions to form a three-dimensional directional gradient map g o , and then use a two-dimensional Gaussian filter or a triangular filter with a standard σ in the X and Y directions to perform a convolution operation on g o to obtain a feature map Then use the one-dimensional filter [1,2,1] in the Z direction to Perform convolution operation to obtain feature map

特征图中的每一个像素在Z方向上都对应了一个特征向量fi,遍历每个像素,对其特征向量vi进行归一化操作,得到最终的方向梯度通道特征图,归一化的计算公式如下:feature map Each pixel in Z corresponds to a feature vector f i in the Z direction, traverse each pixel, and perform a normalization operation on its feature vector v i to obtain the final directional gradient channel feature map, the normalized calculation formula as follows:

式中,ε是一个避免除零的数。where ε is a number that avoids division by zero.

其中,所述梯度方向在360°的范围内均匀划分为n个等份,每个等份的角度间隔为360/n°。Wherein, the gradient direction is evenly divided into n equal parts in the range of 360°, and the angular interval of each equal part is 360/n°.

进一步的,所述步骤E为基于三维稠密特征表达图,建立相似性测度获取影像的匹配位置。Further, the step E is to establish a similarity measure based on the three-dimensional dense feature expression map to obtain the matching position of the image.

其中所述步骤E具体包括以下计算步骤:Wherein said step E specifically comprises the following calculation steps:

经过步骤D后分别得到区域AreaW1i和AreaW2i的三维稠密特征表达图D1和D2,将D1作为模板在D2进行滑动,利用D1和D2之间的相似性进行匹配,相似性测度可为差平方和,欧式距离,归一化互相关,相位相关等。After step D, the three-dimensional dense feature expression graphs D 1 and D 2 of AreaW 1i and AreaW 2i are respectively obtained, and D 1 is used as a template to slide on D 2 , and the similarity between D 1 and D 2 is used for matching, similar The sex measure can be difference sum of squares, Euclidean distance, normalized cross-correlation, phase correlation, etc.

差平方和的计算公式为:The formula for calculating the sum of squared differences is:

欧式距离的计算公式为:The formula for calculating the Euclidean distance is:

归一化互相关的计算公式为:The formula for calculating the normalized cross-correlation is:

相位相关的计算公式为:The formula for phase correlation is:

式(3)-(6)中,c表示三维稠密特征表达图中的坐标,v表示D1和D2之间的偏移量,Si表示D1和D2之间的相似性测度,F表示快速傅里叶正向变换,F*表示F的复数共轭。对于差平方和、欧式距离,当Si取得最小值时,将获得D1和D2之间的偏移量vi。而对于归一化互相关和相位相关,当Si取得最大值时,将获得D1和D2之间的偏移量vi In formulas (3)-( 6 ), c represents the coordinates in the three - dimensional dense feature representation map, v represents the offset between D1 and D2, S i represents the similarity measure between D1 and D2, F represents the fast Fourier forward transform, and F * represents the complex conjugate of F. For the difference sum of squares and Euclidean distance, when S i takes the minimum value, the offset v i between D 1 and D 2 will be obtained. While for normalized cross - correlation and phase correlation, when S i takes the maximum value , the offset v between D1 and D2 will be obtained

另外,为了进行快速匹配,本发明构建一个基于快速傅里叶变换的相似性测度,该测度的发明起源于差平方和,如上所述当差平方和最小时,将获得匹配位置vi,计算公式如下In addition, in order to perform fast matching, the present invention constructs a similarity measure based on fast Fourier transform, the invention of the measure originates from the difference sum of squares, as mentioned above, when the difference square sum is minimum, the matching position v i will be obtained, the calculation formula as follows

对公式(7)进行展开得:Expand the formula (7) to get:

在公式(8)中,由于第一项和第二项的值接近于常数,所以当第三项的值最大时,公式(7)将获得最小值,因此,相似性函数可重新定义为:In formula (8), since the values of the first and second terms are close to constants, when the value of the third term is the largest, formula (7) will obtain the minimum value, therefore, the similarity function can be redefined as:

式中,是一个卷积运算;In the formula, is a convolution operation;

考虑到频率域下的点乘运算等同于空间域下的卷积运算,因此在频率域利用快速傅里叶变换来提高其计算效率,得到基于傅里叶变换的相似性函数为:Considering that the point multiplication operation in the frequency domain is equivalent to the convolution operation in the space domain, fast Fourier transform is used in the frequency domain to improve its computational efficiency, and the similarity function based on Fourier transform is obtained as:

式中,F和F-1分别表示快速傅里叶正向变换和逆向变换,F*表示F的复数共轭,其中F可为二维或三维傅里叶变换。In the formula, F and F -1 represent fast Fourier forward transform and inverse transform respectively, and F * represents the complex conjugate of F, where F can be two-dimensional or three-dimensional Fourier transform.

另一方面,本发明还提供了一种稳健的多模态遥感影像匹配系统,该系统包括下列单元:On the other hand, the present invention also provides a robust multimodal remote sensing image matching system, which system includes the following units:

预处理单元,用于判断参考影像和输入影像的分辨率信息,如果两幅影像具有相同的分辨率,则执行后续单元,如果分辨率不同,则将两幅影像采样为同样的分辨率;The preprocessing unit is used to judge the resolution information of the reference image and the input image. If the two images have the same resolution, execute the subsequent unit. If the resolutions are different, sample the two images to the same resolution;

模板区域选取单元,用于采用分块的策略,在参考影像上检测出一系列分布均匀的特征点,记为P1i(i=1,2,3,…,N),以点P1i为中心选取模板区域AreaW1iThe template area selection unit is used to detect a series of evenly distributed feature points on the reference image by adopting the block strategy, which is denoted as P 1i (i=1,2,3,...,N), and the point P 1i is Select the template area AreaW 1i in the center;

匹配区域选取单元,用于根据遥感影像自身提供的地理坐标信息,预测点集P1i(i=1,2,3,…,N)在输入影像上所对应的匹配区域AreaW2iThe matching area selection unit is used to predict the matching area AreaW 2i corresponding to the point set P 1i (i=1,2,3,...,N) on the input image according to the geographical coordinate information provided by the remote sensing image itself;

特征提取单元,用于在匹配区域内构建三维稠密特征表达图;A feature extraction unit is used to construct a three-dimensional dense feature expression map in the matching area;

初匹配单元,在三维稠密特征表达图的基础上,建立相似性测度进行同名点匹配;对于获得的同名点,对其相似性图进行局部极值拟合,求解出匹配点的亚像素位置;重复上述单元的操作,遍历P1i(i=1,2,3,…,N)的每一个点,得到具有亚像素精度的同名点对{PD1i(x,y),PD2i(x,y)}(i=1,2,3,…,N);The initial matching unit, on the basis of the three-dimensional dense feature expression map, establishes a similarity measure to match the points of the same name; for the obtained points of the same name, it performs local extremum fitting on the similarity map, and solves the sub-pixel position of the matching point; Repeat the operation of the above unit, traverse each point of P 1i (i=1,2,3,…,N), and get the same-name point pair {PD 1i(x,y) with sub-pixel precision, PD 2i(x, y) }(i=1,2,3,...,N);

匹配点筛选单元,用于剔除{PD1i(x,y),PD2i(x,y)}(i=1,2,3,…,N)中误差较大的同名点对,获取最终的同名点对{PID1i(x,y),PID2i(x,y)}(i=1,2,3,…,S)。The matching point screening unit is used to eliminate the same-name point pairs with large errors in {PD 1i(x,y) , PD 2i(x,y) }(i=1,2,3,...,N) to obtain the final Point pairs with the same name {PID 1i(x,y) , PID 2i(x,y) } (i=1, 2, 3, . . . , S).

进一步的,所述特征提取单元用于对匹配区域内的影像数据进行密集的格网采样,计算每个格网点的局部特征描述符,形成三维稠密特征表达图。Further, the feature extraction unit is used to perform dense grid sampling on the image data in the matching area, calculate the local feature descriptor of each grid point, and form a three-dimensional dense feature expression map.

进一步的,所述初匹配单元是在三维稠密特征表达图的基础上建立影像匹配的相似性测度,并进行运算获得相似性图,取相似性图极值位置为影像的匹配位置。Further, the initial matching unit establishes a similarity measure for image matching on the basis of a three-dimensional dense feature expression map, and performs calculations to obtain a similarity map, and takes the extreme value position of the similarity map as the matching position of the image.

综上所述,由于采用了上述技术方案,本发明的有益效果是:In summary, owing to adopting above-mentioned technical scheme, the beneficial effect of the present invention is:

(1)本发明构建了一种稳健的多模态遥感影像匹配方法,通过将影像进行密集的格网采样,提取每个格网点的局部特征描述符(如HOG、LSS、SIFT或者SURF等),形成三维稠密特征表达图,能较好地反映多模态遥感影像间的共有结构和形状属性,并三维稠密特征表达图上,建立了影像匹配的相似性测度。该方法可以快速精确、自动的在多模态影像间获取大量分布均匀的同名点,可有效提高匹配的实际生产效率,满足业务化运行的需求,而且它是一种通用的方法,可以整合各种局部特征描述进行影像匹配(不局限于HOG、LSS、SIFT或SURF等描述符)。(1) The present invention constructs a robust multi-modal remote sensing image matching method, which extracts the local feature descriptor (such as HOG, LSS, SIFT or SURF, etc.) of each grid point by performing dense grid sampling on the image , to form a three-dimensional dense feature expression map, which can better reflect the common structure and shape attributes among multi-modal remote sensing images, and establish a similarity measure for image matching on the three-dimensional dense feature expression map. This method can quickly, accurately and automatically obtain a large number of uniformly distributed points of the same name among multi-modal images, which can effectively improve the actual production efficiency of matching and meet the needs of business operations. Moreover, it is a general method that can integrate various A local feature description is used for image matching (not limited to descriptors such as HOG, LSS, SIFT or SURF).

(2)在构建的方法中,三维的稠密特征表达是通过将影像进行密集的格网采样,并计算每个格网点的HOG,LSS、SIFT或SURF等局部特征描述形成的,是一种稠密的特征表达方式。这不同于传统的HOG,LSS、SIFT或SURF等描述符,它们只是利用影像局部区域信息进行特征构建,或者利用所提取特征点的邻域信息计算特征,是一种稀疏的特征表达方式。相比而言,本发明的稠密的三维特征表达技术能更好,更精确地反映多模态遥感影像间的共有结构和形状属性,匹配性能更稳健,而结合本方法提出的基于三维稠密特征表达技术的相似性测度,可实现多模态影像间的快速、精确匹配。(2) In the construction method, the three-dimensional dense feature expression is formed by densely sampling the image and calculating the local feature description of each grid point such as HOG, LSS, SIFT or SURF, which is a kind of dense feature expression. This is different from traditional descriptors such as HOG, LSS, SIFT, or SURF. They only use the local area information of the image for feature construction, or use the neighborhood information of the extracted feature points to calculate features, which is a sparse feature expression method. In contrast, the dense 3D feature expression technology of the present invention can better reflect the common structure and shape attributes between multi-modal remote sensing images more accurately, and the matching performance is more robust. The similarity measure of expression technology can realize fast and accurate matching between multi-modal images.

(3)本发明针对所构建的方法,利用影像的方向梯度信息构建了一种新颖的稠密特征描述符-方向梯度通道特征(CFOG),它在匹配效率和精度方面都优于稠密的HOG、LSS、SIFT和SURF等特征表达方式。(3) For the constructed method, the present invention constructs a novel dense feature descriptor-directional gradient channel feature (CFOG) using the directional gradient information of the image, which is superior to dense HOG, CFOG in terms of matching efficiency and accuracy. Feature expression methods such as LSS, SIFT, and SURF.

(4)在三维稠密特征表达基础上可利用差平方和,欧式距离,归一化互相关,相位相关等作为相似性测度(不限于这些相似性测度)进行同名点、另外本发明还建立一种快速的基于傅里叶变换的相似性测度,其计算效率更高,计算效果更优。(4) On the basis of three-dimensional dense feature expression, the sum of squared differences, Euclidean distance, normalized cross-correlation, phase correlation, etc. can be used as similarity measures (not limited to these similarity measures) to carry out homonym points. In addition, the present invention also establishes a A fast similarity measure based on Fourier transform, which has higher calculation efficiency and better calculation effect.

(5)大量的实验结果表明,对于平坦地区的影像,匹配的总体精度可达到1个像素以内,而对于山区和城区的影像,匹配的总体精度可达到2.5个像素以内。对于大尺寸的遥感影像(超过20000×20000像素),可在30秒内完成影像匹配。而且与目前流行的商业遥感影像软件(ENVI和ERDAS)相比,所发明的方法在匹配精度和计算效率都具有优势。(5) A large number of experimental results show that for images in flat areas, the overall matching accuracy can reach within 1 pixel, and for images in mountainous and urban areas, the overall matching accuracy can reach within 2.5 pixels. For large-scale remote sensing images (more than 20000×20000 pixels), image matching can be completed within 30 seconds. Moreover, compared with the currently popular commercial remote sensing image software (ENVI and ERDAS), the invented method has advantages in both matching accuracy and computational efficiency.

附图说明Description of drawings

图1是本发明的整体流程图。Fig. 1 is the overall flowchart of the present invention.

图2是本发明的三维稠密特征表达示意图。Fig. 2 is a schematic diagram of three-dimensional dense feature expression in the present invention.

图3是本发明中方向梯度通道特征(CFOG)的构建过程。Fig. 3 is the construction process of the Channel Feature of Oriented Gradient (CFOG) in the present invention.

具体实施方式Detailed ways

为了使本领域的人员更好地理解本发明的技术方案,下面结合本发明的附图,对本发明的技术方案进行清楚、完整的描述,基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动的前提下所获得的其它类同实施例,都应当属于本申请保护的范围。In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution of the present invention is clearly and completely described below in conjunction with the accompanying drawings of the present invention. Based on the embodiments in this application, those of ordinary skill in the art will Other similar embodiments obtained without creative work shall all fall within the scope of protection of this application.

如图1,为一种稳健的多模态遥感影像匹配方法,包括以下步骤:As shown in Figure 1, it is a robust multimodal remote sensing image matching method, including the following steps:

步骤A,根据参考影像和输入影像的分辨率信息,判断两幅影像的分辨率是否一致,一致则进行后续处理,不一致则将这两幅影像按照同样分辨率的进行采样处理。Step A, according to the resolution information of the reference image and the input image, it is judged whether the resolutions of the two images are consistent, if they are consistent, follow-up processing is performed, and if they are inconsistent, the two images are sampled according to the same resolution.

步骤B,采用分块的策略,利用Harris或Forstner算子,在参考影像上提取大量的分布均匀的特征点,具体包括:Step B, using the block strategy, using the Harris or Forstner operator to extract a large number of uniformly distributed feature points on the reference image, specifically including:

将参考影像划分为n×n个互不重叠的区域,在每区域内,计算每个像素点的Harris或Forstner特征值,并将特征值进行从大到小排序,选取特征值较大的k个像素作为特征点。这样可以在每个区域内检测出k个特征点,整幅影像上则拥有n×n×k个特征点。n和k的值可根据实际需求进行设定。记参考影像上检测的特征点集为P1i(i=1,2,3,….,N)。Divide the reference image into n×n non-overlapping regions, calculate the Harris or Forstner eigenvalues of each pixel in each region, sort the eigenvalues from large to small, and select k with larger eigenvalues pixels as feature points. In this way, k feature points can be detected in each region, and there are n×n×k feature points on the entire image. The values of n and k can be set according to actual needs. Denote the feature point set detected on the reference image as P 1i (i=1, 2, 3, . . . , N).

在其他实施例中也可以采用其他算子进行图像的特征提取,本发明对此不进行限定。In other embodiments, other operators may also be used for image feature extraction, which is not limited in the present invention.

步骤C,根据遥感影像自身提供的地理坐标信息,预测点集P1i(i=1,2,3,….,N)在输入影像上所对应的匹配搜索区域,具体包括以下步骤:Step C, according to the geographical coordinate information provided by the remote sensing image itself, predict the matching search area corresponding to the point set P 1i (i=1,2,3,...,N) on the input image, specifically including the following steps:

(1)提取P1i(i=1,2,3,….,N)的一个点P1i(x,y),x和y表示点P1i的图像坐标,以点P1i(x,y)为中心选取大小为M×M的模板区域AreaW1i,并获取该点所对应的地理坐标Geoi(1) extract a point P 1i (x, y) of P 1i (i=1,2,3,...,N), x and y represent the image coordinates of point P 1i , and point P 1i (x, y ) as the center to select a template area AreaW 1i with a size of M×M, and obtain the geographic coordinates Geo i corresponding to the point;

(2)根据地理坐标Geoi并结合输入影像的地理坐标信息,计算其在输入影像上所对应的图像坐标P2i(x,y),并以P2i(x,y)为中心确定一个大小为M1×M1的方形窗口作为匹配搜索区域AreaW2i(2) According to the geographic coordinate Geo i and combined with the geographic coordinate information of the input image, calculate its corresponding image coordinate P 2i (x,y) on the input image, and determine a size centered on P 2i (x,y) A square window of M 1 ×M 1 is used as the matching search area AreaW 2i .

步骤D,在区域AreaW1i和AreaW2i内构建三维稠密特征表达图,参见图2。Step D, constructing a three-dimensional dense feature expression map in AreaW 1i and AreaW 2i , see FIG. 2 .

在一个实施例中,步骤D包括以下步骤:In one embodiment, step D comprises the following steps:

对区域内的像素进行密集的等间距格网采样,并计算每个格网点的HOG、LSS、SIFT或者SURF等局部特征描述符,将每个格网点所对应的特征向量在Z方向进行排列,形成三维稠密特征表达图。对于格网采样的间隔η∈[0.5,4]个像素,采样间隔越小,匹配精度越高,但计算量越大。当采样间隔为1个像素时,即逐像素采样,形成的稠密特征表达图就是逐像素特征表达图。如果采样间隔为小数时,还需要对格网点邻域内的像素值进行内插,便于计算局部特征描述符。另外对于使用何种具体的局部特征描述符,本实施例中对此不做限定。Sampling the pixels in the area with a dense equidistant grid, and calculating local feature descriptors such as HOG, LSS, SIFT or SURF for each grid point, and arranging the feature vectors corresponding to each grid point in the Z direction, A three-dimensional dense feature representation map is formed. For the grid sampling interval η∈[0.5,4] pixels, the smaller the sampling interval, the higher the matching accuracy, but the greater the computational complexity. When the sampling interval is 1 pixel, that is, sampling pixel by pixel, the dense feature expression map formed is the pixel-by-pixel feature expression map. If the sampling interval is a decimal, it is also necessary to interpolate the pixel values in the neighborhood of the grid point to facilitate the calculation of local feature descriptors. In addition, the specific local feature descriptor used is not limited in this embodiment.

在另一个实施例中,步骤D为在区域AreaW1i和AreaW2i内构建方向梯度通道特征(Channel Feature of Orientated Gradient,CFOG),参见图3,具体包括以下步骤:In another embodiment, step D is to construct a directional gradient channel feature (Channel Feature of Orientated Gradient, CFOG) in the areas AreaW 1i and AreaW 2i , referring to Figure 3, specifically comprising the following steps:

(1)对于区域内的所有像素,分别利用一维滤波器[-1,0,1]和[-1,0,1]T计算它们在水平方向(X方向)和垂直方向(Y方向)的梯度gx和gy.(1) For all pixels in the area, use one-dimensional filters [-1,0,1] and [-1,0,1] T to calculate their horizontal direction (X direction) and vertical direction (Y direction) Gradients g x and g y of .

(2)利用gx和gy计算它们在各个方向的梯度值gθ,计算公式如下:(2) Using g x and g y to calculate their gradient value g θ in each direction, the calculation formula is as follows:

式中,θ表示量化的梯度方向,这里将梯度方向在360°的范围内均匀划分为n个等份,每个等份的角度间隔为(360/n)°,abs表示取绝对值,符号表示值为正时取本身,否则取0。In the formula, θ represents the quantized gradient direction. Here, the gradient direction is evenly divided into n equal parts within the range of 360°, and the angular interval of each equal part is (360/n)°. abs means to take the absolute value, When the sign indicates that the value is positive, it takes itself, otherwise it takes 0.

(3)将各个方向的gθ叠置在一起,形成是三维的方向梯度图go。然后在X和Y方向利用标准为σ的二维高斯滤波器或三角滤波器对go进行卷积运算获得特征图再利用一维滤波器[1,2,1]在Z方向对进行卷积运算得到特征图 (3) Stack g θ in all directions together to form a three-dimensional direction gradient map g o . Then in the X and Y directions, use a two-dimensional Gaussian filter or a triangular filter with a standard of σ to perform a convolution operation on g o to obtain a feature map Then use the one-dimensional filter [1,2,1] in the Z direction to Perform convolution operation to obtain feature map

(4)特征图中的每一个像素在Z方向上都对应了一个特征向量fi。遍历每个像素,对其特征向量vi进行归一化操作,进一步消除光照变化的影响,得到最终的CFOG特征图。归一化的计算公式如下:(4) Feature map Each pixel in corresponds to a feature vector f i in the Z direction. Each pixel is traversed, and its feature vector v i is normalized to further eliminate the influence of illumination changes, and the final CFOG feature map is obtained. The normalized calculation formula is as follows:

式中,ε是一个避免除零的数。where ε is a number that avoids division by zero.

步骤E,在三维稠密特征表达图的基础上,建立相似性测度进行同名点匹配,具体包括以下步骤:Step E, on the basis of the three-dimensional dense feature expression graph, establish a similarity measure to match the points with the same name, specifically including the following steps:

(1)经过步骤D后,可以分别得到区域AreaW1i和AreaW2i对应的稠密特征表达图D1和D2,将D1作为模板在D2进行滑动,利用D1和D2之间的相似性进行匹配,相似性测度可为差平方和、欧式距离、归一化互相关、相位相关等,另外对于使用何种具体的相似性测度,本实施例中对此不做限定。(1) After step D, the dense feature expression graphs D 1 and D 2 corresponding to areas AreaW 1i and AreaW 2i can be obtained respectively, and D 1 is used as a template to slide on D 2 , using the similarity between D 1 and D 2 The similarity measure can be sum of squared difference, Euclidean distance, normalized cross-correlation, phase correlation, etc. In addition, the specific similarity measure used is not limited in this embodiment.

差平方和的计算公式为:The formula for calculating the sum of squared differences is:

欧式距离的计算公式为:The formula for calculating the Euclidean distance is:

归一化互相关的计算公式为:The formula for calculating the normalized cross-correlation is:

相位相关的计算公式为:The formula for phase correlation is:

式(3)-(6)中,c表示三维稠密特征表达图中的坐标,v表示D1和D2之间的偏移量,Si表示D1和D2之间的相似性测度,F表示快速傅里叶正向变换,F*表示F的复数共轭。对于差平方和、欧式距离,当Si取得最小值时,将获得D1和D2之间的偏移量vi。而对于归一化互相关、相位相关,当Si取得最大值时,将获得D1和D2之间的偏移量vi In formulas (3)-( 6 ), c represents the coordinates in the three - dimensional dense feature representation map, v represents the offset between D1 and D2, S i represents the similarity measure between D1 and D2, F represents the fast Fourier forward transform, and F * represents the complex conjugate of F. For the difference sum of squares and Euclidean distance, when S i takes the minimum value, the offset v i between D 1 and D 2 will be obtained. For normalized cross-correlation and phase correlation, when S i reaches the maximum value, the offset v i between D 1 and D 2 will be obtained

另外,本发明构建一种快速的基于快速傅里叶变换的相似性测度,该测度的发明起源于差平方和,如上所述,当差平方和最小时,将获得匹配位置vi,计算公式如下In addition, the present invention constructs a fast similarity measure based on Fast Fourier Transform. The invention of this measure originates from the sum of squared differences. As mentioned above, when the sum of squared differences is minimized, the matching position v i will be obtained. The calculation formula is as follows

对公式(7)进行展开得:Expand the formula (7) to get:

在公式(8)中,由于第一项和第二项的值接近于常数,所以当第三项的值最大时,公式(7)将获得最小值,因此,相似性函数可重新定义为:In formula (8), since the values of the first and second terms are close to constants, when the value of the third term is the largest, formula (7) will obtain the minimum value, therefore, the similarity function can be redefined as:

式中,是一个卷积运算;In the formula, is a convolution operation;

考虑到频率域下的点乘运算等同于空间域下的卷积运算,因此在频率域利用快速傅里叶变换来提高其计算效率,得到基于傅里叶变换的相似性测度为:Considering that the point multiplication operation in the frequency domain is equivalent to the convolution operation in the space domain, fast Fourier transform is used in the frequency domain to improve its computational efficiency, and the similarity measure based on Fourier transform is obtained as:

式中,F和F-1分别表示快速傅里叶正向变换和逆向变换,F*表示F的复数共轭,其中F可为二维或三维傅里叶变换。利用式(10)进行匹配的过程为,首先对D1进行快速傅里叶变换得到DF(D1),并对D2进行傅里叶变换并取复数共轭得到DF*(D2),然后将DF(D1)和DF*(D2)进行点乘运算,并对运算结果进行傅里叶逆变换将获得D1和D2之间的相似性图,相似性图最大值的位置则对应了D1和D2之间的偏移量vi,也就是点P1i(x,y)和点P2i(x,y)之间的偏移量。In the formula, F and F -1 represent fast Fourier forward transform and inverse transform respectively, and F * represents the complex conjugate of F, where F can be two-dimensional or three-dimensional Fourier transform. The matching process using formula (10) is as follows: firstly perform fast Fourier transform on D 1 to obtain DF(D 1 ), then perform Fourier transform on D 2 and take complex conjugate to obtain DF * (D 2 ), Then perform dot multiplication of DF(D 1 ) and DF * (D 2 ), and inverse Fourier transform the result to obtain the similarity graph between D 1 and D 2 , and the position of the maximum value of the similarity graph Then it corresponds to the offset v i between D 1 and D 2 , that is, the offset between point P 1i (x,y) and point P 2i (x,y).

将根据以上测度获得的vi在X和Y方向上的偏移量记为(Δx,Δy),则与点P1i(x,y)对应的同名点为P2i(x-Δx,y-Δy),记为P* 2i(x,y),获得的同名点对则为{P1i(x,y),P* 2i(x,y)}。Record the offset of v i in the X and Y directions obtained according to the above measurement as (Δx, Δy), then the point with the same name corresponding to the point P 1i (x, y) is P 2i (x-Δx,y- Δy), denoted as P * 2i (x, y), and the obtained point pair with the same name is {P 1i (x, y), P * 2i (x, y)}.

步骤F,对以上的同名点对{P1i(x,y),P*2i(x,y)},利用二元二次多项式进行局部插值获取亚像素精度,具体包括以下步骤:Step F, for the above point pairs with the same name {P 1i (x, y), P* 2i (x, y)}, use binary quadratic polynomials to perform local interpolation to obtain sub-pixel accuracy, specifically including the following steps:

(1)以点P*2i(x,y)为中心选取3×3像素的局部小窗口,并统计窗口内所有像素的相似性值,然后根据最小二乘的原理,采用二元二次多项式建立相似性值与像素位置的对应关系;(1) Select a local small window of 3×3 pixels centered on the point P* 2i (x,y), and count the similarity values of all pixels in the window, and then use the binary quadratic polynomial according to the principle of least squares Establish the corresponding relationship between the similarity value and the pixel position;

(2)对二元二次多项式求偏导,求解出偏导为0的位置,即获得亚像素精度的同名点对{PD1i(x,y),PD2i(x,y)};(2) Calculate the partial derivative of the binary quadratic polynomial, and find the position where the partial derivative is 0, that is, obtain the sub-pixel precision point pair {PD 1i (x, y), PD 2i (x, y)};

重复步骤C-F,遍历P1i(i=1,2,3,….,N)中的每一个点,获得具有亚像素精度的同名点对{PD1i(x,y),PD2i(x,y)}(i=1,2,3,….,N)。Repeat step CF to traverse each point in P 1i (i=1,2,3,...,N), and obtain the same-named point pair {PD 1i (x,y), PD 2i (x, y)} (i=1,2,3,...,N).

步骤G,剔除{PD1i(x,y),PD2i(x,y)}(i=1,2,3,….,N)中误差较大的同名点对,获取最终的同名点对,具体包括以下步骤:Step G: Eliminate {PD 1i (x, y), PD 2i (x, y)} (i=1,2,3,….,N) with the same name point pair with large error to obtain the final same name point pair , including the following steps:

(1)基于最小二乘的原理,利用{PD1i(x,y),PD2i(x,y)}(i=1,2,3,….,N)的坐标建立投影变换模型。(1) Based on the principle of least squares, use the coordinates of {PD 1i (x, y), PD 2i (x, y)} (i=1,2,3,...,N) to establish a projection transformation model.

(2)计算同名点对的均方根误差和残差,并剔除残差最大的同名点对。(2) Calculate the root mean square error and residual error of the point pairs with the same name, and remove the point pair with the same name with the largest residual error.

(3)重复以上两个步骤,直到均方根误差小于1.5个像素,得到最终的具有亚像素精度的同名点对{PID1i(x,y),PID2i(x,y)}(i=1,2,3,…,S)。(3) Repeat the above two steps until the root mean square error is less than 1.5 pixels, and obtain the final point pair {PID 1i (x, y), PID 2i (x, y)} (i= 1,2,3,...,S).

步骤H,剔除{PD1i(x,y),PD2i(x,y)}(i=1,2,3,….,N)中误差较大的同名点对,获取最终的同名点对{PID1i(x,y),PID2i(x,y)}(i=1,2,3,…,S),并用于进行匹配。Step H, remove {PD 1i(x,y) ,PD 2i(x,y) }(i=1,2,3,….,N) with the same name point pair with large error to obtain the final same name point pair {PID 1i(x,y) , PID 2i(x,y) }(i=1,2,3,...,S), and used for matching.

在另一个实施例中,本发明还提供了一种稳健的多模态遥感影像匹配系统,该系统包括下列单元:In another embodiment, the present invention also provides a robust multimodal remote sensing image matching system, which includes the following units:

预处理单元,用于判断参考影像和输入影像的分辨率信息,如果两幅影像具有相同的分辨率,则执行后续单元,如果分辨率不同,则将两幅影像采样为同样的分辨率;The preprocessing unit is used to judge the resolution information of the reference image and the input image. If the two images have the same resolution, execute the subsequent unit. If the resolutions are different, sample the two images to the same resolution;

模板区域选取单元,用于采用分块的策略,在参考影像上检测出一系列分布均匀的特征点,记为P1i(i=1,2,3,…,N),以点P1i为中心选取模板区域AreaW1iThe template area selection unit is used to detect a series of evenly distributed feature points on the reference image by adopting the block strategy, which is denoted as P 1i (i=1,2,3,...,N), and the point P 1i is Select the template area AreaW 1i in the center;

匹配区域选取单元,用于根据遥感影像自身提供的地理坐标信息,预测点集P1i(i=1,2,3,…,N)在输入影像上所对应的匹配区域AreaW2iThe matching area selection unit is used to predict the matching area AreaW 2i corresponding to the point set P 1i (i=1,2,3,...,N) on the input image according to the geographical coordinate information provided by the remote sensing image itself;

特征提取单元,用于在匹配区域内三维稠密特征表达图;The feature extraction unit is used for three-dimensional dense feature expression map in the matching area;

初匹配单元,用于在三维稠密特征表达图的基础上,建立相似性测度进行同名点匹配;对于获得的同名点,对其相似性图进行局部极值拟合,求解出匹配点的亚像素位置;重复上述单元的操作,遍历P1i(i=1,2,3,…,N)的每一个点,得到具有亚像素精度的同名点对{PD1i(x,y),PD2i(x,y)}(i=1,2,3,…,N);The initial matching unit is used to establish a similarity measure on the basis of the three-dimensional dense feature expression map to match the points with the same name; for the obtained points with the same name, perform local extremum fitting on the similarity map to solve the sub-pixels of the matching points Position; repeat the operation of the above unit, traverse each point of P 1i (i=1,2,3,...,N), and obtain the same-named point pair {PD 1i(x,y) with sub-pixel precision, PD 2i( x,y) }(i=1,2,3,...,N);

匹配点筛选单元,用于剔除{PD1i(x,y),PD2i(x,y)}(i=1,2,3,…,N)中误差较大的同名点对,获取最终的同名点对{PID1i(x,y),PID2i(x,y)}(i=1,2,3,…,S)。The matching point screening unit is used to eliminate the same-name point pairs with large errors in {PD 1i(x,y) , PD 2i(x,y) }(i=1,2,3,...,N) to obtain the final Point pairs with the same name {PID 1i(x,y) , PID 2i(x,y) } (i=1, 2, 3, . . . , S).

进一步的,所述特征提取单元用于对匹配区域内的影像数据进行密集格网采样,计算每个格网点的局部特征描述符,形成三维稠密特征表达图。Further, the feature extraction unit is used to perform dense grid sampling on the image data in the matching area, calculate the local feature descriptor of each grid point, and form a three-dimensional dense feature expression map.

进一步的,所述初匹配单元是利用在三维稠密特征表达基础,建立相似性测度进行相关运算获得相似性图,取相似性图极值的位置为影像的匹配位置。Further, the initial matching unit uses the basis of three-dimensional dense feature expression to establish a similarity measure to perform correlation operations to obtain a similarity map, and take the position of the extreme value of the similarity map as the matching position of the image.

以上为本发明具体实施方式的说明,通过本发明所构建的匹配方法,可利用各种局部特征描述符(如HOG、LSS、SIFT或SURF等)进行有效的三维稠密特征表达,从而有效地反映影像间共有结构和形状特征,并在此基础上建立影像匹配的相似性测度,相比于基于灰度信息的相似性测度如归一化相关系数和互信息等,计算效率更高。另外,本发明所构建的方向梯度通道特征(CFOG),是一种新颖的三维稠密特征表达,它在匹配效率和精度上都优于基于HOG、LSS、SIFT和SURF等描述符的稠密特征表达技术。本发明的技术方案能弥补传统匹配方法对于多模态影像间非线性辐射差异较为敏感的不足,可有效地解决了可见光、红外、激光雷达、合成孔径雷达以及地图等多模态遥感数据的匹配难题。The above is the description of the specific implementation of the present invention. Through the matching method constructed in the present invention, various local feature descriptors (such as HOG, LSS, SIFT, or SURF, etc.) can be used for effective three-dimensional dense feature expression, thereby effectively reflecting Compared with similarity measures based on gray information, such as normalized correlation coefficient and mutual information, the calculation efficiency is higher by sharing structure and shape features between images, and establishing a similarity measure for image matching on this basis. In addition, the Oriented Gradient Channel Feature (CFOG) constructed by the present invention is a novel three-dimensional dense feature expression, which is superior to dense feature expressions based on descriptors such as HOG, LSS, SIFT, and SURF in terms of matching efficiency and accuracy. technology. The technical solution of the present invention can make up for the deficiency that the traditional matching method is sensitive to nonlinear radiation differences between multimodal images, and can effectively solve the matching of multimodal remote sensing data such as visible light, infrared, laser radar, synthetic aperture radar, and maps. problem.

本发明所提出的技术方案是一种通用的技术方法,可以整合各种局部特征描述符(不限于CFOG、HOG,LSS、SIFT和SURF等)进行三维稠密特征表达,在特征表达图上可以利用各种相似性测度(不限于差平方和,欧式距离,归一化互相关,相位相关,以及本发明构建的基于快速傅里叶变换的相似性测度等)进行影像匹配The technical solution proposed by the present invention is a general technical method, which can integrate various local feature descriptors (not limited to CFOG, HOG, LSS, SIFT and SURF, etc.) to express three-dimensional dense features, and can use Various similarity measures (not limited to difference sum of squares, Euclidean distance, normalized cross-correlation, phase correlation, and similarity measures based on fast Fourier transform constructed by the present invention, etc.) for image matching

本发明并不局限于前述的具体实施方式。本发明扩展到任何在本说明书中披露的新特征或任何新的组合,以及披露的任一新的方法或过程的步骤或任何新的组合。The present invention is not limited to the foregoing specific embodiments. The present invention extends to any new feature or any new combination disclosed in this specification, and any new method or process step or any new combination disclosed.

Claims (10)

1.一种稳健的多模态遥感影像匹配方法,其特征在于包括下列步骤:1. A robust multimodal remote sensing image matching method, characterized in that it comprises the following steps: A.判断参考影像和输入影像的分辨率信息,如果两幅影像具有相同的分辨率,则进行步骤B,如果分辨率不同,则将两幅影像采样为同样的分辨率;A. Determine the resolution information of the reference image and the input image. If the two images have the same resolution, proceed to step B. If the resolutions are different, sample the two images to the same resolution; B.采用分块的策略,在参考影像上检测出一系列分布均匀的特征点,记为P1i(i=1,2,3,……,N),以点P1i为中心选取模板区域AreaW1iB. Using the block strategy, detect a series of evenly distributed feature points on the reference image, which are recorded as P 1i (i=1,2,3,...,N), and select the template area centered on point P 1i AreaW 1i ; C.根据遥感影像自身提供的地理坐标信息,预测点集P1i(i=1,2,3,….,N)在输入影像上所对应的匹配区域AreaW2iC. According to the geographical coordinate information provided by the remote sensing image itself, predict the matching area AreaW 2i corresponding to the point set P 1i (i=1,2,3,...,N) on the input image; D.在匹配区域内构建三维稠密特征表达图;D. Construct a three-dimensional dense feature expression map in the matching area; E.在逐像素特征表达图的基础上,建立相似性测度进行同名点匹配;E. On the basis of the pixel-by-pixel feature expression map, establish a similarity measure to match the points with the same name; F.对于获得的同名点,对其相似性图进行局部极值拟合,求解出匹配点的亚像素位置;F. For the obtained point with the same name, perform local extremum fitting on its similarity map, and solve the sub-pixel position of the matching point; G..重复步骤C-F,遍历P1i(i=1,2,3,….,N)的每一个点,得到具有亚像素精度的同名点对{PD1i(x,y),PD2i(x,y)}(i=1,2,3,….,N);G.. Repeat step CF, traversing each point of P 1i (i=1,2,3,...,N), and obtain the same-named point pair {PD 1i(x,y) with sub-pixel precision, PD 2i( x,y) }(i=1,2,3,...,N); H.剔除{PD1i(x,y),PD2i(x,y)}(i=1,2,3,….,N)中误差较大的同名点对,获取最终的同名点对{PID1i(x,y),PID2i(x,y)}(i=1,2,3,…,S)。H. Eliminate {PD 1i(x,y) ,PD 2i(x,y) }(i=1,2,3,….,N) with the same name point pair with large error to obtain the final same name point pair{ PID 1i(x, y) , PID 2i(x, y) } (i=1, 2, 3, . . . , S). 2.如权利要求1所述的多模态遥感影像匹配方法,其特征在于,所述步骤D包括如下步骤:2. multimodal remote sensing image matching method as claimed in claim 1, is characterized in that, described step D comprises the steps: 对匹配区域内的影像数据进行稠密的格网采样,计算每个格网点的局部特征描述符,将其所对应的特征向量在Z方向进行排列,形成三维稠密特征表达图。当格网采样的间距为1个像素时,即进行逐像素采样,所形成的三维稠密特征表达图是三维的逐像素特征表达图。Dense grid sampling is performed on the image data in the matching area, the local feature descriptor of each grid point is calculated, and the corresponding feature vectors are arranged in the Z direction to form a three-dimensional dense feature expression map. When the grid sampling interval is 1 pixel, pixel-by-pixel sampling is performed, and the formed three-dimensional dense feature expression map is a three-dimensional pixel-by-pixel feature expression map. 3.如权利要求2所述的多模态遥感影像匹配方法,其特征在于,所述局部特征描述符为HOG、LSS、SURF或Scale Invariant Feature Transform(SIFT)的一种。3. The multimodal remote sensing image matching method according to claim 2, wherein the local feature descriptor is one of HOG, LSS, SURF or Scale Invariant Feature Transform (SIFT). 4.如权利要求1所述的多模态遥感影像匹配方法,其特征在于,所述步骤D为在匹配区域内构建方向梯度通道特征(channel features of orientated gradients,CFOG),具体包括以下步骤:4. The multimodal remote sensing image matching method according to claim 1, wherein the step D is to construct a channel feature of oriented gradients (CFOG) in the matching area, specifically comprising the following steps: D1.对匹配区域内的影像数据进行密集的格网采样,计算每个格网点在各方向的梯度信息,形成三维的方向梯度图;D1. Perform dense grid sampling on the image data in the matching area, calculate the gradient information of each grid point in each direction, and form a three-dimensional direction gradient map; D2.在水平X方向和垂直Y方向,利用高斯滤波器或三角滤波器对三维的方向梯度图做卷积运算,得到获得特征图再利用一维滤波器[1,2,1]在Z方向对特征图进行卷积运算,得到特征图 D2. In the horizontal X direction and vertical Y direction, use a Gaussian filter or a triangular filter to perform convolution operations on the three-dimensional directional gradient map to obtain a feature map Then use the one-dimensional filter [1,2,1] to map the feature map in the Z direction Perform convolution operation to obtain feature map D3.对特征图进行归一化操作,获得最终的方向梯度通道特征图。D3. Pair feature map Perform a normalization operation to obtain the final directional gradient channel feature map. 5.如权利要求4所述的多模态遥感影像匹配方法,其特征在于,所述步骤D中的构建方向梯度通道特征(CFOG),具体包括以下计算步骤:5. multimodal remote sensing image matching method as claimed in claim 4, is characterized in that, the construction direction gradient channel feature (CFOG) in the described step D, specifically comprises the following computing steps: 对于区域内的所有像素,分别利用一维滤波器[-1,0,1]和[-1,0,1]T计算它们在水平方向(X方向)和垂直方向(Y方向)的梯度gx和gyFor all pixels in the area, use the one-dimensional filter [-1,0,1] and [-1,0,1] T to calculate their gradient g in the horizontal direction (X direction) and vertical direction (Y direction) x and g y ; 利用gx和gy计算它们在各个方向的梯度值gθ,计算公式如下:Using g x and g y to calculate their gradient value g θ in each direction, the calculation formula is as follows: 式中,θ表示量化的梯度方向,abs表示取绝对值,符号表示值为正时取本身,否则取0;In the formula, θ represents the gradient direction of quantization, abs represents the absolute value, When the sign indicates that the value is positive, it takes itself, otherwise it takes 0; 将各个方向的gθ叠置在一起,形成三维方向梯度图go,然后在X和Y方向利用标准为σ的二维高斯滤波器对go进行卷积运算获得特征图再利用一维滤波器[1,2,1]在Z方向对进行卷积运算得到特征图 Stack g θ in all directions to form a three-dimensional direction gradient map g o , and then use a two-dimensional Gaussian filter with standard σ to perform convolution operation on g o in the X and Y directions to obtain the feature map Then use the one-dimensional filter [1,2,1] in the Z direction to Perform convolution operation to obtain feature map 特征图中的每一个像素在Z方向上都对应了一个特征向量fi,遍历每个像素,对其特征向量vi进行归一化操作,得到最终的方向梯度通道特征图(CFOG),归一化的计算公式如下:feature map Each pixel in Z corresponds to a feature vector f i in the Z direction, traverse each pixel, and perform a normalization operation on its feature vector v i to obtain the final directional gradient channel feature map (CFOG), and normalize The calculation formula is as follows: <mrow> <msub> <mi>f</mi> <mi>i</mi> </msub> <mo>=</mo> <mfrac> <msub> <mi>f</mi> <mi>i</mi> </msub> <msqrt> <mrow> <mo>|</mo> <mo>|</mo> <msub> <mi>f</mi> <mi>i</mi> </msub> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>+</mo> <mi>&amp;epsiv;</mi> </mrow> </msqrt> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow> <mrow><msub><mi>f</mi><mi>i</mi></msub><mo>=</mo><mfrac><msub><mi>f</mi><mi>i</mi></msub><msqrt><mrow><mo>|</mo><mo>|</mo><msub><mi>f</mi><mi>i</mi></msub><mo>|</mo><msup><mo>|</mo><mn>2</mn></msup><mo>+</mo><mi>&amp;epsiv;</mi></mrow></msqrt></mfrac><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>2</mn><mo>)</mo></mrow></mrow> 式中,ε是一个避免除零的数。where ε is a number that avoids division by zero. 6.如权利要求1所述的多模态遥感影像匹配方法,其特征在于,所述步骤E是在三维稠密特征表达图基础上,利用差平方和、欧式距离、归一化互相关、相位相关中的一种作为相似性测度进行同名点匹配,或利用一种快速的基于傅里叶变换的相似性测度进行同名点匹配。6. The multimodal remote sensing image matching method according to claim 1, wherein said step E is based on the three-dimensional dense feature expression map, using the difference sum of squares, Euclidean distance, normalized cross-correlation, phase One of the correlations is used as a similarity measure for homonym matching, or a fast Fourier transform-based similarity measure is used for homonym matching. 7.如权利要求6所述的多模态遥感影像匹配方法,其特征在于,所述步骤E具体包括以下计算步骤:7. The multimodal remote sensing image matching method according to claim 6, wherein said step E specifically comprises the following calculation steps: 经过步骤D后分别得到区域AreaW1i和AreaW2i的稠密的三维特征表达图D1和D2,将D1作为模板在D2进行滑动,利用D1和D2之间的相似性进行匹配,;After step D, the dense three-dimensional feature expression graphs D 1 and D 2 of AreaW 1i and AreaW 2i are respectively obtained, and D 1 is used as a template to slide on D 2 , and the similarity between D 1 and D 2 is used for matching. ; 所述差平方和的计算公式为:The calculation formula of the difference sum of squares is: <mrow> <msubsup> <mi>S</mi> <mi>i</mi> <mn>1</mn> </msubsup> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mo>&amp;Sigma;</mo> <mi>c</mi> </munder> <msup> <mrow> <mo>&amp;lsqb;</mo> <msub> <mi>D</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>D</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>-</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow> <mrow><msubsup><mi>S</mi><mi>i</mi><mn>1</mn></msubsup><mrow><mo>(</mo><mi>v</mi><mo>)</mo></mrow><mo>=</mo><munder><mo>&amp;Sigma;</mo><mi>c</mi></munder><msup><mrow><mo>&amp;lsqb;</mo><msub><mi>D</mi><mn>1</mn></msub><mrow><mo>(</mo><mi>c</mi><mo>)</mo></mrow><mo>-</mo><msub><mi>D</mi><mn>2</mn></msub><mrow><mo>(</mo><mi>c</mi><mo>-</mo><mi>v</mi><mo>)</mo></mrow><mo>&amp;rsqb;</mo></mrow><mn>2</mn></msup><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>3</mn><mo>)</mo></mrow></mrow> 所述欧式距离的计算公式为:The formula for calculating the Euclidean distance is: <mrow> <msubsup> <mi>S</mi> <mi>i</mi> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mo>&amp;Sigma;</mo> <mi>c</mi> </munder> <msup> <mrow> <mo>&amp;lsqb;</mo> <msub> <mi>D</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>D</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>-</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow> <mrow><msubsup><mi>S</mi><mi>i</mi><mn>2</mn></msubsup><mrow><mo>(</mo><mi>v</mi><mo>)</mo></mrow><mo>=</mo><munder><mo>&amp;Sigma;</mo><mi>c</mi></munder><msup><mrow><mo>&amp;lsqb;</mo><msub><mi>D</mi><mn>1</mn></msub><mrow><mo>(</mo><mi>c</mi><mo>)</mo></mrow><mo>-</mo><msub><mi>D</mi><mn>2</mn></msub><mrow><mo>(</mo><mi>c</mi><mo>-</mo><mi>v</mi><mo>)</mo></mrow><mo>&amp;rsqb;</mo></mrow><mn>2</mn></msup><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>4</mn><mo>)</mo></mrow></mrow> 所述归一化互相关的计算公式为:The formula for calculating the normalized cross-correlation is: <mrow> <msubsup> <mi>S</mi> <mi>i</mi> <mn>3</mn> </msubsup> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <munder> <mo>&amp;Sigma;</mo> <mi>c</mi> </munder> <mrow> <mo>(</mo> <msub> <mi>D</mi> <mn>1</mn> </msub> <mo>(</mo> <mi>c</mi> <mo>)</mo> <mo>-</mo> <msub> <mover> <mi>D</mi> <mo>&amp;OverBar;</mo> </mover> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <msub> <mi>D</mi> <mn>2</mn> </msub> <mo>(</mo> <mrow> <mi>c</mi> <mo>-</mo> <mi>v</mi> </mrow> <mo>)</mo> <mo>-</mo> <msub> <mover> <mi>D</mi> <mo>&amp;OverBar;</mo> </mover> <mn>2</mn> </msub> <mo>(</mo> <mrow> <mi>c</mi> <mo>-</mo> <mi>v</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> </mrow> <msqrt> <mrow> <munder> <mo>&amp;Sigma;</mo> <mi>c</mi> </munder> <msup> <mrow> <mo>(</mo> <msub> <mi>D</mi> <mn>1</mn> </msub> <mo>(</mo> <mi>c</mi> <mo>)</mo> <mo>-</mo> <msub> <mover> <mi>D</mi> <mo>&amp;OverBar;</mo> </mover> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <munder> <mo>&amp;Sigma;</mo> <mi>c</mi> </munder> <msup> <mrow> <mo>(</mo> <msub> <mi>D</mi> <mn>2</mn> </msub> <mo>(</mo> <mrow> <mi>c</mi> <mo>-</mo> <mi>v</mi> </mrow> <mo>)</mo> <mo>-</mo> <msub> <mover> <mi>D</mi> <mo>&amp;OverBar;</mo> </mover> <mn>2</mn> </msub> <mo>(</mo> <mrow> <mi>c</mi> <mo>-</mo> <mi>v</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow> <mrow><msubsup><mi>S</mi><mi>i</mi><mn>3</mn></msubsup><mrow><mo>(</mo><mi>v</mi><mo>)</mo></mrow><mo>=</mo><mfrac><mrow><munder><mo>&amp;Sigma;</mo><mi>c</mi></munder><mrow><mo>(</mo><msub><mi>D</mi><mn>1</mn></msub><mo>(</mo><mi>c</mi><mo>)</mo><mo>-</mo><msub><mover><mi>D</mi><mo>&amp;OverBar;</mo></mover><mn>1</mn></msub><mo>)</mo></mrow><mrow><mo>(</mo><msub><mi>D</mi><mn>2</mn></msub><mo>(</mo><mrow><mi>c</mi><mo>-</mo><mi>v</mi></mrow><mo>)</mo><mo>-</mo><msub><mover><mi>D</mi><mo>&amp;OverBar;</mo></mover><mn>2</mn></msub><mo>(</mo><mrow><mi>c</mi><mo>-</mo><mi>v</mi></mrow><mo>)</mo><mo>)</mo></mrow></mrow><msqrt><mrow><munder><mo>&amp;Sigma;</mo><mi>c</mi></munder><msup><mrow><mo>(</mo><msub><mi>D</mi><mn>1</mn></msub><mo>(</mo><mi>c</mi><mo>)</mo><mo>-</mo><msub><mover><mi>D</mi><mo>&amp;OverBar;</mo></mover><mn>1</mn></msub><mo>)</mo></mrow><mn>2</mn></msup><munder><mo>&amp;Sigma;</mo><mi>c</mi></munder><msup><mrow><mo>(</mo><msub><mi>D</mi><mn>2</mn></msub><mo>(</mo><mrow><mi>c</mi><mo>-</mo><mi>v</mi></mrow><mo>)</mo><mo>-</mo><msub><mover><mi>D</mi><mo>&amp;OverBar;</mo></mover><mn>2</mn></msub><mo>(</mo><mrow><mi>c</mi><mo>-</mo><mi>v</mi></mrow><mo>)</mo><mo>)</mo></mrow><mn>2</mn></msup></mrow></msqrt></mfrac><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>5</mn><mo>)</mo></mrow></mrow> 所述相位相关的计算公式为:The calculation formula of the phase correlation is: 式(3)-(6)中,c表示三维稠密特征表达图中的坐标,v表示D1和D2之间的偏移量,Si表示D1和D2之间的相似性测度,F表示快速傅里叶正向变换,F*表示F的复数共轭;当相似性测度为差平方和或欧式距离时,当Si取得最小值时,将获得D1和D2之间的偏移量vi;当相似性测度为归一化互相关或相位相关时,当Si取得最大值时,将获得D1和D2之间的偏移量viIn formulas (3)-( 6 ), c represents the coordinates in the three - dimensional dense feature representation map, v represents the offset between D1 and D2, S i represents the similarity measure between D1 and D2, F represents the fast Fourier forward transform, and F * represents the complex conjugate of F; when the similarity measure is the difference sum of squares or Euclidean distance, when S i takes the minimum value, the distance between D 1 and D 2 will be obtained Offset v i ; when the similarity measure is normalized cross-correlation or phase correlation, when S i reaches the maximum value, the offset v i between D 1 and D 2 will be obtained; 所述一种快速的基于傅里叶变换的相似性测度为当差平方和最小时,获得匹配位置vi,计算公式如下:The fast Fourier transform-based similarity measure is to obtain the matching position v i when the sum of squared differences is the smallest, and the calculation formula is as follows: <mrow> <msub> <mi>v</mi> <mi>i</mi> </msub> <mo>=</mo> <munder> <mi>argmin</mi> <mi>v</mi> </munder> <mo>{</mo> <munder> <mo>&amp;Sigma;</mo> <mi>c</mi> </munder> <msup> <mrow> <mo>&amp;lsqb;</mo> <msub> <mi>D</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>D</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>-</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mn>2</mn> </msup> <mo>}</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow> <mrow><msub><mi>v</mi><mi>i</mi></msub><mo>=</mo><munder><mi>argmin</mi><mi>v</mi></munder><mo>{</mo><munder><mo>&amp;Sigma;</mo><mi>c</mi></munder><msup><mrow><mo>&amp;lsqb;</mo><msub><mi>D</mi><mn>1</mn></msub><mrow><mo>(</mo><mi>c</mi><mo>)</mo></mrow><mo>-</mo><msub><mi>D</mi><mn>2</mn></msub><mrow><mo>(</mo><mi>c</mi><mo>-</mo><mi>v</mi><mo>)</mo></mrow><mo>&amp;rsqb;</mo></mrow><mn>2</mn></msup><mo>}</mo><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>7</mn><mo>)</mo></mrow></mrow> 对公式(7)进行展开得:Expand the formula (7) to get: <mrow> <msub> <mi>v</mi> <mi>i</mi> </msub> <mo>=</mo> <munder> <mrow> <mi>arg</mi> <mi>min</mi> </mrow> <mi>v</mi> </munder> <mo>{</mo> <munder> <mo>&amp;Sigma;</mo> <mi>c</mi> </munder> <msubsup> <mi>D</mi> <mn>1</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> <mo>+</mo> <munder> <mo>&amp;Sigma;</mo> <mi>c</mi> </munder> <msubsup> <mi>D</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>c</mi> <mo>-</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>-</mo> <mn>2</mn> <munder> <mo>&amp;Sigma;</mo> <mi>c</mi> </munder> <msub> <mi>D</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> <mo>&amp;CenterDot;</mo> <msub> <mi>D</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>-</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>}</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow> <mrow><msub><mi>v</mi><mi>i</mi></msub><mo>=</mo><munder><mrow><mi>arg</mi><mi>min</mi></mrow><mi>v</mi></munder><mo>{</mo><munder><mo>&amp;Sigma;</mo><mi>c</mi></munder><msubsup><mi>D</mi><mn>1</mn><mn>2</mn></msubsup><mrow><mo>(</mo><mi>c</mi><mo>)</mo></mrow><mo>+</mo><munder><mo>&amp;Sigma;</mo><mi>c</mi></munder><msubsup><mi>D</mi><mn>2</mn><mn>2</mn></msubsup><mrow><mo>(</mo><mi>c</mi><mo>-</mo><mi>v</mi><mo>)</mo></mrow><mo>-</mo><mn>2</mn><munder><mo>&amp;Sigma;</mo><mi>c</mi></munder><msub><mi>D</mi><mn>1</mn></msub><mrow><mo>(</mo><mi>c</mi><mo>)</mo></mrow><mo>&amp;CenterDot;</mo><msub><mi>D</mi><mn>2</mn></msub><mrow><mo>(</mo><mi>c</mi><mo>-</mo><mi>v</mi><mo>)</mo></mrow><mo>}</mo><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>8</mn><mo>)</mo></mrow></mrow> 在公式(8)中,由于第一项和第二项的值接近于常数,所以当第三项的值最大时,公式(7)将获得最小值,因此,相似性函数可重新定义为:In formula (8), since the values of the first and second terms are close to constants, when the value of the third term is the largest, formula (7) will obtain the minimum value, therefore, the similarity function can be redefined as: <mrow> <msub> <mi>v</mi> <mi>i</mi> </msub> <mo>=</mo> <munder> <mrow> <mi>arg</mi> <mi>max</mi> </mrow> <mi>v</mi> </munder> <mo>&amp;lsqb;</mo> <munder> <mo>&amp;Sigma;</mo> <mi>x</mi> </munder> <msub> <mi>D</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>&amp;CenterDot;</mo> <msub> <mi>D</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>-</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>9</mn> <mo>)</mo> </mrow> </mrow> <mrow><msub><mi>v</mi><mi>i</mi></msub><mo>=</mo><munder><mrow><mi>arg</mi><mi>max</mi></mrow><mi>v</mi></munder><mo>&amp;lsqb;</mo><munder><mo>&amp;Sigma;</mo><mi>x</mi></munder><msub><mi>D</mi><mn>1</mn></msub><mrow><mo>(</mo><mi>x</mi><mo>)</mo></mrow><mo>&amp;CenterDot;</mo><msub><mi>D</mi><mn>2</mn></msub><mrow><mo>(</mo><mi>x</mi><mo>-</mo><mi>v</mi><mo>)</mo></mrow><mo>&amp;rsqb;</mo><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>9</mn><mo>)</mo></mrow></mrow> 式中,是一个卷积运算;In the formula, is a convolution operation; 考虑到频率域下的点乘运算等同于空间域下的卷积运算,因此在频率域利用快速傅里叶变换来提高其计算效率,得到基于傅里叶变换的相似性测度为:Considering that the point multiplication operation in the frequency domain is equivalent to the convolution operation in the space domain, fast Fourier transform is used in the frequency domain to improve its computational efficiency, and the similarity measure based on Fourier transform is obtained as: <mrow> <msub> <mi>v</mi> <mi>i</mi> </msub> <mo>=</mo> <munder> <mrow> <mi>arg</mi> <mi>max</mi> </mrow> <mi>v</mi> </munder> <mo>{</mo> <msup> <mi>F</mi> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mo>&amp;lsqb;</mo> <mi>F</mi> <mrow> <mo>(</mo> <msub> <mi>D</mi> <mn>1</mn> </msub> <mo>(</mo> <mi>c</mi> <mo>)</mo> <mo>)</mo> </mrow> <mo>&amp;CenterDot;</mo> <msup> <mi>F</mi> <mo>*</mo> </msup> <mrow> <mo>(</mo> <msub> <mi>D</mi> <mn>2</mn> </msub> <mo>(</mo> <mrow> <mi>c</mi> <mo>-</mo> <mi>v</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> <mo>}</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>10</mn> <mo>)</mo> </mrow> </mrow> <mrow><msub><mi>v</mi><mi>i</mi></msub><mo>=</mo><munder><mrow><mi>arg</mi><mi>max</mi></mrow><mi>v</mi></munder><mo>{</mo><msup><mi>F</mi><mrow><mo>-</mo><mn>1</mn></mrow></msup><mo>&amp;lsqb;</mo><mi>F</mi><mrow><mo>(</mo><msub><mi>D</mi><mn>1</mn></msub><mo>(</mo><mi>c</mi><mo>)</mo><mo>)</mo></mrow><mo>&amp;CenterDot;</mo><msup><mi>F</mi><mo>*</mo></msup><mrow><mo>(</mo><msub><mi>D</mi><mn>2</mn></msub><mo>(</mo><mrow><mi>c</mi><mo>-</mo><mi>v</mi></mrow><mo>)</mo><mo>)</mo></mrow><mo>&amp;rsqb;</mo><mo>}</mo><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>10</mn><mo>)</mo></mrow></mrow> 式中,F和F-1分别表示快速傅里叶正向变换和逆向变换,F*表示F的复数共轭,其中F为二维或三维傅里叶变换。In the formula, F and F -1 represent fast Fourier forward transform and inverse transform respectively, and F * represents the complex conjugate of F, where F is a two-dimensional or three-dimensional Fourier transform. 8.一种稳健的多模态遥感影像匹配系统,其特征在于包括下列单元:8. A robust multimodal remote sensing image matching system, characterized in that it comprises the following units: 预处理单元,用于判断参考影像和输入影像的分辨率信息,如果两幅影像具有相同的分辨率,则执行后续单元,如果分辨率不同,则将两幅影像采样为同样的分辨率;The preprocessing unit is used to judge the resolution information of the reference image and the input image. If the two images have the same resolution, execute the subsequent unit. If the resolutions are different, sample the two images to the same resolution; 模板区域选取单元,用于采用分块的策略,在参考影像上检测出一系列分布均匀的特征点,记为P1i(i=1,2,3,…,N),以点P1i为中心选取模板区域AreaW1iThe template area selection unit is used to detect a series of evenly distributed feature points on the reference image by adopting the block strategy, which is denoted as P 1i (i=1,2,3,...,N), and the point P 1i is Select the template area AreaW 1i in the center; 匹配区域选取单元,用于根据遥感影像自身提供的地理坐标信息,预测点集P1i(i=1,2,3,…,N)在输入影像上所对应的匹配区域AreaW2iThe matching area selection unit is used to predict the matching area AreaW 2i corresponding to the point set P 1i (i=1,2,3,...,N) on the input image according to the geographical coordinate information provided by the remote sensing image itself; 特征提取单元,用于在匹配区域内构建三维稠密特征表达图;A feature extraction unit is used to construct a three-dimensional dense feature expression map in the matching area; 初匹配单元,用于在三维稠密特征表达图的基础上,建立相似测度进行同名点匹配;对于获得的同名点,对其相似性图进行局部极值拟合,求解出匹配点的亚像素位置;重复上述单元的操作,遍历P1i(i=1,2,3,…,N)的每一个点,得到具有亚像素精度的同名点对{PD1i(x,y),PD2i(x,y)}(i=1,2,3,…,N);The initial matching unit is used to establish a similarity measure to match points with the same name on the basis of the three-dimensional dense feature expression map; for the obtained points with the same name, perform local extremum fitting on the similarity map to solve the sub-pixel position of the matching point ;Repeat the operation of the above unit, traverse each point of P 1i (i=1,2,3,...,N), and obtain the same-name point pair {PD 1i(x,y) with sub-pixel precision, PD 2i(x ,y) }(i=1,2,3,...,N); 匹配点筛选单元,用于剔除{PD1i(x,y),PD2i(x,y)}(i=1,2,3,…,N)中误差较大的同名点对,获取最终的同名点对{PID1i(x,y),PID2i(x,y)}(i=1,2,3,…,S)。The matching point screening unit is used to eliminate the same-name point pairs with large errors in {PD 1i(x,y) , PD 2i(x,y) }(i=1,2,3,...,N) to obtain the final Point pairs with the same name {PID 1i(x,y) , PID 2i(x,y) } (i=1, 2, 3, . . . , S). 9.如权利要求8所述的多模态遥感影像匹配系统,其特征在于,所述特征提取单元用于对匹配区域内的影像数据进行密集的格网采样,计算每个格网点的局部特征描述符,形成三维的稠密特征表达图。9. The multimodal remote sensing image matching system as claimed in claim 8, wherein the feature extraction unit is used to perform dense grid sampling on the image data in the matching area, and calculate the local features of each grid point Descriptors form a three-dimensional dense feature representation map. 10.如权利要求8所述的多模态遥感影像匹配方法,其特征在于,所述初匹配单元是在三维稠密的特征表达图基础上,建立影像匹配的相似测度进行运算获得相似性图,取相似性图极值位置为影像的匹配位置。10. The multimodal remote sensing image matching method according to claim 8, wherein the initial matching unit is based on a three-dimensional dense feature expression map, and establishes a similarity measure for image matching to obtain a similarity map, Take the extreme value position of the similarity map as the matching position of the image.
CN201711078617.5A 2017-11-06 2017-11-06 A Robust Multimodal Remote Sensing Image Matching Method and System Active CN107909018B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711078617.5A CN107909018B (en) 2017-11-06 2017-11-06 A Robust Multimodal Remote Sensing Image Matching Method and System

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711078617.5A CN107909018B (en) 2017-11-06 2017-11-06 A Robust Multimodal Remote Sensing Image Matching Method and System

Publications (2)

Publication Number Publication Date
CN107909018A true CN107909018A (en) 2018-04-13
CN107909018B CN107909018B (en) 2019-12-06

Family

ID=61842576

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711078617.5A Active CN107909018B (en) 2017-11-06 2017-11-06 A Robust Multimodal Remote Sensing Image Matching Method and System

Country Status (1)

Country Link
CN (1) CN107909018B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019042232A1 (en) * 2017-08-31 2019-03-07 西南交通大学 Fast and robust multimodal remote sensing image matching method and system
CN109598750A (en) * 2018-12-07 2019-04-09 中国地质大学(武汉) One kind being based on the pyramidal large scale differential image characteristic point matching method of deformation space
CN110288034A (en) * 2019-06-28 2019-09-27 广州虎牙科技有限公司 Image matching method, device, electronic equipment and readable storage medium storing program for executing
CN113034556A (en) * 2021-03-19 2021-06-25 南京天巡遥感技术研究院有限公司 Frequency domain related semi-dense remote sensing image matching method
CN113514812A (en) * 2021-09-14 2021-10-19 北京海兰信数据科技股份有限公司 Clutter suppression processing method and system for shore-based radar
CN117761695A (en) * 2024-02-22 2024-03-26 中国科学院空天信息创新研究院 multi-angle SAR three-dimensional imaging method based on self-adaptive partition SIFT

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020945A (en) * 2011-09-21 2013-04-03 中国科学院电子学研究所 Remote sensing image registration method of multi-source sensor
CN103017739A (en) * 2012-11-20 2013-04-03 武汉大学 Manufacturing method of true digital ortho map (TDOM) based on light detection and ranging (LiDAR) point cloud and aerial image
CN103489191A (en) * 2013-09-24 2014-01-01 中国科学院自动化研究所 Method for detecting changes of remarkable target of remote sensing image
CN104021556A (en) * 2014-06-13 2014-09-03 西南交通大学 Heterological remote-sensing image registration method based on geometric structure similarity
CN105261014A (en) * 2015-09-30 2016-01-20 西南交通大学 Multi-sensor remote sensing image matching method
CN105427298A (en) * 2015-11-12 2016-03-23 西安电子科技大学 Remote sensing image registration method based on anisotropic gradient dimension space

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020945A (en) * 2011-09-21 2013-04-03 中国科学院电子学研究所 Remote sensing image registration method of multi-source sensor
CN103017739A (en) * 2012-11-20 2013-04-03 武汉大学 Manufacturing method of true digital ortho map (TDOM) based on light detection and ranging (LiDAR) point cloud and aerial image
CN103489191A (en) * 2013-09-24 2014-01-01 中国科学院自动化研究所 Method for detecting changes of remarkable target of remote sensing image
CN104021556A (en) * 2014-06-13 2014-09-03 西南交通大学 Heterological remote-sensing image registration method based on geometric structure similarity
CN105261014A (en) * 2015-09-30 2016-01-20 西南交通大学 Multi-sensor remote sensing image matching method
CN105427298A (en) * 2015-11-12 2016-03-23 西安电子科技大学 Remote sensing image registration method based on anisotropic gradient dimension space

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019042232A1 (en) * 2017-08-31 2019-03-07 西南交通大学 Fast and robust multimodal remote sensing image matching method and system
US11244197B2 (en) 2017-08-31 2022-02-08 Southwest Jiaotong University Fast and robust multimodal remote sensing image matching method and system
CN109598750A (en) * 2018-12-07 2019-04-09 中国地质大学(武汉) One kind being based on the pyramidal large scale differential image characteristic point matching method of deformation space
CN109598750B (en) * 2018-12-07 2023-05-23 中国地质大学(武汉) Large-scale difference image feature point matching method based on deformation space pyramid
CN110288034A (en) * 2019-06-28 2019-09-27 广州虎牙科技有限公司 Image matching method, device, electronic equipment and readable storage medium storing program for executing
CN113034556A (en) * 2021-03-19 2021-06-25 南京天巡遥感技术研究院有限公司 Frequency domain related semi-dense remote sensing image matching method
CN113034556B (en) * 2021-03-19 2024-04-16 南京天巡遥感技术研究院有限公司 Frequency domain correlation semi-dense remote sensing image matching method
CN113514812A (en) * 2021-09-14 2021-10-19 北京海兰信数据科技股份有限公司 Clutter suppression processing method and system for shore-based radar
CN113514812B (en) * 2021-09-14 2021-12-14 北京海兰信数据科技股份有限公司 Clutter suppression processing method and system for shore-based radar
CN117761695A (en) * 2024-02-22 2024-03-26 中国科学院空天信息创新研究院 multi-angle SAR three-dimensional imaging method based on self-adaptive partition SIFT
CN117761695B (en) * 2024-02-22 2024-04-30 中国科学院空天信息创新研究院 Multi-angle SAR 3D imaging method based on adaptive partitioned SIFT

Also Published As

Publication number Publication date
CN107909018B (en) 2019-12-06

Similar Documents

Publication Publication Date Title
CN107563438B (en) A Fast and Robust Multimodal Remote Sensing Image Matching Method and System
Ye et al. A robust multimodal remote sensing image registration method and system using steerable filters with first-and second-order gradients
CN111028277B (en) SAR and optical remote sensing image registration method based on pseudo-twin convolution neural network
CN107909018B (en) A Robust Multimodal Remote Sensing Image Matching Method and System
WO2022188094A1 (en) Point cloud matching method and apparatus, navigation method and device, positioning method, and laser radar
Zeng et al. 3dmatch: Learning local geometric descriptors from rgb-d reconstructions
Zhou et al. Canny-vo: Visual odometry with rgb-d cameras based on geometric 3-d–2-d edge alignment
CN104574421B (en) A method and device for high-precision multi-spectral image registration in large format and small overlapping areas
US9761002B2 (en) Stereo-motion method of three-dimensional (3-D) structure information extraction from a video for fusion with 3-D point cloud data
Zhang et al. Semi-automatic road tracking by template matching and distance transformation in urban areas
Misra et al. Feature based remote sensing image registration techniques: A comprehensive and comparative review
CN104867126A (en) Method for registering synthetic aperture radar image with change area based on point pair constraint and Delaunay
CN104200461A (en) Mutual information image selected block and sift (scale-invariant feature transform) characteristic based remote sensing image registration method
CN104077782A (en) Satellite-borne remote sense image matching method
CN108230375A (en) Visible images and SAR image registration method based on structural similarity fast robust
CN106296587A (en) The joining method of tire-mold image
Gao et al. Multi-scale PIIFD for registration of multi-source remote sensing images
Choi et al. Regression with residual neural network for vanishing point detection
Ye et al. A coarse-to-fine visual geo-localization method for GNSS-denied UAV with oblique-view imagery
Li Stereo vision and Lidar based dynamic occupancy grid mapping: Application to scenes analysis for intelligent vehicles
CN106529548A (en) Sub-pixel level multi-scale Harris corner detection algorithm
Serati et al. Digital surface model generation from high-resolution satellite stereo imagery based on structural similarity
Bae et al. Fast and scalable 3D cyber-physical modeling for high-precision mobile augmented reality systems
Yang et al. Object detection and localization algorithm in agricultural scenes based on YOLOv5
CN106355576A (en) SAR image registration method based on MRF image segmentation algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant