[go: up one dir, main page]

CN114677518A - An image feature point detection method, system and computer storage medium - Google Patents

An image feature point detection method, system and computer storage medium Download PDF

Info

Publication number
CN114677518A
CN114677518A CN202210417636.0A CN202210417636A CN114677518A CN 114677518 A CN114677518 A CN 114677518A CN 202210417636 A CN202210417636 A CN 202210417636A CN 114677518 A CN114677518 A CN 114677518A
Authority
CN
China
Prior art keywords
image
points
processing
feature
feature point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210417636.0A
Other languages
Chinese (zh)
Inventor
段程鹏
胡炳樑
刘伟
宋洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Zhongkelide Infrared Technology Co ltd
Original Assignee
Xi'an Zhongkelide Infrared Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Zhongkelide Infrared Technology Co ltd filed Critical Xi'an Zhongkelide Infrared Technology Co ltd
Priority to CN202210417636.0A priority Critical patent/CN114677518A/en
Publication of CN114677518A publication Critical patent/CN114677518A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20068Projection on vertical or horizontal image axis

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method, a system and a computer storage medium for detecting image feature points, wherein the method comprises the following steps: carrying out image enhancement processing on an image to be detected to obtain an enhanced image; performing guided filtering processing on the enhanced image to obtain a filtered image; and performing SURF characteristic point detection on the filtered image to obtain target characteristic points. Aiming at the detection of the image feature points with low signal-to-noise ratio, the method can effectively improve the signal-to-noise ratio, simultaneously inhibit image noise, quickly and accurately realize the detection of the feature points, has strong real-time performance and is beneficial to the realization of engineering.

Description

一种图像特征点检测方法、系统及计算机存储介质An image feature point detection method, system and computer storage medium

技术领域technical field

本发明涉及图像处理技术领域,特别涉及一种图像特征点检测方法、系统及计算机存储介质。The present invention relates to the technical field of image processing, and in particular, to a method, a system and a computer storage medium for detecting image feature points.

背景技术Background technique

在远距离红外及可见光成像配准及融合系统中,目标在图像中占比很小,且目标能量很弱,导致图像中目标特征点的检测准确率较低。In the long-distance infrared and visible light imaging registration and fusion system, the target occupies a small proportion in the image, and the target energy is very weak, resulting in a low detection accuracy of target feature points in the image.

发明内容SUMMARY OF THE INVENTION

本发明实施例提供了一种图像特征点检测方法、系统及计算机存储介质,用以解决现有技术中由于目标在图像中的占比低且能量弱造成特征点检测准确率较低的问题。Embodiments of the present invention provide an image feature point detection method, system and computer storage medium to solve the problem of low feature point detection accuracy in the prior art due to the low proportion of the target in the image and weak energy.

一方面,本发明实施例提供了一种图像特征点检测方法,包括:On the one hand, an embodiment of the present invention provides an image feature point detection method, including:

对待检测图像进行图像增强处理,获得增强图像;Perform image enhancement processing on the image to be detected to obtain an enhanced image;

对增强图像进行引导滤波处理,获得滤波图像;Perform guided filtering processing on the enhanced image to obtain a filtered image;

对滤波图像进行SURF特征点检测,获得目标特征点。Perform SURF feature point detection on the filtered image to obtain target feature points.

另一方面,本发明实施例提供了一种图像特征点检测系统,包括:On the other hand, an embodiment of the present invention provides an image feature point detection system, including:

图像增强模块,用于对待检测图像进行图像增强处理,获得增强图像;The image enhancement module is used to perform image enhancement processing on the image to be detected to obtain an enhanced image;

图像滤波模块,用于对增强图像进行引导滤波处理,获得滤波图像;The image filtering module is used to conduct guided filtering processing on the enhanced image to obtain the filtered image;

特征点检测模块,用于对滤波图像进行SURF特征点检测,获得目标特征点。The feature point detection module is used to perform SURF feature point detection on the filtered image to obtain target feature points.

另一方面,本发明实施例提供了一种计算机存储介质,该计算机存储介质中存储有多条计算机指令,该多条计算机指令用于使计算机执行上述的方法。On the other hand, an embodiment of the present invention provides a computer storage medium, where a plurality of computer instructions are stored in the computer storage medium, and the plurality of computer instructions are used to cause a computer to execute the above method.

本发明中的一种图像特征点检测方法、系统及计算机存储介质,具有以下优点:An image feature point detection method, system and computer storage medium in the present invention have the following advantages:

针对低信噪比图像特征点检测,能够有效提高信噪比,同时抑制图像噪声,能够快速准确地实现特征点检测,且方法的实时性强,利于工程化实现。For the detection of image feature points with low signal-to-noise ratio, the signal-to-noise ratio can be effectively improved, and image noise can be suppressed at the same time, and feature point detection can be realized quickly and accurately, and the method has strong real-time performance, which is conducive to engineering realization.

附图说明Description of drawings

为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。In order to explain the embodiments of the present invention or the technical solutions in the prior art more clearly, the following briefly introduces the accompanying drawings that need to be used in the description of the embodiments or the prior art. Obviously, the accompanying drawings in the following description are only These are some embodiments of the present invention. For those of ordinary skill in the art, other drawings can also be obtained according to these drawings without creative efforts.

图1为本发明实施例提供的一种图像特征点检测方法的流程图;1 is a flowchart of a method for detecting image feature points according to an embodiment of the present invention;

图2为本发明实施例提供的确定特征点主方向的示意图;2 is a schematic diagram of determining the main direction of a feature point according to an embodiment of the present invention;

图3为本发明实施例提供的确定harr小波特征的示意图。FIG. 3 is a schematic diagram of determining a harr wavelet feature according to an embodiment of the present invention.

具体实施方式Detailed ways

下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, but not all of the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.

图1为本发明实施例提供的一种图像特征点检测方法的流程图。本发明实施例提供了一种图像特征点检测方法,包括:FIG. 1 is a flowchart of an image feature point detection method according to an embodiment of the present invention. An embodiment of the present invention provides an image feature point detection method, including:

S100、对待检测图像进行图像增强处理,获得增强图像。S100. Perform image enhancement processing on the image to be detected to obtain an enhanced image.

示例性地,S100具体包括:S101、对待检测图像进行滤波,获得低频图像;S102、将待检测图像和低频图像作差,获得高频图像;S103、对低频图像进行自适应混合调光处理,获得调光图像;S104、将调光图像与高频图像叠加,获得增强图像。Exemplarily, S100 specifically includes: S101, filtering the image to be detected to obtain a low-frequency image; S102, making a difference between the image to be detected and the low-frequency image to obtain a high-frequency image; S103, performing adaptive hybrid dimming processing on the low-frequency image, Obtain a dimming image; S104, superimpose the dimming image and the high-frequency image to obtain an enhanced image.

在S101中,可以采用均值滤波器对待检测图像进行滤波,获得低频图像。S103具体可以包括:S105、对低频图像进行直方图统计,获得相应的直方图映射输出;S106、对低频图像进行线性调光处理,获得相应的调光输出;S107、将直方图映射输出和调光输出进行加权求和,获得调光图像。In S101, an average filter may be used to filter the image to be detected to obtain a low-frequency image. S103 may specifically include: S105, perform histogram statistics on the low-frequency image to obtain a corresponding histogram map output; S106, perform linear dimming processing on the low-frequency image to obtain a corresponding dimming output; S107, map the histogram output and adjust the output. The light output is weighted and summed to obtain a dimming image.

具体地,在S105中,假设图像的灰度级为0~L,令

Figure 359411DEST_PATH_IMAGE002
表示第k个灰度级,
Figure 982022DEST_PATH_IMAGE004
为第k个灰度级出现的像素个数,遍历整幅图像得到所有灰度级的个数
Figure 815986DEST_PATH_IMAGE005
,即得到图像的直方图统计。根据直方图统计结果计算累积直方图,计算公式如下:Specifically, in S105, assuming that the gray level of the image is 0~ L , let
Figure 359411DEST_PATH_IMAGE002
represents the kth gray level,
Figure 982022DEST_PATH_IMAGE004
For the number of pixels appearing in the kth gray level, traverse the entire image to get the number of all gray levels
Figure 815986DEST_PATH_IMAGE005
, that is, the histogram statistics of the image are obtained. The cumulative histogram is calculated according to the statistical results of the histogram, and the calculation formula is as follows:

Figure 392723DEST_PATH_IMAGE006
(1)
Figure 392723DEST_PATH_IMAGE006
(1)

其中,

Figure 337545DEST_PATH_IMAGE007
表示rk灰度级的统计个数,r l r h 分别表示最小的灰度级和最大的灰度级。in,
Figure 337545DEST_PATH_IMAGE007
Represents the statistical number of rk gray levels, and rl and rh represent the smallest gray level and the largest gray level, respectively.

根据公式(1)求得的累积直方图计算直方图映射输出,计算公式如下:According to the cumulative histogram obtained by formula (1), the histogram mapping output is calculated, and the calculation formula is as follows:

Figure 316127DEST_PATH_IMAGE008
(2)
Figure 316127DEST_PATH_IMAGE008
(2)

其中,上式中的

Figure 320993DEST_PATH_IMAGE009
即表示式(1)中的
Figure 883561DEST_PATH_IMAGE010
Figure 867960DEST_PATH_IMAGE011
表示图像坐标为
Figure 465163DEST_PATH_IMAGE012
时的灰度级。Among them, in the above formula
Figure 320993DEST_PATH_IMAGE009
That is, in the expression (1)
Figure 883561DEST_PATH_IMAGE010
,
Figure 867960DEST_PATH_IMAGE011
Represents the image coordinates as
Figure 465163DEST_PATH_IMAGE012
grayscale level.

在S106中,线性调光输出计算公式如下,In S106, the linear dimming output calculation formula is as follows,

Figure 640930DEST_PATH_IMAGE013
(3)
Figure 640930DEST_PATH_IMAGE013
(3)

其中

Figure 192259DEST_PATH_IMAGE014
分别表示细节层图像的最小和最大值,
Figure 478884DEST_PATH_IMAGE015
表示直方图映射输出的最小和最大值,
Figure 166479DEST_PATH_IMAGE016
表示当前像素的灰度级,
Figure 513147DEST_PATH_IMAGE017
表示图像坐标为
Figure 50308DEST_PATH_IMAGE018
时的灰度数值。in
Figure 192259DEST_PATH_IMAGE014
represent the minimum and maximum values of the detail layer image, respectively,
Figure 478884DEST_PATH_IMAGE015
represents the minimum and maximum output of the histogram map,
Figure 166479DEST_PATH_IMAGE016
represents the gray level of the current pixel,
Figure 513147DEST_PATH_IMAGE017
Represents the image coordinates as
Figure 50308DEST_PATH_IMAGE018
grayscale value.

在S107中,加权求和输出的最终的输出计算公式如下:In S107, the final output calculation formula of the weighted summation output is as follows:

Figure 466307DEST_PATH_IMAGE019
(4)
Figure 466307DEST_PATH_IMAGE019
(4)

其中,

Figure 772523DEST_PATH_IMAGE020
分别为线性调光与直方图修正的权重值,且
Figure 24513DEST_PATH_IMAGE021
。in,
Figure 772523DEST_PATH_IMAGE020
are the weight values of linear dimming and histogram correction, respectively, and
Figure 24513DEST_PATH_IMAGE021
.

在S104中,将调光后低频图像与高频图像叠加,且低频图像和高频图像的权值系数动态可配置,最终可得到信噪比增强后的图像,即增强图像。输出计算公式如下:In S104, the low-frequency image and the high-frequency image after dimming are superimposed, and the weight coefficients of the low-frequency image and the high-frequency image are dynamically configurable, and finally an image with enhanced signal-to-noise ratio, that is, an enhanced image can be obtained. The output calculation formula is as follows:

Figure 550435DEST_PATH_IMAGE022
(5)
Figure 550435DEST_PATH_IMAGE022
(5)

S110、对增强图像进行引导滤波处理,获得滤波图像。S110. Perform guided filtering processing on the enhanced image to obtain a filtered image.

示例性地,引导滤波是一种自适应权重的滤波方法,能够在平滑图像的同时起到保持边界的作用。引导滤波器作为一种线性滤波器,可以简单定义为如下形式:Illustratively, guided filtering is an adaptive weighted filtering method that can smooth the image while maintaining boundaries. As a linear filter, the guided filter can be simply defined as follows:

Figure 178862DEST_PATH_IMAGE023
(6)
Figure 178862DEST_PATH_IMAGE023
(6)

其中,I是引导图像,P是输入的待滤波图像,Q是滤波后的输出图像,W是根据引导图I确定的权重值。Among them, I is the guide image, P is the input image to be filtered, Q is the filtered output image, and W is the weight value determined according to the guide map I.

权重值W可以用下式表示:

Figure 106629DEST_PATH_IMAGE024
(7)The weight value W can be expressed by the following formula:
Figure 106629DEST_PATH_IMAGE024
(7)

其中,

Figure 529520DEST_PATH_IMAGE025
是窗口内像素点的均值,
Figure 41273DEST_PATH_IMAGE026
指相邻两个像素点的值,
Figure 37173DEST_PATH_IMAGE027
代表窗口内像素点的方差,
Figure 990086DEST_PATH_IMAGE028
是一个惩罚值。权重值W可以根据上式分析得到:
Figure 413239DEST_PATH_IMAGE029
在边界两侧时,
Figure DEST_PATH_IMAGE030
异号,否则同号。而异号时的权重值将远远小于同号时的权重值,这样处于平坦区域的像素则会被加以较大的权重,平滑效果更明显,而处于边界两侧的像素则会被加以较小的权重,平滑效果较弱,能够起到保持边界的效果。in,
Figure 529520DEST_PATH_IMAGE025
is the mean of the pixels in the window,
Figure 41273DEST_PATH_IMAGE026
refers to the value of two adjacent pixels,
Figure 37173DEST_PATH_IMAGE027
represents the variance of the pixels in the window,
Figure 990086DEST_PATH_IMAGE028
is a penalty value. The weight value W can be obtained according to the above analysis:
Figure 413239DEST_PATH_IMAGE029
on both sides of the border,
Figure DEST_PATH_IMAGE030
Different sign, otherwise same sign. The weight value of the different sign will be much smaller than the weight value of the same sign, so that the pixels in the flat area will be given a larger weight, and the smoothing effect will be more obvious, while the pixels on both sides of the boundary will be compared. Small weights have a weaker smoothing effect and can maintain the effect of boundaries.

惩罚值

Figure 146709DEST_PATH_IMAGE031
对滤波效果影响也很大,当
Figure 946300DEST_PATH_IMAGE031
值很小时,滤波如前;当值很大时,权重的计算公式将近似为一个均值滤波器,平滑效果会更明显。penalty value
Figure 146709DEST_PATH_IMAGE031
It also has a great influence on the filtering effect, when
Figure 946300DEST_PATH_IMAGE031
When the value is small, the filtering is as before; when the value is large, the calculation formula of the weight will be approximated as a mean filter, and the smoothing effect will be more obvious.

同样也可以根据线性滤波公式来理解引导滤波的自适应权重原理,局部线性滤波模型公式如下:

Figure DEST_PATH_IMAGE032
(8)The adaptive weight principle of guided filtering can also be understood according to the linear filtering formula. The local linear filtering model formula is as follows:
Figure DEST_PATH_IMAGE032
(8)

Figure 143932DEST_PATH_IMAGE033
(9)
Figure 143932DEST_PATH_IMAGE033
(9)

Figure 862620DEST_PATH_IMAGE034
(10)
Figure 862620DEST_PATH_IMAGE034
(10)

其中,

Figure 522534DEST_PATH_IMAGE035
两个系数根据引导图I和输入图像P共同决定。而ab的值将会决定梯度信息和平滑信息的权重大小。in,
Figure 522534DEST_PATH_IMAGE035
The two coefficients are jointly determined according to the guide map I and the input image P. The values of a and b will determine the weight of gradient information and smoothing information.

通过观察ab的公式,a的分子为IP的协方差,分母部分为I的方差加上惩罚值

Figure 358772DEST_PATH_IMAGE036
b的值为P的均值减去a乘以I的均值。可以看出当a值很小时,b约等于窗口内像素点的均值
Figure 286276DEST_PATH_IMAGE037
,近似于均值滤波;当a值很大时,输出则主要取决于
Figure 39513DEST_PATH_IMAGE038
的大小,梯度信息能够得到保留。By observing the formulas of a and b , the numerator of a is the covariance of I and P , and the denominator part is the variance of I plus the penalty value
Figure 358772DEST_PATH_IMAGE036
; The value of b is the mean of P minus the mean of a times I. It can be seen that when the value of a is very small, b is approximately equal to the mean of the pixels in the window
Figure 286276DEST_PATH_IMAGE037
, which is similar to mean filtering; when the value of a is large, the output mainly depends on
Figure 39513DEST_PATH_IMAGE038
The size of the gradient information can be preserved.

引导滤波计算过程如下:首先分别计算出引导图像I与输入图像P的均值图像,其计算公式如下:The calculation process of the guided filtering is as follows: First, the mean image of the guided image I and the input image P are calculated respectively, and the calculation formula is as follows:

Figure 685258DEST_PATH_IMAGE039
(11)
Figure 685258DEST_PATH_IMAGE039
(11)

Figure 561073DEST_PATH_IMAGE040
(12)
Figure 561073DEST_PATH_IMAGE040
(12)

其次分别计算出

Figure 608663DEST_PATH_IMAGE041
的均值图像,其计算公式如下:Next, calculate the
Figure 608663DEST_PATH_IMAGE041
The mean image of , its calculation formula is as follows:

Figure 43056DEST_PATH_IMAGE042
(13)
Figure 43056DEST_PATH_IMAGE042
(13)

Figure 677562DEST_PATH_IMAGE043
(14)
Figure 677562DEST_PATH_IMAGE043
(14)

然后分别求出I的方差图像,以及

Figure 855602DEST_PATH_IMAGE044
的协方差图像,其计算公式如下:Then find the variance image of I respectively, and
Figure 855602DEST_PATH_IMAGE044
The covariance image of , its calculation formula is as follows:

Figure 757699DEST_PATH_IMAGE045
(15)
Figure 757699DEST_PATH_IMAGE045
(15)

Figure 864457DEST_PATH_IMAGE046
(16)
Figure 864457DEST_PATH_IMAGE046
(16)

根据以上

Figure 219215DEST_PATH_IMAGE047
Figure 702412DEST_PATH_IMAGE048
计算公式,计算可得如下结果:According to the above
Figure 219215DEST_PATH_IMAGE047
and
Figure 702412DEST_PATH_IMAGE048
The calculation formula can be calculated as follows:

Figure DEST_PATH_IMAGE049
(17)
Figure DEST_PATH_IMAGE049
(17)

Figure 583649DEST_PATH_IMAGE050
(18)
Figure 583649DEST_PATH_IMAGE050
(18)

再对

Figure 798992DEST_PATH_IMAGE051
求均值,并将均值求和得到最终结果,表达式如下:Right again
Figure 798992DEST_PATH_IMAGE051
Find the mean and sum the mean to get the final result, the expression is as follows:

Figure 906625DEST_PATH_IMAGE052
(19)
Figure 906625DEST_PATH_IMAGE052
(19)

Figure 426468DEST_PATH_IMAGE053
(20)
Figure 426468DEST_PATH_IMAGE053
(20)

Figure DEST_PATH_IMAGE054
(21)
Figure DEST_PATH_IMAGE054
(twenty one)

S120、对滤波图像进行SURF特征点检测,获得目标特征点。S120. Perform SURF feature point detection on the filtered image to obtain target feature points.

示例性地,S120具体包括:S121、构建Hessian矩阵,利用Hessian矩阵对滤波图像进行处理,获得相应的像素点和二维图像空间;S122、构建尺度空间;S123、将经过Hessian矩阵处理得到的像素点和二维图像空间以及尺度空间邻域内的点进行比较,确定特征点;S124、根据特征点确定主方向;S125、按照主方向确定相应的特征点描述子;S126、根据特征点描述子对特征点进行匹配,剔除不匹配的特征点。Exemplarily, S120 specifically includes: S121, constructing a Hessian matrix, and using the Hessian matrix to process the filtered image to obtain corresponding pixel points and a two-dimensional image space; S122, constructing a scale space; S123, processing the pixels obtained by the Hessian matrix. The point is compared with the points in the two-dimensional image space and the scale space neighborhood to determine the feature point; S124, determine the main direction according to the feature point; S125, determine the corresponding feature point descriptor according to the main direction; S126, according to the feature point descriptor pair The feature points are matched, and the unmatched feature points are eliminated.

其中,在S121中,构建Hessian矩阵能够生成稳定的边缘点,在构建Hessian矩阵前,首先需要对引导滤波后的滤波图像进行高斯滤波,经高斯滤波后的Hessian矩阵表达式如下:Among them, in S121, the construction of the Hessian matrix can generate stable edge points. Before the construction of the Hessian matrix, Gaussian filtering needs to be performed on the filtered image after guided filtering. The expression of the Hessian matrix after Gaussian filtering is as follows:

Figure 663677DEST_PATH_IMAGE055
(22)
Figure 663677DEST_PATH_IMAGE055
(twenty two)

当Hessian矩阵的判别式取得局部极大值时,判定当前点是比周围邻域内其它点更亮或更暗的点,由此来定位关键点的位置。同时为了提高运算速度,本发明中采用引导滤波器来近似替代高斯滤波器。Hessian矩阵的判别式表示如下:When the discriminant of the Hessian matrix obtains a local maximum value, it is determined that the current point is brighter or darker than other points in the surrounding neighborhood, thereby locating the position of the key point. At the same time, in order to improve the operation speed, a guided filter is used to replace the Gaussian filter approximately in the present invention. The discriminant of the Hessian matrix is expressed as follows:

Figure DEST_PATH_IMAGE056
(23)
Figure DEST_PATH_IMAGE056
(twenty three)

在S122中,同SIFT一样,SURF的尺度空间也是由O和L组成。在SURF特征点中,不同组间图像的尺寸都是一致的,且不同组间使用的引导滤波器的模板尺寸逐渐增大,同一组间不同层间使用相同尺寸的引导滤波器,但是引导滤波器的模糊系数逐渐增大。In S122, like SIFT, the scale space of SURF is also composed of O and L. In the SURF feature points, the size of the images in different groups is the same, and the template size of the guided filters used in different groups gradually increases. The blur coefficient of the filter gradually increases.

S123具体可以包括:S127、将经过Hessian矩阵处理得到的像素点和二维图像空间以及尺度空间邻域内的点进行比较,确定关键点;S128、滤除能量弱的关键点以及错误定位的关键点,筛选出最终的特征点。具体地,可以将经过Hessian矩阵处理的每个像素点与二维图像空间和尺度空间邻域内的26个点进行比较,以初步定位出关键点。S123 may specifically include: S127, compare the pixel points obtained by processing the Hessian matrix with the points in the two-dimensional image space and the scale space neighborhood to determine the key points; S128, filter out the key points with weak energy and the key points that are wrongly located , filter out the final feature points. Specifically, each pixel point processed by the Hessian matrix can be compared with 26 points in the neighborhood of the two-dimensional image space and scale space to preliminarily locate key points.

S124具体可以包括:S129、在特征点的圆形邻域内,统计一定角度扇形内所有点的水平、垂直harr小波特征值的和;S130、以一定弧度大小的间隔对扇形进行旋转,统计旋转后扇形区域内所有点的水平、垂直harr小波特征值的和;S131、将区域内每个点的水平、垂直harr小波特征值的和最大的扇形的方向作为特征点的主方向。具体地,在SURF特征点检测算法中,采用的是统计特征点圆形邻域内的harr小波特征值。即在特征点的圆形邻域内,统计60度扇形内所有点的水平、垂直harr小波特征值的总和,然后扇形以0.2弧度大小的间隔进行旋转并再次统计该区域内harr小波特征值之后,最后将值最大的那个扇形的方向作为该特征点的主方向,该过程如图2所示。S124 may specifically include: S129, in the circular neighborhood of the feature point, count the sum of the horizontal and vertical harr wavelet eigenvalues of all points in the sector of a certain angle; S130, rotate the sector at intervals of a certain radian size, and after counting the rotation The sum of the horizontal and vertical harr wavelet eigenvalues of all points in the fan-shaped region; S131 , the direction of the sector with the largest sum of the horizontal and vertical harr wavelet eigenvalues of each point in the region is taken as the main direction of the feature point. Specifically, in the SURF feature point detection algorithm, the harr wavelet eigenvalues in the circular neighborhood of the statistical feature points are used. That is, in the circular neighborhood of the feature points, the sum of the horizontal and vertical harr wavelet eigenvalues of all points in the 60-degree sector is counted, and then the sector is rotated at intervals of 0.2 radians and the harr wavelet eigenvalues in this area are counted again. Finally, the direction of the sector with the largest value is used as the main direction of the feature point, and the process is shown in Figure 2.

在S125中,在SIFT算法中,取特征点周围4*4个区域块,统计每小块内8个梯度方向,用4*4*8=128维向量作为SIFT特征的描述子。在SURF算法中,也是在特征点周围取一个4*4的矩形区域块,但是所取得矩形区域方向是沿着特征点的主方向。每个子区域统计25个像素的水平方向和垂直方向的harr小波特征,这里的水平和垂直方向都是相对主方向而言的。该harr小波特征为水平方向值之和、垂直方向值之和、水平方向绝对值之和以及垂直方向绝对值之和4个方向,该过程如图3所示。把这4个方向的值作为每个子块区域的特征向量,所以一共有4*4*4=64维向量作为SURF特征的描述子,比SIFT特征的描述子减少了2倍。In S125, in the SIFT algorithm, 4*4 area blocks around the feature point are taken, 8 gradient directions in each small block are counted, and a 4*4*8=128-dimensional vector is used as the descriptor of the SIFT feature. In the SURF algorithm, a 4*4 rectangular area block is also taken around the feature point, but the direction of the obtained rectangular area is along the main direction of the feature point. Each sub-region counts the harr wavelet features in the horizontal and vertical directions of 25 pixels, where the horizontal and vertical directions are relative to the main direction. The harr wavelet features are four directions: the sum of the horizontal direction value, the sum of the vertical direction value, the sum of the absolute value of the horizontal direction and the sum of the absolute value of the vertical direction. The process is shown in Figure 3. The values of these 4 directions are used as the feature vector of each sub-block area, so a total of 4*4*4=64-dimensional vectors are used as the descriptors of SURF features, which is 2 times less than the descriptors of SIFT features.

S126具体可以包括:S132、确定两个特征点间的欧式距离,根据欧式距离确定特征点之间的匹配度;S133、确定匹配度高于阈值的两个特征点之间的Hessian矩阵迹;S134、如果两个特征点之间的Hessian矩阵迹的正负号不同,对相应的特征点进行剔除。具体地,欧式距离越短,代表两个特征点的匹配度越好。如果两个特征点的Hessian矩阵迹正负号相同,代表这两个特征点具有相同方向上的对比度变化,如果不同,说明这两个特征点的对比度变化方向是相反的,即使这两个特征点的欧式距离为零,也需要将其剔除。S126 may specifically include: S132, determining the Euclidean distance between the two feature points, and determining the matching degree between the feature points according to the Euclidean distance; S133, determining the Hessian matrix trace between the two feature points whose matching degree is higher than the threshold; S134 . If the signs of the Hessian matrix traces between the two feature points are different, the corresponding feature points are eliminated. Specifically, the shorter the Euclidean distance, the better the matching degree of the two feature points. If the signs of the Hessian matrix traces of the two feature points are the same, it means that the two feature points have contrast changes in the same direction. If they are different, it means that the contrast changes of the two feature points are opposite, even if the two features The Euclidean distance of the point is zero, which also needs to be culled.

本发明实施例还提供了一种图像特征点检测系统,该系统包括:The embodiment of the present invention also provides an image feature point detection system, the system includes:

图像增强模块,用于对待检测图像进行图像增强处理,获得增强图像;The image enhancement module is used to perform image enhancement processing on the image to be detected to obtain an enhanced image;

图像滤波模块,用于对增强图像进行引导滤波处理,获得滤波图像;The image filtering module is used to conduct guided filtering processing on the enhanced image to obtain the filtered image;

特征点检测模块,用于对滤波图像进行SURF特征点检测,获得目标特征点。The feature point detection module is used to perform SURF feature point detection on the filtered image to obtain target feature points.

本发明实施例还提供了一种计算机存储介质,该计算机存储介质中存储有多条计算机指令,该多条计算机指令用于使计算机执行上述的方法。An embodiment of the present invention further provides a computer storage medium, where a plurality of computer instructions are stored in the computer storage medium, and the plurality of computer instructions are used to cause a computer to execute the above method.

尽管已描述了本发明的优选实施例,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例作出另外的变更和修改。所以,所附权利要求意欲解释为包括优选实施例以及落入本发明范围的所有变更和修改。Although preferred embodiments of the present invention have been described, additional changes and modifications to these embodiments may occur to those skilled in the art once the basic inventive concepts are known. Therefore, the appended claims are intended to be construed to include the preferred embodiment and all changes and modifications that fall within the scope of the present invention.

显然,本领域的技术人员可以对本发明进行各种改动和变型而不脱离本发明的精神和范围。这样,倘若本发明的这些修改和变型属于本发明权利要求及其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit and scope of the invention. Thus, provided that these modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include these modifications and variations.

Claims (9)

1. An image feature point detection method, comprising:
carrying out image enhancement processing on an image to be detected to obtain an enhanced image;
performing guided filtering processing on the enhanced image to obtain a filtered image;
performing SURF feature point detection on the filtered image to obtain target feature points;
the image enhancement processing is performed on the image to be detected to obtain an enhanced image, and the image enhancement processing comprises the following steps:
filtering an image to be detected to obtain a low-frequency image;
making a difference between the image to be detected and the low-frequency image to obtain a high-frequency image;
performing self-adaptive mixed dimming processing on the low-frequency image to obtain a dimming image;
and superposing the dimming image and the high-frequency image to obtain the enhanced image.
2. The method according to claim 1, wherein the filtering the image to be detected to obtain a low-frequency image comprises:
and filtering the image to be detected by adopting an average filter to obtain the low-frequency image.
3. The method according to claim 1, wherein the performing adaptive hybrid dimming processing on the low-frequency image to obtain a dimming image comprises:
performing histogram statistics on the low-frequency image to obtain corresponding histogram mapping output;
performing linear dimming processing on the low-frequency image to obtain corresponding dimming output;
and carrying out weighted summation on the histogram mapping output and the dimming output to obtain the dimming image.
4. The method according to claim 1, wherein the performing SURF feature point detection on the filtered image to obtain target feature points comprises:
constructing a Hessian matrix, and processing the filtered image by using the Hessian matrix to obtain corresponding pixel points and a two-dimensional image space;
constructing a scale space;
comparing the pixel points obtained by the Hessian matrix processing with the two-dimensional image space and the points in the neighborhood of the scale space to determine characteristic points;
determining a main direction according to the characteristic points;
determining a corresponding feature point descriptor according to the main direction;
and matching the feature points according to the feature point descriptors, and removing unmatched feature points.
5. The method as claimed in claim 4, wherein the step of comparing the pixel points obtained by the Hessian matrix processing with the points in the two-dimensional image space and the neighborhood of the scale space to determine the feature points comprises:
comparing the pixel points obtained through the Hessian matrix processing with the two-dimensional image space and the points in the neighborhood of the scale space to determine key points;
and filtering key points with weak energy and key points with wrong positioning, and screening out final feature points.
6. The image feature point detection method according to claim 4, wherein the determining a principal direction from the feature points includes:
in the circular neighborhood of the characteristic points, counting the sum of the horizontal and vertical harr wavelet characteristic values of all the points in a certain angle sector;
rotating the sector at intervals of a certain radian, and counting the sum of horizontal and vertical harr wavelet characteristic values of all points in the rotated sector area;
and taking the direction of the sector with the maximum sum of the horizontal and vertical harr wavelet characteristic values of each point in the region as the main direction of the characteristic point.
7. The method according to claim 4, wherein the matching the feature points according to the feature point descriptors and eliminating unmatched feature points comprises:
determining the Euclidean distance between the two feature points, and determining the matching degree between the feature points according to the Euclidean distance;
determining a Hessian matrix trace between the two characteristic points with the matching degree higher than a threshold value;
and if the signs of Hessian matrix traces between the two characteristic points are different, removing the corresponding characteristic points.
8. An image feature point detection system, comprising:
the image enhancement module is used for carrying out image enhancement processing on the image to be detected to obtain an enhanced image;
the image filtering module is used for conducting guided filtering processing on the enhanced image to obtain a filtered image;
and the characteristic point detection module is used for performing SURF characteristic point detection on the filtered image to obtain target characteristic points.
9. A computer storage medium having stored thereon a plurality of computer instructions for causing a computer to perform the method of any one of claims 1-8.
CN202210417636.0A 2022-04-21 2022-04-21 An image feature point detection method, system and computer storage medium Pending CN114677518A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210417636.0A CN114677518A (en) 2022-04-21 2022-04-21 An image feature point detection method, system and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210417636.0A CN114677518A (en) 2022-04-21 2022-04-21 An image feature point detection method, system and computer storage medium

Publications (1)

Publication Number Publication Date
CN114677518A true CN114677518A (en) 2022-06-28

Family

ID=82078679

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210417636.0A Pending CN114677518A (en) 2022-04-21 2022-04-21 An image feature point detection method, system and computer storage medium

Country Status (1)

Country Link
CN (1) CN114677518A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110135438A (en) * 2019-05-09 2019-08-16 哈尔滨工程大学 An Improved SURF Algorithm Based on Gradient Amplitude Precomputing
CN110390338A (en) * 2019-07-10 2019-10-29 武汉大学 A High Precision Matching Method for SAR Based on Nonlinear Guided Filtering and Ratio Gradient
KR102173244B1 (en) * 2019-12-10 2020-11-03 (주)인펙비전 Video stabilization system based on SURF
CN113673515A (en) * 2021-08-20 2021-11-19 国网上海市电力公司 A computer vision target detection algorithm
WO2021238655A1 (en) * 2020-05-29 2021-12-02 展讯通信(上海)有限公司 Image processing method and apparatus, storage medium and terminal
CN114332081A (en) * 2022-03-07 2022-04-12 泗水县亿佳纺织厂 Textile surface abnormity determination method based on image processing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110135438A (en) * 2019-05-09 2019-08-16 哈尔滨工程大学 An Improved SURF Algorithm Based on Gradient Amplitude Precomputing
CN110390338A (en) * 2019-07-10 2019-10-29 武汉大学 A High Precision Matching Method for SAR Based on Nonlinear Guided Filtering and Ratio Gradient
KR102173244B1 (en) * 2019-12-10 2020-11-03 (주)인펙비전 Video stabilization system based on SURF
WO2021238655A1 (en) * 2020-05-29 2021-12-02 展讯通信(上海)有限公司 Image processing method and apparatus, storage medium and terminal
CN113673515A (en) * 2021-08-20 2021-11-19 国网上海市电力公司 A computer vision target detection algorithm
CN114332081A (en) * 2022-03-07 2022-04-12 泗水县亿佳纺织厂 Textile surface abnormity determination method based on image processing

Similar Documents

Publication Publication Date Title
CN114399522B (en) An edge detection method based on Canny operator with high and low thresholds
CN107203973B (en) Sub-pixel positioning method for center line laser of three-dimensional laser scanning system
CN111080661B (en) Image-based straight line detection method and device and electronic equipment
CN111080529A (en) A Robust UAV Aerial Image Mosaic Method
CN110992263B (en) Image stitching method and system
CN108416789A (en) Method for detecting image edge and system
CN117853510A (en) Canny edge detection method based on bilateral filtering and self-adaptive threshold
CN104933434A (en) Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method
Li et al. Road lane detection with gabor filters
CN112017223B (en) Heterologous image registration method based on improved SIFT-Delaunay
CN107169979A (en) A kind of method for detecting image edge of improvement Canny operators
CN109559273B (en) Quick splicing method for vehicle bottom images
CN111223063A (en) Finger vein image NLM denoising method based on texture feature and dual kernel function
CN115994870B (en) Image processing method for enhancing denoising
CN114119437A (en) GMS-based image stitching method for improving moving object distortion
CN108416801B (en) A Har-SURF-RAN Feature Point Matching Method for Stereo Vision 3D Reconstruction
CN115147613B (en) A method for infrared small target detection based on multi-directional fusion
CN108550165A (en) A kind of image matching method based on local invariant feature
Kumar et al. A novel method of edge detection using cellular automata
CN113160332A (en) Multi-target identification and positioning method based on binocular vision
CN110929598A (en) Contour feature-based matching method for unmanned aerial vehicle SAR images
CN117934863A (en) Adaptive FAST corner detection optimization algorithm based on grayscale mean
CN115661110A (en) A method for identifying and locating transparent workpieces
CN113793372A (en) Optimal registration method and system for different-source images
CN113205540B (en) Multi-scale automatic anisotropic morphological direction derivative edge detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination