[go: up one dir, main page]

CN106156699B - Image processing apparatus and image matching method - Google Patents

Image processing apparatus and image matching method Download PDF

Info

Publication number
CN106156699B
CN106156699B CN201510147484.7A CN201510147484A CN106156699B CN 106156699 B CN106156699 B CN 106156699B CN 201510147484 A CN201510147484 A CN 201510147484A CN 106156699 B CN106156699 B CN 106156699B
Authority
CN
China
Prior art keywords
search
related coefficient
fisrt feature
parameter
intermediate features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510147484.7A
Other languages
Chinese (zh)
Other versions
CN106156699A (en
Inventor
杨安荣
孙成昆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201510147484.7A priority Critical patent/CN106156699B/en
Publication of CN106156699A publication Critical patent/CN106156699A/en
Application granted granted Critical
Publication of CN106156699B publication Critical patent/CN106156699B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a kind of image processing apparatus and image matching method, can reduce the calculation amount and amount of storage of images match processing, while guaranteeing the accuracy of images match.Image processing equipment includes converter unit, the first search unit, the second search unit and judging unit.First search parameter is respectively set as multiple predefined parameters by the first search unit, the first intermediate features figure is generated based on set the first search parameter and fisrt feature figure, calculates the first related coefficient between the first intermediate features figure and template image generated.Second search unit is based on multiple first related coefficients corresponding with multiple predefined parameters, determine the value range of the second search parameter, and the second search parameter is set separately in the value range of the second search parameter, the second intermediate features figure is generated based on set the second search parameter and second feature figure, calculates the second related coefficient between the second intermediate features figure and template image generated.

Description

Image processing apparatus and image matching method
Technical field
The present invention relates to image processing apparatus and image matching method.
Background technique
In the image matching method of fingerprint matching, the face matching of the prior art etc., pass through calculating input image With the related coefficient of template image, such as when related coefficient is greater than threshold value it can be determined that input picture and template image Match.In addition, in the related coefficient of calculating input image and template image, in order to cope with caused by positional shift and direction deflection The case where matching error, needs to calculate the related coefficient in the case where being deviated at various locations with all directions deflection, to count Calculation amount and amount of storage further increase.
It specifically,, will in order to reduce calculation amount and amount of storage in the related coefficient of calculating input image and template image It is the relatively little of intermediate image of data volume that input picture, which converts (scaling), and calculates the phase relation of intermediate image and template image Number.But the reduction of the data volume with intermediate image, matching precision can also decline.
Summary of the invention
The present invention completes in view of the above problems, and its purpose is to provide a kind of image processing apparatus and images match sides Method, can reduce the calculation amount and amount of storage of images match processing, while guarantee the accuracy of images match.
According to an aspect of the present invention, a kind of image processing apparatus is provided.Described image processing unit includes: that transformation is single Member converts input picture, generates fisrt feature figure and the data volume second feature figure big relative to fisrt feature figure;The First search parameter is respectively set as multiple predefined parameters by one search unit, based on the first set search parameter and One characteristic pattern generates the first intermediate features figure, calculates the between the first intermediate features figure generated and the template image One related coefficient, wherein first related coefficient is corresponding with predefined parameter;Second search unit is based on and multiple predetermined ginsengs Corresponding multiple first related coefficients of number, determine the value range of the second search parameter, and in the second search parameter Second search parameter is set separately in value range, is generated based on set the second search parameter and second feature figure Second intermediate features figure calculates the second related coefficient between the second intermediate features figure generated and the template image;Sentence Order member, in the case where calculated second related coefficient meets predetermined condition, is determined as the input picture and the mould Plate images match, based on all second search parameters set in the value range of the second search parameter and second feature figure The second related coefficient between all second intermediate features figures and the template image generated is all unsatisfactory for the feelings of predetermined condition Under condition, it is determined as that the input picture and the template image mismatch.
According to another aspect of the present invention, a kind of image matching method is provided.Described image matching process includes: to input Image is converted, and fisrt feature figure and the data volume second feature figure big relative to fisrt feature figure are generated;By the first search Parameter is respectively set as multiple predefined parameters, is generated among first based on set the first search parameter and fisrt feature figure Characteristic pattern calculates the first related coefficient between the first intermediate features figure generated and the template image, wherein described the One related coefficient is corresponding with predefined parameter;Based on multiple first related coefficients corresponding with multiple predefined parameters, determine The value range of second search parameter;Second search parameter, base are set separately in the value range of the second search parameter The second intermediate features figure is generated in set the second search parameter and second feature figure, is calculated special among generated second The second related coefficient between sign figure and the template image;The case where calculated second related coefficient meets predetermined condition Under, it is determined as that the input picture is matched with the template image;Based on being set in the value range of the second search parameter All second search parameters and second feature the figure all second intermediate features figures and the template image that generate between the In the case that two related coefficients are all unsatisfactory for predetermined condition, it is determined as that the input picture and the template image mismatch.
It is relatively small to calculate data volume when searching for first time for image processing apparatus and image matching method according to the present invention Fisrt feature figure and template image related coefficient, and when determining using calculated related coefficient second of search Thus parameter value range can only calculate the second relatively large spy of data volume when searching for for second within the scope of parameter value The related coefficient of sign figure and template image.Therefore, image processing apparatus and image matching method through the invention, can The accuracy of images match is held in and is carrying out accuracy when images match using second feature figure, can reduce simultaneously Calculation amount.
Detailed description of the invention
Fig. 1 is the functional block diagram for indicating the image processing apparatus of embodiments of the present invention.
Fig. 2 is the flow chart for indicating the image matching method of embodiments of the present invention.
Specific embodiment
In the following, being explained with reference to embodiments of the present invention.Description referring to the drawings is provided, with help to by The understanding of example embodiment of the invention defined by appended claims and their equivalents.It include help to understand it is various specific Details, but they can only be counted as illustratively.It therefore, it would be recognized by those skilled in the art that can be to reality described herein The mode of applying makes various changes and modifications, without departing from scope and spirit of the present invention.Moreover, in order to keep specification clearer Succinctly, by omission pair it is well known that the detailed description of function and construction.
In the following, illustrating the image processing apparatus of embodiments of the present invention referring to Fig.1.Fig. 1 is to indicate reality of the invention Apply the functional block diagram of the image processing apparatus of mode.
As shown in Figure 1, image processing apparatus 1 includes converter unit 11, the first search unit 12,13 and of the second search unit Judging unit 14.Wherein, image processing apparatus 1 be, for example, smart phone, tablet computer, laptop, fingerprint identification device, The image processing apparatus of face identification device etc., as long as having the ability handled image data.
Converter unit 11 converts input picture, generates fisrt feature figure and data volume is big relative to fisrt feature figure Second feature figure.
Wherein, input picture can be the image acquired by image processing apparatus 1 itself by acquisition module, be also possible to From the received image of other devices.In addition, the content about input picture, with field phase applied by image processing apparatus 1 It closes, for example, input picture is fingerprint image, if image processing apparatus if the application of image processing apparatus 1 is to carry out fingerprint recognition 1 application is to carry out recognition of face, then input picture is facial image.
Specifically, converter unit 11 for example carries out wavelet transformation to input picture and reduces transformation, so that it is special to generate second Then sign figure for example carries out wavelet transformation to input picture and reduces transformation, again to generate fisrt feature figure.Wherein, raw Diminution when diminution when at fisrt feature figure converts and generate second feature figure converts difference, thus the data of second feature figure Amount is greater than fisrt feature figure.For example, the input picture of 1024*1024 pixel is carried out wavelet transformation and reduced to become by converter unit 11 It changes, generates the second feature figure of 64*64 pixel, and the input picture of 1024*1024 pixel is subjected to wavelet transformation and diminution Transformation generates the second feature figure of 32*32 pixel.When generating fisrt feature figure, after converter unit 11 can also be to generation Second feature legend is as carried out wavelet transformation and reducing transformation, to generate fisrt feature figure.
Preferably, converter unit 11 converts input picture or second feature figure with different transformation parameters, thus Generate at least two fisrt feature figure corresponding from different transformation parameters.Specifically, converter unit is to input picture or second Characteristic pattern carries out wavelet transformation and reduces transformation, generates fisrt feature figure.For example, utilizing different angles when carrying out wavelet transformation The window function of degree carries out wavelet transformation to input picture or second feature figure, diminution transformation is then carried out again, to generate difference Multiple fisrt feature figures corresponding with different angle.In concrete example, converter unit 11 respectively with 0 degree, 30 degree, 60 degree, 90 degree, 120 degree, 150 degree of window function carry out wavelet transformation to input picture or second feature figure, then carry out diminution transformation again, from And generate 6 fisrt feature figures corresponding with above-mentioned angle respectively.As a result, in subsequent processing using fisrt feature figure come into When row first is searched for, the accuracy of the first search can be improved, and then can more accurately set taking for the second search parameter It is worth range.
Moreover it is preferred that converter unit 11 is filtered input picture, a filtered input picture is generated, and And filtered input picture is converted, generate fisrt feature figure and data volume second spy big relative to fisrt feature figure Sign figure.Specifically, converter unit 11 is before the processing for generate fisrt feature figure and second feature figure, to input picture into The pretreatment of row such as wavelet filtering, to eliminate the noise in original input picture.Thereby, it is possible to eliminate noise pair in advance Interference in images match treatment process improves the accuracy rate of images match.Then, converter unit 11 utilizes filtered input Image generates fisrt feature figure and second feature figure.
In above-mentioned conversion process and filtering processing, specific description has been carried out by taking wavelet transformation, wavelet filtering as an example, But the present invention is not limited thereto, can also carry out other processing such as dct transform, mean filter.As long as after generating Fisrt feature figure and second feature figure can show the feature of input picture well.
First search parameter is respectively set as multiple predefined parameters by the first search unit 12, is searched based on set first Rope parameter and fisrt feature figure generate the first intermediate features figure, calculate the first intermediate features figure generated and the Prototype drawing The first related coefficient as between.Wherein the first related coefficient is corresponding with predefined parameter.
Wherein, predefined parameter for example indicates position offset.In the first search process carried out by the first search unit 12 In, such as the first search parameter is respectively set as to different multiple position offsets.Specifically, position offset is by first Between characteristic pattern corresponding line number of the central point in fisrt feature figure and columns indicate.Plurality of position offset can be pre- First it is set as (1 row, 1 column), (1 row, 3 column), (1 row, 5 column) ... (15 rows, 15 column) etc..In addition, above-mentioned preset Multiple position offsets are an example, can be set as needed other position offsets.In addition, predefined parameter for example may be used To indicate direction rotation amount.At this point, joining in the first search process carried out by the first search unit 12, such as by the first search Number is respectively set as different multiple directions rotation amounts.Wherein, multiple directions rotation amount can be redefined for -40 degree, -30 Degree, -20 degree ..., 40 degree etc..In addition, above-mentioned preset multiple directions rotation amount is an example, can according to need Set other directions rotation amount.In addition, predefined parameter can also for example indicate position offset and direction rotation amount simultaneously, into And it also can according to need the other parameters of expression.
In the following, subsequent explanation is unfolded so that predefined parameter indicates position offset and direction rotation amount simultaneously as an example.This When, the first search parameter is each set to the various combination of above-mentioned multiple position offsets and above-mentioned multiple directions rotation amount.
Specifically, it after the first search parameter is set to some position offset and some direction rotation amount, is based on Set the first search parameter and fisrt feature figure generates the first intermediate features figure, and calculates among generated first The first related coefficient between characteristic pattern and the template image.At this point, calculated first related coefficient and some position Offset is related to some direction rotation amount.Then, then the first search parameter is set as position offset and direction rotates Other combinations of amount, repeat above-mentioned processing, to calculate other groups with the position offset and direction rotation amount Close relevant first related coefficient.Repeat above-mentioned processing, until position offset and direction rotation amount all combinations all Once it was set to the first search parameter, it is right respectively with all combinations of position offset and direction rotation amount thus, it is possible to calculate The first related coefficient answered.
After the first search parameter is set to position offset (a row, b column) and direction rotation amount z degree, the first search Unit 12 utilizes the based on set the first search parameter (that is, position offset (a row, b column) and direction rotation amount z degree) One characteristic pattern generates the first intermediate features figure.In specific processing, for example, 16*16 pixel fisrt feature figure by including The matrix of 16*16 element indicates.
Specifically, it indicates the element of the matrix B of the first intermediate features figure generated and indicates the matrix A of fisrt feature figure Element between relationship it is as follows.The element of the first row first row of matrix B is the element of the xth row y column of matrix A.Its In, in the case where z is less than or equal to 0, x=a- (cosz °-sinz °) * d/2, y=b- (cosz °-sinz °) * d/2.In addition, In the case that z is greater than 0, x=a+ (cosz °+sinz °) * d/2, y=b+ (cosz °-sinz °) * d/2.It is in fisrt feature figure In the case where 16*16 pixel, d=16.The element of the first row secondary series of matrix B is (x-sinz °) row (y+ of matrix A Cosz °) column element.The tertial element of the first row of matrix B is (x-2*sinz °) row (y+2* of matrix A Cosz °) column element.In addition, the element of the second row first row of matrix B is (x+cosz °) row (y+ of matrix A Sinz °) column element.The element of the third line first row of matrix B is (x+cosz °+cosz °) row (y+ of matrix A Sinz °+sinz °) column element.The element of second row secondary series of matrix B is (x+cosz °-sinz °) row of matrix A The element of (y+sinz °+cosz °) column.That is, the element that the line n m of matrix B is arranged is, matrix A (x- (m-1) * sinz °+ (n-1) * cosz °) row (cosz ° of y+ (m-1)+(n-1) * sinz °) column element.It so, it is possible to calculate all of matrix B The value of element.
In addition, in the processing of the element value of above-mentioned calculating matrix B, such as calculated line number or columns (x- (m- 1) * sinz °+(n-1) * cosz °), in the case that (cosz ° of y+ (m-1)+(n-1) * sinz °) be not integer, it is for example carried out It rounds up, to obtain the line number and columns of integer value.
Preferably, the first search unit 12 determines the picture in fisrt feature figure based on the first set search parameter Element, and merely with the value of identified pixel, generate the first intermediate features figure.
Specifically, in the first search unit 12 based on the first set search parameter (that is, position offset (a row, b Column) and direction rotation amount z degree) when generating the first intermediate features figure using fisrt feature figure, as described above, according to expression first The member of the matrix A of characteristic pattern usually computational chart show the first intermediate features figure generated matrix B element.In calculating matrix A In line number or when columns, it may appear that the case where line number or columns beyond matrix A.For example, in the matrix that matrix A is 16*16 In the case where, if calculated line number or columns are more than 16 or less than 1, the value of the corresponding element in matrix B is directly set For null value.Correspondingly, when the first search parameter is set to specific predefined parameter, matrix B is being generated using matrix A When, in the presence of the element for the matrix that will not be utilized to generate the first intermediate features figure in the matrix of fisrt feature figure, therefore In specific treatment process, by not reading the element that will not be utilized to generate the matrix of the first intermediate features figure, in phase The element will not be utilized in the calculating of relationship number, so as to reduce calculation amount.Due to the element of the matrix of fisrt feature figure Corresponding to pixel, therefore the first search unit 12 determines the picture in fisrt feature figure based on the first set search parameter Element, and merely with the value of identified pixel, generate the first intermediate features figure.
In addition, it is above-mentioned based on set the first search parameter (that is, position offset (a row, b column) and direction rotate Amount z degree) processing of the first intermediate features figure is generated using fisrt feature figure is only an example, it can also adopt with other methods Generate the first intermediate features figure, such as in order to improve the computational accuracy of the first related coefficient between template image, it can also To carry out conversion process appropriate.
In addition, as described above, converter unit 11 becomes input picture or second feature figure with different transformation parameters It changes, so that the first search unit 12 exists in the case where generating at least two fisrt feature figure corresponding from different transformation parameters When generating the first intermediate features figure, the first intermediate features figure is generated using at least two fisrt feature figures.Specifically, it is assumed that raw At 6 fisrt feature figures, indicate that the matrix of 6 fisrt feature figures is A1~A6.For example, based on the first set search When parameter (that is, position offset (a row, b column) and direction rotation amount z degree) Lai Shengcheng the first intermediate features figure, as described above, really Surely the line number and columns in the matrix of fisrt feature figure are indicated, is then read respectively in 6 matrixes for indicating 6 fisrt feature figures The value of the element of corresponding row and column is taken, and the value of read 6 elements is for example weighted and averaged, to calculate Indicate the value of the element of the matrix B of the first intermediate features figure.Specifically, the element of the first row secondary series of matrix B is, to matrix Member of the element of the (y+cosz °) of row of (x-sinz °) column of A1, (x-sinz °) the (y+cosz °) of row column of matrix A 2 Element, the element of the (y+cosz °) of row of (x-sinz °) column of matrix A 3, matrix A 4 (x-sinz °) row (y+cosz °) The elements of column, the element of (x-sinz °) the (y+cosz °) of row column of matrix A 5, matrix A 6 (x-sinz °) row (y+ Cosz °) column element be weighted and averaged after value.By calculating the first intermediate features figure and template image that so generate First related coefficient can be improved the accuracy of the first related coefficient, and then can more accurately set the second search parameter Value range.
First search unit 12 after generating the first intermediate features figure, calculate the first intermediate features figure generated with The first related coefficient between the template image.First related coefficient of calculating the first intermediate features figure and template image Processing, can be carried out using method in the prior art, not be unfolded to be illustrated herein.
Wherein, for the first intermediate features figure generate the first related coefficient template image, preferably with first among The template image of characteristic pattern same pixel.Thus, it is possible to reduce the calculation amount when calculating the first related coefficient.Furthermore it is preferred that For the template image for generating the first related coefficient with the first intermediate features figure is, by the way that registered images are carried out and are used for The identical processing of conversion process for generating fisrt feature figure, to generate the template image.Thereby, it is possible to improve calculated The confidence level of one related coefficient.
In addition, the above explained processing for generating the first intermediate features figure and calculate the processing of the first related coefficient can also be with It is parallel to execute.Specifically, after generating some element for indicating the matrix of the first intermediate features figure, the square of the generation is utilized Member in battle array usually carries out calculating the processing of the first related coefficient.Thereby, it is possible to save needed for calculating the first related coefficient Time improves the efficiency of images match.
By above-mentioned processing, it is related that the first search unit 12 calculates corresponding to multiple predefined parameters multiple first Coefficient.Specifically, the first search unit 12 calculates corresponding with all combinations of position offset and direction rotation amount First related coefficient.
Second search unit 13 is based on determining the second search by calculated multiple first related coefficients of the first search unit The value range of parameter.
Specifically, the second search unit 13 compares the size of the first related coefficient, and according to the size of the first related coefficient To determine the value range of the second search parameter in the second search process.Wherein, to the parameter institute of the second search parameter setting The content of expression is identical as the first search parameter.For example, indicating position offset to the predefined parameter of the first search parameter setting In the case where the rotation amount of direction, position offset and direction rotation amount are also illustrated that the parameter of the second search parameter setting.
Preferably, in the first related coefficient corresponding with multiple predefined parameters, determining and value maximum first The corresponding predefined parameter of related coefficient, and it is based on predefined parameter corresponding with maximum first related coefficient of value, determine the The value range of two search parameters.
For example, in by calculated multiple first related coefficients of the first search unit 12, with position offset (3 rows, 4 Column) and the maximum situation of value of 30 degree of direction rotation amount corresponding first related coefficients under, according to the predefined parameter (3 rows, 4 arrange, 30 degree of direction rotation amount) determine the value range of the second search parameter.Specifically, such as by the value model of the second search parameter It encloses and is determined as 2-4 row, 3-5 column, direction rotation amount 21-39 degree.
In addition it is also possible to determine the value range of the second search parameter by other methods.For example, determining and value The maximum corresponding predefined parameter of first related coefficient and predetermined ginseng corresponding with the first second largest related coefficient of value Number, and based on predefined parameter corresponding to maximum first related coefficient of value and with second largest first related of value The corresponding predefined parameter of coefficient, determines the value range of the second search parameter.Although the calculation amount meeting of the second search process as a result, It improves, but correspondingly can be improved the accuracy of images match.
After the value range that the second search parameter has been determined, value of second search unit 13 in the second search parameter The second search parameter is set separately in range, is generated among second based on set the second search parameter and second feature figure Characteristic pattern calculates the second related coefficient between the second intermediate features figure generated and the template image.
Specifically, the second search parameter is successively set as some positional shift in value range by the second search unit 13 After amount and some direction rotation amount, the second intermediate features are generated based on set the second search parameter and second feature figure Figure, and calculate the second related coefficient between the second intermediate features figure generated and the template image.If by aftermentioned Judging unit 14 processing and be judged as that calculated second related coefficient is unsatisfactory for predetermined condition, then again by second search join Number is set as other combinations (setting in value range certainly) of position offset and direction rotation amount, repeats above-mentioned place Reason, to calculate the second related coefficient.
In addition, in the second search process carried out by the second search unit 13, based on the second set search parameter The processing of the second intermediate features figure is generated with second feature figure and calculates the second intermediate features figure generated and the mould The processing of the second related coefficient between plate image is identical as the first above-mentioned search process, therefore without repeat description. In addition, identically as the first search process, the second search unit 13 is determined based on the second set search parameter in the second spy The pixel in figure is levied, and merely with the value of identified pixel, generates the second intermediate features figure.In addition, at the first search Reason in the same manner, can also adopt and generate the second intermediate features figure with other methods, such as in order to improve between template image The first related coefficient computational accuracy, conversion process appropriate can also be carried out.
Wherein, for the second intermediate features figure generate the second related coefficient template image, preferably with second among The template image of characteristic pattern same pixel.Thus, it is possible to reduce the calculation amount when calculating the second related coefficient.Furthermore it is preferred that For the template image for generating the second related coefficient with the second intermediate features figure is, by the way that registered images are carried out and are used for The identical processing of conversion process for generating second feature figure, to generate the template image.Thereby, it is possible to improve calculated The confidence level of one related coefficient.
In addition, the above explained processing for generating the second intermediate features figure and calculate the processing of the second related coefficient can also be with It is parallel to execute.Specifically, after generating some element for indicating the matrix of the second intermediate features figure, the square of the generation is utilized Member in battle array usually carries out calculating the processing of the second related coefficient.Thereby, it is possible to save needed for calculating the second related coefficient Time improves the efficiency of images match.
Judging unit 14 in the case where calculated second related coefficient meets predetermined condition, be determined as input picture with Template image matching, based on all second search parameters set in the value range of the second search parameter and second feature The second related coefficient schemed between all second intermediate features figures and the template image that generate all is unsatisfactory for predetermined condition In the case of, it is determined as that the input picture and the template image mismatch.
Specifically, after calculating the second related coefficient under some second search parameter by the second search unit 13, Judge whether to meet predetermined condition by calculated second related coefficient of the second search unit 13 by judging unit 14 (for example, Whether be greater than threshold value), in the case where second related coefficient meets predetermined condition, judging unit 14 be determined as input picture with Template image matching.Processing terminate for images match as a result,.When judging unit 14 be determined as it is calculated by the second search unit 13 When second related coefficient is unsatisfactory for predetermined condition, as described above, value model of second search unit 13 in the second search parameter The second search parameter of interior reset is enclosed, and repeats above-mentioned processing.Second search unit 13 is by the value of the second search parameter All values in range were all once set to the second search parameter, still not by calculated second related coefficient of the second search unit 13 In the case where meeting predetermined condition, judging unit 14 is it can be determined that input picture and template image mismatch.
The image processing equipment 1 of embodiment according to the present invention passes through the fisrt feature figure relatively small using data volume The first search process determine the range of the second search process, therefore utilizing the second of the relatively large second feature figure of data volume It is only calculated in determining range in search process, therefore can reduce the calculation amount in whole image matching treatment, together When can by images match processing precision be maintained in full scope using data volume it is relatively large second feature figure progress The level of calculating.
In the following, illustrating the image matching method of embodiments of the present invention referring to Fig. 2.Fig. 2 is to indicate reality of the invention Apply the flow chart of the image matching method of mode.
Image matching method shown in Fig. 2 can be applied to image processing equipment shown in FIG. 1.As shown in Figure 1, at image Managing equipment 1 includes converter unit 11, the first search unit 12, the second search unit 13 and judging unit 14.
In step sl, input picture is converted, generates fisrt feature figure and data volume relative to fisrt feature figure Big second feature figure.
Wherein, input picture can be the image acquired by image processing apparatus 1 itself by acquisition module, be also possible to From the received image of other devices.In addition, the content about input picture, with field phase applied by image processing apparatus 1 It closes, for example, input picture is fingerprint image, if image processing apparatus if the application of image processing apparatus 1 is to carry out fingerprint recognition 1 application is to carry out recognition of face, then input picture is facial image.
Specifically, converter unit 11 for example carries out wavelet transformation to input picture and reduces transformation, so that it is special to generate second Then sign figure for example carries out wavelet transformation to input picture and reduces transformation, again to generate fisrt feature figure.Wherein, raw Diminution when diminution when at fisrt feature figure converts and generate second feature figure converts difference, thus the data of second feature figure Amount is greater than fisrt feature figure.For example, the input picture of 1024*1024 pixel is carried out wavelet transformation and reduced to become by converter unit 11 It changes, generates the second feature figure of 64*64 pixel, and the input picture of 1024*1024 pixel is subjected to wavelet transformation and diminution Transformation generates the second feature figure of 32*32 pixel.When generating fisrt feature figure, after converter unit 11 can also be to generation Second feature legend is as carried out wavelet transformation and reducing transformation, to generate fisrt feature figure.
Preferably, in step sl, input picture is filtered, generates a filtered input picture, and right Filtered input picture is converted, and fisrt feature figure and the data volume second feature big relative to fisrt feature figure are generated Figure.Specifically, converter unit 11 carries out input picture before the processing for generate fisrt feature figure and second feature figure Such as the pretreatment of wavelet filtering, to eliminate the noise in original input picture.Thereby, it is possible to eliminate noise in advance to figure As the interference during matching treatment, the accuracy rate of images match is improved.Then, converter unit 11 is schemed using filtered input As generating fisrt feature figure and second feature figure.
Moreover it is preferred that in step sl, being become with different transformation parameters to input picture or second feature figure It changes, to generate at least two fisrt feature figure corresponding from different transformation parameters.Specifically, 11 pairs of input figures of converter unit Picture or second feature figure carry out wavelet transformation and reduce transformation, generation fisrt feature figure.For example, when carrying out wavelet transformation, benefit Wavelet transformation is carried out to input picture or second feature figure with the window function of different angle, then carries out diminution transformation again, thus Generate multiple fisrt feature figures corresponding with different angle respectively.In concrete example, converter unit 11 is respectively with 0 degree, 30 degree, 60 Then degree, 90 degree, 120 degree, 150 degree of window function reduce input picture or second feature figure progress wavelet transformation again Transformation, to generate 6 fisrt feature figures corresponding with above-mentioned angle respectively.It is special using first in subsequent processing as a result, Sign figure when carrying out the first search, can be improved the accuracy of the first search, and then can more accurately set the second search The value range of parameter.
In above-mentioned conversion process and filtering processing, specific description has been carried out by taking wavelet transformation, wavelet filtering as an example, But the present invention is not limited thereto, can also carry out other processing such as dct transform, mean filter.As long as after generating Fisrt feature figure and second feature figure can show the feature of input picture well.
In step s 2, the first search parameter is respectively set as multiple predefined parameters, based on the first set search Parameter and fisrt feature figure generate the first intermediate features figure, calculate the first intermediate features figure generated and the template image Between the first related coefficient, wherein first related coefficient is corresponding with predefined parameter.
Wherein, predefined parameter for example indicates position offset and/or direction rotation amount.It is carried out by the first search unit 12 The first search process in, such as the first search parameter is respectively set as different multiple position offsets and/or different The combination of multiple directions rotation amount.Specifically, position offset is by the central point of the first intermediate features figure in fisrt feature figure Corresponding line number and columns indicate.Plurality of position offset can be redefined for (1 row, 1 column), (1 row, 3 column), (1 Row, 5 column) ... (15 rows, 15 column) etc..In addition, multiple directions rotation amount can be redefined for -40 degree, -30 degree, -20 Degree ..., 40 degree etc..In addition, above-mentioned preset multiple position offsets and multiple directions rotation amount are an example, it can Other direction rotation amounts are set as needed.In addition, predefined parameter can also for example indicate position offset and side simultaneously To rotation amount, and then it also can according to need the other parameters of expression.
Specifically, the first search parameter is being each set to above-mentioned multiple position offsets and the rotation of above-mentioned multiple directions In the case where the various combination for turning amount, the first search parameter be set to some position offset and some direction rotation amount it Afterwards, the first intermediate features figure is generated based on set the first search parameter and fisrt feature figure, and calculated generated The first related coefficient between first intermediate features figure and the template image.At this point, calculated first related coefficient with should Some position offset is related to some direction rotation amount.Then, then by the first search parameter be set as position offset and Other combinations of direction rotation amount, repeat above-mentioned processing, to calculate and the position offset and direction rotation amount Others combine relevant first related coefficient.Above-mentioned processing is repeated, it is all until position offset and direction rotation amount Combination be all once set to the first search parameter, thus, it is possible to calculate all groups with position offset and direction rotation amount Close corresponding first related coefficient.
After the first search parameter is set to position offset (a row, b column) and direction rotation amount z degree, the first search Unit 12 utilizes the based on set the first search parameter (that is, position offset (a row, b column) and direction rotation amount z degree) One characteristic pattern generates the first intermediate features figure.In specific processing, for example, 16*16 pixel fisrt feature figure by including The matrix of 16*16 element indicates.
Specifically, it indicates the element of the matrix B of the first intermediate features figure generated and indicates the matrix A of fisrt feature figure Element between relationship it is as follows.The element of the first row first row of matrix B is the element of the xth row y column of matrix A.Its In, in the case where z is less than or equal to 0, x=a- (cosz °-sinz °) * d/2, y=b- (cosz °-sinz °) * d/2.In addition, In the case that z is greater than 0, x=a+ (cosz °+sinz °) * d/2, y=b+ (cosz °-sinz °) * d/2.It is in fisrt feature figure In the case where 16*16 pixel, d=16.The element of the first row secondary series of matrix B is (x-sinz °) row (y+ of matrix A Cosz °) column element.The tertial element of the first row of matrix B is (x-2*sinz °) row (y+2* of matrix A Cosz °) column element.In addition, the element of the second row first row of matrix B is (x+cosz °) row (y+ of matrix A Sinz °) column element.The element of the third line first row of matrix B is (x+cosz °+cosz °) row (y+ of matrix A Sinz °+sinz °) column element.The element of second row secondary series of matrix B is (x+cosz °-sinz °) row of matrix A The element of (y+sinz °+cosz °) column.That is, the element that the line n m of matrix B is arranged is, matrix A (x- (m-1) * sinz °+ (n-1) * cosz °) row (cosz ° of y+ (m-1)+(n-1) * sinz °) column element.It so, it is possible to calculate all of matrix B The value of element.
In addition, in the processing of the element value of above-mentioned calculating matrix B, such as calculated line number or columns (x- (m- 1) * sinz °+(n-1) * cosz °), in the case that (cosz ° of y+ (m-1)+(n-1) * sinz °) be not integer, it is for example carried out It rounds up, to obtain the line number and columns of integer value.
In addition, as described above, being become in step sl with different transformation parameters to input picture or second feature figure It changes, thus in the case where generating corresponding from different transformation parameters at least two fisrt feature figure, generates the in step s 2 When one intermediate features figure, the first intermediate features figure is generated using at least two fisrt feature figures.Specifically, it is assumed that generate 6 A fisrt feature figure indicates that the matrix of 6 fisrt feature figures is A1~A6.For example, based on the first set search parameter When (that is, position offset (a row, b column) and direction rotation amount z degree) Lai Shengcheng the first intermediate features figure, as described above, determining table Show the line number and columns in the matrix of fisrt feature figure, the then reading pair respectively in 6 matrixes for indicating 6 fisrt feature figures The value of the element for the row and column answered, and the value of read 6 elements is for example weighted and averaged, to calculate expression The value of the element of the matrix B of first intermediate features figure.Specifically, the element of the first row secondary series of matrix B is, to matrix A 1 Element, square of the element of the (y+cosz °) of row of (x-sinz °) column, (x-sinz °) the (y+cosz °) of row column of matrix A 2 Member of the element of the (y+cosz °) of row of (x-sinz °) column of battle array A3, (x-sinz °) the (y+cosz °) of row column of matrix A 4 Element, the element of the (y+cosz °) of row of (x-sinz °) column of matrix A 5, matrix A 6 (x-sinz °) row (y+cosz °) The element of column be weighted and averaged after value.By the first phase for calculating the first intermediate features figure and template image that so generate Relationship number can be improved the accuracy of the first related coefficient, and then can more accurately set the value of the second search parameter Range.
In addition, it is above-mentioned based on set the first search parameter (that is, position offset (a row, b column) and direction rotate Amount z degree) processing of the first intermediate features figure is generated using fisrt feature figure is only an example, it can also adopt with other methods Generate the first intermediate features figure, such as in order to improve the computational accuracy of the first related coefficient between template image, it can also To carry out conversion process appropriate.
Preferably, in step s 2, the pixel in fisrt feature figure is determined based on the first set search parameter, and And merely with the value of identified pixel, the first intermediate features figure is generated.
Specifically, in step s 2 based on the first set search parameter (that is, position offset (a row, b column) and side To rotation amount z degree) when generating the first intermediate features figure using fisrt feature figure, as described above, according to fisrt feature figure is indicated Matrix A member usually computational chart show the first intermediate features figure generated matrix B element.Row in calculating matrix A When several or columns, it may appear that the case where line number or columns beyond matrix A.For example, the case where matrix A is the matrix of 16*16 Under, if calculated line number or columns are more than 16 or less than 1, the value of the corresponding element in matrix B is directly set as null value. Correspondingly, when the first search parameter is set to specific predefined parameter, when using matrix A to generate matrix B, In the presence of the element for the matrix that will not be utilized to generate the first intermediate features figure in the matrix of one characteristic pattern, therefore specifically locating During reason, by not reading the element that will not be utilized to generate the matrix of the first intermediate features figure, in related coefficient The element will not be utilized in calculating, so as to reduce calculation amount.Since the element of the matrix of fisrt feature figure corresponds to picture Element, therefore in step s 2 based on the determining pixel in fisrt feature figure of the first set search parameter, and merely with The value of identified pixel generates the first intermediate features figure.
In step s 2, after generating the first intermediate features figure, the first intermediate features figure generated and institute are calculated State the first related coefficient between template image.The place of first related coefficient of calculating the first intermediate features figure and template image Reason, can be carried out using method in the prior art, not be unfolded to be illustrated herein.Wherein, it is used to and the first intermediate features Figure generates the template image of the first related coefficient, the preferably template image with the first intermediate features figure same pixel.To energy Enough calculation amounts reduced when calculating the first related coefficient.Furthermore it is preferred that being to be used to generate the first phase with the first intermediate features figure The template image of relationship number is, by carrying out place identical with for generating the conversion process of fisrt feature figure to registered images Reason, to generate the template image.Thereby, it is possible to improve the confidence level of calculated first related coefficient.
In addition, the above explained processing for generating the first intermediate features figure and calculate the processing of the first related coefficient can also be with It is parallel to execute.Specifically, after generating some element for indicating the matrix of the first intermediate features figure, the square of the generation is utilized Member in battle array usually carries out calculating the processing of the first related coefficient.Thereby, it is possible to save needed for calculating the first related coefficient Time improves the efficiency of images match.
By above-mentioned processing, in step s 2, multiple first phase relations corresponding with multiple predefined parameters are calculated Number (specifically, first related coefficient corresponding with all combinations of position offset and direction rotation amount).
In step s3, multiple first related coefficients corresponding with multiple predefined parameters are based on, determine the second search The value range of parameter.
Specifically, compare the size of calculated first related coefficient in step s 2 in step s3, and according to first The size of related coefficient determines the value range of the second search parameter in the second search process.Wherein, the second search is joined Content represented by the parameter of number setting is identical as the first search parameter.For example, to the predefined parameter of the first search parameter setting In the case where indicating position offset and direction rotation amount, to the parameter of the second search parameter setting also illustrate that position offset and Direction rotation amount.
Preferably, in step s3, in the first related coefficient corresponding with multiple predefined parameters, determining and value The corresponding predefined parameter of maximum first related coefficient, and it is based on predetermined ginseng corresponding with maximum first related coefficient of value Number, determines the value range of the second search parameter.
For example, in step s 2 in calculated multiple first related coefficients, with position offset (3 rows, 4 column) and direction In the maximum situation of value of 30 degree of rotation amount corresponding first related coefficients, according to the predefined parameter, (3 rows, 4 column, direction rotate 30 degree of amount) determine the value range of the second search parameter.Specifically, such as by the value range of the second search parameter it is determined as 2-4 row, 3-5 column, direction rotation amount 21-39 degree.
In addition it is also possible to determine the value range of the second search parameter by other methods.For example, determining and value The maximum corresponding predefined parameter of first related coefficient and predetermined ginseng corresponding with the first second largest related coefficient of value Number, and based on predefined parameter corresponding to maximum first related coefficient of value and with second largest first related of value The corresponding predefined parameter of coefficient, determines the value range of the second search parameter.Although the calculation amount meeting of the second search process as a result, It improves, but correspondingly can be improved the accuracy of images match.
In step s 4, second search parameter is set separately in the value range of the second search parameter, is based on institute The second search parameter and second feature figure of setting generates the second intermediate features figure, calculates the second intermediate features figure generated The second related coefficient between the template image.
Specifically, in step s 4, some position offset being successively set as the second search parameter in value range After some direction rotation amount, the second intermediate features are generated based on set the second search parameter and second feature figure Figure, and calculate the second related coefficient between the second intermediate features figure generated and the template image.If by aftermentioned Step S5 processing and be judged as that calculated second related coefficient is unsatisfactory for predetermined condition, then the second search parameter is set again It is set to other combinations (setting in value range certainly) of position offset and direction rotation amount, repeats above-mentioned processing, To calculate the second related coefficient.
In addition, in the second search process of step S4, based on set the second search parameter and second feature figure come Generate the processing of the second intermediate features figure and calculate between the second intermediate features figure generated and the template image the The processing of two related coefficients is identical as above-mentioned the first search process of step S2, therefore without repeat description.In addition, with First search process in the same manner, determines the picture in second feature figure based on the second set search parameter in step s 4 Element, and merely with the value of identified pixel, generate the second intermediate features figure.In addition, identically as the first search process, It can adopt and generate the second intermediate features figure with other methods, such as in order to improve the first phase relation between template image Several computational accuracies can also carry out conversion process appropriate.
Wherein, for the second intermediate features figure generate the second related coefficient template image, preferably with second among The template image of characteristic pattern same pixel.Thus, it is possible to reduce the calculation amount when calculating the second related coefficient.Furthermore it is preferred that For the template image for generating the second related coefficient with the second intermediate features figure is, by the way that registered images are carried out and are used for The identical processing of conversion process for generating second feature figure, to generate the template image.Thereby, it is possible to improve calculated The confidence level of one related coefficient.
In addition, the above explained processing for generating the second intermediate features figure and calculate the processing of the second related coefficient can also be with It is parallel to execute.Specifically, after generating some element for indicating the matrix of the second intermediate features figure, the square of the generation is utilized Member in battle array usually carries out calculating the processing of the second related coefficient.Thereby, it is possible to save needed for calculating the second related coefficient Time improves the efficiency of images match.
In step s 5, in the case where calculated second related coefficient meets predetermined condition, it is determined as input picture It is matched with template image.In addition, in step s 6, based on all second set in the value range of the second search parameter The second related coefficient between all second intermediate features figures and the template image that search parameter and second feature figure generate In the case where being all unsatisfactory for predetermined condition, it is determined as that the input picture and the template image mismatch.
Specifically, single by determining after calculating the second related coefficient under some second search parameter in step s 4 Member 14 judgement in step s 4 calculated second related coefficient whether meet predetermined condition (such as, if be greater than threshold value), In the case where second related coefficient meets predetermined condition, judging unit 14 is determined as that input picture is matched with template image. Processing terminate for images match as a result,.When judging unit 14 is determined as that calculated second related coefficient is discontented in step s 4 When sufficient predetermined condition, as described above, in step s 4, the second search ginseng is reset in the value range of the second search parameter Number, and repeat above-mentioned processing.It, in step s 4 will be in the value range of the second search parameter by above-mentioned duplicate processing All values were all once set to the second search parameter, in the case that calculated second related coefficient of institute is still unsatisfactory for predetermined condition, Judging unit 14 is it can be determined that input picture and template image mismatch.
The image matching method of embodiment according to the present invention, by utilizing the relatively small fisrt feature figure of data volume First search process determines the range of the second search process, therefore searches using the second of the relatively large second feature figure of data volume It is only calculated in determining range in rope processing, therefore can reduce the calculation amount in whole image matching treatment, simultaneously The precision of images match processing can be maintained in full scope and be counted using the relatively large second feature figure of data volume The level of calculation.
Those of ordinary skill in the art may be aware that being incorporated in each unit and step of embodiments of the present invention description Suddenly, it can be realized with electronic hardware, computer software, or a combination of the two.And software module can be placed in arbitrary form Computer storage medium in.In order to clearly illustrate the interchangeability of hardware and software, in the above description according to function Each exemplary composition and step can be generally described.These functions are implemented in hardware or software actually, are depended on In the specific application and design constraint of technical solution.Those skilled in the art can use not each specific application Described function is realized with method, but such implementation should not be considered as beyond the scope of the present invention.
Each embodiment of the invention has been described in detail above.However, it should be appreciated by those skilled in the art that not In the case where being detached from the principle and spirit of the invention, these embodiments can be carry out various modifications, combination or sub-portfolio, and Such modification should be fallen within the scope of the present invention.

Claims (10)

1. a kind of image matching method, comprising:
Input picture is converted, fisrt feature figure and the data volume second feature figure big relative to fisrt feature figure are generated;
First search parameter is respectively set as multiple predefined parameters, based on set the first search parameter and fisrt feature figure It generates the first intermediate features figure, calculates the first related coefficient between the first intermediate features figure and template image generated, Wherein first related coefficient is corresponding with predefined parameter;
Based on multiple first related coefficients corresponding with multiple predefined parameters, the value range of the second search parameter is determined;
Second search parameter is set separately in the value range of the second search parameter, based on the second set search ginseng Several and second feature figure generates the second intermediate features figure, calculate the second intermediate features figure generated and the template image it Between the second related coefficient;
In the case where calculated second related coefficient meets predetermined condition, it is determined as the input picture and the Prototype drawing As matching;
What is generated based on all second search parameters set in the value range of the second search parameter and second feature figure In the case that the second related coefficient between all second intermediate features figures and the template image is all unsatisfactory for predetermined condition, It is determined as that the input picture and the template image mismatch.
2. image matching method as described in claim 1, wherein
Input picture is converted, fisrt feature figure and the data volume second feature figure big relative to fisrt feature figure are generated Step includes:
Input picture is filtered, a filtered input picture is generated;
Filtered input picture is converted, generate fisrt feature figure and data volume it is big relative to fisrt feature figure second Characteristic pattern.
3. image matching method as claimed in claim 2, wherein
Filtered input picture is converted, generate fisrt feature figure and data volume it is big relative to fisrt feature figure second In the step of characteristic pattern,
Filtered input picture or the second feature figure are converted with different transformation parameters, thus generate from it is different The corresponding at least two fisrt feature figure of transformation parameter.
4. image matching method as claimed in claim 2, wherein
Based on multiple first related coefficients corresponding with multiple predefined parameters, the value range of the second search parameter is determined Step includes:
In multiple first related coefficients corresponding with multiple predefined parameters, determining and maximum first related coefficient of value Corresponding predefined parameter;
Based on predefined parameter corresponding with maximum first related coefficient of value, the value range of the second search parameter is determined.
5. image matching method as described in claim 1, wherein
The first intermediate features figure is generated based on set the first search parameter and fisrt feature figure, calculates generated first In the step of the first related coefficient between intermediate features figure and the template image,
The pixel in the fisrt feature figure is determined based on the first set search parameter, and merely with identified picture The value of element generates the first intermediate features figure.
6. a kind of image processing apparatus, comprising:
Converter unit converts input picture, generate fisrt feature figure and data volume it is big relative to fisrt feature figure the Two characteristic patterns;
First search parameter is respectively set as multiple predefined parameters by the first search unit, based on the first set search ginseng It counts with fisrt feature figure and generates the first intermediate features figure, calculate between the first intermediate features figure and template image generated First related coefficient, wherein first related coefficient is corresponding with predefined parameter;
Second search unit is based on multiple first related coefficients corresponding with multiple predefined parameters, determines that the second search is joined Several value ranges, and second search parameter is set separately in the value range of the second search parameter, based on set Fixed the second search parameter and second feature figure generates the second intermediate features figure, calculate the second intermediate features figure generated with The second related coefficient between the template image;
Judging unit, in the case where calculated second related coefficient meets predetermined condition, be determined as the input picture with The template image matching, based on all second search parameters set in the value range of the second search parameter and second The second related coefficient between all second intermediate features figures and the template image that characteristic pattern generates all is unsatisfactory for predetermined item In the case where part, it is determined as that the input picture and the template image mismatch.
7. image processing apparatus as claimed in claim 6, wherein
The converter unit is filtered input picture, generates a filtered input picture, and to filtered defeated Enter image to be converted, generates fisrt feature figure and the data volume second feature figure big relative to fisrt feature figure.
8. image processing apparatus as claimed in claim 7, wherein
The converter unit converts filtered input picture or the second feature figure with different transformation parameters, from And generate at least two fisrt feature figure corresponding from different transformation parameters.
9. image processing apparatus as claimed in claim 7, wherein
For second search unit in the first related coefficient corresponding with multiple predefined parameters, determination is maximum with value The corresponding predefined parameter of first related coefficient, and it is based on predefined parameter corresponding with maximum first related coefficient of value, really The value range of fixed second search parameter.
10. image processing apparatus as claimed in claim 6, wherein
First search unit determines the pixel in the fisrt feature figure based on the first set search parameter, and Merely with the value of identified pixel, the first intermediate features figure is generated.
CN201510147484.7A 2015-03-31 2015-03-31 Image processing apparatus and image matching method Active CN106156699B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510147484.7A CN106156699B (en) 2015-03-31 2015-03-31 Image processing apparatus and image matching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510147484.7A CN106156699B (en) 2015-03-31 2015-03-31 Image processing apparatus and image matching method

Publications (2)

Publication Number Publication Date
CN106156699A CN106156699A (en) 2016-11-23
CN106156699B true CN106156699B (en) 2019-06-25

Family

ID=57337240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510147484.7A Active CN106156699B (en) 2015-03-31 2015-03-31 Image processing apparatus and image matching method

Country Status (1)

Country Link
CN (1) CN106156699B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101276411A (en) * 2008-05-12 2008-10-01 北京理工大学 Fingerprint identification method
CN101515286A (en) * 2009-04-03 2009-08-26 东南大学 Image matching method based on image feature multi-level filtration
CN102087710A (en) * 2009-12-03 2011-06-08 索尼公司 Learning device and method, recognition device and method, and program
CN102292745A (en) * 2009-01-23 2011-12-21 日本电气株式会社 image signature extraction device
CN103714159A (en) * 2013-12-27 2014-04-09 中国人民公安大学 Coarse-to-fine fingerprint identification method fusing second-level and third-level features
CN104268880A (en) * 2014-09-29 2015-01-07 沈阳工业大学 Depth information obtaining method based on combination of features and region matching

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101276411A (en) * 2008-05-12 2008-10-01 北京理工大学 Fingerprint identification method
CN102292745A (en) * 2009-01-23 2011-12-21 日本电气株式会社 image signature extraction device
CN101515286A (en) * 2009-04-03 2009-08-26 东南大学 Image matching method based on image feature multi-level filtration
CN102087710A (en) * 2009-12-03 2011-06-08 索尼公司 Learning device and method, recognition device and method, and program
CN103714159A (en) * 2013-12-27 2014-04-09 中国人民公安大学 Coarse-to-fine fingerprint identification method fusing second-level and third-level features
CN104268880A (en) * 2014-09-29 2015-01-07 沈阳工业大学 Depth information obtaining method based on combination of features and region matching

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
快速的多级指纹混合匹配方法;曹国等;《模式识别与人工智能》;20091031;第787-793页

Also Published As

Publication number Publication date
CN106156699A (en) 2016-11-23

Similar Documents

Publication Publication Date Title
Chen et al. Robust learning with kernel mean $ p $-power error loss
CN103646244B (en) Extraction, authentication method and the device of face characteristic
CN106295526B (en) The method and device of Car image matching
JP2019079553A (en) System and method for detecting line in vision system
CN105518717B (en) A kind of face identification method and device
US20140212044A1 (en) Image Matching Using Subspace-Based Discrete Transform Encoded Local Binary Patterns
CN105718848B (en) Quality evaluation method and device for fingerprint image
US10832030B2 (en) Method and apparatus of selecting candidate fingerprint image for fingerprint recognition
US7233700B2 (en) System and method for signal matching and characterization
CN110348412A (en) A kind of key independent positioning method, device, electronic equipment and storage medium
Sugimoto et al. Universal approach for dct-based constant-time gaussian filter with moment preservation
US20200293857A1 (en) Cnn processing device, cnn processing method, and program
CN108596250A (en) Characteristics of image coding method, terminal device and computer readable storage medium
Shu et al. ORB-oriented mismatching feature points elimination
JPH04299469A (en) Recognizing method for substance in image and application on substance tracking in series of images
CN110188407A (en) Method and device for determining liquid flow parameters in porous media
CN111222558B (en) Image processing method and storage medium
CN106156699B (en) Image processing apparatus and image matching method
Crespo et al. Revisiting complex moments for 2-D shape representation and image normalization
CN104298980B (en) A kind of finger print matching method and device based on smart card
Khalil Reference point detection for camera-based fingerprint image based on wavelet transformation
CN110059708B (en) Method and device for generating descriptor and storage medium
CN115423855B (en) Template matching method, device, equipment and medium for image
Martins et al. Disparity energy model using a trained neuronal population
Cicconet et al. Complex-valued hough transforms for circles

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant