CN106097306B - Obtain method, method for detecting image edge and the device of Image Edge-Detection operator - Google Patents
Obtain method, method for detecting image edge and the device of Image Edge-Detection operator Download PDFInfo
- Publication number
- CN106097306B CN106097306B CN201610373497.0A CN201610373497A CN106097306B CN 106097306 B CN106097306 B CN 106097306B CN 201610373497 A CN201610373497 A CN 201610373497A CN 106097306 B CN106097306 B CN 106097306B
- Authority
- CN
- China
- Prior art keywords
- image
- pixel
- edge
- target
- gray value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a kind of method, method for detecting image edge and devices for obtaining Image Edge-Detection operator, wherein the method for obtaining Image Edge-Detection operator includes: in the target image, to mark off four neighborhoods including first object pixel;Bilinear interpolation processing is carried out to the target image, the pixel in an optional insertion four neighborhood establishes the first object pixel and positional relationship of second target pixel points in four neighborhood as the second target pixel points;According to the positional relationship and bilinear interpolation formula, Image Edge-Detection functional relation is determined;Image Edge-Detection operator is determined according to described image edge indicator function relational expression.The Image Edge-Detection operator obtained using this method, is applied in the Image Edge-Detection of digital picture, can improve noise resisting ability while reducing operand, applicability is more preferable.
Description
Technical field
The present invention relates to image edge processing technical field more particularly to a kind of sides for obtaining Image Edge-Detection operator
Method, method for detecting image edge and device.
Background technique
The edge of image belongs to the radio-frequency component of image, is one of essential characteristic of image.The specific target image of one width,
After Image Edge-Detection is handled, becoming a width edge-detected image, edge-detected image is identical as the size of target image,
In edge-detected image, the marginal position of corresponding target image is usually to be highlighted, remaining position is essentially black, thus
Target image and other objects (such as other images, background etc.) can effectively be distinguished using the edge of image.Thus
It is found that the edge of image is found, and it is extremely important for many application fields of image edge processing, for example, image segmentation, image
The application fields such as sharpening, image analysis and identification, are being required to first determine the marginal position of image when specific operation.Existing skill
In art, the method for generalling use Image Edge-Detection determines the marginal position of image.
In digital picture, the radio-frequency component of image includes the part that gray-value variation is violent in image, usually by gray scale
Value changes the edge that violent position is defined as digital picture.For digital picture, various image borders are generallyd use
Detective operators (such as differential operator, Sobel operator, Canny operator etc.) carry out Image Edge-Detection to it.Again due in number
In image, each pixel on the edge of image, the comparison in difference of gray value is small, and variation is gentle, and is located at the side of image
Pixel on edge, the pixel with the edge of image close in image, the grey value difference of the two is bigger, and variation is violent,
Based on this, the first derivative and second dervative for generalling use the gray value of image are calculated as the Image Edge-Detection of digital picture
Son, that is, differential operator, including first order differential operator and Second Order Differential Operator.
When carrying out Image Edge-Detection to digital picture using first order differential operator, noise is larger, and testing result is vulnerable to making an uproar
Sound shadow is rung.When carrying out Image Edge-Detection to digital picture using Second Order Differential Operator, influence of the noise to testing result is smaller,
But when detecting, biggish data window is needed, operand is larger.Therefore, for the Image Edge-Detection of digital picture, one
The needs of rank differential operator and the applicability of Second Order Differential Operator are all poor, can not meet antinoise simultaneously and reduce operand.
So the applicability of differential operator is poor in the technique of image edge detection of digital picture, can not solve simultaneously
The problem of macrooperation amount and big noise.
Summary of the invention
The embodiment of the present invention is designed to provide a kind of method, Image Edge-Detection for obtaining Image Edge-Detection operator
Method and device, in the technique of image edge detection to solve digital picture, the poor problem of the applicability of differential operator.Pass through
Method of the invention can obtain a kind of relatively applicable Image Edge-Detection operator, which is applied to
When in the technique of image edge detection of digital picture, there is preferable noise robustness, meanwhile, it can reduce data window, i.e.,
Reduce operand, applicability is more preferable.
In order to solve the above-mentioned technical problem, the embodiment of the invention discloses following technical solutions:
In a first aspect, the embodiment of the invention provides a kind of methods for obtaining Image Edge-Detection operator, this method comprises:
In the target image, four neighborhoods including first object pixel are marked off;The first object pixel
The smallest pixel is indexed for line number index and columns in four neighborhood;
Bilinear interpolation processing is carried out to the target image, the pixel conduct in an optional insertion four neighborhood
Second target pixel points establish the first object pixel and position of second target pixel points in four neighborhood
Relationship;
According to the positional relationship and bilinear interpolation formula, Image Edge-Detection functional relation, the figure are determined
As edge indicator function relational expression be used to calculate the first object pixel gray value and second target pixel points
The difference of gray value;
Image Edge-Detection operator is determined according to described image edge indicator function relational expression.
With reference to first aspect, in the first possible embodiment of first aspect, it is described to the target image into
The processing of row bilinear interpolation, process of optional one pixel being inserted into four neighborhood as the second target pixel points, tool
Body includes:
To preset interpolation multiplying power, bilinear interpolation processing is carried out to the target image, obtains inserting for the target image
It is worth image;
In the interpolation image, the second target pixel points in four neighborhood, second target picture are inserted into selection
Vegetarian refreshments is located at the position of row pixel second from the bottom and column pixel second from the bottom intersection in four neighborhood.
The possible embodiment of with reference to first aspect the first, in second of possible embodiment of first aspect
In, it is described according to the positional relationship and bilinear interpolation formula, it determines the process of Image Edge-Detection functional relation, has
Body includes:
According to the positional relationship and bilinear interpolation formula, determine following about the first object pixel
Gray value and second target pixel points gray value between first function relational expression;
D (u, v)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+
1);
It is right in the interpolation image when determining each of target image pixel as first object pixel
The second target pixel points answered;
All second target pixel points in the interpolation image are extracted, according to the first mesh described in the target image
The queueing discipline for marking pixel, all second target pixel points are rearranged, and form reference picture;
According to second target pixel points position coordinates in the reference picture and the first function relationship
Formula is determined between following gray values about the first object pixel and the gray value of second target pixel points
Second function relational expression;
D (i, j)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+
1);
According to the second function relational expression, following Image Edge-Detection functional relations are determined;
Wherein, G (i, j) represents the gray value for the pixel that the i-th row jth arranges in target image;D (i, j) is represented with reference to figure
The gray value of the pixel of the i-th row jth column as in;G (i, j+1) represents the ash for the pixel that the i-th row jth+1 arranges in target image
Angle value;G (i+1, j) represents the gray value for the pixel that i+1 row jth arranges in target image;G (i+1, j+1) represents target figure
The gray value for the pixel that i+1 row jth+1 arranges as in;U is represented in interpolation image, and the second target pixel points are to corresponding
First object pixel lateral distance;V represents the second target pixel points to the first corresponding mesh in interpolation image
Mark the fore-and-aft distance of pixel.
The possible embodiment of second with reference to first aspect, in the third possible embodiment of first aspect
In, the process that Image Edge-Detection operator is determined according to described image edge indicator function relational expression specifically includes:
Described image edge indicator function relational expression is transformed to the Image Edge-Detection functional relation of following convolution forms
Formula;
Following Image Edge-Detection operators are determined according to the Image Edge-Detection functional relation of the convolution form;
Wherein, G (i, j) represents the gray value for the pixel that the i-th row jth arranges in target image;D (i, j) is represented with reference to figure
The gray value of the pixel of the i-th row jth column as in;G (i, j+1) represents the ash for the pixel that the i-th row jth+1 arranges in target image
Angle value;G (i+1, j) represents the gray value for the pixel that i+1 row jth arranges in target image;G (i+1, j+1) represents target figure
The gray value for the pixel that i+1 row jth+1 arranges as in;U is represented in interpolation image, and the second target pixel points are to corresponding
First object pixel lateral distance;V represents the second target pixel points to the first corresponding mesh in interpolation image
Mark the fore-and-aft distance of pixel.
Second aspect, the embodiment of the invention provides it is a kind of using above-mentioned Image Edge-Detection operator to image to be detected into
The method of row Image Edge-Detection, this method comprises:
According to the default interpolation multiplying power and described image edge detection operator for being directed to image to be detected, determine for described
The specific Image Edge-Detection operator of image to be detected;
Increase one-row pixels point in the lower section of the last line pixel of image to be detected of acquisition, in its last column picture
The right side of vegetarian refreshments increases by a column pixel, obtains actually detected image;Increased one-row pixels point passes through duplication image to be detected
Row pixel second from the bottom obtain, increased column pixel by duplication image to be detected column pixel second from the bottom obtain
?;
Image Edge-Detection is carried out to the actually detected image using the specific Image Edge-Detection operator, to obtain
Obtain the gray value information of the edge-detected image of described image to be detected.
In conjunction with second aspect, in the first possible embodiment of second aspect, the basis is directed to mapping to be checked
The default interpolation multiplying power and described image edge detection operator of picture determine the specific image border for being directed to described image to be detected
The process of detective operators, specifically includes:
Obtain the default interpolation multiplying power M for being directed to image to be detected;
According to the default interpolation multiplying power M, the value of u and v is determined according to the following equation;
According to the value of the u and v and described image edge detection operator, determine that the specific Image Edge-Detection is calculated
Son.
In conjunction with the first possible embodiment of second aspect, in second of possible embodiment of second aspect
In, it is described that Image Edge-Detection is carried out to the actually detected image using the specific Image Edge-Detection operator, to obtain
The process for obtaining the gray value information of the edge-detected image of described image to be detected, specifically includes:
The gray value data of described image to be detected is obtained, and institute is determined according to the gray value data of described image to be detected
State the gray value data of actually detected image;
According to the following equation, using the gray value number of the specific Image Edge-Detection operator and the actually detected image
According to convolution is carried out, to obtain the gray value information of the edge-detected image of described image to be detected;
Wherein, D represents the gray value of pixel in the edge-detected image of image to be detected, and I ' represents described actually detected
The gray value data of 2 × 2 data windows in image, A represent the specific Image Edge-Detection operator, | | indicate absolute value fortune
It calculates,It indicates to be rounded downwards.
The third aspect, the embodiment provides a kind of device for obtaining Image Edge-Detection operator, the device packets
It includes:
Region division module, in the target image, marking off four neighborhoods including first object pixel;Institute
Stating first object pixel is that line number index and columns index the smallest pixel in four neighborhood;
Interpolation processing module, for carrying out bilinear interpolation processing, an optional insertion described four to the target image
Pixel in neighborhood establishes the first object pixel and second target pixel points exists as the second target pixel points
Positional relationship in four neighborhood;
First determining module, for determining Image Edge-Detection according to the positional relationship and bilinear interpolation formula
Functional relation, described image edge indicator function relational expression be used to calculate the gray value of the first object pixel with it is described
The difference of the gray value of second target pixel points;
Second determining module, for determining Image Edge-Detection operator according to described image edge indicator function relational expression.
In conjunction with the third aspect, in the first possible embodiment of the third aspect, the interpolation processing module includes:
Interpolation process unit, for carrying out bilinear interpolation processing to the target image, obtaining to preset interpolation multiplying power
The interpolation image of the target image;
Pixel selection unit, in the interpolation image, the second target picture in four neighborhood to be inserted into selection
Vegetarian refreshments, second target pixel points are located at row pixel second from the bottom and column pixel second from the bottom intersection in four neighborhood
Position.
In conjunction with the first possible embodiment of the third aspect, in second of possible embodiment of the third aspect
In, first determining module is specifically used for:
According to the positional relationship and bilinear interpolation formula, determine following about the first object pixel
Gray value and second target pixel points gray value between first function relational expression;
D (u, v)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+
1);
It is right in the interpolation image when determining each of target image pixel as first object pixel
The second target pixel points answered;
All second target pixel points in the interpolation image are extracted, according to the first mesh described in the target image
The queueing discipline for marking pixel, all second target pixel points are rearranged, and form reference picture;
According to second target pixel points position coordinates in the reference picture and the first function relationship
Formula is determined between following gray values about the first object pixel and the gray value of second target pixel points
Second function relational expression;
D (i, j)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+
1);
According to the second function relational expression, following Image Edge-Detection functional relations are determined;
Wherein, G (i, j) represents the gray value for the pixel that the i-th row jth arranges in target image;D (i, j) is represented with reference to figure
The gray value of the pixel of the i-th row jth column as in;G (i, j+1) represents the ash for the pixel that the i-th row jth+1 arranges in target image
Angle value;G (i+1, j) represents the gray value for the pixel that i+1 row jth arranges in target image;G (i+1, j+1) represents target figure
The gray value for the pixel that i+1 row jth+1 arranges as in;U is represented in interpolation image, and the second target pixel points are to corresponding
First object pixel lateral distance;V represents the second target pixel points to the first corresponding mesh in interpolation image
Mark the fore-and-aft distance of pixel.
In conjunction with second of possible embodiment of the third aspect, in the third possible embodiment of the third aspect
In, second determining module is specifically used for:
Described image edge indicator function relational expression is transformed to the Image Edge-Detection functional relation of following convolution forms
Formula;
Following Image Edge-Detection operators are determined according to the Image Edge-Detection functional relation of the convolution form;
Wherein, G (i, j) represents the gray value for the pixel that the i-th row jth arranges in target image;D (i, j) is represented with reference to figure
The gray value of the pixel of the i-th row jth column as in;G (i, j+1) represents the ash for the pixel that the i-th row jth+1 arranges in target image
Angle value;G (i+1, j) represents the gray value for the pixel that i+1 row jth arranges in target image;G (i+1, j+1) represents target figure
The gray value for the pixel that i+1 row jth+1 arranges as in;U is represented in interpolation image, and the second target pixel points are to corresponding
First object pixel lateral distance;V represents the second target pixel points to the first corresponding mesh in interpolation image
Mark the fore-and-aft distance of pixel.
Fourth aspect, the embodiment of the invention provides it is a kind of using above-mentioned Image Edge-Detection operator to image to be detected into
The device of row Image Edge-Detection, the device include:
Third determining module, for according to the default interpolation multiplying power and described image edge detection for being directed to image to be detected
Operator determines the specific Image Edge-Detection operator for being directed to described image to be detected;
The lower section of image processing module, the last line pixel for image to be detected in acquisition increases one-row pixels
Point increases by a column pixel on the right side of its last column pixel, obtains actually detected image;Increased one-row pixels point is logical
The row pixel second from the bottom for crossing duplication image to be detected obtains, and increased column pixel is fallen by duplication image to be detected
Number secondary series pixel obtains;
Image Edge-Detection module, for using the specific Image Edge-Detection operator to the actually detected image into
Row Image Edge-Detection, to obtain the gray value information of the edge-detected image of described image to be detected.
In conjunction with fourth aspect, in the first possible embodiment of fourth aspect, the third determining module is specific
For:
Obtain the default interpolation multiplying power M for being directed to image to be detected;
According to the default interpolation multiplying power M, the value of u and v is determined according to the following equation;
According to the value of the u and v and described image edge detection operator, determine that the specific Image Edge-Detection is calculated
Son.
In conjunction with the first possible embodiment of fourth aspect, in second of possible embodiment of fourth aspect
In, described image edge detection module includes:
Gray value data acquiring unit, for obtaining the gray value data of described image to be detected, and according to described to be checked
The gray value data of altimetric image determines the gray value data of the actually detected image;
Image Edge-Detection unit, for according to the following equation, using the specific Image Edge-Detection operator with it is described
The gray value data of actually detected image carries out convolution, to obtain the gray value of the edge-detected image of described image to be detected
Information;
Wherein, D represents the gray value of pixel in the edge-detected image of image to be detected, and I ' represents described actually detected
The gray value data of 2 × 2 data windows in image, A represent the specific Image Edge-Detection operator, | | indicate absolute value fortune
It calculates,It indicates to be rounded downwards.
The technical solution that the embodiment of the present invention provides can include the following benefits: side disclosed by the invention
Method, the Image Edge-Detection operator of acquisition, when applied in the Image Edge-Detection of digital picture, due to having merged window data
It is weighted and averaged (the bilinear interpolation treatment process of image) and object pixel minor shifts (the down-sampled treatment process of image)
Effect, be greatly reduced noise, obtained image edge information is more abundant, and the profile of image is more clear.Meanwhile it examining
It is also smaller to survey the data window needed, that is, operand is smaller, so, Image Edge-Detection operator disclosed by the invention exists
In the technique of image edge detection of digital picture, applicability is more preferable.
The embodiment of the present invention it should be understood that more than general description and following detailed description be only exemplary reconciliation
The property released, it does not limit the disclosure.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows and meets implementation of the invention
Example, and be used to explain the principle of the present invention together with specification.
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, for those of ordinary skill in the art
Speech, without any creative labor, is also possible to obtain other drawings based on these drawings.
Fig. 1 is a kind of gray value data of target image provided in an embodiment of the present invention;
Fig. 2 is the gray value data of the interpolating pixel point of each pixel of target image in Fig. 1;
Fig. 3 is the gray value information of the edge-detected image of target image in Fig. 1;
Fig. 4 is a kind of flow diagram of method for obtaining Image Edge-Detection operator provided in an embodiment of the present invention;
Fig. 5 is the flow diagram of another method for obtaining Image Edge-Detection operator provided in an embodiment of the present invention;
Fig. 6 is that a kind of above-mentioned Image Edge-Detection operator of use provided in an embodiment of the present invention carries out figure to image to be detected
As the flow diagram of the method for edge detection;
Fig. 7 is that another kind provided in an embodiment of the present invention carries out image to be detected using above-mentioned Image Edge-Detection operator
The flow diagram of the method for Image Edge-Detection;
Fig. 8 is a kind of structural block diagram of device for obtaining Image Edge-Detection operator provided in an embodiment of the present invention;
Fig. 9 is the structural block diagram of another device for obtaining Image Edge-Detection operator provided in an embodiment of the present invention;
Figure 10 is that a kind of above-mentioned Image Edge-Detection operator of use provided in an embodiment of the present invention carries out image to be detected
The structural block diagram of the device of Image Edge-Detection;
Figure 11 be it is provided in an embodiment of the present invention it is another using above-mentioned Image Edge-Detection operator to image to be detected into
The structural block diagram of the device of row Image Edge-Detection.
Specific embodiment
Technical solution in order to enable those skilled in the art to better understand the present invention, below in conjunction with of the invention real
The attached drawing in example is applied, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described implementation
Example is only a part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, this field is common
Technical staff's every other embodiment obtained without making creative work, all should belong to protection of the present invention
Range.
Before specifically introducing the embodiment of the present invention, some skills relevant to technical solution disclosed by the invention are first introduced
Art principle: after a width target image is handled amplification by bilinear interpolation, the interpolation image of the target image, Zhi Houtong are obtained
Down-sampled processing is crossed, then by the size of the size reduction of the interpolation image to target image, obtains the reference of the target image
Image.The information (including gray value information) of reference picture and the information of target image are essentially identical, and there is only one between the two
A other minor shifts of sub-pixel, gray value (the line number rope corresponding with position in reference picture of each pixel in target image
Draw identical with columns index) pixel gray value difference, after taking absolute value, the edge detection of target image can be obtained
The gray value information of image obtains the Image Edge-Detection of the target image characterized with gray value information as a result, using these
Gray value information can draw out edge-detected image.Herein, by target image after Image Edge-Detection is handled, according to
Detecting the image definition that obtained gray value information is drawn is edge-detected image, in edge-detected image, Ren Menke
To be clearly seen the profile of target image, also that is, in edge-detected image, the marginal position of corresponding target image is highlighted aobvious
Show, the other positions of corresponding target image are black, and therefore, people can only see target image in edge-detected image
Profile, this profile are the edge of target image.
It is each in target image by above-mentioned technical principle it is found that by carrying out bilinear interpolation processing to target image
After a pixel determines an interpolating pixel point, calculates the gray value of each pixel in target image and inserted with corresponding
It is worth the difference of the gray value of pixel, and after taking absolute value to difference, so that it may obtain the edge-detected image of target image
Gray value information is shown in target image for example, being the gray value data of a certain width target image shown in Fig. 1
The gray value of each pixel;Fig. 2 shows be target image the corresponding interpolating pixel point of each pixel gray scale
Value, wherein position and with the pixel corresponding interpolating pixel point of some pixel in Fig. 1 are in Fig. 2 in target image
Position it is corresponding (line number index is identical with columns index);The gray value of each pixel is corresponding with position in Fig. 2 respectively in Fig. 1
After the gray value of each pixel subtracts each other, difference takes absolute value, and just obtains the gray value information of the edge-detected image of target image,
Fig. 3 shows the gray value information of the edge-detected image of target image, according to the gray value information shown in Fig. 3, Neng Gouhui
The edge-detected image of target image out, the marginal position that position is target image of being highlighted in edge-detected image.
, can in the target image, using each pixel as target pixel points when specific calculating, constructing one includes being somebody's turn to do
(in digital picture, the region of four pixel compositions adjacent to each other is defined as one to four neighborhoods including target pixel points
Four neighborhoods), after carrying out bilinear interpolation processing to target image, some pixel in four neighborhoods will be inserted into as the target picture
The interpolating pixel point of vegetarian refreshments carries out identical processing to each pixel in target image, can be obtained in target image
The corresponding interpolating pixel point of each pixel, later to the gray value of pixel each in target image respectively with corresponding interpolation
The gray value of pixel carries out difference operation, and takes absolute value to operation result, it will be able to obtain the edge detection of target image
The gray value information of image.
Based on this, the present invention provides a kind of methods for obtaining Image Edge-Detection operator, by means of the present invention
The Image Edge-Detection operator arrived, as long as 2 × 2 data window of data window of detection, operand are very small.Also,
The Image Edge-Detection operator is due to having merged window data weighted average (the bilinear interpolation treatment process of image) and mesh
The effect for marking pixel minor shifts (the down-sampled treatment process of image), greatly improves noise robustness, applicability is more preferable.
With reference to the accompanying drawing, specific embodiments of the present invention are discussed in detail.
As shown in figure 4, Fig. 4 shows a kind of process of method for obtaining Image Edge-Detection operator disclosed by the invention
Figure, this method comprises:
Step 11, in the target image, mark off one include first object pixel four neighborhoods;The first object
Pixel is that line number index and columns index the smallest pixel in four neighborhood.
In the present invention, any width digital picture of choosing is as target image, after obtaining target image, in target image
In, a pixel is arbitrarily chosen as first object pixel, and four neighborhoods, neighbours are marked off in the target image
It include first object pixel in domain, also, in four neighborhood, first object pixel is located at the upper left corner of four neighborhoods,
Line number index and columns index are minimum in four neighborhoods, herein, are indexed using line number index and columns, indicate that pixel exists
Locating line number and columns in digital picture.
Step 12 carries out bilinear interpolation processing, the picture in an optional insertion four neighborhood to the target image
Vegetarian refreshments establishes the first object pixel and second target pixel points in four neighborhood as the second target pixel points
In positional relationship.
To preset interpolation multiplying power, after carrying out bilinear interpolation processing to target image, the interpolation image of target image is obtained,
In interpolation image, any pixel chosen in four neighborhoods of insertion is as the second target pixel points, it is preferred that for it
The effect of Image Edge-Detection afterwards is more preferable, and selection is located in four neighborhoods, row pixel second from the bottom and column pixel second from the bottom
The pixel of point intersection location is as the second target pixel points.Wherein, interpolation multiplying power, which is referred to, carries out interpolation processing to image
When, to the multiplying power that image zooms in and out, herein, interpolation multiplying power will be defined as to the multiplying power that image zooms in and out, presets interpolation
The value of multiplying power can arbitrarily be set.
Step 13, according to the positional relationship and bilinear interpolation formula, determine Image Edge-Detection functional relation,
Described image edge indicator function relational expression be used for calculate the first object pixel gray value and second target picture
The difference of the gray value of vegetarian refreshments.
After obtaining target image, the gray value data of target image can be obtained, herein, the gray value data of image
Including the corresponding gray value of each pixel in image.
In interpolation image, determine the positional relationship of first object pixel and the second target pixel points in four neighborhoods it
It afterwards, can be using other pixels in the gray value and four neighborhoods of first object pixel according to bilinear interpolation formula
Gray value calculates the gray value of the second target pixel points, that is, is capable of determining that the gray value and second of first object pixel
Gray value function relational expression between the gray value of target pixel points, after gray value function relational expression deformation, so that it may
Difference functions relational expression is obtained, for calculating the difference of the gray value of first object pixel and the gray value of the second target pixel points
Value.
Using each pixel in target image as first object pixel, each first object picture is determined
Corresponding second target pixel points of vegetarian refreshments calculate every a pair of of first object pixel using above-mentioned difference functions relational expression
After the difference of gray value and the gray value of the second target pixel points, carrying out signed magnitude arithmetic(al) to difference can be obtained by target image
Edge-detected image gray value information, thus, difference functions relational expression is defined as Image Edge-Detection letter in the present invention
Number relational expression, that is, using the Image Edge-Detection functional relation to the gray value and second of every a pair of of first object pixel
After the gray value of target pixel points carries out operation, takes absolute value to operation result and be obtained with the edge detection graph of target image
The gray value information of picture obtains the Image Edge-Detection of the target image characterized with gray value information as a result, according to these ashes
Angle value information can draw out the edge-detected image of target image, and the position that is highlighted in edge-detected image is target
The marginal position of image.
Step 14 determines Image Edge-Detection operator according to described image edge indicator function relational expression.
Using any one width digital picture as target image, can be somebody's turn to do using step 11 to the method for step 13
The gray value information of the edge-detected image of target image, but this process operand is larger, in order to simplify calculation step, reduces
Image Edge-Detection functional relation can be deformed into convolution form by operand, later can be from the image side of convolution form
General Image Edge-Detection operator is extracted in edge detection function relational expression, which can be applied to appoint
In the Image Edge-Detection of what digital picture.
The method provided through this embodiment can obtain a kind of general Image Edge-Detection operator, by the image side
When edge detective operators are applied in the technique of image edge detection of image to be detected, it is only necessary to 2 × 2 data window, it can be right
Image to be detected carries out Image Edge-Detection, and operand is very small, simultaneously because using the Image Edge-Detection operator to be checked
Altimetric image carries out the process of Image Edge-Detection, has merged window data weighted average (the bilinear interpolation treatment process of image)
And the effect (the down-sampled treatment process of image) of object pixel minor shifts, so that testing result is affected by noise smaller,
After detection, in the edge-detected image that the gray value information of the edge-detected image obtained according to detection is drawn, image to be detected
Edge be more clear.
As shown in figure 5, Fig. 5 shows the stream of another method for obtaining Image Edge-Detection operator disclosed by the invention
Cheng Tu, this method comprises:
Step 21, in the target image, mark off one include first object pixel four neighborhoods;The first object
Pixel is that line number index and columns index the smallest pixel in four neighborhood.
When it is implemented, the gray value of the first object pixel is denoted as G (i, j), it will be other in four neighborhood
The gray value of pixel is denoted as G (i, j+1) respectively, G (i+1, j), G (i+1, j+1).
Step 22 carries out bilinear interpolation processing, the picture in an optional insertion four neighborhood to the target image
Vegetarian refreshments establishes the first object pixel and second target pixel points in four neighborhood as the second target pixel points
In positional relationship.
Step 23, according to the positional relationship and bilinear interpolation formula, determine about the first object pixel
The first gray value function relational expression between the gray value of point and the gray value of second target pixel points.
In the present embodiment, the first gray value function relational expression is as follows:
D (u, v)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+
1)。
After carrying out bilinear interpolation processing to target image, the interpolation image of target image, the first gray value function are obtained
Relational expression shows in interpolation image, specifies the position of first object pixel and the second target pixel points in four neighborhoods and closes
After system, so that it may utilize other pixels in four neighborhood of sum of the grayscale values of bilinear interpolation formula and first object pixel
Gray value calculates the gray value of the second target pixel points, that is, is existed according to first object pixel and the second target pixel points
Positional relationship in interpolation image obtains the first of the gray value of first object pixel and the gray value of the second target pixel points
Gray value function relational expression.
Step 24, when determining each of target image pixel as first object pixel, the interpolation
Corresponding second target pixel points in image.
For each of target image pixel, when as first object pixel, in corresponding interpolation graphs
As in, there are second target pixel points corresponding.In interpolation image, each first object pixel pair is determined
The second target pixel points answered.
Step 25 extracts all second target pixel points in the interpolation image, according to institute in the target image
The queueing discipline for stating first object pixel rearranges all second target pixel points, forms reference picture.
All second target pixel points in interpolation image are individually proposed, according to corresponding first object pixel in mesh
Queueing discipline in logo image rearranges the second target pixel points of proposition, the new image of one width of composition, herein, will
The new image definition is reference picture.
Step 26, position coordinates and first ash according to second target pixel points in the reference picture
Angle value functional relation determines the gray scale of the gray value and second target pixel points about the first object pixel
The second gray value function relational expression between value.
In the present embodiment, the second gray value function relational expression is as follows:
D (i, j)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+
1)。
A pair of mutual corresponding first object pixel and the second target pixel points, the second target pixel points are being referred to
Position coordinates in image, corresponding with the position coordinates of first object pixel in the target image, the two is in respective image
In line number index and columns index it is all the same.Based on this, in conjunction with first object pixel and the second target pixel points in interpolation
Position corresponding relationship in image, the first gray value function relational expression can become the second gray value function relational expression.
Step 27, according to the second gray value function relational expression, determine Image Edge-Detection functional relation.
In the present embodiment, Image Edge-Detection functional relation is as follows:
For target image and reference picture, in reference picture in the number of pixel and target image pixel number
It is identical, and in reference picture and in target image, the queueing discipline of pixel is also identical, therefore, the size and mesh of reference picture
The size of logo image is also identical, and the information of reference picture and target image is essentially identical, and there is only a sub-pixes between the two
The offset of rank.Therefore, by the gray value and reference picture to the first object pixel in corresponding target image
The gray values of second target pixel points carries out difference operation, and after carrying out signed magnitude arithmetic(al) again to the result of difference operation, i.e.,
It can get the gray value information of the edge-detected image of target image.
Difference functions relational expression is obtained after the deformation of the second gray value function relational expression based on this, for calculating the first mesh
Mark the difference of the gray value of pixel and the gray value of the second target pixel points.Herein, which is defined
For Image Edge-Detection functional relation, by the Image Edge-Detection functional relation, to each of target image picture
After vegetarian refreshments all carries out corresponding operation, the gray value information of the edge-detected image of target image can be obtained, that is, is obtained with ash
The Image Edge-Detection of the target image of angle value information representation can draw out target figure as a result, according to these gray value informations
The edge-detected image of picture, the marginal position that position is target image of being highlighted in edge-detected image.
Step 28, the Image Edge-Detection function that described image edge indicator function relational expression is transformed to convolution form close
It is formula.
In the present embodiment, the Image Edge-Detection functional relation of convolution form is as follows:
Step 29 determines that Image Edge-Detection is calculated according to the Image Edge-Detection functional relation of the convolution form
Son.
In the present embodiment, Image Edge-Detection operator is as follows:
Wherein, G (i, j) represents the gray value for the pixel that the i-th row jth arranges in target image;D (i, j) is represented with reference to figure
The gray value of the pixel of the i-th row jth column as in;G (i, j+1) represents the ash for the pixel that the i-th row jth+1 arranges in target image
Angle value;G (i+1, j) represents the gray value for the pixel that i+1 row jth arranges in target image;G (i+1, j+1) represents target figure
The gray value for the pixel that i+1 row jth+1 arranges as in;U is represented in interpolation image, and the second target pixel points are to corresponding
First object pixel lateral distance;V represents the second target pixel points to the first corresponding mesh in interpolation image
Mark the fore-and-aft distance of pixel.
When specifically carrying out Image Edge-Detection to a certain digital picture to be detected, according to the to be detected for this of acquisition
The default interpolation multiplying power M of digital picture, according to formulaThe value range of N is that the value of 1~M, M and N can be with
Meaning is arranged, after the value for calculating u and v, so that it may corresponding specific Image Edge-Detection operator is obtained, using specific image side
Edge detective operators can treat detection digital picture and carry out Image Edge-Detection, no longer need to operation above-mentioned steps 21 to step 28,
It can be obtained the gray value information of the edge-detected image of the digital picture to be detected, it can be by the Image Edge-Detection operator
It is applied in the Image Edge-Detection of arbitrary digital picture.
Enable u=0 in Image Edge-Detection operator, available landscape images edge detection operatorFor logarithm
The transverse edge of word image is detected;Similarly, v=0 in Image Edge-Detection operator, the inspection of available longitudinal direction image border are enabled
Measuring and calculatingIt is detected for the longitudinal edge to digital picture.
Method provided in this embodiment is adopted after carrying out bilinear interpolation processing and down-sampled processing to target image
Difference is done with the gray value of target image and the gray value of the image after Duplex treatment, obtains the edge detection of target image
The gray value information of image, and then general Image Edge-Detection operator is obtained, using the Image Edge-Detection operator to be checked
When altimetric image carries out Image Edge-Detection, due to having merged window data weighted average (the bilinear interpolation treatment process of image)
And the effect of object pixel minor shifts (the down-sampled treatment process of image), so that testing result is affected by noise smaller,
In the edge-detected image that the gray value information for the edge-detected image that basis obtains after detection is drawn, the profile of image to be detected
It is more clear.Also, it, can be to figure to be detected by the available landscape images edge detection operator of the Image Edge-Detection operator
As carrying out transverse edge detection, similarly, by the Image Edge-Detection operator also available longitudinal Image Edge-Detection operator, energy
Enough to carry out longitudinal edge detection to image to be detected, application is more flexible, and applicability is more preferable.
As shown in fig. 6, Fig. 6 shows a kind of above-mentioned Image Edge-Detection operator of use disclosed by the invention to be detected
Image carries out the flow chart of the method for Image Edge-Detection, this method comprises:
Step 31, basis are directed to the default interpolation multiplying power and described image edge detection operator of image to be detected, determine
For the specific Image Edge-Detection operator of described image to be detected.
After determining image to be detected, the relevant default interpolation multiplying power M of image to be detected is obtained, according to default interpolation times
Rate M calculates the value of u and v in Image Edge-Detection operator, whereinThe value range of N is 1~M's, M and N
Value can be arbitrarily arranged, and then determine specific Image Edge-Detection operator, for use in the image side of image to be detected later
In edge detection.
Step 32, acquisition image to be detected last line pixel lower section increase one-row pixels point, its most
The right side of latter column pixel increases by a column pixel, obtains actually detected image;Increased one-row pixels point by duplication to
The pixel of the row second from the bottom of detection image obtains, and increased column pixel is second from the bottom by duplication image to be detected
The pixel of column obtains.
Specific Image Edge-Detection operator is one 2 × 2 operator, when carrying out Image Edge-Detection using the operator, is needed
The gray value of the data window for wanting one 2 × 2, last line pixel and last column pixel for image to be detected is believed
Breath can not be detected using the operator, and therefore, it is necessary to the lower sections of the last line pixel in image to be detected to increase a line
Pixel, so as to be detected using gray value information of the operator to the last line pixel of image to be detected, together
Reason needs to increase by a column pixel on the right side of last column pixel of image to be detected, so as to using the operator pair
The gray value information of last column pixel of image to be detected is detected.
In order to guarantee increased one-row pixels point and increased column pixel, the gray value of image to be detected will not be believed
Breath causes too much influence, and increased one-row pixels point and increased column pixel be in the following manner in image to be detected
Obtain: increased one-row pixels point is obtained by the row pixel second from the bottom of duplication image to be detected, increased column pixel
Point is obtained by the column pixel second from the bottom of duplication image to be detected.
Step 33 carries out image border inspection to the actually detected image using the specific Image Edge-Detection operator
It surveys, to obtain the gray value information of the edge-detected image of described image to be detected.
Increase one-row pixels point and a column pixel in image to be detected, after obtaining actually detected image, from practical inspection
In altimetric image, line number index starts with the pixel that columns index is first, and 2 × 2 are successively chosen in actually detected image
The gray value data of data window carries out convolution with specific Image Edge-Detection operator, traverse complete actually detected image it
Afterwards, the gray value information of the edge-detected image of image to be detected can be obtained, i.e. acquisition is characterized to be checked with gray value information
The Image Edge-Detection of altimetric image can draw out the edge detection graph of image to be detected as a result, according to these gray value informations
Picture, the marginal position that position is image to be detected of being highlighted in edge-detected image.
Using method provided in this embodiment, as long as slightly handling image to be detected, that is, can be used provided by the invention
To treated, image to be detected carries out Image Edge-Detection to Image Edge-Detection operator, and then obtains the edge of image to be detected
The gray value information of detection image, entire calculating process, operand is very small, and due to using image side provided by the invention
Edge detective operators, when carrying out Image Edge-Detection to image to be detected, testing result is affected by noise smaller, after detection, according to
In the edge-detected image that obtained gray value information is drawn, the edge of image to be detected is more clear.
As shown in fig. 7, Fig. 7 shows another kind disclosed by the invention using above-mentioned Image Edge-Detection operator to be checked
Altimetric image carries out the flow chart of the method for Image Edge-Detection, this method comprises:
Step 41 obtains the default interpolation multiplying power for being directed to image to be detected.
When it is implemented, preset difference value multiplying power is denoted as M.
Step 42, according to the default interpolation multiplying power, determine the value of u and v.
In the present embodiment, according to formulaDetermine the value of u and v.
During above-mentioned acquisition Image Edge-Detection operator, if each second target pixel points is respectively positioned on accordingly
The position of row pixel second from the bottom and column pixel second from the bottom intersection, obtained reference picture and target image in four neighborhoods
Offset it is maximum, the Image Edge-Detection operator obtained in this manner carries out figure to image to be detected after being applied to
As in the edge-detected image that in edge detection, obtains, the edge of image to be detected is more clear.In this manner, corresponding
U and v value according to formulaIt determines.
Step 43, value and described image edge detection operator according to the u and v, determine the specific image border
Detective operators.
In the present embodiment, specific Image Edge-Detection operator is denoted as A.The value of u and v is substituted into Image Edge-Detection to calculate
It is calculated in son, specific Image Edge-Detection operator A can be obtained.
Step 44, acquisition image to be detected last line pixel lower section increase one-row pixels point, its most
The right side of latter column pixel increases by a column pixel, obtains actually detected image;Increased one-row pixels point by duplication to
The row pixel second from the bottom of detection image obtains, the column second from the bottom that increased column pixel passes through duplication image to be detected
Pixel obtains.
Step 45, the gray value data for obtaining described image to be detected, and according to the gray value number of described image to be detected
According to the gray value data of the determination actually detected image.
The gray value data of the row pixel second from the bottom of image to be detected is increased to the last line of image to be detected
The gray value data of the column pixel second from the bottom of image to be detected is increased to be checked by the lower section of the gray value data of pixel
The right side of the gray value data of last column pixel of altimetric image, can obtain the gray value data of actually detected image.
Step 46 is carried out using the gray value data of the specific Image Edge-Detection operator and the actually detected image
Convolution, to obtain the gray value information of the edge-detected image of described image to be detected.
When it is implemented, according to formulaUsing the specific Image Edge-Detection operator and the reality
The gray value data of detection image carries out convolution, to obtain the gray value letter of the edge-detected image of described image to be detected
Breath.
Wherein, D represents the gray value of pixel in the edge-detected image of described image to be detected, and I ' represents the reality
The gray value data of 2 × 2 data windows in detection image, A represent the specific Image Edge-Detection operator, | | indicate absolute
It is worth operation,It indicates to be rounded downwards.
After carrying out convolution using the gray value data of specific Image Edge-Detection operator and actually detected image, obtained volume
Product result will appear floating number, it is contemplated that and the gray value of digital picture in round figures, to convolution results carries out rounding operation,
Rounding operation herein refers to: taking the rounding less than the maximum integer of floating number as corresponding floating number as a result, being rounded downwards.
After the gray value information of the edge-detected image of image to be detected is calculated according to above-mentioned formula, that is, obtain with ash
It after the edge detection results of image to be detected of angle value information representation, is drawn, can be obtained according to obtained gray value information
To the edge-detected image of image to be detected, in edge-detected image, the marginal position of image to be detected is to be highlighted.
Using method provided in this embodiment, due to during determination specific Image Edge-Detection operator, u and v's
Value is according to formulaDetermine, using this mode determine specific Image Edge-Detection operator to image to be detected into
It is to be checked in the edge-detected image of drafting according to the gray value information of obtained edge-detected image after row Image Edge-Detection
The edge of altimetric image is the most clear.
With the method for above-mentioned acquisition Image Edge-Detection operator and using above-mentioned Image Edge-Detection operator to be detected
The method that image carries out Image Edge-Detection is corresponding, and the embodiment of the invention also discloses a kind of acquisition Image Edge-Detection operators
Device and using above-mentioned Image Edge-Detection operator to image to be detected carry out Image Edge-Detection device.
As shown in figure 8, Fig. 8 shows a kind of structure of device for obtaining Image Edge-Detection operator disclosed by the invention
Block diagram, the device include:
Region division module 51, in the target image, marking off four neighborhoods including first object pixel;
The first object pixel is that line number index and columns index the smallest pixel in four neighborhood;
Interpolation processing module 52, for the target image carry out bilinear interpolation processing, it is optional one insertion described in
Pixel in four neighborhoods establishes the first object pixel and second target pixel points as the second target pixel points
Positional relationship in four neighborhood;
First determining module 53, for determining that image border is examined according to the positional relationship and bilinear interpolation formula
Functional relation is surveyed, described image edge indicator function relational expression is used to calculate gray value and the institute of the first object pixel
State the difference of the gray value of the second target pixel points;
Second determining module 54, for determining that Image Edge-Detection is calculated according to described image edge indicator function relational expression
Son.
Using device provided in this embodiment, a kind of general Image Edge-Detection operator can be obtained, by the image side
Edge detective operators are applied in the technique of image edge detection of image to be detected, while reducing operand, greatly improve
Noise resisting ability, it is to be checked in the edge-detected image drawn according to the gray value information that detection obtains so that after detection
The profile of altimetric image is more clear.
As shown in figure 9, Fig. 9 shows the knot of another device for obtaining Image Edge-Detection operator disclosed by the invention
Structure block diagram, the device include: region division module 51, interpolation processing module 52, the first determining module 53 and the second determining mould
Block 54;
Wherein, interpolation processing module 52 includes:
Interpolation process unit 521, for carrying out bilinear interpolation processing to the target image to preset interpolation multiplying power,
Obtain the interpolation image of the target image;
Pixel selection unit 522, in the interpolation image, the second target in four neighborhood to be inserted into selection
Pixel, second target pixel points are located at row pixel second from the bottom and column pixel phase second from the bottom in four neighborhood
The position of friendship;
First determining module 53 is specifically used for:
According to the positional relationship and bilinear interpolation formula, determine following about the first object pixel
Gray value and second target pixel points gray value between first function relational expression;
D (u, v)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+
1);
It is right in the interpolation image when determining each of target image pixel as first object pixel
The second target pixel points answered;
All second target pixel points in the interpolation image are extracted, according to the first mesh described in the target image
The queueing discipline for marking pixel, all second target pixel points are rearranged, and form reference picture;
According to second target pixel points position coordinates in the reference picture and the first function relationship
Formula is determined between following gray values about the first object pixel and the gray value of second target pixel points
Second function relational expression;
D (i, j)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+
1);
According to the second function relational expression, following Image Edge-Detection functional relations are determined;
Second determining module 54 is specifically used for:
Described image edge indicator function relational expression is transformed to the Image Edge-Detection functional relation of following convolution forms
Formula;
Following Image Edge-Detection operators are determined according to the Image Edge-Detection functional relation of the convolution form;
Wherein, G (i, j) represents the gray value for the pixel that the i-th row jth arranges in target image;D (i, j) is represented with reference to figure
The gray value of the pixel of the i-th row jth column as in;G (i, j+1) represents the ash for the pixel that the i-th row jth+1 arranges in target image
Angle value;G (i+1, j) represents the gray value for the pixel that i+1 row jth arranges in target image;G (i+1, j+1) represents target figure
The gray value for the pixel that i+1 row jth+1 arranges as in;U is represented in interpolation image, and the second target pixel points are to corresponding
First object pixel lateral distance;V represents the second target pixel points to the first corresponding mesh in interpolation image
Mark the fore-and-aft distance of pixel.
The Image Edge-Detection operator that the device provided through this embodiment obtains is flat due to having merged window data weighting
The effect of (the bilinear interpolation treatment process of image) and object pixel minor shifts (the down-sampled treatment process of image),
It is applied in the Image Edge-Detection of image to be detected, testing result is affected by noise smaller, according to what is obtained after detection
Gray value information, in the edge-detected image drawn, the profile of image to be detected is more clear.
As shown in Figure 10, Figure 10 shows a kind of above-mentioned Image Edge-Detection operator of use disclosed by the invention to be checked
Altimetric image carries out the device of Image Edge-Detection, which includes:
Third determining module 61, for according to the default interpolation multiplying power and the inspection of described image edge for image to be detected
Measuring and calculating, determines the specific Image Edge-Detection operator for being directed to described image to be detected;
The lower section of image processing module 62, the last line pixel for image to be detected in acquisition increases a line picture
Vegetarian refreshments increases by a column pixel on the right side of its last column pixel, obtains actually detected image;Increased one-row pixels point
Row pixel second from the bottom by replicating image to be detected obtains, and increased column pixel passes through duplication image to be detected
Column pixel second from the bottom obtains;
Image Edge-Detection module 63, for using the specific Image Edge-Detection operator to the actually detected image
Image Edge-Detection is carried out, to obtain the gray value information of the edge-detected image of described image to be detected.
Using device provided in this embodiment, when carrying out Image Edge-Detection to image to be detected, testing result is by noise
Influence is smaller, and after detection, according in the edge-detected image of obtained gray value information drafting, the edge of image to be detected is more
Clearly.
As shown in figure 11, Figure 11 shows that another kind disclosed by the invention is treated using above-mentioned Image Edge-Detection operator
Detection image carries out the device of Image Edge-Detection, which includes: third determining module 61, image processing module 62, image
Edge detection module 63;
Wherein, third determining module 61 is specifically used for:
Obtain the default interpolation multiplying power M for being directed to image to be detected;
According to the default interpolation multiplying power M, the value of u and v is determined according to the following equation;
According to the value of the u and v and described image edge detection operator, determine that the specific Image Edge-Detection is calculated
Son.
Image Edge-Detection module 63 includes:
Gray value data acquiring unit 631, for obtaining the gray value data of described image to be detected, and according to it is described to
The gray value data of detection image determines the gray value data of the actually detected image;
Image Edge-Detection unit 632 is used for according to the following equation, using the specific Image Edge-Detection operator and institute
The gray value data for stating actually detected image carries out convolution, to obtain the gray scale of the edge-detected image of described image to be detected
Value information;
Wherein, D represents the gray value of pixel in the edge-detected image of image to be detected, and I ' represents described actually detected
The gray value data of 2 × 2 data windows in image, A represent the specific Image Edge-Detection operator, | | indicate absolute value fortune
It calculates,It indicates to be rounded downwards.
Using device provided in this embodiment, when carrying out Image Edge-Detection to image to be detected, due to determining specifically
During Image Edge-Detection operator, the value of u and v are according to formulaIt determines, so that after detection, according to obtaining
Edge-detected image gray value information, in the edge-detected image of drafting, the edge of image to be detected is the most clear.
All the embodiments in this specification are described in a progressive manner, same and similar portion between each embodiment
Dividing may refer to each other, and each embodiment focuses on the differences from other embodiments.Especially for device or
For system embodiment, since it is substantially similar to the method embodiment, so describing fairly simple, related place is referring to method
The part of embodiment illustrates.Apparatus and system embodiment described above is only schematical, wherein as separation
The unit of part description may or may not be physically separated, component shown as a unit can be or
It can not be physical unit, it can it is in one place, or may be distributed over multiple network units.It can be according to reality
Border needs to select some or all of the modules therein to achieve the purpose of the solution of this embodiment.Those of ordinary skill in the art
It can understand and implement without creative efforts.
It should be noted that, in this document, the relational terms of such as " first " and " second " or the like are used merely to one
A entity or operation with another entity or operate distinguish, without necessarily requiring or implying these entities or operation it
Between there are any actual relationship or orders.Moreover, the terms "include", "comprise" or its any other variant are intended to
Cover non-exclusive inclusion, so that the process, method, article or equipment for including a series of elements not only includes those
Element, but also including other elements that are not explicitly listed, or further include for this process, method, article or setting
Standby intrinsic element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that
There is also other identical elements in the process, method, article or equipment for including element.
The above is only a specific embodiment of the invention, it is noted that those skilled in the art are come
It says, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications also should be regarded as
Protection scope of the present invention.
Claims (14)
1. a kind of method for obtaining Image Edge-Detection operator characterized by comprising
In the target image, four neighborhoods including first object pixel are marked off;The first object pixel is institute
It states line number index and columns in four neighborhoods and indexes the smallest pixel;
Bilinear interpolation processing is carried out to the target image, the pixel in an optional insertion four neighborhood is as second
Target pixel points are established the position of the first object pixel and second target pixel points in four neighborhood and are closed
System;
According to the positional relationship and bilinear interpolation formula, Image Edge-Detection functional relation, described image side are determined
Edge detection function relational expression is used to calculate the gray value of the first object pixel and the gray scale of second target pixel points
The difference of value;
Image Edge-Detection operator is determined according to described image edge indicator function relational expression.
2. the method according to claim 1, wherein described carry out at bilinear interpolation the target image
Reason, process of optional one pixel being inserted into four neighborhood as the second target pixel points, specifically includes:
To preset interpolation multiplying power, bilinear interpolation processing is carried out to the target image, obtains the interpolation graphs of the target image
Picture;
In the interpolation image, the second target pixel points in four neighborhood, second target pixel points are inserted into selection
The position of row pixel second from the bottom and column pixel second from the bottom intersection in four neighborhood.
3. according to the method described in claim 2, it is characterized in that, described public according to the positional relationship and bilinear interpolation
Formula determines the process of Image Edge-Detection functional relation, specifically includes:
According to the positional relationship and bilinear interpolation formula, following ashes about the first object pixel are determined
First function relational expression between angle value and the gray value of second target pixel points;
D (u, v)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+1);
Wherein, D (u, v) is indicated in the interpolation image, the gray value of second target pixel points;U is represented in interpolation image
In, the lateral distance of the second target pixel points to corresponding first object pixel;V is represented in interpolation image, and second
Target pixel points to corresponding first object pixel fore-and-aft distance;
It is corresponding in the interpolation image when determining each of target image pixel as first object pixel
Second target pixel points;
All second target pixel points in the interpolation image are extracted, according to first object picture described in the target image
The queueing discipline of vegetarian refreshments rearranges all second target pixel points, forms reference picture;
According to second target pixel points position coordinates in the reference picture and the first function relational expression, really
Make second between following gray values about the first object pixel and the gray value of second target pixel points
Functional relation;
D (i, j)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+1);
According to the second function relational expression, following Image Edge-Detection functional relations are determined;
Wherein, G (i, j) represents the gray value for the pixel that the i-th row jth arranges in target image;D (i, j) is represented in reference picture
The gray value of the pixel of i-th row jth column;G (i, j+1) represents the gray scale for the pixel that the i-th row jth+1 arranges in target image
Value;G (i+1, j) represents the gray value for the pixel that i+1 row jth arranges in target image;G (i+1, j+1) represents target image
The gray value for the pixel that middle i+1 row jth+1 arranges;U is represented in interpolation image, and the second target pixel points are to corresponding
The lateral distance of first object pixel;V represents the second target pixel points to corresponding first object in interpolation image
The fore-and-aft distance of pixel.
4. according to the method described in claim 3, it is characterized in that, described true according to described image edge indicator function relational expression
The process for determining Image Edge-Detection operator, specifically includes:
Described image edge indicator function relational expression is transformed to the Image Edge-Detection functional relation of following convolution forms;
Following Image Edge-Detection operators are determined according to the Image Edge-Detection functional relation of the convolution form;
Wherein, G (i, j) represents the gray value for the pixel that the i-th row jth arranges in target image;D (i, j) is represented in reference picture
The gray value of the pixel of i-th row jth column;G (i, j+1) represents the gray scale for the pixel that the i-th row jth+1 arranges in target image
Value;G (i+1, j) represents the gray value for the pixel that i+1 row jth arranges in target image;G (i+1, j+1) represents target image
The gray value for the pixel that middle i+1 row jth+1 arranges;U is represented in interpolation image, and the second target pixel points are to corresponding
The lateral distance of first object pixel;V represents the second target pixel points to corresponding first object in interpolation image
The fore-and-aft distance of pixel.
5. a kind of carry out image to image to be detected using Image Edge-Detection operator described in Claims 1-4 any one
The method of edge detection characterized by comprising
According to the default interpolation multiplying power and described image edge detection operator for being directed to image to be detected, determine for described to be checked
The specific Image Edge-Detection operator of altimetric image;
Increase one-row pixels point in the lower section of the last line pixel of image to be detected of acquisition, in its last column pixel
Right side increase a column pixel, obtain actually detected image;Increased one-row pixels point is fallen by duplication image to be detected
Number the second row pixel obtains, and increased column pixel is obtained by the column pixel second from the bottom of duplication image to be detected;
Image Edge-Detection is carried out to the actually detected image using the specific Image Edge-Detection operator, to obtain institute
State the gray value information of the edge-detected image of image to be detected.
6. according to the method described in claim 5, it is characterized in that, the basis is directed to the default interpolation multiplying power of image to be detected
And described image edge detection operator, determine the process of the specific Image Edge-Detection operator for described image to be detected,
It specifically includes:
Obtain the default interpolation multiplying power M for being directed to image to be detected;
According to the default interpolation multiplying power M, the value of u and v is determined according to the following equation;
Wherein, u represent in interpolation image, the second target pixel points to corresponding first object pixel laterally away from
From;V represents the fore-and-aft distance of the second target pixel points to corresponding first object pixel in interpolation image;
According to the value of the u and v and described image edge detection operator, the specific Image Edge-Detection operator is determined.
7. according to the method described in claim 6, it is characterized in that, described use the specific Image Edge-Detection operator to institute
It states actually detected image and carries out Image Edge-Detection, to obtain the gray value letter of the edge-detected image of described image to be detected
The process of breath, specifically includes:
The gray value data of described image to be detected is obtained, and the reality is determined according to the gray value data of described image to be detected
The gray value data of border detection image;
According to the following equation, using the specific Image Edge-Detection operator and the actually detected image gray value data into
Row convolution, to obtain the gray value information of the edge-detected image of described image to be detected;
Wherein, D represents the gray value of pixel in the edge-detected image of image to be detected, and I ' represents the actually detected image
In 2 × 2 data windows gray value data, A represents the specific Image Edge-Detection operator, | | indicate signed magnitude arithmetic(al),It indicates to be rounded downwards.
8. a kind of device for obtaining Image Edge-Detection operator characterized by comprising
Region division module, in the target image, marking off four neighborhoods including first object pixel;Described
One target pixel points are that line number index and columns index the smallest pixel in four neighborhood;
Interpolation processing module, for carrying out bilinear interpolation processing, an optional insertion four neighborhood to the target image
In pixel as the second target pixel points, establish the first object pixel with second target pixel points described
Positional relationship in four neighborhoods;
First determining module, for determining Image Edge-Detection function according to the positional relationship and bilinear interpolation formula
Relational expression, described image edge indicator function relational expression are used to calculate the gray value and described second of the first object pixel
The difference of the gray value of target pixel points;
Second determining module, for determining Image Edge-Detection operator according to described image edge indicator function relational expression.
9. device according to claim 8, which is characterized in that the interpolation processing module includes:
Interpolation process unit, for bilinear interpolation processing being carried out to the target image, described in acquisition to preset interpolation multiplying power
The interpolation image of target image;
Pixel selection unit, for selecting the second target pixel points being inserted into four neighborhood in the interpolation image,
Second target pixel points are located at the position of row pixel second from the bottom and column pixel second from the bottom intersection in four neighborhood
It sets.
10. device according to claim 9, which is characterized in that first determining module is specifically used for:
According to the positional relationship and bilinear interpolation formula, following ashes about the first object pixel are determined
First function relational expression between angle value and the gray value of second target pixel points;
D (u, v)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+1);
Wherein, D (u, v) is indicated in the interpolation image, the gray value of second target pixel points;U is represented in interpolation image
In, the lateral distance of the second target pixel points to corresponding first object pixel;V is represented in interpolation image, and second
Target pixel points to corresponding first object pixel fore-and-aft distance;
It is corresponding in the interpolation image when determining each of target image pixel as first object pixel
Second target pixel points;
All second target pixel points in the interpolation image are extracted, according to first object picture described in the target image
The queueing discipline of vegetarian refreshments rearranges all second target pixel points, forms reference picture;
According to second target pixel points position coordinates in the reference picture and the first function relational expression, really
Make second between following gray values about the first object pixel and the gray value of second target pixel points
Functional relation;
D (i, j)=(1-u) (1-v) * G (i, j)+u (1-v) * G (i, j+1)+(1-u) v*G (i+1, j)+uv*G (i+1, j+1);
According to the second function relational expression, following Image Edge-Detection functional relations are determined;
Wherein, G (i, j) represents the gray value for the pixel that the i-th row jth arranges in target image;D (i, j) is represented in reference picture
The gray value of the pixel of i-th row jth column;G (i, j+1) represents the gray scale for the pixel that the i-th row jth+1 arranges in target image
Value;G (i+1, j) represents the gray value for the pixel that i+1 row jth arranges in target image;G (i+1, j+1) represents target image
The gray value for the pixel that middle i+1 row jth+1 arranges;U is represented in interpolation image, and the second target pixel points are to corresponding
The lateral distance of first object pixel;V represents the second target pixel points to corresponding first object in interpolation image
The fore-and-aft distance of pixel.
11. device according to claim 10, which is characterized in that second determining module is specifically used for:
Described image edge indicator function relational expression is transformed to the Image Edge-Detection functional relation of following convolution forms;
Following Image Edge-Detection operators are determined according to the Image Edge-Detection functional relation of the convolution form;
Wherein, G (i, j) represents the gray value for the pixel that the i-th row jth arranges in target image;D (i, j) is represented in reference picture
The gray value of the pixel of i-th row jth column;G (i, j+1) represents the gray scale for the pixel that the i-th row jth+1 arranges in target image
Value;G (i+1, j) represents the gray value for the pixel that i+1 row jth arranges in target image;G (i+1, j+1) represents target image
The gray value for the pixel that middle i+1 row jth+1 arranges;U is represented in interpolation image, and the second target pixel points are to corresponding
The lateral distance of first object pixel;V represents the second target pixel points to corresponding first object in interpolation image
The fore-and-aft distance of pixel.
12. a kind of carry out image to image to be detected using Image Edge-Detection operator described in Claims 1-4 any one
The device of edge detection characterized by comprising
Third determining module, for according to the default interpolation multiplying power and the calculation of described image edge detection for image to be detected
Son determines the specific Image Edge-Detection operator for being directed to described image to be detected;
The lower section of image processing module, the last line pixel for image to be detected in acquisition increases one-row pixels point,
Increase by a column pixel on the right side of its last column pixel, obtains actually detected image;Increased one-row pixels point passes through
The row pixel second from the bottom for replicating image to be detected obtains, the inverse that increased column pixel passes through duplication image to be detected
Secondary series pixel obtains;
Image Edge-Detection module, for carrying out figure to the actually detected image using the specific Image Edge-Detection operator
As edge detection, to obtain the gray value information of the edge-detected image of described image to be detected.
13. device according to claim 12, which is characterized in that the third determining module is specifically used for:
Obtain the default interpolation multiplying power M for being directed to image to be detected;
According to the default interpolation multiplying power M, the value of u and v is determined according to the following equation;
Wherein, u represent in interpolation image, the second target pixel points to corresponding first object pixel laterally away from
From;V represents the fore-and-aft distance of the second target pixel points to corresponding first object pixel in interpolation image;
According to the value of the u and v and described image edge detection operator, the specific Image Edge-Detection operator is determined.
14. device according to claim 13, which is characterized in that described image edge detection module includes:
Gray value data acquiring unit, for obtaining the gray value data of described image to be detected, and according to the mapping to be checked
The gray value data of picture determines the gray value data of the actually detected image;
Image Edge-Detection unit is used for according to the following equation, using the specific Image Edge-Detection operator and the reality
The gray value data of detection image carries out convolution, to obtain the gray value letter of the edge-detected image of described image to be detected
Breath;
Wherein, D represents the gray value of pixel in the edge-detected image of image to be detected, and I ' represents the actually detected image
In 2 × 2 data windows gray value data, A represents the specific Image Edge-Detection operator, | | indicate signed magnitude arithmetic(al),It indicates to be rounded downwards.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610373497.0A CN106097306B (en) | 2016-05-31 | 2016-05-31 | Obtain method, method for detecting image edge and the device of Image Edge-Detection operator |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610373497.0A CN106097306B (en) | 2016-05-31 | 2016-05-31 | Obtain method, method for detecting image edge and the device of Image Edge-Detection operator |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106097306A CN106097306A (en) | 2016-11-09 |
CN106097306B true CN106097306B (en) | 2019-01-29 |
Family
ID=57230729
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610373497.0A Active CN106097306B (en) | 2016-05-31 | 2016-05-31 | Obtain method, method for detecting image edge and the device of Image Edge-Detection operator |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106097306B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107223266B (en) * | 2017-04-27 | 2020-07-07 | 香港应用科技研究院有限公司 | Kernel approximation of fractional order differential operators for edge detection |
CN111179291B (en) * | 2019-12-27 | 2023-10-03 | 凌云光技术股份有限公司 | Edge pixel point extraction method and device based on neighborhood relation |
CN115018868A (en) * | 2022-05-26 | 2022-09-06 | 贵州大学 | Computer image edge detection method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6507364B1 (en) * | 1998-03-13 | 2003-01-14 | Pictos Technologies, Inc. | Edge-dependent interpolation method for color reconstruction in image processing devices |
CN101739672A (en) * | 2009-12-02 | 2010-06-16 | 北京中星微电子有限公司 | Method and device for equalizing histogram based on sub-regional interpolation |
CN103475838A (en) * | 2013-06-21 | 2013-12-25 | 青岛海信信芯科技有限公司 | Deinterlacing method based on edge self adaption |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI376643B (en) * | 2008-03-24 | 2012-11-11 | Novatek Microelectronics Corp | Method for detecting image edge and image interpolation using the same |
-
2016
- 2016-05-31 CN CN201610373497.0A patent/CN106097306B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6507364B1 (en) * | 1998-03-13 | 2003-01-14 | Pictos Technologies, Inc. | Edge-dependent interpolation method for color reconstruction in image processing devices |
CN101739672A (en) * | 2009-12-02 | 2010-06-16 | 北京中星微电子有限公司 | Method and device for equalizing histogram based on sub-regional interpolation |
CN103475838A (en) * | 2013-06-21 | 2013-12-25 | 青岛海信信芯科技有限公司 | Deinterlacing method based on edge self adaption |
Non-Patent Citations (3)
Title |
---|
Advanced Bilinear Image interpolation Based on Edge Features;Heechang Kim et al;《2009 First International Conference on Advances in Multimedia》;20090720;第33-36页 |
一种新的改进Canny边缘检测算法;王娜 等;《深圳大学学报理工版》;20050430;第22卷(第2期);第149-153页 |
基于Contourlet的改进双线性插值图像超分辨率算法;蔡念 等;《传感技术学报》;20110131;第24卷(第1期);第59-64页 |
Also Published As
Publication number | Publication date |
---|---|
CN106097306A (en) | 2016-11-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110927706A (en) | Convolutional neural network-based radar interference detection and identification method | |
CN109754391A (en) | A kind of image quality evaluating method, device and electronic equipment | |
US20110123114A1 (en) | Character recognition device and method and computer-readable medium controlling the same | |
CN109766856A (en) | A dual-stream RGB-D Faster R-CNN method for identifying the pose of lactating sows | |
CN104200480A (en) | Image fuzzy degree evaluation method and system applied to intelligent terminal | |
CN101699469A (en) | Method for automatically identifying action of writing on blackboard of teacher in class video recording | |
CN116188462A (en) | Noble metal quality detection method and system based on visual identification | |
CN116797590B (en) | Mura defect detection method and system based on machine vision | |
CN106097306B (en) | Obtain method, method for detecting image edge and the device of Image Edge-Detection operator | |
CN101923703A (en) | Image Adaptive Method Combining Semantic-Based Seam Cropping and Non-Uniform Mapping | |
CN106651792B (en) | Method and device for removing stripe noise of satellite image | |
CN110136153A (en) | A kind of image processing method, equipment and storage medium | |
CN110428412A (en) | The evaluation of picture quality and model generating method, device, equipment and storage medium | |
WO2017096814A1 (en) | Image processing method and apparatus | |
CN106599878A (en) | Face reconstruction correction method and device based on deep learning | |
CN110070486A (en) | A kind of image processing method, device and electronic equipment | |
CN105809633A (en) | Color noise removing method and device | |
Qiu et al. | An edge detection technique using local smoothing and statistical hypothesis testing | |
CN105469413B (en) | It is a kind of based on normalization ring weighting without refer to smear restoration image synthesis method for evaluating quality | |
CN104700372A (en) | Edge-preserving-based weighted anisostropic diffusion filtering method | |
CN105469358A (en) | Image processing method | |
Li et al. | Bionic vision-based synthetic aperture radar image edge detection method in non-subsampled contourlet transform domain | |
CN110929681B (en) | Wrinkle detection method | |
CN104408736A (en) | Characteristic-similarity-based synthetic face image quality evaluation method | |
CN112633158A (en) | Power transmission line corridor vehicle identification method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | "change of name, title or address" |
Address after: 100094 Beijing city Haidian District Cui Hunan loop 13 Hospital No. 7 Building 7 room 701 Patentee after: Lingyunguang Technology Co., Ltd Address before: 100094 Beijing city Haidian District road Tamatsu wisdom Valley Center Building 2 Patentee before: LUSTER LIGHTTECH GROUP Co.,Ltd. |
|
CP03 | "change of name, title or address" |