[go: up one dir, main page]

CN114972084A - Image focusing accuracy evaluation method and system - Google Patents

Image focusing accuracy evaluation method and system Download PDF

Info

Publication number
CN114972084A
CN114972084A CN202210522635.2A CN202210522635A CN114972084A CN 114972084 A CN114972084 A CN 114972084A CN 202210522635 A CN202210522635 A CN 202210522635A CN 114972084 A CN114972084 A CN 114972084A
Authority
CN
China
Prior art keywords
image
edge
detection frame
sampling
evaluated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210522635.2A
Other languages
Chinese (zh)
Inventor
许允迪
陈安
周才健
应再恩
周柔刚
盛锦华
周卫华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiaxing Yijunchen Technology Co ltd
Hangzhou Huicui Intelligent Technology Co ltd
Original Assignee
Jiaxing Yijunchen Technology Co ltd
Hangzhou Huicui Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiaxing Yijunchen Technology Co ltd, Hangzhou Huicui Intelligent Technology Co ltd filed Critical Jiaxing Yijunchen Technology Co ltd
Priority to CN202210522635.2A priority Critical patent/CN114972084A/en
Publication of CN114972084A publication Critical patent/CN114972084A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an image focusing accuracy evaluation method and system, which comprises the following steps: s1, acquiring an image to be evaluated and preprocessing the image to be evaluated to obtain a gray level image to be evaluated; s2, extracting the edge line of the gray level image to be evaluated and the gradient information of each pixel on the edge line; s3, setting a sampling ratio or sampling point number, and uniformly sampling the edge line to obtain a sampling point; s4, acquiring the detection frame of each sampling point according to the gradient information; s5, calculating the edge definition of each sampling point detection frame; and S6, calculating the edge definition of the whole image according to the edge definition of the detection frame of each sampling point, and evaluating the focusing effect according to the edge definition of the whole image. The invention realizes the rapid and accurate detection and evaluation of the image focusing accuracy, is suitable for various detection scenes and can guide the focusing process in real time.

Description

Image focusing accuracy evaluation method and system
Technical Field
The invention relates to the field of machine vision, in particular to an image focusing accuracy evaluation method and system.
Background
In machine vision-aided industrial part measurement, the captured image is required to be sufficiently clear. For the object to be observed, the imaging surface of the object does not necessarily coincide with the photosensitive surface of the camera, and in order to obtain a clear image, the position of the imaging surface needs to be adjusted to coincide with the photosensitive surface, that is, focusing is performed. The better the focusing pair is, the clearer the edge of the target object image is, the better the consistency with the edge position of the actual object is, and the more accurate the result is when the target object image is further used for measurement, edge detection and other applications; otherwise, the edge of the object of the formed image will be blurred to some extent, and the edge of the target object cannot be accurately positioned, thereby affecting the machine vision detection precision.
In the existing image detection technology, most of the methods are to manually judge whether focusing is accurate or not by naked eyes by amplifying edge positions in an image, so that quantitative data is lacked.
Disclosure of Invention
The present invention is directed to a solution to the problems of the prior art.
In one aspect, embodiments of the present invention provide a method,
further, in the above-mentioned case,
on the other hand, the embodiment of the invention also provides a method,
the invention has the beneficial effects that: the image focusing accuracy evaluation method provided by the invention calculates the definition of the image to be detected based on edge detection, gradient change and gray change sharpness, and improves the accuracy of image evaluation; by means of edge detection and uniform sampling on edge lines, the total calculation amount in the detection process can be reduced, the calculation speed is increased, and the focusing process is guided in real time; a certain range is selected around the sampling point according to the gradient frame to serve as a detection frame, the statistic of the gray level in the detection frame is calculated, the interference of noise points can be effectively eliminated, the detection result is more reliable, and the method can be suitable for detection under various scenes.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts. In the drawings:
fig. 1 is a schematic overall flow chart of an image focusing accuracy evaluation method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of edge point sampling locations according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of the distribution of the gray level of the pixels in the detection frame projected in the gradient direction according to the embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating a distribution of gray level projection of pixels in a detection frame in a gradient direction under an ideal condition in the embodiment of the present invention;
fig. 5 is a schematic diagram of an overall structure of the image focusing accuracy evaluation system according to the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the embodiments of the present invention are further described in detail below with reference to the accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.
In the description of the present specification, the terms "comprising," "including," "having," "containing," and the like are used in an open-ended fashion, i.e., to mean including, but not limited to. Reference to the description of the terms "one embodiment," "a particular embodiment," "some embodiments," "for example," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. The sequence of steps involved in the embodiments is for illustrative purposes to illustrate the implementation of the present application, and the sequence of steps is not limited and can be adjusted as needed.
The embodiment of the invention provides an image focusing accuracy evaluation method, fig. 1 is an overall flow schematic diagram of the image focusing accuracy evaluation method, and as shown in fig. 1, the method specifically comprises the following steps:
s1, acquiring an image to be evaluated of the target object and preprocessing the image to be evaluated to obtain a gray level image to be evaluated;
the image to be evaluated is obtained by shooting a target object through a camera, and if the shot image to be evaluated is a color image, the color image is converted into a gray-scale image through preprocessing.
Specifically, the camera used in this embodiment is a CCD camera; the preprocessing method includes a weighting method and an averaging method, and the weighting method is preferably used in this embodiment.
S2, extracting the edge line of the gray level image to be evaluated and the gradient information of each pixel on the edge line;
whether the image focusing is accurate or not is mainly reflected on the edge of the object image, and the image definition can be better reflected due to larger gray scale change at the edge; and the gray level change of the area of the image surface block is smooth, so that the reference value in the definition evaluation is small. Therefore, the invention adopts the edge detection algorithm to extract the edge information of the image to be evaluated, and the obtained result can greatly reduce the image data amount needing to be processed, thereby filtering a large amount of unnecessary information and reducing the calculated amount in the image accuracy detection process.
Specifically, S2 further includes the following steps:
s2-1, extracting the edge line of the gray level image to be evaluated obtained in the S1 based on the edge detection operator;
in this embodiment, the edge detection operator used for extracting the edge line of the gray scale image to be evaluated is a Canny operator.
S2-2, carrying out noise screening treatment on the edge line to obtain the edge of the target object;
according to the characteristics of the target object, screening the edges of the target object from the edges obtained in the step S2-1, for example, obtaining various closed contours in the whole image through connectivity analysis, and performing noise point screening; wherein, the noise screening treatment comprises breakpoint connection and length filtering and screening.
S2-3, calculating change gradients DX and DY of each pixel of the gray image to be evaluated in the horizontal direction and the vertical direction based on an edge detection operator;
in this embodiment, the edge detection operator used for calculating the variation gradient of each pixel of the grayscale image to be evaluated is a Sobel operator, that is, two sets of 3 × 3 masks are used, which are in the horizontal and vertical directions, and are respectively subjected to plane convolution with the image, so that the variation gradients in the horizontal and vertical directions can be obtained.
And S2-4, calculating the gradient direction of each pixel on the edge line by combining the edge of the target object obtained by S2-2 and the change gradients DX and DY obtained by S2-3.
Specifically, the gradient direction is a direction perpendicular to the tangent of the edge line, and the gradient direction calculation formula is as follows:
Figure BDA0003642375040000031
wherein G is i Representing the gradient direction of the ith pixel, DX i And DY i Respectively representing the horizontal direction variation gradient and the vertical direction variation gradient of the ith pixel.
S3, setting a sampling ratio or sampling point number, and uniformly sampling the edge line to obtain a sampling point;
specifically, the edge line of the gray level image to be evaluated obtained in S2-1 is sampled according to a set sampling ratio or the number of sampling points, so as to obtain N sampling points.
In this embodiment, a sampling ratio or the number of sampling points can be selected and set, and the value range of the sampling ratio is 0.01 to 0.3, preferably 0.05; the number of sampling points satisfies: the number of sampling points is equal to the sampling ratio multiplied by the number of edge points, and the number of sampling points is more than 3.
S4, acquiring the detection frame of each sampling point according to the gradient information;
s4 further includes: according to the edge line and the gradient direction thereof obtained in the step S2, L pixel points are expanded inward and outward of the sampling point along the gradient direction, and W pixel points are expanded to both sides of the gradient line in the direction perpendicular to the gradient direction, so as to construct a detection frame consisting of W × L pixel points. The gradient line is a line segment which takes edge line sampling points as middle points and has the length of L pixel points along the gradient direction.
Specifically, referring to the illustration of fig. 3, the gradient direction is perpendicular to the tangential direction of the edge line, and is directed from dark to light, i.e., from the side where the gradation is small to the side where the gradation is large.
It should be understood by those skilled in the art that the size of the constructed detection frame may be adjusted according to the size of the image, specifically, the length L should be satisfied, and the value of L is greater than 3 and less than 1/4 of the length of the short side of the image; the width W should be an odd number, and the value of W is more than 3, and W is less than 1/sampling ratio.
Further, the length L of the pixel is preferably 12, the width W is preferably 5, and fig. 3 is a detection frame formed by 12 × 5 pixels when the length L of the pixel is 12 and the width W is 5.
S5, calculating the edge definition of each sampling point detection frame;
specifically, S5 further includes the following steps:
s5-1, obtaining the maximum value V of each pixel gray scale in the detection frame max And a minimum value V min And is combined withCalculating the gray difference H, G ═ V of the detection frame max -V min
S5-2, calculating the gray level middle value in the detection frame
Figure BDA0003642375040000041
Figure BDA0003642375040000042
S5-3, calculating the offset delta V between the gray level of each pixel in the detection frame and the intermediate value j The calculation formula is:
Figure BDA0003642375040000043
wherein, is Δ V j Denotes the offset, V, of the jth pixel j Representing the gray value of the jth pixel;
s5-4, synthesizing the offsets and performing normalization processing to obtain the edge sharpness D of each detection frame, where the offsets of each pixel should be H/2 under the ideal sharpness condition, that is, when the sharpness D is equal to 1, in this embodiment, H/2 is used to perform normalization processing, and the calculation formula of the edge sharpness D is:
Figure BDA0003642375040000044
s6, calculating the edge definition of the whole image according to the edge definition of each sampling point detection frame, and evaluating the focusing effect according to the edge definition of the whole image;
specifically, the step of calculating the edge sharpness of the full graph in S6 includes: calculating the average value of the edge definition of each sampling point detection frame as the edge definition of the whole image, wherein the calculation formula is as follows:
Figure BDA0003642375040000051
wherein,
Figure BDA0003642375040000052
for the definition of the whole image edge, N is the number of sampling points, D i And indicating the edge definition of the detection frame corresponding to the ith sampling point.
Further, the edge definition of the whole image is calculated
Figure BDA0003642375040000053
In the range of [0,1]And according to the edge definition of the whole picture
Figure BDA0003642375040000054
The size of the image is judged to evaluate the focusing effect, in this embodiment, the definition of the whole image edge
Figure BDA0003642375040000055
The closer to 1 the value of (b) indicates that the clearer the captured image, the better the focusing effect.
In particular, when the whole image is edge-defined
Figure BDA0003642375040000056
When the value of (A) is greater than 0.8, the image is clear and the focusing effect is good;
when the edge of the whole image is clear
Figure BDA0003642375040000057
When the value of (1) is 0.5-0.8, the image definition and the focusing effect are considered to be common, and the image should be collected again at the moment;
when the edge of the whole image is clear
Figure BDA0003642375040000058
When the value of (A) is less than 0.5, it means that the detection process is erroneous and should be performed again.
As shown in fig. 5, the present invention further provides an image focusing accuracy evaluation system, through which the image focusing accuracy evaluation method can be implemented, specifically, the system includes an acquisition module, a gradient extraction module, a sampling module, a detection frame acquisition module, a calculation module, and an evaluation module;
the system comprises an acquisition module, a pre-processing module and a display module, wherein the acquisition module is used for acquiring and pre-processing images of a target object so as to obtain a gray level image to be evaluated; the gradient extraction module is used for acquiring the gradient direction of each pixel on the edge line of the image to be evaluated; the sampling module can sample the edge line and obtain a sampling point; the detection frame acquisition module can combine the sampling points and the gradient information to obtain the detection frame of each sampling point; the computing module is used for computing the edge definition of each sampling point detection frame and the edge definition of the whole image; the evaluation module can judge the focusing effect of the collected image according to the edge definition of the whole image.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. An image focusing accuracy evaluation method is characterized by comprising the following steps:
s1, acquiring an image to be evaluated and preprocessing the image to be evaluated to obtain a gray level image to be evaluated;
s2, extracting the edge line of the gray level image to be evaluated and the gradient information of each pixel on the edge line;
s3, setting a sampling ratio or sampling point number, and uniformly sampling the edge line to obtain a sampling point;
s4, acquiring the detection frame of each sampling point according to the gradient information;
s5, calculating the edge definition of each sampling point detection frame;
and S6, calculating the edge definition of the whole image according to the edge definition of the detection frame of each sampling point, and evaluating the focusing effect according to the edge definition of the whole image.
2. The method for evaluating the image focusing accuracy of claim 1, wherein in S1, the image to be evaluated is obtained by photographing the object with a camera, and if the photographed image to be evaluated is a color image, the color image is converted into a gray scale image by preprocessing.
3. The image focusing accuracy evaluation method according to claim 1 or 2, wherein said S2 further comprises the steps of:
s2-1, extracting the edge line of the gray level image to be evaluated obtained in the S1 based on the edge detection operator;
s2-2, carrying out noise screening treatment on the edge line to obtain the edge of the target object;
s2-3, calculating change gradients DX and DY of each pixel of the gray image to be evaluated in the horizontal direction and the vertical direction based on an edge detection operator;
and S2-4, calculating the gradient direction of each pixel on the edge line by combining the edge of the object obtained in S2-2 and the change gradients DX and DY obtained in S2-3.
4. The image focusing accuracy evaluation method according to claim 3, wherein in S2-1, the edge detection operator used is a Canny operator; in the step S2-3, the adopted edge detection operator is a Sobel operator.
5. The method for evaluating the image focusing accuracy of claim 4, wherein in the step S3, the sampling ratio ranges from 0.01 to 0.3; the number of sampling points satisfies: the number of sampling points is equal to the sampling ratio multiplied by the number of edge points, and the number of sampling points is more than 3.
6. The image focusing accuracy evaluation method according to claim 5, wherein said S4 further comprises: according to the edge line and the gradient direction thereof obtained in the step S2, L pixel points are extended inward and outward of the sampling point along the gradient direction, and W pixel points are extended to both sides of the gradient line in the direction perpendicular to the gradient direction, so as to construct a detection frame composed of W × L pixel points, wherein the length of the detection frame is L, and the width of the detection frame is W.
7. The method for evaluating the image focusing accuracy according to claim 6, wherein the length L is set to 3 or more, and L is 1/4 or less of the length of the image short side; the width W is greater than 3 and less than 1/sample ratio.
8. The image focusing accuracy evaluation method according to claim 6 or 7, wherein said S5 further comprises the steps of:
s5-1, obtaining the maximum value V of the gray scale of each pixel in the detection frame max And a minimum value V min And calculating the gray difference H, H ═ V of the detection frame max -V min
S5-2, calculating the gray level middle value in the detection frame
Figure FDA0003642375030000021
S5-3, calculating the offset delta V between the gray level of each pixel in the detection frame and the intermediate value j The calculation formula is:
Figure FDA0003642375030000022
wherein, is Δ V j Denotes the offset, V, of the jth pixel j Representing the gray value of the jth pixel;
s5-4, synthesizing the offset and carrying out normalization processing to obtain the edge definition D of each detection frame, wherein the calculation formula is as follows:
Figure FDA0003642375030000023
9. the image focusing accuracy evaluating method according to claim 8, wherein the calculating of the full-image edge sharpness in S6 includes: calculating the average value of the edge definition of each sampling point detection frame as the edge definition of the whole image, wherein the calculation formula is as follows:
Figure FDA0003642375030000024
wherein,
Figure FDA0003642375030000025
for full picture edge definition, N is the number of sampling points, D i And indicating the edge definition of the detection frame corresponding to the ith sampling point.
10. An image focusing accuracy evaluation system is characterized by comprising an acquisition module, a gradient extraction module, a sampling module, a detection frame acquisition module, a calculation module and an evaluation module;
the system comprises an acquisition module, a pre-processing module and a display module, wherein the acquisition module is used for acquiring and pre-processing images of a target object so as to obtain a gray level image to be evaluated; the gradient extraction module is used for acquiring the gradient direction of each pixel on the edge line of the image to be evaluated; the sampling module can sample the edge line and obtain a sampling point; the detection frame acquisition module can combine the sampling points and the gradient information to obtain the detection frame of each sampling point; the computing module is used for computing the edge definition of each sampling point detection frame and the edge definition of the whole image; the evaluation module can judge the focusing effect of the collected image according to the edge definition of the whole image.
CN202210522635.2A 2022-05-13 2022-05-13 Image focusing accuracy evaluation method and system Pending CN114972084A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210522635.2A CN114972084A (en) 2022-05-13 2022-05-13 Image focusing accuracy evaluation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210522635.2A CN114972084A (en) 2022-05-13 2022-05-13 Image focusing accuracy evaluation method and system

Publications (1)

Publication Number Publication Date
CN114972084A true CN114972084A (en) 2022-08-30

Family

ID=82982843

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210522635.2A Pending CN114972084A (en) 2022-05-13 2022-05-13 Image focusing accuracy evaluation method and system

Country Status (1)

Country Link
CN (1) CN114972084A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115937324A (en) * 2022-09-09 2023-04-07 郑州思昆生物工程有限公司 Assembly quality evaluation method, device, equipment and storage medium
CN119027371A (en) * 2024-07-23 2024-11-26 南京航空航天大学 A focus evaluation method, system, device and medium for wafer alignment mark

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100142819A1 (en) * 2008-12-04 2010-06-10 Tomohisa Suzuki Image evaluation device and image evaluation method
CN103793918A (en) * 2014-03-07 2014-05-14 深圳市辰卓科技有限公司 Image definition detecting method and device
CN104732525A (en) * 2015-02-10 2015-06-24 宁波永新光学股份有限公司 Microscopic image definition evaluation method by combining pixel spacing method visual significance
WO2016020209A2 (en) * 2014-08-05 2016-02-11 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device, method and computer program for determining a sharpness measure for an image
CN110717922A (en) * 2018-07-11 2020-01-21 普天信息技术有限公司 Image definition evaluation method and device
CN114332038A (en) * 2021-12-31 2022-04-12 杭州海康威视数字技术股份有限公司 Image sharpness evaluation method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100142819A1 (en) * 2008-12-04 2010-06-10 Tomohisa Suzuki Image evaluation device and image evaluation method
CN103793918A (en) * 2014-03-07 2014-05-14 深圳市辰卓科技有限公司 Image definition detecting method and device
WO2016020209A2 (en) * 2014-08-05 2016-02-11 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device, method and computer program for determining a sharpness measure for an image
CN104732525A (en) * 2015-02-10 2015-06-24 宁波永新光学股份有限公司 Microscopic image definition evaluation method by combining pixel spacing method visual significance
CN110717922A (en) * 2018-07-11 2020-01-21 普天信息技术有限公司 Image definition evaluation method and device
CN114332038A (en) * 2021-12-31 2022-04-12 杭州海康威视数字技术股份有限公司 Image sharpness evaluation method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115937324A (en) * 2022-09-09 2023-04-07 郑州思昆生物工程有限公司 Assembly quality evaluation method, device, equipment and storage medium
CN115937324B (en) * 2022-09-09 2024-03-26 郑州思昆生物工程有限公司 Assembly quality evaluation method, device, equipment and storage medium
CN119027371A (en) * 2024-07-23 2024-11-26 南京航空航天大学 A focus evaluation method, system, device and medium for wafer alignment mark

Similar Documents

Publication Publication Date Title
Abdelhamed et al. A high-quality denoising dataset for smartphone cameras
WO2019148739A1 (en) Comprehensive processing method and system for blurred image
EP3798975B1 (en) Method and apparatus for detecting subject, electronic device, and computer readable storage medium
CN111083365B (en) Method and device for rapidly detecting optimal focal plane position
CN108876768B (en) Shadow defect detection method for light guide plate
CN104637064A (en) Defocus blurred image definition detection method based on edge intensity weight
CN106934806B (en) It is a kind of based on text structure without with reference to figure fuzzy region dividing method out of focus
CN104408707A (en) Rapid digital imaging fuzzy identification and restored image quality assessment method
CN107403414B (en) A kind of image area selecting method and system being conducive to fuzzy kernel estimates
CN108596878A (en) Measurement for Digital Image Definition
CN114972084A (en) Image focusing accuracy evaluation method and system
JP2009259036A (en) Image processing device, image processing method, image processing program, recording medium, and image processing system
CN114998314A (en) PCB (printed Circuit Board) defect detection method based on computer vision
CN115147613A (en) Infrared small target detection method based on multidirectional fusion
CN110658918B (en) Positioning method, device and medium for eyeball tracking camera of video glasses
Gherardi et al. Illumination field estimation through background detection in optical microscopy
CN111311562A (en) Method and device for detecting ambiguity of virtual focus image
CN117788306B (en) Multithreading-based multi-focal-length tab image fusion method
CN116563298B (en) Cross line center sub-pixel detection method based on Gaussian fitting
CN112233032A (en) Method for eliminating ghost image of high dynamic range image
CN108876845B (en) Fresnel pattern center determining method and device
CN106296688B (en) Image blur detection method and system based on overall situation estimation
CN116309755A (en) Image registration method, surface normal vector reconstruction method, system and electronic equipment
CN109448012A (en) A kind of method for detecting image edge and device
CN114359183A (en) Image quality evaluation method and device, and lens occlusion determination method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination