GB2468380A - Blur estimation in eye images via cosine amplitude and phase matching - Google Patents
Blur estimation in eye images via cosine amplitude and phase matching Download PDFInfo
- Publication number
- GB2468380A GB2468380A GB0922046A GB0922046A GB2468380A GB 2468380 A GB2468380 A GB 2468380A GB 0922046 A GB0922046 A GB 0922046A GB 0922046 A GB0922046 A GB 0922046A GB 2468380 A GB2468380 A GB 2468380A
- Authority
- GB
- United Kingdom
- Prior art keywords
- image
- data
- blur
- amplitude
- phase
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 38
- 210000001747 pupil Anatomy 0.000 claims abstract description 16
- 238000005070 sampling Methods 0.000 claims abstract description 7
- 238000004590 computer program Methods 0.000 claims 3
- 230000003287 optical effect Effects 0.000 abstract description 7
- 238000012512 characterization method Methods 0.000 abstract description 3
- 238000012545 processing Methods 0.000 description 30
- 238000013459 approach Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G06K9/0061—
-
- G06T5/001—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30041—Eye; Retina; Ophthalmic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Quality & Reliability (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Ophthalmology & Optometry (AREA)
- Image Processing (AREA)
Abstract
A method for blur estimation in eye or iris images comprises: locating an image region containing an eye 402 and identifying a centre of a pupil region 408; re-sampling the image along lines of various orientations 410, each line passing through the pupil centre region, generating data for an image model; and determining an amplitude and phase of a cosine that most closely matches the data 416, the amplitude and phase respectively corresponding to blur extent and direction 418. A system and computer medium for such a method are also independently claimed. The method may further comprise: extracting multiple image regions 406 if the image contains multiple eyes 404; taking account of nearly circular pupils 412; estimating blurred edge widths 414; and finding the cosine amplitude and phase by least squares curve fitting 416. Model data may comprise blur width. The blur estimation can produce de-blurred images that are more useful for biometric identification. Iris and pupil region edges, or the shutter motion pattern or a characterization of the optical system, can be utilized. In a video stream, eye position before and after capture can predict eye motion.
Description
A FEATURE-BASED METHOD AND SYSTEM FOR BLUR ESTIMATION IN EYE IMAGES
CROSS-REFERENCE TO U.S. PROVISIONAL APPLICATION
[0001] This application claims the benefit of priority based on U.S. Provisional Patent Application Serial No. 61/156,759 filed March 2, 2009, entitled "A Feature-Based Method and System for Blur Estimation in Eye Images." The above-referenced provisional patent application is hereby incorporated by reference herein in its entirety.
STATEMENT OF GOVERNMENT RIGHTS
[0002] The invention disclosed in this application was made with Government support under Contract Number W9ICRB-09-C-0013 awarded by the U.S. Army Research Office.
The U.S. Government has certain rights in the invention.
TECHNICAL FIELD
[0003] Embodiments are generally related to the field of security systems and biometric identification. Embodiments are also related to iris-based biometrics. Embodiments are additionally related to flutter shutter technology. Embodiments are additionally related to image-processing techniques, devices, and systems.
BACKGROUND OF THE INVENTION
[0004] Acquiring sharply-focused images of moving people or objects is a fundamental and challenging problem in several surveillance applications, particularly iris-based biometrics and face recognition. Biometric identification or verification of individuals can be performed on a number of input modalities. Several modalities such as iris and face recognition depend on image capture that, in different environments, may have degraded image quality. Image blur, due to subject motion or optical defocus, is an important type of degradation that is commonly encountered in biometric systems. When blur cannot be mitigated by changing the image capture technique, it becomes necessary to process blurred images in order to produce sharply-focused images that can be used for biometric
I
identification.
[0005] Some deblurring approaches have been implemented. For example, a non-limiting example of a deblurring technology is disclosed in U.S. Patent Application Publication Serial No. U5200710258707A1, entitled "Method and Apparatus for Deblurring Images," which published to Ramesh Raskar on November 8, 2007, and is incorporated herein by reference. Another deblurring approach is disclosed in U.S. Patent Application Publication Serial No. U52007/0258706A1, entitled "Method for Deblurring Images Using Optimized Temporal Coding Patterns," which published to Ramesh Raskar, et al on November 8, 2007, and is incorporated herein by reference.
[0006] While there are many techniques for performing deblurring, all such approaches require an estimate of the blur magnitude. Methods that remove motion blur additionally require estimates of the motion direction/path and the shape of the blur, which is related to the camera's shutter pattern and illumination characteristics. Methods that remove optical blur require estimates of both the magnitude and the shape of the blur, the shape being related to optical characteristics. In its full generality, the blur estimation problem is known to be ill-posed on a single image.
BRIEF SUMMARY
[0007] The following summary is provided to facilitate an understanding of some of the innovative features unique to the embodiments disclosed and is not intended to be a full description. A full appreciation of the various aspects of the embodiments can be gained by taking the entire specification, claims, drawings, and abstract as a whole.
[0008] It is, therefore, one aspect of the present invention to provide for a method and system for enhanced biometric identification.
[0009] It is another aspect of the present invention to provide for a feature-based method and system for blur estimation in eye images.
[0010] The aforementioned aspects and other objectives and advantages can now be achieved as described herein. A feature-based method and system for blur estimation in eye images is disclosed. A blur estimation can be performed from eye/iris images in order to produce de-blurred images that are more useful for biometric identification. The invention has been developed to exploit additional information that, while not available in the general case, is available when the blur estimation method/system is a component in a larger biometric system. Whereas the most general form of the blur estimation problem is ill-posed, three sources of information can be utilized, in addition to the image, to estimate blur in eye/iris images captured by our biometric systems.
[0011] First, the features of the eye/iris region, in particular the edge between the iris and pupil regions, can be utilized. Second, the pattern of shutter motion (for motion blur) or a characterization of the optical system (for defocus blur) can be utilized. Third, additional image information can be utilized. By capturing a burst of images, or a video stream, one can use eye position in the images before and after a given capture to predict the motion of the eye within that capture. Because the before/after image frames need only contain the information necessary to locate the eye, and need not contain sufficient information to perform matching, the capture of these images can be accomplished with a wider range of settings that result in a lower image quality.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] FIG. I illustrates a schematic view of a data-processing system in which the present invention may be embodied, including the use of a biometric reader, which may incorporate the use of, for example, a camera; [0013] FIG. 2 illustrates a schematic view of a software system including an operating system, application software, and a user interface for carrying out the present invention; [0014] FIG. 3 illustrates a graphical representation of a network of data-processing systems in which aspects of the present invention may be implemented; and [0015] FIG. 4 illustrates a high-level flow chart of operations depicting logical operational steps of a method for blur estimation in eye images, in accordance with a preferred embodiment.
DETAILED DESCRIPTION
[0016] FIGS. 1-3 are provided as exemplary diagrams of data-processing environments in which embodiments of the present invention may be implemented. It should be appreciated that FIGS. 1-3 are only exemplary and are not intended to assert or imply any limitation with regard to the environments in which aspects or embodiments of the present invention may be implemented. Many modifications to the depicted environments may be made without departing from the spirit and scope of the present invention.
[0017] As depicted in FIG. 1, the present invention may be embodied in the context of a data-processing system 100 comprising, for example, a central processor 101, a main memory 102, an input/output controller 103, a keyboard 104, a pointing device 105 (e.g., mouse, track ball, pen device, or the like), a display device 106, and a mass storage component 107 (e.g., hard disk). A camera 108 may be employed, which is capable of communicating with the data-processing system 100. Camera 108 may be implemented as, for example, a flutter shutter camera utilized in the context of a biometric reader, such as an Iris scanner or facial recognition device. The flutter shutter camera can be configured as a camera capable of capturing moving objects at an exposure time of, for example, over 50 milliseconds, like high speed motion cameras. Using a coded exposure sequence, the flutter shutter camera can recover, for example, text from a speeding car and sharpen images. As illustrated, the various components of the data-processing system communicate through a system bus 110 or similar architecture.
[0018] FIG. 2 illustrates a computer software system 150 for directing the operation of the data-processing system 100 depicted in FIG. 1. Software system 150, which is stored in system memory 102 and on disk memory 107, can include a kernel or operating system 151 and a shell or interface 153. One or more application programs, such as application software 152, may be "loaded" (i.e., transferred from storage 107 into memory 102) for execution by the data-processing system 100. The data-processing system 100 receives user commands and data through user interface 153; these inputs may then be acted upon by the data-processing system 100 in accordance with instructions from operating module 151 and/or application module 152.
[0019] The interface 153, which is preferably a graphical user interface (GUI), can also serves to display results, whereupon the user may supply additional inputs or terminate a given session. In one possible embodiment, operating system 151 and interface 153 can be implemented in the context of a "Windows" system. It can be appreciated, of course, that other types of systems are possible. For example, rather than a traditional "Windows" system, other operation systems such as, for example, Linux may also be employed with respect to the operating system 151 and interface 153. Application module 152, on the other hand, can include instructions, such as the various operations described herein with respect to the various components and modules described herein such as, for example, the methods 400 and 500 depicted respectively in FIGS. 4-5.
[0020] FIG. 3 illustrates a graphical representation of a network of data processing systems in which aspects of the present invention may be implemented. Network data processing system 300 can be provided as a network of computers in which embodiments of the present invention may be implemented. Network data processing system 300 contains network 302, which can be utilized as a medium for providing communications links between various devices and computers connected together within network data processing system 100. Network 302 may include connections such as wired, wireless communication links, fiber optic cables, USB cables, Ethernet connections, and so forth.
[0021] In the depicted example, server 304 and server 306 connect to network 302 along with storage unit 308. In addition, clients 310, 312, and 314 connect to network 302.
These clients 310, 312, and 314 may be, for example, personal computers or network computers. Data-processing system 100 depicted in FIG. I can be, for example, a client such as client 310, 312, and/or 314. Alternatively, data-processing system 100 can be implemented as a server, such as servers 304 and/or 306, depending upon design considerations.
[0022] In the depicted example, server 304 provides data, such as boot files, operating system images, and applications to clients 310, 312, and 314. Clients 310, 312, and 314 are clients to server 304 in this example. Network data processing system 300 may include additional servers, clients, and other devices not shown. Specifically, clients may connect to any member of a network of servers which provide equivalent content.
[0023] In some embodiments, network data processing system 300 may be the Internet with network 302 representing a worldwide collection of networks and gateways that use the Transmission Control Protocol/Internet Protocol (TCP/IP) suite of protocols to communicate with one another. At the heart of the Internet is a backbone of high-speed data communication lines between major nodes or host computers, consisting of thousands of commercial, government, educational, and other computer systems that route data and messages. Of course, network data processing system 300 also may be implemented as a number of different types of networks such as, for example, a secure intranet, a local area network (LAN), or a wide area network (WAN). FIG. I is intended as an example and not as an architectural limitation for different embodiments of the present invention.
[0024] The following description is presented with respect to embodiments of the present invention, which can be embodied in the context of a data-processing system such as data-processing system 100, computer software system 150, data-processing system 300, and network 302 depicted respectively FIGS. 1-3. The present invention, however, is not limited to any particular application or any particular environment. Instead, those skilled in the art will find that the system and methods of the present invention may be advantageously applied to a variety of system and application software, including database management systems, word processors, and the like. Moreover, the present invention may be embodied on a variety of different platforms, including Macintosh, UNIX, LINUX, and the like. Therefore, the description of the exemplary embodiments, which follows, is for purposes of illustration and not considered a limitation.
[0025] The disclosed embodiments generally describe a feature-based method and system for blur estimation in eye images. A blur estimation can be performed from eye/iris images in order to produce de-blurred images that are more useful for biometric identification. The disclosed embodiments have been developed to exploit additional information that, while not available in the general case, is available when the blur estimation method/system is a component in a larger biometric system.
[0026] Whereas the most general form of the blur estimation problem is ill-posed, three sources of information can be utilized, in addition to the image, to estimate blur in eye/iris images captured by our biometric systems. First, the features of the eye/iris region, in particular the edge between the iris and pupil regions, can be utilized. Second, the pattern of shutter motion (for motion blur) or a characterization of the optical system (for defocus blur) can be utilized. Third, additional image information can be utilized. By capturing a burst of images, or a video stream, one can use eye position in the images before and after a given capture to predict the motion of the eye within that capture. Because the before/after image frames need only contain the information necessary to locate the eye, and need not contain sufficient information to perform matching, the capture of these images can be accomplished with a wider range of settings that result in a lower image quality.
[0027] FIG. 4 illustrates a high-level flow chart of operations depicting logical operational steps of a method 400 for blur estimation in eye images, in accordance with a preferred embodiment. Note that method 400 and other methodologies disclosed herein can be implemented in the context of a computer-useable medium that contains a program product. Programs defining functions on the present invention can be delivered to a data storage system or a computer system via a variety of signal-bearing media, which include, without limitation, non-writable storage media (e.g., CD-ROM), writable storage media (e.g., hard disk drive, read/write CD ROM, optical media), system memory such as, but not limited to, Random Access Memory (RAM), and communication media, such as computer and telephone networks including Ethernet, the Internet, wireless networks, and like network systems.
[0028] It should be understood, therefore, that such signal-bearing media, when carrying or encoding computer readable instructions that direct method functions in the present invention, represent alternative embodiments of the present invention. Furthermore, it is understood that the present invention may be implemented by a system having means in the form of hardware, software, or a combination of software and hardware as described herein or their equivalent. Thus, the method 400, for example, described herein can be deployed as process software in the context of a computer system or data-processing system as that depicted in FIGS. 1-3.
[0029] The method 400 is based on the fact that the edge between the pupil and iris is a very useful feature for blur estimation because it provides information about all orientations over the range [0,360] degrees. Under motion blur, this edge will appear sharp in the directions orthogonal to motion and most blurred in the motion direction. By estimating the degree of blur at each orientation, it is possible to estimate both the motion direction and the severity of the blur. Simplistically, the blur direction can be estimated as the direction along which the blur width is maximal and the extent can be estimated as that maximal blur width. Due to sensor noise, errors in the estimate of the pupil center, and other confounding factors, this simplistic method is likely to perform erratically.
[0030] In order to improve robustness to such factors, our more sophisticated scheme uses the pairs of blur width and edge orientation as data that can be fit to a model.
Because the blur width is expected to vary with the cosine of the difference between an orientation and the true direction of motion, we can fit the blur width data to a two-parameter model of amplitude and phase, the amplitude corresponding to the blur extent, and the phase corresponding to the blur direction. Method 400 thus constitutes a number of processing steps. As indicated at block 402, an operation can be processed for locating the region of the image containing an eye. If the image contains multiple eyes, as indicated at block 404, then multiple regions can be extracted, depicted at block 406; otherwise, the operation illustrated at block 408 can be processed.
[0031] With respect to each such region, the center of the pupil region in the image can be located, as described at block 408. Next, as illustrated at block 410, the image can be re-sampled along lines of various orientations, all of which pass through the pupil center.
Following the processing of the operation depicted at block 410, a test can be performed, as illustrated at block 412, to determine if the pupil is nearly circular. If so, then the operation depicted at block 414 is processed and the operation illustrated at block 413 skipped. If not, then the operation depicted at block 413 is processed, following by the instructions depicted at block 414.
[0032] As indicated at block 413, the edge orientation in the lines of re-sampling can be estimated. The width of the blurred edges can also be estimated along these lines of re-sampling, as depicted at block 414. Note that the step illustrated at block 413 can be skipped when the pupil is nearly circular, as the orientation of the blurred edge will be orthogonal to the orientation of the corresponding line of re-sampling. Following the processing of the operation described at block 414, the operation depicted at block 416 can be processed. At this point, a collection of (edge orientation, blur width) pairs are generated that can be fit to a model.
[0033] As illustrated at block 416, a least squares curve fitting approach (or similar methodology) can be employed to determine the amplitude and phase of a cosine that most closely matches the data. Finally, as indicated at block 418, the method 400 outputs the amplitude and phase, which corresponds to the blur extent and direction, respectively.
[0034] While the present invention has been particularly shown and described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention. Furthermore, as used in the specification and the appended claims, the term "computer" or "system" or "computer system" or "computing device" or "data-processing system" includes any data-processing apparatus including, but not limited to, personal computers, servers, workstations, network computers, main frame computers, routers, switches, Personal Digital Assistants (PDA's), telephones, and any other system capable of processing, transmitting, receiving, capturing and/or storing data.
[0035] It will be appreciated that variations of the above-disclosed and other features and functions, or alternatives thereof, may be desirably combined into many other different systems or applications. Also, that various presently unforeseen or unanticipated alternatives, modifications, variations or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims.
Claims (10)
- CLAIMSWhat is claimed is: 1. A method for blur estimation in eye images, said method comprising: locating a region of an image containing an eye and identifying a center of a pupil region in said image; re-sampling said image along lines of various orientations, wherein each line of orientation thereof passes through a center of said pupil region, in order to generate data indicative of a model of said image; and determining an amplitude and a phase of a cosine that most closely matches said data, wherein said amplitude and phase respectively correspond to a blur extent and a direction, respectively, with respect to said image.
- 2. The method of claim I further comprising estimating a width of blurred edges along said lines of various orientations.
- 3. The method of claim I further comprising extracting multiple regions from said image if said image contains multiple eyes.
- 4. The method of claim 1 further comprising utilizing a least squares curve fitting to determine said amplitude and said phase of said cosine that most closely matches said data.
- 5. The method of claim I wherein said data indicative of said model of said image comprises blur width data.
- 6. A system for blur estimation in eye images, said system comprising: a processor; a data bus coupled to the processor; and a computer-usable medium embodying computer code, the computer-usable medium being coupled to the data bus, the computer program code comprising instructions executable by the processor and configured for:IIlocating a region of an image containing an eye and identifying a center of a pupil region in said image; re-sampling said image along lines of various orientations, wherein each line of orientation thereof passes through a center of said pupil region, in order to generate data indicative of a model of said image; and determining an amplitude and a phase of a cosine that most closely matches said data, wherein said amplitude and phase respectively correspond to a blur extent and a direction, respectively, with respect to said image.
- 7. The system of claim 6 wherein said instructions are further configured for: estimating a width of blurred edges along said lines of various orientations; extracting multiple regions from said image if said image contains multiple eyes; and utilizing a least squares curve fitting to determine said amplitude and said phase of said cosine that most closely matches said data.
- 8. The system of claim 6 wherein said data indicative of said model of said image comprises blur width data.
- 9. A computer-usable medium for blur estimation in eye images, said computer-usable medium embodying computer program code, said computer program code comprising computer executable instructions configured for: locating a region of an image containing an eye and identifying a center of a pupil region in said image; re-sampling said image along lines of various orientations, wherein each line of orientation thereof passes through a center of said pupil region, in order to generate data indicative of a model of said image; and determining an amplitude and a phase of a cosine that most closely matches said data, wherein said amplitude and phase respectively correspond to a blur extent and a direction, respectively, with respect to said image.
- 10. The computer-usable medium of claim 9 wherein: said embodied computer code further comprises computer executable instructions configured for estimating a width of blurred edges along said lines of various orientations; said embodied computer code further comprises computer executable instructions configured for extracting multiple regions from said image if said image contains multiple eyes; and said embodied computer code further comprises computer executable instructions configured for utilizing a least squares curve fitting to determine said amplitude and said phase of said cosine that most closely matches said data.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15675909P | 2009-03-02 | 2009-03-02 | |
US12/421,263 US8873810B2 (en) | 2009-03-02 | 2009-04-09 | Feature-based method and system for blur estimation in eye images |
Publications (3)
Publication Number | Publication Date |
---|---|
GB0922046D0 GB0922046D0 (en) | 2010-02-03 |
GB2468380A true GB2468380A (en) | 2010-09-08 |
GB2468380B GB2468380B (en) | 2011-05-04 |
Family
ID=41717100
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0922046A Active GB2468380B (en) | 2009-03-02 | 2009-12-17 | A feature-based method and system for blur estimation in eye images |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2468380B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9338354B2 (en) | 2011-10-03 | 2016-05-10 | Nikon Corporation | Motion blur estimation and restoration using light trails |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050249429A1 (en) * | 2004-04-22 | 2005-11-10 | Fuji Photo Film Co., Ltd. | Method, apparatus, and program for image processing |
US20070122049A1 (en) * | 2003-03-31 | 2007-05-31 | Cdm Optics, Inc. | Systems and methods for minimizing aberrating effects in imaging systems |
WO2007129762A2 (en) * | 2006-05-08 | 2007-11-15 | Mitsubishi Electric Corporation | Method for reducing blur in an image of a scene and apparatus for deblurring an image of a scene |
GB2450027A (en) * | 2006-03-03 | 2008-12-10 | Honeywell Int Inc | Invariant radial iris segmentation |
-
2009
- 2009-12-17 GB GB0922046A patent/GB2468380B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070122049A1 (en) * | 2003-03-31 | 2007-05-31 | Cdm Optics, Inc. | Systems and methods for minimizing aberrating effects in imaging systems |
US20050249429A1 (en) * | 2004-04-22 | 2005-11-10 | Fuji Photo Film Co., Ltd. | Method, apparatus, and program for image processing |
GB2450027A (en) * | 2006-03-03 | 2008-12-10 | Honeywell Int Inc | Invariant radial iris segmentation |
WO2007129762A2 (en) * | 2006-05-08 | 2007-11-15 | Mitsubishi Electric Corporation | Method for reducing blur in an image of a scene and apparatus for deblurring an image of a scene |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9338354B2 (en) | 2011-10-03 | 2016-05-10 | Nikon Corporation | Motion blur estimation and restoration using light trails |
Also Published As
Publication number | Publication date |
---|---|
GB0922046D0 (en) | 2010-02-03 |
GB2468380B (en) | 2011-05-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12254072B2 (en) | Systems and methods for private authentication with helper networks | |
Boulkenafet et al. | OULU-NPU: A mobile face presentation attack database with real-world variations | |
Tang et al. | Face flashing: a secure liveness detection protocol based on light reflections | |
Chakraborty et al. | An overview of face liveness detection | |
US11489866B2 (en) | Systems and methods for private authentication with helper networks | |
Xu et al. | Virtual u: Defeating face liveness detection by building virtual models from your public photos | |
Uzun et al. | rtCaptcha: A Real-Time CAPTCHA Based Liveness Detection System. | |
CN106096582B (en) | Distinguish real and flat surfaces | |
US20090278928A1 (en) | Simulating a fluttering shutter from video data | |
JP4353246B2 (en) | Normal information estimation device, registered image group creation device, image collation device, and normal information estimation method | |
US8873810B2 (en) | Feature-based method and system for blur estimation in eye images | |
EP3032463B1 (en) | Method and device for tracking characters on a plurality of images of a video stream of a text | |
Sahla Habeeba et al. | Detection of deepfakes using visual artifacts and neural network classifier | |
US9332191B2 (en) | Method and system for determining shutter fluttering sequence | |
kumar Shukla et al. | A novel method for identification and performance improvement of Blurred and Noisy Images using modified facial deblur inference (FADEIN) algorithms | |
CN111597847A (en) | Two-dimensional code identification method, device and equipment and readable storage medium | |
EP2432215B1 (en) | Methods and systems for capturing an image of a moving object | |
US8537272B2 (en) | Method and system for designing optimal flutter shutter sequence | |
GB2468380A (en) | Blur estimation in eye images via cosine amplitude and phase matching | |
WO2020044630A1 (en) | Detector generation device, monitoring device, detector generation method, and detector generation program | |
Gultekin et al. | Multi‐frame motion deblurring of video using the natural oscillatory motion of dexterous legged robots | |
CN114140349A (en) | Method and device for generating interference image | |
Mohzary et al. | Your Eyes Show What Your Eyes See (Y-EYES) Challenge-Response Anti-Spoofing Method for Mobile Security Using Corneal Specular Reflections | |
Favorskaya et al. | Image-based anomaly detection using CNN cues generalisation in face recognition system | |
Hajyan et al. | Farsi CAPTCHA Recognition Using Attention-Based Convolutional Neural Network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
732E | Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977) |
Free format text: REGISTERED BETWEEN 20180816 AND 20180822 |