US10554904B2 - Automated extended depth of field imaging apparatus and method - Google Patents
Automated extended depth of field imaging apparatus and method Download PDFInfo
- Publication number
- US10554904B2 US10554904B2 US16/205,047 US201816205047A US10554904B2 US 10554904 B2 US10554904 B2 US 10554904B2 US 201816205047 A US201816205047 A US 201816205047A US 10554904 B2 US10554904 B2 US 10554904B2
- Authority
- US
- United States
- Prior art keywords
- distances
- range
- stack
- image
- focus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H04N5/2356—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/743—Bracketing, i.e. taking a series of images with varying exposure conditions
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
- H04N23/959—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
-
- H04N5/23212—
-
- H04N5/23216—
-
- H04N5/23222—
-
- H04N5/23232—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2625—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect
Definitions
- the invention relates to extending the depth of field in images. More particularly, the invention relates to an imaging apparatus, having an automated extended depth of field mode, and associated method.
- EDOF extended depth of field
- Post-processing software then combines the stack images into a single composite image by selecting from each stack image the in-focus portion of the image.
- the resulting composite image thus provides both high resolution and a large depth of field.
- this technique requires the photographer to determine the depth of field provided by the lens and aperture, and to then manually advance the focus distance of the lens by the appropriate amount between acquisition of each stack image. This is both tedious and prone to error.
- bracketing is the general technique of taking several shots of the same subject using different camera settings.
- a camera having an automated focus bracketing capability takes a series of shots of a subject, automatically changing the focal distance after each shot.
- the photographer may choose from a number of settings that specify the number of shots.
- bracketing features do not embody automated intelligence for calculating the spacing of images in a stack for (EDOF) photography based on photographer-selected near and far focus limits.
- An imaging apparatus and method automates and simplifies the process of creating extended depth of field images.
- An embodiment automates the acquisition of an image stack and stores metadata at the time of image acquisition that facilitates production of a composite image having an extended depth of field from at least a portion of the images in the acquired stack.
- An embodiment allows a user to specify, either at the time of image capture or at the time the composite image is created, a range of distances that the user wishes to have in focus within the composite image.
- An embodiment provides an on-board capability to produce a composite, extended depth of field image from the image stack.
- An embodiment allows the user to import the image stack into an image-processing software application that produces the composite image.
- FIG. 1 provides a block diagram of a digital camera having an automated, extended depth-of-field mode
- FIG. 2 provides a flow diagram of a method for acquiring a stack of images having depths of field that span a predetermined distance range for production of extended depth-of-field (EDOF) images;
- EEOF extended depth-of-field
- FIG. 3 provides a flow diagram of an alternate embodiment of a method for acquiring a stack of images having depths of field that span a predetermined distance range for production of EDOF images.
- An imaging apparatus and method automates and simplifies the process of creating extended depth of field images.
- An embodiment automates the acquisition of an image stack and stores metadata at the time of image acquisition that facilitates production of a composite image having an extended depth of field from at least a portion of the images in the acquired stack.
- An embodiment allows a user to specify, either at the time of image capture or at the time the composite image is created, a range of distances that the user wishes to have in focus within the composite image.
- An embodiment provides an on-board capability to produce a composite, extended depth of field image from the image stack.
- An embodiment allows the user to import the image stack into an image-processing software application that produces the composite image.
- FIG. 1 shows a diagrammatic representation of a digital camera 100 having an automated extended depth-of-field (EDOF) mode, according to the invention.
- EEOF automated extended depth-of-field
- the digital camera 100 includes a housing 102 , a lens shutter arrangement system 104 having a focusable lens 106 and a shutter 108 , and an image-sensing device 110 , such as a CCD (charge-coupled device).
- One embodiment may include a camera setting adjustment system 114 .
- the camera setting adjustment system 114 may include a stepper motor 112 for incrementally adjusting the focus distance of the lens 106 and the size of the aperture opening.
- the camera setting adjustment system may include an embedded system 116 , the embedded system including at least one processor and a motor driver for facilitating incremental changes in focus distance and aperture via the stepper motor 112 .
- the digital camera may include both fixed focus and auto-focus modes. Selection of focus mode may be accomplished by means of a switch (not shown) or by activating a focus mode-selection component in a user interface displayed on the LCD display 120 . As shown in FIG. 1 , the display is communicatively coupled to the embedded system 116 by means of leads 122 .
- An embodiment allows the user to specify the range of distances at the time of image capture. This may be done manually, for example, by selecting the defining distances via a pair of manual dials (not shown) or by selecting numerical values defining the range on the LCD display 120 , or in a camera-assisted manner by using the camera's autofocus feature to specify the near and far limits of the range; for example by pointing a spot-metering range finder (within the auto-focus system of the camera) at two or more points of interest. Additional embodiments may employ, for example, weighted average metering or forms of object-class detection such as face detection to specify near and far limits of the range.
- the camera acquires a stack of images at spaced intervals sufficient to provide in-focus coverage across the desired range, wherein the processor of the embedded system is programmed to control the process of acquiring an image stack for creation of one or more EDOF images.
- the embedded system 116 includes at least one processor. Additionally, the embedded system includes a read-only memory having stored therein one or more firmware modules for programming the at least one processor to execute the steps of a process for producing EDOF images as described herein.
- the embedded system may be composed of an ASIC (application specific integrated circuit) or one or more CMOSs (complementary metal oxide semiconductors).
- the logic component of the embedded system may be a microcontroller programmed as below, or a FPGA (field programmable gate array) having logic circuits for performing the steps of the procedure below.
- hyperfocal distance is the closest distance at which the lens can be focused while keeping objects at infinity acceptably sharp.
- the maximum permissible circle of confusion is the largest circle of confusion considered to provide acceptably sharp focus.
- the focus distances, s i for the stack images are computed iteratively based on approximations for the near and far limits—D n and D f , respectively—of the depth of field:
- Equation 2 the corresponding near depth of field limit is found using Equation 2, namely:
- the procedure is repeated with the near depth of field limit applied as the far limit of the depth of field in the next iteration. Specifically,
- the result of the procedure is a stack of N images having abutting depths of field, covering the specified range of distances.
- the final image will be at a focus distance, s N , with a near depth of field limit, D N n , corresponding to the near end of the specified range of distances, r n .
- the final image will provide in-focus content nearer than r n .
- An embodiment therefore preferably allows the user to specify whether he desires to have “at least” the specified range of distances in focus or “exactly” the specified range of distances in focus. Based on this specification 206 , the final focus distance may be adjusted. In the former case, no adjustments need be made to the final focus distance.
- the result is a set of focus distances for a stack of images covering at least the specified range of distances.
- the camera Based on this set of focus distances, the camera then obtains 208 a stack of N images having abutting depths of field, covering at least the specified range of distances. In the latter case, the final focus distance is adjusted such that the near depth of field limit corresponds to the near end of the specified range of distances 210 , that is,
- the result is a set of focus distances for a stack of images having abutting depths of field covering exactly the specified range of distances. Based on this set of focus distances, the camera then obtains 212 a stack of N images having abutting depths of field, covering exactly the specified range of distances.
- the images within the image stack may be acquired after each focus distance is calculated.
- the adjustment step at block 210 may be performed by acquiring an additional image to replace the final image of the stack already acquired.
- the camera images using a CCD sensor characterized by a circle of confusion of c 6 ⁇ m.
- H 51.04 m.
- the camera obtains 308 a stack of images covering at least the specified range results. If the user specifies that the range must be exact 306 , the focus distance of the final image is adjusted 310 so that the far depth of field limit corresponds to far end of the specified range of distances, and the camera obtains 312 a stack of images covering exactly the specified range.
- the images within the image stack may be acquired after each focus distance is calculated, and the adjustment step at block 310 may be performed by acquiring an additional image to replace the final image of the stack already acquired.
- both the number and spacing of the stack images are dependent on the f-number N.
- the f-number is determined automatically by the camera using existing methods that consider, for example, the capabilities of the lens and the light level.
- the invention can also be used in conjunction with an “aperture priority” mode in which the user may specify a specific f-number at which the stack images are to be acquired. For example, a user willing to tolerate reduced sharpness in the composite image could reduce the number of stack images required by specifying a relatively small f-number.
- the system may also allow the user to specify specific f-numbers for the individual stack images, thus allowing customization of the stack image spacing.
- the number and spacing of stack images is dependent on the maximum permissible circle of confusion c.
- the value of c is selected automatically by the camera, based on the inherent limitations (i.e., the optical quality) of the lens, the capabilities (i.e., the resolution) of the imaging format (e.g. 35 mm film, CCD), and the amount of diffraction associated with the chosen f-number.
- the circle of confusion may be specified by the user as desired. For example, a user willing to tolerate reduced sharpness in the composite image in order to reduce the number of stack images required could specify a relatively large maximum permissible circle of confusion.
- the user can specify an “overlap fraction”, a of adjacent depths of field in the image stack.
- hyperfocal distance and focus distances are calculated.
- the user specifies an overlap fraction.
- the procedures described herein above are modified such that the focus distance of each image within the image stack is adjusted to place one depth of field limit within the previous depth of field. For example, in the inwardly-iterating procedure, the focus distance (Equation 6) is adjusted to
- a still further embodiment extends to multiple ranges of distance that the user wishes to have in focus.
- each range can be handled separately, as described above.
- the user can specify one or more ranges of distances that he wishes to be out of focus.
- the out of focus ranges can be converted to one or more complementary in focus ranges, each handled as described above.
- the camera automatically acquires a stack of images providing coverage from the near field to infinity at an extremely small f-number. Because each image within the stack offers a limited depth of field, it is then possible for the photographer to select, in post-processing software, what distances he would like to have in focus and what regions he would like to have out of focus. Based on the metadata saved by the camera for each image within the stack, the post-processing software assembles the composite image from the appropriate images within the stack. This approach does require the capture and storage of more stack images than may ultimately be utilized, but it does provide the photographer with greater artistic freedom later in the production process.
- Equation 1 is only one possible approximation of the hyperfocal distance
- Equations 2 and 3 are only two possible approximations of the limits of the depth of field.
- these approximations are valid only for relatively large focus distances in which s f.
- the depth of field may be calculated by any number of well known alternative approximations, and different approximations may be used at different focus distances—in other words, different images within the stack.
- the above procedures are preferably invoked by the user via selection of an “auto-EDOF mode”, analogous to “shutter priority” or “program” modes found on many cameras.
- the user may select the mode on a rotary dial atop the camera.
- the stack of images is preferably acquired as rapidly as possible in response to a single press of the shutter button.
- SLR single lens reflex
- the reflex mirror may be held in a retracted position while the images are acquired.
- the images may be written to a temporary memory cache offering faster write times than the permanent memory, enabling faster image acquisition.
- the invention may be combined with any of several well-known techniques for digital image stabilization that are capable of backing out any motion of the subject or camera that occurs between successive images within the stack; for example the “image stack” feature within PHOTOSHOP (ADOBE SYSTEMS, INC., San Jose, Calif.) or such image-processing software as HELICON FOCUS (HELICON SOFTWARE, LTD., Kharkov, Ukraine).
- PHOTOSHOP ADOBE SYSTEMS, INC., San Jose, Calif.
- HELICON FOCUS HELICON SOFTWARE, LTD., Kharkov, Ukraine
- the stack of images is combined into a single composite image during post-acquisition processing.
- the processing is performed off-board the camera by a computational device programmed with an image processing software application for creating extended depth of field images.
- image processing software application are COMBINEZ5 and COMBINEZM, both open-source programs obtainable on the Internet, and both originally developed by Alan Hadley, a resident of the United Kingdom.
- the camera saves metadata with each image in the stack, preferably indicating the inner and outer boundaries of the depth of field for the image and the specified range of distances.
- the camera stores information from which these quantities can be determined, for example the f-number, the lens focal length, and the maximum possible circle of confusion.
- the information is preferably stored in a standardized set of metadata tags, such as those within the exchangeable image file format (EXIF).
- the images may be combined into a single composite image onboard the camera.
- the stack images may be deleted after composition to increase available memory space.
- the methods and systems herein described provide a large number of unexpected benefits which render them a great advance over the conventional manner of producing extended depth of field images.
- the foremost advantage provided by present methods and systems is that they provide an integrated solution to the challenge of producing extended depth of field (EDOF) images.
- EEOF extended depth of field
- practitioners in this art must first acquire the images, largely, manually. The practitioner determines the depth of field boundaries and the corresponding depths of field at which it is necessary to acquire images in order to cover the desired range of distances. After acquiring the image stack, the practitioner must then export the image stack to a third-party software application with which the composite, extended depth of field image is created.
- the present solution greatly simplifies the step of defining the focus range for the EDOF image, substituting a simple, intuitive procedure whereby the user defines the focus range by tagging successive focus points using the autofocus feature of a digital camera for the cumbersome manual procedure now generally used.
- the present solution also greatly simplifies the acquisition of the image stack.
- the system intelligently calculates the required depths of field to cover the specified range of distances and automatically acquires the image stack, with little or no additional user input.
- One embodiment integrates the production of the EDOF image, eliminating the need for yet another software application to produce the image, and greatly reducing the storage and transfer bandwidth requirements involved in EDOF image production.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Exposure Control For Cameras (AREA)
Abstract
Description
based on the f-number, N, the lens focal length, f, and the maximum permissible circle of confusion c. The practitioner of ordinary skill will appreciate that, in the fields of optics and photography, hyperfocal distance is the closest distance at which the lens can be focused while keeping objects at infinity acceptably sharp. The ordinarily-skilled practitioner will understand that the maximum permissible circle of confusion is the largest circle of confusion considered to provide acceptably sharp focus.
with DO n=rf. Substituting, the procedure can be summarized as
with termination upon
and optional adjustment of the final image to
as previously described with respect to
where the far depth of field limit is within the previous depth of field as specified by α, namely,
D i f =D i-1 n+α(D i-1 f −D i-1 n). (17)
Claims (18)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/205,047 US10554904B2 (en) | 2008-03-05 | 2018-11-29 | Automated extended depth of field imaging apparatus and method |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US3408808P | 2008-03-05 | 2008-03-05 | |
US12/398,034 US8154647B2 (en) | 2008-03-05 | 2009-03-04 | Automated extended depth of field imaging apparatus and method |
US13/420,434 US8913174B2 (en) | 2008-03-05 | 2012-03-14 | Automated extended depth of field imaging apparatus and method |
US14/570,594 US9313417B2 (en) | 2008-03-05 | 2014-12-15 | Automated extended depth of field imaging apparatus and method |
US15/095,597 US10154203B2 (en) | 2008-03-05 | 2016-04-11 | Automated extended depth of field imaging apparatus and method |
US16/205,047 US10554904B2 (en) | 2008-03-05 | 2018-11-29 | Automated extended depth of field imaging apparatus and method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/095,597 Continuation US10154203B2 (en) | 2008-03-05 | 2016-04-11 | Automated extended depth of field imaging apparatus and method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190098197A1 US20190098197A1 (en) | 2019-03-28 |
US10554904B2 true US10554904B2 (en) | 2020-02-04 |
Family
ID=41053196
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/398,034 Active 2030-06-12 US8154647B2 (en) | 2008-03-05 | 2009-03-04 | Automated extended depth of field imaging apparatus and method |
US13/420,434 Active 2029-09-25 US8913174B2 (en) | 2008-03-05 | 2012-03-14 | Automated extended depth of field imaging apparatus and method |
US14/570,594 Active US9313417B2 (en) | 2008-03-05 | 2014-12-15 | Automated extended depth of field imaging apparatus and method |
US15/095,597 Active 2029-06-12 US10154203B2 (en) | 2008-03-05 | 2016-04-11 | Automated extended depth of field imaging apparatus and method |
US16/205,047 Active US10554904B2 (en) | 2008-03-05 | 2018-11-29 | Automated extended depth of field imaging apparatus and method |
Family Applications Before (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/398,034 Active 2030-06-12 US8154647B2 (en) | 2008-03-05 | 2009-03-04 | Automated extended depth of field imaging apparatus and method |
US13/420,434 Active 2029-09-25 US8913174B2 (en) | 2008-03-05 | 2012-03-14 | Automated extended depth of field imaging apparatus and method |
US14/570,594 Active US9313417B2 (en) | 2008-03-05 | 2014-12-15 | Automated extended depth of field imaging apparatus and method |
US15/095,597 Active 2029-06-12 US10154203B2 (en) | 2008-03-05 | 2016-04-11 | Automated extended depth of field imaging apparatus and method |
Country Status (1)
Country | Link |
---|---|
US (5) | US8154647B2 (en) |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2026567B1 (en) * | 2007-07-31 | 2010-10-06 | Ricoh Company, Ltd. | Imaging device and imaging method |
JP5048614B2 (en) * | 2008-09-03 | 2012-10-17 | 富士フイルム株式会社 | Imaging apparatus and method |
JP5497386B2 (en) * | 2009-09-11 | 2014-05-21 | 浜松ホトニクス株式会社 | Image acquisition device |
US8798388B2 (en) * | 2009-12-03 | 2014-08-05 | Qualcomm Incorporated | Digital image combining to produce optical effects |
CN102262331B (en) * | 2010-05-25 | 2014-10-15 | 鸿富锦精密工业(深圳)有限公司 | Image acquisition module and image acquisition method thereof |
US10200671B2 (en) * | 2010-12-27 | 2019-02-05 | 3Dmedia Corporation | Primary and auxiliary image capture devices for image processing and related methods |
JP5882898B2 (en) * | 2011-03-14 | 2016-03-09 | パナソニック株式会社 | Imaging apparatus, imaging method, integrated circuit, computer program |
US8810712B2 (en) * | 2012-01-20 | 2014-08-19 | Htc Corporation | Camera system and auto focus method |
US8830380B2 (en) | 2012-06-28 | 2014-09-09 | International Business Machines Corporation | Depth of focus in digital imaging systems |
US9488819B2 (en) * | 2012-08-31 | 2016-11-08 | Nanotronics Imaging, Inc. | Automatic microscopic focus system and method for analysis of transparent or low contrast specimens |
US8983176B2 (en) | 2013-01-02 | 2015-03-17 | International Business Machines Corporation | Image selection and masking using imported depth information |
KR102022892B1 (en) * | 2013-01-23 | 2019-11-04 | 삼성전자 주식회사 | Apparatus and method for processing image of mobile terminal comprising camera |
US9832390B2 (en) * | 2014-02-20 | 2017-11-28 | Sharp Kabushiki Kaisha | Image capturing device |
US9196027B2 (en) | 2014-03-31 | 2015-11-24 | International Business Machines Corporation | Automatic focus stacking of captured images |
US9449234B2 (en) | 2014-03-31 | 2016-09-20 | International Business Machines Corporation | Displaying relative motion of objects in an image |
US9538065B2 (en) | 2014-04-03 | 2017-01-03 | Qualcomm Incorporated | System and method for multi-focus imaging |
US9300857B2 (en) | 2014-04-09 | 2016-03-29 | International Business Machines Corporation | Real-time sharpening of raw digital images |
JP6317635B2 (en) * | 2014-06-30 | 2018-04-25 | 株式会社東芝 | Image processing apparatus, image processing method, and image processing program |
US9749532B1 (en) * | 2014-08-12 | 2017-08-29 | Amazon Technologies, Inc. | Pixel readout of a charge coupled device having a variable aperture |
US10542204B2 (en) | 2015-08-05 | 2020-01-21 | Microsoft Technology Licensing, Llc | Methods and apparatuses for capturing multiple digital image frames |
JP6838994B2 (en) * | 2017-02-22 | 2021-03-03 | キヤノン株式会社 | Imaging device, control method and program of imaging device |
CN106937056B (en) * | 2017-03-31 | 2020-10-09 | 努比亚技术有限公司 | Focusing processing method and device for double cameras and mobile terminal |
JP6891071B2 (en) * | 2017-08-07 | 2021-06-18 | キヤノン株式会社 | Information processing equipment, imaging system, imaging method and program |
US10247910B1 (en) | 2018-03-14 | 2019-04-02 | Nanotronics Imaging, Inc. | Systems, devices and methods for automatic microscopic focus |
US10146041B1 (en) | 2018-05-01 | 2018-12-04 | Nanotronics Imaging, Inc. | Systems, devices and methods for automatic microscope focus |
CN112930676A (en) * | 2018-11-06 | 2021-06-08 | 奥林巴斯株式会社 | Imaging device, endoscope device, and method for operating imaging device |
WO2020205003A1 (en) | 2019-04-01 | 2020-10-08 | Google Llc | Techniques to capture and edit dynamic depth images |
US10984513B1 (en) * | 2019-09-30 | 2021-04-20 | Google Llc | Automatic generation of all-in-focus images with a mobile camera |
CN113841376B (en) * | 2020-09-22 | 2023-05-16 | 深圳市大疆创新科技有限公司 | Shooting control method and device |
CN112822402B (en) * | 2021-01-08 | 2023-04-18 | 重庆创通联智物联网有限公司 | Image shooting method and device, electronic equipment and readable storage medium |
US11570351B2 (en) * | 2021-06-30 | 2023-01-31 | Zebra Technologies Corporation | Method of detecting and correcting focus drift of variable focus lens for fixed focus applications |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4764787A (en) | 1986-01-21 | 1988-08-16 | Minolta Camera Kabushiki Kaisha | Automatic focus adjusting device |
US20010002216A1 (en) * | 1999-11-30 | 2001-05-31 | Dynacolor, Inc. | Imaging method and apparatus for generating a combined output image having image components taken at different focusing distances |
US6320979B1 (en) | 1998-10-06 | 2001-11-20 | Canon Kabushiki Kaisha | Depth of field enhancement |
US20030023443A1 (en) | 2001-07-03 | 2003-01-30 | Utaha Shizuka | Information processing apparatus and method, recording medium, and program |
US20030117511A1 (en) | 2001-12-21 | 2003-06-26 | Eastman Kodak Company | Method and camera system for blurring portions of a verification image to show out of focus areas in a captured archival image |
US20040145808A1 (en) * | 1995-02-03 | 2004-07-29 | Cathey Wade Thomas | Extended depth of field optical systems |
US20050179808A1 (en) * | 2004-02-02 | 2005-08-18 | Lite-On Technology Corporation | Image capturing device and method with negative out of focus module |
US20060197005A1 (en) | 2004-12-10 | 2006-09-07 | Li Jiang | System and method for automatic focusing of images |
US20060245752A1 (en) | 2005-04-27 | 2006-11-02 | Koji Kawaguchi | Diaphragm device, lens assembly including same and surveillance camera |
US20070212056A1 (en) | 2006-03-08 | 2007-09-13 | Hiroshi Nagata | Single-lens reflex camera |
US20070248330A1 (en) * | 2006-04-06 | 2007-10-25 | Pillman Bruce H | Varying camera self-determination based on subject motion |
US7307653B2 (en) | 2001-10-19 | 2007-12-11 | Nokia Corporation | Image stabilizer for a microcamera module of a handheld device, and method for stabilizing a microcamera module of a handheld device |
US20080013941A1 (en) | 2006-07-14 | 2008-01-17 | Micron Technology, Inc. | Method and apparatus for increasing depth of field for an imager |
US20090167923A1 (en) * | 2007-12-27 | 2009-07-02 | Ati Technologies Ulc | Method and apparatus with depth map generation |
US7565074B2 (en) | 2006-05-31 | 2009-07-21 | Hoya Corporation | Camera having a focus adjusting system |
US7656460B2 (en) | 2007-08-21 | 2010-02-02 | Sony Ericsson Mobile Communications Ab | Autofocus assembly that adjusts a lens in the optical axis direction by alignment of holes in a spacing ring that receive ball bearings |
-
2009
- 2009-03-04 US US12/398,034 patent/US8154647B2/en active Active
-
2012
- 2012-03-14 US US13/420,434 patent/US8913174B2/en active Active
-
2014
- 2014-12-15 US US14/570,594 patent/US9313417B2/en active Active
-
2016
- 2016-04-11 US US15/095,597 patent/US10154203B2/en active Active
-
2018
- 2018-11-29 US US16/205,047 patent/US10554904B2/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4764787A (en) | 1986-01-21 | 1988-08-16 | Minolta Camera Kabushiki Kaisha | Automatic focus adjusting device |
US20040145808A1 (en) * | 1995-02-03 | 2004-07-29 | Cathey Wade Thomas | Extended depth of field optical systems |
US6320979B1 (en) | 1998-10-06 | 2001-11-20 | Canon Kabushiki Kaisha | Depth of field enhancement |
US20010002216A1 (en) * | 1999-11-30 | 2001-05-31 | Dynacolor, Inc. | Imaging method and apparatus for generating a combined output image having image components taken at different focusing distances |
US20030023443A1 (en) | 2001-07-03 | 2003-01-30 | Utaha Shizuka | Information processing apparatus and method, recording medium, and program |
US7307653B2 (en) | 2001-10-19 | 2007-12-11 | Nokia Corporation | Image stabilizer for a microcamera module of a handheld device, and method for stabilizing a microcamera module of a handheld device |
US20030117511A1 (en) | 2001-12-21 | 2003-06-26 | Eastman Kodak Company | Method and camera system for blurring portions of a verification image to show out of focus areas in a captured archival image |
US20050179808A1 (en) * | 2004-02-02 | 2005-08-18 | Lite-On Technology Corporation | Image capturing device and method with negative out of focus module |
US20060197005A1 (en) | 2004-12-10 | 2006-09-07 | Li Jiang | System and method for automatic focusing of images |
US20060245752A1 (en) | 2005-04-27 | 2006-11-02 | Koji Kawaguchi | Diaphragm device, lens assembly including same and surveillance camera |
US20070212056A1 (en) | 2006-03-08 | 2007-09-13 | Hiroshi Nagata | Single-lens reflex camera |
US20070248330A1 (en) * | 2006-04-06 | 2007-10-25 | Pillman Bruce H | Varying camera self-determination based on subject motion |
US7565074B2 (en) | 2006-05-31 | 2009-07-21 | Hoya Corporation | Camera having a focus adjusting system |
US20080013941A1 (en) | 2006-07-14 | 2008-01-17 | Micron Technology, Inc. | Method and apparatus for increasing depth of field for an imager |
US7656460B2 (en) | 2007-08-21 | 2010-02-02 | Sony Ericsson Mobile Communications Ab | Autofocus assembly that adjusts a lens in the optical axis direction by alignment of holes in a spacing ring that receive ball bearings |
US20090167923A1 (en) * | 2007-12-27 | 2009-07-02 | Ati Technologies Ulc | Method and apparatus with depth map generation |
Non-Patent Citations (4)
Title |
---|
"Close-up Photography and Photomacrography", Kodak; Kodak Publication N-12., 1977, 78. |
Hadley, A , "CombineZ5", retrieved online from website: http://www.hadlevweb.pwp.blueyonder.co.uklCZ5/combinez5.htm, Jun. 2005. |
Littlefield, R, "Extended Depth of Field Photography of Insects et al", retrieved online from website: http://www.ianrik.netlinsects/ExtendedDOF, Aug. 7, 2005. |
Wikipedia, "Dept of Field", Publication date not provided; viewed online Feb. 2008; retrieved from website http://en.wikipedia.org/wiki/Depth_of_field. |
Also Published As
Publication number | Publication date |
---|---|
US20150097987A1 (en) | 2015-04-09 |
US20190098197A1 (en) | 2019-03-28 |
US20120169849A1 (en) | 2012-07-05 |
US9313417B2 (en) | 2016-04-12 |
US8913174B2 (en) | 2014-12-16 |
US20090225199A1 (en) | 2009-09-10 |
US8154647B2 (en) | 2012-04-10 |
US20160227094A1 (en) | 2016-08-04 |
US10154203B2 (en) | 2018-12-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10554904B2 (en) | Automated extended depth of field imaging apparatus and method | |
US9215381B2 (en) | Image blurring method and apparatus, and electronic devices | |
KR101756839B1 (en) | Digital photographing apparatus and control method thereof | |
US7711259B2 (en) | Method and apparatus for increasing depth of field for an imager | |
CN103780840B (en) | Two camera shooting image forming apparatus of a kind of high-quality imaging and method thereof | |
US9781334B2 (en) | Control method, camera device and electronic equipment | |
US9049363B2 (en) | Digital photographing apparatus, method of controlling the same, and computer-readable storage medium | |
US8466989B2 (en) | Camera having image correction function, apparatus and image correction method | |
US8937677B2 (en) | Digital photographing apparatus, method of controlling the same, and computer-readable medium | |
KR102336447B1 (en) | Image capturing apparatus and method for the same | |
WO2017045558A1 (en) | Depth-of-field adjustment method and apparatus, and terminal | |
US10175451B2 (en) | Imaging apparatus and focus adjustment method | |
US9307137B2 (en) | Imaging apparatus and imaging method which perform focus adjustment while performing live-view display | |
US7920180B2 (en) | Imaging device with burst zoom mode | |
CN112019734B (en) | Image acquisition method and device, electronic equipment and computer readable storage medium | |
US10972660B2 (en) | Imaging device and imaging method | |
JP2003322789A (en) | Focusing device, camera, and focusing position detecting method | |
JP6218385B2 (en) | Interchangeable lens and imaging device | |
JP2016032180A (en) | Imaging apparatus, control method and program | |
JP2006033519A (en) | Image pickup device | |
JP4356585B2 (en) | Digital camera | |
JP2005136654A (en) | Camera | |
JP2022013076A (en) | Imaging apparatus, control method, and program | |
JP2013090240A (en) | Imaging device and control program of the same | |
JP2007114414A (en) | Imaging apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
AS | Assignment |
Owner name: APPLIED MINDS, LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FERREN, BRAN;REEL/FRAME:047922/0419 Effective date: 20150123 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 4 |