CN102713513B - Camera head, image capture method, program and integrated circuit - Google Patents
Camera head, image capture method, program and integrated circuit Download PDFInfo
- Publication number
- CN102713513B CN102713513B CN201180006620.0A CN201180006620A CN102713513B CN 102713513 B CN102713513 B CN 102713513B CN 201180006620 A CN201180006620 A CN 201180006620A CN 102713513 B CN102713513 B CN 102713513B
- Authority
- CN
- China
- Prior art keywords
- image
- optical
- subject
- imaging
- distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/026—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
- G02B7/365—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals by analysis of the spatial frequency components of the image
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Automatic Focus Adjustment (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
技术领域 technical field
本发明涉及利用从单一的视点拍摄的多张图像,测量场景的进深的摄像装置。 The present invention relates to an imaging device for measuring the depth of a scene using a plurality of images captured from a single point of view.
背景技术 Background technique
以往,提出了用于非接触测量三维场景的进深、即从摄像装置到各个被摄体的距离(以下,“被摄体距离”)的各种方法。可以将它大致区别为,照射红外线、超声波、激光等,根据直到其反射波返回为止的时间、反射波的角度等计算被摄体距离的能动方法,以及根据被摄体的像计算被摄体距离的被动方法。特别是,对于相机等的摄像装置,不需要用于照射红外线等的装置的被动方法被广泛采用。 Conventionally, various methods have been proposed for non-contact measurement of the depth of a three-dimensional scene, that is, the distance from an imaging device to each subject (hereinafter, "subject distance"). It can be roughly distinguished as an active method that calculates the distance to the subject based on the time until the reflected wave returns, the angle of the reflected wave, etc., by irradiating infrared rays, ultrasonic waves, lasers, etc., and calculating the subject distance from the image of the subject. Passive method for distance. In particular, for imaging devices such as cameras, passive methods that do not require devices for irradiating infrared rays or the like are widely used.
对于被动方法,提出了多个方法,被动方法之一有被称为Depth from Defocus(以下,“DFD”)的方法。对于DFD,根据因被摄体距离而大小以及形状变化的模糊来测量被摄体距离。DFD具有不需要多个相机、根据少数图像能够测量距离等的特征。 As for the passive method, several methods have been proposed, and one of the passive methods is a method called Depth from Defocus (hereinafter, "DFD"). With DFD, the subject distance is measured in terms of blur that changes in size and shape depending on the subject distance. The DFD has features that do not require a plurality of cameras, can measure a distance from a small number of images, and the like.
以下,简单地说明DFD的原理。 Hereinafter, the principle of DFD will be briefly described.
可以说,包含模糊的摄像图像(以下“模糊图像”。)是,针对表示没有由透镜的模糊的状态的全焦点图像,将作为被摄体距离的函数的点扩散函数(PSF:Point Spread Function)卷积后的图像。由于点扩散函数是被摄体距离的函数,因此,根据DFD,通过从模糊图像中检测模糊,从而能够求出被摄体距离。但是,此时,全焦点图像和被摄体距离,成为未知数。针对一张模糊图像,成立与模糊图像、全焦点图像、以及被摄体距离相关的一个算式,因此,重新拍摄聚焦位置不同的模糊图像,获得新的公式。也就是说,获得与聚焦位置不同的多个模糊图像相关的多个所述算式。通过解如此获得多个算式,从而计算出被摄体距离。对于算式的获得方法以及解算式的方法等,存在以专利文献1、非专利文献1为首的、对DFD的 各种建议。 It can be said that a captured image including blur (hereinafter "blurred image") is a point spread function (PSF: Point Spread Function) that will be a function of the subject distance for an omni-focus image representing a state without blur by the lens. ) Convolved image. Since the point spread function is a function of the subject distance, according to DFD, the subject distance can be obtained by detecting blur from a blurred image. However, at this time, the distance between the all-focus image and the subject becomes unknown. For a blurred image, a formula related to the blurred image, the all-in-focus image, and the subject distance is established. Therefore, a new formula is obtained by retaking blurred images with different focus positions. That is, a plurality of the above-described calculation expressions related to a plurality of blurred images having different focus positions are obtained. The subject distance is calculated by solving a plurality of formulas thus obtained. There are various proposals for DFD, including patent document 1 and non-patent document 1, regarding the method of obtaining the formula, the method of solving the formula, and the like.
DFD是指,通过对模糊图像中包含的模糊利用点扩散函数,从而求出被摄体距离的方法。然而,在DFD中存在的问题是,由于与被摄体距离对应的像点前后的点扩散函数所示的形状相似,因此,因图像中包含的噪声的影响而不明确,是通过像点后的点扩散函数,还是通过像点前的点扩散函数,其判别变得困难。 DFD is a method of obtaining a subject distance by using a point spread function for blur contained in a blurred image. However, there is a problem in DFD that since the shape shown by the point spread function before and after the image point corresponding to the object distance is similar, it is not clear due to the influence of noise contained in the image, and after passing the image point The point spread function, or the point spread function before the image point, its discrimination becomes difficult.
针对该问题,例如,通过进一步增加聚焦位置不同的图像的张数,从而能够提高被摄体距离估计的准确率。并且,如非专利文献1,通过使用整体不是点对称的形状的开口,从而能够解决在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状的判别不明确的问题。 To solve this problem, for example, by further increasing the number of images with different focus positions, the accuracy of object distance estimation can be improved. In addition, as in Non-Patent Document 1, by using an opening whose overall shape is not point-symmetrical, it is possible to solve the problem of unclear discrimination of the shape shown by the point spread function before and after the image point corresponding to the object distance of the object. question.
(现有技术文献) (Prior art literature)
(专利文献) (patent documents)
专利文献1:日本专利第2963990号公报 Patent Document 1: Japanese Patent No. 2963990
(非专利文献) (non-patent literature)
非专利文献1:C.Zhou,S.Lin and S.Nayar,“Coded Aperture Pairs for Depth from Defocus”In International Conference on Computer Vision,2009 Non-Patent Document 1: C.Zhou, S.Lin and S.Nayar, "Coded Aperture Pairs for Depth from Defocus" In International Conference on Computer Vision, 2009
非专利文献2:“Flexible Depth of Field Photography”,H.Nagahara,S.Kuthirummal,C.Zhou,S.K.Nayer,Euro pean Conference on Computer Vision,2008 Non-Patent Document 2: "Flexible Depth of Field Photography", H.Nagahara, S.Kuthirummal, C.Zhou, S.K.Nayer, Euro pean Conference on Computer Vision, 2008
发明概要 Summary of the invention
发明要解决的问题 The problem to be solved by the invention
对于有关在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状的判别的暧昧性的问题的解决方法,举出了增加聚焦位置不同的图像的张数的方法、以及使用整体不是点对称的形状的开口的方法。但是,前者的问题是,由于增加图像张数,因此摄像时间增加。并且,后者的问题是,由于开口的一部分被遮光,因此光量减少,被摄体距离的估计精度降低。 As a solution to the problem of the ambiguity of the shape indicated by the point spread function before and after the image point corresponding to the subject distance of the subject, a method of increasing the number of images with different focus positions is proposed. , and a method of using an opening of a shape that is not point-symmetrical as a whole. However, the former has a problem in that imaging time increases due to an increase in the number of images. In addition, the latter has a problem in that since a part of the opening is blocked, the amount of light decreases, and the estimation accuracy of the subject distance decreases.
发明内容 Contents of the invention
于是,鉴于所述的状况,本发明的目的在于提供一种摄像装置,不使曝光的光量降低,而解决在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状的判别的暧昧性,根据少的张数的摄像图像估计被摄体距离。 Therefore, in view of the above situation, an object of the present invention is to provide an imaging device that solves the problem shown by the point spread function before and after the image point corresponding to the subject distance of the subject without reducing the amount of light exposed. The ambiguity of the discrimination of the shape estimates the subject distance from the captured images of a small number of sheets.
用于解决问题的手段 means of solving problems
为了实现所述的目的,本发明的实施方案之一涉及的摄像装置具备:摄像元件,拍摄图像;光学系统,用于使被摄体像成像于所述摄像元件;光学元件,具有双折射的效果;以及距离测量部,根据被拍摄的所述图像、和在与被摄体的被摄体距离对应的像点的前后因所述光学元件而发生了变化的点扩散函数,测量从所述摄像元件到所述被摄体的距离。 In order to achieve the above object, an imaging device according to one embodiment of the present invention includes: an imaging element for capturing an image; an optical system for forming an image of an object on the imaging element; an optical element having a birefringent effect; and a distance measurement section that measures the distance from the image based on the captured image and the point spread function that changes due to the optical element before and after the image point corresponding to the subject distance of the subject. The distance from the camera element to the subject.
根据本结构,由于具有双折射的效果的光学元件起作用,因此能够使在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状成为不同的形状。据此,能够解决在与被摄体的被摄体距离对应的像点前后的点扩散函数的判别的暧昧性,并且,根据少的张数的摄像图像能够进行被摄体距离的估计。并且,双折射物质,与利用不是点对称的开口的方法相比,由于不需要遮蔽光,因此能够抑制光量的降低。 According to this configuration, since the optical element having an effect of birefringence acts, the shape indicated by the point spread function before and after the image point corresponding to the subject distance of the subject can be made into a different shape. According to this, ambiguity in distinguishing point spread functions before and after an image point corresponding to the subject distance of the subject can be resolved, and the subject distance can be estimated from a small number of captured images. In addition, since the birefringent substance does not need to shield light compared to the method using openings that are not point-symmetrical, it is possible to suppress a reduction in the amount of light.
并且,具有双折射的效果的光学元件,(特别是,若双折射物质为平行板,光学系统为远心光学系统,)与其他的光学元件不同,主要仅影响到像散,因此,即使将在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状成为不同的形状,给其他的像差带来的影响也小。因此,不需要重新进行光学系统的设计。也就是说,仅进行向现有的装置的插入、以及执行点扩散函数的处理的装置的追加,就能够实现。 In addition, the optical element having the effect of birefringence (in particular, if the birefringent substance is a parallel plate and the optical system is a telecentric optical system), unlike other optical elements, mainly affects only astigmatism, so even if the The shape indicated by the point spread function is different before and after the image point corresponding to the object distance of the object, and the influence on other aberrations is also small. Therefore, there is no need to redesign the optical system. That is, it can be realized only by inserting into an existing device and adding a device for performing point spread function processing.
在此,优选的是,所述光学元件的光学轴的方向和所述光学系统的光轴不平行。 Here, preferably, the direction of the optical axis of the optical element is not parallel to the optical axis of the optical system.
在此,优选的是,在所述光学系统的光轴上,所述光学元件被配置在所述摄像元件与所述光学系统之间,与所述光学系统的光轴相交的所述光学元件的平面,相对于所述光学系统的光轴垂直。 Here, preferably, the optical element is disposed between the imaging element and the optical system on the optical axis of the optical system, and the optical element intersecting the optical axis of the optical system plane, perpendicular to the optical axis of the optical system.
在此,优选的是,还具备光学元件移动部,通过使所述光学元件相对于所述光学系统的光轴插入或退出,从而使所述双折射的效果在所述光学系统的光轴上生效或不生效,所述距离测量部,利用在没有基于所述光学 元件的所述双折射的效果的状态下由所述摄像元件拍摄的图像、和在所述光学元件位于所述光学系统的光轴上的状态下被拍摄的图像,测量从所述摄像元件到所述被摄体的距离。 Here, it is preferable to further include an optical element moving part for making the effect of the birefringence on the optical axis of the optical system by inserting or withdrawing the optical element from the optical axis of the optical system. Enabled or disabled, the distance measuring unit utilizes an image captured by the imaging element in a state where there is no effect of the birefringence due to the optical element, and a position where the optical element is located in the optical system. An image is captured in a state on the optical axis, and the distance from the imaging element to the subject is measured.
在此,优选的是,所述光学元件,能够电性或磁性地使双折射的效果生效或不生效,所述距离测量部,利用在没有基于所述光学元件的所述双折射的效果的状态下由所述摄像元件拍摄的图像、和在所述光学元件位于所述光学系统的光轴上的状态下被拍摄的图像,测量与所述被摄体相距的距离。 Here, it is preferable that the optical element is capable of enabling or disabling the effect of birefringence electrically or magnetically, and that the distance measuring unit utilizes a An image captured by the imaging element in a state and an image captured in a state in which the optical element is positioned on the optical axis of the optical system are used to measure the distance to the subject.
在此,优选的是,还具备参考图像生成部,根据在没有基于所述光学元件的所述双折射的效果的状态下由所述摄像元件拍摄的图像,生成参考图像,所述距离测量部,利用通过所述光学元件拍摄的图像、和所述参考图像,估计所述点扩散函数,测量与所述被摄体相距的距离。 Here, it is preferable to further include a reference image generation unit for generating a reference image based on an image captured by the imaging element in a state where the effect of birefringence due to the optical element is absent, and the distance measurement unit , using the image captured by the optical element and the reference image to estimate the point spread function and measure the distance to the object.
在此,优选的是,所述参考图像生成部,根据在没有基于所述光学元件的所述双折射的效果的状态下由所述摄像元件拍摄的图像,生成全焦点图像,以作为所述参考图像。 Here, it is preferable that the reference image generating unit generates an all-in-focus image as the image captured by the imaging element without the effect of birefringence due to the optical element. Reference image.
在此,优选的是,所述光学系统,具有像方远心性的光学特性。 Here, it is preferable that the optical system has an optical characteristic of image space telecentricity.
在此,也可以是,还具备光线分离部,将光线分离为多个光路,所述摄像元件为多个,多个所述摄像元件分别对应于由所述光线分离部分离的多个光路,拍摄所述被摄体,所述光学元件被配置在由所述光线分离部分离的多个光路之中的至少一个光路上。 Here, it is also possible to further include a light beam splitting unit for splitting the light beam into a plurality of optical paths, the plurality of imaging elements corresponding to the plurality of optical paths separated by the light beam splitting unit, The subject is photographed, and the optical element is disposed on at least one optical path among a plurality of optical paths separated by the light beam splitter.
而且,本发明,除了能够作为这样的摄像装置来实现以外,还能够作为将摄像装置具备的特征性的构成要素的工作作为步骤的摄像方法来实现。并且,也能够作为用于使计算机执行摄像方法的程序来实现。也能够通过CD-ROM等的存储介质或互联网等传输介质来分发这样的程序。并且,本发明,也能够作为进行各个处理部的处理的集成电路来实现。 Furthermore, the present invention can be realized not only as such an imaging device, but also as an imaging method including operations of characteristic components included in the imaging device as steps. Furthermore, it can also be realized as a program for causing a computer to execute the imaging method. Such a program can also be distributed via a storage medium such as a CD-ROM or a transmission medium such as the Internet. Furthermore, the present invention can also be realized as an integrated circuit that performs processing by each processing unit.
发明效果 Invention effect
根据本发明的摄像装置,通过根据至少两张图像,计算图像中包含的点扩散函数所示的形状,从而能够稳定且高精度地求出被摄体距离。 According to the imaging device of the present invention, by calculating the shape represented by the point spread function included in the image based on at least two images, it is possible to obtain the subject distance stably and with high accuracy.
附图说明 Description of drawings
图1是示出本发明的实施例1的摄像装置的结构的方框图。 FIG. 1 is a block diagram showing the configuration of an imaging device according to Embodiment 1 of the present invention.
图2是示出透过双折射物质的光线的情况的图。 FIG. 2 is a diagram showing the state of light rays passing through a birefringent substance.
图3是示出本发明的实施例1的摄像装置的构成要素的配置的图。 FIG. 3 is a diagram showing the arrangement of components of an imaging device according to Embodiment 1 of the present invention.
图4是示出由双折射物质而在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状变化的情况的图。 FIG. 4 is a diagram showing how a birefringent material changes the shape indicated by the point spread function before and after an image point corresponding to the subject distance of the subject.
在图5中,(a-1)是示出利用双折射物质时的、在图4中的位置(a)的非常光线的点扩散函数所示的形状的图,(b-1)是示出利用双折射物质时的、在图4中的位置(b)的非常光线的点扩散函数的图,(a-2)是示出不利用双折射物质时的、在图4中的位置(a)的点扩散函数的图,(b-2)是示出不利用双折射物质时的、在图4中的位置(b)的点扩散函数的图。 In FIG. 5, (a-1) is a diagram showing the shape shown by the point spread function of the extraordinary ray at the position (a) in FIG. 4 when a birefringent substance is used, and (b-1) is a diagram showing When a birefringent material is used, the figure of the point spread function of the extraordinary ray at the position (b) in Fig. 4 is shown, and (a-2) shows the position ( a) is a graph of the point spread function, and (b-2) is a graph showing the point spread function at the position (b) in FIG. 4 when a birefringent substance is not used.
图6是示出利用双折射物质时的与不同的被摄体位置对应的点扩散函数的图。 FIG. 6 is a graph showing point spread functions corresponding to different object positions when a birefringent substance is used.
图7是示出不利用双折射物质时的与不同的被摄体位置对应的点扩散函数的图。 FIG. 7 is a graph showing point spread functions corresponding to different object positions when no birefringent substance is used.
图8是示出三次相位板的形状的图。 FIG. 8 is a diagram showing the shape of a cubic phase plate.
图9是示出本发明的实施例1的摄像装置的工作的流程的图。 FIG. 9 is a diagram showing the flow of operations of the imaging device according to Embodiment 1 of the present invention.
图10是示出本发明的实施例2的摄像装置的结构的方框图。 FIG. 10 is a block diagram showing the configuration of an imaging device according to Embodiment 2 of the present invention.
图11是示出本发明的实施例2的摄像装置的构成要素的配置的图。 FIG. 11 is a diagram showing the arrangement of components of an imaging device according to Embodiment 2 of the present invention.
具体实施方式 Detailed ways
以下,参照附图说明本实施例。而且,以下说明的实施例,都示出本发明的优选的一个具体例。以下的实施例所示的构成要素、构成要素的配置位置以及连接形态、步骤、步骤的顺序等,是一个例子,而不是限定本发明的宗旨。本发明,仅由权利要求书限定。因此,对于以下的实施例的构成要素中的、示出本发明的最上位概念的独立权利要求中没有记载的构成要素,为了实现本发明的问题而并不一定需要,但是,被说明为构成更优选的形态的要素。 Hereinafter, this embodiment will be described with reference to the drawings. Furthermore, the embodiments described below all show a preferred specific example of the present invention. The constituent elements, arrangement positions and connection forms of the constituent elements, steps, the order of the steps, etc. shown in the following embodiments are examples and do not limit the gist of the present invention. The present invention is limited only by the claims. Therefore, among the constituent elements of the following embodiments, the constituent elements not described in the independent claims showing the most general concept of the present invention are not necessarily required in order to achieve the problems of the present invention, but are described as constitutional elements. Elements of a more preferable form.
(实施例1) (Example 1)
图1是示出本发明的实施例1的摄像装置的结构的方框图。 FIG. 1 is a block diagram showing the configuration of an imaging device according to Embodiment 1 of the present invention.
摄像装置10具备,光学系统11、双折射物质12、执行器13、合焦范 围控制部14、摄像元件15、图像获得部16、参考图像生成部17、以及距离测量部18。 The imaging device 10 includes an optical system 11, a birefringent substance 12, an actuator 13, a focal range control unit 14, an imaging element 15, an image obtaining unit 16, a reference image generating unit 17, and a distance measuring unit 18.
在图1中,光学系统11,使被摄体像成像于摄像元件15。在摄像元件15与光学系统11间的光路上,设置有作为具有双折射的效果的光学元件的双折射物质12。尤其使透过双折射物质12的光线中的非常光线的点扩散函数所示的形状变化,使在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状成为不同的形状。执行器13,使双折射物质12,相对于光路而插入以及退出。由于执行器13进行双折射物质12的相对于光路的插入以及退出,因此,摄像装置10能够得到透过双折射物质12的被摄体像的图像、没有透过双折射物质12的被摄体像的图像。合焦范围控制部14,使光学系统11以及摄像元件15的至少一方移动,控制合焦位置以及景深。具体而言,通过以特定的模式使光学系统11工作、或者切换特定的光学元件等,从而进行控制。摄像元件15,由CCD、CMOS等构成,将在摄像面受光的光,按每个像素转换为电信号,并输出。图像获得部16,从摄像元件15获得多个图像,保持各个图像。参考图像生成部17,根据由合焦范围控制部14的效果得到的、具有不同合焦位置以及景深的多张图像,生成估计了没有由光学系统的模糊的状态的参考图像(全焦点图像)。距离测量部18,利用合焦于任意的距离的模糊图像以及由参考图像生成部17得到的参考图像,根据DFD的方法进行距离测量。 In FIG. 1 , an optical system 11 forms a subject image on an imaging element 15 . On the optical path between the imaging element 15 and the optical system 11, a birefringent substance 12 is provided as an optical element having a birefringent effect. In particular, the shape shown by the point spread function of the extraordinary ray among the rays passing through the birefringent material 12 is changed, and the shape shown by the point spread function before and after the image point corresponding to the object distance of the object is changed to be different. shape. The actuator 13 inserts and withdraws the birefringent substance 12 from the optical path. Since the actuator 13 inserts and exits the birefringent material 12 into and out of the optical path, the imaging device 10 can obtain an image of an object transmitted through the birefringent material 12 and an image of an object not transmitted through the birefringent material 12 . like image. The in-focus range control unit 14 moves at least one of the optical system 11 and the imaging device 15 to control the in-focus position and depth of field. Specifically, control is performed by operating the optical system 11 in a specific mode, switching a specific optical element, or the like. The imaging element 15 is composed of CCD, CMOS, etc., and converts light received on the imaging surface into an electrical signal for each pixel, and outputs it. The image obtaining unit 16 obtains a plurality of images from the imaging element 15 and holds each image. The reference image generating unit 17 generates a reference image (omnifocus image) in which no blur caused by the optical system is estimated based on a plurality of images obtained by the effect of the focus range control unit 14 and having different focus positions and depths of field. . The distance measurement unit 18 performs distance measurement according to the DFD method using a blurred image focused at an arbitrary distance and a reference image obtained by the reference image generation unit 17 .
接着,说明由双折射物质12使在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状成为不同的形状的方法。 Next, a method of using the birefringent substance 12 to make the shape shown by the point spread function before and after the image point corresponding to the object distance of the object into a different shape will be described.
双折射物质12是,具有光学异向性的物质,具有根据进入到物质的光线的偏振方向,将光线分离为寻常光线和非常光线的性质。寻常光线和非常光线,取决于双折射物质12固有的光学轴的方向。寻常光线是,具有相对于由光学轴和射入光线而成的平面垂直振动的电场的光线,非常光线是,具有在该平面内振动的电场的光线。而且,光学轴的方向和轴的条数,根据物质的种类而不同,在具有一条光学轴的情况下,表现为单轴,在具有二条的情况下,表现为双轴。在实施例1中,对于双折射物质12,利用作为单轴晶体的方解石(Calcite)。 The birefringent substance 12 is a substance having optical anisotropy, and has a property of separating light rays entering the substance into ordinary rays and extraordinary rays according to the polarization directions of the rays. The ordinary ray and the extraordinary ray depend on the direction of the inherent optical axis of the birefringent substance 12 . An ordinary ray is a ray having an electric field vibrating vertically with respect to a plane formed by the optical axis and the incident ray, and an extraordinary ray is a ray having an electric field vibrating in this plane. Furthermore, the direction of the optical axis and the number of axes differ depending on the type of substance, and when there is one optical axis, it appears as a single axis, and when there are two optical axes, it appears as a double axis. In Example 1, as the birefringent substance 12, calcite (Calcite) which is a uniaxial crystal is used.
寻常光线和非常光线的不同是指,在通过双折射物质12中时,寻常光 线,光的速度与光的转播方向无关而一定,对此,非常光线,光的速度因转播方向而不同。进而,对寻常光线的折射率no、和对非常光线的折射率ne不同。根据该对寻常光线的折射率no和对非常光线的折射率ne的不同、以及非常光线的光的速度因转播方向而不同的性质,如图2,在光线射入到双折射物质12的情况下,在寻常光线与非常光线之间产生行进方向的差。因此,产生射入到双折射物质12的光线,在双折射物质12中分裂为寻常光线和非常光线的现象。在图2中,光从双折射物质12的左方,相对于双折射物质12的平面垂直射入。 The difference between the ordinary ray and the extraordinary ray means that when passing through the birefringent material 12, the ordinary ray, the speed of light has nothing to do with the direction of light propagation and is constant, and the extraordinary ray, the speed of light is different due to the direction of propagation. Furthermore, the refractive index no for ordinary rays is different from the refractive index ne for extraordinary rays. According to the difference between the refractive index no of the ordinary ray and the refractive index ne of the extraordinary ray, and the property that the speed of the light of the extraordinary ray is different due to the direction of propagation, as shown in Figure 2, when the light enters the birefringent material 12 , there is a difference in the direction of travel between ordinary and extraordinary rays. Therefore, a phenomenon occurs in which the light rays incident on the birefringent substance 12 are split into ordinary rays and extraordinary rays in the birefringent substance 12 . In FIG. 2 , light is incident perpendicularly to the plane of the birefringent substance 12 from the left of the birefringent substance 12 .
在本发明中,为了使在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状成为不同的形状,尤其利用非常光线。 In the present invention, in order to make the shape shown by the point spread function before and after the image point corresponding to the subject distance of the subject to be different, extraordinary rays are used in particular.
对于光学系统11、双折射物质12、以及摄像元件15的位置关系,如图3示出,成为在光学系统11(透镜)与摄像元件15之间配置双折射物质12的关系。也就是说,三者,按照光学系统11、双折射物质12、摄像元件15的顺序,被排列并配置在光轴上。双折射物质12是指,具有与光轴相交的双折射物质的平面的全部相对于光轴垂直的形状、配置的平行的板。而且,此时的“垂直”也可以,不是严格的垂直。并且,双折射物质12为单轴,在图3中,光学轴的方向为y方向。而且,针对寻常光线和非常光线的折射率的关系为,no>ne。在此,说明了光学系统11、双折射物质12、以及摄像元件15的位置关系,但是,为了由所述的位置关系得到效果,优选的是,双折射物质12为平行板。若是平行板,则可以具有仅给予主要向像散的影响的性质。在此所述的平行板是指,作为光射入的一侧的第一面与作为光射出的一侧的第二面相互平行的物质。也就是说,第一面以及第二面以外的面的角度以及形状不受限制。 The positional relationship among optical system 11 , birefringent substance 12 , and imaging element 15 is that birefringent substance 12 is disposed between optical system 11 (lens) and imaging element 15 as shown in FIG. 3 . That is, the three are arranged and arranged on the optical axis in the order of the optical system 11 , the birefringent material 12 , and the imaging element 15 . The birefringent material 12 refers to a parallel plate having a shape and arrangement in which all planes of the birefringent material intersecting the optical axis are perpendicular to the optical axis. Moreover, the "vertical" at this time can also be used, not strictly vertical. Furthermore, the birefringent substance 12 is uniaxial, and in FIG. 3 , the direction of the optical axis is the y direction. Furthermore, the relation of the refractive indices for ordinary rays and extraordinary rays is no>ne. Here, the positional relationship between the optical system 11 , the birefringent substance 12 , and the imaging element 15 has been described. However, in order to obtain effects from the positional relationship described above, the birefringent substance 12 is preferably a parallel plate. In the case of a parallel plate, it may have the property of giving only the influence of principal astigmatism. The parallel plate referred to here refers to a substance in which the first surface on the light-incident side and the second surface on the light-emitting side are parallel to each other. That is, the angles and shapes of surfaces other than the first surface and the second surface are not limited.
在利用图3的结构的情况下,在与被摄体的被摄体距离对应的像点前后的非常光线的点扩散函数所示的形状产生差,在与被摄体的被摄体距离对应的像点的前方,成为向y方向长的形状,在与被摄体的被摄体距离对应的像点的后方,成为向x方向长的形状。图4是示出,图3的结构的y-z平面、x-z平面的寻常光线以及非常光线的动作的图。根据光的速度因光学轴的方向、和非常光线的转播方向而不同的性质,对于非常光线,在y-z平面,与寻常光线相比,折射更强,与寻常光线相比,与被摄体的 被摄体距离对应的像点的位置更远。另一方面,在x-z平面,对于非常光线,与寻常光线相比,折射角更小,与寻常光线相比,与被摄体的被摄体距离对应的像点的位置更近。若仅考虑非常光线,x-z平面的光线和y-z平面的光线,与被摄体的被摄体距离对应的像点的位置不同。因此,对于非常光线的点扩散函数所示的形状,在图4的与被摄体的被摄体距离对应的像点的前方的(a)的位置,由于y方向的模糊程度比x方向的模糊程度大,因此,成为图5(a-1)所示的向y方向长的形状。另一方面,对于非常光线的点扩散函数所示的形状,在图4的与被摄体的被摄体距离对应的像点的后方的(b)的位置,由于x方向的模糊程度比y方向的模糊程度大,因此,成为图5(b-1)所示的向x方向长的形状。并且,图5(a-2)、(b-2)分别是,在图4(a)、(b)的位置的寻常光线的点扩散函数。也可以说,它们是没有双折射物质12时的光线的点扩散函数。也就是说,可以确认到在没有双折射物质12的情况下,在与被摄体的被摄体距离对应的像点的前后成为类似的形状(例如,此时为圆形)。 In the case of using the structure of FIG. 3 , there is a difference in the shape shown by the point spread function of the extraordinary rays before and after the image point corresponding to the subject distance of the subject, and at the subject distance corresponding to the subject The front of the image point of is long in the y direction, and the rear of the image point corresponding to the object distance of the object is long in the x direction. FIG. 4 is a diagram showing the behavior of ordinary rays and extraordinary rays on the yz plane and the xz plane in the structure of FIG. 3 . According to the property that the speed of light is different due to the direction of the optical axis and the relaying direction of the extraordinary ray, for the extraordinary ray, in the y-z plane, compared with the ordinary ray, the refraction is stronger, and compared with the ordinary ray, the refraction is stronger than that of the photographed The subject is farther away from the corresponding image point. On the other hand, in the xz plane, for extraordinary rays, the refraction angle is smaller than that of ordinary rays, and the position of the image point corresponding to the subject distance of the subject is closer than that of ordinary rays. If only the extraordinary light is considered, the position of the image point corresponding to the distance between the light of the xz plane and the light of the yz plane is different from that of the subject. Therefore, for the shape shown by the point spread function of the extraordinary ray, at the position (a) in front of the image point corresponding to the subject distance of the subject in Fig. 4, the degree of blur in the y direction is larger than that in the x direction Since the degree of blur is large, it becomes a shape elongated in the y direction as shown in FIG. 5(a-1). On the other hand, for the shape shown by the point spread function of the extraordinary ray, at the position (b) behind the image point corresponding to the subject distance of the subject in Fig. 4, since the degree of blur in the x direction is larger than the Since the degree of ambiguity in the direction is large, it becomes a shape elongated in the x direction as shown in FIG. 5(b-1). 5(a-2) and (b-2) are point spread functions of ordinary rays at the positions of FIG. 4(a) and (b), respectively. It can also be said that they are point spread functions of the light rays without the birefringent substance 12 . That is, it was confirmed that in the absence of the birefringent substance 12 , the front and rear of the image point corresponding to the subject distance of the subject have similar shapes (for example, a circle in this case).
由于在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状不同,因此,解决距离判别的暧昧性,能够唯一地估计被摄体距离。以下,根据图6以及图7说明,利用双折射物质时,有效于被摄体距离的估计。而且,图6是示出利用双折射物质12时的与不同的被摄体位置对应的点扩散函数的图。图7是示出不利用双折射物质时的与不同的被摄体位置对应的点扩散函数的图。并且,对于在此所谓的“被摄体距离”,在所述说明中,定义为从摄像装置到被摄体的距离,但可以是从光学系统11到被摄体的距离,也可以是从摄像元件15到被摄体的距离。 Since the shape shown by the point spread function is different before and after the image point corresponding to the subject distance of the subject, the ambiguity of distance discrimination is resolved, and the subject distance can be uniquely estimated. Hereinafter, it will be described based on FIGS. 6 and 7 that when a birefringent substance is used, it is effective for estimating the subject distance. Furthermore, FIG. 6 is a graph showing point spread functions corresponding to different object positions when the birefringent substance 12 is used. FIG. 7 is a graph showing point spread functions corresponding to different object positions when no birefringent substance is used. And, for the so-called "subject distance" here, in the description, it is defined as the distance from the imaging device to the subject, but it may be the distance from the optical system 11 to the subject, or it may be from The distance from the imaging element 15 to the subject.
在如图6存在双折射物质12的情况下,在考虑与被摄体位置(a)、被摄体位置(b)对应的点扩散函数时,在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状不同,因此,成像于摄像元件15的与位置(a)对应的点扩散函数所示的形状、和与位置(b)对应的点扩散函数所示的形状互不相同。也就是说,根据由摄像元件15得到的点扩散函数,能够唯一地估计被摄体距离。对此,在图7中,由于没有双折射物质12,因此与位置(a)、(b)对应的点扩散函数成为类似的形状,因噪声而不明确,得到的点扩散函数是与位置(a)对应的、还是与位置(b)对应的,难以唯一地估计被摄体 距离。也就是说,在如图6存在双折射物质12的情况下,与如图7没有双折射物质12的情况相比,明确由摄像元件15获得的点扩散函数是与被摄体的被摄体距离对应的像点的前后的哪一方,据此,能够容易唯一地估计被摄体距离,因此是有效的。而且,对于图5至图7的点扩散函数的计算,利用ZEMAX Development Corporation公司制的光学模拟软件“ZEMAX(产品名)”。 In the case where there is a birefringent substance 12 as shown in FIG. 6 , when considering the point spread function corresponding to the subject position (a) and the subject position (b), at the subject distance corresponding to the subject The shape shown by the point spread function before and after the image point is different, therefore, the shape shown by the point spread function corresponding to the position (a) and the shape shown by the point spread function corresponding to the position (b) of the imaging element 15 are imaged. The shapes are different from each other. That is, the subject distance can be uniquely estimated from the point spread function obtained by the imaging element 15 . On the other hand, in FIG. 7, since there is no birefringent substance 12, the point spread functions corresponding to the positions (a) and (b) have similar shapes, but are unclear due to noise, and the obtained point spread functions are the same as the positions ( a) or corresponding to position (b), it is difficult to uniquely estimate the subject distance. That is to say, in the case where there is a birefringent substance 12 as shown in FIG. 6, compared with the case where there is no birefringent substance 12 as shown in FIG. It is effective that the subject distance can be easily and uniquely estimated based on whichever of the image point corresponds to the distance. Furthermore, for the calculation of the point spread function of FIGS. 5 to 7 , optical simulation software "ZEMAX (product name)" manufactured by ZEMAX Development Corporation was used.
实际上,根据图3的结构,同时检测出寻常光线和非常光线。但是,由于包含非常光线,因此,在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状仍然不同。 Actually, according to the configuration of FIG. 3 , ordinary rays and extraordinary rays are simultaneously detected. However, since the extraordinary ray is included, the shape indicated by the point spread function before and after the image point corresponding to the subject distance of the subject still differs.
接着,说明参考图像获得方法。 Next, a method of obtaining a reference image will be described.
对于实施例1的摄像装置10,利用没有由光学系统11的模糊的参考图像(全焦点图像)。没有由光学系统11的模糊的图像,可以说是景深深的图像。通过将光学系统的光圈变窄,从而能够容易实现将景深变深。但是,根据该方法,摄像元件15受光的光量降低,因此,对此,提出了不将光圈变窄而将景深变深的多个方法。该方法之一是,被称为景深扩展(Extended Depth of Field,以下记作为EDoF)的方法。以下,说明EDoF的具体方法。 For the imaging device 10 of Embodiment 1, a reference image (omnifocal image) free from blurring by the optical system 11 is used. An image without blurring by the optical system 11 can be said to be an image with a depth of field. By narrowing the aperture of the optical system, it is possible to easily achieve a deeper depth of field. However, according to this method, the amount of light received by the imaging element 15 decreases, and therefore, a number of methods have been proposed to increase the depth of field without narrowing the aperture. One of these methods is a method called Extended Depth of Field (hereinafter referred to as EDoF). Hereinafter, a specific method of EDoF will be described.
最单纯的EDoF方法是,一边逐渐将合焦位置稍微偏移,一边拍摄多个图像,从这样的图像中提取合焦的部分来合成的方法。 The simplest EDoF method is a method of taking a plurality of images while gradually shifting the in-focus position slightly, and extracting in-focus parts from such images to synthesize them.
对此,在非专利文献2中公开,在曝光中使合焦位置变化,生成不包含模糊的图像的方法。 In contrast, Non-Patent Document 2 discloses a method of changing the focus position during exposure to generate a blur-free image.
具体而言,若在曝光中将摄像元件或透镜向光轴方向移动,点扩散函数与被摄体距离无关而成为大致一定,能够得到均匀模糊的图像。针对得到的模糊图像,若利用不受被摄体距离的影响的不变的点扩散函数来进行反卷积,则能够得到以图像整体没有模糊的图像。 Specifically, if the imaging element or the lens is moved in the direction of the optical axis during exposure, the point spread function becomes substantially constant regardless of the subject distance, and a uniformly blurred image can be obtained. For the obtained blurred image, if deconvolution is performed using an invariant point spread function that is not affected by the subject distance, an image that is not blurred in the entire image can be obtained.
另一方面,还提出了利用特殊的光学元件的EDoF的方法。例如,利用所谓三次相位板(Cubic Phase Mask)的光学元件的方法。对于三维相位板的一个例子,图8中示出该形状。若将这样的形状的光学元件安装在光学系统的光圈附近,则能够得到与被摄体距离无关而具有大致一定的模糊的图像。与非专利文献1同样,若利用不受被摄体距离的影响的不变的点 扩散函数来进行反卷积,则能够得到以图像整体没有模糊的图像。除此以外,可以举出利用多焦点透镜的方法等。 On the other hand, a method of EDoF using a special optical element has also been proposed. For example, a method using an optical element called a cubic phase mask (Cubic Phase Mask). The shape is shown in FIG. 8 for an example of a three-dimensional phase plate. When an optical element having such a shape is mounted near the aperture of the optical system, an image with substantially constant blur regardless of the subject distance can be obtained. Similar to Non-Patent Document 1, if deconvolution is performed using an invariant point spread function that is not affected by the subject distance, an image without blurring in the entire image can be obtained. In addition, a method using a multifocal lens, etc. are mentioned.
而且,在以下进行的说明中设想,对于将景深扩展来得到参考图像的方法,利用在曝光时间中使合焦位置变化的方法。 In addition, in the following description, it is assumed that a method of changing the focus position during the exposure time is used as a method of expanding the depth of field to obtain a reference image.
接着,说明计算被摄体距离的处理的流程。图9是示出计算被摄体距离的处理的流程的一个例子的流程图。该处理是指,在预先决定的n阶段的被摄体距离d1、d2、……、dn中,根据成为对象的被摄体被拍摄的图像,计算被摄体距离与哪个距离最近的处理。 Next, the flow of processing for calculating the subject distance will be described. FIG. 9 is a flowchart showing an example of a flow of processing for calculating a subject distance. This processing is a process of calculating, among predetermined n stages of subject distances d1 , d2 , .
首先,拍摄并获得透过双折射物质而得到的被摄体的图像I和参考图像I′(步骤S101,S102)。而且,对于步骤S101和S102的顺序,也可以相反。但是,在此获得的参考图像是,拍摄没有透过双折射物质12的被摄体的图像。 First, the image I of the object and the reference image I' obtained through the birefringent material are photographed and obtained (steps S101, S102). Moreover, the order of steps S101 and S102 can also be reversed. However, the reference image obtained here is an image of a subject that is not transmitted through the birefringent substance 12 .
在此,在图像I和参考图像I′间,成立以下的算式1所示的关系。 Here, between the image I and the reference image I′, the relationship shown in the following formula 1 is established.
【算式1】 【Equation 1】
I(x,y)=I′(x,y)*h(x,y,d(x,y))···(式1) I(x, y)=I'(x, y)*h(x, y, d(x, y))···(Formula 1)
在此,h表示图像中的位置(x,y)的点扩散函数,d表示(x,y)表示位置(x,y)的被摄体距离。并且,式中的*表示卷积运算。根据被摄体距离,点扩散函数不同,因此,在多个被摄体在不同的被摄体距离存在的情况下,得到将图像的每个位置的被摄体距离不同的点扩散函数与没有模糊的图像卷积了的图像,以作为图像I。 Here, h represents the point spread function at the position (x, y) in the image, and d represents the subject distance at the position (x, y) at (x, y). Also, * in the formula represents a convolution operation. The point spread function differs depending on the subject distance, therefore, in the case where a plurality of subjects exist at different subject distances, the point spread function with different subject distances for each position of the image is obtained with no The blurred image is convolved with the image as image I.
接着,在计数器i代入初始值1(步骤103),按图像的每个像素,计算针对第i阶段的被摄体距离的误差函数C(x,y,di)(步骤S104)。误差函数,由以下的算式2表示。 Next, an initial value of 1 is substituted into the counter i (step 103), and an error function C(x, y, di) for the subject distance of the i-th stage is calculated for each pixel of the image (step S104). The error function is represented by Equation 2 below.
【算式2】 【Equation 2】
C(x,y,di)=|I(x,y)-I′(x,y)*h(x,y,di)|(i=1,2,…,n)····(式2) C(x, y, d i )=|I(x, y)-I′(x, y)*h(x, y, d i )|(i=1, 2,...,n)... ·(Formula 2)
在此,h(x,y,di)表示与被摄体距离di对应的点扩散函数。与被摄体距离di(i=1~n:n为2以上的自然数)对应的点扩散函数,由摄像装置10的例如存储器等预先存储。式2相当于,取将没有模糊的参考图像I′和第i 阶段的被摄体距离di所对应的点扩散函数h(x,y,di)卷积后的图像、与实际的摄像图像I之间的差。在拍摄的被摄体在第i阶段的被摄体距离上实际存在的情况下,作为该差的误差函数C(x,y,di)成为最小。 Here, h(x, y, di) represents a point spread function corresponding to the subject distance di. The point spread function corresponding to the subject distance di (i=1 to n: n is a natural number greater than or equal to 2) is stored in advance in, for example, a memory of the imaging device 10 . Equation 2 is equivalent to taking the convoluted image of the unblurred reference image I′ and the point spread function h(x, y, di) corresponding to the i-th stage object distance di, and the actual captured image I difference between. The error function C(x, y, di) which is the difference becomes the minimum when the photographed subject actually exists at the subject distance of the i-th stage.
而且,在式2中,误差函数C(x,y,di)是,将各个像素间的第i阶段的被摄体距离di所对应的点扩散函数h(x,y,di)和没有模糊的图像卷积后的图像、与实际的摄像图像I之间的差的绝对值,但也可以根据表示L2范数等距离的任意的形式,决定误差函数。 Moreover, in Equation 2, the error function C(x, y, di) is the point spread function h(x, y, di) corresponding to the object distance di of the i-th stage between each pixel and the unblurred The absolute value of the difference between the image convolved with the image of , and the actual captured image I, but the error function may be determined in any form representing the L2 norm equidistant.
计算误差函数后,判定计数器i的值是否到达n(步骤S105),在没有到达的情况下,将计数器i的值增大1(步骤S106),反复进行,直到计数器i的值到达n为止。 After calculating the error function, determine whether the value of counter i reaches n (step S105), and if not, increase the value of counter i by 1 (step S106), and repeat until the value of counter i reaches n.
计算第1阶段至第n阶段的误差函数的全部后,计算被摄体距离(步骤S107)。位置(x,y)的被摄体距离d(x,y),由以下的式3表示。 After calculating all the error functions of the first stage to the nth stage, the subject distance is calculated (step S107). The subject distance d(x, y) at the position (x, y) is represented by Equation 3 below.
【算式3】 【Equation 3】
实际上,为了减少摄像图像I中包含的噪声的影响,进行将图像划分为多个块,求出块内的误差函数的总和,将误差函数成为最小的被摄体距离设为该块整体的拍摄的被摄体的被摄体距离等的处理,从而能够进行更稳定的距离测量。 Actually, in order to reduce the influence of noise included in the captured image I, the image is divided into a plurality of blocks, the sum of the error functions in the blocks is obtained, and the subject distance at which the error function becomes the smallest is set as the distance of the entire block. The processing of the subject distance and the like of the photographed subject enables more stable distance measurement.
根据这样的结构,由于在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状不同,因此能够唯一地进行被摄体距离的估计。 According to such a configuration, since the shape indicated by the point spread function is different before and after the image point corresponding to the subject distance of the subject, it is possible to uniquely estimate the subject distance.
在本实施例中,在图3中,将双折射物质12的光学轴的方向设为朝上,但是,光学轴的朝向,不仅限于朝上,而可以是任意的朝向,即使是任意的朝向,也能够使在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状成为不同的形状。若使光学轴的方向变化,则得到的点扩散函数所示的形状也变化。即使光学轴朝向哪个方向,寻常光线的与被摄体的被摄体距离对应的像点、和非常光线的与被摄体的被摄体距离对应的像点的位置也不同,但是,仅在光学轴和光轴平行的情况下,在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状的不同变小。因此,优选的是,设置双折射物质12,以使双折射物质12的光学轴的方向和光轴 不平行。 In this embodiment, in FIG. 3 , the direction of the optical axis of the birefringent material 12 is set upward, but the direction of the optical axis is not limited to upward, but can be any direction, even if it is any direction , the shape shown by the point spread function before and after the image point corresponding to the object distance of the object can also be made into a different shape. When the direction of the optical axis is changed, the shape represented by the obtained point spread function also changes. Regardless of which direction the optical axis faces, the positions of the image point corresponding to the subject distance of the subject of ordinary rays and the position of the image point corresponding to the subject distance of extraordinary rays of light are different, but only in When the optical axis is parallel to the optical axis, the difference in shape indicated by the point spread function before and after the image point corresponding to the subject distance of the subject becomes smaller. Therefore, it is preferable that the birefringent substance 12 is arranged so that the direction of the optical axis of the birefringent substance 12 is not parallel to the optical axis.
并且,本实施例的记载中,对于双折射物质12,利用作为单轴晶体的方解石(Calcite),但也可以利用其他的具有双折射的效果的物质。对于光学轴,除了方向以外,还可以将轴的数量设为控制点扩散函数所示的形状的要素,除了单轴的双折射物质以外,由双轴的物质也能够得到效果。通过将单轴、双轴、或这两者的双折射物质排列多个,从而也能够扩大变化的幅度。进而,根据双折射物质的厚度、种类,也能够使得到的点扩散函数所示的形状,在与被摄体的被摄体距离对应的像点前后变化。 In addition, in the description of this embodiment, calcite (Calcite), which is a uniaxial crystal, is used as the birefringent substance 12 , but other substances having an effect of birefringence may also be used. Regarding the optical axis, in addition to the direction, the number of axes can also be used as an element controlling the shape represented by the point spread function, and the effect can be obtained from a biaxial material as well as a uniaxial birefringent material. By arranging a plurality of uniaxial, biaxial, or both birefringent substances, the range of change can also be enlarged. Furthermore, depending on the thickness and type of the birefringent substance, the shape represented by the obtained point spread function can also be changed before and after the image point corresponding to the subject distance of the subject.
而且,在本实施例中,透过双折射物质12的被摄体像和没有透过的被摄体像的图像的获得,通过执行器进行的双折射物质的移动来实现,但是,还存在其他的方法,总之,通过由物理驱动的双折射物质本身的移动在光路上进出的方法、以及使用能够控制双折射的效果的光学元件的方法能够实现。 Moreover, in this embodiment, the acquisition of the image of the subject transmitted through the birefringent material 12 and the image of the subject not transmitted through is realized by the movement of the birefringent material performed by the actuator, however, there are still Other methods, in short, can be achieved by physically driven movement of the birefringent substance itself in and out of the optical path, and by using optical elements that can control the effect of birefringence.
对于前者,可以举出,由执行器的直线移动以及在与光轴垂直的状态下使双折射物质的板旋转,从而产生在光路上存在双折射物质的情况和没有双折射物质的情况的方法等。对于后者,例如,可以举出,能够进行像电光效应那样的由电气的控制的元件、以及能够进行由磁气的控制的元件等。在这样的情况下,根据电压以及磁场的施加的有无的切换,能够控制双折射的效果的有无。并且,也可以是,不采用双折射物质,而采用能够对双折射物质的效果进行电性以及磁性控制的例如液晶等。 For the former, there is a method in which a birefringent substance exists and a birefringent substance does not exist on the optical path by linearly moving the actuator and rotating the plate of the birefringent substance in a state perpendicular to the optical axis. wait. As for the latter, for example, an element capable of electrical control such as the electro-optic effect, an element capable of magnetic control, and the like are mentioned. In such a case, the presence or absence of the effect of birefringence can be controlled by switching the presence or absence of the application of the voltage and the magnetic field. In addition, instead of using a birefringent substance, liquid crystals or the like that can electrically and magnetically control the effect of the birefringent substance may be used.
对于双折射物质的位置,除了图3的位置以外,还由任意的位置能够得到使在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状成为不同的形状。但是,若像图3那样紧接摄像元件之前配置,则能够得到大的效果,因此是优选的。 Regarding the position of the birefringent substance, other than the position in FIG. 3 , a shape different from the shape shown by the point spread function before and after the image point corresponding to the subject distance of the subject can be obtained from any position. However, it is preferable to arrange them immediately before the imaging element as shown in FIG. 3 because a large effect can be obtained.
而且,对于光学系统11,优选的是,所有的像高的点扩散函数所示的形状成为同一的光学系统,尤其优选的是,像方远心的光学系统。像方远心的光学系统是指,在像方,全视角的主光线和光轴成为平行的光学系统。在图3的结构中,在光学系统11是像方远心光学系统的情况下,即使在光路上配置双折射物质12,所有的像高的点扩散函数所示的形状也成为同一。也就是说,在光学系统具有所有的像高的点扩散函数所示的形状成为同一 的性质的情况下,即使将双折射物质12配置在光路上,也能够保存该性质。因此,在此情况下,不需要包括双折射物质的光学系统的再设计。在具有所有的像高的点扩散函数所示的形状成为同一的性质的情况下,用于测距的运算的点扩散函数为一个即可,能够抑制运算所需要的成本。 Furthermore, the optical system 11 is preferably an optical system in which the shapes indicated by point spread functions of all image heights are the same, and is particularly preferably an optical system in which the image is telecentric. The image-space telecentric optical system refers to an optical system in which the chief ray and the optical axis of all angles of view become parallel on the image side. In the configuration of FIG. 3 , when the optical system 11 is an image-space telecentric optical system, even if the birefringent material 12 is arranged on the optical path, the shape indicated by the point spread function of all image heights becomes the same. That is, when the optical system has the property that the shapes shown by the point spread functions of all image heights become the same, even if the birefringent material 12 is arranged on the optical path, this property can be preserved. Therefore, in this case, no redesign of the optical system including the birefringent substance is required. In the case where the shapes represented by the point spread functions of all image heights are the same, only one point spread function is required for the calculation of the distance measurement, and the cost required for the calculation can be suppressed.
(实施例2) (Example 2)
本发明的实施例2涉及的摄像装置19具有,分离寻常光线和非常光线,仅获得各个光线的图像的结构。图10是示出本发明的实施例2的摄像装置19的结构的方框图。在图10中,对于与图1的摄像装置10相同的构成要素利用采用相同的符号,省略一部分的说明。摄像装置19具备,光学系统11、光线分离部20、双折射物质12、合焦范围控制部14、摄像元件A21、摄像元件B22、图像获得部A23、图像获得部B24、参考图像生成部17、以及距离测量部18。 The imaging device 19 according to the second embodiment of the present invention has a configuration that separates the ordinary ray and the extraordinary ray and acquires only images of the respective rays. FIG. 10 is a block diagram showing the configuration of an imaging device 19 according to Embodiment 2 of the present invention. In FIG. 10 , the same reference numerals are used for the same constituent elements as those of the imaging device 10 in FIG. 1 , and a part of description is omitted. The imaging device 19 includes an optical system 11, a light beam separating unit 20, a birefringent material 12, a focus range control unit 14, an imaging element A21, an imaging element B22, an image obtaining unit A23, an image obtaining unit B24, a reference image generating unit 17, and a distance measuring unit 18 .
在图10中,光学系统11,将被摄体像成像于摄像元件A21和摄像元件B22。光线分离部20,以任意的光量比,在空间上分离光线。摄像元件A21以及摄像元件B22,由CCD、CMOS等构成,将在摄像面受光的光,按每个像素转换为电信号,并输出。并且,由光线分离部20分离后的光线的一方的光线,由双折射物质12发生点扩散函数所示的形状的变化,由摄像元件A21受光。摄像元件B22,受光没有透过双折射物质12的、没有受到由双折射物质12的效果的、由光线分离部20分离后的另一方的光线。图像获得部A23以及图像获得部B24分别,从摄像元件A21以及摄像元件B22获得图像,保存获得的图像。 In FIG. 10 , the optical system 11 forms a subject image on an imaging element A21 and an imaging element B22 . The light beam splitter 20 spatially separates the light beams with an arbitrary light quantity ratio. The imaging element A21 and the imaging element B22 are composed of CCD, CMOS, etc., and convert the light received on the imaging surface into an electrical signal for each pixel, and output it. Then, one of the light beams separated by the light beam splitter 20 undergoes a change in shape represented by a point spread function by the birefringent substance 12 and is received by the imaging element A21. The imaging element B22 receives the other light beam that has not been transmitted through the birefringent material 12 and that has not received the effect of the birefringent material 12 and that has been separated by the light beam splitter 20 . The image obtaining unit A23 and the image obtaining unit B24 obtain images from the imaging device A21 and the imaging device B22 respectively, and store the obtained images.
具体而言,具有像图11那样的结构,根据双折射物质12,在与被摄体的被摄体距离对应的像点前后的点扩散函数所示的形状不同的模糊图像,由摄像元件A21获得。另一方,对于不通过双折射物质12的光线,与实施例1同样,合焦位置以及景深,由合焦范围控制部14控制,并且,由摄像元件B22拍摄。而且,参考图像生成部17,根据由摄像元件B22获得的图像,生成参考图像。由摄像元件A21得到的模糊图像、以及根据由摄像元件B22拍摄的图像而生成的参考图像,被进行与图9同样的处理,从而利用于被摄体距离的计算。在实施例2中,由摄像元件A21得到的模糊图像、由参考图像生成部17得到的参考图像,分别与图9的图像I、参 考图像I′对应。进而,根据与式1至式3相同的运算,能够计算被摄体距离。 Specifically, with the structure shown in FIG. 11 , the birefringent material 12 produces a blurred image with a different shape shown by the point spread function before and after the image point corresponding to the subject distance of the subject, and the imaging element A21 get. On the other hand, for light rays that do not pass through the birefringent material 12 , the in-focus position and depth of field are controlled by the in-focus range control unit 14 and captured by the imaging device B22 as in the first embodiment. Furthermore, the reference image generation unit 17 generates a reference image based on the image obtained by the imaging device B22. The blurred image obtained by the imaging device A21 and the reference image generated from the image captured by the imaging device B22 are processed in the same manner as in FIG. 9 and used for calculation of the subject distance. In Example 2, the blurred image obtained by the imaging element A21 and the reference image obtained by the reference image generating unit 17 correspond to the image I and the reference image I' in FIG. 9 , respectively. Furthermore, the subject distance can be calculated by the same calculation as in Expressions 1 to 3.
对于用于光线的分离的光线分离部20,可以举出无偏光光束分离器以及偏光光束分离器等。在无偏光光束分离器的情况下,得到的图像I,与实施例1同样,成为包含非常光线和寻常光线的双方的图像。在利用偏光光束分离器的情况下,也能够控制与双折射物质的光学轴分离的偏光的方向,使图像I中包含的光线仅成为非常光线。而且,通过使图像I中包含的光线仅成为非常光线,从而能够拍摄不包含由寻常光线的噪声的图像,因此,能够得到用于导出被摄体距离的精度更高的图像。并且,在利用偏光光束分离器的情况下,也可以将双折射物质配置在偏光光束分离器与光学系统之间。在此情况下,需要选择偏振方向,以仅使寻常光线到达摄像元件B22。 Examples of the light beam splitter 20 for splitting light include a non-polarized beam splitter, a polarized beam splitter, and the like. In the case of no polarizing beam splitter, the obtained image I is an image including both extraordinary rays and ordinary rays, as in the first embodiment. Also in the case of using a polarizing beam splitter, it is possible to control the direction of the polarized light separated from the optical axis of the birefringent material so that the light included in the image I becomes only the extraordinary light. Furthermore, by making the rays included in the image I only extraordinary rays, it is possible to capture an image that does not contain noise due to ordinary rays, and thus it is possible to obtain an image with higher accuracy for deriving the subject distance. Furthermore, when using a polarizing beam splitter, a birefringent substance may be arranged between the polarizing beam splitter and the optical system. In this case, it is necessary to select the polarization direction so that only ordinary rays reach the imaging element B22.
而且,对于仅包含非常光线的图像,虽然光量降低,但是,利用偏光片等仅使特定的偏光通过的光学元件,仅使非常光线透过,从而也能够得到。 Furthermore, although the amount of light decreases for an image containing only extraordinary rays, it can be obtained by using an optical element such as a polarizer that transmits only specific polarized light to transmit only extraordinary rays.
根据这样的结构,在同一时刻能够获得图像I以及参考图像I′,因此,在两图像中不产生模糊以外的差,能够更准确地求出被摄体距离。在实施例1中,由于不是同时获得图像I以及参考图像I′的结构,因此,根据被摄体以及摄像装置本身的运动,与摄像装置对应的被摄体的相对位置变化,在两图像中产生模糊以外的差,距离测量的精度容易降低。但是,在对一张图像的摄像时间相同的情况下,不分离光线的实施例1的射入到一张摄像元件的光量较多,因此,信噪比(S/N比)更高。 According to such a configuration, the image I and the reference image I′ can be obtained at the same time, and therefore, there is no difference between the two images except blur, and the subject distance can be obtained more accurately. In Embodiment 1, since the image I and the reference image I' are not obtained at the same time, according to the movement of the subject and the camera itself, the relative position of the subject corresponding to the camera changes, and the two images If a difference other than blur occurs, the accuracy of distance measurement tends to decrease. However, when the imaging time for one image is the same, in Example 1 where light is not separated, the amount of light incident on one imaging element is larger, so the signal-to-noise ratio (S/N ratio) is higher.
在实施例1中,以时分来获得参考图像I′和图像I,对此,在实施例2中可以说,以空间划分来获得图像I和参考图像I′。在实施例2中,由于划分光线,与图像I以及参考图像I′分别对应的光量降低,但是,若合并两图像的光量,则光量没有降低,没有损失。并且,在两图像的获得所需要的时间为同一的情况下,在实施例1和实施例2中,总光量为同一。 In the first embodiment, the reference image I' and the image I are obtained by time division, whereas in the second embodiment, it can be said that the image I and the reference image I' are obtained by space division. In Example 2, due to the division of light rays, the light quantities corresponding to the image I and the reference image I′ respectively decrease. However, if the light quantities of the two images are combined, the light quantity does not decrease and there is no loss. In addition, when the time required to obtain both images is the same, the total light intensity is the same in Example 1 and Example 2.
而且,在所述实施例1以及实施例2中,为了得到被摄体的被摄体距离,将全焦点图像作为参考图像来利用,但不仅限于此,也可以将模糊一样的图像作为参考图像来利用,从而导出被摄体的被摄体距离。 Furthermore, in the first and second embodiments, the omni-focus image is used as the reference image in order to obtain the subject distance of the subject, but it is not limited to this, and a blurred image may also be used as the reference image. to use to derive the subject distance of the subject.
而且,所述实施例1以及实施例2的方框图(图1,图10等)的各个功 能框内的作为双折射效果赋予部的执行器13的控制部、作为摄像部的图像获得部16、以及距离测量部18,以作为典型的集成电路的LSI来实现。它们既可以单独地单芯片化,也可以包含一部分或全部而单芯片化。例如,也可以将存储器以外的功能框单芯片化。 Moreover, the control unit of the actuator 13 as the birefringence effect imparting unit and the image acquisition unit 16 as the imaging unit in each functional block of the block diagrams of the first and second embodiments (FIG. 1, FIG. 10, etc.) , and the distance measuring unit 18 are realized by an LSI which is a typical integrated circuit. These may be single-chip or a part or all of them may be single-chip. For example, functional blocks other than the memory may be formed into a single chip.
在此,作为LSI,但也可以根据集成度不同被称为IC、系统LSI、超LSI、特大LSI。 Here, it is referred to as an LSI, but it may also be called an IC, a system LSI, a super LSI, or a super LSI depending on the degree of integration.
并且,对于集成电路化的方法,不仅限于LSI,也可以以专用电路或通用处理器来实现。也可以利用在制造LSI后能够编程的FPGA(Field Programmable Gate Array∶现场可编程门阵列)、或可重构LSI内部的电路单元的连接或设定的可重构处理器。 In addition, the method of circuit integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor. An FPGA (Field Programmable Gate Array: Field Programmable Gate Array) that can be programmed after the LSI is manufactured, or a reconfigurable processor that can reconfigure the connection or setting of circuit cells inside the LSI can also be used.
进而,当然,若因半导体技术的进步或导出的其它的技术而出现代替LSI的集成电路化的技术,则可以利用其技术对功能框进行集成化。存在生物技术的应用等的可能性。 Furthermore, as a matter of course, if an integrated circuit technology that replaces LSI appears due to progress in semiconductor technology or other derived technologies, the functional blocks can be integrated using that technology. There is a possibility of application of biotechnology and the like.
并且,各个功能框中也可以,仅将存储成为处理的对象的数据的单元另外构成,而非单芯片化。 In addition, in each functional block, only a unit for storing data to be processed may be separately configured instead of being integrated into a single chip.
工业实用性 Industrial Applicability
本发明涉及的摄像装置,根据从单一的视点拍摄的图像能够进行距离测量,因此,能够广泛地应用于摄像设备。 The imaging device according to the present invention can perform distance measurement based on an image captured from a single point of view, and therefore can be widely used in imaging equipment.
符号说明Symbol Description
10撮像装置 10 camera device
11光学系统 11 optical system
12双折射物质 12 birefringent substances
13执行器 13 actuators
14合焦范围控制部 14 Focus range control unit
15摄像元件 15 camera elements
16图像获得部 16 Image Acquisition Department
17参考图像生成部 17Reference image generation unit
18距离测量部 18 Distance Measurement Department
19摄像装置 19 camera device
20光线分离部 20 light separation part
21摄像元件A 21 Imaging element A
22摄像元件B 22 Imaging element B
23图像获得部A 23 Image Acquisition Part A
24图像获得部B 24 Image Acquisition Part B
Claims (12)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2010260859 | 2010-11-24 | ||
| JP2010-260859 | 2010-11-24 | ||
| PCT/JP2011/006420 WO2012070208A1 (en) | 2010-11-24 | 2011-11-18 | Image capturing device, image capturing method, program and integrated circuit |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN102713513A CN102713513A (en) | 2012-10-03 |
| CN102713513B true CN102713513B (en) | 2015-08-12 |
Family
ID=46145579
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201180006620.0A Expired - Fee Related CN102713513B (en) | 2010-11-24 | 2011-11-18 | Camera head, image capture method, program and integrated circuit |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US20120314061A1 (en) |
| JP (1) | JP5873430B2 (en) |
| CN (1) | CN102713513B (en) |
| WO (1) | WO2012070208A1 (en) |
Families Citing this family (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5848177B2 (en) * | 2012-03-27 | 2016-01-27 | 日本放送協会 | Multi-focus camera |
| JP6157249B2 (en) | 2012-07-31 | 2017-07-05 | キヤノン株式会社 | Imaging apparatus, distance information acquisition method, and program |
| JP6112862B2 (en) * | 2012-12-28 | 2017-04-12 | キヤノン株式会社 | Imaging device |
| CN104102068B (en) * | 2013-04-11 | 2017-06-30 | 聚晶半导体股份有限公司 | Autofocus method and autofocus device |
| US9989623B2 (en) * | 2013-06-13 | 2018-06-05 | Basf Se | Detector for determining a longitudinal coordinate of an object via an intensity distribution of illuminated pixels |
| JP2015046777A (en) * | 2013-08-28 | 2015-03-12 | キヤノン株式会社 | Imaging apparatus and control method of imaging apparatus |
| CN108107571B (en) * | 2013-10-30 | 2021-06-01 | 株式会社摩如富 | Image processing apparatus and method, and non-transitory computer-readable recording medium |
| US9404742B2 (en) * | 2013-12-10 | 2016-08-02 | GM Global Technology Operations LLC | Distance determination system for a vehicle using holographic techniques |
| WO2017012986A1 (en) | 2015-07-17 | 2017-01-26 | Trinamix Gmbh | Detector for optically detecting at least one object |
| JP6699898B2 (en) * | 2016-11-11 | 2020-05-27 | 株式会社東芝 | Processing device, imaging device, and automatic control system |
| JP2020162628A (en) * | 2019-03-28 | 2020-10-08 | ソニー株式会社 | Optical systems, endoscopes, and medical image processing systems |
| JP7524728B2 (en) * | 2020-11-20 | 2024-07-30 | ソニーグループ株式会社 | Signal processing device, signal processing method and program |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2963990B1 (en) * | 1998-05-25 | 1999-10-18 | 京都大学長 | Distance measuring device and method, image restoring device and method |
| CN101076705A (en) * | 2004-03-11 | 2007-11-21 | Icos视检系统有限公司 | Methods and apparatus for wavefront manipulations and improved three-dimension measurements |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0814887A (en) * | 1994-06-27 | 1996-01-19 | Matsushita Electric Works Ltd | Optical displacement gauge |
| JP2001074422A (en) * | 1999-08-31 | 2001-03-23 | Hitachi Ltd | Three-dimensional shape detection device, inspection device with solder, and methods thereof |
| JP4008398B2 (en) * | 2003-09-04 | 2007-11-14 | アオイ電子株式会社 | Position and orientation measurement apparatus and position and orientation measurement method |
| WO2007122615A2 (en) * | 2006-04-20 | 2007-11-01 | Xceed Imaging Ltd. | All optical system and method for providing extended depth of focus of imaging |
| EP2227711A4 (en) * | 2008-01-02 | 2014-01-22 | Univ California | TELEMICROSCOPY APPARATUS WITH HIGH DIGITAL OPENING |
| US8305485B2 (en) * | 2010-04-30 | 2012-11-06 | Eastman Kodak Company | Digital camera with coded aperture rangefinder |
-
2011
- 2011-11-18 CN CN201180006620.0A patent/CN102713513B/en not_active Expired - Fee Related
- 2011-11-18 WO PCT/JP2011/006420 patent/WO2012070208A1/en not_active Ceased
- 2011-11-18 JP JP2012510474A patent/JP5873430B2/en not_active Expired - Fee Related
- 2011-11-18 US US13/574,079 patent/US20120314061A1/en not_active Abandoned
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2963990B1 (en) * | 1998-05-25 | 1999-10-18 | 京都大学長 | Distance measuring device and method, image restoring device and method |
| CN101076705A (en) * | 2004-03-11 | 2007-11-21 | Icos视检系统有限公司 | Methods and apparatus for wavefront manipulations and improved three-dimension measurements |
Also Published As
| Publication number | Publication date |
|---|---|
| JP5873430B2 (en) | 2016-03-01 |
| WO2012070208A1 (en) | 2012-05-31 |
| CN102713513A (en) | 2012-10-03 |
| JPWO2012070208A1 (en) | 2014-05-19 |
| US20120314061A1 (en) | 2012-12-13 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN102713513B (en) | Camera head, image capture method, program and integrated circuit | |
| Zhou et al. | Computational cameras: convergence of optics and processing | |
| US11032533B2 (en) | Image processing apparatus, image capturing apparatus, image processing method, and storage medium | |
| JP2012123296A (en) | Electronic device | |
| US11499824B2 (en) | Distance measuring camera | |
| CN102265627A (en) | Image data obtaining method and image data obtaining device | |
| JP2018538709A (en) | Method and apparatus for generating data representing a pixel beam | |
| JP6034197B2 (en) | Image processing apparatus, three-dimensional imaging apparatus, image processing method, and image processing program | |
| CN111465885A (en) | Apparatus and process for simultaneous capture of standard and plenoptic images | |
| US11436746B2 (en) | Distance measuring camera | |
| WO2015128908A1 (en) | Depth position detection device, image pickup element, and depth position detection method | |
| Pistellato et al. | A geometric model for polarization imaging on projective cameras | |
| US10043275B2 (en) | Image processing apparatus, imaging apparatus, and image processing method | |
| JP2009216600A (en) | Distance measuring device | |
| JP7288226B2 (en) | ranging camera | |
| WO2013069279A1 (en) | Image capturing device | |
| JP7328589B2 (en) | ranging camera | |
| KR20180066479A (en) | Automatic object separation method and apparatus using plenoptic refocus | |
| JP7075892B2 (en) | Devices and methods for generating data representing pixel beams | |
| JP7256368B2 (en) | ranging camera | |
| JP7559015B2 (en) | IMAGE PROCESSING APPARATUS, IMAGING APPARATUS, IMAGE PROCESSING METHOD, AND COMPUTER PROGRAM | |
| JP5866779B2 (en) | Ranging device and imaging device | |
| JP2022128518A (en) | ranging camera | |
| CN103515275A (en) | Multi-focal-length optical alignment device and alignment method for multi-chip stacked substrates |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| C14 | Grant of patent or utility model | ||
| GR01 | Patent grant | ||
| CF01 | Termination of patent right due to non-payment of annual fee | ||
| CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20150812 |