CN103188499A - 3D imaging module and 3D imaging method - Google Patents
3D imaging module and 3D imaging method Download PDFInfo
- Publication number
- CN103188499A CN103188499A CN2011104441052A CN201110444105A CN103188499A CN 103188499 A CN103188499 A CN 103188499A CN 2011104441052 A CN2011104441052 A CN 2011104441052A CN 201110444105 A CN201110444105 A CN 201110444105A CN 103188499 A CN103188499 A CN 103188499A
- Authority
- CN
- China
- Prior art keywords
- image
- object distance
- computing module
- imaging
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Studio Devices (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
技术领域 technical field
本发明涉及一种成像模块,尤其涉及一种具有自动对焦功能的3D成像模组及3D成像方法。 The invention relates to an imaging module, in particular to a 3D imaging module with an automatic focus function and a 3D imaging method.
背景技术 Background technique
随着科技的进步,3D(three-dimensional)成像模组已经越来越多的应用于许多领域,而为达成较好的成像效果,3D成像模组还要求能够具备自动对焦功能。 With the advancement of technology, 3D (three-dimensional) imaging modules have been increasingly used in many fields, and in order to achieve better imaging effects, 3D imaging modules are also required to have an auto-focus function.
成像模块的自动对焦技术分为机械式自动对焦技术以及数字式自动对焦技术。机械式自动对焦技术采用机械结构移动镜头内镜片进行对焦。由于机械式对焦技术要采用复杂的机械结构,导致自动对焦镜头成本上升且体积较大。 The auto-focus technology of the imaging module is divided into mechanical auto-focus technology and digital auto-focus technology. Mechanical autofocus technology uses a mechanical structure to move the lens inside the lens to focus. Since the mechanical focusing technology needs to adopt a complicated mechanical structure, the cost of the auto-focus lens increases and the volume is relatively large.
数字式自动对焦技术通过软件仿真、计算将影像传感器所感应到的图像进行处理,使得因影像传感器像素点上因失焦而模糊的图像变的清晰。例如,扩展景深(Extend Depth of Field:EDoF)技术,其利用光的三原色(红色、绿色、蓝色)在不同距离时各自有最佳的MTF曲线,物体在不同距离时,可以当下距离的最佳原色利用算法数字仿真出其它两原色,以达到全幅的清晰影像数。然,数字式对焦对焦的缺陷在于对于近距成像能力不足,一般来说,如果物距在40cm以内,则数字自动对焦技术的对焦效果往往不能令人满意。 The digital auto-focus technology processes the image sensed by the image sensor through software simulation and calculation, so that the blurred image due to out-of-focus pixels on the image sensor becomes clear. For example, Extend Depth of Field (EDoF) technology uses the three primary colors of light (red, green, blue) to have the best MTF curves at different distances. Best Primary Color uses algorithms to digitally simulate the other two primary colors to achieve full-frame clear images. However, the defect of digital autofocus is that it is not capable of short-range imaging. Generally speaking, if the object distance is within 40cm, the focusing effect of digital autofocus technology is often unsatisfactory.
发明内容 Contents of the invention
有鉴于此,有必要提供一种避免上述问题的3D成像模组以及3D成像方法。 In view of this, it is necessary to provide a 3D imaging module and a 3D imaging method that avoid the above problems.
一种3D成像模组,其包括一个第一成像单元以及一个第二成像单元、一个与所述第一影像传感器以及所述第二影像传感器相连接的存储器、一个与所述存储器相连接的色彩分离单元、一个与所述色彩分离单元相连接的处理器、一个与所述处理器相连接的图像处理单元,分别与所述处理器相连接的一个第一驱动器以及一个第二驱动器,以及一个图像合成单元;所述第一成像单元以及所述第二成像用于同时以不同角度捕捉同一场景的图像;所述存储器用于存储所述第一成像单元以及所述第二成像单元所捕捉的图像;所述色彩分离单元用于将所述第一成像单元以及所述第二成像单元所捕捉到的图像分别用红、绿、蓝三原色表示;所述处理器用于对分别对所述第一成像模块以及所述第二成像模块所捕捉到的图像进行MTF运算,根据运算结果确定当前拍摄模式,并根据所确定的当前拍摄模式选择控制所述图像处理单元或者所述第一驱动器及第二驱动器;所述图像处理单元用于通过图像处理方式对所述第一成像单元以及所述第二成像单元所捕捉到的图像进行失焦模糊修正;所述第一驱动器以及所述第二驱动器分别用于对所述第一成像单元以及所述第二成像单元进行对焦;所述图像合成单元用于将所述第一成像单元以及所述第二成像单元所捕捉的经所述图像处理或者对焦后的图像进行合成,得到3D图像。 A 3D imaging module, which includes a first imaging unit and a second imaging unit, a memory connected to the first image sensor and the second image sensor, a color memory connected to the memory separation unit, a processor connected to the color separation unit, an image processing unit connected to the processor, a first driver and a second driver connected to the processor respectively, and a Image synthesis unit; the first imaging unit and the second imaging are used to capture images of the same scene at different angles at the same time; the memory is used to store the images captured by the first imaging unit and the second imaging unit image; the color separation unit is used to represent the images captured by the first imaging unit and the second imaging unit with three primary colors of red, green and blue respectively; Perform MTF calculations on images captured by the imaging module and the second imaging module, determine the current shooting mode according to the calculation results, and select and control the image processing unit or the first driver and the second driver according to the determined current shooting mode The driver; the image processing unit is used to perform out-of-focus and blur correction on the images captured by the first imaging unit and the second imaging unit through image processing; the first driver and the second driver are respectively is used to focus the first imaging unit and the second imaging unit; the image synthesis unit is used to process or focus the images captured by the first imaging unit and the second imaging unit The final images are synthesized to obtain a 3D image.
一种3D成像方法,其包括如下步骤: A 3D imaging method, comprising the steps of:
以两个成像单元同时分别以不同角度对同一场景进行拍摄; Use two imaging units to shoot the same scene at different angles at the same time;
将所述两个成像单元的影像传感器所感测到的图像分别进行色彩分离; performing color separation on the images sensed by the image sensors of the two imaging units;
对所述两个成像单元的影像传感器每一像素单元所感测到的图像区域进行MTF运算,得到每一像素单元所感测到的图像区域对应的MTF值; Perform MTF calculation on the image area sensed by each pixel unit of the image sensors of the two imaging units to obtain the MTF value corresponding to the image area sensed by each pixel unit;
依据所述每一像素单元所感测到的图像的MTF值确定所述每一像素单元所感测到的图像的物距; determining the object distance of the image sensed by each pixel unit according to the MTF value of the image sensed by each pixel unit;
依据所述每一像素单元所感测到的图像的物距,确定当前拍摄模式; determining the current shooting mode according to the object distance of the image sensed by each pixel unit;
如果当前拍摄模式为近焦模式,则进行下列步骤: If the current shooting mode is close-focus mode, perform the following steps:
依据所述每一像素单元所感测到的图像的物距,分别确定所述两个成像模块的取像镜头最佳对焦位置; According to the object distance of the image sensed by each pixel unit, respectively determine the best focus positions of the imaging lenses of the two imaging modules;
依据所述最佳对焦位置分别确定所述两个成像模块的取像镜头的对焦驱动量; Determine the focus drive amounts of the imaging lenses of the two imaging modules respectively according to the best focus position;
依据所述对焦驱动量分别驱动所述两个成像模块的取像镜头至最佳对焦位置; Drive the imaging lenses of the two imaging modules to the best focus position respectively according to the focus drive amount;
对所述两个成像单元所捕捉的经对焦后的图像进行合成,得到3D图像; Combining the focused images captured by the two imaging units to obtain a 3D image;
如果当前拍摄模式为远焦模式,则进行下列步骤: If the current shooting mode is the far focus mode, perform the following steps:
依据所述每一像素单元所感测到的图像区域对应的MTF值,确定对应像素单元所感测到的图像的模糊量; According to the MTF value corresponding to the image area sensed by each pixel unit, determine the blur amount of the image sensed by the corresponding pixel unit;
依据所述每一像素单元所感测到的图像的模糊量,确定对应像素单元所感测到的图像的模糊修正量; According to the blur amount of the image sensed by each pixel unit, determine the blur correction amount of the image sensed by the corresponding pixel unit;
依据所述模糊修正量,对每一像素单元所感测到的图像进行模糊修正; performing blur correction on the image sensed by each pixel unit according to the blur correction amount;
对所述两个成像单元所捕捉的经模糊修正后的图像进行合成,得到3D图像。 The blur-corrected images captured by the two imaging units are synthesized to obtain a 3D image.
相对于现有技术,所述3D成像模组以及以及3D成像方法利用软件计算仿真的方式确定被摄物体的物距,并根据物距的情形确定当前拍摄模式,依据拍摄模式选择软件计算方式或者驱动取像镜头的方式进行对焦,可以达成无论近焦还是远焦拍摄模式下,都能够得到对焦清晰的图像,将两个成像单元同时拍摄的关于同一场景的图像进行合成,因此可以得到更为清晰的3D图像。 Compared with the prior art, the 3D imaging module and the 3D imaging method use software calculation and simulation to determine the object distance of the subject, and determine the current shooting mode according to the situation of the object distance, and select the software calculation method or The way of driving the imaging lens to focus can achieve a clear-focus image no matter in the near-focus or far-focus shooting mode. The images of the same scene captured by the two imaging units at the same time can be synthesized, so you can get more Clear 3D images.
附图说明 Description of drawings
图1是本发明实施方式的3D成像模组的示意图。 FIG. 1 is a schematic diagram of a 3D imaging module according to an embodiment of the present invention.
图2是本发明实施方式的自动对焦方法的流程图。 FIG. 2 is a flowchart of an autofocus method according to an embodiment of the present invention.
主要元件符号说明 Description of main component symbols
如下具体实施方式将结合上述附图进一步说明本发明。 The following specific embodiments will further illustrate the present invention in conjunction with the above-mentioned drawings.
具体实施方式 Detailed ways
下面将结合附图对本发明作一具体介绍。 The present invention will be described in detail below in conjunction with the accompanying drawings.
请参阅图1,所示为本发明实施方式的3D成像模组100的示意图,所述3D成像模组100包括一个第一成像单元A以及一个与所述第一成像单元A并排设置的第二成像单元B。所述第一成像单元A以及所述第二成像单元B同时以不同的角度拍摄同一场景。 Please refer to FIG. 1 , which is a schematic diagram of a 3D imaging module 100 according to an embodiment of the present invention. The 3D imaging module 100 includes a first imaging unit A and a second imaging unit A arranged side by side with the first imaging unit A. Imaging unit B. The first imaging unit A and the second imaging unit B shoot the same scene from different angles at the same time.
所述第一成像单元A包括一个第一取像镜头11以及一个与所述第一取像镜头11的光轴对准的第一影像传感器12。所述第二成像单元B包括一个第二取像镜头13以及一个与所述第二取像镜头13的光轴对准的第二影像传感器14。 The first imaging unit A includes a first imaging lens 11 and a first image sensor 12 aligned with the optical axis of the first imaging lens 11 . The second imaging unit B includes a second imaging lens 13 and a second image sensor 14 aligned with the optical axis of the second imaging lens 13 .
所述第一取像镜头11以及所述第二取像镜头13用于捕捉物体的影像,并分别将捕捉到的影像聚焦投射至所述第一影像传感器12以及所述第二影像传感器14的感测区域。所述第一取像镜头11以及所述第二取像镜头13分别包括至少一个具有正光焦度的镜片111、131,所述镜片111、131为非球面镜片。 The first imaging lens 11 and the second imaging lens 13 are used to capture images of objects, and focus and project the captured images to the first image sensor 12 and the second image sensor 14 respectively. sensing area. The first imaging lens 11 and the second imaging lens 13 respectively include at least one lens 111 , 131 with positive refractive power, and the lenses 111 , 131 are aspherical lenses.
所述第一影像传感器12以及所述第二影像传感器14分别用于感测所述第一取像镜头11以及所述第二取像镜头13所捕捉到的影像。所述第一影像传感器12以及所述第二影像传感器14中的每一个均包括多个像素单元(图未示),所述多个像素单元呈数组状分布于所述对应的影像传感器的有效感测区域。其中,每一个像素单元均包括三原色(红、绿、蓝)像素。较佳地,所述第一影像传感器12以及所述第二影像传感器14均至少包括2048×1536个像素单元。本实施方式中,所述第一影像传感器12以及所述第二影像传感器14可以为Charged-coupled Device(CCD)传感器或者Complementary Metal Oxide Semiconductor(CMOS)传感器。 The first image sensor 12 and the second image sensor 14 are used to sense images captured by the first imaging lens 11 and the second imaging lens 13 respectively. Each of the first image sensor 12 and the second image sensor 14 includes a plurality of pixel units (not shown in the figure), and the plurality of pixel units are distributed in an effective array of the corresponding image sensor. sensing area. Wherein, each pixel unit includes three primary colors (red, green, blue) pixels. Preferably, both the first image sensor 12 and the second image sensor 14 include at least 2048×1536 pixel units. In this embodiment, the first image sensor 12 and the second image sensor 14 may be Charged-coupled Device (CCD) sensors or Complementary Metal Oxide Semiconductor (CMOS) sensors.
所述3D成像模组100还包括一个存储器20、一个色彩分离单元30、一个处理器40、一个图像处理单元50、一个对应于所述第一取像镜头11的第一驱动器60以及一个对应于所述第二取像镜头13的第二驱动器70。所述存储器20与所述第一影像传感器12以及所述第二影像传感器14相连,所述色彩分离单元30与存储器相连,所述处理器40与所述色彩分离单元30相连,所述图像处理单元50、所述第一驱动器60以及所述第二驱动器70分别与所述处理器40相连,所述第一驱动器60以及所述第二驱动器70还分别与所述第一取像镜头11以及所述第二取像镜头相连。 The 3D imaging module 100 also includes a memory 20, a color separation unit 30, a processor 40, an image processing unit 50, a first driver 60 corresponding to the first imaging lens 11 and a corresponding to The second driver 70 of the second imaging lens 13 . The memory 20 is connected to the first image sensor 12 and the second image sensor 14, the color separation unit 30 is connected to the memory, the processor 40 is connected to the color separation unit 30, the image processing The unit 50, the first driver 60 and the second driver 70 are respectively connected to the processor 40, and the first driver 60 and the second driver 70 are also connected to the first imaging lens 11 and the first imaging lens 11 respectively. The second imaging lens is connected.
所述存储器20用于存储所述第一影像传感器12以及所述第二影像传感器14所感测到的图像。 The memory 20 is used for storing images sensed by the first image sensor 12 and the second image sensor 14 .
所述色彩分离单元30用于将所述第一影像传感器12以及所述第二影像传感器14所感测到的影像分离为分别用三原色表示的图像。 The color separation unit 30 is used for separating the images sensed by the first image sensor 12 and the second image sensor 14 into images respectively represented by three primary colors.
所述处理器40包括一个调变传递函数(Modulation Transfer Function: MTF)运算模块41、一个物距运算模块42、一个物距判断模块43、一个模糊量运算模块44、一个模糊修正量运算模块45、一个对焦位置运算模块46、一个驱动量运算模块47。所述MTF运算模块41与所述色彩分离单元30相连,所述物距运算模块42与所述MTF运算模块41相连,所述物距判断模块43与所述物距运算模块42相连,所述对焦位置运算模块46以及所述模糊量运算模块44分别与所述物距判断模块43相连,所述驱动量运算模块47分别与所述对焦位置运算模块46、所述第一驱动器60以及所述第二驱动器70相连;所述模糊修正量运算模块45分别与所述模糊修正量运算模块45以及所述图像处理单元50相连。 The processor 40 includes a modulation transfer function (Modulation Transfer Function: MTF) operation module 41, an object distance operation module 42, an object distance judgment module 43, a fuzzy amount operation module 44, and a fuzzy correction amount operation module 45 , a focus position computing module 46, and a driving amount computing module 47. The MTF computing module 41 is connected to the color separation unit 30, the object distance computing module 42 is connected to the MTF computing module 41, the object distance judging module 43 is connected to the object distance computing module 42, and the object distance computing module 42 is connected to the object distance computing module 42. The focus position calculation module 46 and the blur amount calculation module 44 are respectively connected to the object distance judgment module 43, and the driving amount calculation module 47 is connected to the focus position calculation module 46, the first driver 60 and the The second driver 70 is connected; the blur correction amount calculation module 45 is respectively connected with the blur correction amount calculation module 45 and the image processing unit 50 .
所述MTF运算模块41用于对所述第一影像传感器12以及所述第二影像传感器14上每一像素单元所感测到的图像区域进行MTF运算,得到对应区域的MTF值。本实施方式中,所述MTF运算模块41对每一像素单元对应的三原色图像分别进行MTF值运算。 The MTF calculation module 41 is used for performing MTF calculation on the image area sensed by each pixel unit on the first image sensor 12 and the second image sensor 14 to obtain the MTF value of the corresponding area. In this embodiment, the MTF computing module 41 performs MTF value computing on the three primary color images corresponding to each pixel unit.
所述物距运算模块42用于依据所述MTF运算模块的运算结果,确定每一像素单元所感测到的图像的物距。 The object distance calculation module 42 is used for determining the object distance of the image sensed by each pixel unit according to the calculation result of the MTF calculation module.
所述物距判断模块43用于依据所述物距运算模块42的运算结果,确定当前的拍摄模式。具体地,所述物距判断模块43将所述物距运算模块的运算结果作综合运算,并将该综合运算的结果与一预设的标准值进行比较,根据比较结果确定当前拍摄模式。本实施方式中,所述综合运算为对所述物距运算模块42所得到的每一像素单元所感测到图像的物距进行采样,并根据采样的数据运算得到用于表征当前拍摄主要目标物的距离的物距表征量。所述预设的标准值用于区分当前拍摄模式为近焦模式或者远焦模式,本实施方式中,所述标准值为40cm,如果所述物距表征量大于40cm,则当前拍摄模式为远焦模式,如果所述物距表征量小于(等于)40cm,则当前拍摄模式为近焦模式。 The object distance judging module 43 is used to determine the current shooting mode according to the calculation result of the object distance calculation module 42 . Specifically, the object distance judging module 43 performs comprehensive calculation on the calculation result of the object distance calculation module, compares the comprehensive calculation result with a preset standard value, and determines the current shooting mode according to the comparison result. In this embodiment, the comprehensive operation is to sample the object distance of the image sensed by each pixel unit obtained by the object distance operation module 42, and obtain the object distance used to represent the main target object currently photographed according to the sampled data operation. The object distance characterization quantity of the distance. The preset standard value is used to distinguish the current shooting mode as the near focus mode or the far focus mode. In this embodiment, the standard value is 40cm. If the object distance representation is greater than 40cm, the current shooting mode is far Focus mode, if the object distance characteristic is less than (equal to) 40cm, the current shooting mode is close focus mode.
所述模糊量运算模块44用于依据所述MTF运算模块41的运算结果,确定该每一像素单元运算所得到的MTF值与对应物距内标准MTF值的差异,并根据该差异确定每一像素单元所感测到的图像的模糊量。所述标准MTF值为每一像素单元在对应物距内所感测到的最清晰图像区域的MTF值,因此,所述MTF运算模块41运算得到的每一个像素单元的MTF值与对应的标准MTF值之间的差异可以表征每一像素单元所感测到的图像的模糊量。本实施方式中,所述模糊量运算模块44对每一像素单元的三原色图像分别进行模糊量运算。所述模糊量运算模块44依据所述物距判断模块43所确定的拍摄模式而确定是否其功能是否开启。本实施方式中,当所述物距判断模块43判断当前拍摄模式为远焦模式时,所述模糊量运算模块44功能开启,当所述物距判断模块43判断当前拍摄模式为近焦模式时,所述模糊量运算模块44功能关闭。 The blur amount calculation module 44 is used to determine the difference between the MTF value obtained by the operation of each pixel unit and the standard MTF value within the corresponding object distance according to the calculation result of the MTF calculation module 41, and determine each The amount of blur in the image sensed by the pixel unit. The standard MTF value is the MTF value of the clearest image area sensed by each pixel unit within the corresponding object distance, therefore, the MTF value of each pixel unit calculated by the MTF calculation module 41 is the same as the corresponding standard MTF The difference between the values can characterize the amount of blur of the image sensed by each pixel unit. In this embodiment, the blur amount calculation module 44 performs blur amount calculation on the three primary color images of each pixel unit. The blur calculation module 44 determines whether its function is enabled or not according to the shooting mode determined by the object distance judging module 43 . In this embodiment, when the object distance judging module 43 judges that the current shooting mode is the far focus mode, the function of the blur amount calculation module 44 is turned on; when the object distance judging module 43 judges that the current shooting mode is the near focus mode , the function of the fuzzy quantity calculation module 44 is turned off.
所述模糊修正量运算模块45用于根据所述模糊量运算模块44所得到的模糊量,确定对每一像素单元所感应到的图像进行模糊修正的修正量。本实施方式中,所述模糊修正量运算模块45对每一像素单元的图像分别进行三原色的模糊修正量运算。 The blur correction amount calculation module 45 is used for determining the blur correction amount for the image sensed by each pixel unit according to the blur amount obtained by the blur amount calculation module 44 . In this embodiment, the blur correction amount calculation module 45 performs three primary color blur correction calculations on the image of each pixel unit.
所述对焦位置运算模块46用于根据所述物距运算模块42的运算结果,确定所述第一取像镜头11以及所述第二取像镜头13的最佳对焦位置。所述对焦位置运算模块46依据所述物距判断模块43所确定的拍摄模式而确定是否其功能是否开启。本实施方式中,当所述物距判断模块43判断当前拍摄模式为近焦模式时,所述对焦位置运算模块46功能开启,当所述物距判断模块43判断当前拍摄模式为远焦模式时,所述对焦位置运算模块46功能关闭。 The focus position calculation module 46 is used for determining the best focus positions of the first imaging lens 11 and the second imaging lens 13 according to the calculation result of the object distance calculation module 42 . The focus position computing module 46 determines whether its function is enabled according to the shooting mode determined by the object distance judging module 43 . In this embodiment, when the object distance judging module 43 judges that the current shooting mode is the near-focus mode, the function of the focus position calculation module 46 is turned on; when the object distance judging module 43 judges that the current shooting mode is the far-focus mode , the function of the focus position computing module 46 is turned off.
所述驱动量运算模块47用于根据所述物距运算模块42所得到的取像镜头10的最佳对焦位置,确定所述第一取像镜头11以及所述第二取像镜头13的对焦驱动量。 The driving amount calculation module 47 is used to determine the focus of the first imaging lens 11 and the second imaging lens 13 according to the best focus position of the imaging lens 10 obtained by the object distance calculation module 42 drive volume.
所述图像处理单元50用于根据所述模糊修正量运算模块45所得到的修正量,对每一像素单元所感应到的图像进行模糊修正,以得到清晰图像。本实施方式中,所述图像处理单元50对每一像素单元的图像进行三原色的修正。所述第一影像传感器12以及所述第二影像传感器14所感测到且经模糊修正后的图像存储于所述存储器20内。 The image processing unit 50 is used for performing blur correction on the image sensed by each pixel unit according to the correction amount obtained by the blur correction amount calculation module 45 to obtain a clear image. In this embodiment, the image processing unit 50 corrects the three primary colors on the image of each pixel unit. The blur-corrected images sensed by the first image sensor 12 and the second image sensor 14 are stored in the memory 20 .
所述第一驱动器60以及所述第二驱动器70分别用于根据所述驱动量运算模块47所得到的第一取像镜头11一及第二取像镜头13的对焦驱动量驱动所述第一取像镜头11以及所述第二取像镜头13至最佳对焦位置。本实施方式中,所述第一驱动器60以及所述第二驱动器70为压电式马达,当然,所述第一驱动器60以及所述第二驱动器70也可以为音圈马达等其它类型的驱动组件。所述第一取像镜头11以及所述第二取像镜头13被所述第一驱动器60以及所述第二驱动器70驱动至最佳对焦位置后,所捕捉到的图像存储于所述存储器20内。 The first driver 60 and the second driver 70 are respectively used to drive the first imaging lens 11- and the second imaging lens 13 according to the focus driving amounts obtained by the driving amount calculation module 47. The imaging lens 11 and the second imaging lens 13 are at the best focus position. In this embodiment, the first driver 60 and the second driver 70 are piezoelectric motors. Of course, the first driver 60 and the second driver 70 may also be other types of drivers such as voice coil motors. components. After the first imaging lens 11 and the second imaging lens 13 are driven to the best focus position by the first driver 60 and the second driver 70, the captured images are stored in the memory 20 Inside.
所述3D成像模组100包括一个图像合成单元80,所述图像合成单元80用于读取所述存储器20内经所述第一驱动器60以及所述第二驱动器70对焦后的图像或者经所述图像处理单元50进行模糊修正后的图像,并对所述图像进行合成,得到3D图像。具体地,所述图像合成单元80每次读取经过对焦或者模糊修正后的所述第一成像单元A以及所述第二成像单元B在同时对同一场景以不同角度所捕捉的图像,并依据所述第一成像单元以及所述第二成像单元对同一场景的不同拍摄角度还原所述场景内物体的深度及远近信息,得到具有视觉深度及远近的图像。 The 3D imaging module 100 includes an image synthesis unit 80, the image synthesis unit 80 is used to read the focused image in the memory 20 through the first driver 60 and the second driver 70 or through the The image processing unit 50 processes the image after blur correction, and synthesizes the images to obtain a 3D image. Specifically, the image synthesis unit 80 reads the images captured by the first imaging unit A and the second imaging unit B at the same time at different angles of the same scene after focusing or blur correction each time, and according to The first imaging unit and the second imaging unit recover the depth and distance information of objects in the scene from different shooting angles of the same scene to obtain images with visual depth and distance.
请参阅图2,本发明实施方式的3D成像方法应用上述实施方式所述的3D成像模组,该3D包括如下步骤: Please refer to Fig. 2, the 3D imaging method of the embodiment of the present invention applies the 3D imaging module described in the above embodiment, and the 3D includes the following steps:
以两个成像单元同时分别以不同角度对同一场景进行拍摄; Use two imaging units to shoot the same scene at different angles at the same time;
将所述两个成像单元的影像传感器所感测到的图像分别进行色彩分离,所述图像分别表示为红、绿、蓝三原色图像; color-separating the images sensed by the image sensors of the two imaging units respectively, and the images are represented as three primary color images of red, green and blue respectively;
对所述两个成像单元的影像传感器每一像素单元所感测到的图像区域进行MTF运算,得到每一像素单元所感测到的图像区域对应的MTF值; Perform MTF calculation on the image area sensed by each pixel unit of the image sensors of the two imaging units to obtain the MTF value corresponding to the image area sensed by each pixel unit;
依据所述每一像素单元所感测到的图像的MTF值确定所述每一像素单元所感测到的图像的物距; determining the object distance of the image sensed by each pixel unit according to the MTF value of the image sensed by each pixel unit;
依据所述每一像素单元所感测到的图像的物距,确定当前拍摄模式; determining the current shooting mode according to the object distance of the image sensed by each pixel unit;
如果当前拍摄模式为近焦模式,则进行下列步骤: If the current shooting mode is close-focus mode, perform the following steps:
依据所述每一像素单元所感测到的图像的物距,分别确定所述两个成像模块的取像镜头最佳对焦位置; According to the object distance of the image sensed by each pixel unit, respectively determine the best focus positions of the imaging lenses of the two imaging modules;
依据所述最佳对焦位置分别确定所述两个成像模块的取像镜头的对焦驱动量; Determine the focus drive amounts of the imaging lenses of the two imaging modules respectively according to the best focus position;
依据所述对焦驱动量分别驱动所述两个成像模块至取像镜头的最佳对焦位置; Drive the two imaging modules to the best focus position of the imaging lens respectively according to the focus drive amount;
对所述两个成像单元所捕捉的经对焦后的图像进行合成,得到3D图像; Combining the focused images captured by the two imaging units to obtain a 3D image;
如果当前拍摄模式为远焦模式,则进行下列步骤: If the current shooting mode is the far focus mode, perform the following steps:
依据所述每一像素单元所感测到的图像区域对应的MTF值,确定对应像素单元所感测到的图像的模糊量; According to the MTF value corresponding to the image area sensed by each pixel unit, determine the blur amount of the image sensed by the corresponding pixel unit;
依据所述每一像素单元所感测到的图像的模糊量,确定对应像素单元所感测到的图像的模糊修正量; According to the blur amount of the image sensed by each pixel unit, determine the blur correction amount of the image sensed by the corresponding pixel unit;
依据所述模糊修正量,对每一像素单元所感测到的图像进行模糊修正; performing blur correction on the image sensed by each pixel unit according to the blur correction amount;
对所述两个成像单元所捕捉的经模糊修正后的图像进行合成,得到3D图像。 The blur-corrected images captured by the two imaging units are synthesized to obtain a 3D image.
所述3D成像模组以及以及3D成像方法利用软件计算仿真的方式确定被摄物体的物距,并根据物距的情形确定当前拍摄模式,依据拍摄模式选择软件计算方式或者驱动取像镜头的方式进行对焦,可以达成无论近焦还是远焦拍摄模式下,都能够得到对焦清晰的图像,将两个成像单元同时拍摄的关于同一场景的图像进行合成,因此可以得到更为清晰的3D图像。 The 3D imaging module and the 3D imaging method determine the object distance of the object by means of software calculation and simulation, and determine the current shooting mode according to the situation of the object distance, and select the software calculation method or the way of driving the imaging lens according to the shooting mode Focusing can achieve a clear-focus image regardless of the close-focus or far-focus shooting mode, and the images of the same scene captured by the two imaging units at the same time are synthesized, so a clearer 3D image can be obtained.
另外,本领域技术人员还可在本发明精神内做其它变化,当然,这些依据本发明精神所做的变化,都应包含在本发明所要求保护的范围之内。 In addition, those skilled in the art can also make other changes within the spirit of the present invention. Of course, these changes made according to the spirit of the present invention should be included within the scope of protection claimed by the present invention.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110444105.2A CN103188499B (en) | 2011-12-27 | 2011-12-27 | 3D imaging modules and 3D formation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110444105.2A CN103188499B (en) | 2011-12-27 | 2011-12-27 | 3D imaging modules and 3D formation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103188499A true CN103188499A (en) | 2013-07-03 |
CN103188499B CN103188499B (en) | 2016-05-04 |
Family
ID=48679428
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201110444105.2A Expired - Fee Related CN103188499B (en) | 2011-12-27 | 2011-12-27 | 3D imaging modules and 3D formation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103188499B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106162149A (en) * | 2016-09-29 | 2016-11-23 | 宇龙计算机通信科技(深圳)有限公司 | A kind of method shooting 3D photo and mobile terminal |
CN110891533A (en) * | 2017-05-29 | 2020-03-17 | 爱威愿景有限公司 | Eye projection system and method with focus management |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1690833A (en) * | 2004-04-20 | 2005-11-02 | 鸿富锦精密工业(深圳)有限公司 | Auto focus method of digital camera |
CN1910614A (en) * | 2004-01-15 | 2007-02-07 | 松下电器产业株式会社 | Measuring method for optical transfer function, image restoring method, and digital imaging device |
US20080158346A1 (en) * | 2006-12-27 | 2008-07-03 | Fujifilm Corporation | Compound eye digital camera |
US7433586B2 (en) * | 2004-08-18 | 2008-10-07 | Casio Computer Co., Ltd. | Camera with an auto-focus function |
CN102103320A (en) * | 2009-12-22 | 2011-06-22 | 鸿富锦精密工业(深圳)有限公司 | Stereo imaging camera module |
US20110199514A1 (en) * | 2008-12-15 | 2011-08-18 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
-
2011
- 2011-12-27 CN CN201110444105.2A patent/CN103188499B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1910614A (en) * | 2004-01-15 | 2007-02-07 | 松下电器产业株式会社 | Measuring method for optical transfer function, image restoring method, and digital imaging device |
CN1690833A (en) * | 2004-04-20 | 2005-11-02 | 鸿富锦精密工业(深圳)有限公司 | Auto focus method of digital camera |
US7433586B2 (en) * | 2004-08-18 | 2008-10-07 | Casio Computer Co., Ltd. | Camera with an auto-focus function |
US20080158346A1 (en) * | 2006-12-27 | 2008-07-03 | Fujifilm Corporation | Compound eye digital camera |
US20110199514A1 (en) * | 2008-12-15 | 2011-08-18 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
CN102103320A (en) * | 2009-12-22 | 2011-06-22 | 鸿富锦精密工业(深圳)有限公司 | Stereo imaging camera module |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106162149A (en) * | 2016-09-29 | 2016-11-23 | 宇龙计算机通信科技(深圳)有限公司 | A kind of method shooting 3D photo and mobile terminal |
CN106162149B (en) * | 2016-09-29 | 2019-06-11 | 宇龙计算机通信科技(深圳)有限公司 | A method and mobile terminal for taking 3D photos |
CN110891533A (en) * | 2017-05-29 | 2020-03-17 | 爱威愿景有限公司 | Eye projection system and method with focus management |
Also Published As
Publication number | Publication date |
---|---|
CN103188499B (en) | 2016-05-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI510056B (en) | 3d imaging module and 3d imaging method | |
JP7145208B2 (en) | Method and Apparatus and Storage Medium for Dual Camera Based Imaging | |
JP4290100B2 (en) | Imaging apparatus and control method thereof | |
CN101309367B (en) | Imaging apparatus | |
US9325899B1 (en) | Image capturing device and digital zooming method thereof | |
EP3499863A1 (en) | Method and device for image processing | |
JP5247076B2 (en) | Image tracking device, focus adjustment device, and imaging device | |
CN103460103B (en) | Image capture device and driving method thereof | |
KR20200031168A (en) | Image processing method and mobile terminal using dual cameras | |
CN103039066B (en) | Imaging device, image processing apparatus and image processing method | |
TWI551113B (en) | 3d imaging module and 3d imaging method | |
JP6222514B2 (en) | Image processing apparatus, imaging apparatus, and computer program | |
CN102227746A (en) | Stereoscopic image processing device, method, recording medium and stereoscopic imaging apparatus | |
CN113313626B (en) | Image processing method, device, electronic device and storage medium | |
TWI536826B (en) | Autofocus imaging module and autofocus method | |
CN103293823A (en) | Automatic focusing imaging module | |
CN110233970A (en) | Image processing method and device, electronic equipment and computer readable storage medium | |
JP2015012482A (en) | Image processing apparatus and image processing method | |
JP2020107956A (en) | Imaging apparatus, imaging method, and program | |
CN107172352A (en) | Focusing control method, device, computer can storage medium and mobile terminal | |
JP6590894B2 (en) | Image processing apparatus, imaging apparatus, image processing method, and program | |
CN103188499B (en) | 3D imaging modules and 3D formation method | |
JP5070856B2 (en) | Imaging device | |
CN103185949B (en) | Auto-focusing imaging modules and Atomatic focusing method | |
JP6590899B2 (en) | Imaging apparatus, imaging method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20160504 Termination date: 20171227 |
|
CF01 | Termination of patent right due to non-payment of annual fee |