JPH07270674A - Focusing detecting device - Google Patents
Focusing detecting deviceInfo
- Publication number
- JPH07270674A JPH07270674A JP6082374A JP8237494A JPH07270674A JP H07270674 A JPH07270674 A JP H07270674A JP 6082374 A JP6082374 A JP 6082374A JP 8237494 A JP8237494 A JP 8237494A JP H07270674 A JPH07270674 A JP H07270674A
- Authority
- JP
- Japan
- Prior art keywords
- focus
- focusing
- image
- image pickup
- sharpness
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004907 flux Effects 0.000 claims abstract description 11
- 210000001747 pupil Anatomy 0.000 claims abstract description 10
- 238000001514 detection method Methods 0.000 claims description 48
- 238000011156 evaluation Methods 0.000 claims description 37
- 238000004364 calculation method Methods 0.000 claims description 25
- 238000000034 method Methods 0.000 description 21
- 238000010586 diagram Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 238000005070 sampling Methods 0.000 description 5
- 238000005311 autocorrelation function Methods 0.000 description 4
- 230000009194 climbing Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000007257 malfunction Effects 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 3
- 238000005314 correlation function Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
Landscapes
- Automatic Focus Adjustment (AREA)
- Focusing (AREA)
Abstract
Description
【0001】[0001]
【産業上の利用分野】本発明は合焦検出装置に関し、特
にCCD等の撮像素子を用いてオートフォーカスを行な
う、例えばスチルカメラやビデオカメラ等に好適な合焦
検出装置に関するものである。BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a focus detecting device, and more particularly to a focus detecting device suitable for a still camera, a video camera or the like for performing an autofocus using an image pickup device such as a CCD.
【0002】[0002]
【従来の技術】従来よりスチルカメラやビデオカメラ等
の撮影装置におけるオートフォーカスの方式として山登
り方式と呼ばれる合焦検出方式が知られている。この山
登り方式は画像(被写体像)の鮮鋭度を直接評価してフ
ォーカスレンズ(合焦レンズ)を駆動して焦点を変化さ
せながら画像の鮮鋭度が最大になる点を探し、その点を
合焦位置として撮影系の合焦状態を得ている。2. Description of the Related Art Conventionally, a focus detection method called a hill climbing method has been known as an autofocus method in a photographing device such as a still camera or a video camera. This hill climbing method directly evaluates the sharpness of the image (subject image) and drives the focus lens (focusing lens) to change the focus to find the point where the sharpness of the image is maximum and focus on that point. The in-focus state of the shooting system is obtained as the position.
【0003】このような画像の鮮鋭度の評価方法を利用
した合焦検出装置が、例えば特開昭62−103616
号公報で提案されている。An in-focus detection apparatus using such an image sharpness evaluation method is disclosed in, for example, Japanese Patent Laid-Open No. 62-103616.
It has been proposed in the publication.
【0004】同公報においては被写体像(物体像)のエ
ッジ部分の幅を検出する検出手段と該幅の大きさに基づ
いて合焦状態を判別する判別手段とを具備する合焦検出
装置であって、該検出手段が該被写体像のエッジ部分の
輝度変化の勾配とそのエッジ部分における輝度差との比
を検出することによりエッジ部分の幅を検出し、そのエ
ッジ部分の幅の大きさが小さいほど画像が鮮鋭であるこ
とを利用して撮影系の合焦状態を得ている。In the publication, there is provided a focus detection apparatus having a detection means for detecting the width of an edge portion of a subject image (object image) and a determination means for determining a focus state based on the size of the width. Then, the detection means detects the width of the edge portion by detecting the ratio of the gradient of the luminance change of the edge portion of the subject image and the luminance difference in the edge portion, and the width of the edge portion is small. The sharpness of the image is used to obtain the focus state of the shooting system.
【0005】図8は山登り方式における評価値のグラフ
の例を示した説明図である。同図において横軸がフォー
カスレンズのレンズ位置で、縦軸が鮮鋭度評価値であ
る。FIG. 8 is an explanatory diagram showing an example of a graph of evaluation values in the hill climbing method. In the figure, the horizontal axis is the lens position of the focus lens, and the vertical axis is the sharpness evaluation value.
【0006】同図に示すようにフォーカスレンズのレン
ズ位置を矢印7−3に示す如く∞点からN点(近点)側
へ変移させていくと鮮鋭度評価値は山のような波形Aと
なる。この波形Aの頂点、即ち合焦位置7−1のときは
どちらの方向にフォーカスレンズを変移させても評価値
は下がる。又波形Aの位置7−2ではN点側で評価値が
増加し、∞点側で評価値が減少するので合焦位置7−1
の方向が分かる。As shown in the same figure, when the lens position of the focus lens is shifted from the ∞ point to the N point (near point) side as shown by arrow 7-3, the sharpness evaluation value becomes a peak-like waveform A. Become. At the apex of the waveform A, that is, at the in-focus position 7-1, the evaluation value is lowered no matter which direction the focus lens is moved. At the position 7-2 of the waveform A, the evaluation value increases on the N point side and decreases on the ∞ point side.
I know the direction.
【0007】しかしながらこの合焦検出方式では、合焦
位置に到達するまでにフォーカスレンズを駆動させなが
ら何フレームもサンプリングして鮮鋭度評価値を求める
必要がある為、合焦検出に時間がかかり、又合焦位置を
予想することも難しいという問題点があった。However, in this focus detection method, it is necessary to sample a number of frames to obtain the sharpness evaluation value while driving the focus lens before the focus position is reached, so it takes time to detect the focus. In addition, it is difficult to predict the in-focus position.
【0008】又、画像の鮮鋭度の評価方法を利用した合
焦検出装置が、例えば米国特許418591号で提案さ
れている。同号では撮影レンズの複数の一部分を通した
光束から結像される複数の像の鮮鋭度を比べて合焦情報
を得る方式において、ハーフミラー等を用いて撮影レン
ズの瞳面上の異なった2つの領域を通過した光束に基づ
く2つの被写体像を該ハーフミラーで反射させて撮像素
子(センサー)面上にそれぞれ入射させ位相差を測定す
ることにより撮影系の合焦状態を得ている。Also, a focus detection device utilizing an image sharpness evaluation method is proposed in, for example, US Pat. No. 4,18591. In the same issue, in the method of obtaining the focus information by comparing the sharpness of a plurality of images formed from the light flux that has passed through a plurality of parts of the taking lens, different things on the pupil plane of the taking lens using a half mirror etc. The two subject images based on the light flux that has passed through the two regions are reflected by the half mirrors, respectively made incident on the image pickup element (sensor) surface, and the phase difference is measured to obtain the focused state of the photographing system.
【0009】しかしながらこの合焦検出方式では、装置
内にハーフミラー等の光学部材を用いている為、装置全
体が複雑化かつ大型化になり、しかも合焦検出用の撮像
素子と撮像手段の結像面との間の関係を同一条件にする
為、その位置精度を出すのに製造コストがかかるという
問題点があった。又その位置精度の経時変化や温度変化
等における信頼性等にも問題点があった。However, in this focus detection method, since an optical member such as a half mirror is used in the device, the entire device becomes complicated and large in size, and the image pickup element for focus detection and the image pickup means are combined. Since the relationship with the image plane is set to the same condition, there is a problem that manufacturing cost is required to obtain the positional accuracy. Further, there is a problem in the reliability of the position accuracy with time and the temperature change.
【0010】そこで従来は上記の問題点を解決する為に
複数の開口部を有する絞りを用いて撮影系の瞳を複数の
領域(瞳領域)に分割し、該分割した複数の領域を通過
した光束に基づく複数の画像情報を撮像手段面上に形成
し、該撮像手段で得られる画像信号(画像情報)を用い
て相関演算することにより撮影系の合焦状態を検出して
いる。これにより合焦検出用に特別な撮像素子を用いる
ことなく合焦検出を行なっている。Therefore, in order to solve the above problems, the pupil of the photographing system is divided into a plurality of regions (pupil regions) by using a diaphragm having a plurality of apertures, and the divided regions are passed through. A plurality of image information based on the light flux is formed on the surface of the image pickup means, and a focus state of the image pickup system is detected by performing a correlation calculation using an image signal (image information) obtained by the image pickup means. As a result, focus detection is performed without using a special image sensor for focus detection.
【0011】[0011]
【発明が解決しようとする課題】前述したスチルカメラ
やビデオカメラ等に用いられるオートフォーカスの方式
としての山登り方式は撮像素子からの映像信号(画像情
報)を直接評価して行なう為、合焦精度が高く、又特別
なセンサーが不要である等の長所がある。しかしながら
その反面前述の如く合焦位置に到達するまでには多くの
フレームをサンプリングして鮮鋭度の評価値を求める必
要がある為に時間がかかり、又合焦位置を予想すること
も困難であるという問題点があった。The hill-climbing method, which is an auto-focusing method used in the still camera, the video camera, etc., is performed by directly evaluating the video signal (image information) from the image sensor, and therefore the focusing accuracy is improved. Has a high price and does not require a special sensor. On the other hand, however, as described above, it takes time to sample a large number of frames to obtain a sharpness evaluation value before reaching the in-focus position, and it is difficult to predict the in-focus position. There was a problem.
【0012】又、複数の開口部を有する絞りを用いて画
像信号の相関演算の結果から合焦情報を得るようにした
従来の合焦検出方式においては、サンプリングが1フレ
ームですむという長所がある。しかしながらその反面デ
ィフォーカス量が小さいとき合焦位置のピークの検出が
困難であり、このため微調整をして検出精度を出さなけ
ればならなかった。又動画の被写体などに追従して合焦
状態を保ったりすることが困難であるという問題点もあ
った。Further, in the conventional focus detection method in which the focus information is obtained from the result of the correlation calculation of the image signal by using the diaphragm having a plurality of apertures, there is an advantage that sampling is performed for one frame. . However, on the other hand, when the defocus amount is small, it is difficult to detect the peak of the in-focus position. Therefore, it is necessary to make fine adjustment to obtain the detection accuracy. There is also a problem that it is difficult to keep the in-focus state by following the subject of a moving image.
【0013】本発明は撮像手段からの画像信号を用いて
複数の合焦手段(合焦検出方式)で撮影系の合焦状態を
検出することにより、合焦検出の為の特別な撮像素子を
用いることなく正確なるオートフォーカスを高速で、し
かも高精度で行なうことができる合焦検出装置の提供を
目的とする。According to the present invention, a special image pickup element for focus detection is provided by detecting the focus state of the photographing system by a plurality of focus means (focus detection method) using the image signal from the image pickup means. An object of the present invention is to provide a focus detection device capable of performing accurate autofocus at high speed and with high accuracy without using it.
【0014】[0014]
【課題を解決するための手段】本発明の合焦検出装置
は、複数の開口部を有する絞りにより撮影系の瞳を複数
の領域に分割し、該分割した複数の領域を通過した光束
に基づく複数の画像情報を撮像手段面上に形成し、該撮
像手段からの画像信号を用いて複数の合焦手段で該撮影
系の合焦状態を検出する際、該複数の合焦手段のうち、
まず第1の合焦手段で該撮影系の合焦位置を検出した
後、合焦検出方式選択手段により該第1の合焦手段とは
異なる他の第2の合焦手段を用いて合焦を行なうか又は
該第1の合焦手段で合焦を行なうか選択して、該撮影系
の合焦状態を検出するようにしたことを特徴としてい
る。A focus detection apparatus according to the present invention divides a pupil of a photographing system into a plurality of areas by a diaphragm having a plurality of openings, and based on a light flux passing through the plurality of divided areas. When a plurality of image information is formed on the surface of the image pickup means and the focus state of the photographing system is detected by the plurality of focus means using the image signal from the image pickup means, among the plurality of focus means,
First, the first focus means detects the focus position of the photographing system, and then the focus detection method selection means uses another second focus means different from the first focus means to focus. Is selected or the focusing is performed by the first focusing means, and the focusing state of the photographing system is detected.
【0015】特に前記第1の合焦手段は前記絞りによっ
て生じる前記撮像手段面上に形成した複数の被写体像の
信号から相関演算により合焦位置を検出しており、前記
第2の合焦手段は前記撮像手段面上に形成した被写体像
の鮮鋭度を評価値として取りだして合焦位置を検出して
いることを特徴としている。In particular, the first focusing means detects the focusing position by correlation calculation from the signals of a plurality of subject images formed on the surface of the image pickup means generated by the diaphragm, and the second focusing means. Is characterized in that the sharpness of the subject image formed on the surface of the image pickup means is taken out as an evaluation value to detect the in-focus position.
【0016】[0016]
【実施例】図1は本発明の実施例1の要部構成図であ
る。DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS FIG. 1 is a schematic view of the essential parts of a first embodiment of the present invention.
【0017】同図において1−1は撮影系(撮影レン
ズ)である。1−2は絞りであり、後述する図2(A)
に示すように面積が互いに異なる2つの開口部1−6,
1−7を有しており、撮影レンズ1−1の瞳(射出瞳)
を2つの領域に分割している。本実施例では開口部1−
7の面積の方が広くなるように形成している。3は撮像
手段であり、CCD等の撮像素子(センサ)より成って
いる。In FIG. 1, reference numeral 1-1 is a photographing system (photographing lens). Reference numeral 1-2 is an aperture, which will be described later with reference to FIG.
As shown in FIG. 2, two openings 1-6 having different areas are provided.
1-7, the pupil of the taking lens 1-1 (exit pupil)
Is divided into two areas. In this embodiment, the opening 1-
The area of 7 is formed to be wider. Reference numeral 3 denotes an image pickup means, which includes an image pickup element (sensor) such as a CCD.
【0018】1−3は撮影系1−1が合焦状態にあると
きの撮像手段3の位置を示しており、1−4は撮影系1
−1が後ピン状態にあるときの撮像手段3の位置を示し
ており、1−5は撮影系1−1が前ピン状態にあるとき
の撮像手段3の位置を示している。Reference numeral 1-3 shows the position of the image pickup means 3 when the photographing system 1-1 is in focus, and 1-4 shows the photographing system 1.
-1 shows the position of the image pickup means 3 when the rear focus state is present, and 1-5 shows the position of the image pickup means 3 when the image pickup system 1-1 is in the front focus state.
【0019】1−13は駆動手段であり、後述する合焦
検出方式選択手段1−16からの合焦信号に基づいて撮
影系1−1を構成するフォーカスレンズ(合焦レンズ)
を駆動させて合焦を行なっている。Reference numeral 1-13 is a drive means, which is a focus lens (focus lens) which constitutes the photographing system 1-1 based on a focus signal from a focus detection method selection means 1-16 which will be described later.
Is driven to focus.
【0020】1−14,1−15は各々第1、第2の合
焦手段(合焦情報検出手段)であり、撮像手段3からの
画像信号(画像情報)を利用して撮影系の合焦状態をそ
れぞれ検出している。Reference numerals 1-14 and 1-15 are first and second focusing means (focusing information detecting means), respectively, which use the image signal (image information) from the image pickup means 3 to detect the image pickup system. The focus state is detected respectively.
【0021】本実施例における第1の合焦手段1−14
は絞り1−2により生じる撮像手段3の撮像面上に形成
した複数の被写体像(画像)の信号から相関演算により
合焦位置を検出している。第2の合焦手段1−15は撮
像手段3面上に形成した被写体像の信号を用いて該被写
体像の鮮鋭度を評価値として取り出すことにより合焦位
置を検出している。First focusing means 1-14 in this embodiment
Detects the in-focus position by correlation calculation from the signals of a plurality of subject images (images) formed on the image pickup surface of the image pickup means 3 caused by the diaphragm 1-2. The second focusing means 1-15 detects the focusing position by extracting the sharpness of the subject image as an evaluation value using the signal of the subject image formed on the surface of the image pickup means 3.
【0022】1−16は合焦検出方式選択手段であり、
第1,第2の合焦手段1−14,1−15からの信号を
用いてその信頼性を判断し、その判断結果から合焦手段
(オートフォーカスモード)を選択して撮影系の合焦検
出を行なっている。Reference numeral 1-16 is a focus detection system selection means,
The reliability is judged using the signals from the first and second focusing means 1-14 and 1-15, and the focusing means (autofocus mode) is selected from the result of the determination to focus the imaging system. It is detecting.
【0023】尚、上記の各手段1−14,1−15,1
−16はソフトウエア的に構成されていても良く、又合
焦手段の選択は片方の合焦手段のみが選択されるもので
なく双方を同時に選択しても良い。Each of the above means 1-14, 1-15, 1
-16 may be configured by software, and in selecting the focusing means, not only one focusing means but both may be selected simultaneously.
【0024】本実施例において撮影系1−1が合焦状態
にあるとき(撮像手段3が位置1−13にあるとき)、
絞り1−2の2つの開口部1−6,1−7を通過した光
束に基づく2つの画像情報は撮像手段3面上の同じ位置
1−8に結像する。又撮影系1−1が合焦状態でないと
き、例えば撮影系1−1が後ピン状態にあるとき(撮像
手段3が位置1−4にあるとき)開口部1−6を通過し
た光束は撮像手段3面上の位置1−9に、開口部1−7
を通過した光束は撮像手段3面上の位置1−10にやや
ボケた状態で結像する。In this embodiment, when the image pickup system 1-1 is in focus (when the image pickup means 3 is at the position 1-13),
Two pieces of image information based on the light fluxes that have passed through the two openings 1-6 and 1-7 of the diaphragm 1-2 are imaged at the same position 1-8 on the surface of the image pickup means 3. Further, when the photographing system 1-1 is not in focus, for example, when the photographing system 1-1 is in the rear focus state (when the image pickup means 3 is at the position 1-4), the light flux passing through the opening 1-6 is imaged. At the position 1-9 on the surface of the means 3, the opening 1-7
The light flux that has passed through is imaged at a position 1-10 on the surface of the image pickup means 3 in a slightly blurred state.
【0025】又、撮影系3が前ピン状態にあるとき(撮
像手段3が位置1−5にあるとき)開口部1−6を通過
した光束は撮像手段3面上の位置1−12に、開口部1
−7を通過した光束は撮像手段3面上の位置1−11に
ややボケた状態で結像する。Further, when the photographing system 3 is in the front focus state (when the image pickup means 3 is at the position 1-5), the light flux passing through the opening 1-6 reaches the position 1-12 on the surface of the image pickup means 3, Opening 1
The light flux passing through -7 forms an image at a position 1-11 on the surface of the image pickup means 3 in a slightly blurred state.
【0026】図2(A)は図1に示した絞り1−2の2
つの開口部1−6.1−7の大きさの形状を示した説明
図であり、図2(B)はこの絞り1−2を用いたときの
後ピンでの画像の例を示した説明図である。図2(B)
において2−4が開口部1−6を通過したときの像であ
り、2−5が開口部1−7を通過したときの像である。FIG. 2A shows the diaphragm 1-2 of FIG.
It is explanatory drawing which showed the shape of the size of the one opening 1-6.1-7, and FIG. 2 (B) is an explanatory view which showed the example of the image in the rear pin when this diaphragm 1-2 is used. It is a figure. Figure 2 (B)
2-4 is an image when passing through the opening 1-6, and 2-5 is an image when passing through the opening 1-7.
【0027】非合焦状態であるときの結像面上における
像の明るさは前述の如く開口部1−7の方が開口部1−
6より面積が広くなっている為、撮像手段3面上におけ
る位置1−10,1−11での像の明るさはそれぞれ撮
像手段3面上における位置1−9,1−12での像より
も明るくボケた像になる。As described above, the brightness of the image on the image plane in the non-focused state is larger in the opening 1-7 than in the opening 1-7.
Since the area is larger than that of 6, the brightness of the image at the positions 1-10 and 1-11 on the surface of the image pickup means 3 is larger than that at the positions 1-9 and 1-12 on the surface of the image pickup means 3, respectively. Also becomes a bright and blurred image.
【0028】この2つの像2−4,2−5の明るさか、
あるいは像のボケ具合を合焦検出手段で演算することに
より、撮影系の合焦位置の方向を判定することができ
る。尚、この判定に用いる情報としては像の明るさとボ
ケ具合の両方を用いて判定するようにしても良い。The brightness of these two images 2-4 and 2-5,
Alternatively, the direction of the focus position of the photographing system can be determined by calculating the degree of blurring of the image by the focus detection means. Note that the information used for this determination may be determined using both the brightness of the image and the degree of blurring.
【0029】このように非合焦状態のとき、撮像面に結
像する像はボケて多重に重なった像になる。この状態で
の撮像手段3からの画像信号の自己相関を演算すること
により、重なった2つの像のズレ量を求めることがで
き、又そのズレ量からディフォーカス量の情報を得るこ
とができる。As described above, in the out-of-focus state, the images formed on the image pickup surface are blurred and multiply overlapped. By calculating the autocorrelation of the image signal from the image pickup means 3 in this state, it is possible to obtain the deviation amount of the two images that overlap each other, and it is possible to obtain the information of the defocus amount from the deviation amount.
【0030】次にこの自己相関演算を用いる第1の合焦
手段(第1のオートフォーカスモード)について説明す
る。Next, the first focusing means (first autofocus mode) using this autocorrelation calculation will be described.
【0031】撮像手段3面上に結像した被写体像に基づ
く信号は時間軸tを変数とする電気信号として出力され
る。この信号(元信号)をf(t)とすると、自己相関
関数C(τ)は次の式で表わされる。A signal based on the subject image formed on the surface of the image pickup means 3 is output as an electric signal having the time axis t as a variable. When this signal (original signal) is f (t), the autocorrelation function C (τ) is expressed by the following equation.
【0032】[0032]
【数1】 ここでTは測距に用いるデータの範囲を示す。[Equation 1] Here, T indicates the range of data used for distance measurement.
【0033】実際の計算は撮像手段3から出力される離
散的なデータに対して行なわれる為、元信号f(t)は
撮像面における座標を用いてX(i,j)と表現でき
る。ここでiは水平方向の座標、jは垂直方向の座標を
表わす。座標X(i,j)を用いると自己相関関数C
(m,n)は次の式で表わされる。Since the actual calculation is performed on the discrete data output from the image pickup means 3, the original signal f (t) can be expressed as X (i, j) using the coordinates on the image pickup surface. Here, i represents a horizontal coordinate and j represents a vertical coordinate. When the coordinate X (i, j) is used, the autocorrelation function C
(M, n) is expressed by the following equation.
【0034】[0034]
【数2】 (2)式においてIは測距枠の水平方向の画素数、Jは
測距枠の垂直方向の画素数を表わす。座標X(i,j)
と座標X(i+m,j+n)との関係を図3に示す。
(2)式におけるmとnとが一定の関係にあり、又mと
nとがKの関数として表わされる場合は座標C(m,
n)は1つの変数の関数としてC(k)と表わすことが
できる。[Equation 2] In equation (2), I represents the number of pixels in the horizontal direction of the ranging frame, and J represents the number of pixels in the vertical direction of the ranging frame. Coordinate X (i, j)
And the coordinate X (i + m, j + n) are shown in FIG.
When m and n in the equation (2) have a constant relationship and m and n are expressed as a function of K, the coordinate C (m,
n) can be represented as C (k) as a function of one variable.
【0035】又、nが零の場合について関数C(k)を
求めると(1)式の連続変数τを離散的な変数kで置き
換えたものと等しくなる。Further, when the function C (k) is obtained when n is zero, it is equal to the continuous variable τ in the equation (1) replaced by the discrete variable k.
【0036】一般に図2(B)に示した2線ボケの画像
では、その画像信号から相関関数C(m)を求めると図
4(A)に示すようになる。mが重なった画像のズレ量
に等しいとき、図4(A)に示すようにピーク4−1が
発生する。これによりmが0以外で関数C(m)がピー
クになるときの値mpが、撮像手段面上のズレ画像の間
隔(画素間隔)となる。Generally, in the two-line blurred image shown in FIG. 2 (B), the correlation function C (m) is obtained from the image signal as shown in FIG. 4 (A). When m is equal to the shift amount of the overlapped images, a peak 4-1 occurs as shown in FIG. As a result, the value mp when m is other than 0 and the function C (m) has a peak becomes the interval (pixel interval) between the misaligned images on the surface of the imaging means.
【0037】この値からディフォーカス量を求めること
ができる。又ディフォーカスの方向はこの自己相関のピ
ークを与える値mpを用いて、この画素分ズレた2つの
像の明るさの大小を各画素の輝度の差分の平均の正負か
ら求めることができる。The defocus amount can be obtained from this value. Further, as the defocus direction, the value mp giving the peak of this autocorrelation can be used to determine the magnitude of the brightness of the two images displaced by this pixel from the positive / negative of the average of the differences in the brightness of each pixel.
【0038】この合焦検出方式では図2(A)に示した
直線(2つの開口部1−6,1−7の中心を結んだ線)
2−3と平行な方向に画像のズレが起こる為、絞りを設
置するときに、この開口部1−6,1−7の並び方向が
撮像手段の走査線方向と平行にならないように取り付け
ることにより、従来の合焦検出方式では苦手であった横
縞模様の被写体でも良好に合焦検出を行なうことができ
る。In this focus detection method, the straight line shown in FIG. 2A (the line connecting the centers of the two openings 1-6 and 1-7)
Since an image shift occurs in a direction parallel to 2-3, the apertures 1-6 and 1-7 should be installed so that the arrangement direction is not parallel to the scanning line direction of the image pickup means when the diaphragm is installed. As a result, it is possible to satisfactorily perform focus detection even on a subject having a horizontal stripe pattern, which is difficult for the conventional focus detection method.
【0039】尚、絞りの開口部の数を3つ以上に増や
し、瞳を複数に分割して3つ以上の画像情報を得るよう
にしても良い。この場合も上記の如く複数の開口部の並
び方向が撮像手段の走査線方向と平行にならないように
配置すれば良い。The number of apertures of the diaphragm may be increased to three or more, and the pupil may be divided into a plurality of pieces to obtain three or more pieces of image information. Also in this case, it is sufficient to arrange the plurality of openings so that the arrangement direction of the plurality of openings is not parallel to the scanning line direction of the image pickup means.
【0040】又、本実施例においては図4(A)からも
分かるようにmが0のときは自己相関関数の値が最大に
なり、それに比べてピントのズレ量に対応するピークは
小さい。よってこのズレ量mpが0に近い。Further, in this embodiment, as can be seen from FIG. 4A, the value of the autocorrelation function becomes maximum when m is 0, and the peak corresponding to the amount of focus shift is smaller than that. Therefore, this shift amount mp is close to zero.
【0041】即ち、ディフォーカス量がかなり小さいと
きは第2のピークの相関値はmが0のときのピークに隠
れる為に検出が困難となってくる。又周期パターンの被
写体などのように相関演算のデータの波形が図4(B)
の各ピーク4−2,4−3,4−4,4−5で示すよう
にmが0以外でも複数のピークをもつ波形となる場合が
ある。このようなときには正確な合焦判定が困難となっ
てくる。That is, when the defocus amount is considerably small, the correlation value of the second peak is hidden by the peak when m is 0, which makes detection difficult. Moreover, the waveform of the data of the correlation calculation such as the subject of the periodic pattern is shown in FIG.
As shown by the respective peaks 4-2, 4-3, 4-4, 4-5, there are cases where the waveform has a plurality of peaks even when m is not 0. In such a case, accurate focus determination becomes difficult.
【0042】そこで本実施例においてはこのような場合
には後述するようにまず1フレームの画像信号から相関
演算によりおおよその合焦位置を求めた後、合焦検出方
式選択手段により精度の高い鮮鋭度評価を用いた第2の
合焦手段(第2のオートフォーカスモード)に切替えて
いる。Therefore, in this embodiment, in such a case, as will be described later, first, an approximate focus position is obtained from the image signal of one frame by a correlation calculation, and then sharpness with high precision is obtained by the focus detection method selection means. The second focusing means (second autofocus mode) using the degree evaluation is switched.
【0043】次にこの画像の鮮鋭度を評価する第2の合
焦手段(第2のオートフォーカスモード)について説明
する。Next, the second focusing means (second autofocus mode) for evaluating the sharpness of this image will be described.
【0044】複数の開口部を有する絞りにより光路を分
割していても合焦時にはボケ具合が最小になり、非合焦
時には重なっている各像でのボケ具合が大きくなること
は前記した図1からも明らかである。これを前述した従
来の合焦検出方式で鮮鋭度の評価値として数値化するこ
とにより、従来と同様の山登り方式の合焦検出を行なう
こともできる。Even if the optical path is divided by a diaphragm having a plurality of apertures, the degree of blurring is minimized during focusing, and the degree of blurring between overlapping images is increased during non-focusing. It is also clear from. By digitizing this as the evaluation value of the sharpness by the above-described conventional focus detection method, it is possible to perform the same hill-climbing focus detection as the conventional one.
【0045】又、複数の開口部を有する絞りを用いた場
合でも1つの開口部を有する絞りと全く同様に、鮮鋭度
の評価値は合焦状態で最大となり、又非合焦状態では合
焦位置から離れるほど小さくなる。Even when a diaphragm having a plurality of apertures is used, the sharpness evaluation value becomes maximum in the focused state, and in the unfocused state, just like the diaphragm having one aperture. The smaller the position, the smaller it becomes.
【0046】この画像の鮮鋭度を評価する方式の苦手の
被写体の例として、例えば高輝度の被写体がある場合に
は鮮鋭度を評価する山登り方式では高輝度の被写体のあ
るところで疑似ピークが発生して正しい合焦位置が得ら
れないという問題点がある。As an example of a subject which is poor in the method of evaluating the sharpness of the image, for example, when there is a high-luminance subject, in the hill climbing method of evaluating the sharpness, a pseudo peak occurs at a high-luminance subject. However, there is a problem that the correct focus position cannot be obtained.
【0047】しかしながら一定のしきい値以上の高輝度
の被写体がある場合には、自己相関演算で合焦位置を求
めるようにすれば高輝度の被写体の場合でも像のズレ量
から合焦位置を求めることができる為、誤動作を防止す
ることができる。However, when there is a high-brightness object having a certain threshold value or more, if the focus position is obtained by autocorrelation calculation, the focus position can be calculated from the image shift amount even in the case of a high-brightness object. Since it can be obtained, malfunction can be prevented.
【0048】又、画像の鮮鋭度を評価する合焦検出方式
において、特に処理をアナログ回路で行なっている場合
には走査線と垂直方向のエッジが評価の対象となる為、
走査線方向に平行なエッジしかない横縞模様の被写体に
対しては鮮鋭度の評価による合焦検出は困難となってく
るという問題点がある。Further, in the focus detection method for evaluating the sharpness of the image, the edge in the direction perpendicular to the scanning line is the object of evaluation, especially when the processing is performed by an analog circuit.
There is a problem that it becomes difficult to detect the focus by evaluating the sharpness for a horizontal stripe pattern object having only edges parallel to the scanning line direction.
【0049】そこで本実施例では後述するように上記に
示した複数の合焦検出方式を各合焦検出手段から得られ
る合焦情報に応じて使い分けることにより、各方式での
苦手な被写体を互いに補い合うようにしている。これに
より誤動作の少ない正確なるオートフォーカスを行なっ
ている。Therefore, in the present embodiment, as will be described later, a plurality of focus detection methods described above are selectively used according to the focus information obtained from each focus detection means, so that subjects that are not good at each method can be separated from each other. I try to make up for each other. As a result, accurate autofocus with few malfunctions is performed.
【0050】図7に本発明の実施例1の具体的なブロッ
ク図を示す。FIG. 7 shows a concrete block diagram of the first embodiment of the present invention.
【0051】本実施例においてはまず自己相関演算を行
なう際には輝度信号分離回路6−8で撮像手段(CC
D)6−3からの信号のうち輝度信号を取り出し、A/
D変換器6−9でデジタル信号に変換して各画素の信号
X(i,j)をメモリ6−10に記憶する。In the present embodiment, first, when performing the autocorrelation calculation, the luminance signal separation circuit 6-8 is used for the image pickup means (CC
D) The luminance signal is extracted from the signals from 6-3, and A /
The D converter 6-9 converts it into a digital signal and stores the signal X (i, j) of each pixel in the memory 6-10.
【0052】次いでその信号X(i,j)をマイクロプ
ロセッサ6−7で読み込んで、前記した(2)式のmの
値を変えながら自己相関演算を行ない、その演算結果を
基にモータドライバ6−8を介してモータ6−4により
撮影系6−1を構成するフォーカスレンズを駆動させて
合焦を行なっている。Then, the signal X (i, j) is read by the microprocessor 6-7, autocorrelation calculation is performed while changing the value of m in the above equation (2), and the motor driver 6 is based on the calculation result. The focus lens forming the photographing system 6-1 is driven by the motor 6-4 via -8 for focusing.
【0053】本実施例ではディフォーカス方向はピーク
を見つけてから1回調べれば良い。又1フレームをサン
プリングした後の合焦位置の情報を得る時間はマイクロ
プロセッサ6−7の処理能力に依存する。In the present embodiment, the defocus direction may be checked once after finding the peak. Further, the time for obtaining the information of the in-focus position after sampling one frame depends on the processing capability of the microprocessor 6-7.
【0054】次に画像の鮮鋭度の評価による演算方法に
ついて説明する。本実施例では鮮鋭度の評価方法の1つ
として画像の輪郭のエッジ幅を評価するES法を使用し
ている。Next, a calculation method based on evaluation of image sharpness will be described. In this embodiment, the ES method for evaluating the edge width of the contour of the image is used as one of the sharpness evaluation methods.
【0055】即ち、輝度信号分離回路6−8からの輝度
信号を用いて差分回路6−11と微分回路6−12等に
よりエッジ部分の輝度差と微分値を求め、これらの値の
比からエッジ幅の逆数を除算回路6−13で求め、積算
回路6−14によりある所定の測距範囲内で積算する。
そしてこの積算した値をA/D変換器6−15でデジタ
ル値に変換した後、マイクロプロセッサ6−7で読み込
んで最大値となる位置にモータドライバ6−8を介して
モータ6−4によりフォーカスレンズを駆動させながら
サンプリングすることにより合焦を行なっている。That is, using the luminance signal from the luminance signal separation circuit 6-8, the difference circuit 6-11, the differentiation circuit 6-12 and the like determine the luminance difference and the differential value of the edge portion, and the edge is calculated from the ratio of these values. The reciprocal of the width is obtained by the dividing circuit 6-13, and integrated by the integrating circuit 6-14 within a predetermined range.
Then, the integrated value is converted into a digital value by the A / D converter 6-15, read by the microprocessor 6-7, and focused by the motor 6-4 via the motor driver 6-8 at the position where the maximum value is reached. Focusing is performed by sampling while driving the lens.
【0056】図5、図6は本発明の実施例1のフローチ
ャートである。次にこのフローチャートに従って本実施
例の動作を説明する。5 and 6 are flowcharts of the first embodiment of the present invention. Next, the operation of this embodiment will be described according to this flowchart.
【0057】まずステップ5−1でスイッチをONする
と自己相関演算による第1のオートフォーカスモードに
なる。次いでステップ5−2で一画面(1フレーム)分
の像データをメモリーに取り込んだ後、ステップ5−3
でその自己相関演算を行ないフォーカスレンズの移動量
と方向とを決定する。First, when the switch is turned on in step 5-1, the first autofocus mode by autocorrelation calculation is set. Next, in step 5-2, after the image data for one screen (one frame) is loaded into the memory, step 5-3
Then, the autocorrelation calculation is performed to determine the movement amount and direction of the focus lens.
【0058】次いでステップ5−4で自己相関の苦手の
被写体かどうかを判定し、もし苦手な被写体や前述した
ように合焦点に近くてピークが見つからないと判定され
た場合にはステップ5−9に進み鮮鋭度の評価による第
2のオートフォーカスモードに移行し、又自己相関の苦
手の被写体でないと判定された場合にはステップ5−5
でステップ5−3で求めた移動量分だけフォーカスレン
ズを駆動する。Next, in step 5-4, it is determined whether or not the subject is poor in autocorrelation, and if it is determined that the subject is not good or the peak is not found near the in-focus point as described above, step 5-9. If it is determined that the subject is not a subject that is poor in autocorrelation, the process proceeds to step 2-5.
In step 5-3, the focus lens is driven by the amount of movement obtained in step 5-3.
【0059】次いでステップ5−6で鮮鋭度の評価の苦
手の被写体かどうかを判定し、もし例えば被写体が高輝
度のものを含んでいる等の理由から正確なピークの検出
が困難と判定された場合には自己相関位置で求めた合焦
点でフォーカスレンズを停止させステップ5−7に進
み、そうでない場合はステップ5−9に進み鮮鋭度の評
価による第2のオートフォーカスモードに切替えて正確
な合焦点を探す。Then, in step 5-6, it is determined whether or not the subject is poor in sharpness evaluation, and it is determined that accurate peak detection is difficult, for example, because the subject includes a subject having high brightness. In this case, the focus lens is stopped at the focal point obtained at the autocorrelation position, and the process proceeds to step 5-7. If not, the process proceeds to step 5-9, and the second autofocus mode based on the sharpness evaluation is switched to the accurate mode. Find the focal point.
【0060】このステップ5−6での判定方法としては
高輝度の被写体の場合、予め高輝度の被写体を撮影した
データからしきい値を定め、メモリーに読み込んだ1フ
レームのデータにその値を越える値があるか否かをソフ
トウエア等で調べることで実現することができる。As a determination method in steps 5-6, in the case of a high-brightness subject, a threshold value is determined in advance from data obtained by photographing a high-brightness subject, and the value is exceeded in the data of one frame read into the memory. It can be realized by checking with software or the like whether there is a value.
【0061】次いでステップ5−7で再度1フレームを
サンプリングして上記に示したステップ5−2、5−3
と同様な処理を行ない本当に相関関数の値mpのピーク
が消えているかどうかを判定し、消えていればステップ
5−8に進み合焦状態とし、消えていなければステップ
5−2に戻り、上記に示したステップを繰り返す。Then, in step 5-7, one frame is sampled again, and steps 5-2 and 5-3 described above are performed.
The same processing as described above is performed to determine whether or not the peak of the value mp of the correlation function really disappears. If the peak disappears, the process proceeds to step 5-8 to bring the object into focus. Repeat the steps shown in.
【0062】一方、ステップ5−9以降では画像の鮮鋭
度の評価による第2のオートフォーカスモードに入る。
画像の鮮鋭度を評価する合焦検出方式はレンズを駆動し
ながら何フレームもサンプリングする必要がある為時間
がかかるが、本実施例においては自己相関演算で合焦点
が予め分かっているので、この合焦点前後でのみ鮮鋭度
の評価を行なえば良い。On the other hand, after step 5-9, the second autofocus mode is entered by evaluating the sharpness of the image.
The focus detection method for evaluating the sharpness of an image takes time because it is necessary to sample many frames while driving the lens, but in this embodiment, the focus point is known in advance by the autocorrelation calculation. The sharpness may be evaluated only before and after focusing.
【0063】尚、図6においてESは画像のエッジ幅の
大きさを使用して求めた鮮鋭度の評価値を表わしてお
り、以下「評価値ES」と称す。In FIG. 6, ES represents the evaluation value of the sharpness obtained by using the size of the edge width of the image, and is referred to as "evaluation value ES" hereinafter.
【0064】まずステップ5−9で評価値ESを求める
計算を開始し、次いでステップ5−10でフォーカスレ
ンズを駆動しながら引続き評価値ESのサンプリングを
続ける。駆動方向はまずその点からある所定量往復させ
る。次いでステップ5−11でこの駆動中にピークがあ
る場合には、そのピークESpの値を記憶して、その点
にフォーカスレンズを移動させて合焦状態を得る。First, in step 5-9, the calculation for obtaining the evaluation value ES is started, and then, in step 5-10, sampling of the evaluation value ES is continued while driving the focus lens. First, the driving direction is reciprocated by a predetermined amount from that point. Next, if there is a peak during this driving in step 5-11, the value of the peak ESp is stored, and the focus lens is moved to that point to obtain a focused state.
【0065】ピークが無い場合には前記した図8に示す
波形Aの位置7−2のように評価値ESの変化が片側で
は増加、もう片側では減少となるので、これによりピー
クの方向を判定すればそのピークを探すことができる。When there is no peak, the change in the evaluation value ES increases on one side and decreases on the other side as in the position 7-2 of the waveform A shown in FIG. Then you can find the peak.
【0066】次いでステップ5−12、5−13で合焦
状態に達してからは評価値ESの値がピークESpの値
より小さくなるまで常に評価値ESの変化を監視しなが
ら繰り返す。Next, after reaching the in-focus state in steps 5-12 and 5-13, the change of the evaluation value ES is constantly monitored and repeated until the value of the evaluation value ES becomes smaller than the value of the peak ESp.
【0067】尚、鮮鋭度の評価をせずに合焦状態に達し
た場合には、ある一定時間おきに前述したステップ5−
7、5−8を繰り返してメモリーに1フレームのデータ
を取り込み、自己相関演算による結果を監視するように
すれば良い。If the in-focus state is reached without evaluating the sharpness, the above-mentioned step 5
It is sufficient to repeat steps 7 and 5-8 to load one frame of data into the memory and monitor the result of the autocorrelation calculation.
【0068】又、並列処理の可能なマイクロプロセッサ
を使用するなどして、上述した2つの合焦検出方式で撮
影系の合焦状態の変化の監視を同時に行なうようにして
も良い。Further, by using a microprocessor capable of parallel processing, it is possible to simultaneously monitor changes in the focus state of the photographing system by the two focus detection methods described above.
【0069】次いでステップ5−14で撮影系の合焦状
態が変化した場合にはオートフォーカスの再起動をかけ
る。このときしきい値ESoを、例えば合焦時のピーク
ESpの値の半分とするなど、そのときの条件に応じて
任意に設定する。Next, in step 5-14, when the focus state of the photographing system changes, the autofocus is restarted. At this time, the threshold value ESo is arbitrarily set according to the conditions at that time, for example, half the value of the peak ESp at the time of focusing.
【0070】次いでステップ5−15で評価値ESとし
きい値ESoとを比較し、変化した評価値ESがしきい
値ESoより小さい場合は合焦点位置が大きく変化した
と判断しステップ5−16に進み、鮮鋭度の評価値ES
の算出を停止しステップ5−2に戻り自己相関演算によ
る第1のオートフォーカスモードに戻り上記に示したス
テップを繰り返す。Then, in step 5-15, the evaluation value ES is compared with the threshold value ESo. If the changed evaluation value ES is smaller than the threshold value ESo, it is determined that the in-focus position has changed significantly, and the process proceeds to step 5-16. Advance, sharpness evaluation value ES
Calculation is stopped and the process returns to step 5-2 to return to the first autofocus mode by the autocorrelation calculation and the above steps are repeated.
【0071】逆に評価値ESがしきい値ESoより大き
い場合には合焦位置が少ししか動いていないと判断し、
鮮鋭度の評価の第2のオートフォーカスモードのままで
ステップ5−10に戻り上記に示したステップを繰り返
し合焦位置を探す。On the contrary, when the evaluation value ES is larger than the threshold value ESo, it is determined that the in-focus position is slightly moved,
In the second autofocus mode for sharpness evaluation, the process returns to step 5-10 to repeat the steps described above to search for the in-focus position.
【0072】このように本実施例においては前述の如く
合焦状態を維持するように各手段を適切に制御すること
により、従来の合焦検出方式に比べてサンプリングのフ
レーム数を少なくでき、又誤動作の原因となる苦手な被
写体を複数の合焦検出方式で互いに補い合うことによっ
て速く正確に合焦検出を行なっている。As described above, in this embodiment, by appropriately controlling each means so as to maintain the in-focus state as described above, the number of sampling frames can be reduced as compared with the conventional in-focus detection method, and Focusing detection is performed quickly and accurately by complementing each other with a plurality of focusing detection methods for subjects that are not suitable for causing malfunctions.
【0073】尚、静止画撮影など高精細な画像信号の記
録を望む場合は、図7に示すように開口部数が切替え可
能な絞り6−2と絞り切替器6−6等を用いて撮影時の
み絞りを単数の開口部を持つ絞りと切替えるようにすれ
ば良い。これにより背景等がズレた像の重なりになるこ
とを防止することができる。When it is desired to record a high-definition image signal such as still image shooting, a diaphragm 6-2 whose aperture number can be switched and an aperture switch 6-6 are used for shooting as shown in FIG. Only the diaphragm may be switched to a diaphragm having a single opening. This makes it possible to prevent overlapping of images with different backgrounds.
【0074】又、本実施例においては画像の鮮鋭度の評
価方法の1つとしてES法を用いたが、他の評価方法を
用いて合焦検出を行なっても本発明は前述の実施例と同
様に適用することができる。Although the ES method is used as one of the image sharpness evaluation methods in the present embodiment, the present invention is not limited to the above-described embodiment even if focus detection is performed using another evaluation method. It can be applied similarly.
【0075】[0075]
【発明の効果】本発明によれば前述の如く1フレーム分
の映像信号から相関演算を用いて合焦位置を求めた後
に、合焦情報に応じて精度の高い鮮鋭度の評価方式のオ
ートフォーカスモードに切替えることができる為、従来
の合焦検出方式に比べサンプリングするフレーム数を少
なくでき、速く合焦位置に達することができ、しかも同
じ複数の開口部を有する絞りを用いて2つの合焦検出方
式(オートフォーカスモード)を絞りの切替えなしで使
用することができるので瞬時に合焦検出方式を切替えた
り、あるいは2つの合焦検出方式で相関演算と鮮鋭度評
価値の計算を同時に行なうこともできる合焦検出装置を
達成することができる。According to the present invention, as described above, after the in-focus position is obtained from the video signal for one frame by using the correlation calculation, the auto-focus of the sharpness evaluation system with high accuracy according to the in-focus information. Since the mode can be switched, the number of frames to be sampled can be reduced compared to the conventional focus detection method, the focus position can be reached quickly, and two focus points can be obtained by using the diaphragm having the same multiple openings. Since the detection method (autofocus mode) can be used without switching the aperture, the focus detection method can be switched instantaneously, or the correlation calculation and the sharpness evaluation value calculation can be performed simultaneously by the two focus detection methods. It is possible to achieve a focus detection device that can also perform.
【0076】更に本発明によれば複数の合焦検出方式を
合焦情報に応じて使い分けることにより、各方式での苦
手な被写体を互いに補い合うことができ、これにより誤
動作の少ない正確なるオートフォーカスを行なうことが
できる合焦検出装置を達成することができる。Furthermore, according to the present invention, by selectively using a plurality of focus detection methods according to the focus information, it is possible to complement the subjects which are not good in each method with each other, thereby achieving accurate autofocus with less malfunction. A focus detection device that can be implemented can be achieved.
【図1】 本発明の実施例1の要部構成図FIG. 1 is a configuration diagram of a main part of a first embodiment of the present invention.
【図2】 本発明の実施例1の絞り形状とそれを用いた
ときの像の例を示す説明図FIG. 2 is an explanatory diagram showing an aperture shape according to the first embodiment of the present invention and an example of an image when using the aperture shape.
【図3】 本発明の実施例1の撮像手段面上の画素の関
係を示す説明図FIG. 3 is an explanatory diagram showing a relationship between pixels on an image pickup unit surface according to the first embodiment of the present invention.
【図4】 本発明の実施例1の自己相関関数の計算結果
をグラフに示した説明図FIG. 4 is an explanatory diagram showing a graph of a calculation result of an autocorrelation function according to the first embodiment of the present invention.
【図5】 本発明の実施例1のフローチャートFIG. 5 is a flowchart of the first embodiment of the present invention.
【図6】 本発明の実施例1のフローチャートFIG. 6 is a flowchart of the first embodiment of the present invention.
【図7】 本発明の実施例1の要部ブロック図FIG. 7 is a block diagram of a main part of the first embodiment of the present invention.
【図8】 本発明の実施例1の鮮鋭度の評価値をグラフ
に示した説明図FIG. 8 is an explanatory diagram showing a graph of evaluation values of sharpness in Example 1 of the present invention.
1−1 撮影系 1−2 絞り 3 撮像手段 1−13 駆動手段 1−14 合焦情報検出手段 1−15 合焦情報検出手段 1−16 合焦検出方式選択手段 2−1,2−2 開口部 1-1 Photographing system 1-2 Aperture 3 Imaging means 1-13 Driving means 1-14 Focusing information detecting means 1-15 Focusing information detecting means 1-16 Focusing detection method selecting means 2-1 and 2-2 Aperture Department
フロントページの続き (72)発明者 山口 敏信 東京都大田区下丸子3丁目30番2号 キヤ ノン株式会社内 (72)発明者 猿渡 浩 東京都大田区下丸子3丁目30番2号 キヤ ノン株式会社内Front page continuation (72) Inventor Toshinobu Yamaguchi 3-30-2 Shimomaruko, Ota-ku, Tokyo Canon Inc. (72) Inventor Hiroshi Saruwatari 3-30-2 Shimomaruko, Ota-ku, Tokyo Canon Inc.
Claims (2)
の瞳を複数の領域に分割し、該分割した複数の領域を通
過した光束に基づく複数の画像情報を撮像手段面上に形
成し、該撮像手段からの画像信号を用いて複数の合焦手
段で該撮影系の合焦状態を検出する際、 該複数の合焦手段のうち、まず第1の合焦手段で該撮影
系の合焦位置を検出した後、合焦検出方式選択手段によ
り該第1の合焦手段とは異なる他の第2の合焦手段を用
いて合焦を行なうか又は該第1の合焦手段で合焦を行な
うか選択して、該撮影系の合焦状態を検出するようにし
たことを特徴とする合焦検出装置。1. A pupil having a plurality of apertures is used to divide a pupil of a photographing system into a plurality of areas, and a plurality of image information based on light fluxes passing through the plurality of divided areas are formed on an image pickup means surface. When the focusing state of the photographing system is detected by the plurality of focusing means using the image signal from the image pickup means, first of the plurality of focusing means, the first focusing means focuses the focusing system of the photographing system. After detecting the in-focus position, the in-focus detection system selecting means performs in-focus by using another second in-focus means different from the first in-focus means, or in-focus by the first in-focus means. A focus detection device characterized in that a focus state of the photographing system is detected by selecting whether to perform focus.
生じる前記撮像手段面上に形成した複数の被写体像の信
号から相関演算により合焦位置を検出しており、前記第
2の合焦手段は前記撮像手段面上に形成した被写体像の
鮮鋭度を評価値として取りだして合焦位置を検出してい
ることを特徴とする請求項1の合焦検出装置。2. The first focusing means detects a focusing position by a correlation calculation from signals of a plurality of subject images formed on the surface of the image pickup means generated by the diaphragm, and the second focusing means. The focus detecting apparatus according to claim 1, wherein the means detects the focus position by taking out the sharpness of the subject image formed on the surface of the image pickup means as an evaluation value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP6082374A JPH07270674A (en) | 1994-03-29 | 1994-03-29 | Focusing detecting device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP6082374A JPH07270674A (en) | 1994-03-29 | 1994-03-29 | Focusing detecting device |
Publications (1)
Publication Number | Publication Date |
---|---|
JPH07270674A true JPH07270674A (en) | 1995-10-20 |
Family
ID=13772819
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP6082374A Pending JPH07270674A (en) | 1994-03-29 | 1994-03-29 | Focusing detecting device |
Country Status (1)
Country | Link |
---|---|
JP (1) | JPH07270674A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6144805A (en) * | 1994-07-26 | 2000-11-07 | Canon Kabushiki Kaisha | Optical apparatus for correcting focus detection caused by environmental variation |
JP2001174696A (en) * | 1999-12-15 | 2001-06-29 | Olympus Optical Co Ltd | Color image pickup unit |
US6268885B1 (en) | 1996-01-31 | 2001-07-31 | Canon Kabushiki Kaisha | Optical apparatus for correcting focus based on temperature and humidity |
JP2001305415A (en) * | 2000-04-19 | 2001-10-31 | Canon Inc | Focus detector |
JP2009267523A (en) * | 2008-04-22 | 2009-11-12 | Nikon Corp | Image restoration apparatus and imaging apparatus |
JP2011091602A (en) * | 2009-10-22 | 2011-05-06 | Canon Inc | Image display device, imaging apparatus and method for displaying image |
JP2011154385A (en) * | 2004-07-12 | 2011-08-11 | Canon Inc | Optical apparatus |
JP2012203314A (en) * | 2011-03-28 | 2012-10-22 | Fujifilm Corp | Imaging apparatus and control method for imaging apparatus |
JP2012226247A (en) * | 2011-04-22 | 2012-11-15 | Nikon Corp | Focus detector and imaging apparatus |
JP2015014802A (en) * | 2014-09-01 | 2015-01-22 | キヤノン株式会社 | Automatic focusing device |
-
1994
- 1994-03-29 JP JP6082374A patent/JPH07270674A/en active Pending
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6144805A (en) * | 1994-07-26 | 2000-11-07 | Canon Kabushiki Kaisha | Optical apparatus for correcting focus detection caused by environmental variation |
US6268885B1 (en) | 1996-01-31 | 2001-07-31 | Canon Kabushiki Kaisha | Optical apparatus for correcting focus based on temperature and humidity |
US6822688B2 (en) | 1996-01-31 | 2004-11-23 | Canon Kabushiki Kaisha | Movable-lens position control apparatus |
JP2001174696A (en) * | 1999-12-15 | 2001-06-29 | Olympus Optical Co Ltd | Color image pickup unit |
JP2001305415A (en) * | 2000-04-19 | 2001-10-31 | Canon Inc | Focus detector |
JP2011154385A (en) * | 2004-07-12 | 2011-08-11 | Canon Inc | Optical apparatus |
JP2009267523A (en) * | 2008-04-22 | 2009-11-12 | Nikon Corp | Image restoration apparatus and imaging apparatus |
JP2011091602A (en) * | 2009-10-22 | 2011-05-06 | Canon Inc | Image display device, imaging apparatus and method for displaying image |
JP2012203314A (en) * | 2011-03-28 | 2012-10-22 | Fujifilm Corp | Imaging apparatus and control method for imaging apparatus |
JP2012226247A (en) * | 2011-04-22 | 2012-11-15 | Nikon Corp | Focus detector and imaging apparatus |
JP2015014802A (en) * | 2014-09-01 | 2015-01-22 | キヤノン株式会社 | Automatic focusing device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7778539B2 (en) | Optical apparatus | |
US7095443B2 (en) | Focusing state determining apparatus | |
US8254773B2 (en) | Image tracking apparatus and tracking evaluation method | |
US20040223073A1 (en) | Focal length detecting method and focusing device | |
JPH02135311A (en) | Focus detecting device | |
JP2001330769A (en) | Image pickup device and its control method | |
JP2003029135A (en) | Camera, camera system and photographic lens device | |
JPH07270674A (en) | Focusing detecting device | |
US7570298B2 (en) | Image-taking apparatus with first focus control such that in-focus position is searched for based on first signal and second focus control such that one of in-focus position and drive amount is determined based on second signal | |
JP2001174696A (en) | Color image pickup unit | |
JPH07298120A (en) | Automatic focusing adjustment device | |
JP4011738B2 (en) | Optical device | |
JPH07287162A (en) | Image input device with automatic focus adjusting means | |
JP4085720B2 (en) | Digital camera | |
JP2007133301A (en) | Autofocus camera | |
JP3412915B2 (en) | Automatic focusing device | |
JPH07143388A (en) | Video camera | |
JP3382321B2 (en) | Optical equipment | |
JP2868834B2 (en) | Auto focus device | |
JP2004258085A (en) | Autofocus system | |
JP3302132B2 (en) | Imaging device | |
JP2810403B2 (en) | Automatic focusing device | |
JPH07159685A (en) | Automatic focusing device | |
JPH07318787A (en) | Automatic zoom device | |
JPH0769513B2 (en) | Focus control signal forming device |