JPH1185981A - Distance measurement origin recognition device for moving objects - Google Patents
Distance measurement origin recognition device for moving objectsInfo
- Publication number
- JPH1185981A JPH1185981A JP9247319A JP24731997A JPH1185981A JP H1185981 A JPH1185981 A JP H1185981A JP 9247319 A JP9247319 A JP 9247319A JP 24731997 A JP24731997 A JP 24731997A JP H1185981 A JPH1185981 A JP H1185981A
- Authority
- JP
- Japan
- Prior art keywords
- distance
- mark
- image
- pattern
- line segment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Measurement Of Optical Distance (AREA)
- Image Processing (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Image Analysis (AREA)
Abstract
(57)【要約】
【課題】 原点からの自己位置を測距する際、3次元空
間における絶対位置・絶対方位を精度良く検出可能とす
る。
【解決手段】 測距原点に、同一走査線上に同一画素パ
ターンが無く、且つ、線分の交点が複数箇所あるパター
ンのランドマークを設置してステレオカメラ10で撮像
し、ステレオ処理部でステレオマッチングを行って距離
画像を生成する。そして、認識処理部50で、マークの
元画像を輝度微分して線分を抽出し、距離画像を用いて
3次元空間中の線分として確定することにより、絶対位
置・絶対方位を検出する。
(57) [Summary] [PROBLEMS] To accurately detect an absolute position and an absolute azimuth in a three-dimensional space when measuring a distance from a self-position from an origin. SOLUTION: At a distance measurement origin, a landmark of a pattern having no same pixel pattern on the same scanning line and having a plurality of intersections of line segments is set, an image is taken by a stereo camera 10, and stereo matching is performed by a stereo processing unit. To generate a distance image. Then, the recognition processing unit 50 extracts a line segment by differentiating the luminance of the original image of the mark, and determines the line segment in the three-dimensional space by using the distance image, thereby detecting the absolute position and the absolute azimuth.
Description
【0001】[0001]
【発明の属する技術分野】本発明は、測距原点に設置し
たマークを認識して原点における位置・方位を検出する
測距原点認識装置に関する。BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a distance measuring origin recognizing device for recognizing a mark set at a distance measuring origin and detecting a position and an azimuth at the origin.
【0002】[0002]
【従来の技術】従来より、無人ロボット、自律走行作業
車、無人ヘリコプター等、自律的に移動する移動体に関
し、移動制御、経路検出、航路検出、ロケーション検出
等の各種技術が開発されており、これらの中でも、自己
位置認識は重要な技術の一つである。2. Description of the Related Art Conventionally, various technologies such as movement control, route detection, route detection, and location detection have been developed for mobile objects that move autonomously, such as unmanned robots, autonomous mobile work vehicles, and unmanned helicopters. Among these, self-location recognition is one of the important technologies.
【0003】この自己位置認識の技術として、例えば、
自律走行作業車等の地上を自律的に走行する移動体で
は、振動ジャイロや光ジャイロによって2次元の角速度
を検出するとともに、対地速度を測定するセンサによっ
て並進速度を検出し、基準位置からの移動量を演算して
自己位置を測定する技術があり、また、無人ヘリコプタ
ー等の飛行体では、慣性航法装置により、重力加速度を
検出して飛行体の加速度を検出し、この加速度を積分す
ることで移動量を知る技術がある。[0003] As this self-position recognition technology, for example,
For mobile objects that travel autonomously on the ground, such as autonomous vehicles, two-dimensional angular velocities are detected by vibrating gyros or optical gyros, and translational speeds are detected by sensors that measure ground speed, and movement from a reference position is performed. There is a technology to calculate the self-position by calculating the amount, and for flying objects such as unmanned helicopters, the inertial navigation device detects the gravitational acceleration to detect the acceleration of the flying object and integrates this acceleration. There is a technology to know the amount of movement.
【0004】さらに、近年では、画像処理の技術を適用
して移動体の自己位置を認識する技術が開発されてお
り、特に、移動体にカメラを撮像して周囲環境を撮像
し、撮像タイミングの異なる2枚の画像間のオプティカ
ルフローを求めて移動体の運動を検出することで速度成
分を演算し、測距開始地点(原点)からの航法軌跡を求
めて自己位置を認識する技術では、膨大な情報量を有す
る画像ならではの周囲環境の分析が可能となり、複雑な
地形を判別して正確な自律航法を実現することができ
る。Furthermore, in recent years, a technique for recognizing the self-position of a moving object by applying image processing technology has been developed. In particular, a camera is imaged on the moving object to image the surrounding environment, and the timing of the imaging timing is adjusted. The technology for calculating the velocity component by detecting the motion of the moving body by calculating the optical flow between two different images and calculating the navigation trajectory from the ranging start point (origin) to recognize the self-position is enormous. It is possible to analyze the surrounding environment unique to an image having a large amount of information, and realize accurate autonomous navigation by determining complicated terrain.
【0005】[0005]
【発明が解決しようとする課題】ところで、上述のよう
な画像処理による自己位置認識の技術では、自己の絶対
位置や絶対方位を知る必要がある場合には、測距原点を
認識する何らかの付加的な装置が必要となる。However, in the technique of self-position recognition by image processing as described above, when it is necessary to know the absolute position and the absolute azimuth of the self, some additional methods for recognizing the distance measurement origin are required. Equipment is required.
【0006】このため、地磁気センサ等を別途に搭載
し、自己の姿勢角を絶対方位角で検出する等の措置も考
えられるが、精度や信頼性が不足していたり、装置が大
掛かりになって移動体に搭載するには不適である、コス
ト上昇を招くといった問題があった。For this reason, it is conceivable to take measures such as separately mounting a geomagnetic sensor or the like and detecting its own attitude angle based on an absolute azimuth angle. However, accuracy or reliability is insufficient, or the apparatus becomes large. There are problems that it is unsuitable for mounting on a moving body and that the cost increases.
【0007】本発明は上記事情に鑑みてなされたもの
で、原点からの自己位置を測距する際、3次元空間にお
ける絶対位置・絶対方位を精度良く検出可能な測距原点
認識装置を提供することを目的としている。The present invention has been made in view of the above circumstances, and provides a distance measuring origin recognizing device capable of accurately detecting an absolute position and an absolute azimuth in a three-dimensional space when measuring the distance from the origin. It is intended to be.
【0008】[0008]
【課題を解決するための手段】請求項1記載の発明は、
測距原点に設置されたマークを撮像する2台1組のステ
レオカメラと、上記ステレオカメラで撮像した一対の画
像に対し、対応位置を探索して対象物までの距離に応じ
て生じる画素ズレ量を求め、この画素ズレ量から得られ
る対象物までの遠近情報を数値化した距離画像を生成す
るステレオ処理部と、上記マークの撮像画像を輝度微分
して線分形状を抽出し、この線分形状と上記距離画像と
に基づいて測距原点に対する3次元の位置・方位を認識
する認識処理部とを備えたことを特徴とする。According to the first aspect of the present invention,
A pair of stereo cameras for capturing a mark placed at the distance measurement origin, and a pair of images captured by the stereo camera, a corresponding position is searched for, and a pixel shift amount generated according to a distance to an object. And a stereo processing unit that generates a distance image in which the perspective information to the object obtained from the pixel shift amount is quantified, and a line segment shape is extracted by differentiating the brightness of the captured image of the mark. A recognition processing unit that recognizes a three-dimensional position and orientation with respect to the distance measurement origin based on the shape and the distance image.
【0009】請求項2記載の発明は、請求項1記載の発
明において、上記線分形状を、輝度微分ベクトルの角度
を軸としたヒストグラムのピーク値の間隔及び度数に基
づいて抽出することを特徴とする。According to a second aspect of the present invention, in the first aspect of the present invention, the line segment shape is extracted based on an interval between peak values and a frequency of a histogram centered on an angle of the luminance differential vector. And
【0010】請求項3記載の発明は、請求項1記載の発
明において、上記マークは、撮像画像上で同一走査線上
に同一画素パターンが無く、且つ、輝度微分画像で線分
の交点を複数箇所検出可能なパターンを有することを特
徴とする。According to a third aspect of the present invention, in the first aspect of the present invention, the mark does not have the same pixel pattern on the same scanning line in the picked-up image, and has a plurality of intersections of line segments in the luminance differential image. It has a detectable pattern.
【0011】請求項4記載の発明は、請求項1記載の発
明において、上記マークは、輝度の異なる領域を三角形
を組み合わせて形成したパターンを有することを特徴と
する。According to a fourth aspect of the present invention, in the first aspect of the present invention, the mark has a pattern formed by combining areas having different luminances by combining triangles.
【0012】請求項5記載の発明は、請求項1記載の発
明において、上記マークは、周囲と輝度の異なる領域を
X字状に形成したパターンを有することを特徴とする。According to a fifth aspect of the present invention, in the first aspect of the present invention, the mark has a pattern in which an area having a luminance different from that of the surroundings is formed in an X-shape.
【0013】すなわち、本発明による測距原点認識装置
では、測距原点に設置されたマークをステレオカメラで
撮像し、このステレオカメラで撮像した一対の画像に対
し、対応位置を探索して対象物までの距離に応じて生じ
る画素ズレ量を求め、この画素ズレ量から得られる対象
物までの遠近情報を数値化した距離画像を生成すると、
マークの撮像画像を輝度微分して線分形状を抽出し、こ
の線分形状と距離画像とに基づいて測距原点に対する3
次元の位置・方位を認識する。線分形状は、輝度微分ベ
クトルの角度を軸としたヒストグラムのピーク値の間隔
及び度数に基づいて抽出することができる。That is, in the distance measuring origin recognizing device according to the present invention, the mark set at the distance measuring origin is picked up by a stereo camera, and a corresponding position is searched for a pair of images picked up by the stereo camera to search for an object. Obtain the pixel shift amount that occurs according to the distance to, and generate a distance image that digitizes the perspective information to the object obtained from this pixel shift amount,
A line segment shape is extracted by differentiating the brightness of the picked-up image of the mark.
Recognize dimensional position and orientation. The line segment shape can be extracted based on the interval and frequency between the peak values of the histogram with the angle of the luminance differential vector as the axis.
【0014】また、測距原点に設置するマークとして
は、撮像画像上で同一走査線上にステレオ法による測距
の誤りを引き起こすような同一画素パターンが無く、且
つ、輝度微分画像で線分の交点を複数箇所検出可能なパ
ターンを有するものが望ましく、輝度の異なる領域を三
角形を組み合わせて形成したパターン、あるいは、周囲
と輝度の異なる領域をX字状に形成したパターンによっ
て実現することができる。As a mark to be set at the origin of the distance measurement, there is no identical pixel pattern that causes an error in distance measurement by the stereo method on the same scanning line on the captured image, and the intersection of line segments in the luminance differential image. Is desirably provided by a pattern in which regions having different luminances are formed by combining triangles, or a pattern in which regions having different luminances from the surroundings are formed in an X-shape.
【0015】[0015]
【発明の実施の形態】以下、図面を参照して本発明の実
施の形態を説明する。図面は本発明の実施の一形態に係
わり、図1は測距原点認識装置の基本構成図、図2は位
置・方位検出ルーチンのフローチャート、図3は単純な
格子状パターンのランドマークを示す説明図、図4はミ
スマッチングを起こさないランドマークのパターン例1
を示す説明図、図5はミスマッチングを起こさないラン
ドマークのパターン例2を示す説明図、図6はミスマッ
チングを起こさないランドマークのパターン例3を示す
説明図、図7は輝度微分を求める小領域の抽出を示す説
明図、図8は輝度微分ベクトルの例を示す説明図、図9
は輝度微分ベクトルの角度ヒストグラムを示す説明図で
ある。Embodiments of the present invention will be described below with reference to the drawings. The drawings relate to an embodiment of the present invention. FIG. 1 is a basic configuration diagram of a distance measuring origin recognition device, FIG. 2 is a flowchart of a position / azimuth detection routine, and FIG. 3 is a diagram showing landmarks of a simple lattice pattern. FIGS. 4A and 4B are landmark pattern examples 1 in which mismatching does not occur.
FIG. 5 is an explanatory diagram showing a landmark pattern example 2 in which no mismatching occurs, FIG. 6 is an explanatory diagram showing a landmark pattern example 3 in which no mismatching occurs, and FIG. FIG. 8 is an explanatory diagram showing extraction of a small area, FIG. 8 is an explanatory diagram showing an example of a luminance differential vector, and FIG.
FIG. 4 is an explanatory diagram showing an angle histogram of a luminance differential vector.
【0016】図1において、符号1は、自律移動体の自
己位置認識に際し、測距開始地点(原点)に設置したマ
ークを認識し、原点の絶対位置・絶対方位を検出する測
距原点認識装置であり、その基本構成として、上記マー
クを撮像するための2台1組のカメラ10a,10bか
らなるステレオカメラ10、このステレオカメラ10で
撮像した一対の画像をステレオマッチングして距離画像
を生成するステレオ処理部20、このステレオ処理部2
0で生成した距離画像をストアする距離画像メモリ3
0、上記ステレオカメラ10で撮像した元画像をストア
する元画像メモリ40、上記マークの元画像及び距離画
像を用いて上記マークを認識し、位置・方位を検出する
認識処理部50が備えられている。In FIG. 1, reference numeral 1 denotes a distance measuring origin recognizing device for recognizing a mark set at a distance measuring starting point (origin) and detecting an absolute position and an absolute direction of the origin when recognizing the self-position of the autonomous moving body. As a basic configuration, a stereo camera 10 including a pair of cameras 10a and 10b for capturing the mark, and a pair of images captured by the stereo camera 10 are stereo-matched to generate a distance image. Stereo processing unit 20, this stereo processing unit 2
Distance image memory 3 for storing the distance image generated at 0
0, an original image memory 40 for storing an original image captured by the stereo camera 10, a recognition processing unit 50 for recognizing the mark by using the original image and the distance image of the mark, and detecting a position / direction. I have.
【0017】上記測距原点認識装置1は、画像処理によ
って移動体の自己位置認識を行うシステムの一部として
構成することができる。例えば、無人ヘリコプター等の
自律飛行体で、前方風景(遠方風景)を撮像するための
2台1組のステレオカメラと、下方風景(地表面)を撮
像するための2台1組のステレオカメラとを搭載し、遠
方風景の時系列的な画像間のオプティカルフロー、及
び、下方風景の時系列的な画像間のオプティカルフロー
を、それぞれの距離画像に基づいて実空間での移動量に
換算して速度成分を求め、下方画像の動きによる速度成
分から遠方画像の動きによる回転速度成分を除去して純
並進速度成分を求めた後、測距開始地点(原点)から見
た並進速度成分に変換して累積することで3次元空間に
おける航法軌跡を求めるシステムでは、測距開始に際
し、地上に設置した原点検出用のランドマークを認識し
て原点の絶対位置・絶対方位を知ることが必要である
が、このような場合、上記ステレオカメラ10は、地表
面撮像用のステレオカメラで兼用することができ、上記
ステレオ処理部20、上記認識処理部50は、システム
の一部として組み込むことができる。The distance measuring origin recognizing device 1 can be constituted as a part of a system for recognizing a self-position of a moving body by image processing. For example, with an autonomous flying vehicle such as an unmanned helicopter, a set of two stereo cameras for imaging a forward scenery (distant scenery) and a pair of two stereo cameras for imaging a lower scenery (ground surface). The optical flow between time-series images of distant scenery and the optical flow between time-series images of lower scenery are converted into the amount of movement in real space based on each distance image. The speed component is obtained, the rotational speed component caused by the movement of the distant image is removed from the speed component caused by the movement of the lower image, the pure translation speed component is obtained, and then converted to the translation speed component viewed from the distance measurement start point (origin). In a system that determines the navigation trajectory in a three-dimensional space by accumulating data, it is necessary to know the absolute position and azimuth of the origin by recognizing the landmark for detecting the origin installed on the ground when starting ranging. However, in such a case, the stereo camera 10 can be also used as a stereo camera for imaging the ground surface, and the stereo processing unit 20 and the recognition processing unit 50 can be incorporated as a part of a system. .
【0018】上記ランドマークとしては、位置方位の検
出精度を確保する上で、検出箇所が多く、線分の多いパ
ターンであることが望ましいが、一方、上記ステレオ処
理部20におけるステレオマッチングに対し、悪影響を
及ぼさないパターンでなければならない。In order to secure the detection accuracy of the position and orientation, it is desirable that the landmark be a pattern having many detection points and many line segments. The pattern must have no adverse effects.
【0019】すなわち、上記ステレオ処理部20では、
2台1組のステレオカメラ10で撮像した一対の画像に
対し、各画像の小領域毎にシティブロック距離を計算し
て互いの相関を求めることで対応する領域を特定し、対
象物までの距離に応じて生じる画素のズレ(=視差)を
求めるようにしており、このズレ量から得られる対象物
までの遠近情報を数値化した3次元画像情報(距離画
像)を取得するため、視差検出方向(水平走査方向)に
類似した領域が複数存在すると、視差を誤検出(ミスマ
ッチング)してしまう。尚、ステレオカメラの撮像画像
から距離画像を生成する処理については、本出願人によ
る特開平5−114099号公報に詳述されている。That is, in the stereo processing unit 20,
For a pair of images captured by a set of two stereo cameras 10, a city block distance is calculated for each small region of each image and a correlation between the city block distances is calculated to specify a corresponding region, and the distance to the object is determined. In order to obtain the three-dimensional image information (distance image) obtained by digitizing the perspective information to the object obtained from the shift amount, the parallax detection direction is obtained. If there are a plurality of regions similar to each other (in the horizontal scanning direction), parallax is erroneously detected (mismatched). The processing of generating a distance image from a captured image of a stereo camera is described in detail in Japanese Patent Application Laid-Open No. H5-114099 by the present applicant.
【0020】例えば、図3に示すようなパターンのラン
ドマーク59では、単純な格子状のパターンであるた
め、A−A'の視差検出方向に対し、距離画像として抽
出される輝度差のエッジを含む小領域a,b,c,d,
e,fのうち、領域a,c,e、領域b,d,fがそれ
ぞれ類似した小領域となってしまい、ステレオ処理に際
してミスマッチングを起こし、正しい距離データを得る
ことができない。For example, since the landmark 59 of the pattern as shown in FIG. 3 is a simple grid-like pattern, the edge of the luminance difference extracted as a distance image is detected in the parallax detection direction of AA ′. Including small areas a, b, c, d,
Among the areas e and f, the areas a, c and e and the areas b, d and f are similar small areas, and a mismatch occurs in stereo processing, so that correct distance data cannot be obtained.
【0021】従って、本発明では、同一走査線上にステ
レオ法による測距の誤りを引き起こすような同一画素パ
ターンが無く、且つ、線分の交点が複数箇所あるパター
ンのランドマーク、すなわち、ステレオマッチングの際
に、視差検出方向に類似した小領域が存在せず、どの方
向に視差検出を行っても(パターンが画像中にどのよう
に傾いて写っても)類似の領域が存在しないパターンの
ランドマークを採用する。Therefore, according to the present invention, there is no landmark of a pattern in which there is no identical pixel pattern on the same scanning line that causes an error in distance measurement by the stereo method and there are a plurality of intersections of line segments, that is, stereo matching. In this case, there is no small area similar to the parallax detection direction, and a landmark of a pattern in which a similar area does not exist regardless of the direction in which parallax detection is performed (however the pattern is tilted in the image). Is adopted.
【0022】このようなランドマークのパターン例は、
図4,5,6に示され、図4に示すランドマーク60
A、図5に示すランドマーク60Bでは、いずれも、輝
度の異なる領域を三角形を組み合わせて形成したパター
ンを有し、視差検出方向A−A'に対して小領域a,
b,c,d,e,f,gの中で類似した領域となるもの
がなく、ミスマッチングは起こらない。さらに、図6に
示すランドマーク60Cは、周囲と輝度の異なる領域を
X字状に形成したパターンであり、同様に同一直線上に
類似した小領域が存在しないため、視差検出方向A−
A'に対して小領域a,b,c,d,e,fの中で類似
した領域となるものがなく、且つ、線分の撮像長さを確
保することで分解能を確保することができる。An example of such a landmark pattern is:
The landmark 60 shown in FIGS.
5A, each of the landmarks 60B shown in FIG. 5 has a pattern in which regions having different luminances are formed by combining triangles, and the small regions a, a with respect to the parallax detection direction AA ′.
There is no similar region among b, c, d, e, f, and g, and no mismatch occurs. Further, the landmark 60C shown in FIG. 6 is a pattern in which regions having different luminance from the surroundings are formed in an X-shape.
There is no similar area among the small areas a, b, c, d, e, and f with respect to A ′, and the resolution can be secured by securing the imaging length of the line segment. .
【0023】以上のランドマーク60A(60B,60
C)に対し、上記認識処理部50では、図2の位置方位
検出ルーチンを実行して位置・方位を検出する。The above landmarks 60A (60B, 60B)
In response to C), the recognition processing unit 50 executes the position and orientation detection routine of FIG. 2 to detect the position and orientation.
【0024】この位置方位検出ルーチンでは、まず、ス
テップS101で、マークを撮像した元画像に対し、マーク
を検出するために予め設定された探索領域を小領域に分
割し、この小領域の輝度から、輝度微分ベクトルΔPを
求める。例えば、図7に示すように、マーク探索領域を
2×2画素の小領域に分割し、i方向に輝度データP0,
P1、j方向に輝度データP2,P3が得られているとき、
輝度微分ベクトルΔPは、以下の(1)式で与えることが
できる。 In the position and orientation detection routine, first, in step S101, a search area set in advance to detect a mark is divided into small areas in the original image of the mark, and the luminance of the small area is calculated from the small area. , And a luminance differential vector ΔP. For example, as shown in FIG. 7, the mark search area is divided into small areas of 2 × 2 pixels, and the luminance data P0,
When luminance data P2 and P3 are obtained in the P1 and j directions,
The luminance differential vector ΔP can be given by the following equation (1).
【0025】また、上記輝度微分ベクトルΔPは、角度
α、長さ(大きさ)Lで書き直すと、以下の(2),(3)式
で表すことができる。 α=tan-1(−ΔPi/ΔPj) …(2) L=(ΔPi2+ΔPj2)1/2 …(3)The above-described luminance differential vector ΔP can be expressed by the following equations (2) and (3) when rewritten with the angle α and the length (size) L. α = tan-1 (−ΔPi / ΔPj) (2) L = (ΔPi 2 + ΔPj 2 ) 1/2 (3)
【0026】次いで、ステップS102へ進み、輝度微分ベ
クトルΔPの長さLがノイズを除去するため予め定めた
閾値を越えているか否かを調べる。そして、ベクトルの
長さLが閾値を越えていない場合には、ステップS104へ
ジャンプし、閾値を越えているとき、ステップS103で、
ベクトルの角度αを軸としたヒストグラムh[α](但
し、α=−90〜+90deg)に度数を加算するとと
もに、その座標データi[n],j[n]、距離画像から得た
距離データd[n]、角度データK[n]をメモリにストア
し、ステップS104へ進む。Next, the process proceeds to step S102, and it is checked whether or not the length L of the luminance differential vector ΔP exceeds a predetermined threshold value for removing noise. If the length L of the vector does not exceed the threshold, the process jumps to step S104. If the length L exceeds the threshold, the process proceeds to step S103.
The frequency is added to a histogram h [α] (α = −90 to +90 deg) with the vector angle α as an axis, the coordinate data i [n], j [n], and the distance data obtained from the distance image. d [n] and the angle data K [n] are stored in the memory, and the process proceeds to step S104.
【0027】ステップS104では、探索範囲の処理が終わ
ったか否かを調べ、探索範囲の処理が終了していないと
きには上記ステップS101へ戻って上述の処理を繰り返
し、探索範囲の処理が終了し、探索領域全面について輝
度微分ベクトルΔPが求まったとき、ステップS105へ進
んで角度ヒストグラムのフィルタ処理を行い、ステップ
S106で、角度ヒストグラムのピーク値の間隔及び度数か
らヒストグラム中のどの群がパターン中のどの線分の検
出結果なのかを判定して線分を分類し、線分の傾き及び
切片を算出する。In step S104, it is determined whether or not the processing of the search range has been completed. If the processing of the search range has not been completed, the process returns to step S101 to repeat the above-described processing. When the luminance differential vector ΔP is obtained for the entire area, the process proceeds to step S105, where the angle histogram is filtered, and the process proceeds to step S105.
In S106, which group in the histogram is the detection result of which line segment in the pattern is determined from the interval and frequency of the peak value of the angle histogram, the line segment is classified, and the slope and intercept of the line segment are calculated.
【0028】例えば、図4に示すパターンのランドマー
ク60Aでは、マークが画像中に水平に写った場合、図
9に示すような角度ヒストグラムが得られ、この角度ヒ
ストグラムに参加した輝度微分ベクトルを持つ小領域の
座標を画像上にプロットすると、パターンに対応して図
8に示すような線分LA,LB,LC,LD,LEが得
られる。線分LF,LG,LH,LIは背景との輝度微
分ベクトルである。For example, in the case of the landmark 60A having the pattern shown in FIG. 4, when the mark is horizontally displayed in the image, an angle histogram as shown in FIG. When the coordinates of the small area are plotted on the image, line segments LA, LB, LC, LD, LE as shown in FIG. 8 are obtained corresponding to the pattern. Line segments LF, LG, LH, and LI are luminance differential vectors with respect to the background.
【0029】この場合、図4のパターンでは、0de
g、±60deg,±90degの組み合わせで輝度微
分ベクトルが存在することがわかっており、図8の線分
の写った角度と図9のヒストグラム群との関係から、群
H1〜H6の中で、群H1に対して線分LB,LE、群
H2に対して線分LF,LH、群H3に対して線分L
A,LD、群H4に対して線分LC,LG,LIが、そ
れぞれ対応していると判定することができる。In this case, in the pattern of FIG.
It is known that a luminance differential vector exists in a combination of g, ± 60 deg, ± 90 deg. From the relationship between the angle of the line segment shown in FIG. 8 and the histogram group in FIG. 9, among the groups H1 to H6, The line segments LB and LE for the group H1, the line segments LF and LH for the group H2, and the line segment L for the group H3.
It can be determined that the line segments LC, LG, and LI correspond to A, LD, and the group H4, respectively.
【0030】そして、群H1を平均化処理することによ
り、線分LB,LEの傾きθBEを求め、次いで、線分L
Bと線分LEとを分割するため、メモリにストアされた
角度データK[n]から線分LBに属する座標及び線分L
Eに属する座標を抜き出し、ハフ変換により線分LBの
切片MB、線分LEの切片MEを求める。Then, by averaging the group H1, the inclination θ BE of the line segments LB and LE is obtained.
In order to divide B and the line segment LE, the coordinates belonging to the line segment LB and the line segment L from the angle data K [n] stored in the memory.
The coordinates belonging to E are extracted, and the intercept MB of the line segment LB and the intercept ME of the line segment LE are obtained by Hough transform.
【0031】同様の処理を他の群にも行うことで、線分
LA〜LIの傾き及び切片を全て求めた後、上記ステッ
プS106からステップS107へ進み、各線分の交点の座標、
角度を算出してマークの2次元の位置・方位を認識し、
さらに、各線分の傾きと切片とからエリアを設定し、抽
出した座標i[n],j[n]、及び、距離データd[n]を用い
て3次元空間中の線分として確定することにより、3次
元空間での絶対位置・絶対方位を検出する。The same processing is performed on the other groups to obtain all the inclinations and intercepts of the line segments LA to LI. Then, the process proceeds from step S106 to step S107, where the coordinates of the intersection of each line segment are calculated.
Calculate the angle to recognize the two-dimensional position and orientation of the mark,
Further, an area is set from the inclination and intercept of each line segment, and determined as a line segment in a three-dimensional space using the extracted coordinates i [n] and j [n] and the distance data d [n]. Thus, the absolute position and the absolute direction in the three-dimensional space are detected.
【0032】これにより、3次元空間における移動体の
自己位置を絶対位置・絶対方位で精度良く認識すること
ができ、しかも、移動体に搭載可能な小型且つ安価な装
置で実現することができる。Thus, the self-position of the moving body in the three-dimensional space can be accurately recognized based on the absolute position and the absolute azimuth, and furthermore, it can be realized by a small and inexpensive device that can be mounted on the moving body.
【0033】尚、図5及び図6のパターンの場合、角度
の組み合わせが0deg,±45deg,±90deg
になり、同様の処理を行うことで、3次元の位置・方位
を認識することができる。In the case of the patterns of FIGS. 5 and 6, the combination of angles is 0 deg, ± 45 deg, ± 90 deg.
By performing the same processing, a three-dimensional position and orientation can be recognized.
【0034】[0034]
【発明の効果】以上説明したように本発明によれば、測
距原点に設置されたマークをステレオカメラで撮像して
遠近情報を数値化した距離画像を生成するとととに、マ
ークの撮像画像を輝度微分して線分形状を抽出し、この
線分形状と距離画像とに基づいて測距原点に対する3次
元の位置・方位を認識するため、移動体等に搭載可能な
小型且つ安価な装置で、3次元空間における測距開始原
点の絶対位置・絶対方位を精度良く検出可能とすること
ができる等優れた効果が得られる。As described above, according to the present invention, a mark placed at the distance measurement origin is picked up by a stereo camera to generate a distance image in which the perspective information is quantified, and a picked-up image of the mark is obtained. Is a small and inexpensive device that can be mounted on a moving body or the like, in order to extract a line segment shape by differentiating the luminance and to recognize a three-dimensional position and orientation with respect to the distance measurement origin based on the line segment shape and the distance image. Thus, an excellent effect is obtained, such that the absolute position and the absolute azimuth of the distance measurement start origin in the three-dimensional space can be accurately detected.
【図面の簡単な説明】[Brief description of the drawings]
【図1】測距原点認識装置の基本構成図FIG. 1 is a basic configuration diagram of a distance measurement origin recognition device.
【図2】位置・方位検出ルーチンのフローチャートFIG. 2 is a flowchart of a position / azimuth detection routine.
【図3】単純な格子状パターンのランドマークを示す説
明図FIG. 3 is an explanatory view showing landmarks of a simple lattice pattern.
【図4】ミスマッチングを起こさないランドマークのパ
ターン例1を示す説明図FIG. 4 is an explanatory diagram showing a landmark pattern example 1 in which mismatching does not occur;
【図5】ミスマッチングを起こさないランドマークのパ
ターン例2を示す説明図FIG. 5 is an explanatory diagram showing a landmark pattern example 2 in which mismatching does not occur;
【図6】ミスマッチングを起こさないランドマークのパ
ターン例3を示す説明図FIG. 6 is an explanatory diagram showing a landmark pattern example 3 in which mismatching does not occur.
【図7】輝度微分を求める小領域の抽出を示す説明図FIG. 7 is an explanatory diagram showing extraction of a small area for calculating a luminance derivative.
【図8】輝度微分ベクトルの例を示す説明図FIG. 8 is an explanatory diagram showing an example of a luminance differential vector.
【図9】輝度微分ベクトルの角度ヒストグラムを示す説
明図FIG. 9 is an explanatory diagram showing an angle histogram of a luminance differential vector.
1 …測距原点認識装置 10…ステレオカメラ 20…ステレオ処理部 50…認識処理部 60A,60B,60C…ランドマーク DESCRIPTION OF SYMBOLS 1 ... Distance measurement origin recognition apparatus 10 ... Stereo camera 20 ... Stereo processing part 50 ... Recognition processing part 60A, 60B, 60C ... Landmark
Claims (5)
2台1組のステレオカメラと、 上記ステレオカメラで撮像した一対の画像に対し、対応
位置を探索して対象物までの距離に応じて生じる画素ズ
レ量を求め、この画素ズレ量から得られる対象物までの
遠近情報を数値化した距離画像を生成するステレオ処理
部と、 上記マークの撮像画像を輝度微分して線分形状を抽出
し、この線分形状と上記距離画像とに基づいて測距原点
に対する3次元の位置・方位を認識する認識処理部とを
備えたことを特徴とする測距原点認識装置。1. A pair of stereo cameras for picking up a mark set at a distance measuring origin, and a pair of images picked up by the stereo camera are searched for a corresponding position, and a corresponding position is searched according to a distance to an object. A stereo processing unit that determines the amount of pixel shift that occurs due to this, and generates a distance image that quantifies the perspective information to the object obtained from the pixel shift amount, and extracts the line segment shape by differentiating the brightness of the captured image of the mark And a recognition processing unit for recognizing a three-dimensional position and orientation with respect to the distance measurement origin based on the shape of the line segment and the distance image.
度を軸としたヒストグラムのピーク値の間隔及び度数に
基づいて抽出することを特徴とする請求項1記載の測距
原点認識装置。2. The distance measuring origin recognizing device according to claim 1, wherein the line segment shape is extracted based on an interval between peak values and a frequency of a histogram centering on an angle of the luminance differential vector.
上に同一画素パターンが無く、且つ、輝度微分画像で線
分の交点を複数箇所検出可能なパターンを有することを
特徴とする請求項1記載の測距原点認識装置。3. The mark according to claim 1, wherein the mark does not have the same pixel pattern on the same scan line in the captured image, and has a pattern capable of detecting a plurality of intersections of the line segment in the luminance differential image. Distance measuring origin recognition device described.
形を組み合わせて形成したパターンを有することを特徴
とする請求項1記載の測距原点認識装置。4. The distance measuring origin recognizing device according to claim 1, wherein the mark has a pattern in which regions having different luminances are formed by combining triangles.
をX字状に形成したパターンを有することを特徴とする
請求項1記載の測距原点認識装置。5. The distance measuring origin recognizing device according to claim 1, wherein the mark has a pattern in which an area having a luminance different from that of a surrounding area is formed in an X-shape.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP24731997A JP4116116B2 (en) | 1997-09-11 | 1997-09-11 | Ranging origin recognition device for moving objects |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP24731997A JP4116116B2 (en) | 1997-09-11 | 1997-09-11 | Ranging origin recognition device for moving objects |
Publications (2)
Publication Number | Publication Date |
---|---|
JPH1185981A true JPH1185981A (en) | 1999-03-30 |
JP4116116B2 JP4116116B2 (en) | 2008-07-09 |
Family
ID=17161637
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP24731997A Expired - Lifetime JP4116116B2 (en) | 1997-09-11 | 1997-09-11 | Ranging origin recognition device for moving objects |
Country Status (1)
Country | Link |
---|---|
JP (1) | JP4116116B2 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001351200A (en) * | 2000-06-09 | 2001-12-21 | Nissan Motor Co Ltd | Onboard object detecting device |
KR100394276B1 (en) * | 1999-07-12 | 2003-08-09 | 한국전자통신연구원 | Method and Embodiment of the Initial Hand-Region Detection Using Stereo Matching Technique For a Hand Gesture Recognition |
US7403669B2 (en) | 2003-12-01 | 2008-07-22 | Honda Motor Co., Ltd. | Land mark, land mark detecting apparatus, land mark detection method and computer program of the same |
JP2008305255A (en) * | 2007-06-08 | 2008-12-18 | Panasonic Electric Works Co Ltd | Map information generation unit, and autonomous moving unit having the same |
JP2011511736A (en) * | 2008-02-13 | 2011-04-14 | パルロ | A method of maneuvering a rotary wing drone with automatic hovering flight stabilization |
WO2015022681A1 (en) * | 2013-08-15 | 2015-02-19 | Rafael Advanced Defense Systems Ltd | Missile system with navigation capability based on image processing |
JP2017531259A (en) * | 2014-10-31 | 2017-10-19 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Position-based control method, apparatus, movable device, and robot |
JP2019023865A (en) * | 2018-07-12 | 2019-02-14 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Method, system, and program for performing error recovery |
JP2019095267A (en) * | 2017-11-21 | 2019-06-20 | 株式会社ダイヘン | Output device, position measurement system, program and method for measuring coordinate of fixed station position |
US10565732B2 (en) | 2015-05-23 | 2020-02-18 | SZ DJI Technology Co., Ltd. | Sensor fusion using inertial and image sensors |
WO2020230410A1 (en) * | 2019-05-16 | 2020-11-19 | 株式会社日立製作所 | Mobile object |
-
1997
- 1997-09-11 JP JP24731997A patent/JP4116116B2/en not_active Expired - Lifetime
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100394276B1 (en) * | 1999-07-12 | 2003-08-09 | 한국전자통신연구원 | Method and Embodiment of the Initial Hand-Region Detection Using Stereo Matching Technique For a Hand Gesture Recognition |
JP2001351200A (en) * | 2000-06-09 | 2001-12-21 | Nissan Motor Co Ltd | Onboard object detecting device |
US7403669B2 (en) | 2003-12-01 | 2008-07-22 | Honda Motor Co., Ltd. | Land mark, land mark detecting apparatus, land mark detection method and computer program of the same |
JP2008305255A (en) * | 2007-06-08 | 2008-12-18 | Panasonic Electric Works Co Ltd | Map information generation unit, and autonomous moving unit having the same |
JP2011511736A (en) * | 2008-02-13 | 2011-04-14 | パルロ | A method of maneuvering a rotary wing drone with automatic hovering flight stabilization |
US10078339B2 (en) | 2013-08-15 | 2018-09-18 | Rafael Advanced Defense Systems Ltd | Missile system with navigation capability based on image processing |
WO2015022681A1 (en) * | 2013-08-15 | 2015-02-19 | Rafael Advanced Defense Systems Ltd | Missile system with navigation capability based on image processing |
JP2017531259A (en) * | 2014-10-31 | 2017-10-19 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Position-based control method, apparatus, movable device, and robot |
US10627829B2 (en) | 2014-10-31 | 2020-04-21 | SZ DJI Technology Co., Ltd. | Location-based control method and apparatus, movable machine and robot |
US10565732B2 (en) | 2015-05-23 | 2020-02-18 | SZ DJI Technology Co., Ltd. | Sensor fusion using inertial and image sensors |
JP2019095267A (en) * | 2017-11-21 | 2019-06-20 | 株式会社ダイヘン | Output device, position measurement system, program and method for measuring coordinate of fixed station position |
JP2019023865A (en) * | 2018-07-12 | 2019-02-14 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Method, system, and program for performing error recovery |
WO2020230410A1 (en) * | 2019-05-16 | 2020-11-19 | 株式会社日立製作所 | Mobile object |
JP2020187664A (en) * | 2019-05-16 | 2020-11-19 | 株式会社日立製作所 | Mobile |
Also Published As
Publication number | Publication date |
---|---|
JP4116116B2 (en) | 2008-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109313031B (en) | Vehicle-mounted processing device | |
Rose et al. | An integrated vehicle navigation system utilizing lane-detection and lateral position estimation systems in difficult environments for GPS | |
JP7082545B2 (en) | Information processing methods, information processing equipment and programs | |
JP4600357B2 (en) | Positioning device | |
JP5157067B2 (en) | Automatic travel map creation device and automatic travel device. | |
JP3833786B2 (en) | 3D self-position recognition device for moving objects | |
CN105841687B (en) | indoor positioning method and system | |
CN107422730A (en) | The AGV transportation systems of view-based access control model guiding and its driving control method | |
KR101880185B1 (en) | Electronic apparatus for estimating pose of moving object and method thereof | |
US20120308114A1 (en) | Voting strategy for visual ego-motion from stereo | |
EP3842751B1 (en) | System and method of generating high-definition map based on camera | |
JP2001266160A (en) | Peripheral recognition method and peripheral recognition device | |
JP2018124787A (en) | Information processing device, data managing device, data managing system, method, and program | |
JP2001331787A (en) | Road shape estimation device | |
JP2002511614A (en) | Tracking and detection of object position | |
JP2012127896A (en) | Mobile object position measurement device | |
JP2006250917A (en) | High-precision cv arithmetic unit, and cv-system three-dimensional map forming device and cv-system navigation device provided with the high-precision cv arithmetic unit | |
Fiala et al. | Visual odometry using 3-dimensional video input | |
CN109997052B (en) | Method and system for generating environment model and positioning by using cross sensor feature point reference | |
CN208323361U (en) | A kind of positioning device and robot based on deep vision | |
KR101883188B1 (en) | Ship Positioning Method and System | |
JP4116116B2 (en) | Ranging origin recognition device for moving objects | |
JP6834401B2 (en) | Self-position estimation method and self-position estimation device | |
KR102174729B1 (en) | Method and system for recognizing lane using landmark | |
Bazin et al. | UAV attitude estimation by vanishing points in catadioptric images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20040830 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20070718 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20071002 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20071115 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20080415 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20080417 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20110425 Year of fee payment: 3 |
|
R150 | Certificate of patent or registration of utility model |
Free format text: JAPANESE INTERMEDIATE CODE: R150 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20110425 Year of fee payment: 3 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20120425 Year of fee payment: 4 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20130425 Year of fee payment: 5 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20130425 Year of fee payment: 5 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20140425 Year of fee payment: 6 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
S531 | Written request for registration of change of domicile |
Free format text: JAPANESE INTERMEDIATE CODE: R313531 |
|
R350 | Written notification of registration of transfer |
Free format text: JAPANESE INTERMEDIATE CODE: R350 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
EXPY | Cancellation because of completion of term |