[go: up one dir, main page]

JPS6081681A - Picture processor for assembling robot - Google Patents

Picture processor for assembling robot

Info

Publication number
JPS6081681A
JPS6081681A JP58189004A JP18900483A JPS6081681A JP S6081681 A JPS6081681 A JP S6081681A JP 58189004 A JP58189004 A JP 58189004A JP 18900483 A JP18900483 A JP 18900483A JP S6081681 A JPS6081681 A JP S6081681A
Authority
JP
Japan
Prior art keywords
robot
coordinates
sensor
point
area sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP58189004A
Other languages
Japanese (ja)
Inventor
Yoshifumi Asano
浅野 好文
Tetsuya Kawamoto
哲也 川本
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seiko Instruments Inc
Original Assignee
Seiko Instruments Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seiko Instruments Inc filed Critical Seiko Instruments Inc
Priority to JP58189004A priority Critical patent/JPS6081681A/en
Publication of JPS6081681A publication Critical patent/JPS6081681A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Manipulator (AREA)

Abstract

PURPOSE:To match coordinates highly accurately by matching respectively different coordinates of a robot and an area sensor only with the coordinates of the robot on the basis of the chacking position X, Y coordinate data of the robot and picture information data. CONSTITUTION:When a fine substance is located on a position H, the coordinates of the area sensor 3 are x1, y1 and the coordinates of the robot are XH, YH. In the figure, the F point is the original point of the coordinates of the sensor 3, the C point is the current position of a robot chack, XO, YO are the original point of the sensor and the offset variable of the robot chack, and XR, YR are the moving variable of the robot at the picking of the fine substance. In addition, theta is the twisted angle of two coordinates. Thus, the formulas are formed from said relation. In these formulas, values XR, YR, x, y are obtained from the robot and the sensor 3 and five parameters XO, YO, BY, YR, are unknown quantities (SX and SY are scale ratio of the robot coordinates in the X and Y directions to the sensor 3). If the point H is moved to optional three points, XR1-XR3 and YR1-YR3 are found out.

Description

【発明の詳細な説明】 本発明はエリアセンサー付の組立ロボット用の画像処理
に関して、特にセンサーの座標とロボットの座標の原点
位置のオフセット量、XY方向のスケール比、座標の回
転角度を演算し両座繰糸を合致させロボット座標系のみ
で画像処理することに関するものである。その目的は組
立ロボットに視覚機能を容易に持たせ、ロボットの生産
性、汎用性を向上させることにある。
DETAILED DESCRIPTION OF THE INVENTION The present invention relates to image processing for an assembly robot equipped with an area sensor, and in particular calculates the amount of offset between the sensor coordinates and the origin position of the robot coordinates, the scale ratio in the XY directions, and the rotation angle of the coordinates. This is related to matching the two-sided reeling threads and performing image processing using only the robot coordinate system. The purpose is to easily provide visual functions to assembly robots and improve the robot's productivity and versatility.

従来のエリアセンサーとロボットとの座標台セはゲージ
を準備し、カーソル線合せなどの手法で多大な時間と労
力を費し行なっていた。しかしながら、このような方法
では、 (1) ロボットの稼動率が低下する。
Conventionally, setting up a coordinate system between an area sensor and a robot requires preparing gauges and using methods such as cursor line alignment, which requires a great deal of time and effort. However, with this method, (1) the operating rate of the robot decreases;

(2) ユーザー側が容易に切換え出来ない。(2) The user cannot easily switch.

(3)使用中の補正が困難である。(3) Correction during use is difficult.

など作業性の悪い要素があった。There were elements that made workability difficult.

本発明は、上記のような従来の欠点を除去するためにな
されたもので、処理時間を短縮し、かつ高精度な座標合
せを可能にし、エリアセンサーのデータをロボット座標
あデータとして処理できる画像処理装置を提供すること
にある。
The present invention was made in order to eliminate the above-mentioned drawbacks of the conventional technology, and it shortens processing time, enables highly accurate coordinate alignment, and provides an image processing system that allows area sensor data to be processed as robot coordinate data. The purpose of this invention is to provide a processing device.

以下図面に示す例によって本発明を詳述する。The present invention will be explained in detail below using examples shown in the drawings.

第1図は画像処理装置のブロック図である。図面中1は
クロック発振回路で、そのクロック信号を受けて、エリ
アセンサーをドライブするドライブ信号発生は路2と、
画像信号を入力するエリアセンサー3と、その情報を記
憶しておくデジタル画像メモリ4、このディジタル画像
メモリを制御するメモリ制御回路5が、コンパレータ6
を通して配置され、前記ディジタル画像メモリの数値解
析および前記メモリ制御回路の制御を行なうOFU回路
7とから構成されている。
FIG. 1 is a block diagram of an image processing device. In the drawing, 1 is a clock oscillation circuit, and line 2 receives the clock signal and generates a drive signal to drive the area sensor.
An area sensor 3 that inputs an image signal, a digital image memory 4 that stores the information, and a memory control circuit 5 that controls this digital image memory are connected to a comparator 6.
The OFU circuit 7 is arranged through the digital image memory and performs numerical analysis of the digital image memory and controls the memory control circuit.

第二図はロボット座標XYの中にエリアセンサーの座標
xyがある座標関係を示している。たとえば微小物体が
Hの位置にある時、エリアセンサーの座標でat;(・
!t # 3’s ) vロボット座標では(Xn*Y
H)となる。F地点がエリアセンサー座標の原点であり
、C地点はロボットチャックの現在位置を示しており、
XO# YOは、エリアセンサー原点とロボットチャッ
クのオフセット量であり、XR#YRは、H地点の微少
物体を拾う場合のロボットの移動量を表わしている。な
おθは二つの座標のねじ゛れ角度である。
FIG. 2 shows a coordinate relationship in which the area sensor coordinates xy are included in the robot coordinates XY. For example, when a minute object is at position H, the coordinates of the area sensor are at;(・
! t # 3's) v In robot coordinates (Xn*Y
H). Point F is the origin of the area sensor coordinates, point C indicates the current position of the robot chuck,
XO#YO is the amount of offset between the area sensor origin and the robot chuck, and XR#YR represents the amount of movement of the robot when picking up the minute object at point H. Note that θ is the twist angle between the two coordinates.

した、かって、これらの関係より下の2式が成り立って
いる。
However, based on these relationships, the following two equations hold true.

Xn=X +(x l18x ) cosθ−Cy E
l y ) sinθYR=Y +(”−ax ) 5
ino十(3”BY)cosθ。
Xn=X + (x l18x ) cosθ−Cy E
ly) sinθYR=Y+(”-ax) 5
ino ten (3”BY) cos θ.

上式で(XR’YR)+ (xty)がそれぞれロボッ
トおよびエリアセンサから得られる数であり、未知数と
いうのはX。HYOH8Y + ”X +θの5個とい
うことになる(aXと8Y はXY方向のロボ・ット座
標とエリアセンサーのスケール比)したがって任意の3
点へHが位動すればXnt=Xo+(z+@ 5x)c
osθ−一(yl @ 8Yンsinθ XR2= XO+ (x2@ BX )cosθ−(y
、−By)sinθ XR3= IO+(3c3a ax’) cosθ−(
ys −By )sinθ YB1=Y0±(、z:、−BX)sinθ+(yl・
By )部O Y n 2 = YO+ (22”8 X )5inθ
+(y2・8y)cosθYR5=Y(1+(”s *
 sx )sinθ+(ys”By)cosθという6
式が成り立つ。
In the above equation, (XR'YR) + (xty) are the numbers obtained from the robot and the area sensor, respectively, and the unknown quantity is X. HYOH8Y + "X + θ (aX and 8Y are the robot coordinates in the XY direction and the scale ratio of the area sensor) Therefore, any 3
If H moves to the point, Xnt=Xo+(z+@5x)c
osθ−1 (yl @ 8Y sinθ XR2= XO+ (x2@BX) cosθ−(y
, -By) sin θ XR3= IO+(3c3a ax') cos θ-(
ys −By ) sin θ YB1=Y0±(, z:, −BX) sin θ+(yl・
By ) part O Y n 2 = YO+ (22”8 X )5inθ
+(y2・8y)cosθYR5=Y(1+(”s *
sx ) sin θ + (ys”By) cos θ 6
The formula holds true.

第二面は関係式より上記の式を得て5個の未知数を計算
するフローチャートである。6式から5未知数を算出す
るので、一式を検算の式に当てられるので、検算結果が
良好でないとデータ入力から繰り返される。また3回の
データ入力が近すぎるなど不都合な場合も同様である。
The second page is a flowchart for calculating the five unknowns by obtaining the above equation from the relational expression. Since 5 unknowns are calculated from 6 equations, one set can be applied to the verification equation, so if the verification result is not good, data input is repeated. The same applies to inconvenient cases such as three data inputs being too close together.

以上の処理がなされると未知数が明らかになることによ
りエリアセンサーの画像情報をロボットの座標で得られ
大幅時間が短縮され、しかも高精度な情報が得られるた
めロボットの生産性、精度の信頼性の面で種々の利点を
得ることができ、工業的価値は非常に大きい。
When the above processing is performed, the unknown quantity becomes clear, and the image information of the area sensor can be obtained in the coordinates of the robot, which greatly reduces the time required.In addition, highly accurate information is obtained, which improves the productivity and accuracy of the robot. Various advantages can be obtained in terms of, and the industrial value is very large.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は画像処理装置の構成を示すブロック図である。 第2図はロボット座標とエリアセンサーの座標の関係で
あり、第3図が第2図から得られる関係式の未知数を解
いていくフローチャートである。 以 上 出願人 セイコー電子工業株式会社 代理人 弁理士 最上 務 第2図 り →−XR−ヨ
FIG. 1 is a block diagram showing the configuration of an image processing device. FIG. 2 shows the relationship between the robot coordinates and the coordinates of the area sensor, and FIG. 3 is a flowchart for solving the unknowns of the relational expression obtained from FIG. Applicant Seiko Electronic Industries Co., Ltd. Agent Patent Attorney Mogami 2nd Plan → -XR-Yo

Claims (1)

【特許請求の範囲】 XY直交座標系ロボットのチャック位置XY座標データ
と、それと同時に画像処理装置から得られる画像情報デ
ータとにより、エリアセンサー。 ドライブ信号発生回路、クロック発振回路、コンパレー
タ、メモリ制御回路1画像メモリ、OPU回路から構成
される画像処理装置が、前記ロボットと、エリアセンサ
ーのそれぞれ異なる座標を、ロボット座標のみに合致さ
せたことを特徴とする組立ロボット用画像処理装置。
[Scope of Claims] An area sensor that uses chuck position XY coordinate data of an XY orthogonal coordinate system robot and image information data obtained from an image processing device at the same time. An image processing device composed of a drive signal generation circuit, a clock oscillation circuit, a comparator, a memory control circuit, an image memory, and an OPU circuit matches the different coordinates of the robot and the area sensor only to the robot coordinates. Features: Image processing device for assembly robots.
JP58189004A 1983-10-07 1983-10-07 Picture processor for assembling robot Pending JPS6081681A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP58189004A JPS6081681A (en) 1983-10-07 1983-10-07 Picture processor for assembling robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP58189004A JPS6081681A (en) 1983-10-07 1983-10-07 Picture processor for assembling robot

Publications (1)

Publication Number Publication Date
JPS6081681A true JPS6081681A (en) 1985-05-09

Family

ID=16233685

Family Applications (1)

Application Number Title Priority Date Filing Date
JP58189004A Pending JPS6081681A (en) 1983-10-07 1983-10-07 Picture processor for assembling robot

Country Status (1)

Country Link
JP (1) JPS6081681A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62168287A (en) * 1986-01-21 1987-07-24 Nachi Fujikoshi Corp Calibrating method for vision sensor

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6015780A (en) * 1983-07-08 1985-01-26 Hitachi Ltd robot control device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6015780A (en) * 1983-07-08 1985-01-26 Hitachi Ltd robot control device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62168287A (en) * 1986-01-21 1987-07-24 Nachi Fujikoshi Corp Calibrating method for vision sensor

Similar Documents

Publication Publication Date Title
JP3004279B2 (en) Image processing system for optical seam tracker
JPH02243284A (en) Automatic seam follow-up method and apparatus
JP2004351570A (en) Robot system
JP2007286976A (en) Robot simulation apparatus
JPS6081681A (en) Picture processor for assembling robot
JPH08257955A (en) Method for controlling manipulator in copying curved surface
CN207788040U (en) The device of welder is calibrated by zero compensation
US4402051A (en) Method for carrying out a cutting work in machining using numerical data
JPS6010309A (en) Method for interpolating path of robot hand
JPH03136125A (en) Graphic input device
JPS62152645A (en) Machine tool coordinate correction device
JPH05315389A (en) Method and device for input of bonding coordinate data of wire bonding device
JPH044406A (en) Automatic teaching system for robot
JPH04119405A (en) Position and rotation angle detection device, its indicator, and robot motion teaching device using the same
JPS61250705A (en) Coordinate transformation method for industrial robots
JPS6257004A (en) Visual sensor for robot
JPS6246206A (en) Area measuring device
JPS62287987A (en) Robot
JPH0244405A (en) Sensor robot
JPS60157609A (en) Teaching playback robot
JPH09204212A (en) Method for correcting machining data in NC robot device
JP2005250667A (en) Pointing device controller and computer system
JPS6329805A (en) Control device for posture of robot hand
JPS5884375A (en) Drawing setting method for digitizer analysis equipment
JPH04182721A (en) Three dimension coordinate position designating method