[go: up one dir, main page]

TWI375136B - - Google Patents

Download PDF

Info

Publication number
TWI375136B
TWI375136B TW94101592A TW94101592A TWI375136B TW I375136 B TWI375136 B TW I375136B TW 94101592 A TW94101592 A TW 94101592A TW 94101592 A TW94101592 A TW 94101592A TW I375136 B TWI375136 B TW I375136B
Authority
TW
Taiwan
Prior art keywords
camera
image
fisheye
projection
eye camera
Prior art date
Application number
TW94101592A
Other languages
Chinese (zh)
Other versions
TW200528945A (en
Inventor
Chuang Jan Chang
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed filed Critical
Priority to TW94101592A priority Critical patent/TW200528945A/en
Publication of TW200528945A publication Critical patent/TW200528945A/en
Application granted granted Critical
Publication of TWI375136B publication Critical patent/TWI375136B/zh

Links

Landscapes

  • Image Processing (AREA)

Description

九、發明說明: 101年3月28曰替換頁 【發明所屬之技術領域】 本發明係有關於一插Α 據;系統擁有習知系 確度及採用較為便宜的設 影度量系統及其架設方法;特'別機為視覺感知器的三維視 外部光學參數當做建構電腦=種有能力演繹,#、眼相機内 統無法工作的視角下仍耳有旦、里的根據;系統擁右羽“么 施元件來組成。 ~又量精 【先前技術】 眼相機影像即是其中之一。把备 ^ 維鉑旦彡庙旦彡w — Α 祀…眼相機當成視覺感知器建構三 轉換=里1、A C二大主題一為魚眼相機的電腦視覺模型 、技術’ 一為決疋該模型在實體空間的位置和方向的技術 知技術所能接受的影像必;=二種非接觸式度量系統。習 J展的度里糸統月㈤採用非常偏離針孔投射的影像―# =、4使f腦找可萌析魚眼相機的影像投射邏輯;後項 =以建構三轉影度㈣統。财频視倾補崎孔相機 來發展相難型,習知技術都無法細於魚眼影像1下介紹 此兩領域當今的技術背景、成就與不足處。 〈習知技術相機所裝置的鏡頭&gt; 針孔成像是基本物理的光學定律。參考這種成影機制來設 計相機鏡頭為光學物理領域常見。此投射機制叫做線框性透視 投射(rectilinear projection,本文簡稱針孔投射)。由於 其投射幾何演繹代數非常完整,所以這是電腦視覺科學研究最 多、最透徹的相機模型。相對地,其他種類有非線性性質的鏡 頭很少使用於機械視影度量系統,三維視影度量系統更是少 見;魚眼鏡頭亦然。 〈魚眼鏡頭及大視角機械視覺的現況〉 裝置魚目艮鏡頭的相機可以攫取到無限景深的清晰影像,視 1375136 101年3月28日替換頁 |…丁。乃厶0 0¾'供貝 ϋ度甚f可叫過纟。但是魚眼鏡頭的光學投射幾何 _cal ge〇metry)與針孔模型差異很大—主要是影 劇'?、的桶狀失真(barrel distortion)—若以習知的嗜绿斜 ίΐϊίίΪί術來處理魚眼相機影像(簡稱魚眼影“,相 相機影像(簡稱魚眼影像)幾乎毫無對策。見又里技麟魚眼 ㈣ί Τ突破η顧視覺操作視角太小的限制’最近蓮勃發展全 像(〇mmdlrectlonal imaging : 一種採用單一鏡頭一 ^光^·以拍攝接近(或超過)半球體立體視野角度的影像及1 ,位處理撕)觀也由於無有效的電職覺模型可 ^ 魚眼鏡頭供發展數位影像處理,所以幾乎 (在全,學咖鏡頭又稱「折射=== S=p;=r雜的反射折射複合式感測器 全向雜術料和使収㈣m聰是#今 t其利賴外的—錢數個幾何外 ^ =采用複合稜鏡)可以能夠反射超過半球視 =折射f嫌制心的—般相機關接 ^ f專利378454、38聰與美國專利6,118, 474 ί 得複雜且昂貴。另外:附加反狀件會使得影ff 減、將反射鏡置於鏡猶的實施態樣會導 “= 可避免的盲點,在很多場合引起不便。到目= 測器來架設三維立體視影度量妒置。 見乂此類感 里且优町取〜再縫〇為一%場影像,例如: _9與美國專利6,256, 058 Β卜但是,旋轉式mg 1375136 „ _丄_. — . 101+3月28日巻拖百 同=點轉標的物肖騎有的料,且— 3拍攝雜更不用說_本身及旋轉機構的重量會以以二 縫合亦存在著許乡的難題。 ,峰^取樣、 〈現有處理魚眼相機影像的技術〉 有二 多缺點;但其在數位計算機内無法 蜀二將冼的間易方法。其中之一所設定的假說是在一丰诚 -視野角的實體㈣_有的光路將符合—“專—投射函數” ^投射形成-魚眼影像。因此可喊接以平面影像的幾^^ ,反向推演空間視野角的資訊當作為影像處理或重摺 (remapping)的演算根據。請參照「第1Α圖」與「第^ 圖」’其中「第1 A®」示意電腦系統可以得到—個框 的圓形成影區域1 ;而「第i B圖」則為投射到「第丄1 之特徵點及線所來自的空間光路示意圖;兩張圖巾標示^ 點與其對應的實體空間偏軸肖α (zenithal 與^ 轴角β (azimuthal distance);此兩術語將接著解釋。 為了呈現入射線匯集到投影中心的投射幾何邏輯,「第工 B圖」安排一個虛擬小球來輔助說明各端點在球心(球心即是 投影中心)的入射光的立體角位置。此半球體可以想像相當於 一個地球儀的北半球部分,繞軸角β為〇的方位是在本初子午 線(prime meridian) 13'(也就是換曰線)的位置。而偏軸 角α為〇的基準是北極光路c’。依此原則光路a,投射角度可 以用的α=αΟ及β=β〇或表示成R[A'Ha〇, β〇)來表示。同理, 圖上的各條光路可以被定義成:κ[Β,]=(π/2, π/2 ); R[C ]=(〇,b),b 為任意值;R[D’]=(7c/87i,3/2) ; R[E']=(7c/4, 抑/2) ; R[F']=〇t3/8, π3/2);及 R[H'Hti/2, π)。 ’ 根據魚眼成像模型符合一“專一投射函數,,的假說,其視野 7 101年3月28日替換頁 線對影像平面擁有下述邊界條件: 一-—' (1) 魚眼影像的成影區域是可以解析的圓形或橢圓形,所 以「第1A圖」的長軸11與短軸12 (或二直徑)的交點c, 其影像點表示成I[c]即為R[c,]的成影位置。更進一步、R[cr] 就是相機光軸,而I[c]為影像平面之失真中心(註解:為何 名為失真中心’在後面段落詳細討論)。 (2) 景&gt;像邊緣係由赤道上入射的水平投射線(即 映射而得;因此Ζ得败]對及R[H,]對Ι[Η]的映射關係。 (3) α與像尚(principai ^stance,以下以ρ表示之) 間恰好是線性比例的關係,其中像高p定義為一影像點與 中心間的距離。因而可得立體光路D,、£,及厂對影面: E、F點的映攝關係。 此投影規則非常完美是—個不用外加補助技術就可以 =的結果。缺USPTG-51議7專利膽展纽f彡像處理方法 。將之應用在内視鏡、監視與遠端控制等實施樣 ί 5’ 313, 3G6、5, 359, 363、5,耶4, 588)。值得 庄思的疋·廷一糸列的專利並未具體論證一般魚眼 又到質疑,目刖貝務上,系統製造商會要求使用限 ϊίίΐ頭結合到特定的相機触,並提供專屬1體,如3 ^使該專利方法(美_5,185,667)有實用的 〈專利51祕7於魚眼影像處理的關與不足〉 一邻5’ i85’ 667由影像直接推演光路立體角對 性ii可、^^⑽刺的,因為它違背了下舰本的光學 ^ . μ 尤學領域%為等距離投射(ecjuidishrd· 则ectl〇n,以下簡稱為叫而這種投射邏輯 101年3月28日替換頁 眼鏡頭唯一的樣式。請參照「第2圖」,其顯示三種典型的魚 眼鏡頭投射曲線,也就是鏡頭的目標規格投射機制有可能是另 外一種.立體圖形投射(stereographicprojection,SGP,ρ —2fxtan(a/2))或正交圖形投射(ort;h〇graphic projection » OGP » p=fxsin(a)) ° 即使目標規格是EDP,架上鏡頭立體視野角大小不恆為半 ,’而能夠是更大或較小。僅從影像無法判斷鏡頭的視野角度 ,否真的為π ;因為無論鏡頭的視野多大,鏡頭的成影區域i 呈現的形狀總是圓形(或橢圓形)。再由「第2圖」中可以看 出這三種魚眼鏡頭投射機制間的差異隨著入射光之α角的增 加而明顯地變大’所以將魚眼鏡頭皆設定為具有π視野 換影像可能引起失真。 衫像圓开&gt;邊緣是抓不準的。就算鏡頭設計時視野角是兀, 但是代表影像強度的射頻能量響應(radi〇metric resp〇nse) 會呈徑向城是-般鏡補普遍縣,尤其是在較大視角的部 二更為頁此會k成影像強度在成影區域1邊緣處急劇下 雜:;簡單的鏡頭最為嚴重。因此,影像邊界是很 難被精確地疋出(註:考慮光的繞射現象,甚至於 定的邊緣特徵)。 ^ ^ 魚眼影像不呈現(完整的)影像邊緣部分。如一相 法ί==的感影區’則邊緣不能顯現,根本就無 和所ΐΐΐΐΐϋ接形影像成影區域為根據的方法是 制。斤U的7b子7L件息息相義;在部分實務_上將有限 明顯地,該習知技術並未探討到一個相機實務 表^成像邏輯的内部參數。更別說較城 J ;的^ 用領域。 ’,,'輯步將魚眼鏡頭用在更精準的應 101年3月28日替換頁 w _ 厶0«督換頁 目關主題’所發展出 =千不嫌於則述嚴格的影像呈現需求;而並且能夠精= 目標魚眼相機的光學參數。如此一來,可將魚眼影像依 j传到㈣學參數’錢及_為具有度量精破度的影 而步用來發展三維視影度量學。利用非常非線性如备象眼 相機架設三_織衫献本㈣最錢_部分。… 〈當今的機械視覺影像技術〉 將,影機做為機器人裝置的視覺感知器(v丨i seniors) ’如同人類的眼睛’此即為機器視覺(R细〇r 或視覺伺服(ViSUal Ser麵g)。機械人由視 =5服&amp;供加工物件位置訊息的概念在酬年代末期已經形 ’可概分為二大類:基於固定位置的視覺伺服 ㈣Si„ed νΐ—肪〇1呢,鹏)及基於的視覺 二认 L!ne-based vlsual servoing,鹏)。PBvs 需要目標 2^三維气標及·_精確位置,計算量A且複雜,不 命丨\在機盗人視覺飼服系統及景緣追縱系統的即時控 運⑸目標物的特徵二維影像進行處理’以獲知目標物 =為^訊’做為回授訊號,引導機器人運動或控制攝影 m達—到機械人定位或影像追縱等目的。因此,腿較 nr貫現’但是隨在於所制的織必須是準確 熟。白、,而現有的相機模型只有針孔投射鏡頭的應用較為成 同太ΓΓ ;兩―(或更多)部的針孔相機布置於在實體空間不 兩邻- jf—王維視影度量系統,這是一個習知的技術。如有 體二=:的針孔相機,由於相機的位置差異’使-工作物 點在兩相機呈影位置產生視差(disparity)位移。 理^f\ParaiiaX 〇Γ 〇PtiCai fl〇w。如果在機器人裝置兩 計射相機的相對方位可以確定,則可用三角學等式來 〜只體物對機器人間的空間位置。也就是給定一物點在兩相 1375136 101年3月28日替換頁 丁 口 管換頁 =,應呈影位置’參考兩相機的投影巾^Γ(視覺參 &gt;點)以二角幾何學可以計算三維空間位置。 腦視覺教科書有討論。 夕電 〈針孔相機的電腦模型〉 、ΙΓ狀射」是機舰錢知_理想_。實務上選用 越接近針孔投賴制的鏡頭可以使系統精準度越高。又、g 二件的特性’大孔彳iH、姻的鏡頭味有能力接近理想針孔才; ^。戶=以在高精密高解析的系統工作視野肖f有關且體 ,,機機體大是很平常,也當然'價昂。然而機械視覺的應, 仍1勃的原因是其在特定領域有很大的助益。很重要的&amp; 物的狀態與設備兩造之間無電氣、機械信號連麵 生產力、品質的可靠度相對提高。 此系叹間早但 解析相機模型的參數用來校正影像到符合 出相機的方位是親視覺兩個重要主題。如此不 ==,是报常被引用的。他以十-個參數描5般S i 型如下所示的「第一表」 取作伐犋 ^1^1羞氏的相機模型的參激IX. INSTRUCTIONS: March 28, 曰 曰 曰 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 【 Special 'Do not use the three-dimensional external optical parameters of the visual sensor as a construction computer = kind of ability to interpret, #, the eye camera system can not work from the perspective of the ear can still have the basis of the ear; the system has the right feather "Mushi Component composition. ~Precision fine [Prior Art] Eye camera image is one of them. Put the preparation of the 铂 铂 铂 彡 彡 彡 — — — 眼 眼 眼 眼 眼 眼 眼 眼 眼 眼 眼 眼 眼 眼 视觉 视觉 视觉 视觉 视觉 视觉 视觉 视觉 视觉 视觉 视觉 视觉 视觉 视觉The second theme is the computer vision model and technology of the fisheye camera. One is the image that can be accepted by the technical know-how of the position and orientation of the model in the physical space; = two kinds of non-contact measurement systems. The exhibition of the dynasty (5) uses images that are very far from the pinhole projection - # =, 4 to make the f brain find the image projection logic of the fisheye camera; the latter term = to construct the three-transition (four) system. Development of the phase Difficult, conventional techniques can not be more detailed than the fisheye image 1 to introduce the current technical background, achievements and deficiencies in these two fields. <Lens of the device installed by conventional technology> Pinhole imaging is the basic physical optical law. Reference to this imaging mechanism to design camera lenses is common in the field of optical physics. This projection mechanism is called rectilinear projection (referred to as pinhole projection). Because its projection geometry deductive algebra is very complete, this is computer vision. The most scientific and most thorough camera model. Relatively, other types of non-linear lenses are rarely used in mechanical vision measurement systems, and 3D visual measurement systems are rare; fisheye lenses are also the same. And the current situation of large-angle mechanical vision> The camera with the fisheye lens can capture clear images with infinite depth of field, depending on the 1375136 March 28, 2011 replacement page | ... Ding. Nai 厶 0 03⁄4' for the Bellow degree f It’s called 纟. But the optical projection geometry of the fisheye lens _cal ge〇metry) is very different from the pinhole model—mainly the shadow distortion of the movie’? (barrel distortion)—If the fisheye camera image (referred to as fisheye shadow) is processed by the conventional chlorotic oblique image, the camera image (referred to as fisheye image) has almost no countermeasures. See also the technique of fisheye (4) Τ Breaking through η visual control operation perspective is too small limit 'Recently Libo development hologram (〇mmdlrectlonal imaging: a single lens with a ^ ^ ^ to capture close to (or exceed) the hemisphere stereoscopic view of the image and 1 bit processing Tear) also because there is no effective electric job model can be used for the development of digital image processing, so almost (in the full, learning coffee lens also known as "refraction === S = p; = r miscellaneous refraction composite Type sensor omnidirectional miscellaneous materials and make (4) m Cong is #今特其利赖------------------------------------------------------------------------------------------------------------------------ It is complicated and expensive to use the patents 378454, 38 Cong and US Patent 6,118, 474 ί. In addition: the addition of the reverse will reduce the shadow ff, and the mirror will be placed in the mirror. "= Avoidable blind spots, causing inconvenience in many occasions. To the eye = detector to set up a three-dimensional stereoscopic metric妒 。 乂 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优 优On the 28th of the month, you will be dragged by the same number of points, and you will be able to use the material of Xiaoxiang, and you will not have to say that the weight of the rotating mechanism will be a problem with Xuxiang. The peak ^ sampling, "the existing technology for processing fisheye camera images" has two disadvantages; however, it is not possible to use the digital method in a digital computer. One of the hypotheses set is that in a Fengcheng-viewing entity (4) _ some light paths will conform to the "special-projection function" ^ projection to form - fisheye image. Therefore, the information of the spatial viewing angle can be reversed by a few ^^ of the planar image as a calculation basis for image processing or remapping. Please refer to "1st map" and "Fig. 2" where "1st A®" indicates that the computer system can get the circle of the frame to form the shadow area 1; and the "i-B diagram" is the projection to the The feature point of 1 and the schematic diagram of the spatial light path from which the line comes; the two maps indicate the point and the corresponding physical space axis a (aziithal and azimuthal distance β); these two terms will be explained next. The projection geometry of the ray into the projection center, "Picture B" arranges a virtual ball to help explain the solid angular position of the incident light at each center of the sphere (the center of the sphere is the projection center). It can be imagined that it is equivalent to the northern hemisphere part of a globe. The orientation around the axis angle β is 位置 at the position of the prime meridian 13' (that is, the change line), and the off-axis angle α is the reference of the 北极. The optical path c'. According to this principle, the optical path a, the projection angle can be expressed by α=αΟ and β=β〇 or expressed as R[A'Ha〇, β〇). Similarly, each light path on the graph can be defined as: κ[Β,]=(π/2, π/2 ); R[C ]=(〇,b), b is an arbitrary value; R[D' ]=(7c/87i,3/2) ; R[E']=(7c/4, /2); R[F']=〇t3/8, π3/2); and R[H'Hti /2, π). According to the fisheye imaging model conforming to a "single projection function," hypothesis, its field of view 7 March 28, 101 replacement page line has the following boundary conditions for the image plane: one - - ' (1) The shadow area is a circular or elliptical shape that can be resolved. Therefore, the intersection point c of the long axis 11 and the short axis 12 (or the two diameters) of "1A" is represented by I[c] as R[c, The location of the film. Further, R[cr] is the camera's optical axis, and I[c] is the distortion center of the image plane (Note: Why is the distortion center) discussed in detail in the following paragraphs). (2) 景&gt; The edge is a horizontal projection line incident on the equator (ie, mapped; thus Ζ Ζ 】 】 】 】 】 】 】 】 】 】 】 The relationship between principai ^stance (hereinafter denoted by ρ) is exactly a linear proportional relationship, where the image height p is defined as the distance between an image point and the center. Thus, the stereoscopic optical path D,, and the factory-to-background are obtained. : The mapping relationship between E and F. This projection rule is perfect - it can be the result without the use of additional subsidy technology. The USPTG-51 7 patent gallbladder is used. It is applied to the internal view. Mirror, surveillance, and remote control implementations ί 5' 313, 3G6, 5, 359, 363, 5, yeah 4, 588). The patents worthy of Zhuang Si’s 廷· 廷 糸 并未 并未 并未 并未 并未 并未 并未 并未 并未 并未 并未 并未 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷 廷3 ^ Make the patented method (US _5, 185, 667) practical (patent 51 secret 7 in the fish eye image processing of the lack and lack of > a neighbor 5 ' i85' 667 from the image directly derived optical path stereo angle ii can, ^^(10) stabbed because it violates the opticals of the ship. ^ μ The field of the field is equidistant (ecjuidishrd · ectl〇n, hereinafter referred to as the call and this projection logic is replaced on March 28, 101 The only style of the page head. Please refer to "Figure 2", which shows three typical fisheye lens projection curves, that is, the target specification projection mechanism of the lens may be another. Stereographic projection (SGP, ρ - 2fxtan(a/2)) or orthogonal graphics projection (ort; h〇graphic projection » OGP » p=fxsin(a)) ° Even if the target specification is EDP, the stereoscopic viewing angle of the lens on the shelf is not constant at half, ' Can be bigger or smaller It is really impossible to judge the angle of view of the lens from the image, or it is really π; because the shape of the lens i is always circular (or elliptical) regardless of the field of view of the lens. It can be seen that the difference between the projection mechanisms of the three fisheye lenses becomes significantly larger as the angle α of the incident light increases. Therefore, setting the fisheye lens to have a π field of view for image change may cause distortion. &gt; The edge is not accurate. Even if the angle of view of the lens is 兀, the RF energy response (radi〇metric resp〇nse) representing the intensity of the image will be a general-purpose county, especially in The larger view of the second part of the page will become the image intensity at the edge of the shadow area 1 sharply; the simple lens is the most serious. Therefore, the image boundary is difficult to be accurately extracted (Note: consider light The diffraction phenomenon, even the edge feature.) ^ ^ The fisheye image does not present the (complete) edge portion of the image. If the shadow zone of the one-phase method ί==, the edge cannot be revealed, and there is no harmony at all. Shaped image The method based on the shadow area is system. The 7b sub- 7L pieces of the U are closely related; in some of the practical _ will be limited, the conventional technique does not explore the internal parameters of a camera practice table. Not to mention the city J; the use of the field. ',, 'Steps to use the fisheye lens in the more accurate should be replaced on March 28, 101, page w _ 厶 0 «Supervisor page change theme 'developed = Thousands do not deny the strict image presentation requirements; and can be fine = the optical parameters of the target fisheye camera. In this way, the fisheye image can be transmitted to (4) the parameters of the 'money and _ for the measurement The degree of breaking is used to develop 3D visual metrics. The use of very non-linear, such as the preparation of the eye, the frame of the three sets of clothes (four) the most money _ part. ... "Today's mechanical vision imaging technology" will use the camera as a visual sensor for robotic devices (v丨i seniors) 'like the human eye' which is machine vision (R finer or visual servo (ViSUal Ser face) g). The concept of the robot's view of the position information of the processed object at the end of the remuneration period can be divided into two categories: visual servo based on fixed position (4) Si„ed νΐ—fat 1 ) and based on the visual recognition of L!ne-based vlsual servoing, Peng). PBvs needs the target 2^ three-dimensional gas mark and · _ precise position, the calculation amount A and complex, not 丨 在 in the machine stealing visual feeding system and Real-time control of the Jingyuan tracking system (5) The characteristics of the target are processed in two-dimensional images 'to know the target = for the message' as a feedback signal, to guide the robot movement or to control the photography - to the robot positioning or image For the purpose of tracking, etc. Therefore, the legs are more consistent than nr's but the woven fabrics must be accurately cooked. White, and the existing camera model only has the same application of pinhole projection lenses; More) pinhole phase Arranged in the physical space is not two neighbors - jf - Wang Wei visual metric system, this is a well-known technique. If there is a body two =: pinhole camera, due to camera position difference 'make - work point in two cameras The position of the shadow produces a disparity displacement. ^f\ParaiiaX 〇Γ 〇PtiCai fl〇w. If the relative orientation of the two camera cameras in the robot device can be determined, the trigonometry equation can be used. The spatial position of the human being. That is, given an object point in the two-phase 1375136 March 28, 2007 replacement page Dingkou tube page change =, should be the shadow position 'reference two camera projection towel ^ Γ (visual gin > point) The three-dimensional spatial position can be calculated by two-dimensional geometry. The brain vision textbook has been discussed. Xidian <Computer model of pinhole camera>, ΙΓ形射" is the engine ship Qianzhi _ ideal _. In practice, the closer to the pinhole lens, the higher the accuracy of the system. Moreover, the characteristics of the two pieces of 'big hole 彳iH, the lens of the scent has the ability to approach the ideal pinhole; ^. Household = In the high-precision and high-resolution system, the working field of view is related to the body, and the body of the machine is very common, and of course it is expensive. However, the reason for mechanical vision is that it is still a great help in a specific field. It is very important that there is no electrical or mechanical signal connection between the state of the object and the equipment. The productivity and quality reliability are relatively improved. This sigh is early but the parameters of the camera model are used to correct the image to the orientation of the camera. So not ==, is often quoted. He uses the ten-parameters to describe the "first table" of the S i type as shown below.

uo, νοUo, νο

=失真中心、的座標位置(指在電腦螢幕的數位系 Μ) &gt; 、.·. 3尺7比例(因 之「比值」使一「正方實體」顯示成為「 ——(Extrinsic ^amewTT~ ~~ 11 ^一~~I—— -____________ 101 年 3 月 28 日替換頁 空間原點的位移關係~~~ 外^4 1^述像機的投射幾何,而外部參數把相機跟 疋建構一機械視覺座標系統的的基準;根據這此 影像轉換成實體空間的投射光路。蔡氏方法學 在網,或教科書都可魏快找到。 氏的2!!機ϊ型所使用的參數並非—成不變,有些模型跟蔡 扩沭二此牲Ϊ址然相同。报多文獻用更多或較少的參數數目來 二古二的影像系統。進—步探討、蔡氏模型的參數並不 機的Ξίί?魚眼影像。本發明將演繹一套能夠描述魚眼相 =内、外部參數的方法學及以實做系統來驗證。 【發明内容】 ’本㈣的目岐針職置絲性透視投射機制 =_機,提供_齡彡像魏麵之械參數解析方法, ^維視f彡度量_’以在系統規格不失 /n又刖提下,可以擴大三維視覺系統的工作視野角度。 备述本發明之目的’提供一種魚眼相機三維視影度量 系統及其木設方法,其技術内容係中華 90123681 . 91116790 ^ 92109160 ^ 92109ΐΓ^Γ;; ^展而成,其中有關三維視影度量系 '统的部分内容,曾 2,7月28日揭露於張創然先生的台灣大學電機工程研究 博士論文之中。 減—光學假說來演繹魚眼相機的機 子覓吴生並架s又測量系統加以驗證。系統以影像對稱來 了中心對稱平面圖乾用以仿真(emulate)「多準直器校正方 一、二次大戰時代,一種以多組精密機 械、.·0構產生夕組確定立體光路,以來鑑定超大型空中昭相 的凸透鏡所擁有的投射幾何機制)的精密機械、组合,進而紅 ”和其呈現_像兩造間的尺寸對應來演繹魚眼二 的視覺模型。其演繹過程概述如下: 機 依據魚眼影像且有從半吉☆ / 替換頁_ 調濟衰減的特性,由對稱且為增益單 方向變蠢;5 組具有六度方位變數(即是三 相機和圖靶控制的平調整台主動控制 個特徵影像點的共同中失圖乾影像的各 T 5/7&quot; 方向與位置各決定了三之1。的4個’即是相機的 2·=:’針求取焦距常數與投影中心提出兩種解題演算 魚上 數、及=的===位置、鐘定投射模型、焦距常 由=^=機地=-「=&quot; τ像形成:===== =s投射模型完全陌生的鏡頭'例如模ί 利的解決方案。 個很有 3.,據這些已經制的參數,足夠描❹、眼相機的視覺 Ϊί:Ϊ地重摺魚眼影像為各種不同格式的 心像田”、、ί 3重摺(remapping)成針孔影像。 1375136 本發明用來得到相機參數 一_ 維視影度量系統—主要是確定只驗裝夕置裝延伸置架設-三 空間方位及相機間相對的方位。」^夕部)魚眼的個別實體 相機模型的外部參數可以參考圖^^部^已經知道單一魚眼 基於此布置兩魚眼相機為目標 眼」;首先得到第一台备眼相機的里系統的左眼」和「右 平台的方向及深度部分====%固定六度方位 調整相機的方財整流__先^=芯2巧動 相關應用大幅提昇。 _犯大視角工作,因而 為讓本發明之上述和其他目的、彳 下文特舉一較佳實施例,並配合所附圖式優== 【實施方式】 本發明中所揭露的技術係以中華 9012腿、91116790、92麵60 與 921〇9l=H二ς =而月1=,維視影度_ 21揭路張創然先生的台灣大學電機工程研究所 博士論文之中。 A ^下之詳細實施例制將涵蓋二大部份:第—部份是根據 月ij面提過的兩種技射模式的假說來建構魚眼相機電腦視覺模 型(即是求取相機内外部光學參數的演算法),投射假說一:『在 「部分視野角範圍」内’魚眼相機的投射模型符合一典型的「圓 形函數」』,本發明提出兩種演算法(命名為:1.「ε_演算法」 和「σ-演算法」)來演繹相機的光學參數;投射假說二 野線排他性地與對應「呈影點」一一映射』的特性;據此本發 明發展一種更通用性的演算法、可以應用在非特定「圓形函數」 的投射模型。第二部分是揭露三維視影度量系統的架設方法及 1375136 i01年3月28日替換頁 ^裝置元件’這部分必須以相機光學參^7里正確 提。 &quot; 〈視覺模型一以特定投射模式為參考的相機參數〉 魚眼鏡頭是一種非線性投射鏡頭,意謂空間中視野線 (sight-ray)通過此類鏡頭後,無法以一般熟知之針孔模式 (pinhole model)的線性透視投射機制來解釋其投射行為二 相較於其他參考線性投影設計的鏡頭,魚眼鏡視^ 度較大的優點,但影像有嚴重的桶狀失直 deformation夂亦即魚眼影像參考針孔模式的 面士呈中心對稱(此中心點稱為失真中心/ρΓιης^^千 且文形量沿著徑射方向增大。 ) 相機的成像投射機制可描述為:於視野(field〇f y 以下簡稱為F0V)中源自視野内物體而來的入射光線( 及射光)會邏輯上(注意:不是實際上;因為-般鏡 頭沒有此貫體投影中心點〕匯聚於空財—唯—的 又= f或稱為投影中心,或viewp〇int,簡稱為νρ),之後再 數=^般經過多階段的折射現象)成像於影像 領域的技藝人士馳知的基礎理論ίίΐ; Ϊί用ίϊίΐ各類鏡頭投射幾何有適當的解析技術,而僅止 ,(即是針孔投射模型)絲礎發展相 向失真(即桶狀失真)尚未有可用的解析方式。 負仏 排列====/考光學軸空間對稱的幾何 排-「第3圖戶;干在二射出中心對稱的 心+1」所不的具有中心對稱圖案22G (physical 於相ί視’以下簡稱PCP)的平面圖乾&amp; 影像平面23 22與相機間的相對方位,使在 上侍到一中心對稱影像230 ( imaged 1375136 101年3月28日替換頁 cen_tral-spiraetry pattern ’以下簡稱 「第 4 圖~~ =(此圖為習知的魚眼投射模型)。或且為或當得到對稱士 ίί ’也表示此時絲軸21 _正交通過影像中心235及圖 心 225 ’ 並且前基點(front cardinal point,簡稱為 FCP) 242 與後基點(back cardinal point,簡稱為 BCP) 243 ;此 兩機點也都在光學軸21上。由於圖靶22上的圖案可以人為安 排置放在已知的絕對方位,故可做為參考座標位置來決定空間 中光學軸21的方位。因此如何操控PCp 22〇或相機6〇的方位 以知到一 ICP 230是本發明的一個核心的程序。 參照「第4圖」在光學軸21上的前基點242是建構視野 空間光路的參考點;後基點243是建構相機内部光路的基準 點。因為這是魚眼鏡頭的邏輯等效圖示,這兩點間的實體距離 並不重要。「第3圖」中的PCP 220,可視為仿真習知的多準 直器(multicoliimator)之圓弧排列的光學佈置。長久以來, 多準直器裝置-紐躺來校正大獅雜' 凸透鏡面的投射 函數,其利用多只被精密圓弧排列的點光源來產生準直到 (co^imatingto) —特定點集中的光束。而此點的空間絕對 位置是架設校正裝置時的已知位置。經由調整測試相機的方位 使達到影像最清晰的狀況,此時即當作測試相機的投影中心 VP已經與預先布置的光束集中點重疊。多準直器以在精確的 幾何位置所裝設的點光源模擬來自無限遠處已知偏軸角(α) 的入射光的光路。而且因為每一入射光所映射之影像點的位置 可被精確地量測,故可以經由直接量測數據得到一個透鏡的偏 軸角對像局的投射剖面(profile)。 就操作原理’多準直器裝置可以用來實測任何具有光學軸 空間對稱投射光路的影像元件、模組或系統,並可得該裝置的 才又射核式。因此,有成力鑑定的投射模式並不限制在一個封閉 的圓形函數投射。(補充說明:「第2圖」所呈現的投射剖面都 疋可用封閉的圓形函數來描述。)因此也適用在鑑定魚眼鏡頭 1375136 丨⑻年3月28曰替換頁 。但是’多準直器的精密弧形機械結構很難在-般 ί —提出以平面圖形來間接仿真多準直空間 ^第3圖」表達出在前述的基礎上設計之pcp 22〇的一具 其包含有一實體圓心與複數個同心且對稱的幾何圖 二(圖t繪示的同心圓)。以下將借用多準直器測量(或 =昭)「法的機制,來辅助描述本發明方法的理論基礎。請再次 二照^4圖」,表示在組成儀器系統的三度空間的平面圖靶 圓图亚ί示其在魚眼_ F〇V投射空間產生的投射光路示意 m可以用魚眼鏡頭24和影像平面23來等效表示魚眼相 得機的投射行為完全符合任—6知的®形投射函數關 補充說明··即是該相機的投射函數為該已知圓形函數 fUnCti〇n)和焦距的乘積),那麼自PCP 220的入射 °上本質地達成-準直機制(c。出matingmeehanism), ,即所有人射線在進人相_部前會絲聚於前基點⑽ —point,簡稱為Fcp),然後再由一後基點 把Cardlnal P〇int ’簡稱為BCP)根據投射函數發散 ,射出並成像在影像平面23上。Fcp 242與Bcp 243是描述 二眼,頭24之投射行為的二個基準點,用來界定魚眼相機 =、外的二個投射抑(補充制:就發明人的認知除本發明 卜’直到目前沒有文獻探討在電腦影像系統如何引入處兩基 點)。於解析魚眼相機的投射機制時,Fcp 242供視野線參 =243縣像平面23參考,此二節點間的距離並非相機的 :數,可以設定為任意值,因此本發明將卿242與BCp如 3併為-單-的VP 241,如「第5 a圖」所示,以一致化表 :成像的圓函數邏輯。「第5A圖」可以視為「第4圖」立 板型中包含光學軸21的子午線平面(meridiQnal ρ1_的光 ,射圖。圖中α,係由像高p與焦距常數,反向推導而 件。α及α’間的邏輯關係受被測試鏡頭的原生投射模式來決定 101年3月28曰替換頁 (補充說明:這很像一個折射現象)。 〈座標糸統〉 的座ίίί便描述本發8縣統的實施方法,首蚊義所參考到 (1)座禚系統W(X,Υ,Ζ)由控制平二Η T六M -A Μ J軸-⑽5卜r編2與ζ,基:鋼體 裴置的可控制(controllable)及勸 斤疋義’此為測试 系統。W(X,Y,Z)三分量的單;立為卜-一ble)的 (p土圖乾座標系統τ(χ,γ,z)以圖乾22布置中 以43圖=?定義為Z轴的參考方向。圖靶22'是可 的位移。TOUZ)三分量的單位為/。、睛餘糸統取Y,z (3)相機的投射空間座標系統Ε( 心為原點’其中α,β,及h定義三基本=二==中 可以借用測地學(geod )習知 f込個座私糸統 來說明、標示地標= j 3為體:面離開赤道平面。請參考「第5 Β圖二: mi β,h^λ,h0 ,内外投射_分為献兩個半球。Γ著r定 米。自牛球為負值。α’(3的皁位是空間度,而h為長度 單邊②料心為端點的 等嗚,㈣ (5)影像平面座標系統c,(x y)或p,(⑽ 35為原點,將影像平面烈以直角座標或極座標表) 1375136 示。X,y及P的單位是長度;β則是空 是無法直接觀察到的。 土令上此糸統 (6)像素座標系統I(u,v)這是可以直接 · 腦系統顯示介面的躲齡統,u,v时素為單位現^ 真中心235成像在電腦系統顯示螢幕的位置定 I(uc,vc)。基本上,相機映射到影像平面烈的尺 &amp; 類^纽在KU,V)座標系統。而任何一個影像 點在像,軚系統也表示成以I(uc,vc)為原點 標C(u,v),或極座標ρ(ρ,β)。 月! 參 101年3月28日替換頁 圖」中啊標示校正系統建構完成後,獨立的 兩座才示糸統巧《,Ρ,…與Τ(χ,γ,〇方向與位置的關連性。本發 明所發展的實驗系統的操作目標是讓圖乾22絲的 座標系統的Z軸能夠與相機所定義E(a,p,h)的光學軸2’1重 合。 〈二階段投射魚眼影像視覺模型〉 由於發名人發現在地圖製作學(cart〇graphy)部分怜f 地圖的映攝邏輯和常見的魚眼翻絲映攝賴_。以下^ =^引用該領域已發展狼成熟的學科術語,來輔助描述本 發明的魚眼視覺模型的影像轉換方法。 有一種繪製地圖的映攝邏輯已經嵌入於「第5 b圖」中。 由=地_實際尺寸跟地球球體是不統,製作地@的第-步 Ϊ ί把=表面以地球中*映絲準攝到—個半徑很小的球 可=(明之小球3〇 (small sphere)為地圖製作學的術語; 成一)’如此、能將小球30球面映社到有限面積的圖面 上。第圖」小球的一點302是為地表映攝來的一 位置。圖上不意繪製等距離投射的地圖時,由該點302映 =地圖23的-點23卜這樣地、可以用半徑為冗單位小球 k的一,平面的圓形面積來展示示小球的表面地理資訊。 魚眼影像成像機制可以解釋為「具有圓球面影像的針孔投 19 101年3月28曰替換頁 旦/ 虫、由甘 上干d月Ζδ ϋ哲识只 :4κ(χ β)繪製」映攝。再舉EDP魚眼鏡頭來說明, 點VP24i 以線段221,241線段定義)在經過視參 (以線段24U31定^投^函如數^生非線性的折射現象光路 表面上的一點训?。而s疋針孔鏡頭般以直線投影到小球 231。 。而再以地圖繪製的映攝邏輯,成影於點 理。射讓魚眼影像賴機制可以方便電腦處 數成ί :二::影像’而地圖製作學的電腦 解析技術潛力彳t:。 賊架在—起,其触的魚眼影像 m。不止能騎料上述料距離投射鏡 Ιϋ Ί: ^繹祕可翻在非常雜性的鏡頭;例如說第二 圖另外不思的立體圖形及正交圖形魚眼鏡頭。 〈圖靶產生準直入射光的機制〉 嫉更ίΐ一cr步描述以圖靶22仿真(emUlate)準直器入射光的 P I第5 A圖」包含:(1 )「第5 B圖」所表示意魚眼相機 成維方錢製,即是小球30的部分;並進-步標 二半徑為焦距常數/;(2)被適當置放的pCp22〇仿真多準直 =的點光源結構;及⑶參考小球30及pcp 220上圓形執 =所定義的大球。現纽PCP 22G最外圈關㈣執跡來說 j :該圓形軌跡所定義的大球半徑是「特徵點221到VP的線 :表示為圖上的大球4〇 ;此大球4〇被平面圖乾22最外圈 ,同心圓所正割,因此該圓形執跡為大球40表面的一個小 圓。PCP 220上的任二圓形執跡都可以單獨以此此機制來描述 一大球。 〈圖乾產生圓錐光路〉 ^投射自PCP 220上任一物體點221的視野線(sight ray) ^路會本質地在「入射點3()1」正交地穿過小球3〇表面並往 球中心(也就是VP 241)集中,如此一來,pCp 22〇上每一同 20 1375136 、、門畆扑L 年3月28日替換頁 束,此懸路於相機外部如恰好建構圓錐光 Γ二示!ί「,4圖」中的立體光路是』於最 函數折射到影像平面23。如「第4冑先路依據技射 VP、细可以描述-條 1路如“圖」上特徵點22】、、 〈得到ICP的方法〉 準圖影料空__性,只有在光學轴2】正交對 映射出的對應影像預期也會呈現出同心且^ 中心^35。P iCP 230,iCP 230的幾何對稱中心即為失真 升〈成之上#,二:t調整圖乾22與測試相機間的相對方位直到 設定精確度,此時圖案中心225所映 ,像點的特徵座標(featured 丨 中心挪(principal p〇int)的位置為為失= 的原點c,(o,〇)《PY0 R、。品位置為衫像千面23 I(uc,vc);而通過失真中=235且23像素座標表示為 視野線也會垂直通過PCP 22() 、二22f 空間 了ϋ路L π ^ f先學轴21的方位。以上程序實現 =糾于軸21方位的魏,這是定相機外部參數過程的, 測試圖案220的圖樣,並不只限定 的平面形式同心圓,PCp 220若β ώ ; '」所‘示 組成,都是烟實二 门、二角形、或疋任思同心多邊形, 人^ ^ 心且對稱的圓形、方形、三自至、、且口任〜數目的同 220實施例。當然,對稱且環繞於$學=’=== 亦有f實同:質’但其妨會得狀簡㈣處理^物件 1375136 101年3月28曰替換頁 以下以一具體實施例,可以實現以PCP 220為參考位置定 位代表相機的小球3〇的尺寸及位置、光學軸21方向與失真中 心235。本發明於實際實驗時,設計pep 220如「第6圖」所 示’其可以雷射印表機將之印製在A3尺寸的紙上作為圖乾22 的一具體實施例。考慮到魚眼鏡頭的失真程度會輻射狀地向外 急劇增加’因此設計PCP 220之同心圓間的半徑差由内往外逐 漸擴大,以反映魚眼鏡頭的此一光學現象;PCP 22〇中同心圓 半徑尺寸的決定,可以用如「第3圖」般的圖靶影像先取得在 適^量測基準位置的物體與影像的對應關係,來調整pcp 22〇 的貫體圓軌跡寬度’使系統能夠同時清楚地顯示中間區域及邊 緣影像範圍的影像。依此原則設計的PCP外參照「第6圖。 另外’黑白相間可以明顯辨別同心圓邊緣,有利於後續影像處 请參照「第7圖」,將該製成的圖靶22固定於一調整平台 ?上’且讓圖乾22與相機60盡量靠近’使得PCP ^能: ,亙整個魚眼鏡頭24的FOV,如此映射出來的影像會橫^大 ,分可顯像範圍。如此安排可取樣較大視角的影像資訊,因 =個部分的影像映照最能分辨魚眼鏡頭24所遵循的特定浐射 即如「第2圖」所*,在視角越大的範圍,不同二 式間的差異越明顯。 η又町供 〈相機與鏡頭〉 測試相機機體60係採用日本Mechademic公 CV—M50E型黑白CCD相機機體,而安裝鏡頭是韓國如^ =tlcai公司出產的DW9813型魚眼鏡頭。這是一個 ^目機由分別製造商提供的主要規格:鏡頭焦距為h丄、 j角線視野角為170度;而相機的裝置CCD元件,每 &lt;調整平台〉 22 101年3月28日替換頁 為了簡化描述’以一平台三基軸的座-- 對座檩系統w(x,Y,z),並設定測試相機^的絕 向為絕對座標系統的正Z方向。經由摔^離圖乾22的方 乾座標系統Ttt Y,Z)的方位.崎控千。可以移動整個圖 〈相機座標系統與圖靶座標系統的對準程序〉 於組裝系統之初,要把相機光軸正 ,整三個傅變數與三做轉變數來相。 ^必須 2 6〇的方向以目視盡量正交對準圖心萬$再 相 .,,、頁不的影像與其對稱指標,藉由電腦程 二考螢幕上 基軸52而微調圖靶22的絕對座棹位詈土軸51與Y’ 部的萬向雲台71微調相機60的方向置以冋時gf由旦才目機6〇底 :生為最佳。根據這樣的硬體設置,理相i :對稱 的方向應與z,絲53財向—致mm,光學㈣ 由軟體程式數位控制將更理想。果萬向云。71能约延伸 〈判斷影像對稱性的方法〉 本發明為決定失真中心235 影像對稱性的方法,以鑑定Ε(α β h]n21而巧兩種判斷 冗襲影像對 圍之外。 中且不應排除在本發明的保護範 〈鏗定ICP的第一種對稱指標〉 請參考「第8A圖」,係 π 像的=二以的原點,選擇= 北、東南)萃取同東、西、東北、西南'西 心往徑向…加⑽=== 線,而以τ表示由白而黑的取樣點;牌同^ (該邊緣與圓中心間的距離長度)相加到的邊緣值 f‘距離和,,,分別為:SS、NN、EE、ww NE $ 此有八 疋ICP 230達到理想的對稱性,則以圖乾中务NW、SE。若 為原點,相對二輕射方向之方向距離和相減應2的特徵座標 差值一 difU|SS、diff_2=EE‘咸應d為零,亦即4個 diff_4=NW-SE—應趨近於零;或是相對 丄现、 加應達到L 财向之距離和相 2=麵、_〜3播泖、_钟=—:亡隨、 這4個差錢4個和錢二者同時參考—H。故參考 方位是否恰當,並觀微調陳22與=2的 的最佳對稱性。以上為本發= 〈魚眼影像的一些處理技;^标〉 f明根據魚眼影像信號的特殊性發展擷取同 種類。邊緣切出是影像處理領域重要的主題,^同 種類的成像技觸触的f彡像能賤譜分 策補實驗分析,魚眼影像隨參考= 像=質有很大的變化。因此,發明—辨認影像邊緣的渾 异法’以進行擷取影像執跡邊緣。 ⑽:第8 Β圖」,顯示「第8 Α圖」取樣線區域的兩 餘傾斜線。P刀的視影信號及處理結果的示意圖。以中心點 準,共有四組視影信號顯群組由上到下示於直軸方向。橫_ 不由中心點往影像邊緣的徑向延伸_。每一組信號群組可辨2 個不規則且漸次衰減的(接近)方波信號曲線。實線部分為 始射頻能量響應(radiometricresponse)呈現嚴重地徑向漸 進衰減’在影像外圍區域很難辨認特徵訊號點位置。本發明發 展一非銳化遮罩(unsha卬mask)處理程序,首先藉由長條^ 化處理(histogram equalizing process)提昇外圍影像的訊 24 ^/5136 101年3月28日替換頁 ^強度’其結杲如圖中以虛線表示的訊號之後,灸 ,等化後”到面曲線制_非因果低通紐㈱(職— jlpass fllter)產生動態門檻值(如圖中所示之接近水平 則此動態門檻值所组成的剖面曲線與用ΐ ΐ ίϋϊϊ線的交又點,即為影像執跡邊緣特徵位 表示传。義、,,D果標不於每一組信號的底部以方形鑛齒的波形來 〈鑑定ICP的第二種對稱指標〉 直由本υ ί出的第二種對稱指標用來判斷魚眼影像參考失 ‘二235 _補性。方法是將影像ρ,(ρ,伽直角座標系 以D(dTn C〇〇nllnate办―)的C(P,Ρ)表示,也就是 圖ί直軸二I為橫軸’轉換「第8 Α圖」的影像如「第9 3」。實白f條的直線性作為第二種對稱評估指 的曲線,因此益·^古社:圚甲㈣綠馬上變成曲率明顯 法許皙…疋直接以肉眼觀察或是利用電腦以數學演瞀 稱,k 了而對稱指標都非常適合應用來判斷Icp 230的ϊ 〃他圓形對稱圖乾亦適用這種演繹方式。 二疋第出T的方位(Camera posing)問題〉 且垂直於中 =225的正交軸會通過影像失真中心235 地定光軸間線座標可以絕對 PCP 220的所名彷要β (補充α兒明.因為同心圓圖案 25 1375136 101年3月28日替換頁 &lt;以特定投射模式為假說求最佳卯的位置及焦距常數办 本發明以PCP 220及ICP 23〇上同心圓的半徑長度(ri, pi)為數值限制,利用忒誤法(计iy_and_err〇r)沿著光學 轴21 --測試假設的定點為投射中心,及推算焦距常數/的 長度。请再次參照「第5 A圖」,定位出魚眼鏡頭24的光學軸 21之後,魚眼鏡頭24的VP 241 —定位於光學轴21上的某一 點,如此一來便大幅了縮減了尋找範圍。 /、 其詳細步驟如下: 失真中心在影像平面的座標得到後,則能夠得到pcp 22〇 圓的f體半M(此為預先設^值)及對應的影像半徑 (此為度置值)對應關係為共施座標對(rl,pi) 中 Π的次序。假設光學軸21上的VP 241與PCP 22Ϊ)的圖案 心5座標點的距離為D。依此便可決定pcp 22〇 同、 偏^ W_tan_1(ri/D);又由影像平面J ί 影像半徑長度pi(principaldist蒙), 右疋,EDP投射函數公式(p=fa)為測試函數:以ai除 ΙΙίίΪ其ί應ί? f 1值°如果測試相機完全符合EDP模式, ,rnrP圖形投射⑽,P=2fxt_/2))或正交 著^=J:fXSln(a)) ’直到配適出滿意的結果。接 =EDP模式為#論的例子晴的方法可實施在不同的投射 且tia’p,h)座標系統的位置原點E(0, 〇, 0), (咖;P,匕為任意值)與Μ 220所 疋義的貝體座私的Z轴τ(0,0,ζ)重合,其中2為實致。 241與PCP 220的距離已知為D〇,設定ρ 1 半徑為ri、其對庫每—旦m = 220上各同心囫的 ^ Dn ^ .:母办像同度為P1,由於pi與oci都相依 D〇的大小,因此勝&amp;以下的數學型式:pi⑽二 26 1375136 101年3月28曰替換頁 漸⑽),其中i = i〜N,而“ ICP23^^^^-執跡總數。取最外圍的圓形和其他之一來連結,經過運算g以 '(1) pi(DO)/ pN(DO) - aii 上 下 ,實上’ DO是-未知值’但確定vp如落在光學軸 ;若取Z軸上的一動態點(〇, 0, z),則可得一誤差數學式如 ei(z) = pi(D0)/ pN(DO) - ai(z)/ aN(z)--------------- 式中ai相依於測試的z,即是(ai(z)=tan_1(ri/z) 而ρι的值在做實驗時已經確定了尺寸(即其恆相依於卯 並不隨著假設Z值而變化)。因此、只要量測得到至少二从 共輛座標對(ri,pi) (c〇njugated c〇〇rdinates ,代表—董曰 互對應之物體點221與影像點231的資訊),即可得到· r 的值。以試誤法搜尋光學軸21上的每—點,根^ S3;在_為最小值之處,此時VP 241的位置即可 但被量測相機的投射函數並不清楚,是 的共辆座標對(ri,pi)所計算得到的 見野,因此影像執跡必須是多條的,而 式的判斷ΐ角乾圍。為了顧慮每&quot;&quot;'影像軌跡對相機投射模 i圍以;::職一圓形軌跡的所㈣ wH二3函數,以公平地對待各執跡的貢獻,其為: (D〇)~(pl(D〇)~pi-l(DO))/ ----------------------- 在光可f為是失真中心235的半徑。因此, 用的誤差函數t找投影中心241的配適過程中,實作例子應 ε(ζ)=§αόί(ε-(;) Xw,(D〇)) _ 該點—點使得ε⑴最小、或是趨近於G時弋 點了此就疋魚_機的VP 24卜式⑷的數學形式是建」 27 1375136 101年3月28曰替換頁 yp假設上的推導辟;若是假設前提^~的投射模 式’例如:SGP (p=2fxtan(a/2))或 OGP (P = fxsin(a)),、 f式(1)至(4)必須根據SGP或OGP的投射函數再推導一次。無 論如何,以上述觀念所做的推斷稱為「ε-演算法」。 ‘… 至於焦距常數f,根據量測到的pi(D)及其相對的ai(D) 為基礎’別用下式計算之: f(DO) = S,(Z)°)xw'(D。)____________________ 其中,fi(DO) = pi(DO)/ai(DO)。同理,若是假設前提改 $ SGP,則 fi(D〇)等於 i/2*pi(DO)/tan(ai(D〇)/2);或是設 疋才又射函數為0GP ’則:fi(D〇)等於pi(D〇)/ sin(ai(D0))。若 鏡頭元王付合设疋的投射模式、量測無誤差,則值將很準 確,那麼f(D0)應等於任一 /i(D0) ’這也就是鏡頭的焦距常數 f。 事貫上,由得到的fi(D)數據演算的統計標準偏差,更可 以利用來估算假設之投射模式的準確性,也就是說,可以用下 列式子做為與设疋之投射模式配適程度的指標,稱之為「口一 演算法」:, ' &quot; a(D0) = (S(/i(D〇)-/W)2)/ (N-i)--------------------⑹ 為進一步驗證實驗結果的可靠性(包括:光學轴21方位 與配適之相機投射模式),請再次參照「第7圖」,本發明更以 初次準直光學軸21後圖靶22的絕對座標位置為基準,將圖靶 22沿著正Z方向移動兩次,各增加5麵的位移;在這兩次位 移中’相機60的方位與圖乾22在X’基軸51與γ,基軸52座 標位置皆保持不變。包含第一次實驗在内,這三次實驗分別以 Testl、Test2 與 Test3 表示之。 位移 ε-演算法 σ-演算法 Di(EDP ) Di (OGP) Di (SGP) Di (EDP) Di (OGP) Di(SGP) 28 1375136 (Testl) 5(Test2) 24.4 |26.4 R5/ 33.1 19.7 101年3月28日替換1 26.2 10(Test3) 29.4 40.1 1.82 2.44 1.85 ·* ——— _ 39.4 2.42 20.7 24.9 3.10 --—-二 |υ· υυ兮 U. U! 矣有,位外’其他數值的單位皆是· ^ 上,並分別利用ε-演算法^演算法推導得到的m i = 與ε值/σ值。對照表1最左端的參考Testl位移量,實驗的 結果以Di K現神確度為指標可⑽論 EDPJI員型’因為無論,演算法或σ_演算法,推|得^^ 值文化’ ^可非常忠貫地反映各次實驗遞增的5麵位移量;但 是同-實驗’兩種演算法推算得_ D值 EDP所推算得到的隹距當赵Μ δ9 71 oc 、 堂7« W 麵/丨·85nffll)也較接近規格 f中祕的L 78m,其中的差距可能是因為手動組裝鏡頭的誤 差^目反地’ 0GP與SGP的實驗結果與已知的絕對位移量盘隹 相差甚多。最後—列相當小的ε值及。值,顯示ϋ 明揭=的此,_算法具料目#的财纽可紐。 二參照第1 〇圖」’以Testl為例,橫轴繪示以假 I ε σ ° 測f條件下(二種投射核式仁種演算法),無論ε 很:顯!最小值,該單一最小值的存在代表心 =所,位置,如此亦酬了本發财法的可雜。然而,同一 鏡頭在不同的參考投射函數可以得到不同νρ 241位置 的焦距,這表示不能細單—實驗隸斷 _ 生投射函數。另外一提、實務上亦難以 == 數完全描述-_的投贿為。 ^ 以上所揭露實施例之「ε_演算法」與「σ_演算 已知的技射函數為目標鏡頭❺成像邏輯,來演繹所對應的相機 29 1375136 101年3月28日替換頁 麥數。而由得到的結杲驗證所提的方法學^可行的。 =提出之第-種主題校正相機内外部光學參數#成1 電腦視覺模型的演算法。 〈視覺模型二、一種通用的相機參數演繹法〉 本發明更進一步發展一通用的相機參數演繹方法學,不須 ς酌任何既有賴閉形式投射函數的假說,直接由校正點的絕 if ί和其顯現的影像位置之映射關健繹相機的光學投射 機制並it相機參數,是為一種更通用的相機參數演算法。 ,、此演算法係根據習知的鏡頭投射現象所發展出來,設定的 ,說為··(丨)、若且為若在相機視野空間中同一特定視野線上 的所存在的物體點,恆映射在影像平面上唯一的同一特定影像 ^二(2)、所有視野線將匯聚於空間中一唯一的投影中心或 Ϊίΐί中心,以卯此,簡稱為VP),再根據投射函數來 =射成像於影像平面上。此投射機制等效示意在「第丄i圖」。 在同一視野線上的特性點W333[r] 333、W323[q] 像平面 ^PL 313/、WC 242、而 1313…333[P,Q,r] 9卜影 濟办門j上單一影像訊息(如圖中之影像點91)無法分辨實 靶二二中同一視野線8〇執跡上的相異物體點(如圖中所示圖 嗜動於P、q、Γ三位置時,三校正點313、323、333的 是在鉉 iLW313[p]、W323[q]、W333W);反過來說,若 嗜體二間中至少二相異物體點映射至同一影像位置,則由 目異物11關實體㈣絕對座標可以較其投射之 U 80,該視野線8〇與光學軸21的交點即為Fcp M2,或 ^复ίί!中心(VP)。:因此可以利用找出實體空間所有視野線 形^ * t點的關係來得到相_投射成影機制。而如何以此圖 二福命二有可以代表魚眼相機投影邏輯的方法的實施内容已 ΐιΐ中華民國第__號專利申請案中,以下便不再 ^ 方法學可以用來處理任何一種魚眼影像的投射函數。 …、眼視影立體度量系統〉 30 !375136 .,相 . (remapping)為針孔影像丄習知“:二^術將影像重摺 式。另-影響系統品y_ _想影像格 到魚眼相機的安置方貝向與===題所以得 〈魚眼影像轉換到針孔影像的技術〉 為其=射已轉換 針孔影像是該專利技術的實施 '直 。,〜像轉換成 〈決定魚眼相機方位的技術&gt; 巧明在前部分公開相機參數求法時,財考p 方位來表4似贼巾峨實齡 以,,3 的方位為基準,延較位相機的方位== 差,此為二維度罝的重要參考基準。 .精太確^到pcp是絕對方向及位置必須有精密的六 貫例子是為印證發明方法學的可行性故以較 間早的方式來貫施’但可達到的相機布置並不限制於此。 〈具有相機整流布置的三維電腦視影度量系統〉 兩相機架設成為影像平面在為同一平 =_(如斜轴線)落於同一線上的程== 而);是三維視覺度量佈置的一特例。經此調整 :3位的代數運算;缺點是於一般針孔相機(由於 視野角度較小)將降低能夠操作的視野角产。「第1 示經過整流後的針&amp;維視覺度量系統;物件丽落在^的 1375136 1 101年3月28日替換頁 視野内可以順利執行三維視影度量;另一物件M,N,因為不在針 孔視野因此無法被度量。本發明針對魚眼影像的電腦視覺模 型]單獨相機地的視角接近180。·,組成系統可達到的工作角 可以達到很大的操魏野肖度,非針孔相機可以相的。用來 印證魚眼視影三維度量裝置的實施例示意圖請參考「第丄3 圖」所顯示。 ^正圖乾延伸實體線段整流像素軸線〉 整流兩部魚眼相機的水平軸線可以經由在PCP 220延伸一 f ^基^線來達成。請參照「第1 4圖」此陳又名深度圖乾, 何尺寸與「第6圖」相似。調整相機軸線的方向 马傻吻影像通過失真中心後,以旋轉相機,使直線 〜像勿S在在像素平面的垂直軸線上。 ^個步驟可以合併在執行相機參數測量階段一齊實施。缺 任何鏡頭的影像恆有__的性質,故此程序只有在 這種布置例並未限制其他布置4:而私有此需要。 〈架設魚眼相機三維視影度量系統〉 建構完成之三維視影度量系統實施例,包含 33與「右眼」城,其顧提供的規格為本發=(一部 二2相機參數階段所使用_機)以及—相機架 β =的該纽減龍纽域gj ^ ^ 必確定的距離位移,且可以自由調整方向。先 目視方式讓其光學轴2!跟「第;l 4圖」圖= 作時,相機和深度圖起22可以依據需要調整。,,,持父,操 請參照「第1 5圖」,本發明提出之牟机 影度量系統的步騾為: m之木叹魚眼相機三維視 c 右眼相機之電腦視覺模型(步驟932);旋轉“深度g 32 101年3月28日替換頁 I以丄丁 ^ 乙〇 a督換貝 22使#助水平及垂直線段」的影像剛好吻合⑽的 伸及垂直延伸線(步雜933);依據陳22之設計移動圖乾中 心玉ίί學軸(步驟934);紀錄共輕實體與影像資料,演繹 電腦視覺模型’並以此右眼相機之平台座標為系統 i考缠1驟935),保持深度酿22的方向及平台z軸的位 置,以讓其平台座標上運動保持影像平面和深度聽平 考面(步驟936);在適當位置綠左眼相機的 大概位置(步驟937);水平及垂直移動深度㈣22,得到盆 圓形對稱影像,紀駭_實體與影料料,^ ;調整相機的方向使深度圖』= 輔助水+及垂直線段的影像剛好吻合ICL的水平延 延伸線(步驟939);再次紀錄共輛實體與影像資料,左 的電腦視覺模型(步驟以紀__的在Ϊ 像ί料的平台座標位置來演算相機於圖乾平面^ 移,及凟繹得到的值分別定義視參點距離(步驟% 〈魚眼視影度量學的三維演算法&gt; 系統架設完成的左右眼相機的實測結果請參 ^「第三表」。表列參考等距離投射、立翻投射、及 =為根據的模型;來執行Sigma及Epsil〇n測試的沾果&quot;。 „明各欄位的意義。第三到四列表示以不同的三種投射握 =測試相機的參數。z(印)表示以Epsil〇n測試演算出來的 =座距離,單位是咖,㈣印)為最佳位2 ::此值:又有早位’❿FL(ep)則為演算得到的焦距常數,’ 咖。,二_是S聊測試的結果,其中Err( 早位疋mm,其餘兩個跟Epsilon測試的單位相同。佥二的 像在電腦營幕顯現的是一個長短轴相差擴圓二 :失真中心的單位是像素,而相機座標原 機時的圖靶座標原點為基準。 仅止右眼相 根據相機製造商提供的規格鏡頭的焦距是i為。參考測 33 1375136 101年3月28日替換頁 試結果,相機明顯的是比較像是屬於等距離投射。故以此組參 數為s十鼻二維視影的基準。對於立體圖投射、正交圖投射的演 算結果就不採用了。 經由這些已經得到的左右眼的「魚眼影像電腦視覺模 型」、相機於外界視覺參考點(投影中心)的位置與相機的方 向’及能夠將影像轉換成邏輯上符合於針孔成像的線框性透視 景夕像。因而既有的電腦視覺技術可以直接引用這些經轉換成的 線框性透視(又名針孔)影像來執行三維視影度量。、 第二,、度量系統,腦視覺模型」參數 離投射 主艺里投射31.4 0.005 0.008 1/784 2. 577 FL(sig) 0.008 標原點 0.13 1.464 -^-^£l_iJ89.Q6Y:219. 50 TR(0, 0, 40) Z(ep) 35.4 54.3 27.7 0.92 TRC-266, -94, 35. 射 〈貫物量測〉 :等完發明以第-組根 為針孔鄕崎隸,㈣行==== 34 ^75136 i〇l年3月28日替換頁 r u乃乙〇 η瞀換頁 視影度量系統所在實驗室的各標的物,其 的比較如「第四表」所示。 。里’4尺寸 第四表、實物量,的結果(星仂:刪、 實際尺寸 量測尺寸 誤差 玻璃門高 2730 2851.09 ---~_ -4. 44% 玻璃窗戶 寬(左上) 1000 997. 74 〇. 23% 玻璃窗戶 寬(右下) 1000 989. 59 ---- 1. 04% 玻璃窗戶 530 527.23 -— 兩度 0. 52% 書樞面 1200 1162. 02 3 17% 書櫃長 980 1011.98 w· -L I/O ------- -3. 26% &amp;具長 600 595.23 〇. 79% =幻不仗j以取侍清晰影像部分量測誤差值可以 1=下;㈣分量測誤差值5%以下,影_料影像=、 ;、、選取觀點所奴縣、兩卩、械 二 .=:、、(XD是否完全平行、光學中心點位置...H 工作視角可以達到100度以上。 哥敢主要的疋 故用其為架構三維視影度量‘其投射’ 鐘疋各自的相機參數及 =表的方法學 立體視影度量系統並演仏 以上的感知器來架構 乃外也可以用兩部 解答’更可,•的可靠度見;==多生組 35 ^/5136 在不違背本創作的精神下各種組合皆可為 的^外,本翻縣之林城之辟法具有以下 的光學參數,如ntum轉換魚眼影像所需 演算邏輯變得非常簡'單g ^ ’二此轉換魚眼影像的 像保有良好的傳二 (3)本發明令參數化相機的方法 相機’不須侷限於特定的EDP模式。用於各種技射機制的 作為影像方中法? 1 找到模1令單—的投影中心(VP ) 影度確度將因本發明而延伸現有視 定本ίΞ本施例揭露如上,然其並非用以限 附之申請與潤飾’因此本發明之保護範圍當視後 /圖定者為準。 第 1A圖、. _ 面影像為基礎之根據理想投射模型的以平 之空間投射示意圖、、像权方法的影像解析圖以及其對應 第2圖,矣备示翌立一 第3圖,纟:示種典型魚眼鏡頭之投射函數曲線圖; 意圖;&quot;义發明精神而設計之一同心圓圖靶實施例示 第5^圖二魚^影像形成之立體投射光路的示意圖; 圖1 會不利用令心對稱圖案⑽)模擬多準直入射光 36 1375136 路以及融-树轉視魏鏡觀 距離投射為例); 耵仃為的不意圖(以等 示「第5 A圖」中小球與影像平面部分的立體光 本發明於實際實驗時應用之中心對稱圖案⑽) 實現她键_目機娜間相對方 第8 Α圖,繪示本發明於實際眚 面上的成像示意圖;貝際貝糾衂圖」映射於影像平 第8 B圖,繪示「第8入圖,影像之秦钋^^ , 四個方向的訊魅度曲線ffi 、南、西北 '東南 Γ「Ξ 本發明以失真中心為原點、以極座標轉換方式展 ,第8Α圖」後的影像示意圖; ,上2 L本發明實際測試時’根财同投射函數,求取 才又衫中心之趨近曲線圖; 第土 1圖,繪示本發明相機參數的通用演算法之一種實施例的 理論模式示意圖,其顯示圖乾移動於不同絕對位置時相異校正 ,成像於同一影像點的光路示意圖; 第1 2圖’繪示經過相機整流的針孔影像三維視影度量系統的 理想光路示意圖; $ 13圖,繪示魚眼三維視影度量系統的一實施例示意圖; 第1 4圖’繪示以「第6圖」進延伸調整相機方向功能的中心 ,稱圖案(PCP)示意圖;以及 第15圖’繪示架設魚眼影像視影度量系統的步驟流程圖。 【主要元件符號說明】 11 12 1 成影區域 長軸 短轴 37 1375136 13 13, 、 13” 21 22 220 221 225 23 230 231 235 24 241 242 243 30 301 302 31 38 313、323、333 40 50 51 52 53 60 70 71 101年3月28日替換頁 本初子午線 本初子午線的映射 光學轴 圖把 中心對稱圖案(PCP) 物體點 圖案中心 影像平面 中心對稱影像(ICP) 影像點 失真t心 鏡頭 投影令心(VP) 前基點(FCP) 後基點(BCP) 小球 入射點 正規化影像點 赤道平面 中心校正點 校正點 大球 調整平台 X’基軸 Y’基轴 Z’基轴 相機 相機支架 萬向雲台 38 1375136 101年3月28日替換頁 80 視野線 91 影像點= Distortion center, coordinate position (refers to the digital system on the computer screen) &gt;, ... 3 feet 7 ratio (since the "ratio" makes a "square entity" display as "- (Extrinsic ^amewTT~ ~ ~ 11 ^一~~I—— -____________ March 28, 2011 Replacement of the displacement relationship of the page space origin ~~~ Outside ^4 1^The projection geometry of the camera, while the external parameters construct the camera with the camera The reference of the visual coordinate system; according to the projection light path of this image into a physical space. Chua's methodology can be found in the net, or textbooks. The parameters used in the 2!! machine type are not - unchanged Some models are the same as those of Cai Yunji. The multiple documents use more or less parameters to compare the image system of the second two. The parameters of the Chua model are not Ξίί? Fisheye image. The present invention will demonstrate a methodology that can describe the fisheye phase = internal and external parameters and verify it by a real system. [Summary of the Invention] 'This (4) is the target of the needle threading perspective projection mechanism = _ machine, providing _ age 彡 like Wei Wei's mechanical parameter analysis method, ^ 视视彡 彡 _ _ in order to not lose the system specifications / n, can expand the working view angle of the three-dimensional vision system. The purpose of the present invention is to provide a fisheye camera three-dimensional visual measurement system and its wood The technical content of the method is Zhonghua 90123681 . 91116790 ^ 92109160 ^ 92109ΐΓ^Γ;; ^ is a part of the content of the 3D visual measurement system, which was revealed in Taiwan on February 28 and 28 by Mr. Zhang Chuangran. In the doctoral dissertation of the university's electrical engineering research. The subtraction-optical hypothesis to perform the fisheye camera's machine, Wu Sheng, and the s-measurement system to verify. The system uses the image symmetry to take the central symmetry plane to simulate (emulate) Multi-collimator correction in the era of the first and second wars, a precision machine that uses multiple sets of precision machinery, .0 configuration to determine the stereoscopic optical path, and has identified the projection geometry mechanism of the super large airborne convex lens. , the combination, and then the red" and its presentation _ like the size of the two models to play the visual model of the squid two. The interpretation process is summarized as follows: Machine based on fish eyeshadow And there are characteristics from the half-ji ☆ / replacement page _ adjustment attenuation, from symmetry and the single direction of the gain becomes stupid; 5 groups have six-degree azimuth variables (that is, the three-camera and the target control of the flat adjustment stage active control features The T 5/7&quot; direction and position of the common image of the image point are determined by three. The four 'is the camera's 2·=:' needle to obtain the focal length constant and the projection center Solving the number of fish, and === position, clocking projection model, focal length often by =^=machine ground =-"=&quot; τ image formation: ===== =s projection model completely strange lens 'For example, the solution of the model. According to these already-made parameters, it is enough to describe the visual image of the eye camera: Ϊ 重 重 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 rem rem rem rem rem rem rem rem rem rem rem rem rem rem rem rem rem rem rem rem rem rem 1375136 The invention is used to obtain camera parameters - a visual image measurement system - mainly to determine the orientation of the installation only - the three spatial orientations and the relative orientation between the cameras." The external parameters of the individual entity camera model can be referred to the figure ^^. It is known that the single fisheye is based on this arrangement of the two fisheye cameras as the target eye"; firstly, the left eye of the inner system of the first eye camera and the right platform are obtained. The direction and depth of the part ====% fixed six-degree azimuth adjustment camera's square financial rectification __ first ^ core 2 smart related applications greatly improved. _ violent perspective work, thus for the above and other purposes of the present invention DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT A preferred embodiment is hereinafter described and used in conjunction with the drawings. == Embodiments The techniques disclosed in the present invention are Chinese 9012 legs, 91116790, 92 faces 60 and 921 〇 9l=H ς = and month 1 =, Vision _ 21 Jielu Zhang Chuangran In the doctoral thesis of the Institute of Electrical Engineering of the National Taiwan University. The detailed embodiment of A ^ will cover two parts: the first part is to construct the fish according to the hypothesis of two technical modes mentioned in the monthly ij plane. Eye camera computer vision model (that is, the algorithm for obtaining the internal and external optical parameters of the camera), projection hypothesis 1: "In the "partial viewing angle range", the projection model of the fisheye camera conforms to a typical "circular function"" The present invention proposes two algorithms (named: 1. "ε_ algorithm" and "σ-algorithm") to interpret the optical parameters of the camera; the projection hypothesis two wild lines exclusively correspond to the corresponding "projection point" The characteristics of a mapping; accordingly, the present invention develops a more versatile algorithm that can be applied to projection models that are not specific "circular functions". The second part is to expose the method of erecting the 3D visual metric system and the 1375136 replacement page on March 28, i01. The device component must be correctly mentioned in the camera optical parameter. &quot; Vision model - camera parameters referenced to a specific projection mode > The fisheye lens is a non-linear projection lens, meaning that the sight-ray in space cannot pass the well-known pinhole after passing through such a lens. The linear perspective projection mechanism of the pinhole model is used to explain the projection behavior of the lens compared to other reference linear projection designs. The fisheye lens has a large visual advantage, but the image has a serious barrel-shaped straightforward deformation. The fisheye image reference pinhole mode of the face is centrally symmetrical (this center point is called the distortion center / ρΓιης ^ ^ thousand and the amount of text increases along the radial direction.) The imaging projection mechanism of the camera can be described as: (field〇fy hereinafter referred to as F0V) The incident light (and the illuminating light) from the object in the field of view will be logical (note: not practical; because the lens does not have this central projection center point) - only - f or called projection center, or viewp〇int, abbreviated as νρ), followed by a number of = ^ through a multi-stage refraction phenomenon) imaging art in the field of art know-how Theoretical basis ίίΐ; Ϊί ίϊίΐ with various types of projection lens geometry appropriate analytical techniques, and only stop, (i.e., a pinhole projection model) with respect to the development of the foundation yarn distortion (i.e., barrel distortion) analytically yet available. Negative 仏 arrangement ====/ test geometrical symmetry of the optical axis - "3rd figure; dry in the center of the symmetry of the heart +1" does not have a central symmetry pattern 22G (physical below The abbreviated PCP) plane &amp; image plane 23 22 and camera relative orientation, so that the upper center to a central symmetric image 230 (imaged 1375136 March 28, 101 replacement page cen_tral-spiraetry pattern 'hereinafter referred to as "fourth Figure ~~ = (This figure is a conventional fisheye projection model). Or if or when the symmetry ίί ' also indicates that the wire axis 21 _ orthogonally passes through the image center 235 and the heart 225 ' and the front base point ( Front cardinal point (abbreviated as FCP) 242 and back cardinal point (BCP) 243; these two points are also on the optical axis 21. Since the pattern on the target 22 can be artificially placed in the known The absolute position can be used as the reference coordinate position to determine the orientation of the optical axis 21 in the space. Therefore, how to control the orientation of the PCp 22〇 or the camera 6〇 to know that an ICP 230 is a core program of the present invention. Figure 4" on the optical axis 21 The front base point 242 is the reference point for constructing the optical path of the visual field space; the rear base point 243 is the reference point for constructing the internal optical path of the camera. Since this is a logical equivalent diagram of the fisheye lens, the physical distance between the two points is not important. The PCP 220 in Figure 3 can be viewed as an optical arrangement that simulates the circular arrangement of a multicoliimator. For a long time, multiple collimator devices - New Zealand to correct the large lion's convex lens surface A projection function that uses a plurality of point sources arranged in a precise circular arc to produce a beam that is concentrated until a specific point is concentrated. The absolute position of the space at this point is the known position when the correction device is set. The orientation of the camera is tested to achieve the clearest image, at which point the projection center VP, which is the test camera, has overlapped with the pre-arranged beam concentration points. The multi-collimator simulates the point source installed at the precise geometric position. The optical path of incident light from an infinity known to the off-axis angle (α). And because the position of the image point mapped by each incident light can be accurately measured, it can be directly The measured data obtains the projection profile of the off-axis angle of the lens to the image. The principle of operation 'multi-collimator device can be used to measure any image component, module or system with spatially symmetric optical path of the optical axis, and The device can be used to launch the nucleus. Therefore, the projection mode with the force identification is not limited to a closed circular function projection. (Additional note: the projection profile presented in "Fig. 2" can be closed. The circular function is described. ) Therefore also applies to the identification of fisheye lens 1375136 丨 (8) March 28 曰 replacement page. However, the precision arc-shaped mechanical structure of the multi-collimator is difficult to indirectly simulate the multi-collimation space with a planar pattern. ^3, which expresses the design of the pcp 22〇 based on the above-mentioned basis. It consists of a solid center and a plurality of concentric and symmetrical geometric figures (concentric circles shown in Figure t). In the following, the multi-collimator measurement (or = Zhao) "method of the method is used to assist in describing the theoretical basis of the method of the present invention. Please refer to the ^4 picture again" to indicate the target circle of the plan view in the three-dimensional space constituting the instrument system. Tuya's projection light path generated by the fisheye _F〇V projection space can be used to represent the fisheye lens 24 and the image plane 23 to represent the projection behavior of the fisheye phase machine. The shape projection function is a supplementary explanation...that is, the projection function of the camera is the product of the known circular function fUnCti〇n) and the focal length, and then the alignment mechanism is substantially achieved from the incidence of the PCP 220 (c. Out of matingmeehanism), that is, all the rays in the front of the human phase will be concentrated in the front base point (10) - point, referred to as Fcp), and then a later base point Cardlnal P〇int 'referred to as BCP) according to the projection function Divergence, emission and imaging on image plane 23. Fcp 242 and Bcp 243 are two reference points describing the projection behavior of the two eyes and the head 24, which are used to define the fisheye camera = and the outer two projections (supplementary system: in addition to the inventor's cognition except the invention) There is currently no literature on how to introduce two points in the computer imaging system. When parsing the projection mechanism of the fisheye camera, Fcp 242 is used for the reference line parameter = 243 county image plane 23 reference, and the distance between the two nodes is not the camera: the number can be set to any value, so the present invention will be 242 and BCp For example, the VP 241 is a single-single-like one, as shown in "Fig. 5a", to harmonize the table: the circular function logic of imaging. "5A" can be regarded as the "figure 4" vertical plate type including the meridian plane of the optical axis 21 (meridiQnal ρ1_ light, the image. α in the figure, derived from the image height p and the focal length constant, deduced The logical relationship between α and α' is determined by the native projection mode of the tested lens. The replacement page of March 28, 2010 (Additional note: this is much like a refraction phenomenon). The coordinate of the coordinate system is ίίί Describe the implementation method of the 8 counties of the present, the first mosquitoes are referenced to (1) the seat system W (X, Υ, Ζ) by the control flat two Η T six M - A Μ J axis - (10) 5 b r 2 ζ, 基: controllable of steel body 及 及 劝 劝 劝 劝 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此 此The graph coordinate system τ(χ, γ, z) is defined as the reference direction of the Z-axis in the arrangement of Fig. 22 = 43. The target 22' is a movable displacement. The unit of the three components of the TOUZ) is /. Y, z (3) camera projection space coordinate system 心 (the heart is the origin 'where α, β, and h define three basic = two == can use geodesy (geod) f込 座 糸 说明 说明 = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = Γ 定 定 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。. (xy) or p, ((10) 35 is the origin, the image plane is shown as a right-angled coordinate or polar coordinate table) 1375136. The units of X, y and P are the length; β is empty and cannot be directly observed. On this system (6) pixel coordinate system I (u, v) This is the direct system of the brain system display interface, u, v time is the unit now ^ true center 235 imaging in the computer system display screen position I (uc, vc). Basically, the camera maps to the image plane's strong ruler &amp; class ^ button in the KU, V) coordinate system. And any image point in the image, the system is also expressed as I (uc, Vc) is the original point mark C (u, v), or the polar coordinates ρ (ρ, β). Month! In the replacement page on March 28, 2010, the two calibrations are completed after the calibration system is completed.糸 巧 《, Ρ, Τ Τ Τ χ χ γ γ γ 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 The optical axis 2'1 of E(a, p, h) coincides. <Two-stage projection fisheye image visual model> Since the famous person discovered the mapping logic in the part of cartography (cart〇graphy) and common The fisheye is reflected in the picture. The following ^=^ refers to the subject terminology of the developed wolf in the field to assist in describing the image conversion method of the fisheye visual model of the present invention. There is a mapping logic for mapping the map already embedded in In the "5th b". From the = ground _ actual size and the earth sphere is not unified, the first step of the production site @ Ϊ = = = surface with the Earth * screen to the camera - a small radius of the ball can be = (small sphere is the term for map making; into one) 'so, can put the ball 30 The face is displayed on a limited area. The point 302 of the small ball is a position for the surface map. When the map is not intended to draw an equidistant projected map, it is reflected by the point 302 = map 23 - Point 23 can be used to display the surface geographic information of the small sphere with a radius of the unit ball k. The fisheye imaging mechanism can be interpreted as "the pinhole with the spherical image" 19 March, 30, 101 曰 Replacement of the dan / insect, from the Gan dynasty d Ζ ϋ ϋ ϋ 只 only: 4κ (χ β) draw "map". Let's take another EDP fisheye lens to illustrate, point VP24i is defined by the line segment 221, 241 line). After the visual parameter (the line segment 24U31 is set to ^^^^^^^^^^^^^^^^^^^^^^^^^疋The pinhole lens is projected in a straight line to the small ball 231. And then the mapping logic drawn by the map is used to make a point. The shooting of the fisheye image can facilitate the computer to count ί :2::Image' And the potential of computer analysis technology for map making is :t.. The thief frame is in the beginning, and the fisheye image of the touch is m. Not only can the above-mentioned materials be able to ride the distance from the projection mirror Ιϋ 绎: ^ The secret can be turned over in a very hybrid lens For example, the second picture has a different stereoscopic image and an orthogonal graphic fisheye lens. <The mechanism by which the target produces collimated incident light> 嫉 ΐ cr cr cr cr cr cr cr cr cr cr em em em em em em em em The PI Figure 5A contains: (1) The image of the fisheye camera shown in Figure 5B is part of the small ball 30, which is the part of the small ball 30; the parallel-step two radius is the focal length constant /; 2) pCp22〇 properly placed multi-collimation = point source structure; and (3) reference ball 30 Ppc 220 on the round = the defined big ball. Now the new PCP 22G outermost circle (four) obstruction j: the radius of the big ball defined by the circular trajectory is "feature point 221 to VP line: expressed as The big ball on the picture is 4〇; this big ball 4〇 is the outermost circle of the plan 22, and the concentric circle is secant, so the circle is a small circle on the surface of the big ball 40. Any two circles on the PCP 220 The shape can be used to describe a large ball by this mechanism alone. <The stem produces a conical light path> ^The line of sight (sight ray) projected from any object point 221 on the PCP 220 will be essentially at the incident point 3 () 1" traverses the surface of the ball 3 正交 orthogonally and concentrates toward the center of the ball (that is, VP 241), so that pCp 22〇 is the same as 20 1375136, and the threshold is replaced by the March 28th replacement page. The beam, this suspension is outside the camera, just like the conical diaphragm; the "three-dimensional light path" in "4" is refracted to the image plane 23 by the most function. For example, "The 4th first road is based on the technical VP, Fine can be described - strip 1 road such as "characteristics point 22",, <method of obtaining ICP> quasi-picture shadow empty __ sex, only in the optical axis 2] The corresponding image mapped by the crossover is also expected to be concentric and ^^^35. The geometric symmetry center of the iCP 230, iCP 230 is the distortion rise <above #, two: t adjust the graph dry 22 and the test camera The relative orientation is set to the accuracy. At this time, the center of the pattern is reflected by the center of the pattern 225. The position of the characteristic point (principal p〇int) is the origin c of the loss =, (o, 〇) "PY0 R, the position of the product is like a thousand faces 23 I (uc, vc); and through the distortion = 235 and 23 pixel coordinates as the line of sight will also pass vertically through the PCP 22 (), two 22f space, the road L π ^ f first learn the orientation of the axis 21. The above program realizes = correcting the orientation of the axis 21, which is the process of setting the external parameters of the camera. The pattern of the test pattern 220 is not limited to the plane form concentric circles, and PCp 220 is composed of β ώ ; It is the same 220 embodiment of the two-door, two-sided, or 同 思 concentric polygons of the smoke, the circle of the heart and the symmetry of the circle, the square, the three-to-one, and the number of the mouth. Of course, symmetry and surrounds $study='=== also has f-same: quality 'but it may be simple (four) processing ^ object 1375136 101 March 28 曰 replacement page below with a specific embodiment, can be achieved The PCP 220 is used as a reference position to locate the size and position of the ball 3's of the camera, the direction of the optical axis 21, and the center of distortion 235. In the actual experiment, the pep 220 is designed as shown in Fig. 6 and can be printed on A3 size paper by a laser printer as a specific embodiment of the figure 22. Considering that the degree of distortion of the fisheye lens increases sharply outward in a radial direction, the radius difference between concentric circles of the design PCP 220 is gradually enlarged from the inside to the outside to reflect this optical phenomenon of the fisheye lens; PCP 22 is concentric. For the determination of the radius of the circle, the image of the target image and the image can be obtained by using the image of the target image as shown in "Fig. 3" to adjust the width of the circumscribed circle of the pcp 22 ' The image of the middle area and the image range of the edge can be clearly displayed at the same time. The PCP external reference designed according to this principle is “Fig. 6. In addition, the black and white phase can clearly distinguish the concentric edge. For the subsequent image, please refer to “Fig. 7”, and fix the prepared target 22 to an adjustment platform. On the 'and let the map 22 and the camera 60 as close as possible', so that the PCP ^ can:, 亘 the FOV of the entire fisheye lens 24, the image thus mapped will be horizontal and large, and can be divided into the image display range. In this way, the image information of a larger viewing angle can be sampled, and the image reflection of the fisheye lens 24 can be best distinguished by the image reflection of the partial image, that is, as shown in the "Fig. 2", the larger the angle of view, the difference The difference between the equations is more obvious. η又町" <Camera and Lens> The test camera body 60 is a Japanese Mechademic CV-M50E black and white CCD camera body, and the mounting lens is a DW9813 fisheye lens produced by Korea, such as ^=tlcai. This is a main specification provided by the respective manufacturers: lens focal length is h丄, j angle line of view angle is 170 degrees; and camera device CCD components, each &lt;Adjustment platform> 22 Replacement page on March 28, 101, in order to simplify the description of 'seats on a platform with three base axes--to the seat system w(x, Y, z), and set the absolute direction of the test camera ^ to absolute The positive Z direction of the coordinate system. The orientation of the square coordinate system Ttt Y, Z) by the drop of the map 22 is. The entire image can be moved. <Alignment procedure between the camera coordinate system and the target coordinate system> At the beginning of the assembly system, the camera optical axis is positive, and the entire three derivative variables are compared with the three conversion variables. ^ Must be 2 6 〇 in the direction of the eye as far as possible to orthogonally align with the image of the heart and then the phase, ,, page of the image and its symmetry index, through the computer program 2 test screen on the base axis 52 to fine-tune the absolute seat of the target 22 The direction of the universal pan head 71 and the Y' portion of the gimbal 71 is adjusted to the direction of the camera 60. The gf is the best. According to such a hardware setting, the phase i: the direction of symmetry should be the same as z, the wire 53 is the same as the mm, and the optical (4) is more ideally controlled by the software program. Wan Wanyun. 71 can extend about <method of judging image symmetry> The present invention is a method for determining the image symmetry of the distortion center 235, to identify Ε(α β h]n21 and two kinds of judging salience images outside the circle. It should be excluded from the protection model of the present invention. The first symmetry index of ICP is described. Please refer to "8A", which is the origin of the π image = second, southeast, southeast, and the east, west, and The northwest and southwest 'western westward radial direction...plus (10)=== line, and τ denotes the white and black sampling point; the card is the same as ^ (the length of the distance between the edge and the center of the circle) is added to the edge value f 'Distance and,,, respectively: SS, NN, EE, ww NE $ This is the ideal symmetry of the gossip ICP 230, then the NW, SE. If it is the origin, the relative distance between the direction of the two light directions and the characteristic coordinate difference of the subtraction should be a difU|SS, diff_2=EE' salt should be zero, that is, 4 diff_4=NW-SE- Nearly zero; or relatively ambiguous, plus should reach the distance of L financial direction and phase 2 = face, _ ~ 3 broadcast, _ bell = -: death, these 4 bad money 4 and money both Reference—H. Therefore, it is appropriate to refer to the orientation, and to fine tune the best symmetry of Chen 22 and =2. The above is the hair = some processing techniques of the fisheye image; ^ standard> f Ming draws the same kind according to the particularity of the fisheye image signal. Edge cutting is an important subject in the field of image processing. The same type of imaging technique can be used to analyze the image analysis. The fisheye image varies greatly with reference = image quality. Therefore, the invention - the method of recognizing the edge of the image is performed to capture the edge of the image. (10): Figure 8 shows the two inclined lines in the sample line area of the "8th image". Schematic diagram of the visual signal of the P-knife and the processing result. A total of four sets of visual signal display groups are shown in the direct axis direction from top to bottom. Horizontal _ does not extend radially from the center point to the edge of the image. Each group of signals can identify 2 irregular and progressively attenuated (close to) square wave signal curves. The solid line portion exhibits severe radial progressive attenuation for the initial RF energy response. It is difficult to identify the characteristic signal point position in the peripheral region of the image. The present invention develops an unshadow mask processing program, which first enhances the peripheral image by a histogram equalizing process. 24^/5136 March 28, 2010 Replacement page ^ intensity' After the signal is indicated by the dotted line in the figure, after the moxibustion and equalization, the curve to the surface curve _ non-causal low-pass NZ (product) - jlpass fllter generates a dynamic threshold (as shown in the figure close to the level The cross-section curve composed of the dynamic threshold value and the intersection point with the ΐ ΐ ϋϊϊ ϋϊϊ , line are the edge points of the image trajectory. The meaning of the D, the D fruit is not at the bottom of each group of signals. The waveform is used to identify the second symmetry index of ICP. The second symmetry index from υ υ ί is used to determine the fisheye image reference loss '2235 _ complement. The method is to image ρ, (ρ, 伽直角The coordinates are represented by C(P, Ρ) of D (dTn C〇〇nllnate), that is, the image of the horizontal axis of the image is the same as the image of the horizontal axis 'converted to the eighth figure'. The linearity of the real white f is the curve of the second symmetric evaluation, so Yi·^古社:圚(4) The green immediately becomes the curvature of the law. Xu Wei... 疋 directly observe with the naked eye or use the computer to perform mathematical nicknames, k and the symmetry index is very suitable for the application to judge the Icp 230. A deductive method. The second axis of the T (Camera posing) problem > and the orthogonal axis perpendicular to the middle = 225 will pass the image distortion center 235 to determine the inter-axis coordinate of the optical axis can be absolute PCP 220's name imitation β ( Supplement α 儿 明. Because concentric pattern 25 1375136 March 28, 2011 replacement page &lt;The position and the focal length constant for determining the optimum 以 by the specific projection mode. The present invention uses the radius length (ri, pi) of the concentric circles on the PCP 220 and ICP 23 为 as a numerical limit, and uses the error method (iy_and_err〇 r) along the optical axis 21 - test the hypothetical fixed point as the projection center, and estimate the focal length constant / length. Referring again to "Fig. 5A", after positioning the optical axis 21 of the fisheye lens 24, the VP 241 of the fisheye lens 24 is positioned at a certain point on the optical axis 21, thus greatly reducing the search range. . /, The detailed steps are as follows: After the coordinates of the distortion center are obtained in the image plane, the f body half M of the pcp 22 circle (this is the preset value) and the corresponding image radius (this is the value) are obtained. The relationship is the order of the Π in the coordinate pair (rl, pi). It is assumed that the distance between the VP 241 on the optical axis 21 and the pattern 5 of the PCP 22 Ϊ) is D. According to this, it can be determined that the pcp 22 is the same, and the partial is W_tan_1(ri/D); and the image plane J ί is the radius of the image radius pi (principal mont), the right 疋, and the EDP projection function formula (p=fa) is the test function: AI ΙΙ ίίΪ its ί should? f 1 value ° If the test camera is fully EDP-compliant, rnrP graphics projection (10), P=2fxt_/2)) or orthogonal ^=J:fXSln(a)) '~ Satisfactory results. The method of splicing = EDP mode is ## The method of clearing can be implemented in different projections and the position origin of the coordinate system of tia'p,h) is E(0, 〇, 0), (coffee; P, 匕 is any value) It coincides with the Z-axis τ (0, 0, ζ) of the shell of the Μ 220, which is 2, which is the reality. The distance between 241 and PCP 220 is known as D〇, and the radius of ρ 1 is set to ri, which is the same for each concentric 囫 of the library m=220. The mother image is equal to P1, due to pi and oci. All depend on the size of D〇, so win &amp; the following mathematical patterns: pi (10) two 26 1375136 101 March 30 曰 replace page gradually (10)), where i = i ~ N, and " ICP23 ^ ^ ^ ^ - the total number of executions Take the outermost circle and the other one to connect, after the operation g to '(1) pi(DO)/ pN(DO) - aii up and down, in fact 'DO is - unknown value' but it is determined that vp falls in Optical axis; if a dynamic point (〇, 0, z) on the Z axis is taken, an error mathematical expression such as ei(z) = pi(D0) / pN(DO) - ai(z) / aN( z)--------------- where ai depends on the test z, ie (ai(z)=tan_1(ri/z) and the value of ρι has been determined at the time of the experiment The size (that is, its constant dependence on 卯 does not change with the assumed Z value). Therefore, as long as the measurement is obtained at least two from the common vehicle coordinate pair (ri, pi) (c〇njugated c〇〇rdinates, representative - Dong The information of the object point 221 and the image point 231 corresponding to each other can obtain the value of · r. Search for the optical axis by trial and error Every point on 21, root ^ S3; where _ is the minimum, the position of VP 241 can be used but the projection function of the measurement camera is not clear, is the total coordinate pair (ri, pi) The calculated field is so wild, so the image must be traced multiple times, and the style is judged by the corners. In order to worry about each &quot;&quot; image track, the camera projection module is surrounded by ::: a circular track (4) wH two 3 function, to treat the contribution of each detour in a fair manner, which is: (D〇)~(pl(D〇)~pi-l(DO))/ --------- -------------- In the light can be f is the radius of the distortion center 235. Therefore, the error function t used to find the projection center 241 in the process of adaptation, the implementation should be ε (ζ )=§αόί(ε-(;) Xw,(D〇)) _ This point—the point makes ε(1) the smallest, or approaches G when it is 弋 疋 疋 _ _ _ VP VP VP VP 数学 数学The form is built" 27 1375136 March 28, 101 曰 Replacement page yp hypothesis; if it is assumed that the projection mode of the premise ^~ 'for example: SGP (p=2fxtan(a/2)) or OGP (P = fxsin( a)), f, (1) to (4) must be derived again according to the projection function of SGP or OGP. Anyway The inference made by the above concept is called "ε-algorithm". '... As for the focal length constant f, based on the measured pi(D) and its relative ai(D), 'do not use the following formula: f(DO) = S, (Z) °) xw' (D. )____________________ where fi(DO) = pi(DO)/ai(DO). Similarly, if the premise is changed to $ SGP, then fi(D〇) is equal to i/2*pi(DO)/tan(ai(D〇)/2); or if the setting function is 0GP' then: Fi(D〇) is equal to pi(D〇)/ sin(ai(D0)). If the projection mode of the lens element is set and the measurement is error-free, the value will be accurate, then f(D0) should be equal to any /i(D0) ' which is the focal length constant f of the lens. In the matter, the statistical standard deviation of the obtained fi(D) data calculus can be used to estimate the accuracy of the hypothetical projection mode. That is to say, the following formula can be used as the matching projection mode. The index of degree is called "oral algorithm":, ' &quot; a(D0) = (S(/i(D〇)-/W)2)/ (Ni)-------- ------------(6) In order to further verify the reliability of the experimental results (including: optical axis 21 orientation and suitable camera projection mode), please refer to "7th figure" again, the present invention further Initially collimating the optical axis 21, the absolute coordinate position of the rear target 22 is used as a reference, and the target 22 is moved twice in the positive Z direction, each increasing the displacement of 5 faces; in the two displacements, the orientation and the image of the camera 60 The dry 22 is at the X' base axis 51 and γ, and the base axis 52 coordinates remain unchanged. Including the first experiment, these three experiments are represented by Testl, Test2 and Test3. Displacement ε-actuation σ-actuation Di(EDP) Di (OGP) Di (SGP) Di (EDP) Di (OGP) Di(SGP) 28 1375136 (Testl) 5(Test2) 24.4 |26.4 R5/ 33.1 19.7 101 Replaced on March 28, 1 26.2 10 (Test3) 29.4 40.1 1.82 2.44 1.85 ·* ——— _ 39.4 2.42 20.7 24.9 3.10 ---- two|υ· υυ兮U. U! 矣有,外外'Other values The units are all · ^ and are derived using the ε-actuator algorithm to derive the mi = and ε values / σ values. Referring to the leftmost reference Testl displacement of Table 1, the experimental results are based on Di K's actual accuracy. (10) EDPJI type 'Because no matter, algorithm or σ_ algorithm, push|get ^^ value culture' ^ can be very Loyally reflecting the incremental 5-face displacement of each experiment; but the same-experimental 'two algorithms derived from the _D value EDP calculated the distance between the Zhao Μ δ9 71 oc, the church 7 « W surface / 丨 · 85nffll) is also closer to the L 78m in the specification f. The difference may be due to the error of manually assembling the lens. The experimental results of 0GP and SGP are quite different from the known absolute displacement. Finally - the column is quite small with ε values. The value, which shows ϋ 揭 = = =, _ algorithm with material # #财纽纽. Second, refer to the first diagram "", using Testl as an example, and the horizontal axis shows the condition of f under the condition of false I ε σ ° (two kinds of projection kernel type algorithm), regardless of ε very: explicit! minimum, the single The existence of the minimum value represents the heart, the position, and the value of the money. However, the same lens can obtain focal lengths of different νρ 241 positions in different reference projection functions, which means that it is not possible to simply list the experimental _sheng projection function. In addition, it is difficult to practice in practice == a number of full descriptions - _ of bribes. ^ The "ε_ algorithm" and the "sigma calculus known technique" disclosed in the above embodiment are the target lens ❺ imaging logic to interpret the corresponding camera 29 1375136 on March 28, 2011. The method proposed by the obtained knot verification is feasible. = The proposed subject-correction camera internal and external optical parameters #1 is the computer vision model algorithm. <Visual Model II. A general camera parameter interpretation The invention further develops a general camera parameter deduction method, without considering any hypothesis of the closed-form projection function, directly from the mapping of the correction point and the image position of the image The optical projection mechanism and it camera parameters are a more general camera parameter algorithm. This algorithm is developed according to the conventional lens projection phenomenon. It is set as (·) (若), if If there is an object point existing on the same specific line of sight in the camera field of view, the same specific image that is uniquely mapped on the image plane will be converge on the image plane. Only in a central projection or Ϊίΐί center, this d, abbreviated as VP), then function = exit The projection image on the image plane. This mechanism is projected in a schematic equivalent "of FIG Shang i." Characteristic points W333[r] 333, W323[q] image planes ^PL 313/, WC 242, and 1313...333[P, Q, r] 9 on the same line of sight, a single image message on the screen j ( As shown in the figure, the image point 91) cannot distinguish the different object points on the same target line 8〇 in the real target 22 (the figure shows the three correction points when they are in the P, q, and 位置 positions). 313, 323, 333 are in 铉iLW313[p], W323[q], W333W); conversely, if at least two different object points in the two parts of the body are mapped to the same image position, then the foreign object 11 is off. The entity (4) absolute coordinates can be compared to its projected U 80, and the intersection of the field of view 8 〇 with the optical axis 21 is Fcp M2, or ^ ί ίί! Center (VP). Therefore, it is possible to find the phase-projection imaging mechanism by finding the relationship of all the visual line shapes of the physical space. And how to implement the method of the fisheye camera projection logic in this picture has been published in the __ patent application of the Republic of China, the following is no longer ^ Methodology can be used to deal with any kind of fisheye The projection function of the image. ..., eye visual stereometric system> 30 !375136 ., phase. (remapping) for pinhole image 丄 知 “ : : : : : : : : : : : : : : : : : : : 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 影像The placement of the camera and the === question, so the technique of converting the fisheye image to the pinhole image is the implementation of the patented technology. The image is converted to a straight line. The technique of fisheye camera orientation> In the previous part, when the camera parameter method was disclosed, the position of the financial test p was shown in Table 4 as the thief, and the orientation of the 3 was used as the reference, and the orientation of the camera was delayed == This is an important reference for two-dimensional 罝. . Jing Taizhen ^ to pcp is the absolute direction and position must have a precise six-way example is to prove the feasibility of the invention method, so in an earlier way However, the achievable camera arrangement is not limited to this. <Three-dimensional computer vision measurement system with camera rectification arrangement> Two cameras are set up to be the same for the image plane on the same line = _ (such as oblique axis) = = and); is a special case of 3D visual metrics. Integer: 3-digit algebraic operation; the disadvantage is that the general pinhole camera (since the field of view angle is small) will reduce the operational field of view. "The first shows the rectified needle &amp; dimensional visual measurement system; The 3D visual metric can be successfully performed in the field of view of the replacement page of 1375136 1 March 30, 101; the other object M, N cannot be measured because it is not in the pinhole field of view. The computer vision model of the fisheye image of the present invention The angle of view of the camera alone is close to 180. · The working angle that can be achieved by the system can reach a large operation, and the non-pinhole camera can be phased. The schematic diagram of the embodiment of the three-dimensional measuring device for fisheye viewing is required. Refer to "Figure 3" for details. ^正图干extent physical segment rectification pixel axis> The horizontal axis of the rectified two fisheye camera can be achieved by extending a f^baseline on the PCP 220. Please refer to "Figure 14". This is also known as the depth map. The size is similar to that of Figure 6. Adjusting the direction of the camera axis The horse silly kisses the image through the center of the distortion after rotating the camera to make the line ~ like S without being on the vertical axis of the pixel plane. ^ Steps can be combined in the implementation of the camera parameter measurement phase. The lack of any lens image always has the nature of __, so this program only has no restrictions on other arrangements in this arrangement: 4 and private. <Three-dimensional visual measurement system for erecting fisheye camera> The construction of the three-dimensional visual measurement system embodiment, including 33 and the "right eye" city, the specifications provided by the customer are the hair = (a second two camera parameter stage used) _ machine) and - the camera frame β = the new minus the new field gj ^ ^ must determine the distance displacement, and can freely adjust the direction. First, visually adjust the optical axis 2! with the "1; 4" image, and the camera and depth map 22 can be adjusted as needed. ,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,, Rotate "Depth g 32 March 28, 2011, replace page I with the image of the ^ ^ 督 督 督 22 22 22 22 22 22 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助 助According to the design of Chen 22, move the graph to the center of the jade axis (step 934); record the total light entity and image data, deduct the computer vision model 'and use the platform coordinates of the right eye camera as the system i test 1 step 935), Maintaining the direction of the depth of the brew 22 and the position of the z-axis of the platform to allow the motion of the platform coordinates to maintain the image plane and the depth of the test plane (step 936); the approximate position of the green left-eye camera in place (step 937); And the vertical movement depth (4) 22, get the circular symmetrical image of the basin, the 骇 _ entity and the shadow material, ^; adjust the direction of the camera so that the depth map 』 = auxiliary water + and the vertical line image just coincides with the horizontal extension of the ICL ( Step 939); record the total entity and image again Material, the left computer vision model (steps to the __ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 3D algorithm of eye visual metrology> The measured results of the left and right eye cameras completed by the system are listed in the “third table.” The table lists the models of equidistant projection, vertical projection, and = based; The smudges of Sigma and Epsil〇n test. „The meaning of each field. The third to fourth columns indicate the parameters of the test camera with different three projections. z (print) indicates the calculation with Epsil〇n test. = seat distance, the unit is coffee, (four) printed) is the best bit 2 :: this value: there is an early position '❿FL (ep) is the focal length constant obtained by the calculation, ' coffee., two _ is the S chat test As a result, Err (early 疋mm, the other two are the same units tested as Epsilon. The image of 佥二 appears in the computer screen is a long and short axis phase difference expansion two: the unit of the distortion center is the pixel, and the camera coordinate original The origin of the camera's target coordinate is the reference. Only the right eye is rooted. The focal length of the lens provided by the camera manufacturer is i. Reference 33 333751 The replacement page test results on March 28, 101, the camera is obviously like the equidistant projection. Therefore, this group of parameters is s ten nose two The benchmark of the Vision is not used for the calculation of the stereoscopic projection and the orthogonal projection. The "fisheye image computer vision model" of the left and right eyes and the camera at the external visual reference point (projection center) are obtained. Position and camera orientation' and the ability to convert images into a wire-framed perspective image that is logically aligned with pinhole imaging. Thus existing computer vision technology can directly reference these converted wireframe perspectives (aka aka Pinhole) image to perform 3D viewing metrics. , second, measurement system, brain vision model parameters from the projection of the main art projection 31.4 0.005 0.008 1/784 2. 577 FL (sig) 0.008 origin 0.13 1.464 -^-^£l_iJ89.Q6Y: 219. 50 TR(0, 0, 40) Z(ep) 35.4 54.3 27.7 0.92 TRC-266, -94, 35. Shooting measurement: After the invention, the first group is the pinhole, and the (4) line ==== 34 ^75136 i〇l March 28th Replacement page ru is the standard of the laboratory where the page measurement system is located, the comparison is shown in the "fourth table". . The result of '4 size fourth table, physical quantity, (star: deleted, actual size measurement size error glass door height 2730 2851.09 ---~_ -4. 44% glass window width (top left) 1000 997. 74 〇. 23% glass window width (bottom right) 1000 989. 59 ---- 1. 04% glass window 530 527.23 - - two degrees 0. 52% book pivot 1200 1162. 02 3 17% bookcase length 980 1011.98 w · -LI/O ------- -3. 26% &amp; has a length of 600 595.23 〇. 79% = illusion j to take a clear image part of the measurement error value can be 1 = down; (four) component The error value is less than 5%, and the image is =,;,, and the viewpoints are slaves, two, and two. =:, (Whether XD is completely parallel, optical center point position...H working angle can be reached More than 100 degrees. Brother dare to use the three-dimensional visual metrics to measure its 'projection', the respective camera parameters of the clock and the method of the stereoscopic visual measurement system of the table, and deduct the above perceptron to structure You can also use two answers to 'more can't be seen. ・==Multi-generation group 35 ^/5136 All kinds of combinations can be done without violating the spirit of this creation ^ The method of the forest city of the county has the following optical parameters. For example, the calculation logic required for ntum to convert the fisheye image becomes very simple 'single g ^ '. This image of the fisheye image has a good biography (3) The method of the present invention makes the camera of the parametric camera 'not limited to a specific EDP mode. The image center method used for various technical mechanisms? 1 Find the projection center (VP) of the modulo 1 order. The present invention is not limited to the application and refinement of the present invention. Therefore, the scope of protection of the present invention is subject to the latter/illustration. Fig. 1A, _ Based on the image, the image projection diagram based on the ideal projection model, the image analysis diagram of the image weight method, and the corresponding map 2, showing a third figure, 纟: showing a typical fisheye lens Projection Function Graph; Intention; &quot;Conceptual Spiritual Design One Concentric Circle Target Embodiment shows a schematic diagram of the stereoscopic projection optical path formed by the image of Figure 5; Figure 1 will simulate without using the symmetry pattern (10)) Multi-collimated incident light 36 13 75136 Road and the fusion-tree turn-by-turn Wei Jingguan distance projection as an example); 耵仃 is not intended (to show the "3A" in the small sphere and the image plane part of the stereoscopic light. Pattern (10)) realizes her key _ eyepiece 娜 间 相对 相对 第 第 , , , , , , , , , 第 娜 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 第 ; ; ; The eighth picture, the image of the Qin dynasty ^^, the four directions of the fascination curve ffi, the south, the northwestern part of the southeast Γ "Ξ The invention uses the distortion center as the origin, the pole coordinate conversion method, the eighth picture" Schematic diagram of the image; , 2 L in the actual test of the invention, the 'roots of the same projection function, to obtain the approach curve of the center of the shirt; the soil 1 diagram, an implementation of the general algorithm of the camera parameters of the present invention A schematic diagram of the theoretical mode of the example, which shows the difference of the optical path of the same image point when the image is moved to different absolute positions, and the schematic diagram of the optical path of the pinhole image through the camera rectification. Light path diagram; $ 1 3 is a schematic diagram showing an embodiment of a three-dimensional visual metric system for fisheyes; FIG. 14 is a schematic view showing a center of a function of adjusting the direction of the camera with "6th drawing", a schematic diagram (PCP); and a 15th Figure ' is a flow chart showing the steps of erecting a fisheye image visual measurement system. [Description of main component symbols] 11 12 1 Formation area long axis Short axis 37 1375136 13 13, 13" 21 22 220 221 225 23 230 231 235 24 241 242 243 30 301 302 31 38 313, 323, 333 40 50 51 52 53 60 70 71 March 28, 2011 Replacement page The meridian of the prime meridian mapping optical axis map center symmetry pattern (PCP) object point pattern center image plane center symmetry image (ICP) image point distortion t center lens projection Heart (VP) Front Foundation Point (FCP) Post Base Point (BCP) Small Ball Incident Point Normalized Image Point Equatorial Plane Center Correction Point Correction Point Large Ball Adjustment Platform X' Base Axis Y' Base Axis Z' Base Axis Camera Camera Stand Universal Yuntai 38 1375136 March 28, 2011 Replacement page 80 Line of sight 91 Image point

3939

Claims (1)

1375136 ,. 101年3月28日替換頁 十、申請專利範圍: ---—- 1. 一種魚眼相機三維視影度量系統,包含有; 一右眼相機,裝置有一非線性投射鏡頭,並已知該右 相機之内、外部光學參數; 一左眼相機,裝置有一非線性投射鏡頭,並已知該左 相機之内、外部光學參數;及 一相機架,用於固定該右眼相機與該左眼相機之相對方 位; 其中該右眼相機與該左眼相機係利用一具中心對稱圖 案之圖靶校正其内'外部光學參數,先利用該圖靶校正右眼 相=之内、外部光學參數,之後移動圖靶,並以右眼相機之 座標位^為系統參考點,校正左眼相機之内、外部光學參 數,當掌握該二相機之相對方位與相關參數時便組成該魚眼 相機三維視影度量系統。 、 2·如申請專利範圍第1項所述之魚眼相機三維視影度量系 統’其中該非線性投射鏡頭是為一魚眼鏡頭。 Μ 3· ^申凊專利範圍第1項所述之魚眼相機三維視影度量系 統,其中該具中心對稱圖案之圖靶係為一同心 ^ 辅助水平及垂直線段的同心圓圖靶。 一八 4. 如申凊專利範圍第1項所述之魚眼相機三維視影度量系 '2,其中該左右眼相機架上更個別裝置有—萬向雲台可以改 變相機的方向。 5. 種魚眼相機三維視影度量系統的架設方法,其包含有: 以具中心對稱圖案之圖’乾用來演繹—右眼相機之圓 對稱影像(ICP); 參考共軛的實體與影像半徑資料,演繹該右眼相機之電 腦視覺模型; 依據該圖靶之設計移動該圖靶使其中心正交於該右眼 相機之光學軸; 40 13751361375136,. March 28, 2011 Replacement Page 10, Patent Application Range: ----- 1. A fisheye camera 3D visual measurement system, including; a right eye camera, the device has a nonlinear projection lens, and Knowing the internal and external optical parameters of the right camera; a left-eye camera having a non-linear projection lens and knowing the internal and external optical parameters of the left camera; and a camera holder for fixing the right-eye camera and The relative orientation of the left-eye camera; wherein the right-eye camera and the left-eye camera use a centrally symmetric pattern of the target to correct the 'external optical parameters, and first use the target to correct the right eye phase Optical parameters, then move the target, and use the coordinate position of the right-eye camera as the system reference point to correct the internal and external optical parameters of the left-eye camera. When the relative orientation and related parameters of the two cameras are grasped, the fisheye is formed. Camera 3D visual measurement system. 2. The fisheye camera three-dimensional visual measurement system according to claim 1, wherein the non-linear projection lens is a fisheye lens. Μ 3· ^ 三维 凊 凊 凊 凊 凊 凊 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼 鱼18. A fisheye camera 3D visual measurement system '2 as described in claim 1 of the patent scope, wherein the left and right eye camera frames have more individual devices - the universal head can change the direction of the camera. 5. A method for erecting a three-dimensional visual measurement system for a fisheye camera, comprising: a diagram with a centrally symmetric pattern 'dry for deductive--a circularly symmetric image of a right-eye camera (ICP); reference conjugate entity and image radius Data, performing a computer vision model of the right eye camera; moving the image target to be orthogonal to the optical axis of the right eye camera according to the design of the image target; 40 1375136 101年07月日替換頁 紀錄雜實體與f彡像倾,演賴 杈^’亚以該右眼相機之平台座標為系統參考點; 見 保持該圖乾的方向及平台么轴的位置,以讓 面和該圖乾之平面平行,並當成參U 在位置条扠一左眼相機的大概位置; ^平及垂直移動該圖乾’得到其圓形對稱 輛的貫體與影像資料,演_左_機的電腦視覺模型,、 視覺ίί=輛實體與影像資料’演繹該左眼相機的電腦 =紀,的該圖㈣在共輛實物及影像資料的平 位置來演算該相機於圖乾平面位移,及演绎得到的值= 義視參點距離。 ⑴义 6.如申,專纖圍第5項所述之魚眼相機三維視影度量 的架设方法,其中演繹該右眼相機之電腦視覺模型之步;驟 後,更包含以下步驟: 外 旋轉並移動該圖靶使輔助水平及垂直線段的影像 吻合ICR的水平延伸及垂直延伸線。 7.如申請專利範圍第5項所述之魚眼相機三維視影度量系統 的架設方法,其中演繹該右眼相機之電腦視覺模型之步 後,更包含以下步驟: &quot;周正該左眼相機的方向使該圖乾的辅助水平及垂直線 段的影像剛好吻合ICL的水平延伸及垂直延伸線。 41On July, 101, the replacement page recorded the miscellaneous entity and the f彡 image, and the performance of the right eye camera platform coordinate was used as the system reference point; see the direction of the figure and the position of the platform axis. Let the face be parallel to the plane of the figure, and take the position of the left eye camera as the position of the reference bar; ^ move the figure horizontally and vertically to get the shape and image data of the circular symmetry vehicle. The computer vision model of the left-machine, the visual ίί=the entity and the image data, the computer that performs the left-eye camera, and the figure (4) calculate the camera in the plane of the map in the flat position of the physical object and the image data. The displacement, and the value obtained from the deduction = the distance of the visual point. (1) Yi 6. For example, the method for erecting the three-dimensional visual metric of the fisheye camera described in Item 5 of the special fiber circumference, wherein the step of the computer vision model of the right-eye camera is performed; after the step, the following steps are further included: And moving the map target to make the image of the auxiliary horizontal and vertical lines coincide with the horizontal extension and vertical extension of the ICR. 7. The method for erecting a three-dimensional visual measurement system for a fisheye camera according to claim 5, wherein after performing the computer vision model of the right-eye camera, the method further comprises the following steps: &quot;Zhou Zhengthe left eye camera The direction of the image of the auxiliary horizontal and vertical segments of the figure coincides with the horizontal extension and vertical extension of the ICL. 41
TW94101592A 2004-01-20 2005-01-19 3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same TW200528945A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW94101592A TW200528945A (en) 2004-01-20 2005-01-19 3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW93101557 2004-01-20
TW94101592A TW200528945A (en) 2004-01-20 2005-01-19 3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same

Publications (2)

Publication Number Publication Date
TW200528945A TW200528945A (en) 2005-09-01
TWI375136B true TWI375136B (en) 2012-10-21

Family

ID=48093266

Family Applications (1)

Application Number Title Priority Date Filing Date
TW94101592A TW200528945A (en) 2004-01-20 2005-01-19 3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same

Country Status (1)

Country Link
TW (1) TW200528945A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI555379B (en) * 2015-11-06 2016-10-21 輿圖行動股份有限公司 An image calibrating, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
TWI555378B (en) * 2015-10-28 2016-10-21 輿圖行動股份有限公司 An image calibration, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
CN108632504A (en) * 2017-03-15 2018-10-09 致伸科技股份有限公司 multi-lens optical device
TWI661392B (en) * 2017-12-27 2019-06-01 聚星電子股份有限公司 Image stitching method and device

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104160700B (en) 2012-06-28 2016-09-14 日本电气株式会社 Camera position attitude assessment equipment and camera position attitude appraisal procedure
TWI660328B (en) * 2017-02-23 2019-05-21 鈺立微電子股份有限公司 Image device utilizing non-planar projection images to generate a depth map and related method thereof
TWI606421B (en) * 2017-03-13 2017-11-21 國立交通大學 Method and device for fisheye camera automatic calibration
TWI646506B (en) * 2017-10-24 2019-01-01 華晶科技股份有限公司 Method and image pick-up apparatus for calculating coordinates of object being captured using fisheye images
US10762658B2 (en) 2017-10-24 2020-09-01 Altek Corporation Method and image pick-up apparatus for calculating coordinates of object being captured using fisheye images
US11741584B2 (en) 2018-11-13 2023-08-29 Genesys Logic, Inc. Method for correcting an image and device thereof
CN110610520B (en) 2019-08-29 2022-03-29 中德(珠海)人工智能研究院有限公司 Visual positioning method and system based on double-dome camera
CN113873223B (en) * 2021-09-03 2023-07-21 大连中科创达软件有限公司 Method, device, equipment and storage medium for determining definition of camera

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI555378B (en) * 2015-10-28 2016-10-21 輿圖行動股份有限公司 An image calibration, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
TWI555379B (en) * 2015-11-06 2016-10-21 輿圖行動股份有限公司 An image calibrating, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
CN108632504A (en) * 2017-03-15 2018-10-09 致伸科技股份有限公司 multi-lens optical device
TWI661392B (en) * 2017-12-27 2019-06-01 聚星電子股份有限公司 Image stitching method and device

Also Published As

Publication number Publication date
TW200528945A (en) 2005-09-01

Similar Documents

Publication Publication Date Title
CN101308018B (en) Stereo vision measuring apparatus based on binocular omnidirectional visual sense sensor
Peleg et al. Omnistereo: Panoramic stereo imaging
TWI375136B (en)
US10354136B2 (en) Head mounted eye tracking device and method for providing drift free eye tracking through a lens system
CN101487703B (en) Fast Panoramic Stereo Camera Measuring Device
US20140168378A1 (en) Calibration and registration of camera arrays using a single circular grid optical target
CN109255844A (en) For using the Graphics overlay layer of the size of video inspection device measurement feature
US20040046888A1 (en) Method for presenting fisheye-camera images
TW565735B (en) Method for determining the optical parameters of a camera
CN107786808B (en) Method and apparatus for generating data representative of a shot associated with light field data
TWI752905B (en) Image processing device and image processing method
Chen et al. A novel mirrored binocular vision sensor based on spherical catadioptric mirrors
CN113052921B (en) A system calibration method for a three-dimensional line of sight tracking system
Orghidan et al. Omnidirectional depth computation from a single image
TW200422755A (en) Method for determining the optical parameters of a camera
CN110515214A (en) An integrated imaging 3D display device with high depth of field
RU2729698C2 (en) Apparatus and method for encoding an image captured by an optical system for acquiring data
CN107464278B (en) Full-view sphere light field rendering method
TW594453B (en) Method for presenting fisheye-camera images
CN103077518B (en) Based on Camera Self-Calibration method and the device of circular point
CN116105633A (en) Inspection method of free-form surface optical lens
CN108353120B (en) Apparatus and method for generating data representing a pixel beam
KR20190026014A (en) Apparatus and method for generating data representing a pixel beam
Bakstein et al. Non-central cameras for 3D reconstruction
CN113989105A (en) A single camera spherical mirror reflection imaging projection device