201236439 六、發明說明: 【發明所屬之技術領域】 本發明是關於一種3D影像感測器校正方法,尤指一 種能在由3D影像感測器拍攝而得之兩影像中精確地找出 兩影像之間的水平垂直位移量’進而對3D影像感測器做 校正的一種方法。 【先前技術】 立體3D影像的資訊内容在近年來發展快速,但3D 影像品質的好壞也決定使用者觀看的舒適度。一般來說, 3D影像通常是藉由—具有左、右兩峨軸影像感測模 組的3D攝像機來拍攝外界影像,以模擬人眼之左、右眼 觀看景物的3D視覺效果。倘若3D攝像機在製造過程中 有某一鏡頭與影像感測模組的位置或角度發生誤差的狀 況時,便會造成所拍攝到的3D影像品質不良,無法讓使 用者舒適地觀看。因此,如何確保3D攝像機在製造過程 中’其鏡頭與影像感測模組的位置或角度能被迅速且精確 地疋位,乃為3D攝像機生產業者致力研究的課題。 【發明内容】 本發明之主要目的是在於提供一種3D影像感測器校 正方法’可針對一 3D攝像模組之兩組鏡頭與影像感測模 、、且所拍攝到的兩影像進行位置差異值的檢測並據以調整 校正’以破保3D攝像模組可以攝得良好的3D影像品質。 為達上述之目的’本發明揭露了一種3D影像感測器 201236439 校正方法,包括有下列步驟: 勺將3d攝像模1 且定位,該3d攝像模組 包括有至 > 兩組之鏡猶影倾測模組; 鱼=二:使用該3D攝像模組之該至少兩组之鏡頭 與影像感測模組自外界擷取至少兩影像; 步驟(c)n制裝置計算賴取之該影 像的位置差異值;201236439 VI. Description of the Invention: [Technical Field] The present invention relates to a 3D image sensor calibration method, and more particularly to accurately finding two images in two images captured by a 3D image sensor. A method of correcting the horizontal vertical displacement between 'and then correcting the 3D image sensor. [Prior Art] The information content of stereoscopic 3D images has developed rapidly in recent years, but the quality of 3D images also determines the comfort of the user. In general, 3D images are usually captured by a 3D camera with left and right two-axis image sensing modules to simulate the 3D visual effects of the left and right eyes of the human eye. If the 3D camera has an error in the position or angle of the lens and the image sensing module during the manufacturing process, the captured 3D image quality will be poor and the user will not be able to watch it comfortably. Therefore, how to ensure that the position or angle of the lens and image sensing module can be quickly and accurately clamped during the manufacturing process of the 3D camera is a subject that the 3D camera manufacturer is committed to. SUMMARY OF THE INVENTION The main object of the present invention is to provide a 3D image sensor calibration method that can perform position difference values for two sets of lenses and image sensing modes of a 3D camera module and the captured two images. The detection and adjustment of the corrections to break the 3D image quality of the 3D camera module can be achieved. In order to achieve the above purpose, the present invention discloses a 3D image sensor 201236439 calibration method, which comprises the following steps: scooping and positioning a 3d camera module 1 , the 3D camera module includes a mirror image of the two groups The tilting module; the fish=2: using the at least two groups of the lens and the image sensing module of the 3D camera module to extract at least two images from the outside; the step (c)n device calculates the image taken by the device Position difference value;
.步驟(D):判_位置差異值是否位於—預設範圍 内,倘若是位於該預設範圍内則執行步驟(F),倘若不是 位於該預設範圍内則執行步驟(E); 步驟⑻:依據該位置差異值來調整該至少兩組之 鏡頭與影像感測模組巾的至少-組鏡頭與影像感測模組 的位置’之後並重回步驟(B)執行;以及 步驟(F ) ·停止〇 於一較佳實施例中’該至少兩影像係包括有一第一影 像以及一第二影像,該第一與第二影像均同為寬具有p 晝素且高具有Q晝素的P*Q影像資料,並且,步驟(C) 所述之計算該位置差異值的方法係包括有下列步驟: 步驟(C1):於該第二影像中決定一搜尋區塊,該搜 尋區塊係位於該第二影像之P*Q的影像資料範圍内; 步驟(C2):自該第一影像中建立一找尋條件,該找 尋條件係依據該第一影像之P*Q的影像資料範圍内中之 一條件區塊的資料所建立,其中’該條件區塊的寬與高的 晝素值是分別小於該搜尋區塊的寬與高; 步驟(C3):自該第二影像之該搜尋區塊中,找尋符 201236439 合該找尋條件之一對應區塊的位置;以及 步驟(C4):依據該條件區塊以及該對應區塊相對於 其兩者在P*Q影像資料範圍中所在的位置的差異,計算 出該條件區塊與該對應區塊兩者間的位置差異值。 於一較佳實施例中,該條件區塊之寬與高同為w畫 素,且該搜尋區塊的寬與高同為W+2t晝素,其中, (W+2t)<P 且(W+2t)<Q 〇 於一較佳實施例中,t為W的整數倍,並且,該步驟 (C3)中所述之找尋符合該找尋條件之一對應區塊的方 法’係先以該條件區塊的資料藉由至少一數學函數計算出 至少一條件值,並且,在該搜尋區塊中分割出複數個同樣 具有寬與高同為W晝素的次搜尋區塊,且各相鄰次搜尋 區塊之邊緣相接;之後,以相同之該至少一數學函數依序 为別e十算出各次搜尋區塊的值並與該至少一條件值進行 比對,倘若相同時則表示該次搜尋區塊就是符合該找尋條 件的該對應區塊。 於一較佳實施例中,該步驟(C3)中所述之找尋符 合該找尋條件之一對應區塊的方法是以多階段搜尋的方 式進行。 於一較佳實施例中,所述之多階段搜尋的方式係包 括:先以一相對較大值之W1作為進行一第一階段搜尋之 一第一條件區塊的寬與高,並據以在寬與高同為W1+2t 晝素的一第一搜尋區塊中找尋符合該找尋條件且寬與高 同為W1之一第一對應區塊;當找到該寬與高同為W1之 該第一對應區塊後,便以該第一對應區塊作為進行一第二 201236439 階段搜尋時之一第二搜尋區塊,且以一相對較小值之%2 作為進行該第二階段搜尋之一第二條件區塊的寬與高,在 §玄寬與尚同為wi之該第一搜尋區塊》中找尋符合該找尋 條件且寬與高同為W2之一第二對應區塊。 於一較隹實施例中,於該步驟(A)中,是藉由一定 位裝置來將該3D攝像模組定位,且於該定位裝置中包括 有可調整該至少兩組之鏡頭與影像感測模組中的至少一 組鏡頭與影像感測模組的位置的機構。Step (D): determining whether the position difference value is within a preset range, if the position is within the preset range, performing step (F), if not within the preset range, performing step (E); (8): adjusting the position of the at least one group of the lens and the image sensing module of the lens and the image sensing module towel according to the position difference value, and returning to step (B) to execute; and the step (F) Stopping in a preferred embodiment, the at least two image systems include a first image and a second image, the first and second images being the same width and having a p-crystal and having a high Q-element. The P*Q image data, and the method for calculating the position difference value in the step (C) includes the following steps: Step (C1): determining a search block in the second image, the search block system Located in the image data range of the P*Q of the second image; Step (C2): establishing a search condition from the first image, the search condition is based on the P*Q image data range of the first image One of the conditional block data is created, where 'the conditional block The width and height of the pixel values are respectively smaller than the width and height of the search block; Step (C3): from the search block of the second image, the search symbol 201236439 corresponds to one of the search conditions corresponding to the block Position; and step (C4): calculating the condition block and the corresponding block according to the difference between the condition block and the position of the corresponding block relative to both of them in the P*Q image data range The position difference value between. In a preferred embodiment, the width and height of the conditional block are w pixels, and the width and height of the search block are W+2t, wherein (W+2t)<P (W+2t) <Q 〇 In a preferred embodiment, t is an integer multiple of W, and the method described in the step (C3) for finding a block corresponding to one of the search conditions is first Calculating at least one condition value by using at least one mathematical function of the data of the conditional block, and dividing a plurality of secondary search blocks having the same width and height as the W element in the search block, and each The edges of the adjacent sub-blocks are connected; then, the values of the search blocks are calculated in the same order by the same at least one mathematical function, and compared with the at least one condition value, if they are the same Indicates that the search block is the corresponding block that meets the search condition. In a preferred embodiment, the method described in the step (C3) for finding a block corresponding to one of the search conditions is performed in a multi-stage search. In a preferred embodiment, the multi-stage search method includes: first, using a relatively large value of W1 as the width and height of the first conditional block for performing a first stage search, and according to Find a first corresponding block that meets the search condition and whose width and height are the same as W1 in a first search block whose width and height are W1+2t pixels; when the width and height are the same as W1, After the first corresponding block, the first corresponding block is used as one of the second search blocks in the second 201236439 stage search, and the second stage search is performed with a relatively small value of %2. The width and height of a second conditional block are found in the first search block of § Xuankuan and Shang, which is the second corresponding block of W2 in accordance with the search condition and the width and height are the same. In a further embodiment, in the step (A), the 3D camera module is positioned by a positioning device, and the positioning device includes an adjustable lens and image sense of the at least two groups. A mechanism for measuring the position of at least one of the lenses and the image sensing module in the module.
【實施方式】 為了能更清楚地描述本發明所提出之3D影像感測器 校正方法’以下將配合圖式詳細說明之。 本發明之3D影像感測器校正方法,主要是為了提高 改善3D攝像機的拍攝效果,因此開發了 3D影像感測器 之對位校正檢_統,個精密硬體工具搭配軟體辅助設 計,能有效對鏡頭與影像感測模組的水平、垂直方面作位 置校正’使3D攝像機可以攝得最佳品質的3〇影像。於 f發明巾,藉由在3D麟機所拍縣得之左右兩張相鄰 :像_彻各種比制方㈣㈣們之間的相對應關 '、以指兩跡像之間的位移量。本發明主要是利用移 動估計方式來計算兩影像在水平、垂直的移動量。第一, =得^確的移動量’選擇了蚊不變的區塊當成檢測 田’第一,軸估測耗費了大部分的運算處理時間,利 讀搜尋方法,找到最佳的匹配位置,得到整體 像的位移量’有效運用30影像感測器之對位校正檢測 201236439 系統。 請參閱圖一 ’為本發明之3D影像感測器之對位校正 檢測系統的一實施例示意圖,其包括了:一 3D攝像模組 10、一定位裝置20、一控制裝置22、以及一測試圖樣23。 該3D攝像模組10係包括有至少兩組之鏡頭與影像 感測模組11、12,於本實施例中,係包括了位於左侧之 一第一鏡頭與影像感測模組11以及一位於右側之一第二 鏡頭與衫像感測模組12 ’其所拍攝得到之左、右兩影像 可分別模擬左、右眼在觀看相同景物時所看到的影像。各 鏡頭與影像感測模組11、12係分別包括了一鏡頭組(Lens Set)以及一影像感測器(Image Sens〇r),藉由鏡頭組將 來自外界景物之綠像投影在制H上並轉換為對應於 該影像之電氣訊號。該鏡頭組可為定焦或是變焦鏡頭組, 且該感測H可為CCD或是CMOS感測ϋ。於本實施例 中,該3D攝像模組10可以是僅包括了一電路板、位於 該電路板上之該兩組鏡頭與影像感測模組u、12、以及 用來定位前魏軸影像_觀之—定位鋪所構成 之模組,而無包括諸如LCD顯示螢幕较操作按鍵等之 3D攝^機的其他元件為較佳;然而,該犯攝像模組ι〇 也可以是具備了 3D攝像機的其他全部或部分元件。 «亥定位裝置20是用以定位該3D攝像模組1〇,使犯 攝像模組10上的所有鏡頭與影像感測模、组11、12都可以 拍攝=該順圖樣2m特定區域或細的圖案^。 於裝置2G上並設置包括有可精確調整該至少兩組 之鏡頭與影像感測模組u、12中的其中至少—組鏡頭與 201236439 影像感測模組11、12的至少於寬度與高度方向之位置以 及傾斜角度的調整機構(圖中未示)。 s亥控制裝置22係藉由一傳輸線21連接於3D攝像模 組10之電路板上,可接受來自該3D攝像模組1〇的該電 氣訊號並加以檢測與分析。於本實施例中,該控制裝置 22可以是包括有顯示器、操作鍵盤等配件之電腦設備, 其執行有-特定的分析軟體,可對前述之電氣訊號進行分 析處理並實施本發明之3D影像感卿校正方法中的種種 _ 檢測、運算與分析。 請參閱n本發明之3D影像_器校正方法的 一實施例流程圖,其包括有下列步驟:步驟31 :將一 3D 攝像模組ίο定位於該定位裝置20上,於該3D攝賴組 包括有至少兩組之鏡頭與影像感測模組u、12 (例如但 不侷限於該第一與第二鏡頭與影像感測模組u、12)。 步驟32 :使_ 3D攝像漁1G之該至少兩組之鏡 頭與影像感聰組11、12自外界之剌試圖樣23進行拍 • 攝’以擷取該測試圖樣23之特定區域圖f2切至少兩影 像。請搭配圖四所示,該至少兩影像係包括了由該第一= 頭與影像感測模組所攝得之第一影像41 (左影像)、以^ 由該第二鏡頭與影像感測模組所攝得第二影像42 (右麥 像);其中’該第-與第二影像4卜42均同為寬且有= 畫素且高具有Q晝素的P*Q影像資料。於本實施例令, 細m·23可包含了專供影像測試分析用的特定圖案 24 ’但是,於另一實施例中’該測試圖樣幻也 ^ 般景物圖樣。 & 201236439 步驟33 : 氣訊號,並據 位置差異值。 以控制裝置22接收該兩影像4卜42之電 以計算出所掏取之該至少兩影像41、42的 步驟34 n㈤ 裝置22满贿置差異值是否位於 是位於該預設__表示該第1 =鏡頭與模組u、12是落在可容許範圍内的 :確:置與角度,由該3D攝像模組10所拍攝到的3D影 像品質合格’於是便執行步驟%以停止本發明之沁影 像感測器校正方法。概地,倘若判_結示該位置 差異值不是錄該職範_時,職科該3D攝像模 t 1〇所拍攝到的3〇影像品質不良,無法讓使用者舒適 地觀看’於是便需再執行步驟36。 步驟36 •依據該位置差異值來操作該定位裝置2〇, 以調整該兩綱與影佩職組u、12巾的至少其中一 組鏡頭與雌組的位置,之後並回到步驟重新 執行》 請參晒三及圖四所示,其中,圖三為0二所示步驟 33中所述之計算該位置差異值的方法的流程圖圖四為 示意說明該第-與第二影像4卜42上之區塊範圍的示意 圖。於本發明中,該步驟33之計算方法係更包括了以下 步驟: 步驟331 ··於該第二鏡頭與影像感測模組丨2所攝得 之該第二影像中決定一搜尋區塊422,該搜尋區塊422係 位於該第二影像42之P*Q的影像資料範圍内。於本實施 例中,該搜尋區塊422為稍後進行搜尋時的搜尋範圍,其 201236439 寬與高同為W+2t畫素;其中,t的值是因鏡頭與影像感 測模組位置偏差所可能導致影像偏移的可預期最大偏差 量,W的值將稍後說明,(w+2t)<P且(W+2t)<Q。 步驟332 :自該第一鏡頭與影像感測模組u所攝得 之該第一影像41中建立一找尋條件,該找尋條件係依據 該第一影像41之P*Q的影像資料範圍内中之一條件區塊 411的資料所建立;其中,該條件區塊411之寬與高同為 W晝素,換言之,該條件區塊411的寬與高的晝素值是 • 分別小於該搜尋區塊422的寬與高。於本實施例中,本方 法進行分析與計算方式乃是以區塊為基準,所以t為w 的整數倍為較佳,但是,在另一實施例中,t也可以不是 w的整數倍。 步驟333 :自該第二影像42之該搜尋區塊422中, 找尋符合該找尋條件之一對應區塊421的位置。 步驟334 :依據該條件區塊411以及該對應區塊421 相對於其兩者在P*Q影像資料範圍中所在的位置的差 馨 異’計算出該條件區塊411與該對應區塊421兩者間的位 置差異值,也就是該第一與第二影像41、42間的位置差 異值。 請參閱圖五,為本發明之30影像感測器校正方法中 進行找尋符合找尋條件之對應區塊的示意圖。於本發明 中’該步驟333巾所述之找尋符合該找尋條件之—對應區 塊的方法’係先以位於該第一影像上之條件區塊的資料藉 由至少-數學函數計算出至少一條件值,並且,在位於該 第一影像上之該搜尋區塊中分割出複數個同樣具有寬與 201236439 雨同為w畫素的次搜尋區塊,且各相鄰次搜尋區塊的邊 緣相接。以圖五所示為例’於該第二影像上之該搜尋區塊 中共分割成寬與高各15等分一共有225個次搜尋區塊。 之後’依據區塊為單位,以相同之該至少一數學函數依序 分別計算出各次搜尋區塊的值並與該至少一條件值進行 比對,倘若姻咖表示触搜尋區塊就是符合該找尋條 件的該對應區塊,倘若找纟完全相同之值時則以具有最接 近值之次搜尋區塊來做為符合該找尋條件的該對應區 塊。如圖五所示,假設該條件區塊的位置是位於第-影像 之正中央也京尤是座標為(〇,〇)的位置,峨找尋發現具有與 該條件區塊相同影像(也就是符合該找尋條件)的對應區 塊是位於第二影像上以χ註記之座標(七5)的位置時, 則可以得知該第-與第二影像之間的位置差異值為「寬度 方向偏移'1W晝素,高度方向偏移-5W畫素」。 *於本發明之—較佳實施例中,於步驟333中所述之找 尋符合該找尋條件之一對應區塊的方法,更可以多階段搜 尋的方式來進行。以三階段搜尋為例,可先以一相對較大 作為進行—第—階段搜尋之—第—條件區塊的 筧與向,並據以在寬與高同為Wl+2t晝素的一第一搜尋 區塊中找尋符合該找尋條件且寬與高同為wi之一第一 H區塊。當找到該寬與高同為wi之該第一對應區塊 後’便以該第-對舰塊作為進行—第二酸搜尋時之一 =:區:,且以一相對較小值之W2作為進行該第二 wiU第二條件區塊的寬與高’在該寬與高同為 ο第一搜尋區塊中找尋符合該找尋條件錢與高同 12 201236439 為Μ之-第二對舰塊。接著,再㈣第二對應區塊作 為進打-第三赌搜料之_帛三搜尋輯,且以一相對 最小值之>W3作為進行該第三階段搜尋之—第三條件區 塊的寬與S ’在該寬與高騎W2之該第三搜尋區塊中找 尋符合該找尋條件且寬與高同為W3之―第三對應區 塊’如此便能加it搜尋的效率。如圖六所示為例,在第一 階段之大區塊搜尋過程中,第—條件區塊的寬與高均為 W卜且W1 f於7倍的W3,也就是需在寬與高至少各為 編號-7至7之15等分或以上(每一等分均為W3晝素) 之第一搜尋區塊顧内找到寬與高各為7倍%晝&的第 -對應區塊al (寬度編號為_7至·卜高度編號為i、至7 的範圍)。當找到第-對應區塊al後,即進行第二階段的 中區塊搜尋顧:此時第二條件區塊的寬與高均為w2, 且W2等於3倍的W3,也就是需在寬編號為_7至]與高 編號為1至7的第二搜尋區塊範關_寬與高各為3 L W3旦素的第一對應區塊a2 (寬度編號為_3至_1,^度 編號為5至7的範圍)。之後,當找到第二對應區塊a2後t 即進行第三階段的小區塊搜尋雕,且其第三條件區塊的 寬與高均為W3,也就是需在寬編縣_3至_〗與高編號為 5至7的第三搜尋區塊範圍内找到寬與高各W3晝素的第 三對應區塊a3。一旦找到位於寬編號_丨且高編號5的該 第三對應區塊a3後,即可以得知該第一與第二影像之間 的位置差異值為「寬度方向偏移-1W3晝素,高度方向偏 移-5W3晝素」。 ^ 唯以上所述之實施例不應用於限制本發明之可應用 13 201236439 範圍,本發明之保舰圍應林發敗”專利範圍内容 所界定技術精神及其均等變化所含括之範圍為主者即大 凡依本發明申請專利範圍所做之均等變化及修飾,仍將不 失本發明之要義所在,亦不脫離本發明之精神和範圍故 都應視為本發明的進一步實施狀況。 【圖式簡單說明】 圖一為本發明之3D影像感測器之對位校正檢測系統 的一實施例示意圖。 圖二為本發明3D影像感測器校正方法之一實施例流程 圖。 圖二為圖二所示步驟33中所述之計算該位置差異值的 方法的流程圖。 圖四為示意說明該第一與第二影像41、42上之區塊範 圍的示意圖。 圖五為本發明之3D影像感測器校正方法中進行找尋符 合找尋條件之對應區塊的一實施例示意圖。 圖六為本發明之3D影像感測器校正方法中進行找尋符 合找尋條件之對應區塊的另一實施例示意圖。 【主要元件符號說明】 10〜3D攝像模組 11、12〜鏡頭與影像感測模組 20〜定位裝置 21〜傳輸線 22〜控制裝置 23〜測試圖樣 201236439 24〜圖案 31〜35、331〜334〜步驟[Embodiment] In order to more clearly describe the 3D image sensor correction method proposed by the present invention, the following will be described in detail with reference to the drawings. The 3D image sensor calibration method of the invention is mainly for improving the shooting effect of the 3D camera, so the alignment correction detection of the 3D image sensor is developed, and a precision hardware tool with software assisted design can effectively Position correction for the horizontal and vertical aspects of the lens and image sensing module' enables the 3D camera to capture the best quality 3 frames. In the invention of the towel, by the two sides of the county in the 3D Lin machine, the two neighbors: the corresponding ratio between the various ratios (4) and (4), to indicate the displacement between the two images. The present invention mainly uses the motion estimation method to calculate the amount of horizontal and vertical movement of two images. First, = the correct amount of movement 'selected the mosquito-invariant block as the detection field' first, the axis estimate consumes most of the processing time, read the search method, find the best matching position, Obtain the displacement of the overall image' effectively using the alignment correction test of the 30 image sensor 201236439 system. Please refer to FIG. 1 , which is a schematic diagram of an embodiment of the alignment correction detection system of the 3D image sensor of the present invention, which includes: a 3D camera module 10 , a positioning device 20 , a control device 22 , and a test Figure 23. The 3D camera module 10 includes at least two sets of lens and image sensing modules 11 and 12, and in this embodiment, a first lens and image sensing module 11 on the left side and a The left and right images, which are located on the right side of the second lens and the shirt image sensing module 12', respectively simulate the images seen by the left and right eyes when viewing the same scene. Each lens and image sensing module 11 and 12 respectively includes a lens set (Lens Set) and an image sensor (Image Sens〇r), and the green image from the external scene is projected on the H by the lens group. It is converted into an electrical signal corresponding to the image. The lens group can be a fixed focus or a zoom lens group, and the sensing H can be a CCD or a CMOS sensing port. In this embodiment, the 3D camera module 10 can include only one circuit board, the two sets of lens and image sensing modules u, 12 on the circuit board, and the positioning of the front Wei axis image. It is better to use the 3D camera that includes the LCD display screen and the operation buttons. However, the camera module can also have a 3D camera. All other or part of the components. «Hai positioning device 20 is used to locate the 3D camera module 1〇, so that all the lenses on the camera module 10 and the image sensing mode, the groups 11, 12 can be photographed = the specific pattern of the 2m specific area or thin Pattern ^. The apparatus 2G is configured to include at least a width and a height direction of at least one of the lens and image sensing modules u, 12 of the at least two groups and the 201236439 image sensing modules 11, 12 The position and the adjustment mechanism of the tilt angle (not shown). The shai control device 22 is connected to the circuit board of the 3D camera module 10 by a transmission line 21, and the electric signal from the 3D camera module 1 can be accepted and detected and analyzed. In this embodiment, the control device 22 may be a computer device including a display, an operating keyboard, and the like, which implements a specific analysis software, which can analyze and process the aforementioned electrical signals and implement the 3D image sense of the present invention. Various methods in the correction method _ detection, calculation and analysis. Please refer to a flowchart of an embodiment of the 3D image finder correction method of the present invention, which includes the following steps: Step 31: Position a 3D camera module ίο on the positioning device 20, and include the 3D ray group. There are at least two sets of lens and image sensing modules u, 12 (such as but not limited to the first and second lens and image sensing modules u, 12). Step 32: Make at least two groups of the lens of the _3D camera fishing 1G and the image sensory group 11, 12 from the outside world to try to take a picture 23 to capture the specific area of the test pattern 23, f2 cut at least Two images. As shown in FIG. 4, the at least two images include the first image 41 (left image) captured by the first head and the image sensing module, and the second lens and image sensing. The second image 42 (right wheat image) is captured by the module; wherein the first and second images 4 and 42 are both wide and have P*Q image data with a pixel and a high Q pixel. In this embodiment, the thin m·23 may include a specific pattern 24 ′ for image test analysis. However, in another embodiment, the test pattern is also a scene pattern. &201236439 Step 33: The air signal, and according to the position difference value. The step 34 of the at least two images 41, 42 is received by the control device 22 to calculate the power of the at least two images 41, 42. (5) Whether the device 22 is full of the difference value is located at the preset __ indicates the first = lens and module u, 12 fall within the allowable range: true: set and angle, the 3D image quality captured by the 3D camera module 10 is qualified 'then then step % is executed to stop the invention Image sensor calibration method. In general, if the judgement _ indicates that the position difference value is not recorded in the job _, the 3D image captured by the 3D camera module t 1〇 is of poor quality and cannot be viewed comfortably by the user. Then go to step 36. Step 36: The positioning device 2 is operated according to the position difference value to adjust the position of at least one of the lens and the female group of the two classes and the shadow group u, 12, and then return to the step to re-execute. Please refer to FIG. 3 and FIG. 4 , wherein FIG. 3 is a flowchart of a method for calculating the position difference value described in step 33 shown in FIG. 2 , and FIG. 4 is a schematic diagram illustrating the first and second images 4 . A schematic diagram of the extent of the block above. In the present invention, the calculation method of the step 33 further includes the following steps: Step 331: Determine a search block 422 in the second image captured by the second lens and the image sensing module 丨2 The search block 422 is located within the P*Q image data range of the second image 42. In this embodiment, the search block 422 is a search range when searching later, and the 201236439 width and height are W+2t pixels; wherein the value of t is due to the position deviation of the lens and the image sensing module. The expected maximum deviation amount that may result in image shift, the value of W will be described later, (w + 2t) < P and (W + 2t) < Q. Step 332: Establish a search condition in the first image 41 captured by the first lens and the image sensing module u. The search condition is based on the P*Q image data range of the first image 41. The data of one of the condition blocks 411 is established; wherein the width and height of the condition block 411 are W, in other words, the width and height of the condition block 411 are smaller than the search area. The width and height of block 422. In the present embodiment, the analysis and calculation method of the method is based on a block, so t is an integer multiple of w. However, in another embodiment, t may not be an integral multiple of w. Step 333: From the search block 422 of the second image 42, find a location corresponding to the block 421 corresponding to the search condition. Step 334: Calculate the condition block 411 and the corresponding block 421 according to the difference between the condition block 411 and the position of the corresponding block 421 in the P*Q image data range. The position difference value between the two, that is, the position difference value between the first and second images 41, 42. Please refer to FIG. 5 , which is a schematic diagram of finding a corresponding block in accordance with the search condition in the 30 image sensor calibration method of the present invention. In the present invention, the method for searching for the corresponding block in accordance with the search condition is to calculate at least one by at least a mathematical function by using the data of the conditional block located on the first image. a condition value, and, in the search block located on the first image, a plurality of secondary search blocks having the same width as 201236439 and w pixels, and edge edges of each adjacent search block are segmented. Pick up. In the example shown in FIG. 5, the search block on the second image is divided into widths and heights of 15 equal parts and a total of 225 search blocks. Then, according to the block, the values of the search blocks are respectively calculated and compared with the at least one condition value by the same at least one mathematical function, and if the in-laws indicate that the search block is in accordance with the The corresponding block of the search condition is used, and if the exact value is found, the search block having the closest value is used as the corresponding block corresponding to the search condition. As shown in Figure 5, it is assumed that the location of the conditional block is located in the center of the first image, and the coordinates of the coordinates are (〇, 〇), and the search finds the same image as the conditional block (that is, it matches When the corresponding block of the search condition is located at the coordinates (7 5) of the second image, the position difference between the first and second images is "the width direction offset". '1W 昼素, height direction shift -5W pixels." In the preferred embodiment of the present invention, the method for finding a corresponding block corresponding to one of the search conditions described in step 333 can be performed in a multi-stage search manner. Taking the three-stage search as an example, it is possible to carry out the first-stage search for the first-stage search--the conditional block, and according to the width and height of the Wl+2t A search block is searched for the first H block that meets the search condition and whose width and height are the same as Wi. When the first corresponding block whose width and height are the same as wi is found, 'the first-to-first ship block is used as the first-second acid search time =: zone:, and a relatively small value of W2 As the width and height of the second conditional block of the second wiU are the same in the first search block, the search for the search condition is met with the high-level 12 201236439 - the second pair of ships . Then, (4) the second corresponding block is used as the third-search series of the hit-third gambling search, and a relatively minimum value of > W3 is used as the third-level search block for the third-stage search. Width and S' find the "third corresponding block" that matches the search condition and the width and height are W3 in the third search block of the wide and high ride W2, so that the efficiency of the it search can be added. As shown in Figure 6, in the first stage of the large block search process, the width and height of the first conditional block are both W and W1 f is 7 times W3, that is, at least width and height are required. Each of the first search blocks, numbered -7 to 7 of 15 equal parts or more (each aliquot is a W3 element), finds a first-to-correspond block of width and height of 7 times % 昼 & Al (width number is _7 to · Bu height number is i, to the range of 7). When the first-corresponding block a is found, the middle block search of the second stage is performed: at this time, the width and height of the second conditional block are both w2, and W2 is equal to 3 times of W3, that is, it needs to be wide The second search block numbered _7 to] and the high number 1 to 7 are the first corresponding block a2 (width number is _3 to _1, ^) of width and height of 3 L W3 Degree numbers are in the range of 5 to 7). After that, when the second corresponding block a2 is found, the block search vulture of the third stage is performed, and the width and height of the third condition block are both W3, that is, in the wide-coded county _3 to _ The third corresponding block a3 of width and height of each W3 element is found within the range of the third search block numbered 5 to 7. Once the third corresponding block a3 located at the wide number _丨 and the high number 5 is found, it can be known that the position difference value between the first and second images is “width direction offset -1W3 pixel, height The direction shift is -5W3." The embodiments described above are not intended to limit the scope of application 13 201236439 of the present invention, and the scope of the technical spirit defined by the content of the patent scope of the invention is limited. The present invention is not limited to the spirit and scope of the present invention, and should be considered as further implementation of the present invention without departing from the spirit and scope of the present invention. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a schematic diagram of an embodiment of a 3D image sensor alignment correction detection system according to the present invention. FIG. 2 is a flow chart of an embodiment of a 3D image sensor calibration method according to the present invention. A flowchart of a method for calculating the position difference value described in step 33. Figure 4 is a schematic diagram illustrating the range of blocks on the first and second images 41, 42. Figure 5 is a 3D of the present invention. FIG. 6 is a schematic diagram of an embodiment of finding a corresponding block that meets a search condition in the image sensor calibration method. FIG. 6 is a search for a matching find bar in the 3D image sensor calibration method of the present invention. Schematic diagram of another embodiment of the corresponding block. [Description of main component symbols] 10~3D camera module 11, 12~ lens and image sensing module 20~ positioning device 21~ transmission line 22~ control device 23~ test pattern 201236439 24~pattern 31~35, 331~334~step