[go: up one dir, main page]

TWI836116B - Judgment device, substrate processing device and manufacturing method of article - Google Patents

Judgment device, substrate processing device and manufacturing method of article Download PDF

Info

Publication number
TWI836116B
TWI836116B TW109123521A TW109123521A TWI836116B TW I836116 B TWI836116 B TW I836116B TW 109123521 A TW109123521 A TW 109123521A TW 109123521 A TW109123521 A TW 109123521A TW I836116 B TWI836116 B TW I836116B
Authority
TW
Taiwan
Prior art keywords
substrate
image
image data
substrate processing
classification
Prior art date
Application number
TW109123521A
Other languages
Chinese (zh)
Other versions
TW202107410A (en
Inventor
藤原広鏡
Original Assignee
日商佳能股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日商佳能股份有限公司 filed Critical 日商佳能股份有限公司
Publication of TW202107410A publication Critical patent/TW202107410A/en
Application granted granted Critical
Publication of TWI836116B publication Critical patent/TWI836116B/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F9/00Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
    • G03F9/70Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically for microlithography
    • G03F9/7088Alignment mark detection, e.g. TTR, TTL, off-axis detection, array detector, video detection
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/68Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for positioning, orientation or alignment
    • H01L21/681Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for positioning, orientation or alignment using optical controlling means
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F9/00Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
    • G03F9/70Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically for microlithography
    • G03F9/7003Alignment type or strategy, e.g. leveling, global alignment
    • G03F9/7023Aligning or positioning in direction perpendicular to substrate surface
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F9/00Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
    • G03F9/70Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically for microlithography
    • G03F9/7092Signal processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/02Manufacture or treatment of semiconductor devices or of parts thereof
    • H01L21/027Making masks on semiconductor bodies for further photolithographic processing not provided for in group H01L21/18 or H01L21/34
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/67005Apparatus not specifically provided for elsewhere
    • H01L21/67242Apparatus for monitoring, sorting or marking
    • H01L21/67259Position monitoring, e.g. misposition detection or presence detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Power Engineering (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Software Systems (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Exposure And Positioning Against Photoresist Photosensitive Materials (AREA)

Abstract

本發明涉及判斷裝置、基板處理裝置以及物品的製造方法。本發明所涉及的判斷裝置的特徵在於,針對在基板處理裝置中拍攝到的基板上的標記的圖像資料進行與圖像評價有關的分類,根據分類的結果判斷基板中的對準精度。The present invention relates to a judging device, a substrate processing device and a method for manufacturing an article. The judging device of the present invention is characterized in that image data of a mark on a substrate photographed in the substrate processing device is classified in relation to image evaluation, and alignment accuracy in the substrate is judged based on the classification result.

Description

判斷裝置,基板處理裝置及物品的製造方法Determination device, substrate processing device and method for manufacturing article

本發明涉及判斷裝置、基板處理裝置以及物品的製造方法。The present invention relates to a determination device, a substrate processing device and a method for manufacturing an article.

近年來,隨著電子設備的小型化、需求的擴大,需要兼顧以記憶體、MPU為代表的半導體元件的細微化和生產率。 因此,在對用於半導體元件的製造的基板進行處理的基板處理裝置中,使基板的位置對齊的對準也需要高精度化。 在基板的對準中,大多使用藉由拍攝在基板上形成的標記的圖像並針對得到的圖像資料進行模式匹配處理來求出基板的位置的方法。 日本特開2000-260699號公報公開了藉由同時提取標記的邊緣和所述邊緣的方向並針對每個邊緣的方向進行著眼於邊緣的模式匹配處理來高精度地檢測標記的曝光裝置。 但是,在諸如日本特開2000-260699號公報的模式匹配處理中,針對包括低對比度、雜訊或者標記失真等的圖像資料,難以檢測標記,基板的對準精度可能降低。 因此,本發明的目的在於提供能夠判斷基板中的對準精度的判斷裝置、基板處理裝置以及物品的製造方法。In recent years, with the miniaturization of electronic devices and the expansion of demand, it is necessary to balance the miniaturization and productivity of semiconductor components represented by memory and MPU. Therefore, in a substrate processing device that processes a substrate used for the manufacture of semiconductor components, the alignment of the position of the substrate also needs to be highly accurate. In the alignment of the substrate, a method is often used to obtain the position of the substrate by capturing an image of a mark formed on the substrate and performing pattern matching processing on the obtained image data. Japanese Patent Publication No. 2000-260699 discloses an exposure device that detects the mark with high accuracy by simultaneously extracting the edge of the mark and the direction of the edge and performing pattern matching processing focusing on the edge for the direction of each edge. However, in the pattern matching process such as Japanese Patent Publication No. 2000-260699, it is difficult to detect the mark for image data including low contrast, noise or mark distortion, and the alignment accuracy of the substrate may be reduced. Therefore, the purpose of the present invention is to provide a judgment device capable of judging the alignment accuracy in a substrate, a substrate processing device, and a method for manufacturing an article.

本發明所涉及的判斷裝置的特徵在於,針對在基板處理裝置中拍攝到的基板上的標記的圖像資料進行與圖像評價有關的分類,根據分類的結果判斷基板中的對準精度。 根據以下參考附圖對示例性實施例的描述,本發明的其他特徵將變得清楚。The judging device according to the present invention is characterized by classifying the image data of the mark on the substrate photographed in the substrate processing apparatus in relation to image evaluation, and judging the alignment accuracy in the substrate based on the classification result. Other features of the invention will become apparent from the following description of exemplary embodiments with reference to the accompanying drawings.

以下參考附圖詳細說明本實施方式所涉及的判斷裝置。此外,以下所示的實施方式僅為實施的具體例,本實施方式不限定於以下的實施方式。 另外,並非在以下所示的實施方式中說明的特徵的所有組合都是為了解決本實施方式的課題所必須的。 另外,在以下所示的附圖中,為了能夠容易地理解本實施方式,存在以與實際不同的比例尺描繪的情況。 [第一實施方式] 圖1是示出第一實施方式所涉及的具備判斷裝置的基板處理系統50的結構的框圖。 此外,本實施方式所涉及的判斷裝置可以如以下所示設置於在基板處理系統50中設置的基板處理裝置10,不限於此,也可以設置於主機電腦11及管理裝置12等。 基板處理系統50具備至少一個半導體製造生產線1。 而且,各半導體製造生產線1具備處理基板的多個基板處理裝置10(半導體製造裝置)和控制多個基板處理裝置10的動作的主機電腦11(主機控制裝置)。 作為基板處理裝置10,例如可以舉出光刻裝置(曝光裝置、壓印(imprint)裝置、帶電粒子束描繪裝置等)、成膜裝置(CVD裝置等)、加工裝置(鐳射加工裝置等)、檢查裝置(覆蓋(overlay)檢查裝置等)。 另外,在基板處理裝置10中,還可以包括對基板進行抗蝕劑材料(密接材料)的塗敷處理作為光刻處理的前處理、並且進行顯影處理作為光刻處理的後處理的塗敷顯影裝置(塗布機/顯影機)。 此外,在曝光裝置中,藉由經由原版(中間掩模(reticle)、掩模)對供給到基板上的光阻劑進行曝光,在基板上的光阻劑上形成與原版的圖案對應的潛像。 在壓印裝置中,藉由在使原板(模具、範本)與供給到基板上的壓印材料接觸的狀態下使壓印材料硬化,在基板上形成圖案。 在帶電粒子束描繪裝置中,藉由利用帶電粒子束在供給到基板上的光阻劑上描繪圖案,在基板上的光阻劑上形成潛像。 如圖1所示,設置於各半導體製造生產線1的多個基板處理裝置10分別與管理保養的管理裝置12連接。 由此,管理裝置12能夠分別管理設置於各半導體製造生產線1的多個基板處理裝置10。 另外,管理裝置12可以作為維護判定裝置發揮功能,該維護判定裝置藉由收集及解析多個基板處理裝置10各自的動作資訊來檢測在各基板處理裝置10中發生的異常或者其預兆,判定是否需要維護處理(維修處理)。 此外,在基板處理系統50中,多個基板處理裝置10與主機電腦11之間的連接、多個基板處理裝置10與管理裝置12之間的連接可以是有線連接以及無線連接中的任意一種。 接下來,說明在基板處理系統50中將各基板處理裝置10構成為曝光裝置的具體例。 圖2A是示出設置於基板處理系統50的曝光裝置10的結構的框圖。另外,圖2B是示出曝光裝置10具備的基板對準光學系統190的結構的概略圖。 曝光裝置10是用於作為物品的半導體元件、液晶顯示元件、薄膜磁頭等器件的製造、在基板上進行圖案形成的光刻裝置。 另外,曝光裝置10以步進掃描(step-and-scan)方式或者步進重複(step-and-repeat)方式對基板進行曝光。 如圖2A所示,曝光裝置10具有主控制部100、光源控制部110、光源120、影像處理部130、載置台控制部140以及干涉儀150。 另外,曝光裝置10具有原板對準光學系統160、原板載置台171、投影光學系統180、基板對準光學系統190以及基板載置台200。 原板載置台171保持並移動藉由照明光學系統(未圖示)照明的原板170。應轉印到基板210的圖案被描繪到原板170。 投影光學系統180將原板170的圖案投影到基板210。基板載置台200能夠保持並移動基板210。 原板對準光學系統160用於原板170的對準。例如,原板對準光學系統160可以包括由積蓄型光電變換元件構成的拍攝元件161以及將來自設置於原板170的標記的光引導到拍攝元件161的光學系統162。 基板對準光學系統190用於基板210的對準。在本實施方式中,基板對準光學系統190是檢測設置於基板210的標記211的離軸光學系統。 主控制部100包括CPU、記憶體等,控制曝光裝置10的各部分,進行對基板210進行曝光的曝光處理以及與其關聯的處理。 在基板處理系統50中,主控制部100根據形成於原板170的標記的位置、形成於基板210的標記211的位置來控制基板載置台200的位置。換言之,主控制部100進行原板170與基板210之間的位置對齊,例如全域對準。 光源120包括鹵素燈等,對形成於基板210的標記211進行照明。 光源控制部110控制來自光源120的光,即用於對標記211進行照明的光的照明強度。 影像處理部130對來自原板對準光學系統160中的拍攝元件161、基板對準光學系統190中的拍攝元件的圖像信號(檢測信號)進行影像處理,取得標記的位置。 在基板處理系統50中,影像處理部130以及基板對準光學系統190作為測量形成於基板210的標記211的位置的測量裝置發揮功能。 干涉儀150藉由對設置於基板載置台200的反射鏡212照射光並檢測藉由反射鏡212反射的光來測量基板載置台200的位置。 載置台控制部140根據藉由干涉儀150測量到的基板載置台200的位置,使基板載置台200移動到任意的位置(驅動控制)。 在曝光裝置10中,來自未圖示的照明光學系統的光(曝光光)通過保持於原板載置台171的原板170,入射到投影光學系統180。 而且,原板170和基板210被配置為相互在光學上共軛的位置關係,所以原板170的圖案經由投影光學系統180在保持於基板載置台200的基板210上成像而被轉印。 基板對準光學系統190作為檢測在基板210上形成的標記211而生成檢測信號(在本實施方式中為圖像信號)的檢測部發揮功能。 如圖2B所示,基板對準光學系統190具備拍攝元件191A及191B、成像光學系統192A及192B以及半反射鏡193。另外,基板對準光學系統190具備照明光學系統194、偏振分束器195、中繼透鏡196、λ/4板197以及物鏡198。 在曝光裝置10中,來自光源120的光經由光纖(未圖示)等被引導到基板對準光學系統190。 然後,被引導到基板對準光學系統190的光如圖2B所示經由照明光學系統194入射到偏振分束器195。 然後,藉由偏振分束器195反射的光通過中繼透鏡196、λ/4板197以及物鏡198,對在基板210上形成的標記211進行照明。 藉由標記211反射的光通過物鏡198、λ/4板197、中繼透鏡196以及偏振分束器195,入射到半反射鏡193。 然後,入射到半反射鏡193的光藉由半反射鏡193而以恰當的強度比例分割為二個光之後,分別被引導到成像倍率相互不同的成像光學系統192A及192B。 成像光學系統192A及192B分別在拍攝元件191A及191B的拍攝面上形成標記211的像。 拍攝元件191A及191B分別包括對包含標記211的區域進行拍攝的拍攝面,生成與在拍攝面中拍攝到的區域對應的圖像信號。 然後,藉由拍攝元件191A及191B生成的圖像信號藉由影像處理部130讀出。 在本實施方式中,影像處理部130藉由針對讀出的圖像信號進行作為影像處理的模式匹配處理,取得拍攝元件191A及191B的拍攝面中的標記211的位置資訊。 模式匹配處理一般被大致分成以下二種。 一種是對圖像(灰度圖像)進行二值化並與預先準備的範本進行匹配、將最相關的位置作為標記211的位置的方法。 另一種是藉由原樣地保持灰度圖像而進行與包括灰度資訊的範本的相關運算來求出標記211的位置的方法。 此外,藉由影像處理部130進行的影像處理不限於模式匹配處理,只要是能夠取得標記211的位置資訊的處理,則也可以是例如邊緣檢測處理等其他處理。 另外,作為對準方式,有移動測量方式和影像處理方式。 在移動測量方式中,一邊使基板載置台200移動,一邊對設置於基板210的標記211照射光(鐳射)。然後,藉由並行地測量從標記211反射的光的強度的變化和基板載置台200的位置,求出標記211的位置。 在影像處理方式中,在使基板載置台200靜止的狀態下對設置於基板210的標記211照射白色光。然後,藉由用積蓄型光電變換元件檢測從標記211反射的光並進行影像處理,求出標記211的位置。 在曝光裝置10中,使用取得的標記211的位置資訊,進行預對準及精準(fine)對準這二種對準。 此處所稱的預對準是指,檢測從未圖示的基板搬運系統送入基板載置台200的基板210的位置偏移量,使基板210大致位置對齊(定位),使得能夠開始精準對準。 另外,此處所稱的精準對準是指,高精度地測量藉由基板載置台200保持的基板210的位置,使基板210精密地位置對齊(定位),使得基板210的位置對齊誤差在容許範圍內。 具體而言,在藉由影像處理進行基板210的精準對準處理時,例如分別拍攝如圖3所示的基板210上的四個標記211a至211d。然後,藉由根據所取得的位置資訊計算基板210的位置來進行測量。 在精準對準處理中,存在無法檢測標記211的情況。另外,即使能夠檢測標記211,也存在在影像處理中由於某種原因無法取得位置而失敗的情況。 例如,可能存在由於基板210的處理工序的影響而導致標記211不清晰的情況、由於基板對準光學系統190的像差的影響而導致不能清晰地看到標記211的情況等。 另外,還考慮標記211的位置偏離拍攝元件191A及191B的拍攝面的視場。 在拍攝元件191A或191B的拍攝面的視場內得到標記211的清晰的圖像的情況下,能夠藉由影像處理正確地測量標記211的位置。 然而,在圖像的對比度低或者由於像差的影響而導致在圖像中有失真的情況下,有時無法正確地測量標記211的位置。 另外,作為標記211偏離拍攝元件191A或191B的拍攝面的視場的原因,考慮預對準中的誤測量、測量前的搬運處理中的位置偏移等起因於裝置的情況。 另外,作為標記211偏離拍攝元件191A或191B的拍攝面的視場的原因,還考慮標記211的轉印位置變動等起因於基板210的處理工序的情況。 在標記211的測量失敗的情況下,無法正常地進行基板210的位置對齊。 而且,在無法正常地進行基板210的位置對齊的情況下,執行用於正常地進行位置對齊的維護處理(維修處理)。 作為維護處理,例如包括多個標記211中使用的標記的變更、標記的像的檢索範圍的擴大、拍攝條件的變更等。 在基板210中對準處理失敗的情況下,即使之後針對基板210進行曝光處理,也無法達成足夠的對準精度。 此時,通常發送差錯並停止基板210的處理,進行用於查明和消除失敗原因的作業。 另一方面,在基板210中對準處理成功的情況下,接著進行針對基板210的曝光處理,但即使在對準處理成功的情況下,也存在在曝光處理中未達成足夠的對準精度的可能性。 作為這樣的可能性中的原因之一,可以舉出由於標記211的位置的誤測量而導致用於針對基板210的位置對齊的計算結果變得不正確。 例如,在對包括標記211的區域進行拍攝而得到的標記圖像中,由於灰塵的附著、其他拍攝時的狀態影響而生成錯誤的圖像信號,從而發生標記211的位置的誤測量。 在發生標記211的位置的誤測量時,在基板210的位置對齊的計算時使用錯誤的值。 因此,作為計算的結果,即使基板210的位置對齊誤差收斂於容許範圍內且對準處理成功,在曝光處理時對準精度仍降低。 圖4示出藉由針對形成於基板210的標記211的對準處理得到的標記圖像的例子。 在此,設為標記211的標記尺寸是50μm×50μm,對準處理中的檢測視場211x是200至400μm×200至400μm。 此時,如圖4所示,在本實施方式中,在包括藉由預對準大致估計出的標記211的中心位置的檢測視場211x內,取得100μm×100μm的標記圖像211y。 然後,針對得到的標記圖像211y進行模式匹配處理。 如果標記圖像211y足夠清晰,則能夠正確地測量標記211的位置。 然而,在標記圖像211y的明暗的對比度低或者存在失真的情況下,產生無法正確地測量標記211的位置的可能性。 圖5A及圖5B分別是示出基板處理系統50中的用於預測對準精度的降低的結構的框圖以及處理流程圖。 首先,藉由影像處理單元300進行針對基板210的影像處理,取得標記圖像301(圖像資料)(步驟401)。 然後,在影像處理單元300根據標記圖像301判定為基板210的位置對齊誤差在容許範圍內且對準成功之後,將曝光處理310的執行指令發送到曝光裝置10。 另外,與曝光處理310的執行指令的發送同時,標記圖像301在被附加來自曝光裝置10的上下文資料320之後被移交給圖像分類單元400(步驟402)。 此外,在此,上下文包括曝光裝置10的機種、機號、硬體結構、軟體結構、設置線等確定結構的資訊。另外,在上下文中,還包括批量(lot)、基板210、原板170、配方(recipe)、環境條件、處理日期時間等確定結構的資訊。 在此,在移交給圖像分類單元400的標記圖像301中,還包括藉由影像處理單元300判定為對準成功或者失敗的任意標記圖像301,但不限於此。 為了提高輸送量,在移交給圖像分類單元400的標記圖像301中,可以僅包括藉由影像處理單元300判定為對準成功的標記圖像301。 另外,在步驟401中測量的標記211的數量既可以是1個也可以是多個,並且,移交給圖像分類單元400的標記圖像301的數量也既可以是1個也可以是多個。 另外,關於針對圖像分類單元400的標記圖像301的移交,可以每當針對一個標記211的影像處理結束時依次進行。另外,不限於此,也可以在針對基板210的所有標記211結束影像處理之後一併地進行。 另外,移交給圖像分類單元400的標記圖像301的資料可以除了包括標記211的圖像信號以外,還包括例如對標記211進行照明的光源120的光量等特徵量資料。 接下來,圖像分類單元400將接受的標記圖像301分類為預先設定的與圖像評價有關的多個類別中的某一類(步驟403)。 作為基板處理系統50中的標記圖像301的具體的分類的方法,如以下所示使用機器學習。 作為使用機器學習的用於預測的方法,有製作學習資料來進行機器學習的有監督的學習。 而且,在有監督的學習中,需要製作包括輸入資料和作為與輸入資料對應的正解的資料的輸出資料的學習資料(監督資料)。 在基板處理系統50中,使用在圖像分類單元400中藉由機器學習得到的學習模型,該機器學習使用了輸入有分類的類別編號的多個標記圖像301作為學習資料305。 在此,例如能夠使用神經網路進行機器學習。神經網路是指,具有輸入層、中間層、輸出層這樣的多層的網路結構的模型。 而且,藉由使用表示輸入資料和輸出資料的關係的學習資料,用誤差逆傳輸法等演算法使網路內部的概率變數最優化,能夠取得學習模型。 在此,說明了使用神經網路取得學習模型的例子,但不限於神經網路。例如,也可以使用支援向量機、決策樹等其他模型、演算法。 然後,圖像分類單元400藉由將標記圖像301輸入到所取得的學習模型,輸出包括與標記圖像301對應的類別編號的分類資訊302作為輸出資料。 接下來,示出本實施方式所涉及的基板處理系統50中的學習資料的具體的製作。 首先,使用先前針對基板210進行的對準處理的結果,將標記圖像301設為輸入資料,將與向各類別編號的分類對應的分類資訊302設為輸出資料,從而製作學習資料305。 例如,作為分類資訊302,能夠設定如以下的表1所示的類別編號0至3。 [表1] 類別編號 分類 0 正常 1 散焦 2 低對比度 3 標記失真 具體而言,類別編號0與正常的標記圖像的分類對應,類別編號1與包括散焦的標記圖像的分類對應。 另外,類別編號2與包括低對比度的標記圖像的分類對應,類別編號3與包括標記失真的標記圖像的分類對應。 此外,上述類別是一個例子,也可以設定其以外的分類的類別。 此處所稱的散焦表示由於標記圖像301的輪廓模糊而不清楚從而導致模式匹配處理或邊緣檢測處理發生偏移的情況。 另外,此處所稱的低對比度表示由於標記圖像301的輪廓相對其周圍清晰度低而導致模式匹配處理或邊緣檢測處理發生偏移的情況。 另外,此處所稱的標記失真表示由於標記圖像301的輪廓的形狀失真而導致模式匹配處理或邊緣檢測處理發生偏移的情況。 然後,為了製作作為輸出資料的分類資訊302,對作為輸入資料的標記圖像301進行分類。 具體而言,作為向散焦的分類方法,例如測量構成標記圖像301的線狀的部分的粗細,在得到的粗細超過預定的閾值的情況下,認定為散焦。 另外,作為向低對比度的分類方法,例如測量標記圖像301與其周圍之間的明暗差,在得到的值低於預定的閾值的情況下,認定為低對比度。 而且,作為向標記失真的分類方法,例如測量標記圖像301的預定的部位(例如四角等)的多個位置,在它們之間的差超過預定的閾值的情況下,認定為標記失真。 因此,藉由在圖像分類單元400中實施上述認定,能夠將標記圖像301分類為類別編號0至3並取得分類資訊302。 這樣,即使在針對標記圖像301對準處理成功的情況下,也能夠藉由分類來明確在曝光處理中未達成足夠的對準精度的標記211的位置的誤測量。 此外,在標記圖像301複合地符合上述散焦、低對比度、標記失真等類別的情況下,可以分類為符合的程度最大的類別。 另外,不限於此,在標記圖像301複合地符合上述散焦、低對比度、標記失真等類別的情況下,也可以根據符合各個類別的程度進行加權來分類。 另外,也可以根據所賦予的上下文來對上述類別進一步進行細分。 以上述方式,藉由將標記圖像301設為輸入資料,將與向各類別編號的分類對應的分類資訊302設為輸出資料,能夠製作學習資料305。 然後,藉由使附加有類別編號的多個標記圖像301學習,能夠製作推論邏輯。 此外,在上述中,藉由圖像分類單元400執行用於製作學習資料的標記圖像301的分類,但不限於此。 例如,為了製作為了得到學習模型而需要的學習資料305,能夠由作業者確認多個標記圖像301並輸入類別編號。 另外,為了提高從學習模型輸出的分類資訊302的正解率,需要針對大量的標記圖像301製作學習資料305。 如圖5A所示,在顯示裝置206中,顯示有為了操作曝光裝置10而需要的資訊、與曝光裝置10的動作有關的資訊等。 圖6是例示地示出顯示裝置206上顯示的畫面900的圖。 另外,在輸入裝置205中,由作業者輸入為了操作曝光裝置10而需要的資訊、為了使顯示裝置206顯示畫面而需要的資訊等。 而且,藉由使顯示裝置206顯示為了輸入分類的類別編號而需要的資訊,作業者能夠經由輸入裝置205輸入用於分類的類別編號的資訊。 另外,在未圖示的CPU中,執行由使顯示裝置206顯示資訊的顯示單元800、使輸入裝置205輸入資訊的輸入單元810進行的處理。 另外,在未圖示的CPU中,執行由判定可否進行顯示裝置206中的顯示以及輸入裝置205中的輸入的有效化的判定單元820進行的處理。 另外,記憶裝置204記憶未輸入有類別編號的未製作資料801和已輸入有類別編號的已製作資料802。 未製作資料801是在對準處理中取得的標記圖像301,是用於製作學習資料的資料。 另外,已製作資料802是對於未製作資料801附加了類別編號的資料,成為輸入到圖像分類單元400的學習資料305。 在此,顯示裝置206、輸入裝置205以及記憶裝置204可以設置於曝光裝置10,不限於此,也可以設置於主機電腦11及管理裝置12等外部的資訊處理裝置。 另外,顯示單元800及輸入單元810能夠藉由在曝光裝置10的主控制部100、管理裝置12以及主機電腦11中的至少1個中執行的軟體程式實現。 另外,判定單元820及圖像分類單元400能夠藉由在曝光裝置10的主控制部100、管理裝置12以及主機電腦11中的至少1個中執行的軟體程式實現。 顯示單元800使顯示裝置206顯示為了製作學習資料305而需要的資訊。 圖7是例示地示出學習資料305的製作畫面的圖。 如圖7所示,在畫面910中,顯示有與未製作資料801中包括的資料關聯的資訊。 例如,在畫面910中,顯示有藉由基板對準光學系統190拍攝的標記211的標記圖像301、包括拍攝標記211時的基板載置台200的位置及速度的關聯資訊912。 另外,關聯資訊912不限於基板載置台200的位置及速度,可以包括對作業者進行製作學習資料305的作業有用的資訊。 例如,關聯資訊912可以包括基板210被移交給基板載置台200時的基板搬運系統的資訊、光源的光量設定的資訊、光源的使用期間等上下文。 另外,關聯資訊912例如可以包括裝置的機種、機號、硬體結構、軟體結構、裝置的設置線、處理物件的基板、包括處理物件的基板的批量、在處理中使用的原板、處理配方、環境條件、處理日期時間等上下文。 另外,在畫面910中,顯示有表示分類的類別編號的選項和選擇狀態的分類資訊913,該選擇狀態表示是否選擇了類別編號。 關於分類資訊913的選擇狀態,能夠使用輸入裝置205來輸入選擇、非選擇。 在選擇了某個類別編號的選項的狀態下按下確定按鈕914的情況下,對於所顯示的未製作資料801輸入所選擇的類別編號的資訊。 另外,在按下中止按鈕915的情況下,學習資料305的製作被中止。 此外,顯示單元800也可以使畫面910顯示多個未製作資料801、多個關聯資訊912以及多個分類資訊913,使類別編號輸入到多個未製作資料801。 輸入單元810取得從輸入裝置205輸入的類別編號的資訊。然後,輸入單元810將類別編號的資訊關聯到記憶於記憶裝置204的未製作資料801,作為已製作資料802記憶到記憶裝置204。 判定單元820根據預定的條件判定學習資料305的製作的開始、結束。即,判定單元820使顯示單元800顯示用於進行顯示裝置206中的類別編號的輸入的資訊,判定是否開始使輸入單元810輸入類別編號的資訊的處理。 另外,判定單元820判定是否使顯示單元800進行的用於進行顯示裝置206中的類別編號的輸入的資訊的顯示、輸入單元810進行的類別編號的資訊的輸入的處理結束。 圖像分類單元400可以在已製作資料802達到預定的件數的情況下,將已製作資料802作為學習資料305追加地進行學習,並從記憶裝置204刪除已製作資料802。 另外,在記憶裝置204中,可以記憶未製作資料801及已製作資料802的件數,藉由輸入單元810、圖像分類單元400更新這些資料的件數。 另外,在記憶裝置204中,可以記憶已製作資料802中的已學習的資料(追加到學習資料305的資料)及未學習的數據(未追加到學習資料305的資料)各自的件數。而且,可以藉由輸入單元810、圖像分類單元400更新這些資料的件數。 另外,顯示單元800可以使這些資料的件數顯示在顯示裝置206上。 接下來,說明製作學習資料305的處理。 圖8是示出製作學習資料305的處理的流程圖。 在S110中,判定單元820根據開始學習資料305的製作的條件,判定是否開始學習資料305的製作。 在判定單元820判定為不開始學習資料305的製作的情況下,在經過預定的期間之後返回到S110,再次判定是否開始學習資料305的製作。 另一方面,在判定單元820判定為開始學習資料305的製作的情況下,進到S111,開始學習資料305的製作。 然後,在S111中,在未製作資料801中包括的標記圖像301中依照上述方式附加分類的類別編號的資訊。 然後,在S112中,將附加有類別編號的標記圖像301從未製作資料801刪除,追加到已製作資料802。 然後,在S113中,判定單元820根據結束學習資料305的製作的條件,判定是否結束學習資料305的製作。 在判定單元820判定為不結束學習資料305的製作的情況下,返回到S111,在顯示裝置206上顯示接下來的未製作資料801。 另一方面,在判定單元820判定為結束學習資料305的製作的情況下,結束畫面910的顯示,結束製作學習資料305的處理。 另外,顯示單元800也可以使作業者判定是否使未製作資料801顯示在顯示裝置206上。 圖9是示出使學習資料305的製作畫面顯示的按鈕的例示性的圖。 按鈕901是用於使作業者判定是否使得用於對未製作資料801進行分類的畫面顯示在顯示裝置206上的按鈕。 在顯示單元800使按鈕901顯示在畫面900中並由作業者按下按鈕的情況下,使得用於對未製作資料801進行分類的畫面顯示在顯示裝置206上。 另外,顯示單元800也可以使表示未製作資料801的件數的訊息902與按鈕901一起顯示。 藉由顯示訊息902,作業者能夠根據未製作資料801的件數,判定是否開始學習資料305的製作。 另外,藉由將圖像分類單元400設置於在曝光裝置10的外部設置的裝置,能夠從多個曝光裝置10接受標記圖像301。 另外,圖像分類單元400可以保管接受的標記圖像301的全部或者一部分直到預先設定的期間或者件數的上限為止。 而且,也可以使得能夠從一覽顯示的多個類別選擇任意的類別,調出被分類為所選擇的類別而保管的標記圖像301並進行畫面顯示。 另外,也可以使得能夠對所選擇的類別中包括的標記圖像301的數量進行合計並顯示結果。 另外,也可以使得能夠用所賦予的上下文來區分所選擇的類別中包括的標記圖像301,對其進行合計並顯示結果。 然後,圖像分類單元400將標記圖像301的分類結果作為分類資訊302移交給預測單元420(步驟404)。 分類資訊302是圖像分類單元400的推論邏輯匯出的、在針對基板210的影像處理中取得的各標記圖像301的分類結果。 此外,關於將分類資訊302移交給預測單元420的時間點,既可以每當一個標記圖像301的分類結束時依次進行,也可以在所有標記圖像301的分類結束之後一併進行。 預測單元420能夠藉由在曝光裝置10的主控制部100、管理裝置12以及主機電腦11中的至少1個中執行的軟體程式實現。 而且,預測單元420根據從圖像分類單元400接受的分類資訊302,參考針對分類資訊302中的每個類別編號預先決定的評價係數430,計算評價值Ep(步驟405)。評價係數430是表示在分類資訊302中已分類的各類別對基板210的對準精度的降低做出貢獻的程度的係數。另外,評價值Ep是表示與已分類的標記圖像301有關的基板210的對準精度的降低的程度的值。 評價係數430是預測單元420參考的電子化的資訊,例如,能夠由作業者預先設定,能夠記憶到未圖示的記憶部。 此外,在上述中為了簡化說明,作為向各類別的分類,示出了如表1所示的類別編號0至3,但也能夠如以下的表2所示的類別編號0至8那樣對類別進一步進行細分。 [表2] 類別編號 分類 0 正常 1 散焦-聚焦誤測量 2 低對比度-過程原因 3 低對比度-觀測器振動 4 低對比度-觀測器內空氣波動 5 低對比度-觀測器的眩光 6 標記失真-過程原因 7 標記失真-觀測器像差 8 標記失真-觀測器照度不均 具體而言,類別編號0對應於正常的標記圖像的分類,類別編號1對應於包括與聚焦誤測量相伴的散焦的標記圖像的分類。 另外,類別編號2對應於包括與過程原因相伴的低對比度的標記圖像的分類,類別編號3對應於包括與觀測器(scope)振動相伴的低對比度的標記圖像的分類。 另外,類別編號4對應於包括與觀測器內空氣波動相伴的低對比度的標記圖像的分類,類別編號5對應於包括與觀測器的眩光相伴的低對比度的標記圖像的分類。 另外,類別編號6對應於包括與過程原因相伴的標記失真的標記圖像的分類,類別編號7對應於包括與觀測器像差相伴的標記失真的標記圖像的分類。 另外,類別編號8對應於包括與觀測器照度不均相伴的標記失真的標記圖像的分類。 此外,上述類別是一個例子,也可以設定其以外的分類的類別。 而且,作為評價係數430,能夠例如如以下的表3所示設定。 [表3] 類別編號 分類 評價係數 0 正常 0 1 散焦-聚焦誤測量 20 2 低對比度-過程原因 15 3 低對比度-觀測器振動 8 4 低對比度-觀測器內空氣波動 5 5 低對比度-觀測器的眩光 8 6 標記失真-過程原因 25 7 標記失真-觀測器像差 12 8 標記失真-觀測器照度不均 18 此外,作為評價係數的值,不限於如表3所示的值,例如,也可以使用將所有評價係數中的最大值設為1來進行標準化而得到的值。 然後,預測單元420執行基於計算出的評價值Ep的行動(action)303(步驟406)。 在此,作為基於評價值Ep的行動303,例如,包括在判定為評價值Ep超過閾值之後對外部通知警告或者中斷曝光裝置10中的曝光處理310等。 另外,曝光裝置10中的曝光處理310的中斷包括重試基板210中的對準處理、移除該基板210或者中止曝光裝置10自身的運用等。 接下來,示出預測單元420進行的評價值Ep的計算的具體例。 在此,考慮在針對具有圖3所示的四個標記211a至211d的基板210的對準處理中針對各標記取得標記圖像301的情況。 此時,設為預測單元420使用以下的式(1)作為評價值Ep的計算式。 在此,Ki 是從針對基板210的標記211a至211d的分類得到的各個評價係數。 另外,在標記圖像301複合地符合多個類別的情況下,也可以根據符合各個類別的程度進行加權並計算平均值。 另外,設為預測單元420在計算出的評價值Ep滿足以下的條件式(2)時進行中斷曝光裝置10中的曝光處理310的判定。 首先,將四個標記211a至211d各自的標記圖像301在預定的時間點從影像處理單元300移交給圖像分類單元400。 此外,在此,預定的時間點是指,例如每當針對一個標記211的影像處理結束時的時間點。 接下來,設為藉由圖像分類單元400根據預先設定的表3所示的類別如以下的表4那樣對四個標記211a至211d各自的標記圖像301進行分類。 [表4] 標記圖像 類別編號(分類) 211a 0(正常) 211b 0(正常) 211c 0(正常) 211d 2(低對比度-過程原因) 然後,將表4所示的分類結果作為分類資訊302移交給預測單元420。 然後,預測單元420參考預先設定的表3所示的評價係數430,如以下的表5所示對各標記圖像301分配評價係數430。 [表5] 標記圖像 類別編號(分類) 評價係數 211a 0(正常) 0 211b 0(正常) 0 211c 0(正常) 0 211d 2(低對比度-過程原因) 15 然後,預測單元420根據如表5所示分配的評價係數430來計算評價值Ep。 此外,在此,設為將評價值Ep如以下的式(3)所示計算為針對各標記圖像301的評價係數430的總和。 然後,預測單元420判定為計算出的評價值Ep=15以滿足上述條件式(2)的方式超過預定的閾值,針對曝光裝置10執行中斷曝光處理310的行動。 如以上所述,在本實施方式所涉及的判斷裝置中,藉由將標記圖像301分類為各類別並計算評價值Ep,能夠高精度地判斷對準精度。 另外,如上所述,在基板處理系統50中,還能夠將影像處理和圖像分類合起來解釋為對準處理。 然而,不限於此,也可以為了提高輸送量而僅將藉由影像處理單元300判定為對準成功的標記圖像301移交給圖像分類單元400來進行分類。 在該情況下,影像處理和圖像分類能夠解釋為相互獨立的處理。 另外,在基板處理系統50中,示出了使用機器學習對標記圖像301進行分類的例子,但不限於此。例如,在將標記圖像301分類為如表1所示的散焦、低對比度以及標記失真的大致的類別的情況下,也可以不使用機器學習而使用上述所示的分類方法來進行分類。 另外,在基板處理系統50中,示出了將各個標記圖像301分類為表1、表2所示的類別的例子,但不限於此。 例如,也可以根據基於如圖3所示的多個標記211的多個標記圖像301之間的相對位置、相對角度等相關關係,作為多個標記圖像301的組進行分類。 [第二實施方式] 圖10是示出第二實施方式所涉及的具備判斷裝置的基板處理系統50中的用於預測對準精度的降低的結構的框圖。 此外,本實施方式所涉及的判斷裝置除了新具備決定單元450以外,結構與第一實施方式所涉及的判斷裝置相同,所以對相同的結構附加相同的標號並省略說明。 決定單元450是用於設定評價係數430的軟體程式,能夠藉由未圖示的設定部實現。 具體而言,決定單元450首先收集包括過去藉由圖像分類單元400進行基板210中的標記圖像301的分類的結果的分類資訊302。 另外,決定單元450收集包括藉由外部測量器440針對基板210進行的對準精度的測量結果的測量資訊441。 然後,決定單元450比較所得到的分類資訊302和測量資訊441。 由此,圖像分類單元400進行的標記圖像301的分類與藉由外部測量器440測量到的對準精度之間的關係被係數化,能夠決定評價係數430。 然後,決定單元450能夠在預測單元420中設定評價係數430。 此外,關於標記圖像301的分類與對準精度之間的關係的係數化,既可以藉由作業者根據經驗在GUI畫面上輸入來設定,也可以另外設置機器學習得到的推論邏輯而自動地設定。 另外,評價係數430可以在曝光裝置10的控制台(console)上或者將顯示裝置連接到具有圖像分類單元400的裝置而與分類的類別一起一覽顯示為GUI畫面而能夠進行確認。 如以上所述,在本實施方式所涉及的判斷裝置中,藉由將標記圖像301分類為各類別、計算評價值Ep並且設定評價係數430,能夠高精度地判斷對準精度。 [物品的製造方法] 本實施方式所涉及的物品的製造方法例如適合於製造器件(半導體元件、磁記憶媒體、液晶顯示元件等)等物品。 另外,本實施方式所涉及的物品的製造方法包括使用曝光裝置10對塗覆有感光劑的基板進行曝光(在基板上形成圖案)的工序,以及使用未圖示的顯影裝置對已曝光的基板進行顯影(對基板進行處理)的工序。 另外,本實施方式所涉及的製造方法可以包括其他公知的工序(氧化、成膜、蒸鍍、摻雜、平坦化、蝕刻、抗蝕劑剝離、切割、鍵合、封裝等)。 本實施方式所涉及的物品的製造方法與以往相比,在物品的性能、品質、生產率以及生產成本中的至少1個方面有利。 以上說明了優選的實施方式,但當然不限定於這些實施方式,而能夠在其要旨的範圍內進行各種變形及變更。 另外,作為基板處理裝置10的一個例子,說明了曝光裝置,但不限定於此。 例如,作為基板處理裝置10的一個例子,也可以是使用模具在基板上形成壓印材料的圖案的壓印裝置。 另外,作為基板處理裝置10的一個例子,也可以是經由帶電粒子光學系統用帶電粒子束(電子線、離子束等)在基板上進行描繪、在基板上形成圖案的描繪裝置。 另外,基板處理裝置10還可以包括將感光介質塗敷在基板的表面上的塗敷裝置、對形成有圖案的基板進行顯影的顯影裝置等在器件等物品的製造中實施如前所述的壓印裝置等裝置實施的工序以外的工序的製造裝置。 另外,實施上述所示的實施方式的方法、程式、記錄有該程式的電腦能夠讀取的記錄媒體也包含於本實施方式的範圍。 根據本發明,能夠提供能夠判斷基板中的對準精度的判斷裝置、基板處理裝置以及物品的製造方法。 儘管已經參考示例性實施例描述了本發明,但是應當理解,本發明不限於所公開的示例性實施例。所附申請專利範圍應被賦予最寬泛的解釋,以涵蓋所有這樣的修改以及等同的結構和功能。The determination device according to this embodiment will be described in detail below with reference to the drawings. In addition, the embodiment shown below is only a specific example of implementation, and this embodiment is not limited to the following embodiment. In addition, not all combinations of features described in the embodiments shown below are necessary to solve the problems of this embodiment. In addition, in the drawings shown below, in order to make this embodiment easy to understand, they may be drawn on a scale different from the actual scale. [First Embodiment] FIG. 1 is a block diagram showing the structure of a substrate processing system 50 including a determination device according to a first embodiment. In addition, the determination device according to this embodiment may be provided in the substrate processing device 10 provided in the substrate processing system 50 as shown below. The determination device is not limited to this and may be provided in the host computer 11, the management device 12, and the like. The substrate processing system 50 includes at least one semiconductor manufacturing line 1 . Furthermore, each semiconductor manufacturing line 1 includes a plurality of substrate processing apparatuses 10 (semiconductor manufacturing apparatuses) that process substrates, and a host computer 11 (host control apparatus) that controls operations of the plurality of substrate processing apparatuses 10 . Examples of the substrate processing apparatus 10 include photolithography apparatuses (exposure apparatuses, imprint apparatuses, charged particle beam drawing apparatuses, etc.), film forming apparatuses (CVD apparatuses, etc.), processing apparatuses (laser processing apparatuses, etc.), Inspection device (overlay inspection device, etc.). In addition, the substrate processing apparatus 10 may further include coating and development in which a resist material (adhesive material) is applied to the substrate as a pre-process of the photolithography process, and a development process is performed as a post-process of the photolithography process. Device (coater/developer). In addition, in the exposure device, the photoresist supplied to the substrate is exposed through the original plate (reticle, mask), and a latent pattern corresponding to the pattern of the original plate is formed on the photoresist on the substrate. picture. In the imprinting device, the imprint material supplied to the substrate is hardened while the original plate (mold, template) is in contact with the imprint material supplied to the substrate, thereby forming a pattern on the substrate. In the charged particle beam drawing device, a latent image is formed on the photoresist on the substrate by drawing a pattern on the photoresist supplied to the substrate using the charged particle beam. As shown in FIG. 1 , a plurality of substrate processing devices 10 installed in each semiconductor manufacturing line 1 are respectively connected to a management device 12 for management and maintenance. Thereby, the management device 12 can individually manage the plurality of substrate processing devices 10 installed in each semiconductor manufacturing line 1 . In addition, the management device 12 may function as a maintenance determination device that collects and analyzes operation information of each of the plurality of substrate processing devices 10 to detect an abnormality or a sign thereof that occurs in each substrate processing device 10 and determines whether Maintenance treatment (repair treatment) is required. In addition, in the substrate processing system 50 , the connection between the plurality of substrate processing devices 10 and the host computer 11 and the connection between the plurality of substrate processing devices 10 and the management device 12 may be either wired connection or wireless connection. Next, a specific example in which each substrate processing device 10 is configured as an exposure device in the substrate processing system 50 will be described. FIG. 2A is a block diagram showing the structure of the exposure device 10 provided in the substrate processing system 50 . In addition, FIG. 2B is a schematic diagram showing the structure of the substrate alignment optical system 190 provided in the exposure apparatus 10 . The exposure apparatus 10 is a photolithography apparatus used to manufacture devices such as semiconductor elements, liquid crystal display elements, and thin film magnetic heads as articles, and to form patterns on substrates. In addition, the exposure device 10 exposes the substrate in a step-and-scan method or a step-and-repeat method. As shown in FIG. 2A , the exposure device 10 includes a main control unit 100 , a light source control unit 110 , a light source 120 , an image processing unit 130 , a stage control unit 140 and an interferometer 150 . In addition, the exposure apparatus 10 has a master alignment optical system 160, a master mounting base 171, a projection optical system 180, a substrate alignment optical system 190, and a substrate mounting base 200. The master plate 171 holds and moves the master plate 170 illuminated by an illumination optical system (not shown). The pattern that should be transferred to the substrate 210 is drawn on the original plate 170 . The projection optical system 180 projects the pattern of the original plate 170 onto the substrate 210 . The substrate mounting table 200 can hold and move the substrate 210 . The master plate alignment optical system 160 is used for the alignment of the master plate 170 . For example, the original plate alignment optical system 160 may include an imaging element 161 composed of an accumulation-type photoelectric conversion element, and an optical system 162 that guides light from a mark provided on the original plate 170 to the imaging element 161 . The substrate alignment optical system 190 is used for alignment of the substrate 210 . In this embodiment, the substrate alignment optical system 190 is an off-axis optical system that detects the mark 211 provided on the substrate 210 . The main control unit 100 includes a CPU, a memory, etc., controls each part of the exposure device 10, and performs exposure processing for exposing the substrate 210 and processing related thereto. In the substrate processing system 50 , the main control unit 100 controls the position of the substrate mounting table 200 based on the position of the mark formed on the original plate 170 and the position of the mark 211 formed on the substrate 210 . In other words, the main control unit 100 performs position alignment between the original plate 170 and the substrate 210, such as global alignment. The light source 120 includes a halogen lamp or the like, and illuminates the mark 211 formed on the substrate 210 . The light source control unit 110 controls the illumination intensity of the light from the light source 120 , that is, the light used to illuminate the mark 211 . The image processing unit 130 performs image processing on image signals (detection signals) from the imaging element 161 in the master alignment optical system 160 and the imaging element in the substrate alignment optical system 190, and obtains the position of the mark. In the substrate processing system 50 , the image processing unit 130 and the substrate alignment optical system 190 function as a measurement device that measures the position of the mark 211 formed on the substrate 210 . The interferometer 150 measures the position of the substrate mounting table 200 by irradiating light to the reflecting mirror 212 provided on the substrate mounting table 200 and detecting the light reflected by the reflecting mirror 212 . The mounting table control unit 140 moves the substrate mounting table 200 to an arbitrary position based on the position of the substrate mounting table 200 measured by the interferometer 150 (drive control). In the exposure device 10 , light (exposure light) from an illumination optical system (not shown) passes through the original plate 170 held on the original plate mounting table 171 and enters the projection optical system 180 . Furthermore, since the original plate 170 and the substrate 210 are arranged in an optically conjugated positional relationship with each other, the pattern of the original plate 170 is imaged on the substrate 210 held on the substrate mounting table 200 via the projection optical system 180 and transferred. The substrate alignment optical system 190 functions as a detection unit that detects the mark 211 formed on the substrate 210 and generates a detection signal (an image signal in this embodiment). As shown in FIG. 2B , the substrate alignment optical system 190 includes imaging elements 191A and 191B, imaging optical systems 192A and 192B, and a half mirror 193 . In addition, the substrate alignment optical system 190 includes an illumination optical system 194, a polarization beam splitter 195, a relay lens 196, a λ/4 plate 197, and an objective lens 198. In the exposure apparatus 10, light from the light source 120 is guided to the substrate alignment optical system 190 via an optical fiber (not shown) or the like. Then, the light guided to the substrate alignment optical system 190 is incident on the polarization beam splitter 195 via the illumination optical system 194 as shown in FIG. 2B. Then, the light reflected by the polarization beam splitter 195 passes through the relay lens 196, the λ/4 plate 197, and the objective lens 198 to illuminate the mark 211 formed on the substrate 210. The light reflected by the mark 211 passes through the objective lens 198, the λ/4 plate 197, the relay lens 196, and the polarization beam splitter 195, and enters the half mirror 193. Then, the light incident on the half-mirror 193 is split into two lights with an appropriate intensity ratio by the half-mirror 193, and is then guided to imaging optical systems 192A and 192B having mutually different imaging magnifications. The imaging optical systems 192A and 192B form images of the mark 211 on the imaging surfaces of the imaging elements 191A and 191B, respectively. The imaging elements 191A and 191B each include an imaging surface for imaging an area including the mark 211 , and generate an image signal corresponding to the area captured on the imaging surface. Then, the image signals generated by the imaging elements 191A and 191B are read out by the image processing unit 130 . In this embodiment, the image processing unit 130 obtains the position information of the mark 211 in the imaging plane of the imaging elements 191A and 191B by performing pattern matching processing as image processing on the read image signal. Pattern matching processing is generally divided into the following two types. One method is to binarize an image (grayscale image), match it with a template prepared in advance, and use the most relevant position as the position of the mark 211 . Another method is to obtain the position of the mark 211 by maintaining the grayscale image as it is and performing a correlation operation with a template including grayscale information. In addition, the image processing performed by the image processing unit 130 is not limited to pattern matching processing, and may be other processing such as edge detection processing as long as the position information of the mark 211 can be obtained. In addition, as alignment methods, there are movement measurement methods and image processing methods. In the moving measurement method, the mark 211 provided on the substrate 210 is irradiated with light (laser) while the substrate mounting table 200 is moved. Then, the position of the mark 211 is obtained by measuring the change in intensity of the light reflected from the mark 211 and the position of the substrate mounting table 200 in parallel. In the image processing method, the mark 211 provided on the substrate 210 is irradiated with white light while the substrate mounting table 200 is kept stationary. Then, the position of the mark 211 is obtained by detecting the light reflected from the mark 211 using an accumulation-type photoelectric conversion element and performing image processing. In the exposure device 10, two types of alignment, pre-alignment and fine alignment, are performed using the obtained position information of the mark 211. The pre-alignment here refers to detecting the positional deviation of the substrate 210 sent to the substrate mounting table 200 from the substrate transport system (not shown), and roughly aligning (positioning) the substrate 210 so that accurate alignment can be started. . In addition, the precise alignment referred to here refers to measuring the position of the substrate 210 held by the substrate mounting table 200 with high precision, and precisely aligning (positioning) the substrate 210 so that the positional alignment error of the substrate 210 is within the allowable range. within. Specifically, when performing precise alignment processing of the substrate 210 through image processing, for example, four marks 211a to 211d on the substrate 210 as shown in FIG. 3 are photographed respectively. Then, measurement is performed by calculating the position of the substrate 210 based on the obtained position information. In the precise alignment process, there are cases where the mark 211 cannot be detected. Even if the mark 211 can be detected, the image processing may fail to obtain the position for some reason. For example, the mark 211 may not be clearly visible due to the influence of the processing steps of the substrate 210 , or the mark 211 may not be clearly visible due to the influence of aberration of the substrate alignment optical system 190 . In addition, it is also considered that the position of the mark 211 deviates from the field of view of the imaging surfaces of the imaging elements 191A and 191B. When a clear image of the mark 211 is obtained within the field of view of the imaging surface of the imaging element 191A or 191B, the position of the mark 211 can be accurately measured by image processing. However, when the contrast of the image is low or there is distortion in the image due to the influence of aberration, the position of the mark 211 may not be measured correctly. In addition, as a reason why the mark 211 deviates from the field of view of the imaging surface of the imaging element 191A or 191B, it is considered that the device is caused by an error in measurement during pre-alignment, a positional shift during a transportation process before measurement, and the like. In addition, as a reason why the mark 211 deviates from the field of view of the imaging surface of the imaging element 191A or 191B, it is also considered that the transfer position of the mark 211 changes due to the processing process of the substrate 210 . If the measurement of the mark 211 fails, the position alignment of the substrate 210 cannot be performed normally. Furthermore, when the positioning of the substrate 210 cannot be performed normally, maintenance processing (repair processing) for performing the positioning normally is performed. The maintenance process includes, for example, changes in the markers used in the plurality of markers 211 , expansion of the search range for marker images, changes in imaging conditions, and the like. If the alignment process fails on the substrate 210 , even if the substrate 210 is subsequently exposed to light, sufficient alignment accuracy cannot be achieved. At this time, an error is normally transmitted, processing of the substrate 210 is stopped, and operations for identifying and eliminating the cause of the failure are performed. On the other hand, if the alignment process on the substrate 210 is successful, the exposure process on the substrate 210 is then performed. However, even if the alignment process is successful, sufficient alignment accuracy may not be achieved in the exposure process. possibility. One of the possible causes is that the calculation result for the positional alignment of the substrate 210 becomes incorrect due to an erroneous measurement of the position of the mark 211 . For example, in a mark image obtained by photographing an area including the mark 211 , an erroneous image signal is generated due to the adhesion of dust or the influence of other conditions at the time of photographing, resulting in an erroneous measurement of the position of the mark 211 . When an erroneous measurement of the position of the mark 211 occurs, an incorrect value is used in the calculation of the positional alignment of the substrate 210 . Therefore, as a result of the calculation, even if the positional alignment error of the substrate 210 converges within the allowable range and the alignment process is successful, the alignment accuracy is reduced at the time of the exposure process. FIG. 4 shows an example of a mark image obtained by the alignment process for the marks 211 formed on the substrate 210 . Here, it is assumed that the mark size of the mark 211 is 50 μm×50 μm, and the detection field of view 211x in the alignment process is 200 to 400 μm×200 to 400 μm. At this time, as shown in FIG. 4 , in this embodiment, a 100 μm×100 μm mark image 211 y is acquired within the detection field of view 211 x including the center position of the mark 211 roughly estimated by pre-alignment. Then, pattern matching processing is performed on the obtained mark image 211y. If the mark image 211y is clear enough, the position of the mark 211 can be measured correctly. However, when the contrast between light and dark in the mark image 211y is low or there is distortion, there is a possibility that the position of the mark 211 cannot be accurately measured. 5A and 5B are respectively a block diagram and a process flow chart showing a structure for predicting a decrease in alignment accuracy in the substrate processing system 50 . First, the image processing unit 300 performs image processing on the substrate 210 to obtain the mark image 301 (image data) (step 401). Then, after the image processing unit 300 determines based on the mark image 301 that the position alignment error of the substrate 210 is within the allowable range and the alignment is successful, the execution instruction of the exposure process 310 is sent to the exposure device 10 . In addition, simultaneously with the transmission of the execution instruction of the exposure process 310, the marked image 301 is handed over to the image classification unit 400 after being appended with the context data 320 from the exposure device 10 (step 402). In addition, here, the context includes information that determines the structure such as the model, machine number, hardware structure, software structure, and installation line of the exposure device 10 . In addition, in the context, it also includes information that determines the structure such as lot, substrate 210, original board 170, recipe, environmental conditions, processing date and time. Here, the marked image 301 handed over to the image classification unit 400 also includes any marked image 301 determined to be successful or failed in alignment by the image processing unit 300, but is not limited to this. In order to increase the transmission throughput, the marked images 301 handed over to the image classification unit 400 may only include the marked images 301 determined to be successfully aligned by the image processing unit 300 . In addition, the number of markers 211 measured in step 401 may be one or multiple, and the number of marker images 301 handed over to the image classification unit 400 may be one or multiple. . In addition, the transfer of the marker image 301 to the image classification unit 400 may be performed sequentially every time the image processing for one marker 211 is completed. In addition, the present invention is not limited to this, and the image processing may be completed for all the marks 211 on the substrate 210 and may be performed together. In addition, the data of the mark image 301 handed over to the image classification unit 400 may include, in addition to the image signal of the mark 211 , feature quantity data such as the light amount of the light source 120 that illuminates the mark 211 . Next, the image classification unit 400 classifies the received marked image 301 into one of a plurality of preset categories related to image evaluation (step 403). As a specific classification method of the mark image 301 in the substrate processing system 50, machine learning is used as follows. As a prediction method using machine learning, there is supervised learning that creates learning materials and performs machine learning. Furthermore, in supervised learning, it is necessary to create learning materials (supervised data) including input data and output data that are correct solutions corresponding to the input data. In the substrate processing system 50 , a learning model obtained by machine learning in the image classification unit 400 using a plurality of labeled images 301 inputted with classified category numbers as the learning data 305 is used. Here, for example, neural networks can be used for machine learning. A neural network refers to a model that has a multi-layer network structure such as an input layer, an intermediate layer, and an output layer. Furthermore, a learning model can be obtained by using learning data that represents the relationship between input data and output data and optimizing probability variables within the network using an algorithm such as the error inverse propagation method. Here, an example of using a neural network to obtain a learning model has been described, but the method is not limited to neural networks. For example, other models and algorithms such as support vector machines and decision trees can also be used. Then, the image classification unit 400 outputs classification information 302 including a category number corresponding to the labeled image 301 as output data by inputting the labeled image 301 to the obtained learning model. Next, specific preparation of learning materials in the substrate processing system 50 according to this embodiment will be described. First, the learning material 305 is created by using the result of the alignment process previously performed on the substrate 210 , using the mark image 301 as the input data, and using the classification information 302 corresponding to the classification for each category number as the output data. For example, as the classification information 302, category numbers 0 to 3 as shown in Table 1 below can be set. [Table 1] Category number Classification 0 normal 1 Defocused 2 low contrast 3 Marker distortion Specifically, category number 0 corresponds to the classification of a normal labeled image, and category number 1 corresponds to a classification of a labeled image including defocus. In addition, class number 2 corresponds to a classification including a low-contrast labeled image, and class number 3 corresponds to a classification including a labeled image including label distortion. In addition, the above categories are examples, and categories other than the categories may be set. The defocus referred to here means that the outline of the marker image 301 is blurred and unclear, causing a shift in the pattern matching process or the edge detection process. In addition, the low contrast referred to here means that the pattern matching process or the edge detection process is offset due to the low definition of the outline of the mark image 301 relative to its surroundings. In addition, the mark distortion referred to here means that the pattern matching process or the edge detection process is shifted due to shape distortion of the outline of the mark image 301 . Then, in order to create classification information 302 as output data, the labeled image 301 as input data is classified. Specifically, as a method of classifying defocus, for example, the thickness of the linear portion constituting the mark image 301 is measured, and when the obtained thickness exceeds a predetermined threshold, it is determined to be defocus. In addition, as a classification method for low contrast, for example, the difference in light and darkness between the mark image 301 and its surroundings is measured, and when the obtained value is lower than a predetermined threshold, it is determined to be low contrast. As a method of classifying marker distortion, for example, multiple positions of a predetermined portion (for example, four corners) of the marker image 301 are measured, and when the difference between them exceeds a predetermined threshold, the marker distortion is determined. Therefore, by performing the above determination in the image classification unit 400, the marked image 301 can be classified into the category numbers 0 to 3 and the classification information 302 can be obtained. In this way, even when the alignment process for the mark image 301 is successful, an erroneous measurement of the position of the mark 211 in which sufficient alignment accuracy was not achieved during the exposure process can be clarified by classification. In addition, when the marker image 301 compositely meets the above categories such as defocus, low contrast, marker distortion, etc., it can be classified into the category with the greatest degree of compliance. In addition, the present invention is not limited to this. When the marker image 301 compositely meets the above categories such as defocus, low contrast, marker distortion, etc., it may be weighted and classified according to the degree of compliance with each category. In addition, the above categories can also be further subdivided according to the context given. In this way, the learning material 305 can be created by using the mark image 301 as the input data and using the classification information 302 corresponding to the classification numbered for each category as the output data. Then, by learning a plurality of labeled images 301 to which category numbers are attached, inference logic can be created. Furthermore, in the above, the classification of the marked image 301 for creating learning materials is performed by the image classification unit 400, but it is not limited to this. For example, in order to create the learning materials 305 necessary to obtain a learning model, the operator can check a plurality of marked images 301 and input a category number. In addition, in order to increase the correct answer rate of the classification information 302 output from the learning model, it is necessary to create learning materials 305 for a large number of labeled images 301 . As shown in FIG. 5A , the display device 206 displays information necessary for operating the exposure device 10 , information related to the operation of the exposure device 10 , and the like. FIG. 6 is a diagram illustrating a screen 900 displayed on the display device 206 . In addition, in the input device 205, the operator inputs information required to operate the exposure device 10, information required to cause the display device 206 to display a screen, and the like. Furthermore, by causing the display device 206 to display the information necessary for inputting the category number for classification, the operator can input the information for the category number for classification via the input device 205 . In addition, the CPU (not shown) executes processing by the display unit 800 that causes the display device 206 to display information, and the input unit 810 that causes the input device 205 to input information. In addition, the CPU (not shown) executes processing by the determination unit 820 that determines whether display on the display device 206 and input on the input device 205 are valid. In addition, the storage device 204 stores unproduced data 801 for which a category number has not been input and created data 802 for which a category number has been input. The uncreated data 801 is the mark image 301 obtained in the alignment process, and is used to create learning materials. In addition, the created material 802 is a material to which a category number is added to the unproduced material 801, and becomes the learning material 305 input to the image classification unit 400. Here, the display device 206, the input device 205 and the memory device 204 may be provided in the exposure device 10, but are not limited thereto and may also be provided in external information processing devices such as the host computer 11 and the management device 12. In addition, the display unit 800 and the input unit 810 can be realized by a software program executed in at least one of the main control unit 100 of the exposure device 10 , the management device 12 , and the host computer 11 . In addition, the determination unit 820 and the image classification unit 400 can be implemented by a software program executed in at least one of the main control unit 100 of the exposure device 10 , the management device 12 , and the host computer 11 . The display unit 800 causes the display device 206 to display information required for creating learning materials 305 . FIG. 7 is a diagram illustrating a creation screen of the learning material 305. As shown in FIG. 7 , on the screen 910 , information related to the data included in the unproduced data 801 is displayed. For example, the screen 910 displays a mark image 301 of the mark 211 photographed by the substrate alignment optical system 190 and related information 912 including the position and speed of the substrate mounting table 200 when the mark 211 is photographed. In addition, the related information 912 is not limited to the position and speed of the substrate mounting table 200 , and may include information useful to the operator in creating the learning materials 305 . For example, the related information 912 may include context such as information on the substrate transport system when the substrate 210 is transferred to the substrate mounting table 200, information on the light intensity setting of the light source, and the usage period of the light source. In addition, the related information 912 may include, for example, the device model, machine number, hardware structure, software structure, installation line of the device, substrate of the processing object, batch of substrates including the processing object, original board used in the processing, processing recipe, Context such as environmental conditions, processing date and time. In addition, on the screen 910, an option indicating the category number of the classification and classification information 913 of a selection state indicating whether the category number is selected are displayed. Regarding the selection status of the classification information 913, selection or non-selection can be input using the input device 205. When the OK button 914 is pressed with the option of a certain category number selected, the information of the selected category number is input to the displayed unproduced data 801 . In addition, when the stop button 915 is pressed, the creation of the learning materials 305 is stopped. In addition, the display unit 800 may also cause the screen 910 to display a plurality of unproduced materials 801, a plurality of related information 912, and a plurality of classification information 913, so that category numbers are input to the plurality of unproduced materials 801. The input unit 810 obtains the category number information input from the input device 205 . Then, the input unit 810 associates the category number information with the unproduced data 801 stored in the storage device 204 and stores it in the storage device 204 as the produced data 802 . The determination unit 820 determines the start and end of the production of the learning materials 305 based on predetermined conditions. That is, the determination unit 820 causes the display unit 800 to display the information for inputting the category number on the display device 206 and determines whether to start the process of causing the input unit 810 to input the information of the category number. In addition, the determination unit 820 determines whether to end the display of the information for inputting the category number by the display unit 800 on the display device 206 and the input of the information of the category number by the input unit 810 . When the number of created materials 802 reaches a predetermined number, the image classification unit 400 may additionally learn the created materials 802 as the learning materials 305 and delete the created materials 802 from the storage device 204 . In addition, in the storage device 204, the number of pieces of unproduced data 801 and produced data 802 can be stored, and the number of pieces of these data can be updated through the input unit 810 and the image classification unit 400. In addition, the storage device 204 can store the respective numbers of learned materials (materials added to the learning materials 305) and unlearned data (materials not added to the learning materials 305) in the created materials 802. Moreover, the number of pieces of these data can be updated through the input unit 810 and the image classification unit 400. In addition, the display unit 800 can display the number of pieces of these materials on the display device 206 . Next, the process of creating learning materials 305 will be described. FIG. 8 is a flowchart showing the process of creating learning materials 305. In S110, the determination unit 820 determines whether to start the production of the learning materials 305 based on the conditions for starting the production of the learning materials 305. When the determination unit 820 determines that the production of the learning materials 305 is not to be started, the process returns to S110 after a predetermined period has elapsed and determines again whether to start the production of the learning materials 305 . On the other hand, if the determination unit 820 determines that the creation of the learning materials 305 is to be started, the process proceeds to S111 to start the creation of the learning materials 305 . Then, in S111, the information of the category number of the classification is added to the mark image 301 included in the unproduced data 801 in the above manner. Then, in S112, the mark image 301 to which the category number is added is deleted from the unproduced data 801 and added to the created data 802. Then, in S113, the determination unit 820 determines whether to end the production of the learning materials 305 based on the conditions for ending the production of the learning materials 305. When the determination unit 820 determines that the creation of the learning material 305 is not completed, the process returns to S111 and the next unproduced material 801 is displayed on the display device 206 . On the other hand, when the determination unit 820 determines that the creation of the learning materials 305 is completed, the display of the screen 910 is ended, and the process of creating the learning materials 305 is ended. In addition, the display unit 800 may allow the operator to determine whether to display the uncreated data 801 on the display device 206 . FIG. 9 is an exemplary diagram showing a button for displaying the creation screen of the learning material 305. Button 901 is a button for causing the operator to determine whether to display a screen for classifying uncreated data 801 on the display device 206 . When the display unit 800 displays the button 901 on the screen 900 and the operator presses the button, a screen for classifying the unproduced material 801 is displayed on the display device 206 . In addition, the display unit 800 may display a message 902 indicating the number of unproduced data 801 together with the button 901 . By displaying message 902, the operator can determine whether to start production of learning materials 305 based on the number of unproduced materials 801. In addition, by providing the image classification unit 400 in a device provided outside the exposure device 10 , the mark images 301 can be received from a plurality of exposure devices 10 . In addition, the image classification unit 400 may store all or a part of the received mark images 301 until a preset period or an upper limit of the number of cases. Furthermore, an arbitrary category may be selected from a plurality of categories displayed in a list, and the mark image 301 classified and stored in the selected category may be called and displayed on the screen. In addition, the number of mark images 301 included in the selected category may be totaled and the result may be displayed. In addition, it is also possible to distinguish the mark images 301 included in the selected category using the given context, to sum them up, and to display the result. Then, the image classification unit 400 hands over the classification result of the labeled image 301 to the prediction unit 420 as classification information 302 (step 404). The classification information 302 is the classification result of each marked image 301 obtained in the image processing of the substrate 210 , which is derived by the inference logic of the image classification unit 400 . In addition, the time point at which the classification information 302 is transferred to the prediction unit 420 may be performed sequentially whenever the classification of one labeled image 301 is completed, or may be performed simultaneously after the classification of all labeled images 301 is completed. The prediction unit 420 can be implemented by a software program executed in at least one of the main control unit 100 of the exposure device 10 , the management device 12 , and the host computer 11 . Furthermore, the prediction unit 420 calculates the evaluation value Ep based on the classification information 302 received from the image classification unit 400 with reference to the evaluation coefficient 430 predetermined for each category number in the classification information 302 (step 405). The evaluation coefficient 430 is a coefficient indicating the degree to which each category classified in the classification information 302 contributes to the decrease in the alignment accuracy of the substrate 210 . In addition, the evaluation value Ep is a value indicating the degree of degradation in the alignment accuracy of the substrate 210 with respect to the classified mark image 301 . The evaluation coefficient 430 is electronic information that the prediction unit 420 refers to, and can be set in advance by an operator, for example, and can be stored in a storage unit (not shown). In addition, in the above description, in order to simplify the description, the category numbers 0 to 3 shown in Table 1 are shown as the classification into each category. However, the categories may also be classified as the category numbers 0 to 8 shown in Table 2 below. Segment further. [Table 2] Category number Classification 0 normal 1 Defocus-focus mismeasurement 2 Low contrast - process reasons 3 Low Contrast - Observer Vibration 4 Low contrast - air fluctuations within the observer 5 Low Contrast - Observer Glare 6 Marker distortion - process reasons 7 Marker Distortion - Observer Aberration 8 Marker distortion - uneven observer illumination Specifically, the category number 0 corresponds to the classification of a normal marker image, and the category number 1 corresponds to the category of a marker image including defocus accompanied by focus mismeasurement. In addition, category number 2 corresponds to a category including a marker image with low contrast associated with process reasons, and category number 3 corresponds to a category including a marker image with low contrast associated with scope vibration. In addition, category number 4 corresponds to a category including a low-contrast marker image accompanied by air fluctuations in the observer, and category number 5 corresponds to a category including a low-contrast marker image associated with glare of the observer. In addition, category number 6 corresponds to the classification of the mark image including the mark distortion accompanying the process cause, and the category number 7 corresponds to the classification of the mark image including the mark distortion accompanying the observer aberration. In addition, category number 8 corresponds to a classification of a marker image including marker distortion accompanying uneven observer illumination. In addition, the above categories are examples, and categories other than the categories may be set. Furthermore, the evaluation coefficient 430 can be set, for example, as shown in Table 3 below. [table 3] Category number Classification Evaluation coefficient 0 normal 0 1 Defocus-focus mismeasurement 20 2 Low contrast - process reasons 15 3 Low Contrast - Observer Vibration 8 4 Low contrast - air fluctuations within the observer 5 5 Low Contrast - Observer Glare 8 6 Marker distortion - process reasons 25 7 Marker Distortion - Observer Aberration 12 8 Marker distortion - uneven observer illumination 18 In addition, the value of the evaluation coefficient is not limited to the value shown in Table 3. For example, a value obtained by standardizing the maximum value of all evaluation coefficients to 1 may be used. Then, the prediction unit 420 executes an action 303 based on the calculated evaluation value Ep (step 406). Here, the action 303 based on the evaluation value Ep includes, for example, notifying an external warning or interrupting the exposure process 310 in the exposure device 10 after it is determined that the evaluation value Ep exceeds the threshold value. In addition, the interruption of the exposure process 310 in the exposure device 10 includes retrying the alignment process in the substrate 210, removing the substrate 210, or suspending the operation of the exposure device 10 itself. Next, a specific example of calculation of evaluation value Ep by prediction unit 420 is shown. Here, consider a case where the mark image 301 is acquired for each mark in the alignment process for the substrate 210 having the four marks 211a to 211d shown in FIG. 3 . At this time, it is assumed that the prediction unit 420 uses the following equation (1) as the calculation equation of the evaluation value Ep. Here, K i is each evaluation coefficient obtained from the classification of the marks 211 a to 211 d for the substrate 210 . In addition, when the marker image 301 compositely matches a plurality of categories, weighting may be performed based on the degree of matching of each category and an average value may be calculated. In addition, it is assumed that the prediction unit 420 determines to interrupt the exposure process 310 in the exposure device 10 when the calculated evaluation value Ep satisfies the following conditional expression (2). First, the mark image 301 of each of the four marks 211a to 211d is handed over from the image processing unit 300 to the image classification unit 400 at a predetermined time point. In addition, here, the predetermined time point means, for example, the time point every time the image processing for one mark 211 is completed. Next, it is assumed that the mark images 301 of each of the four marks 211 a to 211 d are classified by the image classification unit 400 according to the preset categories shown in Table 3 as shown in Table 4 below. [Table 4] Tag image Category number (category) 211a 0 (normal) 211b 0 (normal) 211c 0 (normal) 211d 2 (low contrast - process reasons) Then, the classification results shown in Table 4 are handed over to the prediction unit 420 as classification information 302. Then, the prediction unit 420 refers to the preset evaluation coefficient 430 shown in Table 3, and allocates the evaluation coefficient 430 to each marker image 301 as shown in the following Table 5. [table 5] Tag image Category number (category) Evaluation coefficient 211a 0 (normal) 0 211b 0 (normal) 0 211c 0 (normal) 0 211d 2 (low contrast - process reasons) 15 Then, the prediction unit 420 calculates the evaluation value Ep based on the evaluation coefficient 430 assigned as shown in Table 5. In addition, here, it is assumed that the evaluation value Ep is calculated as the sum of the evaluation coefficients 430 for each mark image 301 as shown in the following equation (3). Then, the prediction unit 420 determines that the calculated evaluation value Ep=15 exceeds a predetermined threshold so as to satisfy the above conditional expression (2), and performs an action of interrupting the exposure process 310 for the exposure device 10 . As described above, in the determination device according to this embodiment, the alignment accuracy can be determined with high accuracy by classifying the mark image 301 into each category and calculating the evaluation value Ep. In addition, as described above, in the substrate processing system 50 , image processing and image classification can also be collectively interpreted as alignment processing. However, the present invention is not limited to this. In order to increase the throughput, only the mark images 301 determined to be successfully aligned by the image processing unit 300 may be transferred to the image classification unit 400 for classification. In this case, image processing and image classification can be interpreted as mutually independent processes. In addition, in the substrate processing system 50 , an example is shown in which the mark image 301 is classified using machine learning, but the classification is not limited to this. For example, when classifying the marked image 301 into the general categories of defocus, low contrast, and marked distortion shown in Table 1, the classification may be performed using the classification method shown above without using machine learning. In addition, in the substrate processing system 50, the example in which each mark image 301 is classified into the categories shown in Table 1 and Table 2 is shown, but the invention is not limited to this. For example, the plurality of mark images 301 may be classified into groups based on correlations such as relative positions and relative angles between the plurality of mark images 301 based on the plurality of marks 211 shown in FIG. 3 . [Second Embodiment] FIG. 10 is a block diagram illustrating a structure for predicting a decrease in alignment accuracy in a substrate processing system 50 equipped with a determination device according to a second embodiment. In addition, the determination device according to this embodiment has the same structure as the determination device according to the first embodiment except that it newly includes the determination unit 450. Therefore, the same structures are assigned the same reference numerals and descriptions thereof are omitted. The determination unit 450 is a software program for setting the evaluation coefficient 430, and can be implemented by a setting unit (not shown). Specifically, the determination unit 450 first collects the classification information 302 including the results of past classification of the mark images 301 in the substrate 210 by the image classification unit 400 . In addition, the determination unit 450 collects measurement information 441 including measurement results of the alignment accuracy of the substrate 210 by the external measurer 440 . Then, the decision unit 450 compares the obtained classification information 302 and the measurement information 441. Thereby, the relationship between the classification of the mark image 301 by the image classification unit 400 and the alignment accuracy measured by the external measuring device 440 is coefficientized, and the evaluation coefficient 430 can be determined. Then, the decision unit 450 can set the evaluation coefficient 430 in the prediction unit 420. In addition, the coefficientization of the relationship between the classification of the marker image 301 and the alignment accuracy can be set by the operator's input on the GUI screen based on experience, or it can be automatically set by separately setting inference logic obtained by machine learning. settings. In addition, the evaluation coefficient 430 can be displayed on a console of the exposure device 10 or a display device connected to a device having the image classification unit 400, and can be displayed as a GUI screen in a list together with the classified categories, so that it can be confirmed. As described above, in the determination device according to this embodiment, the alignment accuracy can be determined with high accuracy by classifying the mark image 301 into each category, calculating the evaluation value Ep, and setting the evaluation coefficient 430 . [Method for Manufacturing Articles] The method for manufacturing articles according to the present embodiment is suitable for manufacturing articles such as devices (semiconductor elements, magnetic memory media, liquid crystal display elements, etc.). In addition, the manufacturing method of the article according to the present embodiment includes the steps of using the exposure device 10 to expose a substrate coated with a photosensitive agent (forming a pattern on the substrate), and using a developing device (not shown) to expose the exposed substrate. The process of developing (processing the substrate) is performed. In addition, the manufacturing method according to this embodiment may include other known processes (oxidation, film formation, evaporation, doping, planarization, etching, resist stripping, dicing, bonding, packaging, etc.). The method of manufacturing an article according to this embodiment is advantageous in at least one aspect of the performance, quality, productivity, and production cost of the article compared with conventional methods. Preferred embodiments have been described above, but it goes without saying that the invention is not limited to these embodiments, and various modifications and changes can be made within the scope of the gist thereof. Moreover, as an example of the substrate processing apparatus 10, the exposure apparatus was demonstrated, However, it is not limited to this. For example, an example of the substrate processing apparatus 10 may be an imprint apparatus that uses a mold to form a pattern of an imprint material on a substrate. An example of the substrate processing apparatus 10 may be a drawing apparatus that draws a charged particle beam (electron beam, ion beam, etc.) on a substrate via a charged particle optical system to form a pattern on the substrate. In addition, the substrate processing apparatus 10 may also include a coating device that applies a photosensitive medium to the surface of the substrate, a developing device that develops the patterned substrate, and the like to perform the above-mentioned pressing in the manufacture of devices and other articles. Manufacturing equipment for processes other than those performed by devices such as printing devices. In addition, the method and program for implementing the above-described embodiment, and a computer-readable recording medium on which the program is recorded are also included in the scope of this embodiment. According to the present invention, it is possible to provide a determination device, a substrate processing device, and an article manufacturing method that can determine the alignment accuracy in a substrate. While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The appended claims are to be given the broadest interpretation to cover all such modifications and equivalent structures and functions.

1:半導體製造生產線 10:基板處理裝置 11:主機電腦 12:管理裝置 50:基板處理系統 100:主控制部 110:光源控制部 120:光源 130:影像處理部 140:載置台控制部 150:干涉儀 160:原板對準光學系統 161:拍攝元件 162:光學系統 170:原板 171:原板載置台 180:投影光學系統 190:基板對準光學系統 191A,191B:拍攝元件 192A,192B:成像光學系統 193:半反射鏡 194:照明光學系統 195:偏振分束器 196:中繼透鏡 197:λ/4板 198:物鏡 200:基板載置台 204:記憶裝置 205:輸入裝置 206:顯示裝置 210:基板 211:標記 300:影像處理單元 301:標記圖像 302:分類資訊 303:行動 305:學習資料 310:曝光處理 320:上下文資料 400:圖像分類單元 420:預測單元 430:評價係數 440:外部測量器 441:測量資訊 450:決定單元 800:顯示單元 801:未製作資料 802:已製作資料 810:輸入單元 820:判定單元 900:畫面 901:按鈕 902:訊息 910:畫面 912:關聯資訊 913:分類資訊 914:確定按鈕 915:中止按鈕 Ep:評價值1: Semiconductor manufacturing production line 10:Substrate processing device 11:Host computer 12:Management device 50:Substrate processing system 100: Main control department 110:Light source control department 120:Light source 130:Image processing department 140: Loading platform control section 150:Interferometer 160:Original plate alignment optical system 161: Photographic element 162:Optical system 170:Original board 171:Original plate mounting table 180:Projection optical system 190: Substrate alignment optical system 191A, 191B: Photographic element 192A, 192B: Imaging optical system 193:Half mirror 194:Illumination optical system 195:Polarization beam splitter 196:Relay lens 197:λ/4 plate 198:Objective lens 200:Substrate mounting table 204:Memory device 205:Input device 206:Display device 210:Substrate 211:mark 300:Image processing unit 301: Tag image 302: Classified information 303: Action 305:Study materials 310: Exposure processing 320:Contextual information 400:Image classification unit 420: Prediction unit 430:Evaluation coefficient 440:External measuring device 441: Measurement information 450: Determining unit 800: Display unit 801: No data produced 802: Data has been created 810:Input unit 820: Determination unit 900:Screen 901:Button 902:Message 910:Screen 912:Related information 913: Classified information 914: OK button 915: Abort button Ep: evaluation value

[圖1]是示出第一實施方式所涉及的基板處理系統的結構的框圖。 [圖2A]是示出第一實施方式所涉及的基板處理系統具備的曝光裝置的結構的概略圖。 [圖2B]是示出第一實施方式所涉及的基板處理系統具備的曝光裝置的結構的概略圖。 [圖3]是形成有標記的基板的示意性俯視圖。 [圖4]是示出藉由針對標記的對準處理得到的圖像的例子的圖。 [圖5A]是示出第一實施方式所涉及的基板處理系統中的用於預測對準精度的降低的結構的圖。 [圖5B]是示出第一實施方式所涉及的基板處理系統中的用於預測對準精度的降低的結構的圖。 [圖6]是例示地示出第一實施方式所涉及的基板處理系統中的顯示裝置上顯示的畫面的圖。 [圖7]是例示地示出第一實施方式所涉及的基板處理系統中的學習資料的製作畫面的圖。 [圖8]是示出第一實施方式所涉及的基板處理系統中的製作學習資料的處理的流程圖。 [圖9]是例示地示出第一實施方式所涉及的基板處理系統中的使學習資料的製作畫面顯示的按鈕的圖。 [圖10]是示出第二實施方式所涉及的基板處理系統中的用於預測對準精度的降低的結構的框圖。[FIG. 1] is a block diagram showing the structure of the substrate processing system according to the first embodiment. [Fig. 2A] is a schematic diagram showing the structure of an exposure device included in the substrate processing system according to the first embodiment. [FIG. 2B] is a schematic diagram showing the structure of the exposure apparatus provided in the substrate processing system according to the first embodiment. [Fig. 3] is a schematic plan view of a substrate on which marks are formed. [Fig. 4] A diagram illustrating an example of an image obtained by alignment processing for markers. FIG. 5A is a diagram illustrating a structure for predicting a decrease in alignment accuracy in the substrate processing system according to the first embodiment. FIG. 5B is a diagram illustrating a structure for predicting a decrease in alignment accuracy in the substrate processing system according to the first embodiment. 6 is a diagram illustrating a screen displayed on the display device in the substrate processing system according to the first embodiment. 7 is a diagram illustrating a screen for creating learning materials in the substrate processing system according to the first embodiment. 8 is a flowchart showing a process of creating learning materials in the substrate processing system according to the first embodiment. 9 is a diagram illustrating a button for displaying a learning material creation screen in the substrate processing system according to the first embodiment. 10 is a block diagram showing a structure for predicting a decrease in alignment accuracy in the substrate processing system according to the second embodiment.

10:曝光裝置 10: Exposure device

204:記憶裝置 204:Memory device

205:輸入裝置 205:Input device

206:顯示裝置 206:Display device

300:影像處理單元 300:Image processing unit

301:標記圖像 301: Tag image

302:分類資訊 302: Classification information

303:行動 303: Action

305:學習資料 305:Study materials

310:曝光處理 310: Exposure processing

320:上下文資料 320: Contextual data

400:圖像分類單元 400:Image classification unit

420:預測單元 420: Prediction unit

430:評價係數 430:Evaluation coefficient

800:顯示單元 800: Display unit

801:未製作資料 801: Data not created

802:已製作資料 802: Data has been created

810:輸入單元 810: Input unit

820:判定單元 820: Judgment unit

Claims (17)

一種判斷裝置,其特徵為:針對在基板處理裝置中拍攝到的基板上的標記的圖像資料進行與圖像評價有關的分類,從按前述圖像資料被分類的每個類別預先決定的評價係數,計算表示與前述圖像資料有關的對準精度的降低的程度的評價值,根據前述評價值判斷前述基板中的對準精度。 A judgment device, characterized by classifying image data of marks on a substrate captured in a substrate processing device in relation to image evaluation, and selecting a predetermined evaluation for each category into which the image data is classified. A coefficient is used to calculate an evaluation value indicating the degree of degradation of alignment accuracy related to the image data, and the alignment accuracy in the substrate is determined based on the evaluation value. 如請求項1記載的判斷裝置,其中,前述判斷裝置,係使用藉由機器學習取得的學習模型,針對前述圖像資料進行前述分類。 As described in claim 1, the judgment device uses a learning model obtained by machine learning to perform the classification on the image data. 如請求項1記載的判斷裝置,其中,前述判斷裝置,係將前述圖像資料分類為正常、散焦、低對比度以及標記失真中的任一個類別。 The judgment device according to claim 1, wherein the judgment device classifies the image data into any one of normal, defocused, low contrast and mark distortion categories. 如請求項1記載的判斷裝置,其中,前述判斷裝置,係在前述評價值超過預定的閾值時,針對前述基板處理裝置執行中斷處理的行動。 The judgment device according to claim 1, wherein the judgment device executes an interrupt process on the substrate processing apparatus when the evaluation value exceeds a predetermined threshold. 如請求項1記載的判斷裝置,其中,前述判斷裝置,係根據藉由外部測量器的前述基板的對準精度的測量結果,設定前述評價係數。 The judgment device as described in claim 1, wherein the judgment device sets the evaluation coefficient based on the measurement result of the alignment accuracy of the substrate by an external measuring device. 如請求項5記載的判斷裝置,其中,前述判斷裝置,係使用藉由機器學習取得的學習模型,設定前述評價係數。 The judgment device according to claim 5, wherein the judgment device sets the evaluation coefficient using a learning model obtained through machine learning. 如請求項1記載的判斷裝置,其中,前述 判斷裝置,係在將與前述基板處理裝置有關的上下文附加到前述圖像資料之後,針對前述圖像資料進行前述分類。 The judgment device as described in claim 1, wherein the judgment device performs the classification on the image data after adding the context related to the substrate processing device to the image data. 如請求項1記載的判斷裝置,其中,前述判斷裝置,係僅針對前述圖像資料中對準處理成功的圖像資料進行前述分類。 The judgment device according to claim 1, wherein the judgment device performs the classification only on the image data that have been successfully aligned among the image data. 如請求項1記載的判斷裝置,其中,前述基板處理裝置,係以使用曝光光將形成於原板的圖案轉印到前述基板上的方式,將前述基板曝光的曝光裝置。 The judgment device as described in claim 1, wherein the substrate processing device is an exposure device that exposes the substrate by transferring the pattern formed on the original plate to the substrate using exposure light. 一種判斷裝置,其特徵為:在將與基板處理裝置有關的上下文附加到在前述基板處理裝置中拍攝到的基板上的標記的圖像資料之後,針對前述圖像資料進行與圖像評價有關的分類,根據該分類的結果判斷前述基板中的對準精度。 A judgment device characterized by: after adding context related to the substrate processing apparatus to image data of a mark on a substrate captured in the substrate processing apparatus, performing image evaluation related to the image data Classification, and the alignment accuracy in the aforementioned substrate is judged based on the results of the classification. 一種基板處理裝置,係處理基板的基板處理裝置,其特徵為:具有如請求項1至10中的任一項記載的判斷裝置。 A substrate processing apparatus for processing a substrate, characterized by having a judging device as described in any one of claims 1 to 10. 一種物品的製造方法,其特徵為:具有使用如請求項11記載的基板處理裝置來處理基板的工序,從處理後的前述基板製造物品。 A method of manufacturing an article, which includes the step of processing a substrate using the substrate processing apparatus as described in claim 11, and manufacturing an article from the processed substrate. 一種基板處理系統,其特徵係具有:如請求項1至10中的任一項記載的判斷裝置;多個基板處理裝置,其係處理基板;主機電腦,其係控制該多個基板處理裝置的動作;以 及管理裝置,其係管理前述多個基板處理裝置的保養。 A substrate processing system, characterized by having: a judgment device as described in any one of claims 1 to 10; a plurality of substrate processing devices that process substrates; and a host computer that controls the plurality of substrate processing devices. action; to and a management device that manages the maintenance of the plurality of substrate processing devices. 一種對準精度的判斷方法,其特徵係具有:針對在基板處理裝置中拍攝到的基板上的標記的圖像資料進行與圖像評價有關的分類之工序;從按前述圖像資料被分類的每個類別預先決定的評價係數,計算表示與前述圖像資料有關的對準精度的降低的程度的評價值之工序;以及根據前述評價值判斷前述基板的對準精度之工序。 A method for judging alignment accuracy, characterized by the following steps: classifying image data of marks on a substrate captured in a substrate processing device in relation to image evaluation; The process of calculating an evaluation value indicating the degree of degradation of the alignment accuracy related to the image data using an evaluation coefficient predetermined for each category; and the step of judging the alignment accuracy of the substrate based on the evaluation value. 一種對準精度的判斷方法,其特徵係具有:將與基板處理裝置有關的上下文附加到在前述基板處理裝置中拍攝到的基板上的標記的圖像資料之工序;針對前述圖像資進行與圖像評價有關的分類之工序;以及從藉由進行該分類的工序得到的分類資訊判斷前述基板的對準精度之工序。 A method for judging alignment accuracy, characterized by the steps of: adding a context related to a substrate processing device to image data of a mark on a substrate captured in the substrate processing device; and performing a comparison with the image data on the image data. A step of classifying related to image evaluation; and a step of judging the alignment accuracy of the substrate from the classification information obtained by performing the classification step. 一種記錄有程式之電腦可讀取的記錄媒體,係記錄有使電腦判斷對準精度的程式之電腦可讀取的記錄媒體,其特徵為:使電腦執行:針對在基板處理裝置中拍攝到的基板上的標記的圖像資料進行與圖像評價有關的分類之工序;從按前述圖像資料被分類的每個類別預先決定的評價 係數,計算表示與前述圖像資料有關的對準精度的降低的程度的評價值之工序;以及根據前述評價值判斷前述基板的對準精度之工序。 A computer-readable recording medium on which a program is recorded, which is a computer-readable recording medium on which a program for causing a computer to determine alignment accuracy is recorded, and is characterized by causing the computer to execute: The process of classifying the image data marked on the substrate in relation to image evaluation; and performing an evaluation determined in advance for each category according to which the image data is classified The coefficient is a step of calculating an evaluation value indicating the degree of degradation of alignment accuracy related to the image data; and a step of judging the alignment accuracy of the substrate based on the evaluation value. 一種記錄有程式之電腦可讀取的記錄媒體,係記錄有使電腦判斷對準精度的程式之電腦可讀取的記錄媒體,其特徵為:使電腦執行:將與基板處理裝置有關的上下文附加到在前述基板處理裝置中拍攝到的基板上的標記的圖像資料之工序;針對前述圖像資進行與圖像評價有關的分類之工序;以及從藉由進行該分類的工序得到的分類資訊判斷前述基板的對準精度之工序。 A computer-readable recording medium on which a program is recorded, which is a computer-readable recording medium on which a program for causing a computer to determine alignment accuracy is recorded, and is characterized by causing the computer to execute: appending context related to a substrate processing device The process of obtaining image data of the mark on the substrate captured by the substrate processing apparatus; the process of classifying the image data related to image evaluation; and the classification information obtained by performing the classification process. The process of judging the alignment accuracy of the aforementioned substrate.
TW109123521A 2019-07-31 2020-07-13 Judgment device, substrate processing device and manufacturing method of article TWI836116B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-140683 2019-07-31
JP2019140683A JP7366626B2 (en) 2019-07-31 2019-07-31 judgment device

Publications (2)

Publication Number Publication Date
TW202107410A TW202107410A (en) 2021-02-16
TWI836116B true TWI836116B (en) 2024-03-21

Family

ID=74483347

Family Applications (1)

Application Number Title Priority Date Filing Date
TW109123521A TWI836116B (en) 2019-07-31 2020-07-13 Judgment device, substrate processing device and manufacturing method of article

Country Status (4)

Country Link
JP (1) JP7366626B2 (en)
KR (1) KR20210015656A (en)
CN (1) CN112309909B (en)
TW (1) TWI836116B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7373340B2 (en) * 2019-09-25 2023-11-02 キヤノン株式会社 judgment device
KR102560241B1 (en) 2022-11-14 2023-07-28 (주)오로스테크놀로지 System for centering position of overlay key based on deep learning and method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020168097A1 (en) * 2001-03-28 2002-11-14 Claus Neubauer System and method for recognizing markers on printed circuit boards
US7103210B2 (en) * 1999-03-09 2006-09-05 Canon Kabushiki Kaisha Position detection apparatus and exposure apparatus
TWI461853B (en) * 2006-01-30 2014-11-21 尼康股份有限公司 The image processing method and apparatus, the measurement apparatus and apparatus, the processing apparatus, the measuring apparatus and the exposure apparatus, the substrate processing system, and the computer-readable information recording medium
TWI637329B (en) * 2015-10-27 2018-10-01 日商斯庫林集團股份有限公司 Displacement detecting apparatus, displacement detecting method, and substrate processing apparatus

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2897330B2 (en) * 1990-04-06 1999-05-31 キヤノン株式会社 Mark detection device and exposure device
JPH10294267A (en) * 1997-04-22 1998-11-04 Sony Corp Equipment and method for detecting alignment mark
US6603882B2 (en) 2001-04-12 2003-08-05 Seho Oh Automatic template generation and searching method
JP2004006527A (en) * 2002-05-31 2004-01-08 Canon Inc Position detection device and position detection method, exposure device, and device manufacturing method and substrate
JP2005322721A (en) 2004-05-07 2005-11-17 Nikon Corp Information preserving method and information using method
CN106463434B (en) 2014-06-10 2020-12-22 Asml荷兰有限公司 Computational wafer inspection
CN105988305B (en) * 2015-02-28 2018-03-02 上海微电子装备(集团)股份有限公司 Wafer pre-alignment method
CN106933069B (en) * 2015-12-30 2018-07-20 上海微电子装备(集团)股份有限公司 A kind of wafer pre-alignment method
CN107168018B (en) 2016-02-29 2018-12-14 上海微电子装备(集团)股份有限公司 A kind of focusing alignment device and alignment methods
JP6623851B2 (en) 2016-03-08 2019-12-25 富士通株式会社 Learning method, information processing device and learning program
JP6979312B2 (en) * 2017-09-08 2021-12-08 株式会社ディスコ How to set the alignment pattern
CN109556509B (en) 2018-01-04 2020-07-03 奥特斯(中国)有限公司 Edge sharpness evaluation of alignment marks

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7103210B2 (en) * 1999-03-09 2006-09-05 Canon Kabushiki Kaisha Position detection apparatus and exposure apparatus
US20020168097A1 (en) * 2001-03-28 2002-11-14 Claus Neubauer System and method for recognizing markers on printed circuit boards
TWI461853B (en) * 2006-01-30 2014-11-21 尼康股份有限公司 The image processing method and apparatus, the measurement apparatus and apparatus, the processing apparatus, the measuring apparatus and the exposure apparatus, the substrate processing system, and the computer-readable information recording medium
TWI637329B (en) * 2015-10-27 2018-10-01 日商斯庫林集團股份有限公司 Displacement detecting apparatus, displacement detecting method, and substrate processing apparatus

Also Published As

Publication number Publication date
TW202107410A (en) 2021-02-16
KR20210015656A (en) 2021-02-10
JP7366626B2 (en) 2023-10-23
CN112309909A (en) 2021-02-02
JP2021026019A (en) 2021-02-22
CN112309909B (en) 2024-01-12

Similar Documents

Publication Publication Date Title
US6245584B1 (en) Method for detecting adjustment error in photolithographic stepping printer
US7728953B2 (en) Exposure method, exposure system, and substrate processing apparatus
US11347153B2 (en) Error detection and correction in lithography processing
CN110088687A (en) Method and apparatus for image analysis
JP4760705B2 (en) Pre-measurement processing method, exposure system, and substrate processing apparatus
TWI836116B (en) Judgment device, substrate processing device and manufacturing method of article
KR102720083B1 (en) Substrate processing apparatus, article manufacturing method, substrate processing method, substrate processing system, management apparatus, and program
JP6608130B2 (en) Measuring apparatus, lithographic apparatus, and article manufacturing method
JP2021006893A (en) Patterning method, patterning device and method for producing article
TWI829962B (en) Judgment device, substrate processing device, article manufacturing method, substrate processing system, judgment method, and computer-readable recording medium
KR102772867B1 (en) Judgement apparatus
JP7105582B2 (en) Determination method, exposure method, exposure apparatus, article manufacturing method and program
JP2020191379A (en) Information processing device, program, substrate processing device, article manufacturing method, and article manufacturing system
JP2021060592A (en) Information processing apparatus, judgement method, program, lithography system, and manufacturing method of article
US11886125B2 (en) Method for inferring a local uniformity metric
TW202244602A (en) Mark detecting apparatus, mark learning apparatus, substrate processing apparatus, mark detecting method, and manufacturing method of article
KR20240133590A (en) Method of obtaining array of plurality of regions on substrate, exposure method, exposure apparatus, method of manufacturing article, non-transitory computer-readable storage medium, and information processing apparatus
US20090002665A1 (en) Exposure apparatus and device manufacturing method
JP2024014295A (en) Positioning device, exposure device, and method for manufacturing article
CN112147857A (en) Pattern forming method and article manufacturing method