[go: up one dir, main page]

WO2024208534A1 - Method and system for context aware overlay adjustment - Google Patents

Method and system for context aware overlay adjustment Download PDF

Info

Publication number
WO2024208534A1
WO2024208534A1 PCT/EP2024/056110 EP2024056110W WO2024208534A1 WO 2024208534 A1 WO2024208534 A1 WO 2024208534A1 EP 2024056110 W EP2024056110 W EP 2024056110W WO 2024208534 A1 WO2024208534 A1 WO 2024208534A1
Authority
WO
WIPO (PCT)
Prior art keywords
overlay
image
features
context area
medium
Prior art date
Application number
PCT/EP2024/056110
Other languages
French (fr)
Inventor
Jiyou Fu
Chenyu Zhang
Original Assignee
Asml Netherlands B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Asml Netherlands B.V. filed Critical Asml Netherlands B.V.
Publication of WO2024208534A1 publication Critical patent/WO2024208534A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F7/00Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printing surfaces; Materials therefor, e.g. comprising photoresists; Apparatus specially adapted therefor
    • G03F7/70Microphotolithographic exposure; Apparatus therefor
    • G03F7/70483Information management; Active and passive control; Testing; Wafer monitoring, e.g. pattern monitoring
    • G03F7/70605Workpiece metrology
    • G03F7/70616Monitoring the printed patterns
    • G03F7/70633Overlay, i.e. relative alignment between patterns printed by separate exposures in different layers, or in the same layer in multiple exposures or stitching
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F7/00Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printing surfaces; Materials therefor, e.g. comprising photoresists; Apparatus specially adapted therefor
    • G03F7/70Microphotolithographic exposure; Apparatus therefor
    • G03F7/70483Information management; Active and passive control; Testing; Wafer monitoring, e.g. pattern monitoring
    • G03F7/70605Workpiece metrology
    • G03F7/706835Metrology information management or control
    • G03F7/706837Data analysis, e.g. filtering, weighting, flyer removal, fingerprints or root cause analysis

Definitions

  • a lithographic apparatus is a machine that applies a desired pattern onto a target portion of a substrate.
  • the lithographic apparatus can be used, for example, in the manufacture of integrated circuits (ICs).
  • an IC chip in a smart phone can be as small as a person’s thumbnail, and may include over 2 billion transistors.
  • Making an IC is a complex and time-consuming process, with circuit components in different layers and including hundreds of individual steps. Errors in even one step have the potential to result in problems with the final IC and can cause device failure. High process yield and high wafer throughput can be impacted by the presence of defects. [0004] Metrology processes are used at various steps during a patterning process to monitor and/or control the process.
  • metrology processes are used to measure one or more characteristics of a substrate, such as a relative location (e.g., registration, overlay, alignment, etc.) or dimension (e.g., line width, critical dimension (CD), thickness, etc.) of features formed on the substrate during the patterning process or stochastic variation, such that, for example, the performance of the patterning process can be determined from the one or more characteristics. If the one or more characteristics are unacceptable (e.g., out of a predetermined range for the characteristic(s)), one or more variables of the patterning process may be designed or altered, e.g., based on the measurements of the one or more characteristics, such that substrates manufactured by the patterning process have an acceptable characteristic(s).
  • a relative location e.g., registration, overlay, alignment, etc.
  • dimension e.g., line width, critical dimension (CD), thickness, etc.
  • the techniques described herein relate to a method for determining overlay, the method including: obtaining a representation of an image having a plurality of features; determining a context area of the image that does not contain target features defined in a corresponding design layout; determining a characteristic of the context area; and determining an overlay of the plurality of features based on the characteristic.
  • the techniques described herein relate to a method for determining overlay, the method including: obtaining a representation of an image having a plurality of features; Confidential obtaining a first overlay of the plurality of features by executing a first process; obtaining a second overlay of the plurality of features, wherein the second overlay is determined based on a context area of the image that does not contain target features defined in a corresponding design layout; and adjusting parameters of the first method to determine an adjusted first overlay based on the second overlay.
  • a non-transitory computer readable medium having instructions that, when executed by a computer, cause the computer to execute a method of any of the above embodiments.
  • an apparatus includes a memory storing a set of instructions and a processor configured to execute the set of instructions to cause the apparatus to perform a method of any of the above embodiments.
  • EBI electron beam inspection
  • Figure 2 is a schematic diagram of an exemplary electron beam tool, according to an embodiment.
  • Figure 3 depicts a schematic representation of holistic lithography, representing a cooperation between three technologies to optimize semiconductor manufacturing, according to an embodiment.
  • Figure 4 shows an example of image representations of features of a pattern with different overlays, context areas and unit cell portion of the context area of the images, consistent with various embodiments.
  • Figure 5 is a block diagram of an exemplary system for adjusting a first overlay based on a second overlay determined using a context area of an image representative of a pattern, consistent with various embodiments.
  • Figure 6 is a flow diagram of an exemplary method for determining a second overlay using a context area of an image representative of a pattern, consistent with various embodiments.
  • Figure 7 is a flow diagram of an exemplary method for adjusting a first overlay based on a second overlay determined using a context area of an image representative of a pattern, consistent with various embodiments.
  • Figure 8 shows examples of template matching process-based adjusting of a first overlay using the second overlay, consistent with various embodiments.
  • Figure 9 is a block diagram of an example computer system, according to an embodiment.
  • a lithographic apparatus is a machine that applies a desired pattern onto a target portion of a substrate. This process of transferring the desired pattern to the substrate is called a patterning process.
  • the patterning process can include a patterning step to transfer a pattern from a patterning device (such as a mask) to the substrate.
  • a patterning device such as a mask
  • Various variations can potentially limit lithography implementation for semiconductor high volume manufacturing (HVM).
  • High resolution images of a substrate such as images obtained using a scanning electron microscope (SEM), may be inspected for determining any defects in the patterning process. For example, the images may be inspected for determining an overlay between features of a pattern.
  • a position of the features may have to be determined. For example, to determine the overlay of a feature, such as a via (e.g., a hole), with another feature, such as a bar, the positions of both the features are determined based on which the overlay is determined.
  • Conventional techniques employ various methods for determining a position of a feature in a pattern. For example, image processing algorithms of segmentation, contour finding and template matching (TM) process, etc. may be employed for determining the position of the features, and therefore, overlay of the features.
  • the context area of the image is an area Confidential which is devoid of (e.g., does not have) any intended target features of a design layout, or is an area between adjacent features.
  • an image representation of the pattern may be classified into two types of regions, one region that contains the features of a pattern whose metrology value (CD, overlay values, EPE, or any metrology parameter) is the measurement of interest, and a second region that is outside the features is the context area.
  • the image of pattern includes two features such as a via and a line
  • the measurement of interest is an overlay between the via and the line
  • the prescribed features are the via and the line
  • the remaining portion of the image is the context area.
  • Overlay Overlay (Context area)
  • the context area is not an area in image focus.
  • the image is processed to extract the context area and an overlay of the features is determined based on one or more characteristics (e.g., key performance indicators (KPIs)) of the context area.
  • KPIs key performance indicators
  • the characteristic of the context area may include a geometrical parameter such as at least one of a height, context y gap, centroid, major axis length, minor axis length, area size, bounding box, extent, or eccentricity of the context area.
  • an estimated overlay can be determined based on a predetermined correlation between the context area KPI and overlay, e.g., in the form of a KPI map, a lookup table, a model or a function.
  • the model can be an empirical model, a machine learning model, etc.
  • the correlation may output an overlay value for a given KPI or characteristic.
  • the estimated overlay is then compared with an initial overlay, which is determined based on a non- context area of the image using any of a number of known methods (e.g., template matching process), and the initial overlay is then adjusted based on the comparison result.
  • the parameters of the template matching process may be adjusted to “fine tune” (e.g., adjust) the initial overlay to determine a revised overlay.
  • the template matching process may be given a search range (e.g., determined based on the comparison result) proximate to the region of the initial overlay in a similarity score map to determine the revised overlay.
  • EBI system 100 includes a main chamber 110, a load-lock chamber 120, an electron beam tool 140, and an equipment front end module (EFEM) 130. Electron beam tool 140 is located within main chamber 110.
  • the exemplary EBI system 100 may be a single or multi-beam system. While the description and drawings are directed to an electron beam, it is appreciated that the embodiments are not used to limit the present disclosure to specific charged particles.
  • EFEM 130 includes a first loading port 130a and a second loading port 130b. EFEM 130 may include additional loading port(s).
  • First loading port 130a and second loading port 130b receive wafer front opening unified pods (FOUPs) that contain wafers (e.g., semiconductor wafers or wafers made of other material(s)) or samples to be inspected (wafers and samples are collectively referred to as “wafers” hereafter).
  • wafers e.g., semiconductor wafers or wafers made of other material(s)
  • wafers and samples are collectively referred to as “wafers” hereafter.
  • One or more robot arms (not shown) in EFEM 130 transport the wafers to load- lock chamber 120.
  • Load-lock chamber 120 is connected to a load/lock vacuum pump system (not shown), which removes gas molecules in load-lock chamber 120 to reach a first pressure below the atmospheric pressure. After reaching the first pressure, one or more robot arms (not shown) transport the wafer from load-lock chamber 120 to main chamber 110.
  • Main chamber 110 is connected to a main chamber vacuum pump system (not shown), which removes gas molecules in main chamber 110 to reach a second pressure below the first pressure. After reaching the second pressure, the wafer is subject to inspection by electron beam tool 140.
  • electron beam tool 140 may comprise a single-beam inspection tool.
  • Controller 150 may be electronically connected to electron beam tool 140 and may be electronically connected to other components as well. Controller 150 may be a computer configured to execute various controls of EBI system 100. Controller 150 may also include processing circuitry configured to execute various signal and image processing functions. While controller 150 is shown in Figure 1 as being outside of the structure that includes main chamber 110, load-lock chamber 120, and EFEM 130, it is appreciated that controller 150 can be part of the structure.
  • FIG.2 illustrates schematic diagram of an exemplary imaging system 200 according to embodiments of the present disclosure.
  • Electron beam tool 140 of FIG.2 may be configured for use in EBI system 100.
  • Electron beam tool 140 may be a single beam apparatus or a multi-beam apparatus.
  • electron beam tool 140 includes a motorized sample stage 201, and a wafer holder 202 supported by motorized sample stage 201 to hold a wafer 203 to be inspected.
  • Electron beam tool 140 further includes an objective lens assembly 204, an electron detector 206 (which includes electron sensor surfaces 206a and 206b), an objective aperture 208, a condenser lens 210, a beam limit aperture 212, a gun aperture 214, an anode 216, and a cathode 218.
  • Objective lens assembly 204 may include a modified swing objective retarding immersion lens (SORIL), which includes a pole piece 204a, a control electrode 204b, a deflector 204c, and an Confidential exciting coil 204d.
  • Electron beam tool 140 may additionally include an Energy Dispersive X-ray Spectrometer (EDS) detector (not shown) to characterize the materials on wafer 203.
  • EDS Energy Dispersive X-ray Spectrometer
  • a primary electron beam 220 is emitted from cathode 218 by applying a voltage between anode 216 and cathode 218.
  • Primary electron beam 220 passes through gun aperture 214 and beam limit aperture 212, both of which may determine the size of electron beam entering condenser lens 210, which resides below beam limit aperture 212.
  • Condenser lens 210 focuses primary electron beam 220 before the beam enters objective aperture 208 to set the size of the electron beam before entering objective lens assembly 204.
  • Deflector 204c deflects primary electron beam 220 to facilitate beam scanning on the wafer.
  • deflector 204c may be controlled to deflect primary electron beam 220 sequentially onto different locations of top surface of wafer 203 at different time points, to provide data for image reconstruction for different parts of wafer 203. Moreover, deflector 204c may also be controlled to deflect primary electron beam 220 onto different sides of wafer 203 at a particular location, at different time points, to provide data for stereo image reconstruction of the wafer structure at that location.
  • a secondary electron beam 222 may be emitted from the part of wafer 203 upon receiving primary electron beam 220. Secondary electron beam 222 may form a beam spot on sensor surfaces 206a and 206b of electron detector 206. Electron detector 206 may generate a signal (e.g., a voltage, a current, etc.) that represents an intensity of the beam spot, and provide the signal to an image processing system 250.
  • a signal e.g., a voltage, a current, etc.
  • the intensity of secondary electron beam 222, and the resultant beam spot may vary according to the external or internal structure of wafer 203.
  • primary electron beam 220 may be projected onto different locations of the top surface of the wafer or different sides of the wafer at a particular location, to generate secondary electron beams 222 (and the resultant beam spot) of different intensities. Therefore, by mapping the intensities of the beam spots with the locations of wafer 203, the processing system may reconstruct an image that reflects the internal or surface structures of wafer 203.
  • Confidential [0032] Imaging system 200 may be used for inspecting a wafer 203 on sample stage 201, and comprises an electron beam tool 140, as discussed above.
  • Imaging system 200 may also comprise an image processing system 250 that includes an image acquirer 260, storage 270, and controller 150.
  • Image acquirer 260 may comprise one or more processors.
  • image acquirer 260 may comprise a computer, server, mainframe host, terminals, personal computer, any kind of mobile computing devices, and the like, or a combination thereof.
  • Image acquirer 260 may connect with a detector 206 of electron beam tool 140 through a medium such as an electrical conductor, optical fiber cable, portable storage media, IR, Bluetooth, internet, wireless network, wireless radio, or a combination thereof.
  • Image acquirer 260 may receive a signal from detector 206 and may construct an image. Image acquirer 260 may thus acquire images of wafer 203.
  • Image acquirer 260 may also perform various post-processing functions, such as generating contours, superimposing indicators on an acquired image, and the like. Image acquirer 260 may be configured to perform adjustments of brightness and contrast, etc. of acquired images.
  • Storage 270 may be a storage medium such as a hard disk, cloud storage, random access memory (RAM), other types of computer readable memory, and the like. Storage 270 may be coupled with image acquirer 260 and may be used for saving scanned raw image data as original images, and post-processed images.
  • Image acquirer 260 and storage 270 may be connected to controller 150. In some embodiments, image acquirer 260, storage 270, and controller 150 may be integrated together as one control unit.
  • image acquirer 260 may acquire one or more images of a sample based on an imaging signal received from detector 206.
  • An imaging signal may correspond to a scanning operation for conducting charged particle imaging.
  • An acquired image may be a single image comprising a plurality of imaging areas.
  • the single image may be stored in storage 270.
  • the single image may be an original image that may be divided into a plurality of regions. Each of the regions may comprise one imaging area containing a feature of wafer 203.
  • Figure 3 depicts a schematic representation of holistic lithography, representing a cooperation between three technologies to optimize semiconductor manufacturing.
  • the patterning process in a lithographic apparatus LA is one of the most critical steps in the processing which requires high accuracy of dimensioning and placement of structures on the substrate W ( Figure 1).
  • three systems in this example may be combined in a so called “holistic” control environment as schematically depicted in Figure 3.
  • One of these systems is the lithographic apparatus LA which is (virtually) connected to a metrology apparatus (e.g., a metrology tool) MT (a second system), and to a computer system CL (a third system).
  • a “holistic” environment may be configured to optimize the cooperation between these three systems to enhance the overall process window and provide tight control loops to ensure that the patterning performed by the lithographic apparatus LA stays within a process window.
  • the process window defines a range of process parameters (e.g., dose, focus, overlay) within which a specific manufacturing process yields a defined Confidential result (e.g., a functional semiconductor device) – typically within which the process parameters in the lithographic process or patterning process are allowed to vary.
  • the computer system CL may use (part of) the design layout to be patterned to predict which resolution enhancement techniques to use and to perform computational lithography simulations and calculations to determine which mask layout and lithographic apparatus settings achieve the largest overall process window of the patterning process (depicted in Figure 2 by the double arrow in the first scale SC1).
  • the resolution enhancement techniques are arranged to match the patterning possibilities of the lithographic apparatus LA.
  • the computer system CL may also be used to detect where within the process window the lithographic apparatus LA is currently operating (e.g., using input from the metrology tool MT) to predict whether defects may be present due to, for example, sub-optimal processing (depicted in Figure 2 by the arrow pointing “0” in the second scale SC2).
  • the metrology apparatus (tool) MT may provide input to the computer system CL to enable accurate simulations and predictions, and may provide feedback to the lithographic apparatus LA to identify possible drifts, e.g., in a calibration status of the lithographic apparatus LA (depicted in Figure 3 by the multiple arrows in the third scale SC3).
  • the pattern can have number of first Confidential features and a number of second features as shown in the first image 401.
  • the first image 401 is a measured image of the pattern printed on the substrate, and may be obtained using a metrology tool such as a scanned electron microscope (SEM), an optical based metrology tool, or a scatterometer.
  • the first image 401 may be a simulated image of the pattern, which may be simulated using any of a number of methods.
  • the first image 401 is representative of features with a negative overlay
  • a second image 411 is representative of features with zero overlay
  • a third image 421 is representative of features with a positive overlay.
  • the KPI 615 may be a height 509 of the context area 516.
  • a KPI may be calculated from the pixel Confidential intensity values.
  • the centroid can be calculated as intensity weighted centroid.
  • the context area can have pixels with varied levels (e.g., 1 ⁇ n levels) of intensity. For example, if the context area pixels have just one gray level, the KPIs are calculated without considering intensity level. If context area pixels have two or more gray levels (e.g., n > 1), some of the geometry parameters are calculated with pixel intensity values as the weight.
  • the KPI evaluation component 515 may determine an estimated overlay of the features based on the KPI 615 of the context area 516.
  • an overlay adjustment component 520 obtains an initial overlay determined using a first process, and an estimated overlay determined using a context area of an image representative of a pattern.
  • an initial overlay 504 may be obtained from a template matching component 510 that facilitates determination of the initial overlay 504 (e.g., depicted as left branch of Figure 5) based on a non-context area of the image using a template matching process.
  • the template matching component 510 processes the non- context area, e.g., the features of the pattern in the image, to determine a similarity score that is representative of a similarity between the features 512 and 514 and the corresponding templates of the features 512 and 514, and determines the initial overlay 504 based on the similarity score.
  • the similarity score map 802 illustrates the initial overlay 504 determined based on the similarity score determined using the template matching process.
  • the overlay adjustment component 520 may obtain the estimated overlay determined using the context area of the image of the pattern from the KPI evaluation component 515 (e.g., as described at least with reference to Figure 6 above).
  • the overlay adjustment component 520 may obtain the estimated overlay 511 determined using the context area 516 of the image 503 of the pattern. [0053] At process P710, the overlay adjustment component 520 compares the initial overlay with the estimated overlay and generates a comparison result 710. For example, the overlay adjustment component 520 compares the initial overlay 504 with the estimated overlay 511 and generates a difference between the two overlays as the comparison result 710.
  • the initial overlay 504 is x units and the estimated overlay is y units, where y ⁇ x.
  • the similarity score map 802 shows the estimated overlay 511 relative to the initial overlay 504.
  • the overlay adjustment component 520 determines whether the comparison result satisfies a criterion. For example, the overlay adjustment component 520 determines whether a difference between the initial overlay 504 and the estimated overlay 511 (e.g., x- y) is within a specified threshold. If the difference is within the specified threshold, the overlay adjustment component 520 determines that the comparison result satisfies the criterion and outputs the initial overlay 504 as an overlay 715 of the features 512 and 514. In some embodiments, the overlay adjustment component 520 may also assign a confidence score to the overlay 715. Various methods may be used to determine the confidence score.
  • the confidence score may be determined at least based on the comparison result 710.
  • the overlay adjustment component 520 determines parameters of the first process (e.g., template matching process) to be adjusted for adjusting the initial overlay 504. In some embodiments, the overlay adjustment component 520 determines search parameters of an area in the similarity score map 802 to be searched by the template matching process to determine the adjusted overlay.
  • the overlay adjustment component 520 may determine the area 806 of the similarity score map 802 that is proximate the estimated overlay 511 (e.g., area 806 of the similarity score map defined by “y – m” to “y + m” along x-axis of the similarity score map, where m is a user specified value).
  • the graph 804 shows a plot of the similarity score, where the y-axis is the similarity score, and the x-axis is the length of the image 503 in the x-direction.
  • the overlay adjustment component 520 may execute the template matching process based on the search parameters (e.g., co-ordinates of the area 806 determined in process P720) to determine the adjusted overlay 715.
  • the template matching component 510 may search the similarity score map 802 in the area 806 to determine a local maxima (e.g., position within the area 806 where the similarity score is the highest), and then revise the initial overlay 504 to the adjusted overlay 517 based on the coordinates of the local maxima.
  • the overlay adjustment component 520 may output the adjusted overlay 517 as the overlay 715 of the features 514 and 514.
  • the methods 600 and 700 of Figures 6 and 7 may performed for each unit cell image of the image 502 of the pattern independently to determine the overlay for each unit cell image.
  • One of the advantages of determining the overlay value for each unit cell image independently is that the method is tolerant to any variations (e.g., image intensity) across the image, which eliminates the need for normalizing an image, which minimizes the consumption of time and computing resources, thereby making the defect detection faster.
  • Another advantage is to assign a confidence level to each unit cell overlay value. For example, when the unit cell overlay determined by the first process is similar to the overlay determined using the context area, a higher confidence level is assigned to this unit cell overlay.
  • the confidence level may be used as a weight to report the weighted mean overlay over all unit cells within one image. This approach may further improve overlay result accuracy and robustness.
  • the method 600 may be used as a standalone method to determine the overlay of the features of a pattern. That is, the estimated overlay measured using the context area may be used as an independent overlay measurement without the need for it being used to adjust the initial overlay.
  • the method 600 may be supplemented by another process.
  • an overlay determined based on the context area may be used as an initial overlay, which may further be adjusted using an overlay determined using another method (e.g., template matching process).
  • the defect detection process helps in improving a patterning process by minimizing defects in patterning a target layout on a substrate.
  • a parameter of a patterning process or a lithographic apparatus used to print a pattern on a substrate may be adjusted to minimize defects in patterning a target layout on the substrate.
  • the patterning process may be performed using the lithographic apparatus to print patterns corresponding to the target layout on the substrate.
  • Figure 9 is a block diagram that illustrates a computer system 900 which can assist in implementing in various methods and systems disclosed herein.
  • the computer system 900 may be used to implement any of the entities, components, modules, or services depicted in the examples of the figures (and any other entities, components, modules, or services described in this specification).
  • the computer system 900 may be programmed to execute computer program instructions to perform functions, methods, flows, or services (e.g., of any of the entities, components, or modules) described herein.
  • the computer system 900 may be programmed to execute computer program instructions by at least one of software, hardware, or firmware.
  • Computer system 900 includes a bus 902 or other communication mechanism for communicating information, and a processor 904 (or multiple processors 904 and 905) coupled with bus 902 for processing information.
  • Computer system 900 also includes a main memory 906, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 902 for storing information and instructions to be executed by processor 904.
  • Main memory 906 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 904.
  • Computer system 900 further includes a read only memory (ROM) 908 or other static storage device coupled to bus 902 for storing static information and instructions for Confidential processor 904.
  • ROM read only memory
  • a storage device 910 such as a magnetic disk or optical disk, is provided and coupled to bus 902 for storing information and instructions.
  • Computer system 900 may be coupled via bus 902 to a display 912, such as a cathode ray tube (CRT) or flat panel or touch panel display for displaying information to a computer user.
  • An input device 914 is coupled to bus 902 for communicating information and command selections to processor 904.
  • cursor control 916 such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 904 and for controlling cursor movement on display 912.
  • This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
  • a touch panel (screen) display may also be used as an input device.
  • portions of one or more methods described herein may be performed by computer system 900 in response to processor 904 executing one or more sequences of one or more instructions contained in main memory 906. Such instructions may be read into main memory 906 from another computer-readable medium, such as storage device 910. Execution of the sequences of instructions contained in main memory 906 causes processor 904 to perform the process steps described herein.
  • processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in main memory 906.
  • hard-wired circuitry may be used in place of or in combination with software instructions.
  • description herein is not limited to any specific combination of hardware circuitry and software.
  • computer-readable medium refers to any medium that participates in providing instructions to processor 904 for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media.
  • Non-volatile media include, for example, optical or magnetic disks, such as storage device 910.
  • Volatile media include dynamic memory, such as main memory 906.
  • Transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise bus 902. Transmission media can also take the form of acoustic or light waves, such as those generated during radio frequency (RF) and infrared (IR) data communications.
  • RF radio frequency
  • IR infrared
  • Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
  • Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to processor 904 for execution.
  • the instructions may initially be borne on a magnetic disk of a remote computer.
  • the remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem.
  • Confidential A modem local to computer system 900 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal.
  • An infrared detector coupled to bus 902 can receive the data carried in the infrared signal and place the data on bus 902.
  • Bus 902 carries the data to main memory 906, from which processor 904 retrieves and executes the instructions.
  • Computer system 900 also preferably includes a communication interface 918 coupled to bus 902.
  • Communication interface 918 provides a two-way data communication coupling to a network link 920 that is connected to a local network 922.
  • communication interface 918 may be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line.
  • ISDN integrated services digital network
  • communication interface 918 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN.
  • LAN local area network
  • Network link 920 typically provides data communication through one or more networks to other data devices.
  • network link 920 may provide a connection through local network 922 to a host computer 924 or to data equipment operated by an Internet Service Provider (ISP) 926.
  • ISP 926 in turn provides data communication services through the worldwide packet data communication network, now commonly referred to as the “Internet” 928.
  • Internet 928 uses electrical, electromagnetic, or optical signals that carry digital data streams.
  • Computer system 900 can send messages and receive data, including program code, through the network(s), network link 920, and communication interface 918.
  • a server 930 might transmit a requested code for an application program through Internet 928, ISP 926, local network 922 and communication interface 918.
  • One such downloaded application may provide for the illumination optimization of the embodiment, for example.
  • the received code may be executed by processor 904 as it is received, or stored in storage device 910, or other non-volatile storage for later execution.
  • computer system 900 may obtain application code in the form of a carrier wave.
  • the concepts disclosed herein may be used for imaging on a substrate such as a silicon wafer, it shall be understood that the disclosed concepts may be used with any type of lithographic imaging systems, e.g., those used for imaging on substrates other than silicon wafers.
  • the terms “optimizing” and “optimization” as used herein refers to or means adjusting a patterning apparatus (e.g., a lithography apparatus), a patterning process, etc. such that results and/or Confidential processes have more desirable characteristics, such as higher accuracy of projection of a design pattern on a substrate, a larger process window, etc.
  • optimization refers to or means a process that identifies one or more values for one or more parameters that provide an improvement, e.g., a local optimum, in at least one relevant metric, compared to an initial set of one or more values for those one or more parameters. "Optimum” and other related terms should be construed accordingly. In an embodiment, optimization steps can be applied iteratively to provide further improvements in one or more metrics. [0072] Aspects of the invention can be implemented in any convenient form.
  • an embodiment may be implemented by one or more appropriate computer programs which may be carried on an appropriate carrier medium which may be a tangible carrier medium (e.g., a disk) or an intangible carrier medium (e.g., a communications signal).
  • an appropriate carrier medium which may be a tangible carrier medium (e.g., a disk) or an intangible carrier medium (e.g., a communications signal).
  • Embodiments of the invention may be implemented using suitable apparatus which may specifically take the form of a programmable computer running a computer program arranged to implement a method as described herein.
  • embodiments of the disclosure may be implemented in hardware, firmware, software, or any combination thereof.
  • Embodiments of the disclosure may also be implemented as instructions stored on a machine-readable medium, which may be read and executed by one or more processors.
  • a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device).
  • a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical, or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), and others.
  • firmware, software, routines, instructions may be described herein as performing certain actions. However, it should be appreciated that such descriptions are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software, routines, instructions, etc.
  • illustrated components are depicted as discrete functional blocks, but embodiments are not limited to systems in which the functionality described herein is organized as illustrated.
  • the functionality provided by each of the components may be provided by software or hardware modules that are differently organized than is presently depicted, for example such software or hardware may be intermingled, conjoined, replicated, broken up, distributed (e.g., within a data center or geographically), or otherwise differently organized.
  • the functionality described herein may be provided by one or more processors of one or more computers executing code stored on a tangible, non-transitory, machine-readable medium.
  • third party content delivery networks may host some or all of the information conveyed over networks, in which case, to the extent information (e.g., content) is said to be supplied or otherwise provided, the information may be provided by sending instructions to retrieve that information from a content delivery network.
  • information e.g., content
  • the information may be provided by sending instructions to retrieve that information from a content delivery network.
  • a method for determining overlay comprising: obtaining a representation of an image having a plurality of features; determining a context area of the image, wherein the context area corresponds to an area of a corresponding design layout that does not contain target features; determining a characteristic of the context area; and determining an overlay of the plurality of features based on the characteristic.
  • the context area is located between the features.
  • the characteristic is a geometrical parameter of the context area.
  • the geometrical parameter includes a height of the context area. 5.
  • the geometrical parameter includes at least one of a context y gap, centroid, major axis length, minor axis length, area size, bounding box, or eccentricity of the context area.
  • the context area includes pixels of different intensity values, and wherein the geometrical parameter is determined based on the intensity values.
  • determining the overlay includes: obtaining a correlation between the characteristic and an overlay of a set of features; and determining the overlay based on the correlation.
  • the method of clause 1 further comprising: comparing the overlay with a first overlay of the plurality of features, wherein the first overlay is determined using a first process based on a non-context area of the image; and adjusting the first process to determine an adjusted overlay based on the overlay.
  • adjusting the first process includes: adjusting the first process based on a determination that a difference between the first overlay and the overlay exceeds a specified threshold. Confidential 13.
  • the first process includes a template matching process.
  • the first process includes an edge searching process that generates a contour of the plurality of features. 15.
  • adjusting the first process includes: adjusting the first process based on a determination that a difference between the first overlay and the second overlay exceeds a specified threshold.
  • the context area is determined by thresholding the image.
  • thresholding the image includes: generating a first histogram that is representative of a first set of the plurality of features in a first layer of the design layout; generating a second histogram that is representative of a second set of the plurality of features in a second layer of the design layout, the second layer being below the first layer; and generating a third histogram that is representative of an area of the image that does not contain target features.
  • a component may include A, or B, or A and B.
  • the component may include A, or B, or C, or A and B, or A and C, or B and C, or A and B and C.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Testing Or Measuring Of Semiconductors Or The Like (AREA)

Abstract

Described herein are systems and methods for determining overlay between features of a pattern. The method includes obtaining a representation of an image having multiple features. A context area of the image that does not contain any target features defined in a corresponding design layout is determined and an overlay of the features is determined based on a characteristic of the context area. The overlay is compared with an initial overlay determined using another process based on a non-context area of the image. The initial overlay is then adjusted or "fine-tuned" by adjusting parameters of the other process based on the comparison result.

Description

METHOD AND SYSTEM FOR CONTEXT AWARE OVERLAY ADJUSTMENT CROSS-REFERENCE TO RELATED APPLICATION [0001] This application claims priority of US application 63/457,754 which was filed on 6 April 2023, and which is incorporated herein in its entirety by reference. TECHNICAL FIELD [0002] The embodiments provided herein relate to semiconductor manufacturing, and more particularly to semiconductor metrology and inspection. BACKGROUND [0003] A lithographic apparatus is a machine that applies a desired pattern onto a target portion of a substrate. The lithographic apparatus can be used, for example, in the manufacture of integrated circuits (ICs). For example, an IC chip in a smart phone, can be as small as a person’s thumbnail, and may include over 2 billion transistors. Making an IC is a complex and time-consuming process, with circuit components in different layers and including hundreds of individual steps. Errors in even one step have the potential to result in problems with the final IC and can cause device failure. High process yield and high wafer throughput can be impacted by the presence of defects. [0004] Metrology processes are used at various steps during a patterning process to monitor and/or control the process. For example, metrology processes are used to measure one or more characteristics of a substrate, such as a relative location (e.g., registration, overlay, alignment, etc.) or dimension (e.g., line width, critical dimension (CD), thickness, etc.) of features formed on the substrate during the patterning process or stochastic variation, such that, for example, the performance of the patterning process can be determined from the one or more characteristics. If the one or more characteristics are unacceptable (e.g., out of a predetermined range for the characteristic(s)), one or more variables of the patterning process may be designed or altered, e.g., based on the measurements of the one or more characteristics, such that substrates manufactured by the patterning process have an acceptable characteristic(s). BRIEF SUMMARY [0005] In some embodiments, the techniques described herein relate to a method for determining overlay, the method including: obtaining a representation of an image having a plurality of features; determining a context area of the image that does not contain target features defined in a corresponding design layout; determining a characteristic of the context area; and determining an overlay of the plurality of features based on the characteristic. [0006] In some embodiments, the techniques described herein relate to a method for determining overlay, the method including: obtaining a representation of an image having a plurality of features; Confidential obtaining a first overlay of the plurality of features by executing a first process; obtaining a second overlay of the plurality of features, wherein the second overlay is determined based on a context area of the image that does not contain target features defined in a corresponding design layout; and adjusting parameters of the first method to determine an adjusted first overlay based on the second overlay. [0007] In some embodiments, there is provided a non-transitory computer readable medium having instructions that, when executed by a computer, cause the computer to execute a method of any of the above embodiments. [0008] In some embodiments, there is provided an apparatus includes a memory storing a set of instructions and a processor configured to execute the set of instructions to cause the apparatus to perform a method of any of the above embodiments. BRIEF DESCRIPTION OF THE DRAWINGS [0009] Embodiments will now be described, by way of example only, with reference to the accompanying drawings in which: [0010] Figure 1 is a schematic diagram illustrating an exemplary electron beam inspection (EBI) system, according to an embodiment. [0011] Figure 2 is a schematic diagram of an exemplary electron beam tool, according to an embodiment. [0012] Figure 3 depicts a schematic representation of holistic lithography, representing a cooperation between three technologies to optimize semiconductor manufacturing, according to an embodiment. [0013] Figure 4 shows an example of image representations of features of a pattern with different overlays, context areas and unit cell portion of the context area of the images, consistent with various embodiments. [0014] Figure 5 is a block diagram of an exemplary system for adjusting a first overlay based on a second overlay determined using a context area of an image representative of a pattern, consistent with various embodiments. [0015] Figure 6 is a flow diagram of an exemplary method for determining a second overlay using a context area of an image representative of a pattern, consistent with various embodiments. [0016] Figure 7 is a flow diagram of an exemplary method for adjusting a first overlay based on a second overlay determined using a context area of an image representative of a pattern, consistent with various embodiments. [0017] Figure 8 shows examples of template matching process-based adjusting of a first overlay using the second overlay, consistent with various embodiments. [0018] Figure 9 is a block diagram of an example computer system, according to an embodiment. [0019] Embodiments will now be described in detail with reference to the drawings, which are provided as illustrative examples so as to enable those skilled in the art to practice the embodiments. Confidential Notably, the figures and examples below are not meant to limit the scope to a single embodiment, but other embodiments are possible by way of interchange of some or all of the described or illustrated elements. Wherever convenient, the same reference numbers will be used throughout the drawings to refer to same or like parts. Where certain elements of these embodiments can be partially or fully implemented using known components, only those portions of such known components that are necessary for an understanding of the embodiments will be described, and detailed descriptions of other portions of such known components will be omitted so as not to obscure the description of the embodiments. In the present specification, an embodiment showing a singular component should not be considered limiting; rather, the scope is intended to encompass other embodiments including a plurality of the same component, and vice-versa, unless explicitly stated otherwise herein. Moreover, applicants do not intend for any term in the specification or claims to be ascribed an uncommon or special meaning unless explicitly set forth as such. Further, the scope encompasses present and future known equivalents to the components referred to herein by way of illustration. DETAILED DESCRIPTION [0020] A lithographic apparatus is a machine that applies a desired pattern onto a target portion of a substrate. This process of transferring the desired pattern to the substrate is called a patterning process. The patterning process can include a patterning step to transfer a pattern from a patterning device (such as a mask) to the substrate. Various variations (e.g., variations in the patterning process or the lithographic apparatus) can potentially limit lithography implementation for semiconductor high volume manufacturing (HVM). High resolution images of a substrate, such as images obtained using a scanning electron microscope (SEM), may be inspected for determining any defects in the patterning process. For example, the images may be inspected for determining an overlay between features of a pattern. To determine the overlay, a position of the features may have to be determined. For example, to determine the overlay of a feature, such as a via (e.g., a hole), with another feature, such as a bar, the positions of both the features are determined based on which the overlay is determined. [0021] Conventional techniques employ various methods for determining a position of a feature in a pattern. For example, image processing algorithms of segmentation, contour finding and template matching (TM) process, etc. may be employed for determining the position of the features, and therefore, overlay of the features. However, conventional techniques focus on utilizing explicit image information of the prescribed features to be measured, and do not consider the remainder of the image, or the “context.” In the template matching process, low image contrast of a blocked feature may make it difficult to calculate similarity between the feature and a template, leading to inaccurate feature location and consequently inaccurate overlay measurement. These and other drawbacks exist. [0022] Disclosed are embodiments for determining an overlay of features of a pattern using a “context” area of an image representation of the pattern. The context area of the image is an area Confidential which is devoid of (e.g., does not have) any intended target features of a design layout, or is an area between adjacent features. In some embodiments, an image representation of the pattern may be classified into two types of regions, one region that contains the features of a pattern whose metrology value (CD, overlay values, EPE, or any metrology parameter) is the measurement of interest, and a second region that is outside the features is the context area. For example, if the image of pattern includes two features such as a via and a line, the measurement of interest is an overlay between the via and the line, then the prescribed features are the via and the line, and the remaining portion of the image is the context area. The image may be represented as: Pattern image = Prescribed features of the pattern + Context area Context area is complementary of the prescribed features area, and therefore context-based overlay shall match prescribed feature-base overlay. Overlay (Prescribed features) = Overlay (Context area) [0023] Typically, in capturing a metrology image, the context area is not an area in image focus. The image is processed to extract the context area and an overlay of the features is determined based on one or more characteristics (e.g., key performance indicators (KPIs)) of the context area. The characteristic of the context area may include a geometrical parameter such as at least one of a height, context y gap, centroid, major axis length, minor axis length, area size, bounding box, extent, or eccentricity of the context area. After determining the KPI of the context area, an estimated overlay can be determined based on a predetermined correlation between the context area KPI and overlay, e.g., in the form of a KPI map, a lookup table, a model or a function. For example, the model can be an empirical model, a machine learning model, etc. Regardless of how the correlation is implemented, the correlation may output an overlay value for a given KPI or characteristic. The estimated overlay is then compared with an initial overlay, which is determined based on a non- context area of the image using any of a number of known methods (e.g., template matching process), and the initial overlay is then adjusted based on the comparison result. For example, if the difference between the estimated overlay and the initial overlay exceeds a specified threshold, the parameters of the template matching process may be adjusted to “fine tune” (e.g., adjust) the initial overlay to determine a revised overlay. For example, the template matching process may be given a search range (e.g., determined based on the comparison result) proximate to the region of the initial overlay in a similarity score map to determine the revised overlay. By using the context area to determine the estimated overlay and then fine tuning the initial overlay determined using the known the methods, any inaccuracies in determining the position of the features is eliminated or minimized, thereby resulting in a more accurate determination of the position of the features, and thus, the overlay. Confidential [0024] Reference is now made to Figure 1, which illustrates an exemplary electron beam inspection (EBI) system 100 consistent with embodiments of the present disclosure. As shown in Figure 1, EBI system 100 includes a main chamber 110, a load-lock chamber 120, an electron beam tool 140, and an equipment front end module (EFEM) 130. Electron beam tool 140 is located within main chamber 110. The exemplary EBI system 100 may be a single or multi-beam system. While the description and drawings are directed to an electron beam, it is appreciated that the embodiments are not used to limit the present disclosure to specific charged particles. [0025] EFEM 130 includes a first loading port 130a and a second loading port 130b. EFEM 130 may include additional loading port(s). First loading port 130a and second loading port 130b receive wafer front opening unified pods (FOUPs) that contain wafers (e.g., semiconductor wafers or wafers made of other material(s)) or samples to be inspected (wafers and samples are collectively referred to as “wafers” hereafter). One or more robot arms (not shown) in EFEM 130 transport the wafers to load- lock chamber 120. [0026] Load-lock chamber 120 is connected to a load/lock vacuum pump system (not shown), which removes gas molecules in load-lock chamber 120 to reach a first pressure below the atmospheric pressure. After reaching the first pressure, one or more robot arms (not shown) transport the wafer from load-lock chamber 120 to main chamber 110. Main chamber 110 is connected to a main chamber vacuum pump system (not shown), which removes gas molecules in main chamber 110 to reach a second pressure below the first pressure. After reaching the second pressure, the wafer is subject to inspection by electron beam tool 140. In some embodiments, electron beam tool 140 may comprise a single-beam inspection tool. [0027] Controller 150 may be electronically connected to electron beam tool 140 and may be electronically connected to other components as well. Controller 150 may be a computer configured to execute various controls of EBI system 100. Controller 150 may also include processing circuitry configured to execute various signal and image processing functions. While controller 150 is shown in Figure 1 as being outside of the structure that includes main chamber 110, load-lock chamber 120, and EFEM 130, it is appreciated that controller 150 can be part of the structure. [0028] FIG.2 illustrates schematic diagram of an exemplary imaging system 200 according to embodiments of the present disclosure. Electron beam tool 140 of FIG.2 may be configured for use in EBI system 100. Electron beam tool 140 may be a single beam apparatus or a multi-beam apparatus. As shown in FIG.2, electron beam tool 140 includes a motorized sample stage 201, and a wafer holder 202 supported by motorized sample stage 201 to hold a wafer 203 to be inspected. Electron beam tool 140 further includes an objective lens assembly 204, an electron detector 206 (which includes electron sensor surfaces 206a and 206b), an objective aperture 208, a condenser lens 210, a beam limit aperture 212, a gun aperture 214, an anode 216, and a cathode 218. Objective lens assembly 204, in some embodiments, may include a modified swing objective retarding immersion lens (SORIL), which includes a pole piece 204a, a control electrode 204b, a deflector 204c, and an Confidential exciting coil 204d. Electron beam tool 140 may additionally include an Energy Dispersive X-ray Spectrometer (EDS) detector (not shown) to characterize the materials on wafer 203. [0029] A primary electron beam 220 is emitted from cathode 218 by applying a voltage between anode 216 and cathode 218. Primary electron beam 220 passes through gun aperture 214 and beam limit aperture 212, both of which may determine the size of electron beam entering condenser lens 210, which resides below beam limit aperture 212. Condenser lens 210 focuses primary electron beam 220 before the beam enters objective aperture 208 to set the size of the electron beam before entering objective lens assembly 204. Deflector 204c deflects primary electron beam 220 to facilitate beam scanning on the wafer. For example, in a scanning process, deflector 204c may be controlled to deflect primary electron beam 220 sequentially onto different locations of top surface of wafer 203 at different time points, to provide data for image reconstruction for different parts of wafer 203. Moreover, deflector 204c may also be controlled to deflect primary electron beam 220 onto different sides of wafer 203 at a particular location, at different time points, to provide data for stereo image reconstruction of the wafer structure at that location. Further, in some embodiments, anode 216 and cathode 218 may be configured to generate multiple primary electron beams 220, and electron beam tool 140 may include a plurality of deflectors 204c to project the multiple primary electron beams 220 to different parts/sides of the wafer at the same time, to provide data for image reconstruction for different parts of wafer 203. [0030] Exciting coil 204d and pole piece 204a generate a magnetic field that begins at one end of pole piece 204a and terminates at the other end of pole piece 204a. A part of wafer 203 being scanned by primary electron beam 220 may be immersed in the magnetic field and may be electrically charged, which, in turn, creates an electric field. The electric field reduces the energy of impinging primary electron beam 220 near the surface of wafer 203 before it collides with wafer 203. Control electrode 204b, being electrically isolated from pole piece 204a, controls an electric field on wafer 203 to prevent micro-arching of wafer 203 and to ensure proper beam focus. [0031] A secondary electron beam 222 may be emitted from the part of wafer 203 upon receiving primary electron beam 220. Secondary electron beam 222 may form a beam spot on sensor surfaces 206a and 206b of electron detector 206. Electron detector 206 may generate a signal (e.g., a voltage, a current, etc.) that represents an intensity of the beam spot, and provide the signal to an image processing system 250. The intensity of secondary electron beam 222, and the resultant beam spot, may vary according to the external or internal structure of wafer 203. Moreover, as discussed above, primary electron beam 220 may be projected onto different locations of the top surface of the wafer or different sides of the wafer at a particular location, to generate secondary electron beams 222 (and the resultant beam spot) of different intensities. Therefore, by mapping the intensities of the beam spots with the locations of wafer 203, the processing system may reconstruct an image that reflects the internal or surface structures of wafer 203. Confidential [0032] Imaging system 200 may be used for inspecting a wafer 203 on sample stage 201, and comprises an electron beam tool 140, as discussed above. Imaging system 200 may also comprise an image processing system 250 that includes an image acquirer 260, storage 270, and controller 150. Image acquirer 260 may comprise one or more processors. For example, image acquirer 260 may comprise a computer, server, mainframe host, terminals, personal computer, any kind of mobile computing devices, and the like, or a combination thereof. Image acquirer 260 may connect with a detector 206 of electron beam tool 140 through a medium such as an electrical conductor, optical fiber cable, portable storage media, IR, Bluetooth, internet, wireless network, wireless radio, or a combination thereof. Image acquirer 260 may receive a signal from detector 206 and may construct an image. Image acquirer 260 may thus acquire images of wafer 203. Image acquirer 260 may also perform various post-processing functions, such as generating contours, superimposing indicators on an acquired image, and the like. Image acquirer 260 may be configured to perform adjustments of brightness and contrast, etc. of acquired images. Storage 270 may be a storage medium such as a hard disk, cloud storage, random access memory (RAM), other types of computer readable memory, and the like. Storage 270 may be coupled with image acquirer 260 and may be used for saving scanned raw image data as original images, and post-processed images. Image acquirer 260 and storage 270 may be connected to controller 150. In some embodiments, image acquirer 260, storage 270, and controller 150 may be integrated together as one control unit. [0033] In some embodiments, image acquirer 260 may acquire one or more images of a sample based on an imaging signal received from detector 206. An imaging signal may correspond to a scanning operation for conducting charged particle imaging. An acquired image may be a single image comprising a plurality of imaging areas. The single image may be stored in storage 270. The single image may be an original image that may be divided into a plurality of regions. Each of the regions may comprise one imaging area containing a feature of wafer 203. [0034] Figure 3 depicts a schematic representation of holistic lithography, representing a cooperation between three technologies to optimize semiconductor manufacturing. Typically, the patterning process in a lithographic apparatus LA is one of the most critical steps in the processing which requires high accuracy of dimensioning and placement of structures on the substrate W (Figure 1). To ensure this high accuracy, three systems (in this example) may be combined in a so called “holistic” control environment as schematically depicted in Figure 3. One of these systems is the lithographic apparatus LA which is (virtually) connected to a metrology apparatus (e.g., a metrology tool) MT (a second system), and to a computer system CL (a third system). A “holistic” environment may be configured to optimize the cooperation between these three systems to enhance the overall process window and provide tight control loops to ensure that the patterning performed by the lithographic apparatus LA stays within a process window. The process window defines a range of process parameters (e.g., dose, focus, overlay) within which a specific manufacturing process yields a defined Confidential result (e.g., a functional semiconductor device) – typically within which the process parameters in the lithographic process or patterning process are allowed to vary. [0035] The computer system CL may use (part of) the design layout to be patterned to predict which resolution enhancement techniques to use and to perform computational lithography simulations and calculations to determine which mask layout and lithographic apparatus settings achieve the largest overall process window of the patterning process (depicted in Figure 2 by the double arrow in the first scale SC1). Typically, the resolution enhancement techniques are arranged to match the patterning possibilities of the lithographic apparatus LA. The computer system CL may also be used to detect where within the process window the lithographic apparatus LA is currently operating (e.g., using input from the metrology tool MT) to predict whether defects may be present due to, for example, sub-optimal processing (depicted in Figure 2 by the arrow pointing “0” in the second scale SC2). [0036] The metrology apparatus (tool) MT may provide input to the computer system CL to enable accurate simulations and predictions, and may provide feedback to the lithographic apparatus LA to identify possible drifts, e.g., in a calibration status of the lithographic apparatus LA (depicted in Figure 3 by the multiple arrows in the third scale SC3). [0037] The following paragraphs describe a system and a method for tuning an initial overlay resulting from a first overlay determination process (e.g., template matching process) based on an estimated overlay determined using a “context” area of an image representation of a pattern. Note that while the initial overlay is described as determined using a template matching process, any of a number of other processes such as image processing algorithms of segmentation, contour finding, edge based algorithm, region based algorithm, machine learning based algorithm, etc. may be used. Further, in embodiments described in details herein, while the estimated overlay is described as determined using a KPI such as the height of the context area, other KPIs such as context y gap (e.g., distance between boundaries of two features), centroid (e.g., center of mass of the context area), major axis length (e.g., length of a major axis of the context area fitted to an ellipse), minor axis length (e.g., length of a minor axis of the context area fitted to an ellipse), area size (e.g., area of the context area), bounding box (e.g., a position and size of the smallest box containing the context area), extent (e.g., area of the context area divided by the area of the bounding box), or eccentricity (e.g., ratio of the distance between the foci of the ellipse and its major axis length) of the context area may be used in determining the estimated overlay. [0038] Figure 4 shows an example 400 of image representations of features of a pattern with different overlays, context areas and unit cell portion of the context area of the images, consistent with various embodiments. A first image 401 shows a pattern with multiple features such as a first feature 402 (e.g., a bar) and a second feature 403 (e.g., a hole representing a via). In some embodiments, the first feature 402 corresponds to a target feature that is part of a first layer of a design layout and the second feature 403 corresponds to a target feature that is part of a second layer of the design layout in which the first layer is printed above the second layer on a substrate. The pattern can have number of first Confidential features and a number of second features as shown in the first image 401. In some embodiments, the first image 401 is a measured image of the pattern printed on the substrate, and may be obtained using a metrology tool such as a scanned electron microscope (SEM), an optical based metrology tool, or a scatterometer. In some embodiments, the first image 401 may be a simulated image of the pattern, which may be simulated using any of a number of methods. [0039] In the example 400, the first image 401 is representative of features with a negative overlay, a second image 411 is representative of features with zero overlay, and a third image 421 is representative of features with a positive overlay. The fourth image 405 is an image representative of a context area of the first image 401. In some embodiments, a context area of the image is an area that is devoid of (or does not have any) target patterns defined in the design layout. That is, it is an area of the image that does not have any intended features of the pattern. For example, all the portions of the first image 401 filled with black color that do not have any features is referred to as the context area, which are shown as black portions (e.g., portions 406) of a first context image 405. Similarly, the black portions of the second image 411 that do not have any features is referred to as the context area, which are shown as black portions (e.g., portions 416) of a second context image 415. Similarly, the black portions of the third image 421 that do not have any features is referred to as the context area, which are shown as black portions (e.g., portions 426) of a third context image 425. The context images 407, 417 and 427 are unit cell size of the context images 405, 415 and 425, respectively. In some embodiments, for performing a metrology inspection on an image, the image is typically divided or considered as a number of unit cells and each unit cell is inspected separately. The context area may be determined for a single unit cell image or multiple unit cell images. In some embodiments, a unit cell boundary in y direction may start from the via center and end at next via center, and the unit cell boundary in x direction shall start from the via boundary’s left most point and end at via boundary’s right most point. Any of the layers may be chosen as a reference layer to determine the overlay between two layers. In the example of Figure 4, the via is chosen as the reference layer, and the overlay is an offset from a line passing through a center of the bar along y- direction to a via center. If the bar is chosen as the reference layer, then the overlay is the offset from via center to the line passing through a center of the bar along x-direction. [0040] In some embodiments, an overlay may be estimated using a KPI or characteristic of the context area geometry. For example, an overlay of the features 402 and 403 in the first image 401 may be estimated using a height 408 of the context area 406 depicted in a first unit cell image 407. For example, the estimated overlay may be determined based on the height of the context area using a KPI map, which is indicative of a correlation between KPIs and overlay. In some embodiments, the KPI map 450 is indicative of a correlation between height of the context area and a corresponding overlay. The graph 451 represents a height of the context area and the corresponding overlay. In some embodiments, the y-axis of the KPI map 450 represents a height of the context area and the x- axis represents an overlay from -m to +m. In some embodiments, the graph 451 is obtained using a Confidential curve fitting algorithm, such as a Gaussian curve. While the correlation is a Gaussian distribution in some cases (e.g., height to overlay correlation), it may not be a Gaussian distribution in correlations of the overlay with some other KPIs. The overlay of the features in the first image 401 may be determined by determining the height 408 of the context feature and then determining the overlay corresponding to the height in the KPI map 450. In some embodiments, the height 408 is determined at a center of the context area. Similarly, an overlay of the features 402 and 403 in the second and third images 411 and 421, respectively, may be estimated using the KPI map 450 based on a height of the context area 416 and 426 in a second unit cell image 417 and a third unit cell image 427, respectively. [0041] The KPI map 450 may be determined using a number of ways. For example, the KPI map 450 may be determined by determining the overlay for a number of sample images using any of a number of known methods (e.g., template matching process), extracting context area for each sample image and determining the KPIs of the context area for each of the overlays determined, and establishing a correlation between the KPIs and overlay. In another example, the KPI map 450 may be determined from the simulated metrology images (e.g., SEM images) with known overlay offset (e.g., from -m to +m overlay range). [0042] The context area may be extracted from an image representative of the features using a number of methods. For example, the context area can be defined using a segmentation method such as a threshold based, edge based, region based, cluster based, and watershed segmentation. An example of the threshold based segmentation method to identify the context area is as follows. In some embodiments, the context area 406 in the first context image 405 may be extracted by thresholding the first image 401 using a specified threshold 434 that is determined based on histograms of pixel intensity values associated with different areas of the first image 401. In some embodiments, a first histogram 431 is generated for the first feature 402, a second histogram 432 is generated for the second feature 403 and a third histogram 433 is generated for the context area 406. Typically, the first histogram 431 is of greater intensity than the second histogram 432 since the second feature 403 is the buried layer, that is printed below the first feature 402 which is a layer above the second feature 403 in the design layout. The third histogram 433 contains least intensity values among the three histograms as it is of a portion of the image that do not have any features. An intensity threshold 434 proximate the intersection of the second histogram 432 and the third histogram 433 is selected for thresholding the first image 401 to extract the context area. By thresholding the first image 401 based on the intensity threshold 434 the first context image 405 having the context area 406 may be generated. The first context image 405 may be a binary image generated based on the pixel values of the first image 401 and the intensity threshold 434. For example, all those pixels with a value below the intensity threshold 434 may be assigned a “0” value (which corresponds to the black region - portions 406) and those equal to or above the intensity threshold 434 may be assigned a “1” value (which corresponds to the white region having the features). In some embodiments, the context area Confidential may correspond to areas of the image with pixel intensity values (a) higher than a threshold, or (b) in a specified threshold range – within a minimum threshold and a maximum threshold. [0043] In some embodiments, the context area may be defined from two or more layers of a design layout. For example, an image of the pattern such as the first image 401 may be an image of five patterning layers, and an x overlay between the features in the first and the third layer may be measured by the template matching process, and the y overlay between the features in the second layer and the third layer may be measured using an edge-based searching algorithm, and the context are may be a remainder of the image area after removing the first, second and third layer features. [0044] The following paragraphs describe adjusting a first overlay (e.g., an initial overlay determined using a first process) based on a second overlay (e.g., an estimated overlay) determined using a context area of an image representative of a pattern at least with reference to Figures 5-8. [0045] Figure 5 is a block diagram of an exemplary system 500 for adjusting an initial overlay based on an estimated overlay determined using a context area of an image representative of a pattern, consistent with various embodiments. Figure 6 is a flow diagram of an exemplary method 600 for determining an estimated overlay using a context area of an image representative of a pattern, consistent with various embodiments. [0046] At process P605, a context area component 505 obtains an image representation of a pattern. The image representation may be an image 502 of the pattern having multiple features such as a hole 512 and a vertical bar 514. The method 600 (e.g., depicted as right branch of Figure 5) may facilitate determination of an estimated overlay with respect to the features 512 and 514. The image 502 may include multiple holes and multiple vertical bars. In some embodiments, the image 502 may be similar to the third image 421 of Figure 4. The image 503 may be a SEM image or other metrology image of the pattern. [0047] At process P610, the context area component 505 generates the context image 507 showing the context area, such as context area 516. As described above, the context area may be an area of the image that does not contain any target patterns defined in the corresponding design layout. The context area component 505 may generate the context area 516 in a number of ways, for example, by thresholding the unit cell image 503 (e.g., as described at least with reference to Figure 4 above). In some embodiments, the context area component 505 may segment the image 502 into a number of smaller portions where each portion (e.g., unit cell image 503) may be referred to as a unit cell and the context area component 505 may generate a context area 516 corresponding to the unit cell image 503. [0048] At process P615, a KPI evaluation component 515 may determine a characteristic or KPI 615 of the context area 516. The KPI 615 may be a geometrical parameter of the context area 516 such as a height, context y gap, a centroid, a major axis length, a minor axis length, an area size, a bounding box, an extent, or eccentricity of the context area 516. In some embodiments, the KPI 615 may be a height 509 of the context area 516. In some embodiments, a KPI may be calculated from the pixel Confidential intensity values. For example, the centroid can be calculated as intensity weighted centroid. The context area can have pixels with varied levels (e.g., 1~n levels) of intensity. For example, if the context area pixels have just one gray level, the KPIs are calculated without considering intensity level. If context area pixels have two or more gray levels (e.g., n > 1), some of the geometry parameters are calculated with pixel intensity values as the weight. [0049] At process P620, the KPI evaluation component 515 may determine an estimated overlay of the features based on the KPI 615 of the context area 516. For example, the KPI evaluation component 515 obtains a KPI map that has a correlation of a KPI with overlay, such as the KPI map 450, and determines an estimated overlay 511 based on the height 509 of the context area 516. [0050] Figure 7 is a flow diagram of an exemplary method 700 for adjusting an initial overlay based on an estimated overlay determined using a context area of an image representative of a pattern, consistent with various embodiments. Figure 8 shows examples of template matching process-based adjusting of an initial overlay using the estimated overlay, consistent with various embodiments. [0051] In some embodiments, the method 700 may be performed using the overlay adjustment component 520. At process P705, an overlay adjustment component 520 obtains an initial overlay determined using a first process, and an estimated overlay determined using a context area of an image representative of a pattern. In some embodiments, an initial overlay 504 may be obtained from a template matching component 510 that facilitates determination of the initial overlay 504 (e.g., depicted as left branch of Figure 5) based on a non-context area of the image using a template matching process. In some embodiments, the template matching component 510 processes the non- context area, e.g., the features of the pattern in the image, to determine a similarity score that is representative of a similarity between the features 512 and 514 and the corresponding templates of the features 512 and 514, and determines the initial overlay 504 based on the similarity score. In Figure 8, the similarity score map 802 illustrates the initial overlay 504 determined based on the similarity score determined using the template matching process. [0052] In some embodiments, the overlay adjustment component 520 may obtain the estimated overlay determined using the context area of the image of the pattern from the KPI evaluation component 515 (e.g., as described at least with reference to Figure 6 above). For example, the overlay adjustment component 520 may obtain the estimated overlay 511 determined using the context area 516 of the image 503 of the pattern. [0053] At process P710, the overlay adjustment component 520 compares the initial overlay with the estimated overlay and generates a comparison result 710. For example, the overlay adjustment component 520 compares the initial overlay 504 with the estimated overlay 511 and generates a difference between the two overlays as the comparison result 710. Consider that the initial overlay 504 is x units and the estimated overlay is y units, where y < x. In the example of Figure 8, the similarity score map 802 shows the estimated overlay 511 relative to the initial overlay 504. Confidential [0054] At a determination process P715, the overlay adjustment component 520 determines whether the comparison result satisfies a criterion. For example, the overlay adjustment component 520 determines whether a difference between the initial overlay 504 and the estimated overlay 511 (e.g., x- y) is within a specified threshold. If the difference is within the specified threshold, the overlay adjustment component 520 determines that the comparison result satisfies the criterion and outputs the initial overlay 504 as an overlay 715 of the features 512 and 514. In some embodiments, the overlay adjustment component 520 may also assign a confidence score to the overlay 715. Various methods may be used to determine the confidence score. In some embodiments, the confidence score may be determined at least based on the comparison result 710. [0055] Responsive to a determination that the comparison result does not satisfy the criterion (e.g., the difference is beyond the specified threshold), at process P720, the overlay adjustment component 520 determines parameters of the first process (e.g., template matching process) to be adjusted for adjusting the initial overlay 504. In some embodiments, the overlay adjustment component 520 determines search parameters of an area in the similarity score map 802 to be searched by the template matching process to determine the adjusted overlay. For example, the overlay adjustment component 520 may determine the area 806 of the similarity score map 802 that is proximate the estimated overlay 511 (e.g., area 806 of the similarity score map defined by “y – m” to “y + m” along x-axis of the similarity score map, where m is a user specified value). In some embodiments, the graph 804 shows a plot of the similarity score, where the y-axis is the similarity score, and the x-axis is the length of the image 503 in the x-direction. [0056] At process P725, the overlay adjustment component 520 (e.g., in co-ordination with the template matching component 510) may execute the template matching process based on the search parameters (e.g., co-ordinates of the area 806 determined in process P720) to determine the adjusted overlay 715. For example, the template matching component 510 may search the similarity score map 802 in the area 806 to determine a local maxima (e.g., position within the area 806 where the similarity score is the highest), and then revise the initial overlay 504 to the adjusted overlay 517 based on the coordinates of the local maxima. The overlay adjustment component 520 may output the adjusted overlay 517 as the overlay 715 of the features 514 and 514. [0057] In some embodiments, the methods 600 and 700 of Figures 6 and 7 may performed for each unit cell image of the image 502 of the pattern independently to determine the overlay for each unit cell image. One of the advantages of determining the overlay value for each unit cell image independently is that the method is tolerant to any variations (e.g., image intensity) across the image, which eliminates the need for normalizing an image, which minimizes the consumption of time and computing resources, thereby making the defect detection faster. Another advantage is to assign a confidence level to each unit cell overlay value. For example, when the unit cell overlay determined by the first process is similar to the overlay determined using the context area, a higher confidence level is assigned to this unit cell overlay. That is, the lower the difference between the two overlay Confidential values determined using different processes for the unit cell, the higher the confidence value of the determined overlay value. After assigning the confidence level to each unit cell’s overlay value, the confidence level may be used as a weight to report the weighted mean overlay over all unit cells within one image. This approach may further improve overlay result accuracy and robustness. [0058] While the foregoing paragraphs describe using the method of determining an estimated overlay of the features using the context area (e.g., method 600 of Figure 6) in combination with (e.g., as a supplemental process to) another process (e.g., template matching process) to adjust an initial overlay determined using the other process, it should be noted that the method 600 may be used as a standalone method to determine the overlay of the features of a pattern. That is, the estimated overlay measured using the context area may be used as an independent overlay measurement without the need for it being used to adjust the initial overlay. [0059] Further, in some embodiments, the method 600 may be supplemented by another process. For example, an overlay determined based on the context area (e.g., using method 600) may be used as an initial overlay, which may further be adjusted using an overlay determined using another method (e.g., template matching process). [0060] In some embodiments, the defect detection process helps in improving a patterning process by minimizing defects in patterning a target layout on a substrate. For example, based on the determined overlay error, a parameter of a patterning process or a lithographic apparatus used to print a pattern on a substrate may be adjusted to minimize defects in patterning a target layout on the substrate. After adjusting the parameter, the patterning process may be performed using the lithographic apparatus to print patterns corresponding to the target layout on the substrate. [0061] Figure 9 is a block diagram that illustrates a computer system 900 which can assist in implementing in various methods and systems disclosed herein. The computer system 900 may be used to implement any of the entities, components, modules, or services depicted in the examples of the figures (and any other entities, components, modules, or services described in this specification). The computer system 900 may be programmed to execute computer program instructions to perform functions, methods, flows, or services (e.g., of any of the entities, components, or modules) described herein. The computer system 900 may be programmed to execute computer program instructions by at least one of software, hardware, or firmware. [0062] Computer system 900 includes a bus 902 or other communication mechanism for communicating information, and a processor 904 (or multiple processors 904 and 905) coupled with bus 902 for processing information. Computer system 900 also includes a main memory 906, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 902 for storing information and instructions to be executed by processor 904. Main memory 906 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 904. Computer system 900 further includes a read only memory (ROM) 908 or other static storage device coupled to bus 902 for storing static information and instructions for Confidential processor 904. A storage device 910, such as a magnetic disk or optical disk, is provided and coupled to bus 902 for storing information and instructions. [0063] Computer system 900 may be coupled via bus 902 to a display 912, such as a cathode ray tube (CRT) or flat panel or touch panel display for displaying information to a computer user. An input device 914, including alphanumeric and other keys, is coupled to bus 902 for communicating information and command selections to processor 904. Another type of user input device is cursor control 916, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 904 and for controlling cursor movement on display 912. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane. A touch panel (screen) display may also be used as an input device. [0064] According to one embodiment, portions of one or more methods described herein may be performed by computer system 900 in response to processor 904 executing one or more sequences of one or more instructions contained in main memory 906. Such instructions may be read into main memory 906 from another computer-readable medium, such as storage device 910. Execution of the sequences of instructions contained in main memory 906 causes processor 904 to perform the process steps described herein. One or more processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in main memory 906. In an alternative embodiment, hard-wired circuitry may be used in place of or in combination with software instructions. Thus, the description herein is not limited to any specific combination of hardware circuitry and software. [0065] The term “computer-readable medium” as used herein refers to any medium that participates in providing instructions to processor 904 for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media include, for example, optical or magnetic disks, such as storage device 910. Volatile media include dynamic memory, such as main memory 906. Transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise bus 902. Transmission media can also take the form of acoustic or light waves, such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read. [0066] Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to processor 904 for execution. For example, the instructions may initially be borne on a magnetic disk of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. Confidential A modem local to computer system 900 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal. An infrared detector coupled to bus 902 can receive the data carried in the infrared signal and place the data on bus 902. Bus 902 carries the data to main memory 906, from which processor 904 retrieves and executes the instructions. The instructions received by main memory 906 may optionally be stored on storage device 910 either before or after execution by processor 904. [0067] Computer system 900 also preferably includes a communication interface 918 coupled to bus 902. Communication interface 918 provides a two-way data communication coupling to a network link 920 that is connected to a local network 922. For example, communication interface 918 may be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 918 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, communication interface 918 sends and receives electrical, electromagnetic, or optical signals that carry digital data streams representing various types of information. [0068] Network link 920 typically provides data communication through one or more networks to other data devices. For example, network link 920 may provide a connection through local network 922 to a host computer 924 or to data equipment operated by an Internet Service Provider (ISP) 926. ISP 926 in turn provides data communication services through the worldwide packet data communication network, now commonly referred to as the “Internet” 928. Local network 922 and Internet 928 both use electrical, electromagnetic, or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 920 and through communication interface 918, which carry the digital data to and from computer system 900, are exemplary forms of carrier waves transporting the information. [0069] Computer system 900 can send messages and receive data, including program code, through the network(s), network link 920, and communication interface 918. In the Internet example, a server 930 might transmit a requested code for an application program through Internet 928, ISP 926, local network 922 and communication interface 918. One such downloaded application may provide for the illumination optimization of the embodiment, for example. The received code may be executed by processor 904 as it is received, or stored in storage device 910, or other non-volatile storage for later execution. In this manner, computer system 900 may obtain application code in the form of a carrier wave. [0070] While the concepts disclosed herein may be used for imaging on a substrate such as a silicon wafer, it shall be understood that the disclosed concepts may be used with any type of lithographic imaging systems, e.g., those used for imaging on substrates other than silicon wafers. [0071] The terms “optimizing” and “optimization” as used herein refers to or means adjusting a patterning apparatus (e.g., a lithography apparatus), a patterning process, etc. such that results and/or Confidential processes have more desirable characteristics, such as higher accuracy of projection of a design pattern on a substrate, a larger process window, etc. Thus, the term “optimizing” and “optimization” as used herein refers to or means a process that identifies one or more values for one or more parameters that provide an improvement, e.g., a local optimum, in at least one relevant metric, compared to an initial set of one or more values for those one or more parameters. "Optimum" and other related terms should be construed accordingly. In an embodiment, optimization steps can be applied iteratively to provide further improvements in one or more metrics. [0072] Aspects of the invention can be implemented in any convenient form. For example, an embodiment may be implemented by one or more appropriate computer programs which may be carried on an appropriate carrier medium which may be a tangible carrier medium (e.g., a disk) or an intangible carrier medium (e.g., a communications signal). Embodiments of the invention may be implemented using suitable apparatus which may specifically take the form of a programmable computer running a computer program arranged to implement a method as described herein. Thus, embodiments of the disclosure may be implemented in hardware, firmware, software, or any combination thereof. Embodiments of the disclosure may also be implemented as instructions stored on a machine-readable medium, which may be read and executed by one or more processors. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device). For example, a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical, or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), and others. Further, firmware, software, routines, instructions may be described herein as performing certain actions. However, it should be appreciated that such descriptions are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software, routines, instructions, etc. [0073] In block diagrams, illustrated components are depicted as discrete functional blocks, but embodiments are not limited to systems in which the functionality described herein is organized as illustrated. The functionality provided by each of the components may be provided by software or hardware modules that are differently organized than is presently depicted, for example such software or hardware may be intermingled, conjoined, replicated, broken up, distributed (e.g., within a data center or geographically), or otherwise differently organized. The functionality described herein may be provided by one or more processors of one or more computers executing code stored on a tangible, non-transitory, machine-readable medium. In some cases, third party content delivery networks may host some or all of the information conveyed over networks, in which case, to the extent information (e.g., content) is said to be supplied or otherwise provided, the information may be provided by sending instructions to retrieve that information from a content delivery network. [0074] Unless specifically stated otherwise, as apparent from the discussion, it is appreciated that Confidential throughout this specification discussions utilizing terms such as “processing,” “computing,” “calculating,” “determining” or the like refer to actions or processes of a specific apparatus, such as a special purpose computer or a similar special purpose electronic processing/computing device. [0075] Embodiments of the present disclosure can be further described by the following statements. 1. A method for determining overlay, the method comprising: obtaining a representation of an image having a plurality of features; determining a context area of the image, wherein the context area corresponds to an area of a corresponding design layout that does not contain target features; determining a characteristic of the context area; and determining an overlay of the plurality of features based on the characteristic. 2. The method of clause 1, wherein the context area is located between the features. 3. The method of clause 1, wherein the characteristic is a geometrical parameter of the context area. 4. The method of clause 3, wherein the geometrical parameter includes a height of the context area. 5. The method of clause 3, wherein the geometrical parameter includes at least one of a context y gap, centroid, major axis length, minor axis length, area size, bounding box, or eccentricity of the context area. 6. The method of clause 3, wherein the context area includes pixels of different intensity values, and wherein the geometrical parameter is determined based on the intensity values. 7. The method of clause 1, wherein determining the overlay includes: obtaining a correlation between the characteristic and an overlay of a set of features; and determining the overlay based on the correlation. 8. The method of clause 7, wherein obtaining the correlation includes: obtaining multiple images, each image having a set of features; determining, using each image, a specified overlay for the set of features and a specified context area of the corresponding image; and correlating, based on each image, a characteristic of the specified context area with the specified overlay of the set of features in that image. 9. The method of clause 7, wherein the correlation is a gaussian distribution. 10. The method of clause 7, wherein the correlation is generated as a characteristic map, a lookup table, a function, or a model. 11. The method of clause 1 further comprising: comparing the overlay with a first overlay of the plurality of features, wherein the first overlay is determined using a first process based on a non-context area of the image; and adjusting the first process to determine an adjusted overlay based on the overlay. 12. The method of clause 11, wherein adjusting the first process includes: adjusting the first process based on a determination that a difference between the first overlay and the overlay exceeds a specified threshold. Confidential 13. The method of clause 11, wherein the first process includes a template matching process. 14. The method of clause 11, wherein the first process includes an edge searching process that generates a contour of the plurality of features. 15. The method of clause 11, wherein comparing the overlay with the first overlay includes assigning a confidence level to the overlay based on a result of comparison between the overlay and the first overlay. 16. The method of clause 15, wherein assigning the confidence level includes assigning a higher confidence level to the overlay the lesser the difference between the overlay and the first overlay. 17. The method of clause 1, wherein determining the context area includes: determining the context area by thresholding the image. 18. The method of clause 1, wherein determining the context area includes: generating a first histogram that is representative of a first set of the plurality of features in a first layer of the design layout; generating a second histogram that is representative of a second set of the plurality of features in a second layer of the design layout, the second layer being below the first layer; and generating a third histogram that is representative of an area of the image that does not contain target features. 19. The method of clause 18 further comprising: selecting an intensity level in an intersection of the second histogram and the third histogram as a threshold for generating the context area; and thresholding the image using the threshold to determine the context area. 20. The method of clause 1, wherein the image is a scanning electron microscope (SEM) image of the plurality of features printed on a substrate. 21. A method for determining overlay, the method comprising: obtaining a representation of an image having a plurality of features; obtaining a first overlay of the plurality of features by executing a first process; obtaining a second overlay of the plurality of features, wherein the second overlay is determined based on a context area of the image that does not contain target features defined in a corresponding design layout; and adjusting parameters of the first method to determine an adjusted first overlay based on the second overlay. 22. The method of clause 21, wherein obtaining the second overlay includes: determining a characteristic of the context area; and determining the second overlay based on the characteristic. 23. The method of clause 22, wherein characteristic includes a height of the context area. 24. The method of clause 22, wherein the characteristic includes at least one of a context y gap, centroid, major axis length, minor axis length, area size, bounding box, or eccentricity of the context Confidential area. 25. The method of clause 22, wherein determining the second overlay includes: obtaining a correlation between the characteristic and an overlay of a set of features; and determining the second overlay based on the correlation. 26. The method of clause 25, wherein obtaining the correlation includes: obtaining multiple images, each image having a set of features; determining, using each image, a specified overlay for the set of features and a specified context area of the corresponding image; and correlating, based on each image, a characteristic of the specified context area with the specified overlay of the set of features in that image. 27. The method of clause 26, wherein the correlation is a gaussian distribution. 28. The method of clause 21, wherein the first overlay is determined using the first process based on a non-context area of the image. 29. The method of clause 21 further comprising: comparing the second overlay with the first overlay of the plurality of features; and adjusting the first process to determine an adjusted overlay based on the overlay. 30. The method of clause 29, wherein adjusting the first process includes: adjusting the first process based on a determination that a difference between the first overlay and the second overlay exceeds a specified threshold. 31. The method of clause 21, wherein the context area is determined by thresholding the image. 32. The method of clause 31, wherein thresholding the image includes: generating a first histogram that is representative of a first set of the plurality of features in a first layer of the design layout; generating a second histogram that is representative of a second set of the plurality of features in a second layer of the design layout, the second layer being below the first layer; and generating a third histogram that is representative of an area of the image that does not contain target features. 33. The method of clause 32 further comprising: selecting an intensity level in an intersection of the second histogram and the third histogram as a threshold for generating the context area; and thresholding the image using the threshold to determine the context area. 34. The method of clause 21, wherein the image is a SEM image of the plurality of features printed on a substrate. 35. An apparatus, the apparatus comprising: a memory storing a set of instructions; and a processor configured to execute the set of instructions to cause the apparatus to perform a method of any of the above clauses. Confidential 36. A non-transitory computer-readable medium having instructions recorded thereon, the instructions when executed by a computer implementing the method of any of the above claims. [0076] The reader should appreciate that the present application describes several inventions. Rather than separating those inventions into multiple isolated patent applications, these inventions have been grouped into a single document because their related subject matter lends itself to economies in the application process. But the distinct advantages and aspects of such inventions should not be conflated. In some cases, embodiments address all of the deficiencies noted herein, but it should be understood that the inventions are independently useful, and some embodiments address only a subset of such problems or offer other, unmentioned benefits that will be apparent to those of skill in the art reviewing the present disclosure. Due to costs constraints, some inventions disclosed herein may not be presently claimed and may be claimed in later filings, such as continuation applications or by amending the present claims. Similarly, due to space constraints, neither the Abstract nor the Summary sections of the present document should be taken as containing a comprehensive listing of all such inventions or all aspects of such inventions. [0077] It should be understood that the description and the drawings are not intended to limit the present disclosure to the particular form disclosed, but to the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the inventions as defined by the appended claims. [0078] Modifications and alternative embodiments of various aspects of the inventions will be apparent to those skilled in the art in view of this description. Accordingly, this description and the drawings are to be construed as illustrative only and are for the purpose of teaching those skilled in the art the general manner of carrying out the inventions. It is to be understood that the forms of the inventions shown and described herein are to be taken as examples of embodiments. Elements and materials may be substituted for those illustrated and described herein, parts and processes may be reversed or omitted, certain features may be utilized independently, and embodiments or features of embodiments may be combined, all as would be apparent to one skilled in the art after having the benefit of this description. Changes may be made in the elements described herein without departing from the spirit and scope of the invention as described in the following claims. Headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description. [0079] As used herein, unless specifically stated otherwise, the term “or” encompasses all possible combinations, except where infeasible. For example, if it is stated that a component includes A or B, then, unless specifically stated otherwise or infeasible, the component may include A, or B, or A and B. As a second example, if it is stated that a component includes A, B, or C, then, unless specifically stated otherwise or infeasible, the component may include A, or B, or C, or A and B, or A and C, or B and C, or A and B and C. Expressions such as “at least one of” do not necessarily modify an entirety of a following list and do not necessarily modify each member of the list, such that “at least one of A, B, and C” should be understood as including only one of A, only one of B, only one of C, or any Confidential combination of A, B, and C. The phrase “one of A and B” or “any one of A and B” shall be interpreted in the broadest sense to include one of A, or one of B. [0080] The descriptions herein are intended to be illustrative, not limiting. Thus, it will be apparent to one skilled in the art that modifications may be made as described without departing from the scope of the claims set out below. Confidential

Claims

CLAIMS 1. A non-transitory computer-readable medium having instructions recorded thereon, the instructions when executed by a computer implementing a method for determining overlay, the method comprising: obtaining a representation of an image having a plurality of features; determining a context area of the image, wherein the context area corresponds to an area of a corresponding design layout that does not contain target features; determining a characteristic of the context area; and determining an overlay of the plurality of features based on the characteristic. 2. The medium of claim 1, wherein the context area is located between the plurality of features, and wherein the characteristic is a geometrical parameter of the context area. 3. The medium of claim 2, wherein the geometrical parameter includes at least one of height, context y gap, centroid, major axis length, minor axis length, area size, bounding box, or eccentricity of the context area. 4. The medium of claim 2, wherein the context area includes pixels of different intensity values, and wherein the geometrical parameter is determined based on the intensity values. 5. The medium of claim 1, wherein determining the overlay includes: obtaining a correlation between the characteristic and an overlay of a set of features; and determining the overlay based on the correlation. 6. The medium of claim 5, wherein obtaining the correlation includes: obtaining multiple images, each image having a set of features; determining, using each image, a specified overlay for the set of features and a specified context area of the corresponding image; and correlating, based on each image, a characteristic of the specified context area with the specified overlay of the set of features in that image. 7. The medium of claim 5, wherein the correlation is generated as a characteristic map, a lookup table, a function, or a model. 8. The medium of claim 1, wherein the method further comprises: comparing the overlay with a first overlay of the plurality of features, wherein the first Confidential overlay is determined using a first process based on a non-context area of the image; and adjusting the first process to determine an adjusted overlay based on the overlay. 9. The medium of claim 8, wherein adjusting the first process includes: adjusting the first process based on a determination that a difference between the first overlay and the overlay exceeds a specified threshold. 10. The medium of claim 8, wherein the first process includes at least one of a template matching process, or an edge searching process that generates a contour of the plurality of features. 11. The medium of claim 8, wherein comparing the overlay with the first overlay includes assigning a confidence level to the overlay based on a result of comparison between the overlay and the first overlay. 12. The medium of claim 1, wherein determining the context area includes: determining the context area by thresholding pixel values of the image. 13. The medium of claim 1, wherein determining the context area includes: generating a first histogram that is representative of a first set of the plurality of features in a first layer of the design layout; generating a second histogram that is representative of a second set of the plurality of features in a second layer of the design layout, the second layer being below the first layer; and generating a third histogram that is representative of an area of the image that does not contain target features. 14. The medium of claim 13, wherein the method further comprises: selecting an intensity level in an intersection of the second histogram and the third histogram as a threshold for generating the context area; and thresholding the image using the threshold to determine the context area. 15. The medium of claim 1, wherein the image is a scanning electron microscope (SEM) image of the plurality of features printed on a substrate. Confidential
PCT/EP2024/056110 2023-04-06 2024-03-07 Method and system for context aware overlay adjustment WO2024208534A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363457754P 2023-04-06 2023-04-06
US63/457,754 2023-04-06

Publications (1)

Publication Number Publication Date
WO2024208534A1 true WO2024208534A1 (en) 2024-10-10

Family

ID=90364138

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/056110 WO2024208534A1 (en) 2023-04-06 2024-03-07 Method and system for context aware overlay adjustment

Country Status (1)

Country Link
WO (1) WO2024208534A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021043936A1 (en) * 2019-09-05 2021-03-11 Asml Netherlands B.V. Method for determining defectiveness of pattern based on after development image

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021043936A1 (en) * 2019-09-05 2021-03-11 Asml Netherlands B.V. Method for determining defectiveness of pattern based on after development image

Similar Documents

Publication Publication Date Title
US7824828B2 (en) Method and system for improvement of dose correction for particle beam writers
US10096100B2 (en) Inspection device, inspection method, and image processing program
US11842420B2 (en) Method and apparatus for adaptive alignment
US20230401694A1 (en) Active learning-based defect location identification
US20240069450A1 (en) Training machine learning models based on partial datasets for defect location identification
TWI844777B (en) Image alignment setup for specimens with intra- and inter-specimen variations using unsupervised learning and adaptive database generation methods
US20240212131A1 (en) Improved charged particle image inspection
WO2024208534A1 (en) Method and system for context aware overlay adjustment
US11650576B2 (en) Knowledge recommendation for defect review
US20250044710A1 (en) Overlay metrology based on template matching with adaptive weighting
US20240037890A1 (en) Topology-based image rendering in charged-particle beam inspection systems
WO2024170211A1 (en) Method and system for identifying a center of a pattern using automatic thresholding
WO2024083437A1 (en) Defect map based d2d alignment of images for machine learning training data preparation
WO2024088665A1 (en) Training a machine learning model to predict images representative of defects on a substrate
WO2024099686A1 (en) Systems, methods, and software for overlay model building and application
US20240212317A1 (en) Hierarchical clustering of fourier transform based layout patterns
WO2024213339A1 (en) Method for efficient dynamic sampling plan generation and accurate probe die loss projection
WO2025036991A1 (en) Systems and methods for hybrid sampling plan generation and accurate die loss projection
WO2024068426A1 (en) Scanning electron microscopy (sem) back-scattering electron (bse) focused target and method
CN118401901A (en) Overlap measurement based on template matching with adaptive weighting

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24710709

Country of ref document: EP

Kind code of ref document: A1