SYSTEM AND METHOD FOR TARGET CENTERING DETECTION IN OVERLAY METROLOGY
20260064014 ยท 2026-03-05
Inventors
- Ofer Manos (Migdal HaEmek, IL)
- Sveta Grechin (Migdal HaEmek, IL)
- Ran Trifon (Migdal HaEmek, IL)
- Yang Yu (Migdal HaEmek, IL)
- Mohamed Hegaze (Migdal HaEmek, IL)
- Avner Safrani (Misgav, IL)
- Ohad Bachar (Timrat, IL)
Cpc classification
G06V10/774
PHYSICS
G03F7/70525
PHYSICS
G03F7/706837
PHYSICS
G03F7/706845
PHYSICS
G03F7/70633
PHYSICS
G03F7/70775
PHYSICS
International classification
G03F7/00
PHYSICS
Abstract
A system for target centering detection may be configured receive one or more acquisition images of a sample from an overlay metrology sub-system and determine, using a machine learning-based centering model, one or more stage correctables based on the received one or more acquisition images. The system may be configured to cause a sample stage of the overlay metrology sub-system to adjust a stage position based on the determined one or more stage correctables and receive one or more measurement images of the sample from the overlay metrology sub-system based on the adjusted stage position of the sample stage. The system may then be configured to determine one or more overlay measurements based on the received one or more measurement images.
Claims
1. A system, the system comprising: a controller including one or more processors configured to execute a set of program instructions stored in memory, the set of program instructions configured to cause the one or more processors to: receive one or more acquisition images of a sample from an overlay metrology sub-system; determine, using a machine learning-based centering model, one or more stage correctables based on the received one or more acquisition images; determine, using the machine learning-based centering model, a model output confidence score, wherein the model output confidence score indicates a level of confidence associated with a respective stage correctable determined using the machine learning-based centering model; generate one or more control signals configured to cause a sample stage of the overlay metrology sub-system to adjust a stage position based on the determined one or more stage correctables; receive one or more measurement images of the sample from the overlay metrology sub-system, wherein the one or more measurement images are acquired by the overlay metrology sub-system based on the adjusted stage position of the sample stage; and determine one or more overlay measurements based on the received one or more measurement images.
2. The system of claim 1, wherein the set of program instructions further configured to cause the one or more processors to: receive a plurality of training images, wherein the plurality of training images include a plurality of through-focus images; and generate the machine learning-based centering model based on the received plurality of training images.
3. The system of claim 2, wherein the generated machine learning-based centering model is stored in the memory as a measurement recipe of the overlay metrology sub-system.
4. The system of claim 2, wherein each through-focus image of the plurality of through-focus images is labeled with a corresponding offset based on a best contrast focus position.
5. The system of claim 2, wherein the machine learning-based centering model is trained to perform target centering detection based on binary classification of one or more image regions of the plurality of through-focus images.
6. The system of claim 5, wherein a detector part of the machine learning-based centering model is trained based on the one or more image regions of the plurality of through-focus images determined based on the binary classification.
7. The system of claim 1, wherein the one or more acquisition images include one or more defocused images.
8. The system of claim 1, wherein the set of program instructions are further configured to cause the one or more processors to: provide the determined model output confidence score to a machine learning-based focus model; and simultaneously adjust a focus of the overlay metrology sub-system while adjusting the stage position of the overlay metrology sub-system.
9. The system of claim 1, wherein the set of program instructions are further configured to cause the one or more processors to: compare the determined model output confidence score to a predetermined threshold.
10. The system of claim 9, wherein the set of program instructions are further configured to cause the one or more processors to: upon determining the determined model output confidence score is below the predetermined threshold, adjust one or more system parameters of the overlay metrology sub-system; and direct the overlay metrology sub-system to capture one or more additional acquisition images based on the adjusted one or more system parameters.
11. A system, the system comprising: an overlay metrology sub-system configured to acquire one or more images of a sample; and a controller communicatively coupled to the overlay metrology sub-system, the controller includes one or more processors configured to execute a set of program instructions stored in memory, the set of program instructions configured to cause the one or more processors to: receive one or more acquisition images of the sample from the overlay metrology sub-system; determine, using a machine learning-based centering model, one or more stage correctables based on the received one or more acquisition images; determine, using the machine learning-based centering model, a model output confidence score, wherein the model output confidence score indicates a level of confidence associated with a respective stage correctable determined using the machine learning-based centering model; generate one or more control signals configured to cause a sample stage of the overlay metrology sub-system to adjust a stage position based on the determined one or more stage correctables; receive one or more measurement images of the sample from the overlay metrology sub-system, wherein the one or more measurement images are acquired by the overlay metrology sub-system based on the adjusted stage position of the sample stage; and determine one or more overlay measurements based on the received one or more measurement images.
12. The system of claim 11, wherein the set of program instructions further configured to cause the one or more processors to: receive a plurality of training images, wherein the plurality of training images include a plurality of through-focus images; and generate the machine learning-based centering model based on the received plurality of training images.
13. The system of claim 12, wherein the generated machine learning-based centering model is stored in the memory as a measurement recipe of the overlay metrology sub-system.
14. The system of claim 12, wherein each through-focus image of the plurality of through-focus images is labeled with a corresponding offset based on a best contrast focus position.
15. The system of claim 12, wherein the machine learning-based model is trained to perform target detection based on binary classification of one or more image regions of the plurality of through-focus images.
16. The system of claim 15, wherein a detector part of the machine learning-based model is trained based on the one or more image regions of the plurality of through-focus images determined based on the binary classification.
17. The system of claim 11, wherein the one or more acquisition images include one or more defocused images.
18. The system of claim 11, wherein the set of program instructions are further configured to cause the one or more processors to: provide the determined model output confidence score to a machine learning-based focus model; and simultaneously adjust a focus of the overlay metrology sub-system while adjusting the stage position of the overlay metrology sub-system.
19. The system of claim 11, wherein the set of program instructions are further configured to cause the one or more processors to: compare the determined model output confidence score to a predetermined threshold.
20. The system of claim 19, wherein the set of program instructions are further configured to cause the one or more processors to: upon determining the determined model output confidence score is below the predetermined threshold, adjust one or more system parameters of the overlay metrology sub-system; and direct the overlay metrology sub-system to capture one or more additional acquisition images.
21. The system of claim 11, wherein the overlay metrology sub-system comprises an image-based overlay metrology sub-system.
22. A method comprising: receiving one or more acquisition images of a sample from an overlay metrology sub-system; determining, using a machine learning-based centering model, one or more stage correctables based on the received one or more acquisition images; determining, using the machine learning-based centering model, a model output confidence score, wherein the model output confidence score indicates a level of confidence associated with a respective stage correctable determined using the machine learning-based centering model; generating one or more control signals configured to cause a sample stage of the overlay metrology sub-system to adjust a stage position based on the determined one or more stage correctables; receiving one or more measurement images of the sample from the overlay metrology sub-system, wherein the one or more measurement images are acquired by the overlay metrology sub-system based on the adjusted stage position of the sample stage; and determining one or more overlay measurements based on the received one or more measurement images.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0010] The numerous advantages of the disclosure may be better understood by those skilled in the art by reference to the accompanying figures in which:
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
DETAILED DESCRIPTION
[0019] Reference will now be made in detail to the subject matter disclosed, which is illustrated in the accompanying drawings. The present disclosure has been particularly shown and described with respect to certain embodiments and specific features thereof. The embodiments set forth herein are taken to be illustrative rather than limiting. It should be readily apparent to those of ordinary skill in the art that various changes and modifications in form and detail may be made without departing from the spirit and scope of the disclosure.
[0020] Embodiments of the present disclosure are directed to a system and method for target centering detection in overlay metrology. For example, the system and method may use a machine learning-based centering model configured to generate one or more stage correctables (e.g., Ax, Ay, and the like) and adjust a stage of the overlay metrology sub-system accordingly. In this regard, the overlay metrology target is centered and in-focus when the measurement image is captured by the overlay metrology sub-system. Additionally, the machine learning-based centering model may be configured to output a model confidence score.
[0021] It is contemplated herein that the machine learning-based centering model of the present disclosure may eliminate the second move z-focus step required with existing techniques. For example, the system and method of the present disclosure may perform correction of the x-, y-, and z-stages simultaneously using the on-tool processor, where the measurement image is captured following such correction. In this regard, the move-acquire-measure (MAM) time is reduced (e.g., by more than 10 ms) and the accuracy of the overlay measurement is improved since each site is measured in the best contrast focus position.
[0022] Referring now to
[0023]
[0024] The one or more processors 106 of controller 104 may be configured to execute a target centering model 111 configured to perform target centering detection of the one or more overlay targets 109 on the sample 110. For example, the target centering model 111 may be stored in memory 108. It is contemplated herein that the target centering model 111 may include any type of machine learning algorithm/classifier and/or deep learning technique or classifier known in the art including, but not limited to, a conditional generative adversarial network (CGAN), a convolutional neural network (CNN) (e.g., GoogleNet, AlexNet, and the like), an ensemble learning classifier, a random forest classifier, artificial neural network (ANN), and the like.
[0025] In embodiments, the one or more processors 106 may be configured to execute a set of program instructions maintained in the memory 108. For example, the one or more processors 106 may be configured to receive one or more training images 115 of the sample 110. By way of another example, the one or more processors 106 may be configured to generate the target centering model 111 based on the received one or more training images 115 of the sample 110. By way of another example, the one or more processors 106 may be configured to receive one or more acquisition images 117 of the sample 110 from the overlay metrology sub-system 102. By way of another example, the one or more processors 106 may be configured to determine, using the target centering model 111, one or more stage correctables based on the received one or more acquisition images 117. By way of another example, the one or more processors 106 may be configured determine, using the target centering model 111, a model output confidence score. By way of another example, the one or more processors 106 may be configured to generate one or more control signals configured to cause the sample stage 112 of the overlay metrology sub-system 102 to adjust a stage position based on the determined one or more stage correctables. By way of another example, the one or more processors 106 may be configured to receive one or more measurement images 119 of the sample from the overlay metrology sub-system. By way of another example, the one or more processors 106 may be configured to determine one or more overlay measurements based on the received one or more measurement images.
[0026] In embodiments, the system 100 includes a user interface 114 communicatively coupled to the controller 104. The user interface 114 may include a user input device 116 and a display 118. The user input device 116 of the user interface 114 may be configured to receive one or more input commands from a user, the one or more input commands configured to input data into the system 100 and/or adjust one or more characteristics of the system 100. For example, the user input device 116 may be configured to receive user labels for the training images 115 from the user. For instance, the user may label the plurality of training images 115 with corresponding offers based on a best contrast focus position, where the labeled training image may be used to train the target centering model 111. In this regard, as will be discussed further herein, the target centering model 111 may be a supervised deep learning model (i.e., trained using user-labeled training images). The display of the user interface 114 may be configured to display data of the system 100 to a user.
[0027]
[0028] In embodiments, the method 200 includes a step 202 of receiving a plurality of training images. For example, the one or more processors 106 may be configured to receive a plurality of training images 115. In one instance, the one or more processors 106 may be configured to receive the plurality of training images 115 from the overlay metrology sub-system 102. In another instance, the one or more processors 106 may be configured to receive the plurality of training images 115 from a different sub-system communicatively coupled to the one or more processors 106.
[0029] The plurality of training images 115 may include a plurality of through-focus images. For example, each through-focus image of the plurality of through-focus images may be labeled with a corresponding offset based on the best contrast focus position. For instance, the initial training images may include through-focus images of a single sample with uniformly spread sites, where for each training site, multiple through-focus images around the best contrast focus are captured. At the end of the through-focus sequence, the best contrast image may be grabbed and set as ground truth for the target centering model 111 (as well as a focusing model, as discussed further herein). In this regard, the target centering model 111 is able to achieve a desired target centering position per site on the sample, thus increasing overall measurement accuracy.
[0030] In embodiments, the plurality of training images 115 may be adjusted prior to training the target centering model 111. For example, the plurality of training images 115 may be adjusted based on one or more parameters (or settings) prior to training the target centering model 111 (in step 204).
[0031] In embodiments, the method 200 includes a step 204 of generating the machine learning-based centering model based on the received plurality of training images. For example, the one or more processors 106 may be configured to generate the target centering model 111 based on the received plurality of training images 115 (in step 202).
[0032] In embodiments, the target centering model 111 may be trained to perform at least coarse training and fine training. For example, during coarse training, the target centering model 111 may be trained to perform object detection with binary classification on image regions. By way of another example, during fine centering training, the target centering model 111 may be trained on the detector part of the model only on the region found during the coarse centering step.
[0033] In embodiments, the generated machine learning-based centering model is stored in the memory as a measurement recipe of the overlay metrology sub-system 102. For example, the target centering model 111 may be stored in the measurement recipe and used during the move-focus-grab (MFG) sequence of the overlay metrology sub-system 102.
[0034] In embodiments, the method 200 includes a step 206 of receiving one or more acquisition images of a sample from an overlay metrology sub-system. For example, as shown in
[0035] The one or more acquisition images 117 may include one or more defocused images of the sample 110. For example, the one or more processors 106 may be configured to receive one or more defocused images of the sample 110. In a non-limiting instance, the defocus of the one or more defocused images may be up to 1.2 m around the best contrast focus position. In this regard, as discussed further herein, the target centering model 111 may be configured to determine a target centering position of the sample 110 based on one or more defocuses images of the sample 110. As mentioned previously herein, the system 100 is thus able to improve the move-acquire-measure (MAM) time, while also improving overlay measurement accuracy.
[0036] In embodiment, the method 200 includes a step 208 of determining, using the machine learning-based centering model, one or more stage correctables based on the received one or more acquisition images 117. For example, the one or more processors 106 may be configured to determine one or more stage correctables based on the received one or more acquisition images 117. The one or more stage correctables may include one or more correctables for the xy-stage. For example, the one or more stage correctables may include at least one of a x correctable and a y correctable.
[0037] In embodiments, the method 200 includes a step 210 of determining, using the machine learning-based centering model, a model output confidence score. For example, the one or more processors 106 may be configured to generate a model output confidence score, where the model output confidence score indicates a level of confidence associated with a respective stage correctable determined using the machine learning-based centering model.
[0038] The model output confidence score may be a number be a number between 0 and 1, where a score closer to 1 indicates higher confidence in the model output and a score closer to 0 indicates lower confidence in the model output. For example, a model confidence score may be calculated during binary classification to indicate if the image center is within a predetermined search area (e.g., 120120 pixel sub-image).
[0039]
[0040] In embodiments, binary classification may be performed to calculate the model output confidence score, where only the section containing the target center will be associated with a high score. For example, the acquisition images 400 may be separated into one or more image regions 402, where the target centering model 111 may be configured to perform object detection on the respective image regions 402 using binary classification to identify a target center. The target centering model 111 may then perform fine centering on the respective image region where the binary classification identifies the respective target center.
[0041] In a non-limiting example, as shown in
[0042] In embodiments, the method 200 includes an optional step 212 of comparing the model output confidence score to one or more predetermined thresholds (e.g., user-defined thresholds, or the like). For example, the one or more processors 106 may be configured to compare the model output confidence score to the one or more predetermined thresholds. For instance, the one or more processors 106 may be configured to receive one or more predetermined thresholds from one or more users and compare the model output confidence score to the user-defined threshold.
[0043] In a non-limiting example, the predetermined threshold may be 0.7, where a model output confidence score above 0.7 indicates the image is good and a model output confidence score below 0.7 indicates the image is bad (or rejected).
[0044] If the model output confidence score is below the predetermined threshold, the one or more processors 106 may be configured to fail the measurement in an optional step 214. For example, the one or more processors 106 may be configured to reject the determined stage corrections (from step 210).
[0045] It is contemplated that a low model output confidence score may be associated with the image center not being within the search area due to incorrect stage navigation. In this case, the model output confidence score may be close to 0. Further, it is contemplated that a low model confidence score may be associated with the input image being strong defocused (e.g., above 1.5-2 m). In this case, the model output confidence score may be below 0.7.
[0046] Upon failing the measurement, in embodiments, the method 200 may include an optional step 216 of adjusting one or more system parameters of the overlay metrology sub-system 102 and capturing one or more additional acquisition images based on the adjusted one or more system parameters. For example, the one or more processors 106 may be configured to adjust one or more system parameters of the overlay metrology sub-system 102 (or other sub-system) and thereafter cause the overlay metrology sub-system 102 to re-capture the acquisition images. In this regard, the model output confidence score may be used to identify system hardware issues and/or issues with the sample itself. In some instances, the model output confidence score may be used to re-train and/or adjust the target centering model 111.
[0047] If the model output confidence score is above (or equal to) the predetermined threshold, the one or more processors 106 may be configured to provide the one or more stage correctables (determined in step 208) to a machine learning-based focus model in a step 218. For instance, the x correctable and a y correctable may be provided to a machine learning-based focus model, where the x, y correctables may be used to determine a region of interest (ROI) placement. In this regard, the ROI placement may be used for machine learning-based focus model input signals.
[0048] The machine learning-based focus model may generally be discussed in U.S. Pat. No. 11,556,738, issued on Jan. 17, 2023, which is incorporated herein by reference in the entirety.
[0049] In embodiments, the method 200 includes a step 220 of generating one or more control signals configured to cause a sample stage of the overlay metrology sub-system to adjust a stage position based on the determined one or more stage correctables. For example, the one or more processors 106 may be configured to generate one or more control signals configured to cause the overlay metrology sub-system 102 to adjust the stage position of the sample stage 112 based on the one or more stage correctables (e.g., x, y). In this regard, the xy-stage may be adjusted by x, y, such that a correct target centering position is achieved prior to capture of the measurement image (received in step 222).
[0050] Where the focus model receives the output of the target centering model 111 from the optional step 218, the method 200 may include simultaneously adjusting a focus of the overlay metrology sub-system while adjusting the stage position of the overlay metrology sub-system (in step 220). For example, the one or more processors 106 may be configured to generate one or more control signals configured to cause the overlay metrology sub-system 102 to simultaneously adjust the xy-stage and z-stage based on the focus (e.g., z) and stage correctables (e.g., x, y). In this regard, the xy-stage may be adjusted by x, y and the z-stage may be adjust by z, such that a correct target centering position and focus is achieved prior to capture of the measurement image (received in step 222).
[0051] In embodiments, the method 200 includes a step 222 of receiving one or more measurement images of the sample from the overlay metrology sub-system. For example, the one or more processors 106 may be configured to receive one or more measurement images 119 of the sample 110 from the overlay metrology sub-system 102, where the measurement images are acquired based on the adjusted stage position (and in some instances, focus position) of the sample stage 112.
[0052] In an optional step 224, validation may be performed on the received one or more measurement images. For example, the one or more processors 106 may be configured to perform legacy validation on the received one or more measurement images from the overlay metrology sub-system 102, where the measurement images were captured based on the determined stage correctables.
[0053]
[0054] In embodiments, the measurement images acquired using the system 100 may be compared to images acquired using a different system 100. For example, validation may be performed using a standard legacy acquisition algorithm, where a legacy center is determined.
[0055] For example, a Euclidean off-center distance between a measurement image FOV (i.e., target center based on the target centering model 111) in the image 500 and a center calculated on measurement image 504 with legacy acquisition algorithm may be determined, as shown and described by Equations 1.1-1.2 and 2.1-2.2 below:
[0056] If the Euclidean off-center distance exceeds a predefined threshold (e.g., 3 pixels), as shown by Equation xxx, overlay calculations for the respective site should be skipped and remeasured later since overlay calculations may not be reliable.
[0057] It is contemplated that validation may be done after the measurement image is captured (or grabbed) during next site measurement to avoid validation affecting MAM time.
[0058] In embodiments, the method 200 includes a step 226 of determining one or more overlay measurements based on the received one or more measurement images. For example, the one or more processors 106 may be configured to determine overlay based on the received measurement images from step 222.
[0059]
[0060] It is noted herein that for the purposes of the present disclosure, the term overlay is generally used to describe relative positions of features on a sample fabricated by two or more lithographic patterning steps, where the term overlay error describes a deviation of the features from a nominal arrangement. In this context, an overlay measurement may be expressed as either a measurement of the relative positions or of an overlay error associated with these relative positions. For example, a multi-layered device may include features patterned on multiple sample layers using different lithography steps for each layer, where the alignment of features between layers must typically be tightly controlled to ensure proper performance of the resulting device. Accordingly, an overlay measurement may characterize the relative positions of features on two or more of the sample layers. By way of another example, multiple lithography steps may be used to fabricate features on a single sample layer. Such techniques, commonly called double-patterning or multiple-patterning techniques, may facilitate the fabrication of highly dense features near the resolution of the lithography system. An overlay measurement in this context may characterize the relative positions of the features from the different lithographic steps on this single layer. It is to be understood that examples and illustrations throughout the present disclosure relating to a particular application of overlay metrology are provided for illustrative purposes only and should not be interpreted as limiting the disclosure.
[0061] As used throughout the present disclosure, the term sample generally refers to a substrate formed of a semiconductor or non-semiconductor material (e.g., a wafer, or the like). For example, a semiconductor or non-semiconductor material may include, but is not limited to, monocrystalline silicon, gallium arsenide, and indium phosphide. A sample may include one or more layers. For example, such layers may include, but are not limited to, a resist, a dielectric material, a conductive material, and a semiconductive material. Many different types of such layers are known in the art, and the term sample as used herein is intended to encompass a sample on which all types of such layers may be formed. One or more layers formed on a sample may be patterned or unpatterned. For example, a sample may include a plurality of dies, each having repeatable patterned features. Formation and processing of such layers of material may ultimately result in completed devices. Many different types of devices may be formed on a sample, and the term sample as used herein is intended to encompass a sample on which any type of device known in the art is being fabricated. Further, for the purposes of the present disclosure, the term sample and wafer should be interpreted as interchangeable. In addition, for the purposes of the present disclosure, the terms patterning device, mask and reticle should be interpreted as interchangeable.
[0062] The overlay target 109 may generally include any overlay target known in the art including, but not limited to, an advanced imaging metrology (AIM) target, a robust AIM target, or a triple AIM target.
[0063] In embodiments, the system 100 includes the overlay metrology sub-system 102 to acquire images of the sample 110 based on any number of overlay recipes. For example, the overlay metrology sub-system 102 may direct illumination to the sample 110 and may further collect light or other radiation emanating from the sample 110 to generate an overlay signal suitable for the determination of overlay of two or more sample layers. The overlay metrology sub-system 102 may be any type of overlay metrology sub-system known in the art suitable for generating overlay signals suitable for determining overlay associated with overlay targets 109 on the sample 110. For example, the 102//may include an image-based overlay metrology sub-system 102. The overlay metrology sub-system 102 may selectively operate in an imaging mode or a non-imaging mode.
[0064] The overlay metrology sub-system 102 may be configurable to generate overlay signals based on any number of recipes defining measurement parameters for acquiring an overlay signal suitable for determining overlay of an overlay target 109. For example, a recipe of the overlay metrology sub-system 102 may include, but is not limited to, an illumination wavelength, a detected wavelength of light emanating from the sample 110, a spot size or shape of illumination on the sample 110, an angle of incident illumination, a polarization of incident illumination, a polarization of collected light, a position of a beam of incident illumination on an overlay target 109, a center position of an overlay target 109 in the focal volume of the overlay metrology sub-system 102, or the like.
[0065] In embodiments, the overlay metrology sub-system 102 includes an illumination sub-system including an illumination source 614 configured to generate at least one illumination beam 616 and one or more illumination optics 622. For example, the illumination sub-system may include one or more broadband illumination sources 614 configured to generate one or more broadband illumination beams 616. In this regard, the overlay metrology sub-system 102 may include one or more apertures at an illumination pupil plane to divide illumination from the illumination source 614 into one or more illumination beams 616 or illumination lobes. In this regard, the overlay metrology sub-system 102 may provide dipole illumination, quadrature illumination, or the like. Further, the spatial profile of the one or more illumination beams 616 on the sample 110 may be controlled by a field-plane stop to have any selected spatial profile.
[0066] The illumination source 614 may include any type of illumination source suitable for providing at least one broadband illumination beam 616. In embodiments, the illumination source 614 is a laser source. For example, the illumination source 614 may include a broadband laser source.
[0067] In embodiments, the overlay metrology sub-system 102 directs the illumination beam 616 to the sample 110 via an illumination pathway 618. The illumination pathway 618 may include one or more optical components suitable for modifying and/or conditioning the illumination beam 616 as well as directing the illumination beam 616 to the sample 110. In embodiments, the illumination pathway 618 includes one or more illumination-pathway lenses 620 (e.g., to collimate the illumination beam 616, to relay pupil and/or field planes, or the like). In embodiments, the illumination pathway 618 includes one or more illumination-pathway optics 622 to shape or otherwise control the illumination beam 616. For example, the illumination-pathway optics 622 may include, but are not limited to, one or more field stops, one or more pupil stops, one or more polarizers, one or more filters, one or more beam splitters, one or more diffusers, one or more homogenizers, one or more apodizers, one or more beam shapers, or one or more mirrors (e.g., static mirrors, translatable mirrors, scanning mirrors, or the like).
[0068] In embodiments, the overlay metrology sub-system 102 includes an objective lens 624 to focus the illumination beam 616 onto the sample 110 (e.g., an overlay target 109 with overlay target features located on two or more layers of the sample 110). In embodiments, the sample 110 is disposed on a sample stage 112 suitable for securing the sample 110 and further configured to position the sample 110 with respect to the illumination beam 616.
[0069] In embodiments, the overlay metrology sub-system 102 includes one or more detectors 628 configured to capture light emanating from the sample 110 (e.g., an overlay target 109 on the sample 110) (e.g., collected light 630) through a collection pathway 632. The collection pathway 632 may include one or more optical elements suitable for modifying and/or conditioning the collected light 630 from the sample 110. In embodiments, the collection pathway 632 includes one or more collection-pathway lenses 634 (e.g., to collimate the illumination beam 616, to relay pupil and/or field planes, or the like), which may include, but is not required to include, the objective lens 624. In embodiments, the collection pathway 632 includes one or more collection-pathway optics 636 to shape or otherwise control the collected light 630. For example, the collection-pathway optics 636 may include, but are not limited to, one or more field stops, one or more pupil stops, one or more polarizers, one or more filters, one or more beams splitters, one or more diffusers, one or more homogenizers, one or more apodizers, one or more beam shapers, or one or more mirrors (e.g., static mirrors, translatable mirrors, scanning mirrors, or the like).
[0070] The overlay metrology sub-system 102 may generally include any number or type of detectors 628 suitable for capturing light from the sample 110 indicative of overlay. In embodiments, the detector 628 includes one or more detectors 628 suitable for characterizing a static sample. In this regard, the overlay metrology sub-system 102 may operate in a static mode in which the sample 110 is static during a measurement. For example, a detector 628 may include a two-dimensional pixel array such as, but not limited to, a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) device. In this regard, the detector 628 may generate a two-dimensional image in a single measurement.
[0071] In embodiments, the detector 628 includes one or more detectors 628 suitable for characterizing a moving sample 110 (e.g., a scanned sample). In this regard, the overlay metrology sub-system 102 may operate in a scanning mode in which the sample 110 is scanned with respect to a measurement field during a measurement. For example, the detector 628 may include a 2D pixel array with a capture time and/or a refresh rate sufficient to capture one or more images during a scan within selected image tolerances (e.g., image blur, contrast, sharpness, or the like). By way of another example, the detector 628 may include a line-scan detector to continuously generate an image one line of pixels at a time. By way of another example, the detector 628 may include a time-delay integration (TDI) detector. A TDI detector may generate a continuous image of the sample 110 when the motion of the sample 110 is synchronized to charge-transfer clock signals in the TDI detector. In particular, a TDI detector acquires charge from light exposure on columns of pixels and includes clock pulses to transfer charge between adjacent columns of pixels along a scan direction. When the motion of the sample 110 along the scan direction is synchronized to the charge transfer in the TDI detector, charge continuously accumulates during the scan. This process continues until the charge reaches a final column of pixels and is subsequently read out of the detector. In this way, images of the object are accumulated over a longer time frame than would be possible with a simple line scan camera. This relatively longer acquisition time decreases the photon noise level in the image. Further, synchronous motion of the image and charge prevents blurring in the recorded image.
[0072] In embodiments, the overlay metrology sub-system 102 includes a scanning sub-system to scan the sample 110 with respect to the measurement field during a metrology measurement. For example, the sample stage 112 may position and orient the sample 110 within a focal volume of the objective lens 624. In embodiments, the sample stage 112 includes one or more adjustable stages such as, but not limited to, a linear translation stage, a rotational stage, or a tip/tilt stage. In embodiments, though not shown, the scanning sub-system includes one or more beam-scanning optics (e.g., rotatable mirrors, galvanometers, or the like) to scan the illumination beams 616 with respect to the sample 110).
[0073] The illumination pathway 618 and the collection pathway 632 of the overlay metrology sub-system 102 may be oriented in a wide range of configurations suitable for illuminating the sample 110 with the illumination beams 616 and collecting light emanating from the sample 110 in response to the incident illumination beams 616. For example, the overlay metrology sub-system 102 may include a beamsplitter 638 oriented such that a common objective lens 624 may simultaneously direct the illumination beams 616 to the sample 110 and collect light from the sample 110. By way of another example, the illumination pathway 618 and the collection pathway 632 may contain non-overlapping optical paths.
[0074] In embodiments, the overlay metrology sub-system 102 may provide overlay data to one or more process sub-systems. Overlay data from an overlay metrology sub-system may generally include any output of an overlay metrology sub-system having sufficient information to determine overlay (or overlay errors) associated with various lithography steps. For example, overlay data may include, but is not required to include, one or more datasets, one or more images, one or more detector readings, or the like. This overlay data may then be used for various purposes including, but not limited to, diagnostic information of the lithography sub-systems or for the generation of process-control correctables. For instance, overlay data for samples in a lot may be used to generate feedback correctables for controlling the lithographic exposure of subsequent samples in the same lot. In another instance, overlay data for samples in a lot may be used to generate feed-forward correctables for controlling lithographic exposures for the same or similar samples in subsequent lithography steps to account for any deviations in the current exposure.
[0075] Referring again to
[0076] The one or more processors 106 of the controller 104 may generally include any processor or processing element known in the art. For the purposes of the present disclosure, the term processor or processing element may be broadly defined to encompass any device having one or more processing or logic elements (e.g., one or more micro-processor devices, one or more application specific integrated circuit (ASIC) devices, one or more field programmable gate arrays (FPGAs), or one or more digital signal processors (DSPs)). In this sense, the one or more processors 106 may include any device configured to execute algorithms and/or instructions (e.g., program instructions stored in memory). In one embodiment, the one or more processors 106 may be embodied as a desktop computer, mainframe computer system, workstation, image computer, parallel processor, networked computer, or any other computer system configured to execute a program configured to operate or operate in conjunction with the system 100, as described throughout the present disclosure. Moreover, different subsystems of the system 100 may include a processor or logic elements suitable for carrying out at least a portion of the steps described in the present disclosure. Therefore, the above description should not be interpreted as a limitation on the embodiments of the present disclosure but merely as an illustration. Further, the steps described throughout the present disclosure may be carried out by a single controller or, alternatively, multiple controllers. Additionally, the controller 104 may include one or more controllers housed in a common housing or within multiple housings. In this way, any controller or combination of controllers may be separately packaged as a module suitable for integration into metrology system 100. Further, the controller 104 may analyze or otherwise process data received from the overlay metrology sub-system 102 and feed the data to additional components within the system 100 or external to the system 100.
[0077] Further, the memory device 108 may include any storage medium known in the art suitable for storing program instructions executable by the associated one or more processors 106. For example, the memory device 108 may include a non-transitory memory medium. As an additional example, the memory device 108 may include, but is not limited to, a read-only memory, a random-access memory, a magnetic or optical memory device (e.g., disk), a magnetic tape, a solid-state drive and the like. It is further noted that memory device 108 may be housed in a common controller housing with the one or more processors 106.
[0078] In this regard, the controller 104 may execute any of various processing steps associated with metrology and/or inspection. For example, the controller 104 may be configured to generate control signals to direct or otherwise control the overlay metrology sub-system 102, or any components thereof. For instance, the controller 104 may be configured to direct the stage 112 to translate the sample 110 along one or more measurement paths or swaths. By way of another example, the controller 104 may be configured to receive images from the overlay metrology sub-system 102. By way of another example, the controller 104 may generate correctables for one or more additional fabrication sub-systems as feedback and/or feed-forward control of the one or more additional fabrication tools (e.g., lithography tool) based on measurements from the overlay metrology sub-system 102.
[0079] One skilled in the art will recognize that the herein described components (e.g., operations), devices, objects, and the discussion accompanying them are used as examples for the sake of conceptual clarity and that various configuration modifications are contemplated. Consequently, as used herein, the specific exemplars set forth and the accompanying discussion are intended to be representative of their more general classes. In general, use of any specific exemplar is intended to be representative of its class, and the non-inclusion of specific components (e.g., operations), devices, and objects should not be taken as limiting.
[0080] Those having skill in the art will appreciate that there are various vehicles by which processes and/or systems and/or other technologies described herein can be implemented (e.g., hardware, software, and/or firmware), and that the preferred vehicle will vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle; alternatively, if flexibility is paramount, the implementer may opt for a mainly software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware. Hence, there are several possible vehicles by which the processes and/or devices and/or other technologies described herein may be effected, none of which is inherently superior to the other in that any vehicle to be utilized is a choice dependent upon the context in which the vehicle will be deployed and the specific concerns (e.g., speed, flexibility, or predictability) of the implementer, any of which may vary.
[0081] The previous description is presented to enable one of ordinary skill in the art to make and use the invention as provided in the context of a particular application and its requirements. As used herein, directional terms such as top, bottom, over, under, upper, upward, lower, down, and downward are intended to provide relative positions for purposes of description, and are not intended to designate an absolute frame of reference. Various modifications to the described embodiments will be apparent to those with skill in the art, and the general principles defined herein may be applied to other embodiments. Therefore, the present invention is not intended to be limited to the particular embodiments shown and described, but is to be accorded the widest scope consistent with the principles and novel features herein disclosed.
[0082] With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations are not expressly set forth herein for sake of clarity.
[0083] All of the methods described herein may include storing results of one or more steps of the method embodiments in memory. The results may include any of the results described herein and may be stored in any manner known in the art. The memory may include any memory described herein or any other suitable storage medium known in the art. After the results have been stored, the results can be accessed in the memory and used by any of the method or system embodiments described herein, formatted for display to a user, used by another software module, method, or system, and the like. Furthermore, the results may be stored permanently, semi-permanently, temporarily, or for some period of time. For example, the memory may be random access memory (RAM), and the results may not necessarily persist indefinitely in the memory.
[0084] It is further contemplated that each of the embodiments of the method described above may include any other step(s) of any other method(s) described herein. In addition, each of the embodiments of the method described above may be performed by any of the systems described herein.
[0085] The herein described subject matter sometimes illustrates different components contained within, or connected with, other components. It is to be understood that such depicted architectures are merely exemplary, and that in fact many other architectures can be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively associated such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as associated with each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being connected, or coupled, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being couplable, to each other to achieve the desired functionality. Specific examples of couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.
[0086] Furthermore, it is to be understood that the invention is defined by the appended claims. It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as open terms (e.g., the term including should be interpreted as including but not limited to, the term having should be interpreted as having at least, the term includes should be interpreted as includes but is not limited to, and the like). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases at least one and one or more to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles a or an limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases one or more or at least one and indefinite articles such as a or an (e.g., a and/or an should typically be interpreted to mean at least one or one or more); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of two recitations, without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to at least one of A, B, and C, and the like is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., a system having at least one of A, B, and C would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, and the like). In those instances where a convention analogous to at least one of A, B, or C, and the like is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., a system having at least one of A, B, or C would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, and the like). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase A or B will be understood to include the possibilities of A or B or A and B.
[0087] It is believed that the present disclosure and many of its attendant advantages will be understood by the foregoing description, and it will be apparent that various changes may be made in the form, construction and arrangement of the components without departing from the disclosed subject matter or without sacrificing all of its material advantages. The form described is merely explanatory, and it is the intention of the following claims to encompass and include such changes. Furthermore, it is to be understood that the invention is defined by the appended claims.