PROCESSOR SYSTEM, SEMICONDUCTOR INSPECTION SYSTEM, AND PROGRAM
20230230886 · 2023-07-20
Assignee
Inventors
- Kenji YASUI (Tokyo, JP)
- Mayuka OSAKI (Tokyo, JP)
- Hitoshi NAMAI (Tokyo, JP)
- Yuki OJIMA (Tokyo, JP)
- Wataru NAGATOMO (Tokyo, JP)
- Masami IKOTA (Tokyo, JP)
- Maki KIMURA (Tokyo, JP)
Cpc classification
H01L22/12
ELECTRICITY
International classification
Abstract
To provide a technique capable of quantitatively grasping a change in three-dimensional shape including a cross-sectional shape of a pattern within a surface of a wafer or between wafers in a non-destructive manner before cross-sectional observation. A processor system of a semiconductor inspection system acquires images captured by an electron microscope (SEM) for a sample (S102), calculates, for a reference region defined on a surface of the sample, first feature data corresponding to each of a plurality of locations in the reference region from the captured image (S103A), calculates a first statistical value based on the first feature data at the plurality of locations (S103B), calculates, for each of a plurality of evaluation regions defined as points or regions on the surface of the sample in correspondence with the reference region, second feature data corresponding to each of one or more locations in the evaluation region from the captured image, as feature data of the same type as the first feature data (S104A), and converts the second feature data using the first statistical value to obtain second feature data after conversion (S105).
Claims
1. A processor system for evaluating a three-dimensional shape including a cross-sectional shape of a pattern of a semiconductor which is a sample, the processor system comprising: at least one processor; and at least one memory resource, wherein the processor is configured to: acquire one or more images captured by an electron microscope for each of one or more samples, calculate, for a reference region defined on each of surfaces of the one or more samples, first feature data corresponding to each of a plurality of locations in the reference region from the captured image, calculate a first statistical value based on the first feature data at the plurality of locations, calculate, for each of a plurality of evaluation regions defined on each of the surfaces of the one or more samples in correspondence with the reference region, second feature data corresponding to each of one or more locations in the evaluation region from the captured image, as feature data of the same type as the first feature data, and convert the second feature data by using the first statistical value to obtain second feature data after conversion.
2. The processor system according to claim 1, wherein the processor is configured to: calculate, as the first statistical value, a fluctuation amount and an average value of the first feature data with respect to a local shape variation of a pattern in the reference region, calculate, as a second statistical value, an average value of the second feature data with respect to a local shape variation of a pattern in the evaluation region, and normalize a difference between the average value of the second feature data and the average value of the first feature data with the fluctuation amount of the first feature data.
3. The processor system according to claim 1, wherein each of the first feature data and the second feature data is at least one type of feature data among a line width, a white band peak, a bottom signal value, a top signal value, an inclination that are calculated based on signal waveform of the captured image, or a value that is calculated from the captured image by calculation.
4. The processor system according to claim 1, wherein the processor is configured to quantify and evaluate a change in cross-sectional shape of the pattern of the semiconductor in a surface of a target sample or between target samples by using the second feature data after conversion as an index.
5. The processor system according to claim 1, wherein the processor is configured to display, on a display screen, the second feature data after conversion for the evaluation region of a target sample.
6. The processor system according to claim 1, wherein the processor is configured to select a cross-sectional observation position for cross-sectional observation based on the second feature data after conversion.
7. The processor system according to claim 6, wherein the processor is configured to cause a cross-sectional observation device to perform cross-sectional observation and to measure a cross-sectional shape dimension based on the cross-sectional observation position.
8. The processor system according to claim 7, wherein the processor is configured to store, into the memory resource, the second feature data, the second feature data after conversion, and the cross-sectional shape dimension in association with one another as data for the same region of a sample.
9. The processor system according to claim 8, wherein the processor is configured to: acquire an image captured by the electron microscope for an estimation target sample, calculate feature data from the captured image, and estimate a cross-sectional shape dimension of a pattern of the estimation target sample based on the calculated feature data according to a relation indicated by the associated data.
10. The processor system according to claim 1, wherein the processor is configured to store, into the memory resource, the second feature data, the second feature data after conversion, and a manufacturing parameter for a sample in association with one another as data for the same region of the sample.
11. The processor system according to claim 10, wherein the processor is configured to: acquire an image captured by the electron microscope for an adjustment target sample, calculate feature data from the captured image, and adjust the manufacturing parameter based on the calculated feature data according to a relation indicated by the associated data such that uniformity of a change in cross-sectional shape of the pattern of the semiconductor in a surface of the adjustment target sample or between the adjustment target samples is higher than before.
12. The processor system according to claim 1, wherein the processor is configured to store, into the memory resource, the second feature data, the second feature data after conversion, a cross-sectional shape dimension as a result of cross-sectional observation, and a manufacturing parameter for a sample in association with one another as data for the same region of the sample.
13. A semiconductor inspection system for inspecting a three-dimensional shape including a cross-sectional shape of a pattern of a semiconductor which is a sample, the semiconductor inspection system comprising: an electron microscope; and a processor system including at least one processor and at least one memory resource, wherein the processor is configured to: acquire one or more images captured by the electron microscope for each of one or more samples, calculate, for a reference region defined on each of surfaces of the one or more samples, first feature data corresponding to each of a plurality of locations in the reference region from the captured image, calculate a first statistical value based on the first feature data at the plurality of locations, calculate, for each of a plurality of evaluation regions defined on each of the surfaces of the one or more samples in correspondence with the reference region, second feature data corresponding to each of one or more locations in the evaluation region from the captured image, as feature data of the same type as the first feature data, and convert the second feature data by using the first statistical value to obtain second feature data after conversion.
14. A program for causing a processor system for evaluating a three-dimensional shape including a cross-sectional shape of a pattern of a semiconductor which is a sample, to perform processing, wherein a processor of the processor system is caused to perform the following processing: acquiring one or more images captured by an electron microscope for each of one or more samples; calculating, for a reference region defined on each of surfaces of the one or more samples, first feature data corresponding to each of a plurality of locations in the reference region from the captured image; calculating a first statistical value based on the first feature data at the plurality of locations; calculating, for each of a plurality of evaluation regions defined on each of the surfaces of the one or more samples in correspondence with the reference region, second feature data corresponding to each of one or more locations in the evaluation region from the captured image, as feature data of the same type as the first feature data; and converting the second feature data by using the first statistical value to obtain second feature data after conversion.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
DESCRIPTION OF EMBODIMENTS
[0031] Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. In all the drawings, the same parts are denoted by the same reference numerals in principle, and a repeated description thereof will be omitted. In order to facilitate understanding, in the drawings, a representation of the component may not represent an actual position, a size, a shape, a range, and the like.
[0032] For the purpose of description, when processing performed by a program is described, the description may be made with a program, a function, a processing unit, or the like as a subject. However, a subject serving as hardware of the program, the function, the processing unit, or the like is such as a processor, or a controller, a device, a computer, a system, or the like including the processor. The computer appropriately uses resources such as a memory and a communication interface and performs processing corresponding to a program read onto the memory by the processor. Accordingly, predetermined functions, processing units, and the like are implemented. The processor is implemented by, for example, a semiconductor device such as a CPU or a GPU. The processor includes a device or a circuit capable of performing a predetermined calculation. The processing is not limited to processing performed by a software program, and can be performed by a dedicated circuit. FPGA, ASIC, CPLD and the like can be applied as the dedicated circuit.
[0033] The program may be installed as data in a target computer in advance, or may be distributed as data into the target computer from a program source. The program source may be a program distribution server on a communication network, and may be a non-transient computer-readable storage medium (for example, a memory card). The program may include a plurality of modules. A computer system may include a plurality of devices. The computer system may be implemented by a client server system, a cloud computing system, an IoT system, or the like. Various types of data and information may have, for example, structures such as tables and lists, but are not limited thereto.
[0034] Identification information may be replaced with an identifier, an ID, a name, a number, or the like.
First Embodiment
[0035] A semiconductor inspection system and the like according to a first embodiment of the present disclosure will be described with reference to
[0036] The semiconductor inspection system according to the first embodiment, in particular, the processor system has a function of evaluating a three-dimensional shape including a cross-sectional shape of a semiconductor pattern. The system has a function of calculating and outputting an index for evaluation (in other words, feature data after conversion) based on feature data (in other words, image feature data) calculated from an SEM image. In the first embodiment, an example is described in which a change in cross-sectional shape of a pattern on a surface of a semiconductor wafer is calculated and detected as an index using a Top-view SEM image and displayed for a user.
[0037] The semiconductor inspection system according to the first embodiment quantitatively grasps a change in cross-sectional shape within a surface of the wafer based on the feature data calculated from the Top-view SEM image. Since a change in image feature data occurs due to the change in cross-sectional shape, the possibility of the change in cross-sectional shape can be grasped by evaluating the change in image feature data on the surface of the wafer. On the other hand, the change in cross-sectional shape includes various shape changes such as a change in line width of a pattern, a change in rounding of corners, a change in tailing, a change in tilt angle of a side wall, and a change in height. It is difficult to uniquely determine the relation between these changes in cross-sectional shape and the image feature data. Therefore, it is difficult to measure a dimension (for example, a line width) of a specific cross-sectional shape based on the image feature data. In other words, in the related art, it is difficult to grasp the magnitude of the influence that the magnitude of the change in image feature data has on the change in cross-sectional shape.
[0038] Therefore, the semiconductor inspection system according to the first embodiment converts a change amount of the image feature data on the surface of the wafer into an index (also referred to as a cross-sectional shape index) obtained by normalizing the image feature data based on fluctuation of the feature data caused by a local shape variation of the semiconductor pattern. A specific example of the normalization is represented by an equation to be described later. The semiconductor inspection system according to the first embodiment calculates, from the SEM image, a statistical value (referred to as a first statistical value) of image feature data (referred to as first feature data) at a plurality of locations in a region serving as a reference, and calculates image feature data (referred to as second feature data) at a plurality of locations in a region serving as an evaluation target. Then, the semiconductor inspection system according to the first embodiment converts the second feature data in the region serving as the evaluation target by using a second statistical value of the region serving as the reference, thereby obtaining second feature data after conversion as the cross-sectional shape index. The cross-sectional shape index can be used as an index which is related to uniformity of a pattern within the surface of the wafer or between wafers and which is used for quantitatively evaluating a change in cross-sectional shape.
[System including Semiconductor Inspection System]
[0039]
[0040] The semiconductor inspection system according to the first embodiment mainly includes the scanning electron microscope (SEM) 1. The SEM 1 includes a main unit to be described later (
[0041] The semiconductor inspection system according to the first embodiment is not limited to a configuration example in
[0042] As will be described later, the cross-sectional observation device 2 is a device having a function of performing cross-sectional processing on a semiconductor device (particularly, a wafer) and observing a cross-sectional shape thereof, and a FIB-SEM is applied in the example of the first embodiment. The cross-sectional processing is processing for destructing a part of the semiconductor device such that a structure of a cross-section of the semiconductor device is exposed and observable. The cross-sectional observation is an observation for a cross-section formed by the cross-sectional processing. The cross-sectional processing can be performed not only by the FIB-SEM but also by other types of devices. The cross-sectional observation can be performed not only by using the FIB-SEM but also by using other types of devices, for example, a cross-sectional SEM and a scanning transmission electron microscope (STEM).
[0043] The cross-sectional shape estimation system 3, which will be described later, is a system having a function of estimating a cross-sectional shape of a semiconductor device (particularly, a wafer). As the cross-sectional shape estimation system 3, for example, a system described in PTL 1 can also be applied. The cross-sectional shape estimation system 3 stores data and information for estimation, for example, data and information representing a correspondence relation between image feature data and a cross-sectional shape dimension, into a database (DB) 3a, for example, based on learning. The cross-sectional shape estimation system 3 uses the data and information stored in the DB 3a to estimate a cross-sectional shape dimension of the semiconductor device based on the image feature data.
[0044] The manufacturing parameter control system 4, which will be described later, is a system that performs control so as to adjust a parameter (for example, an etching parameter) related to manufacturing (for example, an etching process) of the semiconductor device performed by the semiconductor device manufacturing apparatus 5 (for example, an etching device). The manufacturing parameter control system 4 may be a part of the MES 6. The MES 6 is a system that manages semiconductor device manufacturing execution using the semiconductor device manufacturing apparatus 5. The MES 6 and the like have design data and manufacturing execution management information related to the semiconductor device serving as a target.
[0045] The client terminal 7 is an information processing terminal device having a function of accessing each system (particularly, a server function among the functions) such as the SEM 1 via the communication network 9. A general PC or the like can be applied as the client terminal 7, and an input device for external input and an output device for display or the like are built in the client terminal 7, or these devices are externally connected to the client terminal 7. A user such as an operator may use each system such as the SEM 1 from the client terminal 7.
[0046] As a modification, for example, the processor system 100 or the like and the client terminal 7 may be connected via a communication network such as the Internet. For example, functions of the processor system 100 or the like may be implemented by a cloud computing system or the like.
[0047] The communication network 9 may be provided with a program distribution server (not shown). The program distribution server distributes data such as the program according to the first embodiment to, for example, the processor system 100 of the SEM 1.
[Processor System]
[0048]
[0049] The processor 201 is implemented by a semiconductor device such as a CPU, an MPU or a GPU. The processor 201 includes ROM, RAM, various peripheral functions, and the like. The processor 201 performs processing corresponding to a control program 211 stored in the memory 202. Accordingly, an SEM control function 221, a semiconductor inspection function 222, and the like are implemented. The SEM control function 221 is a function of controlling the SEM 1 as a controller, but can be omitted. The semiconductor inspection function 222 is a main function of the semiconductor inspection system according to the first embodiment, and includes a function of calculating a cross-sectional shape index.
[0050] The memory 202 stores the control program 211, setting information 212, image data 213, processing data 214, inspection result data 215, and the like. By the control program 211, the semiconductor inspection function 222 and the like are implemented. The setting information 212 is system setting information or user setting information of the semiconductor inspection function 222. The image data 213 is data of a captured image acquired by the SEM 1. The processing data 214 is data generated during processing using the semiconductor inspection function 222 or the like. The inspection result data 215 is data including feature data, a cross-sectional shape index, an evaluation result, or the like that are obtained as processing results using the semiconductor inspection function 222 or the like.
[0051] The communication interface device 203 is a device including a communication interface for the SEM 1, the communication network 9, and the like. The input and output interface device 204 is a device including an input and output interface, and an input device 205 and an output device 206 are externally connected to the input and output interface device 204. Examples of the input device 205 include a keyboard and a mouse. Examples of the output device 206 include a display and a printer. The processor system 100 may include the input device 205 and the output device 206. A user, such as an operator, may use the processor system 100 through operation of the input device 205 or screen display of the output device 206. The user may use the processor system 100 by accessing the processor system 100 from the client terminal 7 in
[0052] An external storage device (for example, a memory card or a disk) may be connected to the processor system 100, and input and output data of the processor system 100 may be stored in the external storage device.
[0053] When a function is used in a client-server communication form between a system such as the processor system 100 in
[SEM]
[0054]
[0055] The image-capturing unit 101 includes an electron gun 108, an acceleration electrode 110, a converging lens 111, a deflection lens 112, an objective lens 113, a stage 115, a detector 117, and the like as components mounted on a lens barrel (in other words, a housing). The electron gun 108 emits an electron beam 109. The acceleration electrode 110 accelerates the electron beam 109 emitted from the electron gun 108. The converging lens 111 converges the electron beam 109. The deflection lens 112 deflects a trajectory of the electron beam 109. The objective lens 113 controls a height at which the electron beam 109 is converged.
[0056] The stage 115 is a sample stage on which a sample 300 (in other words, a semiconductor device or a wafer) whose image is to be captured is placed. Since the stage 115 is, for example, a mechanism capable of moving in an X direction and a Y direction shown in the drawing, and thus a field of view for capturing an image can be set.
[0057] The detector 117 detects particles such as secondary electrons 116 generated from the sample 300 irradiated with the electron beam 109.
[0058] The overall control unit 102 corresponds to a controller of the SEM 1, and controls the entire image-capturing unit 101 and each of the units. The overall control unit 102 gives, to each unit, an instruction such as drive control. The overall control unit 102 can be implemented by a computer system or a dedicated circuit. At least one of the SEM control function 221 in
[0059] The signal processing unit 103 converts a signal detected by the detector 117 into image data based on analog/digital conversion or the like according to the instruction from the overall control unit 102, and stores the image data into the storage unit 105. The signal processing unit 103 can be implemented by a computer system or a dedicated circuit. The storage unit 105 can be implemented by, for example, a nonvolatile storage device.
[0060] The external input unit 104 is a unit that inputs an instruction or the like to the overall control unit 102 based on an input operation of an operator, and can be implemented by a computer system, an input device, or the like. The display unit 107 is a unit that is connected to the overall control unit 102 and displays information from the overall control unit 102 for the operator, and can be implemented by a computer system, an output device, or the like.
[0061] The processor system 100 acquires image data from the storage unit 105. The image calculation unit 106 of the processor system 100 is a unit that performs processing corresponding to the semiconductor inspection function 222 in
[0062] The SEM is not limited to a configuration example in
[Image Feature Data]
[0063] Next, the image feature data will be described with reference to
[0064]
[0065]
[0066] In general, a signal amount of a signal waveform such as the signal waveform 407 changes with high sensitivity with respect to a tilt angle of a measurement target, and a signal amount at a side wall portion (for example, the side wall portion 405) of the pattern is larger than a signal amount at a flat portion (for example, the upper surface portion 406) of the pattern. Therefore, the signal amount at the side wall portion of the pattern is absolutely large, and a region called a white band (also referred to as WB) 403 appears on the signal waveform 407. Thus, the signal amount of the signal waveform 407 changes according to the cross-sectional shape of the pattern.
[0067]
[0068] In general, it is considered that feature data indicating the signal amount of the signal waveform (for example, WB peak 501) is used to grasp a change in height direction of the pattern (corresponding cross-section), feature data indicating a width of the signal waveform (for example, the width 506) is used to grasp a change in width direction such as the line width of the cross-sectional shape, and feature data indicating an inclination of the signal waveform (for example, the inclination 507) is used to grasp changes in the rounding and the tailing of corner of the pattern, the tilt angle of the side wall, or the like.
[0069] The semiconductor inspection system according to the first embodiment uses the feature data of one or more types to calculate a cross-sectional shape index for each type of feature data.
[Processing Flow (1-1)]
[0070] Next, a main processing flow of the semiconductor inspection system according to the first embodiment will be described with reference to
[0071] In step S101, the processor system 100 sets a reference region and evaluation regions on a surface of a target wafer based on the external input unit 104 (or the input device 205 in
[Reference Region and Evaluation Region]
[0072]
[0073] As shown in (A) of
[0074] The evaluation region 72 and the reference region 71 are set as two-dimensional regions of a set size in order to take a plurality of locations as samples in one region. The evaluation region 72 or the reference region 71 may be set at a region having a size including a pattern shape (for example, at least one line pattern) such that feature data in the region can be calculated. As a modification, the evaluation region 72 is not limited to a two-dimensional region, and may be set as one point according to a calculation formula of the cross-sectional shape index.
[0075] The reference region 71 is set as a reference (in other words, a reference for normalization) for comparison with the image feature data of each evaluation region 72. From this viewpoint, a region in which a pattern relatively close to an ideal shape is assumed to be formed is selected as the reference region 71. In this example, the reference region 71 is set at a center of the wafer 701. The reference region 71 is not limited thereto, and may be set at any position other than the center.
[0076] As in the example of (A) of
[0077] In the example of (B) of
[0078] As shown in the example of the SEM images 710, 721, and 722 in (B) of
[0079] (C) of
[0080] In the first embodiment, a change in image feature data within the surface of the wafer or between wafers is quantified based on the fluctuation of the image feature data caused by the local shape variation of the pattern shape of the wafer. Therefore, a range of the reference region 71 (in other words, an image size) is set such that the local shape variation can be statistically evaluated with sufficient accuracy. In the specific example, as shown in (B) of
[Processing Flow (1-2)]
[0081] With reference to
[0082] Step S103 includes steps S103A and S103B. First, in step S103A, the processor system 100 calculates feature data (also referred to as first feature data) from the SEM image of the reference region 71. In the first embodiment, the feature data is one type of feature data defined in advance. An example of the feature data is the above-described line width (the line width 506 in
[0083] In Step S103B, the processor system 100 calculates a predetermined statistical value (also referred to as a first statistical value) based on the feature data Fs. The statistical value (the first statistical value) of the reference region 71 is defined as a statistical value Ss.
[Calculation of Feature Data and the Like]
[0084]
[0085] As shown in
[0086] In
[0087] As shown in
[Processing Flow (1-3)]
[0088] With reference to
[0089] The processor system 100 calculates a statistical value (also referred to as a second statistical value) based on a plurality of pieces of image feature data Fe in the evaluation region 72. In the example of the first embodiment, the processor system 100 calculates, for example, an average value (μe) as the statistical value (the second statistical value).
[0090] In step S105, the processor system 100 calculates a cross-sectional shape index (Ie) for each evaluation region 72. The processor system 100 converts a change amount of the average value μe of the image feature data Fe calculated in the evaluation region 72 into a normalized cross-sectional shape index Ie based on the statistical value Ss calculated in the reference region 71. In other words, the processor system 100 converts a statistical value (e.g., μe) of the image feature data Fe in the evaluation region 72 by using the statistical value Ss (e.g., μs, and σs) of the image feature data Fs in the reference region 71, thereby calculating the cross-sectional shape index Ie as feature data after conversion.
[0091] The conversion is represented by the following Formula 1, for example. In Formula 1, Ie is calculated by dividing a difference between μe and μs by α×σs. In Formula 1, α is a parameter for adjusting the magnitude of the local shape variation of the reference region 71.
[Distribution of Feature Data and Index]
[0092]
[0093] The change in image feature data represents a change in cross-sectional shape of the semiconductor pattern, but since a unit of the image feature data is not associated with the cross-sectional shape dimension, the magnitude of the change in cross-sectional shape cannot be evaluated based on the change in image feature data. In other words, it is not possible to specify what kind of change in cross-sectional shape (for example, the line width, the WB, the inclination of the side wall, the rounding, the tailing, or the like) a change in certain feature data specifically represents, and it is not possible to quantitatively evaluate the change in cross-sectional shape only based on the image feature data.
[0094] On the other hand, the semiconductor inspection system according to the first embodiment obtains the cross-sectional shape index Ie, which is an index obtained by normalizing the change amount of the image feature data based on the fluctuation of the image feature data caused by the local shape variation, for the image feature data (for example, the feature data A and the feature data B) by the processing as shown in
[0095] In
[0096] Examples of the results of two types of cross-sectional shape indices, that is, the cross-sectional shape index A and the cross-sectional shape index B in
[0097] On the other hand, the wafer distribution 904 of the cross-sectional shape index B of the image feature data B indicates that only a change equal to or smaller than the local shape variation occurs. In this example, the entire surface of the wafer has regions in which the index B ranges from −1 to +1. It can be quantitatively determined from the wafer distribution 904 that a change in cross-sectional shape dimension correlated with the image feature data B is caused by only a change equal to or smaller than the local variation within the surface of the wafer.
[Effects (1)]
[0098] According to the first embodiment, it is possible to use image feature data and a cross-sectional shape index acquired based on a Top-view SEM image to quantitatively evaluate a change in cross-sectional shape of a semiconductor pattern in a non-destructive manner before the cross-sectional observation. For example, by displaying the cross-sectional shape index on the display screen and viewing the cross-sectional shape index by a user, it is possible to determine the cross-sectional observation locations on the surface of the wafer suitable for cross-sectional processing for the cross-sectional observation. Accordingly, it is possible to perform efficient cross-sectional observation, inspection, and the like at low cost.
[0099] As shown in
[0100] The semiconductor inspection system according to the first embodiment may display information including the cross-sectional shape index obtained in this manner together with the GUI on, for example, a display screen of a display device of the processor system 100 (or client terminal 7 or the like). Contents on the display screen may be the same as those in
[0101] The processor system 100 stores, in a storage unit (a memory in the processor system 100, an external DB, or the like), various types of data and information used during the processing shown in
[0102]
[0103] The invention is not limited to the configuration example in the first embodiment, and various modifications can be made. For example, instead of the SEM 1, other types of electron microscopes, charged particle beam devices, or the like that can capture images may be applied.
[0104] In the first embodiment, an example has been described in which the image feature data is calculated based on the signal waveforms calculated from the SEM images as shown in
[0105] In the first embodiment, an example has been described in which the reference region 71 and the evaluation regions 72 are set on the same wafer to evaluate the uniformity and variation of the cross-sectional shape within the surface of the wafer, but the present invention is not limited thereto. It is also possible to set the reference region 71 and the evaluation region 72 between different wafers to evaluate the uniformity and variation of the cross-sectional shape between the wafers. For example, the reference region 71 is set within a surface of a first wafer, and the evaluation region 72 is set within a surface of a second wafer. A plurality of evaluation regions may be selected from a plurality of wafers. In these cases, the processing of the first embodiment can be similarly applied.
[0106] In the first embodiment, an example has been described in which a target semiconductor pattern is a line pattern, but the present invention is not limited thereto. For example, the processing of the first embodiment can be similarly applied when the target semiconductor pattern is a pattern such as a hole. For example, when an SEM image includes an elliptical shape such as a hole, the feature data may be calculated by taking a plurality of locations (samples) from the hole in a direction in which cross-sectional observation can be performed.
[0107] In the first embodiment, an example has been described in which one line pattern is included in the SEM image, but the present invention is not limited thereto. For example, when a plurality of line patterns or the like such as periodic patterns are included in an image (in other words, in the field of view for capturing an image), the processing of the first embodiment can also be similarly applied. For example, the feature data may be calculated from a plurality of locations (samples) of the plurality of line patterns within the region.
[0108] In the first embodiment, an example has been described in which a change in cross-sectional shape of the semiconductor pattern (for example, a change in cross-section in a Z direction) is evaluated, and in addition to the change in this direction, a change in shape in a direction parallel to the surface of the wafer and the semiconductor pattern (for example, the X and Y directions) may be grasped from the SEM image captured two-dimensionally. A known technique may be applied to the change in shape in a parallel direction. Accordingly, a change in three-dimensional shape of the semiconductor pattern can be evaluated.
Second Embodiment
[0109] A semiconductor inspection system according to a second embodiment will be described with reference to FIG. 11 and subsequent drawings. A basic configuration of the second embodiment is the same as or similar to that of the first embodiment. Hereinafter, configuration portions in the second embodiment different from those of the first embodiment will be mainly described.
[0110] In the semiconductor inspection system according to the second embodiment, an observation position of a cross-sectional shape of a sample (a cross-sectional observation region corresponding to the observation position) is selected based on a cross-sectional shape index obtained as in the first embodiment. Then, in the second embodiment, the selected observation position is automatically observed by the cross-sectional observation device 2 in
[0111] In the second embodiment, a suitable position and region in which a change in cross-sectional shape is assumed are selected based on the cross-sectional shape index. Accordingly, it is possible to reduce overlooking of the change in cross-sectional shape and duplicate observation of similar changes. As compared with a case where a plurality of positions and regions on a surface of the wafer are comprehensively examined in order or a case where the plurality of positions and regions on the surface of the wafer are examined in a random manner in the related art, in the second embodiment, the cross-sectional observation can be performed sequentially from the suitable observation position. Therefore, according to the second embodiment, the number of locations on which the cross-sectional processing is performed and the number of times of the processing can be reduced, and it is possible to perform efficient evaluation and inspection.
[Semiconductor Inspection System]
[0112] A system similar to that shown in
[0113] The cross-sectional observation device 2 is, for example, an FIB-SEM, and is a device that can be used for performing the cross-sectional processing and the cross-sectional observation. The cross-sectional observation device 2 is used to perform the cross-sectional observation on the cross-sectional observation region selected by the processor system 100.
[0114] As a modification, a functional part that performs processing for selecting the cross-sectional observation region may be mounted on a device other than the processor system 100 of the SEM 1 shown in
[Processing Flow (2-1)]
[0115]
[0116] In step S201, the processor system 100 inputs cross-sectional observation conditions based on the external input unit and the operation of the user. The cross-sectional observation conditions may be set in advance. In this example, as the cross-sectional observation conditions, a cross-sectional observation candidate region on the surface of the target wafer, the number of patterns (m) for performing cross-sectional observation in the region, and a reference observation region are input.
[Cross-sectional Observation Candidate Region and Reference Observation Region]
[0117]
[0118] Each cross-sectional observation candidate region 1202 is a candidate region for evaluating a global change in cross-sectional shape on the surface of the target wafer 1200. The cross-sectional observation candidate region 1202 is set for each chip, for example. The reference observation region 1201 is selected as an observation location that the user definitely wants to acquire regardless of a distribution of the cross-sectional shape index. The cross-sectional observation candidate region 1202 is compared with the reference observation region 1201. A range (in other words, an image size) of the reference observation region 1201 is set so as to cover the local shape variation of the semiconductor pattern. In a specific example, this range is set as a region including at least one line pattern, for example, as described above. In general, in the semiconductor wafer, since processing conditions are determined based on the center of the wafer, cross-sectional observation is performed at a location of the center of the wafer. Therefore, in this example, the center of the wafer is set as the reference observation region 1201.
[Processing Flow (2-2)]
[0119] With reference to
[0120] In step S203, the processor system 100 calculates, from the SEM image captured in the reference observation region 1201, image feature data Fs corresponding to signal waveforms at a plurality of locations (samples) in the image (step S203A). The processor system 100 calculates, for example, an average value μs and a standard deviation σs as a statistical value Ss based on the image feature data Fs (step S203B).
[0121] In step S204, similarly, the processor system 100 calculates, as feature data of the same type as the image feature data Fs calculated from the reference observation region 1201, image feature data Fe corresponding to the signal waveforms at a plurality of locations (samples) in the Top-view SEM image captured for each of the observation candidate regions 1202 from the image (step S204A). The processor system 100 calculates, for example, an average value μe as a statistical value based on the image feature data Fe (step S204B).
[0122] In step S205, assuming that the cross-sectional observation is performed on m patterns under the conditions input in step S201, the processor system 100 calculates a standard deviation σ's of a sample average distribution corresponding to the m patterns with respect to the standard deviation as calculated in the reference observation region 1201.
[0123] The standard deviation σ's of the sample average distribution corresponding to the m patterns is calculated by the following Formula 2 according to a central limit theorem as shown in
[0124]
[0125] With reference to
[The Number of Patterns]
[0126]
[Cross-Sectional Shape Index]
[0127]
[0128]
[0129] With reference to
[0130] A first viewpoint is as follows. The cross-sectional shape index Ie represents a change in cross-sectional shape. Therefore, a first condition is to select a cross-sectional observation region from the plurality of cross-sectional observation candidate regions 1202 so as to cover a range of this change. Accordingly, it is possible to avoid overlooking a change in cross-sectional shape occurring on the surface of the wafer.
[0131] A second viewpoint is as follows. A difference in the change in cross-sectional shape equal to or smaller than the local shape variation (for example, the example of the wafer distribution 904 of the index B in
[Selection of Cross-Sectional Observation Region]
[0132]
[0133]
[0134] A detailed processing example when the processor system 100 automatically selects the above four cross-sectional observation regions will be described below. A processor first selects the region C1 in which the index value v1 is 0. The region C1 is the same as the reference observation region. Next, the processor selects the region C2 corresponding to the index value v2, which is a minimum value within the range of the index A, and the region C4 corresponding to the index value v4, which is a maximum value within the range of the index A. Next, the processor selects the region C3 corresponding to the index value v3 as a region at which a difference between the index value and that in the region C1 is +1 or more. The difference between the index values being 1 or less indicates that there is only a change equal to or smaller than the local shape variation. The difference between the index values being 1 or more indicates that there is a change equal to or greater than the local shape variation.
[Processing Flow (2-3)]
[0135] With reference to
[0136] In step S208, the cross-sectional observation device 2 performs cross-sectional processing such that a cross-section appears at the input position coordinates of the cross-sectional observation region, and acquires a cross-sectional SEM image of the processed cross-section of the cross-sectional observation region. At this time, the cross-sectional observation device 2 acquires a cross-sectional SEM image including a plurality of (m) patterns or a plurality of (m) cross-sectional SEM images divided for each pattern according to the number of patterns m input in step S201. The cross-sectional SEM image is an image similar to the signal waveform in
[0137] In step S209, the cross-sectional observation device 2 measures a cross-sectional shape dimension using the cross-sectional SEM image.
[0138] The cross-sectional observation device 2 or the processor system 100 stores, into a memory, a database, or the like, various types of data and information obtained during the processing shown in
[Effects (2)]
[0139] As described above, according to the second embodiment, since the suitable cross-sectional observation region is selected based on the cross-sectional shape index, the cross-sectional observation can be performed by using the cross-sectional observation device 2 efficiently without overlooking a change in cross-sectional shape on the surface of the wafer and without waste.
[0140] As a modification of the processing in
[0141] The cross-sectional shape index Ie in Formula 4 represents a statistical degree of separation between a frequency distribution of image feature data in a cross-sectional observation candidate region and a frequency distribution of image feature data in a reference observation region. In the case of this index, since a local shape variation of the cross-sectional observation candidate region is also taken into consideration, the index can be applied even when the shape variation is greatly different between the reference observation region and the cross-sectional observation candidate region. Formula 4 according to this modification is similarly applicable to the first embodiment (step S104).
[0142] In the second embodiment, an example has been described in which the processor automatically calculates and selects the cross-sectional observation region using the cross-sectional shape index in step S207 of calculating and selecting the cross-sectional observation region in
[Interactive Cross-Sectional Observation]
[0143] For example, the processor system 100 calculates, by using the cross-sectional shape index Ie shown in
[0144] The auxiliary map 1700 in
[0145] When the cross-sectional observation candidate region (the first type candidate region) having a change equal to or greater than the local shape variation is observed as a cross-sectional observation region, a difference from the reference observation region can be evaluated. Therefore, in the modification, the processor system 100 displays such an auxiliary map 1700 together with the GUI on the display screen. The user views the auxiliary map 1700 and selects, for example, one cross-sectional observation region from the cross-sectional observation candidate regions (first type candidate regions) each having a change equal to or greater than the local shape variation.
[0146] Next, similarly, the processor system 100 calculates, by using the cross-sectional shape index Ie, a cross-sectional observation candidate region (a first type candidate region) having a change equal to or greater than the local shape variation and a cross-sectional observation candidate region (a second type candidate region) only having a change equal to or smaller than the local shape variation with respect to two regions including the cross-sectional observation region selected by the user and the reference observation region. Then, the processor system 100 similarly displays an auxiliary map including these regions. The user can select a next cross-sectional observation region by viewing the updated auxiliary map. With a configuration in which such processing and operations are sequentially repeated, it is possible to select a suitable cross-sectional observation region so as to satisfy the above two viewpoints.
[0147] The above modification may be performed as follows. The processor system 100 sequentially selects one cross-sectional observation region one by one. The processor system 100 or the user first selects, for example, one cross-sectional observation region (referred to as a first region) based on the cross-sectional shape index. The processor system 100 causes the cross-sectional observation device 2 to perform cross-sectional processing on the first region for cross-sectional observation. The user observes a cross-section of the first region. Next, when the user desires to observe a cross-section of another region, the user causes the processor system 100 to select a next cross-sectional observation region (referred to as a second region). In that case, the initial first region can also be set as the reference observation region. The processor system 100 or the user selects the second region, and causes the cross-sectional observation device 2 to perform cross-sectional processing on the second region for cross-sectional observation. The user observes a cross-section of the second region. Similarly, a cross-sectional observation region is sequentially selected as necessary for a next cross-sectional observation. Accordingly, it is possible to perform an operation of the cross-sectional observation by minimizing the cross-sectional processing that requires destruction of a part of a sample.
[0148] In the second embodiment, a case in which one cross-sectional shape index (for example, the index A) is used has been described as an example. The invention is not limited thereto, and it is also possible to use a plurality of cross-sectional shape indices corresponding to a plurality of types of feature data. When a change in the plurality of independent cross-sectional shape indices occurs on the surface of the wafer, a cross-sectional observation region may be selected so as to cover a range of the change in each cross-sectional shape index. As a processing example, in the flow of
[0149] In other modifications, when there are a plurality of cross-sectional shape indices that independently change on the surface of the wafer, the processor may compare these indices, automatically select an index in which a change within the surface of the wafer is larger than the local shape variation, and select a cross-sectional observation region for the selected index. Since the plurality of indices are normalized, such comparison is also possible.
[0150] In the second embodiment, an example of a system has been described in which automatic cross-sectional observation and cross-sectional shape dimension measurement can be performed by connecting the SEM 1 and the FIB-SEM which is the cross-sectional observation device 2, as shown in
[0151] In the second embodiment, it is possible to associate a result obtained by measuring the cross-sectional shape dimension of the cross-sectional SEM image by the cross-sectional observation device 2 in step S209 in
[0152] The data and information obtained as results according to the second embodiment can also be used in the cross-sectional shape estimation system 3 shown in
[0153] Therefore, in the modification, the data and information which are the results according to the second embodiment and in which the cross-sectional shape dimension is associated with the cross-sectional shape index are added and registered in the database 3a of the cross-sectional shape estimation system 3 (data and information in which the cross-sectional shape is associated with the image feature data). In other words, the database 3a stores data and information in which the image feature data, the cross-sectional shape index, and the cross-sectional shape dimension are associated with one another. Accordingly, in the cross-sectional shape estimation system 3, it is possible to efficiently create the database 3a covering a change in cross-sectional shape of the semiconductor pattern. In other words, based on the function in the second embodiment, it is possible to efficiently create information of the database 3a used for learning for estimation in the cross-sectional shape estimation system 3. Accordingly, accuracy of cross-sectional shape estimation based on the database 3a can be improved.
[0154] At this time, in step S207 shown in
Third Embodiment
[0155] A semiconductor inspection system according to a third embodiment will be described with reference to
[0156] The cross-sectional shape index Ie which is obtained by converting the image feature data and described in the first embodiment is an index indicating a global change on a surface of the wafer with respect to a local shape variation, and is an index quantitatively indicating the uniformity of a cross-sectional shape on the surface of the wafer. In the third embodiment, the manufacturing parameter is controlled using the cross-sectional shape index Ie.
[Manufacturing Parameter Control System]
[0157] The system shown in
[0158] The third embodiment can be seen as having a configuration in which a processor system is provided in the manufacturing parameter control system 4 shown in
[Processing Flow]
[0159]
[0160] Step S302 is processing for inputting the cross-sectional shape index Ie to the manufacturing parameter control system 4 through the input and output unit. In other words, the manufacturing parameter control system 4 acquires the cross-sectional shape index Ie and stores the cross-sectional shape index Ie into a memory.
[0161] In step S303, the user inputs, on a display screen provided by a system (for example, the manufacturing parameter control system 4), a target value for controlling the uniformity of a cross-sectional shape through the input and output unit. The manufacturing parameter control system 4 acquires information on the input target value and stores the information into the memory.
[0162] In step S302, the manufacturing parameter control system 4 refers to image feature data before conversion for the input cross-sectional shape index. In the first embodiment, since various types of data and information including the image feature data and the cross-sectional shape index are stored in association with one another (for example,
[0163] In a database 4a of the manufacturing parameter control system 4, information such as a manufacturing parameter related to the manufacturing of a semiconductor device in the semiconductor device manufacturing apparatus 5 is stored in advance. As the information such as the manufacturing parameter, information managed by the MES 6 may be used. Examples of the manufacturing parameter include an etching parameter when the semiconductor device manufacturing apparatus 5 is an etching device. Examples of the etching parameter include gas pressure and bias power in the case of dry etching.
[0164] In step S304, the manufacturing parameter control system 4 associates the manufacturing parameter (for example, the etching parameter) stored in the database 4a in advance with the image feature data.
[0165] In step S305, the manufacturing parameter control system 4 calculates a manufacturing parameter, that is, a manufacturing parameter after adjustment according to the relation between the manufacturing parameter and the image feature data, so as to satisfy a condition that the uniformity of the selected image feature data (for example, the uniformity in the surface of the wafer) is better than the input target value. At this time, the manufacturing parameter control system 4 quantitatively evaluates the uniformity based on the cross-sectional shape index. The manufacturing parameter control system 4 stores the calculated manufacturing parameter after adjustment into the memory or the database 4a. The adjustment of the manufacturing parameter may be performed by multiplying an original parameter value by an adjustment coefficient, but is not limited thereto.
[0166] In step S306, the manufacturing parameter control system 4 inputs the calculated manufacturing parameter after adjustment to the semiconductor device manufacturing apparatus 5 through the input the output unit. In other words, the semiconductor device manufacturing apparatus 5 receives the manufacturing parameter after adjustment, and the manufacturing parameter after adjustment is set to the semiconductor device manufacturing apparatus 5. Thereafter, the semiconductor device manufacturing apparatus 5 performs a manufacturing process (for example, an etching process) according to the manufacturing parameter after adjustment. The manufacturing parameter after adjustment may be input and set in the MES 6. The adjustment of the manufacturing parameter as described above can be repeatedly performed as appropriate.
[Effects (3)]
[0167] According to the third embodiment, the manufacturing parameter can be suitably adjusted based on the image feature data and the cross-sectional shape index, and the uniformity of the cross-sectional shape on the surface of the wafer can be improved.
[0168] In the third embodiment, the manufacturing parameter is adjusted based on the cross-sectional shape index obtained by normalizing a change amount of the image feature data with the fluctuation of the image feature data caused by the local shape variation. The invention is not limited thereto, the same applies to the cross-sectional shape estimation system 3 according to the second embodiment shown in
[0169]
[0170] A configuration example of the system shown in
[0171] In the example of
[0172] Although the embodiments of the present disclosure have been specifically described, the present disclosure is not limited to the embodiments described above and can be variously modified without departing from the scope of the present disclosure. Except for essential components, the components of the embodiments may be added, deleted, replaced, or the like. Unless otherwise limited, each component may be singular or plural. In addition, an embodiment combining the respective embodiments is also possible.