METHODS AND APPARATUS FOR USE IN THE SPATIAL REGISTRATION OF OBJECTS
20230252672 · 2023-08-10
Inventors
Cpc classification
H01L2224/8313
ELECTRICITY
H01L2224/83132
ELECTRICITY
H01L2924/00014
ELECTRICITY
H01L2224/8385
ELECTRICITY
G03F9/7088
PHYSICS
H01L2224/8385
ELECTRICITY
H01L2924/00014
ELECTRICITY
International classification
Abstract
A method for use in the spatial registration of first and second objects comprises fixing the first and second objects to the same motion control stage in an unknown spatial relationship, using an imaging system to acquire an image of the first object, determining a position and orientation of the first object in a frame of reference of the motion control stage based at least in part on the acquired image of the first object, using the imaging system to acquire an image of the second object, and determining a position and orientation of the second object in the frame of reference of the motion control stage based at least in part on the acquired image of the second object. The method may be used in the spatial registration of first and second objects and, in particular though not exclusively, for use in the spatial registration of optical or electronic components relative to one another, or for use in the alignment of a first object such as an optical or electronic component relative to a second object such as a feature, a structure, a target area or a target region defined on a substrate or a wafer.
Claims
1. A method for use in the spatial registration of first and second objects, the method comprising: fixing the first and second objects to the same motion control stage in an unknown spatial relationship; using an imaging system to acquire an image of the first object or to acquire an image of a first marker provided with the first object, wherein the first marker and the first object have a known spatial relationship; determining a position and orientation of the first object in a frame of reference of the motion control stage based at least in part on the acquired image of the first object or based at least in part on the acquired image of the first marker and the known spatial relationship between the first marker and the first object; using the imaging system to acquire an image of the second object or to acquire an image of a second marker provided with the second object, wherein the second marker and the second object have a known spatial relationship; and determining a position and orientation of the second object in the frame of reference of the motion control stage based at least in part on the acquired image of the second object or based at least in part on the acquired image of the second marker and the known spatial relationship between the second marker and the second object.
2. The method of claim 1, wherein the first marker is rotationally asymmetric and/or aperiodic in one or two dimensions, for example wherein the first marker comprises, or takes the form of, a grid which is rotationally asymmetric and/or aperiodic in one or two dimensions.
3. The method of claim 1, comprising: determining the position and orientation of the first marker in the frame of reference of the motion control stage based at least in part on the acquired image of the first marker; and using the determined position and orientation of the first marker in the frame of reference of the motion control stage and the known spatial relationship between the first marker and the first object to determine the position and orientation of the first object in the frame of reference of the motion control stage.
4. The method of claim 3, comprising: (i) measuring a relative position and orientation of the motion control stage corresponding to the acquired image of the first marker; (ii) determining a degree of similarity between the acquired image of the first marker and a virtual image of the first marker, which virtual image of the first marker has the same size and shape as the first marker, and responsive to determining that the degree of similarity between the acquired image of the first marker and the virtual image of the first marker does not comply with a predetermined criterion, translating and/or rotating the virtual image of the first marker with respect to a FOV of the imaging system until the degree of similarity between the acquired image of the first marker and the virtual image of the first marker complies with the predetermined criterion; and (iii) determining the position and orientation of the first marker in the frame of reference of the motion control stage based on the measured relative position and orientation of the motion control stage corresponding to the acquired image of the first marker and the relative position and orientation of the virtual image of the first marker with respect to the FOV of the imaging system when the degree of similarity between the virtual image of the first marker and the acquired image of the first marker complies with the predetermined criterion.
5. The method of claim 4, wherein determining the degree of similarity between the acquired image of the first marker and the virtual image of the first marker comprises evaluating a cross-correlation value between the acquired image of the first marker and the virtual image of the first marker and wherein the degree of similarity between the acquired image of the first marker and the virtual image of the first marker complies with the predetermined criterion when the cross-correlation value is greater than a predetermined threshold value or has a maximum value.
6. The method of claim 3, comprising: (i) measuring a relative position and orientation of the motion control stage corresponding to the acquired image of the first marker; (ii) determining a degree of similarity between the acquired image of the first marker and a virtual image of the first marker, which virtual image of the first marker has the same size and shape as the first marker, and responsive to determining that the degree of similarity between the acquired image of the first marker and the virtual image of the first marker does not comply with a predetermined criterion, translating the virtual image of the first marker with respect to a FOV of the imaging system until the degree of similarity between the acquired image of the first marker and the virtual image of the first marker complies with the predetermined criterion; (iii) translating the motion control stage along a linear translation axis of the motion control stage by a distance equal to a known separation between the first marker and a further first marker which is also provided with the first object so that the further first marker is in the FOV of the imaging system; (iv) using the imaging system to acquire an image of the further first marker; (v) measuring a relative position and orientation of the motion control stage corresponding to an acquired image of the further first marker; (vi) determining a degree of similarity between the acquired image of the further first marker and a virtual image of the further first marker, which virtual image of the further first marker has the same size and shape as the further first marker, and translating the virtual image of the further first marker with respect to the FOV of the imaging system until the degree of similarity between the acquired image of the further first marker and the virtual image of the further first marker complies with a predetermined criterion; and (vii) determining the position and orientation of the first marker in the frame of reference of the motion control stage based on: (a) the measured relative position of the motion control stage corresponding to the acquired image of the first marker; (b) the relative position of the virtual image of the first marker with respect to the FOV of the imaging system when the degree of similarity between the virtual image of the first marker and the acquired image of the first marker complies with the predetermined criterion; (c) the measured relative position of the motion control stage corresponding to the acquired image of the further first marker; and (d) the relative position of the virtual image of the further first marker with respect to the FOV of the imaging system when the degree of similarity between the virtual image of the further first marker and the acquired image of the further first marker complies with the predetermined criterion.
7. The method of claim 6, wherein determining the degree of similarity between the acquired image of the first marker and the virtual image of the first marker comprises evaluating a cross-correlation value between the acquired image of the first marker and the virtual image of the first marker and wherein the degree of similarity between the acquired image of the first marker and the virtual image of the first marker complies with the predetermined criterion when the cross-correlation value is greater than a predetermined threshold value or has a maximum value; and wherein determining the degree of similarity between the acquired image of the further first marker and the virtual image of the further first marker comprises evaluating a cross-correlation value between the acquired image of the further first marker and the virtual image of the further first marker and wherein the degree of similarity between the acquired image of the further first marker and the virtual image of the further first marker complies with the predetermined criterion when the cross-correlation value is greater than a predetermined threshold value or has a maximum value.
8. The method of claim 1, wherein the second marker is rotationally asymmetric and/or aperiodic in one or two dimensions, for example wherein the second marker comprises, or takes the form of, a grid which is rotationally asymmetric and/or aperiodic in one or two dimensions.
9. The method of claim 1, comprising: determining the position and orientation of the second marker in the frame of reference of the motion control stage based at least in part on the acquired image of the second marker; and using the determined position and orientation of the second marker in the frame of reference of the motion control stage and the known spatial relationship between the second marker and the second object to determine the position and orientation of the second object in the frame of reference of the motion control stage.
10. The method of claim 9, comprising: (i) measuring a relative position and orientation of the motion control stage corresponding to the acquired image of the second marker; (ii) determining a degree of similarity between the acquired image of the second marker and a virtual image of the second marker, which virtual image of the second marker has the same size and shape as the second marker, and responsive to determining that the degree of similarity between the acquired image of the second marker and the virtual image of the second marker does not comply with a predetermined criterion, translating and/or rotating the virtual image of the second marker with respect to the FOV of the imaging system until the degree of similarity between the acquired image of the second marker and the virtual image of the second marker complies with the predetermined criterion; and (iii) determining the position and orientation of the second marker in the frame of reference of the motion control stage based on the measured relative position and orientation of the motion control stage corresponding to the acquired image of the second marker and the relative position and orientation of the virtual image of the second marker with respect to the FOV of the imaging system when the degree of similarity between the virtual image of the second marker and the acquired image of the second marker complies with the predetermined criterion.
11. The method of claim 10, wherein determining the degree of similarity between the acquired image of the second marker and the virtual image of the second marker comprises evaluating a cross-correlation value between the acquired image of the second marker and the virtual image of the second marker and wherein the degree of similarity between the acquired image of the second marker and the virtual image of the second marker complies with the predetermined criterion when the cross-correlation value is greater than a predetermined threshold value or has a maximum value.
12. The method of claim 9, comprising: (i) measuring a relative position and orientation of the motion control stage corresponding to the acquired image of the second marker; (ii) determining a degree of similarity between the acquired image of the second marker and a virtual image of the second marker, which virtual image of the second marker has the same size and shape as the second marker, and responsive to determining that the degree of similarity between the acquired image of the second marker and the virtual image of the second marker does not comply with a predetermined criterion, translating the virtual image of the second marker with respect to a FOV of the imaging system until the degree of similarity between the acquired image of the second marker and the virtual image of the second marker complies with the predetermined criterion; (iii) translating the motion control stage along a linear translation axis of the motion control stage by a distance equal to a known separation between the second marker and a further second marker which is also provided with the second object so that the further second marker is in the FOV of the imaging system; (iv) using the imaging system to acquire an image of the further second marker; (v) measuring a relative position and orientation of the motion control stage corresponding to an acquired image of the further second marker; (vi) determining a degree of similarity between the acquired image of the further second marker and a virtual image of the further second marker, which virtual image of the further second marker has the same size and shape as the further second marker, and translating the virtual image of the further second marker with respect to the FOV of the imaging system until the degree of similarity between the acquired image of the further second marker and the virtual image of the further second marker complies with a predetermined criterion; and (vii) determining the position and orientation of the second marker in the frame of reference of the motion control stage based on: (a) the measured relative position of the motion control stage corresponding to the acquired image of the second marker; (b) the relative position of the virtual image of the second marker with respect to the FOV of the imaging system when the degree of similarity between the virtual image of the second marker and the acquired image of the second marker complies with the predetermined criterion; (c) the measured relative position of the motion control stage corresponding to the acquired image of the further second marker; and (d) the relative position of the virtual image of the further second marker with respect to the FOV of the imaging system when the degree of similarity between the virtual image of the further second marker and the acquired image of the further second marker complies with the predetermined criterion.
13. The method of claim 12, wherein determining the degree of similarity between the acquired image of the second marker and the virtual image of the second marker comprises evaluating a cross-correlation value between the acquired image of the second marker and the virtual image of the second marker and wherein the degree of similarity between the acquired image of the second marker and the virtual image of the second marker complies with the predetermined criterion when the cross-correlation value is greater than a predetermined threshold value or has a maximum value; and wherein determining the degree of similarity between the acquired image of the further second marker and the virtual image of the further second marker comprises evaluating a cross-correlation value between the acquired image of the further second marker and the virtual image of the further second marker and wherein the degree of similarity between the acquired image of the further second marker and the virtual image of the further second marker complies with the predetermined criterion when the cross-correlation value is greater than a predetermined threshold value or has a maximum value.
14. The method of claim 1, comprising: (i) measuring a relative position and orientation of the motion control stage corresponding to an acquired image of the first object; (ii) determining a degree of similarity between the acquired image of the first object and a virtual image of the first object, which virtual image of the first object has the same size and shape as the first object, and responsive to determining that the degree of similarity between the acquired image of the first object and the virtual image of the first object does not comply with a predetermined criterion, translating and/or rotating the virtual image of the first object with respect to a FOV of the imaging system until the degree of similarity between the acquired image of the first object and the virtual image of the first object complies with the predetermined criterion; and (iii) determining the position and orientation of the first object in the frame of reference of the motion control stage based on the measured relative position and orientation of the motion control stage corresponding to the acquired image of the first object and the relative position and orientation of the virtual image of the first object with respect to the FOV of the imaging system when the degree of similarity between the virtual image of the first object and the acquired image of the first object complies with the predetermined criterion.
15. The method of claim 14, wherein determining the degree of similarity between the acquired image of the first object and the virtual image of the first object comprises evaluating a cross-correlation value between the acquired image of the first object and the virtual image of the first object and wherein the degree of similarity between the acquired image of the first object and the virtual image of the first object complies with the predetermined criterion when the cross-correlation value is greater than a predetermined threshold value or has a maximum value.
16. The method of claim 1, comprising: (i) measuring a relative position and orientation of the motion control stage corresponding to an acquired image of the second object; (ii) determining a degree of similarity between the acquired image of the second object and a virtual image of the second object, which virtual image of the second object has the same size and shape as the second object and responsive to determining that the degree of similarity between the acquired image of the second object and the virtual image of the second object does not comply with a predetermined criterion, translating and/or rotating the virtual image of the second object with respect to the FOV of the imaging system until the degree of similarity between the acquired image of the second object and the virtual image of the second object complies with the predetermined criterion; and (iii) determining the position and orientation of the second object in the frame of reference of the motion control stage based on the measured relative position and orientation of the motion control stage corresponding to the acquired image of the second object and the relative position and orientation of the virtual image of the second object with respect to the FOV of the imaging system when the degree of similarity between the virtual image of the second object and the acquired image of the second object complies with the predetermined criterion.
17. The method of claim 16, wherein determining the degree of similarity between the acquired image of the second object and the virtual image of the second object comprises evaluating a cross-correlation value between the acquired image of the second object and the virtual image of the second object and wherein the degree of similarity between the acquired image of the second object and the virtual image of the second object complies with the predetermined criterion when the cross-correlation value is greater than a predetermined threshold value or has a maximum value.
18. The method of any preceding claim 1, wherein the first object is detachably attached to the motion control stage or wherein the first object is detachably attached to a first substrate or wafer and the first substrate or wafer is fixed to the motion control stage.
19. The method of any preceding claim 1, wherein the second object is detachably attached to the motion control stage or wherein the second object comprises a feature, a structure, a target area, a target region defined on a second substrate or wafer, and the second substrate or wafer is fixed to the motion control stage.
20. The method of any preceding claim 1, comprising determining a spatial relationship between the first and second objects in the frame of reference of the motion control stage based on the determined position and orientation of the first object in the frame of reference of the motion control stage and the determined position and orientation of the second object in the frame of reference of the motion control stage.
21. The method of claim 20, comprising spatially registering the first and second objects based on the determined spatial relationship between the first and second objects in the frame of reference of the motion control stage, for example by holding the first object, moving the first object and the motion control stage apart, using the motion control stage to move the second object relative to the first object based on the determined spatial relationship between the first and second objects in the frame of reference of the motion control stage until the first and second objects are in alignment, and then bringing the first and second objects together until the first and second objects are aligned and in engagement.
22. A method for use in the spatial registration of first and second objects, the second object being fixed or attached to a surface and the surface having one or more regions adjacent to the second object, which surface regions have a different reflectivity to the second object, and the method comprising: locating the first object between a light source and the second object; directing light from the light source onto the first object, the second object, and one or more of the surface regions of the surface adjacent to the second object; using single-pixel detection to measure the optical power of at least a portion of the light that is reflected from the first and second objects and the one or more surface regions adjacent to the second object while the first and second objects are aligned relative to one another; and aligning the first and second objects relative to one another until the measured optical power is maximised or minimised.
23. The method of claim 22, wherein using single-pixel detection to measure the optical power of at least a portion of the light that is reflected from the first and second objects and the one or more surface regions adjacent to the second object comprises: using a single-pixel detector to measure the optical power of at least a portion of the light that is reflected from the first and second objects and the one or more surface regions adjacent to the second object; or using a multi-pixel detector to measure the total integrated optical power of at least a portion of the light that is reflected from the first and second objects and the one or more surface regions adjacent to the second object and that is incident across a plurality of the pixels of the multi-pixel detector.
24. The method of claim 22, wherein the first object is detachably attached to the motion control stage or the first object is detachably attached to a first substrate or wafer, and the first substrate or wafer is fixed to the motion control stage.
25. The method of claim 22, wherein the surface to which the second object is fixed or attached is a surface of a motion control stage or a surface of a second substrate or wafer.
26. The method of claim 22, comprising holding the first object, moving the first object and the motion control stage apart, using the motion control stage to move the second object relative to the first object so as to align the first and second objects relative to one another until the measured optical power is maximised or minimised, and then bringing the first and second objects together until the first and second objects are aligned and in engagement.
27. The method of claim 21, comprising attaching the first and second objects while the first and second objects are aligned, for example using at least one of a differential adhesion method, a capillary bonding method, or a soldering method, or by bonding the first and second objects together using an intermediate adhesive material or agent such as an intermediate adhesion layer, to attach the first and second objects while the first and second objects are aligned.
28. The method of claim 1, wherein at least one of the first and second objects comprises a component such as an optical component or an electronic component or wherein at least one of the first and second objects comprises a portion, piece or chip of material.
29. The method of claim 1, wherein one of the first and second objects comprises a lithographic mask and the other of the first and second objects comprises a work-piece such as a substrate or a wafer.
30. The method of claim 26, comprising attaching the first and second objects while the first and second objects are aligned, for example using at least one of a differential adhesion method, a capillary bonding method, or a soldering method, or by bonding the first and second objects together using an intermediate adhesive material or agent such as an intermediate adhesion layer, to attach the first and second objects while the first and second objects are aligned.
31. The method of claim 22, wherein at least one of the first and second objects comprises a component such as an optical component or an electronic component or wherein at least one of the first and second objects comprises a portion, piece or chip of material.
32. The method of claim 22, wherein one of the first and second objects comprises a lithographic mask and the other of the first and second objects comprises a work-piece such as a substrate or a wafer.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0227] Various apparatus and methods for use in spatially registering first and second objects will now be described by way of non-limiting example only with reference to the following drawings of which:
[0228]
[0229]
[0230]
[0231]
[0232]
[0233]
[0234]
[0235]
[0236]
[0237]
[0238]
[0239]
[0240]
[0241]
[0242]
[0243]
[0244]
[0245]
[0246]
[0247]
[0248]
[0249]
[0250]
[0251]
[0252]
[0253]
[0254]
[0255]
[0256]
[0257]
[0258]
[0259]
[0260]
[0261]
DETAILED DESCRIPTION OF THE DRAWINGS
[0262] Referring initially to
[0263] Although not shown explicitly in
[0264] The system 1 further includes an imaging system 30 mounted above the upper surface 23 of the table 22 of the motion control stage 20 for acquiring images of one or more objects located on the upper surface 23. The imaging system 30 has a fixed spatial relationship relative to the base 21 of the motion control stage 20. The imaging system includes a microscope and a camera arranged so that the camera can acquire images of one or more objects located on the upper surface 23 of the table 22 of the motion control stage 20 through the microscope.
[0265] The system 1 further includes a “pick-and-place” tool 36 mounted above the upper surface 23 of the table 22 of the motion control stage 20. The pick-and-place tool 36 includes a head portion in the form of a polydimethylsiloxane (PDMS) stamp 37 for engaging and holding an object such as a component. As will be described in more detail below, the tool 36 is configured to pick a first object, to hold the first object, and to release the first object once the first object is in engagement with a second object. The system 1 further includes a controller in the form of a computing resource 40. As indicated by the dashed lines in
[0266] Referring to
[0267] As will be described in more detail below, the system 1 is capable of detaching the component 4 from the first substrate 6 and subsequently transferring the component 4 to the target area 8 on the second substrate 10.
[0268] The first substrate 6 has an upper surface 52 defining a first marker 50. The component 4 has a known position and orientation relative to the first marker 50. The second substrate 10 has an upper surface 56 defining a second marker 54. The target area 8 has a known position and orientation relative to the second marker 54. The target area 8 has the same size and shape as the component 4.
[0269] A method for use in spatially registering first and second objects will now be described with reference to
[0270] Referring now to
[0271] The computing resource 40 determines the position and orientation of the first marker 50 in the frame of reference of the motion control stage 20 based at least in part on one or more acquired images of the first marker 50 with the aid of a fixed virtual image 62 of the first marker 50. The fixed virtual image 62 is stored in a memory of the computing resource 40 and has a fixed spatial relationship relative to the FOV 60 of the imaging system 30. In the example method illustrated in
[0272] The computing resource 40 determines the position and orientation of the first marker 50 in the frame of reference of the motion control stage 20 by:
[0273] (i) causing the imaging system 30 to acquire an image of the first marker 50 when the first marker 50 is in the FOV 60 of the imaging system 30;
[0274] (ii) determining a degree of similarity between the acquired image of the first marker 50 and the fixed virtual image 62 of the first marker 50;
[0275] (iii) responsive to determining that the degree of similarity between the acquired image of the first marker 50 and the fixed virtual image 62 of the first marker 50 does not comply with a predetermined criterion, the computing resource 40 controls the actuators of the motion control stage 20 to translate and/or rotate the table 22 of the motion control stage 20 in the x-y plane with respect to the FOV 60 of the imaging system 30 while maintaining the first marker 50 in the FOV 60 of the imaging system 30 and repeats steps (i) and (ii) until the degree of similarity between the acquired image of the first marker 50 and the fixed virtual image 62 of the first marker 50 complies with the predetermined criterion;
[0276] (iv) using the position sensors 24 to measure a corresponding position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 and the orientation sensors 26 to measure a corresponding relative orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 when the degree of similarity between the acquired image of the first marker 50 and the fixed virtual image 62 of the first marker 50 complies with the predetermined criterion; and
[0277] (v) determining the position and orientation of the first marker 50 in the frame of reference of the motion control stage 20 based on the measured position and orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 when the degree of similarity between the acquired image of the first marker 50 and the fixed virtual image 62 of the first marker 50 complies with the predetermined criterion.
[0278] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the first marker 50 and the fixed virtual image 62 of the first marker 50. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0279] The computing resource 40 then determines the position and orientation of the component 4 in the frame of reference of the motion control stage 20 from the determined position and orientation of the first marker 50 in the frame of reference of the motion control stage 20 and the known position and orientation of the component 4 relative to the first marker 50.
[0280] The accuracy with which the position of the component 4 is determined depends on factors including the size of the features, the pixel density of the imaging system, and the wavelength of light used, but may be in a range from 1 μm to 1 nm. The accuracy with which the orientation of the component 4 is determined depends on the same factors and may be in the range 0.001-1 mrad.
[0281] Referring now to
[0282] The computing resource 40 determines the position and orientation of the second marker 54 in the frame of reference of the motion control stage 20 based at least in part on one or more acquired images of the second marker 54 with the aid of a fixed virtual image 64 of the second marker 54. The fixed virtual image 64 is stored in a memory of the computing resource 40 and has a fixed spatial relationship relative to the FOV 60 of the imaging system 30. In the example method illustrated in
[0283] The computing resource 40 determines the position and orientation of the second marker 54 in the frame of reference of the motion control stage 20 by:
[0284] (i) causing the imaging system 30 to acquire an image of the second marker 54 when the second marker 54 is in the FOV 60 of the imaging system 30;
[0285] (ii) determining a degree of similarity between the acquired image of the second marker 54 and the fixed virtual image 64 of the second marker 54;
[0286] (iii) responsive to determining that the degree of similarity between the acquired image of the second marker 54 and the fixed virtual image 64 of the second marker 54 does not comply with a predetermined criterion, the computing resource 40 controls the actuators of the motion control stage 20 to translate and/or rotate the table 22 of the motion control stage 20 in the x-y plane with respect to the FOV 60 of the imaging system 30 while maintaining the second marker 54 in the FOV 60 of the imaging system 30 and repeats steps (i) and (ii) until the degree of similarity between the acquired image of the second marker 54 and the fixed virtual image 64 of the second marker 54 complies with the predetermined criterion;
[0287] (iv) using the position sensors 24 to measure a corresponding position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 and the orientation sensors 26 to measure a corresponding relative orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 when the degree of similarity between the acquired image of the second marker 54 and the fixed virtual image 64 of the second marker 54 complies with the predetermined criterion; and
[0288] (v) determining the position and orientation of the second marker 54 in the frame of reference of the motion control stage 20 based on the measured position and orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 when the degree of similarity between the acquired image of the second marker 54 and the fixed virtual image 64 of the second marker 54 complies with the predetermined criterion.
[0289] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the second marker 54 and the fixed virtual image 64 of the second marker 54. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0290] The computing resource 40 then determines the position and orientation of the target area 8 in the frame of reference of the motion control stage 20 from the determined position and orientation of the second marker 54 in the frame of reference of the motion control stage 20 and the known position and orientation of the target area 8 relative to the second marker 54.
[0291] The accuracy with which the position of the target area 8 is determined depends on factors including the size of the features, the pixel density of the imaging system, and the wavelength of light used but may be in a range from 1 μm to 1 nm. The accuracy with which the orientation of the target area 8 is determined depends on the same factors and may be in the range 0.001-1 mrad.
[0292] As will be described in more detail below, the method for use in spatially registering first and second objects continues with the computing resource 40 determining the spatial relationship between the component 4 and the target area 8 in the frame of reference of the motion control stage 20 based on the determined position and orientation of the component 4 in the frame of reference of the motion control stage 20 and the determined position and orientation of the target area 8 in the frame of reference of the motion control stage 20.
[0293] The pick and place tool 36 is locked in position relative to the base of 21 of the motion control stage 20 such that the PDMS stamp 37 of the pick and place tool 36 is positioned in the FOV 60 of the imaging system 30 at a position in z above a z-level of an upper surface of the component 4. If required, the motion control stage 20 is used to align the component 4 in x-y relative to the PDMS stamp 37 of the pick and place tool 36 in the FOV 60 of the imaging system 30. One of ordinary skill in the art will understand that the PDMS stamp 37 has reversible adhesion properties that may be used to pick up the component 4 and place the component 4 at the target area 8 of the second substrate 10 in a highly controllable manner. Specifically, once the component 4 is aligned in x-y relative to the PDMS stamp 37, the table 22 of the motion control stage 20 is moved along the z-axis towards the PDMS stamp 37 until the component 4 and the PDMS stamp 37 come into engagement. The table 22 of the motion control stage 20 is then moved along the z-axis away from the PDMS stamp 37. The adhesion properties of the PDMS stamp 37 cause the PDMS stamp 37 to hold the component 4 and to cause the component 4 to become detached from the first substrate 6 so that the PDMS stamp 37 holds the component 4 clear of the first and second substrates 6, 10. The computing resource 40 then controls the actuators of the motion control stage 20 so as to translate and/or rotate the table 22 of the motion control stage 20 in x-y relative to the component 4 (thereby also translating and/or rotating the first and second substrates 6, 10 in x-y relative to the component 4) based on the determined position and orientation of the target area 8 on the second substrate 10 in the frame of reference of the motion control stage 20 relative to the position and orientation of the component 4 when attached to the first substrate 6 in the frame of reference of the motion control stage 20 until the component 4 and the target area 8 on the second substrate 10 are aligned in translation and rotation in x-y, but spaced apart in z. The computing resource 40 then controls the actuators of the motion control stage 20 so as to move the table 22 (and therefore also the first and second substrates 6, 10) in z until the component 4 and the target area 8 on the second substrate 10 are in engagement. Engagement of the component 4 and the target area 8 on the second substrate 10 results in the PDMS stamp 37 releasing the component 4 and attachment of the component 4 to the second substrate 10 at the target area 8 as a consequence of differential adhesion or capillary bonding between the component 4 and the second substrate 10.
[0294] A first alternative method for use in spatially registering first and second objects will now be described with reference to
[0295] Referring now to
[0296] The computing resource 40 determines the position and orientation of the component 104 in the frame of reference of the motion control stage 20 based at least in part on one or more acquired images of the component 104 with the aid of a fixed virtual image 162 of the component 104. The fixed virtual image 162 of the component 104 is stored in a memory of the computing resource 40 and has a fixed spatial relationship with respect to the FOV 60 of the imaging system 30. In the example method illustrated in
[0297] The first alternative spatial registration method begins with the computing resource 40 determining the position and orientation of the component 104 in the frame of reference of the motion control stage 20 based at least in part on the one or more acquired images of the component 104. Specifically, the computing resource 40:
[0298] (i) causes the imaging system 30 to acquire an image of the component 104 when the component 104 is in the FOV 60 of the imaging system 30;
[0299] (ii) determines a degree of similarity between the acquired image of the component 104 and the fixed virtual image 162 of the component 104;
[0300] (iii) responsive to determining that the degree of similarity between the acquired image of the component 104 and the fixed virtual image 162 of the component 104 does not comply with a predetermined criterion, the computing resource 40 controls the actuators of the motion control stage 20 so as to translate and/or rotate the table 22 of the motion control stage 20 in the x-y plane with respect to the FOV 60 of the imaging system 30 while maintaining the component 104 in the FOV 60 of the imaging system 30 and repeats steps (i) and (ii) until the degree of similarity between the acquired image of the component 104 and the fixed virtual image 162 of the component 104 complies with a predetermined criterion;
[0301] (iv) uses the position sensors 24 to measure a corresponding relative position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 and the orientation sensors 26 to measure a corresponding relative orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 when the degree of similarity between the acquired image of the component 104 and the fixed virtual image 162 of the component 104 complies with the predetermined criterion; and
[0302] (v) determines the position and orientation of the component 104 in the frame of reference of the motion control stage 20 based on the measured position and orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 when the degree of similarity between the acquired image of the component 104 and the fixed virtual image 162 of the component 104 complies with the predetermined criterion.
[0303] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the component 104 and the fixed virtual image 162 of the component 104. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0304] Referring now to
[0305] One of ordinary skill in the art will understand that the first alternative spatial registration method further includes steps of detaching the component 104 from the first substrate 106, moving the table 22 (and therefore also the first and second substrates 106, 110) relative to the component 104, and attaching the component 104 to the target area 108 on the second substrate 110, which steps are identical to the corresponding steps of the spatial registration method described with reference to
[0306] Imaging the component 104 according to the first alternative spatial registration method instead of imaging a first marker like the first marker 50 according to the spatial registration method described with reference to
[0307] A second alternative method for use in spatially registering first and second objects will now be described with reference to
[0308] Referring now to
[0309] Referring now to
[0310] As will be described in more detail below, the computing resource 40 determines the position and orientation of the target area 208 in the frame of reference of the motion control stage 20 based at least in part on one or more acquired images of the target area 208 with the aid of a fixed virtual image 264 of the target area 208.
[0311] The fixed virtual image 264 of the target area 208 is stored in a memory of the computing resource 40 and has a fixed spatial relationship with respect to the FOV 60 of the imaging system 30. In the example method illustrated in
[0312] The second alternative spatial registration method continues with the computing resource 40 determining the position and orientation of the target area 208 in the frame of reference of the motion control stage 20 based at least in part on the one or more acquired images of the target area 208. Specifically, the computing resource 40:
[0313] (i) causes the imaging system 30 to acquire an image of the target area 208 when the target area 208 is in the FOV 60 of the imaging system 30;
[0314] (ii) determines a degree of similarity between the acquired image of the target area 208 and the fixed virtual image 264 of the target area 208;
[0315] (iii) responsive to determining that the degree of similarity between the acquired image of the target area 208 and the fixed virtual image 264 of the target area 208 does not comply with a predetermined criterion, the computing resource 40 controls the actuators of the motion control stage 20 to translate and/or rotate the table 22 of the motion control stage 20 in the x-y plane with respect to the FOV 60 of the imaging system 30 while maintaining the target area 208 in the FOV 60 of the imaging system 30 and repeats steps (i) and (ii) until the degree of similarity between the acquired image of the target area 208 and the fixed virtual image 264 of the target area 208 complies with the predetermined criterion;
[0316] (iv) uses the position sensors 24 to measure a corresponding relative position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 and the orientation sensors 26 to measure a corresponding relative orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 when the degree of similarity between the acquired image of the target area 208 and the fixed virtual image 264 of the target area 208 complies with the predetermined criterion; and
[0317] (v) determines the position and orientation of the target area 208 in the frame of reference of the motion control stage 20 based on the measured position and orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 when the degree of similarity between the acquired image of the target area 208 and the fixed virtual image of the target area 208 complies with the predetermined criterion.
[0318] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the target area 208 and the fixed virtual image 264 of the target area 208. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0319] One of ordinary skill in the art will understand that the second alternative spatial registration method further includes steps of detaching the component 204 from the first substrate 206, moving the table 22 (and therefore also the first and second substrates 206, 210) relative to the component 204, and attaching the component 204 to the target area 208 on the second substrate 110, which steps are identical to the corresponding steps of the spatial registration method described with reference to
[0320] Imaging the target area 208 according to the second alternative spatial registration method instead of imaging a second marker like the second marker 54 according to the spatial registration method described with reference to
[0321] A third alternative method for use in spatially registering first and second objects will now be described with reference to
[0322] Referring to
[0323] As illustrated in
[0324] As illustrated in
[0325] One of ordinary skill in the art will understand that the third alternative spatial registration method further includes steps of detaching the component 304 from the first substrate 306, moving the table 22 (and therefore also the first and second substrates 306, 310) relative to the component 304, and attaching the component 304 to the target area 308 on the second substrate 310, which steps are identical to the corresponding steps of the spatial registration method described with reference to
[0326] A fourth alternative method for use in spatially registering first and second objects will now be described with reference to
[0327] As illustrated in
[0328] As illustrated in
[0329] One of ordinary skill in the art will understand that the fourth alternative spatial registration method further includes steps of detaching the component 404 from the surface 23 of the table 22 of the motion control stage 20, moving the table 22 (and therefore also the second substrate 310) relative to the component 404, and attaching the component 404 to the target area 408 on the second substrate 410, which steps are essentially identical to the corresponding steps of the spatial registration method described with reference to
[0330] A fifth alternative method for use in spatially registering first and second objects will now be described with reference to
[0331] As illustrated in
[0332] As illustrated in
[0333] One of ordinary skill in the art will understand that the fifth alternative spatial registration method further includes steps of detaching the component 504 from the surface 23 of the table 22 of the motion control stage 20, moving the table 22 (and therefore also the second substrate 510) relative to the component 504, and attaching the component 504 to the target area 508 on the second substrate 510, which steps are essentially identical to the corresponding steps of the spatial registration method described with reference to
[0334] A sixth alternative method for use in spatially registering first and second objects will now be described with reference to
[0335] As illustrated in
[0336] As illustrated in
[0337] One of ordinary skill in the art will understand that the sixth alternative spatial registration method further includes steps of detaching the component 604 from the surface 23 of the table 22 of the motion control stage 20, moving the table 22 (and therefore also the second substrate 610) relative to the component 604, and attaching the component 604 to the target area 608 on the second substrate 610, which steps are essentially identical to the corresponding steps of the spatial registration method described with reference to
[0338] One of ordinary skill in the art will understand that various modifications are possible to the apparatus and methods described above. For example, each of the spatial registration methods described above with reference to
[0339] A variant of each of the spatial registration methods described above with reference to
[0340] (i) causing the imaging system 30 to acquire an image of the first marker 50;
[0341] (ii) using the position sensors 24 to measure a position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the first marker 50 and using the orientation sensors 26 to measure an orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the first marker 50;
[0342] (iii) determining a degree of similarity between the acquired image of the first marker 50 and the virtual image 62 of the first marker 50;
[0343] (iv) responsive to determining that the degree of similarity between the acquired image of the first marker 50 and the virtual image 62 of the first marker 50 does not comply with a predetermined criterion, translating and/or rotating the virtual image 62 of the first marker 50 with respect to the FOV 60 of the imaging system 30 and repeating step (iii) until the degree of similarity between the acquired image of the first marker 50 and the virtual image 62 of the first marker 50 complies with the predetermined criterion;
[0344] (v) determining a corresponding relative position and orientation of the virtual image 62 of the first marker 50 with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 62 of the first marker 50 and the acquired image of the first marker 50 complies with the predetermined criterion; and
[0345] (vi) determining the position and orientation of the first marker 50 in the frame of reference of the motion control stage 20 based on the measured relative position and orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the first marker 50 and the determined relative position and orientation of the virtual image 62 of the first marker 50 with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 62 of the first marker 50 and the acquired image of the first marker 50 complies with the predetermined criterion.
[0346] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the first marker 50 and the virtual image 62 of the first marker 50. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0347] Similarly, with reference to
[0348] (i) causing the imaging system 30 to acquire an image of the second marker 54;
[0349] (ii) using the position sensors 24 to measure a position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the second marker 54 and using the orientation sensors 26 to measure an orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the second marker 54;
[0350] (iii) determining a degree of similarity between the acquired image of the second marker 54 and the virtual image 64 of the second marker 54;
[0351] (iv) responsive to determining that the degree of similarity between the acquired image of the second marker 54 and the virtual image 64 of the second marker 54 does not comply with a predetermined criterion, translating and/or rotating the virtual image 64 of the second marker 54 with respect to the FOV 60 of the imaging system 30 and repeating step (iii) until the degree of similarity between the acquired image of the second marker 54 and the virtual image 64 of the second marker 54 complies with the predetermined criterion;
[0352] (v) determining a corresponding relative position and orientation of the virtual image 64 of the second marker 54 with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 64 of the second marker 54 and the acquired image of the second marker 54 complies with the predetermined criterion; and
[0353] (vi) determining the position and orientation of the second marker 54 in the frame of reference of the motion control stage 20 based on the measured relative position and orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the second marker 54 and the determined relative position and orientation of the virtual image 64 of the second marker 54 with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 64 of the second marker 54 and the acquired image of the second marker 54 complies with the predetermined criterion.
[0354] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the second marker 54 and the virtual image 64 of the second marker 54. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0355] As another example, with reference to
[0356] (i) causing the imaging system 30 to acquire an image of the component 304;
[0357] (ii) using the position sensors 24 to measure a position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the component 304 and using the orientation sensors 26 to measure an orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the component 304;
[0358] (iii) determining a degree of similarity between the acquired image of the component 304 and the virtual image 362 of the component 304;
[0359] (iv) responsive to determining that the degree of similarity between the acquired image of the component 304 and the virtual image 362 of the component 304 does not comply with a predetermined criterion, translating and/or rotating the virtual image 362 of the component 304 with respect to the FOV 60 of the imaging system 30 and repeating step (iii) until the degree of similarity between the acquired image of the component 304 and the virtual image 362 of the component 304 complies with the predetermined criterion;
[0360] (v) determining a corresponding relative position and orientation of the virtual image 362 of the component 304 with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 362 of the component 304 and the acquired image of the component 304 complies with the predetermined criterion; and
[0361] (vi) determining the position and orientation of the component 304 in the frame of reference of the motion control stage 20 based on the measured relative position and orientation of the motion control stage 20 corresponding to the acquired image of the component 304 and the determined relative position and orientation of the virtual image 362 of the component 304 with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 362 of the component 304 and the acquired image of the component 304 complies with the predetermined criterion.
[0362] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the component 304 and the virtual image 362 of the component 304. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0363] Similarly, with reference to
[0364] (i) causing the imaging system 30 to acquire an image of the target area 308;
[0365] (ii) using the position sensors 24 to measure a position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the target area 308 and using the orientation sensors 26 to measure an orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the target area 308;
[0366] (iii) determining a degree of similarity between the acquired image of the target area 308 and the virtual image 364 of the target area 308;
[0367] (iv) responsive to determining that the degree of similarity between the acquired image of the target area 308 and the virtual image 364 of the target area 308 does not comply with a predetermined criterion, translating and/or rotating the virtual image 364 of the target area 308 with respect to the FOV 60 of the imaging system 30 and repeating step (iii) until the degree of similarity between the acquired image of the target area 308 and the virtual image 364 of the target area 308 complies with the predetermined criterion;
[0368] (v) determining a corresponding relative position and orientation of the virtual image 364 of the target area 308 with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 364 of the target area 308 and the acquired image of the target area 308 complies with the predetermined criterion; and
[0369] (vi) determining the position and orientation of the target area 308 in the frame of reference of the motion control stage 20 based on the measured relative position and orientation of the motion control stage 20 corresponding to the acquired image of the target area 308 and the determined relative position and orientation of the virtual image 364 of the target area 308 with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 364 of the target area 308 and the acquired image of the target area 308 complies with the predetermined criterion.
[0370] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the target area 308 and the virtual image 364 of the target area 308. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0371] One of ordinary skill in the art will understand that such variant methods for determining a position and orientation of an object such as a component in a frame of reference of the motion control stage 20 wherein the degree of similarity is determined sequentially between a single acquired image of the object or of a marker and each virtual image of a plurality of virtual images of the object or of the marker, wherein each virtual image of the object or of the marker corresponds to a different position and/or orientation of the virtual image of the object or of the marker in the FOV 60 of the imaging system 30, do not require any movement of the table 22 of the motion control stage 20 relative to the base 21 of the motion control stage 20 once the single image of the object or of the marker is acquired. Consequently, not only are such variant methods for determining a position and orientation of an object in a frame of reference of the motion control stage 20 faster than the spatial registration methods described above with reference to
[0372] A further alternative method for use in spatially registering first and second objects will now be described with reference to
[0373] Referring to
[0374] The first substrate 706 has an upper surface 752 defining a first marker 750a and an identical further first marker 750b. The first marker 750a and the further first marker 750b have a known separation, for example because the first marker 750a and the further first marker 750 are defined simultaneously on the first substrate 706 using the same lithographic process. The component 704 has a known position and orientation relative to the first marker 750a and the further first marker 750b.
[0375] The second substrate 710 has an upper surface 756 defining a second marker 754a and a further second marker 754b. The second marker 754a and the further second marker 754b have a known separation, for example because the second marker 754a and the further second marker 754b are defined simultaneously on the second substrate 710 using the same lithographic process. The target area 708 has a known position and orientation relative to the second marker 754a and the further second marker 754b. The target area 708 has the same size and shape as the component 704.
[0376] It should be understood that the first and second substrates 706 and 710 are generally misaligned with the x- and y-axes and that the misalignment of the first and second substrates 706 and 710 with respect to the x- and y-axes has been exaggerated in
[0377] The further alternative method for use in spatially registering first and second objects will now be described with reference to
[0378] Referring to
[0379] (i) causes the imaging system 30 to acquire an image of the first marker 750a when the first marker 750a is in the FOV 60 of the imaging system 30 as shown at inset I in
[0380] (ii) uses the position sensors 24 to measure a corresponding position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 and the orientation sensors 26 to measure a corresponding relative orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20; and
[0381] (iii) determines a degree of similarity between the acquired image of the first marker 750a and a virtual image 762 of the first marker 750a, which virtual image 762 of the first marker 750a has the same size and shape as the first marker 750a, and responsive to determining that the degree of similarity between the acquired image of the first marker 750a and the virtual image 762 of the first marker 750a does not comply with a predetermined criterion, translates the virtual image 762 of the first marker 750a in x and y with respect to the FOV 60 of the imaging system 30 until the degree of similarity between the acquired image of the first marker 750a and the virtual image 762 of the first marker 750a complies with the predetermined criterion.
[0382] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the first marker 750a and the virtual image 762 of the first marker 750a. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0383] The computing resource 40 then:
[0384] (iv) controls the actuators of the motion control stage 20 so as to translate the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 along a linear translation axis of the motion control stage 20 by a distance equal to the known separation between the first marker 750a and the further first marker 750b so that the further first marker 750b is in the FOV 60 of the imaging system 30 as shown at inset II in
[0385] Specifically, the computing resource 40 controls the actuators of the motion control stage 20 so as to translate the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 along the y-axis of the motion control stage 20 by a distance equal to the known separation between the first marker 750a and the further first marker 750b so that the further first marker 750b is in the FOV 60 of the imaging system 30 as shown at inset II in
[0386] The computing resource 40 then:
[0387] (v) causes the imaging system 30 to acquire an image of the further first marker 750b when the further first marker 750b is in the FOV 60 of the imaging system 30 as shown at inset II in
[0388] (vi) uses the position sensors 24 to measure a corresponding position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 and the orientation sensors 26 to measure a corresponding relative orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 when the further first marker 750b is in the FOV 60 of the imaging system 30 as shown at inset II in
[0389] (vii) determines a degree of similarity between the acquired image of the further first marker 750b and the virtual image 762 of the further first marker 750b, and responsive to determining that the degree of similarity between the acquired image of the further first marker 750b and the virtual image 762 of the further first marker 750b does not comply with a predetermined criterion, translates the virtual image 762 of the further first marker 750b in x and y with respect to the FOV 60 of the imaging system 30 until the degree of similarity between the acquired image of the further first marker 750b and the virtual image 762 of the further first marker 750b complies with the predetermined criterion.
[0390] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the further first marker 750b and the virtual image 762 of the further first marker 750b. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0391] The computing resource 40 then:
[0392] (vii) determines the position and orientation of the first marker 750a in the frame of reference of the motion control stage 20 based on: [0393] (a) the measured relative position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the first marker 750a; [0394] (b) the relative position of the virtual image 762 of the first marker 750a with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 762 of the first marker 750a and the acquired image of the first marker 750a complies with the predetermined criterion; [0395] (c) the measured relative position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the further first marker 750b; and [0396] (d) the relative position of the virtual image 762 of the further first marker 750b with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 762 of the further first marker 750b and the acquired image of the further first marker 750b complies with the predetermined criterion.
[0397] One of skill in the art will understand that the method of determining the position and orientation of the first marker 750a in the frame of reference of the motion control stage 20 described above with reference to
[0398] The computing resource 40 then determines the position and orientation of the component 704 in the frame of reference of the motion control stage 20 from the determined position and orientation of the first marker 750a in the frame of reference of the motion control stage 20 and the known position and orientation of the component 704 relative to the first marker 750a.
[0399] The accuracy with which the position of the component 704 is determined in the frame of reference of the motion control stage 20 depends on factors including the size of the features, the pixel density of the imaging system, and the wavelength of light used, but may be in a range from 1 μm to 1 nm. The accuracy with which the orientation of the component 704 is determined in the frame of reference of the motion control stage 20 depends on the same factors and may be in the range 0.001-1 mrad.
[0400] Referring to
[0401] (i) causes the imaging system 30 to acquire an image of the second marker 754a when the second marker 754a is in the FOV 60 of the imaging system 30 as shown at inset III in
[0402] (ii) uses the position sensors 24 to measure a corresponding position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 and the orientation sensors 26 to measure a corresponding relative orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20; and
[0403] (iii) determines a degree of similarity between the acquired image of the second marker 754a and a virtual image 764 of the second marker 754a, which virtual image 764 of the second marker 754a has the same size and shape as the second marker 754a, and responsive to determining that the degree of similarity between the acquired image of the second marker 754a and the virtual image 764 of the second marker 754a does not comply with a predetermined criterion, translates the virtual image 764 of the second marker 754a in x and y with respect to the FOV 60 of the imaging system 30 until the degree of similarity between the acquired image of the second marker 754a and the virtual image 764 of the second marker 754a complies with the predetermined criterion.
[0404] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the second marker 754a and the virtual image 764 of the second marker 754a. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0405] The computing resource 40 then:
[0406] (iv) controls the actuators of the motion control stage 20 so as to translate the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 along a linear translation axis of the motion control stage 20 by a distance equal to the known separation between the second marker 754a and the further second marker 754b so that the further second marker 754b is in the FOV 60 of the imaging system 30 as shown at inset IV in
[0407] Specifically, the computing resource 40 controls the actuators of the motion control stage 20 so as to translate the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 along the y-axis of the motion control stage 20 by a distance equal to the known separation between the second marker 754a and the further second marker 754b so that the further second marker 754b is in the FOV 60 of the imaging system 30 as shown at inset IV in
[0408] The computing resource 40 then:
[0409] (v) causes the imaging system 30 to acquire an image of the further second marker 754b when the further second marker 754b is in the FOV 60 of the imaging system 30 as shown at inset IV in
[0410] (vi) uses the position sensors 24 to measure a corresponding position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 and the orientation sensors 26 to measure a corresponding relative orientation of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 when the further second marker 750b is in the FOV 60 of the imaging system 30 as shown at inset IV in
[0411] (vii) determines a degree of similarity between the acquired image of the further second marker 754b and the virtual image 764 of the further second marker 754b, and responsive to determining that the degree of similarity between the acquired image of the further second marker 754b and the virtual image 764 of the further second marker 754b does not comply with a predetermined criterion, translates the virtual image 764 of the further second marker 754b in x and y with respect to the FOV 60 of the imaging system 30 until the degree of similarity between the acquired image of the further second marker 754b and the virtual image 764 of the further second marker 754b complies with the predetermined criterion.
[0412] The computing resource 40 determines the degree of similarity by evaluating a cross-correlation value between the acquired image of the further second marker 754b and the virtual image 764 of the further second marker 754b. The computing resource 40 determines that the degree of similarity complies with the predetermined criterion when the cross-correlation value has a maximum value.
[0413] The computing resource 40 then:
[0414] (vii) determines the position and orientation of the second marker 754a in the frame of reference of the motion control stage 20 based on: [0415] (a) the measured relative position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the second marker 754a; [0416] (b) the relative position of the virtual image 764 of the second marker 754a with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 764 of the second marker 754a and the acquired image of the second marker 754a complies with the predetermined criterion; [0417] (c) the measured relative position of the table 22 of the motion control stage 20 relative to the body 21 of the motion control stage 20 corresponding to the acquired image of the further second marker 754b; and [0418] (d) the relative position of the virtual image 764 of the further second marker 754b with respect to the FOV 60 of the imaging system 30 when the degree of similarity between the virtual image 764 of the further second marker 754b and the acquired image of the further second marker 754b complies with the predetermined criterion.
[0419] One of skill in the art will understand that the method of determining the position and orientation of the second marker 754a in the frame of reference of the motion control stage 20 described above with reference to
[0420] The computing resource 40 then determines the position and orientation of the target area 708 in the frame of reference of the motion control stage 20 from the determined position and orientation of the second marker 754a in the frame of reference of the motion control stage 20 and the known position and orientation of the target area 708 relative to the second marker 754a.
[0421] The accuracy with which the position of the target area 708 is determined in the frame of reference of the motion control stage 20 depends on factors including the size of the features, the pixel density of the imaging system, and the wavelength of light used, but may be in a range from 1 μm to 1 nm. The accuracy with which the orientation of the target area 708 is determined in the frame of reference of the motion control stage 20 depends on the same factors and may be in the range 0.001-1 mrad.
[0422] The method for use in spatially registering first and second objects continues with the computing resource 40 determining the spatial relationship between the component 704 and the target area 708 in the frame of reference of the motion control stage 20 based on the determined position and orientation of the component 704 in the frame of reference of the motion control stage 20 and the determined position and orientation of the target area 708 in the frame of reference of the motion control stage 20.
[0423] In a variant of the further alternative method for use in spatially registering first and second objects described with reference to
[0424] The size of the PDMS stamp 37 of the pick and place tool 36 that engages any of the components 4, 104, 204, 304, 404, 504, 604, 704 may be larger or smaller than the component. A calibration step may be performed before the PDMS stamp 37 of the pick and place tool 36 engages any of the components to determine the spatial relationship between the PDMS stamp 37 of the pick and place tool 36 and the FOV 60 of the imaging system 30. The spatial relationship between the PDMS stamp 37 of the pick and place tool 36 and the FOV 60 of the imaging system 30 may be used to align the PDMS stamp 37 of the pick and place tool 36 with the centre of any of the components.
[0425] Rather than using the motion control stage to move the table 22 towards the PDMS stamp 37 of the pick and place tool 36 until one of the components 4, 104, 204, 304, 404, 504, 604, 704 engages the PDMS stamp 37 of the pick and place tool 36, the PDMS stamp 37 of the pick and place tool 36 may be movable towards one of the components 4, 104, 204, 304, 404, 504, 604, 704 until the PDMS stamp 37 of the pick and place tool 36 engages one of the components 4, 104, 204, 304, 404, 504, 604, 704. Similarly, rather than using the motion control stage to move the table 22 away from the PDMS stamp 37 of the pick and place tool 36, the PDMS stamp 37 of the pick and place tool 36 may be movable away from the table 22.
[0426] In the spatial registration methods described above with reference to
[0427] Furthermore, although the spatial registration methods described above with reference to
[0428] Although the spatial registration methods described above with reference to
[0429] Once detached from the first substrate or the table 22 of the motion control stage 20, the first object may be flipped before it is attached to the second object.
[0430] Although the spatial registration methods described above with reference to
[0431] Any of the spatial registration methods described above with reference to
[0432] Features of any one of the spatial registration methods described above with reference to
[0433] A method for use in the spatial registration of first and second objects may comprise spatially registering different first objects with the same second object.
[0434] A method for use in the spatial registration of first and second objects may comprise spatially registering different first objects to different target areas on the same second substrate.
[0435] A method for use in the spatial registration of first and second objects may comprise transferring different components defined on, or attached to, different first substrates to different target areas on the same second substrate.
[0436] A method for use in the spatial registration of first and second objects may comprise spatially registering different first objects with different second objects.
[0437] A method for use in the spatial registration of first and second objects may comprise spatially registering different first objects to different target areas on different second substrates.
[0438] A method for use in the spatial registration of first and second objects may comprise transferring different components defined on, or attached to, different first substrates to different target areas on different second substrates.
[0439] Referring now to
[0440] As will be described in more detail below, in use, first and second objects (not shown in
[0441] Although not shown explicitly in
[0442] The system 801 further includes an optical power measurement system 841 mounted above the upper surface 823 of the table 822 of the motion control stage 820 for measuring the optical power at least a portion of light reflected from one or more objects located on the upper surface 823 of the table 822 of the motion control stage 820. The optical power measurement system 841 has a fixed spatial relationship relative to the base 821 of the motion control stage 820. The optical power measurement system 841 includes a single pixel detector (not shown) arranged so as to measure the optical power of at least a portion of light reflected from one or more objects located on the upper surface 823 of the table 822 of the motion control stage 820. The system 801 further includes a white light source 842 for illuminating one or more objects located on the upper surface 823 of the table 822 of the motion control stage 820 and a partially reflecting mirror arrangement 844 for reflecting at least some of the light from the white light source 842 so as to illuminate the one or more objects located on the upper surface 823 of the table 822 of the motion control stage 820 and so as to direct at least a portion of the incident light reflected from the one or more objects to the optical power measurement system 841.
[0443] The system 801 further includes a “pick-and-place” tool 836 mounted above the upper surface 823 of the table 822 of the motion control stage 820. The pick-and-place tool 836 includes a transparent pick-and-place head portion in the form of a PDMS stamp 837 for engaging and holding an object such as a component (not shown in
[0444] As will now be described with reference to
[0445] The method includes directing light from the white light source 842 onto the first object 804, the second object 808, and the one or more regions of the surface of the substrate or wafer 810 adjacent to the second object 808 at the same time and using the optical power measurement system 841 to measure the optical power of at least a portion of the incident light that is reflected from the first and second objects 804, 808 and the one or more regions of the surface of the substrate or wafer 810 adjacent to the second object 808 while the first and second objects 804, 808 are aligned relative to one another. Specifically, the optical power measurement system 841 measures the optical power of at least a portion of the light that is reflected from the first and second objects 804, 808 and the one or more regions of the surface of the substrate or wafer 810 adjacent to the second object 808 while the PDMS stamp 837 of the tool 836 holds the first object 804 above the surface of the substrate or wafer 810 and the computing resource 840 controls the actuators of the motion control stage 820 so as to translate the table 822 of the motion control stage 820 (and therefore also the second object 808 and the substrate or wafer 810) in x-y relative to the base 821 of the motion control stage 820 and/or so as to rotate the table 822 of the motion control stage 820 (and therefore also the second object 808 and the substrate or wafer 810) about the z-axis relative to the base 821 of the motion control stage 820 until the measured optical power is minimised.
[0446] Alternatively, as will now be described with reference to
[0447] The method includes directing light from the white light source 842 onto the first object 904, the second object 908, and the one or more regions of the surface of the substrate or wafer 910 adjacent to the second object 908 at the same time and using the optical power measurement system 841 to measure the optical power of at least a portion of the incident light that is reflected from the first and second objects 904, 908 and the one or more regions of the surface of the substrate or wafer 910 adjacent to the second object 908 while the first and second objects 904, 908 are aligned relative to one another. Specifically, the optical power measurement system 841 measures the optical power of at least a portion of the incident light that is reflected from the first and second objects 904, 908 and the one or more regions of the surface of the substrate or wafer 910 adjacent to the second object 908 while the PDMS stamp 837 of the tool 836 holds the first object 904 above the surface of the substrate or wafer 910 and the computing resource 840 controls the actuators of the motion control stage 820 so as to translate the table 822 of the motion control stage 820 (and therefore also the second object 908 and the substrate or wafer 910) in x-y relative to the base 821 of the motion control stage 820 and/or so as to rotate the table 822 of the motion control stage 820 (and therefore also the second object 908 and the substrate or wafer 910) about the z-axis relative to the base 821 of the motion control stage 820 until the measured optical power is maximised.
[0448] Such a method may enable the spatial registration of first and second objects to a resolution or accuracy of less than 1 μm, less than 100 nm, less than 10 nm, or of the order of 1 nm. Such a method may enable the spatial registration of the first and second objects where the first and second objects have a size or scale of less than 1 μm, less than 100 nm, less than 10 nm, or of the order of 1 nm.
[0449] One of ordinary skill in the art will understand that various modifications are possible to the apparatus and methods described above with reference to
[0450] The surface to which the second object 808, 908 is fixed or attached may be the upper surface 823 of the table 822 of the motion control stage 820.
[0451] The first component 804, 904 may be detachably attached to the table 822 of the motion control stage 820.
[0452] The first component 804, 904 may be detachably attached to a first substrate or wafer (not shown).
[0453] The second object 808, 908 may comprise a feature, a structure, a target area, a target region or a second component 808, 908 defined on a second substrate or wafer 810, 910, wherein the second substrate or wafer 810, 910 is fixed to the motion control stage 820.
[0454] Such a method may enable the alignment of a first component 808, 908 relative to a feature, a structure, a target area, a target region or a second component 808, 908 defined on a substrate or a wafer 810, 910.
[0455] The method may comprise using a multi-pixel detector to measure the total integrated optical power of at least a portion of the light that is reflected from the first and second objects 804 and 808 or 904 and 908 and the one or more regions of the surface of the substrate or wafer 810, 910 adjacent to the second object 808, 908 and that is incident across a plurality of the pixels of the multi-pixel detector.
[0456] The light may comprise light other than white light. For example, the light may comprise coherent light. The light may comprise visible or infrared light.
[0457] The method may comprise detaching the first object 804, 904 from the table 822 of the motion control stage 820.
[0458] The method may comprise detaching the first object 804, 904 from the first substrate (not shown).
[0459] The method may comprise holding the first object 804, 904.
[0460] The method may comprise moving the first object 804, 904 and the motion control stage 820 apart and holding the first object 804, 904 spaced apart from the motion control stage 820 and the second object 808, 908 to permit the motion control stage to move the second object 808, 908 relative to the first object 804, 904.
[0461] The method may comprise aligning the tool, head, stamp, probe or holder 836, 937 with respect to the first object 804, 904.
[0462] The method may comprise engaging the first object 804, 904 with the PDMS stamp 837.
[0463] The method may comprise using the PDMS stamp 837 to hold the first object 804, 904.
[0464] The method may comprise using the PDMS stamp 837 to detach the first object 804, 904 from the motion control stage 820.
[0465] The method may comprise moving the motion control stage 820 away from the first object 804, 904.
[0466] The method may comprise using the tool, head, stamp, probe or holder 836, to move the first object 804, 904 away from the motion control stage.
[0467] The method may comprise using the motion control stage 820 to move the second object 808, 908 relative to the first object 804, 904 so as to align the first and second objects 804 and 808 or 904 and 908 relative to one another until the measured optical power is maximised or minimised.
[0468] The method may comprise bringing the first and second objects 804 and 808 or 904 and 908 together until the first and second objects 804 and 808 or 904 and 908 are aligned and in engagement.
[0469] The method may comprise:
[0470] using the PDMS stamp 837 to hold the first object 804, 904 until the first and second objects 804 and 808 or 904 and 908 are aligned and in engagement; and then
[0471] using the PDMS stamp 837 to release the first object 804, 904 to permit attachment of the first and second objects 804 and 808 or 904 and 908.
[0472] The method may comprise using the motion control stage 820 to move the second object 808, 908 towards the first object 804, 904 until the first and second objects 804 and 808 or 904 and 908 are aligned and in engagement.
[0473] The method may comprise using the PDMS stamp 837 to move the first object 804, 904 towards the second object 808, 908 until the first and second objects 804 and 808 or 904 and 908 are aligned and in engagement.
[0474] As an alternative to, or in addition to, the PDMS stamp 837 of the tool, being transparent to the light used to illuminate the first object 804, 904 and the second object 808, 908, the head 837 of the tool 836 may be smaller than the first object 804, 904 to be picked. For example, the head 837 of the tool 836 may comprise a very fine tip or needle which is smaller than the first object 804, 904 to be picked.
[0475] The method may comprise attaching the first and second objects 804 and 808 or 904 and 908 while the first and second objects 804 and 808 or 904 and 908 are aligned.
[0476] Such a method may be used for the micro-assembly of the first and second objects 804 and 808 or 904 and 908, for example for transfer printing the first object 804, 904 onto the second object 808, 908.
[0477] Attaching the first and second objects 804 and 808 or 904 and 908 may comprise using a differential adhesion method and/or capillary bonding to attach the first and second objects 804 and 808 or 904 and 908 together.
[0478] Attaching the first and second objects 804 and 808 or 904 and 908 may comprise bonding the first and second objects or 904 and 908 using an intermediate adhesive material or agent such as an intermediate adhesion layer. Attaching the first and second objects 804 and 808 or 904 and 908 may comprise soldering the first and second objects 804 and 808 or 904 and 908.
[0479] The method may comprise flipping the first object 804, 904 over before attaching the first and second objects 804 and 808 or 904 and 908.
[0480] The first object 804, 904 may comprise a lithographic mask and the second object 808, 908 may comprise a work-piece e.g. a substrate or a wafer. The work-piece may comprise photoresist to be exposed to visible and/or UV light through the lithographic mask.
[0481] One of ordinary skill in the art will understand that one or more of the features of the embodiments of the present disclosure described above with reference to the drawings may produce effects or provide advantages when used in isolation from one or more of the other features of the embodiments of the present disclosure and that different combinations of the features are possible other than the specific combinations of the features of the embodiments of the present disclosure described above.