Spectrally-resolved three-dimensional shape measurement device and spectrally-resolved three-dimensional shape measurement method

10638062 ยท 2020-04-28

Assignee

Inventors

Cpc classification

International classification

Abstract

An apparatus includes: an interferometer configured to produce white light fringes with measuring light reflected or scattered by an object; an image sensor configured to generate an image signal for each pixel; and a controller. The interferometer splits the measuring light into two luminous fluxes and reflects them on reflecting mirrors having different curvatures. A white light fringe signal is obtained by varying the optical path difference between the two luminous fluxes. The controller is configured to perform frequency conversion on the white light fringe signal, with respect to the optical path difference, to determine a cross spectral density representing spectral information of each point on the object. The controller is configured to perform back-propagation computation based on Fresnel diffraction integral on the cross spectral density to determine a wavefront of light from each point on the object.

Claims

1. An apparatus for obtaining a three-dimensional shape and spectral information of an object, the apparatus comprising: an interferometer configured to produce white light fringes based on measuring light reflected or scattered by the object illuminated with light having wavelengths; an image sensor configured to generate an image signal for each pixel, the image signal representing an intensity distribution of light outputted from the interferometer; and a controller connected to the image sensor and configured to determine the three-dimensional shape and the spectral information of the object from the image signal, the interferometer including: a beam splitting element configured to split the measuring light into a first luminous flux and a second luminous flux; a first reflecting mirror having a first reflection surface arranged for reflecting the first luminous flux to the beam splitting element, the first reflection surface having a first curvature; a second reflecting mirror having a second reflection surface arranged for reflecting the second luminous flux to the beam splitting element, the second reflection surface having a second curvature different from the first curvature; and a movable stage configured to move the second reflecting mirror in a first direction parallel to a straight line that connects the beam splitting element and the second reflecting mirror to each other, so as to change a distance between the beam splitting element and the second reflecting mirror, thereby producing an optical path difference between the first luminous flux and the second luminous flux, the controller being configured to: acquire a white light fringe signal represented by a plurality of the image signals generated by the image sensor, while controlling the movable stage to move the second reflecting mirror in the first direction; perform frequency conversion on the white light fringe signal, with respect to the optical path difference between the first luminous flux and the second luminous flux, so as to determine a cross spectral density representing spectral information of each point on the object; and perform back-propagation computation based on Fresnel diffraction integral on the cross spectral density, so as to determine a wavefront of light reflected or scattered at each point on the object.

2. The apparatus according to claim 1, wherein both of the first reflecting mirror and the second reflecting mirror are concave mirrors.

3. A method adapted for an apparatus, the apparatus comprising: an interferometer configured to produce white light fringes with measuring light reflected or scattered by an object illuminated with light having wavelengths; and an image sensor configured to generate an image signal for each pixel, the image signal representing an intensity distribution of light outputted from the interferometer, the interferometer including: a beam splitting element configured to split the measuring light into a first luminous flux and a second luminous flux; a first reflecting mirror having a reflection surface arranged for reflecting the first luminous flux to the beam splitting element, the reflection surface having a first curvature; a second reflecting mirror having a reflection surface arranged for reflecting the second luminous flux to the beam splitting element, the reflection surface having a second curvature different from the first curvature; and a movable stage configured to move the second reflecting mirror in a first direction parallel to a straight line that connects the beam splitting element and the second reflecting mirror to each other, so as to change a distance between the beam splitting element and the second reflecting mirror, thereby producing an optical path difference between the first luminous flux and the second luminous flux, the method comprising: acquiring a white light fringe signal represented by a plurality of the image signals generated by the image sensor, while controlling the movable stage to move the second reflecting mirror in the first direction; performing frequency conversion on the white light fringe signal, with respect to the optical path difference between the first luminous flux and the second luminous flux, so as to determine a cross spectral density representing spectral information of each point on the object; and performing back-propagation computation based on Fresnel diffraction integral on the cross spectral density, so as to determine a wavefront of light reflected or scattered at each point on the object.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) FIG. 1 is a schematic configuration diagram of a spectrally-resolved three-dimensional shape measurement device according to one embodiment of the present invention.

(2) FIG. 2 is a functional block diagram of a controller.

(3) FIG. 3 is a schematic view of white light fringes obtained by a spectrally-resolved three-dimensional shape measurement device according to one embodiment of the present invention.

(4) FIG. 4 is an operation flowchart of a spectrally-resolved three-dimensional shape measurement device according to one embodiment of the present invention.

DESCRIPTION OF THE PREFERRED EMBODIMENTS

(5) A spectrally-resolved three-dimensional shape measurement device according to an embodiment of the present invention will now be described with reference to the drawings.

(6) A spectrally-resolved three-dimensional shape measurement device according to one embodiment of the present invention splits light reflected or scattered by an object illuminated with white light, into two luminous fluxes, by a beam splitting element of an interferometer. The spectrally-resolved three-dimensional shape measurement device produces a difference in wavefront between the two luminous fluxes by reflecting the two luminous fluxes on two reflecting mirrors separately disposed and having different curvatures. The spectrally-resolved three-dimensional shape measurement device then combines the two luminous fluxes into one luminous flux again. During the measuring operation, the spectrally-resolved three-dimensional shape measurement device measures the intensity distribution of the combined luminous flux by an image sensor while moving one of the reflecting mirrors in such a direction as to change the optical path difference between the two luminous fluxes. The spectrally-resolved three-dimensional shape measurement device thus obtains white light fringes including both the three-dimensional shape information and the spectral information of the object. Since the spectrally-resolved three-dimensional shape measurement device obtains white light fringes by moving only one of the reflecting mirrors in one direction, the device can measure the three-dimensional shape and the spectral information of the object in a reduced time.

(7) FIG. 1 schematically shows a configuration of a spectrally-resolved three-dimensional shape measurement device 1 according to one embodiment of the present invention. Spectrally-resolved three-dimensional shape measurement device 1 includes an interferometer 2, an image sensor 3, and a controller 4. A condensing optical system (not shown) may further be disposed between interferometer 2 and image sensor 3 to condense a luminous flux traveling from interferometer 2 to image sensor 3. Alternatively, a condensing optical system may be disposed between an object 10 and interferometer 2 to condense a luminous flux traveling from object 10 to interferometer 2.

(8) Interferometer 2 includes a beam splitting element 21, a first reflecting mirror 22, a second reflecting mirror 23, and a fine movement stage 24. Beam splitting element 21 is formed by a beam splitter or a half-silvered mirror, for example. Beam splitting element 21 splits measuring light reflected or scattered by object 10 illuminated with white light, into a first luminous flux B1 that travels to first reflecting mirror 22 and a second luminous flux B2 that travels to second reflecting mirror 23. Beam splitting element 21 combines first luminous flux B1 reflected by first reflecting mirror 22 with second luminous flux B2 reflected by second reflecting mirror 23 into one luminous flux, and outputs the combined luminous flux to image sensor 3.

(9) The light source for illuminating object 10 may be of any type and may be placed in any way. Various types of light sources that emit light having two or more wavelengths may be used, such as natural light, a white LED, and a plurality of monochromatic LEDs with different wavelengths. It is preferable that a surface of object 10 be illuminated with as uniform an illuminance as possible.

(10) First reflecting mirror 22 is disposed so that its reflection surface faces beam splitting element 21 and so that the distance between beam splitting element 21 and first reflecting mirror 22 is constant. In the present embodiment, first reflecting mirror 22 is disposed opposite to object 10 across beam splitting element 21.

(11) Second reflecting mirror 23 is disposed so that its reflection surface faces beam splitting element 21 in the direction orthogonal to the line that connects object 10 and first reflecting mirror 22 to each other. Second reflecting mirror 23 is disposed on fine movement stage 24 movably in the direction away from beam splitting element 21.

(12) First reflecting mirror 22 and second reflecting mirror 23 have different curvatures. For example, first reflecting mirror 22 may be a plane mirror, and second reflecting mirror 23 may be a concave mirror. Both of the two reflecting mirrors may be concave mirrors. This enables a shorter distance between interferometer 2 and image sensor 3 and can downsize spectrally-resolved three-dimensional shape measurement device 1. Alternatively, at least one of the reflecting mirrors may be a convex mirror. Thus, first reflecting mirror 22 and second reflecting mirror 23 have different reflection surface curvatures. Therefore, first reflecting mirror 22 and second reflecting mirror 23 differently modulate the wavefronts of luminous flux B1 and luminous flux B2, respectively, when reflecting them. When the two wavefronts are superimposed, white light interference occurs according to the difference in wavefront between luminous flux B1 and luminous flux B2 and according to the optical path difference between luminous flux B1 and luminous flux B2. Preferably first reflecting mirror 22 and second reflecting mirror 23 are disposed so that beam splitting element 21 turns the optical axis of first reflecting mirror 22 by 90 degrees to coincide with the optical axis of second reflecting mirror 23. This reduces the amount of computation for determining the three-dimensional shape information and the spectral information of object 10 from white light fringes represented by image signals generated by image sensor 3.

(13) Fine movement stage 24 is formed by a piezo stage, for example. On fine movement stage 24, second reflecting mirror 23 is placed. Fine movement stage 24 is connected to controller 4. In response to a control signal from controller 4, fine movement stage 24 causes second reflecting mirror 23 placed on fine movement stage 24 to move in the direction away from beam splitting element 21, i.e., in the optical axis direction of second reflecting mirror 23. Movement of fine movement stage 24 enables adjustment of the distance between second reflecting mirror 23 and beam splitting element 21. As described above, beam splitting element 21 and first reflecting mirror 22 are fixedly disposed at a constant distance. By adjusting the distance between second reflecting mirror 23 and beam splitting element 21, interferometer 2 can produce a predetermined optical path difference between luminous flux B1 and luminous flux B2.

(14) Hereinafter, a coordinate system is defined for convenience in which: the direction parallel to the plane where beam splitting element 21, first reflecting mirror 22, and second reflecting mirror 23 are disposed and along which luminous flux B2 travels to second reflecting mirror 23 is an x-axis; a direction orthogonal to the plane is a y-axis; and the direction parallel to the plane and along which luminous flux B1 travels to first reflecting mirror 22 is a z-axis. In this coordinate system, the origin (0,0,0) is set on the optical axis of first reflecting mirror 22 near object 10.

(15) Image sensor 3 includes two-dimensionally arrayed solid state image sensing devices, such as CCD or CMOS devices. During the measuring operation in spectrally-resolved three-dimensional shape measurement device 1, image sensor 3 receives a luminous flux obtained by combining luminous flux B1 reflected by first reflecting mirror 22 and luminous flux B2 reflected by second reflecting mirror 23. Image sensor 3 generates an image signal indicating a pixel value of each pixel according to the intensity of the received light at the pixel. Specifically, during the measuring operation, image sensor 3 generates a plurality of image signals representing the intensity distribution of a luminous flux obtained by combining luminous flux B1 and luminous flux B2. Different images correspond to different optical path differences between luminous flux B1 and luminous flux B2. Image sensor 3 is connected to controller 4. Each time image sensor 3 generates an image signal, image sensor 3 outputs the image signal to controller 4. For example, image sensor 3 generates about 100 to 1000 images during one measuring operation.

(16) Controller 4 is formed by a so-called computer. Controller 4 controls the overall spectrally-resolved three-dimensional shape measurement device 1. Based on images obtained from image sensor 3, controller 4 regenerates the wavefront from each point on object 10, namely, the spectral information of each point on object 10 and the three-dimensional shape information of object 10.

(17) FIG. 2 shows a functional block diagram of controller 4. As shown in FIG. 2, controller 4 includes: a storage unit 41 including at least one of a nonvolatile or volatile semiconductor memory, a magnetic disk, an optical disk, and a reader of them; a communication unit 42 including an interface circuit configured in accordance with communication standards (e.g. RS232C, Universal Serial Bus, and Ethernet (registered trademark)), and software (e.g. a device driver); and a control unit 43 including one or more processor, such as a CPU and a numerical value computation processor, and a peripheral circuit of the processor.

(18) Control unit 43 controls fine movement stage 24 by sending a control signal via communication unit 42, so as to move second reflecting mirror 23 in the x-axis direction and change the optical path difference between luminous flux B1 and luminous flux B2. During the measuring operation, control unit 43 changes the optical path difference by an amount such that white light fringes can be obtained over the range of wavelength for which regeneration of spectral information of object 10 is to be performed. For example, control unit 43 changes the optical path difference within a range corresponding to two or three times the upper limit of the range of wavelength and including the optical path difference of 0.

(19) Control unit 43 determines the three-dimensional shape information and the spectral information of object 10 from white light fringes represented by a plurality of image signals received from image sensor 3.

(20) Hereinafter, the principle for measuring the three-dimensional shape and the spectral information of object 10 by spectrally-resolved three-dimensional shape measurement device 1 is described.

(21) Function R (x,y) representing the phase modulation on a light wave caused by a spherical mirror having focal length f is expressed by the following equation:

(22) R ( x , y ) = exp [ - ik 2 f ( x 2 + y 2 ) ] = Q xy ( - 1 f ) ( 1 )
where k denotes a wave number, (x,y) denotes a position of the reflection surface of the spherical mirror in the x-axis direction and the y-axis direction. If the spherical mirror has a positive power (i.e., the spherical mirror is a concave mirror), focal length f is a positive value.

(23) For example, the distance between a point on the optical axis of first reflecting mirror 22 and the origin is denoted by d.sub.0, and the distance between a point on the optical axis of first reflecting mirror 22 and image sensor 3 is denoted by d.sub.1. It is assumed that second reflecting mirror 23 moves by distance Z/2 away from beam splitting element 21 in the x direction, from the position at which the optical path difference between luminous flux B1 and luminous flux B2 is 0 to the position at which the optical path length difference is Z. In this case, luminous flux B1 emitted from point r.sub.s=(x.sub.s,y.sub.s,z.sub.s) on object 10 propagates by distance (d.sub.0z.sub.s) to be modulated by first reflecting mirror 22, and further propagates by distance d.sub.1 to be sensed at image sensor 3. Luminous flux B2 emitted from point r.sub.s propagates by distance (d.sub.0z.sub.s+Z/2) to be modulated by second reflecting mirror 23, and further propagates by distance (d.sub.1+Z/2) to be sensed at image sensor 3. Accordingly, the light wave corresponding to luminous flux B1 and the light wave corresponding to luminous flux B2 on image sensor 3 are expressed by the following equations, with no consideration for a quadratic phase factor for the position of point r.sub.s that is not related to regeneration of a three-dimensional image:

(24) U 1 ( , ) = A f 1 d 1 f 1 + ( d 0 - z s ) f 1 - ( d 0 - z s ) exp [ ik ( d 0 + d 1 - z s ) ] exp [ ik 2 f 1 - ( d 0 - z s ) d 1 f 1 + ( d 0 - z s ) f 1 - ( d 0 - z s ) d 1 { ( X - f 1 f 1 - ( d 0 - z s ) x s ) 2 + ( Y - f 1 f 1 - ( d 0 - z s ) y s ) 2 } ] ( 2 ) U 2 ( is , ) = A f 2 d 1 f 2 + ( d 0 - z s ) f 2 - ( d 0 - z s ) exp [ ik ( d 0 + d 1 - z s ) ] exp [ ik 2 f 2 - ( d 0 - z s ) d 1 f 2 + ( d 0 - z s ) f 2 - ( d 0 - z s ) d 1 { ( X - f 2 f 2 - ( d 0 - z s ) x s ) 2 + ( Y - f 2 f 2 - ( d 0 - z s ) y s ) 2 } ] ( 3 )
where .sub.is=(X,Y) denotes coordinates in a real space on image sensor 3; =(X,Y,Z)=(.sub.is,Z) denotes a position vector representing the position after a movement by distance Z in the x direction from the sensing surface (Z=0) of image sensor 3; f.sub.1 and f.sub.2 respectively denote the focal length of first reflecting mirror 22 and the focal length of second reflecting mirror 23; denotes a frequency according to wavelength of the light wave. That is, =ck=c/ holds, where c denotes the velocity of light and k denotes a wave number.

(25) Interference fringes (volume interferogram) I.sub.p (,) recorded by the light emitted from a monochromatic light source at position r.sub.s and propagating via interferometer 2 are expressed by the following equation.

(26) I p ( , ) = .Math. .Math. U 1 ( , ) + U 2 ( is , ) .Math. 2 .Math. = .Math. .Math. U 1 ( , ) .Math. 2 .Math. + .Math. .Math. U 2 ( is , ) .Math. 2 .Math. + .Math. U 1 * ( , ) U 2 ( is , ) .Math. + .Math. .Math. U 1 ( , ) U 2 * ( is , ) .Math. .Math. ( 4 )

(27) Accordingly, cross spectral density W.sub.s (,r.sub.s,) of the optical field of interference fringes recorded by the light emitted from the monochromatic light source at position r.sub.s and propagating via interferometer 2 is defined, based on the equations (2) to (4), as the following equations:

(28) W s ( , r s , ) = .Math. .Math. U 1 * ( , ) U 2 ( is , ) .Math. .Math. = exp ( ikZ ) exp { ik 2 ( z s ) [ ( X - mx s ) 2 + ( Y - my s ) 2 ] } where ( z s ) = { f 1 f 2 - ( f 1 + f 2 ) d 1 + d 1 2 } ( d 0 - z z ) 2 + { 2 f 1 f 2 d 1 - ( f 1 + f 2 ) d 1 2 } ( d 0 - z s ) + f 1 f 2 d 1 2 ( d 0 - z s ) 2 ( f 2 - f 1 ) m = - d 1 d 0 - z s ( 5 )
where (z.sub.s) denotes a curvature radius of a spherical wave phase recorded as interference fringes, and m denotes a lateral magnification of a three-dimensional image of the light source. Solving curvature radius (z.sub.s) for z.sub.s provides the following equation.

(29) z s = d 0 2 f 1 f 2 d 1 - d 1 2 ( f 1 + f 2 ) + d 1 4 ( f 1 + f 2 ) 2 + 4 f 1 f 2 d 1 2 { ( z s ) ( f 2 - f 1 ) - d 1 2 } 2 { ( z s ) ( f 2 - f 1 ) - f 1 f 2 + ( f 1 + f 2 ) d 1 - d 1 2 } ( 6 )
Thus, from the value of curvature radius (z.sub.s), an object value converted into a reference coordinate system, namely, a coordinate system for point r.sub.s on object 10, is determined.

(30) In general, light from each point on object 10 is incoherent with each other. That is, light from each point interferes only with itself but does not interfere with light from the other points. A hologram, which is a record of white light fringes obtained on image sensor 3, additionally includes the intensity of white light fringes of light from each point on object 10. Accordingly, when the distribution of light from each point on object 10 is denoted by S (r.sub.s,), white light fringes I () recorded by image signals generated by image sensor 3 are expressed as the integral of the product of S (r.sub.s,) and interference fringes I.sub.p (,) generated by the monochromatic point light source expressed by equation (4), with respect to point r.sub.s, by the following equation:

(31) I ( ) = d 3 r s d S ( r s , ) I p ( , r s , ) = 1 ( ) + 2 ( ) + 12 ( ) + 12 * ( ) ( 7 )
where .sub.1 () and .sub.2 () denote light intensities of luminous flux B1 and luminous flux B2, respectively, as independent light waves. Spatial coherence function .sub.12 () is expressed by the following equation:

(32) 12 ( ) = 0 d exp ( ikZ ) d 3 r s S ( r s , ) exp { ik 2 ( z s ) [ ( X - mx s ) 2 + ( Y - my s ) 2 ] } = c dk exp ( ikZ ) W ( is , r s , ) W ( is , r s , ) = d 3 r s S ( r s , ) exp { ik 2 ( z s ) [ ( X - mx s ) 2 + ( Y - my s ) 2 ] } ( 8 )
where W (.sub.is,r.sub.s) denotes a complex hologram representing the spectral information of each point on object 10. Performing Fourier transformation on spatial coherence function .sub.12 () with respect to optical path difference Z provides a set of cross spectral density W (.sub.is,r.sub.s,) for each wavelength.

(33) Since .sub.is=(X,Y,0) holds, performing back-propagation computation based on Fresnel diffraction integral on W (.sub.is,r.sub.s,) provides regenerated image O (x,y,z,) of object 10 at regeneration distance z for each wavelength as expressed by the following equation.

(34) O ( x , y z , ) = dXdYW ( X , Y , r s , ) exp ( - ik 2 z [ ( X - x ) 2 + ( Y - y ) 2 ] ) ( 9 )

(35) If regeneration distance z is equal to curvature radius (z.sub.s) included in cross spectral density W (.sub.is,r.sub.s,), a focused image of object 10 is obtained at coordinates (x,y), which are obtained by enlarging coordinates (x.sub.s,y.sub.s) at position z.sub.s by magnification m.

(36) FIG. 3 shows a schematic view of a white light fringe signal acquired by spectrally-resolved three-dimensional shape measurement device 1. For example, if object 10 is a point light source on the optical axis of first reflecting mirror 22, white light fringes 300 are three-dimensional interference fringes in a Cartesian coordinate system as shown in FIG. 3, where the axes representing two directions X and Y are orthogonal to each other on image sensor 3 and where the axis representing optical path difference Z between luminous flux B1 and luminous flux B2 is orthogonal to each of the above two axes. Image sensor 3 generates an image signal representing white light fringes in an XY-plane for each value of optical path difference Z. Therefore, white light fringes 300 are obtained by stacking the image signal for each Z value. White light fringes 300 form concentric interference fringes in an XY-plane. In the direction of optical path difference Z between luminous fluxes B1 and B2, white light fringes 300 form concentric interference fringes that depend on the length of Z.

(37) FIG. 4 shows an operation flowchart of spectrally-resolved three-dimensional shape measurement device 1 in measuring the three-dimensional shape and the color of object 10. The operation described hereinafter is controlled by controller 4.

(38) When the measurement starts, control unit 43 of controller 4 acquires an image representing the light intensity distribution generated by image sensor 3, by sending a control signal to fine movement stage 24 via communication unit 42 to vary the optical path difference between luminous flux B1 and luminous flux B2 (step S101). Control unit 43 stores, in storage unit 41 of controller 4, an image signal corresponding to the light intensity distribution received from image sensor 3, with the image signal being associated with the optical path difference between luminous flux B1 and luminous flux B2 (step S102). Control unit 43 determines whether or not the image signal has been obtained over the entire range of a predetermined optical path difference (step S103). When the image signal has not been obtained over the entire range of the optical path difference, control unit 43 returns the control to step S101, controls fine movement stage 24 to change the optical path difference, and repeats steps S101 to S103.

(39) When the image signal has been obtained over the entire range of the predetermined optical path difference at step S103, control unit 43 performs Fourier transformation, for each corresponding pixel, on a white light fringe signal represented by a plurality of image signals stored in storage unit 41, with respect to the optical path difference between luminous fluxes B1 and B2, in accordance with the above equations (6) and (7), thus determining cross spectral density W (.sub.is,) (step S104). Control unit 43 performs back-propagation computation based on Fresnel diffraction integral on the determined cross spectral density W (.sub.is,) in accordance with equation (7), thus determining a regenerated wavefront from each point on object 10 (step S105). Controller 4 then ends the process.

(40) As described above, a spectrally-resolved three-dimensional shape measurement device according to one embodiment of the present invention splits light reflected or scattered by an object illuminated with white light into two luminous fluxes, reflects the two luminous fluxes on reflecting mirrors having different curvatures, and combines the reflected two luminous fluxes again. The spectrally-resolved three-dimensional shape measurement device records, by using an image sensor, wavefronts of the combined luminous flux, by varying the optical path difference between the two luminous fluxes, thus obtaining white light fringes. The spectrally-resolved three-dimensional shape measurement device performs, for each pixel, Fourier transformation with respect to the optical path difference to determine a cross spectral density representing the spectral information of each point on the object. The spectrally-resolved three-dimensional shape measurement device further performs back-propagation computation based on Fresnel diffraction integral on the cross spectral density to determine a regenerated wavefront from each point on the object representing the three-dimensional information of the object. Thus, the spectrally-resolved three-dimensional shape measurement device is simply required to move one of the reflecting mirrors along one axis for obtaining white light fringes. Therefore, the time required for measuring the three-dimensional shape and the spectral information of the object is reduced.

(41) The scope of the present invention is not limited to the above embodiment. For example, the two reflecting mirrors included in interferometer 2 may change positions with each other.

(42) In order to change the optical path difference between luminous fluxes B1 and B2, first reflecting mirror 22 may be moved to change the distance between first reflecting mirror 22 and beam splitting element 21, instead of moving second reflecting mirror 23.

(43) At least one of the two reflecting mirrors may be formed by a reflective spatial light modulator, such as an LCOS. In this case, controller 4 can change the curvature of the reflecting mirror by adjusting the voltage to be applied to a liquid crystal layer concentrically around the optical axis of the reflecting mirror. By adjusting the voltage to be applied to the liquid crystal layer of the spatial light modulator that forms the reflecting mirror, controller 4 can control the curvature of the reflecting mirror to provide a good signal-to-noise ratio of white light fringes.

(44) A spectrally-resolved three-dimensional shape measurement device according to the above embodiment or its variation may be used for various devices that require the spectral information and the three-dimensional shape information of an object. For example, a spectrally-resolved three-dimensional shape measurement device according to the above embodiment or its variation may be used in combination with a microscope. In this case, light from an object may be guided into an interferometer via an observation optical system of the microscope including an objective lens, for example.

(45) As described above, a person skilled in the art can make various modifications in accordance with an embodiment within the scope of the present invention.

(46) It should be construed that the embodiments of the present invention disclosed herein are by way of example in every respect, not by way of limitation. The scope of the present invention is defined by the terms of the claims and is intended to include any modification within the meaning and scope equivalent to the terms of the claims.