Method and optical system for acquiring the tomographical distribution of wave fronts of electromagnetic fields
11015981 · 2021-05-25
Assignee
Inventors
- Juan José Fernández Valdivia (Santa Cruz de Tenerife, ES)
- Juan Manuel TRUJILLO SEVILLA (Santa Cruz de Tenerife, ES)
- Óscar Gómez Cárdenes (Santa Cruz de Tenerife, ES)
Cpc classification
G01J9/00
PHYSICS
International classification
Abstract
The invention relates to a method for the two-dimensional reconstruction of wave fronts (104) of light for use in an optical system (100) comprising: measuring the distribution function of the light intensity in at least two images at different optical planes (101, 102) having an optical path difference. In particular this method is suitable for probing the tomographical distribution of wave fronts of electromagnetic fields with an image detector, e.g. any standard two-dimensional camera.
Claims
1. A method for performing a two-dimensional reconstruction of wave fronts of light for use in an optical system comprising: measuring a distribution function of light intensity in at least two pixelated images, the at least two pixelated images captured at two or more optical planes, the two or more optical planes having an optical path difference; wherein measuring the distribution function of the light intensity comprises determining a plurality of one-dimensional cumulative distribution functions of the light intensity in each plane over a range of different angles within each plane; matching the determined plurality of one-dimensional cumulative distribution functions across the two or more optical planes to derive two-dimensional wave front slope estimates in a midway plane, wherein the midway plane is located between the two or more optical planes; and integrating the two-dimensional wave front slope estimates to reconstruct a two-dimensional shape of the wave front in the midway plane, wherein the distribution function of light intensity is associated with a wave front.
2. The method of claim 1, wherein one of the at least two pixelated images is taken in a pupil plane of the optical system.
3. The method of claim 1, wherein one pixelated image of the at least two pixelated images is taken intra-focally and at least one image of the at least two pixelated images is taken extra-focally.
4. The method of claim 1, further comprising reconstructing a plurality of two-dimensional wave front shapes in a plurality of optical planes based on the at least two pixelated images.
5. The method of claim 1, further comprising: dividing each pixelated image of the at least two pixelated images into sections; and for each section, reconstructing a two-dimensional shape of the wave front.
6. The method of claim 1, wherein propagation of a recovered wave front is determined according to Rayleigh-Sommerfeld diffraction.
7. The method of claim 1, wherein one of the at least two images is a computed image that is fully characterized by theoretical or empirical models.
8. A system, comprising: a processor; and a non-transitory memory containing instructions, which when executed by the processor, cause the system to: measure a distribution function of light intensity in at least two pixelated images, the at least two pixelated images captured at two or more optical planes, the two or more optical planes having an optical path difference, wherein measuring the distribution function of the light intensity comprises determining a plurality of one-dimensional cumulative distribution functions of the light intensity in each plane over a range of different angles within each plane, match the determined plurality of one-dimensional cumulative distribution functions across the two or more optical planes to derive two-dimensional wave front slope estimates in a midway plane, wherein the midway plane is located between the two or more optical planes, and integrate the two-dimensional wave front slope estimates to reconstruct a two-dimensional shape of the wave front in the midway plane, wherein the distribution function of the light intensity is associated with a wave front.
9. The system of claim 8, wherein the processor is a graphical processor unit (GPU).
10. The system of claim 8, further comprising an image detector, wherein the at least two pixelated images are captured by the image detector.
11. The system of claim 10, wherein the memory contains instructions, which when executed by the processor, cause the system to perform a tomography of a volumetric distribution of the wave front based on the at least two pixelated images.
12. The system of claim 11, further comprising a wave front sensor, wherein the memory further contains instructions, which when executed by the processor, cause the system to perform a wave-front reconstruction based on data from the wave front sensor.
13. The system of claim 12, wherein the memory contains instructions, which when executed by the processor, cause the system to perform a tomography of a three-dimensional distribution of the wave front based on data from the wave front sensor.
14. The system of claim 12, wherein the wave-front sensor is one or more of a curvature sensor or an optical acquisition system.
15. A non-transitory, computer-readable medium comprising program code, which when executed by a processor, causes a system to: measure a distribution function of light intensity in at least two pixelated images, the at least two pixelated images captured at two or more optical planes, the two or more optical planes having an optical path difference, wherein measuring the distribution function of the light intensity comprises determining a plurality of one-dimensional cumulative distribution functions of the light intensity in each plane over a range of different angles within each plane, match the determined plurality of one-dimensional cumulative distributions functions across the two or more optical planes to derive two-dimensional wave front slope estimates in a midway plane, wherein the midway plane is located between the two or more optical planes, and integrate the two-dimensional wave front slope estimates to reconstruct a two-dimensional shape of the wave front in the midway plane, wherein the distribution function of the light intensity is associated with a wave front.
16. The non-transitory, computer-readable medium of claim 15, further comprising instructions, which when executed by the processor, cause the system to perform a tomography of a volumetric distribution of the wave front based on the at least two pixelated images.
17. The non-transitory, computer-readable medium of claim 15, further comprising instructions, which when executed by the processor, cause the system to perform a wave-front reconstruction based on data from a wave front sensor.
18. The non-transitory, computer-readable medium of claim 15, further comprising instructions, which when executed by the processor, cause the system to perform a tomography of a three-dimensional distribution of the wave front based on data from a wave front sensor.
Description
(1) The following figure illustrate exemplary:
(2)
(3)
(4)
(5)
(6) The distribution of the exemplary photons 106a, 106b, 107a, 107b, 108a, 108b in their respective planes 101,102, thereby can be interpreted as representing light intensity distributions and the optical planes 101,102 can be interpreted a different images having optical path difference.
(7) The reference numeral 109 denotes an exemplary orientation of the optical system, with the z-axis being identical or parallel to the optical axis (not shown) of the optical system 100.
(8) Assuming that photons 106a, 106b, 107a, 107b, 108a, 108b travel in straight lines 105a, 105b, 105c between the image planes or optical planes 102 and 101, and assuming that the direction of propagation of the photons is perpendicular to their corresponding local wave front, the displacement of the photons along the x-axis is given by the (local) wave-front slope times the distance between the two optical planes 101, 102.
(9) Hence the local wave-front slopes 104a, 104b, 104c of the wave front 104 can be estimated or reconstructed at an optical plane 103 halfway between the position of the photons or halfway between the optical planes 102,101, respectively, by matching the photons 106b, 107b, 108b of one plane 101 onto the photons 106a, 107a, 108 of the other plane 102.
(10) The optical planes 102,101 in which the distribution of photons, i.e. the light intensity distribution, is measured can be located at any place along the optical path of the optical system 100. Therefore, also the optical plane 103 in which the wave front is to be reconstructed can be located at any place along the optical path of the optical system 100. Stated differently, the optical plane 103 in which the wave front is to be reconstructed does not need to coincide with any specific optical plane, e.g. aperture plane or pupil plane, of the optical system 100.
(11) As previously mentioned, it is conceivable that the images taken at different optical planes having an optical path difference, i.e. the images wherein the distribution function of the light intensity is measured, can be located both before or after an aperture or pupil plane of the optical system 100, such that the optical plane 103 in which the wave front is to be reconstructed also can be located before or after an aperture or pupil plane of the optical system 100.
(12) It is therefore possible that the images taken at different optical planes having an optical path difference, i.e. the images wherein the distribution function of the light intensity is measured, can be located at different distances with respect to a possible aperture plane or pupil plane.
(13) The method according to the invention described above, now allows recovering the shape of the wave front also for the more complex case in which the wave-front and intensity variations occur over two dimensions and along different directions.
(14)
(15) As mentioned above, the relation between the slope(s) of the wave-front and the spatial displacement(s) of a photon or photon ray propagating perpendicular to the wave-front can be assumed to follow a linear relationship.
(16) The better the displacement(s) of ray positions or of measured local light intensity positions can be measure, the better the original wave-front shape or wave-front phase can be recovered.
(17) As estimate for the achievable wave-front resolution can be given by
(18)
wherein d represents the distance, i.e. the optical path difference, in m between two images on which the method presented here is carried out, and p is the pixel size in object space.
(19) From the exemplary estimate above it can further be seen, that the achievable wave-front resolution can increase for increasing optical path difference, since a longer distance can magnify the shift(s) or displacement(s) for a given photon ray angle, e.g. the angle of propagation of the photon ray with respect to an optical axis.
(20) In the shown present exemplary case, the error for displacements or shifts of less than 0.5 pixel is rather small, e.g. less than 10%, implying that, for example for optical path difference of a few cm, wave-front resolutions down to the picometer regime can be obtained.
(21) For completeness we note that the minimum measureable angle or wave-front slope or wave-front phase slope can be estimated by atan (p/d), wherein d represents the distance, i.e. the optical path difference, in m between two images on which the method presented here is carried out, and p is the pixel size in the image plane or image sensor.
(22)
(23) Furthermore, the exemplary optical system 300 comprises an optional optical element 304, e.g. a lens, that can, for example, focus the light rays or light ray bundles 309, 310 propagating from the exemplary objects 307, 308 to be observed onto an exemplary focal plane 303. Said exemplary objects 307, 308 can either be two distinct objects (as shown) located at different distances from the optical axis 315 or they can be different parts of a single object.
(24) As is exemplary shown the light rays or light ray bundles 309, 310 can hit the exemplary optical planes 316, 317 at different separate locations on each plane 309, 310, i.e. at different locations of the measured or determined images 301, 303.
(25) As described above, the measured or determined images 301, 303 can be partitioned or divided or sectioned into different sections or regions, wherein the regions or sections can be overlapping or can be separated. For example, image 301 can be partitioned into two regions 311, 314 and image 312 can be partitioned into two regions 312, 313.
(26) Other partitioning schemes are conceivable too. The simple partitioning shown here is just illustrative. As is exemplary shown, the light rays of object 307 hit the image 301 at the region 311 of optical plane 316 and hit the image 302 at the region 312 of the optical plane 317, whereas the light rays of object 308 hit the image 302 at the region 314 of the optical plane 316 and hit the image 302 at the region 313 of the optical plane 317.
(27) Instead of applying the herein described method for two-dimensional reconstruction of wave fronts or wave-front phases over the whole entire size of the images 301, 302 or the entire size of an image sensor, e.g. a charge-coupled device (CCD), the method reconstructing the wave front(s) or wave-front phase(s) can be applied on only the sections or regions in which each image 301, 302 or measurement plane 316, 317 or image sensor (not shown) is divided into.
(28) In other words the wave-front phase is recovered not over the entire image 301, 302, but the wave-front phase(s) for each section 311, 314, 312, 313 or region of each image 301, 302 is/are recovered.
(29) To be more precise, the wave-front shapes or wave-front phases of sections in a plane located between corresponding sections 311, 314, 312, 313 or regions of the images 301, 302, i.e. between the optical planes 316, 316, can be recovered.
(30) Assuming, for example, an at least partially transparent object or target media volume 318, said least partially transparent object or target media volume 318 can be modeled as a set of different discrete phase screens 305, 306, wherein a phase screen, as mentioned above, can be modeled by a matrix wherein the different matrix elements represent different phase change values for phase changes imparted to a wave-front propagating through said at least partially transparent object or target media volume 318 by different regions of the object or target media volume.
(31) By dividing the measured or determined images 301, 302 into a plurality of sections or regions 311, 314, 312, 313, said plurality of sections or regions can, for example, be understood as capturing a projection (or line integral) of the at least partially transparent object or target volume 318 at a certain angle.
(32) Hence a certain section or region 311, 314, 312, 313 of the measured or determined images 301, 302 can correspond to a projection (or line integral) of a phase screen 305, 306 at a certain angle.
(33) The partitioning of the measured or determined images 301, 302 into a plurality of sections or regions 311, 314, 312, 313 can then, as mentioned above, form the basis to define an equation system from which a plurality of phase screens 305, 306 can be computed or restored.
(34) The set of computed or restored phase screens 305, 306 then can inter alia allow performing wave-front phase tomography of, for example at least partially transparent object(s) or target media 318 to be observed, for different parts of an object or different objects under different angles of view and/or different depths.
(35) Followed by three sheets comprising
(36) 100 exemplary optical system
(37) 101 exemplary (first) image plane or (first) optical plane at a (first) optical path position having (first) light intensity distribution
(38) 102 exemplary (second) image plane or (second) optical plane at a (second) optical path position having a (second) light intensity distribution
(39) 103 exemplary optical plane, between said first and second optical plans, wherein the wave front is to be reconstructed, for example, an aperture plane of the optical system
(40) 104 exemplary wave front to be reconstructed
(41) 104a exemplary local wave-front segment having a (first) local slope
(42) 104b exemplary local wave-front segment having a (second) local slope
(43) 104c exemplary local wave-front segment having a (third) local slope
(44) 105a exemplary photon propagation trajectory/photon propagation direction/local wave-front propagation directory
(45) 105b exemplary photon propagation trajectory/photon propagation direction/local wave-front propagation directory
(46) 105c exemplary photon propagation trajectory/photon propagation direction/local wave-front propagation directory
(47) 106a exemplary photon representing a local light intensity in the optical plane 102
(48) 106b exemplary photon representing a local light intensity in the optical plane 101
(49) 107a exemplary photon representing a local light intensity in the optical plane 102
(50) 107b exemplary photon representing a local light intensity in the optical plane 101
(51) 108a exemplary photon representing a local light intensity in the optical plane 102
(52) 108b exemplary photon representing a local light intensity in the optical plane 101
(53) 109 exemplary orientation of the optical system, with the z-axis being identical or parallel to the optical axis (not shown) of the optical system
(54) 200 exemplary plot of error of ray shift measurements
(55) 201 exemplary error curve
(56) 202 exemplary abscissa axis, e.g. y-axis, e.g. ray shift in pixel
(57) 203 exemplary ordinate axis, e.g. x-axis, e.g. error of ray shift measurement with normalized scale from 0 to 1
(58) 300 exemplary optical system
(59) 301 exemplary (first) image at a (first) optical path position
(60) 302 exemplary (second) image at a (second) optical path position
(61) 303 exemplary possible focal plane
(62) 304 exemplary optical element, e.g. lens, of optical system
(63) 305 exemplary (first) phase screen
(64) 306 exemplary (second) phase screen
(65) 307 exemplary (first) object to be observed
(66) 308 exemplary (second) object to be observed
(67) 309 exemplary light rays (light ray bundle) emanating from (first) object
(68) 310 exemplary light rays (light ray bundle) emanating from (second) object
(69) 311 exemplary (first) section or region of (first) image 301
(70) 312 exemplary (first) section or region of (second) image 302
(71) 313 exemplary (second) section or region of (second) image 302
(72) 314 exemplary (second) section or region of (first) image 301
(73) 315 exemplary optical axis
(74) 316 exemplary (first) optical plane or (first) measurement plane
(75) 317 exemplary (second) optical plane or (second) measurement plane
(76) 318 exemplary at least partially transparent target (media) volume