Method for recording gabor hologram
09599959 ยท 2017-03-21
Assignee
Inventors
Cpc classification
G03H2001/005
PHYSICS
G03H1/0443
PHYSICS
G03H2001/0471
PHYSICS
G02B6/06
PHYSICS
G03H1/02
PHYSICS
International classification
G03H1/26
PHYSICS
G03H1/00
PHYSICS
G03H1/02
PHYSICS
Abstract
The present invention is related to an holographic probe device for recording a gabor hologram comprising: a coherent optical fiber bundle comprising a distal end and a proximal end; a recording medium optically coupled to the proximal end of the coherent optical fiber bundle; a light source producing in use a single light beam, illuminating the distal end of the coherent optical fiber bundle and the object to be observed.
Claims
1. A holographic probe device for recording a Gabor hologram comprising: a coherent optical fiber bundle comprising a distal end and a proximal end; a recording medium optically coupled to the proximal end of the coherent optical fiber bundle; a light source adapted to produce a single light beam, wherein the single light beam is adapted to illuminate the distal end of the coherent optical fiber bundle and an object to be observed.
2. The holographic probe device according to claim 1, wherein the light source is selected from the group consisting of LED, laser, and gas discharge tubes.
3. The holographic probe device according to claim 1, wherein the light source comprise an optical fiber to guide the single light beam.
4. The holographic probe device according to claim 1, wherein the recording medium is a CMOS or a CCD image sensor.
5. The holographic probe device according to claim 1, wherein the recording medium is optically coupled to the proximal end of the coherent optical fiber bundle by either direct contact or by means of a lens; and wherein an image is formed on the recording medium by the proximal end of the coherent optical fiber bundle.
6. An optical device comprising: a first and second holographic probe device, each holographic probe device according to claim 1, wherein the first holographic probe device is situated at a first angle relative to an object to be observed and the second holographic probe device is situated at a second angle relative to the object to be observed.
7. The optical device according to claim 6, wherein the distal end of the coherent optical fiber bundle of the first holographic probe device is perpendicular to the distal end of the coherent optical fiber bundle of the second holographic probe device.
8. The optical device according to claim 6, wherein the first holographic probe device is parallel with the second holographic probe device; and wherein the distal end of the coherent optical fiber bundle of the first holographic probe device is aligned with the proximal end of the coherent optical fiber bundle of the second holographic probe device.
9. The optical device according to claim 8, wherein the proximal end of the coherent optical fiber bundle of the first holographic probe device is optically coupled to the light source of a recording device of the second holographic probe device; and the proximal end of the coherent optical fiber bundle of the second holographic probe device is optically coupled to the light source of a recording device of the first holographic probe device.
10. The optical device according to claim 6, wherein no lens is situated between the first holographic probe device and the object to be observed; and wherein lens is situated between the second holographic probe device and the object to be observed.
11. A method for recording a Gabor hologram comprising the step of: providing a coherent optical fiber bundle comprising a distal end (and a proximal end, the proximal end being optically coupled to a recording medium; lighting the distal end of the coherent optical fiber bundle and an object to be observed by means of a single at least partially coherent light beam, thereby producing a Gabor hologram on the distal end of the coherent bundle; recording the Gabor hologram transmitted through the coherent optical fiber bundle on the recording medium.
12. The method according to claim 11 wherein more than one Gabor hologram is recorded by using more than one light beam, each light beam illuminating a separate coherent optical fiber bundle.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
FIGURE KEYS
(10) 1, 101, 201: front recording plane (distal end of the fibre bundle 2: Object 3: diffracted light 4: interference 5: incoming light beam 6, 106, 206: coherent optical fibre bundle (endoscope or light conduit) 7, 107, 207: recording medium 8, 108, 208: light source 9: focal distance 10: light source lens 11: L: distance from distal end of the fibre bundle to the light source 12: recording lens 13: holographic probe device 114, 214: beam splitter s1-3: point-like sources having specific wavelength bandwidths f: focal length of the collimating lens Dp1-3: Extension of the diffraction patterns created, respectively, by the sources s1-3
DETAILED DESCRIPTION OF THE INVENTION
(11) The basic concept consists to use coherent optical fiber bundle or endoscopes 6, also called image conduit, to record in-line digital hologram. Coherent optical fiber bundle 6 are constituted by an organized bundle of optical fibres in such a way that the image formed on one side of the bundle (input plane 1 at distal end of the bundle) is transmitted to the other side of the fibre bundle (output plane at proximal end of the bundle) and is available for the observation where it is recorded by a 2D electronic sensor 7 (camera).
(12) As the light intensity distribution at the input plane 1 of the image conduit is transmitted up to the sensor 7, with some resolution constraints, we will consider that this light distribution is also the recorded one.
(13) The initial configuration to record in-line lens free hologram with a fibre endoscope 6 or image conduit is shown by the
(14) The object 2 (for example a 3D distribution of particles) is illuminated by a directional optical beam 5 that can be generated by a laser or by a Light Emitting Diode illumination. The object 2 diffracts the illumination beam 5 in such a way that the diffracted pattern interferes with the un-diffracted illumination beam at the input window 1 of the fibre endoscope 6.
(15) The interference pattern, that results in a spatial intensity distribution is transmitted at the output plane where it is detected by a two dimensional sensor 7 that records the intensity distribution. This recorded light intensity distribution can be handled by the usual processing of in-line holography for 3D imaging. Note that a miniature optical system as a lens can be also placed in front of the input plane of the fibre endoscope to adjust the magnification of the experimental volume. But, in that case, in order to observe interference fringes, the object 2 should be out of focus with regards to the input plane 1.
(16) It has to be observed that the coherent fibre bundle 6 makes possible to record the intensity distribution in a very simple way at locations difficult to reach with other optical configurations. In this way, it is possible to directly place the input window 1 of the fibre endoscope 6 in contact with a fluid in a microfluidic device, which is complex to perform directly with an image sensor.
(17) The coherent fibre bundle 6 or image conduit has a typical window width of few millimeters with individual fibre of few micrometers. Image conduits with typically 50.000 individual optical fibres are commercially available. That gives a limited resolution, but it is a technology under development that can be improved.
(18) The illumination can be obtained by collimated or diverging beams 5 as shown respectively by
(19) For spatially incoherent sources, the spatial coherence characteristics are expressed by conditions on the source 8 diameters s as shown by the
(20) Similarly, for a diverging beam, we have s=Le/d, where L is the distance 11 between the source 8 and the recoding plane 1. In practical situations, the typical size of s is few hundred micrometers. Therefore, depending on the depth of reconstruction, it is not necessary to have a full spatial coherence illumination although full coherent illumination can be used.
(21) The constraints on the temporal coherence are estimated by considering a scattering object 2 located at a distance d from the recording plane 1. Assuming that the numerical aperture of the fibres is NA, it can be shown that the maximum optical path difference is p=NA.sup.2d/8 that has to be smaller that the coherence length expressed by .sup.2/. It results that <8.sup.2/NA.sup.2d that is typically corresponding to 10 nm. This spectral width is compatible with the ones of the light emitting diodes eventually filtered by an interference filter.
(22) It results that light emitting diode (LED) filtered by a small aperture is convenient. It is also possible to work with laser beam that can be of reduced temporal coherence. Advantageously, a compact illumination scheme consists to illuminate the sample with an optical fibre optically coupled to a laser or a LED.
(23) The light distribution transmitted by the image conduit 6 is reaching the output side of plane where it is recorded by the sensor 7. This coupling can be realized by placing the output plane very close to the sensor or by adding some imaging lens 12 as shown in
(24) A significant advantage of the invention is compactness in such a way that several devices can advantageously be implemented, to perform multiple measurements of the same experimental volume. In the following example, we will call each system as described here above as elementary device 13 or holographic probe device 13 or simply probe 13. There are several interesting configurations to combine elementary devices 13.
(25) When more than one elementary device 13 is used, the light originating from the diffusion on the object 2 of the light from one elementary device 13 can be filtered out of the other elementary device 13 illumination by an appropriate combination of polarizers and/or wave plates, by the use of different wavelengths for the different elementary devices combined with barer filters placed in front of the sensors in order to keep the only right wavelength range on each sensor or by using temporally separated light pulse.
(26) The
(27) This configuration with two elementary devices 13 has a high potential in the field of applications for 3D measurements such as 3D velocymetry. Indeed, when digital holography is used to performed 3D velocymetry, there is an issue with the resolution along the optical axis that is lower than the resolution in the transversal directions. By combining the holograms in both directions, the proposed configuration overcomes this limitation by providing the highest resolution in every direction. Due to the compactness of the elementary devices 13, it is possible to increase the number of elementary devices to increase the accuracy of the measurements.
(28) Advantageously, the two, or the multiple fibre endoscopes 6 could be connected to a single camera.
(29) Several elementary devices 13 can be placed side by side in order to increase the field of view and the resolution by an effective increase of the numerical aperture. The individual recorded images are combined in larger unique hologram for the processing. The recording configuration is shown by
(30) The images conduits 6 can be connected to the same sensor or to different ones.
(31) Thanks to the compactness of the elementary devices 13, it is also possible to observe an experimental volume with two opposite elementary devices as shown by the
(32) In the opposite elementary devices configuration, two in-line holograms of the sample recorded with opposite directions are recorded with two elementary devices. The illumination of the hologram recorded by the first elementary device (comprising the elements 101,107,112,114,106,108 and 110 in
(33) A first hologram of the object is reaching the recording plane 101 of the image conduit 106 and is transmitted up to the sensor 107. Alternatively, other illumination schemes can be achieved, by adding, for example, external illumination by optical fibers.
(34) The symmetric configuration allows to record a second hologram by the sensor 207. If necessary, the back reflexion of the illumination of the object 2 by the second (first) light source 208 (108) to the input plane 101 (201) of the first (second) image conduit 106 (206) and transmitted towards sensor 101 (201) can be eliminated by an appropriate combination of polarizers and/or wave plates, of by the use of different wavelengths for the first and second light source 108,208 combined with barer filters placed in front of the sensors 107,207 in order to keep the only right wavelength range on each sensor 107,207.
(35) The configuration of
(36) Consider that an object of complex transmittance t is placed at the distance d.sub.1 from the recording plane of the image conduit 1. The two recording planes 101,201 of the image conduits are separated by a distance d. For the sake of simplicity, we assume the wavelengths of the both light sources 108,208 are identical. The generalization to two different wavelengths is straightforward as shown in the following.
(37) It can be shown that the distribution amplitude i.sub.1 and i.sub.2, on, respectively, the input planes of the image conduits 106,206, are expressed by:
(38)
(39) Where we use the operator algebra defined by Joseph Shamir, in Optical Systems and Processes, SPIE PRESS 1999, and we omitted to indicate the spatial dependency to simplify the reading. The superscript * denotes the complex conjugate operation.
(40) We have:
(41) The quadratic phase factor defined by
(42)
where k is the wave number defined by k=2/ and j={square root over (1)}. V[b] is the scaling operator defined by V[b]g(x,y)=g(bx,by).
(43)
where F.sup. is the Fourier transform operations defined by:
(44)
By applying to i.sub.1 and i.sub.2, respectively, the scaling operators V[d.sub.1/d] and V[d.sub.2/d], we obtain:
(45)
Where U=V[1/d]U. By decomposing the real and imaginary parts of the Eq. (4), we obtain a system of equations that can be solved, by an adequate choice of d.sub.1 and d.sub.2. Therefore U is determined and we can go back to the real and imaginary parts of t by a scaling, a Fourier transformation operation and by the multiplication by a quadratic phase factor. The optical phase of t results by computing =arctan.sub.2(t.sub.imaginary/t.sub.real).
(46) The holographic recording with an image conduit can advantageously be performed at several discrete wavelengths, and in particular in the red, green and blue wavelengths (or Cyan, Magenta and Yellow or any suitable wavelength combination). We assume that an object separated by a distance d with respect to the recoding plane is illuminated by a two-wavelengths collimated beam. The illumination by a diverging beam slightly changes the following derivation but does not modify the principle.
(47) For the sake of simplicity, we assume that the object is illuminated by a red beam (630 nm) and a green beam (530 nm).
(48) The two-wavelengths light distributions are recorded separately by the sensor. By using the In-line holography expression, we obtain the two following holographic signals:
i.sub.1=u.sub.1+u*.sub.1
i.sub.2=u.sub.2+u*.sub.2(5)
Where u.sub.1 and u.sub.1 are the complex amplitudes of the object t propagated from the object plane up to the recording plane by the Kirchhoff-Fresnel propagation at the wavelength .sub.1 and .sub.2.
(49) Using the previous notations, they can be expressed by:
u.sub.n=F.sup.1Q[.sub.n.sup.2d]Ft(6)
Where n=1, 2.
(50) The diffraction pattern with the wavelength .sub.2 can be alternatively seen as obtained at wavelength .sub.1 with a distance d=d.sub.2/.sub.1. Therefore, the holographic signals of Eq. (5) can be considered as obtained for a single wavelength .sub.1 but for two propagation distances d and d. Using this fact and Eq. (6), Eq. (5) is rewritten by:
i.sub.1=F.sup.1Q[.sub.1.sup.2d]Ft+F.sup.1Q[.sub.1.sup.2d]Ft*
i.sub.2=F.sup.1Q[.sub.1.sup.2d]Ft+F.sup.1Q[.sub.1.sup.2d]Ft*(7)
(51) We separate t in its real and imaginary parts, t=t.sub.r+jt.sub.j. We compute the Fourier transformation of the two Eq. (7) to obtain:
I.sub.1=Q[.sub.1.sup.2d](T.sub.r+jT.sub.j)+Q[.sub.1.sup.2d](T.sub.rjT.sub.j)
I.sub.2=Q[.sub.1.sup.2d](T.sub.r+jT.sub.j)+Q[.sub.1.sup.2d](T.sub.rjT.sub.j)(8)
Where T.sub.r and T.sub.j are the Fourier transformation of t.sub.r and t.sub.j. The Eqs. (8) constitute a set of 2 equations, with the unknown T.sub.r and T.sub.j that are easy to solve according that q=sin {(u.sup.2+v.sup.2)(dd)}0, where (u,v) are the spatial frequencies. The quantity q=0 when (u,v)=(0,0) and when (u.sup.2+v.sup.2)(dd)=m. The first singularity corresponds to a constant value over the field of view that is not significant, and the second singularity can be removed by using a three-wavelengths illumination.
(52) Therefore by computing the inverse Fourier transformation of T=T.sub.r+jT.sub.j, we obtain the complex amplitude t that provides the optical phase of the object.
(53) The multi directional measurement of the optical phase of a sample provides a tool to measure the 3D measurement of the optical phase information. This is called optical tomography. The multi wavelengths approach to perform multiple measurement of a sample can enable such measurement.
(54) Spatially Separated Illumination
(55) Advantageously, several optical sources having different wavelengths illuminate the sample with different directions. This is illustrated by the
(56) In the
(57) The several collimated beams illuminate the object, placed between the lens and the input plane of the image conduit. Each illumination beam, diffracted by the object, will give rise to spatially shifted diffraction patterns at the input plane of the image conduit. The wavelength bandwidths of the different sources are selected in such a way that it is possible to separate the contributions when detected by the sensor. As there are shifts between the recorded diffraction patterns, the digital holographic refocusing provides spatially separated reconstructed images of the object. The separation amounts are directly proportional to the distance between the object and the sensor. It results that the separations can be used to measure the distance between the object and the sensor. This measurement capability improves the accuracy that is obtained by the only digital holographic refocusing.
(58) This type of multisource illumination can be also used in the case of the spherical illumination. It can be also coupled with the multi-image conduit configurations that are described.