Process and apparatus for the capture of plenoptic images between arbitrary planes

11523097 · 2022-12-06

Assignee

Inventors

Cpc classification

International classification

Abstract

A process and an apparatus for the plenoptic capture of photographic or cinematographic images of an object or a 3D scene (10) of interest are based on a correlated light emitting source and correlation measurement, along the line of “Correlation Plenoptic Imaging” (CPI). A first image sensor (Da) and a second image sensor (Db) detect images along a path of a first light beam (a) and a second light beam (b), respectively. A processing unit (100) of the intensities detected by the synchronized image sensors (Da, Db) is configured to retrieve the propagation direction of light by measuring spatio-temporal correlations between light intensities detected in the image planes of at least two arbitrary planes (P′, P″; D′b, D″a) chosen in the vicinity of the object or within the 3D scene (10).

Claims

1. A process for the plenoptic capture of photographic, cinematographic, microscopic or steroscopic images of an object or a 3D scene of interest, comprising the steps of: providing a light emitting source; generating a first light beam (a) and a second light beam (b) coming from said light emitting source; directing said first light beam (a) towards a first image sensor (Da) and said second light beam (b) towards a second image sensor (Db), said second light beam (b) being adapted to be either reflected by said object or 3D scene (10) or transmitted through said object or 3D scene (10); retrieving via the second image sensor, a focused image of a first arbitrary plane chosen in a vicinity of the objector within the 3D scene (10), and retrieving a non-focused image of a second arbitrary plane chosen in the vicinity of the object or within the 3D scene (10), wherein the first arbitrary plane and the second arbitrary plane are planes other than a focusing element plane or a light source plane; retrieving, via the first image sensor, one of a focused image or a ghost image of the second arbitrary plane; and retrieving the propagation direction of light by measuring spatio-temporal correlations between the light intensities detected by said first and second image sensors (Da, Db) in the image planes of said first and second arbitrary planes (P′, P″; D′b, D″a).

2. The process according to claim 1, wherein said light emitting source is selected from an object or 3D scene (10) which reflects/transmits the illuminating chaotic light and a chaotic light emitting object or 3D scene (10).

3. The process according to claim 1, wherein the information about the propagation direction of light is obtained by measuring the correlation between the intensity fluctuations retrieved at points ρ.sub.a and ρ.sub.b on said image sensors (D.sub.a, D.sub.b) according to the correlation function
G(ρ.sub.a, ρ.sub.b)=custom characterΔI.sub.a(ρ.sub.a)ΔI.sub.b(ρ.sub.b)custom character where i=a, b and ΔI.sub.i(ρ.sub.i)=I.sub.i(ρ.sub.i)−custom characterI.sub.i(ρ.sub.i)custom character are the intensity fluctuations at points ρ.sub.i on the image sensor D.sub.i, the symbol custom character. . . custom character denoting the statistical average over the emitted light.

4. The process according to claim 1, wherein the depth of field (DOF) of the original image is enhanced by reconstructing the direction of light between said two arbitrary planes (P′, P″; D′.sub.b, D″.sub.a) and by retracing the light paths to obtain refocused images.

5. The process according to claim 4, wherein the image of the object with enhanced depth of field or the 3D image of the scene is reconstructed by stacking said refocused images.

6. The process according to claim 1, wherein a primary beam (5) of chaotic light coming from said object or 3D scene (10) is collected by a main lens (L.sub.f), and wherein said first light beam (a) and said second light beam (b) are generated by splitting said primary light beam (5) by means of a splitting element (20).

7. The process according to claim 6, wherein said first image sensor (D.sub.a) is placed at a distance (z′.sub.a) from the back principal plane of said main lens (L.sub.f) and retrieves the focused image of a first plane (P′) at a distance (z.sub.a) from the front principal plane of said main lens (L.sub.f), and wherein said second image sensor (D.sub.b) is placed at a distance (z′.sub.b) from the back principal plane of said main lens (L.sub.f) and retrieves the focused image of a second plane (P″) at a distance (z.sub.b) from the front principal plane of said main lens (L.sub.f).

8. The process according to claim 1, wherein said first light beam (a) and said second light beam (b) are quantum entangled beams generated by a source of entangled photons, or beams (30).

9. The process according to claim 8, wherein two identical lenses (L.sub.2) of focal length f.sub.2 are placed in such a way that their back principal planes are at distances (Z′.sub.a) and (z′.sub.b) from the image sensors (D.sub.a) and (D.sub.b), respectively, and define two conjugate planes (D′.sub.a) and (D′.sub.b) at distances z.sub.a=(1/f−1/z′.sub.a).sup.−1 and z.sub.b=(1/f−1/z′.sub.b).sup.−1, respectively, from the front principal planes of said lenses (L.sub.2).

10. The process according to claim 8, wherein an additional lens (L.sub.1) with focal length (f.sub.1) collects the two correlated beams emitted by said entangled photon or beam source (30).

11. The process according to claim 8, wherein said object or 3D scene (10) is placed in the optical path of one of said light beams (a, b).

12. The process according to claim 8, wherein a plane (D″.sub.a) parallel to said planes (D′.sub.a, D′.sub.b) is defined along the optical path of said second light beam (b) at a distance (z.sub.a) from the relevant lens (L.sub.2), and wherein a “ghost image” of the plane (D″.sub.a) is reproduced in the plane (D′.sub.a) when correlation or coincidence measurements are measured between said image sensors (D.sub.a) and (D.sub.b).

13. An apparatus for the plenoptic capture of photographic, cinematographic, microscopic or stereoscopic images of an object or a 3D scene (10) of interest, comprising: a first image sensor (D.sub.a) to detect images coming from said object or 3D scene (10) along a path of a first light beam (a); a second image sensor (D.sub.b) to detect images coming from said object or 3D scene (10) along a path of a second light beam (b); a processing unit (100) of the intensities detected by said image sensors (D.sub.a, D.sub.b); wherein said processing unit is configured to: retrieve, via the second image sensor, a focused image of a first arbitrary plane chosen in a vicinity of the object or within the 3D scene, and retrieve a non-focused image of a second arbitrary plane chosen in the vicinity of the object or within the 3D scene, wherein the first arbitrary plane and the second arbitrary plane are planes other than a focusing element plane or a light source plane; retrieve, via the first image sensor, one of a focused image or a ghost image of the second arbitrary plane; and retrieve the propagation direction of light by measuring spatio-temporal correlations between the light intensities detected by said first and second image sensors (D.sub.a, D.sub.b) in the image planes of the first and second arbitrary planes (P′, P″; D′.sub.b, D″.sub.a).

14. The apparatus according to claim 13, further including a main lens (L.sub.f), wherein the focused images of said at least two arbitrary planes (P′, P″) are retrieved at different distances (z.sub.a) and (z.sub.b) from the front principal plane of said main lens (L.sub.f).

15. The apparatus according to claim 13, wherein said first image sensor (D.sub.a) is placed at a distance (z′.sub.a) from the back principal plane of said main lens (L.sub.f) and said second image sensor (D.sub.b) is placed at a distance (z′.sub.b) from the back principal plane of said main lens (L.sub.f).

16. The apparatus according to claim 13, further including a splitting element (20) placed between said main lens (L.sub.f) and said image sensors (D.sub.a, D.sub.b) so as to generate said first light beam (a) and said second light beam (b) from a primary light beam (5) coming from said object or 3D scene (10) through said main lens (L.sub.f).

17. The apparatus according to claim 13, further including a light emitting source, wherein said light emitting source is an object or 3D scene (10) which reflects/transmits the light from a chaotic source or it is the same object or 3D scene (10) which emits the chaotic light.

18. The apparatus according to claim 13 which is configured to detect said first light beam (a) and said second light beam (b) from an entangled photons or beams light source (30).

19. The apparatus according to claim 13, wherein two identical lenses (L.sub.2) of focal length (f.sub.2) are placed in such a way that their back principal planes are at distances (z′.sub.a) and (z′.sub.b) from the image sensors (D.sub.a) and (D.sub.b), respectively, along the respective paths of said first light beam (a) and said second light beam (b) and define two conjugate planes (D′.sub.a) and (D′.sub.b) at distances z.sub.a(1/f−1/z′.sub.a).sup.−1 and z.sub.b=(1/f−1/z′.sub.b).sup.−1, respectively, from the front principal planes of said lenses (L.sub.2).

20. The apparatus according to claim 13, wherein the focused images of said at least two arbitrary planes (D′.sub.b, D″.sub.a) are retrieved at different distances (z′.sub.a) and (z′.sub.b) from the back principal planes of said lenses (L.sub.2).

21. The apparatus according to claim 13, further including an additional lens (L.sub.1) with focal length (f.sub.1) placed between said entangled photon or beam source (30) and said two identical lenses (L.sub.2).

22. The apparatus according to claim 13, wherein said first image sensor (D.sub.a) and said second image sensor (D.sub.b) are distinct, synchronized, image sensor devices.

23. The apparatus according to claim 13, wherein said first image sensor and said second image sensor are two disjoint parts of a same image sensor device.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) Further characteristics and advantages of the present invention will be more evident in the following description, given for illustrative purposes by referring to the attached figures, in which:

(2) FIG. 1 is a schematic view of an apparatus according to an embodiment of the present invention;

(3) FIGS. 2A, 2B and 2C are comparison images to show the depth of field enhancement obtained with a process according to the present invention;

(4) FIG. 3 is a schematic view of an apparatus according to another embodiment of the present invention; and

(5) FIGS. 4A, 4B and 4C are comparison images to show the refocusing obtained with a process according to the present invention.

DETAILED DESCRIPTION

(6) In the schematic view of the apparatus shown in FIG. 1, the object 10 is considered as an emitter of chaotic light of wavelength λ.

(7) A primary light beam 5 coming from the object 10 is collected by a lens L.sub.f, supposed to be a thin lens in the figure, with positive focal length f and diameter D; the latter can also be the diameter of any other entrance pupil. After the lens L.sub.f, a beam splitter 20 separates the collected light in two beams, denoted as a and b, impinging on image sensors D.sub.a and D.sub.b, respectively. A processing unit 100 is connected to the image sensors D.sub.a, D.sub.b to process the intensities detected by the synchronized image sensors.

(8) The image sensors D.sub.a and D.sub.b, placed, respectively, at a distance z′.sub.a and z′.sub.b from the thin lens L.sub.f (on the image side) retrieve the focused images of the planes P′ and P″ at a distance z.sub.a and z.sub.b from the thin lens L.sub.f (on the object side), according to the equation:
1/z.sub.j+1/z′.sub.j=1/f,
with magnification
M.sub.j=−z′.sub.j/z.sub.j,
with image resolution Δx.sub.j on the two planes
Δx.sub.j=0.61λz.sub.j/D,
and with depth of field Δz.sub.j
Δz.sub.j=1.22λ(z.sub.j/D).sup.2
where j=a, b in all the equations above.

(9) In the interesting cases in which the distance between the two planes P′ and P″ is larger than this natural depth of field, the intensities at the two image sensors D.sub.a and D.sub.b do not encode any relevant information about the volume enclosed by the two planes. For performing “Correlation Plenoptic Imaging between Arbitrary Planes” (CPI-AP), the correlation is measured between the intensity fluctuations retrieved at points ρ.sub.a and ρ.sub.b on the two synchronized image sensors D.sub.a and D.sub.b, respectively; the correlation function is thus
G(ρ.sub.a,ρ.sub.b)=custom characterΔI.sub.a(ρ.sub.a)ΔI.sub.b(ρ.sub.b)custom character  (1)
where i=a, b and ΔI.sub.i(ρ.sub.i)=I.sub.i(ρ.sub.i)−custom characterI.sub.i(ρ.sub.i)custom character are the intensity fluctuations at points ρ.sub.i on the image sensor D.sub.i and the symbol custom character. . . custom character denotes the statistical average over the emitted light. For light emitted by a stationary and ergodic source, the statistical average can be replaced by a time average over a sequence of frames.

(10) By propagating the electromagnetic field in the setup described above, and indicating by A(ρ.sub.i) the aperture function of the object, placed at a distance z from the front principal plane of the lens L.sub.f, and by P(ρ.sub.i) the lens pupil function, one gets

(11) G ( ρ a , ρ b ) = .Math. "\[LeftBracketingBar]" d 2 ρ 0 A .Math. "\[RightBracketingBar]" ( ρ 0 ) .Math. "\[LeftBracketingBar]" 2 p a * ( ρ 0 , ρ a ) p b ( ρ 0 , ρ b ) .Math. "\[RightBracketingBar]" 2 where ( 2 ) p j ( ρ 0 , ρ j ) = d 2 ρ l P ( ρ l ) exp { 2 π i λ [ ( 1 z - 1 z j ) ρ l 2 2 - ( ρ 0 z - ρ j M j Z j ) .Math. ρ l ] } ( 3 )
with j=a, b, p.sub.a* is the complex conjugate of p.sub.a, ρ.sub.0 is the coordinate on the object plane and ρ.sub.l the coordinate on the lens plane. By integrating G(ρ.sub.a, ρ.sub.b) over one of the two image sensor coordinates, let's say ρ.sub.a(or ρ.sub.b), one would get the focused incoherent image of the plane placed at a distance z.sub.b (or z.sub.a) from the front principal plane of the lens L.sub.f. In particular, if the object is placed exactly in z=z.sub.b (or z=z.sub.a), such integral gives an image of the object (known as “ghost image”) characterized by the same resolution, depth of field and magnification of the image directly retrieved by D.sub.b (or D.sub.a) through intensity measurements. In other words, the “ghost image” has the same characteristics of the standard image.

(12) If the object is placed in z≠z.sub.b (or z≠z.sub.a), such integral gives an out-of-focus image of the object.

(13) However, the dependence of G(ρ.sub.a, ρ.sub.b) on both the planar coordinates ρ.sub.a and ρ.sub.b is much more informative, and enables to reconstruct the direction of light between the two planes, and beyond them, and thus to refocus the object aperture function independently of its location within the setup. This can be easily seen in the geometrical optics limit (λ.fwdarw.0), where the effects of diffraction are negligible:

(14) G ( ρ a , ρ b ) C .Math. "\[LeftBracketingBar]" A ( 1 z b - z a ( z - z a M b ρ b - z - z b M a ρ a ) ) .Math. "\[RightBracketingBar]" 4 .Math. .Math. "\[LeftBracketingBar]" P ( 1 z b - z a ( z b M a ρ a - z a M b ρ b ) ) .Math. "\[RightBracketingBar]" 4 ( 4 )
where C is an irrelevant constant. A proper parametrization of the correlation function can be implemented to decouple the object aperture function A from the lens pupil function P, thus leading to several refocused images, one for each value of ρ.sub.s

(15) G r e f ( ρ r , ρ s ) = G ( M a z [ ( z a ρ r + ( z - z a ) ρ s ) ] , M b z [ ( z b ρ r + ( z - z b ) ρ s ) ] ) C .Math. "\[LeftBracketingBar]" A ( ρ r ) .Math. "\[RightBracketingBar]" 4 .Math. "\[LeftBracketingBar]" P ( ρ s ) .Math. "\[RightBracketingBar]" 4 ( 5 )
where ρ.sub.r an ρ.sub.s are the points on the object an on the lens plane, respectively, that give the most relevant contribution to the correlation function. After the refocusing operation (5), applied to the measured correlation function, integration over the planar coordinate ρ.sub.s provides the refocused image of the object aperture, independent of its original position, i.e. on its displacement, z−z.sub.a and z−z.sub.b, from the plane conjugate to each image sensor plane:
Σ.sub.ref(ρ.sub.r)=∫d.sup.2ρ.sub.sG.sub.ref(ρ.sub.r, ρ.sub.s)≈C′|A(ρ.sub.r)|.sup.4  (6)
with C′ another irrelevant constant.

(16) The limits to the refocusing operation do not appear in the geometrical optics regime; such limits can be obtained from the exact expression of the correlation function in Equations (2) and (3), which include the effects of interference and diffraction, as determined by the wave nature of light.

(17) FIGS. 2A, 2B and 2C report density plots of the visibility of images of a double-slit mask, with center-to-center distance d and width d/2, as a function of the slit separation d (representing the resolution) and the distance z−z.sub.m between the object plane and the farther refocusing plane (representing the maximum achievable depth of field). These plots allow to visualize the depth of field enhancement entailed by the refocusing procedure as a function of resolution and maximum achievable.

(18) FIGS. 2A and 2B (“standard” images) show the visibility of the image of the double-slit mask, as obtained by standard imaging at retrieved by both image sensors D.sub.a and D.sub.b; FIG. 2C reports the visibility of the image of the double-slit mask obtained by correlation plenoptic imaging according to the principle of the present invention. In the previous patent application n. EP3220185A1, the density plot of the visibility is identical to the one in FIG. 2C, only for values z−z.sub.m>0; no refocusing is possible for z−z.sub.m<0.

(19) In the standard images (FIGS. 2A and 2B), the DOF increases linearly with decreasing resolution, and the two slits can only be distinguished in a narrow region around the focusing distance. In the CPI-AP technique of the present invention, the DOF of the refocused image according to (Σ.sub.ref (ρ.sub.r)) is highly enhanced (FIG. 2C) with respect to the mere combination of the DOF associated with the images of FIGS. 2A and 2B retrieved by D.sub.a and D.sub.b, separately. In the proposed simulation, for example, the image of a double-slit having d=18 m has a DOF=0.35 mm in standard imaging (FIGS. 2A and 2B), and a DOF=1.4 mm in the applied technique CPI-AP (FIG. 3C). Hence, the depth of field enhancement is by a factor of 4 with respect to standard imaging, and by a factor of 2 with respect to the prior art.

(20) FIG. 3 shows another embodiment of an apparatus according to the present invention which is designed in case the illuminating source emits entangled photons, e.g., by spontaneous parametric down-conversion (SPDC). A processing unit 100 is connected to image sensors D.sub.a and D.sub.b to process the intensities detected by the synchronized image sensors.

(21) The apparatus of FIG. 3 includes two identical lenses L.sub.2 (here supposed to be thin lenses for simplicity) of focal length f>0 that are placed at distances z′.sub.a and z′.sub.b from the image sensors D.sub.a and D.sub.b and defines two conjugate planes D′.sub.a and D′.sub.b, at distances z.sub.a=(1/f−1/z′.sub.a).sup.−1 and z.sub.b=(1/f−1/z′.sub.b).sup.−1, respectively.

(22) A transmissive object 10 is placed in the optical path labelled by b. The planes D′.sub.a and D″.sub.a, both at a distance z.sub.a from the lens L.sub.2 (on the object side), are in the focal plane of an additional lens L.sub.1 (also supposed to be a thin lens for simplicity), with focal length f.sub.1>0. Lens L.sub.1, collects the two correlated beams emitted by the SPDC source 30. Due to the correlation between the two beams, a “ghost image” of the plane D″.sub.a is reproduced in the plane D′.sub.a when correlation (or coincidence) measurements are measured between D.sub.a and D.sub.b. Hence, in the optical path a, the lens L.sub.2 serves for focusing the “ghost image” of the plane D″.sub.a on the image sensor D.sub.a. The refocused image is still given by Equation (5), upon replacing M.sub.a=−z′.sub.a/z.sub.a with −M.sub.a. However, due to the finite aperture of the lenses L.sub.2, characterized by the pupil function ρ.sub.2, the refocused image is slightly complicated by the presence of an envelope function, namely:

(23) Σ r e f ( ρ r ) = d 2 ρ s G r e f ( ρ r , ρ s ) C η ( ρ r ) .Math. "\[LeftBracketingBar]" A ( ρ r ) .Math. "\[RightBracketingBar]" 4 with ( 6.1 ) η ( ρ r ) = d 2 ρ s .Math. "\[LeftBracketingBar]" P 2 ( ρ s ) P 2 ( - ρ s + 2 z a z ( ρ r - ρ s ) ) .Math. "\[RightBracketingBar]" 2 ( 7 )

(24) The envelope function is due to the asymmetry between the two light paths; in fact, different from the setup illuminated by chaotic light previously disclosed for the embodiment of FIG. 1, here the object 10 is not in common between the two paths a and b. However, if the object size is significantly smaller than the lens size, the envelope is constant with respect to ρ.sub.r with good approximation.

(25) In the images of FIGS. 4A, 4B and 4C it is shown the refocusing obtained with a process according to the present invention. The figures represent three different images of a vertical double slit, with center-to-center distance d=21 μm, placed in the midpoint of two object planes at z.sub.a=35.55 mm and z.sub.b=36.55 mm of a lens L.sub.f or L.sub.1 with focal length f=28.84 mm and numerical aperture NA=0.1.

(26) The refocused image of FIG. 4A is obtained with the algorithm defined in Equation (6) and can be compared with the out-of-focus image of FIG. 4B detected on D.sub.a and the refocused image of FIG. 4C obtained by plenoptic imaging of the prior art with N.sub.u=3.

(27) Various modifications can be made to the embodiments herein depicted for illustrative purposes, without departing from the scope of the present invention as defined by the attached claims. For example, the two beams along the light paths a and b can be either naturally diverging from each other or made divergent by the insertion of additional optical elements (beam splitters and mirrors). Each lens in all the embodiments can always be replaced by a more complex imaging system. In the embodiment of FIG. 3, both L.sub.1 and L.sub.2 can either be a single wide lens or two separate lenses, one for the light path a and one for the light path b. In the embodiment of FIG. 3, the SPDC source can be replaced by any emitting source of entangled photons or beams. Moreover, a light emitting source can be employed to illuminate the object or the 3D scene in the scheme of FIG. 1; the source must be chaotic for generic reflective/transmissive/scattering objects, and can be a laser for objects emitting chaotic light (e.g., fluorescent objects).