Light Field Imaging Device and Method for 3D Sensing
20220221733 · 2022-07-14
Inventors
Cpc classification
G02B27/4205
PHYSICS
G02B5/1866
PHYSICS
H04N25/11
ELECTRICITY
G02B27/0075
PHYSICS
International classification
Abstract
A light field imaging device may include a diffraction grating assembly configured to receive an optical wavefront from a scene and including one or more diffraction gratings. Each diffraction grating has a refractive index modulation pattern with a grating period along a grating axis and is configured to generate a diffracted wavefront. The device may also include a pixel array configured to detect the diffracted wavefront in a near-field region. The pixel array includes light-sensitive pixels and a pixel pitch along the grating axis that is equal to or larger than the grating period. Each pixel samples a portion of the diffracted wavefront and generates a pixel response. The pixels include groups or pairs of adjacent pixels, where the adjacent pixels in each group or pair have different pixel responses as a function of the angle of incidence of the optical wavefront. Light field imaging methods are also disclosed.
Claims
1. A light field imaging device, comprising: a diffraction grating assembly configured to receive an optical wavefront incident from a scene, the diffraction grating assembly comprising a diffraction grating having a grating axis and a refractive index modulation pattern with a grating period along the grating axis, the diffraction grating being configured to generate, in a near-field region, a diffracted wavefront having an intensity pattern that is spatially modulated according to the grating period and that shifts laterally along the grating axis as a function of an angle of incidence of the optical wavefront; and a pixel array configured to detect the diffracted wavefront in the near-field region, the pixel array having a plurality of light-sensitive pixels and a pixel pitch along the grating axis that is equal to or larger than the grating period, the light-sensitive pixels being configured to sample respective portions of the diffracted wavefront and generate therefrom corresponding pixel responses, the plurality of light-sensitive pixels comprising groups of adjacent pixels, the adjacent pixels in each group having different pixel responses as a function of the angle of incidence.
2. The light field imaging device of claim 1, wherein a ratio of the pixel pitch to the grating period is different from a positive integer.
3. The light field imaging device of claim 2, wherein the groups of adjacent pixels are pairs of adjacent pixels.
4. The light field imaging device of claim 3, wherein the ratio of the pixel pitch to the grating period is equal to (2n+1)/2, where n is a positive integer.
5. The light field imaging device of claim 4, wherein n=1.
6. The light field imaging device of claim 4, wherein n=2.
7. The light field imaging device of any one of claims 3 to 6, further comprising a processor configured to: compute a plurality of summed pixel responses, each summed pixel response being based on a sum of the pixel responses of a respective one of the pairs of adjacent pixels, and generate a 2D image of the scene from the plurality of summed pixel responses; and/or compute a plurality of differential pixel responses, each differential pixel response being based on a difference between the pixel responses of a respective one of the pairs of adjacent pixels, and generate a depth image of the scene from the plurality of differential pixel responses.
8. The light field imaging device of claim 2, wherein the ratio of the pixel pitch to the grating period is equal to n/m, where n and m are positive integers larger than two, and n is larger than m.
9. The light field imaging device of claim 8, wherein m=3 and n=4.
10. The light field imaging device of any one of claims 2 to 9, wherein the adjacent pixels in each group have identical pixel dimensions along the grating axis.
11. The light field imaging device of claim 1, wherein a ratio of the pixel pitch to the grating period is equal to one and the adjacent pixels in each group do not all have identical pixel dimensions along the grating axis.
12. The light field imaging device of claim 11, wherein the groups of adjacent pixels are pairs of adjacent pixels.
13. The light field imaging device of any one of claims 1 to 12, wherein the diffraction grating is a phase grating.
14. The light field imaging device of claim 13, wherein the diffraction grating is a binary phase grating.
15. The light field imaging device of claim 14, wherein the refractive index modulation pattern comprises a series of ridges periodically spaced-apart at the grating period, interleaved with a series of grooves periodically spaced-apart at the grating period.
16. The light field imaging device of claim 15, wherein each group of adjacent pixels with a chief ray angle of zero is positioned in alignment with a center of a corresponding one of the ridges, a center of a corresponding one of the grooves, or a transition between a corresponding one of the ridges and a corresponding one of the grooves.
17. The light field imaging device of claim 15 or 16, wherein a degree of vertical alignment between the ridges and the grooves and the underlying light-sensitive pixels changes as a function of position within the pixel array.
18. The light field imaging device of any one of claims 14 to 17, wherein the diffraction grating has a duty cycle of about 50%.
19. The light field imaging device of any one of claims 14 to 17, wherein the diffraction grating has a duty cycle different from 50%.
20. The light field imaging device of any one of claims 1 to 19, wherein the grating period ranges from 0.1 micrometer to 10 micrometers.
21. The light field imaging device of any one of claims 1 to 20, wherein the pixel pitch ranges from 0.7 micrometer to 10 micrometers.
22. The light field imaging device of any one of claims 1 to 21, wherein a separation distance between the refractive index modulation pattern of the diffraction grating and a light-receiving surface of the pixel array ranges from 0.2 micrometer to 20 micrometers.
23. The light field imaging device of any one of claims 1 to 21, wherein a separation distance between the refractive index modulation pattern of the diffraction grating and a light-receiving surface of the pixel array is less than about twenty times a center wavelength of the optical wavefront.
24. The light field imaging device of any one of claims 1 to 23, further comprising a color filter array disposed over the pixel array and comprising a plurality of color filters arranged in a mosaic color pattern, the color filter array being configured to filter the diffracted wavefront according to the mosaic color pattern prior to detection of the diffracted wavefront by the pixel array.
25. The light field imaging device of claim 24, wherein the mosaic color pattern is a Bayer pattern.
26. The light field imaging device of claim 24 or 25, wherein the adjacent pixels in each group are disposed under identical color filters.
27. The light field imaging device of claim 26, wherein the identical color filters are green filters.
28. The light field imaging device of claim 24, wherein each color filter is a red filter, a green filter, a blue filter, a yellow filter, a cyan filter, a magenta filter, a clear filter, or an infrared filter.
29. The light field imaging device of any one of claims 1 to 28, further comprising a microlens array disposed over the pixel array and below the diffraction grating assembly, the microlens array comprising a plurality of microlenses, each microlens being optically coupled to a corresponding one of the light-sensitive pixels.
30. The light field imaging device of any one of claims 1 to 29, further comprising pixel array circuitry disposed either under the pixel array, in a backside illumination configuration, or between the diffraction grating assembly and the pixel array, in a frontside illumination configuration.
31. The light field imaging device of any one of claims 1 to 30, wherein the diffraction grating is a single diffraction grating of the diffraction grating assembly.
32. The light field imaging device of any one of claims 1 to 30, wherein the diffraction grating is one of a plurality of diffraction gratings of the diffraction grating assembly.
33. The light field imaging device of claim 32, wherein the plurality of diffraction gratings is arranged in a two-dimensional grating array disposed over the pixel array.
34. The light field imaging device of claim 32 or 33, wherein the diffraction gratings are not all identical.
35. The light field imaging device of any one of claims 32 to 34, wherein the plurality of diffraction gratings comprises multiple sets of diffraction gratings, the grating axes of the diffraction gratings of different ones of the sets having different orientations.
36. The light field imaging device of claim 35, wherein the multiple sets of diffraction gratings comprise a first set of diffraction gratings and a second set of diffraction gratings, the grating axes of the diffraction gratings of the first set extending substantially perpendicularly to the grating axes of the diffraction gratings of the second set.
37. The light field imaging device of any one of claims 1 to 23, wherein the diffraction grating is one of a plurality of diffraction gratings of the diffraction grating assembly, each diffraction grating comprises a grating substrate including a top surface having the refractive index modulation pattern formed thereon, and the grating substrate comprises a spectral filter material or region configured to filter the diffracted wavefront prior to detection of the diffracted wavefront by the plurality of light-sensitive pixels, the plurality of diffraction gratings thus forming a color filter array.
38. The light field imaging device of claim 37, wherein the grating substrate of each diffraction grating acts as a red filter, a green filter, a blue filter, a yellow filter, a cyan filter, a magenta filter, a clear filter, or an infrared filter.
39. The light field imaging device of claim 37, wherein the color filter array is arranged in a Bayer pattern.
40. A light field imaging device, comprising: a diffraction grating assembly configured to receive an optical wavefront incident from a scene, the diffraction grating assembly comprising a diffraction grating having a grating axis and a refractive index modulation pattern with a grating period along the grating axis, the diffraction grating being configured to diffract the optical wavefront to generate a diffracted wavefront; and a pixel array having a plurality of light-sensitive pixels disposed under the diffraction grating assembly and configured to detect the diffracted wavefront in a near-field region, the pixel array having a pixel pitch along the grating axis that is larger than the grating period, a ratio of the pixel pitch to the grating period being different from a positive integer.
41. The light field imaging device of claim 40, wherein the ratio of the pixel pitch to the grating period is equal to (2n+1)/2, where n is a positive integer.
42. The light field imaging device of claim 41, wherein n=1.
43. The light field imaging device of claim 41, wherein n=2.
44. The light field imaging device of claim 40, wherein the ratio of the pixel pitch to the grating period is equal to n/m, where n and m are positive integers larger than two, and n is larger than m.
45. The light field imaging device of claim 44, wherein m=3 and n=4.
46. The light field imaging device of any one of claims 40 to 43, wherein the light-sensitive pixels are configured to sample respective portions of the diffracted wavefront and generate therefrom corresponding pixel responses, the plurality of light-sensitive pixels comprising pairs of adjacent pixels, the adjacent pixels in each pair having different pixel responses as a function of angle of incidence, the light field imaging device further comprising a processor configured to: compute a plurality of summed pixel responses, each summed pixel response being based on a sum of a pixel responses of a respective one of the pairs of adjacent pixels, and generate a 2D image of the scene from the plurality of summed pixel responses; and/or compute a plurality of differential pixel responses, each differential pixel response being based on a difference between the pixel responses of a respective one of the pairs of adjacent pixels, and generate a depth image of the scene from the plurality of differential pixel responses.
47. The light field imaging device of any one of claims 40 to 46, wherein pixels have identical pixel dimensions along the grating axis.
48. The light field imaging device of any one of claims 40 to 46, wherein the pixels do not all have identical pixel dimensions along the grating axis.
49. The light field imaging device of any one of claims 40 to 48, wherein the diffraction grating is a phase grating.
50. The light field imaging device of claim 49, wherein the diffraction grating is a binary phase grating.
51. The light field imaging device of claim 50, wherein the refractive index modulation pattern comprises a series of ridges periodically spaced-apart at the grating period, interleaved with a series of grooves periodically spaced-apart at the grating period.
52. The light field imaging device of claim 51, wherein each pixel with a chief ray angle of zero is positioned in alignment with a center of a corresponding one of the ridges, a center of a corresponding one of the grooves, or a transition between a corresponding one of the ridges and a corresponding one of the grooves.
53. The light field imaging device of claim 51 or 52, wherein a degree of vertical alignment between the ridges and the grooves and the underlying light-sensitive pixels changes as a function of position within the pixel array.
54. The light field imaging device of any one of claims 50 to 53, wherein the diffraction grating has a duty cycle of about 50%.
55. The light field imaging device of any one of claims 50 to 53, wherein the diffraction grating has a duty cycle different from 50%.
56. The light field imaging device of any one of claims 40 to 55, wherein the grating period ranges from 0.1 micrometer to 10 micrometers.
57. The light field imaging device of any one of claims 40 to 56, wherein the pixel pitch ranges from 0.7 micrometer to 10 micrometers.
58. The light field imaging device of any one of claims 40 to 57, wherein a separation distance between the refractive index modulation pattern of the diffraction grating and a light-receiving surface of the pixel array ranges from 0.2 micrometer to 20 micrometers.
59. The light field imaging device of any one of claims 40 to 58, wherein a separation distance between the refractive index modulation pattern of the diffraction grating and a light-receiving surface of the pixel array is less than about twenty times a center wavelength of the optical wavefront.
60. The light field imaging device of any one of claims 40 to 59, further comprising a color filter array disposed over the pixel array and comprising a plurality of color filters arranged in a mosaic color pattern, the color filter array filtering the diffracted wavefront according to the mosaic color pattern prior to detection of the diffracted wavefront by the pixel array.
61. The light field imaging device of any one of claims 40 to 60, further comprising a microlens array disposed over the pixel array and below the diffraction grating assembly, the microlens array comprising a plurality of microlenses, each microlens being optically coupled to a corresponding one of the light-sensitive pixels.
62. The light field imaging device of any one of claims 40 to 61, further comprising pixel array circuitry disposed either under the pixel array, in a backside illumination configuration, or between the diffraction grating assembly and the pixel array, in a frontside illumination configuration.
63. The light field imaging device of any one of claims 40 to 62, wherein the diffraction grating assembly comprises a single grating orientation.
64. The light field imaging device of any one of claims 40 to 63, wherein the diffraction grating is one of a plurality of diffraction gratings of the diffraction grating assembly.
65. The light field imaging device of claim 64, wherein the plurality of diffraction gratings is arranged in a two-dimensional grating array disposed over the pixel array.
66. The light field imaging device of claim 64 or 65, wherein the diffraction gratings are not all identical.
67. The light field imaging device of any one of claims 64 to 66, wherein the plurality of diffraction gratings comprises multiple sets of diffraction gratings, the grating axes of the diffraction gratings of different ones of the sets having different orientations.
68. The light field imaging device of claim 67, wherein the multiple sets of diffraction gratings comprise a first set of diffraction gratings and a second set of diffraction gratings, the grating axes of the diffraction gratings of the first set extending substantially perpendicularly to the grating axes of the diffraction gratings of the second set.
69. A diffraction grating assembly for use with an image sensor, the image sensor comprising a pixel array having a plurality of light-sensitive pixels, the diffraction grating assembly comprising a diffraction grating having a grating axis and a refractive index modulation pattern with a grating period along the grating axis, the grating period being equal to or smaller than a pixel pitch of the pixel array along the grating axis, the diffraction grating being configured to diffract an incident optical wavefront and generate, in a near-field region, a diffracted wavefront having an intensity pattern that is spatially modulated according to the grating period and that shifts laterally along the grating axis as a function of an angle of incidence of the optical wavefront, the diffraction grating assembly being configured to be disposed over the pixel array with the light-sensitive pixels located in the near-field region and comprising laterally adjacent pixels configured to generate different pixel responses as a function of the angle of incidence.
70. The diffraction grating assembly of claim 69, wherein the diffraction grating assembly is configured to be disposed over a color filter array of the image sensor, the color filter array being disposed over pixel array and configured to filter the diffracted wavefront prior to detection of the diffracted wavefront by the plurality of light-sensitive pixels.
71. The diffraction grating assembly of claim 69 or 70, wherein the diffraction grating is a binary phase grating.
72. The diffraction grating assembly of claim 71, wherein the refractive index modulation pattern comprises a series of ridges periodically spaced-apart at the grating period, interleaved with a series of grooves periodically spaced-apart at the grating period.
73. The diffraction grating assembly of any one of claims 69 to 72, comprising a single grating orientation.
74. The diffraction grating assembly of any one of claims 69 to 73, wherein the diffraction grating is one of a plurality of diffraction gratings of the diffraction grating assembly, the plurality of diffraction gratings being arranged in a two-dimensional grating array disposed over the pixel array.
75. The diffraction grating assembly of claim 74, comprising two orthogonal grating orientations.
76. The diffraction grating assembly of any one of claims 69 to 75, wherein the grating period ranges from 0.1 micrometer to 20 micrometers.
77. A method of capturing light field image data about a scene, the method comprising: diffracting an optical wavefront originating from the scene with a diffraction grating having a grating axis and a refractive index modulation pattern with a grating period along the grating axis to generate a diffracted wavefront having, in a near-field diffraction region, an intensity pattern that is spatially modulated according to the grating period and that shifts laterally along the grating axis as a function of an angle of incidence of the optical wavefront; and detecting, as the light field image data, the diffracted wavefront with a pixel array positioned in the near-field diffraction region, the pixel array having a plurality of light-sensitive pixels and a pixel pitch along the grating axis that is equal to or larger than the grating period, said detecting comprising sampling, by the light-sensitive pixels, respective portions of the diffracted wavefront to generate corresponding pixel responses, the plurality of light-sensitive pixels comprising groups of adjacent pixels, the adjacent pixels in each group having different pixel responses as a function of the angle of incidence.
78. The method of claim 77, further comprising setting a ratio of the pixel pitch to the grating period to be different from a positive integer.
79. The method of claim 78, wherein the groups of adjacent pixels are pairs of adjacent pixels.
80. The method of claim 79, further comprising setting the ratio of the pixel pitch to the grating period to be equal to (2n+1)/2, where n is a positive integer.
81. The method of claim 80, further comprising setting n equal to one.
82. The method of claim 80, further comprising setting n equal to two.
83. The method of any one of claims 79 to 82, further comprising: computing a plurality of summed pixel responses, each summed pixel response being based on a sum of the pixel responses of a respective one of the pairs of adjacent pixels; and generating a 2D image of the scene from the plurality of summed pixel responses.
84. The method of any one of claims 79 to 83, further comprising: computing a plurality of differential pixel responses, each differential pixel response being based on a difference between the pixel responses of a respective one of the pairs of adjacent pixels; and generating a depth image of the scene from the plurality of differential pixel responses.
85. The method of claim 78, further comprising setting the ratio of the pixel pitch to the grating period to be equal to n/m, where n and m are positive integers larger than two, and n is larger than m.
86. The method of claim 85, further comprising setting m equal to three and n equal to four.
87. The method of any one of claims 78 to 86, further comprising providing the adjacent pixels in each group with identical pixel dimensions along the grating axis.
88. The method of any one of claims 77 to 86, further comprising providing the adjacent pixels in each group not with all identical pixel dimensions along the grating axis.
89. The method of any one of claims 77 to 88, further comprising providing the diffraction grating as a binary phase grating comprising a series of ridges periodically spaced-apart at the grating period, interleaved with a series of grooves periodically spaced-apart at the grating period.
90. The method of claim 89, further comprising providing the diffraction grating with a duty cycle of about 50% and positioning each group of adjacent pixels having a chief ray angle of zero in alignment with a center of a corresponding one of the ridges, a center of a corresponding one of the grooves, or a transition between a corresponding one of the ridges and a corresponding one of the grooves.
91. The method of claim 89 or 90, further comprising providing a degree of vertical alignment between the ridges and the grooves and the underlying light-sensitive pixels that changes as a function of position within the pixel array.
92. The method of any one of claims 77 to 91, further comprising setting a separation distance between the refractive index modulation pattern of the diffraction grating and a light-receiving surface of the pixel array to be less than about twenty times a center wavelength of the optical wavefront.
93. The method of any one of claims 77 to 91, further comprising filtering the diffracted wavefront with a color filter array prior to detecting the diffracted wavefront with the plurality of light-sensitive pixels.
94. A method of providing light field imaging capabilities to an image sensor comprising a pixel array having a plurality of light-sensitive pixels and a pixel pitch along a pixel axis, the method comprising: providing a diffraction grating assembly comprising a diffraction grating having a grating axis and a refractive index modulation pattern with a grating period along the grating axis, the grating period being equal to or smaller than the pixel pitch, the diffraction grating being configured to diffract an incident optical wavefront into a diffracted wavefront having, in a near-field diffraction region, an intensity pattern that is spatially modulated according to the grating period and that shifts laterally along the grating axis as a function of an angle of incidence of the optical wavefront; and disposing the diffraction grating assembly in front of the image sensor with the grating axis parallel to the pixel axis and the light-sensitive pixels located in the near-field diffraction region for detection of the diffracted wavefront, the light-sensitive pixels comprising laterally adjacent pixels configured to generate different pixel responses as a function of the angle of incidence.
95. The method of claim 94, further comprising setting a ratio of the pixel pitch to the grating period to be different from a positive integer.
96. The method of claim 95, further comprising setting the ratio of the pixel pitch to the grating period to be equal to (2n+1)/2, where n is a positive integer.
97. The method of claim 96, further comprising setting n equal to one.
98. The method of claim 96, further comprising setting n equal to two.
99. The method of claim 95, further comprising setting the ratio of the pixel pitch to the grating period to be equal to n/m, where n and m are positive integers larger than two, and n is larger than m.
100. The method of claim 99, further comprising setting m equal to three and n equal to four.
101. The method of any one of claims 96 to 100, further comprising providing the pixels with identical pixel dimensions along the grating axis.
102. The method of any one of claims 95 to 100, wherein further comprising providing the pixels not with all identical pixel dimensions along the grating axis.
103. The method of any one of claims 95 to 102, wherein disposing the diffraction grating assembly in front of the image sensor comprises positioning the diffraction grating assembly at a separation distance from the pixel array selected such that an optical path length of the diffracted wavefront prior to being detected by the light-sensitive pixels is less than about twenty times a center wavelength of the optical wavefront.
104. The method of any one of claims 95 to 102, wherein disposing the diffraction grating assembly in front of the image sensor comprises positioning the diffraction grating assembly at a separation distance from the pixel array that ranges from 0.2 micrometer to 20 micrometers.
105. The method of any one of claims 95 to 104, wherein providing the diffraction grating assembly comprises providing the diffraction grating as a binary phase grating comprising a series of ridges periodically spaced-apart at the grating period, interleaved with a series of grooves periodically spaced-apart at the grating period.
106. The method of any one of claims 95 to 105, wherein providing the diffraction grating assembly comprises providing the diffraction grating assembly with a plurality of diffraction gratings, the plurality of diffraction gratings comprises multiple sets of diffraction gratings having different orientations.
107. The method of any one of claims 95 to 106, wherein providing the diffraction grating assembly comprises providing the diffraction grating assembly with a single grating orientation.
108. A light field imaging device, comprising: a diffraction grating assembly configured to receive an optical wavefront incident from a scene, the diffraction grating assembly comprising a phase diffraction grating having a grating axis and a refractive index modulation pattern with a grating period along the grating axis, the refractive index modulation pattern comprising a series of ridges periodically spaced-apart at the grating period, interleaved with a series of grooves periodically spaced-apart at the grating period, the diffraction grating being configured to generate, in a near-field region, a diffracted wavefront having an intensity pattern that is spatially modulated according to the grating period and that shifts laterally along the grating axis as a function of an angle of incidence of the optical wavefront; and a pixel array configured to detect the diffracted wavefront in the near-field region, the pixel array having a plurality of light-sensitive pixels and a pixel pitch along the grating axis, a ratio of the pixel pitch to the grating period being equal to (2n+1)/2, where n is a positive integer, the light-sensitive pixels sampling respective portions of the diffracted wavefront and generating therefrom corresponding pixel responses, the plurality of light-sensitive pixels comprising pairs of adjacent pixels, the adjacent pixels in each pair having different pixel responses as a function of the angle of incidence.
109. The light field imaging device of claim 108, wherein n=1 or n=2.
110. A light field imaging device, comprising: a diffraction grating assembly configured to receive an optical wavefront incident from a scene, the diffraction grating assembly comprising a phase diffraction grating having a grating axis and a refractive index modulation pattern with a grating period along the grating axis, the refractive index modulation pattern comprising a series of ridges periodically spaced-apart at the grating period, interleaved with a series of grooves periodically spaced-apart at the grating period, the diffraction grating being configured to diffract the optical wavefront to generate a diffracted wavefront; and a pixel array having a plurality of light-sensitive pixels disposed under the diffraction grating assembly and configured to detect the diffracted wavefront in a near-field region, the pixel array having a pixel pitch along the grating axis, a ratio of the pixel pitch to the grating period being equal to (2n+1)/2, where n is a positive integer.
111. The light field imaging device of claim 110, wherein n=1 or n=2.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0112]
[0113]
[0114]
[0115]
[0116]
[0117]
[0118]
[0119]
[0120]
[0121]
[0122]
[0123]
[0124]
[0125]
[0126]
[0127]
[0128]
[0129]
[0130]
DETAILED DESCRIPTION
[0131] In the present description, similar features in the drawings have been given similar reference numerals. To avoid cluttering certain figures, some elements may not be indicated if they were already identified in a preceding figure. It is also appreciated that the elements of the drawings are not necessarily depicted to scale, since emphasis is placed on clearly illustrating the elements and structures of the present embodiments. Furthermore, positional descriptors indicating the location and/or orientation of one element with respect to another element are used herein for ease and clarity of description. Unless otherwise indicated, these positional descriptors should be taken in the context of the figures and should not be considered limiting. It will be appreciated that such spatially relative terms are intended to encompass different orientations in the use or operation of the present embodiments, in addition to the orientations exemplified in the figures. In particular, terms such as “on”, “over”, “under”, “above”, and “below”, used in specifying the relative spatial relationship of two elements denote that the two elements can be either in direct contact with each other or separated from each other by one or more intervening elements.
[0132] The terms “connected” and “coupled”, and derivatives and variants thereof, are intended to refer herein to any structural or functional connection or coupling, either direct or indirect, between two or more elements. The connection or coupling between the elements may be, for example, mechanical, optical, thermal, electrical, magnetic, chemical, logical, operational, or any combination thereof.
[0133] The terms “a”, “an”, and “one” are defined herein to mean “at least one”, that is, these terms do not exclude a plural number of elements, unless stated otherwise.
[0134] Terms such as “substantially”, “generally”, and “about”, that modify a value, a condition, or a characteristic of a feature of an exemplary embodiment, should be understood to mean that the value, condition, or characteristic is defined within tolerances that are acceptable for the proper operation of this exemplary embodiment for its intended application or that fall within an acceptable range of experimental error. In particular, the term “about” can refer to a range of numbers that one skilled in the art would consider equivalent to the stated value (e.g., having the same or equivalent function or result). In some instances, the term “about” means a variation of ±10 percent of the stated value. It is noted that all numeric values used herein are assumed to be modified by the term “about”, unless stated otherwise.
[0135] The terms “match”, “matching”, and “matched” are intended to refer herein to a condition in which two elements are either the same or within some predetermined tolerance of each other. That is, these terms are meant to encompass not only “exactly” or “identically” matching the two elements but also “substantially”, “approximately” “subjectively”, or “sufficiently” matching the two elements, as well as providing a higher or best match among a plurality of matching possibilities.
[0136] The present description generally relates to light field imaging techniques for acquiring light field information or image data about an optical wavefront emanating from a scene. In accordance with various aspects, the present description relates to a light field imaging device for capturing light field image data about a scene; a diffraction grating assembly for use with an image sensor to obtain light field image data about a scene; a method of capturing light field image data about a scene; and a method of providing 3D or light field imaging capabilities to an image sensor array viewing a scene.
[0137] In some implementations, the present techniques enable the specific manipulation and comparison of the chromatic dependence of diffraction by means of one or more diffractive optical elements paired with an appropriate chromatic encoding mechanism, as well as its use in 3D imaging. In some implementations, the light field imaging devices and methods disclosed herein are sensitive to not only the intensity and angle of incidence of an optical wavefront originating from an observable scene, but also the wavelength, through a specific spatio-spectral subsampling of a generated interference pattern allowing for the direct measurement of the chromatic dependence of diffraction.
[0138] The acquired light field information can include information about not only the intensity of an incident optical wavefront, but also other light field parameters including, without limitation, the angle of incidence, the phase, the wavelength, and the polarization of the optical wavefront. Therefore, light field imaging devices, for example, depth cameras, may acquire more information than traditional cameras, which typically record only light intensity. The image data captured by light field imaging devices may be used or processed in a variety of ways to provide multiple functions including, but not limited to, 3D depth map extraction, 3D surface reconstruction, image refocusing, and the like. Depending on the application, the light field image data of an observable scene may be acquired as one or more still images or as a video stream.
[0139] The present techniques may be used in imaging applications that require or may benefit from enhanced depth sensing and other 3D imaging capabilities, for example, to allow a user to change the focus, the point of view, and/or the depth of field of a captured image of a scene. The present techniques may be applied to or implemented in various types of 3D imaging systems and methods including, without limitation, light field imaging applications using plenoptic descriptions, ranging applications through the comparative analysis of the chromatic dependence of diffraction, and single-sensor single-image depth acquisition applications.
[0140] Non-limiting fields of application include, to name a few, consumer electronics (e.g., mobile phones, tablets, and notebooks, gaming, virtual and augmented reality, photography, etc.), automotive applications (e.g., advanced driver assistance systems, in-cabin monitoring, etc.), industrial applications (e.g., inspection, robot guidance, object identification and tracking, etc.), and security and surveillance (e.g., facial recognition and biometrics, motion tracking, traffic monitoring, drones, agricultural inspection with aerial and ground-based drones, etc.).
[0141] Non-exhaustive advantages and benefits of certain implementations of the present techniques may include: compatibility with passive sensing modalities that employ less power to perform their functions; compatibility with single-sensor architectures having reduced footprints; enablement of depth mapping functions while preserving 2D performance; simple and low-cost integration into existing image sensor hardware and manufacturing processes; compatibility with conventional CMOS and CCD image sensors; extension of the capabilities of other 3D sensing apparatuses, for example, by extending the range of stereo vision devices in the near field or when images are too blurry to perform standard stereo image pairs registration; use in recalibration of other 3D sensing apparatuses, for example, by recalibrating misaligned stereo vision systems; and elimination of the need for multiple components, such as dual cameras or cameras equipped with active lighting systems for depth detection.
[0142] In the present description, the terms “light” and “optical”, and variants and derivatives thereof, are intended to refer to radiation in any appropriate region of the electromagnetic spectrum. In particular, the terms “light” and “optical” are not limited to visible light, but may also include invisible regions of the electromagnetic spectrum including, without limitation, the terahertz (THz), infrared (IR), and ultraviolet (UV) spectral bands. In some implementations, the terms “light” and “optical” may encompass electromagnetic radiation having a wavelength ranging from about 175 nanometers (nm) in the deep ultraviolet to about 300 micrometers (μm) in the terahertz range, for example, from about 400 nm at the blue end of the visible spectrum to about 1550 nm at telecommunication wavelengths, or between about 400 nm and about 650 nm to match the spectral range of typical red-green-blue (RGB) color filters. Those skilled in the art will understand, however, that these wavelength ranges are provided for illustrative purposes only and that the present techniques may operate beyond this range.
[0143] In the present description, the terms “color” and “chromatic”, and variants and derivatives thereof, are used not only in their usual context of human perception of visible electromagnetic radiation (e.g., red, green, and blue), but also, and more broadly, to describe spectral characteristics (e.g., diffraction, transmission, reflection, dispersion, absorption) over any appropriate region of the electromagnetic spectrum. In this context, and unless otherwise specified, the terms “color” and “chromatic”, and their variants and derivatives, may be used interchangeably with the term “spectral” and its variants and derivatives.
[0144] Various implementations of the present techniques are described below with reference to the figures.
Light Field Imaging Device Implementations
[0145] Referring to
[0146] In the present description, the term “light field imaging device” broadly refers to an image capture device capable of acquiring an image representing a light field or wavefront emanating from a scene, where the acquired light field image contains information about not only light intensity at the image plane, but also other light field parameters such as, for example, the direction from which light rays enter the device (i.e., the angle of incidence), the spectrum of the light field, its phase, and its polarization. In some instances, the term “light field imaging device” may be used interchangeably with terms such as “light field camera”, “light field imager”, “light field image capture device”, “depth image capture device”, “3D image capture device”, “plenoptic camera”, and the like.
[0147] The term “scene” is meant to denote any region, space, volume, area, surface, environment, target, feature, or information of interest which may be imaged according to the present techniques. Depending on the application, the observable scene can be an indoor scene or an outdoor scene.
[0148] The light field imaging device 20 depicted in
[0149] The light field imaging device 20 also includes a pixel array 38 including a plurality of light-sensitive pixels 40 disposed under the diffraction grating assembly 24 and configured to detect, in a near-field region, the diffracted wavefront 36 as light field image data about the scene 22. In color implementations, the light field imaging device 20 may also include a color filter array 42 disposed over the pixel array 38. The color filter array 42 may include a plurality of color filters 44 arranged in a mosaic color pattern. Each color filter 44 may be configured to filter incident light according to wavelength to capture color information at a respective location of the color filter array 42. The color filter array 42 may be configured to spatially and spectrally filter the diffracted wavefront 36 according to the mosaic color pattern prior to detection of the diffracted wavefront 36 by the plurality of light-sensitive pixels 40.
[0150] As noted above, by providing a color filter array to perform a direct spatio-chromatic subsampling of the diffracted wavefront generated by the diffraction grating assembly prior to its detection by the pixel array, the light field imaging device may be sensitive to not only the intensity and the angle of incidence of an input optical wavefront, but also its spectral content.
[0151] It is appreciated that a color filter array need not be provided in some applications, for example, monochrome imaging. It is also appreciated that, for simplicity, the wavefront detected by the light-sensitive pixels will be generally referred to as a “diffracted wavefront” in both monochrome and color implementations, although in the latter case, the terms “filtered wavefront” or “filtered diffracted wavefront” may, in some instances, be used to denote the fact that the diffracted wavefront generated by the diffraction grating assembly is both spatially and spectrally filtered by the color filter array prior to detection by the underlying pixel array. Furthermore, in some implementations where a color filter array is not provided, it may be envisioned that the diffraction grating itself could act as a color filter. For example, the diffraction grating could include a grating substrate with a top surface having the refractive index modulation pattern formed thereon, the grating substrate including a spectral filter material or region configured to spectrally filter the diffracted wavefront according to wavelength prior to detection of the diffracted wavefront by the plurality of light-sensitive pixels. For example, and without limitation, the spectral filter material or region could act as one of a red filter, a green filter, a blue filter, a yellow filter, a cyan filter, a magenta filter, a clear or white filter, or an infrared filter (e.g., at around 850 nm or 940 nm). Of course, various other types of color filters may be used in other variants.
[0152] Depending on the application or use, the light field imaging device may be implemented using various image sensor architectures and pixel array configurations. In some embodiments, the light field imaging device may be implemented simply by adding or coupling a diffraction grating assembly on top of an already existing image sensor including a pixel array and, in color-based applications, a color filter array. For example, the existing image sensor may be a conventional 2D CMOS or CCD imager. However, in other implementations, the light field imaging device may be implemented and integrally packaged as a separate, dedicated, and/or custom-designed device incorporating therein all or most of its components (e.g., diffraction grating assembly, pixel array, color filter array, microlens array, etc.).
[0153] More details regarding the structure, configuration, and operation of the components introduced in the preceding paragraphs as well as other possible components of the light field imaging device will be described below.
[0154] In the embodiment of
[0155] Diffraction occurs when a wavefront, whether electromagnetic or otherwise, encounters a physical object or a refractive-index perturbation. The wavefront tends to bend around the edges of the object. Should a wavefront encounter multiple objects, whether periodic or otherwise, the corresponding wavelets may interfere some distance away from the initial encounter as demonstrated by Young's double slit experiment. This interference creates a distinct pattern, referred to as a “diffraction pattern” or an “interference pattern”, as a function of distance from the original encounter, which is sensitive to the incidence angle and the spectral content of the wavefront, and the general size, shape, and relative spatial relationships of the encountered objects. This interference may be described through the evolving relative front of each corresponding wavelet, as described by the Huygens-Fresnel principle.
[0156] In the present description, the term “diffraction grating”, or simply “grating”, refers to a periodic or aperiodic optical structure having spatially modulated optical properties (e.g., a refractive index modulation pattern, defining a grating profile) and being configured to modulate the amplitude and/or the phase of an incident optical wavefront. A diffraction grating may include a periodic arrangement of diffracting elements (e.g., alternating ridges and grooves) whose spatial period—the grating period—is nearly equal to or slightly longer than the wavelength of light incident on the grating.
[0157] An optical wavefront containing a range of wavelengths incident on a diffraction grating will, upon diffraction, have its amplitude and/or phase modified. As a result, a space- and time-dependent diffracted wavefront is produced. In general, a diffraction grating is spectrally dispersive such that each wavelength of an input optical wavefront will be outputted along a different direction. However, diffraction gratings exhibiting a substantially achromatic response over an operating spectral range exist and may be used in some implementations. For example, in some implementations, the diffraction grating may be substantially achromatic in a spectral range of interest and be designed for the center wavelength of the spectral range of interest. In particular, in an embodiment using a Bayer patterned color filter array, the diffraction grating may be optimized for the green channel, that is, around a center wavelength of about 532 nm. It is to be noted that when the diffraction grating is substantially achromatic over the operating spectral range, it is the color filter array that may provide a chromatic sub-sampling of the diffracted wavefront.
[0158] Depending on whether the diffracting elements forming the diffraction grating are transmitting or reflective, the diffraction grating may be referred to as a “transmission grating” or a “reflection grating”. It is noted that while several embodiments described herein may use transmission gratings, the use of reflection gratings in other embodiments is not excluded.
[0159] Diffraction gratings may also be classified as “amplitude gratings” or “phase gratings”, depending on the nature of the diffracting elements. In amplitude gratings, the perturbations to the initial wavefront caused by the grating are the result of a direct amplitude modulation, while in phase gratings, these perturbations are the result of a modulation of the relative group velocity of light caused by a spatial variation of the refractive index of the grating material. In several embodiments disclosed in the present description, the diffraction gratings are phase gratings, which generally absorb less light than amplitude gratings, although amplitude gratings may be used in other embodiments.
[0160] In the embodiment of
[0161] Depending on the application, the diffraction grating 28 may have a duty cycle substantially equal to 50% or different from 50%. The duty cycle is defined herein as the ratio of the ridge width to the grating period 34.
[0162] Another parameter of the diffraction grating 28 is the step height 56, which is the difference in level between the ridges 52 and the grooves 54. For example, in some implementations, the step height 56 may range from about 0.1 μm to about 1 μm. It is to be noted that in some implementations, the step height 56 may be selected such that the diffraction grating 28 causes a predetermined optical path difference between adjacent ridges 52 and grooves 54. For example, the step height 56 may be controlled, along with the refractive index difference between the ridges 52 and the grooves 54, to provide, at a given wavelength and angle of incidence of the optical wavefront (e.g. its center wavelength), a half-wave optical path difference between the ridges and the grooves. Of course, other optical path difference values may be used in other implementations.
[0163] It is to be noted that while the diffraction grating 28 in the embodiment of
[0164] Referring still to
[0165] The term “pixel array” refers generally to a sensor array made up of a plurality of photosensors, referred to herein as “light-sensitive pixels”, or simply “pixels”, which are configured to detect electromagnetic radiation incident thereonto from an observable scene and to convert the detected radiation into electrical data, which may be processed to generate an image of the scene. In the present techniques, the electromagnetic radiation that is detected by the light-sensitive pixels as light field image data corresponds to an optical wavefront incident from the scene, which has been diffracted, and possibly spatio-chromatically filtered, prior to reaching the pixel array.
[0166] The pixel array 38 may be embodied by a CMOS or a CCD image sensor, but other types of photodetector arrays (e.g., charge injection devices or photodiode arrays) or devices (e.g., an event camera) could alternatively be used. As mentioned above, the pixel array 38 may be configured to detect electromagnetic radiation in any appropriate region of the spectrum.
[0167] Depending on the application, the pixel array 38 may be configured according to a rolling or global shutter readout design. The pixel array 38 may further be part of a stacked, backside, or frontside illumination sensor architecture, as described in greater detail below. The pixel array 38 may be of any standard or non-standard optical format, for example, but not limited to, 4/3″, 1″, 2/3″, 1/1.8″, 1/2″, 1.27″, 1/3″, 1/3.2″, 1/3.6″, 35 mm, and the like. The pixel array 38 may also include a contrast or a phase-detection autofocus mechanism, along with their respective pixel architectures. Unless stated otherwise, the term “pixel array” may be used herein interchangeably with terms such as “photodetector array”, “photosensor array”, “imager array”, and the like.
[0168] A light-sensitive pixel 40 of the pixel array 38 may convert the spatial portion of the diffracted wavefront 36 incident upon it into accumulated charge, the amount of which is proportional to the amount of light collected and recorded by the pixel 40. Each light-sensitive pixel 40 may include a light-sensitive surface and associated pixel circuitry for processing signals at the pixel level and communicating with other electronics, such as a readout unit. Those skilled in the art will appreciate that various other components may be integrated into the pixel circuitry. In general, the light-sensitive pixels 40 may be individually addressed and read out.
[0169] Referring still to
[0170] In the embodiment of
[0171] The pixel array 38 may also be characterized by a pixel pitch 62. In the present description, the term “pixel pitch” generally refers to the spacing between individual pixels 40 and is typically defined as the center-to-center distance between nearest-neighbor pixels 40. Depending on the physical arrangement of the pixel array 38, the pixel pitch 62 along the two orthogonal pixel axes 58, 60 may or may not be the same. It is appreciated that a pixel pitch may also be defined along an arbitrary axis, for example, along a diagonal axis oriented at 45° with respect to the two orthogonal pixel axes 58, 60. As described in greater detail below, a relevant pixel pitch 62 is the one along the grating axis 30. As also described in greater detail below, in the present techniques, the pixel pitch 62 of the pixel array 38 along the grating axis 30 is equal to or larger than the grating period 34. For example, in some implementations the pixel pitch 62 along the grating axis 30 may range from 0.7 μm or less to 10 μm, although different pixel pitch values may be used in other implementations.
[0172] In the present description, the term “pixel data” refers to the image information captured by each individual pixel and can include intensity data indicative of the total amount of optical energy absorbed by each individual pixel over an integration period. Combining the pixel data from all the pixels 40 yields light field image data about the scene 22. In the present techniques, because the optical wavefront 26 incident from the scene 22 is diffracted and, possibly, spatially and spectrally filtered prior to detection, the light field image data can provide information about not only the intensity of the incident wavefront 26, but also other light field parameters such as its angle of incidence, phase and spectral content. In particular, as described in greater detail below, the present techniques may allow depth information to be retrieved from angle-of-incidence-dependent information encoded into the intensity-based diffraction pattern produced by the diffraction grating 28 and recorded by the pixel array 38.
[0173] Referring still to
[0174] As mentioned above regarding the terms “color” and “chromatic”, terms such as “color filter” and “color filtering” are to be understood as being equivalent to “spectral filter” and “spectral filtering” in any appropriate spectral range of the electromagnetic spectrum, and not only within the visible range. Depending on the application, the color filters may achieve spectral filtering through absorption of unwanted spectral components, for example, using dye-based color filters. However, other filtering principles may be used without departing from the scope of the present techniques.
[0175] Returning to
[0176] Referring now to
[0177] Referring to
[0178] Referring to
[0179]
[0180] In
[0181] In the present techniques, the diffraction grating 28 and the pixel array 38 are disposed relative to each other such that the light-receiving surface 68 of the pixel array 38 is positioned in the near-field diffraction region of the diffraction grating 28. In a near-field diffraction regime, the Fresnel diffraction theory can be used to calculate the diffraction pattern of waves passing through a diffraction grating. Unlike the far-field Fraunhofer diffraction theory, Fresnel diffraction accounts for the wavefront curvature, which allows calculation of the relative phase of interfering waves. Similarly, when detecting the diffracted irradiance pattern within a few integer multiples of the wavelength with a photosensor or another imaging device of the same dimensional order as the grating, higher order-diffractive effects tend to be limited simply by spatial sampling.
[0182] To detect the diffracted wavefront 36 in the near field, the present techniques may involve maintaining a sufficiently small separation distance 72, or pedestal height, between the top surface 48 of the diffraction grating 28, where refractive index modulation pattern 32 is formed and diffraction occurs, and the light-receiving surface 68 of the underlying pixel array 38, where the diffracted wavefront 36 is detected. In some implementations, this involves selecting the separation distance 72 to be less than about twenty times a center wavelength of the optical wavefront 26. In some implementations, the separation distance 72 may range between about 0.2 μm and about 20 μm, for example, between 0.5 μm and about 8 μm if the center wavelength of the optical wavefront lies in the visible range.
[0183] The Talbot effect is a near-field diffraction effect in which plane waves incident on a periodic structure, such as a diffraction grating, produce self-images, called Talbot images, of the periodic structure at regular distances behind the periodic structure. The regular distance at which self-images of the periodic structure are observed due to interference is called the Talbot length z.sub.T. In the case of a diffraction grating having a grating period g, the Talbot length z.sub.T may be expressed as follows:
where λ is the wavelength of the incidence. This expression simplifies to the following expression when the grating period g is much larger than the wavelength λ:
Other self-images are observed at integer multiples of the half Talbot length (nz.sub.T/2). These additional self-images are either in-phase and out-of-phase by half of the grating period (i.e., by g/2) with respect to the self-image observed at z.sub.T, depending on whether n is even or odd. Further sub-images can also be observed at smaller fractional values of the Talbot length.
[0184] It is to be noted that these Talbot self-images are observed in the case of amplitude gratings. In the case of phase gratings, such as in
[0185] Returning to
[0186] As noted above, the term “match” and its derivatives are meant to encompass not only an exact or identical match or concordance between the profile and the period of the intensity pattern 70 of the detected diffracted wavefront 36 and the profile and the period of the refractive index modulation pattern 32 of the diffraction grating 28, but also but also a substantial, approximate, sufficient, or subjective match. It is also to be noted that the expression “spatially modulated according to the grating period” is meant to encompass both “at the grating period”, to describe implementations where the spatial period 74 of the intensity pattern 70 is substantially equal to the grating period 34, as in
[0187] Another feature of near-field diffraction by a periodic diffraction grating 28, is that upon varying the angle of incidence 76 of the optical wavefront 26 impinging on the diffraction grating 28, the intensity pattern 70 of the diffracted wavefront 36 shifts laterally (i.e., along the grating axis 30), but substantially retains its period 74 and shape, as may be seen from the comparison between the solid and dashed diffraction patterns in
[0188] In some implementations, the separation distance 72 between the diffraction grating 28 and the pixel array 38 may be selected to ensure that the lateral shift experienced by the intensity pattern 70 of the diffracted wavefront 36 remains less than the grating period 34 as the angle of incidence 76 of the optical wavefront 26 is varied across the range of possible angles of incidence on the pixel array 38 as defined by the numerical aperture of the light field imaging device 20. Otherwise, the depth sensitivity of the light field imaging device 20 may be degraded or otherwise adversely affected.
[0189] It is appreciated that it is generally not possible to determine with certainty whether variations in a single pixel response are due to changes in intensity only, changes in angle of incidence only, or changes in both intensity and angle of incidence. Thus, the angle-dependent response of a single pixel cannot, in general, be used to unambiguously recover both the intensity and the angle of incidence of the incoming wavefront. Rather, the difference between the pixel responses from at least a pair or group of pixels configured to sample different portions of the intensity pattern of diffracted wavefront may be used to resolve the ambiguity between intensity and angle of incidence. Thus, in the present techniques, the plurality of light-sensitive pixels includes groups or pairs of adjacent pixels, where the adjacent pixels in each group or pair have different pixel responses as a function of the angle of incidence. In the present description, the term “adjacent pixels” generally refers to two or more pixels whose separation along a line parallel to the grating axis of the overlying diffraction grating is equal to the pixel pitch along the grating axis. It is to be noted, however, that adjacent pixels may, but need not, be arranged along a same line parallel to the grating axis. For example, two nearest-neighbor green pixels in a Bayer pattern may be considered to form a pair of adjacent pixels as defined herein (see, e.g., green pixels 40a and 40b in
[0190] It is to be noted that upon being optically coupled to an underlying pixel array 38, the diffraction grating 28 convolves phase-dependent information with a standard 2D image such that the intensity pattern 70 of the detected diffracted wavefront 36 may generally be expressed as a modulated function I˜I.sub.mod(depth info)×I.sub.base(2D image) including a modulating component I.sub.mod and a base component I.sub.base. The base component I.sub.base represents the non-phase-dependent optical wavefront that would be detected by the pixel array 38 in the absence of the diffraction grating 28. That is, detecting the base component I.sub.base alone would allow a conventional 2D image of the scene 22 to be obtained. Meanwhile, the modulating component I.sub.mod, is a direct result of the phase of the incident optical wavefront 26 such that any edge or slight difference in incidence angle will manifest itself as a periodic electrical response spatially sampled across the pixel array 38. The amplitude of the modulating component I.sub.mod is generally, but not necessarily, small compared to the base component I.sub.base (e.g., the ratio of I.sub.mod to I.sub.base may typically range from about 0.1 to about 0.3). It is appreciated that the sensitivity to the angle of incidence 76 of the optical wavefront 26, and therefore the angular resolution of the light field imaging device 20, will generally depend on the specific design of the diffraction grating 28.
[0191] Grating designs in which the grating period is twice as large as the pixel pitch are disclosed in co-assigned international patent application PCT/CA2017/050686, published as WO 2017/210781 A1, the contents of which are incorporated herein by reference in their entirety. In these designs, adjacent pixels of the pixel array are configured to sample complementary portions of the diffracted wavefront that are phase-shifted by half a grating period relative to each other. In such a configuration, the differential response between the pixel responses of two adjacent pixels may achieve, in principle, a maximum modulation depth of substantially 100% between a first angle of incidence, where the maximum of the diffracted intensity pattern is centered on one pixel and the minimum diffracted intensity pattern is centered on the other pixel (peak modulation level), and a second angle of incidence, where either the maximum or the minimum of the diffracted intensity pattern is centered on the transition between the pixels (unmodulated level).
[0192] In contrast, the present techniques relate to grating designs in which the grating period is equal to or smaller than the pixel pitch along the grating axis. For example, in the embodiments depicted in
[0193] As noted above, phase gratings may be used to generate near-field intensity patterns similar to Talbot self-images, which may be referred to as Lohmann images, at observation planes that are located at distances that scale as g.sup.2/λ. This means that in
[0194] Beyond a certain height, manufacturing a grating substrate can become challenging. In some implementations, the challenge is related to the fact that manufacturing a grating substrate involves a layering process in which the higher the number of layers (i.e., the thicker the grating substrates), the harder it is to maintain flatness uniformity and/or to avoid or at least control void formation, delamination, and other factors affecting yield and reliability.
[0195] Moreover, when the separation distance 72 between the diffraction grating 28 and the pixel array 38 increases, so generally does the lateral shift experienced by the intensity pattern 70 of the diffracted wavefront 36 for a given variation of angle of incidence 76 of the optical wavefront 26. As noted above, it may be desirable or required in some implementations to achieve a condition that this lateral shift remain less than the grating period 34 across the range of possible angles of incidence defined by the numerical aperture of the light field imaging device 20. Because the separation distance 72 scales up with the square of the grating period 34, the range of possible angles of incidence for which this condition may be satisfied, which in turn defines the largest achievable value for the numerical aperture of the light field imaging device 20, generally becomes smaller as the separation distance 72 increases. It is appreciated that a limited numerical aperture may be undesirable or disadvantageous in some applications.
[0196] Thus, in some implementations, for example, when the pixel pitch exceeds a certain value, it may be desirable, advantageous, or required to limit the separation distance 72 between the diffraction grating 28 and the pixel array 38. In the present techniques, it has been found that one way of achieving this is by using designs where the pixel pitch is equal to or greater than the grating period. For example, returning to the case above where p=3 μm, if one uses g=2 μm instead of 6 μm, which corresponds to p/g=3/2>1 rather than p/g=1/2<1, then the separation distance 72 may be kept at 3.76 μm, as for the case where p=1 μm, g=2 μm, and p/g=1/2.
[0197] In designs where p≥g, not at all values of p/g necessarily fulfill the condition that the pixel array includes angle-sensitive groups or pairs of adjacent pixels where the pixels in each group or pair have different pixel responses as a function of the angle of incidence. For example, when the adjacent pixels in a given pixel group or pair all have identical pixel dimensions along the grating axis, values of p/g=n, where n is a positive integer, generally do not fulfill this condition, since in this case the intensity measured by each adjacent pixel does not vary with the angle of incidence of the incoming wavefront because a whole number of grating periods of the diffraction grating extend above each pixel. Thus, as the intensity pattern of the diffracted wavefront shifts laterally above a particular pixel as a function of angle of incidence, any amount of light that is lost to its neighboring pixel on one side is recovered in the same amount from its other neighboring pixel on the other side. In contrast, values of p/g≥1 that are different from a positive integer, as is the case in
[0198] Referring now to
[0199] Referring still to
[0200] Returning to
[0201] Depending on the application, the processor 96 may include a single processing entity or a plurality of processing entities. Such processing entities may be physically located within the same device, or the processor 96 may represent processing functionality of a plurality of devices operating in coordination. Accordingly, the processor 96 may include or be part of one or more of a computer; a microprocessor; a microcontroller; a coprocessor; a central processing unit (CPU); an image signal processor (ISP); a digital signal processor (DSP) running on a system on a chip (SoC); a dedicated graphics processing unit (GPU); a special-purpose programmable logic device embodied in a hardware device such as, for example, a field-programmable gate array (FPGA) or an application-specific integrated circuit (ASIC); a digital processor; an analog processor; a digital circuit designed to process information; an analog circuit designed to process information; a state machine; and/or other mechanisms configured to process information and operate collectively as a processor. In particular, the terms “processor” should not be construed as being limited to a single processor or a single controller, and accordingly, any known processor or controller architecture may be used. Furthermore, depending on the application, the acquisition and processing of the light field image data may be performed by either the same device or separated devices.
[0202] In some implementations, the pixel responses belonging to the same pixel bank or to several pixel banks may be combined together, for example, using appropriate weighting factors, prior to computing the summed and differential pixel responses described above. In the present description, the term “pixel bank” refers to a group of light-sensitive pixels of the pixel array that are arranged along a line which is perpendicular to the grating axis of the overlying diffraction grating. That is, two nearest-neighbor pixel banks are separated from each other by a distance corresponding to the pixel pitch along the grating axis. For example, in
[0203] It is appreciated that depending on the application, the p/g ratio of the pixel pitch 62 of the pixel array 38 along the grating axis 30 to the grating period 34 of the diffraction grating 28 may take several values and fulfill different conditions. Non-limiting examples of such conditions include, to name a few, p/g>1 but different from a positive integer; p/g=(2n+1)/2, where n is a positive integer; p/g=n/m, where n and m are positive integers larger than two, and n is larger than m. Some further non-limiting embodiments will now be described with respect to
[0204] Referring to
[0205] Referring to
[0206] Referring to
[0207] Referring to
[0208] Referring to
[0209] In some implementations, for example, in architectures with high chief ray angle optical systems, the diffraction grating may be designed to follow a designed chief ray angle offset of an overlying microlens array relative to the pixel array such that each corresponding chief ray will pass through the center of the intended grating feature and its subsequent microlens. Such a configuration may ensure appropriate phase offsets for highly constrained optical systems. This means that, in some embodiments, a degree of vertical alignment between the features of the diffraction grating (e.g., its ridges and grooves) and the underlying light-sensitive pixels may change as a function of position within the pixel array, for example, as one goes from the center to the edge of the pixel array, to accommodate a predetermined chief-ray-angle offset. For example, depending on its position within the pixel array, a given light-sensitive pixels may be positioned under and in vertical alignment with a center of a ridge, a center of a groove, a transition between a ridge and a groove, or some intermediate position of a corresponding overlying diffraction grating. This is illustrated in the embodiment of
[0210] Referring to
[0211]
[0212] As in
I.sub.R˜I.sub.mod,R(depth info)×I.sub.base,R(2D image), (5)
I.sub.G˜I.sub.mod,G(depth info)×I.sub.base,G(2D image), (6)
I.sub.B˜I.sub.mod,B(depth info)×I.sub.base,B(2D image), (7)
In
[0213] As in
[0214] In the embodiments described so far, the diffraction grating assembly was depicted as including a single diffraction grating. Referring now to
[0215] The plurality of diffraction gratings 28 may include multiple sets 80a, 80b of diffraction gratings 28a, 28b, where the grating axes 30a, 30b of the diffraction gratings 28a, 28b of different ones of the sets 80a, 80b have different orientations. For example, in the embodiment of
[0216] It is to be noted that besides having orthogonal grating axis orientations, the two sets 80a, 80b of diffraction gratings 28a, 28b depicted in the embodiment of
[0217] It is appreciated that providing a diffraction grating assembly with diffraction gratings having different grating axis orientations may be advantageous or required in some implementations, since diffraction occurs along the grating axis of an individual diffraction grating. This means that when the diffraction grating assembly includes a single grating orientation, light coming from objects of the scene that extend perpendicularly to this single grating orientation will generally not produce a diffracted wavefront from which depth angular information may be extracted. In some implementations, providing two sets of orthogonally oriented gratings (e.g., horizontally and vertically oriented gratings, such as in
[0218] It is also appreciated that in the embodiment depicted in
[0219] Referring to
[0220] Referring to
[0221] In
[0222] In some implementations, the light field imaging device may include wavefront conditioning optics in front of the diffraction grating. The wavefront conditioning optics may be configured to collect, direct, transmit, reflect, refract, disperse, diffract, collimate, focus or otherwise act on the optical wavefront incident from the scene prior to it reaching the diffraction grating assembly. The wavefront conditioning optics may include lenses, mirrors, filters, optical fibers, and any other suitable reflective, refractive and/or diffractive optical components, and the like. In some implementations, the wavefront conditioning optics may include focusing optics positioned and configured to modify the incident wavefront in such a manner that it may be sampled by the light field imaging device.
[0223] Referring to
[0224] For exemplary purposes, the incident optical wavefront 26 in
[0225] In the case of monochromatic plane optical wavefront impinging on a focusing lens such as shown in
[0226] Referring now to
[0227] Referring to
[0228] Referring to
[0229] In both
[0230] Referring still to both
[0231] It is to be noted that the diffraction grating assembly 24, the pixel array 38, the color filter array 42 and the microlens array 64 of the frontside-illuminated light field imaging device 20 of
Diffraction Grating Assembly Implementations
[0232] Referring to
[0233] The diffraction grating 28 may be a binary phase grating and the refractive index modulation pattern 32 may include alternating ridges 52 and grooves 54. The diffraction grating 28 is configured to diffract an optical wavefront 26 incident from the scene 22 and generate, in a near-field diffraction region or plane 98, a diffracted wavefront 36 having an intensity pattern that is spatially modulated according to the grating period 34 and that shifts laterally along the grating axis 30 as a function of an angle of incidence of the optical wavefront 26. The diffraction grating assembly 24 is configured to be disposed over the pixel array 38 with the light-sensitive pixels 40 positioned at the near-field diffraction region or plane 98 for detection of the diffracted wavefront 36 as light field image data. In order to enable the image sensor 94 to become sensitive to the angle of incidence of the optical wavefront 26 by disposing thereon the diffraction grating assembly 24 described herein, the light-sensitive pixels 40 include laterally adjacent pixels that generate different angle-dependent pixel responses as a function of angle of incidence.
[0234] In color imaging applications, the diffraction grating assembly 24 may be configured to be disposed over a color filter array 42 of the image sensor 94. As described above, the color filter array 42 is disposed over the pixel array 38 and configured to spatially and spectrally filter the diffracted wavefront 36 prior to its detection by the plurality of light-sensitive pixels 40.
[0235] Depending on the application, the diffraction grating assembly 24 may include a single diffraction grating 28 or a plurality of diffraction gratings 28 arranged in a 2D grating array or tile disposed over the pixel array 38, and, optionally, the color filter array 42.
Method Implementations
[0236] In accordance with another aspect, the present description also relates to various light field imaging methods, including a method of capturing light field image data about a scene and a method of providing 3D or light field imaging capabilities to a conventional 2D image sensor. These methods may be performed with light field imaging devices and diffraction grating assemblies such as those described above, or with other similar devices and assemblies.
[0237] Referring to
[0238] The method 200 may also include a step 204 of spatio-spectrally filtering the diffracted wavefront with a color filter array to produce a filtered wavefront. It is to be noted that this step 204 is optional and may be omitted in some implementations, for example, in monochrome imaging applications.
[0239] The method 200 may further include a step 206 of detecting, as the light field image data, the spatio-spectrally filtered diffracted wavefront using a pixel array positioned under the color filter array, in the near-field diffraction plane. The pixel array may include a plurality of light-sensitive pixels and a pixel pitch along the grating axis that is equal to or larger than the grating period. In some implementations, the ratio of the pixel pitch to the grating period may be equal to (2n+1)/2, where n is a positive integer. For example, the ratio of the of the pixel pitch to the grating period may be equal to 3/2 (for n=1) or 5/2 (for n=2). Of course, the ratio of the pixel pitch to the grating period may take other values, as discussed in detail above. It may be appreciated that when the spatio-spectral filtering step 204 is omitted, no color filter array is disposed between the diffraction grating assembly and the pixel array, and the detecting step 206 involves the direct detection of the diffracted wavefront with the plurality of light-sensitive pixels. The detecting step 206 may include sampling, by each light-sensitive pixel, a respective portion of the diffracted wavefront to generate a corresponding pixel response, where the light-sensitive pixels include groups or pairs of adjacent pixels. The adjacent pixels in each group or pair may have different pixel responses as a function of the angle of incidence. In order to detect the diffracted wavefront in a near-field diffraction plane, the method 200 may include a step of setting a separation distance between the refractive index modulation pattern of the diffraction grating and a light-receiving surface of the pixel array to be less than about twenty times a center wavelength of the optical wavefront or to range between about 0.2 μm and about 20 μm. In some embodiments, the separation distance may correspond to non-integer multiples of the half-Talbot distance, z.sub.T/2, for example, z.sub.T/4 and 3z.sub.T/4.
[0240] In some implementations, the method 200 may include a step of computing, for each pair of adjacent pixels, a sum of the pixel responses of the adjacent pixels, thereby obtaining a plurality of summed pixel responses, and a step of generating a 2D image of the scene from the plurality of summed pixel responses. Additionally, or alternatively, the method may include a step of computing, for each pair of adjacent pixels, a difference between the pixel responses of the adjacent pixels, thereby obtaining a plurality of differential pixel responses, and a step a generating a depth image of the scene from the plurality of differential pixel responses.
[0241] Referring now to
[0242] The method 300 may also include a step 304 of disposing the diffraction grating assembly in front of the image sensor with the grating axis parallel to the pixel axis and the light-sensitive pixels located at the near-field diffraction plane for detection of the diffracted wavefront and including laterally adjacent pixels configured to generate different pixel responses as a function of the angle of incidence, as described above. In some implementations, the disposing step 304 may include orienting the grating axis either parallel to one of two orthogonal pixel axes of the pixel array or oblique (e.g., at 45°) to the pixel axes. In order for the pixel array to detect the diffracted wavefront in a near-field diffraction region, the disposing step 304 may include positioning the diffraction grating assembly at a separation distance from the pixel array that ranges from about 0.2 μm to about 20 μm, or positioning the diffraction grating assembly at a separation distance from the pixel array selected such that an optical path length of the diffracted wavefront prior to being detected with the light-sensitive pixels is less than about twenty times a center wavelength of the optical wavefront. In some embodiments, the separation distance may correspond to non-integer multiples of the half-Talbot distance, z.sub.T/2, for example, z.sub.T/4 and 3z.sub.T/4.
[0243] Of course, numerous modifications could be made to the embodiments described above without departing from the scope of the appended claims.