SYSTEM FOR IMAGING A SCENE
20230266600 · 2023-08-24
Assignee
Inventors
Cpc classification
G02B27/4272
PHYSICS
G02B5/1814
PHYSICS
H04N23/45
ELECTRICITY
H04N23/741
ELECTRICITY
G02B6/00
PHYSICS
International classification
Abstract
A system for imaging a scene includes a capturing unit to capture two-dimensional and/or three-dimensional information of the scene, the information including light waves from the scene; a first diffractive optical element to receive the light waves from the capturing unit; a waveguide to forward the light waves received by the first diffractive optical element , the first diffractive optical element additionally to couple the light waves into the optical waveguide; and a second diffractive optical element to couple the light waves forwarded by the optical waveguide out of the optical waveguide. The system additionally includes a first image sensor and at least one second image sensor to capture the light waves coupled out of the optical waveguide and to generate first image data and second image data therefrom. The first image sensor and the second image sensor are in a region paired with the second diffractive optical element.
Claims
1-13. (canceled)
14. A system for imaging a scene, the system comprising: a capturing unit configured to capture two-dimensional and/or three-dimensional information of the scene, the information comprising light waves from the scene; a first diffractive optical element configured to receive the light waves from the capturing unit; an optical waveguide configured to forward the light waves received from the first diffractive optical element and to couple the light waves into the optical waveguide; a second diffractive optical element configured to couple the light waves forwarded by the optical waveguide out of the optical waveguide; and a first image sensor and at least one second image sensor, which are configured to capture the light waves coupled out and to generate first image data and second image data therefrom, the first image sensor and the second image sensor being in a region associated with the second diffractive optical element.
15. The system according to claim 14, wherein the region associated with the second diffractive optical element comprises an area into which the second diffractive optical element couples the light waves out, and the first image sensor and the at least one second image sensor are within the area.
16. The system according to claim 14, wherein the size of the second diffractive optical element determines the size of the area.
17. The system according to claim 15, wherein the size of the second diffractive optical element determines the size of the area.
18. The system according to claim 14, wherein the first diffractive optical element comprises a first holographic optical element and the second diffractive optical element comprises a second holographic optical element.
19. The system according to claim 15, wherein the first diffractive optical element comprises a first holographic optical element and the second diffractive optical element comprises a second holographic optical element.
20. The system according to claim 16, wherein the first diffractive optical element comprises a first holographic optical element and the second diffractive optical element comprises a second holographic optical element.
21. The system according to claim 17, wherein the first diffractive optical element comprises a first holographic optical element and the second diffractive optical element comprises a second holographic optical element.
22. The system according to claim 18, wherein the first holographic optical element and the second holographic optical element comprise volume holograms which couple the light waves into and out of the waveguide, respectively, corresponding to their wavelengths.
23. The system according to claim 18, wherein the second holographic element comprises further optical functions for image correction.
24. The system according to claim 22, wherein the second holographic element comprises further optical functions for image correction.
25. The system according to claim 18, wherein the first holographic optical element and the second holographic optical element comprise a photosensitive material including photopolymer.
26. The system according to claim 22, wherein the first holographic optical element and the second holographic optical element comprise a photosensitive material including photopolymer.
27. The system according to claim 23, wherein the first holographic optical element and the second holographic optical element comprise a photosensitive material including photopolymer.
28. The system according to claim 14, wherein the waveguide comprises a prism.
29. The system according to claim 14, wherein the first image sensor has a first sensitivity and/or a first exposure time, and that the second image sensor has a second sensitivity different from the first image sensor and/or a second exposure time different from the first image sensor.
30. The system according to claim 14, wherein the first image sensor and the second image sensor convert the photons associated with the light waves into electrical signals, to generate the first image data and the second image data therefrom.
31. The system according to claim 30, wherein the first image sensor and the second image sensor comprise a CMOS sensor and/or a CCD sensor.
32. The system according to claim 14, further comprising a processing unit to process the image data generated by the first image sensor and the second image sensor and to generate an image of the scene (AS) therefrom.
33. A holographic camera comprising: a system for imaging a scene, the system comprising: a capturing unit configured to capture two-dimensional and/or three-dimensional information of the scene, the information comprising light waves from the scene; a first diffractive optical element configured to receive the light waves from the capturing unit; an optical waveguide configured to forward the light waves received from the first diffractive optical element and to couple the light waves into the optical waveguide; a second diffractive optical element configured to couple the light waves forwarded by the optical waveguide out of the optical waveguide; and a first image sensor and at least one second image sensor, which are configured to capture the light waves coupled out and to generate first image data and second image data therefrom, the first image sensor and the second image sensor being in a region associated with the second diffractive optical element.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] These and other aspects and advantages will become more apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings of which:
[0026]
[0027]
[0028]
[0029]
DETAILED DESCRIPTION
[0030] Reference will now be made in detail to the embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout.
[0031] In
[0032] The capturing unit 12, 22, 32 captures two-dimensional and/or three-dimensional information of the scene 3, wherein the information includes light waves LW from the scene S. The capturing unit 12, 22, 32, as illustrated in
[0033] The first diffractive optical element 14, 24, 34 receives the light waves LW from the capturing unit 12, 22, 32 and couples them into the waveguide 16, 26, 36. The waveguide 16, 26, 36 forwards the light waves LW received from the first diffractive optical element 14, 24, 34 to the second diffractive optical element 14, 24, 34 by total reflection. The second diffractive optical element 14, 24, 34 then couples the light waves LW forwarded by the optical waveguide 16, 26, 36 out of the optical waveguide 16, 26, 36.
[0034] This is illustrated by the corresponding arrows in
[0035] Therein, various arrangements of the first and the second diffractive optical element 14, 24, 34 in the system 10, 20, 30 are possible. The diffractive optical elements 14, 12, 24 can for example be arranged within the waveguide 16, 36, which is illustrated in
[0036] The light waves LW coupled out are then captured by the first image sensor 18, 28, 38 and the at least one second image sensor 18, 28, 38 and first image data BD and second image data BD are generated therefrom. This is described in detail with reference to
[0037] The arrangement in the region EB allows the image sensors 18, 28, 38 to capture the light waves LW from the scene S from the same perspective. Thus, the generated image data can be combined with each other or averaged to obtain a picture of the scene AS reduced in noise.
[0038] In an example, the region EB associated with the second diffractive optical element 14, 24, 34 can have an area, into which the second diffractive optical element 14, 24, 34 couples the light waves LW out, wherein the first image sensor 18, 28, 38 and the at least one second image sensor 18, 28, 38 are arranged within the area.
[0039] Herein, the size of the second diffractive optical element 14, 24, 34 determines the size of the area, i.e. the size of the outcoupling region, also referred to as “eyebox”. By suitable choice of the size of the second diffractive optical element 14, 24, 34, the size of the area can be set such that further image sensors 18, 38, 38 can be arranged within the area, which is for example illustrated in the example shown in
[0040] The first diffractive optical element 14, 24, 34 can comprise a first holographic optical element and the second diffractive optical element 14, 24, 34 can comprise a second holographic optical element. Herein, the holographic optical elements have diffraction gratings, which are produced by holographic methods. These diffraction gratings are for example written or recorded into volume holograms.
[0041] The first holographic optical element and the second holographic optical element can comprise these volume holograms, which couple the light waves LW into and out of the waveguide 16, 26, 36, respectively, corresponding to their wavelengths. More precisely, the light waves LW are coupled in and coupled out, respectively, corresponding to the Bragg condition, i.e. the light waves LW have to have the correct wavelength (color) and the correct shape (beam direction, wavefront profile). Herein, it is distinguished between volume holograms with reflection gratings as well as volume holograms with transmission gratings. In transmitting gratings, a part of the incident light waves LW is reflected and a part is absorbed. In reflection gratings, the light waves LW are diffracted for certain angles and wavelengths such that constructive interference arises. In
[0042] In
[0043] In
[0044] As mentioned above, the examples illustrated in
[0045] Further, the second holographic optical element can comprise further functions for image correction. They can for example be written into the volume hologram and reduce additional disturbances like distortions upon coupling the light waves LW out.
[0046] The first holographic optical element and the second holographic optical element can comprise a photosensitive material, preferably photopolymer. Photopolymer has a good diffraction efficiency and has the advantage that it does not have to be additionally chemically processed. Materials like dichromatic gelatin, silver halide and/or the like are also possible.
[0047] The waveguide 16, 26, 36 can comprise a prism. An optical fiber is also possible.
[0048] The first image sensor 18, 28, 38 can have a first sensitivity and/or a first exposure time, and the second image sensor 18, 28, 38 can have a second sensitivity different from the first image sensor 18, 28, 38 and/or a second exposure time different from the first image sensor. By the different sensitivities and/or exposure times, different regions of a scene S can be captured and combined with varying accuracy. Thus, the scene S can be imaged with all brightness differences. For example, the sensitivity and/or exposure time can be adjusted such that the first image sensor 18, 28, 38 captures bright regions of the scene S without overexposure and the second image sensor 18, 28, 38 captures dark regions of the scene S without underexposure. By combination of the generated image data BD, for example a superposition of image data with different exposure times, a high-contrast image of the scene S can be generated. For example, a color image with extended dynamic range can also be generated with the polychromatic arrangement shown in
[0049] The first image sensor 18, 28, 38 and the second image sensor 18, 28, 38 can convert the photons associated with the light waves LW into electrical signals to generate the first image data BD and the second image data BD therefrom. This occurs by the photoelectric effect, wherein, presented in simplified manner, photons are absorbed by the image sensors and electrons or charges are induced.
[0050] Herein, the first image sensor 18, 28, 38 and the second image sensor 18, 28, 38 can comprise a CMOS sensor and/or CCD sensors. In CMOS sensors (complementary metal-oxide-semiconductor), charges in a pixel are converted into a voltage. It is amplified, quantized and output as a digital value. CCD (charge-coupled device) sensors are composed of a plurality of extensively arranged light-sensitive semiconductor elements. Each semiconductor element represents a photodetector, which converts the incident photons into electrons.
[0051] The system 10, 20, 30 can further comprise a processing unit VB, which processes the image data BD generated by the first image sensor 18, 28, 38 and the second image sensor 18, 28, 38 and generates a picture of the scene AS therefrom. In this context,
[0052] The picture of the scene AS is generated from the processed image data BD by suitable combination.
[0053] Further, a holographic camera can be equipped with the above described system 10, 20, 30 for imaging a scene S.
[0054] A description has been provided with particular reference to preferred embodiments thereof and examples, but it will be understood that variations and modifications can be effected within the spirit and scope of the claims which may include the phrase “at least one of A, B and C” as an alternative expression that means one or more of A, B and C may be used, contrary to the holding in Superguide v. DIRECTV, 358 F3d 870, 69 USPQ2d 1865 (Fed. Cir. 2004).