DEPTH SENSOR MODULE AND DEPTH SENSING METHOD
20170343675 · 2017-11-30
Inventors
Cpc classification
G01S17/48
PHYSICS
G01S17/42
PHYSICS
G01S7/4913
PHYSICS
G01S17/894
PHYSICS
G01S17/36
PHYSICS
International classification
G01S17/48
PHYSICS
G01S17/02
PHYSICS
Abstract
The invention relates to a depth sensor module and depth sensing method. The depth sensor module and method is adapted to include a light detector part and emitting part with a least two light sources spatially offset in the direction of the triangulation baseline. In some of the embodiments, the pixel field of the image sensor in the light detector part consists of time-of-flight pixels. Depth measurements derived by triangulation can be used to calibrate depth maps generated by the time-of-flight measurements.
Claims
1. A depth sensor module (2) comprising a light emitting part (30) for illuminating objects and a light detector part (20), the light emitting part (30) and the light detector part (20) being spatially offset in the direction of a triangulation baseline (15), wherein the light emitting part (30) comprises at least two light sources (32a, 32b) spatially offset in the direction of the triangulation baseline (15), wherein the light detector part (20) is configured to acquire light and to provide along the direction of the triangulation baseline (15) an intensity distribution of the acquired light.
2. The depth sensor module (2) according to claim 1, wherein the depth sensor module (2) is configured to perform a triangulation evaluation using the intensity distribution of the acquired light.
3. The depth sensor module (2) according to claim 1 or 2, wherein the depth sensor module (2) is configured to enable a triangulation evaluation by determining a zero-crossing point of a difference between two intensity distributions of acquired light originating from two of the at least two light sources (32a, 32b) of the light emitting part (30).
4. The depth sensor module (2) according to one of claims 1 to 3, wherein the at least two light sources (32a, 32b) are configured to be controlled individually.
5. The depth sensor module (2) according to one of claims 1 to 4, wherein the at least two light sources (32a, 32b) are comprised in a single die (32).
6. The depth sensor module (2) according to one of claims 1 to 5, wherein a first and a second of the at least two light sources (32a, 32b) are operable to emit a first light beam having a first light intensity distribution and a second light beam having a second light intensity distribution, respectively, wherein the first and second light intensity distributions are mutually symmetric with respect to a plane aligned perpendicularly to the triangulation baseline.
7. The depth sensor module (2) according to one of claims 1 to 6, wherein the light emitting part (30) is operable to alternatingly illuminate objects with a first one (32a) of the at least two light sources while not illuminating the objects with a second one (32b) of the at least two light sources; and illuminate objects with a second one (32b) of the at least two light sources while not illuminating the objects with a first one (32a) of the at least two light sources.
8. The depth sensor module (2) according to one of claims 1 to 7, wherein the light detector part (20) includes an image sensor (210) configured to acquire light.
9. The depth sensor module (2) according to claim 8, wherein the image sensor (210) comprises a time-of-flight image sensor.
10. The depth sensor module (2) according to one of claims 1 to 9, wherein the depth sensor module (2) is configured to enable time-of-flight measurements.
11. The depth sensor module (2) according to one of claims 1 to 10, wherein the light emitting part (30) includes at least one optical feedback pixel (212).
12. The depth sensor module (2) according to one of claims 1 to 11, wherein the depth sensor module (2) is a proximity sensor.
13. A depth sensing method using a depth sensor module (2) having a light emitting part (30) for illuminating objects and a light detector part (20), the light emitting part (30) and the light detector part (20) being spatially offset in the direction of a triangulation baseline (15), wherein the light emitting part (30) includes at least two light sources (32a, 32b) spatially offset in the direction of the triangulation baseline (15), wherein the method comprises: emitting light using the light emitting part (30), acquiring light using the light detector part (20), and obtaining an intensity distribution of the acquired light along the direction of the triangulation baseline (15).
14. The depth sensing method according to claim 13, comprising performing a triangulation evaluation using the intensity distribution of the acquired light.
15. The depth sensing method according to claim 14, wherein the performing the triangulation evaluation comprises determining a zero-crossing point (39) of a difference between a first and a second intensity distributions of acquired light originating from a first and a second of the at least two light sources (32a, 32b) of the light emitting part (30), respectively.
16. The depth sensing method according to one of claims 13 to 15, wherein a first and a second of the at least two light sources (32a, 32b) are operated to emit a first light beam having a first light intensity distribution and a second light beam having a second light intensity distribution, respectively, wherein the first and second light intensity distributions are mutually symmetric with respect to a plane aligned perpendicularly to the triangulation baseline.
17. The depth sensing method according to one of claims 13 to 16, comprising alternatingly illuminating objects with a first one (32a) of the at least two light sources while not illuminating the objects with a second one (32b) of the at least two light sources; and illuminating objects with a second one (32b) of the at least two light sources while not illuminating the objects with a first one (32a) of the at least two light sources.
18. The depth sensing method according to one of claims 13 to 17, comprising controlling the at least two light sources (32a, 32b) individually.
19. The depth sensing method according to one of claims 13 to 18, wherein the light detector part (20) comprises an image sensor (210), and wherein light is acquired using the image sensor (210).
20. The depth sensing method according to one of claims 13 to 19, comprising performing a time-of-flight measurement.
21. The depth sensing method according to one of claims 13 to 20, wherein the light emitting part (30) comprises at least one optical feedback pixel (212), and wherein the method comprises performing an optical-feedback measurement using the at least one optical feedback pixel (212).
22. The depth sensing method according to one of claims 13 to 21, wherein the depth sensor module (2) is a proximity sensor, and wherein the method comprises carrying out proximity measurements.
Description
[0037] The herein described apparatuses and methods will be more fully understood from the detailed description given herein below and the accompanying drawings which should not be considered limiting to the invention described in the appended claims. The drawings:
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
LIST OF REFERENCE SYMBOLS
[0048] 1 Object [0049] 2 Depth sensor module [0050] 15 Direction of triangulation baseline [0051] 20 Light detection part [0052] 210 Image sensor [0053] 211 Pixel field [0054] 212 Optical feedback pixel(s) [0055] 23 Imaging optics [0056] 30 Light emitting part [0057] Light emitting die [0058] 32a First light source [0059] 32b Second light source [0060] 32c Third light source [0061] 32d Fourth light source [0062] 32a2 Pad of the first light source [0063] 32b2 Pad of the second light source [0064] 32c2 Pad of the third light source [0065] 32d2 Pad of the fourth light source [0066] 33 Projecting optics [0067] 35a First light beam [0068] 35b Second light beam [0069] 38 Zero-crossing axis [0070] 39 Zero-crossing point [0071] 40 Cover glass
[0072] Prior art proximity sensor modules include a light source and a photo detector. Light is emitted by the light source and detected by the photo detector. In case some light of the emitted light source is reflected back to the proximity sensor and detected by the photo detector, it is assumed that an object is in front of the proximity sensor. Typically simple signal thresholding is performed to decide whether there is an object in close range or not.
[0073] Recent developments towards the integration of actual depth sensors into proximity sensors promise more reliable proximity detections. However, those depth sensors strongly suffer from stray light caused by dirt and/or dust.
[0074] Triangulation is the process of determining the location (or distance) of a point in a scene. At either end of a baseline, the angles between the point and the baseline are measured or known, respectively. Using trigonometric formulas, the distance between the triangulation baseline and the point can be calculated.
[0075]
[0076] It is also possible to simply have two separate light sources 32a and 32b in the light emitting part 30 that are not on the same die, e.g. two separate LEDs, VCSELs or laser diodes. However, the two light sources 32a and 32b are spatially offset along the direction of the triangulation baseline 15. The light generated by the first light source 32a and the second light source 32b is projected into the scene onto the object 1 (cf.
[0077]
[0078] In order to obtain the differential signal 35a′-35b′, the first light source 32a and the second light source 32b may be switched on and off in alternating series.
[0079] In
[0080] The zero-crossing axis 38 represents the positions where the power of the first light beam 35a is equivalent to the power of the second light beam 35b. The projecting optics 33 may consist of several optical elements such as lenses and/or diffractive optical elements, or may be built by a single lens element or may even consist of a simple glass.
[0081] By imaging the back-reflection from the object 1 of each of the two emitted light beams 35a and 35b through the imaging optics 23 onto the pixel field 211 of the image sensor 210 and subtracting the signals generated while the second light source 32b is on from the signals captured when the first light source 32a is on, the resulting differential signal 35a′-35b′ will show the zero-crossing point 39 (cf.
[0082] Varying the power ratio of the two light sources 32a and 32b enables to tilt the zero-crossing axis 38 back and forth along the direction of the triangulation baseline 15. This may give certain flexibility in steering the zero-crossing axis.
[0083] Compared to actual time-of-flight depth sensing technologies, the proposed zero-crossing triangulation method is significantly more stable with respect to stray light originating, e.g., from reflections from dirt or dust particles appearing in front of the depth sensor module 2. In case the pixel field 211 on the image sensor 210 includes or even consists of time-of-flight pixels, the signals of the time-of-flight image sensor can be used to first detect and localize the zero-crossing point, and, thus to reliably measure a first distance. Subsequently, the stable zero-crossing triangulation-based distance measurement can be used to calibrate the depth map generated by the time-of-flight measurements of each of the pixels of the time-of-flight image sensor. The zero-crossing triangulation-based calibration of the time-of-flight depth map can be done on-the-fly as long as the zero-crossing point 39 is imaged at least partly from the object 1 onto the pixel field 211 by the imaging optics 23. In case object 1 is not always imaged onto the pixel field 211 by the imaging optics 23, the calibration can at least be updated whenever the zero-crossing point becomes visible on the pixel field 211. The detection of the position of the zero-crossing point 39 enables to correct for all stray light issues on the time-of-flight depth measurements. In case that the pixel field 211 includes or even consists of demodulation pixels such as used in indirect time-of-flight sensors, each of the pixels typically includes or even consists of two storage nodes. Such demodulation pixel architectures have been presented in patents U.S. Pat. No. 5,856,667, EP1009984B1, EP1513202B1 and U.S. Pat. No. 7,884,310B2.
[0084] Therefore, the photo-generated charges when the first light source 32a is turned on can be stored on the first storage nodes of the pixels, and the photo-generated charges when the second light source 32b is turned on can be stored in the second storage nodes of the pixels. The first and second light sources 32a, 32b can be alternately turned on and off during an exposure, while synchronizing them with the switches in the demodulation pixels steering the photo-generated charges to either the first storage nodes of the pixels or the second storage nodes of the pixels, respectively.
[0085] Further, having a background removal circuitry on the demodulation pixels enables to get rid of all the charges generated by background light. Such background removal circuitries for demodulation pixels have been presented in PCT publication WO2009135952A2 and in the US patents U.S. Pat. No. 7,574,190B2 and U.S. Pat. No. 7,897,928B2. The removal of a common mode signal level on the different storage nodes of each pixel can drastically increase the dynamic range of the depth sensor module 2.
[0086]
[0087] At a short distance of the object 1 to the depth sensor module 2, as illustrated in
[0088] At a medium distance of the object 1 to the depth sensor module 2, as illustrated in
[0089] The zero-crossing point 39 in the difference of the images captured when the first light source 32a is turned on and when the second light source 32b is turned on can be used for triangulation purposes. The image difference can be generated e.g. by sequentially capturing two images, namely one image with the first light source 32a turned on and one with the second light source 32b turned on, and subtract one from the other.
[0090] Another possibility is to integrate so-called demodulation pixels as light sensing pixels in the pixel field 211. Such demodulation pixels have been presented in patents U.S. Pat. No. 5,856,667, EP1009984B1, EP1513202B1 and U.S. Pat. No. 7,884,310B2. Demodulation pixels can have two storage sites on each pixel. The steering of the photo-generated charges from the photo-sensitive area to one of the two charge storages can be done in syncronization with the alternate control of the first light source 32a and the second light source 32b. The result will be that the first storage nodes of each pixel of the pixel field 211 store the photo-generated charges generated while the first light source 32a is turned on, and the second storage nodes of each pixel of the pixel field 211 store photo-generated charges while the second light source 32b is turned on. More sophisticated demodulation pixels already include subtracting circuitry in each pixel. This enables a better background light signal removal and therefore a more robust system. In case the pixels in the pixel field 211 of the image sensor 210 are time-of-flight pixels, the raw sampling data—possibly in combination with the on-pixel background removal circuitry of the time-of-flight pixels—can be used to find the zero-crossing point 39 and determine depths, based on the localization of the zero-crossing point on the pixel field 211. The travel times of the emitted light from the depth sensor module 2 to the object 1 and back measured by the time-of-flight pixels in the pixel field 211 on the image sensor 210 can further enable to build a full two-dimensional depth map. The depth measurements derived by the zero-crossing triangulation approach can be used to calibrate the two-dimensional depth map derived from the time-of-flight based depth measurements. The calibration performed using the zero-crossing triangulation approach improves the stability and robustness of the depth sensor module in terms of stray light, thermal drifts and many others artefacts. Time-of-flight pixels integrated in the image sensor 210 may be demodulation pixels as used in indirect (phase measurement based) time-of-flight measurement systems or direct (single photon avalanche detection-based) time-of-flight measurement systems. Depending on the need and application, the calibration evaluation based on the triangulation approach can be carried out with every single depth measurement, or may be done from time to time, or can be done simply whenever required.
[0091] The different illustrations in
[0092] The optical feedback pixels can e.g. be used as feedback pixels to adjust emitted light intensities, calibrate for intensities, adjust phase delay, calibrate for phase delays, or detect dirt/dust. Fast intensity variations may be caused by temperature changes or can mean deposition/removal of dirt/dust, while slow intensity variations may be consequences of aging.
[0093]
[0094]
[0095] The following embodiments are furthermore disclosed:
[0096] Depth Sensor Module Embodiments:
[0097] E1. A depth sensor module (2) including a light emitting part (30) for illuminating objects and a light detector part (20), the light emitting part (30) and the light detector part (20) being spatially offset in the direction of a triangulation baseline (15), characterized in that the light emitting part (30) includes at least two light sources (32a, 32b) spatially offset in the direction of the triangulation baseline (15), wherein the light detector part (20) is configured to acquire light and to provide in the direction of the triangulation baseline (15) an intensity distribution of the acquired light.
[0098] E2. The depth sensor module (2) according to embodiment E1, characterized in that the depth sensor module (2) is configured to perform a triangulation evaluation using the intensity distribution of the acquired light.
[0099] E3. The depth sensor module (2) according to embodiment E1 or E2, characterized in that the depth sensor module (2) is configured to enable a triangulation evaluation by determining a zero-crossing point of the difference between two intensity distributions of acquired light originating from two of the at least two light sources (32a, 32b) of the light emitting part (30).
[0100] E4. The depth sensor module (2) according to one of embodiments E1 to E3, characterized in that the at least two light sources (32a, 32b) are configured to be controlled individually.
[0101] E5. The depth sensor module (2) according to one of embodiments E1 to E4, characterized in that the at least two light sources (32a, 32b) are arranged on a single die (32).
[0102] E6. The depth sensor module (2) according to one of embodiments E1 to E5, characterized in that the light detector part (20) includes an image sensor (210) configured to acquire light.
[0103] E7. The depth sensor module (2) according to embodiment E6, characterized in that the image sensor (210) is a time-of-flight image sensor.
[0104] E8. The depth sensor module (2) according to one of embodiments E1 to E7, characterized in that the depth sensor module (2) is configured to enable a time-of-flight measurement.
[0105] E9. The depth sensor module (2) according to one of embodiments E1 to E8, characterized in that the light emitting part (30) includes at least one optical feedback pixel (212).
[0106] E10. The depth sensor module (2) according to one of embodiments E1 to E9, characterized in that the depth sensor module (2) is configured as a proximity sensor.
[0107] Depth Sensing Method Embodiments:
[0108] E11. A depth sensing method using a depth sensor module (2) having a light emitting part (30) for illuminating objects and a light detector part (20), the light emitting part (30) and the light detector part (20) being spatially offset in the direction of a triangulation baseline (15), characterized in that the light emitting part (30) includes at least two light sources (32a, 32b) spatially offset in the direction of the triangulation baseline (15), wherein the depth sensing method comprises the steps of: emitting light using the light emitting part (30), acquiring light using the light detector part (20), and providing in the direction of the triangulation baseline (15) an intensity distribution of the acquired light.
[0109] E12. The depth sensing method according to embodiment E11, characterized in that a triangulation evaluation using the intensity distribution of the acquired light is performed.
[0110] E13. The depth sensing method according to embodiment E11 or E12, characterized in that a triangulation evaluation is performed by determining a zero-crossing point (39) of the difference between two intensity distributions of acquired light originating from two of the at least two light sources (32a, 32b) of the light emitting part (30).
[0111] E14. The depth sensing method according to one of embodiments E11 to E13, characterized in that the at least two light sources (32a, 32b) are controlled individually.
[0112] E15. The depth sensing method according to one of embodiments E11 to E14, characterized in that the light is acquired using an image sensor (210) included in the detector part (20).
[0113] E16. The depth sensing method according to one of embodiments E11 to E15, characterized in that a time-of-flight measurement is performed.
[0114] E17. The depth sensing method according to one of embodiments E11 to E16, characterized in that an optical-feedback measurement is performed using at least one optical feedback pixel (212) in the light emitting part (30).
[0115] E18. The depth sensing method according to one of embodiments E11 to E17, characterized in that the depth sensor module (2) is used as a proximity sensor.