Depth map generator
11076145 · 2021-07-27
Assignee
Inventors
Cpc classification
G01S17/48
PHYSICS
H04N13/239
ELECTRICITY
H04N23/45
ELECTRICITY
G06V10/145
PHYSICS
H04N13/254
ELECTRICITY
H04N13/271
ELECTRICITY
International classification
H04N13/271
ELECTRICITY
H04N13/239
ELECTRICITY
G01S17/48
PHYSICS
Abstract
The invention describes a depth map generator comprising an array comprising a plurality of individually addressable array elements, wherein an array element comprises a semiconductor emitter or a reflector arranged to reflect light emitted by a semiconductor emitter; a driver realised to switch an array element according to a predefined illumination pattern; a number of image sensors, wherein an image sensor is arranged to detect light reflected from a scene irradiated by the irradiation pattern; and a processing unit realised to compute a depth map of the scene on the basis of a light pattern detected by an image sensor. The invention further describes an irradiation arrangement for use in such a depth map generator; a method of generating a depth map of a scene; and a device comprising such a depth map generator.
Claims
1. A depth map generator comprising: an array comprising a plurality of individually addressable array elements, each of the plurality of individually addressable array elements comprising at least one of a semiconductor emitter or a reflector arranged to reflect light emitted by a semiconductor emitter; a driver configured to switch the plurality of individually addressable array elements on or off according to a predefined illumination pattern, the predefined illumination pattern comprising a plurality of array elements that are switched on each bounded by at least two array elements that are switched off; at least one image sensor configured to detect light reflected from a scene illuminated by the predefined illumination pattern; and a processor configured to compute a depth map of the scene from the light pattern based on an offset between an expected position of a detected pixel in the light pattern and an actual position of the detected pixel in the light pattern.
2. The depth map generator according to claim 1, wherein the semiconductor emitter is an infrared LED.
3. The depth map generator according to claim 1, wherein the individually addressable array elements comprise semiconductor emitters and a primary optical element arranged to shape the light emitted by the semiconductor emitters of the array.
4. The depth map generator according to claim 1, wherein the individually addressable array elements comprise reflectors, each of the reflectors being an individually addressable micro-mirror.
5. The depth map generator according to claim 1, wherein the array comprises at least 100 individually addressable array elements.
6. The depth map generator according to claim 1, wherein the image sensor comprises an array of photosensors sensitive to a wavelength of light originating from the array comprising the plurality of individually addressable array elements.
7. An illumination arrangement for use in a depth map generator, the illumination arrangement comprising: an array of individually addressable array elements, each of the individually addressable array elements comprising at least one of a semiconductor emitter or a reflector arranged to reflect light emitted by a semiconductor emitter; a driver configured to switch the plurality of individually addressable array elements on or off according to a predefined illumination pattern, the predefined illumination pattern comprising a plurality of array elements that are switched on each bounded by at least two array elements that are switched off; at least one image sensor array configured to detect light reflected from a scene illuminated by the predetermined illumination pattern; a processor configured to compute a depth map of the scene from the light pattern based on an offset between an expected position of a detected pixel in the light pattern and an actual position of the detected pixel in the light pattern; a first interface configured to connect the driver to the depth map generator; and a second interface configured to connect the processor to the depth map generator.
8. A method of generating a depth map of a scene, the method comprising: arranging a depth map generator in front of the scene, the depth map generator including an array comprising a plurality of individually addressable array elements, each of the plurality of individually addressable array elements comprising at least one of a semiconductor emitter or a reflector arranged to reflect light emitted by a semiconductor emitter; choosing an illumination pattern to illuminate the scene; switching, via a driver, the plurality of individually addressable array elements of the array one or off according to the illumination pattern; forming a light pattern from light reflected from the scene; detecting, with at least one image sensor, the light pattern; and computing a depth map of the scene from the light pattern based on an offset between an expected position of a detected pixel in the light pattern and an actual position of the detected pixel in the light pattern.
9. The method according to claim 8, wherein the switching comprises switching the plurality of individually addressable array elements on or off according to a sequence of the illumination patterns.
10. The method according to claim 9, wherein choosing the illumination pattern comprises choosing the sequence of the illumination patterns to successively illuminate the scene in at least one of a vertical direction and a horizontal direction.
11. The method according to claim 9, wherein the sequence of the illumination patterns is the inverse of a preceding illumination pattern.
12. The method according to claim 8, further comprising adjusting the illumination pattern on the basis of a computed depth map.
13. The depth map generator according to claim 1, wherein the at least one image sensor comprises a single image sensor.
14. The illumination arrangement of claim 7, wherein the at least one image sensor comprises a single image sensor.
15. The method of claim 8, wherein the illumination pattern comprises a plurality of array elements that are switched on each bounded by at least two array elements that are switched off.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7) In the drawings, like numbers refer to like objects throughout. Objects in the diagrams are not necessarily drawn to scale.
DETAILED DESCRIPTION OF THE EMBODIMENTS
(8)
(9) The array of emitters irradiates a scene in its field of view, i.e. the emitter array casts an irradiation pattern into its field of view. The image sensor captures an image of the irradiated scene. The diagram shows the path of a light ray L from an emitter pixel EP to point P in the scene, from which it is reflected as light ray L′ and detected by image sensor pixel(s) SP. In simplified ray optics, rays L, L′ pass unchanged through the centres of the respective lenses FE, FS. Therefore, each pixel in the irradiated scene originates from a pixel of the emitter array, so that an emitter pixel EP subtends an angle □2 to its corresponding scene pixel. The scene pixels are imaged by the image sensor S, so that a scene pixel P subtends an angle □1 to its imaged pixel SP. A right angle is subtended between the illuminated pixel P in the scene and a point V in the plane of the lenses FE, FS. Recognising that
(10)
allows H to be expressed as:
(11)
The principle of similar triangles can then be applied in determining the distance to point P as follows: the position of the emitting pixel E.sub.P is known (from its position in the irradiation pattern), so that
(12)
in which H.sub.E is the distance to lens F.sub.E, and B.sub.E is the distance from the array centre to the emitting pixel E.sub.P. Similarly, the position B.sub.S of the image sensor pixel E.sub.S can be determined (by comparing the sensed image to the irradiation pattern), so that
(13)
(14) These values can be substituted into equation (4) to solve for H. These computations are repeated for each pixel of the irradiation pattern.
(15) As explained above, the invention is based on the insight that the distance to an illuminated point on a scene can be established by identifying the position of the corresponding detecting photosensor(s) in a sensor array. An offset between the expected position of a detected pixel to the actual position of the detected pixel is used to calculate the distance to the illuminated point. This works best with a “black & white” irradiation pattern, in which white represents emitters that are “on”.
(16)
(17) It shall be noted that the diagram is not to scale. Generally, the distance between emitter array E and image sensor S may be relatively small, for example from a few tens of millimetres to a few centimetres, while the distance to a scene can be in the order of 0.25 m to 3 m for consumer products such as mobile devices, and may exceed 3 m in the case of applications such as automotive imaging systems.
(18) To correctly interpret an imaged irradiation pattern, the system carries out a calibration procedure to establish a relationship between the emitter array and any image sensor. A calibration procedure can be used to compensate for unavoidable inaccuracies in the arrangement of emitter array and image sensors.
(19)
(20) The distance H to the illuminated point (i.e. the depth of point P in the scene 3) can be calculated as explained above, recognising that a scene pixel P subtends an angle □1 to its imaged pixel S1P in the first image sensor S1, and the same scene pixel P subtends an angle □2 to its imaged pixel S2P in the second image sensor S2.
(21)
(22)
(23) Although the present invention has been disclosed in the form of preferred embodiments and variations thereon, it will be understood that numerous additional modifications and variations could be made thereto without departing from the scope of the invention. For example, embodiments of the inventive depth map generator may comprise two emitter arrays and a single camera; an assembly in which two cameras are arranged in line and with an emitter array perpendicularly offset from that line (this arrangement permits the use of alternating irradiation patterns of vertical lines and horizontal lines); an active emitter array with an active lens to allow sub emitter-lens-pitch shifts of the irradiation pattern to increase depth map resolution; two emitters with different emission wavelengths, visible line pattern, etc.
(24) For the sake of clarity, it is to be understood that the use of “a” or “an” throughout this application does not exclude a plurality, and “comprising” does not exclude other steps or elements. The mention of a “unit” or a “module” does not preclude the use of more than one unit or module.
(25) TABLE-US-00001 REFERENCE SIGNS: depth map generator 1 driver 11 driver interface 110 processing unit 12 processor interface 120 scene 3 emitter array E, R imaging lens F.sub.E, F.sub.S semiconductor emitter E.sub.P semiconductor emitter E.sub.X micro-mirror R.sub.P active array region E.sub.ON inactive array region E.sub.OFF image sensor array S, S1, S2 photosensor S.sub.P point P illumination pattern X1, . . . , X4, XC emitted light L reflected light L′ distance to illuminated point H distance to lens H.sub.E, H.sub.S, H.sub.S1, H.sub.S2 array separation B distance to pixel B.sub.E, B.sub.S angle ϕ1, ϕ2 depth map DM