DETECTION DEVICE WITH AT LEAST ONE SENSOR DEVICE, AN ANALYSIS DEVICE, A LIGHT SOURCE, AND A CARRIER MEDIUM
20220229188 · 2022-07-21
Assignee
Inventors
- Markus Klug (Ingolstadt, DE)
- Tobias Moll (Ingolstadt, DE)
- Johannes Scheuchenpflug (Baar-Ebenhausen, DE)
Cpc classification
G06F3/017
PHYSICS
G06F3/0421
PHYSICS
G06F2203/04101
PHYSICS
G01S17/87
PHYSICS
G01S17/894
PHYSICS
G06T7/521
PHYSICS
G06F2203/04109
PHYSICS
International classification
G01S17/894
PHYSICS
G01S7/481
PHYSICS
G06T7/521
PHYSICS
Abstract
A carrier medium is designed as a waveguide on which a coupling region and a decoupling region are provided. The coupling region is designed to couple light, which has been emitted from a light source and reflected on an object in the surroundings, into the carrier medium. The coupled reflected light is then transmitted to the decoupling region by internal reflection, and the reflected light is decoupled again at the decoupling region and is transmitted to the at least one sensor device, which is designed as a time-of-flight camera device. The sensor device provides the detected light in the form of sensor data, describing the propagation time of the light reflected on the object, to an analysis device that obtains object data, which describes the position of the object.
Claims
1-10. (canceled)
11. A capturing apparatus (for detecting an object, comprising: a light source, a two-dimensional carrier medium embodied as a light guide having a first coupling-in region and a first coupling-out region, the first coupling-in region embodied as a first holographic element with a first deflection structure to couple light emitted by the light source and reflected by the object in a surrounding area of the capturing apparatus, into the two-dimensional carrier medium which transmits the light by internal reflection from the first coupling-in region to the first coupling-out region, and the first coupling-out region embodied as a second holographic element with a second deflection structure to couple the light incident on the second deflection structure out of the two-dimensional carrier medium; at least one time-of-flight camera embodied to capture the light from the first coupling-out region and to provide sensor data correlated with the captured light to describe a time of flight of the light emitted by the light source and reflected by the object; and an evaluation device embodied to provide, based on the sensor data, object data that describe a relative location of the object with respect to a predetermined reference point.
12. The capturing apparatus as claimed in claim 11, wherein the first coupling-in region and the first coupling-out region each have at least one of a volume holographic grating and a surface holographic grating.
13. The capturing apparatus as claimed in claim 12, wherein a second coupling-in region and a second coupling-out region are provided at the carrier medium, the second coupling-in region embodied as a third holographic element (14) with a third deflection structure to couple light, emitted by the light source and incident on the third deflection structure, into the two-dimensional carrier medium, wherein the two-dimensional carrier medium is embodied to transmit the light by internal reflection from the second coupling-in region to the second coupling-out region, and wherein the second coupling-out region is embodied as a fourth holographic element with a fourth deflection structure to couple the transmitted light incident on the fourth deflection structure out of the two-dimensional carrier medium towards the object in the surrounding area.
14. The capturing apparatus as claimed in claim 13, wherein the object data indicate a distance of the object from the capturing apparatus.
15. The capturing apparatus as claimed in claim 14, wherein the capturing apparatus comprises at least two spatially separated time-of-flight cameras and the evaluation device, based on the sensor data of the at least two time-of-flight cameras, establishes coordinates of the object relative to the capturing apparatus as the object data.
16. The capturing apparatus as claimed in claim 15, wherein the capturing apparatus comprises at least three spatially separated time-of-flight cameras and the evaluation device, based on the sensor data of the at least three time-of-flight cameras, establishes a spatial arrangement of the object in the surrounding area in relation to the capturing apparatus as the object data.
17. The capturing apparatus as claimed in claim 16, wherein the first coupling-in region and the first coupling-out region are formed in one piece with the two-dimensional carrier medium.
18. The capturing apparatus as claimed in claim 16, wherein the two-dimensional carrier medium is formed as a separate element from the first coupling-in region and the first coupling-out region.
19. The capturing apparatus (10) as claimed in claim 16, wherein the light source is embodied to emit pulsed light.
20. The capturing apparatus as claimed in claim 18, wherein the capturing apparatus is embodied as an image capturing device to capture the light coupled out of first the coupling-out region and to provide image data based on the light captured.
21. The capturing apparatus as claimed in claim 19, wherein the evaluation device is embodied to evaluate the image data, taking into account the object data and a specified interpretation criterion, and to provide an interpretation signal describing the image data.
22. The capturing apparatus as claimed in claim 11, wherein a second coupling-in region and a second coupling-out region are provided at the carrier medium, the second coupling-in region embodied as a third holographic element (14) with a third deflection structure to couple light, emitted by the light source and incident on the third deflection structure, into the two-dimensional carrier medium, wherein the two-dimensional carrier medium is embodied to transmit the light by internal reflection from the second coupling-in region to the second coupling-out region, and wherein the second coupling-out region is embodied as a fourth holographic element with a fourth deflection structure to couple the transmitted light incident on the fourth deflection structure out of the two-dimensional carrier medium towards the object in the surrounding area.
23. The capturing apparatus as claimed in claim 11, wherein the object data indicate a distance of the object from the capturing apparatus.
24. The capturing apparatus as claimed in claim 11, wherein the capturing apparatus comprises at least two spatially separated time-of-flight cameras and the evaluation device, based on the sensor data of the at least two time-of-flight cameras, establishes coordinates of the object relative to the capturing apparatus as the object data.
25. The capturing apparatus as claimed in claim 11, wherein the capturing apparatus comprises at least three spatially separated time-of-flight cameras and the evaluation device, based on the sensor data of the at least three time-of-flight cameras, establishes a spatial arrangement of the object in the surrounding area in relation to the capturing apparatus as the object data.
26. The capturing apparatus as claimed in claim 11, wherein the first coupling-in region and the first coupling-out region are formed in one piece with the two-dimensional carrier medium.
27. The capturing apparatus as claimed in claim 11, wherein the two-dimensional carrier medium is formed as a separate element from the first coupling-in region and the first coupling-out region.
28. The capturing apparatus as claimed in claim 11, wherein the light source is embodied to emit pulsed light.
29. The capturing apparatus as claimed in claim 11, wherein the capturing apparatus is embodied as an image capturing device to capture the light coupled out of first the coupling-out region and to provide image data based on the light captured.
30. The capturing apparatus as claimed in claim 29, wherein the evaluation device is embodied to evaluate the image data, taking into account the object data and a specified interpretation criterion, and to provide an interpretation signal describing the image data.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0038] Exemplary embodiments will be described below in conjunction with the accompanying drawings of which:
[0039]
[0040]
[0041]
[0042]
DETAILED DESCRIPTION
[0043] In the exemplary embodiments discussed below, the described components of the embodiments each represent individual features that should be considered independently of one another and also develop the invention in each case independently of one another. The disclosure is therefore also intended to include combinations of the features of the embodiments other than those illustrated. Furthermore, the described embodiments may also be supplemented by further features that have already been described.
[0044] In the figures, identical reference signs each denote elements of identical function.
[0045] A capturing apparatus 10 is shown in
[0046] The coupling-in region 16 is embodied in the form of a holographic element 14 with a first deflection structure 20. The first deflection structure 20 is designed to couple the reflected light 100′, which was emitted as light 100 by a light source 30 and reflected at the object 40 in the surroundings of the capturing apparatus 10, into the carrier medium 12. The carrier medium 12 is in turn embodied to transmit the coupled-in reflected light 100′ with internal reflection from the coupling-in region 16 to the coupling-out region 18. The coupling-out region 18 is embodied in the form of a holographic element 14 with a second deflection structure 22. The second deflection structure 22 is designed to couple the transmitted reflected light 100′ that is incident on the second deflection structure 22 out of the carrier medium 12.
[0047] The sensor device 11 is embodied in the form of a time-of-flight camera device. The sensor device 11 is designed to capture the reflected light 100′ coupled out in the coupling-out region 18 and to provide it in the form of sensor data. The sensor data describe a time of flight of the light 100′, which was reflected at the object 40 and was captured by the sensor device 11. The evaluation device 32 is embodied to provide object data relating to the object 40, taking into account the sensor data. These object data describe a relative location of the object 40 with respect to a reference point of the capturing apparatus 10. The reference point of the capturing apparatus 10 is, for example, a specified point on the surface of the carrier medium 12, with the result that a relative location of the object 40 with respect to the capturing apparatus 10 can be indicated or described by the distance 42.
[0048] The coupling-in region 16 and the coupling-out region 18 have at least one optical grating as a respective deflection structure 20, 22, which grating is embodied in particular in the form of a volume holographic grating or a surface holographic grating. The coupling-in region 16 and the coupling-out region 18 are either formed in one piece with the carrier medium 12, or, alternatively, the carrier medium 12 can be formed as a separate element from the coupling-in region 16 and the coupling-out region 18. The light source 30 is embodied to emit pulsed light 100.
[0049]
[0050]
[0051]
[0052] The capturing apparatus 10 can additionally include an image capturing device, which is embodied to capture the light 100′ that is coupled out of the coupling-out region 18 and to provide it in the form of image data that correlate with the captured light 100′. Optical imaging of the object 40 is therefore also possible. The evaluation device 32 can here be embodied to evaluate the image data, taking into account the object data and a specified interpretation criterion, and to provide an interpretation signal describing the interpreted image data. In this way, for example, image recognition and gesture or action recognition can be carried out. For example, a movement of the object 40 in relation to the capturing apparatus 10 can be captured and also interpreted, such that, for example, the interpretation signal includes that the apple, that is to say the object 40, has shifted a total of 5 cm to the left and has also been rotated through an angle of 30°.
[0053] Overall, the examples show how to provide touch and gesture detection by a time-of-flight camera device and by a holographic optical element (HOE). Here, the carrier medium 12, which is transparent on at least one side and is embodied in the form of a HOE, is supplemented by the at least one sensor device 11, which is embodied in the form of a time-of-flight camera device for capturing three-dimensional objects. The object 40, which is captured hereby, can be located in front of or behind the carrier medium 12. As a result, large-area capturing of the surroundings of the capturing apparatus 10 is ultimately possible. At the same time, a precise measurement takes place in the surroundings or on the surface of the capturing apparatus 10, the carrier medium 12 of which with the coupling-in region 16 and the coupling-out region 18 is embodied, for example, in the form of a transparent, two-dimensional cover plate or cover film for a touch-sensitive screen of a device, such as a mobile device (smartphone). In addition, a measurement over a previously unusable transparent surface, for example the screen of the device on which the two-dimensional cover plate or cover film is attached, and also the use of the capturing apparatus 10 on both sides in the case of a carrier medium 12 with opposite coupling-in regions 16 become possible.
[0054] For this purpose, pulsed light 100 from the light source 30 is ultimately coupled into the carrier medium 12 via a holographic function, i.e. via the carrier medium 12, with the coupling-in region 16 and the coupling-out region 18, is distributed therein, and is coupled out via the coupling-out region 18. If this light 100 is incident on the object 40, it is reflected back to the carrier medium 12, is coupled in there and transmitted to the at least one sensor device 11. Since the time of flight of the light 100 from the light source 30 to the respective sensor device 11 is known, the time until the arrival of the reflected light 100′ can be measured and the time-of-flight difference between reflected light 100′ and direct light 100 can thus be ascertained. Using this time-of-flight difference and the known position of the light source 30 and of the individual sensor devices 11, the exact position of the captured item in space, that is to say of the object 40 in the surroundings, can be calculated. As a result, a distance 42, coordinates 43, and/or a spatial arrangement of the object 40 in the surroundings become capturable. In principle, it can be said here that each dimension to be captured requires one sensor device 11. The one sensor device 11 is consequently sufficient for a straight-line measurement in a plane, and two sensor devices 11 are sufficient for a measurement on a surface (see
[0055] A description has been provided with particular reference to preferred embodiments thereof and examples, but it will be understood that variations and modifications can be effected within the spirit and scope of the claims which may include the phrase “at least one of A, B and C” as an alternative expression that means one or more of A, B and C may be used, contrary to the holding in Superguide v. DIRECTV, 358 F3d 870, 69 USPQ2d 1865 (Fed. Cir. 2004).