OPTICAL DETECTOR
20170363465 · 2017-12-21
Assignee
Inventors
- Robert Send (Karlsruhe, DE)
- Ingmar BRUDER (Neuleiningen, DE)
- Sebastian VALOUCH (Lampertheim, DE)
- Stephan IRLE (Siegen, DE)
- Erwin THIEL (Siegen, DE)
Cpc classification
G01J1/4228
PHYSICS
G01J1/08
PHYSICS
International classification
G01J1/08
PHYSICS
Abstract
An optical detector(110) is disclosed, comprising: at least one optical sensor(122) adapted to detect a light beam(120) and to generate at least one sensor signal, wherein the optical sensor(122) has at least one sensor region(124), wherein the sensor signal of the optical sensor(122) exhibits a non-linear dependency on an illumination of the sensor region(124) by the light beam (120) with respect to a total power of the illumination; at least one image sensor(128) being a pixelated sensor comprising a pixel matrix(174) of image pixels(176), wherein the image pixels(176) are adapted to detect the light beam(120) and to generate at least one image signal, wherein the image signal exhibits a linear dependency on the illumination of the image pixels(176) by the light beam(1,6) with respect to the total power of the illumination; and at least one evaluation device(132), the evaluation device(132) being adapted to evaluate the sensor signal and the image signal. In a particularly preferred embodiment, the non-linear dependency of the sensor signal on the total power of the illumination of the optical sensor(122) is expressible by a non-linear function comprising a linear part and a non-linear part, wherein the evaluation device(132) is adapted to determine the linear part and/or the non-linear part of the non-linear function by evaluating both the sensor signal and the image signal. Herein, the evaluation device(132), preferably, comprises a processing circuit(136) being adapted to provide a difference between the sensor signal and the image signal for determining the non-linear part of the non-linear function.
Claims
1. An optical detector, comprising: at least one optical sensor adapted to detect a light beam and to generate at least one sensor signal, wherein the optical sensor has at least one sensor region, wherein the sensor signal of the optical sensor exhibits a non-linear dependency on an illumination of the sensor region by the light beam with respect to a total power of the illumination; at least one image sensor, being a pixelated sensor comprising a pixel matrix of image pixels, wherein the image pixels are adapted to detect the light beam and to generate at least one image signal, wherein the image signal exhibits a linear dependency on the illumination of the image pixels by the light beam with respect to the total power of the illumination; and at least one evaluation device, the evaluation device being adapted to evaluate the sensor signal and the image signal.
2. The optical detector according to claim 1, wherein the non-linear dependency of the sensor signal on the total power of the illumination of the optical sensor is expressible by a non-linear function comprising a linear part and a non-linear part, wherein the evaluation device is adapted to determine the linear part, the non-linear part, or both, of the non-linear function by evaluating both the sensor signal and the image signal.
3. The optical detector according to claim 2, wherein the evaluation device comprises a processing circuit being adapted to provide a difference between the sensor signal and the image signal for determining the non-linear part of the non-linear function.
4. The optical detector according to claim 1, comprising at least one hybrid sensor, wherein the hybrid sensor comprises at least one of the optical sensors and at least one of the image sensors.
5. The optical detector according to claim 1, wherein the optical sensor is located in a direct vicinity of the image sensor.
6. The optical detector according to claim 5, wherein the optical sensor and the image sensor at least partially touch each other.
7. The optical detector according to claim 1, wherein the optical sensor and the image sensor are arranged in a manner that the light beam first impinges on the optical sensor.
8. The optical detector according to claim 1, wherein the image sensor is an inorganic image sensor.
9. The optical detector according to claim 1, wherein the optical sensor is a large-area optical sensor or a pixelated optical sensor.
10. The optical detector according to claim 9, wherein the optical sensor is a pixelated optical sensor comprising a pixel array of sensor pixels.
11. The optical detector according to claim 10, wherein at least one electronic element is placed in a vicinity of the sensor pixel on a surface, on which both the at least one electronic element and the sensor pixel is located, wherein the at least one electronic element may be adapted to contribute to an evaluation of the signal provided by the sensor pixel, wherein the at least one electronic element preferably comprises one or more of: a connector, a capacity, a diode, a transistor.
12. The optical detector according to claim 10, wherein at least two pixelated optical sensors are arranged on top of each other, wherein a location of the at least two pixelated optical sensors is shifted by an extent with respect to each other.
13. The optical detector according to claim 10, wherein the sensor pixel is electrically connected to a top contact provided by the image pixel of the image sensor.
14. The optical detector according to claim 10, wherein the image sensor has a first pixel resolution, wherein the pixelated optical sensor has a second pixel resolution, wherein the first pixel resolution equals or exceeds the second pixel resolution.
15. The optical detector according to claim 14, wherein, the sensor pixel comprises a pixel array of at least 4×4 display pixels.
16. The optical detector according to claim 1, wherein the optical sensor comprises at least one first electrode, at least one second electrode and at least one photovoltaic material sandwiched in between the first electrode and the second electrode, wherein either the first electrode or the second electrode is a pixelated electrode.
17. The optical detector according to claim 1, further comprising at least one transversal optical sensor, the transversal optical sensor being adapted to determine one or more of a transversal position of the light beam, a transversal position of an object from which the light beam propagates towards the optical detector or a transversal position of a light spot generated by the light beam, the transversal position being a position in at least one dimension perpendicular to an optical axis of the optical detector, the transversal optical sensor being adapted to generate at least one transversal sensor signal.
18. The optical detector according to claim 1, further comprising at least one imaging device being adapted to record an image.
19. The optical detector according to claim 18, wherein a hybrid sensor is used as the imaging device.
20. A detector system for determining a position of at least one object, the detector system comprising at least one optical detector according to claim 1, the detector system further comprising at least one beacon device adapted to direct at least one light beam towards the optical detector, wherein the beacon device is at least one of attachable to the object, holdable by the object and integratable into the object.
21. A human-machine interface for exchanging at least one item of information between a user and a machine, the human-machine interface comprising at least one optical detector according to claim 1.
22. An entertainment device for carrying out at least one entertainment function, wherein the entertainment device comprises at least one human-machine interface according to claim 21, wherein the entertainment device is designed to enable at least one item of information to be input by a player by means of the human-machine interface, wherein the entertainment device is designed to vary the entertainment function in accordance with the information.
23. A tracking system for tracking a position of at least one movable object, the tracking system comprising at least one optical detector according to claim 1, the tracking system further comprising at least one track controller, wherein the track controller is adapted to track a series of positions of the object at specific points in time.
24. A camera for imaging at least one object, the camera comprising at least one optical detector according to claim 1.
25. A method of optical detection, the method comprising: detecting at least one light beam with at least one optical sensor and at least one image sensor, wherein the optical sensor has at least one sensor region, wherein the image sensor is a pixelated sensor comprising a pixel matrix of image pixels; generating at least one sensor signal and at least one image signal, wherein the sensor signal of the optical sensor exhibits a non-linear dependency on an illumination of the sensor region by the light beam with respect to a total power of the illumination, and wherein the image signal of the image sensor exhibits a linear dependency on the illumination of the image pixels by the light beam with respect to the total power of the illumination; and evaluating the sensor signal and the image signal by using at least one evaluation device.
26. The method according to claim 25, wherein the non-linear dependency of the sensor signal on the total power of the illumination of the optical sensor is expressed by a non-linear function comprising a linear part and a non-linear part, wherein the linear part and/or the non-linear part of the non-linear function are determined by evaluating both the sensor signal and the image signal.
27. The method according to claim 26, wherein a difference between the sensor signal and the image signal is determined for providing the non-linear part of the non-linear function, in particular by using a processing circuit being adapted to provide a difference between the sensor signal and the image signal.
28. An article, comprising the optical detector according to claim 1, wherein the article is adapted to function as an article for an application selected from the group consisting of: a position measurement in traffic technology; an entertainment application; a security application; a human-machine interface application; a tracking application; a photography application; an imaging application or camera application; a mapping application for generating maps of at least one space; a mobile application; a webcam; a computer peripheral device; a gaming application; an audio application; a camera or video application; a security application; a surveillance application; an automotive application; a transport application; a medical application; an agricultural application; an application connected to breeding plants or animals; a crop protection application; a sports application; a machine vision application; a vehicle application; an airplane application; a ship application; a spacecraft application; a building application; a construction application; a cartography application; a manufacturing application; a quality control application; a use in combination with at least one time-of-flight detector; an application in a local positioning system; an application in a global positioning system; an application in a landmark-based positioning system; an application in an indoor navigation system; an application in an outdoor navigation system; an application in a household application; a robot application; an application in an automatic door opener; and an application in a light communication system.
Description
BRIEF DESCRIPTION OF THE FIGURES
[0483] Further optional details and features of the invention are evident from the description of preferred exemplary embodiments which follows in conjunction with the dependent claims. In this context, the particular features may be implemented alone or in any reasonable combination. The invention is not restricted to the exemplary embodiments. The exemplary embodiments are shown schematically in the figures. Identical reference numerals in the individual figures refer to identical elements or elements with identical function, or elements which correspond to one another with regard to their functions.
[0484] In the Figures:
[0485]
[0486]
[0487]
[0488]
[0489]
EXEMPLARY EMBODIMENTS
[0490] In
[0491] The optical detector 110 comprises at least one optical sensor 122, which is embodied as a FiP sensor, i.e. as optical sensor 122 has a sensor region 124 which may be illuminated by the light beam 120, thereby creating a light spot 126 in the sensor region 124. The FIR sensor 122 is further adapted to generate at least one sensor signal, wherein the sensor signal, given the same total power of illumination, is dependent on the width of the light beam 120, such as on the diameter or the equivalent diameter of the light spot 126, in the sensor region 124. Thus, the sensor signal of the optical sensor 122 exhibits a non-linear dependency on an illumination of the sensor region 126 by the light beam 120 with respect to a total power of the illumination
[0492] For further details regarding potential setups of the FiP sensor 122, reference may be made to e.g. WO 2012/110924 A1 or US 2012/0206336 A1, e.g. to the embodiment shown in
[0493] The optical detector 110 further comprises at least one image sensor 128 which may, preferably, be located in a beam path 130 in which the optical sensor 122 might also be located. According to the present invention, the image sensor 128 is an inorganic pixelated sensor which comprises a pixel matrix of image pixels within its sensor region 124, which will be illustrated in more detail, for example, in
[0494] The optical detector 110 further comprises at least one evaluation device 132. The evaluation device 132 may, preferably, be connected by at least one connector 134 to the at least one optical sensor 122 in order to receive the sensor signals from the at least one optical sensor 122. As described above, the sensor signals as received from the optical sensor 122 comprise longitudinal optical sensor signals but may, depending on the setup of the optical sensor 122, further comprise transversal sensor signals. In a similar manner, the evaluation device 132 may, preferably, further be connected by at least one further connector 134 to the at least one image sensor 128 in order to receive the image signals from the at least one image sensor 128. Herein, the signal transmission to the evaluation device 132 may take place in a wire-bound or even in a wireless fashion. As an example, the evaluation device 132 may comprise one or more computers, such as one or more processors, and/or one or more application-specific integrated circuits (ASICs).
[0495] According to the present invention, the evaluation device 132 is adapted to evaluate both the sensor signal and the image signal. As outlined above, the sensor signal of the optical sensor 122 exhibits a non-linear dependency on an illumination of the sensor region 124 by the light beam 120 with respect to a total power of the illumination, whereas the image signal exhibits a linear dependency on the illumination of the sensor region 124 comprising the image pixels by the light beam 120 with respect to the total power of the illumination. Accordingly, the sensor signal may, thus, exhibit a dependency on the total power of the illumination and, as a consequence of the above described FiP effect, on the geometry of the illumination. Therefore, in a first respect, the sensor signal as generated by the optical sensor 122 exhibits, in the same manner as the image sensor 128, a linear dependency on the power of the illumination, which may, however, be superimposed, in a second respect, by the additional non-linear dependency on the geometry of the illumination of the optical sensor 122.
[0496] As used in the example as depicted in
[0497] Therefore, the evaluation device 132 may, preferentially, comprise a processing circuit 136 which may be adapted to provide a difference between the sensor signal and the image signal at its output 138. As mentioned above, the purely non-linear part as derived from the sensor signal of the FiP sensor may typically exhibit, for low intensities of the incident light beam 120, a strong contribution which might be dominant, whereas the purely non-linear part as part of the sensor signal of the optical sensor 122 may, for increasing intensities of the incident light beam 120, decrease. Within this regard, the linear part of the non-linear function may be considered as a kind of asymptotic background which could, preferably, be subtracted from the desired signal, i.e. the purely non-linear part which may directly be related to the above-described FiP effect. In order to be able to provide the purely non-linear part of the non-linear function at the output 138 of the processing circuit 136, a first input 140 of the processing circuit 136 may be adapted to receive the total non-linear function by acquiring the sensor signal from the optical sensor 122, while a second input 142 may be adapted to receive the linear part of the non-linear function by acquiring the image signal from the image sensor 128.
[0498] As schematically depicted in
[0499] In this particular example, the optical sensor 122 which exhibits the above-described HP-effect may be developed in different manners. In a first alternative, the sensor region 124 of optical sensor 122 may, preferably, be a uniform sensor surface such that the optical sensor 122 may also be denominated a large-area optical sensor. In general, as disclosed e.g. in one or more of WO 2012/110924 A1, US 2012/0206336 A1, WO 2014/097181 A1 or US 2014/0291480 A1, the setup as shown in
[0500] In addition, the optical detector 110 may further comprise at least one lens 148 which may be located in the beam path 130 of the light beam 120, such that, preferably, the light beam 120 may pass the lens 128 before reaching the at least one optical sensor 122 and, preferably subsequently, the at least one image sensor 128. This kind of arrangement may particularly be preferred in an embodiment in which the optical sensor 122 may be at least partially transparent while the image sensor 128 might be transparent or, alternatively, intransparent. The latter may, thus, allow using intransparent image sensor 128 as known from the state of the art. Herein, the lens 148 may, preferably, be a focus-tunable lens 150 which may be adapted to modify a focal position of the light beam 120, in particular, since it may be adapted to change its own focal length, in a controlled fashion. As an example, at least one commercially available focus-tunable lens may, thus, be used, such as at least one electrically tunable lens. It shall be noted, however, that other types of lenses may be used in addition or alternatively.
[0501] Further, the image sensor 128 may be used an imaging device 152 which may be adapted to record an image as captured by the optical detector 110. Generally, the imaging device 152 may relate to an arbitrary device which may comprise at least one light-sensitive element which may be time and/or spatially resolving and, thus, adapted to record spatially resolved optical information, in one, two, or three dimensions.
[0502] The setup of the optical detector 110 as shown in
[0503] In
[0504] As schematically depicted in
[0505] As shown in
[0506] The organic optical sensor 122 in the hybrid device 156 may, still, be a large-area optical sensor having a uniform sensor surface which comprises the sensor region 124 in the same or a similar manner like the optical sensors 122 in the exemplary setups as illustrated in
[0507] On the other hand, the image sensor 128 as further used within the hybrid sensor 156 may be an inorganic image sensor 128 and, thus, comprise at least one CCD device or at least one CMOS device. In particular, the image sensor 128 may also be employed as a transversal optical sensor, which may be adapted to determine one or more transversal components of the at least one object 118 within the scene 114 in the surroundings 116 of the optical detector 110. Herein, the image sensor 128 may, generally, be shaped in form of a pixel matrix 174 of separate image pixels 176. Similar to the optical sensor 122, the image sensor 128 may comprise an arbitrary number of image pixels 176, such as a number which may especially be suitable or required for the intended purposes. Further, the matrix 174 of image pixels 176 in the image sensor 128 may, generally, comprise the same number of pixels or, preferably as shown in
[0508] As already mentioned above, the pixelated optical sensor 162 comprises the marginal sensor pixels 168 located at the periphery 170 of the pixelated optical sensor 122 and the non-marginal sensor pixels 172 located apart from the periphery 170 within the pixel array 164. However, since it may be preferable to directly place the pixelated optical sensor 162 on top of the image sensor 128, wherein the term “on top” may be interpreted with respect to the z-coordinate in the coordinate system 146, a problem which may concern a providing of electrical contacts to the non-marginal sensor pixels 172 within the pixel array 164 may occur. Whereas electrical contacts may directly be attached to each of the easily accessible marginal sensor pixels 168 of the pixelated optical sensor 162, the problem relating to the at least one non-marginal sensor pixel 172, i.e. the sensor pixel 172 which is not located at the readily accessible periphery 170 of the pixelated optical sensor 162, may be solved, according to the present invention, by using an image sensor 128 which may comprise one or more of the top contacts (not depicted here).
[0509] Accordingly, as shown in
[0510] The optical detector 110 as schematically depicted in
[0511] Further, information as generated by the processing circuit 136 may be combined with other information as generated by the evaluation device 132, such as the depth information as derived from the sensor signal provided by the pixelated optical sensor 162 or image information as derived from the image signal by the image sensor 128 and, subsequently, evaluated in an image evaluation device 182, which may be part of the evaluation device 132 and/or of the image sensor 128. However, other arrangements are feasible.
[0512] The optical detector 110 may further comprise at least one focus-modulation device 184 which can be connected to the at least one focus-tunable lens 150. The at least one focus-modulation device 184 may, thus, be adapted to provide at least one focus-modulating signal to the at least one focus-tunable lens 150. Herein, the focus-modulation device 184 may be an individual unit being separated from the focus-tunable lens 150 and/or may be fully or partially integrated into the focus-tunable lens 150. As depicted in
[0513]
[0514] As schematically depicted in
[0515]
[0516] In a first embodiment, the at least one optical sensor 122 may, as schematically depicted in
[0517] As a further embodiment,
[0518] Within this regard, it may be mentioned that the sensor pixels 166 within the pixelated optical sensor 162 may be one of the marginal sensor pixels 168 at the periphery 170 of the pixelated optical sensor 162 or, in the case where the pixel array 164 comprises at least 3×3 sensor pixels 166, one of the non-marginal sensor pixels 172 which are located apart from the periphery 170 of the pixel array 164.
[0519] As a further embodiment,
[0520] However, in a specific embodiment, at least one electronic element (not depicted here) may be placed in a vicinity of, in particular each of, the sensor pixels 166 on the same surface as the sensor pixels 166. Herein, the electronic elements may be adapted to contribute to an evaluation of the signal as provided by the corresponding sensor pixel 166 and might, thus, comprise one or more of: a connector, a capacity, a diode, a transistor. However, since the electronic elements are not sensitive to the illumination by the incident light beam in the sense as described above that they do not contribute to the sensor signal of the pixelated sensor 162, 162′, the area on the surface of the respective pixelated sensor 162, 162′ may only be able to contribute to the sensor signal as the sensor region 124 to a partial extent. In addition, two adjoining sensor pixels 166 may be separated from each other by a separating strip, wherein the strip may comprise an electrically non-conducting material, such as a photoresist, which may, particularly, be adapted to avoid a cross-talk between the two adjacent sensor pixels 166, so that the strip may also not be able to contribute to the sensor signal.
[0521] However, the embodiment as presented in
[0522] As outlined above, the optical detector 110 and the camera 154 may be used in various devices or systems.
[0523] With regard to the optical detector 110, reference may be made to the disclosure given above or given in further detail below. Basically, all potential embodiments of the detector 110 may also be embodied in the embodiment shown in
[0524] In the exemplary embodiment shown in
[0525] The optical detector 110 may be adapted to determine at least one item on a longitudinal position of one or more of the beacon devices 208 and, optionally, at least one item of information regarding a transversal position thereof, and/or at least one other item of information regarding the longitudinal position of the object 118 and, optionally, at least one item of information regarding a transversal position of the object 118. Additionally, the optical detector 110 may be adapted for identifying colors and/or for imaging the object 118. An opening 210 in the housing 202, which, preferably, may be located concentrically with regard to the optical axis 112 of the detector 110, preferably defines a direction of a view 212 of the optical detector 110.
[0526] The optical detector 110 may be adapted for determining a position of the at least one object 118. Additionally, the optical detector 110, specifically has an embodiment including camera 154, may be adapted for acquiring at least one image of the object 118, preferably a 3D-image. As outlined above, the determination of a position of the object 118 and/or a part thereof by within the scene 114 using the optical detector 110 and/or the detector system 194 may be used for providing a human-machine interface 196, in order to provide at least one item of information to a machine 214. In the embodiments schematically depicted in
[0527] Similarly, as outlined above, the human-machine interface 196 may form part of the entertainment device 198. Thus, by means of the user 206 functioning as the object 118 and/or by means of the user 206 handling the object 118 and/or the control element 204 functioning as the object 118, the user 206 may input at least one item of information, such as at least one control command, into the machine 214, particularly the computer, thereby varying the entertainment function, such as controlling the course of a computer game.
[0528] As outlined above, the optical detector 110 may have a beam path 130, wherein the beam path 130 may be a straight beam path or a tilted beam path, an angulated beam path, a branched beam path, a deflected or split beam path or other types of beam paths. Further, the light beam 120 may propagate along each beam path 130 or partial beam path once or repeatedly, unidirectionally or bidirectionally. Thereby, the components listed above or the optional further components listed in further detail below may fully or partially be located in front of the at least one hybrid sensor 156 and/or behind the at least one hybrid sensor 156 as depicted in
LIST OF REFERENCE NUMBERS
[0529] 110 Optical detector [0530] 112 Optical axis [0531] 114 Scene [0532] 116 Surroundings [0533] 118 Object [0534] 120 Light beam [0535] 122 Optical sensor, FiP sensor [0536] 124 Sensor region [0537] 126 Light spot [0538] 128 Image sensor [0539] 130 Beam path [0540] 132 Evaluation device [0541] 134 Connector [0542] 136 Processing circuit [0543] 138 Output of processing circuit [0544] 140 First input of processing circuit [0545] 142 Second input of processing circuit [0546] 144 Operational amplifier [0547] 146 Coordinate system [0548] 148 Lens [0549] 150 Focus-tunable lens [0550] 152 Imaging device [0551] 154 Camera [0552] 156 Hybrid sensor [0553] 158 Volume [0554] 160 Distance [0555] 162, 162′ Pixelated optical sensor [0556] 164 Pixel array [0557] 166 Sensor pixel [0558] 168 Marginal sensor pixel [0559] 170 Periphery [0560] 172 Non-marginal sensor pixel [0561] 174 Pixel matrix [0562] 176 Image pixel [0563] 178 Matrix [0564] 180 Bond contact [0565] 182 Image evaluation device [0566] 184 Modulation device [0567] 185, 185′ Top contact [0568] 186, 186′ Transparent contact [0569] 188 Large-area optical sensor [0570] 190 Extent of shift [0571] 192 Length of side edge [0572] 194 Detector system [0573] 196 Human-machine device [0574] 198 Entertainment device [0575] 200 Tracking system [0576] 202 Housing [0577] 204 Control element [0578] 206 User [0579] 208 Beacon device [0580] 210 Opening [0581] 212 Direction of view [0582] 214 Machine [0583] 216 Track controller