OPTICAL DETECTOR

20170363465 · 2017-12-21

Assignee

Inventors

Cpc classification

International classification

Abstract

An optical detector(110) is disclosed, comprising: at least one optical sensor(122) adapted to detect a light beam(120) and to generate at least one sensor signal, wherein the optical sensor(122) has at least one sensor region(124), wherein the sensor signal of the optical sensor(122) exhibits a non-linear dependency on an illumination of the sensor region(124) by the light beam (120) with respect to a total power of the illumination; at least one image sensor(128) being a pixelated sensor comprising a pixel matrix(174) of image pixels(176), wherein the image pixels(176) are adapted to detect the light beam(120) and to generate at least one image signal, wherein the image signal exhibits a linear dependency on the illumination of the image pixels(176) by the light beam(1,6) with respect to the total power of the illumination; and at least one evaluation device(132), the evaluation device(132) being adapted to evaluate the sensor signal and the image signal. In a particularly preferred embodiment, the non-linear dependency of the sensor signal on the total power of the illumination of the optical sensor(122) is expressible by a non-linear function comprising a linear part and a non-linear part, wherein the evaluation device(132) is adapted to determine the linear part and/or the non-linear part of the non-linear function by evaluating both the sensor signal and the image signal. Herein, the evaluation device(132), preferably, comprises a processing circuit(136) being adapted to provide a difference between the sensor signal and the image signal for determining the non-linear part of the non-linear function.

Claims

1. An optical detector, comprising: at least one optical sensor adapted to detect a light beam and to generate at least one sensor signal, wherein the optical sensor has at least one sensor region, wherein the sensor signal of the optical sensor exhibits a non-linear dependency on an illumination of the sensor region by the light beam with respect to a total power of the illumination; at least one image sensor, being a pixelated sensor comprising a pixel matrix of image pixels, wherein the image pixels are adapted to detect the light beam and to generate at least one image signal, wherein the image signal exhibits a linear dependency on the illumination of the image pixels by the light beam with respect to the total power of the illumination; and at least one evaluation device, the evaluation device being adapted to evaluate the sensor signal and the image signal.

2. The optical detector according to claim 1, wherein the non-linear dependency of the sensor signal on the total power of the illumination of the optical sensor is expressible by a non-linear function comprising a linear part and a non-linear part, wherein the evaluation device is adapted to determine the linear part, the non-linear part, or both, of the non-linear function by evaluating both the sensor signal and the image signal.

3. The optical detector according to claim 2, wherein the evaluation device comprises a processing circuit being adapted to provide a difference between the sensor signal and the image signal for determining the non-linear part of the non-linear function.

4. The optical detector according to claim 1, comprising at least one hybrid sensor, wherein the hybrid sensor comprises at least one of the optical sensors and at least one of the image sensors.

5. The optical detector according to claim 1, wherein the optical sensor is located in a direct vicinity of the image sensor.

6. The optical detector according to claim 5, wherein the optical sensor and the image sensor at least partially touch each other.

7. The optical detector according to claim 1, wherein the optical sensor and the image sensor are arranged in a manner that the light beam first impinges on the optical sensor.

8. The optical detector according to claim 1, wherein the image sensor is an inorganic image sensor.

9. The optical detector according to claim 1, wherein the optical sensor is a large-area optical sensor or a pixelated optical sensor.

10. The optical detector according to claim 9, wherein the optical sensor is a pixelated optical sensor comprising a pixel array of sensor pixels.

11. The optical detector according to claim 10, wherein at least one electronic element is placed in a vicinity of the sensor pixel on a surface, on which both the at least one electronic element and the sensor pixel is located, wherein the at least one electronic element may be adapted to contribute to an evaluation of the signal provided by the sensor pixel, wherein the at least one electronic element preferably comprises one or more of: a connector, a capacity, a diode, a transistor.

12. The optical detector according to claim 10, wherein at least two pixelated optical sensors are arranged on top of each other, wherein a location of the at least two pixelated optical sensors is shifted by an extent with respect to each other.

13. The optical detector according to claim 10, wherein the sensor pixel is electrically connected to a top contact provided by the image pixel of the image sensor.

14. The optical detector according to claim 10, wherein the image sensor has a first pixel resolution, wherein the pixelated optical sensor has a second pixel resolution, wherein the first pixel resolution equals or exceeds the second pixel resolution.

15. The optical detector according to claim 14, wherein, the sensor pixel comprises a pixel array of at least 4×4 display pixels.

16. The optical detector according to claim 1, wherein the optical sensor comprises at least one first electrode, at least one second electrode and at least one photovoltaic material sandwiched in between the first electrode and the second electrode, wherein either the first electrode or the second electrode is a pixelated electrode.

17. The optical detector according to claim 1, further comprising at least one transversal optical sensor, the transversal optical sensor being adapted to determine one or more of a transversal position of the light beam, a transversal position of an object from which the light beam propagates towards the optical detector or a transversal position of a light spot generated by the light beam, the transversal position being a position in at least one dimension perpendicular to an optical axis of the optical detector, the transversal optical sensor being adapted to generate at least one transversal sensor signal.

18. The optical detector according to claim 1, further comprising at least one imaging device being adapted to record an image.

19. The optical detector according to claim 18, wherein a hybrid sensor is used as the imaging device.

20. A detector system for determining a position of at least one object, the detector system comprising at least one optical detector according to claim 1, the detector system further comprising at least one beacon device adapted to direct at least one light beam towards the optical detector, wherein the beacon device is at least one of attachable to the object, holdable by the object and integratable into the object.

21. A human-machine interface for exchanging at least one item of information between a user and a machine, the human-machine interface comprising at least one optical detector according to claim 1.

22. An entertainment device for carrying out at least one entertainment function, wherein the entertainment device comprises at least one human-machine interface according to claim 21, wherein the entertainment device is designed to enable at least one item of information to be input by a player by means of the human-machine interface, wherein the entertainment device is designed to vary the entertainment function in accordance with the information.

23. A tracking system for tracking a position of at least one movable object, the tracking system comprising at least one optical detector according to claim 1, the tracking system further comprising at least one track controller, wherein the track controller is adapted to track a series of positions of the object at specific points in time.

24. A camera for imaging at least one object, the camera comprising at least one optical detector according to claim 1.

25. A method of optical detection, the method comprising: detecting at least one light beam with at least one optical sensor and at least one image sensor, wherein the optical sensor has at least one sensor region, wherein the image sensor is a pixelated sensor comprising a pixel matrix of image pixels; generating at least one sensor signal and at least one image signal, wherein the sensor signal of the optical sensor exhibits a non-linear dependency on an illumination of the sensor region by the light beam with respect to a total power of the illumination, and wherein the image signal of the image sensor exhibits a linear dependency on the illumination of the image pixels by the light beam with respect to the total power of the illumination; and evaluating the sensor signal and the image signal by using at least one evaluation device.

26. The method according to claim 25, wherein the non-linear dependency of the sensor signal on the total power of the illumination of the optical sensor is expressed by a non-linear function comprising a linear part and a non-linear part, wherein the linear part and/or the non-linear part of the non-linear function are determined by evaluating both the sensor signal and the image signal.

27. The method according to claim 26, wherein a difference between the sensor signal and the image signal is determined for providing the non-linear part of the non-linear function, in particular by using a processing circuit being adapted to provide a difference between the sensor signal and the image signal.

28. An article, comprising the optical detector according to claim 1, wherein the article is adapted to function as an article for an application selected from the group consisting of: a position measurement in traffic technology; an entertainment application; a security application; a human-machine interface application; a tracking application; a photography application; an imaging application or camera application; a mapping application for generating maps of at least one space; a mobile application; a webcam; a computer peripheral device; a gaming application; an audio application; a camera or video application; a security application; a surveillance application; an automotive application; a transport application; a medical application; an agricultural application; an application connected to breeding plants or animals; a crop protection application; a sports application; a machine vision application; a vehicle application; an airplane application; a ship application; a spacecraft application; a building application; a construction application; a cartography application; a manufacturing application; a quality control application; a use in combination with at least one time-of-flight detector; an application in a local positioning system; an application in a global positioning system; an application in a landmark-based positioning system; an application in an indoor navigation system; an application in an outdoor navigation system; an application in a household application; a robot application; an application in an automatic door opener; and an application in a light communication system.

Description

BRIEF DESCRIPTION OF THE FIGURES

[0483] Further optional details and features of the invention are evident from the description of preferred exemplary embodiments which follows in conjunction with the dependent claims. In this context, the particular features may be implemented alone or in any reasonable combination. The invention is not restricted to the exemplary embodiments. The exemplary embodiments are shown schematically in the figures. Identical reference numerals in the individual figures refer to identical elements or elements with identical function, or elements which correspond to one another with regard to their functions.

[0484] In the Figures:

[0485] FIG. 1 shows a first embodiment of an optical detector according to the present invention, comprising an optical sensor, a separate image sensor and a specifically adapted evaluation device;

[0486] FIG. 2 shows a further embodiment of an optical detector according to the present invention, wherein the optical sensor and the image sensor constitute a hybrid sensor;

[0487] FIG. 3 shows a particular embodiment according to the present invention, wherein an electrical connection to a sensor pixel of the optical sensor is provided by a top contact of an image pixel of the image sensor;

[0488] FIG. 4 shows three exemplary embodiments of the optical sensor, i.e. a large-area optical sensor (FIG. 4A), a pixelated optical sensor (FIG. 4B), and an arrangement of two pixelated optical sensors shifted with respect to each other (FIG. 4C); and

[0489] FIG. 5 shows an exemplary embodiment of the optical detector, a detector system, a human-machine interface, an entertainment device, a tracking system, and a camera according to the present invention.

EXEMPLARY EMBODIMENTS

[0490] In FIG. 1, a first exemplary embodiment of an optical detector 110 according to the present invention is shown in a highly schematic cross sectional view, in a plane parallel to an optical axis 112 of the optical detector 110. The optical detector 110 may be used for detecting a scene 114 or a part thereof, wherein the scene 114 refers to a surrounding 116 of the optical detector 110, wherein an image of the scene 114 or the part thereof may be taken. The at least one image of the scene 114 or the part thereof may comprise a single image or a progressive sequence of images, such as a video or video clip. In this particular example, the scene simply comprises an object 118. The object 118 may be adapted for emitting and/or for reflecting one or more light beams 120 towards the optical detector 110.

[0491] The optical detector 110 comprises at least one optical sensor 122, which is embodied as a FiP sensor, i.e. as optical sensor 122 has a sensor region 124 which may be illuminated by the light beam 120, thereby creating a light spot 126 in the sensor region 124. The FIR sensor 122 is further adapted to generate at least one sensor signal, wherein the sensor signal, given the same total power of illumination, is dependent on the width of the light beam 120, such as on the diameter or the equivalent diameter of the light spot 126, in the sensor region 124. Thus, the sensor signal of the optical sensor 122 exhibits a non-linear dependency on an illumination of the sensor region 126 by the light beam 120 with respect to a total power of the illumination

[0492] For further details regarding potential setups of the FiP sensor 122, reference may be made to e.g. WO 2012/110924 A1 or US 2012/0206336 A1, e.g. to the embodiment shown in FIG. 2 and the corresponding description, and/or to WO 2014/097181 A1 or US 2014/0291480 A1, e.g. the longitudinal optical sensor shown in FIGS. 4A to 4C and the corresponding description. It shall be noted, however, that other embodiments of the optical sensor 122, specifically the FiP sensor, are feasible, such as by using one or more of the embodiments as described in detail above.

[0493] The optical detector 110 further comprises at least one image sensor 128 which may, preferably, be located in a beam path 130 in which the optical sensor 122 might also be located. According to the present invention, the image sensor 128 is an inorganic pixelated sensor which comprises a pixel matrix of image pixels within its sensor region 124, which will be illustrated in more detail, for example, in FIG. 2. For this purpose, the sensor region of the image sensor 128 may, preferably, comprise a CCD device or a CMOS device as already mentioned above. However, embodiments wherein the image sensor 128 may be an organic pixelated sensor comprising a pixel matrix of image pixels within its sensor region 124 may also be feasible. Herein, the image pixels in the sensor region 124 of the image sensor 128 are adapted to detect the light beam 120 and to generate at least one image signal. In contrast to the sensor signal as generated by the optical sensor 12, the image signal exhibits a linear dependency on the illumination of the image pixels by the light beam 120 with respect to the total power of the illumination of the sensor region 124 of the image sensor 128.

[0494] The optical detector 110 further comprises at least one evaluation device 132. The evaluation device 132 may, preferably, be connected by at least one connector 134 to the at least one optical sensor 122 in order to receive the sensor signals from the at least one optical sensor 122. As described above, the sensor signals as received from the optical sensor 122 comprise longitudinal optical sensor signals but may, depending on the setup of the optical sensor 122, further comprise transversal sensor signals. In a similar manner, the evaluation device 132 may, preferably, further be connected by at least one further connector 134 to the at least one image sensor 128 in order to receive the image signals from the at least one image sensor 128. Herein, the signal transmission to the evaluation device 132 may take place in a wire-bound or even in a wireless fashion. As an example, the evaluation device 132 may comprise one or more computers, such as one or more processors, and/or one or more application-specific integrated circuits (ASICs).

[0495] According to the present invention, the evaluation device 132 is adapted to evaluate both the sensor signal and the image signal. As outlined above, the sensor signal of the optical sensor 122 exhibits a non-linear dependency on an illumination of the sensor region 124 by the light beam 120 with respect to a total power of the illumination, whereas the image signal exhibits a linear dependency on the illumination of the sensor region 124 comprising the image pixels by the light beam 120 with respect to the total power of the illumination. Accordingly, the sensor signal may, thus, exhibit a dependency on the total power of the illumination and, as a consequence of the above described FiP effect, on the geometry of the illumination. Therefore, in a first respect, the sensor signal as generated by the optical sensor 122 exhibits, in the same manner as the image sensor 128, a linear dependency on the power of the illumination, which may, however, be superimposed, in a second respect, by the additional non-linear dependency on the geometry of the illumination of the optical sensor 122.

[0496] As used in the example as depicted in FIG. 1, the non-linear dependency of the sensor signal on the total power of the illumination of the optical sensor may be expressed by a non-linear function comprising both a linear part and a non-linear part, wherein the sum of both parts may, apart from further effects, describe the non-linear behavior of the sensor signal with respect to the illumination of the sensor region 124. In a similar manner, the image signal may be expressed solely by the linear part of the mentioned non-linear function since the image signal exhibits a linear dependency on the illumination of the image pixels by the light beam 120.

[0497] Therefore, the evaluation device 132 may, preferentially, comprise a processing circuit 136 which may be adapted to provide a difference between the sensor signal and the image signal at its output 138. As mentioned above, the purely non-linear part as derived from the sensor signal of the FiP sensor may typically exhibit, for low intensities of the incident light beam 120, a strong contribution which might be dominant, whereas the purely non-linear part as part of the sensor signal of the optical sensor 122 may, for increasing intensities of the incident light beam 120, decrease. Within this regard, the linear part of the non-linear function may be considered as a kind of asymptotic background which could, preferably, be subtracted from the desired signal, i.e. the purely non-linear part which may directly be related to the above-described FiP effect. In order to be able to provide the purely non-linear part of the non-linear function at the output 138 of the processing circuit 136, a first input 140 of the processing circuit 136 may be adapted to receive the total non-linear function by acquiring the sensor signal from the optical sensor 122, while a second input 142 may be adapted to receive the linear part of the non-linear function by acquiring the image signal from the image sensor 128.

[0498] As schematically depicted in FIG. 1, the processing circuit 136 which may, preferably, be a part of the evaluation device 132 may, thus, comprise one or more operational amplifiers 144 which may, in a known arrangement, be configured to provide the difference between the sensor signal and the image signal at its output 138. As a result, by providing the difference between the sensor signal and the image signal, the purely non-linear part of a corresponding physical quantity, such as a sensor current or a sensor voltage, may, thus, be provided at the output 138 of the processing circuit 136. Therefore, the embodiment as illustrated in FIG. 1 may, thus, be useful for determining the non-linear contribution provided by the FiP effect, particularly at low intensities of the incident light beam 120. Advantageously, it may, thus, be possible to increase the signal quality of the sensor signal, such as the signal to noise-ratio, in this manner, in particular for low intensities. However, other devices for providing the mentioned difference may also be employed, such as other electronic devices (not depicted here), or, alternatively or in addition, by using a piece of software which may be adapted for performing the same task, wherein the software may be executable within or outside the evaluation device 132.

[0499] In this particular example, the optical sensor 122 which exhibits the above-described HP-effect may be developed in different manners. In a first alternative, the sensor region 124 of optical sensor 122 may, preferably, be a uniform sensor surface such that the optical sensor 122 may also be denominated a large-area optical sensor. In general, as disclosed e.g. in one or more of WO 2012/110924 A1, US 2012/0206336 A1, WO 2014/097181 A1 or US 2014/0291480 A1, the setup as shown in FIG. 1, at least one item of information on a longitudinal position of the scene 114 or a part thereof may be determined. By evaluating the sensor signals of the at least one optical sensor 122, a longitudinal coordinate of the scene 114, such as a z-coordinate, which schematically shown in a coordinate system 146, may be determined. For this purpose, a known or determinable relationship between the at least one sensor signal and the z-coordinate may be used. For exemplary embodiments, reference may be made to the above-mentioned prior art documents. Further, by employing more than one optical sensor 122 in form of a stack, ambiguities in the evaluation of the sensor signals may be resolved.

[0500] In addition, the optical detector 110 may further comprise at least one lens 148 which may be located in the beam path 130 of the light beam 120, such that, preferably, the light beam 120 may pass the lens 128 before reaching the at least one optical sensor 122 and, preferably subsequently, the at least one image sensor 128. This kind of arrangement may particularly be preferred in an embodiment in which the optical sensor 122 may be at least partially transparent while the image sensor 128 might be transparent or, alternatively, intransparent. The latter may, thus, allow using intransparent image sensor 128 as known from the state of the art. Herein, the lens 148 may, preferably, be a focus-tunable lens 150 which may be adapted to modify a focal position of the light beam 120, in particular, since it may be adapted to change its own focal length, in a controlled fashion. As an example, at least one commercially available focus-tunable lens may, thus, be used, such as at least one electrically tunable lens. It shall be noted, however, that other types of lenses may be used in addition or alternatively.

[0501] Further, the image sensor 128 may be used an imaging device 152 which may be adapted to record an image as captured by the optical detector 110. Generally, the imaging device 152 may relate to an arbitrary device which may comprise at least one light-sensitive element which may be time and/or spatially resolving and, thus, adapted to record spatially resolved optical information, in one, two, or three dimensions.

[0502] The setup of the optical detector 110 as shown in FIG. 1 may be modified and/or improved in various ways. Thus, the components of the optical detector 110 may fully or partially be integrated into one or more housings which are not shown in FIG. 1. As an example, the at least one optical sensor 122 and the one or more image sensors 128 may be integrated into a tubular housing. Further, the lens 148, in particular the focus-tunable lens 150, and/or the evaluation device 132 may also fully or partially be integrated into the same or a different housing. Further, as outlined above, the at least one optical detector 110 may comprise additional optical components and/or may, additionally, comprise optical sensors which may or may not exhibit the above-mentioned FiP effect. Various other modifications which do not deviate from the general principle shown in FIG. 1 are feasible. By way of example, the optical detector 110 as shown in FIG. 1 may be embodied as a camera 154 or may be part of a camera 154. Thus, the camera 154 may be used specifically for 3D imaging, and may be made for acquiring standstill images and/or image sequences, such as digital video clips.

[0503] In FIG. 2, a further embodiment of the optical detector 110, which may also be used as the camera 154, is shown. Herein, the optical detector 110 comprises a modified setup which comprises a number of modifications with respect to the embodiment of FIG. 1, which may be realized in an isolated fashion or in combination. Accordingly, the optical sensor 122 and the image sensor 128 constitute a hybrid sensor 156, wherein the hybrid sensor 156 might, particularly, represent an assembly which may simultaneously comprise one or more optical sensors 122, in particular one or more FiP sensors as described above, and one or more image sensors 128, preferably one or more inorganic image sensors 128, in particular one or more CCD devices or one or more CMOS devices. Thus, the optical sensor 122 may be used for the purpose as described above, in particular in order to determine the depth of the object 118, while the image sensor 128 may be employed as the imaging device 152.

[0504] As schematically depicted in FIG. 2, the hybrid 156 may comprise a spatial arrangement wherein the optical sensor 122 might be located in a direct vicinity of the image sensor 128, i.e. no further optical element may be placed in a volume 158 which may emerge between the optical sensor 122 and the image sensor 128, which are located in a distance 160 with respect to each other. For sake of clarity, the distance 160 between the optical sensor 122 and the image sensor 128 as shown in FIG. 2 and, thus, the volume 158 between the two different types of sensors 122, 128 is depicted in an exaggerated manner while, in practice, the distance 160 and, thus, the volume 158 may be kept rather small, particularly in order to keep effort and expenses for providing contacts between the optical sensor 122 and the image sensor 128 low. Further, keeping the distance 160 between the optical sensor 122 and the image sensor 128 low, may, advantageously, result in a feature that both constituents of the hybrid device 156 may still be located within a tolerance range with respect to the focus of the light beam 120. Consequently, the distance 160 between the optical sensor 122, which may be in focus at a specific time interval, and the image sensor 128 which may be slightly out of focus could during the same time interval may, still, be tolerated with respect to acquiring an acceptably sharp image of the object 118 in the scene 114.

[0505] As shown in FIG. 2, the optical sensor 122 and the image sensor in the hybrid sensor 156 are arranged in a stacked manner. Consequently, the incident light beam 120 first impinges on the optical sensor 122 before it attains the image sensor 128. Herein, the sensor region 124 as comprised by both the optical sensor 122 and the image sensor 128 is arranged in a manner perpendicular to the optical axis 112 of the optical detector 110. In order to provide a maximum illumination intensity in the sensor region 124 of the image sensor 128 within this particular setup of the hybrid sensor 156, the optical sensor 122 may be fully or at least partially transparent, thus allowing a maximum transmission of the illumination of the incident light beam 120 through the optical sensor 122. Such a restriction with respect to the transmission of the illumination may, however, not equally be imposed on the image sensor 128. By way of example, a single image sensor 128 as used within the hybrid sensor 156 or a last image sensor 128 in a stack of image sensors 128 as employed within the hybrid sensor 156 may, still, be intransparent. This feature may be advantageous since it may allow using a large range of materials within the respective image sensor 128.

[0506] The organic optical sensor 122 in the hybrid device 156 may, still, be a large-area optical sensor having a uniform sensor surface which comprises the sensor region 124 in the same or a similar manner like the optical sensors 122 in the exemplary setups as illustrated in FIG. 1. However, it may rather be preferred to employ a partitioned or pixelated optical sensor 162 in the hybrid sensor 156, wherein the sensor region 124 of the pixelated optical sensor 162 may be established completely or at least partially by a pixel array 164 of separate sensor pixels 166. As schematically depicted in the simplified optical detector 110 according to FIG. 2, the pixel array 164 of the pixelated optical sensor 162 comprises 3×3 sensor pixels 166. As already described above, the optical sensors 122 may comprise any arbitrary number of sensor pixels 166 which may be suitable or required for the respective purposes. Within this regard, it may be mentioned that the pixelated optical sensor 162 comprises marginal sensor pixels 168 at the periphery 170 of the pixelated optical sensor 162 and, in a case where the pixel array 164 may comprise at least 3×3 sensor pixels 166, at least one non-marginal sensor pixel 172 which is located apart from the periphery 170 within the pixel array 164. In order to distinguish the at least one non-marginal sensor pixel 172 from the marginal sensor pixels 168, the non-marginal sensor pixel 172 is depicted in FIG. 2 in a hatched manner.

[0507] On the other hand, the image sensor 128 as further used within the hybrid sensor 156 may be an inorganic image sensor 128 and, thus, comprise at least one CCD device or at least one CMOS device. In particular, the image sensor 128 may also be employed as a transversal optical sensor, which may be adapted to determine one or more transversal components of the at least one object 118 within the scene 114 in the surroundings 116 of the optical detector 110. Herein, the image sensor 128 may, generally, be shaped in form of a pixel matrix 174 of separate image pixels 176. Similar to the optical sensor 122, the image sensor 128 may comprise an arbitrary number of image pixels 176, such as a number which may especially be suitable or required for the intended purposes. Further, the matrix 174 of image pixels 176 in the image sensor 128 may, generally, comprise the same number of pixels or, preferably as shown in FIG. 2, a higher number of pixels compared to the number of pixels within the array 174 of sensor pixels 166 in the pixelated optical sensor 162. By way of example, for each sensor pixel 166 in the optical sensor 162, the pixel matrix 174 of the adjoining image sensor 128 exhibits a matrix 178 of 4×4 image pixels. However, other numbers are possible, such as 16×16 image pixels, 64×64 image pixels or more. This feature is further illustrated by a hatching of the matrix 178 in the image sensor 128, wherein the matrix 178 comprises those image pixels 176 which are located in the direct vicinity of the non-marginal sensor pixel 172 which is equally depicted in the same hatched manner in FIG. 2. For purposes of comparison, a first pixel resolution may, thus, be attributed to the image sensor 128, while a second pixel resolution may be attributed to the pixelated optical sensor 162. As can be derived from the exemplary setup in FIG. 2, the first pixel resolution, accordingly, exceeds the second pixel resolution.

[0508] As already mentioned above, the pixelated optical sensor 162 comprises the marginal sensor pixels 168 located at the periphery 170 of the pixelated optical sensor 122 and the non-marginal sensor pixels 172 located apart from the periphery 170 within the pixel array 164. However, since it may be preferable to directly place the pixelated optical sensor 162 on top of the image sensor 128, wherein the term “on top” may be interpreted with respect to the z-coordinate in the coordinate system 146, a problem which may concern a providing of electrical contacts to the non-marginal sensor pixels 172 within the pixel array 164 may occur. Whereas electrical contacts may directly be attached to each of the easily accessible marginal sensor pixels 168 of the pixelated optical sensor 162, the problem relating to the at least one non-marginal sensor pixel 172, i.e. the sensor pixel 172 which is not located at the readily accessible periphery 170 of the pixelated optical sensor 162, may be solved, according to the present invention, by using an image sensor 128 which may comprise one or more of the top contacts (not depicted here).

[0509] Accordingly, as shown in FIG. 2, the non-marginal sensor pixel 172 of the pixelated optical sensor 162 may be electrically connected to the top contact as provided by at least one of the image pixels 176 within the matrix 178 of the image sensor 128, which is located in the vicinity of the respective optical sensor 122. Herein, the electrical connection is, preferably, provided by using a well-known bonding technique, such as wire bonding, direct bonding, ball bonding, or adhesive bonding. However, other kinds of bonding techniques may be employed. Accordingly, the bonding technique here generates a bond contact 180 between the respective top contact as provided by one or more of the image pixels 176 as comprised within the image sensor 128 and the adjoining non-marginal sensor pixel 172 within the pixelated optical sensor 162.

[0510] The optical detector 110 as schematically depicted in FIG. 2 further comprises the at least one evaluation device 132 as already known from the embodiment as depicted in FIG. 1. Herein, the at least two constituents of the hybrid sensor 156, i.e. the pixelated optical sensor 162 and the image sensor 128, may be connected to the evaluation device 132 by the connector 134. In this particular example, again, the evaluation device 132 comprises the processing circuit 136 which is adapted to provide a difference between the sensor signal and the image signal as the purely non-linear part of the non-linear function at the output 138. Here, the processing circuit 136 might, preferably, be a part of the evaluation device 132 and exhibit the same setup as schematically illustrated in FIG. 1. However, also here, other devices for providing the mentioned difference may also be employed, such as other electronic devices (not depicted here), or, alternatively or in addition, by using a piece of software which may be adapted for performing the same task, wherein the software may be executable within or outside the evaluation device 132.

[0511] Further, information as generated by the processing circuit 136 may be combined with other information as generated by the evaluation device 132, such as the depth information as derived from the sensor signal provided by the pixelated optical sensor 162 or image information as derived from the image signal by the image sensor 128 and, subsequently, evaluated in an image evaluation device 182, which may be part of the evaluation device 132 and/or of the image sensor 128. However, other arrangements are feasible.

[0512] The optical detector 110 may further comprise at least one focus-modulation device 184 which can be connected to the at least one focus-tunable lens 150. The at least one focus-modulation device 184 may, thus, be adapted to provide at least one focus-modulating signal to the at least one focus-tunable lens 150. Herein, the focus-modulation device 184 may be an individual unit being separated from the focus-tunable lens 150 and/or may be fully or partially integrated into the focus-tunable lens 150. As depicted in FIG. 2, the evaluation device 132 may, additionally, be connected to the at least one focus-modulation device 184, which be fully or partially be integrated into the evaluation device 132. As an example, the focus-modulating signal, which preferably may be an electric signal, may be a periodic signal, more preferably a sinusoidal, a square, or triangular periodic signal. The signal transmission to the focus-tunable lens 150 may take place in a wire-bound or in a wireless fashion. As an example, the focus-modulation device 184 may be or may comprise a signal generator, such as an electronic oscillator generating an electronic signal, such as a periodic signal. In addition, one or more amplifiers may be present in order to amplify the focus-modulating signal.

[0513] FIG. 3 shows a particular embodiment, wherein the sensor pixels 166 of the pixelated optical sensor 162 may be electrically connected to a top contact 185 as provided by one of the image pixels 176 of the image sensor 128, wherein the pixelated optical sensor 162 and the image sensor 128 are comprised within the hybrid device 156. Within this regard, it may be preferred that the top contact 185 may provide an electrical connection between one of the non-marginal sensor pixels 172 to one of the image pixels 176 as comprised within the matrix 178. However, it may, equally, be feasible to provide the electrical connection to the marginal sensor pixels 168 of the pixelated optical sensor 162 in the same manner.

[0514] As schematically depicted in FIG. 3, the exemplarily illustrated image pixel 176 of the image sensor 128 may, in this particular embodiment, comprise two individual top contacts 185, 185′ which might each be located at a side of the image pixel 176, respectively. Directly on top of the image pixel 176 with respect to a direction of the incident light beam 120 a transparent contact 186 might be placed. In this preferred example, the transparent contact 186 may constitute one of a connecting means of the exemplarily illustrated sensor pixel 166 of the pixelated optical sensor 162 while another transparent contact 186′ may be placed on top of the sensor pixel 166. By way of example, the two transparent contacts 186, 186′ as displayed here may each be connected to one of the transparent electrodes of the sensor pixel 166 which may, preferably, be located on the top and the bottom of the respective sensor pixel 166. However, other embodiments within this respect may be feasible. A shown here, each of transparent contacts 186, 186′ may be electrically connected to one of the individual top contacts 185, 185′, wherein the contacts 185, 185′ may be arranged to provide further lead to other connectors, such as to the connectors 134 between the hybrid sensor 156 and the evaluation device 132.

[0515] FIG. 4 schematically shows three different embodiments of the optical sensor 122 which exhibits the FiP-effect and which may, according to the present invention, thus be employed in the optical detector 110 as presented in FIGS. 1, 2, 3 and 5.

[0516] In a first embodiment, the at least one optical sensor 122 may, as schematically depicted in FIG. 4A, be a large-area optical sensor 188. Herein the large-area optical sensor 188 exhibits a uniform sensor surface which may, thus, constitute the sensor region 124 of the corresponding optical sensor 122.

[0517] As a further embodiment, FIG. 4B, again, illustrates the pixelated optical sensor 162, wherein the pixelated optical sensor 162 may be established at least partially by the pixel array 164 which comprises the separate sensor pixels 166 which, thus, constitute the sensor region 124. As already described above, the pixelated optical sensor 162 may comprise any arbitrary number of sensor pixels 166 which may be suitable or required for the respective purposes.

[0518] Within this regard, it may be mentioned that the sensor pixels 166 within the pixelated optical sensor 162 may be one of the marginal sensor pixels 168 at the periphery 170 of the pixelated optical sensor 162 or, in the case where the pixel array 164 comprises at least 3×3 sensor pixels 166, one of the non-marginal sensor pixels 172 which are located apart from the periphery 170 of the pixel array 164.

[0519] As a further embodiment, FIG. 4C schematically shows two individual pixelated optical sensors 162, 162′, wherein each of the pixelated optical sensors 162, 162′ may, as depicted in FIG. 4B, be established at least partially by the pixel array 164 comprising a number of individual sensor pixels 166. In the particular embodiment as depicted in FIG. 4C, each of the two individual pixelated optical sensors 162, 162′ comprise the same kind of pixel array 164 which exhibit the same number of sensor pixels 166. However, other embodiments may be feasible, such as an arrangement in which one of the two individual pixelated optical sensors 162 comprises a number of sensor pixels 166 which may be a multiple of the number of sensor pixels 166 as comprised by the other of the two separate pixelated optical sensors 162′.

[0520] However, in a specific embodiment, at least one electronic element (not depicted here) may be placed in a vicinity of, in particular each of, the sensor pixels 166 on the same surface as the sensor pixels 166. Herein, the electronic elements may be adapted to contribute to an evaluation of the signal as provided by the corresponding sensor pixel 166 and might, thus, comprise one or more of: a connector, a capacity, a diode, a transistor. However, since the electronic elements are not sensitive to the illumination by the incident light beam in the sense as described above that they do not contribute to the sensor signal of the pixelated sensor 162, 162′, the area on the surface of the respective pixelated sensor 162, 162′ may only be able to contribute to the sensor signal as the sensor region 124 to a partial extent. In addition, two adjoining sensor pixels 166 may be separated from each other by a separating strip, wherein the strip may comprise an electrically non-conducting material, such as a photoresist, which may, particularly, be adapted to avoid a cross-talk between the two adjacent sensor pixels 166, so that the strip may also not be able to contribute to the sensor signal.

[0521] However, the embodiment as presented in FIG. 4C may provide a solution to this particular problem. Accordingly, the at least two individual pixelated optical sensors 162, 162′ are arranged in the xy-plane according to the coordinate system 146 in a manner that the two pixelated optical sensors 162, 162′ are, in particular directly, placed on top of each other. Further, the respective location of the two pixelated optical sensors 162, 162′ may be shifted by an extent 190 with respect to each other, preferably, in both the x- and the y-direction. Herein, the extent 190 by which the two pixelated optical sensors 162, 162′ are shifted with respect to each other, may, preferentially, exhibit a smaller value than a respective length of a side edge of the corresponding pixelated optical sensor 162, 162′. Thus, the two pixelated optical sensors 162, 162′ may be shifted with respect to each other in a manner that one of the two pixelated optical sensors 162, which might, preferably, be transparent and which might first be impinged on by the incident light beam 120, may cover the area on the other of the two pixelated optical sensors 162′ which comprises the electronic elements as described above. As a result, as regarded from a view of the impinging light beam 120, the sensor region 124 in the optical sensor 122 according to FIG. 4C may, thus, be increased in comparison to the sensor region 124 in the single pixelated optical sensor 162 as shown in FIG. 4B.

[0522] As outlined above, the optical detector 110 and the camera 154 may be used in various devices or systems. FIG. 5, as a further example, shows a detector system 194, comprising at least one optical detector 110, such as the optical detector 110 as disclosed in one or more of the embodiments shown in FIG. 1 or 2. Within this regard, specifically with regard to potential embodiments, reference may be made to the disclosure given above further detail. As an exemplary embodiment, a detector setup similar to the setup shown in FIG. 1 is depicted in FIG. 5. FIG. 5 further shows an exemplary embodiment of a human-machine interface 196, which comprises the at least one detector 110 and/or the at least one detector system 194, and, further, an exemplary embodiment of an entertainment device 198 comprising the human-machine interface 196. FIG. 5 further shows an embodiment of a tracking system 200 adapted for tracking a position of at least one object 118 within the scene 114 in the surroundings 116 of the optical detector 110 and/or the detector system 194.

[0523] With regard to the optical detector 110, reference may be made to the disclosure given above or given in further detail below. Basically, all potential embodiments of the detector 110 may also be embodied in the embodiment shown in FIG. 1 or 2. The evaluation device 132 may be connected to the at least one hybrid sensor 156, which may comprise the at least one optical sensor 122, specifically the at least one pixelated sensor 162, which is located such that the focal position of the incident light beam 120 may be modified by the focus-tunable lens 150 in a manner that the position of the optical sensor 122 may coincide with the focal position, and the at least one image sensor 128 which may be employed as the at least one imaging device 152. Further, at least one focus-modulation device 184 may be provided, wherein, optionally, the at least one focus-modulation device 184 may be adapted for modulating the at least one focus-tunable lens 150 and may, thus, fully or partially be integrated into the evaluation device 132, as shown in FIG. 5. For connecting the above-mentioned devices, i.e. the at least one pixelated sensor 162, the at least one image sensor 128, and, optionally, the at least one focus-tunable lens 150 to the at least one evaluation device 132, as an example, the at least one connector 134 may be provided and/or one or more interfaces, which may be wireless interfaces and/or wire-bound interfaces. Further, the connector 134 may comprise one or more drivers and/or one or more measurement devices for generating sensor signals and/or for modifying sensor signals. Further, the evaluation device 132 may fully or partially be integrated into the hybrid sensor 156 and/or into other components of the optical detector 110. The optical detector 110 may further comprise at least one housing 202 which, as an example, may encase one or more of components 122 or 128. The evaluation device 132 may also be enclosed into housing 202 and/or into a separate housing.

[0524] In the exemplary embodiment shown in FIG. 5, the object 118 to be detected, as an example, may be designed as an article of sports equipment and/or may form a control element 204, the position and/or orientation of which may be manipulated by a user 206. Thus, generally, in the embodiment shown in FIG. 5 or in any other embodiment of the detector system 194, the human-machine interface 196, the entertainment device 198 or the tracking system 200, the object 118 itself may be part of the named devices and, specifically, may comprise at least one control element 204, specifically at least one control element 204 having one or more beacon devices 208 118, wherein a position and/or orientation of the control element 204 preferably may be manipulated by user 206. As an example, the object 118 may be or may comprise one or more of a bat, a racket, a club or any other article of sports equipment and/or fake sports equipment. Other types of objects 118 are possible. Further, the user 206 may be considered as the object 118, the position of which shall be detected. As an example, the user 206 may carry one or more of the beacon devices 208 attached directly or indirectly to his or her body.

[0525] The optical detector 110 may be adapted to determine at least one item on a longitudinal position of one or more of the beacon devices 208 and, optionally, at least one item of information regarding a transversal position thereof, and/or at least one other item of information regarding the longitudinal position of the object 118 and, optionally, at least one item of information regarding a transversal position of the object 118. Additionally, the optical detector 110 may be adapted for identifying colors and/or for imaging the object 118. An opening 210 in the housing 202, which, preferably, may be located concentrically with regard to the optical axis 112 of the detector 110, preferably defines a direction of a view 212 of the optical detector 110.

[0526] The optical detector 110 may be adapted for determining a position of the at least one object 118. Additionally, the optical detector 110, specifically has an embodiment including camera 154, may be adapted for acquiring at least one image of the object 118, preferably a 3D-image. As outlined above, the determination of a position of the object 118 and/or a part thereof by within the scene 114 using the optical detector 110 and/or the detector system 194 may be used for providing a human-machine interface 196, in order to provide at least one item of information to a machine 214. In the embodiments schematically depicted in FIG. 5, the machine 214 may be or may comprise at least one computer and/or a computer system. Other embodiments are feasible. The evaluation device 132 may be a computer and/or may comprise a computer and/or may fully or partially be embodied as a separate device and/or may fully or partially be integrated into the machine 214, particularly the computer. The same holds true for a track controller 216 of the tracking system 200, which may fully or partially form a part of the evaluation device 132 and/or the machine 214.

[0527] Similarly, as outlined above, the human-machine interface 196 may form part of the entertainment device 198. Thus, by means of the user 206 functioning as the object 118 and/or by means of the user 206 handling the object 118 and/or the control element 204 functioning as the object 118, the user 206 may input at least one item of information, such as at least one control command, into the machine 214, particularly the computer, thereby varying the entertainment function, such as controlling the course of a computer game.

[0528] As outlined above, the optical detector 110 may have a beam path 130, wherein the beam path 130 may be a straight beam path or a tilted beam path, an angulated beam path, a branched beam path, a deflected or split beam path or other types of beam paths. Further, the light beam 120 may propagate along each beam path 130 or partial beam path once or repeatedly, unidirectionally or bidirectionally. Thereby, the components listed above or the optional further components listed in further detail below may fully or partially be located in front of the at least one hybrid sensor 156 and/or behind the at least one hybrid sensor 156 as depicted in FIG. 2.

LIST OF REFERENCE NUMBERS

[0529] 110 Optical detector [0530] 112 Optical axis [0531] 114 Scene [0532] 116 Surroundings [0533] 118 Object [0534] 120 Light beam [0535] 122 Optical sensor, FiP sensor [0536] 124 Sensor region [0537] 126 Light spot [0538] 128 Image sensor [0539] 130 Beam path [0540] 132 Evaluation device [0541] 134 Connector [0542] 136 Processing circuit [0543] 138 Output of processing circuit [0544] 140 First input of processing circuit [0545] 142 Second input of processing circuit [0546] 144 Operational amplifier [0547] 146 Coordinate system [0548] 148 Lens [0549] 150 Focus-tunable lens [0550] 152 Imaging device [0551] 154 Camera [0552] 156 Hybrid sensor [0553] 158 Volume [0554] 160 Distance [0555] 162, 162′ Pixelated optical sensor [0556] 164 Pixel array [0557] 166 Sensor pixel [0558] 168 Marginal sensor pixel [0559] 170 Periphery [0560] 172 Non-marginal sensor pixel [0561] 174 Pixel matrix [0562] 176 Image pixel [0563] 178 Matrix [0564] 180 Bond contact [0565] 182 Image evaluation device [0566] 184 Modulation device [0567] 185, 185′ Top contact [0568] 186, 186′ Transparent contact [0569] 188 Large-area optical sensor [0570] 190 Extent of shift [0571] 192 Length of side edge [0572] 194 Detector system [0573] 196 Human-machine device [0574] 198 Entertainment device [0575] 200 Tracking system [0576] 202 Housing [0577] 204 Control element [0578] 206 User [0579] 208 Beacon device [0580] 210 Opening [0581] 212 Direction of view [0582] 214 Machine [0583] 216 Track controller