OPTICAL DETECTOR

20180007343 · 2018-01-04

Assignee

Inventors

Cpc classification

International classification

Abstract

An optical detector (110) is disclosed, comprising: at least one optical sensor (122) adapted to detect a light beam (116) and to generate at least one sensor signal, wherein the optical sensor (122) has at least one sensor region (126), wherein the sensor signal of the optical sensor (122) is dependent on an illumination of the sensor region (126) by the light beam (116), wherein the sensor signal, given the same total power of the illumination, is dependent on a width of the light beam (116) in the sensor region (126); at least one focus-tunable lens (130) located in at least one beam path (132) of the light beam (116), the focus-tunable lens (130) being adapted to modify a focal position of the light beam (116) in a controlled fashion; at least one focus-modulation device (136) adapted to provide at least one focus-modulating signal (138) to the focus-tunable lens (130), thereby modulating the focal position; and at least one evaluation device (140), the evaluation device (140) being adapted to evaluate the sensor signal.

Claims

1. An optical detector, comprising: at least one optical sensor adapted to detect a light beam and to generate at least one sensor signal, wherein the optical sensor has at least one sensor region, wherein the sensor signal of the optical sensor is dependent on an illumination of the sensor region by the light beam, wherein the sensor signal, given the same total power of the illumination, is dependent on a width of the light beam in the sensor region; at least one focus-tunable lens located in at least one beam path of the light beam, the focus-tunable lens being adapted to modify a focal position of the light beam in a controlled fashion; at least one focus-modulation device adapted to provide at least one focus-modulating signal to the focus-tunable lens, thereby modulating the focal position; at least one evaluation device, the evaluation device being adapted to evaluate the sensor signal.

2. The optical detector according to claim 1, wherein the sensor signal of the optical sensor is further dependent on a modulation frequency of the light beam.

3. The optical detector according to claim 1, wherein the evaluation device is adapted to detect one or both of local maxima or local minima in the sensor signal, wherein the evaluation device is adapted to derive at least one item of information on a longitudinal position of at least one object from which the light beam propagates towards the optical detector by evaluating one or both of the local maxima or local minima.

4. The optical detector according to claim 1, wherein the evaluation device is adapted to perform a phase-sensitive evaluation of the sensor signal.

5. The optical detector according claim 1, wherein the optical detector further comprises at least one transversal optical sensor, the transversal optical sensor being adapted to determine one or more of a transversal position of the light beam, a transversal position of an object from which the light beam propagates towards the optical detector or a transversal position of a light spot generated by the light beam, the transversal position being a position in at least one dimension perpendicular to an optical axis of the optical detector, the transversal optical sensor being adapted to generate at least one transversal sensor signal.

6. The optical detector according to claim 1, wherein the optical detector further comprises at least one imaging device.

7. The optical detector according to claim 1, wherein the optical detector further comprises: at least one spatial light modulator being adapted to modify at least one property of the light beam in a spatially resolved fashion, having a matrix of pixels, each pixel being controllable to individually modify the at least one optical property of a portion of the light beam passing the pixel before the light beam reaches the at least one optical sensor; and at least one modulator device adapted for periodically controlling at least two of the pixels with different modulation frequencies; wherein the evaluation device is adapted for performing a frequency analysis in order to determine signal components of the sensor signal for the modulation frequencies.

8. The optical detector according to claim 1, wherein the modulator device is adapted such that each of the pixels is individually controllable.

9. The optical detector according to claim 7, wherein the evaluation device is adapted for performing the frequency analysis by demodulating the sensor signal with the different modulation frequencies.

10. The optical detector according to claim 7, wherein the evaluation device is adapted to assign each of the signal components to one or more pixels of the matrix.

11. The optical detector according to claim 7, wherein the evaluation device is adapted to determine which pixels of the matrix are illuminated by the light beam by evaluating the signal components.

12. The optical detector according to claim 7, wherein the focus-tunable lens is fully or partially part of the spatial light modulator, wherein the pixels of the spatial light modulator have micro-lenses, wherein the micro-lenses are focus-tunable lenses.

13. The optical detector according to claim 12, wherein each pixel has an individual micro-lens.

14. The optical detector according to claim 7, the optical detector further having at least one imaging device, the imaging device being capable of acquiring at least one image of a scene captured by the optical detector, wherein the evaluation device is adapted to assign the pixels of the spatial light modulator to image pixels of the image, wherein the evaluation device is further adapted to determine a depth information for the image pixels by evaluating the signal components, wherein the evaluation device is adapted to combine a depth information of the image pixels with the image in order to generate at least one three-dimensional image.

15. A detector system for determining a position of at least one object, the detector system comprising at least one optical detector according to claim 1, the detector system further comprising at least one beacon device adapted to direct at least one light beam towards the optical detector, wherein the beacon device is at least one of attachable to the object, holdable by the object and integratable into the object.

16. A human-machine interface for exchanging at least one item of information between a user and a machine, the human-machine interface comprising at least one optical detector according to claim 1 referring to an optical detector.

17. An entertainment device for carrying out at least one entertainment function, wherein the entertainment device comprises at least one human-machine interface according to claim 16, wherein the entertainment device is designed to enable at least one item of information to be input by a player by means of the human-machine interface, wherein the entertainment device is designed to vary the entertainment function in accordance with the information.

18. A tracking system for tracking a position of at least one movable object, the tracking system comprising at least one optical detector according to claim 1 and/or at least one detector system according to any of the preceding claims referring to a detector system, the tracking system further comprising at least one track controller, wherein the track controller is adapted to track a series of positions of the object at specific points in time.

19. A camera for imaging at least one object the camera comprising at least one optical detector according to claim 1.

20. A method of optical detection, the method comprising: detecting at least one light beam by using at least one optical sensor and generating at least one sensor signal, wherein the optical sensor has at least one sensor region, wherein the sensor signal of the optical sensor is dependent on an illumination of the sensor region by the light beam, wherein the sensor signal, given the same total power of the illumination, is dependent on a width of the light beam in the sensor region; modifying a focal position of the light beam in a controlled fashion by using at least one focus-tunable lens located in at least one beam path of the light beam; providing at least one focus-modulating signal to the focus-tunable lens by using at least one focus-modulation device, thereby modulating the focal position; and evaluating the sensor signal by using at least one evaluation device.

21. The method according to claim 20, further comprising: modifying at least one property of the light beam in a spatially resolved fashion by using at least one spatial light modulator, the spatial light modulator having a matrix of pixels, each pixel being controllable to individually modify the at least one optical property of a portion of the light beam passing the pixel before the light beam reaches the at least one optical sensor; and periodically controlling at least two of the pixels with different modulation frequencies by using at least one modulator device; and wherein evaluating the sensor signal comprises performing a frequency analysis in order to determine signal components of the sensor signal for the modulation frequencies.

22. An article, comprising the optical detector of claim 1, wherein the article is adapted to function as an article for performing at least one application, selected from the group consisting of: a position measurement in traffic technology; an entertainment application; a security application; a human-machine interface application; a tracking application; a photography application; an imaging application or camera application; a mapping application for generating maps of at least one space; a mobile application; a webcam; a computer peripheral device; a gaming application; a camera or video application; a security application; a surveillance application; an automotive application; a transport application; a medical application; a sports application; a machine vision application; a vehicle application; an airplane application; a ship application; a spacecraft application; a building application; a construction application; a cartography application; a manufacturing application; a use in combination with at least one time-of-flight detector; an application in a local positioning system; an application in a global positioning system; an application in a landmark-based positioning system; an application in an indoor navigation system; an application in an outdoor navigation system; an application in a household application; a robot application; an application in an automatic door opener; and an application in a light communication system.

Description

BRIEF DESCRIPTION OF THE FIGURES

[0623] Further optional details and features of the invention are evident from the description of preferred exemplary embodiments which follows in conjunction with the dependent claims. In this context, the particular features may be implemented alone or in any reasonable combination. The invention is not restricted to the exemplary embodiments. The exemplary embodiments are shown schematically in the figures. Identical reference numerals in the individual figures refer to identical elements or elements with identical function, or elements which correspond to one another with regard to their functions.

[0624] In the figures:

[0625] FIG. 1 shows a first embodiment of an optical detector according to the present invention, comprising a focus-tunable lens and one or more optical sensors;

[0626] FIG. 2 shows an exemplary embodiment of a modulation of a focal length of the focus tunable-lens and a corresponding sensor signal of one of the optical sensors in the embodiment shown in FIG. 1;

[0627] FIG. 3 shows a further embodiment of an optical detector and a camera according to the present invention;

[0628] FIG. 4 shows an exemplary embodiment of an optical detector, a detector system, a human-machine interface, an entertainment device, a tracking system and a camera according to the present invention;

[0629] FIG. 5 shows a further embodiment of an optical detector according to the present invention, further having at least one spatial light modulator;

[0630] FIGS. 6 and 7 show schematic explanations of a measurement using the setup of FIG. 5 using the spatial light modulator;

[0631] FIG. 8 shows an alternative embodiment of an optical detector having at least one spatial light modulator and a branched beam path;

[0632] FIG. 9 shows an embodiment of an optical detector having a spatial light modulator with a micro-lens array having focus-tunable lenses; and

[0633] FIG. 10 shows an embodiment of controlling micro-lenses of the micro-lens array in the embodiment shown in FIG. 9.

EXAMPLARY EMBODIMENTS

[0634] In FIG. 1, a first exemplary embodiment of an optical detector 110 according to the present invention is shown in a highly schematic cross sectional view, in a plane parallel to an optical axis 112 of the optical detector 110. The optical detector 110 may be used for detecting an object 114 or a part thereof. The object 114 may be adapted for emitting and/or reflecting one or more light beams 116 towards the optical detector 110. For this purpose, the object 114, as an example, may be embodied as a light source and/or one or more beacon devices 118 may be one or more of integrated into the object 114, held by the object 114 or attached to the object 114. The beacon devices 118 may comprise one or more illumination sources and/or reflective elements. In case one or more reflective elements are used, the setup of the optical detector 110 may further comprise one or more illumination sources for illuminating the beacon devices 118, which are not depicted in the exemplary embodiment of FIG. 1. For potential embodiments of the beacon devices 118, reference may be made e.g. to the disclosure of the beacon devices in WO 2014/097181 A1 and/or in US 2014/0291480 A1. Other embodiments, however, are feasible. It shall be noted that the combination of the optical detector 110 and the at least one beacon device 118 may be referred to as a detector system 120. Consequently, the exemplary embodiment shown in FIG. 1 also shows an exemplary embodiment of a detector system 120.

[0635] The optical detector 110 comprises at least one optical sensor 122. In the exemplary embodiment shown in FIG. 1, a stack 124 of optical sensors 122 is shown, having, as an example, four optical sensors 122, wherein at least some of the optical sensors 122 are fully or partially transparent. The last optical sensor 122, i.e. the optical sensor 122 on a side of the stack 124 facing away from the object 114, may be an opaque optical sensor 122, without transmissive properties.

[0636] The optical sensors 122 each are embodied as FiP sensors, i.e. as optical sensors 122 each having a sensor region 126 which may be illuminated by the light beam 116, thereby creating a light spot 128 in the sensor region 126. The FiP sensors 122 are further adapted to generate at least one sensor signal, wherein the sensor signal, given the same total power of illumination, is dependent on the width of the light beam 116, such as on the diameter or the equivalent diameter of the light spot 128, in the sensor region 126.

[0637] For further details regarding potential setups of the FiP sensors 122, reference may be made to e.g. WO 2012/110924 A1 or US 2012/0206336 A1, e.g. to the embodiment shown in FIG. 2 and the corresponding description, and/or to WO 2014/097181 A1 or US 2014/0291480 A1, e.g. the longitudinal optical sensor shown in FIGS. 4A to 4C and the corresponding description. It shall be noted, however, that other embodiments of the optical sensor 122, specifically the FiP sensor, are feasible, such as by using one or more of the embodiments described in detail above.

[0638] The optical detector 110 further comprises at least one focus-tunable lens 130, also referred to as an FTL, located in a beam path 132 of the light beam 116, such that, preferably, the light beam 116 passes the focus-tunable lens 130 before reaching the at least one optical sensor 122. The focus-tunable lens 130 is adapted to modify a focal position of the light beam 116, i.e. is adapted to change its own focal length, in a controlled fashion. The focal length modulation, in the exemplary embodiment shown in FIG. 1, is symbolically depicted by reference number 134. As an example, at least one commercially available focus-tunable lens 130 may be used, such as at least one electrically tunable lens. As an example, focus-tunable lenses of the series IL-6-18, IL-10-30, IL-10-30-C or IL-10-42-LP, commercially available by Optotune AG, 8953 Dietikon, Switzerland, may be used. Additionally or alternatively, one or more variable focus liquid lenses may be used, such as models Arctic 316 or Arctic 39N0, available by Varioptic, 69007 Lyon, France. It shall be noted, however, that other types of focus-tunable lenses 130 may be used in addition or alternatively.

[0639] The optical detector 110 further comprises at least one focus-modulation device 136 connected to the at least one focus-tunable lens 130. The at least one focus-modulation device 136 is adapted to provide at least one focus-modulating signal, in FIG. 1 symbolically depicted by reference number 138, to the at least one focus-tunable lens 130. The focus-modulation device 136 may be separate from the focus-tunable lens 130 and/or may fully or partially be integrated into the focus-tunable lens 130. As an example, the focus-modulating signal 138, which preferably may be an electric signal, may be a periodic signal, more preferably a sinusoidal or rectangular periodic signal. The signal transmission to the focus-tunable lens 130 may take place in a wire-bound or even in a wireless fashion. As an example, the focus-modulation device 136 may be or may comprise a signal generator, such as an electronic oscillator generating an electronic signal, such as a periodic signal. In addition, one or more amplifiers may be present in order to amplify the focus-modulating signal 138.

[0640] The optical detector 110 further comprises at least one evaluation device 140. The evaluation device 140, as an example, may be connected to the at least one optical sensor 122, in order to receive sensor signals from the at least one optical sensor 122. Further, as depicted in FIG. 1, the evaluation device 140 may be connected to the at least one focus-modulation device 136 and/or the focus-modulation device 136 may even fully or partially be integrated into the evaluation device 140. As an example, the evaluation device 140 may comprise one or more computers, such as one or more processors, and/or one or more application-specific integrated circuits (ASICs).

[0641] In general, as disclosed e.g. in one or more of WO 2012/110924 A1, US 2012/0206336 A1, WO 2014/097181 A1 or US 2014/0291480 A1, the setup shown in FIG. 1, at least one item of information on a longitudinal position of the object 114 or a part thereof may be determined. Thus, for example, a coordinate system 142 may be used, as symbolically depicted in FIG. 1, with a z-axis parallel to the optical axis 112 of the optical detector 110. By evaluating the sensor signals of the at least one optical sensor 122, a longitudinal coordinate of the object 114, such as a z-coordinate, may be determined. For this purpose, a known or determinable relationship between the at least one sensor signal and the z-coordinate may be used. For exemplary embodiments, reference may be made to the above-mentioned prior art documents. By using the stack 124 of optical sensors 122, ambiguities in the evaluation of the sensor signals may be resolved.

[0642] Still, this setup known from the above-mentioned prior art documents imposes some technical challenges, specifically with regard to the setup of the optical design and with regard to the evaluation of the sensor signals. Specifically, the precision of the evaluation of the z-coordinate of the object 114 and/or a part thereof, such as of the beacon devices 118, may be improved.

[0643] By modulating the focal length of the at least one focus-tunable lens 130, a significant improvement in the precision of the measurement and a significant reduction of the complexity of the optical set up of the optical sensor 110 may be achieved. Thus, as outlined e.g. in one or more of the above-mentioned prior art documents WO 2012/110924 A1 , US 201210206336 A1, WO 2014/097181 A1 or US 2014/0291480 A1, a FiP-sensor can inherently determine whether an object is in focus or not. When changing the focal length of the FTL 130, a FiP-sensor shows a local maximum and/or a local minimum in the FiP current, whenever an object is in focus. This effect is shown in FIG. 2. Therein, on the horizontal axis, the time is given in seconds. On the left vertical axis, the focal length f of the at least one focus-tunable lens 130 is given in millimeters, wherein the graph of the focal length is denoted by reference number 144. On the right vertical axis, an exemplary sensor signal of one of the optical sensors 122 in the setup of FIG. 1 is shown, denoted by I, given in arbitrary units (a.u.). The corresponding curve is denoted by reference number 146. The focal length 146 is oscillating periodically so that the focus is changed from a minimum focal length (in this exemplary embodiment 3.50 mm, other minimum focal lengths may be used) to a maximum focal length (in this exemplary embodiment 5.50 mm, other maximum focal lengths may be used) and back. As an example, a sinusoidal change of the focal length may be used, which turned out to be an efficient type of a signal for modulating the focal length. It shall be noted, however, that other types of signals, preferably periodic signals, may be used for modulating the focal length. By changing the amplitude and the offset of the focus, different focus levels can be analyzed. For example, an object in the front can be analyzed in detail using a short focal length, while an object in the back of a scene captured by the optical detector 110 may be analyzed, such as simultaneously.

[0644] As can be seen in the curves in FIG. 2, sensor signal 146 may exhibit a sharp maximum 148 whenever the object 114, a part thereof or a beacon device 118 from which the light beam 116 emerges is in focus with the FiP sensor 122 generating the sensor signal 146. These sharp maxima 148 always occur at a specific focal length which, in FIG. 2, is denoted by reference number 150, indicating an object-in-focus-line.

[0645] Consequently, the modulation shown in FIG. 2 provides a fast and efficient way of determining the maxima 148 in the sensor signal 146. By analyzing the sensor signal 146, the position of the maxima 148 (or, in a similar set up, of corresponding minima) may be determined. Thus, by determining the object-in-focus-line 150 and/or by determining the focal length fat which the object 114 is in focus (or, correspondingly, the beacon device 118), in FIG. 2 denoted by f, all parameters for determining the longitudinal position z of the object 114 are known. Thus, as an example, the simple lens equation may be used:


1z32 1/f−1/d,

[0646] wherein z may be the longitudinal coordinate of the object 114, f′ may be the focal length at which the maxima 148 occur, and wherein d may be the distance between the focus-tunable lens 130 and the optical sensor 122 generating the sensor signal Consequently, the evaluation device 140 may be adapted to determine at least one longitudinal coordinate of the object 114 or at least one part thereof. It shall be noted, however, that other correlations between the sensor signal 146 and the at least one item of information regarding the longitudinal coordinate of the object 114 may be used. Summarizing, however, the at least one optical sensor 122 may function as a longitudinal optical sensor, and may be used for determining at least one item of information on a longitudinal position of the object 114.

[0647] The advantages of the setup shown in FIG. 1 as compared to setups using lenses having a fixed focal length are evident. Thus, as can be seen in the curves in FIG. 2, the maxima in the sensor signal 146 are rather sharp. Consequently, when using a stack 124 of optical sensors 122, the distance between the optical sensors 122 has to be rather low in order to achieve a high resolution and in order to prove a resolution of the distance measurement. With the modulating setup shown in FIG. 1, contrarily, these technical constraints are lowered, and the optical sensors 122 may be spaced further apart. Further, even a single optical sensor 122 is sufficient, since, by using the focus-tunable lens 130, the optical sensor 122 can always be brought into focus during the focus-modulation, at least within a certain range of distances of the object 114. Consequently, the at least one focus-tunable lens 130, which may be a single focus-tunable lens or at least one focus-tunable lens being comprised in a more complex setup of optical lenses, significantly may reduce the complexity of the optical system of the optical detector 110.

[0648] The setup of the optical detector 110 shown in FIG. 1 may be modified and/or improved in various ways. Thus, the components of the optical detector 110 may fully or partially be integrated into one or more housings which are not shown in FIG. 1. As an example, the at least one focus-tunable lens 130 and the one or more optical sensors 122 may be integrated into a tubular housing. Further, the components 136 and/or 140 may also fully or partially be integrated into the same or a different housing. Further, as outlined above, the at least one optical detector 110 may comprise additional optical components and/or may comprise additional optical sensors which may or may not exhibit the above-mentioned FiP effect. As will be outlined in further detail below, one or more imaging devices may be integrated, such as one or more CCD and/or CMOS devices. Further, the setup shown in FIG. 1 is a linear setup of the beam path 132. It shall be noted, however, that other setups are feasible, such as setups with a bent optical path 132, comprising one or more reflective elements and/or setups in which the beam path 132 is split into two or more partial beam paths, such as by using one or more beam-splitting elements. Various other modifications which do not deviate from the general principle shown in FIG. 1 are feasible.

[0649] In FIG. 3, an embodiment of an optical detector 110 is shown in a similar view as in FIG. 1, wherein the optical detector 110 comprises a modified setup comprising modifications of the embodiment in FIG. 1, which may be realized in an isolated fashion or in combination. The optical detector 110 may be embodied as a camera 152, as in the embodiment shown in FIG. 1, or may be part of a camera 152. For most of the details of the optical detector 110 as well as of a detector system 120 comprising the optical detector 110, reference may be made to FIG. 1 and the corresponding description.

[0650] Again, as in FIG. 1, the optical detector 110 comprises at least one optical sensor 122 exhibiting the above-mentioned FiP effect, wherein the at least one optical sensor 122, as in FIG. 1, may be used as at least one longitudinal optical sensor, denoted by z in FIG. 3. Again, a single optical sensor 122 or a plurality of optical sensors 122 may be used, such as a stack 124 of longitudinal optical sensors 122.

[0651] In addition, the optical detector 110 may comprise at least one transversal optical sensor 154, denoted by xy in FIG. 3. The at least one transversal optical sensor 154 may be separate from the at least one optical sensor 122 and/or may fully or partially be integrated into the at least one longitudinal optical sensor 122. The transversal optical sensor 154 is adapted to determine at least one transversal position of the light beam 116, wherein the transversal position is a position in at least one dimension, such as at least one plane perpendicular to the optical axis 112 of the optical detector 110. Thus, as in FIG. 1, a coordinate system 142 may be used, comprising a z-axis parallel to the optical axis 112, and one or more coordinates in a dimension perpendicular to the optical axis 112, such as Cartesian coordinates x, y. For potential setups of the at least one transversal optical sensor 154, as well as for the combination of the at least one transversal optical sensor 154 and the at least one longitudinal optical sensor 122, reference may be made, as an example, to US 2014/0291480 A1 or WO 2014/097181 A1. Specifically, for a potential sensor setup of the at least one transversal optical sensor, reference may be made to FIGS. 2A and 2B of these documents, as well as to the corresponding description. Further, with regard to potential setups of the at least one longitudinal optical sensor 122, reference may be made to FIGS. 4A to 4C of these documents, as well as to the corresponding description. Similarly, with regard to measurement principles and/or setups of the optical sensors 154, 122, reference may be made to one or more of FIGS. 1A, 1B or 1C of US 2014/0291480 A1 or WO 2014/097181 A1, as well as the corresponding description, wherein, in these setups, at least one focus-tunable lens may be added. It shall be noted, however, that other setups are feasible.

[0652] In the embodiment shown in FIG. 3, the evaluation device 140 may comprise, besides at least one z-evaluation device for determining at least one item of information on a longitudinal position of the object 114, at least one xy-evaluation device 158, wherein the xy-evaluation device 158 may be adapted for generating at least one item of information on a transversal position of the object by evaluating the transversal sensor as signal of the at least one transversal optical sensor 154. The devices 156, 158 may also be combined into a single device and/or may be embodied as software components, having software-encoded method steps adapted for performing the above-mentioned evaluation when run on a computer or computer device. For evaluation of the longitudinal optical sensor signal by the z-evaluation device 156, reference may be made to the method disclosed e.g. in FIG. 2, i.e. the detection of the maxima 148 and the corresponding algorithm described above. For the xy-evaluation device 158, reference may be made e.g. to the disclosure of US 2014/0291480 A1 and WO 2014/097181 A1 and the xy-detection disclosed therein. The information generated by devices 156, 158 may be combined, such as in an optional 3D-evaluation device 160, in order to generate a three-dimensional information regarding the object 114. Again, the device 160 may fully or partially be combined with one or both of devices 156, 158 and/or may fully or partially be embodied as a software component.

[0653] In addition or as an alternative to the transversal optical sensor 154, the optical detector 110 in the embodiment shown in FIG. 3 may comprise one or more imaging devices 162. As an example, as shown in FIG. 3, the at least one imaging device 162 may be or may comprise at least one CCD and/or at least one CMOS chip. The embodiment shown in FIG. 3, preferably, the optical sensors 122 as well as the transversal optical sensor 154 are fully or partially transparent, in order for the light beam 116 to fully or partially reach imaging device 162. Additionally or alternatively, however, as mentioned above, a branched setup may be used, by dividing the beam path 132 into two or more partial beam paths, wherein the imaging device 162 may also be located in a partial beam path. The imaging device 162 may generate one or more images or even a sequence of images, such as a video clip, of a scene captured by the optical detector 110. The image may, as an example, be evaluated by at least one optional image evaluation device 164 or which may be part of the evaluation device 140, or, alternatively, which may be embodied as a separate device. The image evaluation device 164, as an example, may comprise a storage device for storing images generated by the imaging device 162. Additionally or alternatively, however, image evaluation device 164 may also be embodied to perform an image analysis and/or an image processing, such as a filtering and/or a detection of certain features within the image. Thus, as an example, a pattern recognition algorithm may be embodied in the image evaluation device 164 and/or any type of device for object recognition. Image evaluation device 164 may, again, be fully or partially integrated with one or more of devices 156, 158 or 160 and/or may fully or partially be embodied as a software component, having one or more software-encoded processing steps. The information generated by the image evaluation device 164 may be combined with the information generated by the 3D-evaluation device 160.

[0654] As outlined above, the optical detector 110, the detector system 120 and the camera 152 may be used in various devices or systems. Thus, the camera 152 may be used specifically for 3D imaging, and may be made for acquiring standstill images and/or image sequences, such as digital video clips. FIG. 4, as an example, shows a detector system 120, comprising at least one optical detector 110, such as the optical detector 110 as disclosed in one or more of the embodiments shown in FIG. 1 or 3 or as shown in one or more of the embodiments shown in further detail below. In this regard, specifically with regard to potential embodiments, reference may be made to the disclosure given above or given in further detail below. As an exemplary embodiment, a detector setup similar to the setup shown in FIG. 3 is depicted in FIG. 4. FIG. 4 further shows an exemplary embodiment of a human-machine interface 166, which comprises the at least one detector 110 and/or the at least one detector system 120, and, further, an exemplary embodiment of an entertainment device 168 comprising the human-machine interface 166. FIG. 4 further shows an embodiment of a tracking system 170 adapted for tracking a position of at least one object 114, which comprises the detector 110 and/or the detector system 112.

[0655] With regard to the optical detector 110 and the detector system 112, reference may be made to the disclosure given above or given in further detail below. Basically, all potential embodiments of the detector 110 may also be embodied in the embodiment shown in FIG. 4. The evaluation device 140 may be connected to the at least one optical sensor 122, specifically the at least one FiP sensor 122. The evaluation device 140 may further be connected to the at least one optional transversal optical sensor 154 and/or the at least one optional imaging device 162. Further, again, at least one focus-modulation device 136 and at least one focus-tunable lens 130 are provided, wherein, optionally, the at least one focus-modulation device 136 may fully or partially be integrated into the evaluation device 140, as shown in FIG. 4. For connecting the above-mentioned devices 122, 154, 162 and 130 to the at least one evaluation device 140, as an example, at least one connector 172 may be provided and/or one or more interfaces, which may be wireless interfaces and/or wire-bound interfaces. Further, connector 172 may comprise one or more drivers and/or one or more measurement devices for generating sensor signals and/or for modifying sensor signals. Further, the evaluation device 140 may fully or partially be integrated into the optical sensors 122 and/or into other components of the optical detector 110. The optical detector 110 may further comprise at least one housing 174 which, as an example, may encase one or more of components 122, 154, 162 or 130. The evaluation device 140 may also be enclosed into housing 174 and/or into a separate housing.

[0656] In the exemplary embodiment shown in FIG. 4, the object 114 to be detected, as an example, may be designed as an article of sports equipment and/or may form a control element 176, the position and/or orientation of which may be manipulated by a user 178. Thus, generally, in the embodiment shown in FIG. 4 or in any other embodiment of the detector system 120, the human-machine interface 166, the entertainment device 168 or the tracking system 170, the object 114 itself may be part of the named devices and, specifically, may comprise at least one control element 176, specifically at least one control element 176 having one or more beacon devices 118, wherein a position and/or orientation of the control element 176 preferably may be manipulated by user 178. As an example, the object 114 may be or may comprise one or more of a bat, a racket, a club or any other article of sports equipment and/or fake sports equipment. Other types of objects 124 are possible. Further, the user 178 himself or herself may be considered as the object 114, the position of which shall be detected. As an example, the user 178 may carry one or more of the beacon devices 118 attached directly or indirectly to his or her body.

[0657] The optical detector 110 may be adapted to determine at least one item on a longitudinal position of one or more of the beacon devices 118 and, optionally, at least one item of information regarding a transversal position thereof, and/or at least one other item of information regarding the longitudinal position of the object 114 and, optionally, at least one item of information regarding a transversal position of the object 114. Additionally, the optical detector 110 may be adapted for identifying colors and/or for imaging the object 114. An opening 180 in the housing 174, which, preferably, may be located concentrically with regard to the optical axis 112 of the detector 110, preferably defines a direction of a view 182 of the optical detector 110.

[0658] The optical detector 110 may be adapted for determining a position of the at least one object 114. Additionally, the optical detector 110, specifically has an embodiment including camera 152, may be adapted for acquiring at least one image of the object 114, preferably a 3D-image. As outlined above, the determination of a position of the object 114 and/or a part thereof by using the optical detector 110 and/or the detector system 120 may be used for providing a human-machine interface 166, in order to provide at least one item of information to a machine 184. In the embodiments schematically depicted in FIG. 4, the machine 184 may be or may comprise at least one computer and/or a computer system. Other embodiments are feasible. The evaluation device 140 may be a computer and/or may comprise a computer and/or may fully or partially be embodied as a separate device and/or may fully or partially be integrated into the machine 184, particularly the computer. The same holds true for a track controller 186 of the tracking system 170, which may fully or partially form a part of the evaluation device 140 and/or the machine 190.

[0659] Similarly, as outlined above, the human-machine interface 166 may form part of the entertainment device 168. Thus, by means of the user 178 functioning as the object 114 and/or by means of the user 178 handling the object 114 and/or the control element 176 functioning as the object 114, the user 178 may input at least one item of information, such as at least one control command, into the machine 184, particularly the computer, thereby varying the entertainment function, such as controlling the course of a computer game.

[0660] As outlined above, the optical detector 110 may have a straight beam path or a tilted beam path, an angulated beam path, a branched beam path, a deflected or split beam path or other types of beam paths. Further, the light beam 116 may propagate along each beam path or partial beam path once or repeatedly, unidirectionally or bidirectionally. Thereby, the components listed above or the optional further components listed in further detail below may fully or partially be located in front of the at least one optical sensor 122 and/or behind the at least one optical sensor 122.

[0661] The optical detector 110 according to the present invention may further comprise additional elements. Thus, as an example, the optical detector 110 may comprise at least one spatial light modulator (SLM) 188, as schematically depicted in an embodiment shown in FIG. 5. The embodiment of the optical detector 110 shown therein widely corresponds to the embodiment shown in FIG. 1, with, optionally, at least one imaging device 162. Consequently, for most details of the embodiment, reference may be made to one or more of FIGS. 1 and 3, specifically with regard to the elements shown therein. Thus, again, the optical detector 110 comprises at least one focus-tunable lens 130 and one or more optical sensors 122 embodied as FiP sensors, which may act as longitudinal optical sensors. Further, as outlined above, optionally, at least one imaging device 162 may be provided. Additionally, the optical detector 110 comprises at least one spatial light modulator 188 adapted to modify at least one property of the light beam 116 in a spatially resolved fashion. The spatial light modulator 188 comprises a matrix 190 of pixels 192, each pixel 192 being controllable to individually modify the at least one optical property of a portion of the light beam 116 passing the pixel 192. The optical detector 110 further comprises at least one modulator device 194 adapted for periodically controlling at least two of the pixels 192 with different modulations frequencies. The evaluation device 140 is adapted for performing a frequency analysis in order to determine signal components of the sensor signal for the modulation frequencies.

[0662] For the functionality of the detector 110 including the at least one spatial light modulator 188, widely, reference may be made to one or more of U.S. provisional applications No. 61/867,180 dated Aug. 19, 2013, 61/906,430 dated Nov. 20, 2013, and 61/914,402 dated Dec. 11, 2013 as well as unpublished German patent application number 10 2014 006 279.1 dated Mar. 6, 2014, unpublished European patent application number 14171759.5 dated Jun. 10, 2014 and international patent application number PCT/EP2014/067466 as well as U.S. patent application Ser. No. 14/460,540, both dated Aug. 15, 2014, the full content of all of which is herewith included by reference. The functionality of the setup in FIG. 5 will, with reference to the most important features, be explained with reference to FIGS. 6 and 7.

[0663] Thus, FIG. 6 shows, in part, the setup of the embodiment of the optical detector 110 as depicted in FIG. 5, with the focus-tunable lens 130, the spatial light modulator 188 and, in this schematic view, two optical sensors 122. It shall be noted, however, that the setup may comprise additional elements, such as in one or more of the aforementioned embodiments of the optical detector and/or as in one or more of the embodiments to follow. In principle, a single optical sensor 122 is sufficient. However, a plurality of optical sensors 122 may increase the precision of the measurements. Further, in the schematic explanation of the functionality of the spatial light modulator as depicted in FIG. 6, the focus-modulation device 136 as well as the evaluation using signals generated by the focus-modulation device 136, corresponding to the functionality shown e.g. in FIGS. 1 and 3, is not depicted, for simplification purposes.

[0664] As outlined above, the optical detector 110 comprises at least one spatial light modulator 188, at least one optical sensor 122, and, further, at least one modulator device 194 and at least one evaluation device 140. The detector system 120, besides the at least one optical detector 110 may comprise at least one beacon device 118 which is at least one of attachable to an object 114, integratable into the object 114 or holdable by the object 114.

[0665] The optical detector 110, in this embodiment or other embodiments, may furthermore comprise one or more transfer devices 196, such as one or more lenses, preferably one or more camera lenses. The at least one focus-tunable lens 130 may be part of the at least one transfer device 196.

[0666] In the exemplary embodiment shown in FIGS. 6, the spatial light modulator 188, the optical sensor 122 and the transfer device 196 are arranged along an optical axis 112 in a stacked fashion. The optical axis 112 defines a longitudinal axis or a z-axis, wherein a plane perpendicular to the optical axis 112 defines an xy-plane. Thus, in FIG. 6, a coordinate system 142 is shown, which may be a coordinate system of the optical detector 110 and in which, fully or partially, at least one item of information regarding a position and/or orientation of the object 114 may be determined. It shall be noted, however, that other coordinate systems may be used, such as coordinate systems of the object 114 and/or coordinate systems of a surrounding in which the optical detector 110 and/or the object 114 may freely move.

[0667] The spatial light modulator 188 in the exemplary embodiment shown in FIG. 6 may be a transparent spatial light modulator, as shown, or may be an intransparent spatial light modulator, such as a reflective spatial light modulator 188. For further details, reference may be made to the potential embodiments discussed above. The spatial light modulator 188 comprises a matrix 190 of pixels 192 which preferably are individually controllable to individually modify at least one property of a portion of a light beam 116 passing the respective pixel 192. In the exemplary and schematic embodiment shown in FIG. 6, the light beam is denoted by reference number 116 and may be one or more of emitted and/or reflected by the one or more beacon devices 118. As an example, the pixels 192 may be switched between a transparent state or an intransparent state and/or a transmission of the pixels may be switched between two or more transparent states and/or between a transparent state and an intransparent state. In case a reflective and/or any other type of spatial light modulator 188 is used, other types of optical properties may be switched. In the embodiment shown in FIG. 6, four pixels 192 are illuminated, such that the light beam 116 may be split into four portions, each of the portions passing through a different pixel 192. Thus, the optical property of the portions of the light beam 116 may be controlled individually by controlling the state of the respective pixels 192.

[0668] The modulator device 194 is adapted to individually control the pixels 192, preferably all of the pixels 192, of the matrix 190. Thus, as shown in the exemplary embodiment of FIG. 6, the pixels 192 may be controlled at different modulation frequencies, which, for the sake of simplicity, are denoted by the position of the respective pixel 192 in the matrix 190. Thus, for example, modulation frequencies f.sub.11 to f.sub.mn are provided for an m x n matrix 190. As outlined above, the term “modulation frequency” may refer to the fact that one or more of the actual frequency and the phase of the modulation may be controlled.

[0669] Having passed the spatial light modulator 188, the light beam 116, now being influenced by the spatial light modulator 188, reaches the one or more optical sensors 122. Preferably, the at least one optical sensor 122 may be or may comprise a large-area optical sensor having a single and uniform sensor region 126. Due to the beam propagation properties, a beam width w will vary, when the light beam 116 propagates along the optical axis 112.

[0670] The at least one optical sensor 122 generates at least one sensor signal S, which, in the embodiment shown in FIG. 6, is denoted by S.sub.1 and S.sub.2. At least one of the sensor signals (in the embodiment shown in FIG. 6 the sensor Signal S.sub.1) is provided to the evaluation device 140 and, therein, to a demodulation device 198. The demodulation device 198, which, as an example, may contain one or more frequency mixers and/or one or more frequency filters, such as a low pass filter, may be adapted to perform a frequency analysis. As an example, the demodulation device 198 may contain a lock-in device and/or a Fourier analyzer. The modulator device 194 and/or a common frequency generator may further provide the modulation frequencies to the demodulation device 198. As a result, a frequency analysis may be provided which contains signal components of the at least one sensor signal for the modulation frequencies. In FIG. 6, the result of the frequency analysis symbolically is denoted by reference number 200. As an example, the result of the frequency analysis 200 may contain a histogram, in two or more dimensions, indicating signal components for each of the modulation frequencies, i.e. for each of the frequencies and/or phases of the modulation.

[0671] The evaluation device 140, which may contain one or more data processing devices 202 and/or one or more data memories 204, may further be adapted to assign the signal components of the result 200 of the frequency analysis to their respective pixels 192, such as by a unique relationship between the respective modulation frequency and the pixels 192. Consequently, for each of the signal components, the respective pixel 192 may be determined, and the portion of the light beam 116 passing through the respective pixel 192 may be derived.

[0672] Thus, even though a large-area optical sensor 122 may be used, various types of information may be derived from the frequency analysis, using the preferred unique relationship between the modulation of the pixels 192 and the signal components.

[0673] Thus, as a first example, an information on a lateral position of an illuminated area or light spot 206 on the spatial light modulator 188 may be determined (x-y-position). Thus, as symbolically shown in FIG. 6, significant signal components arise for modulation frequencies f.sub.23, f.sub.14, f.sub.13 and f.sub.24. This exemplary embodiment allows for determining the positions of the illuminated pixels and the degree of illumination. In this embodiment, pixels (1,3), (1,4), (2,3) and (2,4) are illuminated. Since the position of the pixels 192 in the matrix 190 generally is known, it may be derived that the center of illumination is located somewhere in between these pixels, mainly within pixel (1,3). A more thorough analysis of the illumination may be performed, specifically if (which usually is the case) a larger number of pixels 192 is illuminated. Thus, by identifying the signal components having the highest amplitude, the center of illumination and/or a radius of the illumination and/or a spot-size or spot-shape of the light spot 206 may be determined. This option of determining the transversal coordinates is generally denoted by x, y in FIG. 6. Thus, the spatial light modulator 188 in the optical detector 110, in conjunction with an analysis of one or more sensor signals of the at least one optical sensor 122, may replace the function of the at least one optional transversal optical sensor 154 as depicted e.g. in the embodiments of FIGS. 3 and 4. Therefore, symbolically, in the evaluation device 140 shown in FIG. 5, an xy-evaluation device 158 is depicted as a part of the evaluation device 140, wherein the xy-evaluation device 158 is connected to the modulator device 194 and to the at least one optical sensor 122, in order to receive modulation information and sensor signals. It shall be noted, however, that other types of transversal optical sensors 154 may be used in addition, such as the ones described above in conjunction with FIGS. 1 and 3.

[0674] By evaluating the illuminated pixels 192, i.e. by determining significant components in the sensor signal and assigning these components to respective pixels 192 of the spatial light modulator 188, a size of the light spot 206 may further be determined and evaluated. Thereof, as described e.g. in U.S. provisional patent applications No. 61/867,180 dated Aug. 19, 2013, 61/906,430 dated Nov. 20, 2013, and 611914,402 dated Dec. 11, 2013 as well as unpublished German patent application number 10 2014 006 279.1 dated Mar. 6, 2014, unpublished European patent application number 14171759.5 dated Jun. 10, 2014 and international patent application number PCT/EP2014/067466 as well as U.S. patent application Ser. No. 14/460,540, both dated Aug. 15, 2014, a further possibility of generating at least one item of information regarding a longitudinal position of the object 114 and/or a part thereof, and/or of the at least one beacon device 118, arises, since the width of the light beam 116 may be correlated to the longitudinal position of the object 114, as explained e.g. in one or more of WO 2012/110924 A1, US 2012/0206336 A1, US 2014/0291480 A1 or WO 2014/097181 A1. In FIG. 6, the option of determining a width of the light spot 206 on the spatial light modulator 114 is symbolically depicted by w.sub.0.

[0675] By determining a transversal or lateral position of the light spot 206 on the spatial light modulator 188, using known imaging properties of the transfer device 196, a transversal coordinate of the object 114 and/or of the at least one beacon device 118 may be determined. Thus, at least one item of information regarding a transversal position of the object 114 may be generated.

[0676] Further, since the beam width w.sub.0 generally, at least if the beam properties of the light beam 116 are known or may be determined (such as by using one or more beacon devices 118 emitting light beams 116 having well-defined propagation properties), the beam width w.sub.0 may further be used, alone or in conjunction with beam waist w.sub.1 and/or w.sub.2 determined by using the optical sensors 122, in order to determine a longitudinal coordinate (z-coordinate) of the object 114 and/or the at least one beacon device 118, as disclosed e.g. in WO 2012/110924 A1, US 2012/0206336 A1, US 2014/0291480 A1 or WO 2014/097181 A1.

[0677] In addition or alternatively to the option of determining one or both of at least one transversal coordinate x, y and/or determining at least one longitudinal coordinate z, the information derived by the frequency analysis may further be used for deriving color information. Thus, as will be outlined in further detail below, the pixels 192 may have differing spectral properties, specifically different colors. Thus, as an example, the spatial light modulator 188 may be a multi-color or even full-color spatial light modulator 188. Thus, as an example, at least two, preferably at least three different types of pixels 192 may be provided, wherein each type of pixels 192 has a specific filter characteristic, having a high transmission e.g. in the red, the green or the blue spectral range. As used herein, the term red spectral range refers to a spectral range of 600 to 780 nm, the green spectral range refers to a range of 490 to 600 nm, and the blue spectral range refers to a range of 380 nm to 490 nm. Other embodiments, such as embodiments using different spectral ranges, may be feasible.

[0678] By identifying the respective pixels 192 and assigning each of the signal components to a specific pixel 192, the color components of the light beam 136 may be determined. Thus, specifically by analyzing signal components of neighboring pixels 192 having different transmission spectra, assuming that the intensity of the light beam 116 on these neighboring pixels is more or less identical, the color components of the light beam 116 may be determined. Thus, generally, the evaluation device 140, in this embodiment or other embodiments, may be adapted to derive at least one item of color information regarding the light beam 116, such as by providing at least one wavelength and/or by providing color coordinates of the light beam 116, such as CIE-coordinates.

[0679] As outlined above, for determining at least one longitudinal coordinate of the object 114 and/or the at least one beacon device 118, a relationship between the width w of the beam and a longitudinal coordinate may be used, such as the relationship of a Gaussian light beam as disclosed in formula (3) above. The formula assumes a focus of the light beam 136 at position z=0. From a shift of the focus, i.e. from a coordinate transformation along the z-axis, a longitudinal position of the object 114 may be derived.

[0680] In addition or alternatively to using the beam width w.sub.0 at the position of the spatial light modulator 188, a beam width w at the position of the at least one optical sensor 122 may be derived and/or used for determining the longitudinal position of the object 114 and/or the beacon device 118. Thus, as outlined above, the at least one optical sensor 122 is a FiP-sensor, as discussed above and as discussed in further detail e.g. in WO 2012/110924 A1, US 2012/0206336 A1, US 2014/0291480 A1 or WO 2014/097181 A1. Thus, given the same total power of illumination, the signal S depends on the beam width w of the respective light spot 206 on the sensor region 126 of the optical sensor 122. This effect may be pronounced by modulating the light beam 116, by the spatial light modulator 188 and/or any other modulation device the focus-tunable lens 130. The modulation may be the same modulation as provided by the modulator device 194 and/or may be a different modulation, such as a modulation at higher or lower frequencies. Thus, as an example, the emission and/or reflection of the at least one light beam 116 by the at least one beacon device 118 may take place in a modulated way. Thus, as an example, the at least one beacon device 118 may comprise at least one illumination source which may be modulated individually.

[0681] As outlined above with reference to FIG. 2 and as explained in great detail in one or more of WO 2012/110924 A1, US 2012/0206336 A1, US 2014/0291480 A1 or WO 2014/097181 A1, due to the FiP-effect, the signal S.sub.1 and/or S.sub.2 depend on a beam width w.sub.1 or w.sub.2, respectively. Thus, e.g. by using equation (3) given above, beam parameters of the light beam 116 may be derived, such as z.sub.0 and/or the origin of the z-axis (z=0). From these parameters, the longitudinal coordinate z of the object 114 and/or of one or more of the beacon devices 118 may be derived.

[0682] Thus, the setup using the at least one spatial light modulator 188 may simply be used for generating xy-information regarding the object 114 and/or at least one part thereof, such as of one or more of the beacon devices 118. Depth information, i.e. z-information, regarding the object 114 and/or at least one part thereof, such as of the at least one beacon device 118, may be generated by evaluating the at least one sensor signal of the at least one optical sensor 122 exhibiting the FiP effect. It shall be noted, however, that the spatial light modulator 188 may further be used for generating pixelated images with depth information for each pixel since, for each part or at least some parts of an image captured by the optical detector 110 and/or a camera 152 comprising the optical detector 110, depth information may be evaluated for each pixel 192, for some of the pixels 192 or for groups of pixels 192 such as for superpixels comprising a plurality of pixels 192. Further, one or more imaging devices 162 may be used for image generation, such as in the setups shown in FIGS. 3 and 5, and depth information for the pixels or at least some of the pixels of one or more images generated by the at least one optional imaging devices 162 may be generated.

[0683] In FIG. 7, symbolically, a setup of the modulator device 194 and of a demodulation device 198 is disclosed in a symbolic fashion, which allows for separating signal components (indicated by S.sub.11 to S.sub.mn) for the pixels 192 of the m×n matrix 190. Thus, the modulator device 194 may be adapted for generating a set of modulation frequencies f.sub.11 to f.sub.mn, for the entire matrix 190 and/or for a part thereof, such as for one or more superpixels comprising a plurality of pixels 192. As outlined above, each of the modulation frequencies f.sub.11 to f.sub.mn may include a respective frequency and/or a respective phase for the pixel 192 indicated by the indices i, j, with i=1. . . m and j=1 n. The set of frequencies f.sub.11 to f.sub.mn is both provided to the spatial light modulator 188, for modulating the pixels 192, and to the demodulation device 198. In the demodulation device 198, simultaneously or subsequently, the modulation frequencies f.sub.11 to f.sub.mn are mixed with the respective signal S to be analyzed, such as by using one or more frequency mixers 208. The mixed signal, subsequently, may be filtered by one or more frequency filters, such as one or more low pass filters 210, preferably with well-defined cutoff frequencies. The setup comprising the one or more frequency mixers 208 and the one or more low pass filters 210 generally is used in lock-in analyzers and is generally known to the skilled person.

[0684] By using the demodulation device 198, signal components S.sub.11 to S.sub.mn may be derived, wherein each signal component is assigned to a specific pixel 192, according to its index. It shall be noted, however, that other types of frequency analyzers may be used, such as Fourier analyzers, and/or that one or more of the components shown in FIG. 7 may be combined, such as by subsequently using one and the same frequency mixer 208 and/or one and the same low pass filter 210 for the different channels.

[0685] As outlined above, various setups of the optical detector is 110 are possible. Thus, as an example, the optical detector 110 as e.g. shown in FIG. 1, 3, 4 or 5 may comprise one or more optical sensors 122. These optical sensors 122 may be identical or different. Thus, as an example, one or more large-area optical sensors 122 may be used, providing a single sensor region 126. Additionally or alternatively, one or more pixelated optical sensors 122 may be used. Further, besides one or more optical sensors 122 exhibiting the above-mentioned FiP effect, one or more further optical sensors may be included which do not necessarily have to show the FiP effect. Further, in case a plurality of optical sensors 122 is provided, the optical sensors 122 may provide identical or different spectral properties, such as identical or different absorption spectra. Further, in case a plurality of optical sensors 122 is provided, one or more of the optical sensors 122 may be organic and/or one or more of the optical sensors 122 may be inorganic. A combination of organic and inorganic optical sensors 122 may be used.

[0686] Further optional modifications of the setup of the optical detector 110 refer to the design of the beam path 132. Thus, as outlined above, the beam path 132 along which the at least one light beam 116 propagates within the optical detector 110 may be a single beam path 132 or may be split into a plurality of partial beam paths. Further, the beam path 132 may be a straight beam path or may be bent, tilted, back-reflected or the like, as the skilled person will recognize. An exemplary embodiment of an optical detector 110 having a split beam path is shown in FIG. 11. In FIG. 11, the light beam 116 enters the optical detector 110 from the left, by passing at least one transfer device 196, which, again, may include the at least one focus-tunable lens 130. The light beam 116 propagates along an optical axis 112 and/or a beam path 132. Subsequently, by one or more beam splitting elements 212 such as one or more prisms, one or more semi-transparent mirrors or one or more dichroitic mirrors, the light beam 116 is split into a first partial light beam 214 travelling along a first partial beam path 216, and a second partial light beam 218, propagating along a second partial beam path 220. A spatial light modulator 188 may be located in the first partial beam path 216. In this embodiment, the spatial light modulator 188 is depicted as a reflective spatial light modulator, deflecting the first partial light beam 214 towards a stack 124 of optical sensors 122. Alternatively, other setups are feasible. Thus, as an example, a transparent spatial light modulator 188 may be used, such as by using a spatial light modulator 188 based on liquid crystals, thereby rendering the first partial beam path 216 straight.

[0687] In one or both of the partial beam paths 216, 220, at least one intransparent optical sensor element may be located, such as at least one imaging device 162. In the setup shown in FIG. 8, the imaging device 162 is located in the second partial beam path 220, whereas the stack of optical sensors 122 is located in the first partial beam path 216. Again, as an example, the at least one imaging device 162 may be or may comprise at least one CCD- and/or CMOS-chip, more preferably a full-color or RGB CCD- or CMOS chip. Thus, as in the setup of FIG. 8, the second partial beam path 220 may be dedicated to imaging and/or determining x- and/or y-coordinates, whereas the first partial beam path 216 may be dedicated to determining a z-coordinate, wherein, still, in this embodiment or other embodiments, an x-y-detector may be present in the first partial beam path 216. One or more individual additional optical elements 222, 224 may be present within the partial beam paths 216, 220, such as one or more lenses, filters, diaphragms or other optical elements.

[0688] It shall further be noted that the spatial light modulator 188 in the setup shown in FIG. 8 may be separate from the beam-splitting element 212. Additionally or alternatively, however, in case a reflective spatial light modulator 188 is used, the spatial light modulator 188 may also be part of the beam-splitting element 212.

[0689] In the exemplary embodiments shown in FIGS. 5 and 8, the at least one optional spatial light modulator 188 is separate from the at least one focus-tunable lens 130. It is, however, also possible to fully or partially integrate the at least one focus-tunable lens 130 with the spatial light modulator 188 or vice versa. An exemplary embodiment of this type is shown in FIG. 9. It shall be noted, that the setup shown in FIG. 9 may be combined with other embodiments of the optical detector 110, such as with more complex beam paths 132, such as with split beam paths and/or with one or more beam-splitting elements. Thus, FIG. 9 simply shows an example of an integration of the at least one focus-tunable lens 130 into the spatial light modulator 188, without restricting further embodiments of the optical detector 110.

[0690] Thus, the embodiment shown in FIG. 9 may widely correspond to the embodiment of the optical detector and/or the camera 152 shown in FIG. 5. Consequently, with regard to most components of the optical detector 110, reference may be made to the description of FIG. 5 above. In this embodiment, however, the at least one focus-tunable lens 130 is integrated with the spatial light modulator 188, by using a spatial light modulator 188 having a micro-lens array 226, having a matrix of pixels 192, wherein each pixel 192, preferably, has at least one micro-lens 228 being embodied as a focus-tunable lens 130. For potential embodiments and setups, reference may be made to the setup of lens arrays as disclosed e.g. in C. U. Murade et al., Optics Express, Vol. 20, No. 16, 18180-18187 (2012). It shall be noted, however, that other embodiments of the micro-lens array 226 and/or the focus-tunable micro-lenses 228, 130 are feasible.

[0691] In case the at least one focus-tunable lens 130 is combined with the at least one spatial light modulator 188, the at least one property of the partial light beams which is modified by the spatial light modulator 188 specifically may be a focal position of the light beam 116 and/or the partial light beam passing the respective pixel 192. Consequently, the light beam 116 may be split into a plurality of partial light beams, according to the micro-lenses 228 through which these portions of the light beam 116 pass, wherein beam properties such as focal positions and/or Gaussian beam properties of each partial light beam may be modulated and/or modified by the micro-lenses 228. Consequently, the at least one focus-modulation device 136, in this embodiment or other embodiments in which the spatial light modulator 188 and the at least one focus-tunable lens 130 are fully or partially combined, may fully or partially be combined with the at least one modulator device 194 of the spatial light modulator 188. Consequently, the at least one focus-modulating signal 138 generated by the focus-modulation device 136 may fully or partially be identical with the at least one modulation signal generated by the modulator device 194 of the spatial light modulator 188. Therein, preferably, each pixel 192, i.e. preferably each micro-lens 228, may be individually controlled by corresponding focus-modulating signals 138. For providing focus-modulating signals 138 to each pixel 192, appropriate multiplexing schemes may be used, as known in passive-matrix liquid crystal devices, and/or focus-modulating signals 138 may be provided simultaneously to all pixels 192 and/or to a plurality of pixels 192, as known e.g. in active-matrix display devices.

[0692] In the setups shown e.g. in FIG. 5 or 8, in which the at least one focus-tunable lens 130 is fully or partially separate from the at least one optional spatial light modulator 188, the evaluation of the sensor signals as shown e.g. in the context of FIG. 2 above, may be separate from the functionality of the spatial light modulator 188. Consequently, a focus-modulation may take place for all pixels 192 of the spatial light modulator 188. In the setup in which the spatial light modulator 188 is fully or partially integrated with the at least one focus-tunable lens 130, such as by using the micro-lens array 226, an individual evaluation of the partial light beams passing through the pixels 192 is possible. Thus, each pixel 192 or one or more groups of pixels 192, such as superpixels having a plurality of individual pixels 192, may be controlled with a unique and common modulation frequency, thereby allowing for using the evaluation scheme as disclosed e.g. in the context of FIG. 2 above for each of these pixels, groups of pixels or superpixels, in order to evaluate and determine depth information for these pixels. In order to assign groups of pixels or superpixels to specific elements within a scene captured by the optical detector 110, the at least one imaging device 162 may be used. Thus, by using e.g. conventional image recognition algorithms such as algorithms adapted for detecting specific elements or objects within an image captured by the image detector 162, areas within the image may be identified, and, superpixels within the matrix 190 may be identified correspondingly.

[0693] An exemplary and simplified embodiment of this evaluation scheme is shown in FIG. 10. FIG. 10 shows a top view onto the matrix 190 of pixels 192 of the micro-lens array 226. Each pixel 192 comprises a focus-tunable lens 130 embodied as a micro-lens 228.

[0694] Within the matrix 190 of pixels 192, in this simplified embodiment, two superpixels 230, 230′ are defined, each having a plurality of pixels 232, 232′, assigned to the superpixels 230, 230′, reespectively. The definition of the at least one superpixel 230, 230′ may, as an example, be made in accordance with results of an evaluation of one or more images generated by the imaging device 162. Thus, as an example, each superpixel 230, 230′ may correspond to an object and/or a pattern detected within the at least one image. Further, in case a sequence of images is generated, such as in a video clip, the definition of the at least one superpixel 230, 230′ may be fixed or may vary, such as from image to image of the image sequence. Thereby, as an example, one or more objects 114 within a scene captured by the optical detector 110 may be tracked. In each image, the at least one object 114 or the image thereof may be identified, and, correspondingly, one or more superpixels 230, 230′ may be defined on the spatial light modulator 188, wherein the pixels 232, 232′ assigned to the superpixels 230, 230′ are pixels through which partial light beams propagating from the at least one object 114 towards the optical detector 110 actually pass.

[0695] The pixels 232, 232′ assigned to the one or more superpixels 230, 230′ may be controlled at a common modulation frequency, such as by periodically modulating the micro-lenses 228 of these pixels 232, 232′. In case more than one superpixel 230, 230′ is defined, the superpixels 230, 230′ may be assigned different modulation frequencies, such as a first modulation frequency f.sub.1 for the pixels 232 of the first superpixel 230, and a second modulation frequency f.sub.2 for the pixels 232′ of the second superpixel 230′, with f.sub.1≠f.sub.2. The remaining pixels 234 of the matrix 190, which are not assigned to the at least one superpixel 230, 230′, may remain unmodulated or may be modulated at a modulation frequency different from the modulation frequency of the pixels 232, 232′ assigned to the one or more superpixels 230, 230′, such as a third modulation frequency f.sub.3, with f.sub.3≠f.sub.1, f.sub.3≠f.sub.2.

[0696] By using the evaluation scheme shown e.g. in FIG. 2, depth information regarding the at least one object 114 or a part thereof, corresponding to the at least one superpixel 230, 230′, may be generated. Thus, in the simplified example shown in FIG. 10, the object 114 may be a schematic human being, which is identified by image evaluation of the image generated by the imaging device 162. By periodically modulating the pixels 232 assigned to the superpixel 230, 230′ of the object 114, signals generated by light beams 116 propagating from this object 114 to the optical detector 110 may be separated from background signals, and, additionally, depth information regarding the object 114 may be generated, using e.g. the evaluation scheme discussed above in the context of FIG. 2. Consequently, the focal length signal 144 in FIG. 2 may be the focal length curve having the modulation frequency of the pixels 232 assigned to the superpixel 230, and, consequently, the maxima 148 may be assigned to the object 114. By locating these maxima and by determining the focal length f at which these maxima occur, at least one information on a longitudinal position of the object 114 may be generated. In case more than one superpixel is defined, such as superpixels 230, 230′ in FIG. 10, as outlined above, the pixels 232, 232′ may be modulated at different modulation frequencies f.sub.1, f.sub.2. By the frequency analysis as shown e.g. in FIG. 2, a separation of the maxima 148 (and/or, analogously, minima) may take place and these maxima 148 may be assigned to the respective frequencies. Thus, as an example, a first type of maxima 148 may occur in curve 146, at a periodicity corresponding to the first modulation frequency f.sub.1 and a second type of maxima 148 may occur in curve 146, at a periodicity corresponding to the second modulation frequency f.sub.2. By frequency separation, such as by electronic filtering and/or by analysis of curve 146, these maxima 148 may be separated and, for each frequency, focal lengths f.sub.1, f.sub.2 may be generated at which the object 114 corresponding to the respective superpixel 230, 230′ is in focus. Thereof, such as by using the above-mentioned lens equation, at least one item of longitudinal information on each of the objects 114 may be generated.

[0697] Thus, as outlined above, the evaluation scheme disclosed in the context of FIG. 2 may generally also be possible for a plurality of objects 114. Thus, as can be seen in FIG. 2, the maxima 148 occur at a specific frequency of modulation, corresponding to the frequency of the focal length curve 144. In case a plurality of superpixels 230, 230′ is used, having different modulation frequencies, a frequency separation may be performed, such as by using hardware filters and/or electronic filters and/or by generating histograms similar to the frequency analysis shown in FIG. 6. Thereby, signals and maxima 148 may be separated according to their modulation frequencies and, thus, maxima 148 and/or minima may be assigned to corresponding superpixel 230, 230′.

[0698] By using evaluation schemes of this type, depth information for specific pixels 192 of the spatial light modulator 188 and/or of one or more images generated by the imaging device 162, for more than one pixel 192, for groups of pixels 192 or superpixels 230, 230′ or even for all of the pixels of an image generated by the imaging device 162 may be generated. By combining the image generated by the imaging device 162 with the depth information generated by using the optical detector, 3-dimensional images or at least images having depth information for one or more regions within the image may be generated.

[0699] A setup of the optical detector 110 in which the at least one focus-tunable lens 130 and the at least one optional spatial light modulator 188 are combined may be used for designing a camera 152 showing all or at least some of the objects within a scene captured by the optical detector 110 in focus and which can also determine depth. Thus, a camera lens may be replaced fully or partially by the at least one focus-tunable lens array having the micro-lens array 226 of focus-tunable micro-lenses 228, 130. The lens focus of these micro-lenses may be oscillating periodically, such as for one or more selected areas of the array 190, such as for one or more superpixels 230, 230′. Thus, for these modulated micro-lenses 228, the focus may be changed from a minimum to a maximum focus length and back. By changing the amplitude and/or offset of the focus, different focus levels may be analyzed. For example, an object 114 in the front can be analyzed in detail, using a short focal length of the corresponding superpixel 230, 230′ or array of micro-lenses, while an object 114 in the back of the scene can be, such as simultaneously, analyzed by using a longer focal length. In order to distinguish the different focus levels, the micro-lenses 228 may be oscillated at different frequencies, which makes a separation possible, such as by using fast Fourier transformation (FFT) and/or other means of frequency selection possible.

[0700] While the focus oscillates, the at least one sensor signal of the at least one optical sensor 122 being embodied as a FiP sensor will show a local minima and/or maxima, wherein an object is in focus with the corresponding optical sensor 122. The imaging device 162, such as the CCD chip and/or the CMOS-chip, having a plurality of imaging pixels, may record an image at the focal length, wherein the FiP curve shows a minimum or maximum. Thus, a simple scheme may be obtained, in order to obtain an image that has all objects or at least some objects in focus.

[0701] The focal length at which a specific optical sensor 122 being embodied as a FIP sensor detects an object in focus may be used to calculate a relative or absolute depth of the corresponding object 114. In connection with image analysis and/or filters, a 3D-image may be calculated.

[0702] The use of spatial light modulators 188 having a micro-lens array 226 composed of a plurality of focus-tunable lenses 130 provides advantages over other types of spatial light modulators, such as spatial light modulators based on micro-mirror systems. Thus, as an advantage, it may be emphasized that, typically, background light may still be transmitted regardless of the focus of the micro-lens and, therefore, may be present as a background signal such as a DC signal in the sensor signal of the optical sensor 122. This background signal, however, may easily be subtracted from the actual modulated signal, such as by using a high pass filter. In case a reflective spatial light modulator 188 is used, such as a micro-mirror array, the signal of the object in focus and the signal of the background light are typically both modulated at the same frequency, which makes a separation of the desired signal of the object and the background signal difficult.

[0703] A further advantage, on the constructive side of the camera 152, may be the fact that a linear setup, as shown e.g. in FIG. 9, is possible, as opposed to the folded setups when using reflective spatial light modulators. Further, in setups of the optical detector 110 using reflective spatial light modulators, a near-focus image is typically required both on the spatial light modulator and on the optical sensor. This requirement, however, imposes severe constraints on the optical construction and renders the optical design of the optical detector demanding. In setups using spatial light modulators 188 having at least one micro-lens array 226, due to the typically short focal lengths of the micro-lenses 228 used therein, and due to the fact that the lenses are typically operated in an oscillating fashion, only a near-focus image on the micro-lens array 226 is necessary. The micro-lenses 228 will then, typically, refocus the partial image onto the optical sensor 122. Consequently, no additional optical elements between the micro-lens array 226 and the at least one optical sensor 122 are required, even though these additional optical elements still may be present for various purposes.

LIST OF REFERENCE NUMBERS

[0704] 110 Optical detector [0705] 112 Optical axis [0706] 114 Object [0707] 116 Light beam [0708] 118 Beacon device [0709] 120 Detector system [0710] 122 Optical sensor [0711] 124 Stack [0712] 126 Sensor region [0713] 128 Light spot [0714] 130 Focus-tunable lens [0715] 132 Beam path [0716] 134 Focal length modulation [0717] 136 Focus-modulation device [0718] 138 Focus-modulating signal [0719] 140 Evaluation Device [0720] 142 Coordinate System [0721] 144 Focal Length [0722] 146 Sensor Signal [0723] 148 Maximum [0724] 150 Object-in-focus-line [0725] 152 Camera [0726] 154 Transversal optical sensor [0727] 156 z-evaluation device [0728] 158 xy-evaluation device [0729] 160 3D-evaluation device [0730] 162 Imaging device [0731] 164 Imaging evaluation device [0732] 166 Human-machine device [0733] 168 Entertainment device [0734] 170 Tracking system [0735] 172 Connector [0736] 174 Housing [0737] 176 Control element [0738] 178 User [0739] 180 Opening [0740] 182 Direction of view [0741] 184 Machine [0742] 186 Track controller [0743] 188 Spatial light modulator [0744] 190 Matrix [0745] 192 Pixel [0746] 194 Modulator device [0747] 196 Transfer device [0748] 198 Demodulation device [0749] 200 Result of frequency analysis [0750] 202 Data processing device [0751] 204 Date memory [0752] 206 Light spot [0753] 208 Frequency mixers [0754] 210 Low pass filter [0755] 212 Beam-splitting element [0756] 214 First partial light beam [0757] 216 First partial beam path [0758] 218 Second partial light beam [0759] 220 Second partial beam path [0760] 222 Additional optical element [0761] 224 Additional optical element [0762] 226 Micro-lens array [0763] 228 Micro-lens [0764] 230, 230′ Superpixel [0765] 232, 232′ Pixels assigned to superpixels 230, 230′, respectively [0766] 234 Remaining pixels