Detector for determining a position of at least one object

11067692 · 2021-07-20

Assignee

Inventors

Cpc classification

International classification

Abstract

A method for adjusting a detector (110) for determining a position of at least one object (112) within a range of measurement (114) is disclosed. The detector (110) comprises at least two longitudinal optical sensors (116) and at least one transfer device (118) for imaging the object (112) into an image plane. The transfer device (118) has a focal plane. The transfer device (118) is positioned in between the longitudinal optical sensors (116) and the object (112). Each of the longitudinal optical sensors (116) has at least one sensor region (120). Each of the longitudinal optical sensors (116) is designed to generate at least one longitudinal sensor signal in a manner dependent on an illumination of the respective sensor region (120) by at least one light beam (178) propagating from the object (112) to the detector (110), wherein the longitudinal sensor signal, given the same total power of the illumination, is dependent on a beam cross-section of the light beam (178) in the sensor region (120). The detector (110) further comprises at least one evaluation device (124). The method comprises the following steps: (i) subsequently moving the object (112) longitudinally to at least two different calibration positions (134, 136) having at least two different longitudinal coordinates within the range of measurement (114); (ii) recording, for each of the calibration positions (134, 136), at least one first longitudinal sensor signal generated by a first longitudinal optical sensor (126) and at least one second longitudinal sensor signal generated by a second longitudinal optical sensor (128); (iii) forming, for each of the calibration positions (134, 126), at least one calibration signal using the first and second longitudinal sensor signals; (iv) generating a calibration function using the calibration signals, the calibration function defining a relationship between the longitudinal coordinate of the object (112) and the first and second longitudinal sensor signals.

Claims

1. A method of adjusting a detector, wherein the detector is suitable for determining a position of at least one object within a range of measurement, wherein the detector constitutes a coordinate system in which an optical axis forms a z-axis, a direction parallel or antiparallel to the z-axis being a longitudinal direction, and a coordinate along the z-axis being a longitudinal coordinate, the detector comprising at least two longitudinal optical sensors and at least one transfer device suitable for imaging the at least one object into an image plane, the at least one transfer device having a focal plane and being positioned between the at least two longitudinal optical sensors and the at least one object, each of the at least two longitudinal optical sensors having at least one sensor region and being designed to generate at least one longitudinal sensor signal in a manner dependent on an illumination of the respective sensor region by at least one light beam propagating from the at least one object to the detector, wherein the at least one longitudinal sensor signal, given the same total power of the illumination, is dependent on a beam cross-section of the at least one light beam in the respective sensor region, the detector further comprising at least one evaluation device, the method comprising: (i) subsequently moving the at least one object longitudinally to at least two different calibration positions having at least two different longitudinal coordinates within the range of measurement; (ii) recording, for each of the at least two different calibration positions, at least one first longitudinal sensor signal generated by a first longitudinal optical sensor and at least one second longitudinal sensor signal generated by a second longitudinal optical sensor; (iii) forming, for each of the at least two different calibration positions, at least one calibration signal using the at least one first longitudinal sensor signal and the at least one second longitudinal sensor signal; and (iv) generating a calibration function using the at least one calibration signal, the calibration function defining a relationship between a longitudinal coordinate of the at least one object and the at least one first longitudinal sensor signal and the at least one second longitudinal sensor signal; wherein the method further comprises at least one adjustment step for positioning the first longitudinal optical sensor and the second longitudinal optical sensor, the adjustment step comprising the following substeps: a) positioning the object in at least one outermost position within the measurement range, the outermost position having a maximum longitudinal coordinate; b) positioning the first longitudinal optical sensor at a longitudinal coordinate of a focused image plane; c) positioning the object in at least one closest position within the measurement range, the closest position having a minimum longitudinal coordinate; and d) positioning the second longitudinal optical sensor at a longitudinal coordinate of a focused image plane.

2. The method of claim 1, further comprising making at least one measurement, wherein the longitudinal coordinate of the at least one object is determined by using the calibration function.

3. The method of claim 1, wherein the at least one adjustment is performed before (i).

4. The method of claim 1, further comprising (v) positioning, suitable for positioning the first longitudinal optical sensor and the second longitudinal optical sensor, the positioning comprising: A) positioning the at least one object in at least one outermost position within the range of measurement, the at least one outermost position having a maximum longitudinal coordinate, and positioning the first longitudinal optical sensor at a longitudinal position between the at least one transfer device and the focal plane of the at least one transfer device; and B) positioning the second longitudinal optical sensor at a longitudinal coordinate of a focused image plane.

5. The method of claim 4, wherein A) comprises: A1) defining a sensor threshold for the at least one first longitudinal sensor signal; A2) moving the first longitudinal optical sensor towards the focal plane of the at least one transfer device and comparing the at least one first longitudinal sensor signal with the sensor threshold; and A3) positioning the first longitudinal optical sensor at a position at which the at least one first longitudinal sensor signal equals the sensor threshold.

6. The method of claim 4, wherein (v) is performed before (i).

7. A detector, suitable for determining a position of at least one object, wherein the detector constitutes a coordinate system in which an optical axis forms a z-axis, a direction parallel or antiparallel to the z-axis being a longitudinal direction, and a coordinate along the z-axis being a longitudinal coordinate, wherein the detector comprises: at least one transfer device, suitable for imaging the at least one object into an image plane, and having a focal plane; at least two longitudinal optical sensors, wherein each of the at least two longitudinal optical sensors has at least one sensor region and is designed to generate at least one longitudinal sensor signal in a manner dependent on an illumination of the respective sensor region by at least one light beam propagating from the at least one object to the detector, wherein the at least one longitudinal sensor signal, given the same total power of the illumination, is dependent on a beam cross-section of the at least one light beam in the respective sensor region; and at least one evaluation device, wherein the detector is adapted to move the at least one object subsequently to at least two different calibration positions having at least two different longitudinal coordinates within a range of measurement, wherein the at least one evaluation device is adapted to record, for each of the at least two different calibration positions, at least one first longitudinal sensor signal generated by a first longitudinal optical sensor and at least one second longitudinal sensor signal generated by a second longitudinal optical sensor, wherein the at least one evaluation device is adapted to form, for each of the at least two different calibration positions, at least one calibration signal using the at least one first longitudinal sensor signal and the at least one second longitudinal sensor signal, and wherein the at least one evaluation device is designed to generate a calibration function using the at least one calibration signal, the calibration function defining a relationship between a longitudinal coordinate of the at least one object and the at least one first longitudinal sensor signal and the at least one second longitudinal sensor signal: wherein the detector is adapted to perform at least one adjustment step for positioning the first longitudinal optical sensor and the second longitudinal optical sensor, the adjustment step comprising the following substeps: a) positioning the object in at least one outermost position within the measurement range, the outermost position having a maximum longitudinal coordinate: b) positioning the first longitudinal optical sensor at a longitudinal coordinate of a focused image plane; c) positioning the object in at least one closest position within the measurement range, the closest position having a minimum longitudinal coordinate; and d) positioning the second longitudinal optical sensor at a longitudinal coordinate of a focused image plane.

8. The detector of claim 7, wherein the at least one evaluation device is designed to generate at least one item of information on a longitudinal position of the at least one object by evaluating at least one longitudinal sensor signal.

9. The detector of claim 7, wherein the detector is adapted to perform positioning suitable for positioning the first longitudinal optical sensor and the second longitudinal optical sensor, the positioning comprising: A) positioning the first longitudinal optical sensor at a longitudinal position between the at least one transfer device and a focal plane of the at least one transfer device; and B) positioning the second longitudinal optical sensor at a longitudinal coordinate of a focused image plane.

10. The detector of claim 9, wherein A) comprises: A1) defining a sensor threshold for the at least one first longitudinal sensor signal; A2) moving the first longitudinal optical sensor toward the focal plane of at least one transfer device and comparing the at least one first longitudinal sensor signal with the sensor threshold; and A3) positioning the first longitudinal optical sensor at a position at which the at least one first longitudinal sensor signal equals the sensor threshold.

11. The detector of claim 7, wherein at least one of the at least two longitudinal optical sensors is at least partially transparent.

12. The detector of claim 11, wherein the detector comprises at least one imaging device and is adapted to image the at least one object through the at least two longitudinal optical sensors.

13. A detector system, suitable for determining a position of at least one object, wherein the detector system comprises: at least one detector of claim 7, and at least one beacon device adapted to direct at least one light beam towards the at least one detector, wherein the at least one beacon device is at least one of attachable to the at least one object, holdable by the at least one object and integratable into the at least one object.

14. A human-machine interface, suitable for exchanging at least one item of information between a user and a machine, wherein the human-machine interface comprises at least one detector of claim 7, wherein the human-machine interface is designed to generate at least one item of geometrical information of a user with the at least one detector, and wherein the human-machine interface is designed to assign to the at least one item of geometrical information at least one item of information.

15. An entertainment device, suitable for carrying out at least one entertainment function, wherein the entertainment device comprises at least one human-machine interface of claim 14, wherein the entertainment device is designed to enable at least one item of information to be input by a player using the human-machine interface and wherein the entertainment device is designed to vary the at least one entertainment function in accordance with the at least one item of information.

16. A tracking system, comprising at least one detector of claim 7 and being suitable for tracking a position of the at least one object, the tracking system further comprising at least one track controller, wherein the at least one track controller is adapted to track a series of positions of the at least one object at specific points in time.

17. A camera, comprising at least one detector of claim 7 and being suitable for imaging the at least one object.

18. A method of detecting, the method comprising illuminating the detector of claim 7 with the at least one light beam.

Description

BRIEF DESCRIPTION OF THE FIGURES

(1) Further optional details and features of the invention are evident from the description of preferred exemplary embodiments which follows in conjunction with the dependent claims. In this context, the particular features may be implemented alone or in any reasonable combination. The invention is not restricted to the exemplary embodiments. The exemplary embodiments are shown schematically in the figures. Identical reference numerals in the individual figures refer to identical elements or elements with identical function, or elements which correspond to one another with regard to their functions.

(2) In the figures:

(3) FIGS. 1A to 1C show an exemplary embodiment of method according to the present invention;

(4) FIGS. 2A and 2B show normalized photocurrent as a function of distance between first longitudinal optical sensor (FIG. 2A) and transfer device and second longitudinal optical sensor (FIG. 2B) and the transfer device;

(5) FIGS. 3A and 3B show quotient of first longitudinal sensor signal and second longitudinal sensor signal as a function of object distance;

(6) FIGS. 4A to 4C show a further exemplary embodiment of the method according to the present invention;

(7) FIGS. 5A and 5B show first and second longitudinal sensor signals as a function of object distance and quotient of first longitudinal sensor signal and second longitudinal sensor signal as a function of object distance; and

(8) FIG. 6 shows an exemplary embodiment of a detector, detector system, human-machine interface, tracking system and camera.

EXEMPLARY EMBODIMENTS

(9) In FIG. 1A to 1C an exemplary embodiment of the method for adjusting a detector 110 for determining a position of at least one object 112 within a range of measurement 114 according to the present invention is shown. The detector 110 comprises at least two longitudinal optical sensors 116 and at least one transfer device 118 for imaging the object 112 into an image plane. The transfer device 118 has a focal plane. The transfer device 118 is positioned in between the longitudinal optical sensors 116 and the object 112. The transfer device 118 may comprise at least one element selected from the group consisting of: a lens, in particular a focusing and/or a defocusing lens; a focusing mirror; a defocusing mirror.

(10) Each of the longitudinal optical sensors 116 has at least one sensor region 120. Each of the longitudinal optical sensors 116 is designed to generate at least one longitudinal sensor signal in a manner dependent on an illumination of the respective sensor region 120 by at least one light beam propagating from the object 112 to the detector 110. The longitudinal sensor signal, given the same total power of the illumination, is dependent on a beam cross-section of the light beam in the sensor region 120. For potential setups of the longitudinal optical sensors, reference may be made to WO 2012/110924 A1 and/or WO2014/097181 A1 and/or WO 2016/005893 A1. Still, other embodiments are feasible. The longitudinal optical sensors 116 may be arranged in a stack.

(11) At least one of the longitudinal optical sensors 116 may be at least partially transparent. Thus, generally, the longitudinal optical sensors 116 may comprise at least one at least partially transparent optical sensor such that the light beam at least partially may pass through the longitudinal optical sensor 116. Thus, as an example, the transparent longitudinal optical sensor may have a transparency of at least 10%, preferably at least 20%, at least 40%, at least 50% or at least 70%. In order to provide a sensory effect, generally, the longitudinal optical sensor typically has to provide some sort of interaction between the light beam and the longitudinal optical sensor which typically results in a loss of transparency. The transparency of the longitudinal optical sensor may be dependent on a wavelength of the light beam, resulting in a spectral profile of a sensitivity, an absorption or a transparency of the longitudinal optical sensor. Preferably all longitudinal optical sensors of the plurality and/or the stack are transparent.

(12) The transfer device 118 may comprise at least one optical axis 122. The transfer device 118 may be positioned such that light originating from the object 112, firstly, is transferred by the transfer device 118 and subsequently impinges on the longitudinal optical sensors 116. The object 112, longitudinal optical sensors 116 and the transfer device 118 may be arranged on the optical axis 122 such that the transfer device 118 is positioned in between the longitudinal optical sensors 116 and the object 112. However, embodiments are feasible wherein the transfer device 118 and the longitudinal optical sensors 116 are arranged in different beam paths.

(13) The detector 110 further comprises at least one evaluation device 124.

(14) In the embodiment shown in FIG. 1A to C, the method comprises at least one adjustment step for positioning at least one first longitudinal optical sensor 126 and at least one second longitudinal optical sensor 128. The adjustment step may comprise the following substeps:

(15) a) positioning the object 112 in at least one outermost position 130 within the measurement range 114, the outermost position 130 of having a maximum longitudinal coordinate;

(16) b) positioning the first longitudinal optical sensor 126 at a longitudinal coordinate of the focused image plane 139;

(17) c) positioning the object 112 in at least one closest position 132 within the measurement range 114, the closest position of having a minimum longitudinal coordinate; and

(18) d) positioning the second longitudinal optical sensor 128 at a longitudinal coordinate of the focused image plane 139.

(19) In particular, in step b), the first longitudinal optical sensor 126 may be positioned such that at least one first longitudinal sensor signal generated by the first longitudinal optical sensor 126 is maximized. In particular, in step d), the second longitudinal optical sensor 128 may be positioned such that least one second longitudinal sensor signal generated by the second longitudinal optical sensor 128 is maximized in case of positive FiP effect or minimized in case of negative FiP effect.

(20) Substeps a) and b) are shown in FIG. 1A. Substeps c) and d) are depicted in FIG. 1B. The substeps may be performed in the given order or in a different order. Further, two or more or even all of the method steps may be performed simultaneously and/or overlapping in time. Further, one, two or more or even all of the method steps may be performed repeatedly. The method may further comprise additional method steps. The adjustment step may be performed before method step (i).

(21) The first longitudinal sensor signal may exhibit a global maximum for this object distance and luminance at a longitudinal coordinate of the focal plane at which collected light originating from the object 112 in the outermost position 130 is focused by the transfer device 118. For example, the first longitudinal sensor signal may be maximized by, firstly, positioning the first longitudinal optical sensor 126 at an arbitrary distance to the transfer device 118, in particular to a site of the transfer device 118 opposite to the object 112, and, subsequently, by moving the first longitudinal optical sensor 126 stepwise or continuously longitudinally away from or toward the transfer device 118.

(22) The closest position 132 may be defined by design of the transfer device, in particular of a longitudinal extension. The second longitudinal sensor signal may exhibit the maximum for this object distance and radiant power at a longitudinal coordinate of the focal plane at which collected light originating from the object 112 in the closest position 132 is focused by the transfer device 118. For example, the second longitudinal sensor signal may be maximized by, firstly, positioning the second longitudinal optical sensor 128 at an arbitrary distance to the first longitudinal optical sensor 126, in particular to a site of the first longitudinal optical sensor 126 opposite to the transfer device 118 such that the first longitudinal optical sensor 126 is positioned in between the transfer device 118 and the second longitudinal optical sensor 128, and, subsequently, by moving the second longitudinal optical sensor 128 stepwise or continuously longitudinally away from or to the first longitudinal optical sensor 126.

(23) The adjusted positions of the first longitudinal optical sensor 126 and second longitudinal optical sensor 128 may differ. The adjusted position of the first longitudinal optical sensor 126 may be closer to the transfer device 118 than the adjusted position of the second longitudinal optical sensor 128. Adjusting the position of the first longitudinal optical sensor 126 and the second longitudinal optical sensor 128 using the proposed method allows that change of quotient over the measurement range is maximized. This allows best resolution to distinguish different longitudinal coordinates of the object 118.

(24) As shown in FIG. 1C, the method comprises the following steps:

(25) (i) subsequently moving the object 112 longitudinally to at least two different calibration positions 134, 136 having at least two different longitudinal coordinates within the range of measurement 114;

(26) (ii) recording, for each of the calibration positions 134, 136, at least one first longitudinal sensor signal generated by the first longitudinal optical sensor 126 and at least one second longitudinal sensor signal generated by the second longitudinal optical sensor 128;

(27) (iii) forming, for each of the calibration positions 134, 136, at least one calibration signal using the first and second longitudinal sensor signals;

(28) (iv) generating a calibration function using the calibration signals, the calibration function defining a relationship between the longitudinal coordinate of the object 112 and the first and second longitudinal sensor signals.

(29) As outlined above, in method step (i), the object 112 is moved subsequently longitudinally to at least two different calibration positions 134, 136 having at least two different longitudinal coordinates within the range of measurement 114. Preferably, the object 112 may be moved through the entire measurement range 114, in particular with a pre-defined or selected step size. The evaluation device 124 may be adapted to record the first and second longitudinal sensor signals. As outlined above, in method step (iii), for each of the calibration positions 134, 136, at least one calibration signal using the first and second longitudinal sensor signals is formed. The evaluation device 124 may adapted to form the calibration signals. In particular, at each position of the object 112 one of the first longitudinal sensor signal and the second longitudinal sensor signal may be divided by the other one of the first longitudinal sensor signal and the second longitudinal sensor signal. In particular, for each position of the object 112 a quotient of the first longitudinal sensor signal and the second longitudinal sensor signal may be formed. As outlined above, in method step (iv), a calibration function is generated using the calibration signals. The calibration function defines a relationship between the longitudinal coordinate of the object 112 and the first and second longitudinal sensor signals. In particular, the calibration function refers to relationship between the calibration signal and the longitudinal coordinate of the object 112. Particularly preferably, the relationship comprises at least one calibration curve, at least one set of calibration curves, at least one function or a combination of the possibilities mentioned. One or a plurality of calibration curves can be stored for example in the form of a set of values and the associated function values thereof, for example in a data storage device and/or a table. Alternatively or additionally, however, the at least one calibration curve can also be stored for example in parameterized form and/or as a functional equation. Various possibilities are conceivable and can also be combined.

(30) The method may further comprise at least one measurement step, not shown here. In the measurement step the longitudinal coordinate of the object 112 and/or another object may be determined within the measurement range. In particular, the longitudinal coordinate of the object 112 may be determined by recording the first sensor signal and the second sensor signal for this position of the object 112 and by forming the combined sensor signal, in particular a quotient. The longitudinal coordinate may be determined by using the calibration function. Preferably, the measurement step may be performed after performing method steps i) to iv).

(31) FIGS. 2A and 2B show experimental results. As object 112 a 530 nm LED having a modulation frequency of 475 Hz was used. As transfer device 118 a camera lens from Nikkor 50 mm f1/1.2 focused at infinity was used. As first and second longitudinal optical sensors a sDSC was used. The distance from the object 112 from the transfer device 118 was varied at 0,2 m steps between 0.393 m and 1.593 m. For each object distance, a longitudinal signal curve was recorded by moving the first longitudinal optical sensor 126 and second longitudinal optical sensor 128 with a step size of 500 μm. FIG. 2A shows an array of curves of normalized photocurrent I.sub.norm,1 as a function of distance z.sub.sensor,1 between the first longitudinal optical sensor 126 and transfer device 118. The curves are normalized to their maxima. Arrow shown in FIG. 2A denotes distance from the object 112 from the transfer device 118 of the respective longitudinal senor curve of the array of curves. The FIG. 2B shows an array of curves of normalized photocurrent I.sub.norm,2 as a function of distance z.sub.sensor,2 between the second longitudinal optical sensor 128 and the transfer device 118. The curves are normalized to their maxima. Arrow shown in FIG. 2B denotes distance from the object 112 from the transfer device 118 of the respective longitudinal senor curve of the array of curves.

(32) FIGS. 3A and 3B show experimental results of determination of quotient of first longitudinal sensor signal I.sub.1 and second longitudinal sensor signal 1.sub.2 as a function of object distance z.sub.obj in mm. FIG. 3A shows a range of object distance from 0 to 2000 mm, whereas FIG. 3B shows zoomed details. Actual measuring points are demonstrated showing possibility for sub-mm resolution. A 530 nm LED having a modulation frequency of 375 Hz was used. As transfer device 118 a camera lens from Nikkor 50 mm f1/1.2 focused at infinity was used. As first and second longitudinal optical sensors a sDSC was used. The first longitudinal optical sensor 126 was placed at a distance of 38 mm from the transfer device 118 and the second longitudinal optical sensor 128 was placed at a distance of 43 mm from the transfer device 118. The quotient between an object distance of 500 mmm and 1500 mm can be used for accurate distance determination. Within this range the quotient changes from ˜1 to ˜14.

(33) FIGS. 4A to 4C show a further exemplary embodiment of the method according to the present invention. In this embodiment, the method further may comprise at least one positioning step for positioning the first longitudinal optical sensor 126 and the second longitudinal optical sensor 128. The positioning step may comprise the following substeps:

(34) A) positioning the object 112 in the at least one outermost position 130 within the measurement range 114, the outermost position 130 of having the maximum longitudinal coordinate, and positioning the first longitudinal optical sensor 126 at a longitudinal position in between the transfer device 118 and the focal plane of the transfer device 118;

(35) A1) defining a sensor threshold for the first longitudinal sensor signal;

(36) A2) moving the first longitudinal optical sensor 126 towards the focal plane and comparing the first longitudinal sensor signal with the sensor threshold;

(37) A3) positioning the first longitudinal optical sensor 126 at a position 138 at which the first longitudinal sensor signal equals the sensor threshold; and

(38) B) positioning the second longitudinal optical sensor 128 at a longitudinal coordinate of a focused image plane 139.

(39) The substeps may be performed in the given order or in a different order. Further, two or more or even all of the method steps may be performed simultaneously and/or overlapping in time. Further, one, two or more or even all of the method steps may be performed repeatedly. The method may further comprise additional method steps. Preferably, the positioning step may be performed before method step (i).

(40) Substeps A) to A2) are shown in FIG. 4A. Preferably, the first longitudinal optical sensor 126 may be positioned in between the transfer device 118 and a point or range, wherein all normalized signals of the first longitudinal optical sensor intersect. With respect to the point or range of intersection reference is made to WO 2016/005893 A1. Although, the first longitudinal optical sensor 126 may be positioned in an arbitrary position between the transfer device 118 and the intersection point of normalized longitudinal optical sensor current, the first longitudinal optical sensor 126 may be preferably placed sufficiently far from the transfer device 118 in order to generate a longitudinal sensor signal distinguishable from a response of a noise-image. The sensor threshold may be defined such that the first longitudinal sensor signal can be used for distance measurements, in particular that the measurement signal is distinguishable from the noise-image and/or baseline. The first longitudinal optical sensor 126 is positioned at the position 138 at which the first longitudinal sensor signal equals the sensor threshold. However, preferably, change of the sensor signal may be in a range from 2× to 1000× noise value, more preferably in a range from 5× to 100× noise value and most preferably below 100× noise value. In particular, the first longitudinal optical sensor may be positioned at the position 138 at which the first longitudinal sensor signal equals the sensor threshold within tolerances of ±10%, preferably ±5%, more preferably ±1%. Movement of first longitudinal optical sensor 126 is depicted by arrow 140.

(41) Substeps A3) and B) are shown in FIG. 4B. In particular, the second longitudinal sensor 128 may be positioned at the focused image plane 139. The object 112 may still be positioned at the outermost position. The longitudinal coordinate of the focused image plane 139 may be different from the longitudinal coordinate corresponding to focal plane at f. In particular, the second longitudinal optical sensor 128 may be positioned at the focused image plane, in particular at position different from the focal plane. In particular, a distance between transfer device 118 and the focused image plane 139 may be greater than a distance between transfer device 118 and longitudinal coordinate corresponding to the focal length f of the transfer device 118. In particular, the longitudinal coordinate corresponding to the focal length f may be in between the transfer device 118 and the focused image plane 139. Preferably, the first longitudinal optical sensor 126 may be positioned in between the transfer device 118 and the point or range of intersection, which coincides or is very close to f. For example, the first longitudinal optical sensor 126 and the second longitudinal optical sensor 128 may be arranged such that the point or range of intersection is located between the first longitudinal optical sensor 126 and the second longitudinal optical sensor 128. However, distance from the point or range of intersection to the first longitudinal optical sensor 126 and distance from the point or range of intersection to the second longitudinal optical sensor 128 may be different.

(42) Furthermore, in the embodiment shown in FIGS. 4A to 4C, the method comprises the following steps, depicted in FIG. 4C:

(43) (i) subsequently moving the object 112 longitudinally to the at least two different calibration positions 134, 136 having at least two different longitudinal coordinates within the range of measurement 114;

(44) (ii) recording, for each of the calibration positions 134, 136, the at least one first longitudinal sensor signal generated by the first longitudinal optical sensor 126 and the at least one second longitudinal sensor signal generated by the second longitudinal optical sensor 128;

(45) (iii) forming, for each of the calibration positions 134, 136, the at least one calibration signal using the first and second longitudinal sensor signals;

(46) (iv) generating the calibration function using the calibration signals, the calibration function defining a relationship between the longitudinal coordinate of the object 112 and the first and second longitudinal sensor signals.

(47) FIGS. 5A and 5B show further experimental results. A 530 nm LED having a modulation frequency of 475 Hz was used. As transfer device 118 a camera lens from Nikkor 50 mm f1/1.2 focused at infinity was used. As first and second longitudinal optical sensors a sDSC was used. The first longitudinal optical sensor 126 was placed at a distance of 33.2 mm from the transfer device 118 and the second longitudinal optical sensor 128 was placed at a distance of 38.2 mm from the transfer device 118. The object distance was varied in a step size of 0.01 m. In FIG. 5A determined photocurrent I in A of first longitudinal sensor signal, curve 142, and second longitudinal sensor signal, curve 144, as a function of object distance z.sub.obj in cm is shown. In FIG. 5B determined quotient of first longitudinal sensor signal I.sub.1 and second longitudinal sensor signal 1.sub.2 as a function of object distance z.sub.obj in cm is shown. Between 0.2 m to 1.80 m the quotient changes between ˜0.5 and ˜2.5. The quotient does not level off or change slope within the measurement range. A monotonous increase in quotient is observed. Thus, measurement of object distance within a wide measurement range is possible.

(48) FIG. 6 shows an exemplary embodiment of a detector system 142, comprising at least one detector 110. Herein, the detector 110 may be employed as a camera 144, specifically for 3D imaging, which may be made for acquiring images and/or image sequences, such as digital video clips. Further, FIG. 6 shows an exemplary embodiment of a human-machine interface 146, which comprises the at least one detector 110 and/or the at least one detector system 142, and, further, an exemplary embodiment of an entertainment device 148 comprising the human-machine interface 146. FIG. 6 further shows an embodiment of a tracking system 150 adapted for tracking a position of at least one object 112, which comprises the detector 110 and/or the detector system 142.

(49) With regard to the detector 110 and to the detector system 142, reference may be made to the full disclosure of this application. Basically, all potential embodiments of the detector 110 may also be embodied in the embodiment shown in FIG. 6. The evaluation device 124 may be connected to each of the at least two longitudinal optical sensors 116, in particular, by the signal leads 152. By way of example, the signal leads 152 may be provided and/or one or more interfaces, which may be wireless interfaces and/or wire-bound interfaces. Further, the signal leads 152 may comprise one or more drivers and/or one or more measurement devices for generating sensor signals and/or for modifying sensor signals.

(50) As described above, the detector 110 may comprise at least two longitudinal optical sensors 116, particularly in combination with one or more transversal optical sensors 154. As an example, one or more at least partially transparent transversal optical sensors 154 may be located on a side of the stack of longitudinal optical sensors 116 facing towards the object 112. Alternatively or additionally, one or more transversal optical sensors 154 may be located on a side of the stack of longitudinal optical sensors 116 facing away from the object 112. In this case the last of the transversal optical sensors 154 may be intransparent. Thus, in a case in which determining the x- and/or y-coordinate of the object in addition to the z-coordinate may be desired, it may be advantageous to employ, in addition to the at one longitudinal optical sensor 116 at least one transversal optical sensor 154 which may provide at least one transversal sensor signal. For potential embodiments of the transversal optical sensor, reference may be made to WO 2014/097181 A1. The at least one optional transversal optical sensor 154 may further be connected to the evaluation device 124, in particular, by the signal leads 152.

(51) Further, the at least one transfer device 118 may be provided. The detector 110 may further comprise the at least one housing 156 which, as an example, may encase one or more of components 116, 154.

(52) Further, the evaluation device 124 may fully or partially be integrated into the optical sensors 116, 154 and/or into other components of the detector 110. The evaluation device 124 may also be enclosed into housing 156 and/or into a separate housing. The evaluation device 124 may comprise one or more electronic devices and/or one or more software components, in order to evaluate the sensor signals, which are symbolically denoted by the longitudinal evaluation unit 158 (denoted by “z”) and a transversal evaluation unit 160 (denoted by “xy”) and. By combining results derived by these evaluation units 158, 160, a position information 162, preferably a three-dimensional position information, may be generated (denoted by “x, y, z”). An example of a coordinate system is shown with reference number 164.

(53) Further, the detector 110 and/or to the detector system 142 may comprise an imaging device 166 which may be configured in various ways. Thus, as depicted in FIG. 6, the imaging device 166 can, for example be part of the detector 110 within the detector housing 156. Herein, the imaging device signal may be transmitted by one or more signal leads 152 to the evaluation device 124. Alternatively, the imaging device 166 may be separately located outside the detector housing 156. The imaging device 166 may be fully or partially transparent or intransparent. The imaging device 166 may be or may comprise an organic imaging device or an inorganic imaging device. Preferably, the imaging device 166 may comprise at least one matrix of pixels, wherein the matrix of pixels may particularly be selected from the group consisting of: an inorganic semiconductor sensor device such as a CCD chip and/or a CMOS chip; an organic semiconductor sensor device.

(54) In the exemplary embodiment as shown in FIG. 6, the object 112 to be detected, as an example, may be designed as an article of sports equipment and/or may form a control element 168, the position and/or orientation of which may be manipulated by a user 170. Thus, generally, in the embodiment shown in FIG. 6 or in any other embodiment of the detector system 142, the human-machine interface 146, the entertainment device 148 or the tracking system 150, the object 112 itself may be part of the named devices and, specifically, may comprise the at least one control element 168, specifically, wherein the at least one control element 168 has one or more beacon devices 172, wherein a position and/or orientation of the control element 168 preferably may be manipulated by user 170. As an example, the object 112 may be or may comprise one or more of a bat, a racket, a club or any other article of sports equipment and/or fake sports equipment. Other types of objects 112 are possible. Further, the user 170 may be considered as the object 112, the position of which shall be detected. As an example, the user 170 may carry one or more of the beacon devices 172 attached directly or indirectly to his or her body.

(55) The detector 110 may be adapted to determine at least one item on a longitudinal position of one or more of the beacon devices 172 and, optionally, at least one item of information regarding a transversal position thereof, and/or at least one other item of information regarding the longitudinal position of the object 112 and, optionally, at least one item of information regarding a transversal position of the object 112. Particularly, the detector 110 may be adapted for identifying colors and/or for imaging the object 112, such as different colors of the object 112, more particularly, the color of the beacon devices 172 which might comprise different colors.

(56) The longitudinal optical sensor 116 may be arranged along the optical axis 122. Specifically, the optical axis 122 may be an axis of symmetry and/or rotation of the setup of the optical sensors 116. The longitudinal optical sensors 116 may be located inside the housing 156. An opening 174 in the housing 156, which may, particularly, be located concentrically with regard to the optical axis 122, preferably defines a direction of view 176 of the detector 110. The light beam originating from the object is denoted with reference number 178.

(57) The detector 110 may be adapted for determining the position of the at least one object 112. Additionally, the detector 110, specifically an embodiment including the camera 144, may be adapted for acquiring at least one image of the object 112, preferably a 3D-image. As outlined above, the determination of a position of the object 112 and/or a part thereof by using the detector 110 and/or the detector system 142 may be used for providing a human-machine interface 146, in order to provide at least one item of information to a machine 180. In the embodiments schematically depicted in FIG. 6, the machine 180 may be or may comprise at least one computer and/or a computer system comprising a data processing device 182. Other embodiments are feasible. The evaluation device 124 may be a computer and/or may comprise a computer and/or may fully or partially be embodied as a separate device and/or may fully or partially be integrated into the machine 180, particularly the computer. The same holds true for a track controller 184 of the tracking system 150, which may fully or partially form a part of the evaluation device 124 and/or the machine 180.

(58) Similarly, as outlined above, the human-machine interface 146 may form part of the entertainment device 148. Thus, by means of the user 170 functioning as the object 112 and/or by means of the user 170 handling the object 112 and/or the control element 168 functioning as the object 112, the user 170 may input at least one item of information, such as at least one control command, into the machine 180, particularly the computer, thereby varying the entertainment function, such as controlling the course of a computer game.

LIST OF REFERENCE NUMBERS

(59) 110 detector 112 object 114 range of measurement 116 longitudinal optical sensor 118 transfer device 120 sensor region 122 optical axis 124 evaluation device 126 first longitudinal optical sensor 128 second longitudinal optical sensor 130 outermost position 132 closest position 134 calibration position 136 calibration position 138 position 139 Focused image plane 140 arrow 142 detector system 144 camera 146 human-machine interface 148 entertainment device 150 tracking system 152 signal leads 154 transversal optical sensor 156 housing 158 longitudinal evaluation unit 160 transversal evaluation unit 162 position information 164 coordinate system 166 imaging 168 control element 170 User 172 beacon device 174 Opening 176 direction of view 178 light beam 180 Machine 182 data processing device 184 track controller