Method for detecting a touch-and-hold touch event and corresponding device
09760215 · 2017-09-12
Assignee
Inventors
Cpc classification
G06F7/388
PHYSICS
International classification
Abstract
Methods for determining a touch-and-hold touch event on a touch sensitive interaction surface of a touch sensing device are provided and comprise the steps of: a) determining a touch location of the touch event based on vibrations, such as bending waves, propagating through the interaction surface and b) determining whether the touch event comprises a hold touch event based on a sensed airborne signal. A device configured to carry out such methods is also provided.
Claims
1. An electronic device comprising a touch sensitive interaction surface, an acoustic signal sensor for sensing vibrations configured to propagate through the interaction surface, an airborne signal emitter, at least one airborne signal sensor, and a processor configured to: a) determine a touch location of a touch event based on vibrations propagating through the touch sensitive interaction surface; and b) determine whether the touch event comprises a hold touch event based on a sensed airborne signal, wherein the sensed airborne signal is emitted in response to detection of vibrations propagating through the interaction surface and a determination corresponding to an acoustic delay time in response to determining a state of non-motion.
2. The electronic device of claim 1, wherein the processor configured to determine whether the touch event comprises the hold touch event includes the processor configured to: control emission of the sensed airborne signal configured to propagate at least one of above or over the interaction surface; and sense properties of the emitted airborne signal.
3. The electronic device of claim 2, wherein the sensed airborne signal is emitted during step a).
4. The electronic device of claim 1, wherein the sensed airborne signal is an ultrasonic sound wave.
5. The electronic device of claim 1, wherein the processor configured to determine whether the touch event comprises the hold touch event includes the processor being configured to determine presence or absence of motion based on an evolution of the sensed airborne signal.
6. The electronic device of claim 5, wherein the processor configured to determine whether the touch event comprises hold touch event includes the processor being configured to detect the touch location in step a) and at least one of an absence of motion or an absence of the touch location.
7. The electronic device of claim 1, wherein determining whether touch event comprises the hold touch event is based on at least one of time of flight, echo delay, or amplitude and/or phase distribution as a function of frequency.
8. The electronic device of claim 1, wherein the airborne signal emitter is configured to emit an ultrasonic wave configured to propagate at least one of above or over the touch sensitive interaction surface.
9. The electronic device of claim 1, wherein the electronic device further comprises at least one of a telephone, a camera, a speaker, and a microphone, and wherein the speaker is configured to emit an airborne signal and the microphone is configured to sense the airborne signal.
10. The electronic device of claim 1 further comprising a first element and a second element configured to form an angle (α), wherein the first element comprises at least one of the airborne signal emitter or the airborne signal sensor for determining the hold touch event on the touch sensitive interaction surface and the second element comprises the touch sensitive interaction surface.
11. The electronic device of claim 10, wherein the airborne signal emitter and the airborne signal sensor are on or in the first element.
12. The electronic device of claim 11, wherein the processor is configured to identify the hold touch event based on an airborne signal reflected off the second element comprising the touch sensitive interaction surface and sensed by at least one of the airborne signal sensor provided in or on the first element.
13. The electronic device of claim 10, wherein the first element and the second element are linked to each other by a hinge portion forming the angle (α) between the first element and the second element.
14. The electronic device of claim 10, wherein the processor is configured to identify the angle (α) based on an airborne signal reflected off the second element and sensed by the airborne signal sensor provided at least one of in or on the first element.
15. The electronic device of claim 10, wherein the angle (α) is less than 180°.
16. The electronic device of claim 10, wherein the angle (α) is less than 160°.
17. The electronic device of claim 10, wherein the angle (α) is less between 45° and 160°.
18. The electronic device of claim 1 comprising a first element with a first surface and a second element with a second surface, the first and second surface forming an angle (α) with respect to each other, wherein at least one of the airborne signal emitter or the airborne signal sensor is provided on or in the first element, and wherein the processor is configured to determine a position of at least one object relative to the second surface based on sensed airborne signals.
19. The electronic device of claim 18, wherein the processor is configured to determine the position of the at least one object based on at least one of first airborne signals directly reflected off the object or second airborne signals that were reflected off the at least one object and the second surface.
20. The electronic device of claim 18, wherein the processor is configured to determine the position of the object in at least one of the plane a plane of the second surface or perpendicular to the plane of the second surface.
21. The electronic device of claim 18, wherein the second element further comprises the touch sensitive interaction surface and the processor is configured to determine the location of the object relative to the touch sensitive interaction surface in an area outside the touch sensitive interaction surface.
22. The electronic device of claim 10, wherein the first element, the second element, and a hinge portion form a clamshell housing.
23. A method for determining a hold touch event on a touch sensitive interaction surface of a touch sensing device comprising the steps of: a) determining a touch location of a touch event based on vibrations propagating through the interaction surface; and b) determining whether the touch event comprises a hold touch event based on a sensed airborne signal, wherein the sensed airborne signal is emitted by an airborne signal emitter of the touch sensing device and in response to detection of vibrations propagating through the interaction surface and a determination corresponding to an acoustic delay time in response to determining a state of non-motion.
24. The method of claim 23, wherein determining whether the touch event comprises a hold touch event comprises emitting an airborne signal configured to propagate at least one of above or over the interaction surface and sensing properties of the emitted airborne signal.
25. The method of claim 23, wherein determining whether the touch event comprises the hold touch event includes determining presence or absence of motion based on an evolution of the sensed airborne signal.
26. The method of claim 23, wherein determining whether the touch event comprises the hold touch event includes detecting the touch location in step a) and at least one of an absence of motion or an absence of the touch location.
27. A computer program product including computer executable instructions stored thereon for determining a hold touch event on a touch sensitive interaction surface of a touch sensing device, the computer program product comprising at least one non-transitory computer-readable storage medium having computer-executable program code instructions stored therein, the computer-executable program code instructions comprising program code instructions to: a) determine a touch location of a touch event based on vibrations propagating through the interaction surface; and b) determine whether the touch event comprises a hold touch event based on a sensed airborne signal, wherein the sensed airborne signal is emitted by an airborne signal emitter of the touch sensing device and in response to detection of vibrations propagating through the interaction surface and a determination corresponding to an acoustic delay time in response to determining a state of non-motion.
Description
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S)
(1) Specific embodiments of the invention will be described in detail with respect to the enclosed figures.
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
DETAILED DESCRIPTION OF THE INVENTION
(15)
(16) The user interaction surface 3 can for instance be part of a touch screen, but may belong to other parts of the device 1. The device further comprises an acoustic signal sensing means 5 and an analyzing unit 7. In this embodiment only one acoustic signal sensing means 5 is illustrated, however, more than one acoustic signal sensing means can be part of the device 1.
(17) The acoustic signal sensing means 5 is configured such that, for a touch event during which the user 9 just taps at position 11 on the interaction surface 3, the location of the impact is identified and the action attributed to the location is carried out by the touch sensitive device 1.
(18) The acoustic signal sensing means 5 is a transducer transforming the vibrations of e.g., a bending wave, travelling inside the user interaction surface 3 into electrical signals. The acoustic signal sensing means 5 can be any one of a piezoelectric transducer, magnetostrictive piezoelectric transducers, electromagnetic piezoelectric transducers, acoustic velocimeters, accelerometers, optical sensors, microelectromechanical system sensors (MEMs), or the like according to specific embodiments.
(19) When a user touches the interaction surface 3 with an object such as his hand 10 at location 11 of the interaction surface 3, vibrations such as bending waves are injected in the interaction surface 3 and will be sensed by the acoustic signal sensing means 5. The sensed signal is then transferred to the analyzing unit 7 which is configured to determine the position 11 of the touch event.
(20) The inventive touch sensitive device 1 according to a specific embodiment of the invention is, however, configured to discriminate between a simple touch event like a tap by a finger of the user or a stylus and a touch-and-hold event during which the user keeps his finger or the stylus in contact with the interaction surface at the same location 11. Typically, two different actions are carried out by the touch sensitive device in response to the two types of inputs.
(21) Immediately following the tap on the interaction surface 11, the vibrations fade away, so that the analysis that was used to determine the location of an impact may not be applied to discriminate between the touch event and the touch-and-hold event.
(22) According to the invention, the touch sensitive device 1 is configured to analyse the properties of an airborne signal to decide whether the touch event is accompanied by a hold event.
(23) In this embodiment, the touch sensitive device 1 comprises an ultrasound signal emitting means 13 and two ultrasound sensing means 15a and 15b. The ultrasound signal emitting means 13 is configured to generate ultrasonic waves 19 that travel over the surface 17 of the touch sensitive device 1. The arrangement of emitters 13 and sensing means 15a and 15b in
(24) The ultrasonic wave 19 travelling over the interaction surface 3 is sensed by the sensing means 15a, 15b and the signal is forwarded to the analyzing unit 7 for further processing.
(25) As the sensing of vibrations travelling inside the interaction surface 3 by the acoustic signal sensing means 5 and the sensing of ultrasonic waves 19 which are traveling above and/or over the interaction surface 3, by the ultrasound sensing means 15a/15b relate to two different physical phenomena, the inventive device uses two distinct sensing means.
(26)
(27) First of all, the presence of the hand 10 and finger 9 can lead to reflections/echoes 23 from the hand 10 and/or finger 21. Furthermore, the propagation of the airborne ultrasonic wave 19 can be blocked by the presence of the finger 21, thereby leading to a shadowing effect 25. As a third mode, diffraction effects may occur leading to diffracted beams 27.
(28)
(29)
(30) Thus by analyzing the evolution of the reflected airborne signal 33/33′ using the analyzing unit 7, the system can easily and reliably discriminate between a state of no motion—flat lines like in
(31) According to a specific embodiment of the invention, the decision whether the touch event is accompanied by a hold action is based on identifying these two states in combination with the fact of recognizing touch localization based on the vibrations sensed by the sensing means 5.
(32)
(33) Here the ultrasound signal emitting means 13 and at least one ultrasound sensing means 15 are illustrated as one ultrasonic transducer. Of course there could also be two distinct devices or more than just one ultrasound sensing means 15 or more than just one ultrasound emitting device 13. Typically the ultrasound signal emitting means 13 and at least one ultrasound sensing means 15 are arranged such that the ultrasonic waves are provided over the entire width (perpendicular to the drawing plane of
(34) The first and second elements 43 and 45 are linked to each other and form an angle α with respect to each other. In this embodiment the first and second element are linked by a hinge portion 47 so that the device 41 can be opened and closed, but according to a variant the angle α could also be fixed. The electronic device 41 can for instance be a clam-shell laptop computer or any other foldable portable device like a mobile phone or a playing console. In the case of such devices, the space around the display in the second element 45 is typically rather restricted, so that the analysis of the properties of the sensed airborne signal to decide whether the touch event is accompanied by a hold event or not can still be implemented without having to enlarge the surface of the second element 45.
(35) The ultrasound emitting means 13 as well as the at least one ultrasound sensing means 15 are arranged in the edge region 48 of the first element 43 which is adjacent the second element 45, more precisely at a predetermined distance from the hinge portion 47.
(36) In this embodiment ultrasonic waves are emitted by the ultrasound emitting means 13 and are reflected or diffracted back to the at least one sensing means 15. In this embodiment, the possible acoustic paths for airborne reflections do however include more than one reflection path. In addition to the direct echo path 49 from the users hand 10 or finger 21, the ultrasound sensing means 15 in the first element 43 of the electronic device 41 e.g. also support acoustic paths in which the emitted ultrasound wave is reflected off the surface of the second element 45 via reflection path 51 before echoing off the user's finger 10 or finger 21 and again reflects off the second element 45 before being sensed by the ultrasound sensing means 15 (either the same or different as the emitting transducer) in or on the first element 43. Also of interest are signals from echo paths involving one reflection off of the second element 45 such as when ultrasonic waves emitted from the emitting means 13 travel directly to finger 21, are reflected from finger 21 and reflected again off the second element 45 before detection by ultrasound sensing means 15. Similarly, the ultrasonic waves may be reflected off of the second element 45 on the way to the finger 21 and then take a direct path to the ultrasound sensing means 15 on the way back from the finger 21.
(37) As illustrated in
(38) Delay times, comparable to the ones illustrated in
(39) The device according to
(40) The angle between the first surface 55 of the first element 43 and the second surface 57 of the second element 45 is not necessarily 90° as shown in
(41) The direct echo path 49 is associated with an echo delay time T.sub.direct and the distance between the finger 21 and the emitting and sensing means 13 and 15 is L.sub.direct. Likewise the reflection path 51 is associated with an echo delay time T.sub.reflection and the distance between the finger 21 and the virtual location 53 is L.sub.reflection. These echo delay times and corresponding distances are related by the following formulas where V is the velocity of ultrasound in air.
L.sub.direct=V.Math.T.sub.direct/2
L.sub.reflection=V.Math.T.sub.reflection/2
(42) The echo delay time corresponding to a path with only one reflection off the second element 45, like already mentioned above, is the average of the direct and reflected echo delay times, namely, (T.sub.direct+T.sub.reflection)/2 and provides redundant information to improve measurements of T.sub.direct and T.sub.reflection. From measurements of T.sub.direct and T.sub.reflection the distances L.sub.direct and L.sub.reflection may be determined.
(43) Assuming for the moment that the finger 21 and emitting and sensing means 13 and 15 and the virtual location 53 are all in the same X/Y plane, the (x,y) coordinates of the finger 21 may be determined from the intersection of a circle 58 of radius L.sub.direct centered on the emitting and sensing means 13 and 15 and of a second circle 59 of radius L.sub.reflection centered on the virtual location 53. In terms of algebraic formulas this corresponds to finding the coordinates (x,y) that solved the following two simultaneous equations.
L.sub.direct.sup.2=(x+a).sup.2+y.sup.2
L.sub.reflection.sup.2=(x+a.Math.cos(2α)).sup.2+(y−a.Math.sin(2α)).sup.2
(44) As shown in
(45) While the measurement of two quantities such as T.sub.direct and T.sub.reflection is sufficient to determine the position of a point in two dimensions, it is not sufficient to determine a point in three dimensions. Continuing to considering the case than the means for emitting 13 and the means for sensing 15 are either co-located or one and the same, but no longer assuming the finger 21 to be in the same X/Y plane as the virtual location 53 and emitting and sensing means 13 and 15, the circles in
L.sub.direct.sup.2=(x+a).sup.2+y.sup.2+z.sup.2
L.sub.reflection.sup.2=(x+a.Math.cos(2α)).sup.2+(y−a.Math.sin(2α)).sup.2+z.sup.2
(46) Additional emitting means 13 and/or additional sensing means 15 may provide the additional measurements that are needed to determine the three-dimensional coordinates of finger 21.
(47) The device 61, which can be a laptop, comprises a first co-located emitting and sensing means 13 and 15 to the right of a keyboard 63 in first element 43 and a second co-located emitting and sensing means 65, 67 to the left of the keyboard 63 in the first element 43. Thus compared to the situation described above there will be direct and reflected paths (reflected off the second element 45) to both the first and second co-located emitting and sensing means 13, 15 and 65, 67, which provides four equations with three unknowns (x,y,z) which is sufficient to determine the position of finger 21 in three dimensions and also provides some redundant information which could be used for noise rejection.
(48) Without echo signals involving reflections off of the second element 45 and hence no measurement of T.sub.reflection and determination of L.sub.reflection, a first and second co-located emitting and sensing means would provide only two equations for three unknowns (x,y,z) and would be incapable of determining the (x,y,z) coordinates of finger 21. Hence the reflections the second element 45 are not considered undesired backgrounds but are used as essential information with which to determine finger 21 position in up to three dimensions.
(49) While it is an option for emitting means 13 and sensing means 15 to be co-located, this is not a requirement. The underlying principles for the case of separated emitting means 13 and sensing means 15 essentially remain the same and the equations as described above can be adapted accordingly.
(50) Additional signals involving reflections off of second element 45 provide useful additional information from which to determine positions of one or even more objects, e.g. one or more fingers 21 of a user's hand or more fingers 21. This includes, in the scenario discussed above of left and right co-located emitting and sensing means in which the left sensing means 67 detects signals resulting from right emitting means 13 and vice versa. In addition more than two pairs of emitting and sensing means can be used to resolve more complex interaction schemes. Thus more complex interaction schemes like multi-touch and/or dragging gestures can be identified by the device 61 according to the invention. To be able to discriminate between signals emitted from the left and right emitting means, the left and right emitted signals can have a different frequency spectrum, different timing, or the like.
(51) The above analysis assumes that the angle α between first element 43 and second element 45 is known or can be determined. One option to determine the angle α is that hinge 47 contains a means for measuring the angle α is incorporated into the device 41 or 61, in particular in the hinge 47. The measured value is then communicated to an appropriate processing means, e.g. a microprocessor, to determine the position of the object 21. Another possibility is that in the absence of a touch or finger 21, the sensing means 15 is able to detect echoes of ultrasonic waves from emitting means 13 off of second element 45 and from such signals the processing means determines α. This can for instance be achieved by comparing live no-touch signals with a data base of no-touch signals for a range of values of angle α. According to a further possibility, advantage is taken of the plurality of sensed signals. In cases such as discussed above when the number of measurements used to determine a finger 21 position (x,y,z) exceeds three, the parameter α in the above equations can be considered not as a predetermined constant, but as a fourth unknown to be determined along with (x,y,z) from the set of simultaneous equations that need to be satisfied. As the hinge angle parameter α is likely to vary much less often than finger 21 coordinates (x,y,z), the measurement of a does not need to be fully real-time and can be regarded as something to initially calibrate and periodically check and recalibration, perhaps using statistical methods in improve precision.
(52) Knowing the x, y, z coordinates of object 10, 25 and the angle α it is furthermore possible to determine the position of the object 10, 25 relative to any position on the second element 45. In particular, it becomes possible to determine a projection of the object 10, 25 onto the second element 45 and the coordinate system y′-z attached to the second element 45. This is illustrated by reference numeral 71 in
(53) The device according to the second and third embodiment is typically used for opening angles α which are less then 180°, in particular less than 160°, even further preferred in a range between 45° and 160°.
(54) The airborne signals in the electronic device 41 according to the third embodiment and its variant 61 are used to replace a touch based user interaction surface by a non touch based user interaction. Like mentioned above, the position of the object 10, 25 can be determined in three dimensions within the coordinate system x-y-z and/or as a projection onto the second element in a coordinate system y′-z. It is furthermore also possible to combine the non touch based interaction based on airborne signals with a touch based user interaction means on the second element 45. This could touch based user interaction means could e.g. correspond to the touch based user interaction surface 3 as described in the second embodiment or any other touch based user interaction means, e.g. based on a capacitive, an inductive or an acoustic technology. In this case a user interaction with the device based on up to five dimensions (three non touch based and two touch based) can be realized.
(55)
(56) During step S1, the acoustic signal sensing means 5 senses signals corresponding to vibrations such as bending waves propagating inside the interaction surface 3. In many applications the interaction surface 3 takes the form of a more or less uniform plate for which propagating vibrations take the form of A.sub.0 order Lamb waves, commonly referred to as bending waves. In the context of this document the term “bending waves” is to be interpreted generally as vibration propagation from the touch point to acoustic signal sensing means 5 even if the interaction surface deviates in geometry and structure from a uniform plate.
(57) The sensed signal is forwarded to the analyzing unit 7. Based on the properties of the sensed signal, the analyzing unit 7 determines whether a user 9 has touched the interaction surface 3 at a certain location, here touch location 11 and may output the coordinates of the touch location to a further processing unit or not. If no touch event is detected, step S1 is repeated again.
(58) Upon detection of a touch interaction by the user 11 with the touch sensitive surface 3, the analyzing unit 7 instructs the ultrasound signal emitting means 13 to emit an airborne ultrasonic wave 19 above and/or over the surface 17 of the device 1 (step S3).
(59) The ultrasound sensing means 15a/15b capture the emitted airborne ultrasonic wave having travelled above and/or over the surface 17 and forward the sensed signal to the analyzing unit 7. Based on the properties of the sensed airborne ultrasonic signal, as illustrated in
(60) If a movement of the user's finger or stylus has been identified, the process proceeds with step S5 during which the analyzing unit 7 checks whether, based on signal sensed by the acoustic signal sensing means 5, a new touch location on the interaction surface 3 can be identified or not. If a new touch location can be identified (step S6), the analyzing unit 7 will identify a drag over the interaction surface and will provide the corresponding output to the further processing means. If no new touch location can be identified (step S7), the analyzing unit 7 can identify that the interaction between the user 9 and the device 1 relates to a simple tap, thus a touch event without hold.
(61) If during step S4 the analysis of the airborne ultrasonic signal leads to the decision that no movement occurred by the user 9 after the touch localization in step S2, the analyzing unit 7 determines that the interaction relates to a touch-and-hold event (step S8) and provides the corresponding output to the further processing unit(s) of the electronic device 1.
(62) During step S9, the analyzing unit continues to check whether a motion of the user 9 can be identified based on the airborne signal properties as illustrated in
(63) This may relate to a lift-off of the user's finger 21 from the interaction surface. The lift-off event can, in addition or in an alternative, also be identified using the signal sensed by the acoustic signal sensing means 5 as the lift off action may also lead to the formation of vibrations such as a bending wave travelling inside the interaction surface 3. If a lift-off is detected in step S11, the user interaction is terminated (S12).
(64) If after the detection of motion in step S9 indicating the end of a touch-and-hold event, no lift-off is detected but a new bending wave is detected, the process restarts with step S1. A new touch location can be identified by the analyzing unit 7 based on the signal sensed by the acoustic signal sensing means 5. The touch location will be different to the one identified at the beginning of the interaction event, the analyzing unit 7 may decide that directly after the touch-and-hold event a drag event takes place, during which the user 9 first touches the interaction surfaces for a longer time and then keeps touching the interaction surface but starts moving over the interaction surface 3 to a different location.
(65) Instead of looking at the time dependency, it is also possible to analyze the sensed airborne signal in the frequency domain, as illustrated in
(66) According to a specific embodiment,
(67)
(68)
(69) These patterns could also be linked to a particular position of the object on the interaction surface 3. Therefore, a localization of the user's hand 10 or finger (or stylus held in his hand) 21 on the interaction surface 3 could be obtained by comparing the obtained pattern with a set of pre-recorded pattern at know locations. By comparing this information with the localization determined based on the vibrations travelling inside the interaction surface 3, it becomes possible to check that the hold signal is indeed based on the user's hand or finger which led to the detection of the touch.
(70) As in the case described with respect to
(71) With the inventive device and the inventive methods according to specific embodiments and their variants, touch-and-hold touch events can be reliably discriminated from simple touch events without a continued interaction.
(72) Unlike in the prior art, this is achieved using acoustic signals travelling inside the user interaction surface 3 but also using airborne signals travelling above and/or over the user interaction surface 3. By using airborne signals, the method to detect hold events becomes independent of the materials used, their distribution inside the device and their geometry.
(73) Thus compared to the prior art a further way of detecting touch-and-hold is provided by the invention which can be integrated into the electronic device using hardware, emitters and sensors, that might already be present, e.g. in telephones and/or cameras.
(74)
(75) Step S21 consists in sensing airborne ultrasonic signals using the one or more sensing means 15 (67) provided in the first element of device 41 (or 61). The sensed signals relate to reflected signals originally emitted from the one or more emitting means 13 (or 65) and reflected off the object 10, 21 above the first element 43.
(76) Subsequently, during step S22, the sensed airborne ultrasonic signals are analyzed to identify signal contributions that relate to reflected signals that were directly reflected from the object 10, 21, these kind of signals carry the reference number 49 in
(77) Based on the various types of sensed signals, identified during step S22, the coordinates of object 10, 21 can then be determined in step S23 based e.g. on the equations established above. Knowing that the localization determination based on the reflected signals of type 51 leading to virtual sensing and emitting means 53 depends on the value of angle α, the method further either uses the value of angle α, known from a means for measuring the angle α or from previous measurements in the absence of an object or determines the angle α out the sensed signals in case more equations than unknown parameters can be established, like explained in detail above.
(78) The coordinates x, y and z, together with the angular value a can then be used to input instructions corresponding to the position of the object relative to the second element 45.
(79) Depending on the amount of measured signals it is furthermore possible to identify the position of not only one object relative to the second element 45 but also more than one. In addition, the change of position as a function of time can also be determined. Thereby it becomes possible to identify more complex interaction patterns, like multiple simultaneous non touching gestures.
(80) According to a variant, illustrated in step S24, the method may furthermore comprise a step of determining a projection of the object 10, 21 onto the second element 45. This projection, illustrated by reference numeral 71 provides the position of the object in the coordinate system y′-z of the second element and can be just like a touch based user interaction on the second element 45.
(81) With the inventive device and the inventive methods according to specific embodiments and their variants, the position of an object relative to a device can be determined in up to three dimensions. According to the invention advantage is taken from reflected signals reflected off the second element which can be attributed to additional “virtual” signal emitting means.
(82) The features of various embodiments and their variants can be freely combined individually or in combination to obtain further realizations of the invention.