DEVICE COMPRISING TOUCHSCREEN AND CAMERA
20170228128 · 2017-08-10
Assignee
Inventors
Cpc classification
G06F3/0425
PHYSICS
G06F3/0488
PHYSICS
G06F2203/04106
PHYSICS
International classification
Abstract
A device comprising a touchscreen and a camera for imaging a reflection of the touchscreen by a cornea of a user operating the device is provided. The device is configured for displaying a user-interface element on the touchscreen and detecting an interaction by a finger of a hand with the user-interface element. The device is further configured for, in response to detecting the interaction, acquiring an image of the reflection of the touchscreen from the camera, determining which finger of the hand is used for interacting with the user-interface element, and performing an action dependent on the finger used for the interaction. By also assigning a meaning to the finger which is being used for interacting with touchscreen-based devices, such that different actions are performed dependent on the finger used for the interaction, embodiments of the invention support simpler, faster, and more intuitive, user interaction.
Claims
1. A device comprising: a touchscreen; and a camera configured for imaging a reflection of the touchscreen by a cornea of a user operating the device, the device being configured for: displaying a user-interface element on the touchscreen; detecting an interaction by a finger of a hand with the user-interface element; and in response to detecting the interaction by the finger with the user-interface element: acquiring an image of the reflection of the touchscreen from the camera; determining, by analyzing the image, which finger of the hand is used for interacting with the user-interface element; and performing an action dependent on the finger used for interacting with the user-interface element.
2. The device according to claim 1, being configured for detecting an interaction by the finger with the user-interface element by detecting that the finger touches or is about to touch a surface area of the touchscreen associated with the user-interface element.
3. The device according to claim 2, being configured for, in response to detecting that the finger is about to touch the surface area of the touch screen associated with the user-interface element, modifying the displayed user-interface element or displaying a further user-interface element.
4. The device according to claim 1, being configured for performing an action dependent on a finger used for interacting with the user-interface element by: performing a copy action if a first finger of the hand is used for interacting with the user-interface element; and/or performing a paste action if a second finger of the hand is used for interacting with the user-interface element.
5. The device according to claim 1, wherein the user-interface element is a virtual button and the device is further configured for: displaying a user-interface element on the touchscreen by displaying a virtual keyboard comprising a plurality of virtual buttons; and performing an action dependent on a finger used for interacting with the virtual button by entering a character associated with the virtual button, wherein a plurality of characters is associated with each virtual button, each character being associated with a respective finger of the hand, the device being configured for performing an action dependent on a finger used for interacting with the virtual button by entering the character associated with the virtual button and the finger used for interacting with the virtual button.
6.-8. (canceled)
9. The device according to claim 5, wherein: a lower case letter is associated with a first finger of the hand; and/or an upper case letter is associated with a second finger of the hand; and/or a number is associated with a third finger of the hand.
10. The device according to claim 1, being configured for performing an action dependent on a finger used for interacting with the user-interface element by: performing a left-click type of action if a first finger of the hand is used for interacting with the user-interface element; and/or performing a right-click type of action if a second finger of the hand is used for interacting with the user-interface element.
11. The device according to claim 10, wherein the right-click type of action is opening a contextual menu associated with the user-interface element.
12. (canceled)
13. The device according to claim 1, wherein the device is any one of a display, a mobile terminal, or a tablet.
14. A method of a device comprising: a touchscreen; and a camera configured for imaging a reflection of the touchscreen by a cornea of a user operating the device, the method comprising: displaying a user-interface element on the touchscreen; detecting an interaction by a finger of a hand with the user-interface element; and in response to detecting the interaction by the finger with the user-interface element: acquiring an image of the reflection of the touchscreen from the camera; determining, by analyzing the image, which finger of the hand is used for interacting with the user-interface element; and performing an action dependent on the finger used for interacting with the user-interface element.
15. The method according to claim 14, wherein the detecting an interaction by a finger with the user-interface element comprises detecting that the finger touches or is about to touch a surface area of the touchscreen associated with the user-interface element.
16. The method according to claim 15, wherein the performing an action dependent on the finger used for interacting with the user-interface element comprises, in response to detecting that the finger is about to touch the surface area of the touch screen associated with the user-interface element, modifying the displayed user-interface element or displaying a further user-interface element.
17. The method according to claim 14, wherein the performing an action dependent on the finger used for interacting with the user-interface element comprises: performing a copy action if a first finger of the hand is used for interacting with the user-interface element; and/or performing a paste action if a second finger of the hand is used for interacting with the user-interface element.
18. The method according to claim 14, wherein the user-interface element is a virtual button; wherein the displaying a virtual button on the touchscreen comprises displaying a virtual keyboard comprising a plurality of virtual buttons; wherein the performing an action dependent on a finger used for interacting with the virtual button comprises entering a character associated with the virtual button; and wherein a plurality of characters is associated with each virtual button, each character being associated with a respective finger of the hand, and the performing an action dependent on a finger used for interacting with the virtual button comprises entering the character associated with the virtual button and the finger used for interacting with the virtual button.
19.-21. (canceled)
22. The method according to claim 14, wherein: a lower case letter is associated with a first finger of the hand; and/or an upper case letter is associated with a second finger of the hand; and/or a number is associated with a third finger of the hand.
23. The method according to claim 14, wherein the performing an action dependent on the finger used for interacting with the user-interface element comprises: performing a left-click type of action if a first finger of the hand is used for interacting with the user-interface element; and/or performing a right-click type of action if a second finger of the hand is used for interacting with the user-interface element.
24. The method according to claim 23, wherein the right-click type of action is opening a contextual menu associated with the user-interface element.
25. The method according to claim 14, wherein the camera is a front-facing camera.
26. The method according to claim 14, wherein the device is any one of a display, a mobile terminal, or a tablet.
27. A computer program comprising computer-executable instructions for causing the device to perform the method according to claim 14, when the computer-executable instructions are executed on a processing unit comprised in the device.
28. A computer program product comprising a non-transitory computer-readable storage medium, the computer-readable storage medium having a computer program comprising computer-executable instructions for causing the device to perform the method according to claim 14, when the computer-executable instructions are executed on a processing unit comprised in the device.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] The above, as well as additional objects, features and advantages of the invention, will be better understood through the following illustrative and non-limiting detailed description of embodiments of the invention, with reference to the appended drawings, in which:
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031] All the figures are schematic, not necessarily to scale, and generally only show parts which are necessary in order to elucidate the invention, wherein other parts may be omitted or merely suggested.
DETAILED DESCRIPTION
[0032] The invention will now be described more fully herein after with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
[0033] In
[0034] Touchscreen 110 is configured for displaying a user-interface element 111, i.e., a graphical object such as a (virtual) button, text, a field for entering text, a picture, an icon, a URL, or the like. Device 100 is configured for, e.g., by virtue of touchscreen 100, detecting an interaction by a finger 151 of a hand 150, in
[0035] Camera 120 has a field of view which is directed into the same direction as the viewing direction of touchscreen 110. Camera 120 and touchscreen 110 are typically provided on the same face of device 100, i.e., camera 120 is a front-facing camera 120. Optionally, device 100 may comprise multiple front-facing cameras and also a rear-facing camera. Camera 120 is configured for imaging a reflection 163 of touchscreen 110 by a cornea 162 of an eye 160 of user 130 operating device 100, as is illustrated in
[0036] It will be appreciated that reflection 163 may optionally arise from a contact lens placed on the surface of eye 160, or even from eyeglasses or spectacles worn in front of eye 160 (not shown in
[0037] Even though device 100 is in
[0038] Device 100 is configured for, in response to detecting the interaction by finger 151 with user-interface element 111, acquiring an image of reflection 163 of touchscreen 110 from camera 120. The interaction by finger 151 with touchscreen 110, i.e., finger 151 touching a surface of touchscreen 110, is detected by touchscreen 110 together with a location of the interaction. Different types of touchscreens are known in the art, e.g., resistive and capacitive touchscreens. The location of the interaction, i.e., the location where finger 151 touches touchscreen 110, is used to determine which of one or more displayed user-interface elements finger 151 interacts with. This may, e.g., be achieved by associating a surface area of touchscreen 110 with each displayed user-interface element, such as the area defined by a border of a virtual button or picture, or a rectangular area coinciding with a text field or URL link. If the location of the detected touch is within a surface area associated with a user-interface element, it is inferred that the associated user-interface element is touched.
[0039] Acquiring an image of reflection 163 of touchscreen 110 from camera 120 may, e.g., be accomplished by requesting camera 120 to capture an image, i.e., a still image. Alternatively, camera 120 may continuously capture images, i.e., video footage, while finger 151 is touching touchscreen 110, e.g., because user 130 is involved in a video call. In this case, device 100 may be configured for selecting from a sequence of images received from camera 120 an image which has captured the interaction. Device 100 is further configured for determining which finger 151 of hand 150 is used for interacting with user-interface element 111. This is achieved by analyzing the acquired image, i.e., by image processing, as is known in the art. Typically, a number of biometric points related to the geometry of the human hand are used to perform measurements and identify one or more fingers and optionally other parts of hand 150. Device 100 is further configured for, subsequent to determining which finger 151 is used for touching user-interface element 111, performing an action which is dependent on finger 151 used for interacting with user-interface element 111. Different actions are performed for the different fingers of hand 150, as is described further below.
[0040] In the following, the determining which finger 151 of hand 150 is used for interacting with user-interface element 111 is described in more detail. First, an image is acquired from camera 120, either by requesting camera 120 to capture an image or by selecting an image from a sequence of images received from camera 120. Then, by means of image processing, an eye 160 of user 130 is detected in the acquired image, and cornea 162 is identified. Further, reflection 163 of touchscreen 110 is detected, e.g., based on the shape and the visual appearance of touchscreen 110, i.e., the number and arrangement of the displayed user-interface elements, which are known to device 100. Then, the acquired image, or at least a part of the acquired image showing at least finger 151 touching user-interface element 111, is analyzed in order to determine which finger 151 of hand 150 is used for the interaction.
[0041] Subsequent to determining which finger 151 of hand 150 is user for interacting with a user-interface element 111 displayed on touchscreen 110, an action dependent on the finger used for interacting with the user-interface element is performed, as is described hereinafter in more detail with reference to
[0042] It will be appreciated that the performed action may also be dependent on the user-interface element or a type of the user-interface element, as is known from traditional computers. That is, different actions, e.g., different default applications, may be associated with virtual buttons, pictures, text fields, icons, and so forth.
[0043] In
[0044] The embodiment described with reference to
[0045] In
[0046] With reference to
[0047] As a further example with reference to
[0048] It will be appreciated that embodiments of the invention may comprise different means for implementing the features described hereinbefore, and these features may in some cases be implemented according to a number of alternatives. For instance, displaying a user-interface element and detecting an interaction by a finger of a hand with the user-interface element may, e.g., be performed by processing unit 101, presumably executing an operating system of devices 100, 200, 300, or 400, in cooperation with touchscreen 110. Further, acquiring an image of the reflection of touchscreen 110 from camera 120 may, e.g., be performed by processing unit 101 in cooperation with camera 120. Finally, performing an action dependent on the finger used for interacting with the user-interface element is preferably be performed by processing unit 101.
[0049] In
[0050] In
[0051] In
[0052] The person skilled in the art realizes that the invention by no means is limited to the embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims. In particular, embodiments of the invention are not limited to the specific choices of user-interface elements, fingers, and actions, used for exemplifying embodiments of the invention. Rather, one may easily envisage embodiments of the invention involving any kind of user-interface element and corresponding actions, whereby different fingers of the hand are associated with at least some of the actions for the purpose of improving user interaction with touchscreen-based devices.