COMPUTER-IMPLEMENTED SYSTEM AND METHOD FOR ASSISTING INPUT TO A VIRTUAL KEYPAD OR KEYBOARD ON AN ELECTRONIC DEVICE
20220269309 · 2022-08-25
Inventors
Cpc classification
H04M2250/22
ELECTRICITY
G06F2200/1634
PHYSICS
G06F3/04886
PHYSICS
G06F1/1626
PHYSICS
G06F1/1643
PHYSICS
G06F2203/04809
PHYSICS
H04M2250/70
ELECTRICITY
International classification
Abstract
The invention provides systems, devices and methods for improved guidance or assistance of data input into a touch screen enabled device such as a mobile phone, tablet, payment terminal etc. The invention is particularly beneficial for use in situations where visual observation of the screen is impaired or not possible, either because of a user's disability or environmental factors. One aspect the disclosure provides a data input assistance device comprising: a body for placement adjacent to a touch screen of an electronic device; at least one input zone provided in the body and arranged to facilitate a user to operate an area of the touch screen via contact with a surface of the screen; and at least one location indicator arranged to communicate, to a software component associated with the electronic device, the location of the input assistance device relative to the touch screen. The invention enables the provision of a virtual input arrangement such as a virtual pinpad, keyboard or keypad, at a location specified by a user. It also enables movement of the virtual input arrangement during use.
Claims
1. A data input assistance device comprising: a body for placement adjacent to a touch screen of an electronic device; at least one input zone provided in or on the body and arranged to facilitate or enable a user to operate an area of the touch screen by contact with the screen; and at least one location indicator arranged to communicate to a software component associated with the electronic device, the location of a portion of the input assistance device relative to the touch screen.
2. A device according to claim 1, wherein the area of the touch screen is, provides and/or functions as a key of a virtual input component, preferably wherein the virtual input component is a virtual keypad, a virtual pinpad or a virtual keyboard.
3. A device according to claim, wherein at least a portion of the body, at least one location indicator and/or at least one input zone comprises an electrically-conductive material.
4. A device according to claim 1, wherein the at least one input zone: i) comprises at least one aperture arranged to expose the screen and enable contact with the screen by a user; and/or ii) comprises an area which is arranged to conduct an electrical signal to the touch screen for detection by a sensor; and or iii) does not comprise a barrier between the surface of the touch screen and the user; and/or iv) is spaced from the surface of the touch screen.
5. A device according to claim 1, wherein the body comprises: i) at least one layer of a material through which a signal or energy derived from a user's body can be transmitted or conducted to the touch screen, and/or ii) a reference marker for tactile or audible communication of a location on the body to a user via touch or sound; and/or iii) a rear surface which: in use is spaced from the surface of the touch screen, by the at least one location indicator; and/or in use faces the surface of the touch screen; and/or comprises a layer, portion or coating of a material or substance which prohibits or impedes transmission of a signal from the user's body to the surface of the touch screen.
6. A device according to claim 5, wherein the reference marker comprises a raised, indented or grooved portion relative to the body, and/or an audible signal upon detection of contact by a user.
7. A device according to claim 1, wherein the device: i) is not fixed, adhered or maintained in position relative to the electronic device during or before or after use; and/or ii) comprises a plurality of location indicators; and/or iii) comprises an offset marker arranged to adjust or alter a data value received into/by, or interpreted as input by, the software component by a specified value.
8. A device according to claim 1, wherein the at least one location indicator: i) comprises a protrusion which projects from the body; and/or ii) is arranged to hold or position the body or a portion thereof away from, spaced from or adjacent to the touch screen; and/or iii) is provided on the body separately from the at least one input zone.
9. A device according to claim 1, wherein the body further comprises a shield arranged to hide or obscure operation of the electronic device from view by an observer.
10. A data input assistance device comprising: a body for placement adjacent to a touch screen of an electronic device, the body comprising at least one input zone and a rear surface; wherein the at least one input zone is arranged to facilitate or enable a user to operate an area of the touch screen by transmission of a signal or energy from the user's body as a result of contact with the touch screen; and wherein the rear surface is arrange to face a surface of the touch screen in use and comprises: i) a layer, portion or coating of a material or substance which prohibits or impedes transmission of a signal or energy from the user's body to the surface of the touch screen; and ii) at least one location indicator which enables or facilitates transmission of a signal or energy from the user's body to the surface of the touch screen for communication, to a software component associated with the electronic device, the location of a portion of the input assistance device relative to the touch screen.
11. A data input assistance system comprising: a data input assistance device according to claim 1; and a software component arranged for execution on an electronic device associated with a touch screen, wherein the software component is arranged to provide a virtual input component on the device at a location based on the position of the location indicator.
12. A data input assistance method comprising: providing a software component operative to provide a virtual input component at a location on a touch screen of and/or associated with an electronic device, the location being specified or influenced by a location indicator of a data input assistance device of claim 1.
13. A method according to claim 12, further comprising: providing the data input assistance device of claim 1; and/or bringing the data input assistance device of claim 1 in proximity to the touch screen of the electronic device; and/or operating a key of the virtual input component through at least one input zone of the data input assistance device of claim 1; and/or providing the virtual input component at a different location on the touch screen of the electronic device in response to movement of the data input assistance device of claim 1.
14. A computer-implemented system comprising: an electronic device comprising a processor and associated memory and associated touch screen; and a data input assistance device according to claim 1.
15. A system according to claim 14, wherein the memory includes executable instructions that, as a result of execution by the processor, causes the system to provide a virtual input component at a location on the touch screen, the location being specified or influenced by a location indicator of the data input assistance device according to claim 1.
16. A non-transitory computer-readable storage medium having stored thereon executable instructions that, as a result of being executed by a processor of a computer system, cause the computer system to provide a virtual input component at a location on a touch screen associated with an electronic device, the location being specified or influenced by a location indicator of a data input assistance device according to claim 1.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0047]
[0048]
[0049]
[0050]
[0051]
[0052]
[0053]
[0054]
[0055]
DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS
[0056] The disclosure provides systems, methods and devices for enabling a user to input data into an electronic device. In the following disclosure, we provide an example use case wherein the invention is used to enter secret or sensitive data (which we may refer to as a “PIN”) for the purpose of verifying their identity before gaining access to a controlled resource e.g. bank account, building, vehicle etc. It should be noted that this use case is provided for illustration only and the invention is not limited with regard to this example. The data may be any form of data—numeric, alphanumeric, symbol, letter, picture etc.
[0057] Embodiments of the disclosure provide solutions that enable a virtual input mechanism/component/arrangement e.g. a virtual keypad, pin pad, keyboard etc (referred to hereafter as a “pin pad” or “virtual pin pad” for convenience) to be provided on a device and/or used in an improved manner. Typically, a software component which, may be called an application, on a phone will provide the pin pad on the screen at the same location and in the same orientation. Moreover, the location of the keypad will remain static during use. However, in some situations, it is desirable to be able to provide the keypad at a location dictated by or dependent upon some variable. This variable could be dependent upon the user, or supplied by the user. For example, a blind user cannot visually determine the location on the screen of a displayed keypad. Therefore, they do not know where to touch the screen in order to enter their data. Similarly, sighted users may not be able to determine the location of the keypad in visually challenging environments e.g. where there is poor lighting. In other situations, it may be desirable for security reasons to be able to alter the location and/or orientation of the keypad, so that malware on a compromised device cannot determine the user's input.
[0058] Another challenge is that the location and/or orientation of the keypad may need to change during use rather than remaining static. For example, a user with physical impairment may have difficulty keeping their hand in one location. Shaky or jerky motions may make input via a static keypad more challenging, and incorrect keys may be pressed as a result. This may render the data input process frustrating for the user, and also less efficient because it takes longer to correct the mistake and then enter the correct input. In other examples, environmental factors may make input difficult via a static keypad e.g. if the user is in a moving vehicle or performing an activity which makes it difficult for them to keep their finger over a static keypad. In such cases, it would be advantageous to allow the keypad to move around in the display zone as required or influenced by the user's needs.
[0059] Turning to
[0060] The phone 11 is arranged and configured to be able to read input data provided by a user via (i.e. “using” and/or “through”) the touch screen. The means for achieving this may comprise one or more hardware components e.g. sensors and/or one or more software components e.g. event listeners, software arranged to input a predetermined symbol (key indicia) into the phone's memory based on the user's contact with the screen 12.
[0061] The phone comprises software operative to provide a virtual keypad within a display zone of the touch screen 12, as can be seen in
[0062] The advantage of this is that it protects from “over the shoulder” surfing. For example, a user may not be aware that a third party is watching their actions and may thus be able to observe the potentially sensitive input data. This is especially a concern, for example, in respect of blind users. Therefore, the invention provides enhanced security and privacy.
[0063] The illustrative embodiment comprises an input device 1 comprising a body 4 which, in use, is placed such that a portion is adjacent to the screen. The body 4 is arranged such that it can slide or otherwise move relative to the surface of the screen. The body 4 is made of or comprises, at least in part, an electro conductive material, such as a plastic. The body 4 may be entirely made of this material, substantially made of it or only a minority or part thereof. However, enough electro conductive material is provided at the required location(s) to enable contact with the device to be communicated to the screen via the location indicators. This enables the software on the electronic device to know where the virtual keypad is to be provided within the display zone, as discussed in more detail below. In some embodiments the device 1 may be smaller than the screen of the electronic device and thus the device 1 may be moveable in relation to the screen. In other embodiments, the device 1 may be arranged to correspond (exactly or substantially) with the size of the screen and movement of the device 1 in use may be impaired or prohibited due to its size relative to the screen.
[0064] The body 4 is provided with location indicating means 3 for communicating the virtual keypad's desired position, size and/or orientation on the screen to the device. Thus, the location, size and/or orientation of where to generate the virtual keypad can be communicated by a location indication element 3 (leg) that is separate and distinct from the input zone(s) relative to position on the device body. Energy or a signal derived from the user's body is transmitted by the leg(s) to the screen for detection thereon, and then used in the generation of the virtual keypad.
[0065] In one embodiment, the entire body 4, including the legs, is made of or comprises a means for communicating a signal derived from the user's body to the touchscreen e.g. electroconductive plastic. In another embodiment, only part of the body 4 comprises such a means. In another, only one or more location indicators 3 may comprise such a means. The location indicators 3 may be raised or projecting portions which protrude or extend from the body 4 of the device. For convenience they are referred to herein as “legs”. In such an embodiment, the leg(s) 3 hold the rest of the body 4 of the device away from the screen so that the remainder of the body does not contact the screen directly. There is a gap provided between the surface of the screen and body of the device other than the legs, which touch the screen. This is advantageous, especially in embodiments where the entire body is made of conductive material, because it prevents other touches and handling of the device body by the user from being communicated to the touch screen as extraneous signals and thus obfuscating communication of the desired location to the keypad generation software. Such embodiments can thus be made of single layer of material, which reduces complexity of design, manufacturing time and costs, and allows for a lighter, easier and more efficient device that requires fewer resources for storage and transportation.
[0066] In an alternative embodiment, the rear surface of the device may be coated or provided with a non-conductive layer or substance which prohibits or impedes transmission of the signal from the user's body to the screen, other than at the legs. The front of the body may comprise a conductive substance or layer that transmits the signal or energy from the user during handling of the device in use to the legs, but the non-conductive portion of the rear face of the body prevents transmission to the screen elsewhere. In such embodiments, the location indicators do not need to be raised or protruding portions, but can be substantially flat or in-line with the rest of the rear surface of the device. They may be conductive elements which are embedded in the body of the device and coupled to the conductive top layer or face of the body such that the signal derived from the user's body can be communicated through the legs to the screen.
[0067] In a preferred embodiment, at least three legs 3 are provided on the rear 2b of the body, as shown in
[0068] The relative arrangement of the legs 3 on the body 4 may define an input detection zone on the screen. For example, the triangular portion of the screen covered by the triangle formed between the legs of the device shown in
[0069] In a preferred embodiment, the location indicators indicate the location of the device (and thus of the virtual pinpad) to the software using a signal derived from the user's body e.g. electrical energy. This provides a low cost and simple arrangements. In one or more alternative embodiments, the location indicators 3 are powered or energised in some way other than using electrical energy from the user's body. For example, the legs 3 can be powered using a battery. In all embodiments, however, the location indicators 3 are powered or energised in some way to enable the desired location, size, configuration and/or orientation of the virtual pin pad to be detected by the hardware/software on the phone as a result of contact between the screen and the location indicator(s) rather than by contact between the screen and the user. Thus, the user does not communicate data or signals relating to the size, configuration and/or orientation of the virtual input device directly to the software without it going via the location indicators 3. The conductive location indicators 3 transmit a signal that is detectable by the touchscreen and transmitted to the software to derive data relating to the virtual keypad. That data can then be used by the software to generate and or re-locate the virtual keypad in a particular location relative to the (display zone) of the touchscreen, and/or with a particular size and/or configuration.
[0070] The centre of the touch radius of a leg 3 can be used in the calculation of where to locate the virtual keypad relative to the display zone of the screen, and also its required size, dimensions and orientation.
[0071] Stabilizers 5 may be provided for improving stability of the body during use in certain embodiments. Examples are shown in
[0072] The body 4 of the input aid 1 is provided with one or more input zones 6 as shown in
[0073] Preferably, the input zone(s) provide unfettered, unimpeded or uninterrupted access to the surface of the touch screen by the user so that the user's body or a suitably arranged (conductive) input device may make contact with the touchscreen surface without intervention or impediment by any part of the device, and/or without the need for translation or communication of the user's signal to the electronic device or touchscreen by a part of the device 1. Thus, the windows 6 may be cut outs, apertures or openings which have no barrier between the surface of the screen and the user. In other embodiments, perhaps for use in situations where dirt, moisture or other environmental factors are relevant, a membrane or some sort of material may be provided within the window to cover the screen. In such embodiments, the membrane would need to allow communication of the user's selection to the device, e.g. via conductive means.
[0074] The user is able to detect the location of the window(s) via non-visual means such as tactile feedback. For example, the user would be able to feel the edges of the windows shown in the figures. Additional features may be added to enhance detection e.g. the window(s) may be provided with a lip or ridge around at least part of the perimeter. In some embodiments, the body may be completely or substantially flat. The location and symbol of the key corresponding to a given window may be communicated in some way to the user when the user's touch is located within that area of the screen e.g. via vibration and/or sound. The sound may be communicated via wired/connected means (e.g. headphones that are plugged in to the electronic device) or wireless means (e.g. via Bluetooth, NFC, WiFi or other wireless connectivity).
[0075]
[0076] The dot 7 is shown in the figures as a small raised bar below the “5” key. In other embodiments, however, the marker 7 could be provided elsewhere on the device, could take any shape, size orientation etc. In some embodiments, it may not be a raised portion, or provide a tactile means of location communication. For example, in some embodiments the software may detect the user's finger on a given location and communicate that via an audible means such as a beep or other sound.
[0077]
[0078]
[0079]
[0080] As shown in
[0081] Additionally, or alternatively, other techniques may be used to adjust or scramble the configuration of the keys on the virtual keyboard or influence the interpretation of the user's input by the software into the electronic device. These techniques may include, for example, the use of biometric data derived from the user to influence the virtual keypad configuration, or random number generation. The software that generates the underlying virtual keypad may utilise any known method for scrambling a keypad layout, including the techniques disclosed in WO2014/013252, WO2016/189325 the contents of which are incorporated herein in their entirety.
[0082] Example: An Illustrative Embodiment in Use
[0083] In use, the input device 1 is arranged to communicate with hardware and at least one software component provided on the user's mobile phone 11. The user may download and install the software onto the phone prior 11 to use. The software is arranged to interpret the signals from the contact points on the device 1 as parameters to be used in the generation, placement and/or display of the virtual keypad. When the user wishes to enter data in accordance with the present invention, the user may execute the software component e.g. by selecting an icon on their phone 11.
[0084] Upon execution, the software application “knows” that data is to be entered and a virtual keypad is to be generated and provided in the correct location as indicated by the user via the device body. The user places the (electroconductive) input device 1 against the screen such that the legs 3 and stabilizers 5 are in contact with the display zone 12 of the phone. The electrical signals derived from the user's body are detected by the screen via the legs. The single leg 3 at the top of the body 4, 2b indicates to the software where it needs to place the top of the virtual keypad relative to the display zone, and the orientation of the virtual keypad. The configuration and relative position of the three legs 3 provide the necessary information relating to the desired size of the pin pad. The software uses this information to generate the virtual keypad. This may be achieved, at least in part, using a procedure call supplied with the phone. Thus, in some embodiments, the virtual keypad generation may be performed by a subroutine contained in the library supplied by the manufacturer of an electronic device 11 and called by a software component downloaded and installed by a user. In other embodiments, the entire method(s) in accordance with the invention may be performed by proprietary software downloaded and installed onto the device. In yet other embodiments, the necessary software may be supplied with the device 11, and thus not require any download or installation by a user.
[0085] In some embodiments, the software causes the screen 12 to go blank or turn a solid colour e.g. black in order to prevent the virtual keypad from being observed. As above, in other embodiments, the virtual keypad may not be displayed at all or may be displayed so as to blend with the background of the screen and thus be invisible to an observer. This is shown in
[0086] The software then provides the virtual keypad beneath the device 1. Thus, in use, the input device 1 functions as a removeable and re-positional overlay that tells the software how and where the keypad is to be provided and track it if it moves relative to the screen. The keypad is drawn to scale under the windows 6 of the device 1 as shown in
[0087] The user locates the raised dot 7 in the centre of the input device 1 thus knowing where the “5” key is located and, by reference, the other keys. The user moves his/her finger to the window 6 for the first digit they wish to enter, making contact with the surface of the screen through the desired window 6 for a predetermined length of time e.g. 2 seconds. This predetermined length of time prevents accidental touches being interpreted as intended keystrokes as the user moves their finger around the windows. In embodiments which utilise a Touch3D device the user alters the pressure of their touch to indicate data input. To confirm a successful entry of the input, the phone may beep, vibrate or otherwise indicate entry to the user. The user then moves to the desired window 6 for the key of the next digit they wish to enter. This is repeated for all desired characters that the user wishes to enter. The software stores each input and constructs a string which represents the user's collective keystrokes. This could be, for example, a PIN or password or other sensitive or secret data. Any accidental keystrokes may be indicated to the software via a predetermined signal such as pressing a “delete” key. This may be performed in a secure portion of memory or secure environment. The string may be sent to a location and used in an authentication process for validation of the user's identity.
[0088] Turning now to
[0089] The processor(s) 2602 can also communicate with one or more user interface input devices 2612, one or more user interface output devices 2614, and a network interface subsystem 2616.A bus subsystem 2604 may provide a mechanism for enabling the various components and subsystems of computing device 2600 to communicate with each other as intended. Although the bus subsystem 2604 is shown schematically as a single bus, alternative embodiments of the bus subsystem may utilize multiple busses.
[0090] The network interface subsystem 2616 may provide an interface to other computing devices and networks. The network interface subsystem 2616 may serve as an interface for receiving data from, and transmitting data to, other systems from the computing device 2600. For example, the network interface subsystem 2616 may enable a data technician to connect the device to a network such that the data technician may be able to transmit data to the device and receive data from the device while in a remote location, such as a data centre.
[0091] The user interface input devices 2612 may include one or more user input devices such as a keyboard; pointing devices such as an integrated mouse, trackball, touchpad, or graphics tablet; a scanner; a barcode scanner; a touch screen incorporated into the display; audio input devices such as voice recognition systems, microphones; and other types of input devices. In general, use of the term “input device” is intended to include all possible types of devices and mechanisms for inputting information to the computing device 2600.
[0092] The one or more user interface output devices 2614 may include a display subsystem, a printer, or non-visual displays such as audio output devices, etc. The display subsystem may be a cathode ray tube (CRT), a flat-panel device such as a liquid crystal display (LCD), light emitting diode (LED) display, or a projection or other display device. In general, use of the term “output device” is intended to include all possible types of devices and mechanisms for outputting information from the computing device 2600. The one or more user interface output devices 2614 may be used, for example, to present user interfaces to facilitate user interaction with applications performing processes described and variations therein, when such interaction may be appropriate. The storage subsystem 2606 may provide a computer-readable storage medium for storing the basic programming and data constructs that may provide the functionality of at least one embodiment of the present disclosure. The applications (programs, code modules, instructions), when executed by one or more processors, may provide the functionality of one or more embodiments of the present disclosure, and may be stored in the storage subsystem 2606. These application modules or instructions may be executed by the one or more processors 2602. The storage subsystem 2606 may additionally provide a repository for storing data used in accordance with the present disclosure. For example, the main memory 2608 and cache memory 2602 can provide volatile storage for program and data. The persistent storage 2610 can provide persistent (non-volatile) storage for program and data and may include flash memory, one or more solid state drives, one or more magnetic hard disk drives, one or more floppy disk drives with associated removable media, one or more optical drives (e.g. CD-ROM or DVD or Blue-Ray) drive with associated removable media, and other like storage media. Such program and data can include programs for carrying out the steps of one or more embodiments as described in the present disclosure as well as data associated with transactions and blocks as described in the present disclosure.
[0093] The computing device 2600 may be of various types, including a portable computer device, tablet computer, a workstation, or any other device described below. Additionally, the computing device 2600 may include another device that may be connected to the computing device 2600 through one or more ports (e.g., USB, a headphone jack, Lightning connector, etc.). The device that may be connected to the computing device 2600 may include a plurality of ports configured to accept fibre-optic connectors. Accordingly, this device may be configured to convert optical signals to electrical signals that may be transmitted through the port connecting the device to the computing device 2600 for processing. Due to the ever-changing nature of computers and networks, the description of the computing device 2600 depicted in
[0094] Terminology
[0095] Herein the terms “keypad”, “pin pad” and “keyboard” may be used interchangeably and synonymously, and all of these terms are intended to cover a virtual input device, component or mechanism for entering data into a computer-based resource such as, nut not limited to, a laptop, personal computer, mobile phone, tablet or any other form of processor-based computing device. The computer-based resource may comprise an electronic device and/or a software application running on such a device. The terms “virtual keypad/keyboard/pin pad” are intended to include a software-implemented version or representation which models the functionality of a mechanical keypad/keyboard, as known in the art. The virtual keypad/keyboard/pin pad provides keys, each associated with a symbol or indicia such that operation of a key by a user causes the relevant symbol associated with the selected key to be entered into (potentially secure) memory within the device for use by a software application.
[0096] The computer-based resource may be any type of electronic, processor-based device comprising an operating system. This includes but is not limited to: servers, mobile devices, personal computers, PoS, payment entry/processing systems and devices, card reading devices, dedicated computing arrangements e.g. ATM machines and networks comprising any combination of such devices. The computer-based resource may comprise at least one software application arranged for execution on a processor associated with the device. The computer-based resource may be arranged and/or operative to generate a virtual input device.
[0097] The terms “authentication”, “verification” and “validation” are used interchangeably herein. The term “virtual” is used interchangeably herein with “electronic” in relation to keypad/pinpad/keyboard.
[0098] The term “configuration” when used in relation to a keypad, whether virtual or physical, may be used herein to include the layout and/or arrangement of keys.
[0099] The term “data” as used herein can be interpreted as meaning any signal that is capable of being entered as input into an electronic computing device. The input/data could be a signal interpreted by the device as any type of data, including (for example) but not limited to a portion of text, a character, a string, an integer or floating point value, a symbol, an instruction, a procedure/function/method call, a representation of a static or moving image etc.
[0100] It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be capable of designing many alternative embodiments without departing from the scope of the invention as defined by the appended claims. In the claims, any reference signs placed in parentheses shall not be construed as limiting the claims. The word “comprising” and “comprises”, and the like, does not exclude the presence of elements or steps other than those listed in any claim or the specification as a whole. In the present specification, “comprises” means “includes or consists of” and “comprising” means “including or consisting of”. The singular reference of an element does not exclude the plural reference of such elements and vice-versa. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In a device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.