UNIVERSAL KEYBOARD
20210405783 · 2021-12-30
Inventors
Cpc classification
G06F3/0202
PHYSICS
G06F21/32
PHYSICS
G06F3/0213
PHYSICS
G06F3/04886
PHYSICS
A61F4/00
HUMAN NECESSITIES
International classification
G06F21/32
PHYSICS
G06F3/0488
PHYSICS
Abstract
A keyboard for physically handicapped persons, including a translucent surface, a capacitive layer underneath the translucent surface, enabling detection of touch location and pressure on the translucent surface, a projection system dynamically projecting a plurality of visual layouts of keys of a keyboard on the translucent surface, wherein each visual layout includes ASCII character keys or graphical buttons, and a dynamic keyboard layout generator configured to receive user input in conformance with a currently projected layout of keys from a physically handicapped user, and to generate therefrom a time series of ASCII characters or button selections for input to the computing device, to dynamically adjust pressure sensitivity of the keyboard to avoid spurious user input, and to dynamically adjust key sizes and positions in a current virtual layout of keys, to reduce the amount of hand motion required by the user and the amount of discomfort experienced by the user.
Claims
1. A keyboard for a physically handicapped person, comprising: a blank translucent surface for use as an input device; a capacitive layer mounted underneath said blank translucent surface, enabling detection of touch location and pressure on said blank translucent surface; a projection system dynamically projecting a plurality of visual layouts of keys of a keyboard on said blank translucent surface, wherein each visual layout comprises ASCII character keys or graphical buttons; and an accessibility module, coupled with said capacitive layer, with said projection system, and with a computing device, configured (i) to receive user input in conformance with a currently projected layout of keys from a physically handicapped user, and to generate therefrom a time series of ASCII characters or button selections for input to the computing device, and (ii) to dynamically adapt to the user's style of typing, comprising dynamically adjusting pressure sensitivity of the keyboard to avoid spurious user input, and dynamically adjusting key sizes and positions in a current virtual layout of keys, to reduce the amount of hand motion required by the user and the amount of discomfort experienced by the user.
2. The keyboard of claim 1, wherein said accessibility module monitors keyboard typing errors.
3. The keyboard of claim 1 wherein said accessibility module identifies an appendage of a user used to provide input, the appendage comprising a finger, a palm or a knuckle.
4. The keyboard of claim 3 wherein said accessibility module identifies position of the user's appendage on said blank translucent surface.
5. The keyboard of claim 3 wherein said accessibility module identifies pressure applied by the user's appendage to said blank translucent surface.
6. The keyboard of claim 1 wherein said accessibility module infers a user's intent during typing, comprising a key the user intended to press, a word the user intended to type, or a shortcut action the user intended to perform within an application.
7. The keyboard of claim 6 wherein said accessibility module infers a user's intent by an application context, natural language constraints, and the user's typing history.
8. The keyboard of claim 1 wherein said accessibility module provides shortcuts to quickly invoke a user's intent and adjusts the visually projected keyboard layout so as to minimize typing errors, to decrease hand and finger motion, and to avoid uncomfortable or impossible hand positions.
9. The keyboard of claim 1 wherein said accessibility module recognizes symbols hand-drawn by a user on said blank translucent surface.
10. The keyboard of claim 1 wherein said accessibility module maps gestures performed on said blank translucent surface to key sequences.
11. The keyboard of claim 1 wherein a gesture comprises sliding a closed first in a rightward or leftward direction.
12. A secure keyboard, comprising: a blank translucent surface for use as an input device; a capacitive layer mounted underneath said blank translucent surface, enabling detection of touch location and pressure on said blank translucent surface; a projection system projecting a visual layout of keys of a keyboard on said blank translucent surface, the visual layout comprising ASCII character keys and/or graphical buttons; a handprint generator coupled with said capacitive layer that, upon a user placing his hand on said blank translucent surface, generates a user template describing the user's hand, the user template comprising a list of keyboard surface coordinates and corresponding pressures, and stores the user template; and a handprint analyzer, coupled with said capacitive layer, with said handprint generator, and with a computing device, that authenticates an unknown user who asserts and identify by matching the unknown user's template, currently generated by said handprint generator, to the stored user template for the identity asserted by the unknown user, wherein: if no match is found, indicates that the unknown user is not authorized to use the keyboard or not previously enrolled for the keyboard; and if a match is found, receives user input from the unknown user in conformance with the projected layout of keys, and generates therefrom a time series of ASCII characters or button selections for input to the computing device.
13. The secure keyboard of claim 12 wherein the user template generated by said handprint generator further comprises a time series comprising times of interaction of the user's hand with said blank translucent surface, locations of the interaction at each time of interaction, and the amount of pressure applied to said blank translucent surface at each location and time of interaction.
14. The secure keyboard of claim 13 wherein the user template generated by said handprint generator comprises physiological measurements extracted from the time series.
15. The secure keyboard of claim 14 wherein the physiological measurements comprise one or more of (i) finger lengths, (ii) palm surface area, and (iii) hand arches.
16. The secure keyboard of claim 13 wherein the user template generated by said handprint generator comprises behavioral measurements extracted from the time series.
17. The keyboard of claim 16 wherein the behavioral measurements comprise one or more of (i) a pressure heatmap formed by the user's hand on said blank translucent surface, (ii) the first and last parts of the user's hand to make contact with said blank translucent surface, and (iii) vibrations experienced by the keyboard while the user's hand is held on said blank translucent surface.
18. The secure keyboard of claim 12 wherein said keyboard further comprises an inertial measurement unit (IMU) sensor sensing an acceleration experienced by the keyboard, and wherein the user template generated by said handprint generator further comprises the acceleration experienced by the keyboard at each time of interaction.
19. The secure keyboard of claim 12 wherein said handprint analyzer authenticates the unknown user in response to receiving an authentication request from an application that runs on the computing device, which the unknown user is attempting to use.
20. A keyboard, comprising: a blank translucent surface for use as an input device; a capacitive layer mounted underneath said blank translucent surface, enabling detection of touch location and pressure on said blank translucent surface; a projection system projecting a visual layout of keys of a keyboard on said blank translucent surface, the visual layout comprising ASCII character keys and/or graphical buttons; a handprint generator coupled with said capacitive layer, that, upon a user placing his hand on said blank translucent surface, generates a user template describing the user's hand, the template comprising a list of keyboard surface coordinates and corresponding pressures, and stores the user template; and a handprint analyzer, coupled with said capacitive layer and with said handprint generator, that identifies an unknown user by comparing the unknown user's template, currently generated by said handprint generator, to a plurality of stored user templates, wherein if a match is not found then the unknown user is not identified.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041] For reference to the figures, the following index of elements and their numerals is provided. Similarly numbered elements represent elements of the same type, but they need not be identical elements.
TABLE-US-00001 Table of elements in the figures Element Description 100 keyboard 110 blank translucent surface 120 capacitive surface 140 projection system 150 controller 160 speaker 165 microphone 170 webcam 175 accelerometer 180 USB connector 185 wireless module 190 biometric generator 200 keyboard 210 blank translucent surface 220 capacitive surface 240 projection system 250 controller 260 speaker 265 microphone 270 webcam 275 accelerometer 280 USB connector 285 wireless module 290 dynamic keyboard layout generator 300 keyboard 310 finished acrylic material 320 alloy bond metal cover 330 microprocessor 340 lithium ion battery 350 micro-USB charging port 360 LED 410 silicone layer 420 touch sensor layer 430 acrylic layer 440 LED layer 450 acrylic blocks 500 layout of keys 510 character keys 520 space bar 530 cancel key 540 special character key 550 copy key 560 paste key 570 touch pad 580 sensitivity scroll bar 600 layout of keys 610 character keys 620 space bar 630 special keys 640 keys for language selection 650 key for adding a language 680 sensitivity scroll bar 700 layout of keys 710 directional keys 720 special key 730 special key 740 special key 750 special key 800 method 810 flowchart operation 820 flowchart operation 830 flowchart operation 900 method 910 flowchart operation 920 flowchart operation 930 flowchart operation 1000 keyboard driver 1100 mouse driver 1200 keyboard embodiment using micro-LED array projection 1210 silicone layer 1220 capacitive layer 1230 acrylic layer 1240 micro LED layer 1300 keyboard embodiment using projection bar 1310 acrylic keyboard 1340 projection bar 1400 keyboard embodiment using projection underneath keyboard 1410 acrylic layer 1420 support layer 1440 projection device 1500 keyboard embodiment using touchscreen 1510 touchscreen 1600 biometric analyzer 1610 biometric identifier 1620 biometric authenticator 1630 biometric behavioral analyzer 1640 biometric learning machine 1700 keyboard to accommodate users with limited metacarpophalangeal (digits/palm joint) or intercarpal (palm/wrist joint) articulation in the left hand 1800 keyboard to accommodate users that experience tremors in the hand such as user suffering from Parkinson's disease, and for users with limited motor accuracy such as users recovering from a stroke 1900 method 1910 flowchart operation 1920 flowchart operation 1930 flowchart operation 1940 flowchart operation 1950 flowchart operation 1960 flowchart operation 1970 flowchart operation 1980 flowchart operation 1990 flowchart operation 2000 obfuscator module 2100 handprint generator 2150 handprint analyzer 2200 accessibility module 2300 cryptographic module
DETAILED DESCRIPTION
[0042] Embodiments of the present invention relate to a universal keyboard, referred to herein as the “ONE-KEYBOARD”, which is a universal solution to all key input to the computer, and is the first fully-compatible biometric keyboard.
[0043] The keyboard consists of a blank translucent surface, a capacitive array touch screen that transmits touch to the computer, and a projection system that projects the keyboard template or layout that is needed for each application. As described below, there are inter alia four types of projection systems: (1) micro-LED array applied to the under-surface of the keyboard, (2) projection system applied to a bar across the keyboard, (3) projection system that projects onto the surface from underneath the keyboard, and (4) touchscreen system.
[0044] The ONE KEYBOARD comes in a variety of sizes and in a software-only version. A user has a choice of silicone pads that adhere to the surface of the ONE KEYBOARD, in order to find the preferred “touch”. The ONE KEYBOARD is supported on two legs that are adjustable to the user's preferred angle. A small speaker in the keyboard allows the user to choose from a variety of pre-recorded sounds to simulate the “click” of a key and provide minimal haptic feedback. The user can choose a binary sound, which makes a sound at a single decibel level, or variable sounds that are louder or softer depending on the pressure applied to the key. The ONE KEYBOARD comes in wired or wireless (e.g., BLUETOOTH®) models. Both models display the internal circuitry through the acrylic, for aesthetic purposes. The touch sensors may be set to whatever threshold the user prefers, and the threshold may vary based on the application being used. There are optional accessories, including inter alia a mouse, microphone, webcam and speaker.
[0045] Reference is made to
[0046] Keyboard 100 includes a biometric generator 190 operative to receive user input in conformance with the projected layout of keys, and to generate therefrom a time series of touch location and touch pressure data, for use as data by a keystroke biometric analyzer 1600. Biometric analyzer 1600 is described below with reference to
[0047] Keyboard 100 also includes a handprint generator 2100 and a handprint analyzer 2150 described below. Keyboard 100 also includes an accessibility module 2200 for users who have difficulty typing on a standard keyboard. Accessibility module 2200 is described below with reference to
[0048] Reference is made to
[0049] Keyboard 200 includes a dynamic keyboard layout generator 290 operative to dynamically control projection system 240 to project different layouts of keys on translucent surface 210 in response to user activity on a computing device, to receive user input in conformance with a currently projected layout of keys, and to generate therefrom a time series of ASCII characters or button selections for input to the computing device.
[0050] It will be appreciated by those skilled in the art that the embodiments shown in
[0051] Reference is made to
[0052] Although element 310 is indicated as being an acrylic material, this is not necessary for practice of the invention, and element 310 may alternatively be comprised of glass, plexi-glass or such other translucent material, or a combination of such materials.
[0053] Reference is made to
[0054] Reference is made to
[0055] Reference is made to
[0056] Reference is made to
[0057] It will be appreciated by those skilled in the art that the layouts 500, 600 and 700 of respective
[0058] It may thus be appreciated by those skilled in the art that the ONE KEYBOARD supports international and emoji keyboard layouts. Users choose from a list of standard locales, or customize and create a new locale. Thus a bilingual user creates a keyboard layout with keys from both an English QWERTY layout and a French AZERTY layout, or a layout that switches between the two. Embodiments of the present invention include a method that chooses among standard international keyboard layouts, that customizes a standard layout, or that creates a new layout. The size and arrangement of keys on the keyboard surface may be customized by the user through a layout customization program that runs on the ONE KEYBOARD.
[0059] Reference is made to
[0060] Reference is made to
[0061] It will be appreciated by those skilled in the art that the methods shown in
[0062] Reference is made to
[0083] Data output from the keyboard via a device driver may use a file format and communications protocol that conform to an existing or future standard. The lowest level output from the ONE KEYBOARD is the continuous data stream of data (X, Y, T.sub.D, T.sub.R, P, A). In addition, the driver estimates the user's finger size, S, using an edge-detection algorithm, where S is the estimated two-dimensional area of the user's finger as estimated by detecting the diameter, d, of the finger while the key is depressed (e.g., S=¼πd.sup.2). The raw pixels covered by the finger are also made available.
[0084] When appropriate, touch data is converted (i.e., mapped) onto ASCII keystrokes, and, when needed, the data is converted to graphical data. For example, if the user presses the keyboard where the “J” key is located, the ASCII output for J is sent with the associated pressure measurement; if the user creates a signature on a touchpad, the signature is mapped to a bitmap file, with a corresponding “user pressure matrix”, which is a 3D matrix containing the 2D pressure applied along a time axis. The physiology and motor control exhibited by a person's hand is relatively unique and may be used as a form of authentication or identification.
[0085] The surface of the ONE KEYBOARD is capable of sensing shape and pressure. As a form of authentication or identification, a user can place his hand on the keyboard surface. Embodiments of the present invention include handprint generator 2100 (
[0086] Handprint analyzer 2150 (
[0087] Keyboard device driver 1000 may be implemented in software, firmware, hardware, or a combination of software, firmware and hardware.
[0088] Reference is made to
[0089] Mouse device driver 1100 may be implemented in software, firmware, hardware, or a combination of software, firmware and hardware.
[0090] The ONE KEYBOARD employs a projection system to dynamically adapt a layout of keys to the user's application. If the user is typing a document, the projected layout of keys conforms to a standard keyboard, and switches between languages, mathematical symbols and graphics, as needed. For a user who uses more than one language, the projected layout of keys includes keys in any language, and further includes a “Translate” button that enables the user to type in one language and have it translated to another language. There are hundreds of keyboard layouts being used throughout the world today, any of which may be projected on the ONE KEYBOARD, and the projected keys may be a single color, or may be color-coded, or may be any other design. When the user is working on a photo-book, for example, with a website such as SHUTTERFLY, owned by Shutterfly, Inc. of Redwood City, Calif., the ONE KEYBOARD projects a section that shows inter alia a “Page Layout” button, a “Background” button, an “Add-a-Page” button, a “Theme” button, and a “Color” button. When the user adds photos to the book, the ONE KEYBOARD projects a section that shows inter alia an “Add Photos from Computer” button, an “Add Photos from Shutterfly” button, and an “Add Photos from Instagram” button. There are icons of photos, text and other objects that the user may drag into his book. The user may use gestures to edit a photo, resize the photo, change the contrast, brightness, hue, saturation, and make other adjustments. When the user switches between applications, such as working on a document, and then opening an Internet browser to look something up for the document, the keyboard switches between modes optimized for each application, and produces a set of custom buttons such as “Copy”, “Paste” and “Create Hyperlink”, to facilitate the interaction between applications. The keyboard works in a highly synchronized fashion with the user, creating the correct keys and icons for each application, and eliminating the need for hundreds of mouse clicks. If authentication is needed, the collected biometric data is used to verify the identity of the user using an external biometric analyzer.
[0091] As described below, there are inter alia four embodiments of projection systems: (1) micro-LED array applied to the under-surface of the keyboard, (2) projection system applied to a bar across the keyboard, (3) projection system that projects onto the surface from underneath the keyboard, and (4) touchscreen system.
[0092] Reference is made to
[0093] A pattern of keys is projected onto silicone surface 1210 by a micro LED array 1240, underneath acrylic layer 1230.
[0094] A controller (not shown) receives user input in conformance with the projected layout of keys, and generates a time series of touch location and touch pressure data therefrom. The touch location and pressure data may be used inter alia by a keystroke biometric analyzer, as explained below.
[0095] Reference is made to
[0096] Reference is made to
[0097] Reference is made to
[0098] Two or more of the projection systems of
[0099] One of the central features of the ONE KEYBOARD is that it uses behavioral biometrics to learn the touch patterns of every individual, via a process termed “keystroke dynamics”. This is a security feature to supplement conventional means of authentication, such as username and password, and may also be used as a form of error correction. At the basic level, the device registers the behavioral data associated with each time the user touches the keyboard. Over a short period of time, the data gathered creates a “behavioral profile” for the user. The behavioral profile is the set of numeric and categorical features that describe the keyboard usage history, including key press and release timings, pressure, acoustics, keyboard motion, and the shape of the finger mapped onto a two-dimensional space. From the behavioral profile, biometric information can be extracted to create a “biometric template”. The biometric template is a reduced set of features that are highly reproducible and specific for each individual. E.g., the pressure and finger shape may be used to uniquely identify a particular user with high probability. Some of the variations in pressure and finger shape between different users can be attributed to (a) physical traits of the user, such as finger size and strength, (b) the distance from the center of the keyboard, and (c) the user's typing proficiency. Once created, the template is a valuable part of the system. The biometric template may be used to grant and restrict access to a system, identify the keyboard owner, and generate a unique encryption key that can be used to digitally sign documents.
[0100] Reference is made to
[0101] Biometric identifier 1610 generates and stores the user's biometric template, which represents the unique and identifiable behavior attributes of the user. As the user's behavior changes over time, such as due to increased typing proficiency, typing impairments, and changes in physiology, the biometric template is updated. This is necessary so as to mitigate the “template aging effect”, a phenomenon encountered in biometrics in which the user's biometric template becomes less effective over time. Biometric learning machine 1640 implements an online learning mechanism to adapt to these changes in the user's behavior and to respond robustly to changes in the environment, such as keyboard positioning and ambient noises or vibrations.
[0102] Biometric authenticator 1620 operates in two modes: static and continuous. In static mode, an authentication or identification decision is made at discrete points in time using all available information up to that point, such as at the beginning of a session or logging into a website. In continuous mode, an authentication or identification decision is made continuously as the user interacts with the keyboard; for authentication decisions in this mode, the biometric authenticator chooses to either allow the session to continue, deeming the user as genuine, or blocks the user from the session, deeming the user as an impostor. For identification decisions in this mode, the keyboard continuously recognizes the identity of the active user, such as in a shared multi-user environment.
[0103] “Affective computing” is a field of study that aims to build systems capable of detecting and responding to the user's affect, or emotional state. In a desktop or laptop environment, the ability to detect the user's affective state can enable a more integrated and productive environment. Biometric behavioral analyzer 1630 recognizes the affective state of the user from the recorded behavior profile in order to provide a more robust and dependable computing environment. Affective states and possible responses include inter alia:
TABLE-US-00002 Affective State Response stress dynamically adjusting the keyboard interface to decrease user workload and increase productivity frustration dynamically optimizing the keyboard interface to reduce errors; and confusion dynamically adjusting the keyboard interface to provide assistance and additional relevant information
[0104] With the user's behavioral profile, the ONE KEYBOARD improves workflow. Based on patterns gleaned from the user's experience, the ONE KEYBOARD corrects common mistakes made repeatedly by the user, and suggests moving the position or layout of various aspects of the keyboard for improved comfort. By determining the size of a user's fingers, and the type and number of errors made by the user, the ONE KEYBOARD suggests changes in the layout of keys that can improve the user's experience. E.g., a larger set of keys may be more efficient for certain users. On a wider scale, a company may utilize aggregated behavioral profile data to identify patterns among large numbers of employees that might be slowing productivity. A cloud-based system, when applied to the user profile data, determines ways to improve workflow in a widely used program, for example, such as PHOTOSHOP®, developed and marketed by Adobe Systems of San Jose, Calif. Software developers may desire the ability to study aggregated behavioral data in order to improve the development of their next generation of applications.
[0105] The ONE KEYBOARD is designed to accommodate users who have difficulty typing on a standard keyboard. This includes users who suffer from Parkinson's disease, traumatic brain injury (TBI), or another physical injury that makes standard keyboards inaccessible. These ailments may result in uncontrollable movement of the hand, restricted motor capability in the hand or wrist, or inability to strike certain keys due to loss of appendages. Embodiments of the present invention include a method that adapts and optimizes the layout of the keyboard based on the user's physical condition. The sensitivity of the keyboard is automatically or manually adjusted to avoid spurious input and to adapt to the user's style of typing. Key sizes and positions are adjusted automatically or manually to reduce the amount of hand motion required and the amount of discomfort experienced by the user, as well as to reduce typing errors.
[0106] The automatic function is performed by an accessibility module that either runs on the computing device of the ONE KEYBOARD or is remotely accessible through the ONE KEYBOARD driver software. The accessibility module continuously monitors keyboard usage, including typing errors, identification of the appendage (finger, palm, knuckle) used to provide input, position of the appendage on the keyboard surface, and pressure applied to the keyboard surface. The accessibility module infers the user's intent during typing, such as the key the user intended to press, the word the user intended to type, or the shortcut action the user intended to perform within an application. These inferences are made based on: the application context, for example the likelihood of a copy action (typically invoked by Ctrl-V on QWERTY keyboard layouts) to occur following the selection of text with the mouse pointer; natural language constraints, including word spelling, grammar rules, sentence structure, and word frequency; and the user's typing history, including the user's style of writing and the frequent typing of personal information, such as name, address, phone number, email address, and commonly typed phrases. The accessibility module provides shortcuts to quickly invoke the user's intent and adjust keyboard layout so as to minimize typing errors, decrease hand and finger motion, and to avoid uncomfortable or impossible hand positions. For some users, pressing the keyboard surface may itself be a difficult action to perform. For these users, the ONE KEYBOARD supports alternate input methods that eliminate such an action, such as the ability to recognize hand-drawn symbols drawn on the keyboard surface or the mapping of gestures, such as sliding a closed first in a leftward direction, to keys or key sequences.
[0107] Reference is made to
[0108] Reference is made to
[0109] Security is another major component of the ONE KEYBOARD. With the user's biometric template, the ONE KEYBOARD quickly detects when someone other than an authorized user is trying to access the system. Within a few lines of typing, the biometric template of the typist discriminates between an authorized user and an intruder. Companies interested in network security use this as a means of ensuring that only the correct user accesses each device. Common examples of this are on-line courses and test-taking, on-line e-commerce, and social networking companies who wish to prevent on-line bullying by “anonymous” users. Once the ONE KEYBOARD is attached to a computer, the driver can prevent anyone from detaching it and attempting to access the computer with a non-biometric keyboard. Currently, many behavioral biometric programs not only reject intruders, but they actually identify the intruder by their own behavioral biometric template. The time series generated by biometric analyzer 1600 is unique to each user and may be utilized as a digital identity.
[0110] The ONE KEYBOARD includes a cryptographic module 2300 (
[0113] Protecting the user's privacy is another major feature of the ONE KEYBOARD. Keystroke dynamics is a technique that can be used to legitimately identify and authenticate a user by a trusted application, such as when logging into a secure banking website or during an online course. However, there are numerous scenarios in which a user's keystroke dynamics are exposed to an untrusted application. This dilemma is often encountered in web applications, whereby a single webpage may load dozens of third party modules that provide functionality through an external application programming interface (API). Given the lack of special permissions required to capture keyboard events in modern web browsers, an untrusted web application or third-party module can passively record the user's key press and release timings and use this information to track the user's identity. This presents a privacy concern since a malicious application can perform user identification and verification remotely via keystroke dynamics without the user's cooperation or knowledge. Since this type of attack relies only on the user's typing behavior, the user's identity may be compromised even when accessing the web application through an anonymizing network, such as The Onion Router (TOR). From this perspective, keystroke dynamics represents a form of “behavioral tracking”, the process by which an advertiser or other third party is able to track user identity and demographics based on his online activity.
[0114] Obfuscation module 2000 (
[0115] There are numerous legitimate uses of keystroke dynamics employed by trusted applications and the ONE KEYBOARD may preserve the intended functionality of these applications. Such behavioral biometric services are provided by companies including TypingDNA, Behaviosec, KeyTrac. The ONE KEYBOARD is compatible with all of these applications, granted they are trusted by the user. This functionality is provided through an application-specific permissions mechanism, whereby the user may choose to trust certain applications, granting them access to the user's un-obfuscated keystroke timings, while allowing other untrusted applications access only to the obfuscated keystroke timings.
[0116] Using the ONE KEYBOARD and its associated methodology, on-line learning sites such as Coursera of Mountain View, Calif., and Khan Academy of New York, N.Y., testing companies such as The College Board of New York, N.Y., and ACT of Iowa City, Iowa, and any company seeking to verify/authenticate users who are accessing their systems via a remote connection, will increase the security of their systems dramatically.
[0117] Reference is made to
[0118] At operation 1950 obfuscation module 2000 obfuscates the user's key press and release timings, to prevent suspicious applications from recording the user's key press and release timings and tracking the user's identity. Obfuscation of key press and release timings may be performed inter alia by temporarily buffering the key press and release events, thereby introducing buffer duration errors into the timings.
[0119] At operation 1960 biometric authenticator 1620 grants trusted applications access to the user's un-obfuscated key press and release timings. At operation 1970 biometric authenticator 1620 makes continuous decisions as to the identity and authenticity of the user, based on his biometric template. At operation 1980 biometric behavioral analyzer 1630 responds to the user's affective state by instructing projection system 140 to update the layout of keyboard 100. At operation 1990 behavioral learning machine 1640 adaptively updates the user's biometric template to compensate for template aging and changes in the environment.
[0120] An important component of the ONE KEYBOARD is software device driver 1000 for the keyboard, shown in
[0121] Biometric learning machine 1640 uses a biometric learning algorithm. This algorithm collects data from the keyboard and then utilizes that data to “learn” from each user's experiences. Typing mistakes tend to be repetitive, such as touching a certain key too lightly, or misspelling some specific words because of inverting the letters. If a user misspells a word repeatedly, the algorithm determines if the error is due to incomplete activation of a key, or due to another error such as inversion of letters. It then maintains a file of these learned experiences for each user and compensates for them, so that the user experiences an error-free interaction. Preferably, the learning algorithm is separate from the ONE KEYBOARD. At present, there are numerous commercial entities utilizing biometric data. The ONE KEYBOARD is compatible with all of these applications.
[0122] Over time, biometric learning machine 1640 determines which applications a user uses most of the time. The universal keyboard suggests optimal keyboard layouts, based on the applications used most of the time, which enable a user to decrease his number of keystrokes, and improve his efficiency and experience.
[0123] The ONE KEYBOARD comes with a device driver. In addition, there is a small program that allows the user to choose from standard keyboard layouts, or design his own custom layout, using a simple graphical interface. There is an error-correcting program that corrects typing errors, similar to SWIFTKEY®, developed and manufactured by TouchType Limited of London, UK. There is an optional cloud based service that includes better “learning” from the user's experiences, and security systems that ensure that each user matches their biometric security profile.
[0124] The ONE KEYBOARD is the most innovative change to human-computer interaction (HCI, http://en.wikipedia.org/wiki/Human%E2%80%93computer interaction) with desktop and laptop computers in the past decade, and is the last keyboard anyone will ever need to buy.
[0125] One having the benefit of the subject disclosure will appreciate that there are many variations of the keyboard of the subject invention. The present invention may be embodied in applications for cellular phones, including inter alia the IPHONE® and IPAD® manufactured by Apple Corporation of Cupertino, Calif., and the ANDROID™ phones manufactured by Samsung Electronics Co., Ltd of Korea, using built-in technology of the phones to collect biometric data.
[0126] Furthermore, add-on components to the ONE KEYBOARD device driver make use of the behavioral data collected during operation. These components inter alia detect fatigue and stress, detect mental states and/or moods, and diagnose physical ailments such as arthritis and Parkinson's disease. As such, the ONE KEYBOARD may be used by qualified medical professionals. Alternatively, or additionally, such information may be used to determine when a person may be more likely persuaded by a particular type of advertisement.
[0127] In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made to the specific exemplary embodiments without departing from the broader spirit and scope of the invention as set forth in the appended claims. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.