G09B21/04

Ensuring that computer programs are accessible to users with disabilities, such as for use with mobile phones
11550702 · 2023-01-10 · ·

Disclosed here is a system to enable interaction between a user with a disability and a computer program. The system can obtain a representation of a user interface to present to a user. The system can determine an element associated with the user interface, where the element is configured to provide information to the user, however, the user interface presentation of the element at least partially fails to provide the information to the user. Based on the element, the system can determine an appropriate test to perform. The appropriate test indicates at least two of: a test to perform with a keyboard, a gesture test to perform with a mobile screen reader, and an audio test to perform with a screen reader. The system can generate an indication of the appropriate test. The system can provide the indication of the appropriate test prior to releasing the user interface to the user.

Self-centering user interface for inputting information

Techniques described herein are directed to, among other things, utilizing a self-centering user interface to receive information associated with a transaction. For instance, a computing device may receive a first input at a first location of a display. The computing device may then determine a positioning for the user interface, where the user interface may be substantially centered about the first location. In some instances, the computing device may display the user interface using the positioning. The computing device may then receive a second input corresponding to swipe from the first location of the display to a second location of the display. The computing device may then determine a symbol included in the user interface based at least in part on the second input. In some instances, the user interface includes a keypad for entering a personal identification number associated with a payment instrument.

Self-centering user interface for inputting information

Techniques described herein are directed to, among other things, utilizing a self-centering user interface to receive information associated with a transaction. For instance, a computing device may receive a first input at a first location of a display. The computing device may then determine a positioning for the user interface, where the user interface may be substantially centered about the first location. In some instances, the computing device may display the user interface using the positioning. The computing device may then receive a second input corresponding to swipe from the first location of the display to a second location of the display. The computing device may then determine a symbol included in the user interface based at least in part on the second input. In some instances, the user interface includes a keypad for entering a personal identification number associated with a payment instrument.

SMART SEAMLESS SIGN LANGUAGE CONVERSATION DEVICE

An approach is disclosed that is performed by a pair of smart glasses worn by a user that includes an information handling system that includes a processor and a memory. The approach receives input cues at input components of the smart glasses. Input components include a digital camera that is included in the smart glasses and accessible by the processor, a microphone that is included in the smart glasses and accessible by the processor. The received cues are analyzed resulting in one or more output cues focused on assisting the user. These output cues are transmitted through one or more output components of the smart glasses. One of the output components is a display on the inside of a lens included in the smart glasses.

AUTOMATIC TRANSLATION BETWEEN SIGN LANGUAGE AND SPOKEN LANGUAGE

Methods, apparatus, systems, and articles of manufacture to translation between sign language and spoken language are disclosed. An example apparatus includes processor circuitry to at least one of instantiate or execute machine readable instructions to identify a plurality of candidate signs across different frames in video; associate a respective gloss to respective ones of the candidate signs; associate a respective confidence score with the respective glosses; identify overlapping frames of the candidate signs; select one or more of the candidate signs as performed signs based on the respective confidence scores and overlapping frames; and convert the performed signs to audio data.

AUTOMATIC TRANSLATION BETWEEN SIGN LANGUAGE AND SPOKEN LANGUAGE

Methods, apparatus, systems, and articles of manufacture to translation between sign language and spoken language are disclosed. An example apparatus includes processor circuitry to at least one of instantiate or execute machine readable instructions to identify a plurality of candidate signs across different frames in video; associate a respective gloss to respective ones of the candidate signs; associate a respective confidence score with the respective glosses; identify overlapping frames of the candidate signs; select one or more of the candidate signs as performed signs based on the respective confidence scores and overlapping frames; and convert the performed signs to audio data.

SELF-CENTERING USER INTERFACE FOR INPUTTING INFORMATION

Techniques described herein are directed to, among other things, utilizing a self-centering user interface to receive information associated with a transaction. For instance, a computing device may receive a first input at a first location of a display. The computing device may then determine a positioning for the user interface, where the user interface may be substantially centered about the first location. In some instances, the computing device may display the user interface using the positioning. The computing device may then receive a second input corresponding to swipe from the first location of the display to a second location of the display. The computing device may then determine a symbol included in the user interface based at least in part on the second input. In some instances, the user interface includes a keypad for entering a personal identification number associated with a payment instrument.

SELF-CENTERING USER INTERFACE FOR INPUTTING INFORMATION

Techniques described herein are directed to, among other things, utilizing a self-centering user interface to receive information associated with a transaction. For instance, a computing device may receive a first input at a first location of a display. The computing device may then determine a positioning for the user interface, where the user interface may be substantially centered about the first location. In some instances, the computing device may display the user interface using the positioning. The computing device may then receive a second input corresponding to swipe from the first location of the display to a second location of the display. The computing device may then determine a symbol included in the user interface based at least in part on the second input. In some instances, the user interface includes a keypad for entering a personal identification number associated with a payment instrument.

Method and device for reading, writing, and communication by deafblind users
11475793 · 2022-10-18 ·

A method and device for reading, writing, and communication by deafblind users is provided to enable such exemplary functions as word processing, text messaging, Internet access, and telephonic communication. By combining a chordic keyboard for user input with a self-scrolling Braille pad for reading Braille, embodiments of the invention enable the user's hands to stay in place on a user console rather than having to constantly switch back and forth between typing messages versus reading or checking for messages. This in turn enables duplex communication because the user can read or acknowledge incoming messages even while typing. It also reduces the dynamic complexity experienced in reading Braille because a body part used for reading Braille can remain constantly available for receiving messages simply by resting in place on the self-scrolling Braille pad without any swiping, thereby eliminating swiping gestures and the problem of timing them with the receipt of messages.

Method and device for reading, writing, and communication by deafblind users
11475793 · 2022-10-18 ·

A method and device for reading, writing, and communication by deafblind users is provided to enable such exemplary functions as word processing, text messaging, Internet access, and telephonic communication. By combining a chordic keyboard for user input with a self-scrolling Braille pad for reading Braille, embodiments of the invention enable the user's hands to stay in place on a user console rather than having to constantly switch back and forth between typing messages versus reading or checking for messages. This in turn enables duplex communication because the user can read or acknowledge incoming messages even while typing. It also reduces the dynamic complexity experienced in reading Braille because a body part used for reading Braille can remain constantly available for receiving messages simply by resting in place on the self-scrolling Braille pad without any swiping, thereby eliminating swiping gestures and the problem of timing them with the receipt of messages.