G06F9/453

ASSISTIVE TECHNOLOGY NOTIFICATIONS FOR RELEVANT METADATA CHANGES IN A DOCUMENT

User interface information related to relevant events of interest is provided. Events can occur anywhere in a document, and may or may not be relevant to a user utilizing an assistive technology (AT) application, such as a screen reader. A provider-side signaling system component determines whether raised events are relevant to the user. In some examples, when an application makes a plurality of attribute changes in a document at once, the signaling provider batches the related events as a single transaction, and generates a generalized annotation describing the changes. The signaling provider further packages the event notification, and sends the event notification to a client-side signaling system component. The signaling client receives the notification, and determines whether to alert the user of the event(s) based on verbosity settings. The AT application is enabled to interpret the event notification and alert the user in a meaningful way.

USER ASSISTANCE SYSTEM OF A REPROCESSING APPARATUS
20180011722 · 2018-01-11 · ·

A user assistance system of a reprocessing apparatus for cleaning and disinfecting at least one surgical instrument arranged in a cleaning basket, the user assistance system including: at least one electronically controlled display device; and a controller coupled with the at least one electronically controlled display device, the at least one electronically controlled display device being integrated in a loading station and being configured for arranging the cleaning basket on one side of the display device during the loading of the cleaning basket, the controller being configured to control the display device such that image information is displayed in an actual size and position which indicates a predetermined arrangement within the cleaning basket of the surgical instrument that is to be reprocessed.

System for facilitating advanced coding to individuals with limited dexterity

A method and system for improving accessibility of advanced coding to individuals with limited dexterity using a personalized screen touch option as replacement of standard typing, to the user's functional mobility ability. The system and method involve a touchscreen coding platform and a personalized screen touch option corresponding to the user's functional mobility activity, which allows the user to write with their prefer code language at a professional level. All actions on the coding platform may be performed with finger taps allowing individuals with limited dexterity to perform complex coding.

VOICE COMMAND-DRIVEN DATABASE
20180011685 · 2018-01-11 ·

A voice command-driven system and computer-implemented method are disclosed for selecting a data item in a list of text-based data items stored in a database using a simple affirmative voice command input without utilizing a connection to a network. The text-based data items in the list are converted to speech using an embedded text-to-speech engine and an audio output of a first converted data item is provided. A listening state is entered into for a predefined pause time to await receipt of the simple affirmative voice command input. If the simple affirmative voice command input is received during the predefined pause time, the first converted data item is selected for processing. If the simple affirmative voice command input is not received during the predefined pause time, an audio output of a next converted data item in the list is provided.

INFORMATION PROCESSING APPARATUS, POSITION INFORMATION GENERATION METHOD, AND INFORMATION PROCESSING SYSTEM
20180011543 · 2018-01-11 · ·

An information processing apparatus includes a memory storing a program and at least one processor that executes the program to implement processes of detecting a speed of motion of a user based on motion information relating to a motion of the user that is detected by a detection device, and generating position information of a position indication display information item, which is displayed on a display device and indicates a position designated by the user, based on the motion information relating to the motion of the user. The position information of the position indication display information item is generated by restricting a moving direction of the position indication display information item to a predetermined direction when the detected speed of motion of the user does not meet a predetermined speed condition.

GESTURE-BASED USER INTERFACE
20180011544 · 2018-01-11 ·

A computer-implemented method for enabling gesture-based interactions between a computer program and a user is disclosed. According to certain embodiments, the method may include initiating the computer program. The method may also include detecting that a condition has occurred. The method may also include activating a gesture-based operation mode of the computer program. The method may also include receiving gesture data generated by a sensor, the gesture data representing a gesture performed by the user. The method may further include performing a task based on the gesture data.

Machine learning analysis of user interface design

Techniques and solutions are described for improving user interfaces, such as by analyzing user interactions with a user interface with a machine learning component. The machine learning component can be trained with user interaction data that includes an interaction identifier and a timestamp. The identifiers and timestamps can be used to determine the duration of an interaction with a user interface element, as well as patterns of interactions. Training data can be used to establish baseline or threshold values or ranges for particular user interface elements or types of user interface elements. Test data can be obtained that includes identifiers and timestamps. The time taken to complete an interaction with a user interface element, and optionally an interaction pattern, can be analyzed. If the machine learning component determines that an interaction time or pattern is abnormal, various actions can be taken, such as providing a report or user interface guidance.

User interface development assistance device, user interface development assistance method, and non-transitory computer-readable recording medium
11709584 · 2023-07-25 · ·

A UI development assistance device (10) comprising a UI editing unit (11), an operation input unit (12), and a display unit (14). The UI editing unit (11) executes a UI editing process and generates a UI editing screen (140). The display unit (14) displays the UI editing screen (140). The operation input unit (12) receives operations pertaining to UI editing. When a plurality of overlapping (UI) objects are present on the UI editing screen (140), the UI editing unit (11) displays a list of the plurality of overlapping (UI) objects on the (UI) editing screen (140) in accordance with a prescribed operation by means of the operation input unit (12).

Software user assistance through image processing
11709691 · 2023-07-25 · ·

Software User Assistance (UA) is afforded from captured User Interface (UI) screen images, with reference to persisted Machine Learning (ML) models. The captured screen images are processed—e.g., using rasterization, Optical Character Recognition (OCR), and/or establishment of a coordinate system—with individual UI elements being determined therefrom. Referencing the persisted ML models, the software application/application state for the captured image is identified. UA data relevant to that application/application state is generated from the model, and then provided to the user (e.g., in a text box overlying the UI screen). Through the capture and processing of UI screen images, embodiments afford a homogenous UA experience for installation, maintenance, and/or upgrade of heterogeneous members of a larger overall landscape, over software lifecycles. Embodiments may be deployed locally on a frontend computer, in order to avoid exporting UI images due to privacy and/or security concerns.

GENERATION OF AN INSTRUCTION GUIDE BASED ON A CURRENT HARDWARE CONFIGURATION OF A SYSTEM
20180011721 · 2018-01-11 ·

Information identifying a current hardware configuration of a system may be received. Furthermore, information of a new hardware component that has not been installed may be received. A graphical user interface (GUI) may be provided with an option to install the new hardware component with the system. In response to a selection from the GUI of the option to install the new hardware component with the system, a plurality of actions to install the new hardware component with the current hardware configuration of the system may be determined. A guide may be generated based on the determined plurality of actions.