G09B21/006

NARRATIVE TEXT AND VOCAL COMPUTER GAME USER INTERFACE
20230196943 · 2023-06-22 ·

A narrative engine receives, from a user application providing a user interface via input and output devices of a computing device, object metadata descriptive of the content of the user interface. An augmented description of the user interface is generated, the augmented description including a description of surroundings in the user interface and a listing of actions to be performed to the user interface. The augmented description is presented using the output devices. User input requesting one of the actions is processed. The augmented description is updated based on the user input.

Communication support device, communication support method, and computer-readable storage medium including program

A communication support device comprises an imaging unit, a counterpart detector, a distance measuring unit, a expression determination unit, a motion determination unit, and a voice output unit. The imaging unit captures an image of a surrounding environment of a user. The counterpart detector detects a predetermined counterpart in the captured image. The distance measuring unit measures a distance between the counterpart and the imaging unit based on the captured image. The expression determination unit determines a facial expression of the counterpart based on the captured image. The motion determination unit determines a motion of the counterpart based on the captured image. The voice output unit notifies the user of identification information for identifying the counterpart by a voice when the distance measured by the distance measuring unit is an interaction distance of a first threshold or less. The voice output unit then notifies the user of the identification information and at least one of facial expression information related to the facial expression determined by the expression determination unit and motion information related to the motion determined by the motion determination unit by a voice when the distance measured by the distance measuring unit is longer than the first threshold.

Wearable device enablement for visually impaired user

A wearable device determines an objective based on analyzing a voice of a user, where the wearable device comprises one or more wearable sensors and one or more wearable actuators. The wearable device identifies objects by the one or more wearable sensors. The wearable device determines an action based on the objective and the identified objects and guides the user to achieve and objective by the one or more wearable actuators.

NAVIGATION SYSTEM
20170345338 · 2017-11-30 ·

The technology disclosed herein includes a navigation system for a visually impaired person to navigate a public restroom. The navigation system may include a plurality of fixtures, a plurality of installations, wherein each installation is electronically connected to a fixture, and an electronic device, the electronic device configured to receive auditory or tactile signals from each installation and produce a signal indicative of a fixture location. The plurality of fixtures may include a toilet, a urinal, a sink, a soap dispenser, and a hand drying apparatus.

Spatial weather map for the visually impaired

A method, computer system, and computer program product for providing spatial weather map data to the visually impaired are provided. The embodiments may include receiving a weather map image that contains weather information and a subject of interest, wherein the subject of interest comprises a user current location, a business location, or an asset location. The embodiments may also include receiving a request from a user for current time, historical or forecast data depicted in the weather map and a request for a geo-location of the subject of interest. The embodiments may further include generating sounds corresponding with the requested weather map data, wherein the generated sounds appear to the user to originate in a particular direction and at a particular distance from the subject of interest, wherein the apparent location of the sounds correspond with a weather map feature including the location of approaching or surrounding weather.

SMART EYEGLASSES FOR SPECIAL NEEDS CHILDREN AND ADULTS
20230169791 · 2023-06-01 ·

A system that detects whether a user interacting with a featured activity is wearing glasses is described. The system verifies that the user is wearing the glasses and the system prompts the user and a caregiver and may blur, stop or otherwise interrupt a user experience of a featured activity, such as a video game or film, when the system determined that the user is not wearing the glasses. A glasses module may be positioned at frame of the glasses at a head of the user to detect that the user is wearing the glasses. Optical facial processing may detect a face and glasses on the face. Also disclosed is a hearing aid that may be integrated with such a system. A glasses module that aids in depth perception by reporting distance ahead, and a system that trains eye contact with another person wearing glasses are also disclosed.

TOUCH SCREEN DEVICE FOR ENTERING DATA USING AUDITORY SIGNALS

A device includes a display and one or more processors that cause a speaker associated with the device to serially recite a plurality of auditory cues. Each of the plurality of auditory cues corresponds to one of a set of characters, and each recitation of the plurality of auditory cues occurs at a predetermined time period. The one or more processors also receive a touch input on the display from a user during the recitation of the plurality of auditory cues, determine a character from the set of characters that corresponds to the touch input based on the predetermined time period, and store the character as a value in a sequence of user-specific information.

DEVICE AND METHOD OF PROVIDING AUDIOVISUAL CONTENT FOR THE DISABLED
20230169834 · 2023-06-01 ·

An electronic device is provided. The electronic device includes at least one processor, and at least one memory configured to store instructions executable by the at least one processor, wherein when the instructions are executed by the at least one processor, the at least one processor is configured to perform sensing an occurrence of a trigger event, based on an input to the electronic device and an input to at least one Internet of Things (IoT) device connected to the electronic device, collecting weather information in response to the occurrence of the trigger event, estimating an emotion of a user, based on at least one of context information stored for the user and the weather information, receiving, from the user, an audiovisual impairment state of the user or estimating the audiovisual impairment state based on the context information, determining audiovisual content for a disabled to output, based on at least one of the estimated emotion and the audiovisual impairment state, among the at least one IoT device, determining a target IoT device to output the determined audiovisual content for the disabled, and outputting the determined audiovisual content for the disabled through the target IoT device.

Intelligent glasses for the visually impaired

An approach for communicating navigation information on a physical environment to a user. The approach includes a computer receiving digital images of a physical environment of the user captured by digital video devices and converting the digital images into a three-dimensional image. The approach includes the computer analyzing the three-dimensional image using object analysis to generate output data, wherein the output data corresponds to the physical environment and determining at least one device associated with the user. The approach includes the computer formatting the output data for use with at least one device, wherein the at least one device is capable of providing to the user a spatial map of the physical environment created by an electrical stimulation pad, and receiving a touch from a user on a surface of the electrical stimulation pad corresponding to an object represented in the spatial map of the physical environment.

BRAILLE LEARNING APPARATUS AND BRAILLE LEARNING METHOD USING THE SAME
20170309203 · 2017-10-26 ·

A braille learning apparatus includes a plurality of slave blocks for receiving braille from a learner, and a master station configured to: receive braille learning information from an external braille learning information terminal, when slave blocks corresponding to a specific braille inputted by the learner are mounted, determine whether the specific braille inputted to the slave blocks match a learning word, and transmit a determination result to the external braille learning information terminal.