G09B21/007

Networked Sensory Enhanced Navigation System

A networked sensory enhanced navigation system enables direction of visually impaired users through complex environments. At least one user peripheral is disposed in networked communication to determine a current user location, potential future user locations, and any selected destination, relative dynamic landscape data informing a proximal environment. Landscape data is populated by access to a Geographic Data Store (“GDS”) wherein previously determined landscape data is storable and accessible. Landscape data is verifiable by local capture effective through third-party peripherals connected over network. The user is directed through the landscape along a designated path, or prevented from collision, by issuance of signal alarms communicative of instructions to the user.

INTEGRATED ACCESSIBLE PEDESTRIAN SYSTEM
20220044549 · 2022-02-10 ·

An integrated pedestrian access system comprising of wireless push buttons adapted to transmit and receive wireless signals and receivers connected to pedestrian crosswalk signal systems, wherein said one or more receivers are configured to communicate with the wireless push buttons through wireless signals, communicate among the receivers, determine whether source device from which a request for registering a pedestrian signal is received through the said communication is the wireless push button or the receivers, determine status of pedestrian signal based on signals received from the pedestrian signal system, register request for pedestrian crosswalk signal if walk signal of the pedestrian signal is not on in the desired direction as per the status of pedestrian signal and provide acknowledgement corresponding to the status of pedestrian signal through the said source device.

INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD
20220233392 · 2022-07-28 ·

An information processing device calculates an occupancy rate of a pedestrian crossing in an image obtained by capturing an image of a traveling direction of a target person, determines a crossing status of the target person for the pedestrian crossing based on the calculated occupancy rate, and generates support information for supporting the target person crossing the pedestrian crossing based on the crossing status.

ROAD GUIDANCE SYSTEM AND MOBILE INFORMATION TERMINAL USED IN SAME

In a road guidance using a Braille block, it has not been possible to determine a walking direction of a user only by a single Braille block. To solve the above problem, in a road guidance system including a mobile information terminal and a Braille block, the mobile information terminal includes a photographing processing unit, a QR code analysis processing unit that analyzes a QR code, a voice output processing unit, and a control unit, a QR code is affixed to the Braille block, and the control unit analyzes the QR code affixed to the Braille block photographed by the photographing processing unit using the QR code analysis processing unit to obtain QR code information, generates road guidance information generated based on a walking direction of a user holding the mobile information terminal from the QR code information, and outputs the road guidance information from the voice output processing unit as a voice.

Method of interactive reading for users of self-scrolling Braille
20210398452 · 2021-12-23 ·

Electronically displayed Braille dots are laterally propagated against a stationary finger resting on a stationary base for reading Braille. The lateral propagation takes the form of a transverse wave of pins which are raised and lowered in sequence. The reading can be synchronized with other processes or events under computer controls. A method of interactive reading is provided whereby the reading of Braille from the display is computer-synchronized with other events and processes to help users learn Braille, to monitor physiological responses to reading, and to enhance a user's reading experience.

Cross-platform remote user experience accessibility

In non-limiting examples of the present disclosure, systems, methods and devices for assisting with cross-platform user experience accessibility are provided. A real-time connection between a remote device and a host device may be established. The remote device may apply a plurality of rules to event metadata that it generates. The filtered event metadata corresponding to a plurality of user experience events occurring on the remote device may be received by the host device. One or more transforms may be applied to the filtered event metadata on the host device. A native accessibility experience corresponding to each of the plurality of user experience events may be provided by the host device based on the transformed event metadata.

ASSISTIVE COMMUNICATION DEVICE, METHOD, AND APPARATUS
20210390881 · 2021-12-16 ·

Communication devices, apparatuses, and methods to assist an individual to develop communicative ability are disclosed. A communication device includes a communication module to generate a user interface to display word tiles for selection, compile a sentence upon selection of word tiles, and output the sentence. A first word tile is associated with a word and at least two stimuli of the word, the at least two stimuli including at least two of a text transcription of the word, a visual depiction of the word, and a vocal expression of the word. The communication device includes a training module to collect interaction data, the interaction data including indications of interactions of a user account with the communication module over a plurality of trials, and to progressively diminish at least one of the at least two stimuli of the word of the first word tile based on the interaction data.

COMMUNICATION SUPPORT DEVICE, COMMUNICATION SUPPORT METHOD, AND COMPUTER-READABLE STORAGE MEDIUM INCLUDING PROGRAM
20210390333 · 2021-12-16 ·

A communication support device comprises an imaging unit, a counterpart detector, a distance measuring unit, a expression determination unit, a motion determination unit, and a voice output unit. The imaging unit captures an image of a surrounding environment of a user. The counterpart detector detects a predetermined counterpart in the captured image. The distance measuring unit measures a distance between the counterpart and the imaging unit based on the captured image. The expression determination unit determines a facial expression of the counterpart based on the captured image. The motion determination unit determines a motion of the counterpart based on the captured image. The voice output unit notifies the user of identification information for identifying the counterpart by a voice when the distance measured by the distance measuring unit is an interaction distance of a first threshold or less. The voice output unit then notifies the user of the identification information and at least one of facial expression information related to the facial expression determined by the expression determination unit and motion information related to the motion determined by the motion determination unit by a voice when the distance measured by the distance measuring unit is longer than the first threshold.

Tactile communication tool
11200815 · 2021-12-14 ·

A tactile communication tool for transforming a user input command into an audio output signal is provided. The tool presents a user with a housing, a plurality of buttons, a sound processing unit with a speaker, and an overlay set with a plurality of tactually discernable non-Braille characters or tactually discernable non-Braille images embossed upon it. The plurality of tactually discernable non-Braille images characters or tactually discernable non-Braille images can be positioned over the plurality of buttons such that when one of the buttons is pressed by a user, the user input command is processed, and an audio output signal is generated. The tool could also be used for providing input to a computer system. The overlay set could also be a plurality of individual tactile overlay keys.

Textual annotation of acoustic effects

Accommodation for color or visual impairments may be implemented by selective color substitution. A color accommodation module receives an image frame from a host system and generates a color-adapted version of the image frame. The color accommodation module may include a rule based filter that substitutes one or more colors within the image frame with one or more corresponding alternative colors.