Patent classifications
G09B21/009
APPARATUS FOR BI-DIRECTIONAL SIGN LANGUAGE/SPEECH TRANSLATION IN REAL TIME AND METHOD
Provided is an apparatus for bi-directional sign language/speech translation in real time and method that may automatically translate a sign into a speech or a speech into a sign in real time by separately performing an operation of recognizing a speech externally made through a microphone and outputting a sign corresponding to the speech, and an operation of recognizing a sign sensed through a camera and outputting a speech corresponding to the sign.
A METHOD AND A DEVICE FOR PROVIDING A PERFORMANCE INDICATION TO A HEARING AND SPEECH IMPAIRED PERSON LEARNING SPEAKING SKILLS
The present invention describes a technique for providing a performance indication to a hearing and speech impaired person learning speaking skills. The technique comprises selecting a phoneme from a plurality of phonemes displayed on a display device; receiving a phoneme produced by the hearing and speech impaired person on a microphone; creating a first mathematical representation for the selected phoneme; creating a second mathematical representation for the received phoneme; generating a first visual equivalent representing the selected phoneme based on the first mathematical model; generating a second visual equivalent representing the received phoneme based on the second mathematical model; displaying the first visual equivalent and the second visual equivalent on the display device for the hearing and speech impaired person to compare; comparing the first mathematical representation and second mathematical representation; generating a performance indication based on result of a comparison of the first mathematical representation and second mathematical representation.
Accessibility for Web Sites
A system, method, and computer-readable medium are disclosed for performing a web site accessibility operation. The web site accessibility operation automatically enables web site accessibility features for a differently abled user. In certain embodiments, the accessibility features are customized to ease accessibility factors. In certain embodiments, the web site accessibility operation intelligently detects a disability of a user and automatically enables web site accessibility features based upon the detected disability. When performing a web site accessibility operation, a user access a website an analyzer module determines whether the user has a disability and if so what type of disability. Based upon the determination the analyzer module automatically modifies the web site to optimize the accessibility of the web site for the identified disability.
SIGN LANGUAGE COMMUNICATION WITH COMMUNICATION DEVICES
Implementations enable conversations between operators of communication devices who use sign language and other operators who don't. A method may include receiving images of first sign language gestures captured by a camera of a first communication device, converting the first sign language gestures into first text, transmitting the first text to a second communication device, receiving second text from the second communication device, and converting the second text into images of second sign language gestures made by an avatar. The method may also include operating the camera to capture the images of the first sign language gestures and presenting the images of the second sign language gestures on a display of the first communication device. The method may further include receiving first speech captured at the second communication device, converting the first speech into third text, and then into images of third sign language gestures made by the avatar.
Glove for Use in Collecting Data for Sign Language Recognition
A glove for use in collecting data for sign language recognition comprises: multiple azimuth sensors arranged on the glove at positions corresponding to the phalanges and metacarpal bones of the hand and used for sensing postures of the hand. The azimuth sensors are only arranged on the glove at positions corresponding to the phalanges of the hand other than the distal phalange in proximity to the fingertip of at least one finger among the middle finger, the index finger, the ring finger, and the little finger. The glove reduces the number of the azimuth sensors arranged on the glove at positions corresponding to the phalanges of the hand, thus reducing costs while not affecting detection performance.
MUSIC LEARNING APPARATUS AND MUSIC LEARNING METHOD USING TACTILE SENSATION
A tactile music learning apparatus converts sound data of a user's voice corresponding to original music into first tactile data including tactile information, generates a synchronized tactile pattern by synchronizing the first tactile data with second tactile data including tactile information corresponding to sound data of the original music, and transfers the synchronized tactile pattern to a tactile reproducing apparatus to allow the tactile reproducing apparatus to reproduce the synchronized tactile pattern.
Visual feedback system
A visual feedback system can include a display panel, an interface unit, and at least one visual feedback device. The at least one visual feedback device can be configured to provide cues for audio generated within a virtual environment.
Synchronized accessibility for client devices in an online conference collaboration
Techniques and systems for synchronized accessibility for client devices in an online conference are described. For example, a conferencing system receives presentation content and audio content as part of the online conference from a client device. The conferencing system generates sign language content by converting audio in the audio content to sign language. The conferencing system then synchronizes display of the sign language content with the presentation content in a user interface based on differences in durations of segments of the audio content from durations of corresponding segments of the sign language content. Then, the conferencing system outputs the sign language content as synchronized with the presentation content, such as to a viewer client device that requested the sign language content, or to storage for later access by viewers that request sign language content.
HAPTIC AND VISUAL COMMUNICATION SYSTEM FOR THE HEARING IMPAIRED
A communication method for hearing impaired communication comprising: providing a speech training device to a hearing impaired user, the speech training device configured to teach the hearing impaired user how to determine non-speech sounds. The method further includes providing a haptic output device to a hearing impaired user where the haptic output device is configured to be relasably coupled to the hearing impaired user. The haptic output device receives, a sound input signal comprising a non-speech sound and provides the haptic output signal to an actuator which is in electrical communication with the haptic output device. The actuator actuates in response to the haptic output signal and provides a haptic sensation to the hearing impaired user.
ELECTRONIC DEVICE-TO-CHARGER SET, AND COMMUNICATION SYSTEM
An electronic device-charger set is provided including an electronic device, and a charger with Which the electronic device is configured to interlock. The electronic device includes a body including a batter and a connector electrically connected to the battery, a clip extending along the body, and a magnet provided at the dip. The charger includes a charging connector disposed at a position connecting with the connector in state in which the electronic device is interlocked with the charger, a recess formed at a position to house the clip in state in which the electronic device is interlocked with the charger, and an attracting magnet provided at an opposing portion of the recess opposing the magnet and configured to generate an attraction force between the attracting magnet and the magnet.