Patent classifications
A61H3/061
VISION-ASSIST DEVICES AND METHODS OF CALIBRATING IMAGE DATA OF A VISION-ASSIST DEVICE
Vision-assist devices and methods are disclosed. In one embodiment, a vision-assist device includes an image sensor for generating image data corresponding to a scene, a processor, and an inertia measuring unit. The inertia measuring unit is configured to measure forces acting on the image sensor and the orientation of the image sensor. The processor is configured to receive the image data from the image sensor and process the force and orientation of the image sensor so as to determine a tilt of the image sensor. The processor is further configured to process the image data based on at least the tilt of the at least one image sensor so as to generate a corrected image data, wherein the corrected image data does not include the tilt.
Method and navigation system for assisting a visually impaired user to safely walk around obstructions and impediments
A computer-implemented method and a navigation system are described for guiding a visually impaired user to avoid obstructions and impediments while walking. The user may wear a plurality of subassemblies of the system. The tilt and rotation of the user's head may be monitored using one of the subassemblies worn on the user's head. Based at least in part on the tilt and rotation of the user's head, vertical and horizontal firing angles used by a distance measuring unit in each of the subassemblies may be calculated to transmit and receive laser signals to perform measurements. The user is then provided with navigation instructions and alarms based on whether an obstruction or an impediment is detected that is closer than a predetermined distance to the user while the user is walking based on the measurements.
Medical device for improving environmental perception for blind or visually-impaired users
A device for improving environmental perception for blind or visually impaired users, including a set of mechanical actuators intended to be in contact with the skin of a user, at least one digital camera designed to acquire a current digital image of an environment facing the user, a processing circuit connected to the camera for receiving pixel signals from the acquired digital image and converting at least one portion of the pixel signals into control signals, each of which powers a mechanical actuator of the set of actuators, an eye-tracking module for tracking each eye of the user to identify a gaze direction of the user. The processing circuit then selects, in the environment filmed by the camera, an area of acquired current image which is a function of the gaze direction and converts the pixel signals of said area into control signals, each of which powers an actuator of the set to stimulate the user's skin.
Human-interface device and a guiding apparatus for a visually impaired user including such human-interface device
A human-interface device and a guiding apparatus for a visually impaired user including such human-interface device. The human-interface device includes a tactile module arranged to provide a set of haptic signals to a user, wherein the tactile module has a plurality of tactile units each arranged to provide at least a first haptic signal and operable to cooperate with one or more of other tactile units to provide different haptic signals.
A DEVICE FOR ASSISTING NAVIGATION
A device for assisting navigation including a body defining a middle portion having a first end and a second end, a handle portion at one of the first or second ends, a holder that is removably mounted with another end of the first or second ends of the body.
Communication support device, communication support method, and computer-readable storage medium including program
A communication support device comprises an imaging unit, a counterpart detector, a distance measuring unit, a expression determination unit, a motion determination unit, and a voice output unit. The imaging unit captures an image of a surrounding environment of a user. The counterpart detector detects a predetermined counterpart in the captured image. The distance measuring unit measures a distance between the counterpart and the imaging unit based on the captured image. The expression determination unit determines a facial expression of the counterpart based on the captured image. The motion determination unit determines a motion of the counterpart based on the captured image. The voice output unit notifies the user of identification information for identifying the counterpart by a voice when the distance measured by the distance measuring unit is an interaction distance of a first threshold or less. The voice output unit then notifies the user of the identification information and at least one of facial expression information related to the facial expression determined by the expression determination unit and motion information related to the motion determined by the motion determination unit by a voice when the distance measured by the distance measuring unit is longer than the first threshold.
NAVIGATION SYSTEM
The technology disclosed herein includes a navigation system for a visually impaired person to navigate a public restroom. The navigation system may include a plurality of fixtures, a plurality of installations, wherein each installation is electronically connected to a fixture, and an electronic device, the electronic device configured to receive auditory or tactile signals from each installation and produce a signal indicative of a fixture location. The plurality of fixtures may include a toilet, a urinal, a sink, a soap dispenser, and a hand drying apparatus.
Positional Variance Based Distance Sensing Apparatus
A positional variance based distance sensing apparatus has a physical marker that changes its position (either rotational or linear) in accordance to the distance of an object from the sensor. This positional change can be used to sense the distance using touch by humans. This positional variance can also be used to control triggers like switches, leavers, and steering based controllers.
Information processing device and information processing method
An information processing device calculates an occupancy rate of a pedestrian crossing in an image obtained by capturing an image of a traveling direction of a target person, determines a crossing status of the target person for the pedestrian crossing based on the calculated occupancy rate, and generates support information for supporting the target person crossing the pedestrian crossing based on the crossing status.
DETECTION METHOD
A detection system 100 of the present invention includes a position detection means 121 for detecting position information representing a position of a predetermined part of a person and a position of an accessory in a specific shape held by the person, and a separation detection means 122 for detecting that the accessory is separated from the person based on the position information.