Non-Visual Virtual-Reality System for Navigation Training and Assessment
20220398940 · 2022-12-15
Inventors
- Lora T. Likova (Greenbrae, CA, US)
- Christopher W. Tyler (San Francisco, CA, US)
- Zlatko K. Minev (New York, NY, US)
Cpc classification
G09B21/003
PHYSICS
G06F3/011
PHYSICS
G06F3/0346
PHYSICS
G06F3/167
PHYSICS
G06F3/016
PHYSICS
A61H2201/5048
HUMAN NECESSITIES
International classification
G06F3/0346
PHYSICS
Abstract
A navigation system for a blind individual to guide the individual through a virtual environment, includes a transponder adapted to be carried by the individual, a tracking system for determining the position of said transponder within the virtual environment, and a feedback system coupled to said transponder to generate a non-visual sensory signal indicative of said position of said transponder. The transponder has a primary sensor direction that is aimed by the individual, and the non-visual sensory signal provides indications of virtual objects aligned with said primary sensor direction, whereby the individual may be guided to find or avoid virtual objects in said virtual environment.
Claims
1. A navigation system for a blind individual that uses non-visual information to guide the individual through a virtual environment, including: a transponder adapted to be carried by the individual; a tracking system for determining the position and directional orientation of said transponder within the virtual environment; a feedback system coupled to said transponder to generate a non-visual sensory signal indicative of said position of said transponder; said transponder having a primary sensor direction that is aimed by the individual; said non-visual sensory signal providing indications of virtual objects or walls aligned with said primary sensor direction, whereby the individual may be guided to find or avoid virtual objects or walls in said virtual environment.
2. The navigation system of claim 1, wherein said non-visual sensory signal comprises an audio signal.
3. The navigation system of claim 2, wherein said audio signal includes variable parameters selected from a group including frequency, amplitude, pulse and rhythms.
4. The navigation system of claim 1, wherein said non-visual signal further indicates the relative size and motion of virtual objects relative to the individual in the primary sensor direction.
5. The navigation system of claim 4, wherein said audio signal includes variable parameters selected from a group including frequency, amplitude, pulse and rhythms, said variable parameters having established correspondences with said relative size and motion of the virtual objects.
6. The navigation system of claim 1, wherein said non-visual sensory signal comprises a haptic signal.
7. The navigation system of claim 6, wherein said haptic signal includes variable parameters selected from a group including frequency, amplitude, pulse and rhythms, said variable parameters having established correspondences with physical characteristics of the virtual objects.
8. The navigation system of claim 7, wherein said physical characteristics of the virtual objects include the relative size and motion of virtual objects relative to the individual in the primary sensor direction.
9. The navigation system of claim 8, wherein said non-visual sensory signal includes verbal information.
10. The navigation system of claim 1, wherein said transponder is a smartphone or other self-tracking device that can generate said information about the position and directional orientation of said transponder within the virtual environment.
11. A method for directing an individual to navigate within a virtual environment, including the steps of: providing a transponder adapted to be carried by the individual, said transponder having a primary sensor direction that is aimed by the individual; providing a tracking system for determining the position directional orientation of said transponder within the virtual environment; providing a feedback system coupled to said transponder to generate a non-visual sensory signal indicative of said position of said transponder; encoding said non-visual sensory signal to provide indications of virtual objects or walls aligned with said primary sensor direction, whereby the individual may be guided to find or avoid virtual objects or walls in said virtual environment.
12. The method of claim 11, wherein said non-visual sensory signal comprises an audio signal, and said audio signal includes variable parameters selected from a group including frequency, amplitude, pulse and rhythms.
13. The method of claim 12, wherein said variable parameters have established correspondences with the size of the virtual objects and their motion with respect to the individual.
14. The navigation system of claim 11, wherein said transponder is a smartphone or other self-tracking device that can generate said information about the position and directional orientation of said transponder within the virtual environment
15. A method for training an individual to create a mental map of a virtual space, including the steps of: defining a virtual space having one or more virtual objects or walls; providing the individual with a transponder adapted to be carried by the individual, said transponder having a primary sensor direction that is aimed by the individual; providing a tracking system for determining the position of said transponder within the virtual environment; providing a feedback system coupled to said transponder to generate a non-visual sensory signal indicative of said position and directional orientation of said transponder; encoding said non-visual sensory signal to provide indications of virtual objects or walls aligned with said primary sensor direction, whereby the individual may be guided to find or avoid virtual objects or walls in said virtual environment.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
DETAILED DESCRIPTION OF THE INVENTION
[0024] The invention generally comprises a navigation system for the blind that uses auditory or other non-visual information, such as vibration, force feedback, etc., to generate a virtual environment in empty real space (hereinafter VEERS) for entirely non-visual navigation. With regard to
[0025] With regard to
[0026] With regard to
[0027] With regard to
[0028] The directional distance signal to the virtual objects can be a pitch of a tone, with a low tone for distant objects, increasing in frequency (and urgency) as the (virtual) object or obstacle is approached. Other forms of obstacle readout could be utilized, or included together with the distance encoding, such as knocking sounds when a wall is encountered. The transponder could indicate the distance of obstacles in the direction of the transponder's orientation by a tactile readout such as the intensity of its vibration, or by an auditory verbal readout of the distance to the object. Auditory parameters such as frequency, amplitude, pulse and rhythms may be used to convey details about the virtual environment, and each parameter may have an established correspondence with a respective physical aspect such as the size and relative motion of the virtual objects. The user scans the virtual transponder beam across the space to obtain wide-angle sense of the arrangements of objects in the virtual scene. (Note that this system does not require recreating the ambient sounds of a real environment, which is the approach taken by prior auditory virtual reality systems, or overlaying the auditory identifiers for specific target location, which is an approach taken by prior augmented reality systems.)
[0029] The system may encode the aspects of the structure of the object, such as its height and width, and include that information in the non-visual—auditory, tactile or verbal—readout. A further embodiment of this information could be to encode the size of objects as a second variable such as sound or vibrational intensity, or verbal specification of the width and height of the obstacle.
[0030] The basic training function of the invention is to allow the user to build up and verify a mental map of the auditory virtual reality space by navigating through it guided by the auditory distance cue. The accuracy of the mental map can be improved by practice in minimizing the errors and maximizing the speed of navigation through the virtual paths defined in the map. This enhanced capability of deploying mental mapping by blind users, or non-visually in sighted users, can be of general use when navigating real physical environments guided by the usual means of the long cane and general auditory cues. It may be appreciated that the training activities take place in an open space that is generally free of obstacles and obstructions, so that the training routines avoid collisions and impacts with objects and are generally safe.
[0031] The training procedure utilizing the VEERS system may include initial training on tactile maps through hand exploration and memorizing of the navigational layout, such as the Likova Cognitive-Kinesthetic Training (see U.S. Pat. No. 10,722,150, issued Jul. 28, 2020), further transferred to the full-scale VEERS navigational layout for extended training and assessment of non-visual navigational capabilities. As the VEERS system can serve both assessment and navigational training, it has the potential to become a useful tool in the O&M instructors' practice.
[0032] With regard to
[0033] With regard to
[0034] With regard to
[0035] The auditory virtual reality can be further extended with coding of the signals by other modalities such as tactile vibration or verbal instructions.