Non-Visual Virtual-Reality System for Navigation Training and Assessment

20220398940 · 2022-12-15

    Inventors

    Cpc classification

    International classification

    Abstract

    A navigation system for a blind individual to guide the individual through a virtual environment, includes a transponder adapted to be carried by the individual, a tracking system for determining the position of said transponder within the virtual environment, and a feedback system coupled to said transponder to generate a non-visual sensory signal indicative of said position of said transponder. The transponder has a primary sensor direction that is aimed by the individual, and the non-visual sensory signal provides indications of virtual objects aligned with said primary sensor direction, whereby the individual may be guided to find or avoid virtual objects in said virtual environment.

    Claims

    1. A navigation system for a blind individual that uses non-visual information to guide the individual through a virtual environment, including: a transponder adapted to be carried by the individual; a tracking system for determining the position and directional orientation of said transponder within the virtual environment; a feedback system coupled to said transponder to generate a non-visual sensory signal indicative of said position of said transponder; said transponder having a primary sensor direction that is aimed by the individual; said non-visual sensory signal providing indications of virtual objects or walls aligned with said primary sensor direction, whereby the individual may be guided to find or avoid virtual objects or walls in said virtual environment.

    2. The navigation system of claim 1, wherein said non-visual sensory signal comprises an audio signal.

    3. The navigation system of claim 2, wherein said audio signal includes variable parameters selected from a group including frequency, amplitude, pulse and rhythms.

    4. The navigation system of claim 1, wherein said non-visual signal further indicates the relative size and motion of virtual objects relative to the individual in the primary sensor direction.

    5. The navigation system of claim 4, wherein said audio signal includes variable parameters selected from a group including frequency, amplitude, pulse and rhythms, said variable parameters having established correspondences with said relative size and motion of the virtual objects.

    6. The navigation system of claim 1, wherein said non-visual sensory signal comprises a haptic signal.

    7. The navigation system of claim 6, wherein said haptic signal includes variable parameters selected from a group including frequency, amplitude, pulse and rhythms, said variable parameters having established correspondences with physical characteristics of the virtual objects.

    8. The navigation system of claim 7, wherein said physical characteristics of the virtual objects include the relative size and motion of virtual objects relative to the individual in the primary sensor direction.

    9. The navigation system of claim 8, wherein said non-visual sensory signal includes verbal information.

    10. The navigation system of claim 1, wherein said transponder is a smartphone or other self-tracking device that can generate said information about the position and directional orientation of said transponder within the virtual environment.

    11. A method for directing an individual to navigate within a virtual environment, including the steps of: providing a transponder adapted to be carried by the individual, said transponder having a primary sensor direction that is aimed by the individual; providing a tracking system for determining the position directional orientation of said transponder within the virtual environment; providing a feedback system coupled to said transponder to generate a non-visual sensory signal indicative of said position of said transponder; encoding said non-visual sensory signal to provide indications of virtual objects or walls aligned with said primary sensor direction, whereby the individual may be guided to find or avoid virtual objects or walls in said virtual environment.

    12. The method of claim 11, wherein said non-visual sensory signal comprises an audio signal, and said audio signal includes variable parameters selected from a group including frequency, amplitude, pulse and rhythms.

    13. The method of claim 12, wherein said variable parameters have established correspondences with the size of the virtual objects and their motion with respect to the individual.

    14. The navigation system of claim 11, wherein said transponder is a smartphone or other self-tracking device that can generate said information about the position and directional orientation of said transponder within the virtual environment

    15. A method for training an individual to create a mental map of a virtual space, including the steps of: defining a virtual space having one or more virtual objects or walls; providing the individual with a transponder adapted to be carried by the individual, said transponder having a primary sensor direction that is aimed by the individual; providing a tracking system for determining the position of said transponder within the virtual environment; providing a feedback system coupled to said transponder to generate a non-visual sensory signal indicative of said position and directional orientation of said transponder; encoding said non-visual sensory signal to provide indications of virtual objects or walls aligned with said primary sensor direction, whereby the individual may be guided to find or avoid virtual objects or walls in said virtual environment.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0017] FIG. 1 depicts a blindfolded user tracking a trajectory through a virtual street layout within an empty indoor space of the VEERS system of the invention.

    [0018] FIG. 2 is a plan layout showing one embodiment that employs multiple positional anchors and a handheld transponder to establish the virtual empty environment and tracking capabilities.

    [0019] FIG. 3 is a functional block diagram depicting the operation of the VEERS system of the invention.

    [0020] FIG. 4 depicts a further embodiment of the VEERS invention in which the individual is tracked by multiple cameras within the virtual layout.

    [0021] FIG. 5 depicts another embodiment of the VEERS system in which a hand-held real (non-virtual) ranging device provides information about the distances of real objects within the virtual space.

    [0022] FIG. 6 depicts another embodiment of the VEERS system in which a hand-held real (non-virtual) ranging device provides information about the distances of real objects within the virtual space, using a narrow directional beam to detect objects.

    [0023] FIG. 7 depicts a further embodiment of the VEERS system in which the direction of movement of the finger on the screen of the device indicates the distance to virtual objects or scene layout features in a miniaturized version of the VEERS concept.

    DETAILED DESCRIPTION OF THE INVENTION

    [0024] The invention generally comprises a navigation system for the blind that uses auditory or other non-visual information, such as vibration, force feedback, etc., to generate a virtual environment in empty real space (hereinafter VEERS) for entirely non-visual navigation. With regard to FIG. 1, the system provides an empty space 11 that may comprise an indoor room or an outdoor space that includes a floor or surface for an individual 1 to stand and move about without encountering real obstacles. A virtual layout 2 is defined by the system software, and a handheld transponder 3 conveys the information about the virtual layout 2 to the user 1 by an auditory or other signal coding the distance to the nearest obstacle in the direction that the transponder is pointing. Here the auditory signal is conveyed by a cable 4 to an earpiece in the user's ear, although wireless audio devices may alternatively be used. The user 1 might be blind or have their sight occluded by a blindfold 5. The virtual-reality layout defines a desired path 6 for the individual 1 to follow, and the actual path 7 is detected by the system.

    [0025] With regard to FIG. 2, one embodiment of the VEERS design utilizes a Pozyx UWB (ultra-wide band) wireless positioning system (Pozyx BVBA, Belgium). It includes an advanced modular architecture consisting of a system of at least 4 positional anchors 12, and the handheld transponder 3 is wirelessly connected to the UWB anchors. The anchors 12 provide the signals for tracking the user position in the empty space of the VEERS non-visual tracking system. The transponder operates as an extended virtual cane to provide auditory information about the distance to virtual objects in any direction the user 1 is pointing it in the empty VEERS space. In this way, the real navigational space can be populated by a virtual auditory layout with which to prompt and test the user's ‘cognitive map’ by navigating correctly. The virtual auditory layout can be a continuous arrangement of virtual walls defining navigable streets or other spaces between them, or isolated virtual objects arranged in the navigation space.

    [0026] With regard to FIG. 4, a further embodiment is the implementation of the VEERS system based on other means of tracking the position and orientation of the transponder. A pair (or more) of optical imaging sensors 21 provide optical imaging of the transponder 3 from above, and tracks the individual 1 as well as the orientation of the transponder 3, so that directional prompting may be provided under software control through the transponder 3. Again, this embodiment populates the real navigational space by a virtual auditory layout with which to test the user's ‘cognitive map’ by navigating correctly along a trajectory 6 conveyed by auditory or other non-visual sensory cues. In a similar way multi-camera optical tracking, or any of the array of optical tracking techniques reviewed in Welch, Bishop, Vichy, Brumback, Keller & Colucci (2001, High-Performance Wide-Area Optical Tracking, may be employed.

    [0027] With regard to FIGS. 3 and 7, a further embodiment is the implementation of the VEERS system based on replacing the transponder with a smartphone 51 or other self-tracking device. The smartphone 51 tracks the individual 1 as well as the direction in which the transponder 51 is pointing, so that directional prompting may be provided under software control from within the smartphone 51. Again, in this embodiment the real navigational space is populated by a virtual auditory layout with which to test the user's ‘cognitive map’ by navigating correctly along a trajectory 6 conveyed by auditory or other non-visual sensory cues. The positional and directional information of the directional distance to virtual obstacles provided by the smartphone may be employed in a similar way.

    [0028] The directional distance signal to the virtual objects can be a pitch of a tone, with a low tone for distant objects, increasing in frequency (and urgency) as the (virtual) object or obstacle is approached. Other forms of obstacle readout could be utilized, or included together with the distance encoding, such as knocking sounds when a wall is encountered. The transponder could indicate the distance of obstacles in the direction of the transponder's orientation by a tactile readout such as the intensity of its vibration, or by an auditory verbal readout of the distance to the object. Auditory parameters such as frequency, amplitude, pulse and rhythms may be used to convey details about the virtual environment, and each parameter may have an established correspondence with a respective physical aspect such as the size and relative motion of the virtual objects. The user scans the virtual transponder beam across the space to obtain wide-angle sense of the arrangements of objects in the virtual scene. (Note that this system does not require recreating the ambient sounds of a real environment, which is the approach taken by prior auditory virtual reality systems, or overlaying the auditory identifiers for specific target location, which is an approach taken by prior augmented reality systems.)

    [0029] The system may encode the aspects of the structure of the object, such as its height and width, and include that information in the non-visual—auditory, tactile or verbal—readout. A further embodiment of this information could be to encode the size of objects as a second variable such as sound or vibrational intensity, or verbal specification of the width and height of the obstacle.

    [0030] The basic training function of the invention is to allow the user to build up and verify a mental map of the auditory virtual reality space by navigating through it guided by the auditory distance cue. The accuracy of the mental map can be improved by practice in minimizing the errors and maximizing the speed of navigation through the virtual paths defined in the map. This enhanced capability of deploying mental mapping by blind users, or non-visually in sighted users, can be of general use when navigating real physical environments guided by the usual means of the long cane and general auditory cues. It may be appreciated that the training activities take place in an open space that is generally free of obstacles and obstructions, so that the training routines avoid collisions and impacts with objects and are generally safe.

    [0031] The training procedure utilizing the VEERS system may include initial training on tactile maps through hand exploration and memorizing of the navigational layout, such as the Likova Cognitive-Kinesthetic Training (see U.S. Pat. No. 10,722,150, issued Jul. 28, 2020), further transferred to the full-scale VEERS navigational layout for extended training and assessment of non-visual navigational capabilities. As the VEERS system can serve both assessment and navigational training, it has the potential to become a useful tool in the O&M instructors' practice.

    [0032] With regard to FIG. 5, a further embodiment of the transponder element of the system is a hand-held real (non-virtual) laser or radar ranging device 31 that radiates a directional signal to detect objects by receiving reflections therefrom. A plurality of real objects 32 may be placed within the layout 2 to act as obstacles or markers within the layout. This system uses the various feedback modes of the VEERS system to provide information about the distances of the objects 32 in the direction of the signal from the laser or radar in the real world, to the user. These feedback modes include audio pitch, audio pulse frequency, tactile vibration frequency or verbal specification of the distance. As in the VEERS system, the user would scan the beam across the physical scene to obtain wide-angle sense of the arrangements of objects in the real scene.

    [0033] With regard to FIG. 6, another embodiment of the real-world ranging system includes a handheld device 41 in which the laser or radar ranging beam has a vertical spread or rapid vertical scan beam 42, so as to capture distance information about real-world objects with a narrow vertical extent, such as a tree-branch, that might be missed by a point-wise beam. The feedback signal would be processed to specify the nearest part of the objects or walls encountered by the vertical beam, to allow the user to treat them as avoidable obstacles.

    [0034] With regard to FIG. 7, another embodiment of the invention is a handheld device 51 such as a tablet computer or smartphone to both train and test navigational skills by means of audio-haptic feedback. However, instead of coding position on the screen of the device, as in U.S. Ser. No. 10/722,150 to L. Likova, the direction of movement of the subject's finger 52 on the screen 53 of the device 51 may be used to code the distance to virtual objects or scene layout features in a miniaturized version of the VEERS concept. As the finger moves, the angle of the most recent segment of the trajectory defines the direction for coding the distance to the first intersection with elements of the virtual scene layout, such as buildings or other objects encoded for any particular application. This distance is then conveyed by an auditory signal of choice, such as the pitch of a tone. The user can then use their finger movements to navigate through the virtual scene layout to practice or test their memory of the scene structure of any virtual layout that is encoded on the screen.

    [0035] The auditory virtual reality can be further extended with coding of the signals by other modalities such as tactile vibration or verbal instructions.