Networked Sensory Enhanced Navigation System
20170270827 · 2017-09-21
Inventors
- Sumanth Channabasappa (Broomfield, CO, US)
- Michael Lynn Hess (Lakewood, CO, US)
- Channabasappa Gurukar Mallikarjuna (Bengaluru, IN)
- Kishore Nayak (Broomfield, CO, US)
Cpc classification
International classification
Abstract
A networked sensory enhanced navigation system enables direction of visually impaired users through complex environments. At least one user peripheral is disposed in networked communication to determine a current user location, potential future user locations, and any selected destination, relative dynamic landscape data informing a proximal environment. Landscape data is populated by access to a Geographic Data Store (“GDS”) wherein previously determined landscape data is storable and accessible. Landscape data is verifiable by local capture effective through third-party peripherals connected over network. The user is directed through the landscape along a designated path, or prevented from collision, by issuance of signal alarms communicative of instructions to the user.
Claims
1. A networked sensory enhanced navigation system operable in communication with at least one peripheral device, said networked sensory enhanced navigation system comprising: continuous determination of a user location; determination of a proximal environment relative to said user location, said proximal environment generable by landscape data comprising known or sensed objects determined to be present in said proximal environment; and issuance of signal alarms perceptible to the user, said signal alarms communicative to direct the user through the proximal environment sensible of the known or sensed objects determined present in the proximal environment; wherein a visually impaired user is enabled comprehension of said user's surroundings and therefore unobstructed passage through the proximal environment.
2. The networked sensory enhanced navigation system of claim 1 wherein the user location is determined by a Global Positioning System and the proximal environment is generable from a Geographic Data Store comprising landscape data uploaded to define landscape features captured previously and pertinent to said user location, said landscape data generable by maps, plans, blueprints, transportation schedules, traffic signals, and other available data accessible via network and pertinent to the proximal environment.
3. The networked sensory enhanced navigation system of claim 2 wherein local capture of the proximal environment verifies the landscape data and updates the Geographic Data Store to include determination and position of novel and moveable objects sensed within the proximal environment.
4. The networked sensory enhanced navigation system of claim 3 wherein local capture of the proximal environment is reinforced by multiple sensors networked through the proximal environment, said multiple sensors including at least one of: a camera disposed upon the user, sonar issued from the user, infrared issued from the user, static cameras disposed in situ such as traffic cameras and security cameras, cameras disposed upon third party objects and transports, radars, radio frequency identification chips disposed upon moveable and immoveable objects, traffic signal changes, third-party peripherals determined in use in the proximal environment, drones, satellites; wherein said multiple sensors are accessible over network to determine local positions of objects sensible within the proximal environment.
5. The networked sensory enhanced navigation system of claim 4 wherein local capture of the proximal environment is effected in real time.
6. The networked sensory enhanced navigation system of claim 5 wherein ambient temperature data and local weather data are accessible over network whereby environmental conditions at specific locations are determinable in the proximal environment.
7. The networked sensory enhanced navigation system of claim 6 wherein a designated path is computable relative landscape data through the proximal environment towards a selected destination.
8. The networked sensory enhanced navigation system of claim 7 wherein computations of user velocity, destination, user location, and landscape data enable predictions of a future user location relative the designated path, whereby arrival times are calculable and future designated paths are calculable relative said future user location.
9. The networked sensory enhanced navigation system of claim 8 wherein the signal alarm is effected as sequential issuances perceptible to the user, said sequential issuances uniquely expressed to communicate proximity to various objects, to effect warning alerts, and to provide directions along the designated path.
10. The networked sensory enhanced navigation system of claim 9 wherein the signal alarm is communicated audibly.
11. The networked sensory enhanced navigation system of claim 10 wherein the signal alarm is communicated haptically.
12. The networked sensory enhanced navigation system of claim 11 wherein the signal alarm is communicated by action of at least one vibration motor portable by the user.
13. The networked sensory enhanced navigation system of claim 12 wherein the at least one vibration motor is disposed in an item of apparel.
14. The networked sensory enhanced navigation system of claim 13 wherein the at least one vibration motor is disposed within at least one shoe.
15. The networked sensory enhanced navigation system of claim 14 wherein a user is enabled to verbally interact with the system by action effective through the user peripheral device.
16. The networked sensory enhanced navigation system of claim 15 wherein non-visually impaired users may effect capture of local landscape data and update the Geographic Data Store to reflect present conditions in the proximal environment.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
Figures
[0028]
[0029]
[0030]
[0031]
[0032]
DETAILED DESCRIPTION OF THE DRAWINGS
[0033] The present networked sensory enhanced navigation system 10 has been devised to assist visually impaired users navigate in real time through a proximal environment 30. The present networked sensory enhanced navigation system 10 enables generation of landscape data 26 relative a current user location 28 to determine scope of a proximal environment 30 through which said user is traveling. Issuance of signal alarms 32 communicable to the user is effective to assist directing the user through the proximal environment 30, maintaining awareness of obstacles and objects that may otherwise impede travel. Users may therefore reach a desired destination 34 without having to visually interact with the proximal environment 30.
[0034] The present networked sensory enhanced navigation system 10, therefore, is operable in communication with at least one peripheral device 20 worn or carried by a visually impaired user. The term “visually impaired”, as used herein throughout, is taken to include users who are not presently capable of visioning the proximal environment 30 relative a user location 28 and is not necessarily limited to blind users, or partially bind users, but may also include users who are visually occupied, as for example, users wearing headsets that obstruct view.
[0035] As shown in
[0036] Geographic Data Store (“GDS”) 24 memory is accessible over network wherein landscape data 26 is storable and retrievable and continuous determination of a user location 28 is effective relative the proximal environment 30. Scope of said proximal environment 30 is generated by relative situation of objects known to be present in the proximal environment 30, as assessed by maps, blueprints, building plans, and other spatial data sets 38 and landscape data 26 as may be accessed through the GDS 24, as will be described subsequently (such as upload of data from participating peripheral devices not in hand or in use by the user in question). See
[0037] Signal alarms 32 perceptible to the user are issued by at least one of the at least one peripheral device 20. These signal alarms 32 are communicative to direct the user through the proximal environment 30 sensible of the known or sensed objects determined present in the proximal environment 30, whereby avoidance of objects is maintained and unobstructed passage through the proximal environment 30 is enabled.
[0038] As shown in
[0039] Thus, accessing the GDS 24 enables computation of a virtual representation of the field of view surrounding the user, the current user location 28, and prognostication of future user locations 36 based on the user location 28 and a sensed user velocity. As shown in
[0040] Capture of the proximal environment 30 is contemplated as effective by visual capture (as by a camera, for example), sonar capture (by monitoring of ultrasonic emissions, for example), infrared (by monitoring of electromagnetic emissions in the infrared spectrum, for example), or other waveform emission (such as radar, for example), as known in the art. Capture of the proximal environment 30 may be effected by at least one of the at least one peripheral device 20 in use by the user in question, or may be effected by additional peripheral devices 40, 44 found operating in the proximal environment 30, such as, for example, third party peripherals 40 or “internet-of-things” 44 discovered in operation in the proximal environment 30. Thus three dimensional spatial analysis and positioning of objects is effective between at least two cooperating peripheral devices found operating in the proximal environment 30.
[0041] Third-party peripherals 40 include traffic cameras, security cameras, radio frequency identification chips as may be employed in objects as part of the present system or the “internet-of-things”, cameras or emitters in vehicles sensible by the present system, or handheld and other peripheral computing devices as may be in operation by third-party users active in the proximal environment 30. Thus scope of the proximal environment 30 may be effected by capture between a plurality of sensors 42 disposed operating in the proximal environment 30 and the GDS 24 accessed by the user at least one peripheral device 20 is updateable by real time acquisition of local landscape data 26. Situation of objects may be repeatedly verified by multiple views effected by more than one third-party peripheral 40, whereby confirmation and verification of object position is repeatedly known.
[0042] The present networked sensory enhanced navigation system 10 may make use of the “internet-of-things”, wherein everyday objects 44 are networked to the internet and able to communicate local environmental data 46 such as temperature, pressure, sounds and sights, and other data, captured by said everyday objects 44. Further, the “internet-of-things” will enable unique identifiers of such networked everyday objects 44 whereby location data of such objects 44 is determinable over network and computable by the present system 10 to generate locally accurate and updateable landscape data 26 informative of the GDS 24. NFC between the at least one user peripheral device 20 and said everyday objects 44 further assists in continuously monitoring and updating the user location 28.
[0043] Real time generation of landscape data 26, therefore, is contemplated whereby, for example, traffic patterns along streets or other conduits are determinable relative a current user location 28. Further, traffic signals, such as pedestrian walkways and traffic lights, may be networked with the present system 10 to enable communication to the user of safe passage across a roadway, street, or other vehicular causeway, for example (see
[0044] Additionally, local weather conditions are contemplated as determinable by the present system 10 whereby local conditions in the proximal environment 30 may be predicted. For example, at least one of the at least one peripheral device 20 in use by the user may sense ambient temperature, barometric pressure, relative humidity, and other metrics, and thereby determine environmental data 46 as may be useful to the user. The at least one peripheral device 20 may likewise determine such environmental data 46 via network, communicating with weather stations 39, for example, to determine precipitation at the user location, or at a future user location 36 towards which said user is determined to be traveling (such as if a user were to exit a building, for example). Thus precipitation is determinable, and presence of ice and/or snow upon the ground may be comprehended by sensors 42 communicating within the proximal environment 30, such as cameras and/or temperature and moisture sensors disposed in objects (including vehicles) active in the proximal environment 30. Additionally, climatic history accessible over network, user uploaded data, and spatial aspect of known locations in the proximal environment (such as northern aspects and shady areas) may enable determination of likelihood of presence of ice and snow whereby the user may be directed accordingly.
[0045] As shown in
[0046] As shown in
[0047] Communication with the user along the designated path 48, or preventative of collision with objects, is effective by issuance of the signal alarm 32. The signal alarm 32 is effected as sequential issuances perceptible to the user, said sequential issuances uniquely expressed to communicate proximity to various objects, warning alerts, and to provide directions along the designated path 48. For example a specific sequence of issuances, determined by rhythm or number, frequency or amplitude, for example, may communicate presence of a particular object, say, or a direction recommended for travel towards a future user location 36. The signal alarm 32 may be issued audibly, and may include verbal instructions intelligible to the user issued as commands, for example, or sonorously emitted as specific sounds matchable with associated responses and actions to be undertaken by the user.
[0048] The signal alarm 32 may be communicated haptically, whereby a user is enabled perception of the signal alarm 32 by vibrations, for example, whereby particular presences and/or particular sequences of vibrations may communicate specific information, such as direction of travel, presence or absence of objects, arrival at a desired location, or a moment in time when an action should be undertaken (such as crossing the road, for example). In such an embodiment, as illustrated diagrammatically in
[0049] A plurality of vibration motors 50 may be used, situated in contact with the user in a plurality of locations, whereby vibration of any one of the plurality of vibration motors 50 may, for example, communicate proximity to an object or indicate a desired direction, or a preferential moment wherein to make a motion towards a future user location 36. Thus the present networked sensory enhanced navigation system 10 is usable to direct a user through a proximal environment 30 towards a future user location 36, or along the designated path 48 towards a desired destination 34, while protecting the user from impacts and collisions with known and sensed objects existing and operating in the proximal environment 30.
[0050] In one embodiment of the invention 10, illustrated diagrammatically in
[0051]
[0052] The at least one peripheral 20 further enables computations of user velocity relative said user location 28 and, in conjunction with available and updateable landscape data 26, enables predictions of a future user location 36. The present system 10 is thereby enabled to act preventatively. Further, when future locations 36 are generated relative a designated path 48, estimated arrival times are calculable and future designated paths 49 are calculable relative each said future user location 36 along the designated path 48. Thus a user may rapidly execute a side route, for example, while traveling along a designated path 48 towards a pre-selected destination 34.
[0053] The present networked sensory enhanced navigation system 10 is enabled for voice recognition whereby a user is enabled to verbally interact with the system 10 effective through the user peripheral device 20. Additionally, a Graphic User Interface (“GUI”) may present renditions of the proximal environment 30 to interact with landscape data 26. This enables third-party actors to update landscape data 26 in a particular proximal environment 30 by interacting manually with the GUI. Additionally users may select destinations by manual interaction with the GUI.
[0054] Visually impaired users are enabled interaction with the present networked sensory enhanced navigation system 10 by contacting a screen of the at least one peripheral device 20, or otherwise effecting contact with said peripheral device 20. Audible commands may then direct the user through menus and a single tap, for example, relative a double tap, for example, may allow for selection in the alternative. Alternately, a swipe in one direction relative a swipe in another direction may also allow for selection in the alternative. Additionally, the present networked sensory enhanced navigation system 10 enables visual capture of objects whereby a user may query presence of objects not communicated through action of the signal alarm 32. The present networked sensory enhanced navigation system 10 may relate information pertinent to the object when stored in the GDS, such as, for example, a building's address.
[0055] The present networked sensory enhanced navigation system 10 further enables tagging of objects, preferred routes, and favorite locations wherein a user is enabled oral input of qualifiers associated with particular geographic data and storable in the GDS 24 whereby a particular location, for example, may be tagged with metadata unique to the user such as, for example, preference towards a particular restaurant, shop, park, particular route of travel, or other object or location.