Networked Sensory Enhanced Navigation System

20170270827 · 2017-09-21

    Inventors

    Cpc classification

    International classification

    Abstract

    A networked sensory enhanced navigation system enables direction of visually impaired users through complex environments. At least one user peripheral is disposed in networked communication to determine a current user location, potential future user locations, and any selected destination, relative dynamic landscape data informing a proximal environment. Landscape data is populated by access to a Geographic Data Store (“GDS”) wherein previously determined landscape data is storable and accessible. Landscape data is verifiable by local capture effective through third-party peripherals connected over network. The user is directed through the landscape along a designated path, or prevented from collision, by issuance of signal alarms communicative of instructions to the user.

    Claims

    1. A networked sensory enhanced navigation system operable in communication with at least one peripheral device, said networked sensory enhanced navigation system comprising: continuous determination of a user location; determination of a proximal environment relative to said user location, said proximal environment generable by landscape data comprising known or sensed objects determined to be present in said proximal environment; and issuance of signal alarms perceptible to the user, said signal alarms communicative to direct the user through the proximal environment sensible of the known or sensed objects determined present in the proximal environment; wherein a visually impaired user is enabled comprehension of said user's surroundings and therefore unobstructed passage through the proximal environment.

    2. The networked sensory enhanced navigation system of claim 1 wherein the user location is determined by a Global Positioning System and the proximal environment is generable from a Geographic Data Store comprising landscape data uploaded to define landscape features captured previously and pertinent to said user location, said landscape data generable by maps, plans, blueprints, transportation schedules, traffic signals, and other available data accessible via network and pertinent to the proximal environment.

    3. The networked sensory enhanced navigation system of claim 2 wherein local capture of the proximal environment verifies the landscape data and updates the Geographic Data Store to include determination and position of novel and moveable objects sensed within the proximal environment.

    4. The networked sensory enhanced navigation system of claim 3 wherein local capture of the proximal environment is reinforced by multiple sensors networked through the proximal environment, said multiple sensors including at least one of: a camera disposed upon the user, sonar issued from the user, infrared issued from the user, static cameras disposed in situ such as traffic cameras and security cameras, cameras disposed upon third party objects and transports, radars, radio frequency identification chips disposed upon moveable and immoveable objects, traffic signal changes, third-party peripherals determined in use in the proximal environment, drones, satellites; wherein said multiple sensors are accessible over network to determine local positions of objects sensible within the proximal environment.

    5. The networked sensory enhanced navigation system of claim 4 wherein local capture of the proximal environment is effected in real time.

    6. The networked sensory enhanced navigation system of claim 5 wherein ambient temperature data and local weather data are accessible over network whereby environmental conditions at specific locations are determinable in the proximal environment.

    7. The networked sensory enhanced navigation system of claim 6 wherein a designated path is computable relative landscape data through the proximal environment towards a selected destination.

    8. The networked sensory enhanced navigation system of claim 7 wherein computations of user velocity, destination, user location, and landscape data enable predictions of a future user location relative the designated path, whereby arrival times are calculable and future designated paths are calculable relative said future user location.

    9. The networked sensory enhanced navigation system of claim 8 wherein the signal alarm is effected as sequential issuances perceptible to the user, said sequential issuances uniquely expressed to communicate proximity to various objects, to effect warning alerts, and to provide directions along the designated path.

    10. The networked sensory enhanced navigation system of claim 9 wherein the signal alarm is communicated audibly.

    11. The networked sensory enhanced navigation system of claim 10 wherein the signal alarm is communicated haptically.

    12. The networked sensory enhanced navigation system of claim 11 wherein the signal alarm is communicated by action of at least one vibration motor portable by the user.

    13. The networked sensory enhanced navigation system of claim 12 wherein the at least one vibration motor is disposed in an item of apparel.

    14. The networked sensory enhanced navigation system of claim 13 wherein the at least one vibration motor is disposed within at least one shoe.

    15. The networked sensory enhanced navigation system of claim 14 wherein a user is enabled to verbally interact with the system by action effective through the user peripheral device.

    16. The networked sensory enhanced navigation system of claim 15 wherein non-visually impaired users may effect capture of local landscape data and update the Geographic Data Store to reflect present conditions in the proximal environment.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    Figures

    [0028] FIG. 1 is a diagrammatic view illustrating an example embodiment articulating landscape data informing a proximal environment.

    [0029] FIG. 2 is a diagrammatic view illustrating an example embodiment effecting verification and update of a Geographic Data Store by determination of local environmental conditions and capture of landscape data from third party peripherals active within the proximal environment.

    [0030] FIG. 3 is a diagrammatic view of an example embodiment of a designated path determined from a current user location towards a selected destination by way of a plurality of future user locations.

    [0031] FIG. 4 is a diagrammatic view of an example embodiment of a plurality of vibration motors disposed in each of a user's pair of shoes.

    [0032] FIG. 5 is a diagrammatic view of an example embodiment of unique issuances of a signal alarm communicating directions and presence of objects to be avoided.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0033] The present networked sensory enhanced navigation system 10 has been devised to assist visually impaired users navigate in real time through a proximal environment 30. The present networked sensory enhanced navigation system 10 enables generation of landscape data 26 relative a current user location 28 to determine scope of a proximal environment 30 through which said user is traveling. Issuance of signal alarms 32 communicable to the user is effective to assist directing the user through the proximal environment 30, maintaining awareness of obstacles and objects that may otherwise impede travel. Users may therefore reach a desired destination 34 without having to visually interact with the proximal environment 30.

    [0034] The present networked sensory enhanced navigation system 10, therefore, is operable in communication with at least one peripheral device 20 worn or carried by a visually impaired user. The term “visually impaired”, as used herein throughout, is taken to include users who are not presently capable of visioning the proximal environment 30 relative a user location 28 and is not necessarily limited to blind users, or partially bind users, but may also include users who are visually occupied, as for example, users wearing headsets that obstruct view.

    [0035] As shown in FIG. 1, the at least one peripheral device 20 is disposed in network communication with a Global Positioning System (“GPS”) 22 whereby a user location 28 is determinable. GPS 22 may include triangulation of a signal between transceivers, as is possible between cellular communications towers, for example, or by interaction with other transceivers extant in the locale, or by satellite or repeating signal communications between available transceivers or additional sensors or other peripherals 40, and the at least one peripheral device 20. Additional means of determining user location 28 by sensed interaction with the peripheral device 20 are contemplated as part of this disclosure, as known in the art.

    [0036] Geographic Data Store (“GDS”) 24 memory is accessible over network wherein landscape data 26 is storable and retrievable and continuous determination of a user location 28 is effective relative the proximal environment 30. Scope of said proximal environment 30 is generated by relative situation of objects known to be present in the proximal environment 30, as assessed by maps, blueprints, building plans, and other spatial data sets 38 and landscape data 26 as may be accessed through the GDS 24, as will be described subsequently (such as upload of data from participating peripheral devices not in hand or in use by the user in question). See FIG. 2. In proximal environments 30 where landscape data 26 is changeable, sensed objects may be determined to be present in said proximal environment 30, whereby landscape data 26 is updateable to inform the proximal environment 30 in real time.

    [0037] Signal alarms 32 perceptible to the user are issued by at least one of the at least one peripheral device 20. These signal alarms 32 are communicative to direct the user through the proximal environment 30 sensible of the known or sensed objects determined present in the proximal environment 30, whereby avoidance of objects is maintained and unobstructed passage through the proximal environment 30 is enabled.

    [0038] As shown in FIG. 2, Landscape data 26 may include, for example, maps, plans, blueprints, transportation schedules, among other available data accessible via network 38 and determined to be pertinent to the proximal environment 30 surrounding the relevant user location 28. Thus a proximal environment 30 interior to a particular building is comprehensible by access to building plans storable in the GDS 24 corresponding to the user location 28. Similarly a proximal environment 30 outside in a particular area is comprehensible by access to maps storable in the GDS 24 corresponding to the user location 28. See FIG. 2.

    [0039] Thus, accessing the GDS 24 enables computation of a virtual representation of the field of view surrounding the user, the current user location 28, and prognostication of future user locations 36 based on the user location 28 and a sensed user velocity. As shown in FIG. 2, capture of the proximal environment 30 by one of the at least one peripheral device 20 may verify 35 the landscape data 26 as accessed by the GDS 24 and update the GDS 24 to include determination of unique objects sensed within the proximal environment 30 otherwise unknown in the relevant landscape data 26. Thus, moveable objects discovered in a new position relative known landscape data 26 stored in the GDS 24 for the current user location 28, for example, are thereby relocated by the system 10. New objects unknown to the GDS 24 are thence populated to the GDS 24 when discovered. See FIG. 2.

    [0040] Capture of the proximal environment 30 is contemplated as effective by visual capture (as by a camera, for example), sonar capture (by monitoring of ultrasonic emissions, for example), infrared (by monitoring of electromagnetic emissions in the infrared spectrum, for example), or other waveform emission (such as radar, for example), as known in the art. Capture of the proximal environment 30 may be effected by at least one of the at least one peripheral device 20 in use by the user in question, or may be effected by additional peripheral devices 40, 44 found operating in the proximal environment 30, such as, for example, third party peripherals 40 or “internet-of-things” 44 discovered in operation in the proximal environment 30. Thus three dimensional spatial analysis and positioning of objects is effective between at least two cooperating peripheral devices found operating in the proximal environment 30.

    [0041] Third-party peripherals 40 include traffic cameras, security cameras, radio frequency identification chips as may be employed in objects as part of the present system or the “internet-of-things”, cameras or emitters in vehicles sensible by the present system, or handheld and other peripheral computing devices as may be in operation by third-party users active in the proximal environment 30. Thus scope of the proximal environment 30 may be effected by capture between a plurality of sensors 42 disposed operating in the proximal environment 30 and the GDS 24 accessed by the user at least one peripheral device 20 is updateable by real time acquisition of local landscape data 26. Situation of objects may be repeatedly verified by multiple views effected by more than one third-party peripheral 40, whereby confirmation and verification of object position is repeatedly known.

    [0042] The present networked sensory enhanced navigation system 10 may make use of the “internet-of-things”, wherein everyday objects 44 are networked to the internet and able to communicate local environmental data 46 such as temperature, pressure, sounds and sights, and other data, captured by said everyday objects 44. Further, the “internet-of-things” will enable unique identifiers of such networked everyday objects 44 whereby location data of such objects 44 is determinable over network and computable by the present system 10 to generate locally accurate and updateable landscape data 26 informative of the GDS 24. NFC between the at least one user peripheral device 20 and said everyday objects 44 further assists in continuously monitoring and updating the user location 28.

    [0043] Real time generation of landscape data 26, therefore, is contemplated whereby, for example, traffic patterns along streets or other conduits are determinable relative a current user location 28. Further, traffic signals, such as pedestrian walkways and traffic lights, may be networked with the present system 10 to enable communication to the user of safe passage across a roadway, street, or other vehicular causeway, for example (see FIGS. 2 and 3).

    [0044] Additionally, local weather conditions are contemplated as determinable by the present system 10 whereby local conditions in the proximal environment 30 may be predicted. For example, at least one of the at least one peripheral device 20 in use by the user may sense ambient temperature, barometric pressure, relative humidity, and other metrics, and thereby determine environmental data 46 as may be useful to the user. The at least one peripheral device 20 may likewise determine such environmental data 46 via network, communicating with weather stations 39, for example, to determine precipitation at the user location, or at a future user location 36 towards which said user is determined to be traveling (such as if a user were to exit a building, for example). Thus precipitation is determinable, and presence of ice and/or snow upon the ground may be comprehended by sensors 42 communicating within the proximal environment 30, such as cameras and/or temperature and moisture sensors disposed in objects (including vehicles) active in the proximal environment 30. Additionally, climatic history accessible over network, user uploaded data, and spatial aspect of known locations in the proximal environment (such as northern aspects and shady areas) may enable determination of likelihood of presence of ice and snow whereby the user may be directed accordingly.

    [0045] As shown in FIG. 3, the networked sensory enhanced navigation system 10 is capable of determining a designated path 48, computable through the proximal environment 30, towards a desired and selected destination 34. The user may select a desired destination 34, which the present system 10 will subsequently navigate said user towards, or the system 10 may act preventatively, preventing collision or contact with objects known and sensed within the proximal environment 30 relative said user's user location 28 and projected future user locations 36.

    [0046] As shown in FIG. 3, the user location 28 orients the user within the proximal environment 30. In the simplified example illustrated in FIG. 3, the user is guided in a direction until reaching future location A whereat the user is directed to turn right and subsequently cross the road when the traffic signal 45, communicating over network with the at least one user peripheral 20, enables safe passage. The user then approaches future location B and is caused to again cross the road. The user then is directed through the park towards future location C, said user directed around trees and other objects as case may be. The user is then directed across crosswalk D and brought to the desired destination 34 by entering into the building at future location E.

    [0047] Communication with the user along the designated path 48, or preventative of collision with objects, is effective by issuance of the signal alarm 32. The signal alarm 32 is effected as sequential issuances perceptible to the user, said sequential issuances uniquely expressed to communicate proximity to various objects, warning alerts, and to provide directions along the designated path 48. For example a specific sequence of issuances, determined by rhythm or number, frequency or amplitude, for example, may communicate presence of a particular object, say, or a direction recommended for travel towards a future user location 36. The signal alarm 32 may be issued audibly, and may include verbal instructions intelligible to the user issued as commands, for example, or sonorously emitted as specific sounds matchable with associated responses and actions to be undertaken by the user.

    [0048] The signal alarm 32 may be communicated haptically, whereby a user is enabled perception of the signal alarm 32 by vibrations, for example, whereby particular presences and/or particular sequences of vibrations may communicate specific information, such as direction of travel, presence or absence of objects, arrival at a desired location, or a moment in time when an action should be undertaken (such as crossing the road, for example). In such an embodiment, as illustrated diagrammatically in FIGS. 4 and 5, it is contemplated that at least one vibration motor 50 be disposed in operational communication with the at least one peripheral device 20, said at least one vibration motor 50 dispositional in contact with a user as integrated, for example, within an item of apparel or piece of equipment, accouterment, or other portable object.

    [0049] A plurality of vibration motors 50 may be used, situated in contact with the user in a plurality of locations, whereby vibration of any one of the plurality of vibration motors 50 may, for example, communicate proximity to an object or indicate a desired direction, or a preferential moment wherein to make a motion towards a future user location 36. Thus the present networked sensory enhanced navigation system 10 is usable to direct a user through a proximal environment 30 towards a future user location 36, or along the designated path 48 towards a desired destination 34, while protecting the user from impacts and collisions with known and sensed objects existing and operating in the proximal environment 30.

    [0050] In one embodiment of the invention 10, illustrated diagrammatically in FIGS. 4 and 5, vibration motors 50 are disposed in each of a users shoes 52 whereby effect of the signal alarm 32 stimulates the user's feet in unique positions and arrays. Particular arrangements of vibration motors 50 enable multiform signal alarms 32 indicative of particular stimuli. Thus, for example, vibration of a leftmost vibration motor 50 might signal a user to turn left. Frequency of vibrations might signal approaching a future user location 36 whereat a second sequence of vibrations might indicate an action, such as a left turn, as example. Additionally, signal alarms 32 may be effected to represent proximity to objects, and frequency of vibrations may be inversely proportional to distance relative to each object. Thus a user may be made sensible that said user is passing by an object, for example, such as indicated by the signal alarm 32 moving across a plurality of vibration motors 50 along the outer longitudinal arch of said user's foot, for example. Continual generation of signal alarms 32 along a particular part of the user's body may, for example, indicate presence of a roadway or a wall adjacent said user. Additional sites of vibration motors 50 upon the user are contemplated as part of this invention 10, including, for example, upon each wrist, each foot, each leg, as part of eyewear or headwear, and as part of clothing worn upon the body, or some sporting equipment or accouterments ported by said user, for example. Directional significance of any signal alarm 32 may be projected as a position stimulated upon the body relative landscape data 26 informing the proximal environment 30, for example.

    [0051] FIG. 4 illustrates an example of a plurality of vibration motors 50 disposed interior to each of a user's pair of shoes 52. In this example embodiment, vibration motors 50 are contemplated as integrated with an insole in each shoe 52. It should be obvious to anyone having ordinary skill in the art that such vibration motors 50 could be integrated with additional items of clothing, apparel, or sporting equipment (specifically, the handles of ski poles, for example, whereby visually impaired users may receive directional signals while skiing). FIG. 5 illustrates a simplified example of unique issuances of a signal alarm to provide directional instructions and alert the user to the presence of objects in the proximity. In FIG. 5 the proximal environment 30 is diagrammatically representative of an interior space, such as a room, for example. Objects 100, 102 are known from user history stored to the GDS. Thus the user is directed forwards by issuance of directional signal alarm X. User is alerted to arrest forwards velocity by issuance of arrest signal alarm Y. Issuance of directional signal alarm Z communicates to the user to turn right. Continuing on the user is alerted to object 100 by proximity signal alarm M. Proximity signal alarm N communicates to user that user is passing by object 100. Proximity signal alarm O communicates to user presence of another object whereby the user is directed between said objects 100, 102, and enabled clear passage through the proximal environment.

    [0052] The at least one peripheral 20 further enables computations of user velocity relative said user location 28 and, in conjunction with available and updateable landscape data 26, enables predictions of a future user location 36. The present system 10 is thereby enabled to act preventatively. Further, when future locations 36 are generated relative a designated path 48, estimated arrival times are calculable and future designated paths 49 are calculable relative each said future user location 36 along the designated path 48. Thus a user may rapidly execute a side route, for example, while traveling along a designated path 48 towards a pre-selected destination 34.

    [0053] The present networked sensory enhanced navigation system 10 is enabled for voice recognition whereby a user is enabled to verbally interact with the system 10 effective through the user peripheral device 20. Additionally, a Graphic User Interface (“GUI”) may present renditions of the proximal environment 30 to interact with landscape data 26. This enables third-party actors to update landscape data 26 in a particular proximal environment 30 by interacting manually with the GUI. Additionally users may select destinations by manual interaction with the GUI.

    [0054] Visually impaired users are enabled interaction with the present networked sensory enhanced navigation system 10 by contacting a screen of the at least one peripheral device 20, or otherwise effecting contact with said peripheral device 20. Audible commands may then direct the user through menus and a single tap, for example, relative a double tap, for example, may allow for selection in the alternative. Alternately, a swipe in one direction relative a swipe in another direction may also allow for selection in the alternative. Additionally, the present networked sensory enhanced navigation system 10 enables visual capture of objects whereby a user may query presence of objects not communicated through action of the signal alarm 32. The present networked sensory enhanced navigation system 10 may relate information pertinent to the object when stored in the GDS, such as, for example, a building's address.

    [0055] The present networked sensory enhanced navigation system 10 further enables tagging of objects, preferred routes, and favorite locations wherein a user is enabled oral input of qualifiers associated with particular geographic data and storable in the GDS 24 whereby a particular location, for example, may be tagged with metadata unique to the user such as, for example, preference towards a particular restaurant, shop, park, particular route of travel, or other object or location.