Patent classifications
G01C21/3652
Method and apparatus for providing tactile sensations
The present disclosure concerns a method and apparatus for the creation of an acoustic field for providing tactile sensations. More particularly, but not exclusively, this disclosure concerns a method and apparatus for the creation of an acoustic field providing tactile sensations for use with an interactive device. The disclosure provides a method of generating a tactile sensation. The method comprises the steps of providing a plurality of acoustic transducers arranged to generate a predetermined distribution of pressure patterns, wherein the pressure patterns comprise a first region providing a first tactile sensation and a second region providing a second, different, tactile sensation.
Transition of navigation modes for multi-modal transportation
Systems and methods for improved transitioning and updating of navigations modes for multi-modal transportation routes are presented. A transportation route may be received that includes a first segment associated with a first modality and a second segment associated with a second modality. A first interface associated with the first modality may be displayed and a trigger event associated with the first segment may be detected. In response, a second interface may be displayed that is associated with the second modality.
Navigation assistance for the visually impaired
Eyewear having a light array and vibration sensors for indicating to a user when to turn left and right, such as when the eyewear is operating a navigation map application, that improves the user experience of eyewear devices for users having partial blindness or complete blindness. To compensate for partial blindness, the front portion of the eyewear frame, such as the bridge, may include the light array, where the left portion of the light array is illuminated to indicate to the user to turn left. The right portion of the light array is illuminated to indicate to the user to turn right. To compensate for complete users having complete blindness, the eyewear has a vibration device on each side of the eyewear, such as in the temples, which selectively vibrate to indicate when the user should turn left and right.
Methods and systems for providing feedback based on information received from an aerial vehicle
Described herein is a control system that facilitates assistance mode(s). In particular, the control system may determine a particular assistance mode associated with an account. This particular assistance mode may specify (i) operations for an aerial vehicle to carry out in order to obtain sensor data providing environment information corresponding to a location associated with the account and (ii) feedback processes to provide feedback, via a feedback system associated with the account, that corresponds to respective environment information. The control system may transmit to the aerial vehicle an indication of the particular operations corresponding to the particular assistance mode and may then receive environment information for the location associated with the account. Based on the received environment information, the control system may apply the specified feedback processes to initiate feedback in accordance with the particular assistance mode via the associated feedback system.
Ear set device
An ear set device that includes a main body; and one or more earphones configured to output a response signal, wherein at least one of the one or more earphones includes: a first detecting unit configured to detect whether the one or more earphones are attached to a user, wherein the main body includes a communication unit configured to perform wireless communication with the one or more earphones, a second detecting unit configured to detect a location of the main body, and a control unit configured to, based on a determination that the main body is inside a vehicle and a determination that the vehicle is operating while the one or more earphones are attached to the user, receive, from the vehicle, information related to an operation of the vehicle and output a response signal corresponding to the received information through the one or more earphones is disclosed.
USER CONTROLLED DIRECTIONAL INTERFACE
A system and method for providing directional feedback. A wireless connection is established between a first directional interface and one or more directional interfaces. A free form path is received from a user utilizing the first directional interface. The free form path is converted to directional feedback in which the free form path is wrapped to available paths communicated by the first directional interface and the one or more directional interfaces to generate the directional feedback. The directional feedback is sent to the one or more directional interfaces wirelessly connected to die first directional interface. The directional feedback is communicated to one or more users associated with the one or more directional interfaces utilizing user preferences associated with the one or more directional interfaces.
System and method for providing directions haptically
Systems, methods, and computer-readable storage devices for providing directions haptically such that sight and hearing can continue unimpeded. In one exemplary embodiment, a wearable device (such as earphones, ear rings, gloves, glasses, or other wearable objects) configured as disclosed herein receives directions to an intended destination for a user, the directions comprising a movement action and a distance to the movement action. The wearable device has multiple haptic output units and generates, through one of those units, a haptic output based on the directions. This allows the user to receive the directions through touch rather than looking at their mobile device or from audio.
NAVIGATION SYSTEM FOR A VISUALLY IMPAIRED USER AND A METHOD OF NAVIGATING A VISUALLY IMPAIRED USER
A system and a method for a navigation system for a visually impaired user includes a navigation module arranged to derive a navigational path from a starting position to a predetermined destination; a guiding vehicle arranged to guide the visually impaired user towards the predetermined destination based on the derived navigational path whereby the movement is arranged to indicate a navigation guidance to the visually impaired user; wherein the navigation module is further arranged to adjust the navigational path of the guiding vehicle in response to the detection of an obstacle during the navigation of the guiding vehicle so as to avoid the detected obstacle whilst following the navigational path.
NAVIGATION ASSISTANCE SYSTEM FOR VEHICLES AND A METHOD THEREOF
The present disclosure relates to a technique for providing navigation assistance to both the driver as well as other vehicle occupants of the vehicle. The technique recites acquiring a placement information for a plurality of light sources and haptic sensors inside the vehicle such as to attract the attention of the driver only in the peripheral vision, without any requirement to take the eyes off the road. It also discloses a navigation unit and an ambient light sensing module configured to provide navigation information and the ambient light information respectively to the control unit. The control unit in turn processes this received information and control the functioning of both the light sources as well as the haptic sensors in order to update the driver and passengers about the upcoming navigation events.
Intelligent electronic footwear and logic for navigation assistance by automated tactile, audio, and visual feedback
Presented are intelligent electronic footwear and apparel with controller-automated features, methods for making/operating such footwear and apparel, and control systems for executing automated features of such footwear and apparel. A method for operating an intelligent electronic shoe (IES) includes receiving, e.g., via a controller through a wireless communications device from a GPS satellite service, location data of a user. The controller also receives, e.g., from a backend server-class computer or other remote computing node, location data for a target object or site, such as a virtual shoe hidden at a virtual spot. The controller retrieves or predicts path plan data including a derived route for traversing from the user's location to the target's location within a geographic area. The controller then transmits command signals to a navigation alert system mounted to the IES's shoe structure to output visual, audio, and/or tactile cues that guide the user along the derived route.