Patent classifications
G01C21/3652
GENERATING TACTILE OUTPUT SEQUENCES ASSOCIATED WITH AN OBJECT
In some embodiments, an electronic device generates a tactile output sequence in response to detecting that the electronic device is oriented within a range of orientations that changes as the distance between the electronic device and a respective object changes. In some embodiments, an electronic device changes one or more characteristics of a tactile output in response to detecting a change in the orientation of the electronic device relative to a respective object. In some embodiments, an electronic device generates tactile outputs with characteristics indicative of the orientation of a camera of the electronic device relative to one or more AR (augmented reality) planes. In some embodiments, an electronic device generates tactile outputs indicative of a data sharing process with a second electronic device.
Haptic guiding system
An apparatus for assisting the mobility of a user is disclosed. The apparatus includes an electronic device provided in a device body and a haptic unit. The electronic device includes a sensing unit, a machine vision unit, and a path planning unit. The machine vision unit is connected to the sensing unit and is configured to detect one or more obstacles in a vicinity of the user. The path planning unit is capable of developing a movement route for the user to avoid the one or more obstacles. The haptic unit includes a haptic arrangement having a plurality of haptic actuators adapted to provide a tactile stimulation to the torso or limbs of the user. The plurality of haptic actuators actuate based on the movement commands from the path planning unit to signify a travel direction in which there is no obstacle.
Method and apparatus for navigation
Embodiments of the present disclosure provide methods and apparatuses for navigation. A method performed by a communication device comprises obtaining a current position of an object. The method further comprises determining whether the object is dangerous to a user of a terminal device based on the current position of the object. The method further comprises, in response to a positive determination, sending a message for guiding the user to a first wearable device of the user and/or a message for informing a maintainer about the dangerous object.
IDENTIFYING EXPENSIVE SEGMENTS IN ROUTE PLANNING AND GUIDANCE
Navigation routing is optimized to enhance guidance provided near potential route divergence points that increase route cost. Historical route data is used to determine costs for routes between an origin and destination. More expensive routes are compared to inexpensive routes to identify route segments found only in the expensive routes. The beginning of such a segment is labeled as an expensive divergence point for the route. When a routing request is received a routing engine determines the recommended route. A navigation engine identifies expensive divergence points located along the route and augments guidance related to the expensive divergence pointsfor example, by emphasizing maneuvers required to avoid deviating at the expensive divergence point. The augmented guidance may include audio, haptic, and visual cues that reduce the likelihood of deviating from the route, and highlight for the user the expensive nature of not staying on route at that point.
Computer-implemented method, wearable device, computer program and computer readable medium for assisting the movement of a visually impaired user
In a first aspect of the invention, it is claimed a computer-implemented method for assisting the movement of a visually impaired user by means of a wearable device 1, comprising the following steps: S1Acquiring data from the environment of the visually impaired user S2Fusing the acquired data, creating, repeatedly updating of a Live Map S3Determining, repeatedly updating and storing, of at least one navigation path together with associated navigation guiding instructions for the visually impaired user to navigate from the current position of the visually impaired user to a point of interest, repeatedly selecting one preferred navigation path from the at least one navigation path, and repeatedly sending to the visually impaired user the preferred navigation path, together with associated navigation guiding instructions.
USER INTERFACES FOR PROVIDING NAVIGATION DIRECTIONS
In some embodiments, an electronic device displays indications of safety characteristics of one or more navigational segments of a navigation route. In some embodiments, an electronic device displays navigation options. In some embodiments, an electronic device presents indications of navigation directions while navigation along a route.
Wearable system and method for navigation
A method for user movement guidance includes receiving a destination request through a mobile device. The method also includes receiving, by the mobile device, sensor data from a wearable device. The wearable device includes a lanyard and an electronic pendant attached to the lanyard. The electronic pendant includes a camera and a LIDAR sensor. The method also includes establishing communication between the mobile device and a remote system in response to receiving the destination request. The method also includes transmitting, by the mobile device, the sensor data to the remote system in response to establishing the communication between the mobile device and the remote system. The method also includes receiving, by the mobile device, navigation instructions to the destination from the remote system.
Navigation assistance system for vehicles and a method thereof
The present disclosure relates to a technique for providing navigation assistance to both the driver as well as other vehicle occupants of the vehicle. The technique recites acquiring a placement information for a plurality of light sources and haptic sensors inside the vehicle such as to attract the attention of the driver only in the peripheral vision, without any requirement to take the eyes off the road. It also discloses a navigation unit and an ambient light sensing module configured to provide navigation information and the ambient light information respectively to the control unit. The control unit in turn processes this received information and control the functioning of both the light sources as well as the haptic sensors in order to update the driver and passengers about the upcoming navigation events.
Multi-functional walking cane and associated use thereof
A multi-functional walking cane includes an elongated rectilinear shaft having a centrally registered longitudinal axis, a handle adjustably coupled to the shaft, and a portable controller removably coupled to the shaft and spaced subjacent to the handle. Advantageously, the controller includes a programmable navigation determining mechanism, and an object-detecting mechanism in communication with the programmable navigation determining mechanism. In this manner, the controller cooperates with the programmable navigation determining mechanism and the object-detecting mechanism to thereby generate and emit output signals that facilitate directional travel for the blind user while preventing undesirable contact with hazardous obstacles during the directional travel. Notably, a portable earpiece is in wireless communication with the controller. Such an earpiece has a speaker for audibly emitting the output signals to the blind user.