Patent classifications
G01C21/365
Mobile Device and Vehicle
A mobile device includes an input device configured to receive a user input, a location receiver configured to receive location information, an image obtainer configured to obtain an image of surrounding environment, a controller configured to perform a navigation function based on destination information received by the input device and the current location information obtained by the location receiver, and perform an augmented reality (AR) function based on image information on the image obtained by the image obtainer upon a determination that the current location is adjacent to the destination based on the destination information and the current location information during execution of the navigation function, and a display device configured to display a navigation image in response to the navigation function or an AR image in response to the AR function based on a control command of the controller.
Multi-mode route selection
A method managing a generated route for multiple modes of transportation includes displaying guidance for a generated route based on a travel itinerary between an origin location and a destination location, wherein the generated route includes at least one mode of transportation alteration. The method receives user data associated with a user traveling at a specific point along the generated route. The method generates a confidence score for the user traveling along generated route based on the received user data. Responsive to determining the confidence score is below a confidence level threshold, the method modifies the generated route based on an analysis of the generated route at the specific point, wherein the modified route causes the confidence score to increase above the confidence level threshold. The method displays guidance for the modified route.
Vehicle having a projector for projecting an image on a road surface
Information related to a vehicle can be displayed by projecting an image based on the information on a road surface or the like. An image projection apparatus that projects an image includes: a sensor unit that acquires information related to a vehicle; and an image projection unit that projects the image based on the information acquired by the sensor unit.
Visual Content Overlay System
An augmented reality display system included in a vehicle generates an augmented reality display, on one or more transparent surfaces of the vehicle. The augmented reality display can include an indicator of the vehicle speed which is spatially positioned according to the speed of the vehicle relative to the local speed limit. The augmented reality display can include display elements which conform to environmental objects and can obscure and replace content displayed on the objects. The augmented reality display can include display elements which indicate a position of environmental objects which are obscured from direct perception through the transparent surface. The augmented reality display can include display elements which simulate one or more particular environmental objects in the environment, based on monitoring manual driving performance of the vehicle by a driver. The augmented reality display can include display elements which identify environmental objects and particular zones in the environment.
AUGMENTED REALITY DISPLAYS FOR LOCATING VEHICLES
Augmented reality displays for locating vehicles are disclosed herein. An example method includes determining a current location of a mobile device associated with a user, determining a current location of a vehicle, and generating augmented reality view data that includes a first arrow that identifies a path of travel for the ridehail user towards the vehicle. The path of travel is based on the current location of the mobile device and the current location of the vehicle. The first arrow is combined with a view obtained by a camera of the mobile device or a view obtained by a camera of the vehicle.
Head-up display, vehicle device, and information display method
A head-up display (HUD) including a direction-information generator, a shift device, and a display system. The direction-information generator generates direction information to be superimposed on a road surface ahead of a vehicle on which the HUD is mounted. The direction information represents a direction of travel to be followed by the vehicle. The shift device shifts at least some of the direction-change information into the display area when the direction information includes direction-change information to represent a change in the direction of travel and the direction-change information falls outside a display area. The display system displays the direction information within the display area as a virtual image.
Computer-vision based positioning for augmented reality navigation
Systems and methods for a more usable Augmented Reality (AR) display of navigation indications is described. A live camera image of a scene may be captured from a device. Navigation instructions may be generated from a navigation system and a navigation indication may be generated for display. A computer vision-based positioning algorithm may be performed on the camera image to determine the relative position between the viewpoint of the device and one or more landmarks in the live camera image. The location or shape of the visual display of the navigation indication may be determined based on the computer vision-based positioning algorithm.
Method for displaying the course of a trajectory in front of a transportation vehicle or an object by a display unit, and device for carrying out the method
A method for displaying a trajectory course in front of a transportation vehicle or object. The trajectory course is shown on a display unit and is represented in grid format. The points of the grid are represented by symbols, wherein only the border thereof or the completely filled-in symbols are represented based on the environmental features. The size and/or color of the symbols is adjustable.
SYSTEM AND METHOD FOR ADAPTING A CONTROL FUNCTION BASED ON A USER PROFILE
The vehicle control system/method for adapting a control function based on a user profile may comprise: a gesture recognition module; a user profile module; a function control module; a processor; a non-transitory storage element coupled to the processor; encoded instructions stored in the non-transitory storage element, wherein the encoded instructions when implemented by the processor, configure the system to: identify a user; retrieve a user profile for the identified user; receive at a gesture recognition module, an input indicating a gesture from the user; identify a control function request corresponding to the gesture input; send a verification of the control function request; and receive at a function control module characteristics parsed from the user profile that effect the control function request by the user profile module to adapt a control function command for an adapted control function output by the function control module.
Vehicle and method of controlling the same
A vehicle is provided. The vehicle includes: a global navigation satellite system (GNSS) receiver configured to receive a signal from a GNSS; a guide lamp installed on a front portion of the vehicle; and a controller electrically connected to the GNSS receiver and the guide lamp, wherein the controller is configured to: identify an entry of the vehicle into a parking lot based on a GNSS signal acquired by the GNSS receiver; and control the guide lamp to display a light line representing a path to be travelled by the vehicle and an area to be occupied by the vehicle on a road ahead of the vehicle based on the entry of the vehicle into the parking lot.