Patent classifications
G08G1/096861
NAVIGATION DEVICE, CONTROL METHOD FOR NAVIGATION DEVICE, AND CONTROL PROGRAM FOR NAVIGATION DEVICE
A navigation device includes an image display unit and a display control unit for controlling the display of the image display unit. The display control unit displays, in the image display unit, a map on a fixed scale in the form of a bird's eye view in which the front of the vehicle is overlooked, and displays a moving route by a guide line on the map until the distance to a guide point becomes a first display switching determination distance, when the distance to the guide point becomes the first display switching determination distance or shorter, increases a scale as the guide point is approached, and when the distance to the guide point becomes a second display switching determination distance or shorter, stops the increase of the scale, and displays, in an area where the guide line is displayed, a mark indicating a path at the guide point.
INFORMATION PROCESSING DEVICE, INFORMATION OUTPUT METHOD, PROGRAM, AND STORAGE MEDIUM
The information processing device provides information relating to a movement of a mobile body. The receiving unit receives an information request. The load determination unit determines a load of a driver with respect to a driving of the mobile body. The voice output unit outputs, in response to the information request, a basic information when the load is large, and an additional information in addition to the basic information when the load is small.
MOBILE APPARATUS AND VEHICLE
A mobile apparatus performing an augmented reality function and a navigation function includes an input device, a location receiver for obtaining a current location, an image obtaining device, a display device provided to display surrounding image obtained from the image obtaining device in response to execution of an augmented reality mode, and a controller provided to search for a route from the current location to a destination, destination information and map information obtained by the location receiver, identify a moving situation according to route information corresponding to the searched route, obtain location information of an inflection point at which the moving situation is changed according to information on the identified moving situation, and control the display device to display an object image of the inflection point on the surrounding image displayed on the display device based on the location information of the inflection point and the current location information.
Display system
A display system of the present disclosure forms an AR route by shifting node information included in road map data to a lane on which a subject vehicle is to travel on the basis of lane information. Thus, it is possible to display the AR route which matches a shape of a route on which the subject vehicle is to travel without providing a feeling of strangeness while resolving inconvenience that the AR route is largely displaced from the route on which the subject vehicle is to travel at positions such as an intersection and a branch point, where a plurality of roads intersect.
USER-TAILORED ROADWAY COMPLEXITY AWARENESS
A roadway complexity awareness system for a vehicle, comprising a processor, an augmented reality interface disposed in communication with the processor, and a memory for storing executable instructions. The processor is programmed to execute the instructions to receive roadway status information from an infrastructure processor associated with a roadway, obtain, from a vehicle sensory system, sensory information indicative of roadway route complexity, and determine a likelihood of roadway route complexity based on the roadway status information and the sensory information. The system may select an augmented reality message indicative of the roadway route complexity, and generate the augmented reality message using an augmented reality interface device, wherein the augmented reality message is visible to a vehicle operator.
DISPLAY LIGHT EMITTING DEVICE
When the distance indicated by distance information received by a Bluetooth module (31) is less than a predetermined first reference distance (KY1), a guidance image (33) including a distance image (33a) and a direction image (33b) is generated, and this guidance image (33) is displayed on a combiner (26) of a helmet (20). By stopping the display of the distance image (33a) of the guidance image (33) and continuing to display the direction image (33b) from when the distance indicated by the distance information becomes less than a second reference distance (KY2) shorter than the first reference distance (KY1) to when the guidance point is reached, the driver's attention is focused on the direction image (33b), so that a mistake in the direction of travel at the guidance point can be prevented.
Method for guiding path by extracting guiding information in lane and electronic device for executing the method
The present disclosure relates to a method for guiding a path of an electronic device and an electronic device. The method for guiding a path according to the present disclosure includes: acquiring a top-view image; extracting guiding information in a lane from lane information extracted from the top-view image; and guiding a path of a vehicle using the extracted lane information and the guiding information in the lane. According to the present disclosure, it is possible to promote safe driving by guiding a possible traveling direction on the path to a driver in real time through image analysis of a traveling road. In addition, it is possible to provide more detailed path guidance by providing the path guidance reflecting the necessity of lane change by recognizing the lane of the driving path.
VEHICLE PARKING MANAGEMENT METHOD, ELETRONIC DEVICE, AND COMPUTER STORAGE MEDIUM
A vehicle parking management method, comprising: on at least one of a vehicle and a mobile device, acquiring current parking place information and a plurality of additional information of a parking lot, the current parking place information at least comprising a location of an unoccupied parking place, each additional information in the plurality of additional information being configured for identifying associated data; on the basis of the current parking place information and the additional information, generating a parking place distribution image, the parking place distribution image at least indicating the unoccupied parking place and the additional information; presenting the parking place distribution image; and in response to that an operation targeting the unoccupied parking place or the additional information is detected, generating a navigation indication related to a target unoccupied parking place.
LOCAL NAVIGATION ASSISTED BY VEHICLE-TO-EVERYTHING (V2X)
Techniques described herein provide for enhanced ultra-local navigation services for V2X devices (e.g., smartphones incorporating V2X chip sets). The V2X devices can transmit vehicle information to edge network devices (e.g., roadside units). The roadside units can be deployed at intersections or along roads to collect traffic information through various sensor inputs and V2X communications with multiple vehicles. The communication between V2X devices and the edge network devices can be accomplished through wireless communication (e.g., direct PC5 interface or through local Uu interface with edge computing. The edge network devices can perform local route optimization and compute one or more recommendations (e.g., a recommend route, a recommended speed, a recommended lane). The edge network devices can transmit the one or more recommendations via a wireless communication to the V2X devices. The V2X devices can display the recommendations to a user.
Mobile terminal and control method thereof
Disclosed is a mobile terminal that provides an augmented reality navigation screen in a state of being hold in a vehicle, the mobile terminal including: at least one camera configured to obtain a front image; a display; and at least one processor configured to calibrate the front image, and to drive an augmented reality navigation application so that the augmented reality navigation screen including at least one augmented reality (AR) graphic object and the calibrated front image is displayed on the display.