G01C21/3647

Assistance When Driving a Vehicle
20170248441 · 2017-08-31 ·

In order to support the driving of an ego-vehicle, the following steps are carried out: gathering information from the environment of the ego-vehicle; processing the gathered information, in such a way that it is detected whether a neighboring vehicle is in the environment of the ego-vehicle, and if a neighboring vehicle is detected, additionally gathering and/or processing information relating to the neighboring vehicle in order to assign at least one typical attribute to the neighboring vehicle; and, according to the at least one typical attribute of the neighboring vehicle, providing control information for driving the ego-vehicle.

INFORMATION MANAGEMENT DEVICE, VEHICLE, AND INFORMATION MANAGEMENT METHOD

An information management device includes a difference extractor and a determination unit. The difference extractor extracts, as difference information, vehicle peripheral three-dimensional information detected by a vehicle and three-dimensional map information. The determination unit determines whether the extracted difference information is a difference inherent to the vehicle.

Implementing route generation with augmented reality

A method, system and computer program product are provided for implementing route generation with augmented reality. A crowd or event is analyzed and escape routes are intelligently distributed by attendee profile type or ability. Augmented Reality (AR) projections are used representing different subsets of an event population to convey the profile-specific escape routes for people to follow. The AR behavior is changed as an event situation changes.

Intersection guide system, method, and program

Intersection guide systems, methods, and programs acquire information on a path of a vehicle and acquire a travel direction of the vehicle at a guide intersection ahead of the vehicle on the basis of the information on the path. The systems, methods, and programs display a guide image that represents the travel direction superimposed on a portion of a forward scene ahead of the vehicle other than an image of the guide intersection, and a connection line image as superimposed on the forward scene, the connection line image connecting between the image of the guide intersection in the forward scene and the guide image. The connection line image is superimposed on the forward scene such that a length of the connection line image becomes shorter as a degree of approach of the vehicle to the guide intersection becomes larger.

BIRDS EYE VIEW VIRTUAL IMAGING FOR REAL TIME COMPOSITED WIDE FIELD OF VIEW
20170234692 · 2017-08-17 ·

A live image and a previously acquired or generated image are superimposed or composited to represented a virtual vantage point for flying, driving or navigating a plane, vehicle or vessel.

PARKING ASSIST DEVICE, PARKING ASSIST METHOD, AND RECORDING MEDIUM

A parking assist device is used for a vehicle on which a power receiving coil including a first coil and a second coil is mounted. The parking assist device includes a hardware processor functioning as an acquisition unit and a vehicle control unit. The acquisition unit serves to acquire route information for parking the vehicle at a facing position where a power feeding coil and the first coil of the power receiving coil face each other. The vehicle control unit serves to perform control of parking the vehicle by using the route information. The power receiving coil is mounted on the vehicle such that a distance between a center position of the first coil and a center position of the vehicle in a vehicle length direction becomes shorter than a distance between a center position of the second coil and the center position of the vehicle.

Alert system for environmental changes

For locations along a route a user will be traveling, the alert system for environmental changes compares first and second sets of images associated with first and second timestamps, respectively. The alert system determines degrees of environmental changes for the locations based on the comparisons. The alert system then generates and sends an alert to a user device. In determining the degrees of environment changes, the alert system retrieves first and second set of images matching a given location and associated with first and second timestamps. The alert system identifies first and second sets of objects and extracts first and second sets of attributes for the first and second sets of images. The alert system compares the first and second sets of attributes and the first and second set of objects, and determines a given degree of environmental changes at the given location based on the comparisons.

Satellite navigation method and system
09772194 · 2017-09-26 · ·

A satellite navigation method and system are provided. The system includes a global position system module, an input unit, a picture database, a geographical information system module, an integrating unit, and a display unit. The method includes the following steps. Firstly, a navigation area is determined through the input unit, so as to search out several picture batches in the navigation area from the picture database. Next, a map relating to the navigation area is provided by the GIS module. Then, the picture batches and the map are integrated by the integrating unit to produce an integrated map shown on a first frame of the display unit. Afterwards, several pictures of the picture batch chosen from the first frame displayed on a second frame and satellite navigation information of the picture chosen from the second frame are displayed on a third frame by the display unit.

SYSTEM AND STEREOSCOPIC RANGE DETERMINATION METHOD FOR A ROADWAY LIGHTING SYSTEM

Provided is a system that includes a first camera and a second camera each configured to capture image data from respective viewing angles, and a data processor coupled with the first camera and the second camera, that receives the image data, calculates a range based on the image data received and validates the calculated range for a parking space.

VEHICLE GUIDANCE SYSTEM

A vehicle guidance system facilitates maneuvering of an autonomous vehicle with respect to an object in a scene. The vehicle includes a steering apparatus having a range of angular positions and a multitude of actuators for controlling the dynamics of the vehicle. The system includes a steering angle sensor, a camera device, and a video processing module. The sensor is configured to monitor the angular position of the wheel. The device is configured to capture an original image of a scene having the object. The module is configured to receive and process the original image from the device, detect the object in the original image, receive and process the angular position from the sensor, generate a dynamic trajectory based on at least the angular position, orientate the dynamic trajectory with regard to the object, and operate at least one of the actuators to guide the vehicle along the dynamic trajectory.