Patent classifications
G06T2207/30261
COLLISION WARNING SYSTEM AND METHOD
A collision warning system and method are provided. The system includes an image capturing apparatus, a sensing apparatus, and a computing apparatus. The computing apparatus analyzes a vehicle side image to generate a plurality of objects and an object coordinate corresponding to each of the objects. The computing apparatus calculates an aerial view coordinate corresponding to each of the objects based on the object coordinates and a pitch angle corresponding to the image capturing apparatus. The computing apparatus calculates a predicted collision time between the vehicle and each of the objects based on a vehicle speed, a turning angle, and the aerial view coordinates. The computing apparatus generates a plurality of warning regions and determines a warning scope corresponding to each of the warning regions based on the predicted collision times.
Methods and apparatus for depth estimation on a non-flat road with stereo-assisted monocular camera in a vehicle
A non-transitory processor-readable medium stores code representing instructions to be executed by the processor. The code comprises code to cause the processor to receive a first image and a second image from a stereo camera pair disposed with a vehicle. The code causes the processor to detect, using a machine learning model, an object based on the first image, the object located within a pre-defined area within a vicinity of the vehicle. The code causes the processor to determine a distance between the object and the vehicle based on disparity between the first image and the second image. The code causes the processor to determine a longitudinal value of the vehicle based on the distance and a height of the vehicle. The code causes the processor to send an instruction to facilitate driving of the vehicle based on a road profile associated with the longitudinal value.
Safe exit assist system
A safe exiting assistance system includes an object detector that detects an object which approaches from a rear side of a vehicle, a body detector that detects physical information of a passenger in the vehicle, a door opening/closing detector that detects door opening or closing information of the vehicle, a determiner that determines whether a situation requires alert to the passenger in response to detection results of the object detector and the door opening/closing detector and calculates a value sensed by the body detector in response to vehicle information set in advance, and a controller that differently controls a position and an irradiation angle of a light source in response to the physical information when information in which the alert is necessary is received from the determiner.
Control of Autonomous Vehicle Based on Environmental Object Classification Determined Using Phase Coherent LIDAR Data
Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
BOUNDING BOX ESTIMATION AND LANE VEHICLE ASSOCIATION
Disclosed are techniques for estimating a 3D bounding box (3DBB) from a 2D bounding box (2DBB). Conventional techniques to estimate 3DBB from 2DBB rely upon classifying target vehicles within the 2DBB. When the target vehicle is misclassified, the projected bounding box from the estimated 3DBB is inaccurate. To address such issues, it is proposed to estimate the 3DBB without relying upon classifying the target vehicle.
Distance measurement device
A distance measurement device includes: a light source configured to emit visible illumination light; an imaging element configured to receive reflected light of the illumination light from an object; and a signal processing circuit configured to reduce the emission of the illumination light in a predetermined period, detect a timing when the reception of the reflected light at the imaging element is reduced due to the reduction of the illumination light, and measure a distance to the object on the basis of the detected timing.
VEHICULAR TRAILER HITCHING ASSIST SYSTEM
A vehicular trailer hitching assist system includes a rear backup camera disposed at a rear portion of a vehicle and a control disposed in the vehicle. A display device includes a video display screen operable to display video images for viewing by a driver of the vehicle. Responsive to the driver initiating a hitching maneuver event by engaging reverse gear of the vehicle, rear backup video images derived from image data captured by the rear backup camera are displayed at the video display screen. An overlay overlays the displayed rear backup video images and aids in guiding connection of a trailer hitch of the equipped vehicle to a trailer tongue of a trailer. Continued display of rear backup video images that are overlaid by the overlay is ceased upon elapse of a threshold period of time or upon exceeding a threshold speed or upon exceeding a threshold distance traveled.
Processing of Sensor Data for a Driver Assistance System
In order to process sensor data for a driver assistance system oriented towards the driver's comfort, sensor data that is sensed by a sensor device and describes objects is preprocessed such that a distinction is made between a driving zone and a non-driving zone, where the driving zone is designated as an object driving zone. The object driving zone is delimited by a boundary line. Since the sensor data is processed for a comfort-oriented driver assistance system, it does not have to describe the entire theoretical driving zone. Rather, the boundary line is used to delimit the driving zone within which the vehicle can normally be expected to drive. Based thereon, it is easy to determine an appropriate boundary line and significantly reduce the volume of data to be transmitted from the sensor device to a central control device of the comfort-oriented driver assistance system in order to describe the sensed objects.
IMAGE GENERATING DEVICE AND IMAGE GENERATING METHOD
An ECU captures an image of an imaging region around an own vehicle, and acquires image data, the imaging region being configured by a plurality of imaging areas. The ECU determines whether the object is present in the imaging areas, on the basis of detection results of an object present around the own vehicle. The ECU selects a target area to be displayed in an easy-to-see state from among the plurality of imaging areas, on the basis of determination results. The ECU reduces and corrects the image data of each imaging area such that an image of the target area is displayed in the easy-to-see state compared to an image of each imaging area to generate display image data.
CONTROLLING HOST VEHICLE BASED ON DETECTED DOOR OPENING EVENTS
Systems and methods are provided for navigating an autonomous vehicle. In one implementation, a system for navigating a host vehicle based on detecting a door opening event may include at least one processing device. The processing device may be programmed to receive at least one image associated with the environment of the host vehicle, analyze the at least one image to identify a side of a parked vehicle, identify a first structural feature of the parked vehicle and a second structural feature of the parked vehicle, identify a door edge of the parked vehicle in a vicinity of the first and second structural features, determine a change of an image characteristic of the door edge of the parked vehicle, and alter a navigational path of the host vehicle based at least in part on the change of the image characteristic of the door edge of the parked vehicle.