B60R2300/8093

Vehicular vision system with image manipulation
11577645 · 2023-02-14 · ·

A vehicular vision system includes a camera disposed at a front portion of a vehicle, a display screen and a processor for processing image data captured by the camera. The processor performs first, second and third image manipulations on first, second and third portions of the image data to generate first, second and third region manipulated image data. The display screen displays first, second and third images derived from the manipulated image data at respective display regions. The displayed images are discontinuous at a first seam between first and second display regions and discontinuous at a second seam between first and third display regions. An object present in first and second regions of the view of the camera is displayed as discontinuous at the first seam and an object present in the first and third regions of the view of the camera is displayed as discontinuous at the second seam.

Vehicular trailer hitching assist system

A vehicular trailer hitching assist system includes a rear backup camera disposed at a rear portion of a vehicle and a control disposed in the vehicle. A display device includes a video display screen operable to display video images for viewing by a driver of the vehicle. During a hitching maneuver event undertaken by the driver to hitch a trailer hitch of the vehicle to a trailer tongue of a trailer, and while the driver of the vehicle is maneuvering the vehicle to hitch the vehicle to the trailer, the control outputs video images derived from image data captured by the rear backup camera for display by the video display screen. At least one overlay overlays the displayed video images to guide the driver during the hitching maneuver. The at least one overlay aids in guiding connection of the trailer hitch of the vehicle to the trailer tongue of the trailer.

Vehicular vision system that dynamically calibrates a vehicular camera

A vehicular vision system includes a camera disposed at a vehicle and operable to capture multiple frames of image data during a driving maneuver of the vehicle. A control includes an image processor that processes frames of captured image data to determine feature points in an image frame when the vehicle is operated within a first range of steering angles, and to determine motion trajectories of those feature points in subsequent image frames for the respective range of steering angles. The control determines a horizon line based on the determined motion trajectories. Responsive to determination that the determined horizon line is non-parallel to the horizontal axis of the image plane, at least one of pitch, roll or yaw of the camera is adjusted. Image data captured by the camera is processed at the control for object detection.

WARNING DEVICE, WARNING METHOD, AND WARNING PROGRAM
20180012087 · 2018-01-11 ·

To issue an appropriate warning based on detection of an object even under the circumstances where it is difficult to determine the outside environment of a movable body, a warning device according to the present invention includes an image acquisition unit configured to acquire a plurality of images respectively based on a plurality of filter characteristics, a detection unit configured to perform detection of a specified object on each of the plurality of acquired images, and a warning unit configured to issue a specific warning when the object is detected from at least any one of the plurality of acquired images, wherein the warning unit issues a higher level of warning when the object is detected from all of the plurality of images than when the object is detected from some of the plurality of images.

Driving support device

Provided is a driving support device configured to: generate a bird's-eye view image around an own vehicle from periphery images acquired from a plurality of periphery monitoring cameras, and generate a cropped bird's-eye view image by cropping an image of a cropping range including a blind spot range from the generated bird's-eye view image; generate a rear-lateral side converted image by converting a rear-lateral side direction image acquired from a rear-lateral side monitoring camera into an image in which left and right are inverted; and generate a combined image in which the generated cropped bird's-eye view image and the generated rear-lateral side converted image are arranged, and control a display device such that the generated combined image is displayed by the display device.

VEHICULAR CONTROL SYSTEM
20230005275 · 2023-01-05 ·

A vehicular control system includes a camera and a control having a processor that processes image data captured by the camera to determine an approaching vehicle that is approaching an intersection forward of the equipped vehicle. The system determines projected path of the equipped vehicle. Estimated time to arrival of the approaching vehicle at the intersection is determined at least in part by processing of captured image data. Responsive to determination that the equipped vehicle will complete a turn at the intersection before the estimated time to arrival elapses, the system may determine that it is safe to proceed along the projected path of travel. Responsive at least in part to determination that the equipped vehicle will not complete the turn at the intersection before the estimated time to arrival elapses, the system may determine that it is not safe to proceed along the projected path of travel.

Cross-validating sensors of an autonomous vehicle

Methods and systems are disclosed for cross-validating a second sensor with a first sensor. Cross-validating the second sensor may include obtaining sensor readings from the first sensor and comparing the sensor readings from the first sensor with sensor readings obtained from the second sensor. In particular, the comparison of the sensor readings may include comparing state information about a vehicle detected by the first sensor and the second sensor. In addition, comparing the sensor readings may include obtaining a first image from the first sensor, obtaining a second image from the second sensor, and then comparing various characteristics of the images. One characteristic that may be compared are object labels applied to the vehicle detected by the first and second sensor. The first and second sensors may be different types of sensors.

METHOD FOR DISPLAYING A SURROUNDINGS MODEL OF A VEHICLE, COMPUTER PROGRAM, ELECTRONIC CONTROL UNIT AND VEHICLE
20230025209 · 2023-01-26 ·

A method for displaying a surroundings model of a vehicle. The method includes: capturing at least one sequence of camera images of at least one section of the surroundings of the vehicle with the aid of at least one camera; detecting a position of the vehicle; storing at least one camera image of the surroundings of the vehicle, each stored camera image being assigned the detected position of the vehicle) at the moment the stored camera image was captured; determining distances between the vehicle and objects in the surroundings; generating at least one close-range projection surface which represents the close range around the vehicle, the close-range projection surface being deformed three-dimensionally depending on the determined distances; and displaying the surroundings model as a function of the generated close-range projection surface, at least one current camera image, a stored camera image and the present vehicle position.

Close-in sensing camera system

The technology relates to an exterior sensor system for a vehicle configured to operate in an autonomous driving mode. The technology includes a close-in sensing (CIS) camera system to address blind spots around the vehicle. The CIS system is used to detect objects within a few meters of the vehicle. Based on object classification, the system is able to make real-time driving decisions. Classification is enhanced by employing cameras in conjunction with lidar sensors. The specific arrangement of multiple sensors in a single sensor housing is also important to object detection and classification. Thus, the positioning of the sensors and support components are selected to avoid occlusion and to otherwise prevent interference between the various sensor housing elements.

Collision avoidance and/or pedestrian detection system
11697371 · 2023-07-11 · ·

A collision avoidance and/or pedestrian detection system for a large passenger vehicle such as commuter bus, which includes one or more exterior and/or interior sensing devices positioned strategically around the exterior and interior of the vehicle for recording data, method for avoiding collisions and/or detecting pedestrians, and features/articles of manufacture for improving same, is described herein in various embodiments. The sensing devices may be responsive to one or more situational sensors, and may be connected to one or more interior and/or exterior warning systems configured to alert a driver inside the vehicle and/or a pedestrian outside the vehicle that a collision may be possible and/or imminent based on a path of the vehicle and/or a position of the pedestrian as detected by one or more sensing devices and/or situational sensors.