B60R2300/305

Method and apparatus for an automated trailer backup system in a motor vehicle

Methods and apparatus are provided for performing an assisted driving trailer reversing operation including a camera operative to capture an image, an interactive user interface operative to display a graphical user interface and to receive a user input, a processor operative to generate the graphical user interface in response to the user input, the user input being indicative of a trailer destination, to generate a left maneuverability margin and a right maneuverability margin in response to a trailer dimension and a hitch angle, and a projected trailer path in response to the trailer destination, wherein the graphical user interface includes the image and a plurality of graphics overlaid on the image indicative of the left maneuverability margin, the right maneuverability margin, the projected trailer path and the trailer destination, and a vehicle controller operative to perform a trailer reversing operation in response to the control signal.

Vehicular vision system with image manipulation
11577645 · 2023-02-14 · ·

A vehicular vision system includes a camera disposed at a front portion of a vehicle, a display screen and a processor for processing image data captured by the camera. The processor performs first, second and third image manipulations on first, second and third portions of the image data to generate first, second and third region manipulated image data. The display screen displays first, second and third images derived from the manipulated image data at respective display regions. The displayed images are discontinuous at a first seam between first and second display regions and discontinuous at a second seam between first and third display regions. An object present in first and second regions of the view of the camera is displayed as discontinuous at the first seam and an object present in the first and third regions of the view of the camera is displayed as discontinuous at the second seam.

Vehicular vision system

A vehicular vision system includes a camera, a distance sensor and a controller having at least one processor. Image data captured by the camera and sensor data captured by the distance sensor are processed at the controller. The controller, responsive to processing of captured image data and of captured sensor data, detects an object. The controller determines the distance to the detected object based at least in part on difference between the positions of the detected object in captured image data and in captured sensor data. The controller, responsive to processing of captured image data and of captured sensor data, and responsive to the determined distance to the detected object, determines that the detected object represents a collision risk. The controller alerts a driver of the vehicle of the collision risk and/or controls the vehicle to mitigate the collision risk.

Method for displaying the surroundings of a vehicle on a display device, processing unit and vehicle
20230226977 · 2023-07-20 ·

A method for displaying an environment of a vehicle on a display includes: recording the environment with at least two cameras, each having a different field of view, wherein fields of view of adjacent cameras overlap; creating a panoramic image from at least two images taken by differing cameras, the images being projected into a reference plane for creating the panoramic image; ascertaining depth information pertaining to an object in the environment by triangulation from at least two differing individual images taken by the same camera; and generating an overlay structure as a function of the ascertained depth information, the overlay structure having been uniquely assigned to an imaged object; and, representing the created panoramic image, containing the at least one object, and the at least one generated overlay structure on the display such that the overlay structure is displayed on, and/or adjacent to, the assigned object.

Method and system for assisting drivers in locating objects that may move into their vehicle path
11697425 · 2023-07-11 · ·

A system and method for assisting drivers of vehicles are described. The systems and methods provide an extended view of the area surrounding the driver's vehicle while providing real-time object trajectory for objects and other vehicles that may enter the driver's reactionary zone. The system and methods capture images of the area surrounding the driver's vehicle and create a composite image of that area in real-time and using Augmented Reality (AR) create a 3-D overlay to warn the driver as objects or other vehicles enter the driver's reactionary zone so that a driver can make more informed driving decisions.

Display system and method

The present disclosure relates to a display system (1) for generating a composite view of a region behind a vehicle (V) towing a trailer (T). A first camera (C1) is provided for outputting first image data corresponding to a first image (IMG1), the first camera (C1) being configured to be mounted in a rear-facing orientation to the vehicle (V). A second camera (C2) is provided for outputting second image data corresponding to a second image (IMG2), the second camera (C2) being configured to be mounted in a rear-facing orientation to the trailer (T). An image processor (5) receives the first image data and said second image data. The image processor (5) is configured to combine said first image data and said second image data to generate composite image data corresponding to a composite image (IMG3). The present disclosure also relates to a corresponding method of generating a composite image (IMG3), and to a rig made up of a vehicle (V) and a trailer (T).

System and method for work machine
11549238 · 2023-01-10 · ·

A system includes a processor and a display. The processor acquires shape data indicative of a shape of the surroundings in a traveling direction of a work machine. The processor generates a guide line. The guide line is disposed spaced from the work machine. The guide line indicates the shape of the surroundings in the traveling direction of the work machine. The processor synthesizes a surrounding image and the guide line and generates an image including the surroundings image and the guide line. The display displays the image including the surroundings image and the guide line based on a signal from the processor. The system may further include the work machine, a camera that captures the surrounding image, and a sensor that measures the shape of surroundings.

Vehicle positioning for inductive energy transfer

A method for bringing a vehicle closer to a vehicle-external primary charging unit configured to inductively charge the vehicle, where the vehicle includes a secondary charging unit, a camera system and a display device, includes the steps of a) capturing a real-time image of a vehicle environment using the camera system, wherein the primary charging unit is included in the real-time image, b) displaying the real-time image on the display device, and c) inserting at least one guide line into the real-time image. The direction and/or curvature of the guide line coincides with a steering angle lock of the vehicle such that the guide line corresponds to the trajectory of the vehicle in the case of the steering angle lock. The position of the at least one guide line in the real-time image of the vehicle environment is selected such that the guide line indicates a movement curve of the secondary charging unit of the vehicle. The method further includes indicating the movement curve of the secondary charging unit relative to the primary charging unit based on a movement of the vehicle by repeating steps a) to c).

ASSISTED PARKING MANEUVERS FOR VEHICLES COUPLED IN A TOWED RECHARGING ARRANGEMENT

Leading and trailing electrified vehicles are coupled together in a towing arrangement for in-flight transfer of an electrical charge between their battery systems. With the vehicles connected by a towing device, a parking maneuver is initiated in which the trailing vehicle leads the leading vehicle. For the parking maneuver, one of the vehicles is designated (e.g., automatically or by driver agreement) to be an active steering vehicle and the other vehicle to be a passive steering vehicle. At least the passive steering vehicle comprises an electrically-controlled steering actuator. During movement, a turning (e.g., steering angle) of the active steering vehicle is monitored. Based on the turning of the active steering vehicle, an assistive steering angle is determined for the passive steering vehicle. The electrically-controlled steering actuator is commanded according to the assistive steering angle. The parking maneuver may be reverse or forward.

Method for calibrating a vehicular vision system
11535154 · 2022-12-27 · ·

A method for calibrating a vehicular vision system includes disposing a camera at a vehicle, disposing a processor at the vehicle, and disposing a video display screen in the vehicle so as to be viewable by the vehicle driver. The video display screen is operable to display video images derived from image data captured by the imager of the camera. Image data is captured by the imager of the camera and provided to the processor. The video display screen displays video images derived from image data captured by the imager of the camera. The processor generates a graphic overlay for display with the video images at the video display screen. Responsive to processing captured image data, the vehicular vision system is calibrated by adapting an orientation and position of the image data relative to the generated graphic overlay to a corrected orientation and position relative to the generated graphic overlay.