B60R2300/301

System and method for displaying vehicle driving information

A system for displaying vehicle driving information may include: first and second image acquisition devices configured to capture an image of a first side on which a driver seat is located and an image of a second side which is the opposite side of the first side, respectively; first and second display devices configured to display the captured images of the first and second image acquisition devices, respectively; first and second BSD devices and configured to sense side-rear vehicle information; and a control device configured to recognize the location and vehicle distance of the side-rear vehicle by analyzing the side-rear vehicle information detected by the first and second BSD devices, determine whether to issue a BSD warning, and control BSD warning icons of the ego vehicle and the side-rear vehicle to be displayed as an OSD of the images displayed on the first and second display devices.

System and method for enhancing situational awareness in a transportation vehicle

A method for enhancing situational awareness in a transportation vehicle including combination tractor and a trailer. The method includes locating a plurality of cameras on the trailer of the vehicle. Each camera is equipped with a camera wash system including a wash nozzle, dryer, and defroster. The cameras are operatively connected to a local area user network via an onboard access point. The onboard access point communicates with a data bus of the vehicle. At vehicle start-up, the cameras are automatically paired with a computing device located inside the tractor. The computing device includes a display screen and graphical interface with icon tabs representing each of the connected cameras. Using the icon tabs rendered on the display screen of the computing device, a selected camera and associated camera wash system can be manually activated.

Sensor array for an autonomously operated utility vehicle and method for surround-view image acquisition

A sensor apparatus for an autonomously operated commercial vehicle to allow panoramic capture of surroundings of the commercial vehicle, including: radar units mountable in front corner areas of the vehicle; downwardly directed cameras having a fisheye objective, mountable on front upper corner areas of the vehicle; at least one rearwardly directed sensor mounted on a section of the vehicle to allow rearward image capture; and an evaluation module to evaluate image data from the radar units, the downwardly directed cameras and the at least one rearwardly directed sensor to achieve the panoramic capture of the surroundings of the vehicle; in which the radar units and the at least one rearwardly directed sensor capture all points in a surrounding area of the vehicle, and wherein the downwardly directed cameras capture all points in the surrounding area of the vehicle. Also described are a related commercial vehicle, method and computer readable medium.

System for Monitoring the Surroundings of a Motor Vehicle

The invention relates to a system (1) for monitoring the surroundings of a motor vehicle (100), in particular an autonomous or semi-autonomous motor vehicle, wherein the system (1) includes at least one optical image capturing device (2) as well as further a lighting device (3, 4), wherein the optical image capturing device (2) is arranged for capturing an area of coverage (E1) of the surroundings, and wherein the area of coverage (E1) can be illuminated at least partially, preferably completely, by the lighting device (3, 4), and wherein the lighting device (3, 4) is arranged for generating a motor vehicle light distribution or part of a motor vehicle light distribution. The system (1) includes at least one optical additional image capturing device (5), which is arranged to capture a so-called additional area of coverage (E2), and wherein the system (1) further includes an additional lighting device (6), which is arranged to illuminate at least partially, preferably completely, the additional area of coverage (E2), also called the second area of coverage.

ASPHALT COMPACTOR BIRDS EYE CAMERA INTEGRATION

A compactor machine can include a machine frame; at least one cylindrical roller drum rotatably coupled to the machine frame and rotatable about a drum axis oriented generally transverse to a direction of travel of the compactor machine; a plurality of cameras mounted to the machine frame so as to provide a 360° bird's eye view of an area around the machine; a display showing the 360° bird's eye view; and a controller; wherein, the controller is configured to determine certain conditions regarding a compacting operation of the compactor machine and the controller overlays a highlight symbol on the 360° bird's eye view on the display notifying a machine operator of the certain conditions.

Apparatus for monitoring the blind spot of a motor vehicle

An apparatus for monitoring the blind spot of a motor vehicle has a mirror assembly with at least one mirror element that can be moved using an actuator, and a microphone assembly. The microphone assembly is designed to determine the sound direction, and the movable mirror element is in the form of a sound reflector.

Apparatus, method, and vehicle for providing braking level of forward vehicle

An apparatus, method, and vehicle for providing a braking level of a forward vehicle may quantify a degree of braking of the forward vehicle into a braking level and provide the braking level of the forward vehicle to a driver. The apparatus includes a brake lamp position recognizing device configured to recognize positions of brake lamps of the forward vehicle based on an image and a relative acceleration of the forward vehicle, a braking determining device configured to determine whether or not braking of the forward vehicle is performed based on a brake lamp image extracted from the image of the forward vehicle, a braking level determining device configured to determine a braking level of the forward vehicle based on the relative acceleration of the forward vehicle, and a braking level image providing device configured to provide the determined braking level of the forward vehicle through an image.

Monitoring System for a Vehicle
20220402434 · 2022-12-22 ·

A monitoring system for a vehicle. The monitoring system includes at least one imaging sensor arranged at a side mirror of a front-seat passenger side of the vehicle, a displaying unit arranged in a visual range of a driver of the vehicle, and a control unit configured to receive information on a status of at least a turning indicator. The imaging sensor is configured to capture image data of an environment of the vehicle. The displaying unit is configured to display the image data of the imaging sensor. The control unit is configured to control the display of the image data by means of the displaying unit based on the information of the status of the turning indicator.

System and methods for object detection and tracking using a lidar observation model
11531113 · 2022-12-20 · ·

A system for detecting and tracking objects using lidar can include one or more processors configured to receive lidar data. The one or more processors can determine shape data from the lidar data. The shape data can be indicative of an object. The one or more processors can determine a plurality of extents of the object based on the shape data. The one or more processors can update a state of the object based on the plurality of extents, the state including a boundary of the object. The one or more processors can provide the state of the object to an autonomous vehicle controller to cause the autonomous vehicle controller to control an autonomous vehicle responsive to the state of the object.

IMAGE BASED TRAILER WHEELBASE DETERMINATION

A method for determining a trailer detection parameter of a trailer including receiving an image from at least one camera at a controller. The at least one camera defines a field of view including at least a portion of a vehicle trailer. The method determines a trailer angle of the vehicle trailer relative to a tractor, identifies at least one feature in an image of the trailer, determines a two-dimensional distance from the at least one feature to a predefined position on the image, and converts the two-dimensional distance to a three-dimensional distance based at least in part on the determined angle. The three-dimensional distance is a trailer detection parameter of the trailer.