Patent classifications
B60Q2300/41
AUTOMATIC HIGH BEAM CONTROL FOR AUTONOMOUS MACHINE APPLICATIONS
In various examples, high beam control for vehicles may be automated using a deep neural network (DNN) that processes sensor data received from vehicle sensors. The DNN may process the sensor data to output pixel-level semantic segmentation masks in order to differentiate actionable objects (e.g., vehicles with front or back lights lit, bicyclists, or pedestrians) from other objects (e.g., parked vehicles). Resulting segmentation masks output by the DNN(s), when combined with one or more post processing steps, may be used to generate masks for automated high beam on/off activation and/or dimming or shading—thereby providing additional illumination of an environment for the driver while controlling downstream effects of high beam glare for active vehicles.
SYSTEM FOR CONTROLLING AT LEAST ONE HEADLIGHT OF A MOTOR VEHICLE
A system controls at least one headlight of a motor vehicle provided with at least one light-emitting diode. The system includes at least one camera including a calculator suitable for determining the presence of at least one vehicle to be detected in the area in front of the motor vehicle and suitable for determining the shape of a shadow zone around at least one detected vehicle such that the latter is not blinded by the light emitted by the at least one light-emitting diode of the headlight, a controller for controlling the headlight. The controller is suitable for controlling the state of each light-emitting diode according to the operating state of the headlight and the shape of the shadow zone around the detected vehicle.
Vehicular driver assistance system with construction zone recognition
A vehicular driver assistance system includes a camera that views through the windshield of the vehicle and a control device having an image processor that processes captured image data. Responsive to processing of captured image data by the image processor, the system adjusts a light beam emitted by a headlamp of the vehicle. The control device, responsive at least in part to image data processing by the image processor, determines when the vehicle is at a construction zone. Responsive to determination that the vehicle is at a construction zone, image processing of image data captured by the camera is adjusted to discriminate construction zone signs from taillights of leading vehicles. Responsive to determination of the vehicle exiting the construction zone, the control device adjusts the light beam emitted by the headlamp of the vehicle responsive to determination of headlamps of approaching vehicles and taillights of leading vehicles.
VEHICULAR VISION SYSTEM WITH CONSTRUCTION ZONE RECOGNITION
A vehicular vision system includes an image processor and a camera that views through the windshield of the vehicle. The camera captures image data as the vehicle travels along a road, and the image processor processes image data captured by the camera. The vehicular vision system, responsive at least in part to processing by the image processor of image data captured by the camera, determines when the vehicle is at a construction zone. Responsive to determining that the vehicle is at the construction zone, the vehicular vision system adjusts a vehicular driver assistance system of the vehicle. The vehicular vision system determines that the vehicle exits the construction zone based at least in part on processing by the image processor of image data captured by the camera.
Vehicular control system using a camera and lidar sensor to detect objects
A vehicular control system includes a plurality of sensors that include at least a camera and a 3D point-cloud LIDAR. As the vehicle travels along a road, and responsive at least in part to processing at an electronic control unit of 3D point-cloud LIDAR data captured by the 3D point-cloud LIDAR, the vehicular control system (a) determines presence of a pedestrian or cross traffic vehicle present exterior of the vehicle that (i) is not on the road that is being travelled along by the vehicle and is approaching the road to cross the road ahead of the vehicle and (ii) is at least in the field of sensing of the 3D point-cloud LIDAR and (b) at least in part controls at least one vehicle function of the vehicle responsive at least in part to the determined presence of the pedestrian or cross traffic vehicle.
DRIVER ASSISTANCE SYSTEM AND METHOD
In order to provide an enhanced driver assistance system for a vehicle, a prediction of a movement of the third-party vehicle is determined based upon motion data relating to a third- party vehicle travelling in front which has moved out of a region of view of at least one sensor of the vehicle, or relating to an oncoming third-party vehicle which has not yet entered the region of view of the sensor, and based upon map data The driver assistance system is then operated on the basis of the prediction.
Automatic high beam control for autonomous machine applications
In various examples, high beam control for vehicles may be automated using a deep neural network (DNN) that processes sensor data received from vehicle sensors. The DNN may process the sensor data to output pixel-level semantic segmentation masks in order to differentiate actionable objects (e.g., vehicles with front or back lights lit, bicyclists, or pedestrians) from other objects (e.g., parked vehicles). Resulting segmentation masks output by the DNN(s), when combined with one or more post processing steps, may be used to generate masks for automated high beam on/off activation and/or dimming or shading—thereby providing additional illumination of an environment for the driver while controlling downstream effects of high beam glare for active vehicles.
VEHICLE HEADLAMP
A vehicle headlamp (1) includes a lamp fitting (10), and a control unit (CO) configured to, when a signal indicating detection of a preceding vehicle (80) is input, control the lamp fitting (10) such that a total luminous flux amount of light emitted to a first region (211) and a second region (212), and widths (W211 and W212) in the first region (211) and the second region (212) change according to a position of the preceding vehicle (80) with respect to a vehicle (100), in which the first region (211) overlaps a whole of a visual recognition unit of the preceding vehicle (80), and edges (212R and 212L) of the second region (212) on both sides in the left-right direction are located on a preceding vehicle (80) side with respect to edges (211R and 211L) of the first region (211) on both sides in the left-right direction.
Control apparatus for vehicle headlight
A control apparatus (1) for vehicle headlight including a rear end detection unit (3, 7) that detects a rear end (117) of a preceding vehicle (111) and a light blocking area setting unit (7) that sets a light blocking area (123) that includes the rear end within a light illumination area of a headlight (101, 103) of an own vehicle (107), wherein the light blocking area setting unit sets the light blocking area such that a spreading extent (125, 127) of the light blocking area outside the rear end is wider on a side in which a front end of the preceding vehicle is present than on the opposite side with reference to the rear end as viewed from the own vehicle.
Driving assistance apparatus and driving assistance method for vehicle
In a driving assistance apparatus, an object detecting unit detects an object that is present in a periphery of an own vehicle based on an image captured by an imaging apparatus provided in the own vehicle. An avoidance control unit performs collision avoidance control for avoiding a collision between the detected object and the own vehicle when a collision between the object and the own vehicle is likely. A light distribution control unit switches irradiated light of an irradiation apparatus provided in the own vehicle between high beam and low beam based on a predetermined switching condition. The light distribution control unit performs switching suppression control to suppress switching of the irradiated light from high beam to low beam while the avoidance control unit is performing collision avoidance control in a case where the irradiated light is set to high beam.