B60W2420/403

VEHICULAR COLLISION AVOIDANCE SYSTEM
20230079211 · 2023-03-16 ·

A vehicular collision avoidance system includes a forward-viewing camera viewing through the windshield at least forward of the equipped vehicle, a rearward-sensing radar sensor sensing at least rearward of the equipped vehicle, and an electronic control unit. The vehicular collision avoidance system detects vehicles present forward and/or rearward of the equipped vehicle. Responsive to data processing of radar data captured by the rearward-sensing radar sensor, the vehicular collision avoidance system detects another vehicle approaching the equipped vehicle from the rear, determines distance between the equipped vehicle and the other vehicle, and determines speed difference between the equipped vehicle and the other vehicle. Based at least in part on the determined distance between the equipped vehicle and the other vehicle and the determined speed difference between the equipped vehicle and the other vehicle, the vehicular collision avoidance system controls the equipped vehicle to mitigate impact by the other vehicle.

METHOD FOR AVOIDING A COLLISION IN ROAD TRAFFIC ON THE BASIS OF ADAPTIVELY SETTING POTENTIALLY OCCUPIED AREAS

The present invention relates to a method for avoiding collisions of a moving vehicle with other road users in the surroundings of the vehicle, comprising at least the method steps of: a) detecting, by means of one or more sensors, the vehicle surroundings and the other road users located therein; b) dividing the vehicle surroundings into a plurality of occupied areas; c) classifying the other road users detected in method step a), wherein, by means of the classification, at least one road user group is assigned to each of the other road users; d) prioritising the road user classified in method step c), taking into account both the classification carried out in method step c) and the occupied area defined in method step b), wherein road users from one or more predetermined road user groups in the particular occupied area are given a high priority and road users from other, non-predetermined road user groups in the particular occupied area are given a lower priority; and e) determining the probability of collision of the other road users with the vehicle, wherein the collision probability is determined in accordance with the prioritisation carried out in method step d) and the collision probability of the other road users having a high priority is determined first; f) changing or maintaining the current driving behaviour of the vehicle on the basis of the collision probabilities determined in method step e).

Crowd sourcing data for autonomous vehicle navigation

Systems and methods of processing crowdsourced navigation information for use in autonomous vehicle navigation are disclosed. A method may include processing, by a mapping server, crowdsourced navigation information from a plurality of vehicles obtained by sensors coupled to the plurality of vehicles, wherein the navigation information describes road lanes of a road segment; collecting data about landmarks identified proximate to the road segment, the landmarking including a traffic sign; generating, by the mapping server, an autonomous vehicle map for the road segment, wherein the autonomous vehicle map includes a spline corresponding to a lane in the road segment and the landmarks identified proximate to the road segment; and distributing, by the mapping server, the autonomous vehicle map to an autonomous vehicle for use in autonomous navigation over the road segment.

Vehicle control device

A vehicle control device includes: a first detection unit that detects a traveling state of a host vehicle; a merging detection unit that detects that the host vehicle approaches within a predetermined area of a merging point when the host vehicle travels on the merging road toward the merging point at which a main road joins with the merging road; a second detection unit that detects a speed of a lane flow by another vehicle that travels on the main road toward the merging point; a position detection unit that obtains a position of a pre-merging point as a virtual point on the main road reaching the merging point when the host vehicle reaches the merging point; and a display control unit that controls a display device to display the position of the host vehicle and the pre-merging point.

Apparatus, method, computer program, base station and vehicle for providing information related to an approaching vehicle

Embodiments relate to a method, an apparatus, a computer program, a base station and a vehicle for providing information related to an approaching vehicle. The method comprises Receiving (110) information related to a velocity of a plurality of vehicles. The method further comprises Determining (120) the information related to the approaching vehicle based on the information related to the velocity of the plurality of vehicles. The information related to the approaching vehicle indicates a presence of a vehicle of the plurality of vehicles having a velocity higher than an average velocity of the plurality of vehicles. The method further comprises Providing (130) the information related to the approaching vehicle to at least a subset of vehicles of the plurality of vehicles.

Vehicle-trailer distance detection device and method

A method for determining a distance between a camera positioned on a rear portion of a tow vehicle and a trailer coupler supported by a trailer positioned behind the tow vehicle as the tow vehicle approaches the trailer. The method includes identifying the trailer coupler of the trailer within one or more images of a rearward environment of the tow vehicle. The method also includes receiving sensor data from an inertial measurement unit supported by the tow vehicle. The method includes determining a pixel-wise intensity difference between a current received image from the one or more images and a previously received image from the one or more images. The method includes determining the distance based on the identified trailer coupler, the sensor data, and the pixel-wise intensity difference, the distance includes a longitudinal distance, a lateral distance, and a vertical distance.

Sequential fusion for 3D object detection

Techniques are provided for improving a perception processing pipeline for object detection that fuses image segmentation data (e.g., segmentation scores) with LiDAR points. The disclosed techniques are implemented using an architecture that accepts point clouds and images as input and estimates oriented 3D bounding boxes for all relevant object classes. In an embodiment, a method comprises: matching temporally, using one or more processors of a vehicle, points in a three-dimensional (3D) point cloud with an image; generating, using an image-based neural network, semantic data for the image; decorating, using the one or more processors, the points in the 3D point cloud with the semantic data; and estimating, using a 3D object detector with the decorated points as input, oriented 3D bounding boxes for the one or more objects.

VEHICLE AUTOMATED DRIVING SYSTEM

A vehicle automated driving system 100 comprises a surrounding environment information acquiring device 10, a vehicle information acquiring device 20, a driver information acquiring device 30, a package selecting part 90, a package proposing part 91, an automated driving executing part 92, and a rejection count detecting part 93. The package selecting part determines the driving assistance package based on at least one of the surrounding environment information, the vehicle information, and the driver information, selects the determined driving assistance package if the rejection count of the determined driving assistance package is less than a predetermined threshold value, and selects a driving assistance package different from the determined driving assistance package if the rejection count of the determined driving assistance package is the threshold value or more.

Systems and method to trigger vehicle events based on contextual information

This disclosure relates to a system and method for detecting vehicle events. Some or all of the system may be installed in a vehicle, operate at the vehicle, and/or be otherwise coupled with a vehicle. The system includes one or more sensors configured to generate output signals conveying information related to the vehicle. The system receives contextual information from a source external to the vehicle. The system detects a vehicle event based on the information conveyed by the output signals from the sensors and the received contextual information.

Vehicle exterior environment recognition apparatus
11628836 · 2023-04-18 · ·

A vehicle exterior environment recognition apparatus includes a travel path derivation unit, a speed derivation unit, and a follow-up controller. The travel path derivation unit estimates an own-vehicle travel path and derives a target-vehicle travel path that contains a point on a target vehicle and forms a parallel curve to the own-vehicle travel path. The speed derivation unit derives a target-vehicle speed vector. The follow-up controller makes a follow-up control based on the target-vehicle speed vector on the condition that an angle formed by the target-vehicle speed vector and a tangential line to the target-vehicle travel path at the point on the target vehicle falls within a predetermined angular range, and makes the follow-up control based on a tangential speed component of the target-vehicle speed vector on the condition that the angle falls out of the angular range.