B60W2420/54

CONTROL METHOD AND INFORMATION DISPLAY SYSTEM

A control method is performed by an information display system displaying information on one or more display units included in a vehicle. The information display system executes guidance processing for guiding a line of sight of a driver of the vehicle toward a display unit which is not in the line of sight. The control method including: obtaining vicinity information indicating a state of vicinity of the vehicle from a device outside the vehicle; determining whether or not the driver should pay more attention to the state of the vicinity indicated in the vicinity information than the information displayed on the one or more display units; and executing control for suppressing the guidance processing when it is determined that the driver should pay more attention to the state of the vicinity than the information displayed on the one or more display units.

EMERGENCY VEHICLE DETECTION AND RESPONSE

Techniques for detecting and responding to an emergency vehicle are discussed. A vehicle computing system may determine that an emergency vehicle based on sensor data, such as audio and visual data. In some examples, the vehicle computing system may determine aggregate actions of objects (e.g., other vehicles yielding) proximate the vehicle based on the sensor data. In such examples, a determination that the emergency vehicle is operating may be based on the actions of the objects. The vehicle computing system may, in turn, identify a location to move out of a path of the emergency vehicle (e.g., yield) and may control the vehicle to the location. The vehicle computing system may determine that the emergency vehicle is no longer relevant to the vehicle and may control the vehicle along a route to a destination. Determining to yield and/or returning to a mission may be confirmed by a remote operator.

LOW-ENERGY IMPACT COLLISION DETECTION

This disclosure relates to systems and techniques for identifying collisions, such as relatively low energy impact collisions involving an autonomous vehicle. Sensor data from a first sensor modality in a first array may be used to determine a first estimated location of impact and second sensor data from a second sensor modality in a second array may be used to determine a second estimated location of impact. A low energy impact event may be configured when the first estimated location of impact corresponds to the second estimated location of impact.

Process and system for sensor sharing for an autonomous lane change

A process for sensor sharing for an autonomous lane change is provided. The process includes, within a dynamic controller of a host vehicle, monitoring sensors of the host vehicle, establishing communication between the host vehicle and a confederate vehicle on a same roadway as the host vehicle, monitoring sensors of the confederate vehicle, within the dynamic controller of the host vehicle, utilizing data from the sensors of the host vehicle and data from the sensors of the confederate vehicle to initiate a lane change maneuver for the host vehicle, and executing the lane change maneuver for the host vehicle.

Redundant environment perception tracking for automated driving systems

Redundant environment perception tracking for automated driving systems. One example embodiment provides a surveillance system, the system including a plurality of sensors, a memory, and an electronic processor. The electronic processor is configured to receive, from the plurality of sensors, environmental information of a common field of view, generate, based on the environmental information, a plurality of hypotheses regarding an object within the common field of view, the plurality of hypotheses including at least one set of hypotheses excluding the environmental information from at least one sensor of a first sensor type, determine, based on a subset of the plurality of hypotheses, an object state of the object, wherein the subset includes the at least one set of hypotheses excluding the environmental information from the at least one sensor of the first sensor type, and track the object based on the object state that is determined.

Parking assist system
11613251 · 2023-03-28 · ·

A parking assist system includes: an imaging device configured to capture an image of a surrounding of a vehicle; a display device configured to display a surrounding image of the vehicle based on the image captured by the imaging device; and a control device configured to control a display of the display device based on the surrounding image and to calculate a trajectory of the vehicle from a current position to a target position. In a case where the trajectory includes a switching position for steering the vehicle and changing a moving direction thereof and the vehicle is moving toward the switching position, the control device causes the display device to superimpose the switching position on the surrounding image and to hide at least a first part of the trajectory, the first part connecting the switching position and the target position.

Method and driver assistance system for improving ride comfort of a transportation vehicle and transportation vehicle

A method for improving the ride comfort of a transportation vehicle including planning a first driving route by a navigation system; automatically detecting at least one road parameter of the first driving route by a sensor system of the transportation vehicle; automatically evaluating the first driving route in view of the ride comfort of the first driving route by taking into account the road parameter; and in response thereto using the first driving route or planning an alternative driving route.

Driver re-engagement system

In a network of autonomous or semi-autonomous vehicles, an alert may be triggered when one of the vehicles switches from autonomous to manual mode. The alert may be communicated to nearby autonomous vehicles so that drivers of those vehicles may become aware of a potentially unpredictable manual driver nearby. Drivers of autonomous vehicles who may have become disengaged (e.g., sleeping, reading, talking, etc.) during autonomous driving may become re-engaged upon noticing the alert. A re-engaged driver may choose to switch his/her own vehicle from autonomous to manual mode in order to appropriately react to an unpredictable nearby manual driver. In additional or alternative embodiments, the alert may be triggered or intensified when indications of impairment of a nearby driver or malfunction of a nearby vehicle are detected.

Systems and methods for providing nature sounds

Systems and methods for generating sound elements in a vehicle are present. In one example, a method comprises selecting a sound element, the sound element corresponding to a natural environment; and broadcasting the sound element via one or more speakers of a vehicle. In this way, a sound environment may be provided to a vehicle user based on the at least one vehicle state.

AUTOMATIC CROSS-SENSOR CALIBRATION USING OBJECT DETECTIONS

Certain aspects of the present disclosure provide techniques for sensor calibration. First sensor data is received from a first sensor and second sensor data is received from a second sensor, where the first sensor data and the second sensor data each indicate detected objects in a space. The first sensor data is transformed using a first transformation profile to convert the first sensor data to a coordinate frame of the second sensor data. The first transformation profile is refined based on a difference between the transformed first sensor data and the second sensor data.