B60W60/00272

TIME GAPS FOR AUTONOMOUS VEHICLES
20230047336 · 2023-02-16 ·

Aspects of the disclosure provide for a method of controlling an autonomous vehicle in an autonomous driving mode. For instance, a predicted future trajectory for an object detected in a driving environment of the autonomous vehicle may be received. A routing intent for a planned trajectory for the autonomous vehicle may be received. The predicted future trajectory and the routing intent intersect with one another may be determined. When the predicted future trajectory and the routing intent are determined to intersect with one another, a time gap may be applied to a predicted future state of the object defined in the predicted future trajectory. A planned trajectory may be determined for the autonomous vehicle based on the applied time gap. The autonomous vehicle may be controlled in the autonomous driving mode based on the planned trajectory.

Systems and methods for hybrid prediction framework with inductive bias

Systems and methods are provided for implementing hybrid prediction. Hybrid prediction integrates two deep learning based trajectory prediction approaches: grid-based approaches and graph-based approaches. Hybrid prediction techniques can achieve enhanced performance by combining the grid and graph approaches in a manner that incorporates appropriate inductive biases for different elements of a high-dimensional space. A hybrid prediction framework processor can generate trajectory predictions relating to movement of agents in a surrounding environment based on a prediction model generating using hybrid prediction. Trajectory predictions output from the hybrid prediction framework processor can be used to control an autonomous vehicle. For example, the autonomous vehicle can perform safety-aware and autonomous operations to avoid oncoming objects, based on the trajectory predictions.

System and method for future forecasting using action priors

A system for method for future forecasting using action priors that include receiving image data associated with a surrounding environment of an ego vehicle and dynamic data associated with dynamic operation of the ego vehicle. The system and method also include analyzing the image data and detecting actions associated with agents located within the surrounding environment of the ego vehicle and analyzing the dynamic data and processing an ego motion history of the ego vehicle. The system and method further include predicting future trajectories of the agents located within the surrounding environment of the ego vehicle and a future ego motion of the ego vehicle within the surrounding environment of the ego vehicle.

Detecting out-of-model scenarios for an autonomous vehicle

Detecting out-of-model scenarios for an autonomous vehicle including: determining, based on first sensor data from one or more sensors, an environmental state relative to the autonomous vehicle, wherein operational commands for the autonomous vehicle are based on a selected machine learning model, wherein the selected machine learning model comprises a first machine learning model; comparing the environmental state to a predicted environmental state relative to the autonomous vehicle; and determining, based on a differential between the environmental state and the predicted environmental state, whether to select a second machine learning model as the selected machine learning model.

Techniques for maintaining vehicle formations
11708094 · 2023-07-25 · ·

A method of maintaining vehicle formation includes receiving a desired formation distance between a lead vehicle and a follower vehicle; receiving a pre-planned path for the follower vehicle; and defining a dynamic zone around a current position of the lead vehicle. The dynamic zone has a boundary characterized by a first radius from the current position of the lead vehicle. The first radius can be substantially equal to the desired formation distance. The method further includes determining a next speed of the follower vehicle based on a current position of the follower vehicle with respect to the boundary of the dynamic zone; determining a commanded curvature of the follower vehicle based on the current position of the follower vehicle with respect to the pre-planned path; and outputting the next speed and the commanded curvature to a control system of the follower vehicle for navigation of the follower vehicle.

TRACKING VANISHED OBJECTS FOR AUTONOMOUS VEHICLES
20230227074 · 2023-07-20 ·

Aspects of the disclosure relate to methods for controlling a vehicle having an autonomous driving mode. For instance, sensor data may be received from one or more sensors of the perception system of the vehicle, the sensor data identifying characteristics of an object perceived by the perception system. When it is determined that the object is no longer being perceived by the one or more sensors of the perception system, predicted characteristics for the object may be generated based on one or more of the identified characteristics. The predicted characteristics of the object may be used to control the vehicle in the autonomous driving mode such that the vehicle is able to respond to the object when it is determined that the object is no longer being perceived by the one or more sensors of the perception system.

3D Occlusion Reasoning for Accident Avoidance

An occlusion is identified in a vehicle transportation network. A visibility grid is identified on a second side of the occlusion for a vehicle that is on a first side of the occlusion. The visibility grid is identified with respect to a region of interest that is at least a predefined distance above ground. The visibility grid is used to identify first portions of roads sensed by a sensor positioned on the vehicle and second portions of the roads that are not sensed by the sensor. A driving behavior of the vehicle is altered based on the visibility grid.

SYSTEMS AND METHODS FOR ESTIMATING CUBOIDS FROM LIDAR, MAP AND IMAGE DATA
20230219602 · 2023-07-13 ·

Systems and methods for operating a robotic system. The methods comprise: inferring, by a computing device, a first heading distribution for the object from a 3D point cloud; obtaining, by the computing device, a second heading distribution from a vector map; obtaining, by the computing device, a posterior distribution of a heading using the first and second heading distributions; defining, by the computing device, a cuboid on a 3D graph using the posterior distribution; and using the cuboid to facilitate driving-related operations of a robotic system.

System and method for implementing reward based strategies for promoting exploration
11699062 · 2023-07-11 · ·

A system and method for implementing reward based strategies for promoting exploration that include receiving data associated with an agent environment of an ego agent and a target agent and receiving data associated with a dynamic operation of the ego agent and the target agent within the agent environment. The system and method also include implementing a reward function that is associated with exploration of at least one agent state within the agent environment. The system and method further include training a neural network with a novel unexplored agent state.

Control of autonomous vehicle based on environmental object classification determined using phase coherent LIDAR data

Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.