B60W2754/20

TRAJECTORY GENERATION AND OPTIMIZATION USING CLOSED-FORM NUMERICAL INTEGRATION IN ROUTE-RELATIVE COORDINATES
20210109539 · 2021-04-15 ·

Techniques are discussed for generating and optimizing a trajectory using closed-form numerical integration in route-relative coordinates. A decision planner component of an autonomous vehicle, for example, can receive or generate a reference trajectory, which may correspond to an ideal route for an autonomous vehicle to traverse through an environment, such as a center of a road segment. Lateral dynamics (e.g., steering angles, curvature values of trajectory segments) and longitudinal dynamics (e.g., velocity and acceleration) can be represented in a single algorithm such that optimizing the reference trajectory (e.g., based on loss functions or costs) can substantially simultaneously optimize the lateral dynamics and longitudinal dynamics in a single convergence operation. In some cases, the trajectory can be used to control the autonomous vehicle to traverse an environment.

SYSTEM AND METHOD FOR PROVIDING ACCURATE TRAJECTORY FOLLOWING FOR AUTOMATED VEHICLES IN DYNAMIC ENVIRONMENTS
20210094569 · 2021-04-01 ·

A system and method for providing accurate trajectory following for automated vehicles in dynamic environments that include receiving image data and LiDAR data associated with a dynamic environment of a vehicle. The system and method also include processing a planned trajectory of the vehicle that is based on an analysis of the image data and LiDAR data. The system and method further include communicating control signals associated with following the planned trajectory to autonomously control the vehicle to follow the planned trajectory to navigate within the dynamic environment to reach a goal.

CONTROL ARCHITECTURES FOR AUTONOMOUS VEHICLES

The subject matter described in this specification is generally directed control architectures for an autonomous vehicle. In one example, a reference trajectory, a set of lateral constraints, and a set of speed constraints are received using a control circuit. The control circuit determines a set of steering commands based at least in part on the reference trajectory and the set of lateral constraints and a set of speed commands based at least in part on the set of speed constraints. The vehicle is navigated, using the control circuit, according to the set of steering commands and the set of speed commands.

SYSTEMS AND METHODS FOR NAVIGATING A VEHICLE

Systems and methods are provided for vehicle navigation. In one implementation, a system may comprise an interface to obtain sensing data of an environment of the host vehicle. The processing device may be configured to determine a planned navigational action; identify, a target vehicle in the environment of the host vehicle; predict a distance between the host vehicle and the target vehicle if the planned navigational action was taken; determine a current host vehicle stopping distance based on a braking capability, acceleration capability, and speed of the host vehicle; determine a current target vehicle braking distance based on a speed and braking capability of the target vehicle; and implement the planned navigational action when the predicted distance of the planned navigational action is greater than a minimum safe longitudinal distance calculated based on the current host vehicle stopping distance and the current target vehicle braking distance.

Impedance-based motion control for autonomous vehicles
10928832 · 2021-02-23 · ·

Methods and systems for controlling motion of a vehicle. Sensor data is obtained, representing an environment of the vehicle. A reference control trajectory is corrected, using one or more virtual forces, to provide a desired trajectory. The one or more virtual forces are generated by applying an impedance scheme to the sensor data. A desired trajectory is outputted for controlling the vehicle.

Systems and methods for navigating a vehicle

An autonomous system includes a processing device programmed to receive, from an image capture device, an image of an environment of the host vehicle; detect an obstacle in the environment, based on an analysis of the image; monitor a driver input to at least one of a throttle control, a brake control, or a steering control associated with the host vehicle; determine whether the driver input results in the host vehicle navigating within a proximity buffer relative to the obstacle; allow the driver input to cause a corresponding change in one or more host vehicle motion control systems, if the processing device determines that the driver input would not result in the host vehicle navigating within the proximity buffer relative to the obstacle; and prevent the driver input to cause the change if the driver input results in the host vehicle navigating within the proximity buffer relative to the obstacle.

DRIVING SUPPORT DEVICE

A driving support device performs steering control and deceleration control for avoiding an object which is detected in front of a host vehicle. The driving support device performs: calculating a target lateral distance which is a target of the steering control and which is a lateral distance between the host vehicle and the object when the host vehicle passes by the object; and increasing target deceleration which is a target of the deceleration control and which is deceleration of the host vehicle when the host vehicle passes by the object as a lateral distance restraint value which is obtained by subtracting the target lateral distance from a threshold value increases when the target lateral distance is less than the threshold value.

APPARATUS FOR CONTROLLING BEHAVIOR OF AUTONOMOUS VEHICLE AND METHOD THEREOF

An apparatus for controlling a behavior of an autonomous vehicle includes: a joystick that inputs an adjustment value corresponding to an amount of manipulation by a user, and a controller to control the behavior of the autonomous vehicle based on the adjustment value corresponding to the amount of manipulation being input from the joystick.

Multi-sensor data fusion for automotive systems
10960838 · 2021-03-30 · ·

A sensor fusion system associated with a vehicle includes a sensor interface communicatively coupled to a plurality of sensors in the vehicle and a vehicle experience system. The sensor interface comprises an input receiving data from each of the plurality of sensors and an output configured to output fused vehicle data based on the data received from the plurality of sensors. The vehicle experience system is coupled to the output of the sensors interface to receive the fused vehicle data. The vehicle experience system includes one or more processors and a non-transitory computer readable storage medium storing instructions that when executed by one or more processors cause the one or more processors to control at least one parameter of the vehicle based on the fused vehicle data.

Contingency Planning and Safety Assurance

A method for contingency planning for an autonomous vehicle (AV) includes determining a nominal trajectory for the AV; detecting a hazard object that does not intrude into a path of the AV at a time of the detecting the hazard object; determining a hazard zone for the hazard object; determining a time of arrival of the AV at the hazard zone; determining a contingency trajectory for the AV; controlling the AV according to the contingency trajectory; and, in response to the hazard object intruding into the path of the AV, controlling the AV to perform a maneuver to avoid the hazard object. The contingency trajectory includes at least one of a lateral contingency or a longitudinal contingency. The contingency trajectory is determined using the time of arrival of the AV at the hazard zone.