Patent classifications
B60W60/0027
Driving Assistance System and Driving Assistance Method for the Automated Driving of a Vehicle
A driving assistance system for automated driving of a vehicle includes at least one processor unit, which is designed to perform the following when the driving assistance system is carrying out a maneuver of turning onto a street: to determine, on the basis of environment data of an environment sensor system of the vehicle, whether another, on-coming road user is blocking a lane of the street which corresponds to the vehicle; and when it is determined that the other road user is blocking the lane of the street which corresponds to the vehicle to carry out a situation evaluation in order to determine whether the vehicle should back up in order to allow the other road user to exit the street or whether the other road user should be requested to clear the lane by backing up.
AUTONOMOUS DRIVING METHOD AND APPARATUS
The present application discloses autonomous driving methods and apparatuses that may be applied to an intelligent vehicle and an autonomous driving vehicle. An example method includes: obtaining a motion status of an obstacle; determining a location of a towing point and a speed of the towing point based on the motion status of the obstacle, where both the location of the towing point and the speed of the towing point are continuous functions of the motion status of the obstacle; and obtaining expected acceleration of an ego vehicle based on the location of the towing point and the speed of the towing point.
RADAR-BASED DATA FILTERING FOR VISUAL AND LIDAR ODOMETRY
Aspects of the disclosed technology provide solutions for performing odometry and in particular, for performing odometry by filtering moving objects from a scene using sensor data. In some aspects, a process can include steps for receiving a first set of sensor data corresponding with a plurality of objects in a scene, determining one or more moving objects and one or more stationary objects from among the plurality of objects, and receiving a second set of sensor data. In some aspects, the process can further include steps for filtering the second set of sensor data to remove data associated with the one or more moving objects and generating odometry data associated with the filtered second set of sensor data. Systems and machine-readable media are also provided.
Non-solid object monitoring
An autonomous navigation system may navigate through an environment in which one or more non-solid objects, including gaseous and/or liquid objects, are located. Non-solid objects may be determined, using sensor data, to present an obstacle or interference based on determined chemical composition, size, position, velocity, concentration, etc. of the objects.
Methods and systems for joint pose and shape estimation of objects from sensor data
Methods and systems for jointly estimating a pose and a shape of an object perceived by an autonomous vehicle are described. The system includes data and program code collectively defining a neural network which has been trained to jointly estimate a pose and a shape of a plurality of objects from incomplete point cloud data. The neural network includes a trained shared encoder neural network, a trained pose decoder neural network, and a trained shape decoder neural network. The method includes receiving an incomplete point cloud representation of an object, inputting the point cloud data into the trained shared encoder, outputting a code representative of the point cloud data. The method also includes generating an estimated pose and shape of the object based on the code. The pose includes at least a heading or a translation and the shape includes a denser point cloud representation of the object.
APPARATUS WITH COLLISION WARNING AND VEHICLE INCLUDING THE SAME
An apparatus for warning the collision of a vehicle includes an information acquirer configured to acquire information on a surrounding object and information on a vehicle, and a controller configured to generate collision predicting information for the surrounding object, based on the information on the surrounding object and the information on the vehicle, and generate control information to control braking of the vehicle and to provide, based on the collision predicting information, a buffer element to an outside of the vehicle while controlling the braking of the vehicle.
SYSTEMS AND METHODS FOR TEMPORAL DECORRELATION OF OBJECT DETECTIONS FOR PROBABILISTIC FILTERING
Systems and methods for tracking an object. The method comprising: receiving, by a processor, a series of observations made over time for the object; selecting, by the processor, a plurality of sets of observations using the series of observations; causing, by the processor, the plurality of sets of observations to be used by at least one filter to generate a track for the object (wherein the at least one filter uses sensor data associated with each of a plurality of frames of sensor data only once during generation of the track); and causing, by the processor, operations of an autonomous robot to be controlled based on the track for the object.
PLANNING-AWARE PREDICTION FOR CONTROL-AWARE AUTONOMOUS DRIVING MODULES
A method of generating an output trajectory of an ego vehicle includes recording trajectory data of the ego vehicle and pedestrian agents from a scene of a training environment of the ego vehicle. The method includes identifying at least one pedestrian agent from the pedestrian agents within the scene of the training environment of the ego vehicle causing a prediction-discrepancy by the ego vehicle greater than the pedestrian agents within the scene. The method includes updating parameters of a motion prediction model of the ego vehicle based on a magnitude of the prediction-discrepancy caused by the at least one pedestrian agent on the ego vehicle to form a trained, control-aware prediction objective model. The method includes selecting a vehicle control action of the ego vehicle in response to a predicted motion from the trained, control-aware prediction objective model regarding detected pedestrian agents within a traffic environment of the ego vehicle.
Reducing inconvenience to surrounding road users caused by stopped autonomous vehicles
Aspects of the disclosure provide for reducing inconvenience to other road users caused by stopped autonomous vehicles. As an example, a vehicle having an autonomous driving mode may be stopped at a first location. While the vehicle is stopped, sensor data is received from a perception system of the vehicle. The sensor data may identify a road user. Using the sensor data, a value indicative of a level of inconvenience to the road user caused by stopping the vehicle at the first location may be determined. The vehicle is controlled in the autonomous driving mode to cause the vehicle to move from the first location and in order to reduce the value.
Detection of object awareness and/or malleability to state change
Determining whether another entity is coordinating with an autonomous vehicle and/or to what extent the other entity's behavior is based on the autonomous vehicle may comprise determining a collaboration score and/or negotiation score based at least in part on sensor data. The collaboration score may indicate an extent to which the entity is collaborating with the autonomous vehicle to navigate (e.g., a likelihood that the entity is increasingly yielding the right of way to the autonomous vehicle based on the autonomous vehicle's actions). A negotiation score may indicate an extent to which behavior exhibited by the entity is based on actions of the autonomous vehicle (e.g., how well the autonomous vehicle and the entity are communicating with their actions).