Patent classifications
B60W2554/4029
Group and combine obstacles for autonomous driving vehicles
In one embodiment, a plurality of obstacles is sensed in an environment of an automated driving vehicle (ADV). One or more representations are formed to represent corresponding groupings of the plurality of obstacles. A vehicle route is determined in view of the one or more representations, rather than each and every one of the obstacles individually.
Autonomous vehicle operation with explicit occlusion reasoning
Autonomous vehicle operation with explicit occlusion reasoning may include traversing, by a vehicle, a vehicle transporation network. Traversing the vehicle transportation network can include receiving, from a sensor of the vehicle, sensor data for a portion of a vehicle operational environment, determining, using the sensor data, a visibility grid comprising coordinates forming an unobserved region within a defined distance from the vehicle, computing a probability of a presence of an external object within the unobserved region by comparing the visibility grid to a map (e.g., a high-definition map), and traversing a portion of the vehicle transportation network using the probability. An apparatus and a vehicle are also described.
Partial point cloud-based pedestrians' velocity estimation method
A method, apparatus, and system for estimating a moving speed of a detected pedestrian at an autonomous driving vehicle (ADV) is disclosed. A pedestrian is detected in a plurality of frames of point clouds generated by a LIDAR device installed at an autonomous driving vehicle (ADV). In each of at least two of the plurality of frames of point clouds, a minimum bounding box enclosing points corresponding to the pedestrian excluding points corresponding to limbs of the pedestrian is generated. A moving speed of the pedestrian is estimated based at least in part on the minimum bounding boxes across the at least two of the plurality of frames of point clouds. A trajectory for the ADV is planned based at least on the moving speed of the pedestrian. Thereafter, control signals are generated to drive the ADV based on the planned trajectory.
VEHICLE CONTROL SYSTEM, VEHICLE INTEGRATED CONTROL DEVICE, ELECTRONIC CONTROL DEVICE, NETWORK COMMUNICATION DEVICE, VEHICLE CONTROL METHOD AND COMPUTER READABLE MEDIUM
A vehicle control system (500) controls a vehicle whereon a plurality of ECUs (30) and a vehicle integrated control device (10) to control the plurality of ECUs (30) are mounted. The vehicle integrated control device (10) includes a control target value operation unit to calculate a control target value to control the plurality of ECUs (30). Further, the vehicle integrated control device (10) includes a prediction control value operation unit to estimate a state of the vehicle in the future, and to calculate a prediction control value to control the plurality of ECUs (30). The vehicle integrated control device (10) includes an instruction signal generation unit to generate an instruction signal including an operation instruction and a prediction control instruction. Each of the plurality of ECUs (30) includes an actuator control unit to control an actuator (50) based on the prediction control instruction.
REAL TIME EVENT TRIGGERED FEEDBACK FOR AUTONOMOUS VEHICLES
The disclosure relates collecting feedback from passengers of autonomous vehicles. For instance, that a triggering circumstance for triggering a feedback request has been met may be determined. The triggering circumstance may include a driving event, a presence of other road users, or a trip state. A display requirement and data collection parameters for the feedback request are identified based on the determination. The display requirement defines when the feedback request is displayed and the data collection parameters identify information that the feedback request is to collect. The feedback request is provided for display based on the display requirement and data collection parameters. In response, feedback from a passenger of the autonomous vehicle is received and stored for later use.
Trajectory classification
Techniques to predict object behavior in an environment are discussed herein. For example, such techniques may include inputting data into a model and receiving an output from the model representing a discretized representation. The discretized representation may be associated with a probability of an object reaching a location in the environment at a future time. A vehicle computing system may determine a trajectory and a weight associated with the trajectory using the discretized representation and the probability. A vehicle, such as an autonomous vehicle, can be controlled to traverse an environment based on the trajectory and the weight output by the vehicle computing system.
Vehicle ride sharing system and method using smart modules
A vehicle sharing system includes a vehicle having interior transceiver modules associated with different passenger seating areas and a vehicle computing system (VCS) including a processor and a memory in communication with the modules and programmed to detect occupancy status of each seating area based on signals from the modules and to communicate the occupancy statuses to a remote server to facilitate scheduling of ride-sharing passengers for a specified seating area of the vehicle. The reserved seating location may be used to align the seating location/door with a passenger during pick-up, adjust vehicle accessory settings associated with the reserved seating location, and activate a visual indicator to direct the passenger to the assigned/reserved seating location.
Predicting jaywalking behaviors of vulnerable road users
Jaywalking behaviors of vulnerable road users (VRUs) such as cyclists or pedestrians can be predicted. Location data is obtained that identifies a location of a VRU within a vicinity of a vehicle. Environmental data is obtained that describes an environment of the VRU, where the environmental data identifies a set of environmental features in the environment of the VRU. The system can determine a nominal heading of the VRU, and generate a set of predictive inputs that indicate, for each of at least a subset of the set of environmental features, a physical relationship between the VRU and the environmental feature. The physical relationship can be determined with respect to the nominal heading of the VRU and the location of the VRU. The set of predictive inputs can be processed with a heading estimation model to generate a predicted heading offset (e.g., a target heading offset) for the VRU.
Systems and methods for robotic behavior around moving bodies
Systems and methods for detection of people are disclosed. In some exemplary implementations, a robot can have a plurality of sensor units. Each sensor unit can be configured to generate sensor data indicative of a portion of a moving body at a plurality of times. Based on at least the sensor data, the robot can determine that the moving body is a person by at least detecting the motion of the moving body and determining that the moving body has characteristics of a person. The robot can then perform an action based at least in part on the determination that the moving body is a person.
On-vehicle driving behavior modelling
This application is directed to on-vehicle behavior modeling of vehicles. A vehicle has one or more processors, memory, a plurality of sensors, and a vehicle control system. The vehicle collects training data via the plurality of sensors, and the training data include data for one or more vehicles during a collection period. The vehicle locally applies machine learning to train a vehicle driving behavior model using the collected training data. The vehicle driving behavior model is configured to predict a behavior of one or more vehicles. The vehicle subsequently collecting sensor data from the plurality of sensors and drives the vehicle by applying the vehicle driving behavior model to predict vehicle behavior based on the collected sensor data. The vehicle driving behavior model is configured to predict behavior of an ego vehicle and/or a distinct vehicle that appears near the ego vehicle.