Patent classifications
B60W2754/00
Control of autonomous vehicle based on environmental object classification determined using phase coherent LIDAR data
Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
Control of Autonomous Vehicle Based on Environmental Object Classification Determined Using Phase Coherent LIDAR Data
Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
Travel Control Device and Method for Vehicle
A travel control device acquires subject vehicle information including the position of a subject vehicle, acquires object information including the position of an avoidance object which the subject vehicle should avoid, plans a target route for avoiding the avoidance object in accordance with the position of the subject vehicle and the position of the avoidance object, and outputs command information for driving the subject vehicle on the target route. The planning function is used to reduce the distance from the subject vehicle to a turning point as a margin distance decreases. The margin distance is the distance between the subject vehicle and a lane marker on a road on which the subject vehicle is traveling and is along a width direction of the road. The turning point is a point at which the location of the target route along the width direction varies by a predetermined distance or more.
Travel Control Device and Travel Control Method
A travel control device comprises: an object information acquisition unit that acquires object information including the position of an avoidance object around a subject vehicle; a planning unit that plans a target route passing the avoidance object on the basis of the position of the subject vehicle and the position of the avoidance object; a control unit that outputs command information, and a second setting unit that calculates a distance between the subject vehicle and the avoidance object along the vehicle width direction when driving the subject vehicle on the target route and uses the distance as the basis to set a tolerable vehicle width distance range. When an actual distance from the subject vehicle to the position of the avoidance object along the vehicle width direction is within the tolerable vehicle width distance range, the control unit drives the subject vehicle on the basis of the set target route.
Training machine learning model based on training instances with: training instance input based on autonomous vehicle sensor data, and training instance output based on additional vehicle sensor data
Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.
Control Of Autonomous Vehicle Based On Determined Yaw Parameter(s) of Additional Vehicle
Determining an instantaneous vehicle characteristic (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined instantaneous vehicle characteristic of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined instantaneous vehicle characteristic of the additional vehicle. In many implementations, the instantaneous vehicle characteristics of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
METHOD AND APPARATUS FOR CONTROLLING AUTOMATED GUIDED VEHICLE
A method performed by a computing device for controlling an automated guided vehicle according to an embodiment of the present disclosure includes obtaining information on a movable area including nodes arranged in a grid pattern, obtaining information on a moving path connecting a source node and a destination node existing in the movable area, wherein the moving path includes a plurality of moving nodes, and reserving at least some of the moving nodes located between a current node and the destination node by using information on the current node occupied by an automated guided vehicle according to movement of the automated guided vehicle.
Control of autonomous vehicle based on determined yaw parameter(s) of additional vehicle
Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
CONTROLLING AN AUTONOMOUS VEHICLE USING A PROXIMITY RULE
The subject matter described in this specification is generally directed to a system and techniques for controlling an autonomous vehicle. In one example, a proximity rule is received by a control circuit. A reference trajectory is received from the planning circuit by a control circuit, where the reference trajectory is determined by the planning circuit based on the proximity rule. The control circuit receives the proximity rule and determines a predicted trajectory based on the reference trajectory and the proximity rule. The autonomous vehicle is then navigated according to the predicted trajectory.
Control Of Autonomous Vehicle Based On Determined Yaw Parameter(s) of Additional Vehicle
Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.