Patent classifications
B60W2552/53
VISIBILITY CONDITION DETERMINATIONS FOR AUTONOMOUS DRIVING OPERATIONS
Techniques are described for determining visibility conditions of an environment in which an autonomous vehicle is operated and performing driving related operations based on the visibility conditions. An example method of adjusting driving related operations of a vehicle includes determining, by a computer located in an autonomous vehicle, a visibility related condition of an environment in which the autonomous vehicle is operating, adjusting, based at least on the visibility related condition, a set of one or more values of one or more variables associated with a driving related operation of the autonomous vehicle, and causing the autonomous vehicle to be driven to a destination by causing the driving related operation of one or more devices located in the autonomous vehicle based on at least the set of one or more values.
METHOD AND CONTROL UNIT FOR OPERATING A TRANSVERSE STABILIZATION SYSTEM OF A VEHICLE
A method for operating a transverse stabilization system of a vehicle. A steering direction of the vehicle and a setpoint direction of the vehicle are read in, with a transverse stabilization target for the transverse stabilization system being determined using the steering direction and the setpoint direction.
ASSOCIATING PERCEIVED AND MAPPED LANE EDGES FOR LOCALIZATION
A system for associating perceived and mapped lane edges can include a processor and a memory. The memory includes instructions such that the processor is configured to receive a sensor data representing a perceived object; receive map data representing a map object; determine a cost matrix a cost matrix indicative of an association cost for associating the map object to the perceived object; compare the association cost with an association cost threshold; and associate the perceived object with the map object based on the association cost.
Driving assistance apparatus
In a driving assistance apparatus, an image acquiring unit acquires a captured image captured by an onboard camera. Based on the captured image acquired by the image acquiring unit, a boundary line recognizing unit recognizes a boundary line that demarcates a traffic lane in which an own vehicle is driving. A road information acquiring unit acquires road information related to a road on which the own vehicle is driving. Based on the road information acquired by the road information acquiring unit, a degree-of-reliability setting unit sets a degree of reliability of the boundary line recognized by the boundary line recognizing unit. Based on the boundary line recognized by the boundary line recognizing unit, a driving assisting unit performs driving assistance of the own vehicle and varies control content of the driving assistance based on the degree of reliability.
Autonomous vehicle park-and-go scenario design
In one embodiment, when an autonomous driving vehicle (ADV) is parked, the ADV can determine, based on criteria, whether to operate in an open-space mode or an on-lane mode. The criteria can include whether the ADV is within a threshold distance and threshold heading relative to a vehicle lane. If the criteria are not satisfied, then the ADV can enter the open-space mode. While in the open-space mode, the ADV can maneuver it is within the threshold distance and the threshold heading relative to the vehicle lane. In response to the criteria being satisfied, the ADV can enter and operate in the on-lane mode for the ADV to resume along the vehicle lane.
Autonomous vehicle operation using linear temporal logic
Techniques are provided for autonomous vehicle operation using linear temporal logic. The techniques include using one or more processors of a vehicle to store a linear temporal logic expression defining an operating constraint for operating the vehicle. The vehicle is located at a first spatiotemporal location. The one or more processors are used to receive a second spatiotemporal location for the vehicle. The one or more processors are used to identify a motion segment for operating the vehicle from the first spatiotemporal location to the second spatiotemporal location. The one or more processors are used to determine a value of the linear temporal logic expression based on the motion segment. The one or more processors are used to generate an operational metric for operating the vehicle in accordance with the motion segment based on the determined value of the linear temporal logic expression.
Collision avoidance assist apparatus
A driving assist ECU determines that a current situation is a specific situation where it is predicted that there is no object that is about to enter an adjacent lane from an area outside of a host vehicle road on which a host vehicle is traveling, when a road-side object is detected at a part around an edge of the adjacent lane, and/or when a white line painted to define the adjacent lane is detected at the part around the edge of the adjacent lane and no object near the detected white line is detected. The driving assist ECU does not perform a steering control for avoiding a collision, the steering control for letting the vehicle enter the adjacent lane, when it is not determined that the current situation is the specific situation.
VEHICLE CONTROL SYSTEM AND METHOD
A control system for controlling generation of a steering overlay signal to control a trajectory of a host vehicle can include one or more controllers and is configured to identify a lateral boundary of the host-vehicle lane of travel. The control system monitors a position of the host vehicle in relation to the lateral boundary of the host-vehicle lane of travel. A lateral velocity of the host vehicle is determined by the control system. The steering overlay signal is generated based on a determination that the host vehicle is approaching or traversing the lateral boundary of the host-vehicle lane of travel and that the determined lateral velocity is greater than or equal to a first lateral velocity threshold. The control system can determine a lateral separation between the host vehicle and an object.
System for Determining Road Slipperiness in Bad Weather Conditions
Systems and methods are disclosed for estimating slipperiness of a road surface. This estimate may be obtained using an image sensor mounted on a vehicle. The estimated road slipperiness may be utilized when calculating a risk index for the road, or for an area including the road. If a predetermined threshold for slipperiness is exceeded, corrective actions may be taken. For instance, warnings may be generated to human drivers that are in control of driving vehicle, and autonomous vehicles may automatically adjust vehicle speed based upon road slipperiness detected.
AUTONOMOUS LOOK AHEAD METHODS AND SYSTEMS
Methods and systems are provided for controlling an autonomous vehicle. In one embodiment, a method includes: identifying, by a processor, at least one constraint on a longitudinal dimension of an upcoming road; defining, by the processor, constraint activation logic based on a type of the at least one constraint; performing, by the processor, the constraint activation logic to determine a state of the constraint to be at least one of active and inactive; when the state of the constraint is active, validating, by the processor, a motion plan of the autonomous vehicle based on the constraint; and selectively controlling the autonomous vehicle based on the validating of the motion plan.