Patent classifications
B60W2420/52
External environment sensor data prioritization for autonomous vehicle
Sensor data is received from an array of sensors configured to capture one or more objects in an external environment of an autonomous vehicle. A first sensor group is selected from the array of sensors based on proximity data or environmental contexts. First sensor data from the first sensor group is prioritized for transmission based on the proximity data or environmental contexts.
Driver assistance system and control method thereof
A driver assistance system according to an embodiment of the present disclosure includes: a radar provided in the vehicle to have an external sensing field for the vehicle and configured to acquire radar data; a memory configured to store a first graph stored in advance; and a processor configured to determine a static target based on the radar data and driving information comprising a driving velocity, generate a second graph based on the determined static target, and correct the driving velocity based on the first graph and the second graph.
DISTANCE-VELOCITY DISAMBIGUATION IN HYBRID LIGHT DETECTION AND RANGING DEVICES
The subject matter of this specification can be implemented in, among other things, a system that includes a first light source to produce a pulsed beam and a second light source to produce a continuous beam, a modulator to impart a modulation to the second beam, and an optical interface subsystem to transmit the pulsed beam and the continuous beam to an outside environment and to detect a plurality of signals reflected from the outside environment. The system further includes one or more circuits configured to identify associations of various reflected pulsed signals, used to detect distance to various objects in the environment, with correct reflected continuous signals, used to detect velocities of the objects. The one or more circuits identify the associations based on the modulation of the detected continuous signals.
Optical acquisition device for a motor vehicle, wherein the operation of a light source unit is carried out in dependence on a functional state of the housing, method, and motor vehicle
The invention relates to an optical acquisition device (3) for a motor vehicle (1), having a housing (8) of the optical acquisition device (3), in which a light source unit (10) of the optical acquisition device (3) is arranged, wherein light beams (6) can be emitted by means of the light source unit (10) through a housing part (9) of the housing (8) into surroundings (4) of the motor vehicle (1), wherein the optical acquisition device (3) comprises a checking unit (16), by means of which a functional state of the housing (8) is checkable, and if an actual functional state of the housing (8) deviating from a reference functional state of the housing (8) is detected, a control signal can be generated. The invention furthermore relates to a motor vehicle (1) and a method.
Vehicle sensor calibration and verification
Systems and methods for automated vehicle sensor calibration and verification are provided. One example method involves monitoring a vehicle using one or more external sensors of a vehicle calibration facility. The sensor data may be indicative of a relative position of the vehicle in the vehicle calibration facility. The method also involves causing the vehicle to navigate in an autonomous driving mode, based on the sensor data, from a current position of the vehicle to a first calibration position in the vehicle calibration facility. The method also involves causing a first sensor of the vehicle to perform a first calibration measurement while the vehicle is at the first calibration position. The method also involves calibrating the first sensor based on at least the first calibration measurement.
Sensor fusion for precipitation detection and control of vehicles
An apparatus includes a processor configured to be disposed with a vehicle and a memory coupled to the processor. The memory stores instructions to cause the processor to receive, at least two of: radar data, camera data, lidar data, or sonar data. The sensor data is associated with a predefined region of a vicinity of the vehicle while the vehicle is traveling during a first time period. At least a portion of the vehicle is positioned within the predefined region during the first time period. The method also includes detecting that no other vehicle is present within the predefined region. An environment of the vehicle during the first time period is classified as one state from a set of states that includes at least one of dry, light rain, heavy rain, light snow, or heavy snow, based on at least two of the sensor data to produce an environment classification. An operational parameter of the vehicle based on the environment classification is modified.
Method and control device for controlling a motor vehicle
A method for controlling in an automated manner a motor vehicle (10) traveling on a road (12) in a current lane (14) is suggested, wherein the road (12) has at least one further lane (16). The method comprises the following steps: At least two preliminary driving maneuvers are generated and/or received, which include a lane change from the current lane (14) to the at least one further lane (16) and a starting time of the lane change. The starting times of the at least two preliminary driving maneuvers are at different times. The at least two driving maneuvers are compared taking into account the respective starting times. One of the starting times is selected based on the comparison. Further, a control device for a system for controlling a motor vehicle is also suggested.
Vehicle systems and methods utilizing LIDAR data for road condition estimation
A system and method for estimating road conditions ahead of a vehicle, including: a LIDAR sensor operable for generating a LIDAR point cloud; a processor executing a road condition estimation algorithm stored in a memory, the road condition estimation algorithm performing the steps including: detecting a ground plane or drivable surface in the LIDAR point cloud; superimposing an M×N matrix on at least a portion of the LIDAR point cloud; for each patch of the LIDAR point cloud defined by the M×N matrix, statistically evaluating a relative position, a feature elevation, and a scaled reflectance index; and, from the statistically evaluated relative position, feature elevation, and scaled reflectance index, determining a slipperiness probability for each patch of the LIDAR point cloud; and a vehicle control system operable for, based on the determined slipperiness probability for each patch of the LIDAR point cloud, affecting an operation of the vehicle.
VEHICLE DRIVER ASSIST SYSTEM
A vehicle driver assist system includes an expert evaluation system to fuse information acquired from various data sources. The data sources can correspond to conditions associated with the vehicle as a unit as well as external elements. The expert evaluation system monitors and evaluates the information from the data sources according to a set of rules by converting each data value into a metric value, determining a weight for each metric, assigning the determined weight to the metric, and generating a weighted metric corresponding to each data value. The expert evaluation system compares each weighted metric (or a linear combination of metrics) against one or more thresholds. The results from the comparison provide an estimation of a likelihood of one or more traffic features occurring.
Method for Providing Obstacle Maps for Vehicles
A method for the preparation of an obstacle map, wherein the obstacle map comprises cells, includes assigning each of the cells to segments of an environment of the vehicle, and assigning to each of the cells information as to whether the corresponding segment of the environment is occupied by an obstacle. The method also includes preparing an environment map that comprises the cells, and determining a threshold value specification, where the threshold value specification specifies different threshold values for the cells of the environment map. The threshold value specification is determined depending on a trajectory of the vehicle. An obstacle map is then determined on the basis of the environment map and the threshold value specification.