Patent classifications
B60W2554/404
VEHICLE DRIVE ASSIST APPARATUS
A vehicle drive assist apparatus to be applied to a vehicle executes at least emergency braking control for avoiding collision of the vehicle with a recognized object. The vehicle drive assist apparatus includes a surrounding environment recognition device and a traveling control unit. The surrounding environment recognition device includes a recognizer that recognizes a surrounding environment around the vehicle, and a feature information acquirer that acquires feature information of a target object in the recognized surrounding environment. The traveling control unit centrally controls the entire vehicle. The traveling control unit includes a determiner that determines, based on the acquired feature information, whether the target object has a possibility of hindering traveling of the vehicle. The traveling control unit continues normal traveling control for the vehicle when the target object is determined to have no possibility of hindering the traveling of the vehicle.
DETERMINING PERCEPTUAL SPATIAL RELEVANCY OF OBJECTS AND ROAD ACTORS FOR AUTOMATED DRIVING
Disclosed herein are system, method, and computer program product embodiments for determining objects that are kinematically capable, even if non-compliant with rules-of-the-road, of affecting a trajectory of a vehicle. The computing system (e.g., perception system, etc.) of a vehicle may generate a trajectory for the vehicle and a respective trajectory for each object of a plurality of objects within a field of view (FOV) of the sensing device associated with the vehicle. The computing system may identify objects of the plurality of objects with trajectories that intersect the trajectory for the vehicle and remove from such objects, objects with trajectories that at least one of exit the FOV or intersect with other objects of the plurality of objects within the FOV. The computing system may select, from remaining objects with trajectories that intersect the trajectory for the vehicle, objects with trajectories that indicate a respective collision between the object and the vehicle and assign a severity of the respective collision.
INWARD/OUTWARD VEHICLE MONITORING FOR REMOTE REPORTING AND IN-CAB WARNING ENHANCEMENTS
Systems and methods are provided for intelligent driving monitoring systems, advanced driver assistance systems and autonomous driving systems, and providing alerts to the driver of a vehicle, based on anomalies detected between driver behavior and environment captured by the outward facing camera. Various aspects of the driver, which may include his direction of sight, point of focus, posture, gaze, is determined by image processing of the upper visible body of the driver, by a driver facing camera in the vehicle. Other aspects of environment around the vehicle captured by the multitude of cameras in the vehicle are used to correlate driver behavior and actions with what is happening outside to detect and warn on anomalies, prevent accidents, provide feedback to the driver, and in general provide a safer driver experience.
COOPERATIVE TRAFFIC CONGESTION DETECTION FOR CONNECTED VEHICULAR PLATFORM
Systems and methods are provided to implement cooperative traffic congestion detection, and enhance the accuracy of detection of traffic congestion for enhanced routing and maneuvering vehicles along a travel route. A vehicle is configured to receive vehicle data from an ad-hoc network of a plurality of vehicles that are communicatively connected (and proximately located). A subset of the plurality of vehicles can be sensor-rich vehicles that are equipped with ranging sensors (e.g., cameras, LIDAR, radar, ultrasonic sensors), which enables real-time detection of the multiple traffic parameters, such as the presence of other vehicles, vehicle speed, vehicle movement, traffic, and the like, within the vicinity along the route. The vehicle employs cooperative traffic congestion detection, and fuses data from the plurality of vehicles, including sensor-rich vehicles and legacy vehicles, and applies a learning-based algorithm, such as a machine-learning (ML) algorithm, to generate a real-time and more accurate estimate of traffic congestion.
VEHICLE CONTROL SYSTEM, VEHICLE INTEGRATED CONTROL DEVICE, ELECTRONIC CONTROL DEVICE, NETWORK COMMUNICATION DEVICE, VEHICLE CONTROL METHOD AND COMPUTER READABLE MEDIUM
A vehicle control system (500) controls a vehicle whereon a plurality of ECUs (30) and a vehicle integrated control device (10) to control the plurality of ECUs (30) are mounted. The vehicle integrated control device (10) includes a control target value operation unit to calculate a control target value to control the plurality of ECUs (30). Further, the vehicle integrated control device (10) includes a prediction control value operation unit to estimate a state of the vehicle in the future, and to calculate a prediction control value to control the plurality of ECUs (30). The vehicle integrated control device (10) includes an instruction signal generation unit to generate an instruction signal including an operation instruction and a prediction control instruction. Each of the plurality of ECUs (30) includes an actuator control unit to control an actuator (50) based on the prediction control instruction.
SYSTEMS AND METHODS FOR ESTIMATING CUBOIDS FROM LIDAR, MAP AND IMAGE DATA
Systems and methods for operating a robotic system. The methods comprise: inferring, by a computing device, a first heading distribution for the object from a 3D point cloud; obtaining, by the computing device, a second heading distribution from a vector map; obtaining, by the computing device, a posterior distribution of a heading using the first and second heading distributions; defining, by the computing device, a cuboid on a 3D graph using the posterior distribution; and using the cuboid to facilitate driving-related operations of a robotic system.
Vehicle collision alert system and method for detecting driving hazards
An impairment analysis (“IA”) computer system for alerting a first driver of a first vehicle to a driving hazard posed by a second vehicle operated by a second driver is provided. The IA computer system is associated with the first vehicle, and includes at least one processor in communication with at least one memory device. The at least one processor is programmed to: (i) receive second vehicle data including second driver data and second vehicle condition data, where the second vehicle data is collected by a plurality of sensors included on the first vehicle; (ii) analyze the second vehicle data by applying a baseline model to the second vehicle data; (iii) determine that the second vehicle poses a driving hazard to the first vehicle based upon the analysis; and/or (iv) generate an alert signal based upon the determination that the second vehicle poses a driving hazard to the first vehicle.
Apparatus and method for controlling backward driving of vehicle
An apparatus for controlling backward driving of a vehicle including: a driving trajectory generation unit configured to generate a driving trajectory for backward driving of an ego vehicle on a target path, using sensing information acquired while the ego vehicle drives forward along the target path; and a control unit configured to control the backward driving of the ego vehicle on the target path according to the driving trajectory generated by the driving trajectory generation unit, correct the driving trajectory using driving information of another vehicle, which has driven backward on the target path before the ego vehicle, when a change on the target path is sensed in comparison to during the forward driving of the ego vehicle during the process of controlling the backward driving of the ego vehicle, and control the backward driving of the ego vehicle according to the corrected driving trajectory.
METHOD AND DEVICE FOR PREDICTING A FUTURE ACTION OF AN OBJECT FOR A DRIVING ASSISTANCE SYSTEM FOR VEHICLE DRIVABLE IN HIGHLY AUTOMATED FASHION
A method for predicting a future action of an object for a driving assistance system for a highly automated mobile vehicle. At least one sensor signal from at least one vehicle sensor of the vehicle is read in, the sensor signal representing at least one piece of kinematic object information concerning the object that is detected by the vehicle sensor at an instantaneous point in time. A planner signal from a planner of the autonomous driving assistance system is read in, the planner signal representing at least one piece of semantic information concerning the object or the surroundings of the object at a point in time in the past. The kinematic object information is fused with the semantic information to obtain a fusion signal. A prediction signal is determined using the fusion signal, the prediction signal representing the future action of the object.
RADAR-BASED DATA FILTERING FOR VISUAL AND LIDAR ODOMETRY
Aspects of the disclosed technology provide solutions for performing odometry and in particular, for performing odometry by filtering moving objects from a scene using sensor data. In some aspects, a process can include steps for receiving a first set of sensor data corresponding with a plurality of objects in a scene, determining one or more moving objects and one or more stationary objects from among the plurality of objects, and receiving a second set of sensor data. In some aspects, the process can further include steps for filtering the second set of sensor data to remove data associated with the one or more moving objects and generating odometry data associated with the filtered second set of sensor data. Systems and machine-readable media are also provided.