Patent classifications
B60W2420/403
Image-based velocity control for a turning vehicle
An autonomous vehicle control system is provided. The control system may include a plurality of cameras to acquire a plurality of images of an area in a vicinity of a vehicle; and at least one processing device configured to: recognize a curve to be navigated based on map data and vehicle position information; determine an initial target velocity for the vehicle based on at least one characteristic of the curve as reflected in the map data; adjust a velocity of the vehicle to the initial target velocity; determine, based on the plurality of images, observed characteristics of the curve; determine an updated target velocity based on the observed characteristics of the curve; and adjust the velocity of the vehicle to the updated target velocity.
Device and method for virtualizing driving environment, and vehicle
A device for virtualizing a driving environment surrounding a first node, which includes: a data acquisition device, configured to acquire position data of the first node, position data and sensing data of at least one second node, where the at least one second node and the first node are in a first communication network; and a scene construction device, configured to construct a scene virtualizing the driving environment surrounding the first node based on the position data of the fist node and the at least one second node, and on the sensing data of the at least one second node. Accordingly, by utilizing position data and sensor data of a node, a scene for virtualizing a driving environment can be constructed in real time for a driver, which improves driving safety.
ENHANCED VEHICLE OPERATION
While operating a vehicle, a candidate marker is detected via first image data from a first image sensor. Upon failing to identify the candidate marker, vehicle exterior lighting is actuated to illuminate the candidate marker. Then the candidate marker is determined to be one of a real marker or a projected marker based on determining whether the candidate marker is detected via second image data from the first image sensor. Upon determining the candidate marker is the real marker, the vehicle is operated based on the real marker.
CREATING HIGHLIGHT REELS OF USER TRIPS
Systems and methods for generating images from an autonomous vehicle ride. The images can include images from inside the vehicle and outside the vehicle, and can be used to create a highlight reel of the ride. The images can be captured automatically and include images of passengers during the ride. The highlight reel is provided to a user who can choose to share it with others.
System of configuring active lighting to indicate directionality of an autonomous vehicle
Systems, apparatus and methods may be configured to implement actively-controlled light emission from a robotic vehicle. A light emitter(s) of the robotic vehicle may be configurable to indicate a direction of travel of the robotic vehicle and/or display information (e.g., a greeting, a notice, a message, a graphic, passenger/customer/client content, vehicle livery, customized livery) using one or more colors of emitted light (e.g., orange for a first direction and purple for a second direction), one or more sequences of emitted light (e.g., a moving image/graphic), or positions of light emitter(s) on the robotic vehicle (e.g., symmetrically positioned light emitters). The robotic vehicle may not have a front or a back (e.g., a trunk/a hood) and may be configured to travel bi-directionally, in a first direction or a second direction (e.g., opposite the first direction), with the direction of travel being indicated by one or more of the light emitters.
Vehicle driving assist system with driver attentiveness assessment
A driving assist system for a vehicle includes a driver monitoring system that includes a plurality of sensors disposed in a vehicle and sensing driver hand positions of a driver driving the vehicle. A control includes a processor operable to process data sensed by the sensors to determine the driver hand positions of the driver driving the vehicle. The control, responsive to processing of data sensed by the sensors and at least in part responsive to the determined sensed driver hand positions, is operable to determine a level of attentiveness of the driver. The driving assistance system of the vehicle operates to provide driving assistance of the vehicle responsive at least in part to the determined level of attentiveness of the driver.
Automated vehicle actions such as lane departure warning, and associated systems and methods
Mappings of keys to actions can automate various vehicle systems. Some automations can provide lane departure warnings. Keys for lane departure mappings can specify vibration patterns expected when a vehicle drives over lane delineators. These vibration-based mappings can include keys with vibration patterns, e.g., defining vibration frequencies or vibration locations. Keys for emergency light mappings can be based on conditions such as (1) the vehicle being on the road, stopped, not in traffic, and not at a stop signal; (2) components of the vehicle having failed; or (3) weather conditions.
NPU IMPLEMENTED FOR ARTIFICIAL NEURAL NETWORKS TO PROCESS FUSION OF HETEROGENEOUS DATA RECEIVED FROM HETEROGENEOUS SENSORS
A neural processing unit (NPU) includes a controller including a scheduler, the controller configured to receive from a compiler a machine code of an artificial neural network (ANN) including a fusion ANN, the machine code including data locality information of the fusion ANN, and receive heterogeneous sensor data from a plurality of sensors corresponding to the fusion ANN; at least one processing element configured to perform fusion operations of the fusion ANN including a convolution operation and at least one special function operation; a special function unit (SFU) configured to perform a special function operation of the fusion ANN; and an on-chip memory configured to store operation data of the fusion ANN, wherein the schedular is configured to control the at least one processing element and the on-chip memory such that all operations of the fusion ANN are processed in a predetermined sequence according to the data locality information.
AUTONOMOUS VEHICLE WITH PATH PLANNING SYSTEM
A vehicular control system determines a planned path of travel for a vehicle along a traffic lane in which the vehicle is traveling on a road. The system determines a respective target speed for waypoints along the planned path that represents a speed the vehicle should travel when passing through the respective waypoint. The system determines a speed profile for the vehicle to travel at as the vehicle travels along the planned path, with at least two different speeds being based on a difference in target speeds of at least two consecutive respective waypoints of the plurality of waypoints. The system determines an acceleration profile for the vehicle to follow as it changes from one speed to another speed of the speed profile. The system controls the vehicle to maneuver the vehicle along the planned path in accordance with the determined speed and acceleration profiles.
DETERMINING ESTIMATED STEERING DATA FOR A VEHICLE
Techniques for using ball joint sensor data to determine conditions relevant to a vehicle are described in this disclosure. For example, in one example, the ball joint sensor data may be used to determine estimated steering data. The estimated steering data may be directly used to navigate through an environment, such as by the vehicle relying on the estimated steering data when planning, tracking, or executing a driving maneuver. Also, the estimated steering data may be used to verify the reliability of other steering sensor data used to navigate through the environment.