Patent classifications
B60W2554/20
IDENTIFICATION OF PLANAR POINTS IN LIDAR POINT CLOUD OBTAINED WITH VEHICLE LIDAR SYSTEM
A system in a vehicle includes a lidar system to transmit incident light and receive reflections from one or more objects as a point cloud of points. The system also includes processing circuitry to identify feature points among the points of the point cloud, the feature points being horizontal feature points reflected from a horizontal surface or vertical feature points reflected from a vertical surface. The processing circuitry processes the point cloud by obtaining a normal vector corresponding to each of the points of the point cloud. The normal vector includes a first component associated with a first dimension, a second component associated with a second dimension, and a third component associated with a third dimension.
ROAD INFORMATION COLLECTION DEVICE
An in-vehicle device includes a vehicle detection unit that detects behavior of a user's own vehicle that is traveling, a neighboring vehicle detection unit that detects behavior of a neighboring vehicle preceding the user's vehicle, a neighboring vehicle following the user's vehicle, or both of neighboring vehicles, a road determination unit that determines the condition of a road on which the user's vehicle is traveling on the basis of behavior of the user's vehicle detected by the user's vehicle detection unit and behavior of a neighboring vehicle detected by the neighboring vehicle detection unit, and a transmission information creation unit and a transmission unit that transmit road information including a determination result by the road determination unit.
System of configuring active lighting to indicate directionality of an autonomous vehicle
Systems, apparatus and methods may be configured to implement actively-controlled light emission from a robotic vehicle. A light emitter(s) of the robotic vehicle may be configurable to indicate a direction of travel of the robotic vehicle and/or display information (e.g., a greeting, a notice, a message, a graphic, passenger/customer/client content, vehicle livery, customized livery) using one or more colors of emitted light (e.g., orange for a first direction and purple for a second direction), one or more sequences of emitted light (e.g., a moving image/graphic), or positions of light emitter(s) on the robotic vehicle (e.g., symmetrically positioned light emitters). The robotic vehicle may not have a front or a back (e.g., a trunk/a hood) and may be configured to travel bi-directionally, in a first direction or a second direction (e.g., opposite the first direction), with the direction of travel being indicated by one or more of the light emitters.
DRIVER ASSISTANCE SYSTEM AND METHOD
A driver assistance system for an ego vehicle, and a method for a driver assistance system is provided. The system is configured to refine a coarse geolocation method based on the detection of the static features located in the vicinity of the ego vehicle. The system performs at least one measurement of the visual appearance of each of at least one static feature located in the vicinity of the ego vehicle. Using the at least one measurement, a position of the ego vehicle relative to the static feature is calculated. The real world position of the static feature is identified. The position of the ego vehicle relative to the static feature is calculated, which is, in turn, used to calculate a static feature measurement of the vehicle location. The coarse geolocation measurement and the static feature measurement are combined to form a fine geolocation position. By combining the measurements, a more accurate location of the ego vehicle can be determined.
UNSTRUCTURED VEHICLE PATH PLANNER
The techniques discussed herein may comprise an autonomous vehicle guidance system that generates a path for controlling an autonomous vehicle based at least in part on a static object map and/or one or more dynamic object maps. The guidance system may identify a path based at least in part on determining set of nodes and a cost map associated with the static and/or dynamic object, among other costs, pruning the set of nodes, and creating further nodes from the remaining nodes until a computational or other limit is reached. The path output by the techniques may be associated with a cheapest node of the sets of nodes that were generated.
Method for Determining an Avoidance Path of a Motor Vehicle
A method for determining an avoidance path of a motor vehicle includes the steps of:—acquiring data relating to an obstacle located in the surroundings of the motor vehicle by means of a detection system,—determining a final position to be reached according to the position of the obstacle and an initial position of the motor vehicle,—calculating a theoretical impact position located between the initial position and the final position, and—developing the avoidance path such that the motor vehicle passes through the initial position and the final position and avoids the theoretical impact position around the outside.
VEHICLE CONTROL DEVICE
At the time of travel following a travel route, even if an environment around the travel route changes, estimation of a host vehicle position from a stationary object is realized. A vehicle control device has a processor and a memory and controls traveling of a vehicle, and includes a sensor that acquires surrounding environment information of the vehicle, a surrounding environment storage unit that acquires a stationary object from surrounding environment information acquired by the sensor, calculates a position of the stationary object, and stores the position of the vehicle on a travel route and a position of the stationary object in association with each other. The surrounding environment storage unit stores three or more of the stationary objects at each position on a travel route as stationary objects for host vehicle position estimation in a case of receiving a command to start storing the surrounding environment information and the travel route.
ADAPTIVE CRUISE CONTROL WITH NON-VISUAL CONFIRMATION OF OBSTACLES
A system comprises a computer having a processor and a memory, the memory storing instructions executable by the processor to access sensor data of a first sensor of a vehicle while an adaptive cruise control feature of the vehicle is active, detect, based on the sensor data of the first sensor, a stationary object located along a path of travel of the vehicle, wherein the stationary object is located outside of a range of a second sensor of the vehicle, determine a presence of an intersection within a threshold distance of the stationary object that is along the path of travel of the vehicle, and responsive to a determination that the stationary object is a stopped vehicle of the intersection, adjust, by the adaptive cruise control feature, the speed of the vehicle.
RIDE COMFORT IMPROVEMENT IN DIFFERENT TRAFFIC SCENARIOS FOR AUTONOMOUS VEHICLE
Enclosed are embodiments of motion control operations in various traffic scenarios in consideration of the kinematic factor for trajectory planning. In some embodiments, a method includes: determining a danger rating for at least one object identified in an environment, wherein the danger rating represents a perceived risk associated with a respective object; evaluating a set of hierarchical factors with respect to a traffic scenario, wherein a metric is derived for trajectories of the traffic scenario that quantifies passenger ride comfort based on the danger rating and the set of hierarchical factors; determining a motion control operation in the traffic scenario to increase the passenger ride comfort based on the metric; and augmenting a route planner of an autonomous vehicle with motion control operations in different traffic scenarios to increase the passenger ride comfort.
Object sensing device
An object sensing device is configured to sense an object therearound using an ultrasonic sensor. The object sensing device comprises: a distance judgment portion that performs a judgement of a distance to the object therearound in accordance with received ultrasonic waves that are based on transmitted ultrasonic waves by the ultrasonic sensor; and a notification control portion that performs a predetermined notification operation in accordance with the received ultrasonic waves that are based on the transmitted ultrasonic waves. The notification control portion performs the predetermined notification operation when the distance judgement portion has continuously judged that the object is within a predetermined close range. The notification control portion fails to perform the predetermined notification operation when a judgement history of the distance to the object by the distance judgement portion indicates an abnormal appearance of the object within the predetermined close range.