Patent classifications
B60W2420/408
Systems and methods for unmanned positioning and delivery of rental vehicles
A managing apparatus for positioning rental vehicles includes a memory storing instructions and a processor configured to execute the instructions to cause the managing apparatus to access model information and location information for a plurality of autonomous vehicles, receive a request including a delivery location and a chosen model for renting, select an autonomous vehicle of the chosen model from among the plurality of autonomous vehicles based on the model information, the location information, and the delivery location, instruct the selected autonomous vehicle to fully-autonomously or semi-autonomously travel to the delivery location, and instruct the selected autonomous vehicle to switch to manual operation mode at the delivery location for manual operation by a vehicle rental customer.
Monitoring drivers and external environment for vehicles
The present subject matter relates to varying warning intensity based on driving behaviour and driver state. Data related to external environment to a vehicle is fetched and the driver state and driving behavior is monitored. Based on the fetched data and monitored data, an event is determined, and warning is generated for a driver of the vehicle. The intensity of the warning is varied based on severity of the event and the driver state and the driving behavior.
System and method for controlling a vehicle steering system
A system for control of a steering system of a vehicle, mechanically coupled to the wheels and including an actuator for applying a force or a torque to the steering system. A force or torque can be superimposed on a force or torque originating from the wheels. The system includes a detection unit disposed on the vehicle and configured for anticipatorily detecting at least one surface condition of a surface section located ahead of the vehicle in the direction of vehicle travel and subsequently driven on by the vehicle. The system including a data processing unit disposed on the vehicle and connected to and communicating with the detection unit. The data processing unit configured for generating control signals for controlling an actuator of the steering system based on the detected surface condition.
REDUNDANT ENVIRONMENT PERCEPTION TRACKING FOR AUTOMATED DRIVING SYSTEMS
Redundant environment perception tracking for automated driving systems. One example embodiment provides an automated driving system for a vehicle, the system including a plurality of sensors, a memory, and an electronic processor. The electronic processor is configured to receive, from the plurality of sensors, environmental information of a common field of view, generate, based on the environmental information, a plurality of hypotheses regarding an object within the common field of view, the plurality of hypotheses including at least one set of hypotheses excluding the environmental information from at least one sensor of a first sensor type, determine, based on a subset of the plurality of hypotheses, an object state of the object, wherein the subset includes the at least one set of hypotheses excluding the environmental information from the at least one sensor, and perform a vehicular maneuver based on the object state that is determined.
METHOD AND DEVICE FOR EVALUATING THE ANGULAR POSITION OF AN OBJECT, AND DRIVER ASSISTANCE SYSTEM
A method for evaluating an angular position of an object recognized on the basis of radar data, the radar data being ascertained by a radar device. The method includes: ascertaining of an intrinsic speed of the radar device; ascertaining a relative speed of the recognized object in relation to the radar device, using the ascertained radar data; ascertaining at least one angular test region using the ascertained intrinsic speed and the ascertained relative speed, the at least one angular test region corresponding to possible stationary objects that have a relative speed that substantially corresponds to the ascertained relative speed; and ascertaining whether an azimuth angle of the recognized object lies in the ascertained angular test region.
METHOD FOR OPERATING A VEHICLE GUIDING SYSTEM WHICH IS DESIGNED TO GUIDE A MOTOR VEHICLE IN A COMPLETELY AUTOMATED MANNER, AND MOTOR VEHICLE
The present disclosure relates to a method for operating a vehicle guiding system of a motor vehicle, said vehicle guiding system being designed to guide the motor vehicle in a completely automated manner. The presence of a traffic officer and/or instruction data which describes a traffic instruction provided by the traffic officer is ascertained by analyzing sensor data of at least one surroundings sensor of the motor vehicle and taken into consideration during the completely automated guidance of the vehicle. At least one radar sensor with a semiconductor chip which acts as a radar transceiver is used as the surroundings sensor, and upon detecting the presence of a traffic officer, the radar sensor is switched from at least one normal operating mode to an additional operating mode provided for detecting limbs of the traffic officer and/or their movement, wherein the sensor data of the radar sensor is analyzed for instruction data which describes the limbs of the traffic officer and/or their movement.
METHODS AND SYSTEMS FOR COMPUTER-BASED DETERMINING OF PRESENCE OF DYNAMIC OBJECTS
A method for determining a set of dynamic objects in sensor data representative of a surrounding area of a vehicle having sensors, the method being executed by a server, the server executing a machine learning algorithm (MLA). Sensor data is received, and the MLA generates, based on the sensor data, a set of feature vectors. Vehicle data indicative of a localization of the vehicle is received. The MLA generates, based on the set of feature vectors and the vehicle data, a tensor, the tensor including a grid representation of the surrounding area. The MLA generates an mobility mask indicative of grid cells occupied by at least one moving potential object in the grid, and a velocity mask indicative of a velocity associated with the at least one potential object in the grid. The MLA determines, based on the mobility mask and the velocity mask, the set of dynamic objects.
REDUNDANCY INFORMATION FOR OBJECT INTERFACE FOR HIGHLY AND FULLY AUTOMATED DRIVING
A system for a reliability of objects for a driver assistance or automated driving of a vehicle includes a plurality of sensors that include one or more sensor modalities for providing sensor data for the objects. An electronic tracking unit is configured to receive the sensor data to determine a detection probability (p_D) for each of the plurality of sensors for each of the objects, to determine an existence probability (p_ex) for each of the plurality of sensors for each of the objects, and to provide vectors for each of the objects based on the existence probability (p_ex) for each contributing one of the plurality of sensors for the specific object. The vectors are provided by the electronic tracking unit for display as an object interface on a display device. The vectors are independent from the sensor data from the plurality of sensors.
NEURAL NETWORK DEVICE AND METHOD USING A NEURAL NETWORK FOR SENSOR FUSION
In accordance with an embodiment, a neural network is configured to: process a first grid representing at least a first portion of a field of view of a first sensor; process a second grid representing at least a second portion of a field of view of a second sensor; and fuse the processed first grid with the processed second grid into a fused grid, where the fused grid includes information about the occupancy of the first portion of the field of view of the first sensor and the occupancy of the second portion of the field of view of the second sensor.
DRIVING ASSIST SYSTEM
Driving assist control has plural control modes associated with plural scenes on a one-to-one basis. A scene corresponding to driving environment for a vehicle is a subject scene, and a selected control mode is one associated with the subject scene. Plural pieces of scene description information respectively define the plural scenes. Selected scene description information is one defining the subject scene and indicates parameters used in the driving assist control of the selected control mode. A processor executes the driving assist control of the selected control mode based on the parameters indicated by the selected scene description information, and switches the selected control mode by switching the selected scene description information. When the subject scene changes, the processor notifies an occupant of the vehicle of switching of the selected control mode before switching the selected scene description information and the selected control mode.