Patent classifications
G01S2013/9319
MODULAR RADAR ELEMENT WIRELESS SYNCHRONIZATION
The present disclosure is directed to a modular radar apparatus and to methods for using a modular radar apparatus. Two or more modules of the radar apparatus may each include a reference signal antenna where a synchronization signal generated at a first module may be sent to a second module via a pair of respective reference signal antenna. When the synchronization signal is received by the second module, that signal may be provided to a phase locked loop (PLL) to synchronize timing of the second module with the first module. The PLL or a multiplier circuit could then generate signals with timing synchronized to the synchronization signal yet at frequencies higher than the synchronization signal. This synchronized timing may allow transmitted radar signals to be synchronized in time with received reflected radar signals more accurately because effects of transmitting synchronization signals via wired interconnects between modules is eliminated.
Object velocity detection from multi-modal sensor data
Ground truth data may be too sparse to supervise training of a machine-learned (ML) model enough to achieve an ML model with sufficient accuracy/recall. For example, in some cases, ground truth data may only be available for every third, tenth, or hundredth frame of raw data. Training an ML model to detect a velocity of an object when ground truth data for training is sparse may comprise training the ML model to predict a future position of the object based at least in part on image, radar, and/or lidar data (e.g., for which no ground truth may be available). The ML model may be altered based at least in part on a difference between ground truth data associated with a future time and the future position.
TARGET FOLLOWING METHOD, DEVICE, APPARATUS AND SYSTEM
The present disclosure relates to a target following method, device, apparatus and system. The method includes: acquiring first orientation data sent by a UWB base station arranged on a target following apparatus and a following mode sent by a UWB beacon arranged on a target to be followed; processing, based on the following mode, the first orientation data to obtain second orientation data including a current distance between a target position and the UWB base station, and a second azimuth angle of a line where the target position and the UWB base station are located with respect to a current direction of a movement of the target following apparatus; and comparing the second orientation data with preset orientation data to obtain a comparison result, and controlling the target following apparatus to perform target following according to the comparison result.
Autonomous driving system
An autonomous driving system includes a target object position recognition unit configured to recognize a target object position detected by a vehicle-mounted sensor based on map information in a map database, a vehicle position recognition unit configured to recognize a vehicle position, a relative-relationship-on-map acquisition unit configured to acquire a relative-relationship-on-map between the target object and the vehicle based on the target object position and the vehicle position on the map, a detected-relative-relationship acquisition unit configured to acquire a detected-relative-relationship between the target object detected by the sensor and the vehicle based on a result of detection performed by the sensor, a map accuracy evaluation unit configured to evaluate map accuracy of the map information based on the relative-relationship-on-map and the detected-relative-relationship, and an autonomous driving permission unit configured to permit an autonomous driving control using the map information based on the result of evaluation of the map accuracy.
Self-correcting vehicle localization
A computer is programmed to determine a localization of a first vehicle, including location coordinates and an orientation of the first vehicle, based on first vehicle sensor data, and to wirelessly receive localizations of respective second vehicles, wherein a first vehicle field of view at least partially overlaps respective fields of view of each of the second vehicles. The computer is programmed to determine pair-wise localizations for respective pairs of the first vehicle and one of the second vehicles, wherein each of the pair-wise localizations defines a localization of the first vehicle relative to a global coordinate system based on a (a) relative localization of the first vehicle with reference to the respective second vehicle and (b) a second vehicle localization relative to the global coordinate system, and to determine an adjusted localization for the first vehicle that has a minimized sum of distances to the pair-wise localizations.
Vehicle and controlling method thereof
A vehicle includes a communicator that is mounted on the vehicle to perform wireless communication with a server and a controller operates the communicator to transmit an accident reception request signal and image data acquired by another vehicle to the server when the vehicle has an accident with an accident target vehicle. The controller operates the communicator to receive a fault ratio from the server when the server generates fault ratio data between the vehicle and the accident target vehicle based on the image data.
ADVANCED DRIVER ASSISTANCE SYSTEM, AND VEHICLE HAVING THE SAME
Provided is an advanced driver assist system (ADAS) and a vehicle having the same. The ADAS includes: a communicator configured to communicate with a plurality of other vehicles; an obstacle detector configured to detect an obstacle in a surrounding and output obstacle information about the detected obstacle; and a controller configured to acquire distance information about a distance to a second vehicle travelling in the surrounding of the first vehicle among the obstacles based on the obstacle information detected by the obstacle detector during a cruise control mode, acquire travel information and position information of a third vehicle travelling in the surrounding of the second vehicle based on information received through the communicator, and controlling acceleration and deceleration based on the distance information with respect to the second vehicle, the travel information of the third vehicle, and the position information of the third vehicle.
VEHICLE CONTROLLER
A route information storage function having decreased data storage requirements is provided in a vehicle controller that includes a processor, a first storage unit, and a second storage unit and that stores route information indicating a route to a target point. The vehicle controller includes a traveling state acquiring unit that acquires route information on a vehicle, a short-term storage information processing unit that stores the route information in the first storage unit, as short-term storage information, the route information being acquired by the traveling state acquiring unit while the vehicle is traveling, and a long-term storage information processing unit that after the vehicle has reached the target point, determines long-term storage information from short-term storage information stored in the first storage unit, the long-term storage information processing unit storing the determined long-term storage information in the second storage unit.
Sensor Fusion for Object-Avoidance Detection
This document describes techniques, apparatuses, and systems for sensor fusion for object-avoidance detection, including stationary-object height estimation. A sensor fusion system may include a two-stage pipeline. In the first stage, time-series radar data passes through a detection model to produce radar range detections. In the second stage, based on the radar range detections and camera detections, an estimation model detects an over-drivable condition associated with stationary objects in a travel path of a vehicle. By projecting radar range detections onto pixels of an image, a histogram tracker can be used to discern pixel-based dimensions of stationary objects and track them across frames. With depth information, a highly accurate pixel-based width and height estimation can be made, which after applying over-drivability thresholds to these estimations, a vehicle can quickly and safely make over-drivability decisions about objects in a road.
METHODS AND SYSTEMS FOR TRACKING A MOVER'S LANE OVER TIME
Systems and methods for assigning a lane to an object in an environment of an autonomous vehicle are disclosed. The methods include assigning an instantaneous probability to each of a plurality of lanes in the environment based on a current state of the object, generating a transition matrix for each of the plurality of lanes, and identifying the lane in which the object is moving at the current time t based on the instantaneous probability and the transition matrix. The instantaneous probability is a measure of likelihood that the object is in that lane at a current time. The transition matrix encodes one or more probabilities that the object transitioned either into that lane or out of that lane at the current time.