Patent classifications
G01S17/006
METHOD AND SYSTEM FOR DETERMINING CORRECTNESS OF LIDAR SENSOR DATA USED FOR LOCALIZING AUTONOMOUS VEHICLE
Disclosed herein is method and system for determining correctness of Lidar sensor data used for localizing autonomous vehicle. The system identifies one or more Region of Interests (ROIs) in Field of View (FOV) of Lidar sensors of autonomous vehicle along a navigation path. Each ROI includes one or more objects. Further, for each ROI, system obtains Lidar sensor data comprising one or more reflection points corresponding to the one or more objects. The system forms one or more clusters in each ROI. The system identifies a distance value between, one or more clusters projected on 2D map of environment and corresponding navigation map obstacle points, for each ROI. The system compares distance value between one or more clusters and obstacle points based on which correctness of Lidar sensor data is determined. In this manner, present disclosure provides a mechanism to detect correctness of Lidar sensor data for navigation in real-time.
REMOTE SENSING INVERSION METHOD, SYSTEM, DEVICE AND COMPUTER READABLE STORAGE MEDIUM FOR AEROSOL COMPONENT DISTRIBUTION
According to the technical solution of the present invention, a vertical mass concentration curve of fine aerosol component distribution can be obtained based on remote sensing test cooperation of a ground Lidar and a solar photometer, specifically, an internal mixing model and a normalized volume distribution model are constructed according to the light absorptivity and the water solubility of aerosol components, and separates out profiles of inorganic salt components, black carbon components, water-soluble organic matter, water-insoluble organic matter, fine aerosol water content components and the like to obtain corresponding vertical distribution. A new thought and development direction can be provided for remote sensing inversion of Lidar and the like, and meanwhile, development of aerosol component spectrums is promoted. According to the comparison of on-site observation data and reanalysis data, the vertical distribution of aerosol components obtained by the technical solution is relatively reasonable, and has popularization value.
METHOD, APPARATUS AND STORAGE MEDIUM FOR MULTI-TARGET MULTI-CAMERA TRACKING
The present disclosure relates to a method, an apparatus and a storage medium for multi-target multi-camera tracking. According to an embodiment of the present disclosure, the method comprises: determining an overall local target trajectory set including a local target trajectory set of each camera by performing single-camera multi-target tracking on a corresponding image sequence provided by each camera of a plurality of cameras; and determining a global target trajectory set for the plurality of cameras by performing multi-camera multi-target matching on the overall local target trajectory set; wherein determining the global target trajectory set comprises: determining a cluster matched global trajectory set by clustering local target trajectories; determining a cost-minimum path set by implementing a cost-minimum path algorithm on a directed graph; and merging corresponding trajectories in the cluster matched global trajectory set based on the cost-minimum path set.
Systems and Methods for Generating Synthetic Sensor Data via Machine Learning
The present disclosure provides systems and methods that combine physics-based systems with machine learning to generate synthetic LiDAR data that accurately mimics a real-world LiDAR sensor system. In particular, aspects of the present disclosure combine physics-based rendering with machine-learned models such as deep neural networks to simulate both the geometry and intensity of the LiDAR sensor. As one example, a physics-based ray casting approach can be used on a three-dimensional map of an environment to generate an initial three-dimensional point cloud that mimics LiDAR data. According to an aspect of the present disclosure, a machine-learned model can predict one or more dropout probabilities for one or more of the points in the initial three-dimensional point cloud, thereby generating an adjusted three-dimensional point cloud which more realistically simulates real-world LiDAR data.
SYSTEMS AND METHODS FOR LOCALIZATION
Systems and methods for localization are provided. In one aspect, a LIDAR scan is captured from above to generate a point cloud. One or more locations may be sampled in the point cloud and LIDAR scans may be simulated at each location. The sampled locations and associated simulated LIDAR scans may be used to train a regressor to localize vehicles in the environment that are at poses different from the pose from which the LIDAR point cloud was captured. In one aspect, a mapping UAV systematically scans an environment with a camera to generate a plurality of map images. The map images are stitched together into an orthographic image. A runtime UAV captures one or more runtime images of the environment with a camera. Feature matching is performed between the runtime images and the orthographic image for localization. In one aspect, a first machine learning model is trained to transform a camera image into a LIDAR image and a second machine learning model is trained to estimate a pose based on a LIDAR image. A runtime image may be input to the first machine learning model to generate a simulated LIDAR scan. The simulated LIDAR scan may be input to the second machine learning model to estimate a pose, which localizes the vehicle.
Transposition of a measurement of a radar cross-section from an RF-domain to an optical domain
Optical modality configured to simulate measurements of the radar cross-section of targets, dimensioned to be conventionally-measured in the RF-portion of the electromagnetic spectrum, with sub-micron accuracy. A corresponding compact optical system, with a foot-print comparable with a tabletop, employing optical interferometric time-of-flight approach to reproduce, on a substantially shorter time-scale, radar-ranging measurements ordinarily pertaining to the range of frequencies that are at least 10.sup.3 times lower than those employed in the conventional RF-based measurement.
METHOD FOR LIGHT RANGING SYSTEM CONTROL IN A VEHICLE
A method for controlling a light ranging system mounted on a vehicle is provided with receiving means for receiving external data and a plurality of sensors. The light ranging system is configured by a plurality of configurable parameters and comprises storage means for storing a plurality of predetermined operational schemes. The method comprises collecting, via the receiving means and the plurality of sensors, data indicative of the vehicle's operation and/or of conditions wherein the vehicle is operating, determining based on the collected data an operational scheme among the plurality of predetermined operational schemes for operating the light ranging system, deriving one or more control instructions to set at least one configurable parameter by a stored parameter table corresponding to the operational scheme, and applying the one or more control instructions to the light ranging system.
LIDAR System With Reduced Speckle Sensitivity
Multiple LIDAR output signals are generated and are concurrently directed to the same sample region in a field of view. The LIDAR output signals have one or more optical diversities selected from a group consisting of wavelength diversity, polarization diversity, and diversity of an angle of incidence of the LIDAR output signal relative to the sample region.
Adaptive Ladar Receiver Control Using Spatial Index of Prior Ladar Return Data
Disclosed herein are examples of ladar systems and methods where data about a plurality of ladar returns from prior ladar pulse shots gets stored in a spatial index that associates ladar return data with corresponding locations in a coordinate space to which the ladar return data pertain. This spatial index can then be accessed by a processor to retrieve ladar return data for locations in the coordinate space that are near a range point to be targeted by the ladar system with a new ladar pulse shot. This nearby prior ladar return data can then be analyzed by the ladar system to help define a control parameter for use by the ladar receiver with respect to the new ladar pulse shot.
DISTANCE TO OBSTACLE DETECTION IN AUTONOMOUS MACHINE APPLICATIONS
In various examples, a deep neural network (DNN) is trained to accurately predict, in deployment, distances to objects and obstacles using image data alone. The DNN may be trained with ground truth data that is generated and encoded using sensor data from any number of depth predicting sensors, such as, without limitation, RADAR sensors, LIDAR sensors, and/or SONAR sensors. Camera adaptation algorithms may be used in various embodiments to adapt the DNN for use with image data generated by cameras with varying parameterssuch as varying fields of view. In some examples, a post-processing safety bounds operation may be executed on the predictions of the DNN to ensure that the predictions fall within a safety-permissible range.