G01S7/417

ASSOCIATION OF CAMERA IMAGES AND RADAR DATA IN AUTONOMOUS VEHICLE APPLICATIONS
20230038842 · 2023-02-09 ·

The described aspects and implementations enable fast and accurate object identification in autonomous vehicle (AV) applications by combining radar data with camera images. In one implementation, disclosed is a method and a system to perform the method that includes obtaining a radar image of a first hypothetical object in an environment of the AV, obtaining a camera image of a second hypothetical object in the environment of the AV, and processing the radar image and the camera image using one or more machine-learning models MLMs to obtain a prediction measure representing a likelihood that the first hypothetical object and the second hypothetical object correspond to a same object in the environment of the AV.

SYSTEM AND METHOD FOR EARLY DIAGNOSTICS AND PROGNOSTICS OF MILD COGNITIVE IMPAIRMENT USING HYBRID MACHINE LEARNING

A system and method for predicting mild cognitive impairment (MCI) related diagnosis and prognosis utilizing hybrid machine learning. More specifically, the system and method produce predictions of MCI conversions to dementia and prognosis related thereof. Using available medical imaging and non-imaging data a diagnosis and prognosis model is trained using transfer learning. A platform may then receive a request from a clinician for a target patient's diagnosis or prognosis. The target patient's medical data is retrieved and used to create a model for the target patient. Then details of the target patient's model and the diagnosis and prognosis model are compared, a prediction is generated, and the prediction is returned to the clinician. As new medical data becomes available it is fed into the respective model to improve accuracy and update predictions.

SMALL UNMANNED AERIAL SYSTEMS DETECTION AND CLASSIFICATION USING MULTI-MODAL DEEP NEURAL NETWORKS

Provided is a detection and classification system and method for small unmanned aircraft systems (sUAS). The system and method detect and classify multiple simultaneous heterogeneous RC transmitters/sUAS downlinks from the RF signature using Object Detection Deep Convolutional Neural Networks (DCNNs). The method further utilizes not only passive RF, but may also utilize Electro Optic/Infrared (EO/IR), radar and acoustic sensors as well, with a fusion of the individual sensor classifications. Detection and classification with Identification Friend or Foe (IFF) of individual sUAS in a swarm, multi-modal approach for high confidence classification, decision, and implementation on a low C-SWaP (cost, size, weight and power) NVIDIA Jetson TX2 embedded AI platform is achieved.

Signal detection and denoising systems

Disclosed herein are systems and methods for estimating target ranges, angles of arrival, and speed using optimization procedures. Target ranges are estimated by performing an optimization procedure to obtain a denoised signal, performing a correlation of a transmitted waveform and the denoised signal, and using a result of the correlation to determine an estimate of a distance between the sensor and at least one target. Target angles of arrival are estimated by determining ranges at which targets are located, and, for each range, constructing an array signal from samples of received echo signals, and using the array signal, performing another optimization procedure to estimate a respective angle of arrival for each target of the at least one target. Doppler shifts may also be estimated using another optimization procedure. Certain of the optimization procedures use atomic norm techniques.

Sensor fusion for precipitation detection and control of vehicles

An apparatus includes a processor configured to be disposed with a vehicle and a memory coupled to the processor. The memory stores instructions to cause the processor to receive, at least two of: radar data, camera data, lidar data, or sonar data. The sensor data is associated with a predefined region of a vicinity of the vehicle while the vehicle is traveling during a first time period. At least a portion of the vehicle is positioned within the predefined region during the first time period. The method also includes detecting that no other vehicle is present within the predefined region. An environment of the vehicle during the first time period is classified as one state from a set of states that includes at least one of dry, light rain, heavy rain, light snow, or heavy snow, based on at least two of the sensor data to produce an environment classification. An operational parameter of the vehicle based on the environment classification is modified.

Near range radar

Apparatus and associated methods relate to enabling a radar system to use different sensing mechanisms to estimate a distance from a target based on different detection zones (e.g., far-field and near-field). In an illustrative example, a curve fitting method may be applied for near-field sensing, and a Fourier transform may be used for far-field sensing. A predetermined set of rules may be applied to select when to use the near-field sensing mechanism and when to use the far-field mechanism. The frequency of a target signal within a beat signal that has less than two sinusoidal cycles may be estimated with improved accuracy. Accordingly, the distance of a target that is within a predetermined distance range (e.g., two meters range for 24 GHz ISM band limitation) may be reliably estimated.

AXIS DEVIATION DETECTION DEVICE AND AXIS DEVIATION DETECTION PROGRAM
20230003877 · 2023-01-05 ·

An axis deviation detection device detects a deviation of a central axis of an in-vehicle measurement device that executes measurement using radar waves. The axis deviation detection device performs correction of data that are obtained by the measurement device, to generate first data. Furthermore, the axis deviation detection device generates second data, based on behavior of the vehicle as measured by a sensor unit. The axis deviation detection device then detects the deviation of the central axis, from the first and second data.

Radar-based target tracking using motion detection
11567185 · 2023-01-31 · ·

In an embodiment, a method includes: receiving reflected radar signals with a millimeter-wave radar; performing a range discrete Fourier Transform (DFT) based on the reflected radar signals to generate in-phase (I) and quadrature (Q) signals for each range bin of a plurality of range bins; for each range bin of the plurality of range bins, determining a respective strength value based on changes of respective I and Q signals over time; performing a peak search across the plurality of range bins based on the respective strength values of each of the plurality of range bins to identify a peak range bin; and associating a target to the identified peak range bin.

Adaptive thresholding and noise reduction for radar data

An electronic device for gesture recognition, includes a processor operably connected to a transceiver. The transceiver is configured to transmit and receive signals for measuring range and speed. The processor is configured to transmit the signals, via the transceiver. in response to a determination that a triggering event occurred, the processor is configured to track movement of an object relative to the electronic device within a region of interest based on reflections of the signals received by the transceiver to identify range measurements and speed measurements associated with the object. The processor is also configured to identify features from the reflected signals, based on at least one of the range measurements and the speed measurements. The processor is further configured to identify a gesture based in part on the features from the reflected signals. Additionally, the processor is configured to perform an action indicated by the gesture.

Neural network based radiowave monitoring of fall characteristics in injury diagnosis

Training a machine learning neural network (MLNN) in radiowave based monitoring of fall characteristics in diagnosing injury. The method comprises receiving, in a first set of input layers of the MLNN, from a millimeter wave (mmWave) radar sensing device, a set of mmWave radar point cloud data representing fall attributes associated with a subject, each of the first set associated with a respective fall attribute; receiving, at a second set of input layers of the MLNN, a set of personal attributes of the subject, training a MLNN classifier based on supervised training that establishes a correlation between an injury condition of the subject as generated at the output layer, the mmWave point cloud data, and personal attributes; and adjusting an initial matrix of weights by backpropagation to increase correlation between the injury condition, the mmWave point cloud data, and the personal attributes.