G01S13/587

Compensating radio tracking with comparison to image based tracking

The present disclosure provides an error detector for determining an error vector between a radio trajectory and an image trajectory. The error detector includes: an input for monitoring a radio trajectory of an object from a radio signal and an image trajectory of an object from an image over an observation area; a correlation module arranged to correlate the radio trajectory with the image trajectory; an error module arranged to determine an error vector between the radio trajectory and the image trajectory; and an output arranged to transmit the error vector for use in determining an estimated trajectory of a target based on a target trajectory from a radio signal.

System and method for three dimensional object tracking using combination of radar and image data
11697046 · 2023-07-11 · ·

Methods, systems, and apparatus, including medium-encoded computer program products, for 3D flight tracking of objects includes, in at least one aspect, a method including obtaining two dimensional image data of a golf ball in flight, the two dimensional image data originating from a camera; obtaining radar data of the golf ball in flight, the radar data originating from a Doppler radar device associated with the camera; interpolating the radar data to generate interpolated radar data of the golf ball in flight; and blending radial distance information derived from the interpolated radar data of the golf ball in flight with angular distance information derived from the two dimensional image data of the golf ball in flight to form three dimensional location information of the golf ball in three dimensional space.

SENSOR FUSION ARCHITECTURE FOR LOW-LATENCY ACCURATE ROAD USER DETECTION

Aspects described herein provide sensor data stream processing for enabling camera/radar sensor fusion, with application to road user detection in the context of Autonomous Driving/Assisted Driving (ADAS). In particular, a scheme to extract Region-of-Interests (ROI) from a high-resolution, high-dimensional radar data cube that can then be transmitted to a sensor fusion unit is described. The ROI scheme allows to extract relevant information, thus reducing the latency and data transmission rate to the sensor fusion module, without trading-off accuracy and detection rates. The sensor data stream processing comprises receiving a first data stream from a radar sensor, forming a point cloud by extracting 3D points from the 3D data cube, performing clustering on the point cloud in order to identify high-density regions representing one or ROIs, and extracting one or more 3D bounding boxes from the 3D data cube corresponding to the one or more ROIs and classifying each ROI.

Three dimensional object tracking using combination of radar data and two dimensional image data

Methods and systems include, in at least one aspect: obtaining from a camera 2D image data of an object, obtaining from a radar device radar data of the object, combining the radar data and the 2D image data to produce 3D location information of the object, and modeling a 2D trace of the object using the 2D image data by finding an initial version of the 2D trace, receiving an initial portion of the 3D location information, extending the initial portion of the 3D location information in accordance with physical-world conditions to find at least one 3D location beyond the initial portion of the 3D location information, projecting the at least one 3D location into a 2D image plane of the camera to locate 2D region, and processing the 2D region in the 2D image data to extend the 2D trace of the object in flight.

THREE DIMENSIONAL OBJECT TRACKING USING COMBINATION OF RADAR SPEED DATA AND TWO DIMENSIONAL IMAGE DATA
20230091774 · 2023-03-23 ·

Methods and systems include, in at least one aspect: determining an optical model of an object in flight using two dimensional image data obtained from a camera, determining a radar model of the object in flight using radar data obtained from a radar device, combining the radar model with the optical model to produce three dimensional location information of the object in flight in three dimensional space, comparing the three dimensional location information of the object in flight with data representing an expected ball launch, and rejecting (or verifying) the object as an actual ball launch in response to the three dimensional location information of the object in flight differing (or not differing) from the data representing the expected ball launch by a threshold amount.

SELF-ORGANIZED LEARNING OF THREE-DIMENSIONAL MOTION DATA
20230065922 · 2023-03-02 ·

A method may include capturing image data associated with an object in a defined environment at one or more points in time. The method may include capturing radar data associated with the object in the defined environment at the same points in time. The method may include obtaining, by a machine learning model, the image data and the radar data associated with the object in the defined environment. The method may include pairing each image datum with a corresponding radar datum based on a chronological occurrence of the image data and the radar data. The method may include generating, by the machine learning model, a three-dimensional motion representation associated with the object that is associated with the image data and the radar data.

System and Method for Three Dimensional Object Tracking Using Combination of Radar and Image Data
20230293940 · 2023-09-21 ·

Methods, systems, and apparatus, including medium-encoded computer program products, for 3D flight tracking of objects includes, in at least one aspect, a method including obtaining (from a camera) two dimensional image data of a golf ball in flight; obtaining radar data (originating from a Doppler radar device) of the golf ball in flight; fitting a curve to the radar data to generate a continuous function of time for the radar data of the golf ball in flight; determining three dimensional location information of the golf ball in three dimensional space including, for each of multiple camera observations, finding a radial distance using the continuous function and a time of the camera observation, finding a depth distance, finding a horizontal distance and finding a vertical distance to the golf ball; and providing the three dimensional location information of the golf ball in three dimensional space to augment other data before display.

Three Dimensional Object Tracking Using Combination of Radar Speed Data and Two Dimensional Image Data
20210275873 · 2021-09-09 ·

Methods and systems include, in at least one aspect: obtaining from a camera 2D image data of an object, obtaining from a radar device radar data of the object, combining the radar data and the 2D image data to produce 3D location information of the object, and modeling a 2D trace of the object using the 2D image data by finding an initial version of the 2D trace, receiving an initial portion of the 3D location information, extending the initial portion of the 3D location information in accordance with physical-world conditions to find at least one 3D location beyond the initial portion of the 3D location information, projecting the at least one 3D location into a 2D image plane of the camera to locate 2D region, and processing the 2D region in the 2D image data to extend the 2D trace of the object in flight.

System and method of controlling operation of a device with a steerable optical sensor and a steerable radar unit

System and method of controlling operation of a device in real-time. The system includes an optical sensor having a steerable optical field of view for obtaining image data and a radar unit having a steerable radar field of view for obtaining radar data. A controller may be configured to steer a first one of the optical sensor and the radar unit to a first region of interest and a second one of the optical sensor and the radar unit to the second region of interest. The controller may be configured to steer both the optical sensor and the radar unit to the first region of interest. The radar data and the image data are fused to obtain a target location and a target velocity. The controller is configured to control operation of the device based in part on at least one of the target location and the target velocity.

RADAR HEAD POSE LOCALIZATION

An augmented reality device has a radar system that generates radar maps of locations of real world objects. An inertial measurement unit detects measurement values such as acceleration, gravitational force and inclination ranges. The values from the measurement unit drift over time. The radar maps are processed to determine fingerprints and the fingerprints are combined with the values from the measurement unit to store a pose estimate. Pose estimates at different times are compared to determine drift of the measurement unit. A measurement unit filter is adjusted to correct for the drift.