G01S17/66

Tracking positions using a scalable position tracking system

A scalable tracking system processes video of a space to track the positions of people within a space. The tracking system determines local coordinates for the people within frames of the video and then assigns these coordinates to time windows based on when the frames were received. The tracking system then combines or clusters certain local coordinates that have been assigned to the same time window to determine a combined coordinate for a person during that time window.

SENSOR-BASED CONTROL OF LIDAR RESOLUTION CONFIGURATION
20230044279 · 2023-02-09 ·

A computer-implemented method comprises: generating first output using a first sensor of a vehicle comprising an infrared camera or an event-based sensor, the first output indicating a portion of surroundings of the vehicle; providing the first output to a LiDAR of the vehicle having a field of view (FOV); configuring a resolution of the LiDAR based at least in part on the first output; generating a representation of at least part of the surroundings of the vehicle using the LiDAR; providing, to a perception component of the vehicle, second output of a second sensor of the vehicle and third output of the LiDAR, the perception component configured to perform object detection, sensor fusion, and object tracking regarding the second and third outputs, wherein the first output bypasses at least part of the perception component; and performing motion control of the vehicle using a fourth output of the perception component.

System and method for vehicle position and velocity estimation based on camera and LIDAR data
11557128 · 2023-01-17 · ·

A vehicle position and velocity estimation based on camera and LIDAR data are disclosed. A particular embodiment includes: receiving input object data from a subsystem of an autonomous vehicle, the input object data including image data from an image generating device and distance data from a distance measuring device; determining a two-dimensional (2D) position of a proximate object near the autonomous vehicle using the image data received from the image generating device; tracking a three-dimensional (3D) position of the proximate object using the distance data received from the distance measuring device over a plurality of cycles and generating tracking data; determining a 3D position of the proximate object using the 2D position, the distance data received from the distance measuring device, and the tracking data; determining a velocity of the proximate object using the 3D position and the tracking data; and outputting the 3D position and velocity of the proximate object relative to the autonomous vehicle.

System and method for vehicle position and velocity estimation based on camera and LIDAR data
11557128 · 2023-01-17 · ·

A vehicle position and velocity estimation based on camera and LIDAR data are disclosed. A particular embodiment includes: receiving input object data from a subsystem of an autonomous vehicle, the input object data including image data from an image generating device and distance data from a distance measuring device; determining a two-dimensional (2D) position of a proximate object near the autonomous vehicle using the image data received from the image generating device; tracking a three-dimensional (3D) position of the proximate object using the distance data received from the distance measuring device over a plurality of cycles and generating tracking data; determining a 3D position of the proximate object using the 2D position, the distance data received from the distance measuring device, and the tracking data; determining a velocity of the proximate object using the 3D position and the tracking data; and outputting the 3D position and velocity of the proximate object relative to the autonomous vehicle.

SYSTEMS AND METHODS FOR PARTICLE FILTER TRACKING
20230012257 · 2023-01-12 ·

Systems and methods for operating a mobile platform. The methods comprise, by a computing device: obtaining a LiDAR point cloud; using the LiDAR point cloud to generate a track for a given object in accordance with a particle filter algorithm by generating states of a given object over time (each state has a score indicating a likelihood that a cuboid would be created given an acceleration value and an angular velocity value); using the track to train a machine learning algorithm to detect and classify objects based on sensor data; and/or causing the machine learning algorithm to be used for controlling movement of the mobile platform.

System and method for classifying agents based on agent movement patterns

Described is a system and method for the classification of agents based on agent movement patterns. In operation, the system receives position data of a moving agent from a camera or sensor. Motion data of the moving agent is then extracted and used to generate a predicted future motion of the moving agent using a set of pre-calculated Echo State Networks (ESN). Each ESN represents an agent classification and generates a predicted future motion. A prediction error is generated for each ESN by comparing the predicted future motion for each ESN with actual motion data. Finally, the agent is classified based on the ESN having the smallest prediction error.

DETECTOR FOR AN OPTICAL DETECTION OF AT LEAST ONE OBJECT

A detector (110) for an optical detection of at least one object (112) is proposed. The detector (110) comprises: —at least one transfer device (120), wherein the transfer device (120) comprises at least two different focal lengths (140) in response to at least one incident light beam (136); —at least two longitudinal optical sensors (132), wherein each longitudinal optical sensor (132) has at least one sensor region (146), wherein each longitudinal optical sensor (132) is designed to generate at least one longitudinal sensor signal in a manner dependent on an illumination of the sensor region (146) by the light beam (136), wherein the longitudinal sensor signal, given the same total power of the illumination, is dependent on a beam cross-section of the light beam (136) in the sensor region (146), wherein each longitudinal optical sensor (132) exhibits a spectral sensitivity in response to the light beam (136) in a manner that two different longitudinal optical sensors (132) differ with regard to their spectral sensitivity; wherein each optical longitudinal sensor (132) is located at a focal point (138) of the transfer device (120) related to the spectral sensitivity of the respective longitudinal optical sensor (132); and —at least one evaluation device (150), wherein the evaluation device (150) is designed to generate at least one item of information on a longitudinal position and/or at least one item of information on a color of the object (112) by evaluating the longitudinal sensor signal of each longitudinal optical sensor (132). Thereby, a simple and, still, efficient detector for an accurate determining of a position and/or a color of at least one object in space is provided.

CROSSWIND SPEED MEASUREMENT BY OPTICAL MEASUREMENT OF SCINTILLATION
20180003824 · 2018-01-04 ·

The present disclosure describes methods and systems for measuring crosswind speed by optical measurement of laser scintillation. One method includes projecting radiation into a medium, receiving, over time, with a photodetector receiver, a plurality of scintillation patterns of scattered radiation, comparing cumulative a radiation intensity for each received scintillation pattern of the received plurality of scintillation patterns, and measuring a cumulative weighted average cross-movement within the medium using the compared cumulative radiation intensities.

CROSSWIND SPEED MEASUREMENT BY OPTICAL MEASUREMENT OF SCINTILLATION
20180003824 · 2018-01-04 ·

The present disclosure describes methods and systems for measuring crosswind speed by optical measurement of laser scintillation. One method includes projecting radiation into a medium, receiving, over time, with a photodetector receiver, a plurality of scintillation patterns of scattered radiation, comparing cumulative a radiation intensity for each received scintillation pattern of the received plurality of scintillation patterns, and measuring a cumulative weighted average cross-movement within the medium using the compared cumulative radiation intensities.

Method, processing unit and surveying instrument for improved tracking of a target
11709269 · 2023-07-25 · ·

A method implemented in a processing unit controlling a surveying instrument is provided. The method comprises obtaining a first set of data from optical tracking of a target with the surveying instrument, and identifying from the first set of data a dependence over time of at least one parameter representative of movements of the target. The method further comprises receiving a second set of data from a sensor unit via a communication channel, the second set of data including information about the at least one parameter over time, and determining whether a movement pattern for the optically tracked target as defined by the dependence over time of the at least one parameter is the same as, or deviates by a predetermined interval from, a movement pattern as defined by the dependence over time of the at least one parameter obtained from the second set of data.