G01S13/66

Advanced gaming and virtual reality control using radar
11656336 · 2023-05-23 · ·

Techniques are described herein that enable advanced gaming and virtual reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or submillimeter scale, for user control actions even when those actions are optically occluded or obscured.

Advanced gaming and virtual reality control using radar
11656336 · 2023-05-23 · ·

Techniques are described herein that enable advanced gaming and virtual reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or submillimeter scale, for user control actions even when those actions are optically occluded or obscured.

RECOGNITION PROCESSING SYSTEM, RECOGNITION PROCESSING DEVICE, AND RECOGNITION PROCESSING METHOD

To perform high-speed recognition processing of an image signal acquired by imaging.

A recognition processing system of the present disclosure includes: a first sensor device that acquires an image signal by imaging; a second sensor device that performs an object detection process, a selection unit that selects one of a plurality of recognition processes based on information on the object detected by the detection process; and a recognition processing unit that executes the recognition process selected by the selection unit based on the image signal.

Obstacle recognition device and obstacle recognition method
11465643 · 2022-10-11 · ·

An obstacle recognition device includes: a first sensor and a second sensor, which are configured to detect an object near a vehicle; a calculation unit configured to calculate, based on first detection data on a first object detected by the first sensor and second detection data on a second object detected by the second sensor, an index value for identifying whether the two objects are the same object; a determination unit configured to determine whether the two objects are the same object by comparing the index value with a threshold value set in advance; and a correction unit configured to calculate, when the determination unit has determined that the two objects are the same object, a detection error between the two sensors based on the two detection data, and generate corrected detection data so as to remove the detection error.

Deterrent for unmanned aerial systems

A system for providing integrated detection and deterrence against an unmanned vehicle including but not limited to aerial technology unmanned systems using a detection element, a tracking element, an identification element and an interdiction or deterrent element. Elements contain sensors that observe real time quantifiable data regarding the object of interest to create an assessment of risk or threat to a protected area of interest. This assessment may be based e.g., on data mining of internal and external data sources. The deterrent element selects from a variable menu of possible deterrent actions. Though designed for autonomous action, a Human in the Loop may override the automated system solutions.

Deterrent for unmanned aerial systems

A system for providing integrated detection and deterrence against an unmanned vehicle including but not limited to aerial technology unmanned systems using a detection element, a tracking element, an identification element and an interdiction or deterrent element. Elements contain sensors that observe real time quantifiable data regarding the object of interest to create an assessment of risk or threat to a protected area of interest. This assessment may be based e.g., on data mining of internal and external data sources. The deterrent element selects from a variable menu of possible deterrent actions. Though designed for autonomous action, a Human in the Loop may override the automated system solutions.

Systems, Methods and Computer-Readable Media for Improving Platform Guidance or Navigation Using Uniquely Coded Signals
20170370678 · 2017-12-28 ·

A spatially-distributed architecture (SDA) of antennas transmits respective uniquely coded signals. A first receiver having a known position in a coordinate system defined by the SDA receives reflected versions of the uniquely coded signals. A first processor receives the reflected versions of the uniquely coded signals and identifies a position of a non-cooperative object in the coordinate system. A platform having a second receiver receives non-reflected versions of the uniquely coded signals. The platform determines a position of the platform in the coordinate system. In an example, the platform uses a self-determined position and a position of the non-cooperative object communicated from the SDA to navigate or guide the platform relative to the non-cooperative object. In another example, the platform uses a self-determined position and information from an alternative signal source in a second coordinate system to guide the platform. Guidance solutions may be generated in either coordinate system.

Adjusting weight of intensity in a PHD filter based on sensor track ID

In one embodiment, a method for tracking multiple objects with a probabilistic hypothesis density filter is provided. The method includes comparing second track IDs corresponding to newly obtained measurements to one or more first track IDs corresponding to a T.sub.k+1 predicted intensity having a predicted weight. If all of the one or more first track IDs match any of the second track IDs, the predicted weight is multiplied by a first value. If less than all of the one or more first track IDs match any of the second track IDs, the predicted weight is multiplied by a second value, wherein the second value is greater than the first value. The method then determines whether to prune the T.sub.k+1 predicted intensity based on the predicted weight after multiplying with either the first value or the second value.

Adjusting weight of intensity in a PHD filter based on sensor track ID

In one embodiment, a method for tracking multiple objects with a probabilistic hypothesis density filter is provided. The method includes comparing second track IDs corresponding to newly obtained measurements to one or more first track IDs corresponding to a T.sub.k+1 predicted intensity having a predicted weight. If all of the one or more first track IDs match any of the second track IDs, the predicted weight is multiplied by a first value. If less than all of the one or more first track IDs match any of the second track IDs, the predicted weight is multiplied by a second value, wherein the second value is greater than the first value. The method then determines whether to prune the T.sub.k+1 predicted intensity based on the predicted weight after multiplying with either the first value or the second value.

Architecture for automation and fail operational automation

In an embodiment, an automation system for a vehicle may employ a variety of diversities to enhance reliability, accuracy, and stability in automating operation of the vehicle. For example, in an embodiment, an automation system for a vehicle may include multiple sensor pods with overlapping fields of view. Each sensor pod may include multiple different sensors in an embodiment, providing diverse views of the environment surrounding the vehicle. A set of sensor pods with overlapping fields of view may also transmit their object data at different points in time, providing diversity in time. Redundancy in other areas, such as the network switches which connect the sensor pods to an automation controller, may also aid in provided fail operational functionality. In an embodiment, the sensor pods may include local processing to process the data captured by the sensors into object identification.