G06T2207/10044

System and Method for Generating a Radar Image of a Scene

The present disclosure provides a system and a method for generating a radar image of a scene. The method comprises receiving radar measurements of a scene collected from a set of antennas, wherein the set of antennas are under uncertainties caused by one or a combination of position ambiguities and clock ambiguities of each of the antennas. The method further comprises generating the radar image of the scene by solving a sparse recovery problem. The sparse recovery problem determines, until a termination condition is met, a set of image shifts of the radar image corresponding to different uncertainties of the antennas and updates an estimate of the radar image, based on the determined set of image shifts of the radar image. The sparse recovery problem is solved with a neural network denoiser that denoises a filtering of the estimate of the radar image.

Radar image processing device and radar image processing method

A radar image processing device includes a phase difference calculating unit calculating a phase difference between phases with respect to a first and a second radio wave receiving points in each pixel at corresponding pixel positions among pixels in a first and a second suppression ranges, the first and the second suppression ranges being suppression ranges in a first and a second radar images capturing an observation area from the first and the second radio wave receiving points, respectively; and a rotation amount calculating unit calculating each phase rotation amount in the pixels in the second suppression range from each phase difference, wherein a difference calculating unit rotates phases in the pixels in the second suppression range based on the rotation amounts, and calculates a difference between pixel values at corresponding pixel position among the pixels in the first suppression range and phase-rotated pixels in the second suppression range.

Crop phenology estimation and tracking with remote sensing imagery
11636672 · 2023-04-25 · ·

Techniques for estimating crop maturity using remote sensing imagery are disclosed. A method may include a step of providing a remote sensing image of crops acquired at an image acquisition time, for example, a synthetic aperture radar image acquired from a satellite. The method may also include a step of predicting a maturity level of the crops at the image acquisition time based on input crop maturity data from an input time and weather-based growth indication data from between the input time and the image acquisition time. The weather-based growth indication data may include growing degree day data. The method may further include a step of updating the predicted maturity level to an updated maturity level of the crops based on the remote sensing image and a response model relating remote sensing image information to crop maturity information. The predicting and updating steps may involve performing a particle filtering operation.

IMAGE PROCESSING METHOD AND DATA STRUCTURE OF METADATA

The present technology relates to an image processing method and a data structure of metadata that allow for image processing on a plurality of captured images obtained by operation of a formation flight. A satellite cluster management device and an image analysis server as image processing apparatuses perform predetermined image processing on the basis of satellite specification information for specifying an artificial satellite associated as metadata with a captured image captured by the artificial satellite. The present technology can be applied to, for example, artificial satellites that perform satellite remote sensing by a formation flight.

AIRBORNE SENSOR TO SENSOR INFORMATION SHARING TECHNIQUE
20230066768 · 2023-03-02 ·

A radar system and method for sharing threat data is configured to communicate with other radar systems in surrounding aircraft and share threat specific data. Each radar system may be configured to a specific task according to the priorities of the aircraft and the capabilities of the surrounding aircraft; the gathered data is then shared with the surrounding aircraft such that each aircraft may commit longer dwell time to individual tasks while still receiving data for all of the potential tasks. The radar system may identify a fault and send a request for radar threat data to nearby aircraft. The radar system may receive such data within the operating band of the radar and allow continued operation.

Object velocity and/or yaw rate detection and tracking
11663726 · 2023-05-30 · ·

Tracking a current and/or previous position, velocity, acceleration, and/or heading of an object using sensor data may comprise determining whether to associate a current object detection generated from recently received (e.g., current) sensor data with a previous object detection generated from formerly received sensor data. In other words, a track may identify that an object detected in former sensor data is the same object detected in current sensor data. However, multiple types of sensor data may be used to detect objects and some objects may not be detected by different sensor types or may be detected differently, which may confound attempts to track an object. An ML model may be trained to receive outputs associated with different sensor types and/or a track associated with an object, and determine a data structure comprising a region of interest, object classification, and/or a pose associated with the object.

Method and system for detecting and analyzing objects

A method for detecting objects and labeling the objects with distances in an image includes steps of: obtaining a thermal image from a thermal camera, an RGB image from an RGB camera, and radar information from an mmWave radar; adjusting the thermal image based on the RGB image to generate an adjusted thermal image, and generating a fused image based on the RGB image and the adjusted thermal image; generating a second fused image based on the fused image and the radar information; detecting objects in the images, and generating, based on the fused image, another fused image including bounding boxes marking the objects; and determining motion parameters of the objects.

POINT CLOUD REGISTRATION FOR LIDAR LABELING

The subject disclosure relates to techniques for detecting an object. A process of the disclosed technology can include steps for receiving three-dimensional (3D) Light Detection and Ranging (LiDAR) data of the object at a first time, generating a first point cloud based on the 3D LiDAR data at the first time, receiving 3D LiDAR data of the object at a second time, generating a second point cloud based on the 3D LiDAR data at the second time, aggregating the first point cloud and the second point cloud to form an aggregated point cloud, and placing a bounding box around the aggregated point cloud. Systems and machine-readable media are also provided.

MOVING TARGET FOCUSING METHOD AND SYSTEM BASED ON GENERATIVE ADVERSARIAL NETWORK
20230162373 · 2023-05-25 ·

A moving target focusing method and system based on a generative adversarial network are provided. The method includes: generating, using a Range Doppler algorithm, a two-dimensional image including at least one defocused moving target, as a training sample; generating at least one ideal Gaussian point in a position of at least one center of the at least one defocused moving target in the two-dimensional image, to generate a training label; constructing the generative adversarial network, the generative adversarial network includes a generative network and a discrimination network; inputting the training sample and the training label into the generative adversarial network to perform repeated training until an output of the generative network reaches a preset condition, to thereby obtain a trained network model; and inputting a testing sample into the trained network model, to output a moving target focused image.

HIGH FIDELITY DATA-DRIVEN MULTI-MODAL SIMULATION
20230159033 · 2023-05-25 ·

Provided are methods for generating high fidelity synthetic sensor data representing hypothetical driving scenarios for the vehicle. Some methods described include accessing sensor data associated with operation of a vehicle in an environment traversing a first path. Operation of a simulated vehicle is simulated along a synthetic driving scenario in the simulated environment along a second path different from the first path and in simulation with the plurality of simulated agents. Systems and computer program products are also provided.