G01S13/9027

Systems and methods for mapping manmade objects buried in subterranean surfaces using an unmanned aerial vehicle integrated with radar sensor equipment

An aerial vehicle system for mapping an object buried in a subterranean surface, the aerial vehicle system including an aerial vehicle, an electronic sensor, a processor, and a memory. The memory includes instructions, which when executed by the processor, cause the system to receive a first input data set by the electronic sensor, the first input data set based on an electromagnetic signal and geographic location data, generate a raw image based on the first input data set, and compare the raw image to a calibration data set, the calibration data set based on material calibration data. The material calibration data is based on unique spectral reflection patterns of an object in a controlled environment at predefined heights and subterranean conditions.

De-Aliased Imaging for a Synthetic Aperture Radar
20230018183 · 2023-01-19 ·

This document describes techniques for enabling de-aliased imaging for a synthetic aperture radar. Radar signals processed by a synthetic aperture radar (SAR) system may include false detections in the form of aliasing induced by grating lobes. The techniques described herein can reduce the adverse effects of grating lobes by obtaining an initial SAR image using a back-projection algorithm. Aliasing effects (e.g., false detections) in this initial image may be common due to the limitations of an SAR system moving at non-uniform speeds. A refined image is produced from the initial image by applying a de-aliasing filter to the initial image. The refined image may have reduced or eliminated false detections that attribute to aliasing effects, resulting in a better representation of the environment of the vehicle.

RADAR PROCESSING METHOD AND SYSTEM
20230213648 · 2023-07-06 ·

A computer-implemented method for processing radar information comprises receiving synthetic aperture radar (SAR) data that specifics a sequence of radar acquisitions of a target area taken over a particular acquisition time. The method comprises forming a plurality of sub-aperture images, wherein each sub-aperture image is associated with radar acquisitions of the target area taken over a particular interval of the acquisition time. The method further comprises determining, based on the plurality of sub-aperture images, velocities associated with one or more target objects moving within the target area; and generating an image associated with the SAR data that depicts the one or more target objects and the velocities associated with one or more target objects.

Radar image processing device and radar image processing method

A radar image processing device performs determination of a pixel including a ghost image and changes the value of the pixel which is determined to include the ghost image on a radar image the focus of which has been changed.

Learning device, learning method, and storage medium

Provided is a learning device that can generate a feature deriving device capable of deriving, for an identical object, feature amounts which respectively express a feature of the object in different forms and which are mutually related. This learning device comprises: an acquisition unit that acquires first data and second data, with different forms of the object recorded therein; an encoder that derives a first feature amount from the first data; a conversion unit that converts the first feature amount to a second feature amount; a decoder that generates third data from the second feature amount; and a parameter updating unit that updates, on the basis of a comparison between the second data and the third data, the value of a parameter used in the derivation of the first feature amount, and the value of a parameter used in the generation of the third data.

NAVIGATION APPARATUS AND POSITION DETERMINATION METHOD
20230087890 · 2023-03-23 ·

A navigation apparatus includes an image capturing device, template database, correlation device, evaluation device, and output interface. The image capturing device can create a radar image of a surround, the template database configured to provide at least one template substantially matched to the radar image and containing at least one geo-referenced landmark, the at least one geo-referenced landmark being geo-referenced by at least one geo-coordinate. The correlation device can correlate the at least one geo-referenced landmark in the at least one template with the radar image and provide the at least one geo-coordinate belonging to the at least one geo-referenced landmark. The evaluation device can determine a position of the navigation apparatus from the at least one geo-coordinate of the at least one geo-referenced landmark and from a setting of the image capturing device. The output interface is configured to provide the determined position.

IMAGE PROCESSING METHOD
20230081660 · 2023-03-16 · ·

An image processing apparatus according to the present invention includes: an extracting unit configured to extract a candidate image, which is an image of a candidate region specified in accordance with a preset criterion, from a target image to be a target for an annotation process, and also extract a corresponding image, which is an image of a corresponding region corresponding to the candidate region, from a reference image that is an image corresponding to the target image; a displaying unit configured to display the candidate image and the corresponding image so as to be able to compare the images with each other; and an input accepting unit configured to accept input of input information for the annotation process for the candidate image.

Radar image processing device and radar image processing method

A radar image processing device includes a phase difference calculating unit calculating a phase difference between phases with respect to a first and a second radio wave receiving points in each pixel at corresponding pixel positions among pixels in a first and a second suppression ranges, the first and the second suppression ranges being suppression ranges in a first and a second radar images capturing an observation area from the first and the second radio wave receiving points, respectively; and a rotation amount calculating unit calculating each phase rotation amount in the pixels in the second suppression range from each phase difference, wherein a difference calculating unit rotates phases in the pixels in the second suppression range based on the rotation amounts, and calculates a difference between pixel values at corresponding pixel position among the pixels in the first suppression range and phase-rotated pixels in the second suppression range.

Method and system for determining a characteristic dimension of a ship
11635511 · 2023-04-25 · ·

Disclosed is a method for determining a dimension of a ship, the method being implemented by an electronic system with a radar device having two receiving channels. The method includes: acquiring, for each of the two receiving channels of the radar device, a synthetic aperture radar image imaging the ship in an environment; the sum of the respective amplitudes of the pixels of the two radar images to obtain a sum image; extracting pixels from the sum image imaging the ship to obtain a mask of the ship; determining a range of phase differences between the radar signals received by each of the two receiving channels; and determining a dimension of the ship as a function of the mask of the ship and the determined phase difference range.

Method and system for detecting and analyzing objects

A method for detecting objects and labeling the objects with distances in an image includes steps of: obtaining a thermal image from a thermal camera, an RGB image from an RGB camera, and radar information from an mmWave radar; adjusting the thermal image based on the RGB image to generate an adjusted thermal image, and generating a fused image based on the RGB image and the adjusted thermal image; generating a second fused image based on the fused image and the radar information; detecting objects in the images, and generating, based on the fused image, another fused image including bounding boxes marking the objects; and determining motion parameters of the objects.