G01S17/933

WINDOW CONTAMINATION SENSOR
20220404266 · 2022-12-22 ·

A contamination sensor for an optical sensor observation window includes a source, two prisms, a detector, and a controller. The source can emit a collimated light beam at an incident angle that is greater than a critical angle of an interface between a fluid and the window. The window has a refractive index greater than the refractive index of the fluid. The prisms can direct the collimated light beam within the window such that the collimated light beam reflects within a contamination detection zone of the window. The detector can receive the collimated light beam. The controller can communicate with the source and detector. The controller can calculate an emission/detection ratio defined by a difference between an amount of light emitted by the source and an amount of light that passes from the source to the detector by a total internal reflectance of the window.

OBJECT DETECTION VIA COMPARISON OF SYNCHRONIZED PULSED ILLUMINATION AND CAMERA IMAGING
20220408005 · 2022-12-22 ·

An image processing system may comprise a global shutter camera, an illumination emitter, and a processing system comprising at least one processor and memory. The processing system may be configured to control the image processing system to: control the illumination emitter to illuminate a scene; control the global shutter camera to capture a sequence of images of the scene, wherein the captured sequence of images includes images that are captured without illumination of the scene by the illumination emitter and images that are captured while the scene is illuminated by the illumination emitter; and determine presence of an object in the scene based on comparison of the images captured without illumination of the scene and images captured with illumination of the scene.

Autonomous aircraft sensor-based positioning and navigation system using markers

A system and method are disclosed for design of a suite of multispectral (MS) sensors and processing of enhanced data streams produced by the sensors for autonomous aircraft flight. The onboard suite of MS sensors is specifically configured to sense and use a MS variety of sensor-tuned objects, either strategically placed objects and/or surveyed and sensor significant existing objects to determine a position and verify position accuracy. The received MS sensor data enables an autonomous aircraft object identification and positioning system to correlate MS sensor data output with a-priori information stored onboard to determine and verify position and trajectory of the autonomous aircraft. Once position and trajectory are known, the object identification and positioning system commands the autonomous aircraft flight management system and autopilot control of the autonomous aircraft.

Autonomous aircraft sensor-based positioning and navigation system using markers

A system and method are disclosed for design of a suite of multispectral (MS) sensors and processing of enhanced data streams produced by the sensors for autonomous aircraft flight. The onboard suite of MS sensors is specifically configured to sense and use a MS variety of sensor-tuned objects, either strategically placed objects and/or surveyed and sensor significant existing objects to determine a position and verify position accuracy. The received MS sensor data enables an autonomous aircraft object identification and positioning system to correlate MS sensor data output with a-priori information stored onboard to determine and verify position and trajectory of the autonomous aircraft. Once position and trajectory are known, the object identification and positioning system commands the autonomous aircraft flight management system and autopilot control of the autonomous aircraft.

Methods and systems for acoustic machine perception for an aircraft
11531100 · 2022-12-20 · ·

In an example, a method is described. The method includes causing one or more sensors arranged on an aircraft to acquire, over a window of time, first data associated with a first object that is within an environment of the aircraft, where the one or more sensors include one or more of a light detection and ranging (LIDAR) sensor, a radar sensor, or a camera, causing an array of microphones arranged on the aircraft to acquire, over approximately the same window of time as the first data is acquired, first acoustic data associated with the first object, and training a machine learning model by using the first acoustic data as an input value to the machine learning model and by using an azimuth, a range, an elevation, and a type of the first object identified from the first data as ground truth output labels for the machine learning model.

Methods and systems for acoustic machine perception for an aircraft
11531100 · 2022-12-20 · ·

In an example, a method is described. The method includes causing one or more sensors arranged on an aircraft to acquire, over a window of time, first data associated with a first object that is within an environment of the aircraft, where the one or more sensors include one or more of a light detection and ranging (LIDAR) sensor, a radar sensor, or a camera, causing an array of microphones arranged on the aircraft to acquire, over approximately the same window of time as the first data is acquired, first acoustic data associated with the first object, and training a machine learning model by using the first acoustic data as an input value to the machine learning model and by using an azimuth, a range, an elevation, and a type of the first object identified from the first data as ground truth output labels for the machine learning model.

Creating a ground control point file using an existing landmark shown in images

In some examples, a system includes a memory configured to store a first image and a second image captured by one or more cameras mounted on one or more vehicles and store locations and orientations of the one or more cameras at times when the first and second images were captured. The system also includes processing circuitry configured to identify an existing landmark in the first and second images. The processing circuitry is also configured to determine a latitude, a longitude, and an altitude of the existing landmark based on the locations and orientations of the one or more cameras at the times when the images were captured. The processing circuitry is configured to create a file including the location of the existing landmark and pixel coordinates of the existing landmark in the first and second images.

AIRCRAFT IDENTIFICATION

Methods, devices, and systems for aircraft identification are described herein. In some examples, one or more embodiments include a computing device comprising a memory and a processor to execute instructions stored in the memory to simulate virtual light detection and ranging (Lidar) sensor data for a three-dimensional (3D) model of an aircraft type to generate a first point cloud corresponding to the 3D model of the aircraft type, generate a classification model utilizing the simulated virtual Lidar sensor data of the 3D model of the aircraft type, and identify a type and/or sub-type of an incoming aircraft at an airport by receiving, from a Lidar sensor at the airport, Lidar sensor data for the incoming aircraft, generating a second point cloud corresponding to the incoming aircraft utilizing the Lidar sensor data for the incoming aircraft, and classifying the second point cloud corresponding to the incoming aircraft using the classification model.

AIRCRAFT IDENTIFICATION

Methods, devices, and systems for aircraft identification are described herein. In some examples, one or more embodiments include a computing device comprising a memory and a processor to execute instructions stored in the memory to simulate virtual light detection and ranging (Lidar) sensor data for a three-dimensional (3D) model of an aircraft type to generate a first point cloud corresponding to the 3D model of the aircraft type, generate a classification model utilizing the simulated virtual Lidar sensor data of the 3D model of the aircraft type, and identify a type and/or sub-type of an incoming aircraft at an airport by receiving, from a Lidar sensor at the airport, Lidar sensor data for the incoming aircraft, generating a second point cloud corresponding to the incoming aircraft utilizing the Lidar sensor data for the incoming aircraft, and classifying the second point cloud corresponding to the incoming aircraft using the classification model.

METHOD OF AUTONOMOUS HIERARCHICAL MULTI-DRONE IMAGE CAPTURING

A method for optimizing image capture of a scene by a swarm of drones including a root drone and first and second level-1 drones involves the root drone following a predetermined trajectory over the scene, capturing one or more root keyframe images, at a corresponding one or more root drone orientations and root drone-to-scene distances. For each root keyframe image: the root drone generates a ground mask image for that root keyframe image, and applies that ground mask image to the root keyframe image to generate a target image. The root drone then analyzes the target image to generate first and second scanning tasks for the first and second level-1 drones to capture a plurality of images of the scene at a level-1 drone-to-scene distance smaller than the root drone-to-scene distance; and the first and second level-1 drones carry out the first and second scanning tasks respectively.