B64D47/08

SYSTEM FOR RECORDING AND REAL-TIME TRANSMISSION OF IN-FLIGHT OF AIRCRAFT COCKPIT TO GROUND SERVICES
20230002072 · 2023-01-05 · ·

A system, method and device for monitoring an aircraft, including activity taking place within an aircraft and conditions of the aircraft, where one or more a percepting component such as a camera, microphone, or other sensor, is situated at a location within the aircraft from which information may be ascertained. Preferably the component is disguised within the surfaces or instrumentation of the aircraft. The percepting component is connected with a communication mechanism to transmit communications from the system resident within the aircraft to a grounds portion of the system through a communication link, such as a satellite communication link. The system may process the information corresponding with the aircraft condition or activity and generate alerts when a trigger is met or exceeded. The system components on the aircraft may monitor conditions and activity without using or interfering with the aircraft instrumentation, the system using only the aircraft power.

Autonomous Aerial Vehicle Hardware Configuration

An introduced autonomous aerial vehicle can include multiple cameras for capturing images of a surrounding physical environment that are utilized for motion planning by an autonomous navigation system. In some embodiments, the cameras can be integrated into one or more rotor assemblies that house powered rotors to free up space within the body of the aerial vehicle. In an example embodiment, an aerial vehicle includes multiple upward-facing cameras and multiple downward-facing cameras with overlapping fields of view to enable stereoscopic computer vision in a plurality of directions around the aerial vehicle. Similar camera arrangements can also be implemented in fixed-wing aerial vehicles.

Autonomous Aerial Vehicle Hardware Configuration

An introduced autonomous aerial vehicle can include multiple cameras for capturing images of a surrounding physical environment that are utilized for motion planning by an autonomous navigation system. In some embodiments, the cameras can be integrated into one or more rotor assemblies that house powered rotors to free up space within the body of the aerial vehicle. In an example embodiment, an aerial vehicle includes multiple upward-facing cameras and multiple downward-facing cameras with overlapping fields of view to enable stereoscopic computer vision in a plurality of directions around the aerial vehicle. Similar camera arrangements can also be implemented in fixed-wing aerial vehicles.

TECHNOLOGIES FOR ANALYZING BEHAVIORS OF OBJECTS OR WITH RESPECT TO OBJECTS BASED ON STEREO IMAGERIES THEREOF

This disclosure enables various technologies for analyzing behaviors of objects or with respect to objects based on stereo imageries thereof. For example, such analysis may be useful in enforcement of certain actions by objects or with respect to objects, surveillance of objects or with respect to objects, or other situations involving analyzing behaviors of objects or with respect to objects.

UNMANNED AERIAL VEHICLE WITH VIRTUAL UN-ZOOMED IMAGING

In some examples, a computing device receives, from an unmanned aerial vehicle (UAV), a first image from a first camera on the UAV and a plurality of second images from a plurality of second cameras on the UAV. The plurality of second cameras may be positioned on the UAV for providing a plurality of different fields of view in a plurality of different directions around the UAV. Further, the first camera has a longer focal length than the second cameras. The computing device presents, on a display, a composite image including at least a portion of the first image within a merged image generated from the plurality of second images. The presented composite image enables a user to at least one of: zoom out from the at least one first image to the merged image, or zoom in from the merged image to the at least one first image.

UNMANNED AERIAL VEHICLE WITH VIRTUAL UN-ZOOMED IMAGING

In some examples, a computing device receives, from an unmanned aerial vehicle (UAV), a first image from a first camera on the UAV and a plurality of second images from a plurality of second cameras on the UAV. The plurality of second cameras may be positioned on the UAV for providing a plurality of different fields of view in a plurality of different directions around the UAV. Further, the first camera has a longer focal length than the second cameras. The computing device presents, on a display, a composite image including at least a portion of the first image within a merged image generated from the plurality of second images. The presented composite image enables a user to at least one of: zoom out from the at least one first image to the merged image, or zoom in from the merged image to the at least one first image.

IMAGING SYSTEM AND ROBOT SYSTEM

An imaging system includes: an unmanned flight vehicle; an imager that is mounted on the unmanned flight vehicle and takes an image of a robot which performs work with respect to a target object; a display structure which is located away from the unmanned flight vehicle and displays the image taken by the imager to a user who manipulates the robot; and circuitry which controls operations of the imager and the unmanned flight vehicle. The circuitry acquires operation related information that is information related to an operation of the robot. The circuitry moves the unmanned flight vehicle such that a position and direction of the imager are changed so as to correspond to the operation related information.

METHOD, SYSTEM, AND IMAGE PROCESSING DEVICE FOR CAPTURING AND/OR PROCESSING ELECTROLUMINESCENCE IMAGES, AND AN AERIAL VEHICLE

A method (400) of capturing and processing electroluminescence (EL) images (1910) of a PV array (40) is disclosed herein. In a described embodiment, the method 400 includes controlling the aerial vehicle (20) to fly along a flight path to capture EL images (1910) of corresponding PV array subsections (512b) of the PV array (40), deriving respective image quality parameters from at least some of the captured EL images, dynamically adjusting a flight speed of the aerial vehicle along the flight path, based on the respective image quality parameters for capturing the EL images (1910) of the PV array subsections (512b), extracting a plurality of frames (1500) of the PV array subsection (512b) from the EL images (1910); determining a reference frame having a highest image quality of the PV array subsection (512b) from among the extracted frames (2100); performing image alignment of the extracted frames (2100) to the reference frame to generate image aligned frames (2130), and processing the image aligned frames (2130) to produce an enhanced image (2140) of the PV array subsection (512b) having a higher resolution than the reference frame. A system, image processing device, and aerial vehicle for the method thereof are also disclosed.

METHOD, SYSTEM, AND IMAGE PROCESSING DEVICE FOR CAPTURING AND/OR PROCESSING ELECTROLUMINESCENCE IMAGES, AND AN AERIAL VEHICLE

A method (400) of capturing and processing electroluminescence (EL) images (1910) of a PV array (40) is disclosed herein. In a described embodiment, the method 400 includes controlling the aerial vehicle (20) to fly along a flight path to capture EL images (1910) of corresponding PV array subsections (512b) of the PV array (40), deriving respective image quality parameters from at least some of the captured EL images, dynamically adjusting a flight speed of the aerial vehicle along the flight path, based on the respective image quality parameters for capturing the EL images (1910) of the PV array subsections (512b), extracting a plurality of frames (1500) of the PV array subsection (512b) from the EL images (1910); determining a reference frame having a highest image quality of the PV array subsection (512b) from among the extracted frames (2100); performing image alignment of the extracted frames (2100) to the reference frame to generate image aligned frames (2130), and processing the image aligned frames (2130) to produce an enhanced image (2140) of the PV array subsection (512b) having a higher resolution than the reference frame. A system, image processing device, and aerial vehicle for the method thereof are also disclosed.

Flying body
11565810 · 2023-01-31 · ·

[Problem] To provide a flying body having a new structure capable of improving flight efficiency. [Solution] The problem is addressed by a flying body capable of traveling along at least a first direction and comprising an airframe part and an auxiliary part wherein the airframe part has a body part and a lift generating part, the body part having a right part and a left part extending along the first direction and a connecting part whereby the ends of the right part and the left part in a second direction opposing to the first direction are connected. The flying body is thus configured so as to create a surrounded space surrounded by the left part, the right part, and the connecting part when seen from a third direction perpendicular to the first direction.