B64U101/30

Unmanned aerial image capture platform

Methods and systems are disclosed for an unmanned aerial vehicle (UAV) configured to autonomously navigate a physical environment while capturing images of the physical environment. In some embodiments, the motion of the UAV and a subject in the physical environment may be estimated based in part on images of the physical environment captured by the UAV. In response to estimating the motions, image capture by the UAV may be dynamically adjusted to satisfy a specified criterion related to a quality of the image capture.

Distributed unmanned aerial vehicle architecture

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for a distributed system architecture for unmanned air vehicles. One of the methods includes obtaining information identifying flight information of a UAV, with the flight information including flight phase information or a contingency condition associated with a flight critical module included in the UAV. The obtained information is analyzed, and one or more first payload modules are determined to enter a modified power state. Requests to enter the modified power state are caused to be transmitted to each determined payload module in the one or more first payload modules.

Autonomous aerial vehicle assisted viewing location selection for event venue

A processing system including at least one processor may collect, via at least one camera of at least one autonomous aerial vehicle, viewing information for a plurality of positions for each of a plurality of viewing locations within an event venue. The processing system may next present a viewing location selection interface to a user, where the viewing location selection interface provides a simulated view with respect to at least one of the plurality of positions for at least one of the plurality of viewing locations, and where the simulated view is based upon the viewing information that is obtained. The processing system may then obtain a selection from the user of a viewing location of the plurality of viewing locations for an event at the event venue.

Robust laser scanning for generating a 3D model

In a method and system for scanning a structure, a structure scanner may acquire multiple scans of a surface of a structure. Each of the scans may correspond to different portions of the surface. The property inspection system may generate a 3D model of the surface using the scans. To account for potential changes in position and/or orientation of the structure scanner between scans, the structure scanner may self-calibrate using a fiducial marker. By correcting for changes in position and orientation over time, the structure scanner may accurately map the scans of the different portions of the surface to a 3D model of the surface.

Mutual recognition method between unmanned aerial vehicle and wireless terminal
11815913 · 2023-11-14 · ·

A mutual recognition method between an unmanned aerial vehicle (UAV) and a wireless terminal, includes: when an image of the UAV is positioned in a predetermined section of an imaging surface of an image sensor in the wireless terminal, receiving, by a server, first state information about the wireless terminal including information about a direction of an external magnetic field of the wireless terminal from the wireless terminal.

Collaborative relationship between a vehicle and a UAV

Exemplary embodiments described in this disclosure are generally directed to a collaborative relationship between a vehicle and a UAV. In one exemplary implementation, a computer that is provided in the vehicle uses images captured by an imaging system in the UAV together with images captured by an imaging system in the vehicle, to modify a suspension system of the vehicle based on a nature of the terrain located below, or ahead, of the vehicle. The computer may, for example, modify a suspension system before the vehicle reaches a rock or a pothole on the ground ahead. In another exemplary implementation, the computer may generate an augmented reality image that includes a 3D model of the vehicle rendered on an image of a terrain located below, or ahead of, the vehicle. The augmented reality image may be used by a driver of the vehicle to drive the vehicle over such terrain.

Control apparatus and control method for specular object detection based on an unmanned aerial vehicle's reflection

A control apparatus includes an acquisition unit that acquires captured data in which an object around a moving object is captured by an imaging unit, where the moving object is one of a moving object that is irradiated with spontaneous emission light and a moving object that moves with a predetermined pattern, and a determination unit that determines that the object is an obstacle if the captured data acquired by the acquisition unit includes a specific pattern.

Information processing apparatus

Time slot specifying unit reads out facility information of a facility to be inspected (specifically, a base station ID of a base station whose previous inspection day is included in a past predetermined period) from facility information storage unit. Daytime information acquiring unit acquires daytime information at positions at which base stations to be inspected are installed. Time slot specifying unit specifies non-backlight time slots based on the read-out facility information and the acquired daytime information. Operation plan generating unit generates an operation plan of drone in which facilities are shot in the specified non-backlight time slots. Operation plan outputting unit outputs the generated operation plan.

Unmanned aerial vehicle (UAV) systems and methods for maintaining railway situational awareness
11823578 · 2023-11-21 · ·

An unmanned aerial vehicle (UAV) system for maintaining railway situational awareness, includes a ground station configured to be mounted to a train, a UAV including a sensor, a processor, and a memory. The sensor is configured to provide a signal indicative of condition and/or an event. The memory contains instructions, which, when executed by the processor, cause the system to: selectively deploy the UAV, from the ground station mounted to the train, receive the signal from the sensor; and determine a condition and/or an event, relative to the train, based on the sensed signal.

Automated detection and remediation of contagion events

Techniques are provided for implementing automated contagion detection and remediation (ACDR) features to detect and remediate environmental contagion conditions. For example, ACDR techniques can be used to target contagion contamination on surfaces in a trafficked area, rather than focusing detecting and remediating human symptoms. ACDR systems can include swarms of specially configured drones under control of one or more centralized controllers to detect presence of one or more types of pathogens on surfaces and to classify detected contagion events. In some embodiments, upon such detection, the same or other specially configured drones can be triggered to remediate the detected condition by removing the pathogen by disinfecting surfaces, by cordoning off infected areas, and/or in other ways. Some embodiments can further log and aggregate data relating to detected contagion events to support tracking, remediation, enforcement, protocol updating, research, and/or other efforts.