G05D105/80

Unmanned aerial vehicle inspection route generating apparatus and method
12602061 · 2026-04-14 · ·

An unmanned aerial vehicle inspection route generating apparatus and method are provided. The apparatus generates a plurality of inspection points corresponding to a target object to be inspected, and each of the inspection points corresponds to a spatial coordinate. The apparatus calculates a plurality of flight segments based on a three-dimensional model corresponding to the target object to be inspected and the spatial coordinates corresponding to the inspection points, and each of the flight segments corresponds to two of the inspection points. The apparatus calculates a risk value corresponding to each of the flight segments. The apparatus generates an inspection route corresponding to the target object to be inspected based on the risk values and the flight segments.

Robotic bees and mantis and insect-like robots with embodied artificial intelligence

This invention relates to insect-like robots empowered by generative artificial intelligence (Gen-AI), specifically designed to address critical challenges in agriculture and environmental monitoring. The robotic bee, mantis, and dragonfly can autonomously perform essential tasks such as crop pollination, pest control, and detailed farmland inspection. These robots feature lifelike designs with components such as heads with integrated high-resolution cameras, antennae for communication, specialized mouthparts, thoraxes housing CPUs and actuators, and wings with thin-film photovoltaic solar materials. The AI models function as the brains, processing data captured by various sensors to provide real-time guidance and control commands. Leveraging advanced technologies and sustainable power sources, these robotic insect-like robots enhance productivity, sustainability, and efficiency in agricultural practices, contributing to a more secure and eco-friendly future.

Unmanned platform with bionic visual multi-source information and intelligent perception

Disclosed is an unmanned platform with bionic visual multi-source information and intelligent perception. The unmanned platform is equipped with a bionic polarization vision/inertia/laser radar combined navigation module, a deep learning object detection module and an autonomous obstacle avoidance module; the bionic polarization vision/inertia/laser radar combined navigation module is configured to position and orient the unmanned platform in real time; the deep learning object detection module is configured to sense an environment around the unmanned platform according to RGB images of a surrounding environment collected by the bionic polarization vision/inertia/laser radar combined navigation module; and the autonomous obstacle avoidance module determines whether there are any obstacles around the unmanned platform during running according to the objects identified by the target, and performs autonomous obstacle avoidance in combination with the carrier navigation and positioning information. Concealment, autonomous navigation, object detection and autonomous obstacle avoidance capabilities of the unmanned platform are thus improved.

Methods and systems for real-time enhanced learning services and intelligent on-demand task-based services

Aspects of the subject disclosure may include, for example, receiving a user selection relating to a course, transmitting a request to a controller in a vehicle for information regarding capabilities of available devices onboard the vehicle, wherein the available devices include uncrewed aerial vehicles (UAVs), based on the transmitting, obtaining, from the controller, the information regarding the capabilities, responsive to the obtaining, sending a command to the controller to facilitate deployment of one or more of the UAVs to collect data for the course, and after the sending, receiving the data from the controller and incorporating the data into the course for delivery to one or more users onboard the vehicle. Other embodiments are disclosed.

Automated aerial data capture for 3D modeling of unknown objects in unknown environments

System and method are disclosed for multi-phase process of automated data capture for photogrammetry and 3D model building of an unknown object (311) in an unknown environment. Planner module (152) generates a flight plan (413) for a camera drone (110) to fly autonomously on a flight path along a virtual polygon grid (302) defined above the target object (311) during a survey phase. Model builder computer (153) receives a point cloud dataset (321) captured by LiDAR sensor on camera drone (301) during survey flight and constructs low resolution 3D mesh (331) of the target object (311). Planner module (152) generates a flight path (413) for camera drone inspection phase with virtual waypoints surrounding the target object (311) at a marginal distance from the surface defined by the low resolution 3D mesh (331). Model builder (153, 163) builds a high resolution 3D model (422) of the target object (311) using photogrammetry processing of high resolution images captured by camera drone (411, 412) during inspection phase.