Patent classifications
B64U2101/30
Autonomous unmanned vehicles for responding to situations
Autonomous unmanned vehicles (UVs) for responding to situations are described. Embodiments include UVs that launch upon detection of a situation, operate in the area of the situation, and collect and send information about the situation. The UVs may launch from a vehicle involved in the situation, a vehicle responding to the situation, or from a fixed station. In other embodiments, the UVs also provide communications relays to the situation and may facilitate access to the situation by responders. The UVs further may act as decoupled sensors for vehicles. In still other embodiments, the collected information may be used to recreate the situation as it happened.
Capture of ground truthed labels of plant traits method and system
In embodiments, acquiring sensor data associated with a plant growing in a field, and analyzing the sensor data to extract one or more phenotypic traits associated with the plant from the sensor data. Indexing the one or more phenotypic traits to one or both of an identifier of the plant or a virtual representation of a part of the plant, and determining one or more plant insights based on the one or more phenotypic traits, wherein the one or more plant insights includes information about one or more of a health, a yield, a planting, a growth, a harvest, a management, a performance, and a state of the plant. One or more of the health, yield, planting, growth, harvest, management, performance, and the state of the plant are included in a plant insights report that is generated.
Unmanned aerial vehicle control system, unmanned aerial vehicle control method, and program
Stability of an unmanned aerial vehicle is sought by using a flight controller of an unmanned aerial vehicle control system for controlling flying by an unmanned aerial vehicle based on an instruction from a first operator. A determiner is used to determine whether a second operator visually recognizes the unmanned aerial vehicle based on a predetermined determination method. A switcher is used to switch, based on a result of the determination obtained by the determiner, from a first state, in which the unmanned aerial vehicle flies in accordance with an instruction from the first operator, to a second state, in which the unmanned aerial vehicle flies in accordance with an instruction from the second operator.
Method of navigating an unmanned aerial vehicle for streetlight maintenance
An unmanned aerial vehicle (UAV) includes a body that supports one or more rotors, the one or more rotors each driven by a motor and configured to provide lift to the body. The UAV further includes a parts handler coupled to the body, the parts handler configured to grasp a payload, and rotate the payload with respect to an external structure to couple the payload to, or decouple the payload from, the external structure. The UAV includes a stabilizing mechanism extending from the body, the stabilizing mechanism configured to contact the external structure without transferring entire weight of the UAV to the external structure and prevent rotation of the body when the part-handler rotates the payload.
UAV video aesthetic quality evaluation method based on multi-modal deep learning
The present disclosure provides a UAV video aesthetic quality evaluation method based on multi-modal deep learning, which establishes a UAV video aesthetic evaluation data set, analyzes the UAV video through a multi-modal neural network, extracts high-dimensional features, and concatenates the extracted features, thereby achieving aesthetic quality evaluation of the UAV video. There are four steps, step one to: establish a UAV video aesthetic evaluation data set, which is divided into positive samples and negative samples according to the video shooting quality; step two to: use SLAM technology to restore the UAV's flight trajectory and to reconstruct a sparse 3D structure of the scene; step three to: through a multi-modal neural network, extract features of the input UAV video on the image branch, motion branch, and structure branch respectively; and step four to: concatenate the features on multiple branches to obtain the final video aesthetic label and video scene type.
System, devices and methods for tele-operated robotics
The system, devices and methods herein enable autonomous and tele-operation of tele-operated robots for maintenance of a property around known and unknown obstacles. A method may include using an unmanned aerial vehicle for obtaining additional data relating to the property and obstacles within the property and plan a path around the obstacles using data from sensors on-board the tele-operated robot and the aerial image. A method may also provide optimization of total time needed for performing the property maintenance and the labor costs in situations where manual intervention is needed for navigating the tele-operated robot around obstacles on the property or for removing obstacles on the property.
Systems and methods for generating a two-dimensional map
A system, computer-implemented method and non-transitory computer readable medium storing instructions for generating a two-dimensional (2D) map of an area of interest are provided. The system comprises a processor and memory storing instructions which when executed by the processor configure the processor to perform the method. The method comprises determining a perimeter of an area of interest, obtaining nadir images of the area of interest, obtaining at least one oblique image of the area of interest from at least one corner of the perimeter, and processing the nadir and oblique images together to form the 2D map of the area of interest.
OBJECT PLACEMENT VERIFICATION
Described are approaches for monitoring construction of a structure. In an embodiment, sensor data (e.g., imaging data, LIDAR, infrared, etc.) of a construction site is obtained. The sensor data is analyzed and objects related to the construction site are identified. The objects are mapped to corresponding objects of a builder’s design plans of the construction site, and the location of components are checked for accuracy. When a discrepancy above a threshold is detected, a report indicating such errors is generated and appropriate entities are provided the report.
UNMANNED AERIAL VEHICLE
Provided is an unmanned aerial vehicle configured to be able to discharge a liquid, comprising: a container of the liquid; a discharging unit configured to discharge the liquid; and a coupling unit configured to couple the container and the discharging unit together, wherein the coupling unit includes a buffering unit configured to buffer stress generated in the coupling unit. The container may be an aerosol container. The coupling unit may include a rigid unit coupled to the buffering unit and configured to have rigidity higher than the buffering unit.
AERIAL RECONNAISSANCE DRONE AND METHOD
An aerial reconnaissance drone having a dragonfly format (elongate fuselage and flapping wings) with two cameras having respective diagonal fields of view, arranged at respective ends of the fuselage, both pointing forwards, wherein the second camera has a diagonal fielder of view that is at most half that of the first camera. This has the advantage of providing a drone that can capture enhanced imagery when required, by performing a half turn and switching which camera is being used. Since this avoids placing two cameras in the same location both can have a clear view of surroundings yet it helps avoid off balance caused by placing too much mass in any particular off-centre location.