B64U2201/00

CLOUD & HYBRID-CLOUD FLIGHT VEHICLE & ROBOTIC CONTROL SYSTEM AI & ML ENABLED CLOUD-BASED SOFTWARE & DATA SYSTEM METHOD FOR THE OPTIMIZATION AND DISTRIBUTION OF FLIGHT CONTROL & ROBOTIC SYSTEM SOLUTIONS AND CAPABILITIES

A robotic vehicle management system for the control, optimization and distribution of robotic vehicles is presented in which vehicle operational data is recorded and used to model and optimize a vehicle's travel path. A process for receiving data from multiple vehicles is disclosed, wherein the recorded data is used in the optimization of control systems with regards to travel path, fuel savings, safety, and other considerations. The recorded data may be used to improve system operations or operations of individual vehicles. Methods and techniques are also provided for reading data from vehicle sensors, applying analysis techniques to this data, and uploading improved operational processes to one or more vehicles or to a fleet of vehicles. Adaptive controls, learning based controls, navigation system and other capabilities may be included for optimization and distribution by this discloses system and methods.

Intelligent location awareness for unmanned systems
11518513 · 2022-12-06 · ·

In some embodiments, a method for determining a location of an unmanned system (UMS) can include: receiving data from a plurality of data sources, wherein the data sources include a geolocation sensor and at least one of an RF receiver, a RADAR system, a LIDAR system, a SONAR system, an infrared camera, a Simultaneous Location and Mapping Algorithm (SLAM) system, an inertial sensor, or an acoustic sensor; determining a reliability of one or more of the data sources based on the received data; assigning weights to the data sources based at least in part on the determination of the reliability of the one or more data sources; and determining the location of the UMS using the received data and the assigned weights.

FLIGHT GUIDANCE AND CONTROL INTERFACES FOR UNMANNED AIR VEHICLES

Systems, methods, and devices described in the present disclosure provide technology for detecting features of interest depicted in a video stream supplied by a camera-equipped drone and adjusting the flight pattern of the drone to increase the amount of time in which the features of interest are within the line of sight of the drone. In addition, the present disclosure provides technology for detecting when events of interest occur at the features of interest and alerting personnel when the events occur. In some examples, the technology of the present disclosure also determines a format for an alert (e.g., a push notification) based on an event type so that personnel who receive the alert are apprised of the urgency of the event based on the format.

Techniques for image recognition-based aerial vehicle navigation

A control terminal for controlling an unmanned aerial vehicle (UAV) includes a processor and a storage medium storing instructions that, when executed by the processor, cause the processor to render an image on a user interface of the control terminal. The image is captured by an imaging device coupled to the UAV and is associated with a view of the imaging device. The instructions further cause the processor to detect, via the user interface, a gesture-based input including one or more reference points in the image and indicating a view change of the imaging device, determine a type of the gesture-based input by analyzing the one or more reference points, and generate control data based on the type of the gesture-based input to control at least one of the UAV or the imaging device for the view change of the imaging device.

Altitude estimation for aerial vehicles
11512953 · 2022-11-29 · ·

A method for determining an altitude of a moving object includes obtaining pressure-dependent data from a plurality of sensors and computing the altitude of the moving object based on the pressure-dependent data from the plurality of sensors. Each of the sensors is mounted on the moving object with a respective primary orientation direction, and the primary orientation directions of at least two of the sensors are different.

Flight status inspection system, flight status inspection method, and non-transitory computer-readable recording medium storing program

A flight status inspection system, flight status inspection method and non-transitory computer-readable recording medium storing program inspect the flight status of a flying object (drone) capable of flying through the air. The drone has a gravitational center movement device for moving the position of the gravitational center of the entire drone. In addition, the flight status inspection system has an inspection device for acquiring and storing information about the flight status when moving the position of the gravitational center of the drone during flight, or when changing the flight details during movement of the gravitational center of the drone.

IOT drone fleet

Apparatus, systems, processes, and computer-readable mediums for facilitating the use of drones are described. For one embodiment, such a system includes a user element having a user application computer program configured to instruct a user interface device to facilitate use of user data and use of mission parameter(s) for a proposed drone mission. An owner element includes an owner application computer program configured to facilitate use of owner data and use of at least one drone parameter. A fleet system element is communicatively coupled to the user element and to the owner element and includes a computer system processor configured to facilitate use of a fleet record and use of at least one fleet parameter.

Adhoc geo-fiducial mats for landing UAVs

An apparatus for visual navigation of a UAV includes a geo-fiducial mat and a plurality of geo-fiducials. The geo-fiducial mat includes a landing pad region that provides a location for aligning with a landing pad of a UAV and a survey point. The geo-fiducials are each specified for a unique directional and offset position in or about the landing pad region relative to the survey point. The geo-fiducials each includes a two-dimensional (2D) pattern that visually conveys an alphanumerical code. The 2D pattern has a shape from which a visual navigation system of the UAV can visually triangulate a position of the UAV.

CONTROL APPARATUS, FIRST MOBILE TERMINAL, METHOD, PROGRAM, AND RECORDING MEDIUM
20220371731 · 2022-11-24 · ·

In order for a mobile terminal to land at an appropriate landing point in reaction to a change of an incident that may occur while the mobile terminal flies along a flight path, a control apparatus 100 includes: an information acquisition section 131 configured to acquire, according to a flight of a first mobile terminal (mobile terminal 200a) performed based on a flight path to a first landing point, information on one or more second landing points associated with the flight path to the first landing point; and a first communication processing section 133 configured to transmit the information on the one or more second landing points to the first mobile terminal (mobile terminal 200a) via a mobile communication network 300.

METHOD AND APPARATUS FOR UAV AND UAV CONTROLLER GROUP MEMBERSHIP UPDATE
20220371732 · 2022-11-24 · ·

In the method, an unmanned aerial system application enabler (UAE) server can determine that a first UAV (UAV-1) is to be replaced with a second UAV (UAV-2) based on a received request. The UAV-2 is recognized by the UAE server based on a Civil Aviation Authority (CAA) level identity (ID) of the UAV-2. A request to perform a group membership update is sent by the UAE server to a SEAL group management (GM) server. The group membership update replaces the UAV-1 with the UAV-2. A response message is received by the UAE server from the SEAL GM server. The request to perform the group membership update includes (i) an ID of an UAE client that corresponds to the group of the UAV-1 and the UAV-C, (ii) a user equipment (UE) ID of the UAV-1, (iii) a UE ID of the UAV-2, and (iv) the CAA-level ID of the UAV-2.