G05D2109/20

SYSTEMS AND METHODS FOR AIRCRAFT LANDING GUIDANCE DURING GNSS DENIED ENVIRONMENT

A system comprises a GNSS sensor onboard an aerial vehicle; a monitor warning system (MWS) that determines whether the vehicle is in a GNSS denied environment; and a flight management system that includes a landing guidance module, and a database having location coordinates of landing sites. Onboard vision sensors and a radar velocity system (RVS) communicate with the guidance module. When the MWS determines that the vehicle is in a GNSS denied environment, the guidance module calculates an optimal flight path by receiving image data from the vision sensors; receiving position, velocity and altitude data from the RVS; receiving location coordinates of a landing site; processing the image data, and the position, velocity and altitude data, to determine a location of the vehicle and provide 3D imaging of a route to the landing site; and calculating a flight path angle to the landing site, using vehicle and landing site coordinates.

DEVICE AND METHOD FOR INSPECTING AN INDUSTRIAL VEHICLE

Method for inspecting an industrial equipment in an operating space, including the following steps: receiving image data, detected by a measurement device and representing an image including at least one component of the industrial equipment; processing the image data by means of machine vision algorithms to identify a control parameter; processing image data and determining a dimensional measurement, using machine vision algorithms, corresponding to the control parameter identified; formulating and sending a read request to a memory, based on the control parameter identified; sending an allowable maintenance value, representing an allowable value of the control parameter identified; comparing between the allowable maintenance value and the dimensional measurement; generating a diagnosis, based on the comparison between the allowable maintenance value and the dimensional measurement.

VEHICLE CONTROL LOOPS AND INTERFACES

Embodiments relate to an aircraft control and interface system configured to adaptively control an aircraft according to different flight states by modifying one or more processing control loops. The system receives sensor data from one or more sensors of the aircraft. The system determines, from the sensor data, a component of the aircraft is compromised. The system determines the aircraft is in a degraded flight state due to the compromised component. The system operates the aircraft according to the degraded flight state, wherein operating the aircraft according to the degraded flight state includes: (a) modifying one or more processing loops based on the degraded flight state and (b) generating an actuator command by applying the degraded flight state and a signal based on an input from a vehicle control interface to the modified one or more processing loops.

GOLF SUPPORT SYSTEM AND COMPUTER READABLE MEDIUM
20250229158 · 2025-07-17 · ·

A golf support system supports a user to play golf on a golf course. The golf support system includes a course travelling vehicle to be operated in the golf course. A photographing unit photographs a trajectory of a golf ball as a trajectory image by using a camera mounted on the course travelling vehicle. A trajectory prediction unit calculates the trajectory of the golf ball as a predicted trajectory by using the trajectory image. A route calculation unit calculates a predicted drop position being a drop position of the golf ball by using the predicted trajectory. Then, the route calculation unit calculates a travelling route to the predicted drop position of the golf ball.

IMAGING CONTROLS FOR UNMANNED AERIAL VEHICLES
20240124138 · 2024-04-18 ·

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, are described for performing privacy-aware surveillance with a security camera drone. The drone obtains image data using an imaging device and processes the image data to detect an object external to the drone. In response to processing the image data, the drone determines multiple viewing angles indicated in the image data with respect to the object. Based on the image data and the multiple viewing angles, a first section of the area for surveillance and a second, different section of the area to be excluded from surveillance are identified. The drone determines an adjustment to at least one of the multiple viewing angles to cause the second section to be excluded from surveillance. Based on the adjustment, the drone controls the imaging device to exclude the second section from surveillance imagery obtained by the drone.

AUTONOMOUS VEHICLE DELIVERY SYSTEM
20240140629 · 2024-05-02 ·

An autonomous vehicle (AV) delivery system is configured to deliver a payload or package in a rural and/or urban environment. The AV delivery system includes a first autonomous vehicle (AV). The first AV is configured to travel between a payload receiving location and a payload drop location. The AV delivery system further includes a second autonomous vehicle (AV) coupled to the first AV. The second AV is coupled to a payload and configured to travel between the first AV and a designated drop target adjacent to a ground or receiving surface at the payload drop location.

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND DELIVERY SYSTEM
20240169304 · 2024-05-23 · ·

The management server 2 calculates a total weight of a plurality of articles loaded on the UAV 1 on the basis of weights of the articles included in each of a plurality of orders, and determines whether or not the unmanned aerial vehicle is capable of loading and delivering the articles included in each of the plurality of orders by comparing a loadable weight of the UAV 1 with the calculated total weight.

Disaster Situation Communication Network Infrastructure Supplementation Method and System Using Unmanned Mobile Device

A device for establishing a communication network and collecting situation information at a site of a collapse disaster is disclosed. The device includes a ground drone 10 deployed at the site of the collapse disaster, the ground drone 10 having a communication device 80 mounted thereon, a flying drone 32 mounted on and carried by the ground drone 10 to fly and photograph the site of the collapse disaster, a camera device 40 mounted on the ground drone 10 to photograph surroundings of the ground drone 10, a storage 50 installed on the ground drone 10, and a plurality of repeater modules 60 connected by the wireless communication network to relay wireless communications between the ground drone 10, the flying drone 32, and a command and control center 100, wherein the storage 50 accommodates the repeater modules 60, and throws the repeater modules 60 in response to an operation signal.

Concept for designing and using an UAV controller model for controlling an UAV
11984038 · 2024-05-14 · ·

Examples relate to a method for generating an Unmanned Aerial Vehicle (UAV) controller model for controlling an UAV, a system including an UAV, a wind generator, a motion-tracking system and a control module, and to an UAV. The method for training the UAV controller model includes providing a wind generator control signal to a wind generator, to cause the wind generator to emit a wind current towards the UAV. The method includes operating the UAV using the UAV controller model. A flight of the UAV is influenced by the wind generated by the wind generator. The method includes monitoring the flight of the UAV using a motion-tracking system to determine motion-tracking data. The method includes training the UAV controller model using a machine-learning algorithm based on the motion-tracking data.

DEVICES, SYSTEMS AND METHODS FOR NAVIGATING A MOBILE PLATFORM
20240152159 · 2024-05-09 ·

Aspects of embodiments to systems and methods for navigating a mobile platform using an imaging device on the platform, from a point of origin towards a target located in a scene, and without requiring a Global Navigation Satellite system (GNSS), by employing the following steps: acquiring, by the imaging device, an image of the scene comprising the target; determining, based on analysis of the image, a direction vector pointing from the mobile platform to the target; advancing the mobile platform in accordance with the direction vector to a new position; and generating, by a distance sensing device, an output as a result of attempting to determine, with the distance sensing device, a distance between the mobile platform and the target. The mobile platform advanced towards the target until the output produced by the distance sensing device is descriptive of a distance which meets a low-distance criterion.