Patent classifications
G05D1/0676
Systems And Methods For Operating Drones In Proximity To Objects
Systems and methods for operating drones in proximity to objects are disclosed herein. An example method includes determining a change in drone, flight status that involves a rotor of the drone being active, determining presence of a mobile device within a designated clearance area established around the drone, preventing the drone from landing, providing a warning message to a user of the mobile device to clear away from the designated clearance area, detecting that the mobile device and the user are not within the designated clearance area, and causing the drone to land.
CENTRAL MANAGEMENT SERVER, UNMANNED CARGO AIRCRAFT AND UNMANNED DELIVERY ROBOT FOR DELIVERING GOODS CONSIDERING STATUS OF LOCAL DELIVERY HUB
A central management server includes a determination module determining a landable local delivery hub among a plurality of local delivery hubs located within a preset radius centered around a destination, when a location information request for the landable local delivery hub is received from an unmanned cargo aircraft; and a control module configured to transmit a landing command including the location information of the determined landable local delivery hub to the unmanned cargo aircraft, and transmit a task execution command to cause the unmanned cargo aircraft to deliver goods to the destination. The determination module determines the landable local delivery hub, based on a combination of an expected landing standby period of the unmanned cargo aircraft, an expected battery consumption amount of the unmanned cargo aircraft, an expected delivery period of the unmanned delivery robot, and an expected battery consumption amount of the unmanned delivery robot.
SYSTEMS AND METHODS FOR PROVIDING REDUCED FLAPS TAKEOFF AND LANDING ADVICE
Methods and systems for providing reduced flaps takeoff or landing advice in an aircraft. The methods and systems include a display device and a processor in operable communication with the display device. The processor is configured to execute program instructions. The program instructions are configured to cause the processor to receive takeoff or landing performance data including weather data and runway conditions data for a plurality of runways at a destination aerodrome, calculate values of takeoff or landing performance parameters for a plurality of flap configurations for each of the plurality of runways, and present, on the display device, at least one of the values of takeoff or landing performance parameters.
Methods and system for autonomous landing
A computer-implemented method for controlling an unmanned aerial vehicle (UAV) includes identifying a set of target markers based on a plurality of images captured by an imaging device carried by the UAV. The set of target markers includes at least two or more types of target markers that are in close proximity to be detected within a same field of view of the imaging device. The method further includes determining a spatial relationship between the UAV and the set of target markers based at least in part on the plurality of images, and controlling the UAV to approach the set of target markers based at least in part on the spatial relationship while controlling the imaging device to track the set of target markers such that the set of target markers remains within the same field of view of the imaging device.
Enhancing airborne visibility of ground markers by vertical variation
A ground marker for use in identifying a location associated with a mission performed by an aerial vehicle includes a visible surface with aspects that are positioned at different vertical heights or elevations. The vertical variation in the aspects of the visible surface enhances a level of visibility of the ground marker within images captured by cameras provided aboard the aerial vehicle, resulting in more accurate estimations of ranges to such markers (e.g., altitudes) determined from such images. The visible surface includes one-dimensional or two-dimensional bar codes, alphanumeric characters and symbols thereon and is provided on or within rigid or flexible frames that are adapted to be placed on ground surfaces at the location associated with the mission.
MOVING BODY, CONTROL METHOD, AND PROGRAM
The present disclosure relates to a moving body, a control method, and a program that enable realization of safer movement and stop. A safety degree estimation unit estimates a safety degree according to a lapse of time of its own machine in a moving state on the basis of external environmental information regarding an external environment, and a movement control unit controls movement of the own machine on the basis of the estimated safety degree. Technology according to the present disclosure can be applied to, for example, a moving body such as a drone.
LIGHT EMITTING DEVICE POSITIONAL TRACKING FOR MOBILE PLATFORMS
Light emitting device positional tracking systems and methods are provided. In one example, a method includes receiving images captured of a target location comprising a plurality of light emitting devices, where each of the light emitting devices has an associated blinking pattern. The method may further include detecting the blinking pattern for each of the light emitting devices in the images. The method may further include determining a classification for each of the light emitting devices based on its detected blinking pattern. The method may further include aligning a mobile platform with the target location based on the classifications of the light emitting devices. Related devices and systems are also provided.
Landing gear deployment
A method 300 for deploying an aircraft landing gear including: receiving an aircraft landing gear deployment signal 310, receiving an aircraft position signal indicative of a distance of the aircraft from an aircraft landing site 320, receiving one or more flight signals indicating one or more dynamic conditions or parameters relating to the flight of the aircraft 330, determining, based at least on the one or more flight signals, a first aircraft position, relative to the aircraft landing site, at which the landing gear deployment should commence 340, and deploying the landing gear (a) when the aircraft reaches the first aircraft position, in the event that the deployment signal is received before the aircraft reaches the first aircraft position, or (b) immediately, in the event that the deployment signal is received when the aircraft has passed the first aircraft position 350.
Autonomous aircraft sensor-based positioning and navigation system using markers
A system and method are disclosed for design of a suite of multispectral (MS) sensors and processing of enhanced data streams produced by the sensors for autonomous aircraft flight. The onboard suite of MS sensors is specifically configured to sense and use a MS variety of sensor-tuned objects, either strategically placed objects and/or surveyed and sensor significant existing objects to determine a position and verify position accuracy. The received MS sensor data enables an autonomous aircraft object identification and positioning system to correlate MS sensor data output with a-priori information stored onboard to determine and verify position and trajectory of the autonomous aircraft. Once position and trajectory are known, the object identification and positioning system commands the autonomous aircraft flight management system and autopilot control of the autonomous aircraft.
Method for Controlling a Flight Movement of an Aerial Vehicle for Landing or for Dropping a Cargo, and Aerial Vehicle
The preferred embodiments relate to a method for controlling a flight movement of an aerial vehicle for landing the aerial vehicle, including: recording of first image data by means of a first camera device, which is provided on an aerial vehicle, and is configured to record an area of ground, wherein the first image data is indicative of a first sequence of first camera images. The method also includes recording of second image data by means of a second camera device, which is provided on the aerial vehicle, and is configured to record the area of ground, wherein the second image data is indicative of a second sequence of second camera images.