Patent classifications
B64F1/18
Drone support and operations system
Systems and methods for drone support and operation are provided. The system for providing support includes a base with at least one docking area and a cover portion configured to move between an open and closed position and a drone support unit configured to provide support for the drone at least during the time the drone is in the at least one docking area of the base.
VISUAL LANDING AIDS FOR UNMANNED AERIAL SYSTEMS
Visual landing aids including a series of contrasting circles and polygons for unmanned aerial vehicles that are capable of being accurately detected over a wide range of angles and distances by an unmanned aerial vehicle equipped with a camera and shape detection capabilities. The visual landing said may be implemented using contrasting colors for the pattern which reflect visible and/or UV or infrared light, or by light emitting elements. In some examples, the landing aids includes a secondary smaller version of the landing aid shape pattern that is embedded within the larger pattern, to enable greater detection range while facilitating close-in precision guidance. In still further examples, light emitting elements may be pulsed at a rate that is synchronized with the camera shutter on the unmanned aerial vehicle to further enhance accurate detection.
VISUAL LANDING AIDS FOR UNMANNED AERIAL SYSTEMS
Visual landing aids including a series of contrasting circles and polygons for unmanned aerial vehicles that are capable of being accurately detected over a wide range of angles and distances by an unmanned aerial vehicle equipped with a camera and shape detection capabilities. The visual landing said may be implemented using contrasting colors for the pattern which reflect visible and/or UV or infrared light, or by light emitting elements. In some examples, the landing aids includes a secondary smaller version of the landing aid shape pattern that is embedded within the larger pattern, to enable greater detection range while facilitating close-in precision guidance. In still further examples, light emitting elements may be pulsed at a rate that is synchronized with the camera shutter on the unmanned aerial vehicle to further enhance accurate detection.
SYSTEMS AND METHODS OF PRECISION LANDING FOR OFFSHORE HELICOPTER OPERATIONS USING SPATIAL ANALYSIS
Systems and methods of precision landing in adverse conditions are provided. In one embodiment, a precision landing system comprises a vehicle including: a receiver configured to receive position information for structures and a landing zone of a landing site and a processor coupled to a memory, the memory includes three-dimensional geometric structural information for a landing site. The processor configured to: receive the position information from the receiver; assign geographical coordinates to the three-dimensional geometric structural information using the position information for the structures and the landing zone of the landing site; send the three-dimensional geometric structural information and graphical rendering information to a display device. The vehicle further includes a display device, wherein the display device is configured to render and display a three-dimensional representation of the landing site in real-time based on the three-dimension geometric structural information and the graphical rendering information from the processor.
HELIPAD AND HELIPAD ILLUMINATION SYSTEM
Helipad and helipad illumination system The invention relates to a dynamic helipad illumination system (2) for dynamically illuminating a floor surface or supporting deck of helipad (1) at least partly in order to enable communication of information regarding one or more current local conditions of, at and/or near the helipad. The dynamic helipad illumination system comprises one or multiple illumination elements (3). The system is arranged to bring at least a part (2a) of said one or multiple illumination elements from a first illumination state, in which at least a part of the floor surface or support deck of the helipad is at least partly illuminated by at least a number of said at least part of the one or multiple illumination elements such that at least a part of the floor surface or support deck radiates a first visual appearance which may correspond to at least a first message to be communicated to a helicopter crew or a pilot, into a second illumination state, in which at least a part of the floor surface or support deck of the helipad is at least partly illuminated by at least a number of said at least part of the one or multiple illumination elements such that at least a part of the floor surface or support deck of the helipad radiates a second visual appearance which may correspond to at least a second message to be communicated to the helicopter crew or the pilot.
MACHINE VISION-BASED METHOD AND SYSTEM FOR AIRCRAFT DOCKING GUIDANCE AND AIRCRAFT TYPE IDENTIFICATION
A machine vision-based method and system for aircraft docking guidance and aircraft type identification, comprising: S1, a monitoring scenario is divided into different information processing function areas; S2 a captured image is pre-processed; S3 the engine and the front wheel of an aircraft are identified in the image, so as to confirm that the aircraft has appeared in the image; S4 continuous tracking and real-time updating are performed on the image of the engine and the front wheel of the aircraft captured in step S3; S5 real-time positioning of the aircraft is implemented and the degree of deviation of the aircraft with respect to a guide line and the distance with respect to a stop line are accurately determined; S6 the degree of deviation of the aircraft with respect to the guide line and the distance with respect to the stop line of step S5 are outputted and displayed.
MACHINE VISION-BASED METHOD AND SYSTEM FOR AIRCRAFT DOCKING GUIDANCE AND AIRCRAFT TYPE IDENTIFICATION
A machine vision-based method and system for aircraft docking guidance and aircraft type identification, comprising: S1, a monitoring scenario is divided into different information processing function areas; S2 a captured image is pre-processed; S3 the engine and the front wheel of an aircraft are identified in the image, so as to confirm that the aircraft has appeared in the image; S4 continuous tracking and real-time updating are performed on the image of the engine and the front wheel of the aircraft captured in step S3; S5 real-time positioning of the aircraft is implemented and the degree of deviation of the aircraft with respect to a guide line and the distance with respect to a stop line are accurately determined; S6 the degree of deviation of the aircraft with respect to the guide line and the distance with respect to the stop line of step S5 are outputted and displayed.
SYSTEMS AND METHODS EMPLOYING CODED LIGHT TO DOCK AERIAL DRONES, SELF-DRIVING CARS AND SURFACE ROBOTS
Precision docking is one of the most important tasks for drones and surface robots to charge themselves and load/unload packages. Without accurate docking, surface robots and drones will miss their charging pad or charging contacts and cannot automatically charge themselves for later tasks. Described is a system using coded light to guide the precision docking process for drones and ground robots. More specifically, the system uses projectors to project temporal identifiers for space partitioned by pixel projections. Different space partition gets a different identifier. By using a simple light sensor on a drone or a ground robot, the drone or the ground robot can know its precise location in the space and therefore knows where to move for a precise docking. Depending on docking precision requirement, the coded light precision may be adjusted by using projectors with different resolutions.
SYSTEMS AND METHODS EMPLOYING CODED LIGHT TO DOCK AERIAL DRONES, SELF-DRIVING CARS AND SURFACE ROBOTS
Precision docking is one of the most important tasks for drones and surface robots to charge themselves and load/unload packages. Without accurate docking, surface robots and drones will miss their charging pad or charging contacts and cannot automatically charge themselves for later tasks. Described is a system using coded light to guide the precision docking process for drones and ground robots. More specifically, the system uses projectors to project temporal identifiers for space partitioned by pixel projections. Different space partition gets a different identifier. By using a simple light sensor on a drone or a ground robot, the drone or the ground robot can know its precise location in the space and therefore knows where to move for a precise docking. Depending on docking precision requirement, the coded light precision may be adjusted by using projectors with different resolutions.
FLIGHT MANAGEMENT SYSTEM DEPARTURE AND ARRIVAL PERFORMANCE DISPLAY BASED ON WEATHER DATA UPLINK
A flight management system includes a communications system configured to receive weather data from a remote source, a display system configured to generate an output for a flight display of an aircraft, and at least one processor with a non-transitory processor-readable medium storing processor-executable code. The output includes weather information based on the received weather data. The processor-executable code causes the processor to receive a user input from a user interface element of the aircraft where the user input requests updated weather information. The processor-executable code causes the processor to retrieve, via the communications system and in response to the user input, updated weather data from the remote source; calculate a departure or arrival performance flight parameter based at least in part on the updated weather data; and provide, via the display system, an output for the flight display of the aircraft where the output includes the flight parameter.