Patent classifications
B64D47/08
Automatically deployable drone for vehicle accidents
Methods and systems for automatically deploying an autonomous drone from a vehicle in response to a triggering event or accident so that data associated with the triggering event or accident may be automatically obtained are described. In one embodiment, a method for deploying an autonomous drone in response to a triggering event is described. The method includes providing an autonomous drone in a vehicle. The method also includes detecting a triggering event associated with the vehicle. Upon detection of the triggering event, the method includes automatically deploying the autonomous drone from the vehicle. The method further includes implementing, by the autonomous drone, a plurality of automatic actions, including recording data associated with the vehicle in which the autonomous drone is provided.
Systems and methods for performing remote maintenance
Various embodiments provide systems and/or methods for automated maintenance, delivery, retrieval, and/or communications using drones.
Systems and methods for performing remote maintenance
Various embodiments provide systems and/or methods for automated maintenance, delivery, retrieval, and/or communications using drones.
Camera agnostic core monitor incorporating projected images with high spatial frequency
A camera agnostic core monitor for an enhanced flight vision system (EFVS) is disclosed. In embodiments, a structured light projector (SLP) generates and projects a precise geometric pattern or other like artifact, which is reflected by collimating elements into the EFVS optical path. Within the optical path, the EFVS focal plane array is illuminated by, and detects, the projected artifacts within the scene imagery captured for display by the EFVS. Image processors assess the presentation of the detected artifacts (e.g., position/orientation relative to the expected presentation of the detected artifact within the scene imagery) to verify that the displayed EFVS imagery is not misleading.
User equipment, system, and control method for controlling drone
Provided is a user equipment for controlling a drone. The user equipment analyzes an original video to control the drone to photograph a reproduction video giving a feeling identical to or similar to the original video. An electronic device may be connected to an artificial intelligence module, a robot, an augmented reality (AR) device, a virtual reality (VR) device, a device related to 5G service, and the like.
User equipment, system, and control method for controlling drone
Provided is a user equipment for controlling a drone. The user equipment analyzes an original video to control the drone to photograph a reproduction video giving a feeling identical to or similar to the original video. An electronic device may be connected to an artificial intelligence module, a robot, an augmented reality (AR) device, a virtual reality (VR) device, a device related to 5G service, and the like.
CAMERA SYSTEM USING STABILIZING GIMBAL
Disclosed is an electronic gimbal with camera and mounting configuration. The gimbal can include an inertial measurement unit which can sense the orientation of the camera and three electronic motors which can manipulate the orientation of the camera. The gimbal can be removably coupled to a variety of mount platforms, such as an aerial vehicle, a handheld grip, or a rotating platform. Moreover, a camera can be removably coupled to the gimbal and can be held in a removable camera frame. Also disclosed is a system for allowing the platform, to which the gimbal is mounted, to control settings of the camera or to trigger actions on the camera, such as taking a picture, or initiating the recording of a video. The gimbal can also provide a connection between the camera and the mount platform, such that the mount platform receives images and video content from the camera.
CAMERA SYSTEM USING STABILIZING GIMBAL
Disclosed is an electronic gimbal with camera and mounting configuration. The gimbal can include an inertial measurement unit which can sense the orientation of the camera and three electronic motors which can manipulate the orientation of the camera. The gimbal can be removably coupled to a variety of mount platforms, such as an aerial vehicle, a handheld grip, or a rotating platform. Moreover, a camera can be removably coupled to the gimbal and can be held in a removable camera frame. Also disclosed is a system for allowing the platform, to which the gimbal is mounted, to control settings of the camera or to trigger actions on the camera, such as taking a picture, or initiating the recording of a video. The gimbal can also provide a connection between the camera and the mount platform, such that the mount platform receives images and video content from the camera.
SYSTEM AND METHOD FOR ASSESSING OPERATOR SITUATIONAL AWARENESS VIA CONTEXT-AWARE GAZE DETECTION
A system and method for continuous real-time assessment of the situational awareness of an aircraft operator incorporates gaze sensors to determine the current gaze target (or sequence of gaze targets) of the operator, e.g., which interfaces the operator is looking at. The system receives operational context from aircraft systems indicative of current events and conditions both internal and external to the aircraft (e.g., operational status, mission or flight plan objectives, weather conditions). Based on the determined gaze targets and coterminous operational context, the system evaluates the situational awareness of the operator relative to the operational context, e.g., perceptive of the operational context; comprehending the operational context and its implications, and projecting the operator's perception and comprehension into responsive action and second-order ramifications according to task models indicative of expected behavior.
SYSTEM AND METHOD FOR ASSESSING OPERATOR SITUATIONAL AWARENESS VIA CONTEXT-AWARE GAZE DETECTION
A system and method for continuous real-time assessment of the situational awareness of an aircraft operator incorporates gaze sensors to determine the current gaze target (or sequence of gaze targets) of the operator, e.g., which interfaces the operator is looking at. The system receives operational context from aircraft systems indicative of current events and conditions both internal and external to the aircraft (e.g., operational status, mission or flight plan objectives, weather conditions). Based on the determined gaze targets and coterminous operational context, the system evaluates the situational awareness of the operator relative to the operational context, e.g., perceptive of the operational context; comprehending the operational context and its implications, and projecting the operator's perception and comprehension into responsive action and second-order ramifications according to task models indicative of expected behavior.