Patent classifications
G06V20/17
AIRCRAFT DOOR CAMERA SYSTEM FOR DOCKING ALIGNMENT MONITORING
A camera with a field of view toward an external environment of an aircraft is disposed within an aircraft door such that a ground surface is within the field of view of the camera during taxiing of the aircraft. A display device is disposed within an interior of the aircraft. A processor is operatively coupled to the camera and to the display device. The processor analyzes image data captured by the camera for docking guidance by identifying, within the captured image data, a region on the ground surface corresponding to an alignment fiducial indicating a parking location for the aircraft, determining, based on the region of the captured image data corresponding to the alignment fiducial indicating the parking location, a relative location of the aircraft with respect to the alignment fiducial, and outputting an indication of the relative location of the aircraft to the alignment fiducial.
AIRCRAFT DOOR CAMERA SYSTEM FOR DOCKING ALIGNMENT MONITORING
A camera with a field of view toward an external environment of an aircraft is disposed within an aircraft door such that a ground surface is within the field of view of the camera during taxiing of the aircraft. A display device is disposed within an interior of the aircraft. A processor is operatively coupled to the camera and to the display device. The processor analyzes image data captured by the camera for docking guidance by identifying, within the captured image data, a region on the ground surface corresponding to an alignment fiducial indicating a parking location for the aircraft, determining, based on the region of the captured image data corresponding to the alignment fiducial indicating the parking location, a relative location of the aircraft with respect to the alignment fiducial, and outputting an indication of the relative location of the aircraft to the alignment fiducial.
MANAGEMENT PLATFORM FOR AUTONOMOUS DRONE OPERATIONS
Methods, systems, and computer programs are presented for executing a mission by an autonomous device to inspect an asset. One method includes an operation for obtaining a workflow. The workflow includes operations to be executed during a mission to be performed by a robot and a destination for sending data resulting from the mission. The method further includes an operation for generating a package after completion of the mission associated with the workflow. The package is self-contained and comprises information obtained during the mission that enables generation of results. The package comprises sensor information collected by one or more sensors, telemetry information obtained by the robot, information about assets associated with the mission, software version identifier for the package generation, and routing information for transmitting the package to the destination. The method further includes an operation for analyzing the information of the package to determine results for the mission.
USER SAFETY AND SUPPORT IN SEARCH AND RESCUE MISSIONS
Locating, aiding, and communicating with users and personnel in emergency situations by traversing a defined path utilizing an unmanned vehicle, detecting a user within a threshold distance of the defined path, logging a geolocation of the user within the unmanned vehicle, and determining whether to dispatch assistance to the user.
SYSTEMS AND METHODS FOR IDENTIFYING INCLINED REGIONS
Systems and methods for identifying inclined regions are provided. In one aspect, a method is provided that includes receiving shadow data for at least one first ground object in a first region, wherein each first ground object is depicted in one overhead image of the first region, wherein the shadow data comprises a length of the respective first ground object as identified from the respective overhead image; receiving shadow data for at least one second comparable ground object in a second region, wherein each second ground object is depicted in one overhead image of the second region, wherein the shadow data comprises a length of the respective second ground object as identified from the respective overhead image; calculating a statistical measure describing the variability of the shadow lengths between objects in the first region and the second region; comparing the statistical measure to a predetermined threshold; and based on the comparison, identifying that the first region is inclined relative to the second region.
Systems, devices, and methods for in-field diagnosis of growth stage and crop yield estimation in a plant area
Methods, devices, and systems may be utilized for detecting one or more properties of a plant area and generating a map of the plant area indicating at least one property of the plant area. The system comprises an inspection system associated with a transport device, the inspection system including one or more sensors configured to generate data for a plant area including to: capture at least 3D image data and 2D image data; and generate geolocational data. The datacenter is configured to: receive the 3D image data, 2D image data, and geolocational data from the inspection system; correlate the 3D image data, 2D image data, and geolocational data; and analyze the data for the plant area. A dashboard is configured to display a map with icons corresponding to the proper geolocation and image data with the analysis.
Systems, devices, and methods for in-field diagnosis of growth stage and crop yield estimation in a plant area
Methods, devices, and systems may be utilized for detecting one or more properties of a plant area and generating a map of the plant area indicating at least one property of the plant area. The system comprises an inspection system associated with a transport device, the inspection system including one or more sensors configured to generate data for a plant area including to: capture at least 3D image data and 2D image data; and generate geolocational data. The datacenter is configured to: receive the 3D image data, 2D image data, and geolocational data from the inspection system; correlate the 3D image data, 2D image data, and geolocational data; and analyze the data for the plant area. A dashboard is configured to display a map with icons corresponding to the proper geolocation and image data with the analysis.
Method for size estimation by image recognition of specific target using given scale
The present invention relates to a method for size estimation by image recognition of a specific target using a given scale. First, a reference objected is recognized in an image and the corresponding scale is established. Then the specific target is searched and the size of the specific target is estimated according to the acquired scale.
User equipment, system, and control method for controlling drone
Provided is a user equipment for controlling a drone. The user equipment analyzes an original video to control the drone to photograph a reproduction video giving a feeling identical to or similar to the original video. An electronic device may be connected to an artificial intelligence module, a robot, an augmented reality (AR) device, a virtual reality (VR) device, a device related to 5G service, and the like.
Method and a system for detecting road ice by spectral imaging
A method for detecting an ice on a road surface includes: providing a spectral imaging camera; recording a first reflectance (R1) of the surface at 0.545 to 0.565 μm using the spectral imaging camera; recording a second reflectance (R2) of the surface at 0.620 to 0.670 μm using the spectral imaging camera; recording a third reflectance (R3) of the surface at 0.841 to 0.876 μm using the spectral imaging camera; calculating an ice index based on the first reflectance, the second reflectance, and the third reflectance; providing a thermometer; recording a surface temperature of the surface using the thermometer; and detecting a presence of the ice on the surface based on the ice index and the surface temperature. A system for detecting an ice on a surface is also disclosed.