G06V20/188

AIRCRAFT DOOR CAMERA SYSTEM FOR DOCKING ALIGNMENT MONITORING
20230052176 · 2023-02-16 ·

A camera with a field of view toward an external environment of an aircraft is disposed within an aircraft door such that a ground surface is within the field of view of the camera during taxiing of the aircraft. A display device is disposed within an interior of the aircraft. A processor is operatively coupled to the camera and to the display device. The processor analyzes image data captured by the camera for docking guidance by identifying, within the captured image data, a region on the ground surface corresponding to an alignment fiducial indicating a parking location for the aircraft, determining, based on the region of the captured image data corresponding to the alignment fiducial indicating the parking location, a relative location of the aircraft with respect to the alignment fiducial, and outputting an indication of the relative location of the aircraft to the alignment fiducial.

Agricultural pattern analysis system

A pattern recognition system including an image gathering unit that gathers at least one digital representation of a field, an image analysis unit that pre-processes the at least one digital representation of a field, an annotation unit that provides a visualization of at least one channel for each of the at least one digital representation of the field, where the image analysis unit generates a plurality of image samples from each of the at least one digital representation of the field, and the image analysis unit splits each of the image samples into a plurality of categories.

Systems, devices, and methods for in-field diagnosis of growth stage and crop yield estimation in a plant area

Methods, devices, and systems may be utilized for detecting one or more properties of a plant area and generating a map of the plant area indicating at least one property of the plant area. The system comprises an inspection system associated with a transport device, the inspection system including one or more sensors configured to generate data for a plant area including to: capture at least 3D image data and 2D image data; and generate geolocational data. The datacenter is configured to: receive the 3D image data, 2D image data, and geolocational data from the inspection system; correlate the 3D image data, 2D image data, and geolocational data; and analyze the data for the plant area. A dashboard is configured to display a map with icons corresponding to the proper geolocation and image data with the analysis.

Plant group identification

A farming machine moves through a field and includes an image sensor that captures an image of a plant in the field. A control system accesses the captured image and applies the image to a machine learned plant identification model. The plant identification model identifies pixels representing the plant and categorizes the plant into a plant group (e.g., plant species). The identified pixels are labeled as the plant group and a location of the pixels is determined. The control system actuates a treatment mechanism based on the identified plant group and location. Additionally, the images from the image sensor and the plant identification model may be used to generate a plant identification map. The plant identification map is a map of the field that indicates the locations of the plant groups identified by the plant identification model.

Apparatus and method for image-guided agriculture

A method for image-guided agriculture includes receiving images; processing the images to generate reflectance maps respectively corresponding to spectral bands; synthesizing the reflectance maps to generate a multispectral image including vegetation index information of a target area; receiving crop information in regions of the target area; and assessing crop conditions for the regions based on the identified crop information and the vegetation index information.

COMPUTER-EXECUTABLE METHOD RELATING TO WEEDS AND COMPUTER SYSTEM
20230044040 · 2023-02-09 · ·

A computer-executable method relating to weeds, and a computer system. The method comprises: receiving an image (S11); recognizing one or more plants in the image in order to obtain the classification and/or names of the plants, and determining whether the plants are weeds (S12); and in response to determining that at least one plant is a weed, outputting information indicating that the at least one plant is a weed (S13).

TREE CROWN EXTRACTION METHOD BASED ON UNMANNED AERIAL VEHICLE MULTI-SOURCE REMOTE SENSING
20230039554 · 2023-02-09 ·

A tree crown extraction method based on UAV multi-source remote sensing includes: obtaining a visible light image and LIDAR point clouds, taking a digital orthophoto map (DOM) and the LIDAR point clouds as data sources, using a method of watershed segmentation and object-oriented multi-scale segmentation to extract single tree crown information under different canopy densities. The object-oriented multi-scale segmentation method is used to extract crown and non-crown areas, and a tree crown distribution range is extracted with the crown area as a mask; a preliminary segmentation result of single tree crown is obtained by the watershed segmentation method based on a canopy height model; a brightness value of DOM is taken as a feature, the crown area of the DOM is performed secondary segmentation based on a crown boundary to obtain an optimized single tree crown boundary information, which greatly increases the accuracy of remote sensing tree crown extraction.

Gardening apparatus

A gardening apparatus includes one or more of a base, a fluid reservoir, and a plant tray or support disposed on the reservoir. The support is adapted for receiving one or more modular plant inserts, and can define a flow structure for channeling fluid to each insert. A pump supplies fluid from the reservoir to the plant tray or support, with a light assembly adapted to generate a spectrum of light for growth of plants from the inserts. A processor is configured for controlling fluid flow from the pump, the light spectrum generated by the lighting elements, or both. For example, the processor can use a dynamic recipe, algorithm or control schedule to modulate the fluid flow or spectrum based the plant type, growth stage, height, plant health data, digital phenotyping data, or ambient conditions, or a combination thereof.

Method and system for estimating crop coefficient and evapotranspiration of crops based on remote sensing

Methods and systems estimate crop coefficients of a crop. At least one image sensor system captures a plurality of multispectral images of the crop and image data is derived from the multispectral images. At least one vegetation index of the crop is determined based on image data in at least a first spectral band. The reflectance of the crop monotonically increases and reaches a reflectance of at least 20% for at least one wavelength in the first spectral band. A crop coefficient of the crop is estimated based on the determined at least one vegetation index.

Method for selectively deploying sensors within an agricultural facility

One variation of a method for deploying sensors within an agricultural facility includes: accessing scan data of a set of modules deployed within the agricultural facility; extracting characteristics of plants occupying the set of modules from the scan data; selecting a first subset of target modules from the set of modules, each target module in the set of target modules containing a group of plants exhibiting characteristics representative of plants occupying modules neighboring the target module; for each target module, scheduling a robotic manipulator within the agricultural facility to remove a particular plant from a particular plant slot in the target module and load the particular plant slot with a sensor pod from a population of sensor pods deployed in the agricultural facility; and monitoring environmental conditions at target modules in the first subset of target modules based on sensor data recorded by the first population of sensor pods.