Patent classifications
A01B69/001
Machine-learned tillage plug detection in an autonomous farming vehicle
A detection system detects malfunctions in an autonomous farming vehicle during an autonomous routine using one or more models and data from sensors coupled to the autonomous farming vehicle. The models may include machine-learned models trained on the sensor data and configured to identify objects indicative of an operational or malfunctioning component within a tilling assembly such as a tilling shank or sweep. Additionally, a machine-learned model may be trained on sensor data to detect whether debris has plugged the tilling assembly of the autonomous farming vehicle. In response to detecting a malfunction or a plug, the detection system may modify the autonomous routine (e.g., pausing operation) or provide information for the malfunction to be addressed (e.g., the likely location of a malfunctioning sweep that has detached from the tilling assembly).
SYSTEMS AND METHODS FOR MONITORING PLANTS IN PLANT GROWING AREAS
Systems and methods for monitoring plants'conditions in one or more plant growing areas are presented. The system comprises a data collection system for providing characterization data about various parameters of plants in the one or more plant growing areas, the data collection system comprising data collection modules of at least first and second different types comprising respectively one or more first type imaging devices of predetermined first field of view and first resolution and one or more second type imaging devices of predetermined second field of view narrower than the first field of view and second resolution higher than the first resolution, the characterization data provided by the first type imaging device(s) comprising first type image data indicative of one or more plants in the plant growing area and of location of at least one device of the second type imaging devices with respect to said one or more plants in the plant growing area, the characterization data provided by the second type imaging device(s) comprising second type image data indicative of one or more portions of plants in the plant growing area; and a control system for activating at least one first type imaging device and at least one second type imaging device at least partially simultaneously, and to be responsive to operational data being based on analysis of the first type image data and comprising navigation data to navigate the at least one second type imaging device or at least one device of the first type imaging devices in the plant growing area.
IMAGE MONITORING FOR CONTROL OF INVASIVE GRASSES
The invention relates to weed mapping, monitoring and control, in particular to the mapping, monitoring and control of invasive annual grasses. A computer system comprises a receiving unit providing a processing unit with images of a geographical area, the images displaying the geographical area at points in time during a phenological cycle of a weed. At least two images depicting a weed at two different phenological stages may be used because two sequentially collected images are required in order to identify a characteristic temporal change. The processing unit analyzes the images to identify image areas showing a spectral signature characteristic of the weed. The processing unit identifies geographical sub-areas corresponding to the identified image areas and to create a weed distribution map with the identified geographical sub-areas marked as areas affected by the weed. The output unit is configured to output the map.
PRECISION TREATMENT OF AGRICULTURAL OBJECTS ON A MOVING PLATFORM
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include, receiving data representing a policy specifying a type of action for an agricultural object, selecting an emitter with which to perform a type of action for the agricultural object as one of one or more classified subsets, and configuring the agricultural projectile delivery system to activate an emitter to propel an agricultural projectile to intercept the agricultural object.
System and method for monitoring the levelness of a multi-wing agricultural implement
A system for monitoring the levelness of a multi-wing agricultural implement may include a central frame section, a wing section pivotably coupled to the central frame section and a field contour sensor configured to generate data indicative of a contour of an aft portion of the field located rearward of the implement relative to a direction of travel of the implement. The system may further include a controller communicatively coupled to the field contour sensor. The controller may be configured to monitor the data received from the field contour sensor and assess a levelness of the implement based at least in part on the contour of the aft portion of the field.
REMOTE CONTROL APPARATUS
A remote control apparatus capable of communicating with a control apparatus of an autonomously running work vehicle via a communication apparatus, the remote control apparatus comprising a communication apparatus, a control apparatus, a display apparatus, and cameras for obtaining images of the front and rear, wherein the display apparatus is provided with at least a remote control region for controlling the autonomously running work vehicle, a peripheral image region for displaying images captured by the cameras, and a work status display region, wherein the peripheral image region is provided with a frontal view and a rear view.
Implement control of vehicle and implement combination
An implement includes a global positioning system (GPS) receiver and an implement control system. The global positioning system receiver is configured to obtain position information for the implement. The implement control system is configured to determine a lateral error for the implement based on the position information, estimate a lateral error for a vehicle relative to the implement, the vehicle being attached to the implement, and steer the vehicle to guide the implement based on at least the lateral error for the implement and the lateral error for the vehicle.
Autonomous detection and control of vegetation
A method includes obtaining, by the treatment system configured to implement a machine learning (ML) algorithm, one or more images of a region of an agricultural environment near the treatment system, wherein the one or more images are captured from the region of a real-world where agricultural target objects are expected to be present, determining one or more parameters for use with the ML algorithm, wherein at least one of the one or more parameters is based on one or more ML models related to identification of an agricultural object, determining a real-world target in the one or more images using the ML algorithm, wherein the ML algorithm is at least partly implemented using the one or more processors of the treatment system, and applying a treatment to the target by selectively activating the treatment mechanism based on a result of the determining the target.
WORKING VEHICLE
A working vehicle includes a vehicle body, a linkage provided on the vehicle body to link a working device, a controller to cause the vehicle body to perform automatic travel, a rear obstacle detector to detect obstacles behind the vehicle body, and a setting changer to change rear setting information regarding obstacle detection performed by the rear obstacle detector.
Rearward facing multi-purpose camera with windrow width indications
A crop windrow monitoring system includes an image sensor positioned to include a field of view facing a rearward direction of a power unit, and a visual monitor operable to display an image. A computing device is operable to determine an intended direction of movement of the power unit. The image is displayed on the visual monitor in a first mode having a first magnification when the intended direction of movement includes the rearward direction. The image is displayed on the visual monitor in a second mode having a second magnification and overlaid with indicia indicating a width of the windrow when the intended direction of movement includes the forward direction. The second magnification may be larger than the first magnification.