Patent classifications
A01B69/001
System and method for controlling the operation of a seed-planting implement based on topographical features present within a field
In one aspect, a system for controlling the operation of a seed-planting implement may include a furrow-forming tool configured to form a furrow in soil present within a field. Furthermore, the system may include a sensor configured to capture data indicative of a topographical profile of the soil within the field. Additionally, a controller of the disclosed system may be configured to identify a topographical feature within the field based on the data received from the sensor. Furthermore, the controller may be configured to determine a position of the furrow-forming tool relative to the identified topographical feature. Additionally, the controller may be configured to initiate a control action to adjust the position of the furrow-forming tool when it is determined that the relative position between the furrow-forming tool and the identified topographical feature is offset from a predetermined positional relationship defined for the furrow-forming tool.
System and method for determining soil clod size distribution using spectral analysis
In one aspect, a system for determining soil clod size distribution as an agricultural implement is being towed across a field by a work vehicle may include a vision-based sensor provided in operative association with one of the work vehicle or the agricultural implement. As such, the vision-based sensor may be configured to capture vision data associated with a portion of the field present within a field of view of the vision-based sensor. Furthermore, the system may include a controller configured to receive vision data from the vision-based sensor. Moreover, the controller may be further configured to analyze the received vision data using a spectral analysis technique to determine a size distribution of soil clods present within the field of view of the vision-based sensor.
Tillage implement with vision sensors
A control system for a tillage implement broadly includes front and rear sensors, a leveling assembly, and a controller. The front sensor is positioned on a front of a central section, wherein the front sensor is configured to obtain height information indicative of a height of the front of the central section above a ground. The rear sensor is positioned on a rear of the central section, wherein the rear sensor is configured to obtain height information indicative of a height of the rear of the central section above the ground. The leveling assembly is configured to adjust a front to rear orientation of the central section. The controller is configured to receive the height information from the front sensor and the height information from the rear sensor, and to provide instructions to the leveling assembly to adjust the front to rear orientation of the central section based on the received height information.
Micro-precision application of multiple treatments to agricultural objects
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include, receiving data representing a policy specifying a type of action for an agricultural object, selecting an emitter with which to perform a type of action for the agricultural object as one of one or more classified subsets, and configuring the agricultural projectile delivery system to activate an emitter to propel an agricultural projectile to intercept the agricultural object.
Obscurant emission to assist image formation to automate agricultural management and treatment
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include calculating an amount of light originating from a light source sensed at a location in which an agricultural object is interposed between the light source and an image capture device, causing emission of an obscurant, and directing the obscurant to a region interposed between the light source and the agricultural object.
SMART SPRAYER FOR PRECISION AGRICULTURE
An illustrative smart sprayer for a precision agricultural implement includes a mount for coupling the sprayer to a modular tool arm of the implement, and a spray head, an airflow generator, a solenoid valve, and an inspirator mixer located within a containment shroud. The smart sprayer is one of alternative agricultural tools that can be selectively coupled to and be operated by a control system of the common platform precision agricultural implement, including a machine vision module for identifying and locating commodity plants, non-commodity plants, and positioning the agricultural tools laterally and vertically relative to commodity plant lines.
CONTROL SYSTEM FOR CONTROLLING FILLING BASED ON LOAD WEIGHT
A receiving vehicle is automatically identified and the number of times it is filled is automatically counted. Control signals can be generated based on the number of times the receiving vehicle is filled.
Calibration of systems to deliver agricultural projectiles
Various embodiments relate generally to computer vision and automation to autonomously identify and deliver for application a treatment to an object among other objects, data science and data analysis, including machine learning, deep learning, and other disciplines of computer-based artificial intelligence to facilitate identification and treatment of objects, and robotics and mobility technologies to navigate a delivery system, more specifically, to an agricultural delivery system configured to identify and apply, for example, an agricultural treatment to an identified agricultural object. In some examples, a method may include identifying an emitter of an agricultural projectile delivery system to calibrate a trajectory of an agricultural projectile to intercept a target, predicting a projectile impact site relative to the reference of alignment, determining a calibration parameter to align the projectile impact site and the target, and adjusting the trajectory based on the one or more calibration parameters.
SYSTEMS, METHODS, AND APPARATUSES FOR AUTOMATED CROP MONITORING
A crop monitoring system may include an observation robot, a centralized server, and a user device. An observation robot may be autonomous. The observation robot may include a suite of sensors. The observation robot may include two cameras oriented towards opposite sides of the observation robot to capture images of plants on either side of the observation robot. The centralized server may store and/or execute various instructions for operating and/or communicating with the observation robot. The centralized server may store and/or execute various instructions for processing data such as images, sensor measurements, and environmental data obtained from the observation robot. The centralized server may store and/or execute various instructions for analyzing such data. The centralized server may store and/or execute various instructions for presenting the data to a user, such as via the user device.
System and method for presenting the surroundings of an agricultural implement
Systems and methods are disclosed herein for displaying images of certain surroundings of an agricultural implement, for example one including a frame extending between opposing distal ends of a length transverse to a working direction of the agricultural implement. Individual image regions of the surroundings of the agricultural implement are captured using cameras arranged on the agricultural implement and directed toward a working area in the working direction, wherein a corresponding display is generated on a user interface. One or more traveling conditions (e.g., an edge of the working area and/or an edge of the frame, respectively corresponding to a first end and/or second end of the frame) may be automatically projected in the working direction, wherein respective indicia corresponding to the projected traveling conditions are superimposed on the generated display. The indicia may optionally be modified dynamically based on determined changes in a projected course of the working direction.