Patent classifications
A01B69/001
ADVANCED MOVEMENT THROUGH VEGETATION WITH AN AUTONOMOUS VEHICLE
Disclosed here are methods and systems for automatically operating automated vehicles moving through vegetation obstacles with minimal damage, comprising receiving image(s) depicting vegetation obstacle(s) blocking at least partially a path of an automated vehicle executing a mission, analyzing the image(s) to extract one or more obstacle attributes of the vegetation obstacle(s), computing a plurality of movement patterns for operating the automated to cross the vegetation obstacle(s) based on one or more vehicle attributes of the automated vehicle with respect to one or more of the obstacle attributes where each movement pattern defines one or more movement parameters of the automated vehicle, selecting one of the movement patterns estimated to reduce a cost of damage to the automated vehicle and/or to the one or more vegetation obstacles, and outputting instructions for operating the automated vehicle to move through the vegetation obstacle(s) according to the selected movement pattern.
CAMERA ARRAY CALIBRATION IN A FARMING MACHINE
The calibration system of the farming machine receives images from each camera of the camera array. The images comprise visual information representing a view of a portion of an area surrounding the farming machine. To calibrate a pair of cameras including a first camera and second camera, the calibration system determines a relative pose between the pair of cameras by extracting relative position and orientation characteristics from visual information in both an image received from the first camera and an image received from the second camera. The calibration system identifies a calibration error for the pair of cameras based on a comparison of the relative pose with an expected pose between the first pair of cameras. The calibration system transmits a notification to an operator of the farming machine that describes the calibration error and instructions for remedying the calibration error.
VIRTUAL SAFETY BUBBLES FOR SAFE NAVIGATION OF FARMING MACHINES
An autonomous farming machine navigable in an environment for performing farming action(s) is disclosed. The farming machine receives a notification from a manager that there are no obstacles in the blind spots of the detection system. The farming machine applies an obstacle detection model to the captured images to verify that there are no obstacles in unobstructed views. The farming machine determines a configuration of the farming machine. The farming machine determines a virtual safety bubble for the farming machine to autonomously perform the farming action(s) based on the determined configuration. The farming machine detects an obstacle in the environment by applying the obstacle detection model to the captured images. The farming machine determines that the obstacle is entering the virtual safety bubble. In response to determining that the obstacle is entering the virtual safety bubble, the farming machine terminates operation of the farming machine and/or enacts preventive measures.
AGRICULTURAL MACHINE, AND DEVICE AND METHOD FOR CONTROLLING AGRICULTURAL MACHINE
An agricultural machine includes one or more illuminators to illuminate surroundings of the agricultural machine in a traveling direction thereof, and a controller to control self-driving of the agricultural machine while keeping at least one of the one or more illuminators deactivated at nighttime.
DETERMINING VEHICLE HEADING AND IMPLEMENT HEADING OF LOW SPEED FARMING MACHINE
A system and a method are disclosed for determining a heading of a vehicle and a heading of an implement of a farming machine when the farming machine is stationary or moving at a speed below a threshold speed. The vehicle and the heading are attached together via a pivot hitch. A farming machine management system receives coordinates from a first location sensor coupled to the vehicle and a second location sensor coupled to the implement. The farming machine management system determines intersection points between a first circle centered at the first location sensor and a second circle centered at the second location sensor. The farming machine management system selects one of the intersection points based on an output of a machine learning model. The farming machine management system determines the headings of the vehicle and the implement and generates instructions for operating the farming machine based on the headings.
Working vehicle
A working vehicle is capable of autonomously traveling on a target traveling route and appropriately performing illumination during autonomous traveling to enable an operator to reliably confirm the presence of obstacles or the like on the target traveling route. The working vehicle includes a traveling body to autonomously travel on the target traveling route, illumination lamps located on the traveling body to respectively illuminate different directions, and a controller to change a control relating to a way of turning on the illumination lamps during the autonomously traveling.
Autonomous mobile robots for movable production systems
A system for performing autonomous agriculture within an agriculture production environment includes one or more agriculture pods, a stationary robot system, and one or more mobile robots. The agriculture pods include one or more plants and one or more sensor modules for monitoring the plants. The stationary robot system collects sensor data from the sensor modules, performs farming operations on the plants according to an operation schedule based on the collected sensor data, and generates a set of instruction for transporting the agriculture pods within the agriculture production environment. The stationary robot system communicates the set of instructions to the agriculture pods. The mobile robots transport the agriculture pods between the stationary robot system and one or more other locations within the agriculture production environment according to the set of instructions.
Method for Treating Plants in a Field
A method for treating plants in a field, in which a specific crop is planted, has the following steps: selecting a treatment tool for treating plants; capturing an image of the field, the image being correlated with positional information; determining a position of a plant to be treated in a field, using a neural network into which the captured image is fed, wherein the neural network has multiple specific classes and a general class, the crop belongs to one of the specific classes, and plants not corresponding to the crop belong to both one of the specific classes and to the general class, or at least belong to the general class; directing the treatment tool to the position of the plant; and treating the plant using the treatment tool.
IMPLEMENT MANAGEMENT SYSTEM FOR DETERMINING IMPLEMENT STATE
An implement management system detects implement wear and monitors implement states to modify operating modes of a vehicle. The system can determine implement wear using the pull of the implement on the vehicle, the force and angle of which is represented by an orientation vector. The system may measure a current orientation vector and determine an expected orientation vector using sensors and a model (e.g., a machine learned model). Additionally, the implement management system can determine an implement state based on images of the soil and the implement captured by a camera onboard the vehicle during operation. The system may apply different models to the images to determine a likely state of the implement. The difference between the expected and current orientation vectors or the determined implement state may be used to determine whether and how the vehicle's operating mode should be modified.
AGRICULTURAL MACHINE
An agricultural machine includes a traveling vehicle body, a detector to detect objects in an area surrounding the traveling vehicle body, a detection adjuster to adjust a detection direction of the detector, a position sensor to detect a position of the traveling vehicle body, and an information acquirer to acquire information relating to an entrance/exit of an agricultural field. The detection adjuster is operable to perform a first adjustment of orienting the detection direction toward an end point of the entrance/exit when the traveling vehicle body is traveling toward the entrance/exit and a distance from the position of the traveling vehicle body to the entrance/exit included in the information is a predetermined value or less, and a second adjustment of orienting the detection direction toward the end point of the entrance/exit or a space forward of the end point in a traveling direction when the traveling vehicle body is traveling on the entrance/exit.