Patent classifications
G05D1/6486
Mobile work machine control based on control zone map data
Control zones are identified on a thematic map and work machine actuator settings are identified for each control zone. A position of the work machine is sensed and actuators on the work machine are controlled based on the control zone that the work machine is in, and based upon the actuator settings corresponding to the control zone. The control zone is then divided, on a display, into a harvested portion of the control zone on which an observed condition value is shown, and control zone that has yet to be harvested, on which an estimated value of the condition is shown.
Direction Information for a Harvesting Vehicle
A process for generating directional information indicating a target direction for a combine harvester. Point cloud data representing a region ahead of the harvester is obtained and processed to identify a target region enclosing a representation of unharvested crop. The directional information is generated by processing an edge of the target region to determine a target direction that would result in the harvester harvesting the edge of the unharvested crop.
Work vehicle guidance and/or automation with respect to identified regions of interest in a work area
Computer-implemented systems and methods are provided for automatically conforming guidance of work vehicles to detected conditions in a work area. Upon determining a deviation of a work vehicle's current movement path from a first planned movement path with respect to the work area, at least a portion of the work area may be identified as a region of interest, along with coverage characteristics associated with the region of interest, based at least in part on the determined deviation. A second planned movement path is generated for the work vehicle and/or another work vehicle subsequently traversing the work area relative to the region of interest, wherein the generated second planned movement path accounts for the coverage characteristics associated with the region of interest. The second planned movement path may be generated to avoid the region, prevent turns within the region, optimize work vehicle operation within the work area generally, etc.
Determination of task area boundaries during driver-assisted construction vehicle operation
Construction vehicles using a teach and repeat system can ascertain boundaries of a task area while working, without knowing a perimeter of the task area before starting the task. The system maps a first path of the vehicle by recording a position of a vehicle moving from a first start position to a first end position. A second path is predicted, based on a shape of the first path, the second path having a second start position and a second end position. A third path is mapped from a third start position to a third end position. Boundaries of the task area are ascertained based on the first path, the first start position, the second start position, the third start position, the third path, the first end position, the second end position, and the third end position.
Sweeping method of swimming pool cleaning robot and cleaning robot
The present application discloses a sweeping method of a swimming pool cleaning robot and a cleaning robot, the method including: acquiring map information about an area to be cleaned; planning a first sweeping path based on the map information, the first sweeping path meeting pre-set cleaning parameter requirements; controlling the cleaning robot to travel and perform a cleaning operation based on the first sweeping path; determining whether the cleaning operation is ended, and if so, controlling the cleaning robot to travel to a missed area so as to perform supplementary sweeping. This application can improve sweeping coverage rate and sweeping efficiency.
Robot sensing animal waste located indoors and controlling method thereof
A robot includes a gas collector; a gas sensor configured to sense a gas collected by the gas collector; a camera; a driver; and a processor configured to: based on the gas sensed by the gas sensor identified as a first type gas, control the driver to decrease a moving speed of the robot, identify a gas generating area based on data sensed by the gas sensor while the robot is moving at the decreased speed, and control the camera to capture the identified gas generating area.
REAL-TIME ROBOT-MOUNTED SPILL DETECTION SYSTEM WITH MULTI-CAMERAS UTILIZING DEEP LEARNING
A system for detecting and addressing a spill is provided. The system includes an imaging device coupled to a mobile robot, a controller including a processor and memory including an interface module that receives a plurality of images, including infrared thermal and RGB images, from the imaging device, an artificial intelligence (AI) module for evaluating the plurality of images to determine a presence or absence of the spill and provides an output to the alert module when the spill has occurred. The memory includes an alert module that provides an alert of the spill, marks an area of the spill, or initiates a cleanup of the spill. The AI module evaluates the thermal and RGB images together to train and inference on the mobile robot in real-time using a voting module that executes an ensemble algorithm, or secondary layer based on the separate outputs to generate a single output.
POOL ROBOT CONTROL METHOD AND APPARATUS, STORAGE MEDIUM, AND POOL ROBOT
Embodiments of the present disclosure provide a pool robot control method and apparatus, a storage medium, and a pool robot. The method includes: obtaining image information of a target water region captured by a pool robot, where the pool robot operates in the target water region; analyzing the image information to determine an analysis result; and determining a target position of a target object if the analysis result indicates that there is the target object in the target water region, and controlling the pool robot to move to the target position and perform a target operation corresponding to a type of the target object.
METHOD FOR CONTROLLING ROBOT, COMPUTER PROGRAM PRODUCT, ROBOT, AND STORAGE MEDIUM
A method for controlling a robot, where the method includes: obtaining an image collected in real time; determining whether a current scene is a specific scene according to the image collected in real time; and controlling, in a case where determining that the current scene is the specific scene, the robot to execute a cleaning task corresponding to the specific scene.
POOL CLEANING DEVICE AND ITS CONTROLLING METHOD
Disclosed are a pool cleaning device and a method for controlling the pool cleaning device. The method includes: controlling the device to move by a main path in a pool; obtaining at least one image about a current environment within an angle of view of at least one image sensor of the device; determining at least one cleaning target in the current environment based on the at least one image; controlling the device to leave the main path in the main route to clean the at least one cleaning target; and controlling the device to return to the main path to continue moving by the main path after cleaning the at least one cleaning target.