G05D2111/60

AUTOMATIC SWIMMING POOL CLEANER WITH AUTO-SCHEDULING SYSTEMS AND METHODS
20240272647 · 2024-08-15 · ·

Automatic swimming pool cleaners for swimming pools and spas may automatically schedule, defined, and/or plan a future cleaning cycle for the automatic swimming pool cleaner. The automatic swimming pool cleaner itself may plan the future cleaning cycle without receiving a cleaning order from a user and/or may allow for the pool system to keep the pool or spa clean by itself without relying on an order from the user as traditionally required.

Spatial blind spot monitoring systems and related methods of use
12259733 · 2025-03-25 · ·

Embodiments of the present disclosure provide a system and a method of controlling a robot for autonomous navigation. The method includes receiving a set of point values defining LIDAR data from a LIDAR sensor scanning a 2D omnidirectional plane, receiving a sensor value from an ultrasonic sensor having a 3D field of view excluding the plane, and resolving an observable field of view for the LIDAR sensor, where the observable field of view includes a blind spot of the LIDAR sensor, and modifying the LIDAR data using the sensor value based on the object being located in the blind spot indicated by the sensor value less than one or more point values corresponding to a portion of the plane extending along the observable field of view, where the modified LIDAR data indicates the object being detected by the LIDAR sensor despite the object located outside the 2D field of view.

Detecting and tracking features of plants for treatment in an agricultural environment

A method and system having instructions to perform actions include obtaining images of an agricultural environment including a first image comprising at least one background portion and one or more regions of interest, implementing a machine learning (ML) algorithm on a portion of the first image including a portion of the background portion and the one or more regions of interest, detecting a plurality of objects associated with a plurality of real-world objects in the agricultural environment in at least one region of interest in the one or more regions of interest of the first image including detecting a first object and detecting a second object, implementing a second algorithm on the portion of the first image comprising the first object to detect one or more divided features of the first object, tracking a feature of the one or more divided features across subsequent images as the moving platform traverses the agricultural environment, tracking the second object, and selecting a target action configured to target the second object and applying the target action via the target mechanism to the second object.

Systems and Methods for Dynamic Object Removal from Three-Dimensional Data

Systems and methods for generating simulation data based on real-world environments are provided. A method includes obtaining multi-modal sensor data indicative of a dynamic object within an environment of a robotic platform. The multi-modal sensor data is associated with a plurality of timesteps including a first timestep and a second timestep. The method includes providing the multi-modal sensor data indicative of the dynamic object within the environment as an input to a machine-learned dynamic object removal model. And, the method includes receiving as an output of the machine-learned dynamic object removal model, in response to receipt of the multi-modal sensor data, a scene representation indicative of at least a portion of the environment including a reconstructed region based at least in part on removal of the dynamic object and multiple levels of granularity. The scene representation is used as a template for generating different simulations within the depicted environment.

Systems, methods, and apparatus for using remote assistance to classify objects in an environment
12488592 · 2025-12-02 · ·

Example embodiments relate to techniques for enabling one or more systems of a vehicle (e.g., an autonomous vehicle) to request remote assistance to help the vehicle navigate in an environment. A computing device may be configured to receive a request for assistance from a vehicle to classify an object and to initiate display of a graphical user interface at a display device. The graphical user interface may be configured to visually represent the object and may comprise one or more graphical user interface elements to enable input to be provided for classifying the object. The computing device may also be configured to generate a response that includes a classification of the object based on detecting a selection of at least one of the one or more graphical user interface elements. Further, the computing device may be configured to transmit the response to the vehicle.

Robot, charging station, and robot charging system comprising same

Disclosed are a robot, a charging station, and a robot charging system, the charging station including: at least one indicator; at least one reflector configured to reflect light received from the outside to the at least one indicator; an interface configured to dock an external device; and a processor that, when it is detected that the external device is docked in the interface, supplies power to the docked external device through the interface.

Perception-Based Worksite Control System

A control system manages a plurality of mobile machines each equipped with a visual perception system to capture perception data. The control system via an onboard controller applies an object detection operation to the perception data to detect a detected marker position corresponding to a visual marker. The onboard controller assesses an assessed marker heath status with respect to the detected marker position and transmits that to a central worksite server. The central worksite server aggregates the assessed marker health statuses from a plurality of mobile machines to determine an aggregate marker health status for the visual marker.

Remote assistance for autonomous vehicles in predetermined situations

Example systems and methods enable an autonomous vehicle to request assistance from a remote operator in certain predetermined situations. One example method includes determining a representation of an environment of an autonomous vehicle based on sensor data of the environment. Based on the representation, the method may also include identifying a situation from a predetermined set of situations for which the autonomous vehicle will request remote assistance. The method may further include sending a request for assistance to a remote assistor, the request including the representation of the environment and the identified situation. The method may additionally include receiving a response from the remote assistor indicating an autonomous operation. The method may also include causing the autonomous vehicle to perform the autonomous operation.

Travel route control of autonomous work vehicle to reduce travel distance

An autonomous work vehicle including a position information obtaining unit, a driving unit, a control unit, and a memory storing a destination position. The position information obtaining unit includes a GNSS receiver acquiring a position of the autonomous work vehicle. The driving unit includes a motor. The control unit includes a processor. The processor is configured to calculate a direction of the destination position relative to a current position of the autonomous work vehicle, wherein the direction is calculated using the current position of the autonomous work vehicle and the position of the predetermined destination, and the driving unit drives the autonomous work vehicle in a traveling direction toward the direction of the destination position calculated by the processor.

AUTONOMOUS REFILLING BASE FOR AGRICULTURAL ROBOTS WITH MULTIPLE MAINTENANCE AND REFUEL FUNCTIONS

The present invention relates to an autonomous refilling base for agricultural robots, designed to increase their autonomy and enable continuous operation without human intervention. The base uses an automatic camera-guided navigation system to connect to the robot and perform maintenance and supplying. It has tanks for storing various products, such as herbicides and insecticides, and decontaminates the robot and its pipes. With solar panels and batteries, the base is energetically self-sufficient and equipped with RTK GPS, and an internet connection, ensuring precise operation and real-time synchronization.