G05D1/0274

Bio-hybrid odor-guided autonomous palm-sized air vehicle

A bio-hybrid odor-localizing autonomous air vehicle includes an airborne robotic platform having a navigation platform, a wireless transmitter communicatively coupled to a management console, and a biological sensor mounted on the airborne robotic platform that reacts to at least one olfactory odor. A controller is communicatively coupled to the airborne robotic platform, the navigation platform, and the biological sensor. The controller monitors the biological sensor. In response to the biological sensor detecting the at least one olfactory odor, the controller directs the airborne platform to three-dimensionally map an olfactory plume of the at least one olfactory odor using an olfactory-driven search pattern. The controller stores the three-dimensional map for later retrieval or transmits the three-dimensional map of the olfactory plume to the management console via the wireless transmitter.

Method for autonomously controlling a vehicle

The present application provides a method for autonomously controlling a vehicle performed by a control system of the vehicle on the basis of a mission received from a mission controller, the method comprising: receiving a mission comprising a set of instructions from the mission controller; validating the mission by checking whether the mission meets a first set of requirements; executing the mission if the mission meets the first set of requirements and rejecting the mission if the mission does not meet the first set of requirements.

Mobile robot using artificial intelligence and controlling method thereof
11700989 · 2023-07-18 · ·

A mobile robot of the present disclosure includes: a traveling unit configured to move a main body; a cleaning unit configured to perform a cleaning function; a sensing unit configured to sense a surrounding environment; an image acquiring unit configured to acquire an image outside the main body; and a controller configured to generate a distance map indicating distance information from an obstacle for a cleaning area based on information detected and the image through the sensing unit and the image acquiring unit, divide the cleaning area into a plurality of detailed areas according to the distance information of the distance map and control to perform cleaning independently for each of the detailed areas. Therefore, the area division is optimized for the mobile robot traveling in a straight line by dividing the area in a map showing a cleaning area.

Vehicle

A vehicle traveling from a first area to a second area includes a traveling mechanism having a plurality of wheels and a traveling function, and a control unit configured to cause the traveling mechanism to perform a wheel cleaning operation and then cause the vehicle to enter the second area.

CONTROL DEVICE OF LIFTING PLATFORM FOR DETECTION DEVICE AND DETECTION DEVICE

Provided are a detection device and a control device of a lifting platform thereof, wherein the control device is used for controlling the lifting platform of the detection device, wherein the control device comprises a first distance measuring sensor arranged on the top of the detection device; a first processor connected with the first distance measuring sensor and used for obtaining a first distance measurement instruction and controlling the first distance measuring sensor to measure a first distance between the top of the detection device and an obstacle directly above the detection device according to the first distance measurement instruction; the first processor is further used for obtaining the first distance sent by the first distance measuring sensor, generating an elevation instruction according to the first distance, and controlling a lifting motor of the detection device to drive the lifting platform to rise to the target position.

ROBOTIC WORK TOOL SYSTEM, AND METHOD FOR DEFINING A WORKING AREA PERIMETER
20230015812 · 2023-01-19 ·

A robotic work tool system (200) for defining a working area perimeter (105) surrounding a working area (150) in which a robotic work tool (100) is intended to operate. The robotic work tool system (200) comprises a boundary definition unit (300) comprising at least one position unit (175) for receiving position data; and at least one controller (210) for controlling operation of the boundary definition unit (300). The controller (210) being configured to receive, from the position unit (175), position data while the boundary definition unit (300) is moved around the working area (150) to define a preliminary working area perimeter (110). The controller (210) is further configured to identify, based on the received position data, a geometry of the preliminary working area perimeter (110) approximately corresponding to a predefined geometry; and to adjust the identified geometry to define an adjusted working area perimeter (105), wherein the identified geometry is adjusted to correspond to the predefined geometry.

AUTONOMOUS MACHINE NAVIGATION IN VARIOUS LIGHTING ENVIRONMENTS
20230020033 · 2023-01-19 ·

Training an autonomous machine in a work region for navigation in various lighting conditions includes determining a feature detection range based on an environmental lighting parameter, determining a feature detection score for each of one or more positions in the containment zone based on the feature detection range, determining one or more localizable positions in the containment zone based on the corresponding feature detection scores, and updating the navigation map to include a localization region within the containment zone based on the one or more localizable positions. Navigation may use one or more of an uncertainty area, the localization region, and one or more buffer zones to navigate based on lighting conditions.

Control device and work machine

A control device capable of improving the position accuracy of map data is disclosed. The control device is configured to acquire information which is output from a working machine which includes a working part and works along a boundary between a working area and a non-working area. The control device includes an operating state acquisition part configured to acquire information indicating an operating state of a machine body of the working machine; a judgment part configured to determine the operating state of the machine body, based on information acquired by the operating state acquisition part; a position information acquisition part configured to acquire position information indicating a position of the machine body; and a storage control part configured to store the position information acquired by the position information acquisition part in the storage, based on a determination result of the judgment part.

Generating a local mapping of an agricultural field for use in performance of agricultural operation(s)

Implementations are directed to assigning corresponding semantic identifiers to a plurality of rows of an agricultural field, generating a local mapping of the agricultural field that includes the plurality of rows of the agricultural field, and subsequently utilizing the local mapping in performance of one or more agricultural operations. In some implementations, the local mapping can be generated based on overhead vision data that captures at least a portion of the agricultural field. In these implementations, the local mapping can be generated based on GPS data associated with the portion of the agricultural field captured in the overhead vision data. In other implementations, the local mapping can be generated based on driving data generated during an episode of locomotion of a vehicle through the agricultural field. In these implementations, the local mapping can be generated based on GPS data associated with the vehicle traversing through the agricultural field.

WORKING MAP CONSTRUCTION METHOD AND APPARATUS, ROBOT, AND STORAGE MEDIUM
20230015335 · 2023-01-19 ·

Embodiments of this specification provide a working map construction method and apparatus, a robot, and a storage medium. The method includes: determining a moving path of a robot when the robot moves forward as a first forward moving path; determining, after the robot moves backward, a position of the robot when the robot changes from moving backward to moving forward again as a correction position; determining an auxiliary position on the first forward moving path according to the correction position in a case that the correction position is not on the first forward moving path; and determining a correction path according to the correction position and the auxiliary position, so as to construct a working map of the robot according to the correction path and the first forward moving path.