Patent classifications
G05D1/0272
Localization method and system for mobile remote inspection and/or manipulation tools in confined spaces
A localization method and system for mobile remote inspection and/or manipulation tools in confined spaces are provided. The system comprises a mobile remote inspection and/or manipulation device including a carrier movable within the confined space and an inspection and/or manipulation tool, such as an inspection camera, pose sensors arranged on the movable carrier for providing signals indicative of the position and orientation of the movable carrier, and distance sensors arranged on the movable carrier for providing signals indicative of the distance to interior surfaces of the confined space. The localization method makes use of probalistic sensor fusion of the measurement data provided by the pose sensors and the distance sensors in order to precisely determine the actual pose of the movable carrier and localize data generated by the inspection and/or manipulation tool.
Unsupervised learning of metric representations from slow features
A method of unsupervised learning of a metric representation and a corresponding system for a mobile device determines a metric position information for a mobile device from an environmental representation. The mobile device comprises at least one sensor for acquiring sensor data and an odometer system configured to acquire displacement data of the mobile device. An environmental representation is generated based on the acquired sensor data by applying an unsupervised learning algorithm. The mobile device moves along a trajectory and the displacement data and the sensor data are acquired while the mobile device is moving along the trajectory. A set of mapping parameters is calculated based on the environmental representation and the displacement data. A metric position estimation is determined based on a further environmental representation and the calculated set of mapping parameters.
Method of localization by synchronizing multi sensors and robot implementing same
Disclosed herein are a method of localization by synchronizing multi sensors and a robot implementing the same. The robot according to an embodiment includes a controller that, when a first sensor acquires first type information, generates first type odometry information using the first type information, that, at a time point when the first type odometry information is generated, acquires second type information by controlling a second sensor and then generates second type odometry information using the second type information, and that the robot by combining the first type odometry information and the second type odometry information.
Mobile robot system and method for generating map data using straight lines extracted from visual images
A mobile robot is configured to navigate on a sidewalk and deliver a delivery to a predetermined location. The robot has a body and an enclosed space within the body for storing the delivery during transit. At least two cameras are mounted on the robot body and are adapted to take visual images of an operating area. A processing component is adapted to extract straight lines from the visual images taken by the cameras and generate map data based at least partially on the images. A communication component is adapted to send and receive image and/or map data. A mapping system includes at least two such mobile robots, with the communication component of each robot adapted to send and receive image data and/or map data to the other robot. A method involves operating such a mobile robot in an area of interest in which deliveries are to be made.
GRASS-CUTTING ROBOT AND CONTROL METHOD THEREFOR
Disclosed in the present invention are a grass-cutting robot and a control method therefor. The grass-cutting robot comprises a travelling apparatus, a motive power apparatus, a detection apparatus and a control apparatus. The travelling apparatus is configured to facilitate travel of the grass-cutting robot on a physical surface in a first direction. The motive power apparatus is configured to drive the travelling apparatus. The detection apparatus is configured to detect an attitude of the grass-cutting robot. The control apparatus is configured to apply a control signal to the grass-cutting robot when the attitude meets a predetermined condition, the control signal causing resistance to arise in the travelling apparatus, and the resistance causing a tendency of at least part of the travelling apparatus to move in the first direction to be hindered. Further disclosed in the present invention is a control method for a grass-cutting robot. The grass-cutting robot and control method therefor according to one or more embodiments of the present invention can improve the precision of grass-cutting robot control, and increase work effectiveness and safety.
Ground Treatment Appliance
An autonomous ground treatment appliance, in particular a robotic lawnmower, includes a housing, a running gear, a control unit, at least one wheel unit, and a sensor unit. The control unit is configured to control the autonomous ground treatment appliance. The at least one wheel unit is mounted on the housing so as to be at least partially movable relative to the housing. The sensor unit is configured to ascertain a position of the wheel unit relative to the housing.
AUTONOMOUS MACHINE NAVIGATION USING REFLECTIONS FROM SUBSURFACE OBJECTS
Autonomous machine navigation involves determining a current pose of an autonomous machine based on non-vision-based pose data captured by one or more non-vision-based sensors of the autonomous machine. The pose represents one or both of a position and an orientation of the autonomous machine in a work region defined by one or more boundaries. Pose data is determined based on a return signal received in response to a wireless signal transmitted to a surface or subsurface object that passively provides the return signal. The return signal is identifiable with the object. The current pose is updated based on the pose data to correct or localize the current pose and to provide an updated pose of the autonomous machine in the work region.
Cleaning machine and path planning method of the cleaning machine
A cleaning machine and a path planning method of the cleaning machine are provided. According to one embodiment of the invention, a cleaning machine for cleaning a surface is provided. The cleaning machine includes a sensing module and a control system. The sensing module senses an environment of the cleaning machine to obtain map data. The control system divides the map data into multiple blocks, and controls the cleaning machine to perform a first cleaning process and a second cleaning process in a current block of the blocks, and then controls the cleaning machine to move to a next block of the blocks.
InCycle planner checkout for autonomous vehicles
Process for clearing an autonomous machine including first evaluating operation at a high curvature offline location. Following acceptable operation, the machine is placed into service and evaluated at a worksite. Following acceptable worksite operation, online operating speed of the machine is increased incrementally, and performance reevaluated. Following acceptable performance characteristics, online operating speed of the machine continues to be increased and revaluated until the machine reaches maximum designated operating speed, or is evaluated as unacceptable, in which case the machine continues to operate at the last acceptable online operating speed and identifies the unacceptable performance characteristic for further evaluation.
HIGH-DEFINITION MAPPING
A method may include obtaining sensor data about a total measurable world around an autonomous vehicle. The sensor data may be captured by sensor units co-located with the autonomous vehicle. The method may include generating a mapping dataset including the obtained sensor data and identifying data elements that each represents a point in the mapping dataset. The method may include sorting the data elements according to a structural data categorization that is a template for a high-definition map of the total measurable world and determining a mapping trajectory of the autonomous vehicle. The mapping trajectory may describe a localization and a path of motion of the autonomous vehicle. The method may include generating the high-definition map based on the structural data categorization and relative to the mapping trajectory of the autonomous vehicle, and the high-definition map may be updated based on the path of motion of the autonomous vehicle.