System of robotic cleaning devices
11712142 · 2023-08-01
Assignee
Inventors
Cpc classification
G05D1/0225
PHYSICS
Y10S901/01
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
A47L9/2894
HUMAN NECESSITIES
A47L9/2852
HUMAN NECESSITIES
A47L2201/04
HUMAN NECESSITIES
International classification
A47L9/28
HUMAN NECESSITIES
G05D1/00
PHYSICS
Abstract
A system of robotic cleaning devices and a method of a master robotic cleaning device of controlling at least one slave robotic cleaning device. The method performed by a master robotic cleaning device of controlling at least one slave robotic cleaning device includes detecting obstacles, deriving positional data from the detection of obstacles, positioning the master robotic cleaning device with respect to the detected obstacles from the derived positional data, controlling movement of the master robotic cleaning device based on the positional data, and submitting commands to the at least one slave robotic cleaning device to control a cleaning operation of said at least one slave robotic cleaning device, the commands being based on the derived positional data, wherein the cleaning operation of the slave robotic cleaning device is controlled as indicated by the submitted commands.
Claims
1. A method performed by a master robotic cleaning device of controlling at least one slave robotic cleaning device, the method comprising: detecting obstacles; deriving positional data from the detection of obstacles; positioning the master robotic cleaning device with respect to the detected obstacles from the derived positional data; controlling movement of the master robotic cleaning device based on the positional data; and submitting commands to the at least one slave robotic cleaning device to control a cleaning operation of said at least one slave robotic cleaning device, the commands being based on the derived positional data, wherein the cleaning operation of the slave robotic cleaning device is controlled as indicated by the submitted commands; wherein the commands submitted by the master robotic cleaning device comprise an instruction to the at least one slave robotic cleaning device to remove debris from a surface to be cleaned and gather the debris for subsequent pick-up by the master robotic cleaning device.
2. The method of claim 1, wherein the commands submitted by the master robotic cleaning device comprise data indicating a surface over which the at least one slave robotic cleaning device is instructed to move.
3. The method of claim 1, wherein the commands submitted by the master robotic cleaning device comprise data indicating a time at which the at least one slave robotic cleaning device is instructed to perform the cleaning operation.
4. The method of claim 1, wherein the commands submitted by the master robotic cleaning device comprise an instruction to the at least one slave robotic cleaning device to return to its charger after the cleaning operation has been performed.
5. A computer product comprising a non-transitory computer readable medium, the non-transitory computer readable medium comprising a computer program comprising computer-executable instructions for causing a device to perform the steps recited in claim 1 when the computer-executable instructions are executed on a processing unit included in the device.
6. The method of claim 1, further comprising controlling the master robotic cleaning device to pick up the gathered debris.
7. The method of claim 1, wherein the master robotic cleaning device comprises a suction fan and suction fan motor configured to operate to push the gathered debris into the master robotic cleaning device.
8. The method of claim 7, wherein the slave robotic cleaning device comprises a brush configured to push the debris in front of the slave robotic cleaning device.
9. The method of claim 1, wherein the slave robotic cleaning device comprises a brush configured to push the debris in front of the slave robotic cleaning device.
10. The method of claim 1, wherein the slave robotic cleaning device does not have a dust container.
11. The method of claim 1, wherein the slave robotic cleaning device does not have a vacuum fan.
12. The method of claim 1, wherein the surface to be cleaned is under at least one of the detected obstacles.
13. The method of claim 1, wherein the master robotic cleaning device has a first height, and the slave robotic cleaning device has a second height, wherein the second height is less than the first height.
14. The method of claim 13, wherein the surface to be cleaned is under at least one of the detected obstacles and a bottom surface of the at least one of the detected obstacles has a height between the first height and the second height.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The invention is now described, by way of example, with reference to the accompanying drawings, in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
DETAILED DESCRIPTION
(12) The invention will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout the description.
(13) The invention relates to robotic cleaning devices, or in other words, to automatic, self-propelled machines for cleaning a surface, e.g. a robotic vacuum cleaner, a robotic sweeper or a robotic floor washer. The robotic cleaning device according to the invention can be mains-operated and have a cord, be battery-operated or use any other kind of suitable energy source, for example solar energy.
(14)
(15) A controller 22 such as a microprocessor controls the wheel motors 15, 16 to rotate the driving wheels 12, 13 as required in view of information received from an obstacle detecting device (not shown in
(16) Further, the main body 11 may optionally be arranged with a cleaning member 17 for removing debris and dust from the surface to be cleaned in the form of a rotatable brush roll arranged in an opening 18 at the bottom of the robotic cleaner 10. Thus, the rotatable brush roll 17 is arranged along a horizontal axis in the opening 18 to enhance the dust and debris collecting properties of the cleaning device 10. In order to rotate the brush roll 17, a brush roll motor 19 is operatively coupled to the brush roll to control its rotation in line with instructions received from the controller 22.
(17) Moreover, the main body 11 of the robotic cleaner 10 comprises a suction fan 20 creating an air flow for transporting debris to a dust bag or cyclone arrangement (not shown) housed in the main body via the opening 18 in the bottom side of the main body 11. The suction fan 20 is driven by a fan motor 21 communicatively connected to the controller 22 from which the fan motor 21 receives instructions for controlling the suction fan 20. It should be noted that a robotic cleaning device having either one of the rotatable brush roll 17 and the suction fan 20 for transporting debris to the dust bag can be envisaged. A combination of the two will however enhance the debris-removing capabilities of the robotic cleaning device 10.
(18) The robotic cleaning device 10 may further be equipped with an inertia measurement unit (IMU) 24, such as e.g. a gyroscope and/or an accelerometer and/or a magnetometer or any other appropriate device for measuring displacement of the robotic cleaning device 10 with respect to a reference position, in the form of e.g. orientation, rotational velocity, gravitational forces, etc. A three-axis gyroscope is capable of measuring rotational velocity in a roll, pitch and yaw movement of the robotic cleaning device 10. A three-axis accelerometer is capable of measuring acceleration in all directions, which is mainly used to determine whether the robotic cleaning device is bumped or lifted or if it is stuck (i.e. not moving even though the wheels are turning). The robotic cleaning device 10 further comprises encoders (not shown in
(19) The main body 11 may further be arranged with a rotating side brush 14 adjacent to the opening 18, the rotation of which could be controlled by the drive motors 15, 16, the brush roll motor 19, or alternatively a separate side brush motor (not shown). Advantageously, the rotating side brush 14 sweeps debris and dust from the surface to be cleaned such that the debris ends up under the main body 11 at the opening 18 and thus can be transported to a dust chamber of the robotic cleaning device. Further advantageous is that the reach of the robotic cleaning device 10 will be improved, and e.g. corners and areas where a floor meets a wall are much more effectively cleaned. As is illustrated in
(20) The robotic cleaning device 10 further comprises a communication interface 29 comprising a wireless receiver and transmitter, typically embodied by a single unit known as a transceiver. The communication interface 29 communicates via e.g. infrared (IR), ultrasonic or radio-frequency (RF) signals with for instance a remote control utilizing line-of-sight communication or a server using wireless local area network (WLAN) technology.
(21) The communication interface may further be connected to a user interface (not shown) provided on the robotic cleaning device 10 via which a user can provide the robotic cleaner 10 with a particular type of instruction, such as “start”, “stop”, “return to charging station”, etc. The user interface may be of touch-screen type or mechanically configured comprising physical buttons to be operated. Further, the user interface may comprise display means for visually indicating a user selection. It should be noted that the user not necessarily need to provide input to the user interface by physically touching it, but may alternatively communicate with the robotic cleaner by means of the previously mentioned remote control.
(22) With further reference to
(23)
(24) The first and second line lasers 27, 28 are typically arranged on a respective side of the camera 23 along an axis being perpendicular to an optical axis of the camera. Further, the line lasers 27, 28 are directed such that their respective laser beams intersect within the field of view of the camera 23. Typically, the intersection coincides with the optical axis of the camera 23.
(25) The first and second line laser 27, 28 are configured to scan, preferably in a vertical orientation, the vicinity of the robotic cleaning device 10, normally in the direction of movement of the robotic cleaning device 10. The first and second line lasers 27, 28 are configured to send out laser beams, which illuminate furniture, walls and other objects of e.g. a room to be cleaned. The camera 23 is controlled by the controller 22 to capture and record images from which the controller 22 creates a representation or layout of the surroundings that the robotic cleaning device 10 is operating in, by extracting features from the images and by measuring the distance covered by the robotic cleaning device 10, while the robotic cleaning device 10 is moving across the surface to be cleaned. Thus, the controller 22 derives positional data of the robotic cleaning device 10 with respect to the surface to be cleaned from the recorded images, generates a 3D representation of the surroundings from the derived positional data and controls the driving motors 15, 16 to move the robotic cleaning device across the surface to be cleaned in accordance with the generated 3D representation and navigation information supplied to the robotic cleaning device 10 such that the surface to be cleaned can be navigated by taking into account the generated 3D representation. Since the derived positional data will serve as a foundation for the navigation of the robotic cleaning device, it is important that the positioning is correct; the robotic device will otherwise navigate according to a “map” of its surroundings that is misleading.
(26) The 3D representation generated from the images recorded by the 3D sensor system thus facilitates detection of obstacles in the form of walls, floor lamps, table legs, around which the robotic cleaning device must navigate as well as rugs, carpets, doorsteps, etc., that the robotic cleaning device 10 must traverse. The robotic cleaning device 10 is hence configured to learn about its environment or surroundings by operating/cleaning.
(27) Hence, the 3D sensor system comprising the camera 23 and the first and second vertical line lasers 27, 28 is arranged to record images of a vicinity of the robotic cleaning from which objects/obstacles may be detected. The controller 22 is capable of positioning the robotic cleaning device 10 with respect to the detected obstacles and hence a surface to be cleaned by deriving positional data from the recorded images. From the positioning, the controller 22 controls movement of the robotic cleaning device 10 by means of controlling the wheels 12, 13 via the wheel drive motors 15, 16, across the surface to be cleaned.
(28) The derived positional data facilitates control of the movement of the robotic cleaning device 10 such that cleaning device can be navigated to move very close to an object, and to move closely around the object to remove debris from the surface on which the object is located. Hence, the derived positional data is utilized to move flush against the object, being e.g. a thick rug or a wall. Typically, the controller 22 continuously generates and transfers control signals to the drive wheels 12, 13 via the drive motors 15, 16 such that the robotic cleaning device 10 is navigated close to the object.
(29) Now, with reference to
(30) With reference to
(31) In
(32) Thus, for objects having a clearance height less than h.sub.1, the master robot cannot pass under for performing a cleaning operation. Without a system such as that shown in
(33) Advantageously, in the embodiment illustrated with reference to
(34) As in the case of the master robot 10, the slave robot 30 comprises a controller 30 configured to control a propulsion system comprising driving means in the form of e.g. two electric wheel motors 33, 34 for enabling movement of the driving wheels 35, 36 (or any other appropriate movement means) such that the slave cleaning device 30 can be moved over a surface to be cleaned, such as the surfaces under the furniture which the master robot 10 cannot reach. In this particular example, the slave robot 30 further comprises a cleaning member in the form of a rotatable brush roll 37 for more effectively removing debris from the surface to be cleaned.
(35) With the system of cleaning robots according to embodiments of the present invention, the slave robot 30 is advantageously not required to be equipped with the same sophisticated obstacle detecting device (embodied in
(36)
(37)
(38) Thus, the master robot 10 emits light by means of its laser light sources 27, 28 onto the slave robot 30 and the camera 23 records images of a vicinity of the master robotic cleaning device 10 from which the slave robot 10 may be detected. Thereafter, the master robot 10 derives positional data of the detected objects from the recorded images, and positons itself in relation to the objects, including the slave robot 30.
(39)
(40) Upon detecting the laser light emitted by the master robot 10, the slave robot 30 communicates via its communication interface to the master robot 10 that the laser light is detected. As previously has been discussed, the interface may communicate via e.g. IR, ultrasonic or RF signals (possibly utilizing WLAN technology).
(41) In this way, the master robot 10 is able of detecting—and positioning itself in relation to—the slave robot 30 using for instance SLAM.
(42) Further advantageous is that, by having the slave robot 30 detected light emitted by the line lasers 27, 28 of the master robot 10, and instantly communicate that the light that has been detected at the optical detector 39, an operational clock of the master and the slave, respectively, can be synchronized to each other. Hence, any clock drift may be eliminated, which facilitates system navigation. It should be noted that the embodiments of the invention illustrated in
(43)
(44) This embodiment is advantageous since there is no need to equip the slave robot 30 with a dust container or suction fan, thereby facilitating an even less complex—and less noisy—slave robot 30. Further, with this embodiment, the slave robot is advantageously more or less maintenance-free, as there is no need to empty a dust container. A user will only occasionally have to remove debris that is stuck to the cleaning member of the slave robot 10.
(45) Again with reference to
(46)
(47) In the exemplifying embodiment of
(48) In an embodiment of the invention, the commands submitted from the master robotic cleaning device 10 to the slave robotic cleaning device 30 comprises an instruction to the slave robotic cleaning device 30 to return to its charger (not shown) after the cleaning operation has been performed.
(49)
(50) Advantageously, by communicating via a WLAN, the master robot 10 and the slave robot 30 can be located on a great distance from each other, such as on different floors in a building but still being capable of communicating with each other.
(51)
(52) The invention has mainly been described above with reference to a few embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims.