METHOD FOR AUTONOMOUS PROCESSING OF FLOOR SURFACES

20240057835 ยท 2024-02-22

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for autonomous processing of floor surfaces by a mobile, self-driving device, in particular a floor cleaning device such as a vacuum and/or sweeping and/or mopping robot. The method includes: the mobile, self-driving device carrying out an exploration travel in a proposed floor processing region, detecting obstacles and/or objects located on the floor by a detection device, classifying the obstacles and/or objects as movable or not movable, and transporting the obstacles and/or objects classified as movable to at least one collection point. The collection point is cleaned before obstacles and/or objects classified as movable are transported to the collection point.

    Claims

    1-10. (canceled)

    11. A method for autonomous processing of floor surfaces with an aid of a mobile, self-driving device, the method comprises the following steps of: implementing an exploration tour by the mobile, self-driving device in a proposed floor processing area; detecting obstacles and/or objects disposed on the proposed floor processing area by means of a detection facility; classifying the obstacles and/or the objects as movable or non-movable; and transporting the obstacles and/or the objects classified as movable to at least one collection point, the at least one collection point being cleaned before the obstacles and/or the objects classified as movable are transported to the at least one collection point.

    12. The method according to claim 11, which further comprises determining the at least one collection point via the mobile, self-driving device itself or by a user.

    13. The method according to claim 11, which further comprises determining a route from the obstacle and/or the object classified as movable to the at least one collection point by the mobile, self-driving device itself.

    14. The method according to claim 11, wherein a plurality of collection points is defined.

    15. The method according to claim 11, which further comprises cleaning free areas of the proposed floor processing area, before and/or while the obstacles and/or the objects classified as movable are transported to the at least one collection point.

    16. The method according to claim 11, which further comprises transporting the obstacles and/or the objects classified as movable to the at least one collection point.

    17. The method according to claim 11, wherein the at least one collection point lies outside of the proposed floor processing area.

    18. The method according to claim 11, which further comprises storing the obstacles and/or the objects classified as movable in a sorted fashion.

    19. The method according to claim 11, wherein the mobile, self-driving device is a floor cleaning device or a suction and/or a sweeping and/or a mopping robot.

    20. The method according to claim 11, which further comprises transporting the obstacles and/or the objects classified as movable to the at least one collection point before the proposed floor processing area is cleaned.

    21. A mobile, self-driving device, comprising: a detector for detecting obstacles and/or objects disposed on a floor; a classification facility for classifying the obstacles and/or the objects as movable or non-movable; and a transporter for transporting the obstacles and/or the objects classified as movable to at least one collection point, wherein the at least one collection point is cleaned before transporting the obstacles and/or the objects classified as movable to the at least one collection point.

    22. The mobile, self-driving device according to claim 21, wherein said transporter has a gripping facility, a clamping facility, a wall facility and/or a scooper facility.

    23. The mobile, self-driving device according to claim 21, wherein the mobile, self-driving device is a floor cleaning device for autonomous processing of floor areas.

    24. The mobile, self-driving device according to claim 21, wherein the floor cleaning device for autonomous processing of the floor areas is a suction and/or a sweeping and/or a mopping robot.

    Description

    [0045] The invention is explained in more detail using the subsequent embodiments of the invention which only show examples. The drawings show:

    [0046] FIG. 1: a schematic view of an exemplary embodiment of a surroundings map before implementing the inventive method for automatically processing floor surfaces with the aid of a mobile, self-driving device,

    [0047] FIG. 2A: a schematic view of an exemplary embodiment of a surroundings map of a method step for implementing the inventive method for automatically processing floor surfaces with the aid of a mobile, self-driving device

    [0048] FIG. 2B: a schematic view of an exemplary embodiment of a surroundings map of a further method step for implementing the inventive method for automatically processing floor surfaces with the aid of a mobile, self-driving device.

    [0049] FIG. 1 shows a surroundings map 10 of an apartment to be cleaned, in particular of a floor processing area 1 to be cleaned. This surroundings map is shown in an app, in particular a cleaning app, of a portable additional device, in particular a mobile device. Interaction between a user and a mobile, self-driving device, in particular a suction robot, is thus possible.

    [0050] The surroundings map 10 is produced by the mobile, self-driving device. To this end, the suction robot carries out an exploration tour, during which the suction robot moves over and identifies its surroundings and forwards this to the user in the form of the surroundings map in the app of the mobile device. With the exploration tour, any obstacles, items and objects 3 located on the floor are detected by means of a detection facility, such as for instance socks, toys, shoes, rugs, walls, door thresholds etc. lying on the floor, and shown to the user in the surroundings map at the correct location. In order to detect the obstacles, items and objects, the suction robot comprises a suitable sensor system, such as, for instance, a 3D camera, 3D LIDAR, PMD sensor and/or a (stereo) camera.

    [0051] In addition to detecting the obstacles, items and objects 3 located on the base, these are additionally classified by the suction robot as movable obstacles and non-movable obstacles. Identifying any type of object is not necessarily required here. The suction robot identifies in particular which obstacles are to/have to remain in their position and subsequently have to be driven around, such as for instance socket strips, cables, lamp stands, chair legs, pet excrement and suchlike and which obstacles can be moved by the suction robot and transported away, such as for instance toys, socks, shoes and other items lying around.

    [0052] After classifying the obstacles, the obstacles and/or objects classified as movable are transported to a collection point 2. The collection point 2 can be predefined by the user in the surroundings map 10 for instance, to which the suction robot is to transport the loose, movable obstacles lying around. Alternatively, the suction robot can itself establish the area which it selects as the collection point 2. Here the location of the area, the rapid accessibility and other properties such as for instance a travel path for the user, tripping hazards and suchlike are taken into account. This proposal is preferably shown to the user in the surroundings map 10, so that if necessary the user can change and/or adjust the collection point 2.

    [0053] Moreover, it is possible to define a number of collection points (not shown), to which the suction robot can bring the obstacles. In particular, the suction robot approaches the collection point which is closest to it. Alternatively or in addition, the suction robot only uses a first collection point until this is full and a second collection point is required or it brings noticeable advantages for the working speed.

    [0054] The collection point can lie outside of the floor are to be cleaned, for instance in an office, when the living room is to be cleaned. This can ensure that the floor area to be cleaned is cleaned as thoroughly as possible.

    [0055] FIG. 2A shows a surroundings map 10, in which a plurality of obstacles, items and objects 3 are specified. Moreover, movable obstacles 4 and non-movable obstacles 5 are shown, which have been detected by the suction robot during the exploration tour and classified.

    [0056] Before the suction robot now transports the obstacles classified as movable to the collection point, it cleans the collection point in order to be able to ensure that the collection point is also cleaned, which is no longer possible after transporting the items. The collecting and taking away of the movable obstacles 4 during the implementation of the cleaning program is then preferably carried out. In this process the suction robot continuously scans the floor area in front of it and even before physical contact identifies what the obstacles are, collects correspondingly movable obstacles 4 and brings them to the collection point 2 and then continues its cleaning. At the end of the cleaning program, all obstacles 4 classified as movable are located at the collection point 2, while the obstacles 5 classified as non-movable continue to remain at their location, as shown in FIG. 2B.

    [0057] Alternatively, it is possible that before actually starting the cleaning program and after cleaning the collection point 2, a search tour is carried out by the suction robot which searches the floor of the floor processing area to be cleaned. Identified obstacles are brought to the collection point 2. The actual cleaning is then begun.

    [0058] Further alternatively or in addition, it is possible to offer the user a clear-up program which the user can start via the app without allowing a subsequent floor cleaning process to ensue.

    [0059] Identified movable obstacles 4 are preferably stored classified at the collection point 2. For instance, socks are placed on a first pile at the collection point 2, shoes on a second pile at the collection point 2 and toys on a third pile at the collection point 2. This advantageously facilitates subsequent clearing up by the user.

    [0060] To ensure that the suction robot is able to collect the obstacles, it preferably comprises a robot arm or a similar mechanism, which allows gripping, clamping, lifting, carrying away and/or storing. Alternatively or in addition, side walls which can be folded out/extended are used, which permit the obstacles to slide over the floor and prevent a lateral slipping of the obstacles during cornering. A preferred grid-type partition which can be folded out/extended in front of the robot prevents obstacles from reaching the suction mouth, the brush roller or below the suction robot. Once the obstacles are placed at the collection point 2, the side walls and the front partition are retracted again in order to simplify maneuvering of the suction robot. Further additionally or alternatively, a blade-type collection mechanism can be used, which lifts the obstacles and thus enables transportation also over door thresholds.

    [0061] With the method according to the invention, the obstacles 3, 4, 5 lying on the floor are identified, classified and the movable obstacles 4 brought to the collection point 2, in order to be able to clean the floor area to be cleaned as completely as possible and at the same time to carry out a rough type of clearing up. Non-movable obstacles 5 remain at the location and position.