DEVICE, SYSTEM AND METHOD FOR ASSISTING MOBILE ROBOT OPERATIONS

20220063108 · 2022-03-03

    Inventors

    Cpc classification

    International classification

    Abstract

    Disclosed are a device, system and method for assisting mobile robots. A service robot is disclosed comprising a body; a motion component fitted to the body and configured to propel the service robot in a direction; an engagement component configured to exert a localized force on a geometrically defined interaction area; a sensor configured to detect the interaction area; and a communication component configured to at least communicate with mobile robots and to at least receive requests to engage the interaction area. A system comprising a mobile robot and a service robot is also disclosed. A method comprising a mobile robot approaching a pedestrian road crossing at a first location; the mobile robot requesting assistance from a service robot; the service robot executing at least one assistive action; and in response to the assistive action, the mobile robot crossing the road via the pedestrian road crossing is also disclosed.

    Claims

    1. A service robot configured to assist mobile robots, the service robot comprising: a body; a motion component fitted to the body and configured to propel the service robot in a direction; an engagement component configured to exert a localized force on a geometrically defined interaction area; a sensor configured to detect the interaction area; and a communication component configured to at least communicate with mobile robots and to at least receive requests to engage the interaction area.

    2. The service robot according to claim 1, wherein the interaction area comprises a pushbutton.

    3. The service robot according to claim 1, wherein the engagement component is motor operated and comprises a mechanical arm.

    4. The service robot according to claim 1, wherein the engagement component is configured to exert a force of at least 10 N on the interaction area.

    5. The service robot according to claim 1, wherein the engagement component comprises at least two positions comprising an idle position and an active position and wherein in the idle position the engagement component is substantially flush with an upper surface of the body; and in the active position the engagement component is substantially protruding from the body.

    6. The service robot according to claim 5, wherein the engagement component is configured to actuate from the idle position to the active position in response to a request to engage the interaction area.

    7. The service robot according to claim 1, wherein the motion component is configured to displace the body substantially vertically in response to the sensor detecting the interaction area.

    8. The service robot according to claim 1, wherein the engagement component is configured to exert a localized force on an area at a height of between 80 and 150 cm above ground.

    9. A system for assisting mobile robots, the system comprising: at least one mobile robot configured to navigate in unstructured outdoor environments on pedestrian walkways; and at least one service robot according to claim 1, wherein the service robot is configured to assist the mobile robot by engaging the interaction area.

    10. The system according to claim 9, wherein the service robot is configured to assist the mobile robot at a first location and time upon request.

    11. The system according to claim 9, wherein the service robot is configured to assist the mobile robot by pressing a pedestrian crossing pushbutton so as to enable the mobile robot to cross a traffic road.

    12. The system according to claim 9, wherein the service robot is configured to assist the mobile robot at a first location, and wherein the system further comprises a server configured to communicate with the mobile robot and the service robot and wherein the server is configured to instruct the service robot to navigate to the first location where the service robot can assist the mobile robot.

    13. The system according to claim 12, wherein the system comprises a plurality of mobile robots and wherein the server is configured to optimize placement of the service robot based on ongoing mobile robot operations.

    14. The system according to claim 12, wherein the server is configured to estimate navigational time of the service robot and the mobile robot and instruct the service robot to start navigating to the first location.

    15. A method for assisting mobile robots, the method comprising: a mobile robot approaching a pedestrian road crossing at a first location; the mobile robot requesting assistance from a service robot; the service robot executing at least one assistive action; and in response to the assistive action, the mobile robot crossing the road via the pedestrian road crossing.

    16. The method according to claim 15, further comprising the service robot travelling to the first location ahead of the mobile robot so as to provide assistive action upon arrival of the mobile robot.

    17. The method according to claim 15, further comprising the service robot departing the first location after executing the assistive action.

    18. The method according to claim 17, further comprising the service robot travelling to a second location and providing an assistive action to a second mobile robot at the second location.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0225] FIGS. 1a and 1b depicts an embodiment of a service robot;

    [0226] FIGS. 2a and 2b depict a different embodiment of a service robot;

    [0227] FIGS. 3a, 3b, 3c and 3d depict schematic partial views of an embodiment of a service robot;

    [0228] FIG. 4 depicts a schematic embodiment of a system according to one embodiment of the invention;

    [0229] FIG. 5 schematically depicts a method according to an embodiment of the invention

    [0230] FIG. 6 shows an embodiment of a mobile robot as per an embodiment of the present invention.

    DESCRIPTION OF EMBODIMENTS

    [0231] FIGS. 1a and 1b schematically depicts an embodiment of a service robot according to an aspect of the present invention. The service robot 1 is shown engaging an interaction area 50. In the present figures, the interaction area 50 is shown as a pedestrian crossing pushbutton.

    [0232] The service robot 1 comprises a body 2. The body comprises an upper surface 22 which is shown as convex in the figures. The service robot 1 further comprises a motion component 4, shown as wheels 4. The depicted embodiment shows a service robot 1 with six wheels.

    [0233] The service robot 1 further comprises an engagement component 6. The engagement component 6 can be configured to engage or activate the interaction area 50. In the depicted embodiments, the engagement component 6 is configured to engage or push the pushbutton 50 or a pedestrian crossing.

    [0234] The service robot 1 in any of the shown embodiments generally comprises a processing component as well (not shown in the figures). The processing component can serve to control and coordinate the service robot's 1 operations, such as navigating (and generally using the motion component 4), actuating the engagement component 6, or using a communication component (also not shown) to send and receive data, instructions, or operational information.

    [0235] FIGS. 2a and 2b show another embodiment of a service robot. In this embodiment, the engagement component 6 is shown as a flag or antenna that can have a double function of increasing the service robot's visibility and engaging interaction areas. The service robot 1 has a similar body 2 and a motion component 4, also depicted as wheels.

    [0236] FIGS. 3a, 3b, 3c and 3d show partial views of the service robot 1. Those correspond to the schematic embodiment of FIGS. 1a and 1b. Sensor 8 is shown, placed at the top or end of the engagement component 6. The shown sensor 8 comprises a visual camera 8, but there can be different sensors (such as a Lidar sensor or a Time of Flight sensor), and/or a plurality of sensors. In the shown embodiment, the camera 8 is placed within an indentation of a protruding element that is configured to engage an interaction area.

    [0237] Upper surface of the body 22 is shown as well. Further, an engagement mechanism 62, protruding through the upper surface 22 is shown. The engagement mechanism 62 comprises a lever connected to a motor that can actuate the engagement component 6, so that it can move between an idle position (as shown in FIG. 3b) and an active position (as shown in FIGS. 3a, 3c and 3d). In the idle position, the engagement component 6 can be advantageously out of the way, so that it does not impede the movement of the service robot 1, not present any inconvenience to passersby if the service robot 1 is travelling. In the active position, the engagement component 6 can engage or activate the interaction area 50.

    [0238] The engagement mechanism 62 can be implemented differently. For example, the engagement mechanism 62 could comprise a kinematic structure such as folding bars, to optimize space taken by the engagement mechanism 62.

    [0239] FIG. 3b shows the engagement component 6 protruding slightly from the body's upper surface 22. In other embodiments, the engagement component 6 can be substantially flush with the upper surface 22. That is, the upper surface 22 could comprise an indentation where the engagement component 6 could fit, and from where it could extend beyond the body when moved from an idle into an active position.

    [0240] FIG. 4 shows a schematic embodiment of a system according to one aspect of the present invention. A service robot 1 communicates with a server 200, which in turn communicates with a plurality of mobile robots 100. The server 200 is optional, and is shown for illustrative purposes only. In other words, the service robot 1 and the mobile robots 100 can also communicate directly. In the depicted embodiment, the server 200 may coordinate the operations of the mobile robots and the service robot 1. That is, the server 200 may direct the service robot 1 to navigate to different locations in order to assist different mobile robots 100. Additionally or alternatively, the service robot 1 may coordinate at least part of operations of mobile robots 100. For example, the service robot 1 may coordinate a plurality of mobile robots 100 crossing a traffic road via a pedestrian crossing. On busy routes, a queue of mobile robots may form, all waiting to cross the traffic road to navigate to their destination. This may be undesired, as the robots may block parts of the sidewalk, arrive at their destination later than expected, and/or generally the operations of the mobile robots may be slowed down. The service robot 1 may then be placed in the vicinity of such busy road crossings in order to streamline mobile robot operations. The service robot 1 and the mobile robots 100 may also be coordinated by the server 200, which might calculate optimal routes for the robots and optimal placement for the service robot 1. The service robot 1 may observe the road crossing and transmit data useful to the mobile robots 100 in order to cross it as fast as possible (e.g. any vehicles detected within the robots' 100 blind spots or outside their field of view, state of the traffic light, etc). The service robot 1 may also coordinate (or enable the server 200 to coordinate) a plurality of robots crossing the pedestrian crossing in tandem or in formation (e.g. a column, pairwise crossing, or the like). This can also allow for quicker road crossing, since the mobile robots 100 would not need to individually ensure that the crossing is safe to perform, but would rather be authorized by the service robot 1 to cross without first stopping and verifying the safety of such crossing. The service robot 1 can use a plurality of sensors to ensure that the crossing is safe (e.g. a combination of cameras and a time of flight sensor or a radar). Additionally, the service robot 1 can be placed at a better vantage point to observe the intersection compared to the mobile robots 100, which would observe it from the pedestrian crossing.

    [0241] FIG. 5 schematically shows an embodiment of a method for assisting mobile robot operations. In S1, the mobile robot operations in a predetermined region are monitored. The region can comprise a neighborhood, a campus, a shopping center or the like. In S2, a location and time for providing assistance to the mobile robot by the service robot are estimated. In S3, the service robot is instructed to navigate to the estimated location so as to arrive at the estimated time. In S4, the service robot provides assistance to the mobile robot. The service robot can then depart the location in order to assist a different mobile robot in a different location for example.

    [0242] FIG. 6 demonstrates an exemplary embodiment of the mobile robot 100. The mobile robot 100 can comprise a delivery or a vending robot, that is, it can transport and deliver packages, consumable items, groceries or other items to customers. Preferably, the mobile robot 100 is outfitted with a beverage module (not shown in the figure).

    [0243] The mobile robot 100 comprises a robot body 102. The body 102 comprises an item compartment in which items can be placed and transported by the robot (not shown in the present figure).

    [0244] The mobile robot 100 further comprises a robot motion component 104 (depicted as wheels 104). In the present embodiment, the robot motion component 104 comprises six wheels 104. This can be particularly advantageous for the mobile robot 100 when traversing curbstones or other similar obstacles on the way to delivery recipients.

    [0245] The mobile robot 100 comprises a lid 106. The lid 106 can be placed over the item compartment and locked to prevent unauthorized access to the beverage module.

    [0246] The mobile robot 100 further comprises a robot signaling device 108, depicted here as a flagpole or stick 108 used to increase the visibility of the robot 100. Particularly, the visibility of the robot 100 during road crossings can be increased. In some embodiments, the signaling device 108 can comprise an antenna. The mobile robot 100 further comprises robot headlights 109 configured to facilitate the robot's navigation in reduced natural light scenarios and/or increase the robot's visibility further. The headlights are schematically depicted as two symmetric lights 109, but can comprise one light, a plurality of lights arranged differently and other similar arrangements.

    [0247] The mobile robot 100 also comprises robot sensors 110, 112, 113, 114. The sensors are depicted as visual cameras (110, 112, 113) and ultrasonic sensors (114) in the figure, but can also comprise radar sensors, lidar sensors, time of flight cameras and/or other sensors. Further sensors can also be present on the mobile robot 100. One sensor can comprise a front camera 110. The front camera 110 can be generally forward facing. The sensors may also comprise front (112, 113), side and/or back stereo cameras. The front stereo cameras 112 and 113 can be slightly downward facing. The side stereo cameras (not depicted) can be forward-sideways facing. The back camera (not depicted) may be a mono or a stereo camera can be generally backward facing. The sensors present on multiple sides of the robot can contribute to its situational awareness and navigation capabilities. That is, the robot 100 can be configured to detect approaching objects and/or hazardous moving objects from a plurality of sides and act accordingly.

    [0248] The robot sensors can also allow the robot 100 to navigate and travel to its destinations at least partially autonomously. That is, the robot can be configured to map its surroundings, localize itself on such a map and navigate towards different destinations using in part the input received from the multiple sensors.

    [0249] The service robot 1 can be structurally and physically similar to the mobile robot 100. However, the service robot 1 can be specifically optimized for performing an assistive action, such as pushing a button, whereas the mobile robot 100 can be optimized for tasks such as item delivery and transportation or the like. The service robot 1 may not have an item compartment, or the item compartment may be utilized for the engagement component mechanism or the like.

    LIST OF REFERENCE NUMERALS

    [0250] 1—Service robot [0251] 2—Body [0252] 22—Upper surface of the body [0253] 4—Motion component [0254] 6—Engagement component [0255] 62—Engagement mechanism [0256] 8—Sensor [0257] 10—Communication component [0258] 50—Interaction area [0259] 100—Mobile robot [0260] 102—Robot body [0261] 104—Robot motion component [0262] 106—Lid [0263] 108—Robot signaling device [0264] 109—Headlights [0265] 110—Front camera [0266] 112—Front stereo camera [0267] 113—Front stereo camera [0268] 114—Ultrasonic sensor [0269] 116—Robot communication component [0270] 200—Server

    [0271] Whenever a relative term, such as “about”, “substantially” or “approximately” is used in this specification, such a term should also be construed to also include the exact term. That is, e.g., “substantially straight” should be construed to also include “(exactly) straight”.

    [0272] Whenever steps were recited in the above or also in the appended claims, it should be noted that the order in which the steps are recited in this text may be the preferred order, but it may not be mandatory to carry out the steps in the recited order. That is, unless otherwise specified or unless clear to the skilled person, the order in which steps are recited may not be mandatory. That is, when the present document states, e.g., that a method comprises steps (A) and (B), this does not necessarily mean that step (A) precedes step (B), but it is also possible that step (A) is performed (at least partly) simultaneously with step (B) or that step (B) precedes step (A). Furthermore, when a step (X) is said to precede another step (Z), this does not imply that there is no step between steps (X) and (Z). That is, step (X) preceding step (Z) encompasses the situation that step (X) is performed directly before step (Z), but also the situation that (X) is performed before one or more steps (Y1), . . . , followed by step (Z). Corresponding considerations apply when terms like “after” or “before” are used.