PATH PLANNING METHOD AND NAVIGATION METHOD AND MOBILE MACHINE USING THE SAME
20250390099 ยท 2025-12-25
Inventors
- Huaguang DU (Pasadena, CA, US)
- Peijie Xu (Pasadena, CA, US)
- Zhipeng LIU (Pasadena, CA, US)
- Wenda Xu (Pasadena, CA, US)
- Fangyun ZHAO (Pasadena, CA, US)
- Chengkun Zhang (Pasadena, CA, US)
Cpc classification
G05D2105/31
PHYSICS
International classification
G05D1/246
PHYSICS
Abstract
Path planning and navigation for a mobile machine is disclosed. A path planning method plans a path for the mobile machine having a plurality of sensors by: receiving, from each of the sensors of the mobile machine, sensor data; creating, based on the received sensor data from each of the sensors, a plurality of local sensor layers each corresponding to the received sensor data from each of the sensors; creating a local map by integrating all the created local sensor layers; inflating the local map; creating a global costmap by fusing an inflated global map and the inflated local map; planning, according to the costmap, the path for navigating the mobile machine; and providing the planned path to the mobile machine for navigating the mobile machine using the planned path.
Claims
1. A method for navigating a mobile machine having a plurality of sensors, comprising: obtaining a global map; inflating the global map; and in response to receiving a navigation task of the mobile machine: creating a global costmap for the mobile machine by performing a costmap creation process; planning, based on the costmap, a path for navigating the mobile machine; and navigating the mobile machine using the planned path; wherein, the costmap creation process includes: receiving, from each of the sensors of the mobile machine, sensor data; creating, based on the received sensor data from each of the sensors, a plurality of local sensor layers each corresponding to the received sensor data from each of the sensors; creating a local map by integrating all the local sensor layers; inflating the local map; and creating the costmap for the mobile machine by fusing the inflated global map and the inflated local map.
2. The method of claim 1, further comprising: obtaining, based on the received sensor data from each of the sensors, dynamic obstacle information; and recording the obtained dynamic obstacle information of obstacles in the local map that corresponds to the received sensor data at a current time frame and transform coordinates representing a pose of the mobile machine at the current time frame.
3. The method of claim 2, further comprising: discarding the recorded dynamic obstacle information of the obstacles that is beyond at least one of a field of view of the mobile machine and a specified number of previous time frames before the current time frame.
4. The method of claim 2, wherein creating the local map by integrating all the local sensor layers comprises: creating the local map by integrating all the local sensor layers, the recorded dynamic obstacle information corresponding to the received sensor data at previous time frames before the current time frame and the transform coordinates of the previous time frames.
5. The method of claim 1, further comprising: adjusting a size of the local map according to a velocity of the mobile machine so that the size is proportional to the velocity.
6. The method of claim 1, wherein the global map is a pre-built static map corresponding to a facility, and obtaining the global map comprises: obtaining, based on the static map, static obstacle information; and creating, based on the obtained static obstacle information, the global map.
7. The method of claim 1, wherein the costmap is a map having a plurality of cells each with a cost value with respect to obstacles; wherein planning, based on the created costmap, the path for navigating the mobile machine comprises: planning, according to the costs in the created costmap, the path for navigating the mobile machine while avoiding the obstacles.
8. A method for planning a path for navigating a mobile machine having a plurality of sensors, comprising: receiving, from each of the sensors of the mobile machine, sensor data; creating, based on the received sensor data from each of the sensors, a plurality of local sensor layers each corresponding to the received sensor data from each of the sensors; creating a local map by integrating all the created local sensor layers; inflating the local map; creating a global costmap for the mobile machine by fusing an inflated global map and the inflated local map; planning, according to the costmap, the path for navigating the mobile machine; and providing the planned path to the mobile machine for navigating the mobile machine using the planned path.
9. The method of claim 8, wherein the method is performed in response to receiving a navigation task of the mobile machine.
10. The method of claim 8, further comprising: obtaining, based on the received sensor data from each of the sensors, dynamic obstacle information; and recording the obtained dynamic obstacle information of obstacles in the local map that corresponds to the received sensor data at a current time frame and transform coordinates representing a pose of the mobile machine at the current time frame.
11. The method of claim 10, further comprising: discarding the recorded dynamic obstacle information of the obstacles that is beyond at least one of a field of view of the mobile machine and a specified number of previous time frames before the current time frame.
12. The method of claim 10, wherein creating the local map by integrating all the local sensor layers comprises: creating the local map by integrating all the local sensor layers, the recorded dynamic obstacle information corresponding to the received sensor data at previous time frames before the current time frame and the transform coordinates of the previous time frames.
13. The method of claim 8, further comprising: adjusting a size of the local map according to a velocity of the mobile machine so that the size is proportional to the velocity.
14. A mobile machine, comprising: one or more sensors; one or more processors; and one or more memories storing a costmap module configured to be executed by the one or more processors, wherein the costmap module comprises a layer manager and an inflation manager, and the costmap module comprises instructions to: receive, from each of the sensors of the mobile machine, sensor data; create, using the layer manager based on the received sensor data from each of the sensors, a plurality of local sensor layers each corresponding to the received sensor data from each of the sensors; create, using the layer manager, a local map by integrating all the created local sensor layers; inflate, using the inflation manager, the local map; create a global costmap for the mobile machine by fusing an inflated global map and the inflated local map; plan, according to the costmap, a path for navigating the mobile machine; and provide the planned path to the mobile machine for navigating the mobile machine using the planned path.
15. The mobile machine of claim 14, wherein the costmap module is triggered to execute by the one or more processor in response to receiving a navigation task of the mobile machine.
16. The mobile machine of claim 14, wherein the costmap module further comprises instructions to: obtain the global map; inflate the global map; and navigate the mobile machine using the planned path.
17. The mobile machine of claim 14, wherein the costmap module further comprises a memory manager, and further comprises instructions to: obtain, based on the received sensor data from each of the sensors, dynamic obstacle information; and record, using the memory manager, the obtained dynamic obstacle information of obstacles in the local map that corresponds to the received sensor data at a current time frame and transform coordinates representing a pose of the mobile machine at the current time frame.
18. The mobile machine of claim 17, wherein the costmap module further comprises instructions to: discard, using the memory manager, the recorded dynamic obstacle information of the obstacles that is beyond at least one of a field of view of the mobile machine and a specified number of previous time frames before the current time frame.
19. The mobile machine of claim 17, wherein creating the local map by integrating all the local sensor layers comprises: creating, using the layer manager, the local map by integrating all the local sensor layers, the recorded dynamic obstacle information corresponding to the received sensor data at previous time frames before the current time frame and the transform coordinates of the previous time frames.
20. The mobile machine of claim 14, wherein the costmap module further comprises instructions to: adjust a size of the local map according to a velocity of the mobile machine so that the size is proportional to the velocity.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0004] In order to more clearly illustrate the technical solutions in this embodiment, the drawings used in the embodiments or the description of the prior art will be briefly introduced below. In the drawing(s), like reference numerals designate corresponding parts throughout the figures. It should be understood that, the drawings in the following description are only examples of the present disclosure. For those skilled in the art, other drawings can be obtained based on these drawings without creative works.
[0005]
[0006]
[0007]
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
DETAILED DESCRIPTION
[0015] In order to make the objects, features and advantages of the present disclosure more obvious and easy to understand, the technical solutions in this embodiment will be clearly and completely described below with reference to the drawings. Apparently, the described embodiments are part of the embodiments of the present disclosure, not all of the embodiments. All other embodiments obtained by those skilled in the art based on the embodiments of the present disclosure without creative efforts are within the scope of the present disclosure.
[0016] It is to be understood that, when used in the description and the appended claims of the present disclosure, the terms including, comprising, having and their variations indicate the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or a plurality of other features, integers, steps, operations, elements, components and/or combinations thereof.
[0017] It is also to be understood that, the terminology used in the description of the present disclosure is only for the purpose of describing particular embodiments and is not intended to limit the present disclosure. As used in the description and the appended claims of the present disclosure, the singular forms one, a, and the are intended to include the plural forms as well, unless the context clearly indicates otherwise.
[0018] It is also to be further understood that the term and/or used in the description and the appended claims of the present disclosure refers to any combination of one or more of the associated listed items and all possible combinations, and includes such combinations.
[0019] In the present disclosure, the terms first, second, and third are for descriptive purposes only, and are not to be comprehended as indicating or implying the relative importance or implicitly indicating the amount of technical features indicated. Thus, the feature limited by first, second, and third may include at least one of the feature either explicitly or implicitly. In the description of the present disclosure, the meaning of a plurality is at least two, for example, two, three, and the like, unless specifically defined otherwise.
[0020] In the present disclosure, the descriptions of one embodiment, some embodiments or the like described in the specification mean that one or more embodiments of the present disclosure can include particular features, structures, or characteristics which are related to the descriptions of the descripted embodiments. Therefore, the sentences in one embodiment, in some embodiments, in other embodiments, in other embodiments and the like that appear in different places of the specification do not mean that descripted embodiments should be referred by all other embodiments, but instead be referred by one or more but not all other embodiments unless otherwise specifically emphasized.
[0021] The present disclosure relates to navigate a mobile machine. As used herein, the term mobile machine refers to a machine such as a mobile robot or a vehicle that has the capability to move around in its environment. The term path planning refers to find a sequence of valid configurations that moves a mobile machine from the starting point to the destination, where path denotes a sequence of poses without time stamp (cf. trajectory denotes a sequence of poses or position with time stamp). The term pose refers to position (e.g., x and y coordinates on x and y axes) and posture (e.g., a yaw angle along z axis). The term navigation refers to the process of monitoring and controlling the movement of a mobile machine from one place to another. The term collision avoidance refers to prevent or reduce the severity of a collision. The term sensor refers to a device, module, machine, or subsystem such as ambient light sensor and image sensor (e.g., camera) whose purpose is to detect events or changes in its environment and send the information to other electronics (e.g., processor).
[0022]
[0023] For realizing the path planning (and the navigation) of the mobile machine 100, a static map for the facility 1000 has to be built in advance, and the current (real-time) position of the mobile machine 100 in the facility 1000 has to be determined. For example, a global path (e.g., the global path P.sub.g) between the start point and the destination that is in the built static map for the facility 1000 may be planned based on the built static map, and a (real-time) local path (e.g., detour path P.sub.d) may be planned based on a local map (e.g., local map M.sub.l of
[0024]
[0025] The navigation module 121 in the storage unit 120 of the mobile machine 100 may be a software module (of the operation system of the mobile machine 100) that may belong to a navigation stack (e.g., an ROS (robot operating system) navigation stack) of the mobile machine 100, which has instructions I.sub.n (e.g., instruction for actuating motor(s) M of the mobile machine 100 to move the mobile machine 100) for implementing the navigation of the mobile machine 100, a map builder 1211, and path planner(s) 1212. Each of the map builder 1211 and the path planner(s) 1212 may be a submodule separated from the instructions I.sub.n or other submodules of the navigation module 121, or a part of the instructions I.sub.n for implementing the navigation of the mobile machine 100. The map builder 1211 may be a software module having instructions I.sub.b for building maps for the mobile machine 100. The costmap module 122 in the storage unit 120 of the mobile machine 100 may be a software module (of the operation system of the mobile machine 100) that may belong to the navigation stack, which has instructions I.sub.c for creating costmaps for the mobile machine 100, and a layer manager 12211, an inflation manager 12212, and a memory manager 12213 that are management functionalities specific to building costmaps (see also
[0026] The path planner(s) 1212 may be software module(s) having instructions Ip for planning path for the mobile machine 100. In some embodiments, the path planner(s) 1212 may include a global path planner for planning global paths (e.g., the global path P.sub.g) for the mobile machine 100 and a local path planner for planning local paths (e.g., the detour path P.sub.d) for the mobile machine 100. The global path planner may be, for example, a path planner which plans global paths based on costmap built by the map builder 1211 and/or other map built by the map builder 1211 through, for example, simultaneous localization and mapping (SLAM). The local path planner may be, for example, a path planner based on A*, RRT* (rapidly-exploring random trees), or TEB (timed elastic band) algorithm, which plans local paths based on the global paths, and other data collected by the mobile machine 100. For example, images may be collected through a camera C and/or a Lidar R of the mobile machine 100, and the collected images may be analyzed so as to identify obstacles, so that the local path can be planned with reference to the identified obstacles, and the obstacles can be avoided by moving the mobile machine 100 according to the planned local path. In other embodiments, rather than including the global path planner and the local path planner, the path planner(s) 1212 may include a path planner for planning both the global paths and the local paths. The path planner(s) 1212 may further have data (e.g., input/output data and temporary data) related to the path planning of the mobile machine 100 which may be stored in the one or more memories and accessed by the processing unit 110.
[0027] In some embodiments, each of the path planner(s) 1212 may be a module in the storage unit 120 that is separated from the navigation module 121. The instructions I.sub.n may include instructions for implementing collision avoidance of the mobile machine 100 (e.g., obstacle detection and detour path planning). In addition, the local path planner may plan a detour path (e.g., the detour path P.sub.d) to graft to the global path(s) (e.g., the global path P.sub.g) in response to, for example, the original global path(s) being blocked (e.g., blocked by an unexpected obstacle such as the obstacle O) or inadequate for collision avoidance (e.g., impossible to avoid a detected obstacle when adopted). The detour path may be grafted to the global path(s) by replacing a part of the original global path(s) that is near to the obstacle. In other embodiments, the navigation module 121 may be a navigation unit communicating with the processing unit 110, the storage unit 120, and the control unit 130 over the one or more communication buses or signal lines L, and may further include one or more memories (e.g., high-speed random access memory (RAM) and non-transitory memory) for storing the instructions I.sub.n, the map builder 1211, and the path planner(s) 1212, and one or more processors (e.g., MPU and MCU) for executing the stored instructions I.sub.n, I.sub.b, I.sub.p, and I.sub.c to implement the navigation of the mobile machine 100.
[0028] The mobile machine 100 may further include a communication subunit 131 and an actuation subunit 132. The communication subunit 131 and the actuation subunit 132 communicate with the control unit 130 over one or more communication buses or signal lines that may be the same or at least partially different from the above-mentioned one or more communication buses or signal lines L. The communication subunit 131 is coupled to communication interfaces of the mobile machine 100, for example, network interface(s) 1311 for the mobile machine 100 to communicate with the control device 200 via network(s) and I/O interface(s) 1312 (e.g., a physical button), and the like. The actuation subunit 132 is coupled to component(s)/device(s) for implementing the motions of the mobile machine 100 by, for example, actuating motor(s) M of wheels and/or joints of the mobile machine 100. The communication subunit 131 may include controllers for the above-mentioned communication interfaces of the mobile machine 100, and the actuation subunit 132 may include controller(s) for the above-mentioned component(s)/device(s) for implementing the motions of the mobile machine 100. In other embodiments, the communication subunit 131 and/or actuation subunit 132 may just abstract component for representing the logical relationships between the components of the mobile machine 100.
[0029] The mobile machine 100 may further include a sensor subunit 133 which may include a set of sensor(s) and related controller(s), for example, the camera C (e.g., an RGB-D camera) and an IMU (inertial measurement unit) U (or an accelerometer and a gyroscope), for detecting the environment in which it is located to realize its navigation. The sensor subunit 133 communicates with the control unit 130 over one or more communication buses or signal lines that may be the same or at least partially different from the above-mentioned one or more communication buses or signal lines L. In other embodiments, in the case that the navigation module 121 is the above-mentioned navigation unit, the sensor subunit 133 may communicate with the navigation unit over one or more communication buses or signal lines that may be the same or at least partially different from the above-mentioned one or more communication buses or signal lines L. In addition, the sensor subunit 133 may be just abstract component for representing the logical relationships between the components of the mobile machine 100. Furthermore, the sensor subunit 133 may further include other sensor for detecting the environment in which the mobile machine 100 is located, for example, an ultrasonic sensor and an infrared (IR) sensor.
[0030] In some embodiments, the map builder 1211, the path planner(s) 1212, the sensor subunit 133, and the motor(s) M (and wheels and/or joints of the mobile machine 100 coupled to the motor(s) M) jointly compose a (navigation) system which implements map building, (global and local) path planning, and motor actuating so as to realize the navigation of the mobile machine 100. In addition, the various components shown in
[0031]
[0032]
[0033]
[0034] A map represented as an occupancy grid containing a plurality of cells each assigned a (inflated/uninflated) cost value is called a costmap Mc, where the cost value close to obstacles will be higher and the cost value far away from obstacles will be lower.
[0035] A trigger logic (see step S315) may be realized for enabling the mobile machine 100 to conserve computational resources when not engaged in navigation tasks. At step S315, it determines whether a navigation task of the mobile machine 100 is received or not. If yes, step S316 will be performed to start a costmap creation process for creating the costmap M.sub.c; otherwise, step S315 will be reperformed to loop and wait for a navigation task. The navigation task may be created by the navigation/operation system of the mobile machine 100 for performing the costmap creation process, in response to a request for the navigation of the mobile machine 100. Steps S311-S314 is performed before step S315 so as to speeded-up the initiation of the navigation stack, and may also be performed after step S315 to, for example, perform in parallel with steps S316-S318. Upon receiving the navigation task, the navigation stack of the mobile machine 100 triggers the costmap creation process for creating the costmap M.sub.c. At step S316, the sensor data D.sub.s is received from each of the sensors for navigation of the mobile machine 100. The sensor data D.sub.s is received in a current time frame, that is, the time frame of the current time. When the sensors for navigation include the camera C and the Lidar R, the received sensor data D.sub.s include the sensor data D.sub.s of the camera C (e.g., images) and that of the Lidar R (e.g., point clouds). At step S317, the received sensor data D.sub.s is preprocessed. The received sensor data D.sub.s may be preprocessed by, for example, removing sensor noise and/or handling data dropouts. At step S318, dynamic obstacle information O.sub.d is obtained from the preprocessed sensor data D.sub.s. The dynamic obstacle information O.sub.d is information (e.g., the position) of obstacles detected by the sensors for navigation, where the obstacles may include dynamic obstacles that are recognized from the received sensor data D.sub.s (e.g., the obstacle O of
[0036] According to the navigation method, the processing unit 110 may further provide the (global) costmap M.sub.c for the mobile machine 100 based on the global map M.sub.g, the preprocessed sensor data D.sub.s, and other obstacle information O.sub.t (block 320 of
[0037]
[0038] Steps S3221-S3222 are performed in parallel with steps S3223-S3225 so as to speeded-up the initiation of the creation of the local map M.sub.l in step S3226. At step S3223, the dynamic obstacle information O.sub.d is obtained based on the preprocessed sensor data D.sub.s from each of the sensors for navigation and the other obstacle information O.sub.t including the historical dynamic obstacle information O.sub.d. At step S3224, the memory manager 12213 records the dynamic obstacle information O.sub.d of obstacles in the local map M.sub.l that corresponds to the sensor data D.sub.s received at the current time frame and transform coordinates representing the pose of the mobile machine 100 at the current time frame. The memory manager 12213 significantly reduces the computational burden of the mobile machine 100 by retaining only the transform coordinates of different time frames. At step S3225, the recorded dynamic obstacle information Oa corresponding to the received sensor data D.sub.s at the previous time frames and the transform coordinates of the previous time frames are obtained. At step S3226, the layer manager 12211 creates the local map M.sub.l of the adjusted size by integrating all the local sensor layers L.sub.s and the recorded obstacle information O.sub.d (i.e., the historical dynamic obstacle information O.sub.d) and transform coordinates. By integrating all the local sensor layers L.sub.s and the historical dynamic obstacle information O.sub.d, a comprehensive representation of the local map M.sub.l can be constructed. The integration may be performed by aligning the historical dynamic obstacle information O.sub.d with the current time frame using the transform coordinates; combining the sensor data D.sub.s in each of the local sensor layers L.sub.s into an empty occupancy grid that is taken as the local map M.sub.l; integrating the aligned historical dynamic obstacle information Oa into the grid. The memory manager 12213 may save storage and computational resources and facilitate the rational updates of the costmap M.sub.c by selectively discarding the outdated historical dynamic obstacle information O.sub.d through a forget logic (see steps S3227-S3228) which spans both spatial and temporal dimension. This dual-dimensional approach optimizes the memory utilization of the mobile machine 100 while upholding the integrity of the navigation/operation system of the mobile machine 100. At step S3227, the memory manager 12213 discards (e.g., removes from a memory of the storage unit 120) the recorded dynamic obstacle information O.sub.d of the obstacles that is beyond the FOV of the mobile machine 100. The sensor data D.sub.s for obtaining the recorded dynamic obstacle information O.sub.d that is within the FOV of the mobile machine 100 is considered ground truth and is retained, while the sensor data D.sub.s outside the FOV is systematically erased to ensure real-time accuracy. At step S3228, the memory manager 12213 discards the recorded obstacle information O.sub.d of the obstacles that is beyond a specified number (e.g., 100) of the previous time frames. In other embodiments, step S3227 and step S3228 may be performed in a reverse order, or it may perform only one of the steps rather than both.
[0039] According to the costmap providing, the processing unit 110 may further inflate the local map M.sub.l (block 323 of
[0040] According to the navigation method, the processing unit 110 may further perform path planning based on the created costmap M.sub.c to provide the global path P.sub.g (and the detour path P.sub.d) (block 330 of
[0041] By focusing on enhancing FOV adaptability, dynamic map sizing, and intelligent computational resource management, a novel framework of costmap architecture is provided, which improves navigational precision and efficiency in real-world environment.
[0042] It can be understood by those skilled in the art that, all or part of the method in the above-mentioned embodiment(s) can be implemented by one or more computer programs to instruct related hardware. In addition, the one or more programs can be stored in a non-transitory computer readable storage medium. When the one or more programs are executed, all or part of the corresponding method in the above-mentioned embodiment(s) is performed. Any reference to a storage, a memory, a database or other medium may include non-transitory and/or transitory memory. Non-transitory memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, solid-state drive (SSD), or the like. Volatile memory may include random access memory (RAM), external cache memory, or the like.
[0043] The processing unit 110 (and the above-mentioned processor) may include central processing unit (CPU), or be other general purpose processor, graphics processing unit (GPU), digital signal processor (DSP), application specific integrated circuit (ASIC), field-programmable gate array (FPGA), or be other programmable logic device, discrete gate, transistor logic device, and discrete hardware component. The general purpose processor may be microprocessor, or the processor may also be any conventional processor. The storage unit 120 (and the above-mentioned memory) may include internal storage unit such as hard disk and internal memory. The storage unit 120 may also include external storage device such as plug-in hard disk, smart media card (SMC), secure digital (SD) card, and flash card.
[0044] The exemplificative units/modules and methods/steps described in the embodiments may be implemented through software, hardware, or a combination of software and hardware. Whether these functions are implemented through software or hardware depends on the specific application and design constraints of the technical schemes. The above-mentioned path planning method and mobile machine may be implemented in other manners. For example, the division of units/modules is merely a logical functional division, and other division manner may be used in actual implementations, that is, multiple units/modules may be combined or be integrated into another system, or some of the features may be ignored or not performed. In addition, the above-mentioned mutual coupling/connection may be direct coupling/connection or communication connection, and may also be indirect coupling/connection or communication connection through some interfaces/devices, and may also be electrical, mechanical or in other forms.
[0045] The above-mentioned embodiments are merely intended for describing but not for limiting the technical schemes of the present disclosure. Although the present disclosure is described in detail with reference to the above-mentioned embodiments, the technical schemes in each of the above-mentioned embodiments may still be modified, or some of the technical features may be equivalently replaced, so that these modifications or replacements do not make the essence of the corresponding technical schemes depart from the spirit and scope of the technical schemes of each of the embodiments of the present disclosure, and should be included within the scope of the present disclosure.