Autonomous exploration framework for indoor mobile robotics using reduced approximated generalized Voronoi graph

11567502 · 2023-01-31

Assignee

Inventors

Cpc classification

International classification

Abstract

An autonomous robotic exploration method based on a reduced approximated generalized Voronoi graph, the method including: 1) constructing a reduced approximated generalized Voronoi topological map based on a morphological method; 2) obtaining an Next-Best-View and planning a global path from the robot to the Next-Best-View; and 3) navigating to the Next-Best-View along the global path R={r.sub.0, r.sub.1, r.sub.2, . . . , p.sub.NBV} based on a visual force field (VFF) algorithm.

Claims

1. A method, comprising: 1) constructing a reduced approximated generalized Voronoi topological map by means of a morphological method according to 1.1)-1.4): 1.1) performing threshold extraction for an occupancy probabilistic grid map: extracting obstacle pixels with a gray scale smaller than an empirical threshold th.sub.obstacle as an obstacle region, the th.sub.obstacle being a value between 50-70; and extracting non-obstacle pixels with a gray scale greater than an empirical threshold th.sub.free as a passable region, and the th.sub.free being a value between 200-220; 1.2) performing small-gap filling for the obstacle region and the passable region generated in 1.1) using a morphological closing operation; removing those pixel blocks with the number of pixels smaller than an empirical threshold th.sub.conn using connectivity analysis, in which the value of the th.sub.conn is determined according to the resolution of the occupancy probabilistic grid map; obtaining a smooth obstacle region and a smooth passable region by removing a convex part of the boundary of the pixel block using smoothing filtering; 1.3) obtaining the reduced approximated generalized Voronoi diagram by performing center line extraction for the smooth passable region in 1.2) using an image thinning algorithm; and 1.4) performing topologization for the reduced approximated generalized Voronoi graph obtained in 1.3), extracting a node set V in the reduced approximated generalized Voronoi graph by adopting a neighborhood analysis, obtaining edges E connecting these nodesbased on a flood-fill algorithm, and recording edge lengths in a distance matrix M.sub.dist, to obtain a topological map G={V, E, M.sub.dist} of the passable region; 2) obtaining a Next-Best-View and planning a global path from the robot to the Next-Best-View according to 2.1)-2.5): 2.1) obtaining an edge e closest to a current location of the robot in the topological map G and two nodes v.sub.e.sup.1 and v.sub.e.sup.2 of the e by taking all leaf nodes V.sub.leaf in the topological map G as initial candidate frontiers, and querying two paths R.sub.1={v.sub.e.sup.1, r.sub.0.sup.1, r.sub.1.sup.1, r.sub.2.sup.1, . . . , p} and R.sub.2={v.sub.e.sup.2, r.sub.0.sup.2, r.sub.1.sup.2, r.sub.2.sup.2, . . . , p} from each candidate point p to the v.sub.e.sup.1 and the v.sub.e.sup.2 respectively in the topological map based on a Dijkstra algorithm; 2.2) performing curve simplification for the two paths obtained in 2.1) based on a Douglas-Pucker algorithm, then calculating lengths of two simplified paths as D(p, v.sub.e.sup.1) and D(p, v.sub.e.sup.2) respectively, selecting a minimum value as a distance D(p)=min{D(p, v.sub.e.sup.1), D(p, v.sub.e.sup.2)} from the p to the robot, and calculating a sum T(p) of turning angles of a path corresponding to the distance; 2.3) calculating an estimate A(p) of a potential environmental information gain of each candidate point p in the occupancy probabilistic grid map, and calculating a perception degree C(p) of a sensor of the candidate point; 2.4) performing normalization for four feature values D(p), T(p), A(p) and C(p) of the candidate point calculated in 2.2) and 2.3), obtaining a candidate point set P.sub.candidate by excluding the candidate points with extremely low scores based on a low threshold, and completing the exploration when the candidate point set is an empty set; and 2.5) evaluating each candidate point in the candidate point set P.sub.candidate by a Multi-Criteria-Decision-Making approach based on a fuzzy measure function, taking the candidate point with the highest score as the Next-Best-View p.sub.NBV, and obtaining the global path R={r.sub.0, r.sub.1, r.sub.2, . . . , p.sub.NBV} from the current location of the robot to the Next-Best-View by tracing back in the result of in 2.2); and 3) navigating to the Next-Best-View along the global path R={r.sub.0, r.sub.1, r.sub.2, . . . , p.sub.NBV} based on a visual force field (VFF) algorithm according to 3.1)-3.3): 3.1) letting a current navigation target point be r.sub.0, performing motion planning in real time according to laser sensor data based on the VFF algorithm, transmitting motion planning instructions to the robot, so that the robot starts to move toward the current navigation target point r.sub.0 until the robot reaches r.sub.0; 3.2) when the robot reaches a critical point r.sub.i in the global path, letting the current navigation target point be r.sub.i+1, performing motion planning in real time based on the VFF algorithm, so that the robot continues moving to the r.sub.i+1 until the robot reaches the Next-Best-View p.sub.NBV; and 3.3) returning to 1), and starting to query the next Next-Best-View for the next stage of exploration; wherein the sum T(p) of the turning angles in 2.2) is calculated based on the following formula: T ( p ) = .Math. i = 1 M - 2 θ i wherein: θ.sub.i=π−∠D.sub.i−1D.sub.iD.sub.i+1, which refers to a degree of an i-th turning angle.

2. The method of claim 1, wherein the perception degree C(p) of the sensor in 2.3) is calculated based on the following formulas: x u c = arg min x U C Dist ( p , x ) , x c = arg min x C Dist ( p , x ) , and C ( p ) = { - Dist ( p , x uc ) , if p C Dist ( p , x u ) , if p UC , wherein: a set UC refers to a region undetected by another sensor, a set C refers to a region already detected by another sensor, x.sub.uc refers to a pixel point that is closest to the candidate point p and undetected by another sensor, and x.sub.c refers to a point that is closest to the candidate point p and already detected by another sensor.

3. The method of claim 1, wherein the normalization in 2.4) is performed for the four feature values D(p), T(p), A(p) and C(p) of the candidate point based on the following formulas: u A ( p ) = A ( p ) - min q CP A ( q ) max q CP A ( q ) - min q CP A ( q ) , u D ( p ) = 1 - D ( p ) - min q CP D ( q ) max q C P D ( q ) - min q CP D ( q ) , u T ( p ) = 1 - T ( p ) - min q CP T ( q ) max q CP T ( q ) - min q CP T ( q ) and u C ( p ) = C ( p ) - min q CP C ( q ) max q CP C ( q ) - min q CP C ( q ) .

4. The method of claim 1, where in 2.5), the calculation of the score of candidate point p involves the following formula:
F(p)=Σ.sub.j=1.sup.n(u.sub.j(p)−u.sub.j−1(p))μ(A.sub.j), wherein: a sub-set Aj⊂F is defined as Aj={F.sub.m,m∈N|u.sub.j(p)≤u.sub.m(p)≤u.sub.n(p)}, μ a normalized fuzzy metric function used for defining a weight of a group of features, and feature score vector sequences of the candidate points p are sorted in an ascending order according to normalized numerical values of the candidate points, that is, 0=u.sub.0(p)≤u.sub.1(p)≤u.sub.2(p)≤ . . . ≤u.sub.n(p)≤1.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) FIG. 1 is a flowchart illustrating indoor autonomous robotic exploration according to an example of the disclosure.

(2) FIG. 2 illustrates a main flow of an autonomous exploration method based on a Next-Best-View according to an example of the disclosure.

(3) FIG. 3 illustrates a main flow of constructing an environmental topological map from an occupancy probabilistic grid map based on a reduced approximated generalized Voronoi graph according to an example of the disclosure.

(4) FIG. 4A is an occupancy probabilistic grid map according to an example of the disclosure.

(5) FIG. 4B is a schematic diagram illustrating a smooth obstacle region according to an example of the disclosure.

(6) FIG. 4C is a schematic diagram illustrating a smooth passable region according to an example of the disclosure.

(7) FIG. 5 is an environmental topological map based on a reduced approximated generalized Voronoi graph according to an example of the disclosure.

(8) FIG. 6A is a schematic diagram illustrating a global path calculation method according to an example of the disclosure.

(9) FIG. 6B is a schematic diagram illustrating a path curvature calculation method according to an example of the disclosure.

(10) FIG. 7A is a schematic diagram of calculating an estimate A(p) of a potential environmental information gain of each candidate point p by a ray casting method according to an example of the disclosure.

(11) FIG. 7B is a schematic diagram illustrating a coverage of another sensor carried by a robot and calculating a perception degree C(p) of the sensor of the robot according to an example of the disclosure.

(12) FIG. 8 is a schematic diagram illustrating an indoor autonomous robotic exploration process according to an example of the disclosure.

DETAILED DESCRIPTION OF THE EMBODIMENTS

(13) To further illustrate, embodiments detailing an autonomous robotic exploration method based on a reduced approximated generalized Voronoi graph are described below. It should be noted that the following embodiments are intended to describe and not to limit the disclosure.

(14) An example of the disclosure provides an autonomous exploration method of a mobile robot applicable to an indoor scenario. The indoor autonomous exploration method of the mobile robot is designed and implemented by generating a topological map from an occupancy probabilistic grid map and combining respective advantages of two environmental maps. The example is implemented through the following technical solution: an Next-Best-View is obtained by constructing a reduced approximated generalized Voronoi topological map based on a morphological method, a global path from a current location of the robot to the Next-Best-View is planned, and the robot is navigated to the Next-Best-View along the global path R={r.sub.0, r.sub.1, r.sub.2, . . . , p.sub.NBV}, as shown in FIG. 1. By fully utilizing the features of the reduced approximated generalized Voronoi graph, the method is applied to the indoor autonomous exploration task of the mobile robot, and the autonomous exploration problem is converted from a two-dimensional plane space into a topological map space, thereby greatly reducing the number of candidate frontiers, and decreasing the calculation amount of the path planning. Further, it is ensured that the generated global path is the optimal collision-free path at the same time. Compared with other methods, the method of the disclosure can enable the indoor autonomous robotic exploration to be faster, the reduced approximated generalized Voronoi graph is more flexible in the specific application, and the requirements of the autonomous exploration task of the mobile robot is satisfied better.

(15) In a specific implementation, an autonomous robotic exploration method based on a reduced approximated generalized Voronoi graph is applicable to tasks such as indoor surveying and indoor search and rescue of the mobile robot in an unknown indoor space. The method includes the following blocks.

(16) At block S1, a reduced approximated generalized Voronoi topological map is constructed based on a morphological method as shown in FIG. 3, which includes the following sub-blocks.

(17) At sub-block S1.1, threshold extraction is performed for an occupancy probabilistic grid map as shown in FIG. 4A, where pixels with a gray scales smaller than an empirical thresholdth.sub.obstacle are extracted as an obstacle region, and the th.sub.obstacle usually is 50-70; and pixels with a gray scale greater than an empirical value th.sub.free are extracted as a passable region, and the th.sub.free usually is 200-220.

(18) At sub-block S1.2, small-gap filling is performed for the obstacle region and the passable region generated at the above sub-block S1.1 by using a morphological closing operation, a pixel block with the number of pixels smaller than an empirical threshold th.sub.conn is removed based on a connectivity analysis, and the th.sub.conn is determined according to a resolution and an application requirement of the occupancy probabilistic grid map. In this example, one pixel of the occupancy probabilistic grid map represents a region of 0.05 m×0.05 m, and the th.sub.conn is 40. A convex part of a pixel block boundary is removed by smoothing filtering, so that a smooth obstacle region as shown in FIG. 4B and a smooth passable region as shown in FIG. 4C are obtained.

(19) At sub-block S1.3, the reduced approximated generalized Voronoi diagram is obtained by performing center line extraction for the smooth passable region at sub-block S1.2 based on an image thinning algorithm of eroding edge pixels, where the Voronoi diagram is in the form of a continuous network graph with a width being 1 on a grid image.

(20) At sub-block S1.4, topologization is performed for the reduced approximated generalized Voronoi graph obtained at sub-block S1.3, a node set V in the reduced approximated generalized Voronoi graph is extracted by adopting a neighborhood analysis, edges E connecting these nodes are obtained based on a flood-fill algorithm, and edge lengths are recorded in a distance matrix M.sub.dist, to obtain a topological map G={V, E, M.sub.dist} of the passable region as shown in FIG. 5.

(21) At block S2, a Next-Best-View is obtained, and a global path from the robot to the Next-Best-View is planned, where the block S2 includes the following sub-blocks.

(22) At sub-block S2.1, all leaf nodes V.sub.leaf in the topological map G are taken as initial candidate frontiers, an edge e closest to the current location of the robot and two nodes v.sub.e.sup.1 and v.sub.e.sup.2 of the edge e in the G are obtained, and two paths R.sub.1={v.sub.e.sup.1, r.sub.0.sup.1, r.sub.1.sup.1, r.sub.2.sup.1, . . . , p} and R.sub.2={v.sub.e.sup.2, r.sub.0.sup.2, r.sub.1.sup.2, r.sub.2.sup.2, . . . , p} from each candidate point p to the v.sub.e.sup.1 and the v.sub.e.sup.2 are queried respectively in the topological map based on a Dijkstra algorithm.

(23) At sub-block S2.2, curve simplification is performed for two paths obtained at sub-block S2.1 based on a Douglas-Pucker algorithm, and lengths of two simplified paths are then calculated as D(p, v.sub.e.sup.1) and D(p, v.sub.e.sup.2) respectively, as shown in FIG. 6A; a minimum value is taken as a distance D(p)=min{D(p, v.sub.e.sup.1),D(p, v.sub.e.sup.2)} from the p to the robot, and a sum T(p) of turning angles of a path corresponding to the distance is calculated as shown in FIG. 6B and the calculation may be performed based on the following formula:
T(p)=Σ.sub.i=1.sup.M−2θ.sub.i.

(24) In the above formula, θ.sub.i=π−∠D.sub.i−1D.sub.iD.sub.i+1, which refers to a degree of the i-th turning angle.

(25) At sub-block S2.3, an estimate A (p) of a potential environmental information gain of each candidate point p is calculated in the occupancy probabilistic grid map by a ray casting method as shown in FIG. 7A, and FIG. 7B illustrates a coverage of another sensor (such as a panorama camera, a laser radar and a bio-detector) carried by the robot and a perception degree C(p) of the sensor of the robot calculated based on the following formulas.

(26) x uc = arg min x UC Dist ( p , x ) x c = arg min x C Dist ( p , x ) C ( p ) = { - Dist ( p , x u c ) , if p C Dist ( p , x u ) , if p UC

(27) In the above formulas, a set UC refers to a region undetected by another sensor, a set C refers to a region already detected by another sensor, x.sub.uc refers to a pixel point that is closest to the candidate point p and undetected by another sensor, and x.sub.c refers to a point that is closest to the candidate point p and already detected by another sensor.

(28) At sub-block S2.4, a feature score vector U (p.sub.i)=(u.sub.i(p.sub.i), u.sub.2 (p.sub.i), . . . , u.sub.n(p.sub.i)) of the candidate point is constructed by performing, based on the following formulas, normalization for four feature values D(p), T(p), A(p) and C(p) of the candidate points calculated at sub-blocks S2.2 and S2.3, a candidate point set P.sub.candidate is obtained by excluding the candidate points with extremely low scores based on a low threshold. The exploration is completed if the candidate point set is an empty set.

(29) u A ( p ) = A ( p ) - min q CP A ( q ) max q CP A ( q ) - min q CP A ( q ) u D ( p ) = 1 - D ( p ) - min q CP D ( q ) max q CP D ( q ) - min q CP D ( q ) u T ( p ) = 1 - T ( p ) - min q CP T ( q ) max q CP T ( q ) - min q CP T ( q ) u C ( p ) = C ( p ) - min q CP C ( q ) max q CP C ( q ) - min q CP C ( q )

(30) At sub-block S2.5, each candidate point in the candidate point set P candidate is evaluated based on the following formula by a Multi-Criteria-Decision-Making approach based on a fuzzy measure function, the candidate point with the highest score is the Next-Best-View p.sub.NBV, and the global path R={r.sub.0, r.sub.1, r.sub.2, . . . , r.sub.NBV} from the current location of the robot to the Next-Best-View is obtained by tracing back in the result of sub-block S2.2.
F(p)=Σ.sub.j=1.sup.n(u.sub.j(p)−u.sub.j−1(p))μ(A.sub.j)

(31) In the above formula, a sub-set j⊂F is defined as Aj={F.sub.m,m∈N|u.sub.j(p)≤u.sub.m(p)≤u.sub.n(p)}, μ is a normalized fuzzy metric function used for defining a weight of a group of features, and feature score vector sequences of the candidate points pare sorted in an ascending order according to normalized numerical values of the candidate points, that is, 0=u.sub.0(p)≤u.sub.1(p)≤u.sub.2(p)≤ . . . ≤u.sub.n(p)≤1.

(32) At block S3, navigation is performed to the Next-Best-View along the global path R={r.sub.0, r.sub.1, r.sub.2, . . . , p.sub.NBV} based on a local obstacle avoidance algorithm of a visual force field (VFF), which specifically includes the following sub-blocks.

(33) At sub-block S3.1, if a current navigation target point is r.sub.0, motion planning is performed in real time according to laser sensor data based on the VFF algorithm, and motion planning instructions are transmitted to the robot, so that the robot starts to move toward the current navigation target point r.sub.0 until it reaches the point r.sub.0.

(34) At sub-block S3.2, if the robot reaches a critical point r.sub.i in the global path, let the current navigation target point be r.sub.i+1. The motion planning is performed in real time based on the VFF algorithm, so that the robot continues moving to r.sub.i+1 until it reaches the Next-Best-View p.sub.NBV.

(35) At sub-block S3.3, a return is made to block S1 to start to query the next Next-Best-View for the next stage of exploration. FIG. 8 is a schematic diagram illustrating a progress of an indoor environmental exploration task according to an example of the disclosure.

(36) In the example, a method based on a reduced approximated generalized Voronoi graph is implemented. The reduced approximated generalized Voronoi graph may be regarded as a center line of the passable region in the map, and the topological map generated by the reduced approximated generalized Voronoi graph can express the connectivity and reachability of the entire passable region. Based on this, redundant edges and nodes are culled from the reduced approximated generalized Voronoi graph proposed in the example, so that the entire topological map can express almost same environmental information with a smaller number of edges and nodes. The leaf nodes of the reduced approximated generalized Voronoi graph may be taken as candidate frontiers, and the edges may be taken as an initial space of the path planning. Thus, the target number to be calculated is reduced, spatial complexity and time complexity of the algorithm are greatly lowered, and the functional, real-time and complete requirements of the indoor autonomous exploration of the mobile robot are satisfied.

(37) It will be obvious to those skilled in the art that changes and modifications may be made, and therefore, the aim in the appended claims is to cover all such changes and modifications.