Robot and method for controlling the same
10976749 · 2021-04-13
Assignee
Inventors
Cpc classification
Y10S901/01
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
Y10S901/49
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G05D1/0088
PHYSICS
International classification
Abstract
A robot and a method for controlling the robot are provided, each of which prevents the robot from hitting an obstacle when autonomous movement of the robot is controlled using an OSS. The method for controlling a robot including a plurality of rangefinder sensors, the robot being configured to autonomously move based on a map created using one or more measurement data portions of the plurality of rangefinder sensors. The method includes: deleting some of the one or more measurement data portions of the plurality of rangefinder sensors; integrating the one or more measurement data portions remaining after the deleting into an integrated measurement data portion; and determining the integrated measurement data portion to be a measurement data portion of one or more virtual rangefinder sensors which are fewer in number than the number of the plurality of rangefinder sensors.
Claims
1. A method for controlling a robot including a plurality of rangefinder sensors, the method comprising: plotting a plurality of measurement points of the plurality of rangefinder sensors on a range-finding range of a virtual rangefinder sensor; dividing the range-finding range of the virtual rangefinder sensor into a plurality of divided measurement areas; for each of the plurality of divided measurement areas, (i) selecting a measurement point in the divided measurement area, (ii) deleting remaining measurement points in the divided measurement area other than the selected measurement point, and (iii) determining the selected measurement point as a measurement point of the virtual rangefinder sensor; and creating a map based on the measurement point of the virtual rangefinder sensor in each of the plurality of divided measurement areas, wherein the range-finding range of the virtual rangefinder sensor is circular, the plurality of divided areas are obtained by equally dividing the range-finding range of the virtual rangefinder sensor according to a predetermined angle, and the robot is configured to move autonomously based on the map.
2. The method according to claim 1, wherein, in each of the divided measurement areas, the deleted remaining measurement points are unmeasurable at a position of the virtual rangefinder sensor.
3. The method according to claim 1, wherein for each of the plurality of divided measurement areas, the selected measurement point in the divided measurement area is a measurement point, from among measurement points in the divided measurement area, having a shortest distance to a position of the virtual rangefinder sensor.
4. The method according to claim 1, wherein each of the divided measurement areas has a wedge shape having a center angle identical to the predetermined angle.
5. The method according to claim 1, wherein a current position of the robot is estimated using one or more of the measurement point of the virtual rangefinder sensor for each of the plurality of divided measurement areas.
6. A robot including a plurality of rangefinder sensors, the robot comprising a controller configured to: plot a plurality of measurement points of the plurality of rangefinder sensors on a range-finding range of a virtual rangefinder sensor; divide the range-finding range of the virtual rangefinder sensor into a plurality of divided measurement areas; for each of the plurality of divided measurement areas, (i) select a measurement point in the divided measurement area, (ii) delete remaining measurement points in the divided measurement area other than the selected measurement point, and (iii) determine the selected measurement point as a measurement point of the virtual rangefinder sensor; and create a map based on the measurement point of the virtual rangefinder sensor in each of the plurality of divided measurement areas, wherein the range-finding range of the virtual rangefinder sensor is circular, the plurality of divided areas are obtained by equally dividing the range-finding range of the virtual rangefinder sensor according to a predetermined angle, and the robot is configured to move autonomously based on the map.
Description
BRIEF DESCRIPTION OF DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
DESCRIPTION OF EMBODIMENTS
(17) Hereinafter, a description will be given of an embodiment of the present invention with reference to the accompanying drawings.
(18) First, a description will be given of a configuration of mobile robot 100 according to an embodiment of the present invention with reference to
(19) As illustrated in
(20) Furthermore, as illustrated in
(21) Sensor integration section 11 integrates a measurement data portion obtained by measurement using laser rangefinder sensor 4R and a measurement data portion obtained by measurement using laser rangefinder sensor 4L. The measurement data portion resulting from the integration is referred to as an “integrated measurement data portion,” hereinafter.
(22) Map section 12 creates a map of a region (space) in which mobile robot 100 moves, and stores (hereinafter, the term “manages” may be used) the map.
(23) Destination setting section 13 sets coordinates of a specific position in a map as a moving destination (destination) of mobile robot 100. The set coordinates of the moving destination are referred to as “target coordinates,” hereinafter.
(24) Self-localization section 14 estimates a current position (self-location) of mobile robot 100 based on the integrated measurement data portion and the map. The coordinates of the estimated position are referred to as “current position coordinates,” hereinafter.
(25) Intelligence section 15 creates a path through which the mobile robot moves, based on the integrated measurement data portion, the target coordinates, and the current position coordinates, and recognizes an obstacle on the path.
(26) Drive-wheel control section 16 controls drive wheels 1 such that mobile robot 100 moves (runs) on the path created by intelligence section 15.
(27) Mobile robot 100 runs (moves forward) in a direction of arrow “b” in
(28) Furthermore, mobile robot 100 is capable of running in a direction opposite to arrow “b” (running backward) by rotating the pair of drive wheels 1 in a direction opposite to arrow “a.”
(29) Moreover, the pair of drive wheels 1 are configured to be individually rotatable. Accordingly, by rotating the pair of drive wheels in the directions opposite to each other, mobile robot 100 can spin turn.
(30) Laser rangefinder sensors 4R and 4L are provided on right and left sides with respect to the traveling direction of mobile robot 100, respectively. It is favorable that laser rangefinder sensors 4R and 4L are both provided in a forward most position of mobile robot main body 3 as illustrated in
(31) The configuration of mobile robot 100 has been described thus far.
(32) Next, a description will be given of an exemplary map created and stored by map section 12, with reference to
(33) In occupancy grid map 200, a region in which mobile robot 100 moves is partitioned into squares as grids as illustrated in the diagram indicated by (a) in
(34) Next, a method for creating occupancy grid map 200 illustrated in
(35)
(36) In
(37) Moreover, in
(38) Note that, in a case where a laser beam emitted from laser rangefinder sensor 10 does not intersect obstacle 9 for a certain distance, neither measurement point 10b nor line segment data portion 10c of a laser rangefinder sensor exists.
(39)
(40) All the grids included in occupancy grid map 200 initially belong to unknown region 6. Map section 12 changes a grid through which line segment data portion 10e passes to movable region 8, and changes a grid including measurement point 10b of a laser rangefinder sensor to obstacle region 7. Map section 12 performs this processing continuously with time as with “t,” “t+Δt,” and “t+Δt+1.” In this manner, occupancy grid map 200 is created.
(41) Next, a description will be given of an operation flow during movement of mobile robot 100 with reference to
(42) In step S1, destination setting section 13 sets a destination (target coordinates) of mobile robot 100 in occupancy grid map 200 managed by map section 12.
(43) More specifically, destination setting section 13 sets any of grids forming occupancy grid map 200 to be a destination. The grid which is set at this time has to be a grid belonging to movable region 8.
(44) In step S2, intelligence section 15 identifies a current position (current position coordinates) of mobile robot 100, which is estimated by self-localization section 14 in occupancy grid map 200 managed by map section 12, and calculates a path from the current position of mobile robot 100 to the destination of mobile robot 100, which has been set in step S1.
(45) More specifically, intelligence section 15 searches for a grid corresponding to the current position among the grids forming occupancy grid map 200, and sets the found grid to be the current position. Intelligence section 15 then calculates a path from the grid which has been set as the current position to the grid which has been set as the destination on occupancy grid map 200. In this calculation (creation) of the path, for example, a publicly-known A (A-star) algorithm is used, but the calculation technique is not limited to this, and another publicly-known technique may be used.
(46) In step S3, drive-wheel control section 16 controls drive wheels 1 such that mobile robot 100 moves (runs) along the created path. In this manner, mobile robot 100 moves toward the destination.
(47) During this movement, self-localization section 14 estimates the current positon of mobile robot 100 as needed, and intelligence section 15 calculates a distance between the current position and the set destination and determines whether or not the distance becomes equal to or less than the previously set certain distance.
(48) In step S4, in a case where the distance between the current position and the destination becomes equal to or less than the certain distance, intelligence section 15 determines that robot 100 has arrived at the destination. Moreover, drive-wheel control section 16 stops driving of drive wheels 1, and stops the movement of mobile robot 100.
(49) Next, a description will be given, using
(50)
(51) In
(52) Next, a description will be given of a flow of a method for integrating measurement data portions, using
(53) In step S11, sensor integration section 11 plots all measurement points 4Rb and 4Lb of laser rangefinder sensors obtained by laser rangefinder sensors 4R and 4L, respectively, on a two dimensional coordinate system. This two-dimensional coordinate system is a two-dimensional coordinate system of the virtual laser rangefinder sensor.
(54) In step S12, sensor integration section 11 divides the range-finding range of the virtual laser rangefinder sensor on the two dimensional coordinate system into a plurality of wedge-shaped portions. A specific example of this division will be described using
(55) In
(56) Measurement area 5d may include measurement points 4Rb and 4Lb of laser rangefinder sensors as illustrated in
(57) In step S13, in a case where a plurality of measurement points of laser rangefinder sensors are included in one wedge-shape (measurement area 5d), sensor integration section 11 selects one measurement point from among the plurality of measurement points.
(58) As indicated by (a) in
(59) Sensor integration section 11 selects a measurement point to which the distance from origin 5a is shortest, as processing data for integration (measurement data portion to be integrated). At this time, a measurement point other than the selected measurement point is deleted. The reason for this deletion is that the presence of the measurement point other than the selected measurement point causes an inconsistency in that, although the measurement point other than the selected measurement point cannot be measured from the position of the virtual laser rangefinder sensor in the first place, the measurement point remains even after integration of measurement data portions when the measurement data portions are integrated while a plurality of measurement points are present in one measurement area.
(60) The processing of step S13 is performed only in a case where a plurality of measurement points of a laser rangefinder sensor are included in one measurement area 5d.
(61) In step S14, sensor integration section 11 changes (determines) the measurement points (measurement points selected in step S13) of laser rangefinder sensors included respectively in measurement areas 5d to measurement points of the virtual laser rangefinder sensor.
(62) Thereafter, an occupancy grid map is created based on the measurement points of the virtual laser rangefinder sensor resulting from the change, and autonomous movement of mobile robot 100 is controlled based on the occupancy grid map.
(63) Next, a comparative example will be described using
(64)
(65)
(66)
(67) Meanwhile,
(68)
(69) In
(70) As illustrated in
(71)
(72) Originally, a mobile robot cannot enter inside obstacle 9, but the presence of movable region 8 inside grid line “a” (corresponding to obstacle 9) allows the mobile robot to enter inside obstacle 9 in occupancy grid map 200 illustrated in
(73)
(74)
(75)
(76) In occupancy grid map 200 as illustrated in
(77) Note that, although a description has been given with an exemplary case where laser rangefinder sensors (laser rangefinders) are used as rangefinder sensors in the embodiment described above, the rangefinder sensors may be ultrasound sensors, optical sensors, or camera sensors. Accordingly, the performance of autonomous movement of a mobile robot increases.
(78) Furthermore, the integration method in
(79) As has been described in detail, in the present embodiment, in a case where a map is created using measurement data portions of a plurality of laser rangefinder sensors included in a mobile robot and the mobile robot autonomously moves based on the map, some of the measurement data portions of the plurality of laser rangefinder sensors is deleted, and the measurement data portions remaining after deletion are integrated into an integrated measurement data portion, and the integrated measurement data portions are determined (changed) to be measurement data portions of a virtual rangefinder sensor(s) fewer in number than the number of the plurality of rangefinder sensors.
(80) Thus, even in a case where autonomous movement of a mobile robot is controlled using an OSS, an accurate map can be created, and thus, it is possible to prevent the mobile robot from hitting an obstacle by error.
(81) The present invention is not limited to the description of the above embodiment and can be variously modified within a range not departing from the gist of the present invention.
INDUSTRIAL APPLICABILITY
(82) The present invention can be utilized in the fields of autonomous movement or automatic driving of a mobile object.
REFERENCE SIGNS LIST
(83) 1 Drive Wheel 2 Quasi-wheel 3 Mobile robot main body 4R, 4L, Laser rangefinder sensor 4Ra, 4La, 10a Origin of laser rangefinder sensor 4Rb, 4Lb, 10b Measurement point of laser rangefinder sensor 4Rc, 4Lc, 10c Line segment data portion 5a Origin of virtual laser rangefinder sensor 5b Measurement point of virtual laser rangefinder sensor 5c Line segment data portion of virtual laser rangefinder sensor 5d Measurement area 6 Unknown region 7 Obstacle region 8 Mobile region 9 Obstacle 11 Sensor integration section 12 Map section 13 Destination setting section 14 Self-localization section 15 Intelligence section 16 Drive-wheel control section 17 Control section 20, 100 Mobile robot 21a, 21b, 21c Ultrasound sensor 22a, 22b, 22c Peripheral map 23 Integration map