HUMAN-MACHINE COOPERATIVE CONTROL SYSTEM AND HUMAN-MACHINE COOPERATIVE CONTROL METHOD
20240231381 ยท 2024-07-11
Assignee
Inventors
Cpc classification
G05D2105/05
PHYSICS
International classification
Abstract
A system for presenting information to a human to avoid dangers such as collisions, without hindering the respective movement of the human and an autonomous machine. The human-machine cooperative control system manages each movable region so that the human and the autonomous machine do not collide. The system comprises a moving body position measurement unit comprising at least sensor for measuring the position of moving bodies, including a human and a machine; a moving body motion prediction unit for predicting future motion of a subject moving body on the basis of a moving body position; an exclusive management unit for planning a movable region for each moving body on the basis of a planned route for the unmanned machine and a moving body predicted motion obtained from the moving body motion prediction unit; and an information presentation unit for presenting information about the movable region for the human.
Claims
1. A human-machine cooperative control system that exclusively manages each movable area such that a person and an unmanned machine capable of autonomously moving do not collide with each other in a shared area, the human-machine cooperative control system comprising: a moving body position measurement unit that includes one or a plurality of sensors that measure a position of a moving body including the person and the unmanned machine; a moving body motion prediction unit that predicts a future motion of a target moving body from the position of the moving body measured by the moving body position measurement unit; an exclusion management unit that plans a movable area of each moving body based on a planned route of the unmanned machine and a moving body prediction motion obtained from the moving body motion prediction unit; and an information presentation unit that presents, to a target person, movable area information for a person among movable areas of moving bodies planned by the exclusion management unit.
2. The human-machine cooperative control system according to claim 1, wherein an area deviation prevention function of transmitting, to a target unmanned machine, movable area information for an unmanned machine among the movable areas of the moving bodies planned by the exclusion management unit, and limiting traveling of the unmanned machine so as not to enter an area other than the movable area is provided.
3. The human-machine cooperative control system according to claim 1, wherein the exclusion management unit plans the movable area of each moving body based on the planned route and/or the moving body prediction motion for a predetermined time length with respect to the planned route of the unmanned machine and the moving body prediction motion obtained from the moving body motion prediction unit.
4. The human-machine cooperative control system according to claim 1, wherein the exclusion management unit divides an exclusion management target area in a lattice shape having a predetermined size, and determines the movable areas of the respective moving bodies so that the movable areas of the moving bodies do not overlap each other, for each of the divided areas (division areas) in a minimum unit.
5. The human-machine cooperative control system according to claim 4, wherein, in a case where the planned route and/or the moving body prediction motion overlaps in the same division area, the exclusion management unit determines the division area as the movable area for one unmanned machine or moving body based on a predetermined priority order.
6. The human-machine cooperative control system according to claim 4, wherein, in a case where the planned route and/or the moving body prediction motion overlaps in the same division area, and in a case where the division area has already been allocated as a movable area of any unmanned machine or moving body, the exclusion management unit continuously determines the division area as the movable area with respect to the unmanned machine or the moving body.
7. The human-machine cooperative control system according to claim 1, wherein the exclusion management unit changes a size of a division area allocated as a movable area of for each moving body based on an attribute and action history information of the moving body.
8. The human-machine cooperative control system according to claim 1, wherein the exclusion management unit allocates a preferential movable area obtained by preferentially allocating a movable area to the moving body around the movable area, in addition to the movable area.
9. The human-machine cooperative control system according to claim 1, wherein the information presentation unit displays information by projecting an image or a video onto a ground by a projector.
10. The human-machine cooperative control system according to claim 1, wherein the information presentation unit displays information by switching light-emitting of a light-emitting object buried in a ground.
11. The human-machine cooperative control system according to claim 1, wherein the information presentation unit displays information by drawing an image or a video on a display device of a person.
12. A human-machine cooperative control method for exclusively managing each movable area such that a person and an unmanned machine capable of autonomously moving do not collide with each other in a shared area, the human-machine cooperative control method comprising: measuring a position of a moving body including the person and the unmanned machine; predicting a future motion of a target moving body from the position of the moving body; planning a movable area of each moving body based on a planned route and a moving body prediction motion of the unmanned machine; and presenting, to a target person, movable area information for a person among movable areas of moving bodies.
13. The human-machine cooperative control method according to claim 12, wherein the movable area includes an occupied area set along a course of each moving body and a preferential area set around the occupied area, and when movable areas of a plurality of the moving bodies do not overlap each other, the plurality of the moving bodies are movable within the shared area without limitation.
14. The human-machine cooperative control method according to claim 13, wherein, when priority areas for the plurality of moving bodies overlap each other, the plurality of moving bodies are set to be movable in the shared area according to the preferential area set in advance.
15. The human-machine cooperative control method according to claim 13, wherein, when occupied areas for the plurality of moving bodies overlap each other, the unmanned machine is stopped.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
DESCRIPTION OF EMBODIMENTS
[0021] Hereinafter, an embodiment of the present invention will be described with reference to the drawings. Note that the following description shows specific examples of the contents of the present invention, and the present invention is not limited to these descriptions. Various changes and modifications can be made by those skilled in the art within the scope of the technical idea disclosed in this specification. In all the drawings for describing the present invention, components having the same function are denoted by the same reference signs, and the repetitive description thereof may be omitted.
EXAMPLES
[0022] A human-machine cooperative control system according to an example of the present invention will be described with reference to
[0023]
[0024] In addition, there are a plurality of moving bodies 1 as control targets of the human-machine cooperative control system, and there is a shared area 2 as a control target area. The moving body 1 can be subdivided into, for example, an unmanned machine 1a that has an autonomous movement function and moves in an unmanned manner, a manned machine 1b that is operated and moved by a person, a worker 1c, and the like. The shared area 2 is an area where the unmanned machine 1a, and the manned machine 1b and/or the worker 1c perform work at the same time.
[0025] Among the moving bodies, a target task is assigned to the unmanned machine 1a by a control function (not illustrated), and the unmanned machine 1a plans an action and acts by itself in accordance with the task. In addition, in the present example, a site where the manned machine 1b represented by a forklift and a worker work in the same area is assumed. The task of the unmanned machine 1a is assigned by a control function in units such as carrying a cargo placed at a certain point to another point, for example. The unmanned machine 1a plans a moving route from a position at that time to a point in a task instruction by a route planning unit 101 (illustrated in
[0026]
[0027] Among the processing functions, the moving body position measurement unit 301 has a function of detecting the moving body 1 and measuring the position thereof. The sensor unit 3 may be any sensor as long as the sensor can measure the position of the moving body 1. For example, the sensor unit may be a GPS or a beacon attached to the moving body 1, or a camera or a sensor such as LiDAR for estimating a position by matching with map information, or may be a camera or a sensor such as LiDAR, which is disposed to be fixed in an environment and directly measures the moving body 1. Furthermore, a plurality of the sensors may be combined.
[0028] In addition, although specific processing contents may change depending on the sensor configuration of the sensor unit 3, the moving body position measurement unit 301 is roughly classified into one that is installed on the moving body 1 itself and measures the own position and one that is installed on the moving body 1 or the environment side, detects the moving body 1 moving within a measurement range, and measures the position thereof. Note that the moving body position measurement unit 301 is not limited to specific measurement means, and any method may be used as long as the position of the moving body 1 can be acquired and transmitted.
[0029] The moving body motion prediction unit 501 has a function of receiving the position of the moving body 1 measured by one or a plurality of moving body position measurement units 301, performing integration process such as association of the same moving body when respectively receiving the positions of the plurality of moving bodies 1 from the plurality of moving body position measurement units 301, and predicting the motion up to the future for a predetermined time based on position information up to that time point of each moving body 1.
[0030] Specifically, there is a method of predicting a prediction motion as a probability density distribution representing the existence probability of the moving body 1 after a predetermined time. For example, probability density distributions of the moving body 1 at three future time points after 1 second, after 2.5 seconds, and after 5 seconds are calculated, and a value of an average and a standard deviation of the probability density distribution at each time point is set as the prediction motion.
[0031] The exclusion management unit 502 has a function of receiving the prediction motion of one or the plurality of moving bodies 1 predicted by the moving body motion prediction unit 501 and the planned route planned by the route planning unit 101 of the unmanned machine 1a, excluding an area in the shared area 2 such that the courses of the moving bodies 1 do not overlap, and determining the occupied area of each moving body 1. Details of this function will be described later.
[0032] The information presentation unit 401 has a function of transferring the occupied area of each moving body 1 determined by the exclusion management unit 502 to an operator of the manned machine 1b and/or the worker 1c. Details of this function will be described later.
[0033]
[0034] In the present example, the information presentation device 4 is assumed to be a projector. The information presentation device 4 is not limited to the projector. The information presentation device 4 may display the occupied area 601 by using a light-emitting object such as a display or a light buried in advance on the ground. Alternatively, a display device may be held or worn by a person and the occupied area 601 may be displayed on the display device. In particular, if an eyeglass-type device having an augmented reality function of superimposing and displaying the occupied area 601 in a surrounding environment is used, information equivalent to information displayed on the ground can be acquired with the same sense without displaying anything on the ground, and it is possible to prevent a decrease in work efficiency as compared with a case where the information is simply displayed on the display device held by a person. In short, any means may be used as long as the occupied area set on the ground of the shared area or the preferential area described later can be recognized by the worker.
[0035]
[0036]
[0037] Then, in a processing step S101, the moving body motion prediction unit 501 predicts the actions of the manned and unmanned moving bodies 1. In the case of the manned moving bodies 1b and 1c, for example, the position at a future time point (for example, 1 second later) is estimated from the past position at the past time point which is 1 second or 2 seconds before and the current position at the current time point. In the case of the unmanned moving body 1a, action prediction for the current and future positions in the route planned by the route planning unit 101 is performed.
[0038] In a processing step S102, the exclusion management unit 502 determines an occupied area and a preferential area of each moving body 1 by using the planned routes and/or the prediction motions for all the moving bodies 1 existing in the shared area 2.
[0039] In a processing step S103, information presentation of the occupied area and the preferential area in the shared area 2 is performed by the information presentation unit 401 using a method allowing a person to perform recognition.
[0040] Thus, area display as illustrated in
[0041] In order to enable display of the information presentation example, the exclusion management unit 502 functions more specifically as follows. First, the exclusion management unit 502 receives a prediction motion 702 for the manned moving bodies 1b and 1c from the moving body motion prediction unit 501. In the display example of
[0042] Further, the exclusion management unit 502 receives a planned route 701 from the route planning unit 101 of the unmanned machine 1a. In the display example of
[0043] Furthermore, the exclusion management unit 502 may change the size of the occupied area 601 for each moving body from attribute information of each moving body, a motion history of each moving body, and the like, which have been set in advance. For example, the occupied area 601 can be set relatively small for a worker having sufficient experience to improve work efficiency, and the occupied area 601 can be set relatively large for a worker not having sufficient experience to further reduce the risk of an accident. In addition, for a worker who tends to take an action having a difficulty in being predicted, that is, a worker whose matching degree between the predicted motion and the actual motion is low, the occupied area 601 can be set relatively large according to the low matching degree to further reduce the risk of an accident.
[0044] The exclusion management unit 502 determines the occupied area 601 and the preferential area 602 of each moving body 1 by using the planned routes 701 and/or the prediction motions 702 for all the moving bodies 1 existing in the shared area 2, as illustrated in
[0045] In addition, the exclusion management unit 502 preferably has the functions and characteristics as follows. For example, it is preferable to have an area deviation prevention function of transmitting movable area information for the unmanned machine 1a among the movable areas (occupied area and preferential area) of the moving bodies 1 planned by the exclusion management unit 502 to the target unmanned machine 1a, and limiting traveling so that the unmanned machine does not enter an area other than the occupied area. In addition, regarding the planned route of the unmanned machine 1a and the moving body prediction motion obtained from the moving body motion prediction unit 501, the movable area of each moving body is preferably planned based on the planned route and/or the moving body prediction motion for a predetermined time length. Furthermore, in a case where the planned route and/or the moving body prediction motion overlap the same division area, the exclusion management unit 502 preferably determines the division area as the movable area for one unmanned machine or moving body based on a predetermined priority order. In addition,
[0046] Next, a concept of monitoring mutual movement of the plurality of moving bodies 1 in the shared area 2 and performing control protection will be described with reference to
[0047] According to this measure, at the time of
[0048] Then, when it is determined in the processing step S200 of the flow of
[0049] In a case where the overlap is in the preferential area, a process is performed as follows in the processing step S202. As this case, for example,
[0050] Since the unmanned machine 1a is stopped, the worker 1c tries to pass in front of the unmanned machine 1a. In addition, since the unmanned machine 1a has already acquired the occupied area 601 and the preferential area 602 in the traveling direction of the worker 1c, it is not possible for the worker 1c to acquire the occupied area 601 in the range.
[0051] In this manner, the exclusion management unit 502 causes the previously acquired moving body 1 to hold the occupied area 601 and the preferential area 602 of the division area. In addition, in a case where a plurality of moving bodies 1 try to enter a new division area at the same time, the moving body 1 having a higher priority can acquire the occupied area 601 or the preferential area 602 associated with the occupied area 601 based on a predetermined priority order. For example, it is conceivable that the priority order is basically set to the order of the worker 1c, the manned machine 1b, and the unmanned machine 1a, and the priority order is further individually assigned in the priority order, but the present example is not limited thereto.
[0052] In a case where the overlap is in the occupied area, processing is performed as follows in the processing step S203. As this case, for example,
[0053] In this case, the worker deviates from the rule that the worker does not go out from the occupied area. The exclusion management unit 502 detects the deviation and transmits rule deviation information to the information presentation unit 401, and transmits a stop command to the unmanned machine 1a in the shared area 2. The information presentation unit 401 displays a rule deviation state 703, and the unmanned machine 1a that has received the stop command stops autonomous traveling.
[0054] As described above, the occupied area 601 is not acquired in a case where the moving body approaches the traveling direction of another moving body 1. If the moving body waits for the passage of the other moving body 1 in the occupied area 601 or selects another course, the moving body can continue the work without stopping the unmanned machine 1a. Even in a case where the moving body goes out from the occupied area 601, the risk of an accident is reduced by displaying the rule deviation state 703 to make a user aware of the rule deviation state before entering the occupied area 601 of the other moving body 1 and stopping the autonomous traveling of the unmanned machine 1a.
[0055] In addition, as in the manned machine 1b illustrated in
REFERENCE SIGNS LIST
[0056] 1 moving body [0057] 1a unmanned machine [0058] 1b manned machine [0059] 1c worker [0060] 2 shared area [0061] 3 sensor unit [0062] 4 information presentation device [0063] 5 computer [0064] 101 route planning unit [0065] 301 moving body position measurement unit [0066] 401 information presentation unit [0067] 501 moving body motion prediction unit [0068] 502 exclusion management unit [0069] 601 occupied area [0070] 602 preferential area