Robot control device, robot control method, and robot control program
11527008 · 2022-12-13
Assignee
Inventors
- Haruka Fujii (Uji, JP)
- Toshihiro Moriya (Tokyo, JP)
- Takeshi Kojima (Kyoto, JP)
- Norikazu Tonogai (Nara, JP)
Cpc classification
H04N23/66
ELECTRICITY
International classification
Abstract
A robot control device includes an obtaining unit that obtains, from an image sensor that captures a workpiece group to be handled by a robot, a captured image, a simulation unit that simulates operation of the robot, and a control unit that performs control such that the captured image is obtained if, in the simulation, the robot is retracted from an image capture forbidden space, in which an image is potentially captured with the workpiece group and the robot overlapping each other, and which is set based on either or both a first space being the visual field range of the image sensor, and a columnar second space obtained by taking a workpiece region including the workpiece group or each of divided regions into which the workpiece region is divided, as a bottom area, and extending the bottom area to the position of the image sensor.
Claims
1. A robot control device comprising: a processor configured with a program to perform operations comprising: operation as an obtaining unit configured to obtain a captured image from an image sensor configured and positioned at a position to capture an image of a workpiece group to be handled by a robot; operation as a simulation unit configured to execute simulation of operation of the robot; and operation as a control unit configured to perform control such that an image captured by the image sensor is obtained in response to, in the simulation, the robot being retracted from at least one image capture forbidden space extending above the workpiece group, in which an image is potentially captured with the workpiece group and the robot overlapping each other, the image capture forbidden space being a space comprising one or both of: a first space, comprising a visual field range of the image sensor; and at least one columnar second space comprising a workpiece region including the workpiece group or a plurality of divided regions into which the workpiece region is divided, as a bottom area, which extends toward the sensor to the position of the image sensor.
2. The robot control device according to claim 1, wherein the image capture forbidden space comprises a space in which the first space and the second space overlap each other.
3. The robot control device according to claim 2, wherein the processor is configured with the program to perform operations such that operation as the control unit is configured to calculate the second space with a bottom area having a shape in which an outline of the workpiece group is contained, and calculate, as the image capture forbidden space, a space in which the calculated second space and the first space overlap each other.
4. The robot control device according to claim 3, wherein the shape in which an outline of the workpiece group is contained is given by a contour that indicates the outline of the workpiece group, or is a shape that encloses the contour of the workpiece group.
5. The robot control device according to claim 4, wherein the shape that encloses the contour of the workpiece group comprises a circumscribed shape that circumscribes the outline of the workpiece group or an inner edge shape of a receptacle in which the workpiece group is accommodated.
6. The robot control device according to claim 1, wherein the image sensor comprises an image sensor configured to capture a still image, and the processor is configured with the program to perform operations such that operation as the control unit is configured to control the image sensor to start image capture at a timing at which, in the simulation, the robot has been retracted from the image capture forbidden space.
7. The robot control device according to claim 1, wherein the image sensor comprises an image sensor configured to capture a moving image, and the processor is configured with the program to perform operations such that operation as the control unit is configured to perform control such that a frame image is obtained based on the moving image captured by the image sensor at a timing at which, in the simulation, the robot has been retracted from the image capture forbidden space.
8. The robot control device according to claim 1, wherein a plurality of image capture forbidden spaces are set, and the processor is configured with the program to perform operations such that operation as the control unit is configured to perform control such that an image captured by the image sensor is obtained in response to the robot being retracted from at least one of the plurality of image capture forbidden spaces.
9. The robot control device according to claim 8, wherein at least two of the plurality of image capture forbidden spaces partially overlap each other in a direction in which the image sensor captures an image.
10. The robot control device according to claim 8, wherein, in response to a robot arm of the robot being retracted to a reference position at which the robot arm does not affect a layout of the workpiece group, and, out of image capture forbidden spaces that correspond to the divided regions, any image capture forbidden space being in which a workpiece is present but the robot arm is not present, the processor is configured with the program to perform operations such that operation as the control unit is configured to perform control such that an image captured by the image sensor is obtained.
11. A robot control method in which a computer executes processing, the processing comprising: obtaining a captured image from an image sensor configured and positioned at a position to capture an image of a workpiece group to be handled by a robot; executing a simulation of operation of the robot; and performing control such that an image captured by the image sensor is obtained in response to, in the simulation, the robot being retracted from at least one image capture forbidden space extending above the workpiece group, in which an image is potentially captured with the workpiece group and the robot overlapping each other, the image capture forbidden space being a space comprising a one or both of: a first space, comprising a visual field range of the image sensor; and at least one columnar second space comprising a workpiece region including the workpiece group or a plurality of divided regions into which the workpiece region is divided, as a bottom area, which extends toward the sensor to the position of the image sensor.
12. A non-transitory computer-readable storage medium storing a robot control program, which when read and executed causes a computer to perform operations comprising: operation as an obtaining unit configured and positioned at a position to obtain a captured image from an image sensor configured to capture an image of a workpiece group to be handled by a robot; operation as a simulation unit configured to execute simulation of operation of the robot; and operation as a control unit configured to perform control such that an image captured by the image sensor is obtained in response to, in the simulation, the robot being retracted from at least one image capture forbidden space extending away from the workpiece group, in which an image is potentially captured with the workpiece group and the robot overlapping each other, the image capture forbidden space being a space comprising one or both of: a first space, comprising a visual field range of the image sensor; and at least one columnar second space comprising a workpiece region including the workpiece group or a plurality of divided regions into which the workpiece region is divided, as a bottom area, which extends toward the sensor to the position of the image sensor.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
DETAILED DESCRIPTION
(17) Hereinafter, an example of an embodiment will be described with reference to the drawings. Note that the same reference numerals are given to the same or equivalent constituent components or parts in the drawings. Furthermore, for illustrative reasons, the scaling of the drawings may be exaggerated and may be different from the actual scale.
(18)
(19) The robot RB picks up a workpiece W selected from a workpiece group WS, which is a group of workpieces W received in a box-shaped receptacle 20A, transports the held workpiece W to another box-shaped receptacle 20B, and places it into the receptacle 20B. Hereinafter, an operation of picking up a workpiece W is referred to as a “pick-up operation”. Also, an operation of placing a workpiece W is referred to as a “placing operation”.
(20) In one or more embodiments, as an example, a robot hand H, serving as an end effector, is attached to a leading end of a robot arm of the robot RB, and the robot hand H holds, by gripping, a workpiece W received in the receptacle 20A. Then, the robot RB transports, while holding, the workpiece W to the other receptacle 20B, and releases and places the held workpiece W. Note that the member that holds a workpiece W is not limited to the robot hand H, and may be a suction pad that suctions the workpiece W.
(21) The image sensor S is set at a position above the workpiece group WS at which it can capture an image of an area including the workpiece group WS. The image sensor S is a camera that captures a still image of the workpiece group WS in accordance with an instruction of the robot control device 10. Note that, as the image sensor S, a stereo camera may also be employed that captures images of an object from different directions at the same time.
(22) The robot control device 10 subjects a captured image obtained from the image sensor S to image processing, and recognizes the position and orientation of a workpiece W to be picked up based on a result of the image processing. Then, the robot control device 10 generates a planned path specifying a path from a position at which the robot RB picks up the workpiece W from the receptacle 20A to a position at which the robot RB places it into the receptacle 20B. The robot control device 10 outputs an operation instruction value to the robot RB so that the robot RB operates in accordance with the generated planned path.
(23) The following will describe the robot RB. In one or more embodiments, a case will be described in which the robot RB is a vertically articulated robot, but one or more aspects is also applicable to a horizontal articulated robot (SCARA robot), a parallel link robot, and the like.
(24)
(25) The base link BL and the link L1 are connected to each other via the joint J1, which rotates about a vertical axis Si in a direction of an arrow C1 in
(26) The link L1 and the link L2 are connected to each other via the joint J2, which rotates about a horizontal axis S2 in a direction of an arrow C2 in
(27) The link L2 and the link L3 are connected to each other via the joint J3, which rotates about an axis S3 in a direction of an arrow C3 in
(28) The link L3 and the link L4 are connected to each other via the joint J4, which rotates about an axis S4 in a direction of an arrow C4 in
(29) The link L4 and the link L5 are connected to each other via the joint J5, which rotates about an axis S5 in a direction of an arrow C5 in
(30) The link L5 and the link L6 are connected to each other via the joint J6, which rotates about an axis S6 in a direction of an arrow C6 in
(31) For each of the joints J1 to J6, a predetermined rotation angle range of −180 degrees to +180 degrees is set as a range of motion.
(32) The orientation (posture) of the robot RB depends on the rotation angles of the joints J1 to J6.
(33)
(34) As shown in
(35) In one or more embodiments, a robot control program for executing robot control processing is stored in the ROM 12 or the storage 14. The CPU 11 is a central processing unit, and is configured to execute various types of programs and control the constituent components, for example. In other words, the CPU 11 reads programs from the ROM 12 or the storage 14, and executes the read programs in the RAM 13 serving as a work area. The CPU 11 controls the constituent components and executes the various types of arithmetic processing in accordance with the programs stored in the ROM 12 or the storage 14.
(36) Note that the CPU 11 is a processor that can execute a plurality of processes in parallel to each other. Examples of such a processor include a multi-core CPU. Furthermore, the CPU 11 may be a processor that can execute a plurality of processes in parallel to each other based on a multi-task operating system.
(37) The ROM 12 stores various types of programs and various types of data. The RAM 13 serves as a work area to temporarily store a program or data. The storage 14 is constituted by a hard disk drive (HDD) or a solid state drive (SSD), and stores various types of programs including an operating system, and various types of data.
(38) The input unit 15 includes a keyboard 151, and a pointing device such as a mouse 152, and is used to perform various types of input. The monitor 16 is, for example, a liquid crystal display, and displays various types of information. The monitor 16 may also employ a touch panel method, so as to also function as the input unit 15. The optical disk drive 17 reads data stored in various types of recording media (such as CD-ROMs or blue-ray discs), and writes data into the recording media, for example.
(39) The communication interface 18 is an interface for communicating with another device such as the robot RB and the image sensor S, and uses a standard such as the Ethernet (registered trademark) standard, an FDDI standard, or a Wi-Fi (registered trademark) standard.
(40) Hereinafter, functional configurations of the robot control device 10 will be described.
(41)
(42) As shown in
(43) The obtaining unit 30 obtains a captured image from the image sensor S that captures an image of the workpiece group WS to be handled by the robot RB.
(44) The simulation unit 32 simulates the operation of the robot RB.
(45) The control unit 34 performs control such that an image captured by the image sensor S is obtained (only) if, in the simulation made by the simulation unit 32, the robot RB has (just) been retracted from an image capture forbidden space in which an image is potentially captured with the workpiece group WS and the robot RB overlapping each other, the image capture forbidden space being a space that is set based on at least one of a visual field range of the image sensor S and the shape of a region including the workpiece group WS.
(46) The storage unit 36 stores various types of information such as a robot control processing program, information relating to the visual field range of the image sensor S, data on the shapes of the base link BL and the links L1 to L6 of the robot RB, information relating to the ranges of motion of the joints J1 to J6, data on the shape of the robot hand H, and data on the shape of workpieces W.
(47) Hereinafter, effects of the robot control device 10 will be described.
(48)
(49) Serving as the control unit 34, the CPU 11 instructs the image sensor S to capture an image, and serving as the obtaining unit 30, the CPU 11 obtains the image captured by the image sensor S (step S100).
(50) Serving as the control unit 34, the CPU 11 reads the visual field range of the image sensor S from the storage unit 36, and calculates an image capture forbidden space based on the captured image obtained from the image sensor S and the visual field range of the image sensor S read from the storage unit 36 (step S102).
(51) Here, “image capture forbidden space” refers to a space in which an image is potentially captured with the workpiece group WS and the robot RB overlapping each other. As described later, when the robot RB is to pick up a workpiece W, the workpiece W to be picked up is selected based on an image captured by the image sensor S. At this time, if an overlap of the workpiece group WS and the robot arm of the robot RB appears in the captured image, it may be difficult to select a workpiece W to be picked up. accordingly, image capture of the image sensor S is forbidden if the robot arm of the robot RB is present in the image capture forbidden space.
(52) In one or more embodiments, as the image capture forbidden space, a space is set in which a first space, which is the visual field range of the image sensor S, and a second space overlap each other, the second space being obtained by taking a region including the workpiece group WS as a bottom area and extending the bottom area to the position of the image sensor S.
(53) For example, as shown in
(54) When, as shown in
(55) The shape of the bottom area 42 is not limited to the example of
(56) In this case, the shape in which the outline of the workpiece group WS is contained may also be given by a contour 50 that indicates the outline of the workpiece group WS as shown in
(57) Furthermore, the shape in which the outline of the workpiece group WS is contained may also be a shape that encloses the contour of the workpiece group WS. As shown in
(58) Serving as the control unit 34, the CPU 11 calculates, based on the captured image obtained from the image sensor S, the position and orientation of a workpiece W to be picked up (step S104). For example, by performing well-known edge extraction processing, feature extraction processing, and the like on the captured image, the position and orientation of workpieces W are detected. Then, a workpiece W to be picked up is selected from among the detected workpieces W. Note that examples of criteria for selecting a workpiece W to be picked up include selecting a workpiece W located at the uppermost position of the workpiece group WS, selecting a workpiece W located in the center of the workpiece group WS, and selecting a workpiece W that does not overlap with another workpiece W, but the present invention is not limited to these criteria.
(59) Serving as the control unit 34, the CPU 11 generates a planned path for the pick-up operation (step S106). Specifically, based on the position and orientation of a workpiece W to be picked up that was calculated in step S104, a target orientation (posture) of the robot RB is determined, and a planned path specifying a path from the initial orientation of the robot RB to the target orientation is calculated. Here, the planned path for the robot RB refers to a list of orientations when the robot RB is moved from the initial orientation to the target orientation, that is, a list of rotation angles of the joints J1 to J6 of the robot RB. For the calculation of the planned path, a motion planning method can be used in which a planned path is automatically generated upon designation of an initial orientation and a target orientation. Alternatively, a planned path generated by using a teaching playback method may also be used. In other words, a configuration is also possible in which the robot RB is directly moved to train paths that correspond to the positions and orientations of various workpieces, which are stored in the storage unit 36, and the path, calculated in step S104, that corresponds to the position and orientation of the workpiece W to be picked up is read from the storage unit 36.
(60) Serving as the control unit 34 and the simulation unit 32, the CPU 11 transmits, to the robot RB, an operation instruction value for causing the robot RB to operate in accordance with the planned path generated in step S106, so that the robot RB actually operates, and executes simulation of the same operation as the actual operation of the robot RB (step S108). accordingly, at a timing at which the pick-up operation is actually executed by the robot RB, a pick-up operation is also executed in a simulation by a virtual robot RB. In other words, the pick-up operation by the real robot RB and the pick-up operation in the simulation by the virtual robot RB are executed in parallel to each other. Note that a configuration is also possible in which the operation in the simulation by the robot RB is followed by the operation by the real robot RB, that is, the pick-up operation in the simulation may also be executed slightly prior to the pick-up operation executed by the real robot RB. accordingly, when, in later-described step S114, it is determined in the simulation that the robot arm with the held workpiece W is retracted from the image capture forbidden space 46, and in step S116, the image sensor S is instructed to capture an image, it is possible to match the timing at which the image sensor S is instructed to capture an image with the timing at which the robot RB is actually retracted from the image capture forbidden space 46 as much as possible.
(61) Also, serving as the control unit 34, the CPU 11 executes processing for generating a planned path for the placing operation, in parallel to the processing in step S108 (step S110).
(62) Specifically, a planned path specifying a path from the initial orientation of the robot RB to the target orientation is calculated, where the initial orientation is an orientation of the robot RB when it picks up the workpiece W in the pick-up operation, and the target orientation is an orientation when it places the workpiece W into the receptacle 20B.
(63) Serving as the control unit 34 and the simulation unit 32, the CPU 11 transmits, to the robot RB, an operation instruction value for causing the robot RB to operate in accordance with the planned path generated in step S110, so that the robot RB actually operates, and executes simulation of the same operation as the actual operation of the robot RB (step S112). accordingly, at a timing at which the placing operation is actually executed by the robot RB, a placing operation in the simulation by a virtual robot RB is also executed. In other words, the placing operation by the real robot RB and the placing operation in the simulation by the virtual robot RB are executed in parallel to each other.
(64) Furthermore, the CPU 11 executes, in parallel to the processing in step S112, the processing in steps S114 and S116. In other words, serving as the control unit 34, the CPU 11 determines whether or not the timing to obtain a captured image has come (step S114).
(65) The timing to obtain a captured image is the timing at which, in the simulation, the robot arm with the held workpiece W has just been retracted from the image capture forbidden space 46. To calculate the timing at which the robot arm with the held workpiece W has been retracted from the image capture forbidden space 46, the CPU 11 uses, in the simulation, a well-known interference determination technique for determining whether or not the robot interferes with an obstacle, for example. Here, “interference” refers to a situation in which the robot is in contact with an obstacle. A technique disclosed in, for example, JP 2002-273675A can be used as the well-known interference determination technique.
(66) As shown in, for example,
(67) It is also conceivable that the timing at which, after the robot arm of the robot RB enters the image capture forbidden space 46, the robot arm with the held workpiece W is moved in a direction of the arrow B to get out from the image capture forbidden space 46 is a timing at which, if the image capture forbidden space 46 is regarded as an obstacle, the robot arm with the held workpiece W that is interfering with the obstacle is brought into a non-interfering state in which it does no longer interfere therewith.
(68) accordingly, the CPU 11 regards, in the simulation, the image capture forbidden space 46 as an obstacle and determines the timing at which, after the robot arm of the robot RB interferes with the obstacle, the robot arm with the held workpiece W no longer interferes with the obstacle, as the timing at which the robot RB is retracted from the image capture forbidden space 46.
(69) If the timing to obtain a captured image (NO in step S114) has not yet come, the CPU 11 repeats the determination in step S114.
(70) On the other hand, if the timing to obtain a captured image (YES in step S114) has come, then the CPU 11, serving as the control unit 34, instructs the image sensor S, and, serving as the obtaining unit 30, obtains a captured image of the workpiece group WS from the image sensor S (step S116).
(71) The CPU 11 determines whether or not all of the workpieces W of the workpiece group WS have been transported from the receptacle 20A to the receptacle 20B (step S118). Specifically, for example, the CPU 11 subjects the captured image obtained in step S116 to image processing, and determines whether or not there is any workpiece W in the captured image. Furthermore, if the number of workpieces originally received in the receptacle 20A is known in advance, it is also possible to determine whether or not the number of times a workpiece W has been transported reaches the number of workpieces W. Then, if all of the workpieces W have been transported (YES in step S118), this routine is ended. On the other hand, if the transporting of all of the workpieces W is not complete (NO in step S118), the procedure moves to step S102, and the processing from steps S102 to S118 is repeated until all of the workpieces W have been transported. In this case, the initial orientation used when a planned path for the pick-up operation is generated in step S106 is an orientation of the robot RB when a workpiece W is placed in the previous placing operation.
(72)
(73) In one or more embodiments, simulation of the operation of the robot RB is executed. When, in the simulation, the robot arm with the held workpiece W is retracted from the image capture forbidden space 46, the image sensor S is instructed to capture an image. accordingly, it is possible to promptly capture an image at a timing at which the robot RB is retracted from the image capture forbidden space 46, reducing the time required for one cycle.
(74) The robot control device 10 is not limited to the above-described embodiments, and various modifications are possible. For example, one or more embodiments have described a case in which the image sensor S is a camera that captures a still image, but a camera that captures a moving image may also be used. In this case, the image sensor S continues to capture a moving image during the execution of the procedure in
(75) Furthermore, one or more embodiments have described a case in which, when the entire robot arm with a workpiece W is retracted from the image capture forbidden space 46, the image sensor S is instructed to capture an image, but the image sensor S may also be instructed to capture an image even when part of the robot arm with a held workpiece W remains in the image capture forbidden space 46. For example, assuming that the bottom area of the second space 44 has the inner edge shape 48 of the receptacle 20A as shown in
(76) Furthermore, the image sensor S may also be instructed to capture an image if it is determined that the robot arm including the robot hand H without a held workpiece W is retracted from the image capture forbidden space 46. Alternatively, the image sensor S may also be instructed to capture an image if it is determined that the robot arm without a held workpiece W and the robot hand H is retracted from the image capture forbidden space 46.
(77) Furthermore, one or more embodiments have described a case where a space in which the first space 40 and the second space 44 overlap each other is defined as the image capture forbidden space 46, but the first space 40 or the second space 44 may also be defined as the image capture forbidden space 46. When the first space 40 is defined as the image capture forbidden space 46, the processing in step S102 in
(78) Furthermore, one or more embodiments have described a case where only one image capture forbidden space is set, but a plurality of image capture forbidden spaces may also be set in step S102 in
(79) Specifically, as shown in
(80) Note that the divided regions 42A to 42D may also be set so that the numbers of workpieces W present in the respective divided regions 42A to 42D are equal to each other. In this case, the positions and number of workpieces W are calculated based on a captured image. The sizes of the divided regions 42A to 42D may only be set based on the calculated positions and number of workpieces W.
(81)
(82) Note that, in the example of
(83) If a plurality of image capture forbidden spaces are set, processing shown in
(84) Note that a reference position may also be set based on, for example, the captured image obtained in step S100 in
(85) Then, if, in the simulation, the robot arm is retracted to the reference position (YES in step S114A), that is, the robot arm is not in contact with a workpiece W and does not affect the layout of the workpiece group WS, the procedure moves to step S114B. On the other hand, if the robot arm is not retracted to the reference position (NO in step S114A), that is, the robot arm may get into contact with a workpiece W and may affect the layout of the workpiece group WS, the processing in step S114A is repeated until the robot arm is retracted to the reference position.
(86) The CPU 11 determines whether or not, out of the image capture forbidden spaces that correspond to the divided regions 42A to 42D, there is any image capture forbidden space in which any workpiece W is present but the robot arm is not present (step S114B). If, out of the image capture forbidden spaces that correspond to the divided regions 42A to 42D, there is any image capture forbidden space in which any workpiece W is present but the robot arm is not present (YES in step S114B), the procedure moves to step S116 in
(87) Note that the determination as to whether or not any workpiece W is present is performed in the following manner. When step S102 in
(88) In this way, if, out of the image capture forbidden spaces that correspond to the divided regions 42A to 42D, there is any image capture forbidden space in which any workpiece W is present but the robot arm is not present, it is determined that the timing to obtain a captured image has come. A case is assumed in which, for example, a workpiece W in the divided region 42A shown in
(89) Furthermore, one or more embodiments have described a case where one or more aspects are applied to pick-up operation in which the robot RB picks up a workpiece W, but one or more aspects are also applicable to placing operation in which the robot RB places the workpiece W held by itself into the receptacle 20B. In this case, an image sensor is installed at a position above the receptacle 20B, an image capture forbidden space is calculated based on an image captured by this image sensor, and the position at which a workpiece W is to be placed is determined. Then, the image sensor may only be instructed to capture an image when the robot arm that has placed the workpiece W into the receptacle 20B is retracted from the image capture forbidden space.
(90) Furthermore, in one or more embodiments, a well-known interference determination technique is used in the simulation to determine the timing at which the robot RB is retracted from the image capture forbidden space 46, but it is also possible to determine the timing at which the robot RB is retracted from the image capture forbidden space 46 based on a planned path of the robot RB. Specifically, based on the planned path for the placing operation generated in step S110 in
(91) Note that the robot control processing executed by the CPU reading software (a program) in the above-described embodiments may also be executed by various types of processors other than the CPU. Examples of the processors in this case include a programmable logic device (PLD) such as a field-programmable gate array (FPGA), whose circuit configuration can be changed after the manufacture, and a dedicated electric circuit such as an application specific integrated circuit (ASIC), which is a processor having a circuit configuration designed only to execute specific processing. Furthermore, the robot control processing may also be executed by one of these types of processors, or a combination of two or more processors of the same or different types (for example, a plurality of FPGAs, and a combination of a CPU and a FPGA). Furthermore, the hardware structures of the various types of processors are more specifically electric circuits obtained by combining circuit elements such as semiconductor elements.
(92) Furthermore, the above-described embodiments have described a case where the robot control program is stored (installed) in advance in the storage 14 or the ROM 12, but the present invention is not limited to this. The program may also be provided in such a form that it is recorded in a recording medium such as a compact disk read only memory (CD-ROM), a digital versatile disk read only memory (DVD-ROM), or a universal serial bus (USB) memory. Furthermore, the program may also be downloaded from an external device via a network.
LIST OF REFERENCE NUMERALS
(93) 1 Pick-and-place device 10 Robot control device 20A, 20B Receptacle 30 Obtaining unit 32 Simulation unit 34 Control unit 36 Storage unit 40 First space 42 Bottom area 44 Second space 46 Image capture forbidden space 50 Contour 52 Circumscribed shape