TRANSFER ROBOT AND CONTROL METHOD THEREOF
20170173796 ยท 2017-06-22
Inventors
- Kwang-Jun KIM (Ansan-si, KR)
- Doojin Kim (Hwaseong-si, KR)
- Kongwoo Lee (Seoul, KR)
- Joohyung Kim (Seongnam-si, KR)
- Kyungbin Park (Suwon-si, KR)
- Nam-Su Yuk (Suwon-si, KR)
Cpc classification
G05D1/0225
PHYSICS
B25J9/1612
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1679
PERFORMING OPERATIONS; TRANSPORTING
Y10S901/01
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G05B2219/40613
PHYSICS
B25J9/162
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
The present disclosure relates to a transfer robot and a method for controlling the same. The transfer robot includes a robot main body and a driving unit configured to move the robot main body toward a stage. The robot main body includes a distance sensor unit configured to obtain distance information between the robot main body and the stage, a first image acquisition unit configured to receive an image of a first mark of the stage and obtain a first image information, a manipulation unit configured to pick up a target object disposed on the stage, and a control unit configured to control the driving unit using the distance information and the first image information and thereby to causing the robot main body to be placed at a desired position spaced apart from the stage.
Claims
1. A transfer robot, comprising: a robot main body; and a driving unit configured to move the robot main body toward a stage, wherein the robot main body comprises: a distance sensor unit configured to obtain distance information between the robot main body and the stage, and a first image acquisition unit configured to take an image of a first mark of the stage and to obtain a first image information; a manipulation unit configured to pick up a target object disposed on the stage; and a control unit configured to control the driving unit using the distance information and the first image information, thereby causing the robot main body to be placed at a desired position spaced apart from the stage.
2. The transfer robot of claim 1, wherein the distance sensor unit comprises: a first distance sensor; and a second distance sensor having spatial separation from the first distance sensor.
3. The transfer robot of claim 2, wherein the control unit is configured to determine a relative angle between the robot main body and the stage based on a distance between the first and second distance sensors, a first distance and a second distance, which are respectively obtained by the first and second distance sensors, and to control the driving unit to cause the relative angle to be equal to or smaller than a predetermined angle value.
4. The transfer robot of claim 2, wherein the control unit is configured to control the driving unit to cause a difference between a first distance and a second distance, which are respectively obtained by the first and second distance sensors, to be equal to or smaller than a predetermined value.
5. The transfer robot of claim 2, wherein the first image acquisition unit is disposed equidistantly between the first and second distance sensors.
6. The transfer robot of claim 1, wherein the control unit is configured to obtain an X-coordinate and a Z-coordinate of a reference point of the first mark from the first image information and to control the driving unit to allow at least one of the X-coordinate and the Z-coordinate to coincide with a predetermined one of the reference coordinates.
7. The transfer robot of claim 6, wherein the reference point of the first mark is a center point of the first mark.
8. The transfer robot of claim 1, wherein the robot main body further comprises a target object-sensing unit, configured to detect the target object disposed in a scan region and to obtain an X-coordinate, a Y-coordinate and a Z-coordinate of the target object.
9. The transfer robot of claim 8, wherein the target object-sensing unit comprises: a detection sensor; and a scan unit configured to move the detection sensor to scan the scan region.
10. The transfer robot of claim 8, wherein the manipulation unit comprises: a robot hand configured to grasp the target object; and a robot arm connected to the robot hand, and configured to change a position of the robot hand, wherein the control unit is configured to calculate a grasping position, allowing the robot hand to grasp the target object, and to control the robot arm to place the robot hand at the grasping position, and the X-coordinate, the Y-coordinate and the Z-coordinate of the target object is used to calculate the grasping position.
11. The transfer robot of claim 10, wherein the robot main body further comprises a second image acquisition unit, configured to take an image of a second mark of the target object and to obtain a second image information, the robot hand comprises fingers configured to be inserted into respective grip recesses of the target object, and the control unit is configured to obtain a position information of the second mark from the second image information and to control the robot hand, based on the position information of the second mark, to allow each of the fingers to be placed near a position of a corresponding one of the grip recesses.
12. The transfer robot of claim 10, wherein the robot hand comprises fingers configured to be inserted into grip recesses of the target object, and the control unit is configured to control the robot hand to partially insert the fingers into the grip recesses, to control the robot arm to elevate the robot hand, and to control the robot hand to further insert the fingers into the grip recesses.
13. A method of controlling a transfer robot, comprising: moving a robot hand to a first position using a robot arm, wherein the robot hand comprises a plurality of fingers configured to grasp a target object having grip recesses; partially inserting the fingers into the grip recesses with the robot hand at the first position; elevating the robot hand to a second position higher than the first position, using the robot arm; and further inserting the fingers into the grip recesses with the robot hand at the second position.
14. The method of claim 13, further comprising detecting the target object and obtaining coordinate information including an X-coordinate, a Y-coordinate, and a Z-coordinate of the target object, using a target object-sensing unit, wherein the first position is calculated from the X-coordinate, the Y-coordinate and the Z-coordinate of the target object.
15. The method of claim 14, wherein the robot arm and the robot hand are used as parts of a robot main body, wherein the robot main body further comprises a distance sensor unit and an image acquisition unit, and wherein the target object is disposed on a stage having spatial separation from a desired position and comprises a first mark, wherein the method further comprises: obtaining a distance information between the robot main body and the stage, using the distance sensor; obtaining a first image information containing an image of the firstmark, using the first image acquisition unit; and moving the robot main body to the desired position, using the distance information and the first image information.
16. A transfer robot comprising: a steerable platform having an articulating arm attached thereto; a controller coupled to the steerable platform, the controller configured to position a first surface of the steerable platform at a predetermined distance, and with parallel alignment, to a second surface of a stage having a target object disposed thereon; and a robotic hand connected to the articulating arm, the robotic hand including at least two movable phalanxes configured to grip a respective recessed feature of the target object.
17. The transfer robot of claim 16 further comprising a plurality of distance sensors proximally located to the first surface and configured to measure a measured distance between the first surface and the second surface, wherein the controller directs a movement of the steerable platform towards the stage until the measured distance is the same as the predetermined distance.
18. The transfer robot of claim 17 wherein a difference between two of the plurality of distance sensors is reduced by steering the steerable platform, thereby aligning the first surface in parallel to the second surface.
19. The transfer robot of claim 16 further comprising an obstacle sensor proximally located to the first surface, the obstacle sensor configured to detect an object between the steerable platform and the stage, and to communicate to the controller to alter a path between the steerable platform and the stage.
20. The transfer robot of claim 16 further comprising a rotatable connection between the robotic hand and the articulating arm configured to provide rotational alignment between the robotic hand and the target object, an image sensor on the robotic hand configured to receive an image of an alignment mark on the target object, the image communicated to the controller to move the at least two phalanxes to grip the respective recessed feature of the target object.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] Example embodiments will be more clearly understood from the following brief description taken in conjunction with the accompanying drawings. The accompanying drawings represent non-limiting, example embodiments as described herein.
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031] It should be noted that these figures are intended to illustrate the general characteristics of methods, structure and/or materials utilized in certain example embodiments and to supplement the written description provided below. These drawings are not, however, to scale and may not precisely reflect the precise structural or performance characteristics of any given embodiment, and should not be interpreted as defining or limiting the range of values or properties encompassed by example embodiments. For example, the relative thicknesses and positioning of molecules, layers, regions and/or structural elements may be reduced or exaggerated for clarity. The use of similar or identical reference numbers in the various drawings is intended to indicate the presence of a similar or identical element or feature.
DETAILED DESCRIPTION
[0032] Reference will now be made in detail to the embodiments of the present general inventive concept, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present general inventive concept by referring to the figures.
[0033]
[0034] Referring to
[0035] The robot main body 100-800 may include a body unit 100, a control unit 800, a distance sensor unit 300, a first image acquisition unit 400, and a manipulation unit 600. The transfer robot 10 may further include a target object-sensing unit 500, a second image acquisition unit 700, and an obstacle-sensing unit 200.
[0036] At least a part of the appearance of the transfer robot 10 may be defined by the body unit 100. The body unit 100 may be equipped with various units. For example, the body unit 100 may be equipped with the obstacle-sensing unit 200, the distance sensor unit 300, the first image acquisition unit 400, the target object-sensing unit 500, the manipulation unit 600, the control unit 800, and the driving unit 900. In various embodiments, the control unit 800 of
[0037] The obstacle-sensing unit 200 may be oriented to a driving direction of the robot main body 100-800. Accordingly, the obstacle-sensing unit 200 may be configured to detect an obstacle O (such as an object, a human, or a stage) (e.g., see
[0038] The distance sensor unit 300 may be configured to obtain information (hereinafter, distance information I2) pertaining to a distance between the robot main body 100-800 and a stage 20, (e.g., see
[0039] As shown in
[0040] The first image acquisition unit 400 may be provided on the top surface of the body unit 100. The first image acquisition unit 400 may be disposed between the first distance sensor 310 and the second distance sensor 320. For example, the first image acquisition unit 400 may be positioned to be equidistant from the first and second distance sensors 310 and 320. The first image acquisition unit 400 may be placed on a Y-Z plane, (wherein the Z axis is orthogonal to both the X and Y axes), passing through a center of the body unit 100. Here, the center of the body unit 100 may be the center of gravity of the body unit 100. The first image acquisition unit 400 may include a charge-coupled device (CCD), but the inventive concept may not be limited thereto. For example, various imaging units may be used for the first image acquisition unit 400, if they have an imaging function.
[0041] The manipulation unit 600 may be configured to grasp and pick up a target object 30 (e.g., see
[0042] The robot arm 610 may include a plurality of rods 611-614 and at least one hinge 615-617. Alternatively, the robot arm 610 may be provided in the form of a single rod. In some embodiments, the robot arm 610 may include a first rod 611, a second rod 612, a third rod 613, a fourth rod 614, a first hinge 615, a second hinge 616, and a third hinge 617. At least one or all of the first, second, third, and fourth rods 611-614 may be shaped like an elongated bar with a circular or rectangular section, but the inventive concept may not be limited thereto.
[0043] The first rod 611 may include an end portion, which is connected to the body unit 100. The first rod 611 may be placed on an X-Y plane, as shown in
[0044] The first hinge 615 may connect the first rod 611 to the third rod 613 to allow the third rod 613 to be rotatable about the first rod 611. The second hinge 616 may connect the third rod 613 to the fourth rod 614 to allow the fourth rod 614 to be rotatable about the third rod 613. The third hinge 617 may connect the second rod 612 to the fourth rod 614 to allow the second rod 612 to be rotatable about the fourth rod 614.
[0045] As described above, the robot hand 620 may be connected to an end portion of the robot arm 610, (e.g., to the second rod 612 for the embodiment shown in
[0046] The robot hand 620 may include a palm 621, plurality of fingers 622, and a palm-rotating unit 623. The palm 621 may be a flat plate with a specific area. The palm 621 may be a circular or rectangular disk with a flat surface. The palm-rotating unit 623 may be connected to a surface of the palm 621. The palm-rotating unit 623 may be configured to rotate the palm 621. In certain embodiments, the palm 621 may be configured to be rotated by the robot arm 610. For example, the second rod 612 may be configured to rotate about a rotation axis passing through its two opposite end portions. Such a rotation of the second rod 612 may lead to rotation of the palm 621. The fingers 622 may be connected to an opposite surface of the palm 621, opposing a surface connected to the palm-rotating unit 623. In addition, the second image acquisition unit 700 may be provided on the opposite surface of the palm 621. In one example, the second image acquisition unit 700 is one the same surface of the palm 621 as the fingers 622.
[0047] Each of the fingers 622 may be inserted into a corresponding one of a plurality of grip recesses 31a and 31b (e.g., see
[0048] The palm-rotating unit 623 may be connected to an end portion of the robot arm 610. As described above, the palm-rotating unit 623 may be connected to the surface of the palm 621. The palm-rotating unit 623 may be configured to rotate the palm 621. This may make it possible to change or control positions of the fingers 622.
[0049] The target object-sensing unit 500 may be configured to detect the target object 30 provided on the stage 20. The target object-sensing unit 500 may include a detection sensor 510 and a scan unit 520. A position of the detection sensor 510 may be controlled by the scan unit 520, to enable the detection sensor 510 to detect the target object 30 in a scan region S (e.g., see
[0050] The target object 30 may have second marks 32a and 32b, (e.g., see
[0051] The control unit 800 may be provided in the body unit 100. Accordingly, the control unit 800 may be protected from an external impact. The control unit 800 may be configured to receive the obstacle-sensing information I1 from the obstacle-sensing unit 200. The control unit 800 may be configured to receive the distance information I2 from the distance sensor unit 300. The control unit 800 may be configured to receive the first image information I3 from the first image acquisition unit 400. The control unit 800 may be configured to receive the target object position information I4 from the target object-sensing unit 500. The control unit 800 may be configured to receive the second image information I5 from the second image acquisition unit 700. In the control unit 800, the received information I1-I5 may be used to control the driving unit 900 and the manipulation unit 600. The control unit 800 will be described in more detail with reference to
[0052] The robot main body 100-800 may be moved toward one of a plurality of stages 20 (e.g., see
[0053] In some embodiments, the driving unit 900 may include a plurality of driving wheels (not shown), which are configured to control the motion of the robot main body 100-800, and a driving part (not shown), which is configured to apply a driving force to the driving wheels, but the inventive concept is not limited thereto. For example, various devices may be provided in the driving unit 900, if they are capable of moving the robot main body 100-800. The driving part may apply a driving force to the plurality of driving wheels, in response to control signals transmitted from the control unit 800. The driving force of the driving unit 900 may be used to move the robot main body 100-800 along the X-Y plane. In addition, the driving unit 900 may further include an apparatus for changing a position of the robot main body 100-800 in a Z-direction. In one embodiment, the driving unit includes four wheels. In other embodiments, the driving unit includes three wheels to ensure that all wheels remain in contact with the floor. In other embodiments, the driving unit includes low wear components that are suitable for a clean room environment.
[0054]
[0055] Referring to
[0056] With reference to
[0057]
[0058] Referring to
[0059] As a result of the movement along the driving path P, the body unit 100 may be spaced apart from the stage 20 by a predetermined relative distance D. In addition, the body unit 100 may be placed to form a predetermined relative angle with respect to the stage 20. Here, the relative distance D may refer to a straight distance from a center point of a surface of the body unit 100 (e.g., the first image acquisition unit 400) to a surface of the stage 20 provided with the first mark 21. The relative angle may refer to an angle between the surface of the body unit 100 and the surface of the stage 20 provided with the first mark 21.
[0060] The distance sensor unit 300 may obtain information on a distance between the robot main body 100-800 and a stage 20 (e.g., the distance information I2) (in step S15 of
[0061] When the body unit 100 is positioned adjacent to the stage 20, the control unit 800 may obtain the relative angle between the body unit 100 and the stage 20 and the relative distance D between the body unit 100 and the stage 20 from information on the first distance D.sub.1, the second distance D.sub.2, and distance L3 (see
[0062] The control unit 800 may calculate the relative distance D, using the following equation 1:
D=(D.sub.1+D.sub.2)/2Equation 1:
[0063] The control unit 800 may calculate the relative angle , using the following equation 2:
=tan.sup.1((D.sub.1-D.sub.2)/L))Equation 2:
[0064] With reference to
[0065] In certain embodiments, the control unit 800 may control the driving unit 900 to allow a difference between the first and second distances D.sub.1 and D.sub.2 respectively to be equal to or less than a predetermined value. For example, the control unit 800 may control the driving unit 900 until the difference between the first and second distances D.sub.1 and D.sub.2 is zero (as defined by the measurement resolution of the distance sensors 310 and 320). In this case, the body unit 100 may be positioned in such a way that its surface is parallel to a surface of the stage 20. The control unit 800 may control the driving unit 900 to allow the relative distance D between the body unit 100 and the stage 20 to be within a predetermined distance range.
[0066]
[0067] Referring to
[0068] The control unit 800 may control the driving unit 900 until at least one of the X- and Z-coordinates (x.sub.2, z.sub.2) corresponding to C.sub.2 coincides with the predetermined reference coordinate C.sub.1 (in step S19 of
[0069] When the robot main body 100-800 is located adjacent to the stage 20, the body unit 100 may be positioned in such a way that the relative distance D is equal to a predetermined distance. Here, the obtained coordinates of the reference point C.sub.2 of the first mark 21 may not coincide with the predetermined reference coordinate C.sub.1.
[0070] The control unit 800 may calculate an error x between the X-coordinate x.sub.2 of the reference point C.sub.2 of the first mark 21 and the X-coordinate x.sub.1 of the predetermined reference coordinate C.sub.1. The control unit 800 may calculate an error z between the obtained Z-coordinate z.sub.2 of the reference point C.sub.2 of the first mark 21 and the Z-coordinate z.sub.1 of the predetermined reference coordinate C.sub.1.
[0071] The control unit 800 may control the driving unit 900 to move the robot main body 100-800 by the calculated errors x and z in the X- and Z-directions to minimize the errors x and z during a subsequent calculation. Accordingly, the robot main body 100-800 may be located at the target position C that is appropriately spaced apart from the stage 20.
[0072] In some embodiments, the control unit 800 may control the driving unit 900 to allow the X- and Z-coordinates (x.sub.2, z.sub.2) obtained from the first image information I3 to coincide with the X- and Z-coordinates (x.sub.1, z.sub.1) contained in the predetermined reference coordinate C.sub.1. In certain embodiments, the control unit 800 may control the driving unit 900 to allow only the X-coordinate x.sub.2 to coincide with the X-coordinate x.sub.1 contained in the predetermined reference coordinate C.sub.1.
[0073] When the relative distance D between the body unit 100 and the stage 20 coincides with the predetermined distance and the X- and Z-coordinates (x.sub.2, z.sub.2) obtained from the first image information I3 coincide with the X- and Z-coordinates (x.sub.1, z.sub.1) contained in the predetermined reference coordinate, the robot main body 100-800 may be positioned at the target position C (e.g., see
[0074] Referring to
[0075] Information code 21a may be formed on the first mark 21 of the stage 20. For example, the information code 21a of the first mark 21 may include a QR code, a barcode, or a DATA matrix. The control unit 800 may obtain the information code 21a of the first mark 21 from the first image information I3.
[0076] The information code 21a of the first mark 21 may include one or more of a position of the stage 20, a relative distance between the robot main body 100-800 and the stage 20, a relative angle between the robot main body 100-800 and the stage 20, and a reference coordinate of the first mark 21.
[0077] The control unit 800 may obtain the information on the position of the stage 20, on the relative distance between the robot main body 100-800 and the stage 20, on the relative angle between the robot main body 100-800 and the stage 20, and on the reference coordinate of the first mark 21 from the information code 21a.
[0078] When a plurality of stages 20 are provided (as shown in
[0079]
[0080] Referring to
[0081] The scan unit 520 may be configured to adjust or change a position of the detection sensor 510 in a Z-direction. Accordingly, the target object-sensing unit 500 may obtain information on a Z-coordinate of the target object 30 in the scan region S (e.g., a two dimensional scan region). The information on X, Y, and Z-coordinates of the target object 30 obtained by the target object-sensing unit 500 may be transmitted to the control unit 800.
[0082] Based on the target object position information I4 obtained by the target object-sensing unit 500, the control unit 800 may control the robot arm 610 to move the robot hand 620 toward the target object 30 (in step S22 of
[0083]
[0084] Referring to
[0085] The second image acquisition unit 700 provided on the palm 621 of the robot hand 620 may be configured to take images of the second marks 32a and 32b of the target object 30 and to obtain the second image information I5, in which the images of the second marks 32a and 32b are contained (in step S23 of
[0086] The control unit 800 may obtain information on positions of the second marks 32a and 32b, based on the second image information I5 (in step S24 of
[0087] The control unit 800 may extract an information code (not shown) of the second marks 32a and 32b from the second image information I5. The information code of the second marks 32a and 32b may contain information on the target object 30. For example, the information code of the second marks 32a and 32b may contain various types of information (e.g., a kind or a production year of the target object 30). The control unit 800 may transmit the information on the target object 30 to a user via a communication unit (not shown).
[0088]
[0089] Referring to
[0090] Thereafter, under the control of the control unit 800, the fingers 622 of the robot hand 620 may be partially inserted into the grip recesses 31a and 31b, respectively (in step S25 of
[0091] If the fingers 622 are partially inserted into the grip recesses 31a and 31b, the control unit 800 may control the robot arm 610 to elevate the robot hand 620 in the Z-direction (in step S26 of
[0092] In certain cases, the target object 30 may be placed at an angle to the stage 20. Consequently, the target object 30 will also be placed at an angle to the palm 621 of the robot hand 620. Accordingly, a distance Z.sub.1 between a side portion of the target object 30 and the palm 621 may be different from a distance Z.sub.2 between an opposite side portion of the target object 30 and the palm 621, (see
[0093] If the robot hand 620, in which the fingers 622 are partially inserted into the grip recesses 31a and 31b, is elevated in the Z-direction, the target object 30 may be rotated by gravitational force and thus will become aligned to be parallel to the palm 621. Accordingly, it is possible to compensate the difference in level between the side portions of the target object 30.
[0094] The control unit 800 may control the robot hand 620 to further insert the fingers 622 into remaining regions of the grip recesses 31a and 31b, respectively, after the elevation of the robot hand 620 (in step S27 of
[0095]
[0096] The target object-sensing unit 501 may include the detection sensor 510 and a scan unit 521. In some embodiments, the scan unit 521 may be configured to rotate the detection sensor 510 on an X-Y plane by a specific angle range. This may allow the target object-sensing unit 500 to obtain information on X- and Y-coordinates of the target object 30 in the scan region S (e.g., see
[0097]
[0098] The stage 20 may include a first mark 21 (e.g., see
[0099] The first image acquisition unit 401 may be configured to obtain the first image information I3, in which three-dimensional images of the first mark 21 of the stage 20 are contained. The first image acquisition unit 401 may also be configured to transmit the first image information I3 to the control unit 800. The first image information I3 may include at least one two-dimensional or three-dimensional image of the first mark 21.
[0100] The control unit 800 may receive the first image information I3 obtained by the first image acquisition unit 401. In the control unit 800, the first image information I3 may be used to control the driving unit 900 to allow the robot main body 100-800 to be located at a desired position that is appropriately spaced apart from the stage 20. This will be described in more detail with reference to
[0101] The manipulation unit 600 may be provided on the body unit 100 and may be used to grasp and pick up a target object (not shown) disposed on the stage 20. The manipulation unit 600 may include the robot hand 620, which is configured to grasp the target object (not shown), and the robot arm 610, which is used to change a position of the robot hand 620.
[0102]
[0103] Referring to
[0104] As a result of the movement along the driving path, the robot main body 100-800 may be spaced apart from a surface of the stage 20 by a predetermined relative distance D. The robot main body 100-800 may be placed to form a predetermined relative angle with respect to the stage 20. Here, the relative distance D may refer to a straight distance from a center point of a surface of the body unit 100 of the robot main body 100-800 to the stage 20. For example, the distance D may be measured from a centroid of the first image acquisition unit 401 to the stage 20. The relative angle may refer to an angle between a surface of the body unit 100 of the robot main body 100-800 and the surface of the stage 20 provided with the first mark 21.
[0105] The control unit 800 may obtain a projection area A.sub.1 (see
[0106] The control unit 800 may obtain a length y.sub.2 in Y-direction of the first mark 21, based on the first image information I3. When the body unit 100 is placed to form the relative angle with respect to the stage 20 (as shown in
[0107] The control unit 800 may obtain a length y.sub.2 in the Y-direction of the first mark 21, based on the three-dimensional image of the first mark 21. The control unit 800 may also obtain the relative angle between the body unit 100 and the stage 20 from the obtained length yz. For example, the larger the relative angle between the body unit 100 and the stage 20, the longer the obtained length y.sub.2 in Y-direction of the first mark 21. Conversely, the lower the relative angle between the body unit 100 and the stage 20, the shorter the obtained length yz in Y-direction of the first mark 21.
[0108] The control unit 800 may control the driving unit 900 until the relative angle is equal to a predetermined angle value. In certain embodiments, the control unit 800 may control the driving unit 900 until the relative angle is less than the predetermined angle value. In some embodiments, the predetermined angle value may be about 0 degrees, but the inventive concept may not be limited thereto. If the relative angle is about 0 degrees, the body unit 100 may be placed in such a way that a surface thereof is substantially parallel to a surface of the stage 20. If the body unit 100 is placed to have a surface parallel to a surface of the stage 20, the length in Y-direction of the first mark 21 obtained by the control unit 800 may be substantially zero.
[0109] The control unit 800 may obtain position information on the reference point C.sub.2 of the first mark 21, based on the first image information I3. For example, the control unit 800 may be configured to calculate X- and Z-coordinates (x.sub.2, z.sub.2) of the reference point C.sub.2 of the first mark 21, based on the first image information I3. In some embodiments, the reference point C.sub.2 of the first mark 21 may be a center point of the first mark 21, but the inventive concept is not limited thereto.
[0110] The control unit 800 may control the driving unit 900 until at least one of the X- and Z-coordinates (x.sub.2, z.sub.2) coincides with the predetermined reference coordinate C.sub.1. Here, the reference coordinate C.sub.1 may represent coordinates of the reference point C.sub.2 of the first mark 21, which are contained in the first image information I3 when the robot main body 100-800 is located at a desired position that is appropriately spaced apart from the stage 20, and the reference coordinate C.sub.1 may include X- and Z-coordinates (x.sub.1, z.sub.1).
[0111] When the robot main body 100-800 is located to be adjacent to the stage 20, the obtained the reference point C.sub.2 of the first mark 21 may not coincide with the predetermined reference coordinate C.sub.1. The control unit 800 may calculate an error Mx between the X-coordinate x.sub.2 of the reference point C.sub.2 of the first mark 21, which is obtained from the first image information I3, and the X-coordinate x.sub.1 contained in the predetermined reference coordinate C.sub.1. The control unit 800 may calculate an error 46z between the Z-coordinate z.sub.2 of the reference point C.sub.2 of the first mark 21, which is obtained from the first image information I3, and the Z-coordinate z.sub.1 of the predetermined reference coordinate C.sub.1.
[0112] Referring to
[0113] While example embodiments of the inventive concepts have been particularly shown and described, it will be understood by one of ordinary skill in the art that variations in form and detail may be made therein without departing from the spirit and scope of the attached claims.