Control device and method for controlling a robot with a system by means of gesture control

09901411 ยท 2018-02-27

Assignee

Inventors

Cpc classification

International classification

Abstract

The Invention relates to a control device (25, 28, 29) for controlling a robot system (11) with at least one robot arm (14, 18) on which a surgical instrument (15, 19) is secured that has an end effector (17, 21), wherein the control device (25, 28, 29) comprises an imaging system (25, 28) that records the control specification of at least one hand (30.sub.L, 30.sub.R), evaluates it and converts it to corresponding control commands for one or more components of the robot system (11). In order to simplify the control of the robot system (11) and in particular to make it intuitive, it is proposed that a control unit (25) be provided which determines the orientation and/or the position and/or the degree of opening of the end effector (15, 19) of a surgical instrument (15, 19) as first parameter or first parameters, and moreover determines the orientation and/or the position and/or the degree of opening of at least one hand (30.sub.L, 30.sub.R) as second parameter or second parameters, and, in the event of a deviation of one or more of the first parameters from the corresponding second parameter, suppresses a manual control of the end effector (17, 21), and, in the event of an agreement of one or more of the first parameters with the corresponding second parameter, frees a gesture control, such that the surgical instrument (15, 19) can be actuated by hand.

Claims

1. A control device for controlling a robot system with at least one robot arm on which a surgical instrument is secured which has an end effector, wherein the control device comprising: an imaging system which records the control command of at least one hand, evaluates it and converts it into corresponding control commands for one or more components of the robot system, and a control unit which is constructed and arranged to: determine the orientation and/or the position and/or the degree of opening of the end effector of the surgical instrument as a first parameter or first parameters, and furthermore determine the orientation and/or the position and/or the degree of opening of at least one hand as a second parameter or second parameters, and in the event that one or more of the first parameters deviate from the respectively corresponding second parameter or in the event that the controlling hand moves out of a detection region, automatically deactivate the gesture control of the end effector, and in the event that one or more of the first parameters agree with the respectively corresponding second parameter and in the event that the controlling hand is moved in the detection region, enable a gesture control such that the end effector can be operated by means of the gesture control.

2. The control device according to claim 1, wherein at least two of the second parameters must agree with the respectively corresponding first parameters in order to enable the gesture control.

3. The control device according to claim 1, wherein the control unit is constructed and arranged to emit a signal in the event of a deviation of one more of the first parameters from the respectively corresponding second parameter, said signal being to require the user of the robot system to adapt the orientation and/or position and/or the degree of opening of at least one hand to the corresponding state of the end effector.

4. The control device according to claim 1, wherein in the event of an agreement of one or more of the second parameters with the respectively corresponding first parameter, the control unit is constructed and arranged to emit a signal which signals to the user that an alignment of a hand with the controlled end effector was successful.

5. The control device according to claim 1, wherein the control unit is constructed and arranged to depict a virtual element on a screen, the orientation and/or position and/or degree of opening of which corresponds to that or those of the hand.

6. The control device according to claim 1, wherein the control unit is constructed and arranged to depict a further element on a screen, the orientation and/or position and/or degree of opening of which corresponds to that or those of the controlled end effector, and which serves as a target specification for the alignment between the controlling hand and the controlled end effector.

7. The control device according to claim 1, further comprising a manually operated auxiliary element.

8. The control device according to claim 1, wherein the control unit is constructed and arranged to determine the position of the hand by setting a point on the hand or the associated arm or on an associated auxiliary element and determines this point in a coordinate system of the imaging system.

9. The control device according to claim 1, wherein the control unit is constructed and arranged to determine the orientation of at least one hand by setting at least one vector and determines this in the coordinate system of the imaging system.

10. The control device according to claim 9, wherein to set the vector, the control unit is constructed and arranged to set at least one further point on the hand and/or on an auxiliary element.

11. The control device according to claim 10, wherein the vector lies on a plane which is spanned by at least three of the set points.

12. The control device according to claim 1, wherein the control unit is constructed and arranged to set at least one point on the hand and/or on the auxiliary element which lies on a tip of a finger or of the auxiliary element.

13. The control device according to claim 1, wherein the control unit is constructed and arranged to determine the degree of opening of at least one hand by setting two lines, and determines the angle between the lines in the coordinate system of the imaging system.

14. The control device according to claim 11, wherein the control unit is constructed and arranged to determine the degree of opening of at least one hand by determining the distance between two points on the hand.

15. The control device according to claim 7, wherein the manually operated auxiliary element comprises an inertial sensor which can measure a position of the auxiliary element in space and/or a movement of the auxiliary element in or around three spatial axes.

16. The control device according to claim 1, wherein the control unit is constructed and arranged such that a control movement of a hand is converted into a corresponding movement of the end effector using a predetermined scaling factor.

17. The control device according to claim 1, wherein it comprises a screen on which an object controlled by means of gesture control is depicted, wherein the orientation of at most only two axes of a coordinate system of the imaging system agrees with the orientation of the corresponding axes of a coordinate system of the screen and the object is displayed such that it is moved on the screen in relation to the coordinate system of the screen in the same direction as the movement of the hand in relation to the coordinate system of the imaging system.

18. The control device according to claim 1, wherein a camera is provided which records the end effector, and a coordinate system of the camera is aligned to the image axes of the image recorded by the camera, wherein one of the axes of the coordinate system of the camera points in the viewing direction of the camera.

19. The control device according to claim 18, wherein the control unit is constructed and arranged such that a control command of a hand in the coordinate system of the imaging system is converted into a corresponding movement of the end effector in the coordinate system of the camera.

20. The control device according to claim 19, wherein to control the end effector, the active coordinate system switches between the coordinate systems of the camera, depending on by which camera the end effector is recorded.

21. A method for controlling a robot system by means of at least one hand, comprising the following steps: determining an orientation and/or a position and/or a degree of opening of an end effector of a surgical instrument as a first parameter or first parameters; determining the orientation and/or the position and/or the degree of opening of at the least one hand (30.sub.L, 30.sub.R) as a second parameter or second parameters; comparing at least one first parameter with the respectively corresponding second parameter; in the event that one or more of the first parameters deviate from the respectively corresponding second parameter or in the event that the controlling hand moves out of a detection region, automatically deactivating the gesture control of the end effector; and in the event that one or more of the first parameters agree with the respectively corresponding second parameter and in the event that the controlling hand is moved in the detection region, enabling the gesture control such that the end effector can be controlled using the at least one hand.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) The invention is explained in more detail below by means of the enclosed drawings. Here are shown:

(2) FIG. 1 a depiction of an input device known from prior art and an end effector controlled therewith;

(3) FIG. 2 a perspective view of a robot system for minimally invasive surgery having two robot arms and a control device to implement a gesture control;

(4) FIG. 3 a side view of several surgical instruments which are introduced into the body of a patient;

(5) FIG. 4 a depiction of two hands and different reference points and lines to determine the orientation, the position and the angle of opening of the hands;

(6) FIG. 5 a depiction of two hands similar to FIG. 4, wherein each hand holds an auxiliary element to guide the hand movements;

(7) FIG. 6 a screen on which the end effectors controlled by means of gesture control as well as two virtual end effectors are depicted; and

(8) FIG. 7 a schematic depiction of different method steps of a method for controlling a robot system by means of gesture control.

(9) With regard to the explanation of FIG. 1, reference is made to the introduction of the description.

(10) FIG. 2 shows a schematic depiction of a robot system 11 for minimally invasive surgery having two robot arms 14, 18 and a control device for controlling the different components of the robot system 11 by means of gesture control. The robot arms 14, 18 are secured here to an OP table 12 via a rail 24. The robot arms 14, 18 can thereby be constructed in the same manner or differently. A patient 13 lies on the OP table 12, on whom a surgical intervention is to be carried out.

(11) The robot arms 14 and 18 are each equipped with a surgical instrument 15 or 19. Surgical instruments can in principle be all instruments which are suitable for a surgical intervention, such as, for example, a scalpel, a gripper, scissors, electro-surgical instruments, endoscopes, a camera, staple, etc. The instrument 15 can, for example, be formed as scissors, and the instrument 19 as a gripper. The instruments 15, 19 typically have a shaft 16, 20 on the distal end of which an end effector 17, 21 corresponding to the function of the instrument is attached (see FIG. 3).

(12) The instruments 15, 19 are moved by the robot arms 14 and 18 and introduced into the body of the patient 13 via small artificial openings. The actual surgical intervention can then be implemented by operation of the end effectors 17 or 21.

(13) To control the end effectors 17, 21this is understood below to be the positioning and orientation in space as well as the actuation of the actual working elementsa control device is provided with an image processing system. The imaging system comprises a camera 28 which monitors a cubic detection region 36. The imaging system is thereby able to recognise and interpret manually executed gestures. The image data recorded by the camera is transmitted to a control unit 25 and evaluated by an evaluation unit (software) contained therein. The control unit 25 then generates control commands corresponding to the recognised gestures, with which control commands the actuators of the controlled components 14, 15, 17, 18, 19, 21 are controlled. The robot arms 14, 18, the instruments 15, 19 and/or the end effectors 17, 21 can thereby each be operated individually or all at the same time.

(14) According to one exemplary embodiment, for example, the left end effector 21 can be operated with the left hand 30.sub.L of the user and the right end effector 17 with the right hand 30.sub.R. By the term operation of the end effector, it is here understood that the end effector 17, 21 can be positioned or orientated in three-dimensional space and a certain function, such as, for example, cutting, gripping or coagulating, can be executed. The end effectors 17, 21 can optionally be controlled or regulated in terms of position and/or speed.

(15) The robot arms 14 and 18 are connected here to the control unit 25 via a respective cable 26 or 27. Alternatively, a wireless control could also be provided.

(16) As has been explained above, a predetermined detection region 36 is available to the user in which he can execute manual gestures in order to control the robot system 11. The control device is preferably designed such that only gestures which are executed within this region 36 are converted into corresponding control commands by the control unit 25. Gestures implemented outside the detection region 36 are, however, not converted into corresponding control commands.

(17) The detection region 36 has a first coordinate system K.sub.G in which the positions of the hands 30.sub.L, 30.sub.R as well as the orientation thereof in three-dimensional space are able to be determined clearly. Therefore, for example, the direction can be determined in which the hand 30.sub.L, 30.sub.R or the fingers thereof are pointing, as well as the distances of the hands 30.sub.L, 30.sub.R or fingers to each other. Additionally, the movement of the hands 30.sub.L, 30.sub.R or the fingers thereof can be determined, such as, for example, the distance covered, the movement speed and/or the movement direction.

(18) In order to determine the position of a hand 30.sub.L, 30.sub.R, one or more points on the hand 30.sub.L, 30.sub.R and/or the associated arm can be recognised with the aid of the imaging system and can be used as a reference point for the position of the respective hand 30.sub.L, 30.sub.R. As is shown in FIG. 4, for example, the points P.sub.1L of the left hand 30.sub.L and P.sub.1R of the right hand 30.sub.R are determined, which lie on the wrist joint of the hands 30.sub.L, 30.sub.R respectively. The position of the two hands 30.sub.L, 30.sub.R can therefore be described clearly in the coordinate system K.sub.G.

(19) Additionally, further points on the hands, such as, for example, the points P.sub.2L, P.sub.2R and P.sub.3L, P.sub.3R on the fingertips of the thumb and index finger of the left and right hand 30.sub.L, 30.sub.R can also be defined as reference points. In order to determine the orientation of the hands 30.sub.L, 30.sub.R, a respective line 31, 32, 33 or 34 can be laid between two points of the hand 30.sub.L or 30.sub.R. The orientation of a hand 30.sub.L, 30.sub.R can, for example, be defined as a vector which lies on one of the lines 31 to 34 referred to. Alternatively, however, a vector V.sub.1L, V.sub.1R could also be defined which lies between the lines 31 and 32 or 33 and 34 referred to. Such a vector could lie in a plane which is spanned by the points P.sub.1L, P.sub.2L and P.sub.3L of the left hand 30.sub.L or by the three points P.sub.1L, P.sub.2R and P.sub.3R, of the right hand 30.sub.R. This vector V.sub.1L, V.sub.1R is then used as a reference for the orientation of the hand 30.sub.L or 30.sub.R.

(20) Alternatively, the vectors V.sub.1L and V.sub.1R could, however, also enclose any other angle and, for example, point directly to the fingertip of the index finger (point P.sub.3L or P.sub.3R).

(21) The degree of opening of a hand 30.sub.L, 30.sub.R between the thumb and the index finger can, for example, be defined by the angle ?.sub.L or ?.sub.R between the lines 31 and 32 or 33 and 34. This angle ?.sub.R, ?.sub.L is dependent on the distance between the thumb and the index finger. The degree of opening of a hand 30.sub.L, 30.sub.R can, however, also be determined by the distance of two points P.sub.2L, P.sub.2R, P.sub.3L, P.sub.3R on the hand 30.sub.L, 30.sub.R, for example by the distance of the tip of the index finger P.sub.3L and the tip of the thumb P.sub.2L.

(22) The position, orientation and the degree of opening of a hand 30.sub.L, 30.sub.R are therefore able to be determined clearly in the coordinate system K.sub.G. Additionally, an associated speed or acceleration can be determined for any change of position, orientation and degree of opening.

(23) As shown in FIG. 4, the vectors V.sub.1L or V.sub.1R advantageously always enclose half of the angle of opening ?.sub.L or ?.sub.R between the lines 31 and 32 or 33 and 34.

(24) Besides the recognition described above of hand or arm points, there are, however, yet further different possibilities to determine the position, the orientation or the degree of opening of a hand. As is shown in FIG. 5, the control device of the robot system 11 can also comprise manually operated auxiliary elements 35.sub.L, 35.sub.R which serve to guide an opening and closing movement of the hand between thumb and index finger. In the depicted exemplary embodiment, an individual guide element 35.sub.L, 35.sub.R is provided for each hand 30.sub.L, 30.sub.R.

(25) Each auxiliary element 35.sub.L, 35.sub.R comprises two limbs which are connected to each other by a hinge and which correspond in their shape and arrangement to a gripper or surgical scissors. The auxiliary elements 35.sub.L, 35.sub.R each have precisely one degree of freedom, i.e. only movement towards or away from each other of the thumb tip and index fingertip around a defined axis 37 is possible. Due to this limitation of the freedom of movement of thumb and index finger, incorrect interpretations can therefore be excluded during gesture recognition, as the fingertips can only move along a fixedly defined path. The moving limbs of the auxiliary elements 35.sub.L, 35.sub.R can, for example, have loops 36.sub.L, 36.sub.R into which the user can insert his thumb or index finger.

(26) In an analogous manner, for the determination of distinctive points on the hands 30.sub.L, 30.sub.R, points can also be determined on the auxiliary elements 35.sub.L, 35.sub.R and from this the position, orientation and/or the degree of opening of the associated hand 30.sub.L or 30.sub.R can be determined. Instead of the wrist joint points P.sub.1L or P.sub.1R, for example, the joint points 37.sub.L or 37.sub.R of the auxiliary element could be used which are depicted in FIG. 5. Instead of the vectors V.sub.1L and V.sub.1R, for example, the joint axes could be used, which run through the points 37.sub.L or 37.sub.R. The angle between the limbs of an auxiliary element 35.sub.L or 35.sub.R which are connected to each other by a hinge or the distance between the two limb ends (analogously to the fingertips), for example, could be used as a reference for the degree of opening of the hand.

(27) Alternatively or additionally, the auxiliary elements 35.sub.L and 35.sub.R could each be equipped with a sensor system 38.sub.L and 38.sub.R. This sensor system can likewise be used to determine the position, the orientation and/or the degree of opening of the hand. If, for example, an angle sensor is provided which measures the angle of the two limbs which are connected to each other by a hinge, the degree of opening of the hand 30.sub.L or 30.sub.R between the thumb and index finger can be determined. Additionally, magnetic sensors, acceleration or inertial sensors could also be integrated, with which the position, the orientation, but also a movement speed or acceleration of the hand 30.sub.L or 30.sub.R can be detected. The sensor data can, for example, be transferred in a contactless manner to the control unit 25. Redundant pieces of information can be compared to one another in order to detect or rectify possible errors. A missing or incorrect piece of information can, for example, also be replaced by a piece of information which is present in a redundant manner.

(28) During a minimally invasive operation, the location at which the intervention takes place is monitored by one or more cameras 22, 23. For this purpose, one or more laparoscopic instruments are introduced into the body of the patient 13 through small artificial openings, as is also depicted in FIG. 3. The image of the camera 23 is depicted on a screen 29 on which the surgeon can observe and monitor the progress of the operation. Additionally, a further camera 22 is provided in the robot system 11 of FIG. 2, said camera recording the events outside the body of the patient 13. This camera 22 can, for example, be used to detect the position and orientation of the robot arms 14, 18 as well as the instruments 15 and 19 and the camera 23. The image of camera 22 and the camera 28 can also be depicted on the screen 29. It can also be switched back and forth between the cameras 22 and 23 automatically or manually depending on whether an end effector is located within or outside the patient. Optionally, the image on the screen 29 can be shared in order to display an end effector detected by the camera 22 and an end effector detected by the camera 23.

(29) Preferably, all cameras and/or screens are 3D-capable. Devices known from prior art can be used for this purpose, for example stereoscopic cameras or cameras having dual image recording. In order to enable a control of the surgical instruments 15, 19, including their end effectors 17, 21, which is as intuitive as possible for the surgeon, at least the orientation, but preferably also the position and the degree of opening of the controlling hand 30.sub.L or 30.sub.R in relation to the coordinate system K.sub.G should agree with the orientation, the position or the degree of opening of the controlled end effector 17 or 21 in relation to the respective camera coordinate system K.sub.K1 or K.sub.K2. In the case of the robot system 11 depicted here, for this purpose, an alignment process is implemented in which the user can adapt the orientation and/or the position and/or the degree of opening of his hand to the corresponding state of the controlled end effector 17 or 21.

(30) As soon as an agreement of at least one of the state parameters referred to, preferably all state parameters, has been achieved, the controlled end effector 17, 21 is enabled or activated and can then be controlled by hand. The activation can occur automatically or require an additional operation of the user.

(31) In order to simplify the alignment between his hands 30.sub.L, 30.sub.R and the controlled end effector 17, 21 for the user, in this exemplary embodiment, a virtual end effector 17 is overlaid on the screen 29, the orientation, position and degree of opening of which corresponds to the orientation, position and the degree of opening of the controlling hand, for example 30.sub.R. The virtual end effector 17 therefore represents the state of the controlling hand, for example 30.sub.R. Additionally, a further end effector 17 is imaged in the window 40, said further end effector depicting the state of the controlled end effector 17. The first virtual end effector 17 therefore displays an actual state of his hand 30.sub.R to the user and the further end effector 17 the target state of the hand 30.sub.R.

(32) As is shown in FIG. 6, the orientation of the hand 30.sub.R and the orientation of the end effector 17 differ by an angle ??. In order to bring the orientation of the right hand 30.sub.R into agreement with the orientation of the end effector 17, the user only has to change the orientation of his hand. The robot system 11 can also additionally display to the user how he must move his hand, for example by overlaying of arrows. If the orientation of the hand 30.sub.R agrees with the orientation of the end effector 17, feedback to the user can likewise occur such as, for example, by displaying a symbol on the screen 29 or by coloured highlighting of the virtual end effector 17. Depending on the design of the robot system 11, tolerances can be specified for the agreement of the orientation. An exact agreement is not necessarily required.

(33) In an analogous way, the orientation of the left hand 30.sub.L can also be brought into agreement with the left end effector 21. After the alignment of the orientation, the gesture control of the two end effectors 17, 21 is enabled. Additionally, the window 40 can be hidden. As a requirement for an activation of the gesture control, it can be provided that, besides the orientation, one or more further states of the hand 30.sub.R, 30.sub.L, must be brought into agreement with the respectively controlled end effector 17, 21. Therefore, for example, it can be provided that the degree of opening ?.sub.L and ?.sub.R of the two hands 30.sub.L and 30.sub.R must be brought into agreement with the angles of opening ?.sub.L and ?.sub.R of the two end effectors 17, 21. The alignment of the degree of opening can occur analogously to the alignment of the orientation, as has been described above. Therefore it is determined whether the degree of opening of a hand 30.sub.L, 30.sub.R deviates from the degree of opening of the controlled end effector 17, 21. In the event of a deviation, the user can in turn be required to change the degree of opening ?.sub.L, ?.sub.R of his hand 30.sub.L, 30.sub.R. After the adaptation has occurred, the gesture control can in turn be automatically enabled. During the alignment, an offset can be defined between the degrees of opening ?.sub.L and ?.sub.R of the two hands 30.sub.L and 30.sub.R and the angles of opening ?.sub.L and ?.sub.R of the two end effectors 17, 21. This offset can, for example, cause the fingers to not have to be completely closed in order to close an end effector. This is then particularly helpful if an auxiliary element 35.sub.L and 35.sub.R is guided with the hands and the fingers can therefore not be completely closed.

(34) It can, however, also be necessary for the activation of the gesture control that, additionally, the position P.sub.1L or P.sub.1R of the controlling hand 30.sub.L or 30.sub.R is brought into agreement with the position Q.sub.1L or Q.sub.1R of the controlled end effector 17 or 21 again. The method to align the position can in turn occur analogously to the alignment of the orientation or of the degree of opening. As is shown in FIG. 6, the position of the point Q.sub.1R of the virtual end effector 17 is offset compared to the point Q.sub.1R of the end effector 17. The offset can, for example, be displayed by a vector 39. The user can now change the position of his hand until it agrees with the position of the virtual end effector 17. The agreement can in turn be displayed to the user. After the alignment, the relative position of the hands 30.sub.L and 30.sub.R as well as the end effectors 17, 21 then agree with each other. In other words, if the fingertips of the left hand 30.sub.L and the right hand 30.sub.R touch, then the tips of the end effectors 17, 21 should also touch.

(35) In order to design the control of the robot system 11 to be as simple as possible and in particular to enable an intuitive control, the objects 17, 21, 17, 17 depicted on the screen 29 are preferably depicted such that they follow a hand movement precisely in the direction of the hand movement. If a hand 30.sub.L, 30.sub.R is moved, for example, in the x-direction in the coordinate system K.sub.G of the imaging system, then the controlled object is also moved in the image, such as, for example, the end effector 17 or the virtual end effector 17, in the x-direction in the coordinate system K.sub.B of the screen 29. The same applies for movements having a component in the y-direction or z-direction. The coordinate system K.sub.B of the screen 29 is thereby orientated in the same direction as the coordinate system K.sub.G of the imaging system (the z-axis points, for example, upwards and the x-axis to the right). A hand movement to the right therefore also always results in a movement of the controlled object 17, 17 on the screen to the right, and a movement upwards (in the z-direction) in a corresponding movement of the controlled object 17, 17 on the screen upwards. The actual movement of the object 17, 21 in space differs, however, as a rule, from the movement displayed on the screen 29. In order to achieve such a depiction, there are fundamentally various possibilities.

(36) In the robot system 11 depicted in FIGS. 1 to 6, an individual coordinate system K.sub.K1, K.sub.K2 is allocated to each camera 22, 23. During a pivot of the camera 22 or 23, the associated coordinate system K.sub.K1 or K.sub.K2 also pivots with the camera. Therefore, the orientation of a camera coordinate system K.sub.K1 or K.sub.K2 can be carried out by the corresponding camera 22 or 23 being adjusted. For example, camera 23 can be rotated around its axis 42. Alternatively, the orientation of the camera coordinate system K.sub.K1 or K.sub.K2 can be adapted by the user in any manner by means of the control 25 using a coordinate transformation, such that the camera does not necessarily have to be adjusted.

(37) Additionally, the alignment of the camera coordinate systems K.sub.K1 and K.sub.K2 should each be identical in relation to the respective cameras 22, 23. The x-axis of the coordinate system K.sub.K1 of the camera 22 and the x-axis of the coordinate system K.sub.K2 of the camera 23 introduced into the patient 13 can, for example, each point in the recording direction of the respective camera 22, 23. The orientation of the coordinate system K.sub.B of the screen 29 likewise agrees with the orientation of the coordinate systems K.sub.K1, K.sub.K2, wherein the coordinate system K.sub.B is aligned in a fixed manner on the screen 29. For example, the z-axis of K.sub.B always points vertically upwards and the x-axis points into the screen 29 as a normal to the screen surface. If an object recorded by the camera 23, for example, is moved in the z-direction in the coordinate system K.sub.K1, the object is also moved on the screen 29 in the z-direction in the screen coordinate system K.sub.B. The robot system 11 automatically recognises where the end effector 17, 21 is located and controls this accordingly. The relevant coordinate system for the respective end effector 17, 21 is preferably changed automatically if the end effector 17, 21 is guided in and out of the patient 13.

(38) The coordinate system K.sub.G is allocated to the detection region 36. This coordinate system can be aligned according to the coordinate system K.sub.B of the screen 29, but does not have to be. Preferably, however, the y-axis of the coordinate system K.sub.G is aligned substantially in parallel to the y-axis of the coordinate system K.sub.B of the screen 29. The x-axis of the coordinate system K.sub.G is directed substantially frontally to the front in relation to the user.

(39) The real movement of an end effector 17, 21 in space, however, as a rule does not agree with the movement direction displayed on the screen 29 and also does not agree with the movement direction of the controlling hand 30.sub.L, 30.sub.R in space. The end effectors 17, 21 are controlled in particular in the camera coordinate system K.sub.K1 or K.sub.K2. Depending on how the camera 22 or 23 is aligned, the camera coordinate system K.sub.K1 or K.sub.K2 is also aligned differently in space. In other words, a hand movement in the z-direction of the coordinate system K.sub.G indeed causes a movement of the end effector in the z-direction of the respective camera coordinate system K.sub.K1 or K.sub.K2. The actual movement of the end effector in space then, however, depends on the alignment of the z-axis of the camera coordinate system K.sub.K1 or K.sub.K2 in space.

(40) The position and alignment (orientation of the x-, y-, z-axes) of the coordinate systems K.sub.K1, K.sub.K2 are known to the robot system 11 or to the control unit 25 and can be converted into a global robot coordinate system K.sub.R by a coordinate transformation. As a consequence, all physical parameters in each of the coordinate systems K.sub.K1, K.sub.K2, K.sub.B, K.sub.G can be converted into corresponding parameters of a different coordinate system by means of coordinate transformation. Therefore, for example, the positions of the points Q.sub.1L and Q.sub.1R could be described by vectors in the global robot coordinate system K.sub.R. The positions of the points Q.sub.1L and Q.sub.1R of the robot coordinate system K.sub.R could likewise be transformed into the coordinate systems K.sub.K1 and K.sub.K2 of the respective camera 22, 23. Therefore, the control 25 can convert the movement parameters of a hand 30.sub.L, 30.sub.R, which are detected in the coordinate system K.sub.G into control parameters for an end effector 17, 21 which is operated in the respective camera coordinate system K.sub.K1 or K.sub.K2.

(41) The orientation, position and the angle of opening of the end effectors 17, 21 can be determined by the imaging system in an analogous way to the hand 30.sub.L, 30.sub.R. The angle of opening can, for example, be set by an angle between the two working elements, as is depicted in FIG. 3. The orientation of the end effectors 17, 21 can be set by the vectors V.sub.2L and V.sub.2R and the position of the end effectors 17 and 21 can be defined by the position of the limb points Q.sub.1L or Q.sub.1R. If, as has been previously defined, the hand vectors V.sub.1L and V.sub.1R each run at half of the angle ?.sub.R or ?.sub.R between the thumb and the index finger, then advantageously it is analogously defined that the end effector vectors V.sub.2L or V.sub.2R run at half the angle of opening ?.sub.L or ?.sub.R between the two working elements of the respective auxiliary element 25.sub.L or 25.sub.R. The individual parameters can be described in any coordinate system K.sub.K1, K.sub.K2, K.sub.R. In this way, diversely redundant pieces of information can be obtained which can be compared to one another for checking.

(42) The data present in the robot system 11 can be used to recognise the position, the orientation and the angle of opening of an end effector 17, 21. Therefore, for example, the control unit 25 can determine the position of the end effector 17 by means of the position of the robot arm 14. Since the control unit 25 furthermore generates the control commands for the end effectors 17, 21, the orientation and the angle of opening of each end effector 17, 21 are also therefore known to this.

(43) After the activation of the gesture control for one of the end effectors 17, 21, the relevant end effector 17, 21 can be operated by hand. As long as the controlling hand 30.sub.L, 30.sub.R is located in the detection region 36, the control commands executed by the user are converted into corresponding control commands. If, however, the controlling hand 30.sub.L or 30.sub.R moves from the detection region 36, the gesture control is preferably interrupted. In other words, the end effector 17, 21 is stopped. Therefore, it can be excluded that the end effector 17, 21 executes an action which is not desired. The relevant end effector 17, 21 can be activated again after a new alignment process has been executed.

(44) FIG. 7 shows various method steps of a method for controlling the robot system 11 of FIGS. 1 to 6. In step S1, the hands 30.sub.L, 30.sub.R are moved into the detection region 36 and in step S2 are detected by means of the camera 28. In step S3, the determination of the orientation of the hands occurs, wherein the vectors V.sub.1L, V.sub.1R are set. In step S4, the orientation of the end effectors 17, 21 is determined and the vectors V.sub.2L, V.sub.2R are set. In step S5, a target-actual alignment of the orientations then follows.

(45) In step S6, the position of the hands 30.sub.L and 30.sub.R is determined and corresponding points P.sub.1L, P.sub.1R are set. The determination of the position of the end effectors follows this in step S7, wherein the points P.sub.2L and P.sub.2R are set. In step S8, finally a target-actual value alignment of the positions follows.

(46) Step S9 describes the determination of the degree of opening of the finger, wherein the angles ?.sub.L and ?.sub.R are determined. Correspondingly, then in step S10, the angles of opening of the end effectors 17, 21 are determined. In step S11, finally the target-actual value alignment of the degree of opening or angle of opening follows.

(47) In the event of a deviation of the orientations, the positions and/or the degrees of opening, in step 12, an instruction is emitted to the user to implement an alignment. As soon as an agreement of at least one actual value with the respectively associated target value has been achieved, the gesture control is activated. The robot system 11 then recognises the manual control commands executed by the user and controls the end effectors 17, 21 according to the commands.

(48) If a parameter of the hand 30.sub.L or 30.sub.R no longer agrees with the respective parameter of the end effector 17, 21, the control of this end effector 17, 21 is preferably deactivated (step S15). A deactivation of the end effector 17, 21 preferably also occurs if the controlling hand 30.sub.L, 30.sub.R has moved from the detection region 36.

(49) The end effector 17, 21 can be activated or operated again if the procedure for agreement of the respective parameters is carried out again. If a hand has been moved from the recognition region 36, then it must first be moved into the recognition region 36 again (see step S1). Provided a non-agreement of at least one of the parameters was the trigger of the interruption, while the hands were located in the detection region 36, the renewed determination of the respective parameters can be continued directly (see step S2, S3, S6 and S9).

(50) The steps shown in FIG. 7 can be deposited on a storage medium in the control 25, such that the control 25 can execute it at any time.