Robot and method for controlling a robot

09579793 ยท 2017-02-28

Assignee

Inventors

Cpc classification

International classification

Abstract

The invention relates to a robot and a method for controlling a robot. The distance between an object and the robot and/or the derivative thereof or a first motion of the object is detected by means of a non-contact distance sensor arranged in or on a robot arm of the robot and/or on or in an end effector fastened on the robot arm. The robot arm is moved based on the first motion detected by means of the distance sensor, a target force or a target torque to be applied by the robot is determined based on the distance detected between the object and the robot, and/or a function of the robot or a parameterization of a function of the robot is triggered based on the first motion detected and/or a target distance between the object and the robot and/or the derivative thereof detected by means of the distance sensor.

Claims

1. A method of controlling a robot having at least one robot arm, the method comprising: detecting with a non-contact distance sensor a first movement of an object; wherein the non-contact distance sensor is attached to at least one of the robot arm or an end effector attached to the robot arm; moving the robot arm on the basis of the detected first movement in a manner that prevents collision between the robot arm and the object; and maintaining the position and orientation of an attaching device of the robot arm or of a tool center point of the end effector while moving the robot arm in the manner that prevents collision.

2. The method of claim 1, further comprising: in response to the detected movement, performing at least one of: triggering a robot function based at least on the detected movement, or parameterizing a robot function based at least on the detected movement.

3. The method of claim 1, further comprising: detecting a distance between the object and the robot with the non-contact distance sensor; and in response to the detected distance, performing at least one of: determining a target force or torque to be produced by the robot based on at least one of the detected distance or the derivative of the detected distance, triggering a robot function based on at least one of the detected distance or the derivative of the detected distance, or parameterizing a robot function based on at least one of the detected distance or the derivative of the detected distance.

4. The method of claim 3, wherein a target force or torque is determined and the method further comprises actuating the robot arm such that the end effector produces the determined target force or torque.

5. The method of claim 2, wherein the robot function is at least one of an activation or deactivation of the end effector.

6. The method of claim 3, wherein the robot function is at least one of an activation or deactivation of the end effector.

7. The method of claim 1, wherein the non-contact distance sensor is a capacitive distance sensor.

8. A robot, comprising: a robot arm including a plurality of articulation axes, an attaching device for attaching an end effector to the robot arm, and a plurality of drives for moving the robot arm about the respective axes; at least one non-contact distance sensor operatively coupled with the robot arm or an end effector attached to the robot arm; and a control device communicating with the drives and the sensor, the control device configured to actuate the drives to move the robot arm based on a first movement of an object detected by the at least one sensor; the control device actuating the drives to move the robot arm in a manner that prevents collision between the robot arm and the object while maintaining a position and orientation of an attaching device of the robot arm or of a tool center point of the end effector.

9. The robot of claim 8, wherein the control device is further configured to perform at least one of: determining a target force or torque to be produced by the robot based on at least one of a distance between the robot and the object detected by the at least one sensor, or on a derivative of the detected distance; triggering a robot function based on at least one of the detected first movement, the detected distance, or the derivative of the detected distance; or parameterizing a robot function based on at least one of the detected first movement, the detected distance, or the derivative of the detected distance.

10. The robot of claim 9, wherein: the robot function is at least one of an activation or deactivation of an end effector attached to the robot arm; and/or the control device actuates the robot arm such that the end effector produces the determined target force or torque.

11. The robot of claim 8, wherein the non-contact distance sensor is a capacitive distance sensor.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) Examples of exemplary embodiments of the invention are depicted in the accompanying schematic drawing. The figures show the following:

(2) FIG. 1 a robot with non-contact distance sensors,

(3) FIGS. 2, 3 an acquisition field of the non-contact distance sensors,

(4) FIGS. 4-6 flow charts to illustrate the operation of the robot of FIG. 1,

(5) FIG. 7 another robot,

(6) FIG. 8 a flow chart to illustrate the operation of the robot of FIG. 7, and

(7) FIG. 9 an end effector for a robot.

DETAILED DESCRIPTION

(8) FIG. 1 shows a robot R with a robot arm M. Robot arm M represents essentially the movable part of robot R, and includes a plurality of axes 1-6, a plurality of levers 7-10 and a flange 18, to which an end effector 19, which is for example a medical instrument, in particular a surgical instrument, is attached.

(9) Each of the axes 1-6 is moved with a drive, for example an electric drive 11-16, which are electrically connected in a non-depicted manner to a control computer 17 of robot R, so that control computer 17 or a computer program running on control computer 17 is able to activate electric drives 11-16 in such a way that the position or location of flange 18 of robot R or the tool center point TCP of robot R can be oriented essentially freely in space. Electric drives 11-16 of robot R each include for example an electric motor and possibly power electronics that activate the motors.

(10) In the case of the present exemplary embodiment, robot R also has non-contact distance sensors 20 that are connected to control computer 17 in a non-depicted manner, which are situated in or on robot arm M, in particular integrated into the structure of robot arm M, or are set into an outer skin that encloses robot arm M, for example at individual spots or over a wide area. The non-contact distance sensors 20 are for example, as is the case in the present exemplary embodiment, capacitive distance sensors 20, which as such are known to a person skilled in the art. The distance sensors 20 are set up to ascertain a distance d to an object, for example to a person P.

(11) As just mentioned, in the case of the present exemplary embodiment the non-contact distance sensors 20 are capacitive distance sensors 20, which together produce an acquisition field 21 which at least partially encloses robot arm M and is depicted in FIGS. 2 and 3. FIG. 2 shows robot R with acquisition field 21 in a side view, and FIG. 3 shows robot R with acquisition field 21 in a top view.

(12) If person P for example approaches this acquisition field 21, then distance sensors 20 generate a signal corresponding to the distance d between person P and robot R, which is transmitted to control computer 17 for further processing. In the case of the present exemplary embodiment, control computer 17 or a computer program running on control computer 17 controls the movement of robot R on the basis of this signal. This is summarized by means of a flow chart depicted in FIG. 4.

(13) If person P approaches robot R, distance sensors 20 detect the distance d between robot R and person P. The signals assigned to distance d and coming from distance sensors 20 are transmitted to control computer 17, which recognizes a movement of person P on the basis of the signals, in particular on the basis of a detected change in the distance d between person P and robot R, step S1 of the flow chart in FIG. 4.

(14) On the basis of the detected movement of person P, in the case of the present exemplary embodiment control computer 17 calculates a movement corresponding to the movement of person P, which is to be executed by effector 19 or its tool center point TCP, step S2 of the flow chart.

(15) On the basis of the calculated movement that effector 19 or the tool center point TCP is to execute, control computer 17 activates drives 11-16 in such a way that effector 19 or tool center point TCP executes this movement without person P touching robot R, step S3 of the flow chart in FIG. 4.

(16) In another exemplary embodiment, robot R or control compute 17 is configured so that a target force or target torque to be produced by robot R, and/or their respective derivatives, are calculated by means of the distance sensors 20. This is summarized by means of a flow chart depicted in FIG. 5.

(17) If person P approaches robot R, distance sensors 20 detect the distance d between robot R and person P. The signals assigned to distance d and coming from distance sensors 20 are transmitted to control computer, step S1 of the flow chart in FIG. 5.

(18) On the basis of the distance d, in the case of the present exemplary embodiment control computer 17 calculates a target force or target torque corresponding to the distance d, and/or their respective derivatives, which the end effector 19 situated on robot arm M is to produce, step S2 of the flow chart in FIG. 5.

(19) Control computer 17 then activates drives 11-16 in such a way that end effector 19 applies the calculated target force to an object not depicted in further detail in the figures, step S3 of the flow chart in FIG. 5. Thus it is possible to set the force to be produced by end effector 19 and/or its derivative without contact, via the distance d to robot R. Alternatively, it is also possible to set a torque to be produced by end effector 19 and its derivative, instead of a force, via the distance d between person P and robot R.

(20) In another exemplary embodiment, robot R or control computer 17 is configured so that a function of robot R and/or a parameterizing of a function of robot R is carried out automatically if the distance d between person P and robot R falls below a minimum distance or exceeds a maximum distance. This is summarized by means of a flow chart depicted in FIG. 6.

(21) If person P approaches robot R so that distance d falls below the minimum distance, then this is detected by the distance sensors 20 or by control computer 17, step A1 of the flow chart in FIG. 6.

(22) In the case of the present exemplary embodiment, control computer 17 then activates end effector 19, step A2 of the flow chart in FIG. 6.

(23) FIG. 7 shows another robot 70, with a robot arm M having a plurality of axes. Robot 70 in the case of the present exemplary embodiment is a redundant robot R, for which reason it is able to execute a movement in its zero space. Zero space designates the joint angle space of redundant robot 70, in which it is possible to reconfigure robot joints 71-77 in such a way that the situation (position and orientation) of end effector 19 in space remains unchanged. This embodiment of robot 70 can be used for example in robot-supported surgery.

(24) In the case of the present exemplary embodiment, person P can operate robot 70 as follows, which is illustrated by means of a flow chart depicted in FIG. 8.

(25) If person P approaches robot 70, this is detected by means of the distance sensors 20, step A1 of the flow chart in FIG. 8, for example by deriving the signal assigned to the distance.

(26) To prevent a collision of person P with the relevant part of robot 70, control computer 17 activates drives 81-87 of robot 70 on the basis of the detected approach of person P to robot 70, in such a way that on the one hand the location of end effector 19 in space remains unchanged, the relevant lever or the relevant joint 71-77 with which person P potentially will collide yields to the approach of person P, step A2 of the flow chart in FIG. 8.

(27) The robots R, 70 described above include the distance sensors 20, which are situated in or on the corresponding robot arms M. But it is also possible, additionally or alternatively, to situate the distance sensors 20 on the end effector. FIG. 9 shows such an end effector 19, in which or on which distance sensors 20 are situated.