Robot system and method for controlling a robot system

11040455 · 2021-06-22

Assignee

Inventors

Cpc classification

International classification

Abstract

The present invention relates to a robotic system having at least one robotic arm, a control unit for controlling the robotic arm and a robotic arm sensor system, wherein the controller and robotic arm sensor system are designed to respond to predetermined haptic gestures of the user acting on the robotic arm in such a way that the robotic system performs at least one predetermined operation associated with the haptic gesture.

Claims

1. A robotic system, comprising: at least one robotic arm; a control unit for controlling the robotic arm; a robotic arm sensor system; and a display device with a graphical user interface, wherein the control unit and robotic arm sensor system are designed to respond to predetermined forces and/or moments generated by haptic gestures of the user acting on the robotic arm, in which said control unit is designed to assign the forces and/or moments to at least one predetermined operation so that the robotic system performs said at least one predetermined operation associated with the haptic gesture, wherein the control unit is further designed to assign said forces and/or moments generated by the haptic gesture with a navigation and selection control on the graphical user interface of said display device.

2. The robotic system according to claim 1, in which the forces and/or moments generated by the haptic gesture are assigned to different operations depending on their respective directions.

3. The robotic system according to claim 1, in which the forces and/or moments generated by the haptic gesture are assigned to different operations depending on their respective variables.

4. The robotic system according to claim 1, in which a different chronological sequence of haptic gestures is assigned to different operations.

5. The robotic system according to claim 1, in which the control unit is adapted to generate a feedback in response to the haptic gesture.

6. The robotic system according to claim 5, in which the feedback is auditory and/or visual.

7. The robotic system according to claim 5, in which the feedback is formed to be haptically detectable.

8. The robotic system according to claim 7, in which the compliance of the robotic arm is variable.

9. A method for controlling a graphical user interface of display device of a robotic system comprising at least one robotic arm, a control unit for controlling the robotic arm and a robotic arm sensor system, the method comprising the steps of: manipulating the robotic arm by at least one predetermined force and/or moment generated by a haptic gesture; generating signals by the robotic arm sensor system in response to the at least one predetermined force and/or moment generated by the haptic gesture; transmitting the signals into the control unit; and assigning the signals to different predetermined operations by the control unit, wherein the control unit assigns the signals with a navigation and selection control on the graphical user interface of the display device.

10. The method according to claim 9, further comprising the step of: generating at least one feedback in dependence on the different predetermined operations.

11. The method according to claim 10, further comprising the step of: adjusting the compliance of the robotic arm depending on a predetermined haptic gesture.

Description

(1) Further features and advantages will become apparent from the following description of an embodiment shown with reference to the only FIG. 1.

(2) FIG. 1 shows a perspective view of a multi-axle lightweight robot, which is composed of a plurality of articulated arm members 10 connected to one another.

(3) Using the example of the front (distal) member 20, which cooperates with an end effector, not shown, the possible directions are indicated schematically, which can act by haptic gestures on the robotic system.

(4) Thus, it is possible for an operator to exert tensile or compressive forces on the member 20 in the X-, Y-, Z-directions.

(5) A light push on the member 20 in the Z-direction, for example, may symbolize the pressing of a push-button, this haptic gesture is then assigned to a start command that activates a predetermined operation.

(6) By pushing the member 20 sideways to the left (L) or to the right (R), the operator can be guided through a complex menu of a graphical user interface on a display device, not shown, such as the screen of a computer. By pressing in the Z- or Y-direction, the entry selected in the graphical menu by these haptic gestures can then be confirmed.

(7) It is also possible to transmit the movement of the member 20 in the X-Y-plane to the movement of a cursor on a screen and to perform a “mouse click” by pressing in the Z-direction.

(8) Furthermore, it is possible for the operator to make a haptic rotation R on the member 20, the amount of torque applied thereby being able to be used as a signal for the strength of a parameter to be selected by this rotation. The haptic rotation in R-direction simulates a rotary knob.

(9) For an emergency, it may be provided that an operator simply beats against the robotic arm, indicated by arrow P. The robotic system recognizes by the force or the acceleration that it refers to such an emergency and then immediately stops its motion.

(10) The haptic gestures are not limited to the foremost arm member 20. In principle, each individual arm member 10 may be provided with such functionality.

(11) In addition, it is possible that the behavior of physical surfaces, keys or rotary knobs can be simulated by virtual, as freely programmable resistors when moving the robotic arm. In this way, the robotic system can be used by itself as an input or control device to activate complex systems and operations that are in a close functional relationship with the robotic system or its desired functionality.