System for Performing an Input on a Robotic Manipulator
20220362943 · 2022-11-17
Inventors
Cpc classification
B25J9/1656
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/36401
PHYSICS
G05B19/423
PHYSICS
B25J13/06
PERFORMING OPERATIONS; TRANSPORTING
International classification
B25J13/08
PERFORMING OPERATIONS; TRANSPORTING
B25J13/06
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A system for performing an input on a robotic manipulator, wherein the system includes: a robotic manipulator having a plurality of limbs connected to one another by articulations and having actuators; a sensor unit configured to record an input variable, applied by a user by manually guiding the robotic manipulator, on the robotic manipulator, wherein the input variable is a kinematic variable or a force and/or a moment, and wherein the sensor unit is configured to transmit the input variable; and a computing unit connected to the robotic manipulator and to the sensor unit, the computing unit configured to transform the input variable received from the sensor unit via a predefined input variable mapping, wherein the input variable mapping defines a mathematical mapping of the input variable onto a coordinate of a graphical user interface or onto a setting of a virtual control element.
Claims
1. A system to perform an input on a robotic manipulator, the system comprising: a robotic manipulator having a plurality of limbs connected to one another by articulations and having actuators; a sensor unit configured to record an input variable, applied by a user by manually guiding the robotic manipulator, on the robotic manipulator, wherein the input variable is a kinematic variable or a force and/or a moment, and wherein the sensor unit is configured to transmit the input variable; and a computing unit connected to the robotic manipulator and to the sensor unit, the computing unit configured to transform the input variable received from the sensor unit via a predefined input variable mapping, wherein the predefined input variable mapping defines a mathematical mapping of the input variable onto a coordinate of a graphical user interface or onto a setting of a virtual control element.
2. The system according to claim 1, wherein the system comprises a display unit, wherein the display unit configured to display at least one of the following: the input variable; an amount of the input variable; the input variable as transformed; an amount of the input variable as transformed; a graphical user interface and objects on a graphical object surface; and a mouse pointer.
3. The system according to claim 1, wherein: the sensor unit is configured to record a current position and/or a current orientation of the robotic manipulator for transmitting the current position and/or the current orientation to the computing unit; and the computing unit is to actuate the actuators in such a manner that the robotic manipulator, during a manual guiding onto a predefined geometric structure, generates a resistance against movement of the robotic manipulator caused by the manual guiding, and wherein the computing unit is configured to activate a predefined function if the resistance exceeds a predetermined limit value or if a distance of a predefined point of the robotic manipulator with respect to the predefined geometric structure undershoots a predetermined limit value.
4. The system according to claim 3, wherein the predefined function is activation of an object on the graphical user interface, at least when the coordinate coincides with a predetermined coordinate range of the object.
5. The system according to claim 3, wherein the predefined geometric structure is a plane and the plane is invariant in terms of its orientation and position with respect to a terrestrial coordinate system.
6. The system according to claim 3, wherein the computing unit is configured to actuate the actuators in such a manner that, during the manual guiding of the robotic manipulator, the robotic manipulator outputs a haptic feedback and/or a tactile feedback of the input variable as recorded and/or the input variable as transformed.
7. The system according to claim 6, wherein the haptic feedback and/or the tactile feedback in each case includes at least one of the following: a position-dependent grid; a resistance-caused limitation of a work region in which the robotic manipulator is capable of being manually guided; a feedback in case the input variable and/or the current position and/or the current orientation of the robotic manipulator coincides with an object on the graphical user interface; and a signal if the coordinate coincides with a predefined coordinate range of the object of the graphical user interface.
8.-10. (canceled)
11. The system according to claim 7, wherein in the case of feedback the coordinate of the graphical user interface is assigned, using the computing unit, to the input variable and/or the current position and/or the current orientation of the robotic manipulator.
12. The system according to claim 2, wherein the display unit is a screen.
13. The system according to claim 1, wherein the computing unit is configured to actuate the robotic manipulator with gravity compensation.
14. A method of performing an input on a robotic manipulator having a plurality of limbs connected to one another by articulations and having actuators, the method comprising: recording, using a sensor unit, an input variable applied by a user by manually guiding the robotic manipulator, on the robotic manipulator, wherein the input variable is a kinematic variable or a force and/or a moment; transmitting the input variable using the sensor unit; and transforming, using a computing unit, the input variable received from the sensor unit via a predefined input variable mapping, wherein the predefined input variable mapping defines a mathematical mapping of the input variable onto a coordinate of a graphical user interface or onto a setting of a virtual control element of the computing unit.
15. The method according to claim 14, wherein the method comprises displaying at least one of the following using a display unit: the input variable; an amount of the input variable; the input variable as transformed; an amount of the input variable as transformed; a graphical user interface and objects on a graphical object surface; and a mouse pointer.
16. The method according to claim 14, wherein the method comprises: recording, using the sensor unit, a current position and/or a current orientation of the robotic manipulator for transmitting the current position and/or the current orientation to the computing unit; and actuating, using the computing unit, the actuators in such a manner that the robotic manipulator, during a manual guiding onto a predefined geometric structure, generates a resistance against movement of the robotic manipulator caused by the manual guiding; and activating, using the computing unit, a predefined function if the resistance exceeds a predetermined limit value or if a distance of a predefined point of the robotic manipulator with respect to the predefined geometric structure undershoots a predetermined limit value.
17. The method according to claim 16, wherein the predefined function is activation of an object on the graphical user interface, at least when the coordinate coincides with a predetermined coordinate range of the object.
18. The method according to claim 16, wherein the predefined geometric structure is a plane and the plane is invariant in terms of its orientation and position with respect to a terrestrial coordinate system.
19. The method according to claim 16, wherein actuation of the actuators is performed by the computing unit in such a manner that, during the manual guiding of the robotic manipulator, the robotic manipulator outputs a haptic feedback and/or a tactile feedback of the input variable as recorded and/or the input variable as transformed.
20. The method according to claim 19, wherein the haptic feedback and/or the tactile feedback in each case includes at least one of the following: a position-dependent grid; a resistance-caused limitation of a work region in which the robotic manipulator is capable of being manually guided; a feedback in case the input variable and/or the current position and/or the current orientation of the robotic manipulator coincides with an object on the graphical user interface; and a signal if the coordinate coincides with a predefined coordinate range of the object of the graphical user interface.
21. The method according to claim 20, wherein in the case of feedback the method comprises assigning, using the computing device, the coordinate of the graphical user interface to the input variable and/or the current position and/or the current orientation of the robotic manipulator.
22. The method according to claim 15, wherein the display unit is a screen.
23. The method according to claim 14, wherein the method comprises actuating, using the computing unit, the robotic manipulator with gravity compensation.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0045] In the drawings:
[0046]
[0047]
[0048] The representations in the figures are diagrammatic and not true to scale.
DETAILED DESCRIPTION
[0049]
[0053] The sensor unit 7 is used for recording an input variable applied by the user by manually guiding the robotic manipulator 1, on the robotic manipulator 1, wherein the input variable is a force applied on the robotic manipulator 1 by the user, and wherein the sensor unit 7 is designed for transmitting the input variable to the computing unit 5. The computing unit 5 is here used for transforming the input variable via a predefined input variable mapping, and the input variable mapping is a mathematical mapping of the input variable onto a coordinate of a graphical user interface or onto a setting of a virtual control element. The system 100 furthermore includes a display unit 9, wherein the display unit 9 is designed to display at least one of the following: [0054] input variable, [0055] amount of the input variable, [0056] transformed input variable, and [0057] amount of the transformed input variable.
[0058] In particular, the display unit is designed for displaying a mouse pointer with the current mouse pointer coordinate. The sensor unit 7 is therefore used for recording a current position and/or a current orientation of the robotic manipulator 1 and for transmitting the current position and/or the current orientation to the computing unit 5, wherein the computing unit 5 is designed for actuating the actuators 3 in such a manner that the robotic manipulator 1, during manual guiding onto a predefined spatially fixed defined plane, generates a resistance against movement of the robotic manipulator 1 caused by manual guiding. The computing unit 5 is furthermore used for activating a predefined function if a predefined limit value in the resistance is exceeded or if a predefined limit value in the distance of a predefined point of the robotic manipulator 1 with respect to the geometric structure is undershot. The predefined function is an activation of an object on the graphical user interface, at least if the coordinate coincides a predefined coordinate range of the object. The computing unit 5 is moreover designed for actuating the actuators 3 in such a manner that, during manual guiding of the robotic manipulator 1, the robotic manipulator 1 outputs haptic feedback and/or tactile feedback of the recorded input variable and/or the transformed input variable, wherein the haptic feedback and/or the tactile feedback in each case include(s) at least one of the following: [0059] position-dependent grid, [0060] resistance-caused limitation of the work region in which the robotic manipulator 1 can be manually guided, [0061] feedback in the case of coinciding of an input variable and/or a current position and/or current orientation of the robotic manipulator 1 with an object on the graphical user interface, wherein the coordinate of the graphical user interface is assigned to an input variable and/or a current position and/or current orientation of the robotic manipulator 1, and [0062] signal if the coordinate coincides a predefined coordinate range of the object of the graphical user interface. The display unit 9 here is a screen.
[0063]
[0068] Although the invention was illustrated and explained in greater detail by a preferred embodiment example, the invention is nevertheless not limited by the disclosed examples, and other variations can be derived therefrom by the person skilled in the art, without exceeding the scope of protection of the invention. Therefore, it is clear that multiple variation possibilities exist. It is also clear that embodiments mentioned as examples in fact represent only examples which in no way should be interpreted as limiting, for example, the scope of protection, the application possibilities or the configuration of the invention. Instead, the above description and the description of the figures enable the person skilled in the art to concretely implement the exemplary embodiments, wherein the person skilled in the art cognizant of the disclosed inventive idea can make diverse changes, for example, with regard to the function or the arrangement, in an exemplary embodiment of mentioned elements, without exceeding the scope of protection which is defined by the claims and their legal equivalents, such as, for example, further explanations in the description.
LIST OF REFERENCE NUMERALS
[0069] 1 Robotic manipulator [0070] 3 Actuators [0071] 5 Computing unit [0072] 7 Sensor unit [0073] 9 Display unit [0074] 100 System [0075] S1 Recording [0076] S2 Transmitting [0077] S3 Transforming [0078] S4 Displaying