Robotic Training System
20180333861 ยท 2018-11-22
Inventors
- Andreas Keibel (Augsburg, DE)
- Henry Arenbeck (Duisburg, DE)
- Melanie Kolditz (Aachen, DE)
- Kirsten Albracht (Koln, DE)
- Dirk Abel (Aachen, DE)
- Gert-Peter Brueggemann (Koln, DE)
Cpc classification
A63B24/0087
HUMAN NECESSITIES
A63B23/03525
HUMAN NECESSITIES
A63B21/00178
HUMAN NECESSITIES
A63B23/0405
HUMAN NECESSITIES
A61H2230/605
HUMAN NECESSITIES
A61B5/222
HUMAN NECESSITIES
A61B5/0205
HUMAN NECESSITIES
B25J13/08
PERFORMING OPERATIONS; TRANSPORTING
A61H2201/1659
HUMAN NECESSITIES
A63B2071/0072
HUMAN NECESSITIES
B25J13/089
PERFORMING OPERATIONS; TRANSPORTING
A63B71/0054
HUMAN NECESSITIES
A63B2225/15
HUMAN NECESSITIES
A61H2230/045
HUMAN NECESSITIES
A63B2024/0093
HUMAN NECESSITIES
International classification
Abstract
A method for controlling the robot of a training system according to any of the previous claims, wherein a biomechanical and/or cardiovascular stress of the user, particularly based on a measured impingement of the actuation surface, is determined and the robot is controlled using a predetermined and the measured biomechanical and/or cardiovascular stress of the user. A computer program product with a program code, which is saved on a medium readable by the computer, for implementing a method according to the previous claim.
Claims
1. A training system with a robot (10); a robot-guided actuation surface (30A); an activity detection means (40) for detecting a biomechanical and/or cardiovascular stress of a user (20), particularly based on an impingement of the actuation surface determined by a force detection means (12) of the training system; and a control means (40) for controlling the robot based on a predetermined and a measured biomechanical and/or cardiovascular stress of the user.
2. A training system according to claim 1, wherein an activity detection means is implemented to determine the stress of the user based on at least one biomechanical and/or cardiovascular model, particularly a modular one and/or one that can be parameterized, and/or a measured status of the user.
3. A training system according to the previous claim, wherein the activity detection means being embodied to determine the status of the user is based on a detected position, acceleration, nerve and/or muscle and/or cardiovascular activity and/or dimensions of a biological structure of the user.
4. A training system according to the previous claim, wherein the activity detection means features at least one particularly inertial position sensor, arranged at the user, acceleration sensor, EMG-sensor and/or at least one sensor for determining a cardiovascular parameter and/or at least one particularly non-invasive sensor for determining a dimension of a biological structure of the user and/or at least one room monitoring sensor (70).
5. A training system according to any of the previous claims, wherein the control means is implemented to control a force, particularly the direction of force and/or the strength of the robot upon the robot-guided actuation surface and/or a motion of the robot-guided actuation surface by the robot, particularly a direction and/or speed of motion, based on the predetermined and the measured biomechanical and/or cardiovascular stress of the user.
6. A training system according to any of the previous claims, featuring a safety means (50) for the particularly redundant monitoring of the impingement of the actuation surface, the measured biomechanical and/or cardiovascular stress of the user, and/or the status of the robot.
7. A training system according to the previous claim, wherein the safety means is implemented to perform compensating motions if an impermissible impingement of the actuation surface or biomechanical and/or cardiovascular stress of the user or an impermissible status of the robot is determined.
8. A training system according to any of the previous claims, wherein the control means is implemented to identify the user (20), particularly in a touchless fashion, and to control the robot based on the user identified.
9. A training system according to any of the previous claims, featuring at least two actuation surfaces (30A, 30B, 30C), which can optionally be coupled to the robot, with the control means being implemented to at least partially automatically change the robot-guided actuation surfaces and/or identify them and to control the robot based on the identified robot-guided actuation surface.
10. A training system according to any of the previous claims, featuring a fixing means for fixing the user to a robotic actuation surface and/or a user positioning device (60).
11. A training system according to any of the previous claims, featuring output means for issuing feedback based on the determined biomechanical and/or cardiovascular stress.
12. A method for controlling the robot of a training system according to any of the previous claims, wherein a biomechanical and/or cardiovascular stress of the user, particularly based on a measured impingement of the actuation surface, is determined and the robot is controlled using a predetermined and the measured biomechanical and/or cardiovascular stress of the user.
13. A computer program product with a program code, which is saved on a medium readable by the computer, for implementing a method according to the previous claim.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0065]
DETAILED DESCRIPTION
[0066]
[0067] The training system features a robot 10. The robot features an arm with six rotary joints actuated by electric motors, with perpendicular or parallel axes of rotation being aligned in pairs in reference to each other.
[0068] The training system further features several different actuation surfaces 30A, 30B, and 30C, which are guided optionally in a detachable fashion at a robot flange 11 and thus are guided by the robot. The robot flange 11 shows the degrees of freedom defined by the six joints of the robot in reference to a robot base, which is fixed towards the environment.
[0069] In the exemplary embodiment the presently coupled and/or robot-guided actuation surface 30A comprises a platform for supporting one or both feet of a user, so that the training system can particularly act as a so-called function support, as indicated in
[0070] The training system features a force detection means for determining a force and momentum impingement of the actuation surface in three directions, respectively orthogonal to each other, in the form of a six-dimensional force/momentum sensor 12, which is arranged between the robot flange 11 and the actuation surface 30A.
[0071] The training system features an activity detection means for determining a biomechanical stress of a user 20 based on the measured impingement of the actuation surface as well as control means for controlling the drives of the robot 10 based on a predetermined and the measured biomechanical stress of the user, which are both implemented in a control 40.
[0072] In the exemplary embodiment, in which the robot 10 acts as a function support, for example forces and momentum acting in the knee joint of the user 20 are measured based on the determined impingement of the actuation surface 30A using the biomechanical model and compared to the predetermined stress. Then the control 40 controls the robot 10 such that the forces and momentum acting in the knee joint of the user 20 approach the desired stress or prevent that the permitted stress is exceeded.
[0073] Additionally the control determines, based on the measured impingement of the actuation surfaced 30A using a biomechanical model, a muscular stress in the knee extender, compares it with a predetermined optimal training stimulus, and controls the robot 10 such that the forces acting in the knee extender of the user 20 approach the desired training stress.
[0074] This way, advantageously any excess stress of the knee joint can be avoided and simultaneously the knee extender can be optimally stressed.
[0075] The control 40 features several biomechanical model modules, which implement various parts of the motion system of the user and have different degrees of complexity. The control 40 prepares optionally, particularly for each training plan, from these modules respectively the biomechanical model, based on which it then determines the stress of the user 20 and controls the robot 10.
[0076] The biomechanical models can be parameterized in order to adapt them to the different users. The parameters of the model are entered by the user or a trainer or determined from the database, particularly by recognizing a user identity and recalling parameters connected to said user identity from a memory unit of the control 40. Additionally or alternatively, one or more of the parameters can also be determined by the training system itself, particularly identified or estimated.
[0077] The control 40 considers, when determining the stress of the user 20, additionally a position of references of the user, which are determined in the exemplary embodiment by space monitoring sensors fixed in reference to the environment, for example a camera 70 and appropriate image detection. In a variant, not shown, the position of references of the user can additionally or alternatively be determined by position sensors arranged at the user. In another variant, not shown either, the control 40 can additionally or alternatively also consider nerve and/or muscle activities of the user 20 when determining his/her stress level, which are determined from EMG-sensors arranged at the user.
[0078] The references may have a known position in reference to joints of the motion system of the user, for example the knee joint. Then the control 40 can determine the position of the knee joint, based on the registered position of the references and in consideration of the impingement of the actuation surface 30A, and determine the stress in the knee joint.
[0079] In the exemplary embodiment the control 40 controls a force, which the robot 10 exerts upon the robot-guided actuation surface 30A as well as a motion of the robot-guided actuation surface by the robot based on the predetermined and the measured biomechanical stress of the user.
[0080] If for example based on the determined impingement of the actuation surface 30A excess biomechanical stress of the knee joint of the user 20 is determined, the control 40 can reduce the force by which the robot 10 impinges the actuation surface 30A, particularly a motion opposite that of the user (concentric training) and/or change its direction and/or the motion trajectory of the actuation surface 30A such that the biomechanical stress of the knee joint is reduced, for example (better) correlates a motion of the actuation surface 30A with an axis of motion of the knee joint.
[0081] The training system features a safety means with a safety control 50 for monitoring the impingement of the actuation surface 30A, the measured biomechanical stress of the user, and the condition of the robot 10.
[0082] The safety control 50 detects via two means the impingement of the actuation surface 30A via the force/momentum sensor 12 and the status, particularly a position, speed, and/or acceleration of the robot 10 via light sensors 71, 72. Additionally, it compares the measured biomechanical stress of the user 20 with a predetermined, permissible biomechanical stress, for example maximally permitted forces in the knee.
[0083] If the safety control 50 detects an impermissible impingement of the actuation surface 30A or an impermissible status of the robot 10, for example a force exerted upon the actuation surface 30A exceeding a predetermined limit or the robot 10 leaves the predetermined area set by the light sensors 71, 72, or if the safety control 50 detects an impermissible biomechanical stress of the user 20, it performs a compensating motion of the robotic actuation surface 30A into a predetermined default position.
[0084] In addition or as an alternative to the light sensors 71, 72 and/or the camera 70 the safety control 50 can also detect the position of the robot 10 by position and/or joint angle sensors 13 at the joints of the robot.
[0085] As already mentioned above, the training system features in the exemplary embodiment three different actuation surfaces 30A-30C, which can optionally be coupled to the robot 10.
[0086] The control 40 identifies the respectively robot-guided actuation surface (in the exemplary embodiment 30A) and/or coupled to the robot flange 11 and controls the robot 10 based on the identified robot-guided actuation surface. For this purpose the different actuation surfaces 30A-30C respectively include a RFID-transponder 32A, 32B and/or 32C, the robot 10 includes means 31 for the electromagnetic detection of the respectively coupled RFID-transponder.
[0087] The training system features a user positioning device 60 with an adjustable seating area and a backrest.
[0088] Although in the previous description exemplary embodiments were explained, it shall be pointed out that a plurality of variants is possible. Additionally, it shall be pointed out that the exemplary embodiments only represent examples which shall not limit the scope of protection, the applications, and the design in any way. Rather, a specialist shall be provided in the previous description with a guideline for implementing at least one exemplary embodiment, wherein various changes, particularly with regards to the function and arrangement of the components described, may be performed without leaving the scope of protection, as discernible from the claims and combinations of features equivalent thereto.
[0089] While the present invention has been illustrated by a description of various embodiments, and while these embodiments have been described in considerable detail, it is not intended to restrict or in any way limit the scope of the appended claims to such detail. The various features shown and described herein may be used alone or in any combination. Additional advantages and modifications will readily appear to those skilled in the art. The invention in its broader aspects is therefore not limited to the specific details, representative apparatus and method, and illustrative example shown and described. Accordingly, departures may be made from such details without departing from the spirit and scope of the general inventive concept.
TABLE-US-00001 List of reference characters 10 robot 11 Robot flange 12 force/momentum sensor 13 joint angle sensor 20 User 30A, 30B, 30C actuation surface 31 means for detecting a RFID- transponder 32A; 32B; 32C RFID transponder 40 (robot)control (activity detection and control means) 50 safety control 60 user positioning device 70 cameras (room monitoring system) 71, 72 Light Sensors