MONITORING METHOD AND ROBOTIC SYSTEM
20230405823 · 2023-12-21
Inventors
- Hayo Knoop (Forchheim, DE)
- Elisabeth Preuhs (Erlangen, DE)
- Markus Kowarschik (Nürnberg, DE)
- Stephan Kellnberger (Erlangen, DE)
Cpc classification
B25J9/1679
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1674
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
For a simple monitoring of a robotic system that is configured for robot-assisted actuation of a movement of a medical object in a hollow organ of a patient, the robotic system includes at least one drive system, a robot control unit, and an acoustic sensor. A method is provided and includes receiving, by the acoustic sensor, acoustic signals of the robotic system during the operation of the robotic system for moving the medical object. At least one signal pattern is recognized in the received acoustic signals. The at least one recognized signal pattern is evaluated with respect to an associated action flow of at least one component of the robotic system. The method includes checking whether the action flow is an intended action flow, and actuating an action if the action flow is unintended.
Claims
1. A method for monitoring a robotic system that is configured for robot-assisted actuation of a movement of a medical object in a hollow organ of a patient, the robotic system including at least one drive system, a robot control unit, and an acoustic sensor, the method comprising: receiving, by the acoustic sensor, acoustic signals of the robotic system during operation of the robotic system for moving the medical object; recognizing at least one signal pattern in the received acoustic signals; evaluating the at least one recognized signal pattern with respect to an associated action flow of at least one component of the robotic system; checking whether the associated action flow is an intended action flow; and actuating an action when the associated action flow is unintended.
2. The method of claim 1, wherein the actuated action includes outputting a message, outputting a warning, outputting an action proposal, automatically interrupting the operation of the robotic system, an automatic corrective action so as to eliminate the unintended action flow, or any combination thereof.
3. The method of claim 1, wherein checking whether the associated action flow is the intended action flow comprises comparing the associated action flow with a planning guideline, a database, a control guideline, or any combination thereof.
4. The method of claim 1, wherein the acoustic signals are received from at least one drive of the at least one drive system.
5. The method of claim 1, wherein at least one pre-trained machine-learning algorithm is used for evaluating and checking the signal pattern.
6. The method of claim 1, further comprising: performing a continuous monitoring during the operation of the robotic system; terminating the continuous monitoring on deactivation of the robotic system.
7. The method of claim 1, wherein the method is trigger started by an activation of the robotic system.
8. The method of claim 1, further comprising using a further monitoring method.
9. The method of claim 8, wherein the further monitoring method includes monitoring by imaging.
10. The method of claim 1, further comprising: performing an evaluation with respect to an unintended slip between a drive and a medical object; and actuating an action when the unintended slip is determined.
11. A robotic system comprising: a robot control unit; a robot-assisted drive system comprising: a drive; and a drive mechanism, wherein the drive system is configured to move a medical object in a hollow organ of a patient based on control signals of the robot control unit; an acoustic sensor configured to receive acoustic signals of the robotic system, the acoustic sensor being arranged on the robotic system; and an evaluating unit configured to: recognize signal patterns; and evaluate the signal patterns with respect to an associated action flow of at least one component of the robotic system.
12. The robotic system of claim 11, wherein the robot control unit is configured to actuate an action.
13. The robotic system of claim 12, the actuation of the action comprises: output of a message; output of a warning; output of an action proposal; automatic interruption of the operation of the robotic system; performance of corrective action so as to eliminate the unintended action flow; or any combination thereof.
14. The robotic system of claim 11, further comprising a display unit configured to display messages, warnings, action proposals, or any combination thereof.
15. The robotic system of claim 13, further comprising a display unit configured to display messages, warnings, action proposals, or any combination thereof.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0015]
[0016]
[0017]
DETAILED DESCRIPTION
[0018]
[0019] During operation 20 of a robotic system 1 (e.g., during a robot-assisted navigation of a catheter 3 and a guide wire 4 through a hollow organ of a patient), acoustic signals from at least one component of the robotic system 1 are received in a first act 21 by at least one acoustic sensor 11 (e.g., a microphone). The component may be, for example, a drive 7 that drives a drive mechanism, such as, for example, a manipulator 10. The manipulator 10 is connected directly or indirectly to the medical object (e.g., the guide wire 4 or the catheter 3). The component may also be a different mechanical component, a component for transferring force to the medical object or the medical object itself, or two or more medical objects that are mutually connected or nested in one another (e.g., catheter or guide wire). It is also possible, using a number of acoustic sensors 11 (e.g., microphones), to receive acoustic signals from a number of components.
[0020] In a second act 22, the received acoustic signals are read with respect to signal patterns, or signal patterns are detected in the acoustic signals. This may be performed, for example, by software algorithms (e.g., also using artificial intelligence or pre-trained machine-learning algorithms, such as using deep neural networks). Signal patterns may be continuously recognized or read in this manner. In a third act 23, the recognized signal patterns are evaluated with respect to an associated action flow of at least one component of the robotic system. This may likewise be performed by software algorithms, also in this case, for example, using artificial intelligence or pre-trained machine-learning algorithms (e.g., using deep neural networks). It is possible to provide that in each case an acoustic sensor is assigned respectively to a predetermined component so that it is possible to limit the evaluation to this component. Thus, for example, in
[0021] An action flow that is associated with a signal pattern is understood in this case to be by which process and/or which function and/or which incorrect function and/or which event the corresponding characteristic signal pattern has been generated. The present embodiments are based on the knowledge that each action flow (e.g., each process and/or function and/or incorrect function and/or event) generates a characteristic noise (e.g., signal pattern) that differs from any other signal patterns of other action flows, and as a result, is clearly identifiable. Thus, for example, the signal pattern of the drive in the case of a continuous movement with translation or rotation of the guide wire differs from the signal pattern that a slipping through/slip of the guide wire generates, since in that case, a resistance is exceeded, for example. Such a signal pattern is clearly recognizable, and as a result, is easily identifiable in such an evaluation. Other processes, events, or functions may also be unambiguously assigned. An appropriate machine-learning algorithm may be trained prior to use for the method with the possibly occurring signal patterns. As a result, it is also possible to differentiate short pattern pulses from longer lasting patterns.
[0022] If the evaluation indicates that a predetermined action flow has occurred, a check is performed in a fourth act as to whether the action flow is an intended action flow (e.g., an action flow that is intended in the manner in which it has occurred). This may be performed, for example, with the aid of a comparison with a planning guideline, a database, or a control guideline. It is also possible to store which action flows are intended, and which are not, in a storage device or in the evaluation software itself. It is also possible, for example, to store that slip is unintended during a navigation procedure and a uniform movement of the medical object is intended.
[0023] If an unintended action flow is determined, an action is performed subsequently in a fifth act 25. If, for example, it is stored that a slip or an incorrect function is unintended, then, if a slip is determined in the case of a movement, an action is subsequently actuated in a fifth act 25. The actuated action may be, for example, outputting a message, outputting a warning, outputting an action proposal, automatically interrupting the operation of the robotic system, and/or an automatic corrective action so as to eliminate the unintended action flow.
[0024] A warning may be output in an acoustic, haptic, or visual manner, for example (e.g., as a colored light or as output text on a monitor). Additionally or alternatively, a request may be made for a user action that is a general action or an action that is adapted to the incorrect function (e.g., for cleaning or checking predetermined components or also for manually continuing the movement). Further, it is possible, for example, to suggest or automatically actuate additional movements of the components (e.g., short forward and backward movements) in order to eliminate the unintended action flow (e.g., slip). It is also possible, for example, to activate further or the same mechanical components or amplify their action in order to correct the unintended action flow. It is also possible to correct the target values originally calculated by activating the component (e.g., drive) and to activate a repeated (e.g., semi-automatic) movement. In critical situations, it is also possible to completely stop the operation of the robotic system. Further, the unintended action flow may be recorded. Thus, it is possible to subsequently correct erroneous measurement values of the robotic system that occur as a result of slipping through/slip, for example.
[0025] The machine-learning algorithm(s) that are used for the method (e.g., deep neural networks, long-short term memories, or recurrent neural networks) may be pre-trained using large amounts of data. For this purpose, automatic movements may be performed, for example, by an appropriate component (e.g., a drive mechanism, such as a manipulator, may perform a number of passes). Training may be performed by repeated and varied model tests and comparisons made with the actual geometry as ground truth.
[0026] The recognition of signal patterns and the evaluation thereof may be performed online or live in a very short time. By appropriately training the algorithm used, it is also possible to recognize parts of typical signal patterns at an early stage so that in many cases an occurring problem (e.g., unintended action flow) may be identified and counteracted at an early stage. In this manner, signal patterns may be used to predict the next movements and/or events (e.g., whether and when slip will occur). This information may then likewise be used to actuate actions (e.g., in order to output warnings or to perform appropriate automatic actions so as to prevent the event (slip)).
[0027] In addition to the acoustic monitoring by the acoustic sensors, it is also possible to use a further monitoring method in order to observe the robotic system or the movements of the medical object through the hollow organ of the patient. Thus, monitoring may be performed by fluoroscopy (e.g., X-ray imaging), for example. The corresponding images captured by an X-ray machine may also be evaluated (e.g., automatically by algorithms). The results may be coupled (e.g., temporally) with the results of the acoustic monitoring in order to obtain an even better evaluation of the navigation of the medical object and to be able to optimize the treatment of the patient. Fluoroscopy images may also be used for training the machine-learning algorithm for acoustic monitoring.
[0028] The exemplary robotic system 1 (see
[0029] Typical robotic systems for which the method is applicable, provided the robotic systems are modified by acoustic sensors, an evaluation unit, and appropriate software, include the CorPath GRX system from Corindus, Inc. or an LBR system having a catheter. In the case of the former, the acoustic sensors may be arranged, for example, on the cassette of the drive unit.
[0030] The monitoring method uses acoustic sensors for receiving acoustic signals from components of a robotic system in order, by evaluating signal patterns, to render it possible to improve the robot-assisted movement of a medical object in a hollow organ of a patient or to optimize the navigation.
[0031] The present embodiments may be briefly summarized in the following manner. For a particularly simple monitoring of a robotic system that is configured for the robot-assisted actuation of a movement of a medical object in a hollow organ of a patient, the robotic system having at least one drive system, a robot control unit, and an acoustic sensor, a method that includes the following acts is provided: using the at least one acoustic sensor, receiving acoustic signals of the robotic system during the operation of the robotic system for moving the medical object; recognizing at least one signal pattern in the received acoustic signals; evaluating the at least one recognized signal pattern with respect to an associated action flow of at least one component of the robotic system; checking whether the action flow is an intended action flow; and actuating an action if the action flow is unintended.
[0032] The elements and features recited in the appended claims may be combined in different ways to produce new claims that likewise fall within the scope of the present invention. Thus, whereas the dependent claims appended below depend from only a single independent or dependent claim, it is to be understood that these dependent claims may, alternatively, be made to depend in the alternative from any preceding or following claim, whether independent or dependent. Such new combinations are to be understood as forming a part of the present specification.
[0033] While the present invention has been described above by reference to various embodiments, it should be understood that many changes and modifications can be made to the described embodiments. It is therefore intended that the foregoing description be regarded as illustrative rather than limiting, and that it be understood that all equivalents and/or combinations of embodiments are intended to be included in this description.