Robot System And Method For Operating A Teleoperative Process

20180085926 ยท 2018-03-29

Assignee

Inventors

Cpc classification

International classification

Abstract

A system and a method for carrying out a teleoperative process, wherein an image recording device and a tool are used. The tool may be guided by a first manipulator, and a currently captured region of the image recording device is determined. Monitoring the position of the tool relative to the currently captured region of the image recording device makes it possible to prevent the tool from unintentionally leaving the captured region. The axes of the manipulator are provided with sensors for detecting the forces and/or torques acting on the axes.

Claims

1-12. (canceled)

13. A robot system for a teleoperative process, comprising: an image recording device; a tool guided by a first manipulator, wherein the axes of the manipulator are provided with sensors for detecting at least one of forces or torques acting on the axes; and a control unit configured to: (a) determine a currently captured region of the image recording device, (b) determine a pose of the tool relative to the image recording device, and (c) perform an action when the tool violates limits of the currently captured region of the image recording device.

14. The robot system of claim 13, wherein the first manipulator is a multi-axis articulated robot arm.

15. The robot system of claim 13, wherein the action performed by the control unit comprises at least one of the output of a warning, a shutdown, hard switching of the first manipulator, or a deactivation of the tool.

16. The robot system of claim 13, wherein the image recording device is guided by a second manipulator.

17. The robot system of claim 16, wherein the second manipulator is a multi-axis articulated robot arm.

18. The robot system of claim 16, wherein the control unit is further configured to determine the currently captured region of the image recording device using a manipulator controller of the second manipulator.

19. The robot system of claim 13, wherein the control unit is further configured to determine the position of the tool using a manipulator controller of the first manipulator.

20. The robot system of claim 13, wherein the image recording device is an ultrasound probe or an endoscopic imaging device.

21. The robot system of claim 13, wherein the control unit determines the pose of the tool relative to the image capturing device by determining the pose of only a part of the tool.

22. A method for operating a teleoperative process, comprising: determining the pose of a tool guided by a first manipulator, wherein the axes of the manipulator are provided with sensors for detecting at least one of forces or torques acting on the axes; determining a currently captured region of an image recording device that provides a user with a visualization of the pose of the tool; determining the pose of the tool relative to the image recording device; and when the tool violates limits of the currently captured region of the image recording device, then at least one of emitting a warning, shutting down, hard switching the first manipulator, or deactivating the tool.

23. The method of claim 22, wherein the first manipulator is a multi-axis articulated robot arm.

24. The method of claim 22, wherein the image recording device is guided by a second manipulator.

25. The method of claim 24, wherein the second manipulator is a multi-axis articulated robot arm.

26. The method of claim 24, wherein the currently captured region of the image recording device is determined using a manipulator controller of the second manipulator.

27. The method of claim 22, wherein the pose of the tool is determined using a manipulator controller of the first manipulator.

28. The method of claim 22, wherein the image recording device is an ultrasound probe or an endoscopic imaging device.

29. The method of claim 22, wherein determining the pose of the tool relative to the image capturing device comprises determining the pose of only a part of the tool.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0024] The present invention is described below with respect to one non-restrictive exemplary embodiment.

[0025] FIG. 1 shows in schematic form an inventive system for the robot-assisted treatment of a patient.

DETAILED DESCRIPTION

[0026] FIG. 1 shows in schematic form and by way of example an inventive system 1 for the robot-assisted treatment of a patient 50. It is clear to those skilled in the art that the principles, described herein, may be applied, of course, to any type of teleoperation, in particular, to teleoperations in an industrial environment. The system comprises a control unit 10, which comprises a computer 12 and a screen 11. The patient 50 lies on an operating table 53; and in the drawing shown, the reference numeral 51 is intended to indicate a cross sectional view through the neck of the patient 50. In the neck 51 there is a target point 52, such as, for example, a tumor or the like, that is to be examined or, more specifically, is to be treated. The tool that is to be used in the treatment is a surgical instrument, i.e., a biopsy needle 30. The biopsy needle 30 is guided by a first manipulator 31, which is a multi-axis articulated robot arm 31 in the illustrated case. The articulated robot arm 31 is assigned a manipulator controller 32, which is connected to the computer 12 of the control unit 10, as indicated by the dashed arrows. The biopsy needle 30 is to be guided to the target point 52. In order to make it easier for the surgeon to guide the biopsy needle 30 or to make it possible at all, an image recording device in the form of an ultrasound probe 20 is used. The ultrasound probe 20 is guided by a second manipulator 21, which is also a multi-axis articulated robot arm and which is assigned a manipulator controller 22. The manipulator controller 22 and also the ultrasound probe 20 are connected to the control unit 10, as indicated by the dashed arrows.

[0027] The articulated robot arm 21 carries and moves the ultrasound probe 20. The ultrasound probe 20 is pressed by the articulated robot arm 21 against the body of the patient 50 in order to make ultrasound images of the inside of the patient's body. The ultrasound images are transmitted to the control unit 10 or, more specifically, the associated computer 12, processed in the computer 12 and then displayed on the screen 11. The reference numeral 24 is intended to indicate the currently captured region of the ultrasound probe 20 (i.e., the image plane (sound plane) of the ultrasound probe). The image plane or sound plane of the probe is typically only a few millimeters thick, so that the probe has to be aligned very precisely in order to deliver informative images. The alignment of the probe and the pressing of the probe is carried out by the manipulator or, more specifically, by the articulated robot arm 21, so that a surgeon is relieved of these tasks. To this end it is advantageous for the articulated robot arm 21 to be provided with force sensors and to work in a closed loop force control, so that the articulated robot arm presses the ultrasound probe 20 with a defined force against the skin surface of the patient 50.

[0028] Since the pose of the ultrasound probe 20 is fixed, based on the current position of the manipulator, or can be calculated from it, and since the contour and the orientation of the captured region 24 are also known, it is possible to calculate precisely where the captured region 24 is located in the space.

[0029] In FIG. 1 the tip of the biopsy needle 30 is inside the currently captured region 24, so that the surgeon can track the movement of the tip through the body of the patient 50 on the screen 11 and can guide the biopsy needle 20 accordingly in a target-oriented manner to the target point 52. The position and orientation of the biopsy needle is known precisely due to the robot's position and the pose of the manipulator 31 or can be accurately determined therefrom. Since the control unit 10 knows the respective position and pose of the two manipulators 21 and 31 from the two manipulator controllers 22 and 32 or can calculate said position and pose with the aid of (i.e., by means of) the manipulator controllers, the control unit 10 can determine the pose of the biopsy needle 30 relative to the ultrasound probe 20 and, thus, also the relative-pose of, for example, the tip of the biopsy needle 30 to the currently captured region 24. This allows the control unit to determine if the biopsy needle 30 has violated the limits of the captured region 24 of the ultrasound probe 20. If such a violation of the limits is determined, then a corresponding warning can be emitted, for example, on the screen 11, or the manipulator 31 is hard switched.

[0030] While the present invention has been illustrated by a description of various embodiments, and while these embodiments have been described in considerable detail, it is not intended to restrict or in any way limit the scope of the appended claims to such detail. The various features shown and described herein may be used alone or in any combination. Additional advantages and modifications will readily appear to those skilled in the art. The invention in its broader aspects is therefore not limited to the specific details, representative apparatus and method, and illustrative example shown and described. Accordingly, departures may be made from such details without departing from the spirit and scope of the general inventive concept.

LIST OF REFERENCE NUMERALS

[0031] 1 system [0032] 10 control unit [0033] 11 screen [0034] 12 computer [0035] 20 image recording device (ultrasound probe) [0036] 21, 31 manipulators (articulated robot arm) [0037] 22, 32 manipulator controllers [0038] 24 currently captured region (sound plane) [0039] 30 tool (biopsy needle) [0040] 50 patient [0041] 51 cross section through the neck [0042] 52 target point [0043] 53 operating table