SURGERY ROBOT SYSTEM AND USE METHOD THEREFOR
20220022985 · 2022-01-27
Inventors
Cpc classification
A61B34/20
HUMAN NECESSITIES
A61B2034/2072
HUMAN NECESSITIES
A61B2090/366
HUMAN NECESSITIES
A61B90/10
HUMAN NECESSITIES
A61B2034/107
HUMAN NECESSITIES
G06T7/521
PHYSICS
A61B2090/064
HUMAN NECESSITIES
A61B2034/105
HUMAN NECESSITIES
International classification
A61B34/20
HUMAN NECESSITIES
G06T7/521
PHYSICS
Abstract
The present invention provides a surgical robot system which includes a workstation, a robotic arm, a scanning module, a guiding module and the like. Fast registration may be completed by means of the scanning module. The system improves the speed and accuracy of registration and shortens the period of the operation.
Claims
1. A surgical robot system, comprising: a workstation, comprising a housing, a computation and control center, a display apparatus and an input device; a robotic arm, comprising a plurality of arm segments which are connected by joints; a scanning module, configured to collect information for a target space; and a guiding module, configured to guide a surgical instrument to move in a desired trajectory, wherein information collected by the scanning module is processed by the workstation to acquire three-dimensional information of the target space.
2. The system according to claim 1, wherein the scanning module comprises an image acquiring apparatus.
3. The system according to claim 1, wherein the scanning module comprises a light emitting component and an image acquiring apparatus.
4. The system according to claim 1, wherein the scanning module comprises a projecting component and an image acquiring apparatus, and that the projecting component may emit a specific coded image to the target space and the image acquiring apparatus collects the image, thereby acquiring an accurate three-dimensional structure of the target space by a corresponding decoding algorithm.
5. The system according to claim 4, wherein the projecting component may also project an image to the target space; and/or the projecting component and the image acquiring apparatus have a predetermined relative spatial position relationship; and/or the projecting component comprises a light source, a lens group, a digital micromirror device and a control module.
6. The system according to claim 1, wherein a position of the scanning module in a coordinate system of the robotic arm is determined by the robotic arm.
7. The system according to claim 1, wherein the system further comprises a position tracking module, wherein the position tracking module is configured to track a position of the scanning module and the position tracking module is an image acquiring apparatus; and/ or the position tracking module is an optical tracking apparatus; and/ or the position tracking module is an electromagnetic tracking apparatus.
8. The system according to claim 1, wherein the scanning module is connected in a detachable manner to the robotic arm through a flange or is independent, and/or that the scanning module is integrated in the robotic arm.
9. The system according to claim 1, wherein a force applied to the robotic arm can be calculated with a current of a motor or at least one force sensor is provided, and preferably that each joint of the robotic arm is provided with a force sensor.
10. The system according to claim 1, wherein the robotic arm has at least 6 degrees of freedom, and preferably that the robotic arm has 6, 7, 8, 9, or 10 degrees of freedom.
11. A method for using the surgical robot system according to claim 1, comprising the following steps: a) using the surgical robot system to receive image data for visualized display, and making a surgical plan in the three-dimensional model; b) using the scanning module to scan a target space, generating a three-dimensional structure with scanned data via the workstation, and registration with the image acquired in step a; and c) mounting the guiding module at an end of the robotic arm and executing the predetermined surgical plan.
12. The system according to claim 2, wherein a position of the scanning module in a coordinate system of the robotic arm is determined by the robotic arm.
13. The system according to claim 3, wherein a position of the scanning module in a coordinate system of the robotic arm is determined by the robotic arm.
14. The system according to claim 4, wherein a position of the scanning module in a coordinate system of the robotic arm is determined by the robotic arm.
15. The system according to claim 2, wherein the system further comprises a position tracking module, wherein the position tracking module is configured to track a position of the scanning module and the position tracking module is an image acquiring apparatus; and/or the position tracking module is an optical tracking apparatus; and/or the position tracking module is an electromagnetic tracking apparatus.
16. The system according to claim 3, wherein the system further comprises a position tracking module, wherein the position tracking module is configured to track a position of the scanning module and the position tracking module is an image acquiring apparatus; and/ or the position tracking module is an optical tracking apparatus; and/ or the position tracking module is an electromagnetic tracking apparatus.
17. The system according to claim 4, wherein the system further comprises a position tracking module, wherein the position tracking module is configured to track a position of the scanning module and the position tracking module is an image acquiring apparatus; and/or the position tracking module is an optical tracking apparatus; and/or the position tracking module is an electromagnetic tracking apparatus.
18. The system according to claim 2, wherein the scanning module is connected in a detachable manner to the robotic arm through a flange or is independent, and/or that the scanning module is integrated in the robotic arm.
19. The system according to claim 3, wherein the scanning module is connected in a detachable manner to the robotic arm through a flange or is independent, and/or that the scanning module is integrated in the robotic arm.
20. The system according to claim 4, wherein the scanning module is connected in a detachable manner to the robotic arm through a flange or is independent, and/or that the scanning module is integrated in the robotic arm.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0032] In order to more clearly illustrate the technical solutions of the specific embodiments of the present invention or the prior art, a brief introduction may be given hereinafter to the accompanying drawings that may be used in the description of the specific embodiments or the prior art. Obviously, the accompanying drawings in the description below are used for illustrating some embodiments of the present invention, and those of ordinary skill in the art may still derive other drawings from these accompanying drawings without creative efforts.
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
REFERENCE MARKS
[0044] 100—workstation; 101—housing; 102—computation and control center; 103—display apparatus; 104—input device; 1011—wheel; 1012—fixing apparatus; 200—robotic arm; 300—scanning module; 400—guiding module; 500—tracking module; 201—base; 202—first joint; 203—first arm segment; 204—second joint; 205—second arm segment; 206—third joint; 207—third arm segment; 208—fourth joint; 209—fourth arm segment; 210—fifth joint; 211—fifth arm segment; 212—sixth joint; 213—sixth arm segment; 214—seventh joint; 215—seventh arm segment (end of robotic arm); 301—projecting component; 302—image acquiring apparatus; 303—traceable structure; 3031—traceable marker (spherical marker or corner point); 501—camera or projecting component; 502—camera; 503—infrared transmitting apparatus.
DETAILED DESCRIPTION
[0045] In order to make the objectives, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below in conjunction with the accompanying drawings. Obviously, the described embodiments are only a part of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments derived by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.
[0046] To understand the embodiment of the embodiment, a surgical robot system disclosed in the present invention is introduced in detail at first.
[0047] With reference to
[0048] the workstation 100 includes a housing 101, a computation and control center 102, a display apparatus 103, and an input device 104; the computation and control center 102 communicates with the display apparatus 103, the input device 104 and other medical devices, for example, a magnetic resonance imaging (MRI) device or an X-ray computed tomography (CT) device, or a database. The display apparatus 103 is configured to display a three-dimensional image and a software control interface generated by the computation and control center 102. The number of display apparatuses may be more than one, and the display apparatus may also be one of other existing devices, for example, an LCD display, a notebook computer, a tablet computer, a smart phone or the like. In one embodiment, a touch screen with both display and input functions may be used; in another embodiment, the display apparatus 103 may be a glass with a function of projecting display or a helmet with a projection screen, for the convenience of the user. The input device 104 is any input accessory, for example, a foot switch, a touch pad, a touch pen, a touch screen, a joystick, a trackball, a wireless mouse, a mouse, a keyboard, a voice input port or a combination thereof, which allows the user to input an order to the computation and control center 102; the input device 104 may be omitted under the circumstance that the display apparatus 103 has an input function. The housing 101 has a wheel, a fixing apparatus and a handle, which ensures that the user can easily move the workstation 100; the housing 101 may also have a connecting apparatus which connects the workstation 100 and an operating table/head frame or the like in a fixed manner. The robotic arm 200 is any robotic arm having at least 6 degrees of freedom, for example, it may be the robotic arm having 7, 8, 9 or 10 degrees of freedom and an end of the robotic arm is connected to the work station 100 in a fixed manner.
[0049] The scanning module 300 may have a plurality of structures. In a first solution, the scanning module only includes an image acquiring apparatus, for example, a binocular camera or the like; in a second solution, the scanning module includes a light emitting component and an image acquiring apparatus, wherein the light emitting component emits light to the target space, the image acquiring apparatus collects images, and the computation and control center calibrates the coordinates of the target space based on the acquired information after sufficient data is collected; and in a third solution, the scanning module includes a projecting component and an image acquiring apparatus, and the projecting component and the image acquiring apparatus of the scanning module have a predetermined relationship of relative spatial position. The projecting component may not only emit a specific coded image but also project an image to the target space. For example, important physiological information of the patient, such as heartbeat, blood pressure and blood type, is projected onto the surface of the patient's skin to display the information in a non-contact and safe manner, and distortion correction may also be performed. The scanning module 300 may be independent, connected to the robotic arm 200 in a detachable manner or integrated in the robotic arm 200.
[0050] A guiding module 400 may be connected to the end 215 of the robotic arm via a flange. The guiding module 400 includes a through hole through which another surgical instrument such as a guide wire, a drill bit, an electrode or the like may be guided and positioned; the surgical instrument reaches a specified position via the through hole of the guiding module based on the pre-planned path and length under the circumstance that the guiding module moves to a specified position and stays there.
[0051] The scanning module 300 is connected to the robotic arm 200. Since a relative position of the scanning module 300 and the robotic arm 200 is determined, the spatial position of the scanning module may be determined via the robotic arm 200. The scanning module acquires scanning data and transmits the data to the workstation 100 for processing. Next, registration is performed after a three-dimensional structure is established. And then, subsequent operation is performed by the guiding module 400.
[0052] With reference to
[0053] the workstation 100 includes a housing 101, a computation and control center 102, a display apparatus 103, and an input device 104; the computation and control center 102 communicates with the display apparatus 103, the input device 104 and other medical devices, for example, an MRI device, a CT device, or a database. The display apparatus 103 is configured to display a three-dimensional image and a software control interface generated by the computation and control center 102. The number of display apparatuses may be more than one, and the display apparatus may also be one of other existing devices, for example, an LCD display, a notebook computer, a tablet computer, a smart phone or the like. In one embodiment, a touch screen with both display and input functions may be used; in another embodiment, the display apparatus 103 may be a glass with a function of projecting display or a helmet with a projection screen, for the convenience of the user. The input device 104 is any input accessory, for example, a foot switch, a touch pad, a touch pen, a touch screen, a joystick, a trackball, a wireless mouse, a mouse, a keyboard, a voice input port or a combination thereof, which allows the user to input an order to the computation and control center 102; the input device 104 may be omitted under the circumstance that the display apparatus 103 has an input function. The housing 101 has a wheel, a fixing apparatus and a handle, which ensures that the user can easily move the workstation 100; the housing 101 may also have a connecting apparatus which connects the workstation 100 and an operating table/head frame or the like in a fixed manner. The robotic arm 200 is any robotic arm having at least 6 degrees of freedom, for example, it may be the robotic arm having 7, 8, 9 or 10 degrees of freedom and an end of the robotic arm is connected to the work station 100 in a fixed manner.
[0054] The scanning module 300 may have a plurality of structures. In a first solution, the scanning module only includes an image acquiring apparatus, for example, a binocular camera or the like; in a second solution, the scanning module includes a light emitting component and an image acquiring apparatus, wherein the light emitting component emits light to the target space, the image acquiring apparatus collects images, and the computation and control center calibrates the coordinates of the target space based on the acquired information after sufficient data is collected; and in a third solution, the scanning module includes a projecting component and an image acquiring apparatus, and the projecting component and the image acquiring apparatus of the scanning module have a predetermined relationship of relative spatial position. The projecting component may not only emit a specific coded image but also project an image to the target space. For example, important physiological information of the patient, such as heartbeat, blood pressure and blood type, is projected onto the surface of the patient's skin to display the information in a non-contact and safe manner, and distortion correction may also be performed. The scanning module 300 may be independent, connected to the robotic arm 200 in a detachable manner or integrated in the robotic arm 200.
[0055] A guiding module 400 may be connected to the end 215 of the robotic arm via a flange. The guiding module 400 includes a through hole through which another surgical instrument such as a guide wire, a drill bit, an electrode or the like may be guided and positioned; the surgical instrument reaches a specified position via the through hole of the guiding module based on the pre-planned path and length under the circumstance that the guiding module moves to a specified position and stays there.
[0056] The tracking module 500 may be implemented by different devices as long as it may track a spatial position of the scanning module 300. For example, in a first situation, a position tracking module is a camera with a tracking ability, for example, a binocular camera. The scanning module includes corner points arranged in a special structure or self-luminous markers. According to the principle of binocular imaging, the position of the tracked scanning module is acquired, and then the spatial position of the acquired image information may be determined by the position of the scanning module; in a second situation, a position tracking module is an optical tracking apparatus which usually includes a light-traceable marker, a camera unit and a light emitting unit. Preferably, the light is infrared rays. The marker is fixed to the scanning module. In this way, the position of the scanning module may be monitored in real time by the optical tracking apparatus. There may be a plurality of forms of markers, for example, a ball and the like; and in a third situation, a position tracking module is an electromagnetic tracking apparatus which determines the position of an electromagnetic marker by the influence of the electromagnetic marker in the magnetic field on the electromagnetic filed, and the spatial position of the scanning module could be determined by the electromagnetic marker by fixing the electromagnetic marker to the scanning module. The tracking module 500 has a determined position relationship with the workstation 100 or the robotic arm 200.
Embodiment 1
[0057] With reference to
Embodiment 2
[0058] With reference to
Embodiment 3
[0059] With reference to
Embodiment 4
[0060] With reference to
Embodiment 5
[0061] With reference to
Embodiment 6
[0062] With reference to
Embodiment 7
[0063] With reference to
[0064] The surgical robot system of the invention may be used in a variety of surgical situations and has different using methods. Only some of the examples are shown below.
Embodiment 8
[0065] An embodiment of the method for using the surgical robot system according to embodiment 1, the method comprising the following steps:
[0066] A) receiving medical image data, such as magnetic resonance image data, functional magnetic resonance image data, CT image data, phase contrast magnetic resonance angiography (PC-MRA) data and the like by the workstation 100 of the surgical robot system through the interface, and preferably unifying formats of the surgical data, and afterwards constructing a three-dimensional model of the target space by a software, “Neurosurgery Robot Planning Software”, which is pre-loaded in the workstation 110, wherein blood vessels are displayed in the three-dimensional model, and the user plans a surgical solution according to a planning guide provided by the software and determines the path of the surgical instrument;
[0067] B) fixing the workstation 100 to an appropriate position, sending by a user an instruction through the input device 104, for example, a mouse or a keyboard, to make the robotic arm 200 control the scanning device 300 connected to the flange, projecting structural light to a target space by the projecting component, collecting the image by the camera and calculating a three-dimensional structure of the target space based on the decoding of the coded images, wherein, preferably, the software of the workstation 100 may control the robotic arm 200 to adjust the position based on the range of the collected images until a three-dimensional structure which meets with the requirements is acquired after data is collected a plurality times, and then registration a three-dimensional structure of the surgical area with the three-dimensional model in step A; and
[0068] C) performing a surgical operation after the registration is completed, and performing the following steps in an operation in which a deep electrode is placed: replacing the scanning module 300 with the guiding module 400, sending an instruction to the robotic arm 200 by the workstation 100 according to the surgical plan made in step A, the robotic arm 200 moving to a specified position, determining a direction and a position for the drill bit of the surgical drill by the user via the guiding module 400, mounting surgical accessories such as a limiting stopper based on the parameters provided by the Neurosurgery Robot Planning Software, then, making a hole in a surgical site, for example, a head, and then using another surgical instrument such as a guide wire or an electrode to replace the drill bit, and going forward along a channel of an orienting device to reach a specified position. If it is a multi-step operation, the robotic arm may be dragged to a desired position to complete the operation with the step in accordance with the pre-determined operation plan. The process shall be repeated many times till all planned steps are completed.
Embodiment 9
[0069] With reference to
Embodiment 10
[0070] An example of the method for using the surgical robot system in embodiment 9 is basically the same as that in embodiment 7. The difference is that the scanning module is integrated in an end of the robotic arm. As a result, the difference is that the guiding apparatus is connected directly to the robotic arm via the flange after the registration without occupying the flange.
[0071] In the description of the embodiments of the present invention, unless otherwise explicitly defined or limited, the terms “mounted”, “connected with”, and “connected to” should be interpreted broadly. For example, they may refer to a fixed connection, detachable connection or integrated connection, or may be a mechanical connection or electrical connection, or may refer to a direct connection or an indirect connection via an intermediary, or may be an internal communication of two elements. For those of ordinary skill in the art, the specific meanings of the above-mentioned terms in the present invention may be understood according to specific situations.
[0072] Finally, it should be noted that the above-mentioned embodiments are only specific implementations of the present invention, which are used to illustrate the technical solutions of the present invention and shall not be construed as limitation. The protection scope of the present invention is not limited thereto. Although referring to the foregoing embodiments to make a detailed description for the present invention, those of ordinary skill in the art should understand that: for any person skilled in the art, modifications may still be made to the technical solutions described in the foregoing embodiments within the technical scope disclosed in the present invention, or changes may be easily conceived, or equivalent substitutions may be made for some of the technical features; these modifications, changes or substitutions do not deviate the nature of the corresponding technical solutions from the spirit and scope of the technical solutions of the embodiments of the present invention, and should fall within the protection scope of the present invention. Therefore, the protection scope of the present invention should be subject to the protection scope of the claims.