GUIDING A ROBOTIC SURGICAL SYSTEM TO PERFORM A SURGICAL PROCEDURE
20230225810 · 2023-07-20
Inventors
Cpc classification
A61B17/1615
HUMAN NECESSITIES
A61B2090/365
HUMAN NECESSITIES
A61B2034/305
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
A61B2034/256
HUMAN NECESSITIES
A61B90/37
HUMAN NECESSITIES
A61B2034/107
HUMAN NECESSITIES
A61B2034/102
HUMAN NECESSITIES
A61B2034/105
HUMAN NECESSITIES
International classification
A61B90/00
HUMAN NECESSITIES
A61B17/16
HUMAN NECESSITIES
Abstract
A robotic surgical system may be used to perform a surgical procedure. Providing guidance for the robotic surgical system includes integrating a Point of View (PoV) surgical drill with a camera to capture a PoV image of a surgical area of a subject patient; displaying an image of the surgical area, based on a viewing angle of the PoV surgical drill, thus enabling the surgeon to operate on the surgical area using the PoV surgical drill. The PoV surgical drill operates based on the surgeon's control of a guidance drill. The content of the images may change based on a change in the viewing angle of the PoV surgical drill.
Claims
1. A method of guiding a robotic endoscope, the method comprising: identifying a region of interest of an affected body part of a subject patient; creating distance-based rules for the region of interest for generating alerts during a surgical procedure; endoscope image referencing by displaying real images of the subject patient captured by the robotic endoscope along with augmented reality (AR) anatomical images of the region of interest; wherein after the surgical procedure including inserting the robotic endoscope into the subject patient has been performed, guiding the robotic endoscope based on the captured real images and the AR anatomical images.
2. The method of claim 1, wherein performing the endoscope image referencing includes displaying images taken by the robotic endoscope with the AR anatomical images.
3. The method of claim 2, wherein the distance-based rules are created based on at least one of the AR anatomical image, the real images from the robotic endoscope, or the image of the robotic endoscope.
4. The method of claim 1, wherein the images of the robotic endoscope are provided by an ultrasound device.
5. The method of claim 1, wherein a plurality of cameras is provided around a periphery of a head of the robotic endoscope, and wherein a field of view of the plurality of cameras includes a point of view (PoV) image region.
6. The method of claim 5, further comprising: displaying the PoV image region of the plurality of cameras with the AR anatomical images.
7. The method of claim 6, wherein the displaying of the field of view of the plurality of cameras includes: displaying the field of view of the plurality of cameras in a subwindow of the AR anatomical images.
8. The method of claim 6, wherein the displaying of the field of view of the plurality of cameras includes: automatically changing a size of a viewing window based on a distance of the robotic endoscope from one or more internal organs.
9. The method of claim 1, wherein the guiding of the robotic endoscope is based on at least one of a point of view (PoV) of the robotic endoscope, a location of the robotic endoscope, or a distance of the robotic endoscope from one or more internal organs.
10. The method of claim 1, wherein the distance-based rules include using a distance between one or more internal organs.
11. A non-transitory computer-readable medium having executable instructions stored thereon that, when executed, cause one or more processors to: receive an identification of a region of interest of an affected body part of a subject patient; create distance-based rules for the region of interest for generating alerts during a surgical procedure; perform an endoscope image referencing by displaying captured real images of the subject patient from the robotic endoscope with augmented reality (AR) anatomical images of the region of interest; wherein after the surgical procedure including inserting the robotic endoscope into the subject patient has been performed, guide the robotic endoscope based on the captured real images and the AR anatomical images.
12. A system for a robotic endoscope, the system comprising: a robotic endoscope; a measurement recognition module to receive an identification of a region of interest of an affected body part of a subject patient, wherein the measurement recognition module is configured to: create distance-based rules for the region of interest for generating alerts during a surgical procedure, and perform an endoscope image referencing by displaying captured real images of the subject patient from the robotic endoscope with augmented reality (AR) anatomical images of the region of interest; and an endoscope control system configured to guide the robotic endoscope based on the captured real images and the AR anatomical images.
13. The system of claim 12, wherein the measurement recognition module is further configured to: display images of the robotic endoscope with the AR anatomical images.
14. The system of claim 13, wherein the distance-based rules are created based on at least one of the AR anatomical image, the real images from the robotic endoscope, or the image of the robotic endoscope.
15. The system of claim 12, wherein a plurality of cameras is provided around a periphery of a head of the robotic endoscope, wherein a field of view of the plurality of cameras includes a point of view (PoV) image region.
16. The system of claim 15, wherein the measurement recognition module is further configured to: display the PoV image region of the plurality of cameras with the AR anatomical images.
17. The system of claim 16, wherein the displaying the field of view of the plurality of cameras includes at least one of: displaying the field of view of the plurality of cameras in a subwindow of the AR anatomical images; or automatically changing a size of a viewing window based on a distance of the robotic endoscope from one or more internal organs.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The accompanying drawings illustrate various embodiments of systems, methods, and embodiments of various other aspects of the disclosure. Any person with ordinary skills in the art will appreciate that the illustrated element boundaries (e.g. boxes, groups of boxes, or other shapes) in the figures represent one example of the boundaries. It may be that in some examples one element may be designed as multiple elements or that multiple elements may be designed as one element. In some examples, an element shown as an internal component of one element may be implemented as an external component in another, and vice versa. Furthermore, elements may not be drawn to scale. Non-limiting and non-exhaustive descriptions are described with reference to the following drawings. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating principles.
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
DETAILED DESCRIPTION
[0014] Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words “comprising,” “having,” “containing,” and “including,” and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items.
[0015] It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the preferred, systems and methods are now described.
[0016] Embodiments of the present disclosure will be described more fully hereinafter with reference to the accompanying drawings in which like numerals represent like elements throughout the several figures, and in which example embodiments are shown. Embodiments of the claims may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. The examples set forth herein are non-limiting examples and are merely examples among other possible examples.
[0017]
[0018] The image database 106 may store images of a subject patient, as well as images of previous patients who have undergone similar surgeries. The images may be captured using an X-ray, ultrasound, and Magnetic Resonance Imaging (MRI). Further, the images may be present in raw form, as Three-Dimensional (3D) models, Augmented Reality (AR) images, Virtual Reality (VR) images, and Point of View (PoV) images. The position database 108 may store real-time position information of a PoV surgical drill 122 and that of a virtual drill that may he shown to a surgeon during surgery.
[0019] The communication network 104 may be either of a wired and/or a wireless network. The communication network 104, if wireless, may be implemented using communication techniques such as Visible Light Communication (VLC), Worldwide Interoperability for Microwave Access (WIMAX), Long Term Evolution (LTE™), Wireless Local Area Network (WLAN), Infrared (IR) communication, Public Switched Telephone Network (PSTN), Radio waves, and other communication techniques known in the art.
[0020] The system 102 may further include a processor 110, interface(s) 112, and a memory 114. The processor 110 may execute an algorithm stored in the memory 114 for processing the PoV images and for guiding the robotic surgical system when performing a surgical procedure. The processor 110 may also be configured to decode and execute any instructions received from one or more other electronic devices or server(s).
[0021] In at least one embodiment, the processor 110 may include one or more general purpose processors (e.g., INTEL® or Advanced Micro Devices® (AMD) microprocessors) and/or one or more special purpose processors (e.g., digital signal processors or Xilinx System On Chip (SOC) Field Programmable Gate Array (FPGA) processor). The processor 110 may be configured to execute one or more computer-readable program instructions, such as program instructions to carry out any of the functions described in this description.
[0022] The interface(s) 112 may facilitate interaction between a surgeon and the system 102. The interface(s) 112 may accept an input from the surgeon or other user who is associated with an on-going surgery and/or provide an output to the surgeon or other user. The interface(s) 112 may either be a Command Line Interface (CLI), Graphical User Interface (GUI), or a voice interface.
[0023] The memory 114 may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, Compact Disc Read-Only Memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, Random Access Memories (RAMS), Programmable Read-Only Memories (PROMs), Erasable PROMs (EPROMs), Electrically Erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions.
[0024]
[0025] The user device 116 is shown as a tablet in
[0026]
[0027]
[0028] In at least one embodiment, a drill bit may be placed in an opening 208 of a drill holder 210 of the PoV surgical drill 122. Once the drill bit is placed in the drill holder 210, a module 212 connected to the drill holder 210 may identify parameters of PoV surgical drill 122, including a type of the surgical drill, a type of the drill bit, a size of the drill bit, and an absolute position of a tip of the drill bit in an XYZ coordinate system referencing the operating table 202.
[0029] In at least one embodiment, the module 212 may further comprise a surgical drill reader configured to read a serial number present on the drill bit. The serial number may be related to the PoV surgical drill 122 and/or the drill bit thereof. Serial numbers respectively corresponding to different drill bits and different categories of surgical drills may be stored in a memory corresponding to the module 212. The received serial number may be matched with the serial numbers stored in the memory to identify details related to the PoV surgical drill 122 and the drill bit. In at least one example, the surgical drill reader may be implemented as a Near Field Communication (NFC) reader, and NFC encoded chip may be attached to the drill bit. The NFC reader may therefore communicate with the NFC encoded chip to receive the serial number of the drill bit.
[0030] In at least one embodiment, the module 212 may identify and/or determine the drill bit being cradled, reference the position of the drill bit with the virtual grid 200, identify the surgical drill and the drill bit, convert the surgical drill identification to an associated virtual surgical icon, and convert the drill bit identification to an associated virtual surgical drill bit icon. The module 212 may further transmit the virtual surgical icon and the virtual drill bit icon, referenced to the XYZ coordinate system of the operating table 202, to an AR imaging system and to the reference holder system 126.
[0031] In at least one other embodiment, the reference holder system 126, shown and described with regard to
[0032]
[0033] In at least one embodiment, images captured by any one or more of the cameras 308, 310, and 312 may be integrated to produce one composite PoV image using known image processing tools and techniques. In at least one example, the composite PoV image may be cropped in a circle and centered with regard to the drill bit 302 based on defaults settings stored by the surgeon. The cropped image may then be sent to the reference holder system 126.
[0034] In at least one embodiment, while performing a surgical procedure on the subject patientm the surgeon may maneuver the guidance drill 120 to control the PoV surgical drill 122 based on the PoV images seen on the AR/VR display 118. As set forth above, the PoV images may be collected using one or more of the cameras 308, 310, and 312 positioned on the head of the PoV surgical drill 122. Thus, content of the PoV images may change based on an orientation and direction faced by the PoV surgical drill 122.
[0035] In at least one embodiment, the processor 110 may synchronize the position of the PoV surgical drill 122 with a reference linked with augmented images shown on the AR display device 118. Based on such synchronization, a Virtual Reality (VR) drill may be shown to the surgeon on the augmented images displayed using the AR display device 118. The VR drill may move based on changes in position of the PoV surgical drill 122, controlled by the surgeon controlling the guidance drill 120. Thus, such synchronization of the VR drill and the PoV surgical drill 122 provides a realistic experience to the surgeon. Further, operating room cameras may also be used for capturing images of the surgical procedure from a fixed angle, as set by the surgeon or based on a positioning of the operating room cameras. Such images may be stored in the image database 106, and may be displayed to the surgeon using an image display, e.g., AR display device 118.
[0036]
[0037]
[0038] In an illustrative embodiment, any of the operations, processes, etc. described herein can be implemented as computer-readable instructions stored on a computer-readable medium. The computer-readable instructions can be executed by a processor of a mobile unit, a network element, and/or any other computing device.
[0039] There is little distinction left between hardware and software implementations of aspects of systems; the use of hardware or software is generally (but not always, in that in certain contexts the choice between hardware and software can become significant) a design choice representing cost vs. efficiency tradeoffs. There are various vehicles by which processes and/or systems and/or other technologies described herein can be effected (e.g., hardware, software, and/or firmware), and that the preferred vehicle will vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle; if flexibility is paramount, the implementer may opt for a mainly software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware.
[0040] The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In one embodiment, several portions of the subject matter described herein may be implemented via Application Specific. Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. However, those skilled in the art will recognize that some aspects of the embodiments disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a CD, a DVD, a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).
[0041] Those skilled in the art will recognize that it is common within the art to describe devices and/or processes in the fashion set forth herein, and thereafter use engineering practices to integrate such described devices and/or processes into data processing systems. That is, at least a portion of the devices and/or processes described herein can be integrated into a data processing system via a reasonable amount of experimentation. Those having skill in the art will recognize that a typical data processing system generally includes one or more of a system unit housing, a video display device, a memory such as volatile and non-volatile memory, processors such as microprocessors and digital signal processors, computational entities such as operating systems, drivers, graphical user interfaces, and applications programs, one or more interaction devices, such as a touch pad or screen, and/or control systems including feedback loops and control motors (e.g., feedback for sensing position and/or velocity; control motors for moving and/or adjusting components and/or quantities). A typical data processing system may be implemented utilizing any suitable commercially available components, such as those typically found in data computing/communication and/or network computing/communication systems.
[0042] The herein described subject matter sometimes shows different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely examples, and that in fact many other architectures can be implemented which achieve the same functionality, In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically rateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.
[0043] From the foregoing, it will be appreciated that various embodiments of the present disclosure have been described herein for purposes of illustration, and that various modifications may be made without departing from the scope and spirit of the present disclosure. Accordingly, the various embodiments disclosed herein are not intended to be limiting.