AUGMENTED REALITY SOFT TISSUE BIOPSY AND SURGERY SYSTEM
20230120638 · 2023-04-20
Assignee
Inventors
Cpc classification
A61B2090/365
HUMAN NECESSITIES
A61B90/37
HUMAN NECESSITIES
A61B2090/397
HUMAN NECESSITIES
H04N13/239
ELECTRICITY
A61B90/36
HUMAN NECESSITIES
A61B2090/3966
HUMAN NECESSITIES
A61B90/39
HUMAN NECESSITIES
A61B2090/3945
HUMAN NECESSITIES
International classification
A61B90/00
HUMAN NECESSITIES
G06T19/00
PHYSICS
H04N13/239
ELECTRICITY
Abstract
The combination of a marker system with binocular AR glasses to create a soft tissue procedure system used for surgery, biopsies, etc. A marker localization system is functionally integrated with augmented reality (AR) glasses. This soft tissue system can be applicable for AR surgeries in soft tissues involving mobile and static anatomies anywhere in the body, such as for example, breast, soft tissue, lungs, lymph nodes, liver surgeries, etc. By the placement of single or multiple markers and localizing the markers with a locator, such as a hand-held locator as a non-limiting example, the above-mentioned system provides a real-time intraoperative coordinate frame for the virtual projection of the markers (and the associated ROI) on/in the surgical field/biopsy field of view, using commercial off-the-shelf computer hardware (laptops, tablets, etc.) for the required image processing.
Claims
1. An augmented reality (AR) system for use in soft tissue procedures, the system comprising: a marker localization system—comprising: one or more smart markers placed in a soft tissue region of interest of a subject prior to the procedure, wherein each of the one or more smart markers are configured to respond to a signal; and a locator configured to be moved over a tissue surface of the soft tissue region of interest (ROI) to detect the one or more smart markers and to designate at least one reference point; binocular AR glasses for collecting and displaying data related to the ROI during the soft tissue procedure; at least one processing unit; and at least one set of optically contrast visual elements.
2. The AR system of claim 1, wherein the AR system is configured to operate in one of at least two modes of operation including an augmented reality mode and an ordinary vision mode, whereby in the ordinary vision mode, a user can operate with normal vision through the binocular AR glasses.
3. The AR system of claim 1, wherein the locator is further configured to determine depth measurements of the one or more smart markers and transfer the depth measurements in real-time to the processing unit.
4. The AR system of claim 3, wherein the one or more smart markers comprise ferrite free signal emitting transponders with collective ID's, wherein the locator determines the depth measurements of the one or more smart markers by processing a phase shift between signals received from the smart markers and a reference signal.
5. The AR system of claim 1, wherein the binocular AR glasses comprise an assembly of virtual reality (VR) glasses and high-resolution binocular video cameras wherein both the VR glasses and high-resolution binocular video cameras interfaces with the processing unit in real-time.
6. The AR system of claim 5, wherein the processing unit is configured to simultaneously receive marker depth data from the locator and video data from the binocular AR glasses and in response, to visibly present the region of interest within the binocular AR glasses.
7. The AR system of claim 6, wherein the processing unit is integrated within the marker localization system and running a software program.
8. The AR system of claim 6, wherein the processing unit is a computer that is separate from the marker localization system.
9. The AR system of claim 1, wherein the at least one set of optically contrast visual elements create contrast visual points on the locator, with a first portion of the at least one set of optically contrast elements being associated with the upper end tip of the locator and a second portion of the at least one set of optically contrast elements being associated with a mid-point of the locator and the at least one set of optically contrast visual elements allowing for the determination of the coordinates of the tip of the locator.
10. The AR system of claim 9, wherein the at least one set of optically contrast visual elements are at least two LEDs, wherein the binocular AR glasses can receive signals from the at least two LEDs and determine the coordinates of the tip of the locator.
11. The AR system of claim 10, wherein the at least two LEDs are removable sealed clips.
12. The AR system of claim 9, wherein the locator is one of a group of devices including: a biopsy needle, a surgical instrument, an ablation needle, and a radiation device.
13. The AR system of claim 1, wherein the processing unit is configured to operate in at least two major modes of operation: a first mode that includes pre-procedure, pre-biopsy and pre-cutting mode and a second mode that is an active session that includes performing a procedure, biopsy and providing surgery assistance.
14. The AR system of claim 13, wherein the processing unit operating in the first mode is configured to receive depth measurements of the one or more smart markers in real-time from the locator and simultaneously collect and process video data received from the binocular system.
15. The AR system of claim 13, wherein the processing unit operating in the second mode is configured to provide marker tracking without the use of the locator by just using physical reference points.
16. The AR system of claim 15, wherein the processing unit is configured to create a virtual object of the ROI based on 3D coordinates of the one or more smart markers and reference points received from the locator and AR binocular glasses and present the virtual object in real-time.
17. The AR system of claim 16, wherein the AR binocular glasses are configured to present within an augmented scene the real-time position of the working edge of the locator relative to the virtual object.
18. The AR system of claim 15, wherein the processing unit is configured to create a virtual object of the ROI based on the 3D coordinates of a single marker received from the locator and present the virtual object in real-time.
19. The AR system of claim 15, wherein the processing unit is configured to prepare an augmented reality video stream for the AR binocular glasses.
20. The AR system of claim 15, wherein the processing unit receives and processes the data of the video stream from the contrast points located on the locator.
21. The AR system of claim 15, wherein the processing unit is further configured to use the video stream to calibrate the locator by using at least three contrast visual elements
22. The AR system of claim 1, configured to project a virtual, variable density “arrow field”, wherein a spatial extent and a density of the virtual, variable density “arrow field” is inversely proportional to an estimated distance between a tip of the locator and a currently selected smart marker of the one or more smart markers, wherein an arrow direction of the virtual, variable density “arrow field” corresponds to an algorithmically estimated best direction of search for the currently selected marker.
Description
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
[0049]
[0050]
[0051]
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
DETAILED DESCRIPTION OF VARIOUS EMBODIMENTS
[0058] The present invention, as well as features and aspects thereof, is directed towards providing a marker localization system with binocular augmented reality (AR) glasses that are communicatively coupled to a server that directs a procedure.
[0059] The disclosed AR system, which may be utilized in surgical settings, biopsy settings or other procedures, comprises a marker localization system with binocular AR glasses wire/wirelessly interconnected with at least one master computer or processing unit (server, tablet, laptop, etc.). The computer can be the computer of the smart-marker localization system which can run additional appropriate SW for control of the AR processes, a computer independent to the AR system (a sole AR computer) or combined AR computer and marker localization system computers with appropriate interfaces for automatically transferring the data of markers depth in real-time. The term AR computer can also be referred to as the AR master computer. Throughout the disclosure, the general term ‘computer’ will be understood to refer to either or a combination of the above computer systems.
[0060] The AR glasses can comprise an assembly of VR glasses and high-resolution binocular video cameras wherein both the binocular system and VR glasses have appropriate interfaces with the computer(s) allowing real-time operations. The system also includes optically contrasted objects/signs/icons. The computer can provide: the distance measurements from the currently selected marker to a locator (which may be handheld); the marker's electronic ID if available; and inter-marker distance as well as to perform all required AR calculations. Simultaneously, the AR glasses provide the visual information used for geometry measurements, geometry transformation and composite image creation. The reader is referred to reference “U.S. Provisional Patent Application No. 63/218,973 Filed on Jul. 7, 2021” now U.S. patent application Ser. No. 17/841,663 filed on Jun. 16, 2022 (incorporated herein above by reference) for a detailed description of an exemplary marker system that is suitable for various embodiments of the AR surgical system. It should be appreciated by those skilled in the relevant art that markers and locators of design other than what is described in U.S. Ser. No. 17/841,663 may also be used to construct a virtual rendering of the ROI. Generally, the markers and locator design should be such that they provide marker depth/distance from locator tip to marker measurements. For example, Cianna Medical Savi Scout, Health Beacons LOCalizer, Magseed, Elucent, and Molli markers and their appropriate locators can be used to generate the AR rendering of the ROI by, for example, attaching appropriate LED's to their locators (the locator in the case of Elucent being a surgical instrument). Grid/chess board references can also be used for this purpose.
[0061] Embodiments of the AR system utilize markers that are transponders, although in some embodiments markers that are signal reflectors or signal activated may also be used. Further, the transponder embodiment of the AR system operate to process the phase shift between the signals that are received from the markers and the reference signal rather than just the wave amplitude. In some embodiments, smart markers may be utilized that have collective ID's. Some embodiments of the AR surgical system utilize a pair of commonly available glasses and smart markers of various companies each covered by patents, combined with a control system to provide a unique AR surgical system. The various embodiments provide a solution for use in soft tissue surgery or procedures, especially in mobile tissue environments, such as breast surgery, where standard augmented reality image fusion is difficult. The various embodiments likewise provide a solution for image guided soft tissue biopsy, especially using ultrasound, in cases where the tumor has completely or almost completely disappeared after the use of some systemic therapy given prior to planned surgery. In such instances, there is a need to biopsy a tissue area previously occupied by the tumor, which is no longer visible, in order to see if microscopic remnants of tumor are still present.
[0062]
[0063]
[0064] The coordinate system XOZ has an origin at the middle between left sensor matrix (2-1) and right sensor matrix (2-2). The rays from the point P (X, Y, Z) pass through left lens (2-3) and right lens (2-4) to create images at the left matrix (n.sub.L, m.sub.L) pixels and at the right matrix (n.sub.R, m.sub.R) pixels. Both matrices are identical and have vertical number of pixels M and horizontal number of pixels N. In linear units of measure, the matrix resolutions are δ.sub.X and δ.sub.y respectively. The local origins of each matrix are at their centers, while the axises of local coordinate system of each one is:
Vertical: m=−M/2, . . . 0, . . . M/2, and
Horizontal: n=−N/2, . . . 0, . . . N/2.
[0065] The coordinates of point P can be calculated according to the following formulas:
[0066] This binocular system, for example, has a base D=70 mm, focal lengths F=10 mm, sensor resolution of 20 MP (mega-pixels) and pixel size of 1.5 μm and provides an accuracy of X, Y, Z coordinate estimation of about 0.1 mm, at a distance range of Z=[300 to 500] mm. This range satisfies the distance from the eyes of the surgeon to the operation field for most surgeries including breast surgeries.
[0067] As described above, the AR process requires the surgeon to utilize the locator in the creation of the virtual image. For example, the surgeon can hold the locator and glide its tip across the tissue surface, for example, the patient's breast. The computer system provides accurate measurement of the distance between the tip of the locator and the marker-under-test. The tip of the locater can be (visually) obscured by the hands of the surgeon resulting in its coordinates not being optically measured directly by the AR system. This situation can require an estimation of the coordinates of the locator's tip.
[0068] Estimation of the coordinates of the locator's tip is illustrated by
[0069] The solution is a point with maximum value of Z (most remote point from the AR origin). Due to the outside diameter of the locator body, a number of LEDs can be used around the body at point P2 to ensure an optically accessible (unobscured) visual point and allow for correct calculation(s).
[0070] The 3D estimation of the locator's tip position is performed according to the following algorithm:
Calculate the vector V=P.sub.2−P.sub.1; [F7] [0071] Calculate the Norm of the vector V:
Norm(V)=√{square root over ((X.sub.2−X.sub.1).sup.2+(Y.sub.2−Y.sub.1).sup.2+(Z.sub.2−Z.sub.1).sup.2)}; [F8]
Calculate unit vector V.sub.UNIT=V/Norm(V); [F9]
Find required point P.sub.3(X,Y,Z)=P.sub.1+L.Math.V.sub.UNIT. [F10]
[0072] The result of this algorithm can be verified:
√{square root over (Σ.sub.i.sup.3(P.sub.3(α.sub.i)−P.sub.1(α.sub.i)).sup.2)}=L, where X=α.sub.1,Y=α.sub.2,Z=α.sub.3 [F11]
[0073] This algorithm provides an accuracy of 3D estimation of locator tip position to better than +/−0.2 mm.
[0074] The accumulation of marker localization data permits determination of the distances between the implanted markers and between the last localized marker and the tip of the locator. The combination of all these distances in the computer memory can be interpreted as a (3-5) 3D pattern, for example a pyramid as depicted in
[0075]
[0076] By performing the above-described process, the surgeon, after having set up a first marker pyramid, is free to operate on the markers and the ROI which they define with free hands and real-time vision. During the course of a surgery, for example with tissue mobilized and moved, the surgeon can at any time reestablish a new pyramid by relocating the markers and establishing a new ROI reference point. In the event of 3 or more markers being placed, the markers defining the base of the resultant 3D volume will determine the configuration/shape of that volume and the position of its reference tip.
[0077]
[0078] Looking at
[0079] Turning to
[0080]
[0081]
[0082] The calibration procedure can be performed prebiopsy/presurgery for biopsy needles and surgical instruments such as an electrocautery device, diathermy knife and scalpels etc, planned to be used in the course of the biopsy/surgical procedure. The calibration process uses a calibration point (calibration LED (7-5)) located on an appropriate calibration surface. The operator can touch the calibration LED and assign this event by sending a signal to a computer system designating the start of the appropriate calibration program. The binocular sub-system of the AR system performs an estimation of the 3D locations of 3 LEDs: P.sub.1, P.sub.2 and P. The length of the needle/instrument is determined by the following algorithm:
Calculate the vector V=P.sub.2−P.sub.1; (F12) [0083] Calculate the Norm of the vector V:
Norm(V)=√{square root over ((X.sub.2−X.sub.1).sup.2+(Y.sub.2−Y.sub.1).sup.2+(Z.sub.2−Z.sub.1).sup.2)}; (F13)
Calculate unit vector V.sub.UNIT=V/Norm(V); (F14)
Find required length L=(P(X,Y,Z)−P.sub.1)/V.sub.UNIT. (F15)
[0084] A method other than utilizing LED'S attached to the locator and surgical instruments can be used. For example, grid/chess board references can also be used for calibration.
[0085] The combination of the system of marker coordinates together with reference points and the coordinates of the tip of a surgical instrument/needle, for example of a biopsy device, can create the virtual object of the dynamic field for surgery/needle biopsy of the tumor bed. This object combined (by appropriate SW) with the binocular AR glasses worn by for example a surgeon/breast imager creates an AR dynamic field allowing the operator full visual control of the entire ROI as defined by the marker(s).
[0086] The described method can allow the surgeon to see if the tip of his instrument(s), for example the cutting diathermy or scalpel, has ‘entered’ the virtual volume of tissue as defined by the markers and hence, transgressed the desired surgical margin. This transgression can also be indicated by a signal, such as for example, by a warning sound or a red (transgression) or green (non-transgression) light. By including the surgical instrument as described above in the AR scene, the surgeon is able to operate in real-time, seamlessly, without the need to periodically revert to using the locator. The surgical instrument can be seen in 3D with respect to the AR defined ROI. For example, the instrument can be seen or designated as lying anterior or posterior to the 3D wired virtual marker defined object.
[0087] In cases where tumors have responded fully or almost completely to preoperative systemic therapy, a biopsy of the suspected residual tumor bed can be done under imaging, for example with ultrasound guidance. The physician/operator can see, for example at real-time ultrasound, if the tip of the biopsy needle has ‘entered’ the virtual volume of tissue as defined by the markers or transgressed the desired originally defined tumor volume. This transgression can also be indicated by a signal such as for example by a warning sound or a red (transgression) or green (non-transgression) light. The biopsy needle can be seen in 3D with respect to the AR defined ROI. For example, the needle/instrument can be seen or designated as lying anterior or posterior to the 3D wired virtual marker defined object. The AR field can be image fused with the image of the imaging modality being used for the biopsy for example the ultrasound or CT or MM screen image.
[0088] The node points established by the markers and the designated reference points can be integrated with an AI program that can be combined with the above-described method to enhance the integration of the reference points and 3D virtual defined volume. AI can be used to collect data from each procedure for the establishment of a historical data set. This will allow for suggestions and alerts to surgeons, providing real time visual data for intelligent surgery.
[0089] The AR system described and the associated computer program may be used to generate a simulation model of the ROI/ROI's requiring biopsy and or excision. The markers and associated ROI/ROI'S in the simulation model can duplicate the invivo scenario and allow for teaching/review and practice of the interventional procedures prior to the actual invivo event/s.
[0090] In the description and claims of the present application, each of the verbs, “comprise”, “include” and “have”, and conjugates thereof, are used to indicate that the object or objects of the verb are not necessarily a complete listing of members, components, elements, or parts of the subject or subjects of the verb.
[0091] The present invention has been described using detailed descriptions of embodiments thereof that are provided by way of example and are not intended to limit the scope of the invention. The described embodiments comprise different features, not all of which are required in all embodiments of the invention. Some embodiments of the present invention utilize only some of the features or possible combinations of the features. Variations of embodiments of the present invention that are described and embodiments of the present invention comprising different combinations of features noted in the described embodiments will occur to persons of the art.
[0092] It will be appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described herein above. Rather the scope of the invention is defined by the claims that follow.