AUGMENTED REALITY SOFT TISSUE BIOPSY AND SURGERY SYSTEM
20250352295 ยท 2025-11-20
Inventors
Cpc classification
H04N13/239
ELECTRICITY
A61B90/36
HUMAN NECESSITIES
A61B2090/3945
HUMAN NECESSITIES
A61B2090/395
HUMAN NECESSITIES
A61B2090/365
HUMAN NECESSITIES
A61B34/20
HUMAN NECESSITIES
A61B90/37
HUMAN NECESSITIES
A61B2090/397
HUMAN NECESSITIES
A61B90/39
HUMAN NECESSITIES
H04N13/271
ELECTRICITY
International classification
A61B90/00
HUMAN NECESSITIES
A61B90/50
HUMAN NECESSITIES
G06T19/00
PHYSICS
H04N13/239
ELECTRICITY
Abstract
Apparatus and methods are described for use with a region of interest (ROI) within breast tissue of a breast of a subject and one or more reference markers configured to be placed at respective reference points on skin of the breast. A processing unit determines a relationship between positions of the ROI within the breast tissue and the one or more reference markers on the skin of the breast, and, subsequently, determines a location of the ROI within the breast tissue by identifying the one or more reference markers to thereby localize the ROI within the breast tissue. The processing unit outputs augmented-reality data at a location corresponding to the ROI. Other applications are also described.
Claims
1. An apparatus for use with a region of interest (ROI) within breast tissue of a breast of a subject and one or more reference markers configured to be placed at respective reference points on skin of the breast, the apparatus comprising: a processing unit configured to: determine a relationship between positions of the ROI within the breast tissue and the one or more reference markers on the skin of the breast; subsequently, determine a location of the ROI within the breast tissue by identifying the one or more reference markers to thereby localize the ROI within the breast tissue; and output augmented-reality data at a location corresponding to the ROI.
2. The apparatus according to claim 1, wherein the one or more reference markers include one or more markings on the skin of the breast and/or one or more objects placed on the skin of the breast, and wherein the processing unit is configured to identify the one or more reference markers by identifying the one or more markings on the skin of the breast and/or the one or more objects placed on the skin of the breast.
3. The apparatus according to claim 1, wherein the apparatus is configured for use with one or more auxiliary reference markers placed on skin and/or tissue of the breast, and wherein, in the event that one or more of the reference points on the skin of the breast lies in an area requiring skin removal or surgical entry, the processing unit is configured to: determine a relationship between the position of the ROI within the breast tissue and the one or more auxiliary reference markers; subsequently, determine a location of the ROI within the breast tissue by identifying the one or more auxiliary reference markers to thereby localize the ROI within the breast tissue; and output augmented reality data at a location corresponding to the ROI.
4. The apparatus according to claim 1, wherein the processing unit is configured to receive an input from a user indicating that the relationship between positions of the ROI within the breast tissue and the one or more reference markers on the skin of the breast has been determined and in response thereto to associate the position of the ROI within the breast tissue and the one or more reference markers on the skin of the breast via a 3D connectivity mesh.
5. The apparatus according to claim 1, wherein the processing unit is configured to output the augmented reality data at a location on the breast that corresponds to the ROI.
6. The apparatus according to claim 1, wherein the one or more reference markers comprise three or more reference markers and wherein the processing unit is configured to determine the relationship between positions of the ROI within the breast tissue and the one or more reference markers on the skin of the breast by determining respective relationships between the position of the ROI within the breast tissue and positions of each of the reference markers on the skin of the breast.
7. The apparatus according to claim 1, wherein the processing unit is configured to determine the relationship between the position of the ROI within the breast tissue and positions of each of the reference markers on the skin of the breast, by defining one or more wired 3D geometry figures with nodes that connect the ROI within the breast tissue and positions of the one or more reference markers on the skin of the breast.
8. The apparatus according to claim 1, wherein the one or more reference markers comprise one or more visible reference markers.
9. The apparatus according to claim 1, wherein the one or more reference markers are configured to be repositioned to new positions during a procedure that is performed on the subject's breast, and wherein the processing unit is configured to determine an updated relationship between positions of the ROI within the breast tissue and the new positions of the reference markers.
10. The apparatus according to claim 1, wherein the processing unit is configured to localize the ROI within the breast tissue by translating movement of the one or more reference markers to movement of the ROI within the breast tissue.
11. The apparatus according to claim 10, wherein the processing unit is configured to translate movement of the one or more reference markers to movement of the ROI within the breast tissue by assuming that movement of the breast tissue will not significantly vary a distance between the ROI within the breast tissue and the one or more reference markers on the skin of the breast.
12. The apparatus according to claim 1, wherein the apparatus is for use with at least one breast marker that is placed within the breast tissue such as to designate the ROI and wherein the processing unit is configured to determine the relationship between positions of the ROI within the breast tissue and the one or more reference markers on the skin of the breast by determining a relationship between positions of the at least one breast marker within the breast tissue and the one or more reference markers on the skin of the breast.
13. The apparatus according to claim 12, wherein the apparatus is for use a plurality of breast markers that are placed within the breast tissue such as to designate the ROI and wherein the processing unit is configured to define wired 3D geometry figures with nodes that connect the plurality of breast markers within the breast tissue and positions of respective reference markers on the skin of the breast in order to define the ROI volume.
14. The apparatus according to claim 1, wherein the apparatus is configured for use with augmented reality glasses and wherein the processing unit is configured to output augmented reality data at a location corresponding the ROI on the augmented reality glasses.
15. The apparatus according to claim 14, wherein the processing unit is configured to output an augmented reality image of the ROI volume at the location corresponding to the ROI on the augmented reality glasses.
16. The apparatus according to claim 14, wherein the apparatus is configured to use with a tool having a working edge and the processing unit is configured to output an augmented reality image of a real-time position of the working edge of a tool relative to the ROI on the augmented reality glasses.
17. The apparatus according to claim 1, wherein the ROI includes a tumor and wherein the processing unit is configured to: determine a relationship between positions of the tumor within the breast tissue and the one or more reference markers on the skin of the breast; subsequently, determine a location of the tumor within the breast tissue by identifying the one or more reference markers to thereby localize the tumor within the breast tissue; and output augmented-reality data at a location corresponding to the tumor.
18. The apparatus according to claim 17, wherein the processing unit is configured to determine the position of the tumor within the breast tissue based on an ultrasound image of the tumor and to determine the relationship between positions of the tumor within the breast tissue and the one or more reference markers on the skin of the breast by determining a relationship between the positions of the one or more reference markers on the skin of the breast and the position of the tumor within the breast tissue, as determined based on an ultrasound image of the tumor.
19. The apparatus according to claim 17, wherein the processing unit is configured to determine the position of the tumor within the breast tissue by localizing one or more passive markers that are placed in a vicinity of the tumor, and to determine the relationship between positions of the tumor within the breast tissue and the one or more reference markers on the skin of the breast by determining a relationship between positions of the one or more reference markers on the skin of the breast and the position of the tumor within the breast tissue, as determined by localizing the one or more passive markers.
20. The apparatus according to claim 17, wherein the processing unit is configured to determine the position of the tumor within the breast tissue using guide wire localization, and to determine the relationship between positions of the tumor within the breast tissue and the one or more reference markers on the skin of the breast by determining a relationship between positions of the one or more reference markers on the skin of the breast and the position of the tumor within the breast tissue, as determined using guide wire localization.
21. A method for use with a region of interest (ROI) within breast tissue of a breast of a subject and one or more reference markers configured to be placed at respective reference points on skin of the breast, the method comprising: determining a relationship between position of the ROI within the breast tissue and the one or more reference markers on the skin of the breast; subsequently, determining a location of the ROI within the breast tissue by identifying the one or more reference markers to thereby localize the ROI within the breast tissue; and outputting augmented reality data at a location corresponding to the ROI.
22. The method according to claim 21, further comprising receiving an input from a user indicating that the relationship between positions of the ROI within the breast tissue and the one or more reference markers on the skin of the breast has been determined and in response thereto associating the position of the ROI within the breast tissue and the one or more reference markers on the skin of the breast via a 3D connectivity mesh.
23. The method according to claim 21, wherein the one or more reference markers include three or more reference markers and wherein determining the relationship between positions of the ROI within the breast tissue and the one or more reference markers on the skin of the breast comprises determining respective relationships between the position of the ROI within the breast tissue and positions of each of the reference markers on the skin of the breast.
24. The apparatus according to claim 21, wherein determining the relationship between the position of the ROI within the breast tissue and positions of each of the reference markers on the skin of the breast comprises defining one or more wired 3D geometry figures with nodes that connect the ROI within the breast tissue and positions of the one or more reference markers on the skin of the breast.
25. The method according to claim 21, wherein the ROI includes a tumor and wherein: determining the relationship between positions of the ROI within the breast tissue and the one or more reference markers on the skin of the breast comprises determining a relationship between positions of the tumor within the breast tissue and the one or more reference markers on the skin of the breast; determining the locations of the ROI within the breast tissue by identifying the one or more reference markers to thereby localize the ROI within the breast tissue comprises determining a location of the tumor within the breast tissue by identifying the one or more reference markers to thereby localizing the tumor within the breast tissue; and outputting augmented reality data at the location corresponding to the ROI comprises outputting augmented-reality data at a location corresponding to the tumor.
26. The method according to claim 25, further comprising determining the position of the tumor within the breast tissue based on an ultrasound image of the tumor, wherein determining the relationship between positions of the tumor within the breast tissue and the one or more reference markers on the skin of the breast comprises determining a relationship between positions of the one or more reference markers on the skin of the breast and the position of the tumor within the breast tissue, as determined based on the ultrasound image.
27. The method according to claim 25, further comprising determining the position of the tumor within the breast tissue by localizing one or more passive markers that are placed in a vicinity of the tumor, wherein determining the relationship between positions of the tumor within the breast tissue and the one or more reference markers on the skin of the breast comprises determining a relationship between positions of the one or more reference markers on the skin of the breast and the position of the tumor within the breast tissue, as determined by localizing the one or more passive markers that are placed in the vicinity of the tumor.
28. The method according to claim 25, further comprising determining the position of the tumor within the breast tissue by using guide wire localization, wherein determining the relationship between positions of the tumor within the breast tissue and the one or more reference markers on the skin of the breast comprises determining a relationship between positions of the one or more reference markers on the skin of the breast and the position of the tumor within the breast tissue, as determined by using guide wire localization.
Description
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
[0049]
[0050]
[0051]
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
DETAILED DESCRIPTION OF VARIOUS EMBODIMENTS
[0058] The present invention, as well as features and aspects thereof, is directed towards providing a marker localization system with binocular augmented reality (AR) glasses that are communicatively coupled to a server that directs a procedure.
[0059] The disclosed AR system, which may be utilized in surgical settings, biopsy settings or other procedures, comprises a marker localization system with binocular AR glasses wire/wirelessly interconnected with at least one master computer or processing unit (server, tablet, laptop, etc.). The computer can be the computer of the smart-marker localization system which can run additional appropriate SW for control of the AR processes, a computer independent to the AR system (a sole AR computer) or combined AR computer and marker localization system computers with appropriate interfaces for automatically transferring the data of markers depth in real-time. The term AR computer can also be referred to as the AR master computer. Throughout the disclosure, the general term computer will be understood to refer to either or a combination of the above computer systems.
[0060] The AR glasses can comprise an assembly of VR glasses and high-resolution binocular video cameras wherein both the binocular system and VR glasses have appropriate interfaces with the computer(s) allowing real-time operations. The system also includes optically contrasted objects/signs/icons. The computer can provide: the distance measurements from the currently selected marker to a locator (which may be handheld); the marker's electronic ID if available; and inter-marker distance as well as to perform all required AR calculations. Simultaneously, the AR glasses provide the visual information used for geometry measurements, geometry transformation and composite image creation. The reader is referred to reference U.S. Provisional Patent Application No. 63/218,973 Filed on Jul. 7, 2021 now U.S. patent application Ser. No. 17/841,663 filed on Jun. 16, 2022 (incorporated herein above by reference) for a detailed description of an exemplary marker system that is suitable for various embodiments of the AR surgical system. It should be appreciated by those skilled in the relevant art that markers and locators of design other than what is described in U.S. Ser. No. 17/841,663 may also be used to construct a virtual rendering of the ROI. Generally, the markers and locator design should be such that they provide marker depth/distance from locator tip to marker measurements. For example, Cianna Medical Savi Scout, Health Beacons LOCalizer, Magseed, Elucent, and Molli markers and their appropriate locators can be used to generate the AR rendering of the ROI by, for example, attaching appropriate LED's to their locators (the locator in the case of Elucent being a surgical instrument). Grid/chess board references can also be used for this purpose.
[0061] Embodiments of the AR system utilize markers that are transponders, although in some embodiments markers that are signal reflectors or signal activated may also be used. Further, the transponder embodiment of the AR system operate to process the phase shift between the signals that are received from the markers and the reference signal rather than just the wave amplitude. In some embodiments, smart markers may be utilized that have collective ID's. Some embodiments of the AR surgical system utilize a pair of commonly available glasses and smart markers of various companies each covered by patents, combined with a control system to provide a unique AR surgical system. The various embodiments provide a solution for use in soft tissue surgery or procedures, especially in mobile tissue environments, such as breast surgery, where standard augmented reality image fusion is difficult. The various embodiments likewise provide a solution for image guided soft tissue biopsy, especially using ultrasound, in cases where the tumor has completely or almost completely disappeared after the use of some systemic therapy given prior to planned surgery. In such instances, there is a need to biopsy a tissue area previously occupied by the tumor, which is no longer visible, in order to see if microscopic remnants of tumor are still present.
[0062]
[0063]
[0064] The coordinate system XOZ has an origin at the middle between left sensor matrix (2-1) and right sensor matrix (2-2). The rays from the point P (X,Y,Z) pass through left lens (2-3) and right lens (2-4) to create images at the left matrix (n.sub.L,m.sub.L) pixels and at the right matrix (n.sub.R,m.sub.R) pixels. Both matrices are identical and have vertical number of pixels M and horizontal number of pixels N. In linear units of measure, the matrix resolutions are .sub.X and .sub.y respectively. The local origins of each matrix are at their centers, while the axis of local coordinate system of each one is:
[0065] The coordinates of point P can be calculated according to the following formulas:
[0066] This binocular system, for example, has a base D=70 mm, focal lengths F=10 mm, sensor resolution of 20 MP (mega-pixels) and pixel size of 1.5 m and provides an accuracy of X,Y,Z coordinate estimation of about 0.1 mm, at a distance range of Z=[350 to 500] mm. This range satisfies the distance from the eyes of the surgeon to the operation field for most surgeries including breast surgeries.
[0067] As described above, the AR process requires the surgeon to utilize the locator in the creation of the virtual image. For example, the surgeon can hold the locator and glide its tip across the tissue surface, for example, the patient's breast. The computer system provides accurate measurement of the distance between the tip of the locator and the marker-under-test. The tip of the locater can be (visually) obscured by the hands of the surgeon resulting in its coordinates not being optically measured directly by the AR system. This situation can require an estimation of the coordinates of the locator's tip.
[0068] Estimation of the coordinates of the locator's tip is illustrated by
[0069] The solution is a point with maximum value of Z (most remote point from the AR origin). Due to the outside diameter of the locator body, a number of LEDs can be used around the body at point P2 to ensure an optically accessible (unobscured) visual point and allow for correct calculation(s).
[0070] The 3D estimation of the locator's tip position is performed according to the following algorithm:
[0071] The result of this algorithm can be verified:
[0072] This algorithm provides an accuracy of 3D estimation of locator tip position to better than +/0.2 mm.
[0073] The accumulation of marker localization data permits determination of the distances between the implanted markers and between the last localized marker and the tip of the locator. The combination of all these distances in the computer memory can be interpreted as a (3-5) 3D pattern, for example a pyramid as depicted in
[0074]
[0075] By performing the above-described process, the surgeon, after having set up a first marker pyramid, is free to operate on the markers and the ROI which they define with free hands and real-time vision. During the course of a surgery, for example with tissue mobilized and moved, the surgeon can at any time reestablish a new pyramid by relocating the markers and establishing a new ROI refence point. In the event of 3 or more markers being placed, the markers defining the base of the resultant 3D volume will determine the configuration/shape of that volume and the position of its reference tip.
[0076]
[0077] Looking at
[0078] Turning to
[0079]
[0080]
[0081] The calibration procedure can be performed prebiopsy/presurgery for biopsy needles and surgical instruments such as an electrocautery device, diathermy knife and scalpels etc., planned to be used in the course of the biopsy/surgical procedure. The calibration process uses a calibration point (calibration LED (7-5)) located on an appropriate calibration surface. The operator can touch the calibration LED and assign this event by sending a signal to a computer system designating the start of the appropriate calibration program. The binocular sub-system of the AR system performs an estimation of the 3D locations of 3 LEDs: P.sub.1, P.sub.2 and P. The length of the needle/instrument is determined by the following algorithm:
[0082] A method other than utilizing LED'S attached to the locator and surgical instruments can be used. For example, grid/chess board references can also be used for calibration.
[0083] The combination of the system of marker coordinates together with reference points and the coordinates of the tip of a surgical instrument/needle, for example of a biopsy device, can create the virtual object of the dynamic field for surgery/needle biopsy of the tumor bed. This object combined (by appropriate SW) with the binocular AR glasses worn by for example a surgeon/breast imager creates an AR dynamic field allowing the operator full visual control of the entire ROI as defined by the marker(s).
[0084] The described method can allow the surgeon to see if the tip of his instrument(s), for example the cutting diathermy or scalpel, has entered the virtual volume of tissue as defined by the markers and hence, transgressed the desired surgical margin. This transgression can also be indicated by a signal, such as for example, by a warning sound or a red (transgression) or green (non-transgression) light. By including the surgical instrument as described above in the AR scene, the surgeon is able to operate in real-time, seamlessly, without the need to periodically revert to using the locator. The surgical instrument can be seen in 3D with respect to the AR defined ROI. For example, the instrument can be seen or designated as lying anterior or posterior to the 3D wired virtual marker defined object.
[0085] In cases where tumors have responded fully or almost completely to preoperative systemic therapy, a biopsy of the suspected residual tumor bed can be done under imaging, for example with ultrasound guidance. The physician/operator can see, for example at real-time ultrasound, if the tip of the biopsy needle has entered the virtual volume of tissue as defined by the markers or transgressed the desired originally defined tumor volume. This transgression can also be indicated by a signal such as for example by a warning sound or a red (transgression) or green (non-transgression) light. The biopsy needle can be seen in 3D with respect to the AR defined ROI. For example, the needle/instrument can be seen or designated as lying anterior or posterior to the 3D wired virtual marker defined object. The AR field can be image fused with the image of the imaging modality being used for the biopsy for example the ultrasound or CT or MRI screen image.
[0086] The node points established by the markers and the designated reference points can be integrated with an AI program that can be combined with the above-described method to enhance the integration of the reference points and 3D virtual defined volume. AI can be used to collect data from each procedure for the establishment of a historical data set. This will allow for suggestions and alerts to surgeons, providing real time visual data for intelligent surgery.
[0087] The AR system described and the associated computer program may be used to generate a simulation model of the ROI/ROI's requiring biopsy and or excision. The markers and associated ROI/ROI'S in the simulation model can duplicate the in vivo scenario and allow for teaching/review and practice of the interventional procedures prior to the actual in vivo event/s.
[0088] In the description and claims of the present application, each of the verbs, comprise, include and have, and conjugates thereof, are used to indicate that the object or objects of the verb are not necessarily a complete listing of members, components, elements, or parts of the subject or subjects of the verb.
[0089] The present invention has been described using detailed descriptions of embodiments thereof that are provided by way of example and are not intended to limit the scope of the invention. The described embodiments comprise different features, not all of which are required in all embodiments of the invention. Some embodiments of the present invention utilize only some of the features or possible combinations of the features. Variations of embodiments of the present invention that are described and embodiments of the present invention comprising different combinations of features noted in the described embodiments will occur to persons of the art.
[0090] It will be appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described herein above. Rather the scope of the invention is defined by the claims that follow.