C-ARM-BASED MEDICAL IMAGING SYSTEM, AND METHOD FOR MATCHING 2D IMAGE AND 3D SPACE

20220039874 · 2022-02-10

    Inventors

    Cpc classification

    International classification

    Abstract

    Proposed is a medical imaging system including: a C-arm including an X-ray source and a detector; a first plate installed on an X-ray path between the X-ray source and the detector, and including a first transmissive surface provided with a plurality of first ball markers blocking an X-ray, and a first optical marker; a second plate installed on the X-ray path between the X-ray source and the detector, and including a second transmissive surface provided with a plurality of second ball markers blocking the X-ray, and a second optical marker; a reference optical marker configured to provide a 3D reference coordinate system; an optical tracking device configured to recognize locations of the first and second optical markers and the reference optical marker; and a matcher configured to calculate a matching relationship between coordinates on a 3D reference coordinate system and locations on first and second captured images.

    Claims

    1. A medical imaging system comprising: a C-type fluoroscopy device (hereinafter, referred to as ‘C-arm’) comprising an X-ray source and a detector; a first plate installed on an X-ray path between the X-ray source and the detector, and comprising a first transmissive surface provided with a plurality of first ball markers blocking an X-ray, and a first optical marker; a second plate installed on an X-ray path between the X-ray source and the detector, and comprising a second transmissive surface provided with a plurality of second ball markers blocking an X-ray, and a second optical marker; a reference optical marker configured to provide a 3D reference coordinate system; an optical tracking device configured to recognize locations of the first and second optical markers and the reference optical marker; and a matcher configured to calculate a matching relationship between coordinates on a 3D reference coordinate system and locations on first and second captured images, based on the first and second captured images respectively obtained by the detector with regard to a subject at first and second locations, locations of the first and second ball markers on the first and second captured images, and location information obtained by the optical tracking device.

    2. The medical imaging system of claim 1, wherein the matcher is configured to obtain first and second projected images by respectively projecting the first and second captured images to the first plate along the X-ray path, based on a matching relationship between the location of the first ball marker on the first and second captured images; and the location of the first ball marker on the first plate calculated using the first optical marker.

    3. The medical imaging system of claim 2, wherein the matcher is configured to obtain first and second source locations of the X-ray source respectively corresponding to the first and second captured images, based on a matching relationship between the location of the second ball marker on the first and second projected images; and the location of the second ball marker on the second plate calculated using the second optical marker.

    4. The medical imaging system of claim 3, wherein the matcher is configured to calculate a 3D imaging space in which a projection path of a first X-ray emitted at the first source location from a location relationship between the first source location and the first projected image overlaps a projection path of a second X-ray emitted at the second source location from a location relationship between the second source location and the second projected image.

    5. The medical imaging system of claim 1, further comprising: a display configured to display the first and second captured images; and a user interface configured to receive information about certain locations on the first and second captured images from a surgical operator, wherein the matcher is configured to calculate spatial coordinates in the reference coordinate system, which correspond to the location information input by the surgical operator.

    6. The medical imaging system of claim 1, further comprising a medical instrument comprising a third optical marker, wherein the matcher is configured to obtain location coordinates of the medical instrument obtained by the optical tracking device using the third optical marker, and calculate location information about the medical instrument on the first and second captured images based on the matching relationship.

    7. The medical imaging system of claim 1, wherein the first plate comprises a fastening member to be fastened to the C-arm so that the first transmissive surface can be located in front of the detector.

    8. The medical imaging system of claim 1, wherein the second plate is installed between the first plate and the X-ray detector.

    9. A method of matching a C-arm 2D image and a 3D space performed by a medical imaging system, comprising: obtaining a first captured image at a first location by a C-arm detector of the medical imaging system; by a matcher of the medical imaging system, obtaining a first projected image by projecting the first captured image back to a first plate along a first X-ray path corresponding to the first location, in which the first plate is provided on the first X-ray path and comprises a first optical marker of which a location is identifiable in a medical spatial coordinate system (hereinafter, referred to as a ‘reference coordinate system’); by the matcher, calculating a location of a C-arm source corresponding to the first location, i.e., a first source location, based on a matching relationship between a location on the reference coordinate system of a ball marker of a second plate provided on the first X-ray path and a location of the ball marker of the second plate on the first projected image; obtaining a second captured image at a second location by the C-arm detector; by the matcher, obtaining a second projected image by projecting the second captured image back to the first plate provided on a second X-ray path along the second X-ray path corresponding to the second location; by the matcher, calculating a location of the C-arm source corresponding to the second location, i.e., a second source location, based on a matching relationship between a location on the reference coordinate system of the ball marker of the second plate provided on the second X-ray path and a location of the ball marker of the second plate on the first projected image; and by the matcher, calculating coordinates of an intersection between a first line connecting the first source location and a first pixel on the first projected image and a second line connecting the second source location and the second pixel on the second projected image.

    10. A method of matching a C-arm 2D image and a 3D space performed by a medical imaging system, comprising: obtaining a first captured image at a first location by a C-arm detector of the medical imaging system; by a matcher of the medical imaging system, obtaining a first projected image by projecting the first captured image back to a first plate along a first X-ray path corresponding to the first location, in which the first plate is provided on the first X-ray path and comprises a first optical marker of which a location is identifiable in a reference coordinate system; by the matcher, calculating a location of a C-arm source corresponding to the first location, i.e., a first source location, based on a matching relationship between a location on the reference coordinate system of a ball marker of a second plate provided on the first X-ray path and a location of the ball marker of the second plate on the first projected image; obtaining a second captured image at a second location by the C-arm detector; by the matcher, obtaining a second projected image by projecting the second captured image back to the first plate provided on a second X-ray path along the second X-ray path corresponding to the second location; by the matcher, calculating a location of the C-arm source corresponding to the second location, i.e., a second source location, based on a matching relationship between a location on the reference coordinate system of the ball marker of the second plate provided on the second X-ray path and a location of the ball marker of the second plate on the first projected image; and by the matcher, calculating first pixels to which certain spatial coordinates are projected on the first projected image along the first X-ray path, and calculating second pixels to which the certain spatial coordinates are projected on the second projected image along the second X-ray path.

    11. The method of claim 9, wherein the obtaining the first projected image of the first captured image on the first plate by the matcher comprises calculating the first projected image by warping the first captured image based on a location on the first captured image corresponding to the ball marker of the first plate and a location on the reference coordinate system.

    12. The method of claim 9, wherein the obtaining the second projected image of the second captured image on the first plate by the matcher comprises calculating the second projected image by warping the second captured image based on a location on the second captured image corresponding to the ball marker of the first plate and a location on the reference coordinate system.

    13. The method of claim 10, wherein the obtaining the first projected image of the first captured image on the first plate by the matcher comprises calculating the first projected image by warping the first captured image based on a location on the first captured image corresponding to the ball marker of the first plate and a location on the reference coordinate system.

    14. The method of claim 10, wherein the obtaining the second projected image of the second captured image on the first plate by the matcher comprises calculating the second projected image by warping the second captured image based on a location on the second captured image corresponding to the ball marker of the first plate and a location on the reference coordinate system.

    Description

    DESCRIPTION OF DRAWINGS

    [0017] FIG. 1 is a schematic structure view of a conventional C-arm apparatus;

    [0018] FIG. 2 is a schematic view of a C-arm-based medical imaging system according to an embodiment of the disclosure;

    [0019] FIG. 3 is a flowchart of a matching method between 2D image pixels and 3D space coordinates according to an embodiment of the disclosure; and

    [0020] FIGS. 4 to 8 are schematic views for describing a matching process according to an embodiment of the disclosure.

    MODE FOR INVENTION

    [0021] Below, embodiments of the disclosure will be described with reference to the accompanying drawings.

    [0022] FIG. 2 is a schematic view of a C-arm-based medical imaging system according to an embodiment of the disclosure.

    [0023] Referring to FIG. 2, the C-arm-based medical imaging system according to an embodiment of the disclosure includes a C-arm 10, a first plate 20, a second plate 30, a reference optical marker 40, an optical tracking device 50, a display 60, a user interface 70, and a matcher 80.

    [0024] The C-arm 10 is provided with an X-ray source 14 and a detector 16 which are installed at opposite ends of a frame 12 and face each other, and includes an actuator 18 for rotationally or translationally moving the frame 12.

    [0025] The first plate 20 includes a first transmissive surface 21 on which a plurality of first ball markers are formed drawing a first pattern, and a first optical marker 22. Here, the first transmissive surface 21 is installed to intersect an X-ray path between the X-ray source 14 and the detector 16 of the C-arm 10, the first ball marker is made of an X-ray blocking material. The first optical marker 22 is provided to define a coordinate system for the first plate 20, and be also detected by an external optical tracking device.

    [0026] The first plate 20 may be stationarily installed in the C-arm 10 so that the first plate 20 can be located on a first X-ray path when the C-arm source 14 and the detector 16 are in a first location to capture an image, but located on a second X-ray path when the locations of the source 14 and the detector 16 are changed to a second location to capture an image. For example, the first plate 20 may include a fastening member (not shown) so as to be fastened to the front surface of the detector 16.

    [0027] The second plate 30 includes a second transmissive surface 31 on which a plurality of second ball markers are formed drawing a second pattern, and a second optical marker 32. The second transmissive surface 31 is installed to intersect an X-ray path between the X-ray source 14 and the detector 16 of the C-arm 10, and the second ball marker is made of an X-ray blocking material. The second optical marker 32 is provided to define a coordinate system for the second plate 30, and be also detected by an external optical tracking device.

    [0028] The second plate 30 may be stationarily or semi-stationarily installed to intersect the first X-ray path and the second X-ray path corresponding to the first and second locations, like the first plate 20.

    [0029] The reference optical marker 40 is to provide a reference for defining a coordinate system of a medical space in which the C-arm 10 operates, and is detected by the external optical tracking apparatus.

    [0030] The optical tracking device 50 may be embodied by an optical tracking system (OTS) to recognize optical markers such as the first and second optical markers 22 and 32, and the reference optical marker 40. A commercial medical OTS provides a function of transformation between the optical marker and the coordinate system as well as a distance to the optical marker, cardinal points and height, and therefore the commercial medial OTS may be employed as the optical tracking device 50.

    [0031] The display 60 is to show an X-ray image captured by the detector 16 to a surgical operator, and the user interface 70 is to receive location information input by the surgical operator through, for example, a touch screen, a mouse, etc. The input location information may be displayed on the display 60 while overlapping with the displayed X-ray image.

    [0032] The matcher 80 calculates a matching relationship between image pixels and 3D space coordinates, based on the coordinate systems of the first and second optical markers 22 and 32 obtained by the optical tracking device 50, the reference coordinate system of the reference optical marker 40 and the image captured by the detector 16. The matcher 80 may be embodied by a processor, a memory and software for performing an image processing process based on the coordinate systems.

    [0033] FIG. 3 is a flowchart of a matching method between 2D image pixels and 3D space coordinates according to an embodiment of the disclosure, and FIGS. 4 to 7 are schematic views for describing the matching method step by step according to an embodiment of the disclosure.

    [0034] With reference to FIGS. 3 to 7, operations of the medical imaging system and the matcher 80 disclosed in FIG. 2 will be described in detail.

    [0035] First, in the state that the C-arm source 14 and the detector 16 are in the first location, the source 14 emits an X-ray to a patient's lesion and the detector 16 obtains a first captured image (S1). Here, the first location refers to any location that is selectable by a surgical operator. Because an anterior-posterior (AP) image and a lateral-lateral (LL) image are needed on a surgical process, a location at which the AP image is obtainable is selected in this embodiment. However, the AP image is not accurately required in this embodiment. Therefore, the first location may be a location at which a similar image approximate to the AP image is obtainable, and the images captured at this location will be collectively called the AP image. Based on the first location, the X-ray emitted from the X-ray source 14 given as a point source goes through a subject along the first X-ray path and is then input to the detector 16, and the detector 16 obtains the first captured image, i.e., the AP image. The surgical operator does not have to capture a plurality of images to get the exact AP location, and therefore ease of use is significantly improved.

    [0036] The matcher 80 obtains a first projected image by projecting the first captured image back to the first plate 20 along the X-ray path (S2). Here, the first ball marker located on the first captured image and the first ball marker located on the first plate 20 are matched to a ‘reference point’, thereby obtaining the first projected image.

    [0037] Because the path of the first X-ray emitted from the X-ray source 14 given as the point source is expanded in the form of a cone, a well-known image warping algorithm may be applied to the first captured image so as to obtain the first projected image based on nonlinear transformation. Here, the location of the first ball marker on the first plate 20 of the same plane as the first projected image and the location of the first ball marker formed on the first captured image are matched to the reference point, and other pixels on the first captured image are subjected to linear or nonlinear interpolation, thereby obtaining the first projected image.

    [0038] For example, as disclosed in FIG. 4, a transformation matrix is obtained with reference to the first ball marker of the first plate 20, using on ‘rigid body landmark transform’ (see “Closed-form solution of absolute orientation using unit quaternions” written by Berthold K. P. Horn, included in Journal of the Optical Society of America A. 4:629-642), and data interpolation and smoothing are performed based on a spline curve using ‘thin plate spline transform’ (see “Splines minimizing rotation invariant semi-norms in Sobolev spaces” written by J. Duchon—constructive theory of functions of several variables, oberwolfach 1976), thereby transforming the first captured image, i.e., the AP image into the first projected image.

    [0039] By the transformation matrix, which defines a transformation relationship between the first captured image and the first projected image, a certain pixel of the first captured image may be transformed into a pixel of the first projected image, and the pixel of the first projected image may be transformed into coordinates on the medical spatial reference coordinate system by means of the first optical marker 22 of the first plate 20.

    [0040] Next, the matcher 80 obtains the location of the C-arm source 14 with regard to the first captured image (hereinafter, referred to as a ‘first source location’), based on a relationship between the second ball marker located on the first projected image and the second ball marker on the second plate 30 (S3). Here, instead of the first captured image of which location information is unknown, the first projected image of which 3D location coordinates are known through the first optical marker 22 is used as reference.

    [0041] Referring to FIG. 5, the second ball markers on the first captured image are warped onto the first projected image, and then the locations of the second ball markers on the first projected image and the locations of the second ball markers on the second plate 30 are connected, thereby obtaining the first source location SAP at a point where the extended lines are intersected. Because the location information of the first projected image and the location information of the second plate 30 are obtained by the optical tracking device 50, it is possible to calculate the first source location SAP.

    [0042] Next, in the state that the C-arm source 14 and the detector 16 are in the second location, the source 14 emits the X-ray to the patient's lesion and the detector 16 obtains a second captured image (S4). Here, the second location refers to any location that is selectable by a surgical operator, but is selected in this embodiment as a location for obtaining the LL image. However, the LL image is not accurately required in this embodiment. Therefore, the second location may be a location at which an image approximate to the LL image is obtainable, and the images captured at the second location will be collectively called the LL image.

    [0043] The matcher 80 obtains a second projected image by projecting the second captured image back to the first plate 20 along the second X-ray path (S5). Like the same method as that in the operation S2, the plurality of first ball markers located on the second captured image and the first ball markers on the first plate 20 are matched to the ‘reference point,’ thereby obtaining the second projected image, in which the well-known image warping algorithm may be employed.

    [0044] The matcher 80 obtains the location of the C-arm source 14 corresponding to the second location (hereinafter, referred to as a ‘second source location’), based on a relationship in location between the second ball marker located on the second projected image and the second ball marker on the second plate 30 (S6).

    [0045] FIG. 6 shows a space in which the first X-ray conically extended from the first source location SAP up to the first plate 20, and the second X-ray conically extended from the second source location SLL up to the first plate 20 are overlapped. This overlapped space (hereinafter, referred to as an ‘imaging space’) refers to concept of a space in which pixels of interest on the first and second projected images are located or a 3D imaging space in which the subject is located.

    [0046] Next, the matcher 80 matches the coordinates in the imaging space and the pixels on the captured image (S7).

    [0047] As disclosed in FIG. 7, certain coordinates P in the imaging space are matched to a first pixel P.sub.APVD obtained as projected to the first projected image by the first X-ray and a second pixel P.sub.LLVD obtained as projected to the second projected image by the second X-ray, and the first and second pixels (P.sub.APVD, P.sub.LLVD) on the first and second projected images are matched to pixels P.sub.APimg and P.sub.LLimg on the first and second captured images by inverse warping matrix T.sub.APwarp.sup.−1 and displayed on the display 60.

    [0048] Here, a certain pixel in the imaging space may for example indicate a location at the tip of a surgical tool. The optical tracking device 50 may obtain coordinates, i.e., location information about the tip by recognizing a third optical marker of the surgical tool, and the matcher 80 may calculate the location of the tip on the first and second captured images based on the coordinates of the tip so as to display the location of the tip on the display 60. Therefore, the surgical operator can use a tracking function and a navigation function with the location of the surgical tool being displayed on two given 2D images in real time.

    [0049] FIG. 8 shows that the first interest pixel P.sub.APimg on the first captured image and the corresponding second interest pixel P.sub.LLimg on the second captured image warped onto the first and second projected images and matched to certain coordinates P in the imaging space as back-projected forming an intersection P in the imaging space.

    [0050] Specifically, a first line P.sub.in2-Pin1 intersecting the imaging space is formed based on the back-projection from the first pixel P.sub.APVD on the first projected image corresponding to the first interest pixel P.sub.APimg on the first captured image toward the source 14 along the first X-ray path. The first line P.sub.in2-Pin1 from the imaging space is matched to a line P.sub.in4-Pin3 on the second projected image, and matched again to a line P.sub.LLimg1-P.sub.LLimg2 on an inverse warping T.sub.LLwarp.sup.−1 second captured image, so that the surgical operator can select a second interest pixel P.sub.LLimg3 among the pixels on the matched line P.sub.LLimg1-P.sub.LLimg2 on the selected second captured image. The second interest pixel P.sub.LLimg3 is warped onto the second projected image and matched to a certain pixel P.sub.LLVD, and thus calculated as coordinates to which certain coordinates corresponding to the intersection P between the second interest pixel P.sub.LLimg3 and the first interest pixel P.sub.APimg in the image space are matched.

    [0051] The first interest pixel P.sub.APimg and the second interest pixel P.sub.LLimg3 may be selected by the surgical operator as a point checked as the same feature point of the subject. Therefore, it will be understood that the surgical operator can select a desired point as the first interest pixel P.sub.APimg and the second interest pixel P.sub.LLimg3 on the first captured image and the second captured image through the user interface 70, and this selected point may be transformed into spatial coordinates and provided as surgical planning information to a medical robot or the like.

    [0052] Although a few embodiments of the disclosure have been described, it is understood by a person having ordinary knowledge in the art to which the disclosure pertains that change or replacement can be made in the embodiments of the disclosure without departing from technical scope of the disclosure.

    [0053] For example, the locations where the first plate 20 and the second plate 30 are stationarily installed may be changeable within a range of intersecting the X-ray path, and the shapes and patterns of the markers on the first and second plates 20 and 30 may be selected within a range for achieving the same function and purpose.

    [0054] Further, the matching steps of the flowchart shown in FIG. 3 may be changeable in precedence except a case where logical order between them is established. The two images are employed in the foregoing embodiments, but more images may be used in the matching.

    [0055] Therefore, it is appreciated that the foregoing embodiments of the disclosure are for illustrative purposes only, and the scope of the disclosure are within the technical concept defined in the appended claims and its equivalents.