2D/3D Registration
20170243361 · 2017-08-24
Inventors
Cpc classification
A61B6/5229
HUMAN NECESSITIES
A61B6/5211
HUMAN NECESSITIES
G06T11/006
PHYSICS
A61B6/5235
HUMAN NECESSITIES
G06T2207/20016
PHYSICS
G06T11/005
PHYSICS
International classification
G06T19/00
PHYSICS
A61B6/00
HUMAN NECESSITIES
G06T7/246
PHYSICS
Abstract
A method includes, following specification of an initial transformation as a test transformation that is to be optimized, determining a 2D gradient x-ray image and a 3D gradient dataset of the image dataset, carrying out, for each image element of the gradient comparison image, a check for selection as a contour point, and determining an environment best corresponding to a local environment of the contour point and extending around a comparison point in the gradient x-ray image for all contour points in the at least one gradient comparison image. Local 2D displacement information is determined by comparing the contour points with the associated comparison points, and motion parameters of a 3D motion model describing a movement of the target region between the acquisition of the image dataset and the x-ray image are determined from the displacement information and a registration transformation describing the registration.
Claims
1. A method for registering a three-dimensional (3D) image dataset of a target region of a patient with a two-dimensional (2D) x-ray image of the target region recorded in an acquisition geometry, the method comprising: following specification of an initial transformation as a test transformation that is to be optimized: determining a 2D gradient x-ray image and a 3D gradient dataset of the image dataset; determining at least one 2D gradient comparison image forward-projected in the acquisition geometry of the x-ray image using the test transformation from the 3D gradient dataset after the determining of the 2D gradient x-ray image and the 3D gradient dataset of the image dataset, wherein depth information describing a depth in a radiation direction of a central beam of the acquisition geometry is associated in the at least one 2D gradient comparison image with gradients described two-dimensionally in an image plane; carrying out a check for each image element of the at least one 2D gradient comparison image to determine whether the gradient described two-dimensionally in the image plane satisfies a selection criterion indicating the presence of a contour in the image element, wherein all image elements satisfying the selection criterion are selected as contour points; determining an environment that best corresponds to a local environment of the contour point based on a comparison measure, and extends around a comparison point in the gradient x-ray image for all contour points in the at least one gradient comparison image from a plurality of environments extending around test points; determining local 2D displacement information, the determining of the local 2D displacement information comprising comparing the contour points with the associated comparison points; and determining motion parameters of a 3D motion model that describe a movement of the target region between the acquisition of the image dataset and the x-ray image from the local 2D displacement information and a registration transformation describing the registration by correcting the test transformation based on the motion parameters.
2. The method of claim 1, further comprising determining a plurality of gradient comparison images for different depth intervals of the 3D image dataset in the radiation direction of the central beam of the acquisition geometry, wherein the respective depth interval is assigned to the at least one 2D gradient comparison image as depth information, for each of the at least one 2D gradient comparison image, the depth position in relation to the radiation direction in the depth interval covered by the at least one 2D gradient comparison image, at which position a maximum contribution to the gradient occurred during the forward projection, is assigned as depth information to each gradient described two-dimensionally in the image plane, or a combination thereof.
3. The method of claim 2, wherein the at least one 2D gradient comparison image is determined by applying a render algorithm to the gradient dataset, and wherein one result channel of the render algorithm contains a normalized angular value as a direction indication of the gradient in the image plane, and a further result channel contains a measure for a magnitude of the gradient in the image plane.
4. The method of claim 3, wherein during determination of the depth position as depth information, a third result channel that contains the depth position normalized by the extension of the covered depth interval is used.
5. The method of claim 3, wherein the render algorithm is executed by a GPU.
6. The method of claim 1, wherein the selection criterion comprises a threshold value comparison for the magnitude of the gradient described two-dimensionally in the image plane.
7. The method of claim 1, wherein the 2D gradient x-ray image and the at least one 2D gradient comparison image are initially determined in a coarser resolution with larger image elements than the x-ray image, and wherein the determination of the motion parameters and the correction of the test transformation are repeated successively to at least provide a better resolution.
8. The method of claim 7, wherein the determination of the motion parameters and the correction of the test transformation are repeated successively to at least provide a resolution up to a resolution of the 2D x-ray image.
9. The method of claim 7, wherein the gradients of a larger image element formed by a plurality of image elements of a size of the image elements of the 2D x-ray image are determined at least for the 2D gradient x-ray image from a structure tensor of the larger image element.
10. The method of claim 1, wherein the method is repeated iteratively with the registration transformation as a new test transformation until an abort criterion is satisfied.
11. The method of claim 10, wherein the abort criterion is an undershooting of a threshold value for a norm of the motion parameters, the exceeding of a predetermined number of iteration steps, or a combination thereof.
12. The method of claim 1, wherein a coarse transformation determined from a known position of the patient relative to the x-ray device acquiring the x-ray image, a coarse transformation determined from a registration process using forward-projected virtual x-ray images and performed at a coarser resolution, a coarse transformation having an accuracy of less than 20 mm with respect to translations, less than 10° with respect to rotations, or a combination thereof is used as a first used initial transformation.
13. The method of claim 1, further comprising acquiring a time series of x-ray images, wherein a registration is performed for each x-ray image of the time series of x-ray images, and the registration transformation determined for the previously acquired x-ray image is used in each case as the initial transformation.
14. The method of claim 11, wherein in the case of a registration transformation determined for an x-ray image acquired immediately beforehand, a smaller environment, a smaller number of test points, or a combination thereof is chosen as the initial transformation than in the case of any other initial transformation.
15. The method of claim 1, wherein in addition to a base test point corresponding to the contour point, test points are chosen along the direction of the gradient projected onto the image plane in the starting point associated with the contour point.
16. The method of claim 1, further comprising: determining comparison measures for a specific number of test points corresponding to one image element in each case; and choosing the test point associated with the comparison measure indicating a greatest correspondence as the comparison point or choosing the test points by an optimization algorithm maximizing the comparison measure and operating along the direction of the projected three-dimensional gradient in the starting point associated with the contour point.
17. The method of claim 1, wherein the test points, the number of test points, or the test points and the number of test points to be used are determined as a function of deviation information describing a degree of a present deviation of the test transformation from reality.
18. The method of claim 17, wherein motion parameters of at least one registration process in relation to an x-ray image acquired at an earlier point in time, prediction information derived therefrom, or a combination thereof is used as deviation information.
19. The method of claim 1, wherein the associated 2D displacement information is discarded if a minimum value for the comparison measure at the comparison point is undershot.
20. The method of claim 1, wherein the gradient correlation is used as the comparison measure.
21. The method of claim 1, wherein a motion model permitting a rotation and a translation and describing a rigid movement of all reference structures described by 3D starting points associated with contour points, in relation to individual reference structures, depth intervals, or a combination thereof is used as the motion model.
22. The method of claim 1, wherein in order to determine the motion parameters for each pair of contour point and starting point associated with the contour point in the 3D image dataset, which is determinable taking into account the depth information, taking into account the displacement information describing the observable component of the movement, a target plane, in which the three-dimensionally displaced starting point, the comparison point, and the focal point of the radiation source in the acquisition geometry, is determined, the focal point being chosen as a point of origin of the coordinate system, are located, whereupon an equation system formed by setting the scalar products of the respective normal vector of the target planes and a vector describing the starting point displaced three-dimensionally due to the movement and containing the motion parameters to zero is formed and solved for the motion parameters.
23. The method of claim 22, wherein the normal vector of the respective target planes is formed as a cross product of a vector that is formed as the cross product of the gradient vector of the gradient dataset standing perpendicularly on the course of the contour in the 3D image dataset at the starting point with the vector describing the 3D position of the starting point in the 3D image dataset or of the gradient vector of the at least one 2D gradient comparison image at the contour point with the vector describing the 3D position of the contour point, with the vector describing the 3D position of the comparison point.
24. The method of claim 22, wherein a rotational component of the movement is assumed as linear in order to determine a linear equation system.
25. The method of claim 22, wherein the equation system is solved by applying an iterative, reweighted optimization algorithm that employs the least squares method, by minimizing the sum of the terms of the equations for the contour points, each term being provided with a weighting factor and yielding zero.
26. The method of claim 25, wherein the weighting factors are determined as a function of the comparison measure, as the comparison measure for the respective comparison point, or a combination thereof.
27. The method of claim 26, wherein the weighting factors are updated during the iteration steps as a product of the original weighting factor with a residual confidence.
28. The method of claim 1, wherein depth information known based on subregions is taken into account in the determination of the motion parameters, in the formulation of the motion model, or in the determination of the motion parameters and in the formulation of the motion model.
29. A controller of an x-ray device for registering a three-dimensional (3D) image dataset of a target region of a patient with a two-dimensional (2D) x-ray image of the target region recorded in an acquisition geometry, the controller comprising: a processor configured to: following specification of an initial transformation as a test transformation that is to be optimized: determine a 2D gradient x-ray image and a 3D gradient dataset of the image dataset; determine at least one 2D gradient comparison image forward-projected in the acquisition geometry of the x-ray image using the test transformation from the 3D gradient dataset after the determining of the 2D gradient x-ray image and the 3D gradient dataset of the image dataset, wherein depth information describing a depth in a radiation direction of a central beam of the acquisition geometry is associated in the at least one 2D gradient comparison image with gradients described two-dimensionally in an image plane; carry out a check for each image element of the at least one 2D gradient comparison image to determine whether the gradient described two-dimensionally in the image plane satisfies a selection criterion indicating the presence of a contour in the image element, wherein all image elements satisfying the selection criterion are selected as contour points; determine an environment that best corresponds to a local environment of the contour point based on a comparison measure, and extends around a comparison point in the gradient x-ray image for all contour points in the at least one gradient comparison image from a plurality of environments extending around test points; determine local 2D displacement information, the determining of the local 2D displacement information comprising comparing the contour points with the associated comparison points; and determine motion parameters of a 3D motion model that describe a movement of the target region between the acquisition of the image dataset and the x-ray image from the local two-dimensional displacement information and a registration transformation describing the registration by correcting the test transformation based on the motion parameters.
30. In a non-transitory computer-readable storage medium storing instructions executable by a computing device to register a three-dimensional (3D) image dataset of a target region of a patient with a two-dimensional (2D) x-ray image of the target region recorded in an acquisition geometry, the instructions comprising: following specification of an initial transformation as a test transformation that is to be optimized: determining a 2D gradient x-ray image and a 3D gradient dataset of the image dataset; determining at least one 2D gradient comparison image forward-projected in the acquisition geometry of the x-ray image using the test transformation from the 3D gradient dataset after the determining of the 2D gradient x-ray image and the 3D gradient dataset of the image dataset, wherein depth information describing a depth in a radiation direction of a central beam of the acquisition geometry is associated in the at least one 2D gradient comparison image with gradients described two-dimensionally in an image plane; carrying out a check for each image element of the at least one 2D gradient comparison image to determine whether the gradient described two-dimensionally in the image plane satisfies a selection criterion indicating the presence of a contour in the image element, wherein all image elements satisfying the selection criterion are selected as contour points; determining an environment that best corresponds to a local environment of the contour point based on a comparison measure, and extends around a comparison point in the gradient x-ray image for all contour points in the at least one gradient comparison image from a plurality of environments extending around test points; determining local 2D displacement information, the determining of the local 2D displacement information comprising comparing the contour points with the associated comparison points; and determining motion parameters of a 3D motion model that describe a movement of the target region between the acquisition of the image dataset and the x-ray image from the local two-dimensional displacement information and a registration transformation describing the registration by correcting the test transformation based on the motion parameters.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
DETAILED DESCRIPTION
[0070]
[0071] In the present case the x-ray image 2 is the first of a series of x-ray images that were acquired for the purpose of monitoring a medical intervention by means of fluoroscopic imaging. The information 3 is sourced from a preoperative three-dimensional image dataset, for example a CT image dataset and/or a magnetic resonance image dataset.
[0072]
[0073] Once the contour points have been selected and corresponding starting points determined, a 2D/3D tracking is performed in a step S2 using gradient correlation in specific environments around contour points lying in the image plane. Two-dimensional displacements are determined by finding environments associated with a comparison point in the gradient x-ray image containing the spatial gradients of the x-ray image, which environments represent the best correspondence with the environment of the contour point in a forward-projected gradient comparison image. Step S2 may finally be referred to as a “2D displacement measurement”.
[0074] The position of the comparison points determined in step S2 which, in conjunction with the known position of the contour points, reproduces the two-dimensional displacement information, is used in step S3 to determine, in a robust manner, the three-dimensional motion underlying the displacement of the contour points, a point-to-plane correspondence being used and an equation system being determined and solved by means of which the non-observable components of the three-dimensional motion may also be determined. In the first pass through steps S2 and S3, motion parameters determined in this way therefore provide a correction possibility for the initial transformation adopted for use as a test transformation. The test transformation corrected with the aid of the motion parameters, which relate to a specific motion model, in this case a rigid motion model, forms the determined registration transformation. In this case it is also possible in an exemplary embodiment (at least in a first determination) to work with a plurality of increasing resolutions, which are used successively in order to refine the registration, thereby contributing to the robustness of the method.
[0075] However, the method is also performed iteratively in order to realize further improvements. Accordingly, in step S4, a check is carried out against an abort criterion, for example a maximum number of iteration processes and/or a change, even one that is extremely small, to the registration information relative to the test transformation, in other words an abort criterion indicating a convergence. If an abort criterion or the abort criterion is satisfied, the most recently determined registration transformation is taken as the definitive registration transformation 5, which means that the 2D/3D registration is then completed. If this is not the case, however, a 3D/2D tracking is again performed in step S2 and motion parameters are again determined in step S3 using the determined registration transformation as the new test transformation in order to find a more accurate registration transformation. It has been shown that in the case of the exemplary embodiment illustrated here, in spite of an only very roughly estimated initial transformation, which may be derived for example from a known position of the patient in relation to the x-ray device recording the x-ray image, it is possible to arrive at an accurate, valid 3D/2D registration in only 5 to 10 iteration steps.
[0076] Steps S1 to S3, and therefore the actual workflow sequence of the exemplary embodiment of the method will now be explained in detail hereinbelow.
[0077] Once the initial transformation has been determined, it is necessary in the first instance to determine contour points. These lie on contours which also exist in the three-dimensional image dataset, which for example may therefore correspond to contours of the anatomical structures 4 that are visible in the three-dimensional image dataset.
[0078] For this purpose, a spatial gradient dataset of the three-dimensional image dataset is first determined, that is to say a dataset in which each voxel of the three-dimensional image dataset is assigned a corresponding three-dimensional gradient vector. While it is conceivable in principle to now determine a single gradient comparison image by gradient projection according to the gradient projection theorem (cf. the article by W. Wein et al. or formula (1)), it is provided in this exemplary embodiment, in order to avoid the overlapping of structures, in particular reference structures, of different depths along the central beam of the acquisition geometry, to define subregions along the central beam in order to perform a depth-aware gradient projection within the meaning of the known concept of the depth intervals (cf. the article by J. Wang et al.) and consequently to generate a plurality of projected gradient comparison images, each corresponding to a depth interval. The resulting stack of gradient comparison images may be denoted as {∇I.sub.d.sup.proj}, where d is the depth index and accordingly denotes a depth interval. The depth index is assigned as depth information to the gradient comparison image.
[0079] In the present case, the gradient comparison images are determined by using a render algorithm in a single step by means of a GPU (graphical processing unit). Three normalized starting channels are used for this in the present case, the first of which contains the direction of the gradient described two-dimensionally in the image plane, the second contains the magnitude thereof, and the third, finally, contains an additional, further piece of depth information, namely the normalized depth position within the depth interval, at which position the greatest contribution to the gradient of the gradient comparison image described two-dimensionally in the image plane was present during the forward projection of the gradient image. In reality, therefore, the result of the render operation may be an output texture of the type
frag.sub.r=α.sub.2π
frag.sub.g=g.sub.∥−1∥
frag.sub.b=d.sub.m,
wherein the normalized angle of the gradient direction is yielded as a radian value divided by 2π, the inverse normalized gradient magnitude is yielded as
and d.sub.m is the depth position normalized by way of the maximum depth. This depth-aware gradient render operation generates the gradient position together with a depth estimation in a single render step on a GPU.
[0080] The contour points are now selected in the gradient comparison image by applying a selection criterion to each image element in order to find prominent gradients that correspond to a contour. In this case, in a simple, yet practical possibility, the selection criterion may comprise a threshold value comparison, which means that the gradients whose magnitude exceeds a threshold value satisfy the selection criterion. That said, however, further aspects of the selection criterion may also relate to considerations of the environment, for example in order to identify actually longer edges/contours that are associated for example with an anatomical structure 4 that lies within the subregion defined by the depth interval.
[0081] Should use be made initially of considerations at a coarser resolution in which finally a plurality of smaller image elements of the resolution of the x-ray image are combined into a larger image element (field or “patch”), in the present case the definition of the gradient direction and the gradient magnitude for a “patch” of said type is used by means of the structure tensor for the center point, in other words the central smaller image element, of the larger image element for the larger image element. In this case the eigenvectors and the eigenvalues of the structure tensor are used in order to determine the gradient direction.
[0082] The object of step S2 is now to establish whether the point in the real x-ray image corresponding to the contour point actually lies at the position of the contour point or has been displaced, and if so, how strongly. For this purpose a local gradient-based comparison should be conducted.
[0083]
[0084] Around the contour point 7 there is drawn an environment 8, the size of which permits a useful local comparison and which is beneficially chosen as a function of a deviation transformation describing a presumed deviation of the test transformation from the valid registration transformation; for example there may be a variation of between 5*5 image elements and 20*20 image elements in a 512*512 image. In later iteration steps, i.e. repetitions of steps S2 and S3, it is of course possible in the event of convergence also to start from smaller deviations. However, a conclusion may possibly also be drawn in respect of the initial transformation in terms of how well this applies. If, for example, it is simply a rough estimation based on the known position of the patient in the x-ray device, a greater potential deviation is likely than if, in the case of a series of x-ray images, the finally determined registration transformation 5 of the most recently acquired x-ray image is derived from an initial transformation. In a multiple resolution scheme, the current resolution level may not least have an influence also on the choice of the size of the environment 8, since at a coarser resolution smaller environments may be chosen where appropriate. This is not absolutely necessary, however.
[0085] The point in the gradient x-ray image 9 corresponding in its position to the contour point 7, said gradient x-ray image 9 therefore containing for each pixel of the x-ray image the spatial (in this case two-dimensional) gradient there, will be referred to in the following as the base test point 11, cf. arrow 10. A search direction 12 is also defined in the x-ray image 9 as the projected direction of the three-dimensional gradient in the starting point. Since (narrow) depth intervals are used and the gradient comparison image 6 only relates to one of said depth intervals, the gradient described two-dimensionally in the image plane, i.e. the first result channel, indicates this direction sufficiently accurately and may be used directly. In a test range or search area along the search direction, test points 13 corresponding in each case to a pixel are determined as candidate positions, the number (and hence the search area) of which may also be made dependent on the already mentioned deviation information. In the present case, four further test points 13 are shown by way of example on either side of the base test point 11; it goes without saying that other numbers may be used in a real-world implementation.
[0086] An environment 14 corresponding in size to the environment 8 may now also be defined for each of said test points 11, 13.
[0087] The gradient correlation (cf. once again the article by W. Wein et al.) as well as equation (3)) is now determined as a comparison measure between the environment 8 and each of the environments 14, of which only two are shown in
[0088] A valid comparison point 15 corresponds to the contour point 7 displaced under the influence of the movement that is now to be determined and that describes the error in the test transformation. The difference between the comparison point and the contour point 7 therefore indicates the two-dimensional observable displacement due to the movement.
[0089] The motion parameters thus assigned to specific displacements by 3D/2D tracking and describing the three-dimensional movement in a three-dimensional motion model, which is assumed here in simplified form as a rigid, common movement of all of the reference structures, will now be determined on the basis of a point-to-target plane correspondence, as has been proposed by DE 10 2013 214 479 A1, in such a way that the components of the three-dimensional movement that are not observable at individual points may also be reconstructed.
[0090] This will now be explained in more detail with regard to
[0091] It should be pointed out at this juncture that in order to simplify the illustration, the image plane 17 is shown here by way of example as lying at “1” in the corresponding coordinate. An x-ray detector or its actual position is not necessarily required to define this because a rescaling is possible without difficulty. The calculations then become significantly easier, as also by virtue of the position of the focal point C, since the latter, as already explained hereinabove, is of course also part of the target plane π.
[0092] In the actual x-ray image, the comparison point 15 or P′ corresponds to the starting point W, which means that the movement describing the error in the test transformation has displaced the contour point P by the just determined displacement dp relative to the comparison point or the displaced contour point P′.
[0093] The corresponding three-dimensional movement has displaced the starting point W in accordance with the vector dw to the displaced starting point W′. If a rigid movement with differential rotation is assumed, dw is yielded according to formula (7), the vectors contained therein likewise being shown in
[0094] The two-dimensional and three-dimensional movements are linked to one another by means of the target plane π, which is likewise shown in
[0095] The normal vector n of the target plane π is also depicted accordingly in
[0096] As described, the displacements determined during the 3D/2D tracking and the error in the test transformation described by the motion parameters are related by formula (6), which produces the equation system. The motion parameters, i.e. the rotation dw and the translation dt, must now still be determined by means of the point-to-plane correspondence model given by equation (6), cf. in this regard also formulas (7) and (8), which could be solved using conventional methods, for example a RANSAC method, for the equation system that is linear by virtue of the assumption of the differential rotation.
[0097] In the present case it is, however, provided that use be made of the IRLS optimization scheme, as has been described with regard to equation (9), the weightings β.sub.i being given as described by way of the gradient correlation and in the different iteration steps by means of the residual confidence.
[0098] Once the motion parameters have first been ascertained, the registration transformation may be determined in step S3 as the test transformation corrected by the transformation described by the motion parameters, the process then being continued further in an iterative manner, as has been described already with regard to
[0099] Finally,
[0100] The x-ray device 19 also has a control device 25 (e.g., a controller including a processor), which is embodied as a computing device for performing the method.
[0101] Although the invention has been illustrated and described in more detail on the basis of the preferred exemplary embodiment, the invention is not limited by the disclosed examples and other variations may be derived herefrom by the person skilled in the art without leaving the scope of protection of the invention.
[0102] The elements and features recited in the appended claims may be combined in different ways to produce new claims that likewise fall within the scope of the present invention. Thus, whereas the dependent claims appended below depend from only a single independent or dependent claim, it is to be understood that these dependent claims may, alternatively, be made to depend in the alternative from any preceding or following claim, whether independent or dependent. Such new combinations are to be understood as forming a part of the present specification.
[0103] While the present invention has been described above by reference to various embodiments, it should be understood that many changes and modifications can be made to the described embodiments. It is therefore intended that the foregoing description be regarded as illustrative rather than limiting, and that it be understood that all equivalents and/or combinations of embodiments are intended to be included in this description.