FEATURE-BASED REGISTRATION METHOD

20210358142 · 2021-11-18

    Inventors

    Cpc classification

    International classification

    Abstract

    Methods for registering a three-dimensional model of a body volume to a real-time indication of a sensor position that involve analyzing scanned and sensed voxels and using parameters or thresholds to identify said voxels as being either tissue or intraluminal fluid. Those voxels identified as fluid are then used to construct a real-time sensed three-dimensional model of the lumen which is then compared to a similarly constructed, but previously scanned model to establish and update registration.

    Claims

    1-20. (canceled)

    21. A method for registering a sensing volume of a sensor to a three-dimensional model, the method comprising: segmenting a plurality of first voxels of a three-dimensional model; receiving sensing volume voxels of a plurality of second voxels from a sensor located within a body lumen; segmenting the sensing volume voxels of the second voxels based on a threshold; comparing the segmented volume voxels of the second voxels with a plurality of portions of the segmented plurality of first voxels of the three-dimensional model; identifying a fit portion of the segmented plurality of first voxels of the three-dimensional model based on the comparison; and registering the segmented volume voxels of the second voxels to the fit portion of the segmented plurality of first voxels of the three-dimensional model.

    22. The method according to claim 21, wherein the fit portion is identified based on a minimum difference between the segmented volume voxels of the second voxels and the segmented plurality of first voxels of the three-dimensional model.

    23. The method according to claim 21, wherein comparing the segmented volume voxels of the second voxels with a plurality of portions of the segmented plurality of first voxels of the three-dimensional model includes: calculating a difference between the segmented sensing volume voxels of the second voxels and each of the plurality of segmented first voxels of the three-dimensional model; and selecting the fit portion that provides a minimum difference between the segmented sensing volume voxels of the second voxels and the segmented plurality of first voxels of the three-dimensional model.

    24. The method according to claim 23, wherein calculating the difference includes: calculating an absolute value of a difference between each segmented value of the segmented volume voxels of the second voxels and each segmented value of each of the segmented plurality of first voxels of the three-dimensional model; and summing the absolute values of the difference to calculate a total difference between the segmented sensing voxels of the second voxels and the segmented plurality of first voxels of the three-dimensional model.

    25. The method according to claim 21, wherein segmenting the sensing volume voxels of the second voxels includes assigning a density value to each voxel of the sensing volume voxels of the second voxels.

    26. The method according to claim 25, wherein the density value is based on an advancement speed of the sensor.

    27. The method according to claim 25, wherein an advancement speed of the sensor is inversely proportional to the density value.

    28. A method for registering a three-dimensional model to an image of a sensor, the method comprising: receiving location data from a sensor located within a body lumen; assigning a density value to each voxel of a plurality of voxels of a three-dimensional model based on the received location data; determining a voxel of the plurality of voxels, which is closest to the sensor, based on a plurality of parameters, each of which has a predefined threshold.

    29. The method according to claim 28, wherein the density value is based on an advancement speed of the sensor.

    30. The method according to claim 28, wherein an advancement speed of the sensor is inversely proportional to the density value.

    31. The method according to claim 28, wherein the density value is a Hounsfield number.

    32. The method according to claim 28, wherein the density value of each voxel is proportional to a probability that the sensor occupies each voxel.

    33. The method according to claim 28, further comprising determining whether a voxel is tissue or air.

    34. The method according to claim 33, wherein a voxel is determined as tissue when the density value is higher than the threshold.

    35. A method for registering a three-dimensional model to a sensing volume of a sensor, the method comprising: segmenting a plurality of first voxels of a three-dimensional model; receiving sensing volume voxels of a plurality of second voxels from a sensor located within a body lumen; assigning a density value for each voxel of the sensing volume voxels of the second voxels; segmenting the sensing volume voxels of the second voxels based on a threshold; and registering a first portion of the plurality of first voxels of the three-dimensional model to the segmented sensing volume voxels of the second voxels.

    36. The method according to claim 35, wherein the density value is based on an advancement speed of the sensor.

    37. The method according to claim 35, wherein an advancement speed of the sensor is inversely proportional to the density value.

    38. The method according to claim 35, wherein the density value is a Hounsfield number.

    39. The method according to claim 35, wherein the density value of each voxel is proportional to a probability that the sensor occupies the voxel.

    40. The method according to claim 35, wherein a voxel of the second voxels is considered as tissue when the density value of the voxel is higher than the threshold and the voxel is considered as an airway when the density value of the voxel is lower than the threshold.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0017] FIG. 1 is a flowchart of a method of the present invention; and

    [0018] FIG. 2 is a flowchart of a more specific example of an embodiment of the method of FIG. 1.

    DETAILED DESCRIPTION

    [0019] Generally, the present invention includes a system and method for registering a three-dimensional model of a body volume, such as a CT volume, to a real-time image of a sensor. This registration method compares anatomical cavity features to cavity voxels, as opposed to anatomical shapes or locations to structure shapes or locations.

    [0020] Referring now to the flowchart of FIG. 1, it is shown that the method of the present invention begins at 20 with a collection of reference data. This step involves the acquisition of a plurality of CT scans, which are then assembled into a CT volume. During the procedure, the sensor is inserted into the lungs of the patient and a data stream is established between the sensor and a system processor.

    [0021] At step 22, the data acquired is processed, which involves de-cluttering and digitization. Each of the voxels is assigned a number based on the tissue density Housefield number. This density value can be associated with gray level or color using well-known window-leveling techniques. The density is proportional to a probability that the sensor will occupy a given voxel. The data is also filtered as desired. For example, if the sensor is advanced slowly rather than quickly, it will necessarily result in higher densities as any one voxel is going to be occupied for a longer period of time while the sensor takes longer to pass through. Hence, an advancement rate may be noted and used to normalize the densities by speed, accordingly. After filtering, the voxels with higher densities are given higher weight in registration than voxels having lower densities.

    [0022] At step 24 the desired parameters are defined. By way of example only, the voxel could be required to meet parameters such as: 1) falls within a particular density range, 2) falls within a predefined proximity from a currently accepted (registered) voxel, 3) fits within a specific template such as a group of continuous densities corresponding to air next to a plurality of densities corresponding to a blood vessel.

    [0023] At 26, a compare and fit function is performed. This step includes multiple sub-steps, beginning with step 30. These steps are performed iteratively and repeatedly until the target is reached.

    [0024] Step 30 involves an initial guess and is based on assumptions or known landmark techniques. For example, the main carina is relatively easy to match to the main carina of a BT.

    [0025] At 32, the CT volume is registered to the sensor data using the initial guess and a difference between the two is calculated.

    [0026] At 34, for each real voxel visited by the sensor, the registration software finds the closest voxel in the CT volume that matches specific parameters. The registration is then updated accordingly. If the process is iterative, the matched voxels may be aligned completely (ideally). If the process is continuous, a density function is used to weight the importance of that particular voxel match and the registration is adjusted, using frequency and/or density, a degree that is proportional to the weighted importance.

    [0027] Referring now to FIG. 2 for illustration purposes, there is shown a more specific example of an embodiment of the method of FIG. 1, which represents a binary voxel-based approach. At 60 a collection of reference data is taken, similar to the data acquisition step 20 described above. This step involves the acquisition of a plurality of CT scans, which are then assembled into a CT volume. The voxels representing internal lung air are then segmented from the CT volume using a known segmentation algorithm, obviating the need to extract the geometry, surfaces, or structures of the lung. During the procedure, the sensor is inserted into the lungs of the patient and a data stream is established between the sensor and a system processor.

    [0028] At step 62, the data acquired from the sensor is processed, which involves de-cluttering and digitization. Each of the voxels is assigned a number based on the tissue density Housefield number. This density value can be associated with gray level or color using well known window-leveling techniques. The density is proportional to a probability that the sensor will occupy a given voxel. The data is also filtered as desired. For example, if the sensor is advanced slowly rather than quickly, it will necessarily result in higher densities as any one voxel is going to be occupied for a longer period of time while the sensor takes longer to pass through. Hence, an advancement rate may be noted and used to adjust the densities accordingly. After filtering, the voxels with higher densities are given higher registration importance than voxels having lower densities.

    [0029] At step 64 a threshold value is set for the sensing volume voxels. For example, if the density of a given voxel is higher than the threshold value, that voxel is considered to be tissue and is given a value of zero. If the density of the voxel is below the threshold, that voxel is considered to be air and is given a value of 1. Hence the voxel space now becomes a binary voxel space. This function is performed both on the CT volume as well as on the sensor data.

    [0030] At step 66 a compare and fit function is performed. Because a binary system is being used, it is possible to use a variety of matching methods to register the two binary volumes. For example, a subtraction method could be used. A subtraction method superimposes a segment of the sensor data over a corresponding segment of the binary CT volume. The registration is effected by subtracting the binary values of the one volume from the other. For example for any given voxel, if the values are both 1, when the aligned voxels are subtracted the value for that matched voxel space is zero. If they are not the same, however, subtraction results in either a 1 or a −1. All values are converted to their absolute values and totaled. The registration of that particular segment of sensor data is adjusted until a minimum subtracted total is acquired. One advantage of this method is that a minimum may be acquired regardless of image quality.

    [0031] Although the invention has been described in terms of particular embodiments and applications, one of ordinary skill in the art, in light of this teaching, can generate additional embodiments and modifications without departing from the spirit of or exceeding the scope of the claimed invention. Accordingly, it is to be understood that the drawings and descriptions herein are proffered by way of example to facilitate comprehension of the invention and should not be construed to limit the scope thereof.