PERSONALIZED REGISTRATION METHOD FOR TEMPLATE LIBRARY OF ANATOMICAL MORPHOLOGY AND MECHANICAL PROPERTIES OF MATERIALS OF BONE CT IMAGES

20230146649 · 2023-05-11

    Inventors

    Cpc classification

    International classification

    Abstract

    The present invention provides a personalized registration method for a template library of anatomical morphology and mechanical properties of materials of bone CT images. In the method, a large number of bone CT images of healthy persons are used to build a statistical model capable of containing anatomical morphology and mechanical properties of materials of bones, the parameterized description of bones of a patient is realized by a personalized registration method for the statistical model and bone CT images, a prosthesis template library is built by using images of the patient and registration parameters, and the template library is matched with the CT images of the patient by the personalized registration method to retrieve template images and prosthesis models similar to the bone conditions of the patient from the template library as the initial reference template for design of personalized prosthesis implants.

    Claims

    1-3. (canceled)

    4. A personalized registration method for a template library of anatomical morphology and mechanical properties of materials of bone CT images, comprising the following steps: S1: building a statistical model containing a shape model of anatomical morphology of bones and a gray model of mechanical properties of materials; S11: processing data; collating a large number of bone CT images of healthy persons as a sample set, and normalizing pixel resolutions and gray ranges of the sample set; and resampling the unified spatial resolution, and selecting the finest spatial resolution; S12: calibrating anatomical characteristic information; marking anatomical landmarks of all the sample images in the sample set, and donating as L={(x.sub.1, y.sub.1, z.sub.1), . . . , (x.sub.i, y.sub.i, z.sub.i), . . . , (x.sub.k, y.sub.k, z.sub.k)}, wherein k is the number of the anatomical landmarks, and (x.sub.i, y.sub.j, z.sub.i) represents the 3D coordinate of the i.sup.th point; S13: building an average template image; selecting an average image or any sample image from the sample set as the initial template, deforming all the samples into the space of the average image by means of image registration, calculating a new average image to substitute the initial template image, and repeating the registration operation until the average template image no longer changes; S14: modeling the shape; obtaining the position changes of corresponding pixels between images through image registration; the deformation field of an area where the bone is located describes the morphology changes of the bone, and the deformation field is used to build a shape model of the bone; S15: modeling the gray; the gray value of CT images reflects the attenuation coefficient of organs to X-ray, and the high contrast of bone CT images with surrounding soft tissues accurately describes the geometrical morphology of the bone; statistically modeling the gray of the bone by using the gray value of the CT images; and modeling the gray for all the bone CT images by PCA after down-sampling of the CT images, with the formula as follows:
    g=g.sub.gb.sub.g wherein g∈R.sup.N represents the average gray vector of the sample set; Φ.sub.g∈R.sup.N×M is a gray feature matrix of the gray model in the statistical model obtained by PCA of the sample set, containing M gray changing patterns learned from the sample set, and the subscript g represents the gray; and b.sub.g∈R.sup.M is a gray parameter, g∈R.sup.N represents the gray of the model, the value of g is under control of the gray parameter b.sub.g, and the purpose of controlling the gray change of the model is achieved by adjusting the value of the gray parameter b.sub.g; S2: building a prosthesis template library; S21: collecting and processing data; establishing a pre-implantation data set D.sub.1 and a post-implantation data set D.sub.2 according to the bone CT image samples before and after bone prosthesis implantation, wherein D.sub.1 contains the shape and gray information of damaged bones of the patient, and the post-implantation data set D.sub.2 contains the shape and material information of a personalized bone prosthesis; and normalizing pixel resolutions and gray ranges of D.sub.1 and D.sub.2, and marking the anatomical landmarks of the images in D.sub.1; S22: matching the shape model; registering and fitting the shape model in the statistical model to the bone CT images of the patient not implanted with a bone prosthesis, and calculating the shape parameter b.sub.s of the bone of the patient; S23: matching the gray model; fitting the gray model in the statistical model to the bone images of the patient not implanted with a bone prosthesis, and calculating the gray parameter b.sub.g of the bone of the patient; Φ.sub.gb.sub.g=g.sub.p, wherein g.sub.p represents the standard deviation between the gray vector of the sampled bone of the patient and the average gray vector, and Φ.sub.g represents the gray feature matrix of the gray model in the statistical model; S24: acquiring a bone prosthesis model; extracting the tagged image of a prosthesis implant from the CT images of the postoperative patient with a prosthesis by the threshold segmentation method, and transforming the tagged image of the prosthesis to a 3D surface model by the Marching Cubes algorithm; S25: building a prosthesis template library; building a prosthesis template library based on the statistical model, the CT image data of the patient, the shape parameter and gray parameter of the bone of the patient, and the prosthesis model; S3. carrying out personalized registration of the template library; obtaining the shape parameter and the gray parameter of the bone through personalized registration of the prosthesis template library and the CT images, and retrieving from the prosthesis template library a prosthesis model matching with the bone in the CT images best; S31: preprocessing data; acquiring the bone CT images of the patient to be implanted with a bone prosthesis, carrying out unified space and gray normalization on the CT images, and marking the anatomical landmarks of the bone CT images of the patient; S32: carrying out personalized registration and parameterized representation; carrying out personalized registration on the bone CT images of the patient by using the statistical model, and calculating the shape parameter b′.sub.s and the gray parameter b′.sub.g of the bone of the patient through matching of the shape model and the gray model; S33: matching data; retrieving the data of template images stored in the prosthesis template library and similar to those of the bone of the patient from the prosthesis template library by using a similarity calculation formula according to the shape parameter and gray parameter describing the bone of the patient obtained by registration, the similarity calculation formula is as follows: D = λ 1 .Math. i = 0 M ( b s - b s ) 2 + λ 2 .Math. i = 0 M ( b g - b g ) 2 wherein .Math. i = 0 M ( b s - b s ) 2 is the variance between the shape parameter of the bone of the patient and that of the template image in the prosthesis template library for measurement of the level of similarity in the shape of the bone between two images; Σ.sub.i=0.sup.M(b′.sub.g−b.sub.g).sup.2 is the variance between the gray parameter of the bone of the patient and that of the template image in the prosthesis template library for measurement of the level of similarity in the gray of the bone between two images; and λ.sub.1 and λ.sub.2 are weights of the shape similarity and the gray similarity, the sum thereof is 1, and the values of λ.sub.1 and λ.sub.2 are assigned according to different weights to realize the retrieval matching method based on the shape similarity or based on the gray similarity; calculating the similarity D between the data in the prosthesis template library and the data of the patient, obtaining the template CT image most similar to the bone of the patient after comparison, and acquiring a bone prosthesis model corresponding to the image as the initial reference template for the personalized bone prosthesis of the patient for subsequent analysis and design of bone prostheses.

    5. The personalized registration method for a template library of anatomical morphology and mechanical properties of materials of bone CT images according to claim 1, wherein the steps of modeling the shape in S14 are as follows: S141: calculating shape vectors; calculating nonlinear spatial transformation between the sample image and the average template image by means of image registration, building a shape vector for each sample, and letting s.sub.i=[p.sub.i, l.sub.i]∈R.sup.3N as the shape vector of the sample i, wherein p.sub.i∈R.sup.3N is a 3D coordinate set of N pixels obtained by registering the average template to the sample i, and l.sub.i∈R.sup.3K is a 3D coordinate set of K anatomical landmarks in the sample image. S142: aligning the shape vectors s.sub.i of all the samples by generalized procrustes analysis; S143: modeling the shape change of the bone by principal component analysis; the shape model is as follows:
    s=s.sub.sb.sub.s wherein s∈R.sup.3N represents the average shape of the sample set, Φ.sub.s ∈R.sup.3(N+K)×M represents an eigenvector matrix obtained by principal component analysis of the sample set, and M feature vectors represent M bone deformation patterns learned by the shape model; b.sub.s∈R.sup.M is a shape parameter, and M elements in b.sub.s respectively represent weights of M deformation patterns in Φ.sub.s superimposed with the average shape; and s∈R.sup.3(N+K) represents the shape of the model, s is under control of b.sub.s, and the purpose of controlling model deformation is achieved by adjusting the value of b.sub.s.

    6. The personalized registration method for a template library of anatomical morphology and mechanical properties of materials of bone CT images according to claim 1, wherein the steps of matching the shape model in S22 are as follows: S221: sampling; carrying out equidistant sampling on the vertexes of the average shape in the shape model, and selecting a small number of sampling points for subsequent calculation; S222: carrying out personalized registration; obtaining the mapping relationship, i.e., deformation field, between the average template and the bone image of the patient by the nonlinear registration method of the image, and deforming the vertexes of the average shape in the shape model by using the deformation field to obtain a shape vector describing the bone of the patient; the gray similarity measure between images and the distance measure of the anatomical landmarks are used as the constraints for the loss function of nonlinear registration, and the objective function is as follows:
    E=ω.sub.iE.sub.i+ω.sub.lE.sub.l wherein E is the weighted similarity measure between the template and the image of the patient, E.sub.i is the gray similarity measure between the template and the image of the patient, and E.sub.l is the distance measure of the anatomical landmarks; and ω.sub.i and ω.sub.l are weights of the two measures, and the sum thereof is 1; S223: aligning the registered shape vector of the bone of the patient with the average shape vector, wherein similarity transformation is composed of a scaling factor and a rotation matrix, the scaling factor is obtained by calculating the ratio of the distance between the vertex and the centroid of the registered shape vector of the bone to that of the average shape vector, and the rotation matrix is obtained by singular value decomposition of the vertex matrixes of the two shape vectors; S224: setting the number of the shape parameters b.sub.s; directly defining the number of the shape parameters b.sub.s or selecting the number of the shape parameters b.sub.s capable of retaining more than 95% of data information; and extracting the corresponding shape feature matrix Φ.sub.s from the shape model according to the number of the shape parameters b.sub.s; S225: calculating the shape parameter b.sub.s of the bone of the patient; carrying out linear regression fitting by least squares according to the shape feature matrix Φ.sub.s extracted from the shape model and the registered shape vector of the bone of the patient, and solving the system of linear equations Φ.sub.sb.sub.s=s.sub.p to obtain the shape parameter b.sub.s of the bone of the patient, wherein s.sub.p represents the displacement deviation between the shape vector of the bone of the patient and the average shape vector, and Φ.sub.s represents the shape feature matrix contained in the shape model in the statistical model.

    Description

    DESCRIPTION OF DRAWINGS

    [0038] FIG. 1 is an overall work flow chart of personalized registration of a template library.

    [0039] FIG. 2 is a flow chart for building a statistical model.

    [0040] FIG. 3 is a flow chart of a nonlinear registration method.

    [0041] FIG. 4(a) is an average template image for personalized registration;

    [0042] FIG. 4(b) shows an image of a patient for personalized registration;

    [0043] FIG. 5(a) is a CT image of a patient I;

    [0044] FIG. 5(b) is a diagram of a prosthesis model of the patient I;

    [0045] FIG. 5(c) is a CT image of a patient II;

    [0046] FIG. 5(d) is a diagram of a prosthesis model of the patient II;

    [0047] FIG. 5(e) is a CT image of a patient III;

    [0048] FIG. 5(f) is a diagram of a prosthesis model of the patient III.

    DETAILED DESCRIPTION

    [0049] With the lower hip bone as an example, the present invention is further described in combination with specific implementation steps, as shown in FIG. 1. A personalized registration method for a template library of anatomical morphology and mechanical properties comprises the following steps:

    [0050] Step 1: building a statistical model containing anatomical morphology and mechanical properties of materials of bones.

    [0051] S11: preprocessing data. Collating a large number of bone CT images of healthy persons as a sample set, and normalizing the sample set, including normalization of pixel resolutions and gray ranges.

    [0052] Resampling CT images from different hospitals with uniform spatial resolution to eliminate the inconsistence in spatial resolution. In order to ensure that image information is not lost, the finest spatial resolution of multiple hospitals is selected as the resampling resolution; and using the unified CT value range to normalize the gray of images to eliminate the influence of outliers.

    [0053] S12: calibrating anatomical characteristic information. The anatomical characteristic information refers to the symbolic information in bone CT images, which can significantly reflect the changes of morphological structure and density of bones or have medical significance, such as anatomical landmarks or masks for bone segmentation.

    [0054] Marking anatomical landmarks of all the sample images in the sample set by experts, and donating as L={(x.sub.1, y.sub.1, z.sub.1), . . . , (x.sub.i, y.sub.i, z.sub.i), . . . , (x.sub.k, y.sub.k, z.sub.k)}, wherein k is the number of the anatomical landmarks, and (x.sub.i, y.sub.i, z.sub.i) represents the 3D coordinate of the i.sup.th point;

    [0055] S13: building an average template image. First, selecting an average image or any sample image from the sample set as the initial template, then deforming all the samples into the space of the average image by means of image registration, calculating a new average image to substitute the initial template image, repeating the registration operation, and carrying out loop iteration until the average template image no longer changes.

    [0056] S14: modeling the shape. A deformation field obtained by image registration describes the position changes of corresponding pixels between images, the deformation field of an area where the bone is located can describe the morphology change of the bone, and the deformation field is used to build a shape model of the bone.

    [0057] The specific practice is as follows:

    [0058] S141: calculating shape vectors. Calculating nonlinear spatial transformation between the sample image and the average template image by means of image registration, building a shape vector for each sample, and letting s.sub.i=[p.sub.i, l.sub.i]∈R.sup.3N as the shape vector of the sample i, wherein p.sub.i∈ R.sup.3N is a 3D coordinate set of all the pixels (N in total) obtained by registering the average template to the sample i, and l.sub.i∈R.sup.3K is a 3D coordinate set of all the anatomical landmarks (K in total) in the sample image.

    [0059] S142: performing generalized procrustes analysis (GPA). Before using the shape vectors s.sub.i for statistical modeling, it is necessary to eliminate the shape change caused by other factors. All the shape vectors s.sub.i need to be aligned by rotation and translation in the common coordinate system, and the shape vectors s.sub.i of all the samples are aligned by the GPA method.

    [0060] S143: performing principal component analysis (PCA). For all the shape vectors s.sub.i, modeling the shape change of the bone by the PCA method, and the shape model obtained is shown as follows:


    s=s.sub.sb.sub.s

    [0061] wherein s∈R.sup.3N represents the average shape of the sample set, Φ.sub.s∈R.sup.3(N+K)×M represents an eigenvector matrix obtained by PCA of the sample set, s represents the shape, and for M feature vectors, M is the number of samples, representing M bone deformation patterns learned by the shape model; and b.sub.s∈R.sup.M is a shape parameter, and M elements in b.sub.s respectively represent weights of M deformation patterns in Φ.sub.s superimposed with the average shape. s∈R.sup.3(N+K) represents the shape of the model, s is under control of b.sub.s, and the purpose of controlling model deformation is achieved by adjusting the value of b.sub.s.

    [0062] S15: modeling the gray. The gray value of CT images reflects the attenuation coefficient of organs to X-ray, and the high contrast of bone CT images with surrounding soft tissues can accurately describe the geometrical morphology of the bone. Meanwhile, the CT value has an approximate linear relationship with the apparent density of the bone, and the apparent density of the bone has an empirical formula for the exponential relationship with the properties of materials of the bone, which can accurately describe the mechanical properties of materials of the bone.

    [0063] The gray value of the CT images can be used for statistical modeling of the gray of the bone. Down-sampling performed on the pixels of the sample images with the same scale standard can reduce the calculation amount of the subsequent steps and improve the speed of gray modeling. Letting g.sub.i∈R.sup.N be the gray vector obtained by sampling in the sample i image, which represents the image gray of each pixel in the average template at the corresponding position in the sample i. Modeling all the gray vectors of the sample set, the built model is represented as follows:


    g=g.sub.gb.sub.g

    [0064] wherein g∈R.sup.N represents the average gray vector of the sample set, Φ.sub.g∈R.sup.N×M represents a gray feature matrix of the gray model in the statistical model obtained by PCA of the sample set, g represents the gray, containing M gray changing patterns learned from the sample set, b.sub.g∈R.sup.M is a gray parameter, g∈R.sup.N represents the gray of the model, the value of g is under control of the gray parameter b.sub.g, and the purpose of controlling the gray change of the model is achieved by adjusting the value of the gray parameter b.sub.g. The flow for building the statistical model is shown in FIG. 2.

    [0065] Step 2: building a prosthesis template library by personalized registration of bone CT images.

    [0066] S21: collecting and processing data. Collecting and collating preoperative and postoperative bone CT images of a patient implanted with a bone prosthesis, and establishing a preoperative data set D.sub.1 and a postoperative data set D.sub.2 wherein the preoperative data set D.sub.1 contains the morphology and gray (density) information of the damaged bone of the patient, and the postoperative data set D.sub.2 contains the shape and material information of a personalized bone prosthesis. Referring to the method of S11 in step 1, normalizing pixel resolutions and gray ranges of the data sets D.sub.1 and D.sub.2, and marking the anatomical landmarks of the images in the preoperative data set D.sub.1 by experts.

    [0067] S22: matching the shape model. Registering and fitting the shape model in the statistical model to the preoperative bone CT images of the patient, and calculating the shape parameter b.sub.s of the bone of the patient.

    [0068] The specific steps are as follows:

    [0069] S221: sampling. Carrying out equidistant sampling on the vertexes of the average shape in the shape model, and selecting a small number of sampling points such as 10000 vertexes for subsequent calculation to reduce the calculation cost and time;

    [0070] S222: carrying out personalized registration. Obtaining the mapping relationship between the average template and the bone image of the patient by the nonlinear registration method of the image, and deforming the vertexes of the average shape in the shape model by using the deformation field.

    [0071] The gray similarity measure between images and the distance measure of the anatomical landmarks are used as the constraints for the loss function of nonlinear registration so that the template can be correctly matched with the images of the patient, and the objective function is as follows:


    E.sub.i=ω.sub.iE.sub.i+ω.sub.lE.sub.l

    The gray similarity measure (such as mutual information or mutual correlation) between the images of the patients helps align the template with the undamaged part in the bone CT image of the patient based on gray registration. E.sub.1 is the distance measure of the anatomical landmarks, which is used to ensure alignment between the anatomical landmarks in the template and the images of the patient. For the registration of the damaged part of the bone, the surrounding anatomical landmarks can provide guidance to ensure that the registration of the damaged area does not have excessive deformation. ω.sub.i and ω.sub.l are weights of the two measures, and the sum thereof is 1. The method the nonlinear registration method is shown in FIG. 3.

    [0072] S223: aligning the registered shape vector of the bone of the patient with the average shape vector to eliminate the shape effect caused by similarity transformation, wherein similarity transformation is composed of a scaling factor and a rotation matrix, the scaling factor is obtained by calculating the ratio of the distance between the vertex and the centroid of the registered shape vector of the bone to that of the average shape vector, and the rotation matrix is obtained by singular value decomposition of the vertex matrixes of the two shape vectors.

    [0073] S224: setting the number of the shape parameter b.sub.s. Directly defining the number of the shape parameters b.sub.s or selecting the number of the shape parameters b.sub.s capable of retaining more than 95% of data information, and extracting the corresponding shape feature matrix Φ.sub.g from the shape model according to the number of the shape parameters b.sub.s.

    [0074] S225: Calculating the shape parameter b.sub.s of the bone of the patient. Carrying out linear regression fitting by least squares according to the shape feature matrix Φ.sub.s extracted from the shape model and the registered shape vector of the bone of the patient, and solving the system of linear equations Φ.sub.sb.sub.s=s.sub.p to obtain the shape parameter b.sub.s of the bone of the patient, wherein s.sub.p represents the displacement deviation between the shape vector of the bone of the patient and the average shape vector, and Φ.sub.s represents the shape feature matrix contained in the shape model in the statistical model.

    [0075] S23: matching the gray model. Fitting the gray model in the statistical model to the bone images of the patient, and calculating the gray parameter b.sub.g of the bone of the patient. Transferring the point cloud describing the bone shape of the patient from a physical space coordinate system to a matrix space coordinate system, wherein the point cloud also expresses the mapping relationship between the average template and the image pixels of the patient, and the result of registering the average template to the image of the patient can be calculated from the image of the patient.

    [0076] Carrying out linear regression fitting by least squares according to the gray vector of the registered image and the gray feature matrix Φ.sub.g in the gray model, and solving the system of linear equations Φ.sub.gb.sub.g=g.sub.p to obtain the gray parameter b.sub.g of the bone of the patient, wherein g.sub.p represents the standard deviation between the gray vector of the sampled bone of the patient and the average shape vector, and Φ.sub.g represents the gray feature matrix contained in the gray model in the statistical model. The parameterized representation of the bone of the patient after personalized registration is shown in FIG. 4.

    [0077] S24: acquiring a bone prosthesis model. Metal prosthesis implants usually show higher CT values in the CT images, for example, the CT value of aluminum metal is about 2000 HU, and stainless steel and titanium alloy reach the highest CT value, i.e., 3071 HU, in ordinary CT images. The CT value of metal prosthesis implants is obviously different from that of internal organs of a human body, the tagged image of a prosthesis implant can be extracted from the CT images of the postoperative patient with a prosthesis by the threshold segmentation method, and the tagged image of the prosthesis is transformed to a 3D surface model by the Marching Cubes algorithm. The CT image of the postoperative patient and the prosthesis model are shown in FIG. 5.

    [0078] S25: building a prosthesis template library. Building a prosthesis template library based on the statistical model, the CT image data of the patient, the shape parameter and gray parameter of the bone of the patient, and the prosthesis model. The contents stored in the prosthesis template library contain the parts shown in the table below:

    Prosthesis Template Library

    [0079]

    TABLE-US-00002 Name Data Type Remarks Serial No. int Unrepeatable, unique File Name varchar255 Image file name Storage Path varchar255 Absolute path Shape Parameter varchar255 File path Average Shape Vector varchar255 File path Shape Feature Matrix varchar255 File path Gray Parameter varchar255 File path Average Gray Vector varchar255 File path Gray Feature Matrix varchar255 File path Storage Time timestamp Automatically affixing timestamp
    The shape feature matrix, the shape parameter, the average shape vector, the gray feature matrix, the gray parameter and the average gray vector cannot be directly stored in the database list because the data amount is too large, and alternatively, can be saved as binary file, and then the storage paths thereof can be stored in the database.

    [0080] Step 3: carrying out personalized registration of the prosthesis template library.

    [0081] S31: preprocessing data. Acquiring the bone CT images of the patient to be implanted with a bone prosthesis. Referring to the methods of S11 and S12 in step 1, carrying out unified space and gray normalization on the CT images, and marking the anatomical landmarks of the bone CT images of the patient by experts.

    [0082] S32: carrying out personalized registration and parameterized representation. Referring to the method of S22 and S23 in step 2, carrying out personalized registration on the bone CT images of the patient by using the statistical model, and calculating the shape parameter b; and the gray parameter by of the bone of the patient through matching of the shape model and the gray model.

    [0083] S33: matching data. Retrieving the data of template images stored in the prosthesis template library and similar to those of the bone of the patient from the prosthesis template library by using a similarity calculation formula according to the shape parameter and gray parameter describing the bone of the patient obtained by registration, the similarity calculation formula is as follows:

    [00002] D = λ 1 .Math. i = 0 M ( b s - b s ) 2 + λ 2 .Math. i = 0 M ( b g - b g ) 2 wherein .Math. i = 0 M ( b s - b s ) 2

    is the variance between the shape parameter of the bone of the patient and that of the template image in the prosthesis template library for measurement of the level of similarity in the shape of the bone between two images; Σ.sub.i=0.sup.M(b.sub.g′−b.sub.g).sup.2 is the variance between the gray parameter of the bone of the patient and that of the template image in the prosthesis template library for measurement of the level of similarity in the gray of the bone between two images; λ.sub.1 and λ.sub.2 are weights of the shape similarity and the gray similarity, the sum thereof is 1, and the values of λ.sub.1 and λ.sub.2 are assigned according to different weights to realize the retrieval matching method based on the shape similarity or based on the gray similarity.

    [0084] Calculating the similarity D between each column of data in the prosthesis template library and the data of the patient, obtaining the template CT image most similar to the bone of the patient after comparison, and acquiring a bone prosthesis model corresponding to the image as the initial reference template for the personalized bone prosthesis of the patient for doctors to perform subsequent analysis and design of bone prostheses.

    [0085] With the registration of the hip bone as an example, the anatomical morphology and the gray changes of other bones of a human body are different, the thought and the method of the present invention are also applicable to other bones. Although the embodiments of the present invention have been shown and described, various amendments and improvements made by those ordinary skilled in the art without departing from the principle and basic thought of the present invention shall be considered to belong to the protection scope of the present invention.