METHOD FOR TRAINING POST-PROCESSING DEVICE FOR DENOISING MRI IMAGE AND COMPUTING DEVICE FOR THE SAME

20230162326 · 2023-05-25

    Inventors

    Cpc classification

    International classification

    Abstract

    Disclosed is a training method including outputting an MRI signal from a plurality of coils included in an MRI scanner and performing, by a computing device, supervised learning on a post-processing part included in the computing device by using, as training input data, a first image generated using a first group of coils among the plurality of coils and using, as a label, a second image generated using a second group of coils among the plurality of coils.

    Claims

    1. A magnetic resonance imaging (MRI) system comprising: an MRI scanner including a first group of coils and a second group of coils; and a computing device including a post-processing part for post-processing an MRI image and a training management part, wherein a first image generated based on signals obtained from the first group of coils is used as training input data for supervised learning of the post-processing part, a second image generated based on signals obtained from the second group of coils is used as a label for supervised learning of the post-processing part, and the training management part is configured to perform supervised learning on the post-processing part using the training input data and the label.

    2. The MRI system of claim 1, wherein the first image is a first MRI image generated based on a first group of MRI signals obtained from the first group of coils, generating of the second image includes: generating a second MRI image based on a second group of MRI signals obtained from the second group of coils; generating an intermediate label image based on the second MRI image so as to eliminate a correlation between first noise in the first MRI image and second noise in the second MRI image; and generating a label image based on the intermediate label image so as to compensate for a difference in sensitivity between the first group of coils and the second group of coils, and the second image is the generated label image.

    3. The MRI system of claim 2, wherein the first MRI image is an image obtained by synthesizing images of a first group generated from the MRI signals of the first group obtained from the first group of coils, the second MRI image is an image obtained by synthesizing images of a second group generated from the MRI signals of the second group obtained from the second group of coils, and the MRI scanner includes a transform part configured to generate the images of the first group from the MRI signals of the first group and generate the images of the second group from the MRI signals of the second group.

    4. The MRI system of claim 2, wherein the intermediate label image is generated based on a weighted sum of the first MRI image and the second MRI image.

    5. The MRI system of claim 1, wherein the first image is a first MRI image generated based on a first group of MRI signals obtained from the first group of coils, generating of the second image includes: generating a second MRI image based on a second group of MRI signals obtained from the second group of coils; and generating a label image based on the second MRI image so as to compensate for a difference in sensitivity between the first group of coils and the second group of coils, the second image is the generated label image.

    6. The MRI system of claim 1, wherein while performing the supervised learning, the post-processing part is configured to receive an input of the first image to generate a post-processed image, and the training management part is configured to train the post-training part using a loss function between the post-processed image and the second image.

    7. A magnetic resonance imaging (MRI) system comprising: an MRI scanner including a first group of coils and a second group of coils and configured to output an MRI image; and a computing device including a trainable post-processing part and a training management part configured to train the post-processing part, wherein the post-processing part is configured to, during a training process of the post-processing part, receive an input of a first image generated based on signals obtained from the first group of coils to generate a training post-processed image, the training management part is configured to, during the training process of the post-processing part, train the post-training part using a loss function between the training post-processed image and a second image generated based on signals obtained from the second group of coils, and the post-processing part is configured to receive the MRI image to output an image obtained by denoising the MRI image after the training process of the post-processing part is completed.

    8. The MRI system of claim 7, wherein the first image is a first MRI image generated based on a first group of MRI signals obtained from the first group of coils, generating of the second image includes: generating a second MRI image based on a second group of MRI signals obtained from the second group of coils; generating an intermediate label image based on the second MRI image so as to eliminate a correlation between first noise in the first MRI image and second noise in the second MRI image; and generating a label image based on the intermediate label image so as to compensate for a difference in sensitivity between the first group of coils and the second group of coils, and the second image is the generated label image.

    9. The MRI system of claim 7, wherein the first image is a first MRI image generated based on a first group of MRI signals obtained from the first group of coils, generating of the second image includes: generating a second MRI image based on a second group of MRI signals obtained from the second group of coils; and generating a label image based on the second MRI image so as to compensate for a difference in sensitivity between the first group of coils and the second group of coils, and the second image is the generated label image.

    10. A method of denoising a magnetic resonance imaging (MRI) image, comprising: outputting, by an MRI scanner, an MRI signal from a plurality of coils included in the MRI scanner; and inputting, by a computing device, an MRI image generated using signals obtained from the plurality of coils to a post-processing part included in the computing device to generate a post-processed image obtained by denoising the MRI image, wherein the post-processing part is trained using a supervised learning method, wherein the supervised learning method includes: generating, by a second MRI scanner including a first group of coils and a second group of coils, a first image based on signals obtained from the first group of coils; generating, by the second MRI scanner, a second image based on signals obtained from the second group of coils; and performing, by a second computing device, supervised learning on the post-processing part by using the first image as training input data for supervised learning of the post-processing part and using the second image as a label for supervised learning of the post-processing part.

    11. The method of claim 10, wherein the second computing device is the same device as the computing device, the second MRI scanner is the same device as the MRI scanner, and the plurality of coils included in the MRI scanner include the first group of coils and the second group of coils.

    12. The method of claim 10, wherein the first image is a first MRI image generated based on a first group of MRI signals obtained from the first group of coils, generating of the second image includes: generating a second MRI image based on a second group of MRI signals obtained from the second group of coils; generating an intermediate label image based on the second MRI image so as to eliminate a correlation between first noise in the first MRI image and second noise in the second MRI image; and generating a label image based on the intermediate label image so as to compensate for a difference in sensitivity between the first group of coils and the second group of coils, and the second image is the generated label image.

    13. The method of claim 10, wherein the first image is a first MRI image generated based on a first group of MRI signals obtained from the first group of coils, generating of the second image includes: generating a second MRI image based on a second group of MRI signals obtained from the second group of coils; and generating a label image based on the second MRI image so as to compensate for a difference in sensitivity between the first group of coils and the second group of coils, and the second image is the generated label image.

    14. A neural network training method for training a post-processing part configured to receive an input of a magnetic resonance imaging (MRI) image and denoise the MRI image, the method comprising: generating, by an MRI scanner including a first group of coils and a second group of coils, a first image based on signals obtained from the first group of coils; generating, by the MRI scanner, a second image based on signals obtained from the second group of coils; and performing, by a computing device, supervised learning on the post-processing part by using the first image as training input data for supervised learning of the post-processing part and using the second image as a label for supervised learning of the post-processing part.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0081] Exemplary embodiments can be understood in more detail from the following description taken in conjunction with the accompanying drawings, in which:

    [0082] FIG. 1 illustrates a main configuration of an MRI scanner;

    [0083] FIG. 2 is a diagram illustrating a process of generating a K-space or MRI image using an MRI signal obtained by the MRI scanner illustrated in FIG. 1;

    [0084] FIG. 3 is a diagram illustrating a concept of an MRI image including noise;

    [0085] FIG. 4 illustrates a method of denoising an MRI image according to a comparative embodiment;

    [0086] FIG. 5 illustrates a method of denoising an MRI image according to another comparative embodiment;

    [0087] FIG. 6 is a diagram illustrating images measured by using a plurality of coils included in an MRI scanner and a method of generating an MRI image using the images;

    [0088] FIG. 7 is a diagram for describing an equation that represents a relationship between a sub-image obtained by a coil [c] having an index c among coils and sensitivity of the coil [c];

    [0089] FIG. 8 is a diagram illustrating a configuration of an MRI system provided according to an embodiment of the present invention;

    [0090] FIG. 9 is a diagram illustrating a configuration of an MRI system provided according to an embodiment of the present invention modified from FIG. 8;

    [0091] FIG. 10 is a diagram illustrating a function performed by a computing device provided according to an embodiment of the present invention after training of a post-processing part is completed;

    [0092] FIG. 11A is a diagram illustrating a process of generating two images through a one-time data acquisition process in an MRI scanner and training a post-processing part by using the two images according to an embodiment of the present invention;

    [0093] FIG. 11B illustrates an embodiment modified from the embodiment illustrated in FIG. 11A;

    [0094] FIG. 11C illustrates a configuration of a system, which is provided according to a preferred embodiment of the present invention, for performing a training method of a post-processing part for denoising an MRI image;

    [0095] FIG. 12 illustrates a method of denoising an MRI image using a trained post-processing part according to an embodiment of the present invention; and

    [0096] FIG. 13 is a flowchart illustrating a training method provided according to an embodiment of the present invention.

    [0097] FIG. 14 is a flowchart illustrating the supervised learning operation of FIG. 13 in detail.

    [0098] FIG. 15 is a diagram illustrating the computing device shown in FIGS. 9, 10, and 12 from a hardware aspect.

    DETAILED DESCRIPTION OF EMBODIMENTS

    [0099] Embodiments of the present invention will be described with reference to the accompanying drawings. However, the present invention is not limited to the embodiments described herein, and may be implemented in various different forms. The terminology used herein is not for limiting the scope of the present invention but for describing the embodiments. Furthermore, the singular forms used herein include the plural forms as well, unless otherwise indicated.

    [0100] FIG. 8 is a diagram illustrating a configuration of an MRI system provided according to an embodiment of the present invention.

    [0101] An MRI system 1000 may include an MRI scanner 200 and a computing device 100.

    [0102] In the case where the MRI scanner 200 and the computing device 100 are integrally provided, the MRI system may be simply referred to as an MRI scanner.

    [0103] The MRI scanner 200 may include a plurality of coils 211, 212, 221, and 222. The MRI scanner 200 has a space capable of accommodating an object 70.

    [0104] The number of the plurality of coils may be, for example, N in total, but only four coils are illustrated in FIG. 8 for convenience.

    [0105] The plurality of coils are divided into a plurality of groups, for example, two groups.

    [0106] In the example of FIG. 8, the plurality of coils 211, 212, 221, and 222 are divided into first group coils 211 and 212 [G1] and second group coils 221 and 222 [G2]. The first group is indicated by symbol G1, and the second group is indicated by symbol G2.

    [0107] In an embodiment, a total number of coils included in the first group and a total number of coils included in the second group may be different from each other. Alternatively, in another embodiment, the total number of coils included in the first group and the total number of coils included in the second group may be the same. However, the present invention is not limited by the number of coils included in each coil.

    [0108] In a preferred embodiment, a union of the first group of coils and the second group of coils is the same as a set of all of coils included in the MRI scanner 200. Here, there may be no overlapping coil between the first group of coils and the second group of coils.

    [0109] The computing device 100 may include a post-processing part 110, a training management part 120, and a label image generating part 130.

    [0110] The post-processing part 110 may be a trainable network. For example, the post-processing part 110 may include an artificial intelligence network, a neural network, or a machine learning network.

    [0111] The training management part 120 may be configured to manage a training process of the post-processing part 110.

    [0112] In an embodiment of the present invention, a second MRI image 620 may be used as a label for supervised learning as it is.

    [0113] In a preferred embodiment of the present invention, an image generated by correcting the second MRI image 620, i.e., a label image, may be used as a label for supervised learning. The label image generating part 130 is configured to generate the label image by correcting the second MRI image 620.

    [0114] A label generation process for generating, by the label image generating part 130, the label image from the second MRI image 620 may include correcting the second MRI image 620 based on a difference in sensitivity between the first group of coils and the second group of coils. Furthermore, the label generation process may further include correcting the second MRI image 620 by eliminating a correlation between noise included in a first MRI image and noise included in the second MRI image.

    [0115] When training of the post-processing part 110 is completed, the post-processing part 110 may autonomously operate without intervention of the training management part 120.

    [0116] FIG. 9 is a diagram illustrating a configuration of an MRI system provided according to an embodiment of the present invention modified from FIG. 8.

    [0117] FIG. 9 illustrates the same structure as that illustrated in FIG. 8 except that the computing device 100 is separated from the MRI system 1000.

    [0118] FIG. 10 is a diagram illustrating a function performed by a computing device provided according to an embodiment of the present invention after training of a post-processing part is completed.

    [0119] The computing device 100 may obtain an MRI image output from the MRI scanner 200 and provide the MRI image to the post-processing part 110.

    [0120] The MRI image may be an image including noise. The MRI image may be an image generated using all of the coils included in the MRI scanner 200. For example, the MRI image may be an image including noise and indicated by reference number 600 in FIGS. 2, 3, 4, or 5.

    [0121] The post-processing part 110 may output a denoised image by processing the MRI image including noise. Performance of the post-processing part 110 may be evaluated to be better as an image output from the post-processing part 110 is closer to the true image (x).

    [0122] FIG. 11A is a diagram illustrating a process of generating two images through a one-time data acquisition process in an MRI scanner and training a post-processing part by using the two images according to an embodiment of the present invention.

    [0123] Here, the “one-time data acquisition process” may represent a process of acquiring one output signal from each of substantially available coils among the coils included in the MRI scanner 200.

    [0124] Here, the coils may have a signal detection function.

    [0125] As described above, the plurality of coils included in the MRI scanner 200 are divided into a plurality of groups, for example, two groups.

    [0126] FIG. 11A illustrates an example in which the coils 211 and 212 belong to the first group G1 among the plurality of groups and the coils 221 and 222 belong to the second group G2 among the plurality of groups.

    [0127] The MRI scanner 200 or the MRI system 1000 may be configured to generate a first MRI signal 510 including signals output from the first group G1 of coils and generate a second MRI signal 520 including signals output from the second group of coils G2.

    [0128] In a preferred embodiment, the signals output from the second group G2 of coils may not be included in the first MRI signal 510, and the signals output from the first group G1 of coils may not be included in the second MRI signal 520.

    [0129] A transform part 230 included in the MRI scanner 200 or the MRI system 1000 may be configured to transform the first MRI signal 510 into a first image 610 (= first MRI image) (I.sub.input) and transform the second MRI signal 520 into the second MRI image 620 (I.sub.label).

    [0130] In the present disclosure, the first image 610 (I.sub.input) may be referred to as a first MRI image 610 (I.sub.input).

    [0131] In an embodiment of the present invention, the transform part 230 may include a first transform part 231 for transforming the first MRI signal 510 into the first MRI image 610 and a second transform part 232 for transforming the second MRI signal 520 into the second MRI image 620.

    [0132] The first image 610 (I.sub.input) generated using the signals output from the first group G1 of coils may be provided as training input data of the post-processing part 110. The post-processing part 110 may output a post-processed image 630 generated by post-processing the first image 610 (I.sub.input). The post-processed image 630 may be provided as a first input to the training management part 120.

    [0133] In an embodiment, the second MRI image 620 (I.sub.label) generated using the signals output from the second group G2 of coils may be directly provided as a second input to the training management part 120. In this case, the second MRI image 620 (I.sub.label) is a label image, i.e., a second image 820.

    [0134] However, as illustrated in FIG. 11A, according to a preferred embodiment of the present invention, the image generated by correcting the second MRI image 620, i.e., the label image 820 (I″.sub.label), may be used as a label for supervised learning. The label image generating part 130 is configured to generate the label image 820 (I″.sub.label) by correcting the second MRI image 620.

    [0135] The label generation process may further include generating a first corrected image (I′.sub.label) by correcting the second MRI image 620 (I.sub.input) in order to eliminate a correlation between noise included in the first MRI image 610 and noise included in the second MRI image 620.

    [0136] If there is no correlation between noise included in the first MRI image 610 and noise included in the second MRI image 620, the second MRI image 620 (I.sub.input) may become the first corrected image (I.sub.label) per se.

    [0137] Furthermore, a label generation process for generating, by the label image generating part 130, the label image 820 (I″.sub.label) from the second MRI image 620 (I.sub.input) may include generating the label image 820 (I″.sub.label) based on a difference in sensitivity between the first group G1 of coils and the second group G2 of coils.

    [0138] By eliminating the difference in sensitivity between the coils, an image obtained by denoising the first MRI image 610 provided to the post-processing part 110 and an image obtained by denoising the label image 820 (I″.sub.label) provided to the training management part 120 may be rendered identical.

    [0139] The training management part 120 may calculate a loss value according to a loss function between the post-processed image 630 and the label image 820 (I″.sub.label). Furthermore, update information P121 for changing a parameter θ of the post-processing part 110 may be generated so as to reduce the loss value according to the loss function. The training management part 120 may train the post-processing part 110 by changing the parameter θ of the post-processing part 110 using the update information P121.

    [0140] When the MRI scanner 200 performs the data acquisition process multiple times, a plurality of different sets of the first image and the label image may be obtained. For example, the MRI scanner 200 may perform the data acquisition process N times in order to prepare N number of different sets of the first image and the label image.

    [0141] With regard to two of the different data acquisition processes, a scan target (for example, a person) to be scanned by the MRI scanner 200 may be different. Alternatively, with regard to two of the different data acquisition processes, a scan target (for example, a person) to be scanned by the MRI scanner 200 may have a different posture.

    [0142] The computing device 100 may finish training of the post-processing part 110 by repeating the training using the plurality of different sets of the first image and the label image.

    [0143] Here, the first image 610 (I.sub.input) may satisfy [Equation 5]. [00153]

    [00006]Iinput=.Math.isiHyi=.Math.isiHsix+ni=.Math.isiHsix+.Math.isiHniE.Math.isiHni=.Math.isiHEni=0

    [0144] In [Equation 5], the subscript i denotes an index for identifying the coils belonging to the first group G1, and s.sub.i denotes spatial sensitivity of corresponding coils.

    [0145] Here, the second MRI image 620 (I.sub.label) may satisfy [Equation 6]. [00158]

    [00007]Ilabel=.Math.jsjHyj=.Math.jsjHsjx+nj=.Math.jsjHsjx+.Math.jsjHnjE.Math.jsjHnj=.Math.jsjHEnj=0

    [0146] In [Equation 6], the subscript j denotes an index for identifying the coils belonging to the second group G2, and S.sub.j denotes spatial sensitivity of corresponding coils.

    [0147] In [Equation 5] and [Equation 6], expectation values of the noise terms may be 0 and may be independent of each other. Furthermore, n.sub.i and n.sub.j are symmetric with each other.

    [0148] Meanwhile, when the first image 610 (I.sub.input) and the second MRI image 620 (I.sub.label) are given as expressed in [Equation 5] and [Equation 6], a loss function used by the training management part 120 may be defined as [Equation 7]. [00164]

    [00008]loss=fθIinputIlabel22

    [0149] Here, f.sub.θ(I.sub.input) represents the post-processed image 630.

    [0150] Descriptions have been provided with reference to FIG. 5 on the assumption that a value (x) of the term obtained by eliminating noise from the MRI image 600 (x+n.sub.1) input to the network 111 (fθ) is equal to a value (x) of the term obtained by eliminating noise from the other label image 604 (x+n.sub.2) used as a label.

    [0151] However, on the contrary, the left term Σ.sub.iS.sub.i.sup.H.sub.SiX among the terms constituting I.sub.input is a value obtained by multiplying the true image (x) by Σ.sub.iS.sub.i.sup.HS.sub.i in [Equation 5], and the left term Σ.sub.jS.sub.j.sup.HS.sub.jX among the terms constituting the second MRI image 620 (I.sub.label)) is a value obtained by multiplying the true image x by Σ.sub.jS.sub.j.sup.HS.sub.j in [Equation 6]. That is, the value of the term obtained by eliminating noise from I.sub.input of [Equation 5] and the value of the term obtained by eliminating noise from I.sub.label of [Equation 6] are different from each other.

    [0152] Therefore, there may occur an issue in which a combination of [Equation 5], [Equation 6], and [Equation 7] do not satisfy the assumption given with regard to FIG. 5. This issue may be resolved by correcting the loss function as expressed in [Equation 8]. [00170]

    [00009]loss=.Math.jsjHsjfθIinput.Math.isiHsiIlabel22

    [0153] That is, the loss function may be redefined using a value obtained by multiplying f.sub.θ(I.sub.input) by Σ.sub.jS.sub.j.sup.HS.sub.j that is a proportional constant value included in I.sub.label and a value obtained by multiplying I.sub.label by Σ.sub.iS.sub.i.sup.HS.sub.i that is a proportional constant value included in fθ(I.sub.input).

    [0154] In an embodiment of the present invention, I.sub.input of [Equation 5] is used as the first image provided to the post-processing part 110, I.sub.label of [Equation 6] is used as the label image provided to the training management part 120, and the loss of [Equation 8] is used as the loss function. Here, the above first embodiment satisfies the assumption given with regard to FIG. 5. That is, even if the post-processing part 110 is trained using the second MRI image 620 (I.sub.label) as a label image, a training effect may be achieved, which is the same as or similar to the effect exhibited when training the post-processing part 110 using the true image (x) for the first image 610 (I.sub.input) as a label image.

    [0155] In another embodiment of the present invention, I.sub.input of [Equation 5] is used as the first image provided to the post-processing part 110, I.sub.label of [Equation 9] shown below is used as the label image provided to the training management part 120, and the loss’ of [Equation 7] is used as the loss function. Here, the above second embodiment satisfies the assumption given with regard to FIG. 5. Therefore, the same training effect as that of the first embodiment may be achieved. [00175]

    [00010]Ilabel=.Math.jsjHsjx+.Math.jsjHnj.Math.isiHsi/.Math.jsjHsj

    [0156] In another embodiment of the present invention, I.sub.input of [Equation 10] is used as the first image provided to the post-processing part 110, I.sub.label of [Equation 6] is used as the label image provided to the training management part 120, and the loss’ of [Equation 7] is used as the loss function. Here, the above third embodiment satisfies the assumption given with regard to FIG. 5. Therefore, the same training effect as that of the first embodiment may be achieved. [00178]

    [00011]Iinput=.Math.isiHsix+.Math.isiHni.Math.jsjHsj/.Math.isiHsi

    [0157] FIG. 11B illustrates an embodiment modified from the embodiment illustrated in FIG. 11A.

    [0158] Hereinafter, descriptions about FIG. 11B will be provided with focus on differences with FIG. 11A.

    [0159] The transform part 230 may transform the first MRI signal 510 into the first MRI image 610, and thereafter may transform the second MRI signal 520 into the second MRI image 620. To this end, the transform part 230 may include a buffer for storing the second MRI signal 520 while transforming the first MRI signal 510 into the first MRI image 610.

    [0160] In FIGS. 11A and 11B, the transform part 230 may be a computing module including a programmed FPGA or dedicated calculation part and a memory.

    [0161] In FIGS. 11A and 11B, the signals output from the first group of coils are input to the transform part 230 after being combined by a first signal combining part 241, and the signals output from the second group of coils are input to the transform part 230 after being combined by a second signal combining part 242. However, in the modified embodiment, all of the signals output from the coils belonging to the first and second groups may be directly input to the transform part 230 without passing through the first signal combining part 241 and the second signal combining part 242. Furthermore, the signals output from the first group of coils may be combined with each other in the transform part 230, and the signals output from the second group of coils may be combined with each other in the transform part 230.

    Preferred Embodiment

    [0162] FIG. 11C illustrates a configuration of a system, which is provided according to a preferred embodiment of the present invention, for performing a training method of a post-processing part for denoising an MRI image.

    [0163] Described below with reference to FIG. 11C is a process of generating two images through a one-time data acquisition process in an MRI scanner and training a post-processing part by using the two images according to an embodiment of the present invention.

    [0164] The post-processing part is a denoising network.

    [0165] The MRI scanner 200 may include the plurality of coils 211, 212, 221, and 222, the transform part 230, and the image combining parts 251 and 252.

    [0166] The plurality of coils included in the MRI scanner 200 are divided into a plurality of groups, for example, two groups. In the example of FIG. 11C, the coils 211 and 212 belong to the first group G1 among the plurality of groups and the coils 221 and 222 belong to the second group G2 among the plurality of groups.

    [0167] When the one-time data acquisition process is performed in the MRI scanner 200, the 11th coil 211, the 12th coil 212, the 21st coil 221, and the 22nd coil 222 output an 11th MRI signal 511, a 12th MRI signal 512, a 21st MRI signal 521, and a 22nd MRI signal 522, respectively.

    [0168] The transform part 230 may generate an 11th MRI image 611, a 12th MRI image 612, a 21st MRI image 621, and a 22nd MRI image 622 by transforming the 11th MRI signal 511, the 12th MRI signal 512, the 21st MRI signal 521, and the 22nd MRI signal 522, respectively.

    [0169] In the present disclosure, an MRI image obtained by combining all of the 11th MRI image 611, the 12th MRI image 612, the 21st MRI image 621, and the 22nd MRI image 622 may be denoted by x. Here, each of the 11th MRI image 611, the 12th MRI image 612, the 21st MRI image 621, and the 22nd MRI image 622 may be referred to as an individual channel image and denoted by yi.

    [0170] Here, [Equation 11] is established. [00195]

    [00012]yi=six+niWhere, xC2,yiC2,siC2.

    [0171] Here, s.sub.i is the coil sensitivity, and n.sub.i is the noise of i.sup.th channel image, modeled as zero mean Gaussian with the standard deviation of σ.sub.i for both real and imaginary axis. The matrix multiplication (or division) hereafter indicates Hadamard multiplication (or division).

    [0172] In an embodiment, the transform part 230 may include an 11th transform part 2311 for generating the 11th MRI image 611 from the 11th MRI signal 511, a 12th transform part 2312 for generating the 12th MRI image 612 from the 12th MRI signal 512, a 21st transform part 2321 for generating the 21st MRI image 621 from the 21st MRI signal 521, and a 22nd transform part 2322 for generating the 22nd MRI image 622 from the 22nd MRI signal 522.

    [0173] The first image combining part 251 may generate the first MRI image 610 (I.sub.input) by combining the 11th MRI image 611 and the 12th MRI image 612.

    [0174] The second image combining part 252 may generate the second MRI image 620 (I.sub.label) by combining the 21st MRI image 621 and the 22nd MRI image 622.

    [0175] The first MRI image 610 (I.sub.input) and the second MRI image 620 (I.sub.label) satisfy [Equation 12]. [00203]

    [00013]Iinput=.Math.jsjHyj

    [00014]Ilabel=.Math.kskHyk

    [0176] Where, j denotes the first group G1, and k denotes the second group G2. And S.sub.i.sup.H is the hermitian of s.sub.i. It is assumed that the two images cover all imaging volumes because most of the individual coils have relatively large volume coverage and are mutually coupled.

    [0177] The first MRI image 610 (I.sub.input) and the second MRI image 620 (I.sub.label) may be provided to the computing device 100.

    [0178] The first MRI image 610 may be provided as training input data of the post-processing part 110. The post-processing part 110 may output the post-processed image 630 generated by post-processing the first image 610 (I.sub.input). The post-processed image 630 may be provided as a first input to the training management part 120.

    [0179] The second MRI image 620 (I.sub.label) may be input to the noise decorrelation part 131. The noise decorrelation part 131 is configured to transform the second MRI image 620 (I.sub.label) so as to eliminate a correlation between first noise in the first MRI image 610 (I.sub.input) and second noise in the second MRI image 620 (Ilabel).

    [0180] The noise decorrelation part 131 is configured to output an intermediate label image 720 (I′.sub.label) by transforming the second MRI image 620 (I.sub.label).

    [0181] These two images, I.sub.input and I.sub.label, have different coil sensitivity weighting and may have noise correlation (e.g., mutual inductance between channels). Therefore, they need to be further processed to satisfy the three conditions, first, the paired images have independent noise, second, they have the same noise-free image, and third, the expectation of the noise is zero. In order to impose the independence of noise between the two images, a generalized least-square solution is applied, resulting in the following modification in the label image as indicated by Equation 13.

    [0182] The intermediate label image 720 (I′.sub.label), the second MRI image 620 (I.sub.label), and the first MRI image 610 (I.sub.input) have a relationship as expressed in [Equation 13].

    [00015]Ilabel=αIinput+βIlabel

    with α = -σ.sub.JK.sup.2 / root{σJ.sup.2σK.sup.2 - (σ.sub.JK.sup.2).sup.2}, and β = -σJ.sup.2 / root {σ.sub.J.sup.2σK.sup.2 -(σ.sub.JK.sup.2).sup.2 }

    [0183] Here, σ.sub.J.sup.2, σ.sub.K.sup.2, and σ.sub.JK.sup.2 are matrices (∈R.sup.2) calculated as var(|Σ.sub.jS.sub.j.sup.Hy.sub.j|), var(|Σ.sub.kS.sub.k.sup.Hyk|), and .sub.COV(|ΣjSj.sup.Hy.sub.j|, |Σ.sub.kS.sub.k.sup.Hy.sub.k|), respectively. In these equations, all operations are voxel-wise operations. As a result, I.sub.label is modified to I′.sub.label, and thereby the noise covariance between I.sub.input and I′.sub.label becomes zero.

    [0184] In this specification, I′.sub.label may be called as an intermediate label image 720.

    [0185] If there is no correlation between first noise in the first MRI image 610 (I.sub.inPut) and second noise in the second MRI image 620 (I.sub.label), the intermediate label image 720 (I′.sub.label) is the same as the second MRI image 620 (I.sub.label).

    [0186] The intermediate label image 720 (I′.sub.label) may be input to a coil sensitivity compensation part 132.

    [0187] The coil sensitivity compensation part 132 is configured to transform the intermediate label image 720 (I′.sub.label) by compensating for a difference in sensitivity between the coils 211 and 212 of the first group and the coils 221 and 222 of the second group.

    [0188] To impose the requirement of the same noise-free image, the coil sensitivity of I′.sub.label (i.e., S′K=a|ΣkSk.sup.H|+β|ΣjSj.sup.H|) is modified to match that of I.sub.nput (i.e., S.sub.J= |Σ.sub.jS.sub.jH|)by multiplying the sensitivity ratio (S.sub.J/S′.sub.K) to I′.sub.label in each voxel, generating a final image (I″.sub.label=(Sj/S′K).Math.S′k) • I′.sub.label). Since multiplying a coefficient is a linear process, the first condition of noise independence still holds after the processing.

    [0189] In this specification, the final image I″.sub.label may be called as a label image 820.

    [0190] The coil sensitivity compensation part 132 is configured to output the label image 820 (I″.sub.label) by transforming the intermediate label image 720 (I′.sub.label).

    [0191] Here, covariance between first noise included in the first MRI image 610 (I.sub.input) and third noise included in the label image 820 (I″.sub.label) is zero. Furthermore, an image obtained by eliminating the first noise from the first MRI image 610 (I.sub.input) and an image obtained by eliminating the third noise from the label image 820 (I″.sub.label) are the same.

    [0192] The above mentioned third condition of zero-mean noise is valid, assuming that the combined images have reasonably high SNR such that the noise characteristics within the image can be considered as Gaussian with zero mean.

    [0193] The label image 820 (I″.sub.label) may be provided as a second input to the training management part 120.

    [0194] The training management part 120 may calculate a loss value according to a loss function between the post-processed image 630 and the label image 820 (I″.sub.label). Furthermore, update information P121 for changing a parameter θ of the post-processing part 110 may be generated so as to reduce the loss value according to the loss function. The training management part 120 may train the post-processing part 110 by changing the parameter θ of the post-processing part 110 using the update information P121.

    [0195] In this embodiment, for the training of a denoising network, the L2 loss is utilized as following Equation 14.

    [00016]loss=SKfθIinputSJIlabel22

    where f.sub.θ is the neural network. It is noted that the scaled version of I′.sub.label is used instead of I″.sub.label to avoid division. This loss function is calculated within a brain mask.

    [0196] FIG. 12 illustrates a method of denoising an MRI image using a trained post-processing part according to an embodiment of the present invention.

    [0197] The post-processing part illustrated in FIG. 12 may be one that has been trained using the method described with reference to FIGS. 11A and 11B.

    [0198] The MRI scanner 200 may output the MRI image 600. The MRI image 600 may be an image generated using all of the coils included in the MRI scanner 200. The MRI image 600 may be an image in which noise (n) is added to a true image (x).

    [0199] The output MRI image 600 may be input to the post-processing part 110 of the computing device 100. The post-processing part 110 may output the post-processed image 603. When the post-processing part 110 has been sufficiently trained according to an embodiment of the present invention, an error between the post-processed image and the true image 601 (x) may be very small.

    [0200] A method of denoising an MRI image provided according to an embodiment of the present invention may include: outputting, by the MRI scanner 200, an MRI signal from the plurality of coils 211 included in the MRI scanner 200; and inputting, by the computing device 100, an MRI image generated using signals obtained from the plurality of coils to the post-processing part 110 included in the computing device 100 to generate a post-processed image obtained by denoising the MRI image.

    [0201] Here, the post-processed part 110 may be one that has been supervised-trained according to a supervised learning method.

    [0202] FIG. 13 is a flowchart illustrating a training method provided according to an embodiment of the present invention.

    [0203] In operation S10, the MRI scanner 200 may output an MRI signal from the plurality of coils 211 included in the MRI scanner 200.

    [0204] In operation S20, the computing device 100 that post-processes an MRI image may perform supervised learning on the post-processing part 110 included in the computing device 100 by using, as training input data, an image generated using a signal obtained from a first group of coils among the plurality of coils included in the MRI scanner 200 and using, as a label, an image generated using a signal obtained from a second group of coils among the plurality of coils.

    [0205] FIG. 14 is a flowchart illustrating the supervised learning operation of FIG. 13 in detail.

    [0206] Above supervised learning operation S20 may include operation S21 and operation S22.

    [0207] In operation S21, the post-processing part 110 receives an input of the image 610 generated by transforming the first MRI signal 510 obtained by the first group of coils and generates the post-processed image 630.

    [0208] In operation S22, the training management part 120 included in the computing device 100 trains the post-processing part 110 using a loss function between the post-processed image 630 and an image generated by transforming the second MRI signal 520 obtained by the second group of coils.

    [0209] Here, the image generated by transforming the second MRI signal 520 may be the label image 820 (I″.sub.label) illustrated in FIGS. 11A, 11B, and 11C.

    [0210] FIG. 15 is a diagram illustrating the computing device shown in FIGS. 9, 10, and 12 from a hardware aspect.

    [0211] The computing device 100 may include a device interface unit 3 capable of reading a computer-readable nonvolatile recording medium 2 and a processing unit 4.

    [0212] The nonvolatile recording medium 2 may store a program including a first instruction code for executing a function of the post-processing part 110. The first instruction code may be referred to as a post-processing instruction code. The processing unit 4 may be configured to execute the function of the post-processing part 110 by reading and executing the first instruction code through the device interface unit 3.

    [0213] Furthermore, the nonvolatile recording medium 2 may store a program including a second instruction code for executing a function of the training management part 120. The second instruction code may be referred to as a training management instruction code. The processing unit 4 may be configured to execute the function of the training management part 120 by reading and executing the second instruction code through the device interface unit 3.

    [0214] The nonvolatile recording medium 2 may store a program including a third instruction code for executing: controlling, by the computing device 100, the MRI scanner 200 to operate so as to output an MRI signal from a plurality of coils included in the MRI scanner 200; and performing supervised learning on the post-processing part 110 included in the computing device 100 by using, as training input data, the first image 610 generated using a signal obtained from a first group of coils among the plurality of coils included in the MRI scanner 200 and using, as a label, the second image 820 generated using a signal obtained from a second group of coils among the plurality of coils. The processing unit 4 may be configured to perform a method of performing supervised learning on the post-processing part 110 by reading and executing the third instruction code through the device interface unit 3.

    [0215] Here, the performing supervised learning may include: receiving, by the post-processing part 110, an input of the first image 610 generated by transforming the first MRI signal 510 obtained by the first group of coils to generate the post-processed image 630; and training, by the training management part 120, the post-processing part 110 using a loss function between the post-processed image 630 and the second image 820 generated by transforming the second MRI signal 520 obtained by the second group of coils.

    [0216] The nonvolatile recording medium 2 may store a program including a fourth instruction code for executing: generating, by the computing device 100, a first K-space corresponding to the first MRI signal by transforming the first MRI signal and generating the first image 610 corresponding to the first MRI signal; and generating, by the computing device 100, a second K-space corresponding to the second MRI signal by transforming the second MRI signal and generating the second image 820 corresponding to the second MRI signal. The processing unit 4 may be configured to perform a method of generating the first image 610 and the second image 820 by reading and executing the fourth instruction code through the device interface unit 3.

    [0217] According to the present invention, a specific technology for generating label data and training data for supervised learning of a post-processing part for denoising an MRI image can be provided.

    [0218] Those skilled in the art could easily make various alterations or modifications to the above-mentioned embodiments of the present invention without departing the essential characteristics of the present invention. The claims that do not refer to each other may be combined with each other within the scope of understanding of the present disclosure.