METHOD FOR DETERMINING WHETHER A CELL SHOWN IN A NUCLEAR FLUORESCENCE IMAGE ACQUIRED THROUGH CONFOCAL MICROSCOPE IS A TUMOROUS CELL
20230377143 · 2023-11-23
Inventors
- Fabrizio FREZZA (Roma, IT)
- Fabio MANGINI (Roma, IT)
- Maurizio TROIANO (Roma, IT)
- Marco MUZI (Roma, IT)
- Anna ALISI (Roma, IT)
Cpc classification
International classification
Abstract
A method determines whether a cell shown in a nuclear fluorescence image acquired through a confocal microscope is a tumorous cell. The method is based on the application of a discrete Wavelet transform to a reference matrix associated with a reference image of the nucleus of the cell, obtained by inserting a segmented image of the nucleus on a background of a predetermined color, to obtain four further matrices, and on the generation of a respective co-occurrence matrix for each further statistical function. The matrices characterize the nucleus of the cell and are calculated starting from each co-occurrence matrix. The results are provided as input to a predetermined neural network (NN).
Claims
1. A method for determining whether at least one cell of body tissue shown in an nuclear fluorescence image acquired through a confocal microscope is a tumorous cell, wherein said fluorescence is obtained through a DNA intercalating agent, said method comprising: A) segmenting said nuclear fluorescence image to obtain at least one segmented image I.sub.s-referred to a nucleus (C) of a single cell; B) inserting said at least one segmented image referred to said nucleus (C) of said cell on a background having a predetermined color to obtain at least one reference image (I.sub.REF), in which a reference matrix M.sub.REF of dimensions M×N is associated with said reference image (I.sub.REF) and each pixel of said reference image (I.sub.REF) corresponds a respective number in said reference matrix M.sub.REF whose value is the respective grey level of said pixel; C) applying a discrete Wavelet transform to said reference matrix M.sub.REF to obtain a further first matrix M.sub.1 associated with a further first image (11) which is an image of the nucleus (C) of the cell shown in said reference image (I.sub.REF), in which said further first image (I.sub.1) has a resolution lower than the resolution of said reference image (I.sub.REF), a further second matrix M.sub.2 associated with a further second image (I.sub.2) referred to the horizontal components of said reference image (I.sub.REF), a further third matrix M.sub.3 associated with a further third image (I.sub.3) referred to the vertical components of said reference image (I.sub.REF), a further fourth matrix M.sub.4 associated with a further fourth image (I.sub.4) referred to the diagonal components of said reference image (I.sub.REF), in which each of said further matrices M.sub.1,M.sub.2,M.sub.3,M.sub.4 is a matrix of dimensions M′×N′ and a pixel in position x, y of each further image (I.sub.1,I.sub.2,I.sub.3,I.sub.4) corresponds to a respective number in position x,y inside a respective further matrix M.sub.1,M.sub.2,M.sub.3,M.sub.4 and the value of said number is the respective grey level of said pixel; D) creating a respective Co-occurrence matrix P.sub.1(i,j|Δx, Δy), P.sub.2(i,j|Δx, Δy), P.sub.3(i,j|Δx, Δy), P.sub.4(i,j|Δx, Δy) for each of said further four matrices M.sub.1,M.sub.2,M.sub.3,M.sub.4, in which each Co-occurrence matrix contains information on the nucleus (C) of said cell in terms of texture, magnitude and morphology, and is a matrix of dimensions G×G, where G is the number of grey levels and each of said Co-occurrence matrices P.sub.1(i,j|Δx, Δy), P.sub.2(i,j|Δx, Δy), P.sub.3(i,j|Δx, Δy), P.sub.4(i,j|Δx, Δy) has in a respective position i,j the number of pairs of elements of a respective further matrix M.sub.1,M.sub.2,M.sub.3,M.sub.4, in which each pair of elements is associated with a respective pair of pixels and is formed by a first element associated with a first pixel of said pair of pixels having a grey level equal to i and by a second element associated with a second pixel of said pair of pixels, different from said first pixel and having a grey level equal to j, where i is a positive integer i=0 . . . G and j is a positive integer j=0 . . . G; E) calculating a plurality of statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.N starting from each Co-occurrence matrix P.sub.1(i,j|Δx, Δy), P.sub.2(i,j|Δx, Δy), P.sub.3(i,j|Δx, Δy), P.sub.4(i,j|Δx, Δy) to characterize at least the texture of the nucleus (C) of said cell, in which each statistical function SF.sub.1,SF.sub.2 . . . SF.sub.N is associated with a respective parameter of a further image of the nucleus (C) of said cell and the result of each statistical function SF.sub.1,SF.sub.2 . . . SF.sub.N is a respective number, so that a vector V of numbers comprising four sub-vectors v.sub.1,v.sub.2,v.sub.3,v.sub.4, is associated with the nucleus (C) of said cell, each sub-vector being associated with a respective further image (I.sub.1,I.sub.2,I.sub.3,I.sub.4) and containing k elements in which k is the number of said statistical functions, F) supplying as input to a predetermined neural network (NN) the results of said statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.N, in which said predetermined neural network (NN) comprises an output layer with at least a first output node (N.sub.OUT1) and is configured to provide as output a first numerical value between 0 and 1 at said first output node (N.sub.OUT1), G) comparing said first numerical value with a predetermined threshold, H) identifying said cell as a tumorous cell, by determining that said first numerical value is greater than said predetermined threshold.
2. The method according to claim 1, wherein step G comprises the sub-step G1 of approximating said first numerical value to 1, when said first numerical value is greater than said predetermined threshold, and to 0, when said first numerical value is less than or equal to said predetermined threshold, wherein with reference to step H, the nucleus (C) of said cell is the nucleus of a diseased cell, in particular a tumorous cell, when said first numerical value is approximated to 1.
3. The method according to claim 1, wherein said output layer comprises a second output node (N.sub.OUT2), wherein with reference to step F, said predetermined neural network (NN) is configured to provide as output a second numerical value between 0 and 1 at said second output node (N.sub.OUT2), wherein step G comprises comparing said second numerical value with said predetermined threshold, wherein step H allows to determine whether said cell is a diseased cell, in particular a tumorous cell, when said second numerical value is less than or equal to said predetermined threshold, as well as when said first numerical value is greater than said predetermined threshold.
4. The method according to claim 2, wherein step G comprises the sub-step G2 of approximating said second numerical value to 1, when said second numerical value is greater than said predetermined threshold, and to 0, when said second numerical value is less than or equal to said predetermined threshold, wherein with reference to step H, the nucleus (C) of said cell is nucleus of a diseased cell, in particular a tumorous cell, when said second numerical value is approximated to 0, as well as when said first numerical value is approximated to 1.
5. The method according to claim 1, wherein said plurality of statistical functions comprises: a first statistical function SF.sub.1 named Inverse Difference Moment to indicate a homogeneity in the distribution of grey levels:
6. The method according to claim 1, wherein said plurality of statistical functions comprises two further statistical functions to characterize the magnitude and the morphology of the nucleus (C) of said cell, respectively: a eighth statistical function SF.sub.8 named Extension to offer an estimate of the magnitude of the nucleus (C) of the cell through a number of pixel pairs, each of which is formed from a respective first pixel and a respective second pixel, different from said first pixel and is positioned next to said first pixel, wherein the first pixel and the second pixel of each pixel pair have a grey level equal to 0:
EX=1/P.sub.z(i=1,j=1|Δx,Δy) where P.sub.z(i=1,j=1|Δx, Δy) is the first element of the Co-occurrence matrix; a ninth statistical function SF.sub.8 named EdgeLengthEstimate to offer an estimate of the perimeter of the nucleus of the cell (C) through a number of pixel pairs, each of which is formed by a respective first pixel and by a respective second pixel, different from said first pixel and is positioned next to said first pixel, wherein one of two pixels has a grey level equal to 0:
7. The method according to claim 1, wherein said predetermined neural network (NN) comprises an input layer and said input layer comprises a number of input nodes (N.sub.IN1,N.sub.IN2 . . . N.sub.INF) equal to the total number of statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.N calculated for each matrix of Co-occurrence.
8. The method according to claim 1, wherein said predetermined neural network (NN) comprises at least one hidden layer, in which said hidden layer comprises at least one respective first hidden node (N.sub.N1).
9. The method according to claim 1, wherein said hidden layer comprises ten hidden nodes (N.sub.N1,N.sub.N2 . . . N.sub.N10).
10. The method according to claim 1, wherein said predetermined color of said background is the black color.
11. The method according to claim 1, wherein said DNA intercalating agent is a fluorochrome, preferably the DRAQ5.
12. A system for determining whether at least one cell (C) of body tissue shown in a nuclear fluorescence image acquired through a confocal microscope is a diseased cell, in particular a tumorous cell, wherein said fluorescence is obtained through a DNA intercalating agent, said system comprising: storage means (SM) in which said nuclear fluorescence image and a predetermined threshold are stored, a predetermined neural network (NN) comprising an output layer, wherein said output layer comprises at least one first output node (N.sub.OUT1), and configured to provide as output a first numerical value between 0 and 1 at said first output node (N.sub.OUT1), a logic control unit (U), connected to said storage means (SM) and to said predetermined neural network (NN) and configured to: segment said nuclear fluorescence image to obtain at least one segmented image referred to a nucleus (C) of a single cell; insert said at least one segmented image referred to said nucleus (C) of said cell on a background having a predetermined color to obtain at least one reference image (I.sub.REF), in which a reference matrix M.sub.REF of dimensions M×N is associated with said reference image (I.sub.REF) and each pixel of said reference image (I.sub.REF) corresponds a respective number in said reference matrix M.sub.REF whose value is the respective grey level of said pixel; apply a discrete Wavelet transform to said reference matrix M.sub.REF to obtain: a further first matrix M.sub.1 associated with a further first image (I.sub.1) which is an image of the nucleus (C) of the cell shown in said reference image (I.sub.REF), in which said further first image (I.sub.1) has a resolution lower than the resolution of said reference image (I.sub.REF), a further second matrix M.sub.2 associated with a further second image (I.sub.2) referred to the horizontal components of said reference image (I.sub.REF), a further third matrix M.sub.3 associated with a further third image (I.sub.3) referred to the vertical components of said reference image (I.sub.REF), a further fourth matrix M.sub.4 associated with a further fourth image (I.sub.4) referred to the diagonal components of said reference image (I.sub.REF), in which each of said further matrices M.sub.1,M.sub.2,M.sub.3,M.sub.4 is a matrix of dimensions M′×N′ and a respective number in position x,y inside a respective further matrix M.sub.1,M.sub.2,M.sub.3,M.sub.4 corresponds a pixel in position x,y of each further image (I.sub.1,I.sub.2,I.sub.3,I.sub.4) and the value of said number is the respective grey level of said pixel; create a respective Co-occurrence matrix P.sub.1(i,j|Δx, Δy), P.sub.2(i,j|Δx, Δy), P.sub.3(i,j|Δx, Δy), P.sub.4(i,j|Δx, Δy) for each of said further four matrices M.sub.1,M.sub.2,M.sub.3,M.sub.4, in which each Co-occurrence matrix contains information on the nucleus (C) of said cell in terms of texture, magnitude and morphology, and is a matrix of dimensions G×G, where G is the number of grey levels and each of said Co-occurrence matrices P.sub.1(i,j|Δx, Δy), P.sub.2(i,j|Δx, Δy), P.sub.3(i,j|Δx, Δy), P.sub.4(i,j|Δx, Δy) has in a respective position i,j the number of pairs of elements of a respective further matrix M.sub.1,M.sub.2,M.sub.3,M.sub.4, in which each pair of elements is associated with a respective pair of pixels and is formed by a first element associated with a first pixel of said pair of pixels having a grey level equal to i and by a second element associated with a second pixel of said pair of pixels, different from said first pixel and having a grey level equal to j, where i is a positive integer i=0 . . . G and j is a positive integer j=0 . . . G; calculate a plurality of statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.N starting from each Co-occurrence matrix P.sub.1(i,j|Δx, Δy), P.sub.2(i,j|Δx, Δy), P.sub.3(i,j|Δx, Δy), P.sub.4(i,j|Δx, Δy) to characterize at least the texture of the nucleus (C) of said cell, in which each statistical function SF.sub.1,SF.sub.2 . . . SF.sub.N is associated with a respective parameter of a further image of the nucleus (C) of said cell and the result of each statistical function SF.sub.1,SF.sub.2 . . . SF.sub.N is a respective number, so that a vector V of numbers comprising four sub-vectors v.sub.1,v.sub.2,v.sub.3,v.sub.4, is associated with the nucleus (C) of said cell, each sub-vector being associated with a respective further image (I.sub.1,I.sub.2,I.sub.3,I.sub.4) and containing k elements in which k is the number of said statistical functions, supply as input to said predetermined neural network (NN) the results of said statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.N to obtain a first numerical value between 0 and 1 at said first output node (N.sub.OUT1), compare said first numerical value with said predetermined threshold stored in said storage means (SM), identify said cell as tumorous cell, by determining that said first numerical value is greater than said predetermined threshold.
13. The system according to claim 12, wherein said logic control unit is configured to approximate said first numerical value to 1, when said first numerical value is greater than said predetermined threshold, and to 0, when said first numerical value is less than or equal to said predetermined threshold, and to determine whether the nucleus (C) of said cell is the nucleus of a tumorous cell, when said first numerical value is approximated to 1.
14. The system according to claim 13, wherein said output layer comprises a second output node (N.sub.OUT2), wherein said predetermined neural network (NN) is configured to provide as output a second numerical value between 0 and 1 at said second output node (N.sub.OUT2), wherein said logic control unit (U) is configured to compare said second numerical value with said predetermined threshold and to determine whether the nucleus (C) of said cell is the nucleus of a tumorous cell, when said second numerical value is less than or equal to said predetermined threshold, as well as when said first numerical value is greater than said predetermined threshold.
15. The system according to the claim 14, wherein said logic control unit (U) is configured to approximate said second numerical value to 1, when said second numerical value is greater than said predetermined threshold, and to 0, when said second numerical value is less than or equal to said predetermined threshold, and to determine whether the nucleus (C) of said cell is the nucleus of a tumorous cell, when said second numerical value is approximated to 0, as well as when said first numerical value is approximated to 1.
16. (canceled)
17. (canceled)
18. The system according to claim 12, wherein said plurality of statistical functions comprises: a first statistical function SF.sub.1 named Inverse Difference Moment to indicate a homogeneity in the distribution of grey levels:
19. The system according to claim 12, wherein said plurality of statistical functions comprises two further statistical functions to characterize the magnitude and the morphology of the nucleus (C) of said cell, respectively: a eighth statistical function SF.sub.8 named Extension to offer an estimate of the magnitude of the nucleus (C) of the cell through a number of pixel pairs, each of which is formed from a respective first pixel and a respective second pixel, different from said first pixel and is positioned next to said first pixel, wherein the first pixel and the second pixel of each pixel pair have a grey level equal to 0:
EX=1/P.sub.z(i=1,j=1|Δx,Δy) where P.sub.z(i=1,j=1|Δx, Δy) is the first element of the Co-occurrence matrix; a ninth statistical function SF.sub.8 named EdgeLengthEstimate to offer an estimate of the perimeter of the nucleus of the cell (C) through a number of pixel pairs, each of which is formed by a respective first pixel and by a respective second pixel, different from said first pixel and is positioned next to said first pixel, wherein one of two pixels has a grey level equal to 0:
20. The system according to claim 12, wherein said predetermined color of said background is the black color.
21. The system according to claim 12, wherein said DNA intercalating agent is a fluorochrome, preferably the DRAQ5.
22. The system according to claim 12, wherein said predetermined neural network (NN) comprises an input layer and said input layer comprises a number of input nodes (N.sub.IN1,N.sub.IN2 . . . N.sub.INF) equal to the total number of statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.N calculated for each matrix of Co-occurrence.
23. The system according to claim 12, wherein said hidden layer comprises ten hidden nodes (N.sub.N1,N.sub.N2 . . . N.sub.N10).
24. Non-transitory tangible medium comprising instructions which, when executed by a computer, cause the computer to carry out the method according claim 1.
Description
FIGURE LIST
[0100] The present invention will be now described, for illustrative, but not limitative purposes, according to its embodiment, making particular reference to the enclosed figures, wherein:
[0101]
[0102]
[0103]
[0104]
[0105]
[0106]
[0107]
[0108]
[0109]
[0110]
[0111]
DETAILED DESCRIPTION OF THE INVENTION
[0112] With reference to
[0113] In particular, the method is conceived to verify whether the nucleus of a cell is the nucleus of a diseased cell, through an analysis of some characteristics of the nucleus itself.
[0114] Although the method can be applied to a nucleus of a healthy cell of a liver tissue (shown in
[0115] In particular,
[0116] Furthermore, the image shown in
[0117] In particular, said fluorescence is obtained through a DNA intercalating agent, i.e. a chemical agent capable of binding to the cell's DNA and emitting fluorescence.
[0118] Said DNA intercalating agent can be a fluorochrome and preferably the DRAWS.
[0119] The DRAQ5 is an anthraquinone-based dye that binds stoichiometrically to the DNA present into the nucleus of a cell and emits fluorescence.
[0120] The fact that the image of the cell is a nuclear fluorescence image (in which the fluorescence is obtained through said DNA intercalating agent and not through an antibody), and that said image is obtained with a confocal microscope allows the method object of the present invention to accurately determine if a cell is diseased on the basis of the analysis of some characteristics of the nucleus of said cell, such as texture, size and morphology.
[0121] This makes the method object of the present patent application different from the methods of the known type which are designed to analyse whether a cell expresses a protein or not to answer a diagnostic question.
[0122] In the disclosed embodiment, the fluorescence technique was performed on images of sections of a diseased liver tissue fixed in formalin and included in paraffin.
[0123] The nuclei of the cells present in said sections of liver tissue have been marked using a fluorochrome, DRAQ5, diluted 1:5000 and incubated for 5 minutes at room temperature.
[0124] After washing the liver tissue sections, a drop of phosphate buffer saline (PBS)/glycerol (1:1) was placed on those liver tissue sections which were subsequently covered with a coverslip.
[0125] The images concerning liver tissue sections have been acquired through a confocal microscope Olympus Fluoview FV1000 provided with software FV10-ASW version 4.1, by using a lens 40× and a further lens 20× (numerical opening: 0.75).
[0126] Individual liver tissue sections have been acquired with a scan format of 1024×1024 pixels, a sampling rate equal to 20 μs/pixel, and the images are 12-bit/pixel images.
[0127] The mixing of the fluorochromes was carried out through the automatic sequential acquisition of multi-channel images, in order to reduce the spectral crosstalk between the channels.
[0128] The fluorochrome is a molecule which, when excited by photons emitted from a light radiation source, emits further photons having a wavelength greater than the wavelength of the photons with which the fluorochrome was excited.
[0129] In particular, the DRAQ5 has an optimal excitation wavelength of 647 nm and its emission spectrum has a peak value in the 681/697 nm band.
[0130] This fluorochrome is used to highlight the DNA present in the cell nucleus.
[0131] Hepatocarcinoma is difficult to identify and has abnormal group of hepatocytes, as well as anomalies of the nucleus.
[0132] Therefore, one or more liver cells will have a high N/C (nucleus/cytoplasm) ratio.
[0133] The essential features that will be highlighted will concern the alteration of the nuclei of the liver cells that will appear large and often joined together.
[0134] With reference to the method object of the invention, said method comprises the following steps: [0135] A) segmenting said nuclear fluorescence image to obtain at least one segmented image I.sub.S referred to a nucleus C of a single cell; [0136] B) inserting said at least one segmented image I.sub.S referred to said nucleus C of said cell on a background having a predetermined color to obtain at least one reference image I.sub.REF, in which a reference matrix MREF of dimensions M×N is associated with said reference image I.sub.REF and each pixel of said reference image I.sub.REF corresponds a respective number in said reference matrix M.sub.REF whose value is the respective grey level of said pixel; [0137] C) applying a discrete Wavelet transform to said reference matrix M.sub.REF (associated with said reference image I.sub.REF) to obtain further four matrices M.sub.1,M.sub.2,M.sub.3,M.sub.4, different one from the other, each of which is associated with a further image I.sub.1,I.sub.2,I.sub.3,I.sub.4 of the same nucleus C of said cell: [0138] a further first matrix M.sub.1 associated with a further first image I.sub.1 (shown in
[0148]
[0149] With reference to step A, a segmented image I.sub.s of the nucleus C of a single cell is obtained.
[0150] In the embodiment being described, as already said, said cell is a cell of a diseased liver tissue.
[0151] The number of pixels of the segmented image I.sub.s does not depend on the dimensions of the nucleus of the cell.
[0152] In the embodiment being described, the segmentation is a binary segmentation.
[0153] It is known that the binary segmentation applies to an image in grey scale and allows to distinguish an object (in the specific case the nucleus of a cell) from its background. As a result, if the image originally acquired was an image in color, it would be necessary to transform said image in color in a image in grey scale before performing a binary segmentation.
[0154] If the grey level of a pixel is greater than a predetermined threshold value, this pixel belongs to the object, otherwise this pixel belongs to the background.
[0155] With reference to step B, as said, the segmented image I.sub.s of the nucleus C of the cell is inserted in a background of a predetermined color, so that the resulting image is a reference image I.sub.REF.
[0156] A reference matrix M.sub.REF is associated with to said reference image I.sub.REF.
[0157] A respective number in said reference matrix M.sub.REF is associated with each pixel of said reference image I.sub.REF and the value of said number is the respective grey level of said pixel.
[0158] As already said, the predetermined color for the background is preferably the black color.
[0159] Advantageously, from the computational point of view, a number equal to 0 is associated with each pixel having black color.
[0160] The scale of grey levels goes from black color to the white color and the number 0 corresponds to the black color.
[0161] Consequently, the reference image I.sub.REF is the real image of the nucleus C of the cell, since the background of black color is not taking into account.
[0162] However, the predetermined color for the background can be a color different from the black color, such as dark blue, without departing from the scope of the invention.
[0163] With reference to step C, the discrete Wavelet transform allows to disclose the texture of the nucleus of the cell.
[0164] The discrete Wavelet transform is applied to the reference matrix M.sub.REF associated with reference image I.sub.REF (i.e. the image obtained by inserting the segmented image I.sub.s on a background of a predetermined color) and allows to obtain four further matrices M.sub.1,M.sub.2,M.sub.3,M.sub.4 associated with respective further images I.sub.1,I.sub.2,I.sub.3,I.sub.4 of the nucleus of the same cell.
[0165] Each further matrix M.sub.1,M.sub.2,M.sub.3,M.sub.4 has dimensions M′ x N′.
[0166] The sum of said further matrices M.sub.1,M.sub.2,M.sub.3,M.sub.4 is a matrix of dimensions M×N.
[0167] If on the hand, as said, said further first image I.sub.1 is an image of the nucleus of the cell shown in said reference image I.sub.REF wherein said further first image I.sub.1 has a resolution less than the resolution of said reference image I.sub.REF, on the other hand, said further first image I.sub.1 is the only further image in which the real perimeter of the nucleus of the cell is visible.
[0168] The other further images (i.e. the further second image I.sub.2, the further third image I.sub.3 and the further fourth image I.sub.4) are images of the same nucleus C of the cell respectively referring to the horizontal components of the nucleus of the cell, to the vertical components of the nucleus of the cell and to the diagonal components of the nucleus of the cell.
[0169] Furthermore, the discrete Wavelet transform mentioned in step C of the method is a transform of first order.
[0170] However, the discrete Wavelet transform can be a transform of any order, without departing from the invention.
[0171] In case of discrete Wavelet transforms of order higher than the first order, for example up to the third order, the Wavelet transform of second order will be applied to the further images I.sub.1, I.sub.2, I.sub.3, I.sub.4 which are the four sub-bands obtained from the Wavelet transform of first order and the Wavelet transform of third order will be applied to the further images which will be the four sub-bands obtained from the Wavelet transform of second order.
[0172] With reference to step D, a respective Co-occurrence P.sub.1(i,j|Δx, Δy) P.sub.2 (i,j|Δx, Δy) P.sub.3(i,j|Δx, Δy) P.sub.4(i,j|Δx, Δy) is created for each further matrix M.sub.1,M.sub.2,M.sub.3,M.sub.4 obtained through the discrete Wavelet transform (as well as associated with a respective further image I.sub.1,I.sub.2,I.sub.3,I.sub.4).
[0173] In general, the Co-occurrence matrix contains information on the characteristics of the nucleus C of the cell and the information on the texture, on the size and on morphology is present among this information.
[0174] Each Co-occurrence matrix P.sub.1(i,j|Δx, Δy) P.sub.2 (i,j|Δx, Δy) P.sub.3 (i,j|Δx, Δy) P.sub.4(i,j|Δx, Δy) is calculated according to the following formula:
P.sub.z(i,j,Δx,Δy)=W.sub.zQ.sub.z(i,j|Δx,Δy) [0175] where [0176] z is an index to indicate the respective Co-occurrence matrix, wherein said index is a positive integer z=1 . . . 4;
[0185] Each Co-occurrence matrix P.sub.1(i,j|Δx, Δy) P.sub.2 (i,j|Δx, Δy) P.sub.3 (i,j| Δx, Δy) P.sub.4(i,j|Δx, Δy) is a matrix of dimensions G×G, wherein G is the number of grey levels associated to the pixel present in said further matrices M.sub.1, M.sub.2, M.sub.3, M.sub.4.
[0186] Each Co-occurrence matrix P.sub.1(i,j|Δx, Δy) P.sub.2 (i,j|Δx, Δy) P.sub.3 (i,j|Δx, Δy) P.sub.4(i,j|Δx, Δy) has in a respective position i,j the number of pairs of elements of a respective further matrix M.sub.1,M.sub.2,M.sub.3,M.sub.4, wherein each pair pf elements is associated with a respective pair of pixels.
[0187] In particular, each pair of elements is formed by a first element associated with a first pixel of said pair of pixels having a grey level equal to i and by a second element associated with a second pixel of said pair of pixels, different from said first pixel and having a grey level equal to j.
[0188] Consequently, in each element in position i,j of a respective Co-occurrence matrix P.sub.1(i,j|Δx, Δy) P.sub.2 (i,j|Δx, Δy) P.sub.3(i,j|ΔX, Δy) P.sub.4(i,j|Δx, Δy) a triple contribution is present: the grey level of a first pixel, the grey level of a second pixel, different from said first pixel, and the number of pairs of pixels formed by a first pixel and by a second pixel with respective grey levels.
[0189] With reference to step E, a plurality of statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.N are calculated starting from each Co-occurrence matrix P.sub.1(i,j|Δx, Δy) P.sub.2 (i,j|Δx, Δy) P.sub.3 (i,j|Δx, Δy) P.sub.4(i,j|Δx, Δy).
[0190] Said statistical functions are predetermined and chosen to characterize at least the texture and preferably the size and the morphology of the nucleus C of the cell, as explained below.
[0191] In other words, a respective plurality of statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.N is calculated for each of said Co-occurrence matrix P.sub.1(i,j|Δx, Δy) P.sub.2 (i,j|Δx, Δy) P.sub.3(i,j|Δx, Δy) P.sub.4(i,j|Δx, Δy).
[0192] The result of each statistical function SF.sub.1,SF.sub.2 . . . SF.sub.N is a respective number, so that a vector V of numbers comprising four sub-vectors v.sub.1,v.sub.2,v.sub.3,v.sub.4 (i.e. V=[v.sub.1;v.sub.2;v.sub.3;v.sub.4]) is associated with the nucleus C of said cell.
[0193] Each of said sub-vectors v.sub.1,v.sub.2,v.sub.3,v.sub.4 is associated with a respective further image I.sub.1,I.sub.2,I.sub.3,I.sub.4 and contains k elements wherein k is the number of the used statistical functions (i.e. the number of elements is equal to the number of statistical functions).
[0194] In the embodiment being described, said plurality of statistical functions comprises seven statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.7, mentioned below.
[0195] A first statistical function SF.sub.1 named Inverse Difference Moment (IDM) is conceived to indicate a homogeneity in the distribution of grey levels
[0200] Said first statistical function SF.sub.1 is a measure of the homogeneity of the image (i.e. of a homogeneity of the grey levels) and therefore offers an indication of how much the image is free of significant variations between two grey levels.
[0201] The greater the numerical result of said first statistical function SF.sub.1, the lower the numerical result of a further statistical function called Contrast mentioned below.
[0202] A second statistical function SF.sub.2 named Energy (EN) is conceived to indicate a homogeneity in the structure of the texture of the nucleus of the cell:
[0205] In other words, said second statistical function SF.sub.2 relates to the structure of the texture of the nucleus of the cell intended as a macrostructure of the texture, since it refers to the nucleus of the cell in its entirety.
[0206] A third statistical function SF.sub.3 named Norm Entropy (NE) is conceived to take into account the level of clutter between pixels:
[0210] In other words, the numerical result of said third statistical function SF.sub.3 is the higher the closer the numerical values associated with the respective grey levels are to the maximum value of the grey levels, based on the number of grey levels with which it has been chosen to encode the reference image.
[0211] The numerical result of said third statistical function will be greater the closer the grey levels are to 256.
[0212] In a further example, if the grey levels range from 0 to 56, the numerical result of said third statistical function will be greater the closer the grey levels are to 56.
[0213] A fourth statistical function SF.sub.4 named Local Homogeneity (LO) is conceived to indicate the presence of homogeneous areas or non-homogeneous areas:
[0218] The numerical result of said fourth statistical function SF.sub.4 is higher the higher the number of homogeneous areas inside the cell nucleus is, and lower the higher the number of inhomogeneous areas inside the nucleus of the cell.
[0219] A fifth statistical function SF.sub.5 named Cluster Shade (CS) is conceived to indicate an asymmetry of the Co-occurrence matrix:
[0226] A sixth statistical function SF.sub.6 named Cluster Prominence (CP) is conceived to indicate a further asymmetry of the Co-occurrence matrix:
[0233] The higher the numerical results of said fifth statistical function SF.sub.5 and of said sixth statistical function SF.sub.6 the more the Co-occurrence matrix is asymmetric with respect to its diagonal.
[0234] A seventh statistical function SF.sub.7 named Contrast (CO) is conceived to identify the difference in intensity between two grey levels, a first grey level associated with said first pixel and a second grey level associated with said second pixel:
[0239] The higher the numerical result of said seventh statistical function SF.sub.7, two pixels of a pair of pixels.
[0240] As mentioned, said two pixels can be placed side by side one or the other or at a predetermined distance between them.
[0241] As regards said seventh statistical function SF.sub.7, it is preferable that said two pixels are side by side.
[0242] With reference to vector V, said vector V is given by four sub-vectors v.sub.1,v.sub.2,v.sub.3,v.sub.4, each of which is formed by the numerical results of the seven statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.7 mentioned above and referred to a respective Co-occurrence matrix P.sub.1(i,j|Δx, Δy) P.sub.2(i,j|Δx, Δy) P.sub.3(i,j|Δx, Δy) P.sub.4(i,j|Δx, Δy).
[0243] In other words, the vector V=[IDM.sub.1, EN.sub.1, NE.sub.1, LO.sub.1, CS.sub.1, CP.sub.1, CO.sub.1; IDM.sub.2, EN.sub.2, NE.sub.2, LO.sub.2, CS.sub.2, CP.sub.2, CO.sub.2; IDM.sub.3, EN.sub.3, NE.sub.3, LO.sub.3, CS.sub.3, CP.sub.3, CO.sub.3; IDM.sub.4, EN.sub.4, NE.sub.4, LO.sub.4, CS.sub.4, CP.sub.4, CO.sub.4].
[0244] Consequently, in the embodiment being described, each sub-vector v.sub.1,v.sub.2,v.sub.3,v.sub.4 is so defined: [0245] v.sub.1=[IDM.sub.1, EN.sub.1, NE.sub.1, LO.sub.1, CS.sub.1, CP.sub.1, CO.sub.1]; [0246] v.sub.2=[IDM.sub.2, EN.sub.2, NE.sub.2, LO.sub.2, CS.sub.2, CP.sub.2, CO.sub.2]; [0247] v.sub.3=[IDM.sub.3, EN.sub.3, NE.sub.3, LO.sub.3, CS.sub.3, CP.sub.3, CO.sub.3]; [0248] v.sub.4=[IDM.sub.4, EN.sub.4, NE.sub.4, LO.sub.4, CS.sub.4, CP.sub.4, CO.sub.4].
[0249] However, it is preferable that said plurality of statistical functions comprises two further statistical functions to also characterize the size and texture of the nucleus of said cell: an eighth statistical function SF.sub.8 and a ninth statistical function SF.sub.9.
[0250] The eighth statistical function SF.sub.8 called Extension is conceived to offer an estimate of the size of the cell nucleus C through the number of pairs of pixels, each of which is formed by a respective first pixel and a respective second pixel, different from said first pixel and positioned next to said first pixel, in which the first pixel and the second pixel of each pair of pixels have a grey level equal to 0:
EX=1/P.sub.z(i=1,j=1|Δx,Δy) [0251] where [0252] P.sub.z(i=1,j=1|Δx, Δy) is the first element of the Co-occurrence matrix.
[0253] The greater the number of pixel pairs with both pixels having a grey level equal to 0, the smaller the size of the cell nucleus.
[0254] Consequently, this eighth statistical function offers an estimate of the size of the cell's nucleus.
[0255] A ninth statistical function SF.sub.9 named EdgeLengthEstimate is conceived to offer an estimate of the perimeter of the nucleus C of the cell through the number of pairs of pixels, each of which is formed by a respective first pixel and a respective second pixel, different from said first pixel and positioned next to said first pixel, in which one of said two pixels has a grey level equal to 0:
[0259] As can be seen from the formula, the ninth statistical function allows to add a first number which is the result of the sum of all the elements of the first row of the Co-occurrence matrix with a second number which is the result of the sum of the elements of the first column of the same Co-occurrence matrix.
[0260] The result obtained by adding said first number and said second number is the number of pairs of pixels arranged on the edge of the nucleus of the cell.
[0261] This ninth statistical function offers an estimate of the perimeter of the cell nucleus.
[0262] The values of the eighth statistical function and the ninth statistical function offer an estimate of the size and morphology of a nucleus of a cell.
[0263] In fact, if the value of the eighth statistical function is low and the value of the ninth statistical function is high, it means that the nucleus of the cell has a jagged edge and a jagged edge may be characteristic of a tumorous cell.
[0264] To determine the size and morphology of the nucleus of the cell, the same matrix, from which information on the texture of said nucleus was obtained, has been used, so as to simplify the calculations and optimize the calculation time.
[0265] If nine statistical functions, each of the four sub-vectors v.sub.1,v.sub.2,v.sub.3,v.sub.4 mentioned above would be formed by the numerical results of nine statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.9 and referred to a respective Co-occurrence matrix P.sub.1(i,j|Δx, Δy) P.sub.2(i,j|Δx, Δy) P.sub.3(i,j|Δx, Δy) P.sub.4(i,j|Δx, Δy).
[0266] In other words, the vector V=[IDM.sub.1, EN.sub.1, NE.sub.1, LO.sub.1, CS.sub.1, CP.sub.1, CO.sub.1, EX.sub.1, ELE.sub.1; IDM.sub.2, EN.sub.2, NE.sub.2, LO.sub.2, CS.sub.2, CP.sub.2, CO.sub.2, EX.sub.2, ELE.sub.2; IDM.sub.3, EN.sub.3, NE.sub.3, LO.sub.3, CS.sub.3, CP.sub.3, CO.sub.3, EX.sub.3, ELE.sub.3; IDM.sub.4, EN.sub.4, NE.sub.4, LO.sub.4, CS.sub.4, CP.sub.4, CO.sub.4, EX.sub.4, ELE.sub.4].
[0267] Consequently, each sub-vector v.sub.1,v.sub.2,v.sub.3,v.sub.4 would be so defined: [0268] v.sub.1=[IDM.sub.1, EN.sub.1, NE.sub.1, LO.sub.1, CS.sub.1, CP.sub.1, CO.sub.1, EX.sub.1, ELE.sub.1]; [0269] v.sub.2=[IDM.sub.2, EN.sub.2, NE.sub.2, LO.sub.2, CS.sub.2, CP.sub.2, CO.sub.2,EX.sub.2, ELE.sub.2]; [0270] v.sub.3=[IDM.sub.3, EN.sub.3, NE.sub.3, LO.sub.3, CS.sub.3, CP.sub.3, CO.sub.3, EX.sub.3, ELE.sub.3]; [0271] v.sub.4=[IDM.sub.4, EN.sub.4, NE.sub.4, LO.sub.4, CS.sub.4, CP.sub.4, CO.sub.4, EX.sub.4, ELE.sub.4].
[0272] With reference to step F, as said, said predetermined neural network NN is designed to provide at least a first numerical value between 0 and 1 at a respective output node, i.e. the first output node.
[0273] In particular, in the embodiment being described, said predetermined neural network is a feed-forward neural network.
[0274] Furthermore, the learning method for said neural network is a quasi-Newton method.
[0275] With reference to steps G and H, said first numerical value will be compared with a predetermined threshold and the cell will be considered a diseased cell, if said first numerical value is greater than said predetermined threshold.
[0276] With particular reference to steps G and H, said step G can comprise a sub-step G1 of approximating said first numerical value to 1, when said first numerical value is greater than said predetermined threshold, and to 0, when said first numerical value is less than or equal to said predetermined threshold, and with reference to step H said cell is a diseased cell, in particular a tumorous cell, when said first numerical value is approximated to 1.
[0277] Returning to step F, in the embodiment being described, said predetermined neural network NN comprises a second output node N.sub.OUT2.
[0278] Furthermore, said predetermined neural network NN is configured to provide as output a second numerical value between 0 and 1 at said second output node N.sub.OUT2.
[0279] Said second numerical value is compared with the same predetermined threshold with which the first numerical value is compared.
[0280] After the comparison with said predetermined threshold, said second numerical value is approximated to 1 or 0.
[0281] A diseased cell (in the embodiment being described) is identified by a first numerical value (at the first output node N.sub.OUT1) which has been approximated to 1 and by a second numerical value (at the second output node N.sub.OUT2) which was approximated to 0.
[0282] A healthy cell is identified by a first numerical value (at the first output node N.sub.OUT1) which has been approximated to 0 and by a second numerical value (at the second output node N.sub.OUT2) which was approximated to 1.
[0283] In other words, the steps from F to H have been modified as follows.
[0284] The step F of the method that said predetermined neural network NN is configured to provide as output a second numerical value at said second output node N.sub.OUT2.
[0285] The step G of the method comprises the comparison of said second numerical value at said second output node N.sub.OUT2 with said predetermined threshold.
[0286] The step H of the method allows to determine if the nucleus C of said cell is the nucleus of a diseased cell, in particular a tumorous cell, when said first numerical value is greater than said predetermined threshold and said second numerical value is less than or equal to said predetermined threshold.
[0287] In particular, the step G can comprise a sub-step G2 of approximating the second numerical value to 1, when said second numerical value is greater than said predetermined threshold, and to 0, when said second numerical value is less than or equal to said predetermined threshold and with reference to step H said cell is a diseased cell, in particular a tumorous cell, when said first numerical value is approximated to 1 and when said second numerical value is approximated to 0.
[0288] With reference to two output nodes N.sub.OUT1,N.sub.OUT2, said two output nodes N.sub.OUT1,N.sub.OUT2 are included in a output layer of said predetermined neural network NN.
[0289] As is clear from the system capable of implementing this method, shown in
[0292] With reference to the input layer, in the embodiment being described, said input layer comprises twenty-eight input nodes N.sub.IN1,N.sub.IN2 . . . N.sub.IN28, each of which is associated with a respective numerical result of each of said seven statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.7 for each of the four Co-occurrence matrix M.sub.1,M.sub.2,M.sub.3,M.sub.4.
[0293] With reference to the hidden layer, in the embodiment being described, said hidden layer comprises ten hidden nodes N.sub.N1,N.sub.N2 . . . N.sub.N10.
[0294] The present invention also relates to a system, shown in
[0295] Said system comprises: [0296] storage means SM in in which said nuclear fluorescence image and a predetermined threshold are stored, [0297] a predetermined neural network NN comprising an output layer, wherein said output layer comprises at least one first output node Noun, and configured to provide as output a first numerical value between 0 and 1 at said first output node Noun, [0298] a logic control unit U, connected to said storage means MM and to said predetermined neural network NN and configured to: [0299] segment said nuclear fluorescence image to obtain at least one segmented image I.sub.s referred to a nucleus C of a single cell; [0300] insert said at least one segmented image I.sub.s referred to said nucleus C of said cell on a background having a predetermined color to obtain at least one reference image I.sub.REF, in which a reference matrix M.sub.REF of dimensions M×N is associated with said reference image I.sub.REF and each pixel of said reference image I.sub.REF corresponds a respective number in said reference matrix M.sub.REF whose value is the respective grey level of said pixel; [0301] apply a discrete Wavelet transform to said reference matrix M.sub.REF to obtain further four matrices M.sub.1,M.sub.2,M.sub.3,M.sub.4, different from each other, each of which is associated with a respective further image I.sub.1,I.sub.2,I.sub.3,I.sub.4 of the same nucleus C of the cell: [0302] a further first matrix M.sub.1 associated with a further first image I.sub.1 which is an image of the nucleus C of the cell shown in said reference image I.sub.REF, in which said further first image I.sub.1 has a resolution lower than the resolution of said reference image I.sub.REF, [0303] una ulteriore seconda matrice M.sub.2 associata ad una ulteriore seconda immagine I.sub.2 riferita alle componenti orizzontali di detta immagine di riferimento I.sub.REF, [0304] a further second matrix M.sub.2 associated with a further second image I.sub.2 referred to the horizontal components of said reference image I.sub.REF, [0305] a further third matrix M.sub.3 associated with a further third image I.sub.3 referred to the vertical components of said reference image I.sub.REF, [0306] a further fourth matrix M.sub.4 associated with a further fourth image I.sub.4 referred to the diagonal components of said reference image I.sub.REF, [0307] in which each of said further matrices M.sub.1,M.sub.2,M.sub.3,M.sub.4 is a matrix of dimensions M′ x N′ and a respective number in position x,y inside a respective further matrix M.sub.1,M.sub.2,M.sub.3,M.sub.4 corresponds a pixel in position x,y of each further image I.sub.1,I.sub.2,I.sub.3,I.sub.4 and the value of said number is the respective grey level of said pixel; [0308] create a respective Co-occurrence matrix P.sub.1(i,j|Δx, Δy), P.sub.2(i,j|Δx, Δy), P.sub.3(i,j|Δx, Δy), P.sub.4(i,j|Δx, Δy) for each of said further four matrices M.sub.1,M.sub.2,M.sub.3,M.sub.4, in which each Co-occurrence matrix contains information on the nucleus C of said cell in terms of texture, magnitude and morphology, and is a matrix of dimensions G×G, where G is the number of grey levels and each of said Co-occurrence matrices P.sub.1(i,j|Δx, Δy), P.sub.2(i,j|Δx, Δy), P.sub.3(i,j|Δx, Δy), P.sub.4(i,j|Δx, Δy) has in a respective position i,j the number of pairs of elements of a respective further matrix M.sub.1,M.sub.2,M.sub.3,M.sub.4, in which each pair of elements is associated with a respective pair of pixels and is formed by a first element associated with a first pixel of said pair of pixels having a grey level equal to i and by second element associated with a second pixel of said pair of pixels, different from said first pixel and having a grey level equal to j, where i is a positive integer i=0 . . . G and j is a positive integer j=0 . . . G; [0309] calculate a plurality of statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.N starting from each Co-occurrence matrix P.sub.1(i,j|Δx, Δy), P.sub.2(i,j|Δx, Δy), P.sub.3(i,j|Δx, Δy), P.sub.4(i,j|Δx, Δy) to characterize at least the texture of the nucleus C of said cell, in which each statistical function SF.sub.1,SF.sub.2 . . . SF.sub.N is associated with a respective parameter of a further image of the nucleus C of said cell and the result of each statistical function SF.sub.1,SF.sub.2 . . . SF.sub.N is a respective number, so that a vector V of numbers comprising four sub-vectors v.sub.1,v.sub.2,v.sub.3,v.sub.4, is associated with the nucleus C of said cell, each of which is associated with a respective further image I.sub.1,I.sub.2,I.sub.3,I.sub.4 and contains k elements in which k is the number of said statistical functions, [0310] supply as input to said predetermined neural network NN the results of said statistical functions SF.sub.1,SF.sub.2 . . . SF.sub.N to obtain a first numerical value between 0 and 1 at said first output node N.sub.OUT1, [0311] compare said first numerical value with said predetermined threshold stored in said storage means SM, [0312] determine whether said cell is a diseased cell, in particular a tumorous cell, when said first numerical value is greater than said predetermined threshold.
[0313] In particular, said logic control unit U is configured to approximate said first numerical value to 1, when said first numerical value is greater than said predetermined threshold, and to 0, when said first numerical value is less than or equal to said predetermined threshold, and to determine whether the nucleus C of a cell is the nucleus of a diseased cell, in particular a tumorous cell, when said first numerical value is approximated to 1.
[0314] Furthermore, as said for the method, said first output node is included in the output layer of said predetermined neural network NN.
[0315] Said predetermined neural network NN can comprise a second output node N.sub.OUT2 (also included in said output layer) and said predetermined neural network NN can be configured to provide a second numerical value between 0 and 1 at said second output node N.sub.OUT2 (in addition to the first numerical value and always on the basis of the results of the statistical functions provided as input to the neural network), and said logic control unit U can be configured to compare said second numerical value with said predetermined threshold and determine whether said cell C is a diseased cell, in particular a tumorous cell, when said second numerical value is less than or equal to said predetermined threshold, besides said first numerical value is greater than said predetermined threshold.
[0316] In particular, said logic control unit U can be configured to approximate said second numerical value to 1, when said second numerical value is greater than said predetermined threshold, and to 0, when said second numerical value is less than or equal to said predetermined threshold, and to determine whether the nucleus C of said cell is the nucleus of a diseased cell, in particular the nucleus of a tumorous cell, when said second numerical value is approximated to 0, besides said first numerical value is approximated to 1.
[0317] As said for the method, said plurality of statistical functions can comprise seven statistical functions to characterize the texture and preferably two further statistical functions to characterize the size and the morphology of the nucleus of a cell.
[0318] The present invention relates to a computer program, comprising code means configured in such a way that, when executed on a computer, perform the steps of the method described above.
[0319] Furthermore, the present invention also relates to a computer-readable storage medium comprising instructions, which, when executed by a computer, cause the computer to carry out the steps of the method described above.
[0320] Example of Creating a Co-Occurrence Matrix
[0321] Below, an example of how a Co-occurrence matrix is created starting from a further matrix associated with a further image, wherein said further matrix has dimensions 5×5 (consequently M′ is equal to 5 and N′ is equal to 5) and said further image is coded with 5 levels of grey (i.e. through the values 0,1,2,3,4).
[0322] It is assumed that said further matrix is the further first matrix M.sub.1 for convenience.
[0323] Below is an example of said further first matrix:
[0324] As mentioned, the Co-occurrence matrix is defined by the following general formula:
P.sub.z(i,j,Δx,Δy)=W.sub.z.Math.Q.sub.z(i,j|Δx,Δy)
[0325] In the example being described Δx=1 and Δy=0.
[0326] This means that pairs of elements of said further matrix are taken into consideration (in which each element corresponds to a respective pixel) formed by two elements side by side, i.e. a first element and a second element arranged within said further matrix in the position subsequent to said first element.
[0327] Consequently, the general formula indicated above becomes:
P.sub.1(i,j,1,0)=W.sub.1Q.sub.1(i,j|1,0)
[0328] In the example being described the parameter W.sub.1 (i.e. the number referred to the number of possible pairs of elements associated with a respective pixel pairs) becomes:
[0329] As regards the calculation of the parameter Q.sub.1 (i.e. the number referred to the number of pairs of elements of a further matrix, wherein each pair of elements is formed by said first element associated with said first pixel with grey level equal to i and from said second element associated with said second pixel with grey level equal to j), in order to facilitate the calculation of this parameter, a table is shown below which shows the number of pairs of elements as i and j vary.
TABLE-US-00001 j 0 1 2 3 i 0 1 2 1 0 1 0 1 3 0 2 0 0 3 5 3 0 0 2 2
[0330] With reference to the first row of the table: [0331] when i=0 and j=0 a pair of elements [0,0] is present in the further first matrix M.sub.1 (see second row), [0332] when i=0 and j=1 two pair of elements [0,1] are present in the further matrix M.sub.1 (see first row and third row), [0333] when i=0 and j=2 a pair of elements [0,2] is present in the further first matrix M.sub.1 (see second row), [0334] when i=0 and j=3 no pair of elements [0,3] is present in the further first matrix M.sub.1.
[0335] With reference to the second row of the table: [0336] when i=1 and j=0 no pair of elements [1,0] is present in the further first matrix M.sub.1, [0337] when i=1 an j=1 a pair of elements [1,1] is present in the further first matrix M.sub.1 (see first row), [0338] when i=1 and j=2 three pair of elements [1,2] are present in the further first matrix M.sub.1 (see first row, third row and fourth row), [0339] when i=1 and j=3 no pair of elements [1,3] is present in the further first matrix M.sub.1.
[0340] With reference to the third row of the table: [0341] when i=2 and j=0 no pair of elements [2,0] is present in the further first matrix M.sub.1, [0342] when i=2 and j=1 no pair of elements [2,1] is present in the further first matrix M.sub.1, [0343] when i=2 and j=2 three pair of elements [2,2] are present in the further first matrix M.sub.1 (see third row, fourth row and fifth row), [0344] when i=2 and j=3 five pair of elements [2,3] are present in the further first matrix M.sub.1 (see first row, second row, third row, fourth row and fifth row).
[0345] With reference to the fourth row of the table: [0346] when i=3 and j=0 no pair of elements [3,0] is present in the further first matrix M.sub.1, [0347] when i=3 and j=1 no pair of elements [3,1] is present in the further first matrix M.sub.1, [0348] when i=3 and j=2 two pair of elements [3,2] are present in the further first matrix M.sub.1 (see fourth row and fifth row), [0349] when i=3 and j=3 two pair of elements [2,3] are present in the further first matrix M.sub.1 (see second row and fifth row).
[0350] As a result:
Test Example for the Method Described Above
[0351] A nuclear fluorescence image of a liver tissue containing a number of cells equal to 573 (including healthy cells and diseased cells) has been processed through the method above describe, by using a neural network already trained with other nuclear fluorescence images concerning a plurality of cells present in a healthy and diseased liver tissue. The results have been compared with the results of the traditional anatomy-pathological methods.
[0352] Furthermore, in order to evaluate the robustness of the method described above, it has been chosen to apply different predetermined threshold values to determine whether the cell is healthy or diseased.
[0353] In the example being disclosed, said threshold values have been chosen between 0 and 1.
[0354] In particular, the chosen threshold values are the following: 0.2, 0.4, 0.6 e 0.8.
[0355] Below is a table showing the results obtained by varying the threshold values.
TABLE-US-00002 Threshold TP TN FP FN value fp tp 79 471 3 20 0.8 0.006 0.79 80 469 4 20 0.6 0.008 0.8 80 468 5 20 0.4 0.010 0.8 82 463 9 19 0.2 0.019 0.81
[0356] In the table above: [0357] TP indicates the number of cells recognized as diseased cells correctly identified by the method described above; [0358] TN indicates the number of cells recognized as healthy cells correctly identified by the method described above; [0359] FP indicates the number of cells recognized as healthy cells mistakenly identified as diseased cells by the method described above; [0360] FN indicates the number of cells recognized as diseased cells mistakenly identified as healthy cells by the method described above; [0361] fp indicates an estimate of the likelihood that the method described above mistakenly identifies as diseased cells the cells recognized as healthy cells, wherein
and [0362] tp indicates an estimate of the likelihood that the method described above correctly identifies healthy cells, wherein
[0363] The values shown in the table have been used to construct a respective confusion matrix for each predetermined threshold value and to construct a ROC curve concerning all the confusion matrices.
[0364] The
[0365] The accuracy of the method described above to determine whether the cells are healthy cells or diseased cells is directly proportional to the area subtended by the ROC curve.
[0366] The area under the ROC curve is called AUC and measures the probability that the result of a test on a sick person randomly chosen from a group of sick people is different from (greater than) the result of a test on a healthy person randomly chosen by a group of healthy people.
[0367] In addition, several methods are known to estimate the area subtended by the ROC or AUC curve.
[0368] In particular, a known method for estimating the area subtended by the ROC or AUC curve provides for a numerical integration, for example by calculating different areas each of which is associated with a respective polygon subtended by the curve and then adding the area of all polygons.
[0369] The result of the sum of the areas of all polygons will provide a lower estimate of the real area subtended by the ROC or AUC curve.
[0370] In particular, it is possible to use a known method to interpret the value of the area subtended by the ROC or AUC curve, according to which: [0371] if AUC<0.5, the method is not considered informative; [0372] if 0.5≤AUC0.7, the method is considered inaccurate; [0373] if 0.7≤AUC0.9, the method is considered moderately accurate; [0374] if 0.9<AUC<1, the method is considered to be highly accurate; [0375] if AUC=1 the method is considered perfect.
[0376] Regardless of the predetermined threshold value, the method is moderately accurate or highly accurate.
[0377] The method described above is accurate with respect to the predetermined threshold value and robust with respect to the choice of each predetermined threshold value.
[0378] In the example being disclosed, it is preferably that the predetermined threshold value is greater than 0.2 and more preferably greater than or equal to 0.8.
[0379] Advantages Advantageously, as said, the method object of the present invention allows to determine automatically if a cell shown in a nuclear fluorescence image obtained through a confocal microscope is a diseased cell, in particular a tumorous cell.
[0380] A second advantage is given by the fact through said method it is possible to distinguish diseased cells from healthy cells.
[0381] A further advantage is due to the reliability of the method. The present invention has been described for illustrative, but not limitative purposes, according to its preferred embodiment, but it is to be understood that variations and/or modifications can be carried out by a skilled in the art, without departing from the scope thereof, as defined according to enclosed claims.
BIBLIOGRAPHY
[0382] 1) Uhler C, Shivashankar G V. Nuclear Mechanopathology and Cancer Diagnosis. Trends Cancer. 2018 April; 4(4):320-331; [0383] 2) Radhakrishnan A, Damodaran K, Soylemezoglu A C, Uhler C, Shivashankar G V. Machine Learning for Nuclear Mechano-Morphometric Biomarkers in Cancer Diagnosis. Sci Rep. 2017 Dec. 20; 7(1):17946. doi: 10.1038/s41598-017-17858-1; [0384] 3) Dongmei Guo, Tianshuang Qiu, Jie Bian, Li Zhang. A computer-aided diagnostic system to discriminate SPIO-enhanced magnetic resonance hepatocellular carcinoma by a neural network classifier. Computerized medical imaging and graphics: the official journal of the Computerized Medical Imaging Society 33(8):588-92; [0385] 4) Mitrea, M. Platon (Lupşor), S. Nedevschi, P. Mitrea R. Brehar. Conference paper: 6th International Conference on Advancements of Medicine and Health Care through Technology; 17-20 Oct. 2018, Cluj-Napoca, Romania, pp. 169-175. The Role of Convolutional Neural Networks in the Automatic Recognition of the Hepatocellular Carcinoma, Based on Ultrasound Images; [0386] 5) Siqi Li, Huiyan Jiang, Wenbo Pang Joint multiple fully connected convolutional neural network with extreme learning machine for hepatocellular carcinoma nuclei grading. Computers in Biology and Medicine Volume 84, 1 May 2017, Pages 156-167.