Image Processing Device, Image Processing Method, Image Processing Program, Endoscope Device, and Endoscope Image Processing System
20230218175 · 2023-07-13
Inventors
- Hiroshi Takemura (Shinjuku-ku, Tokyo, JP)
- Kohei Soga (Shinjuku-ku, Tokyo, JP)
- Reiichirou Ike (Shinjuku-ku, Tokyo, JP)
- Toshihiro Takamatsu (Shinjuku-ku, Tokyo, JP)
- Hiroaki Ikematsu (Tokyo, JP)
- Hideo Yokota (Saitama, JP)
Cpc classification
A61B5/0077
HUMAN NECESSITIES
A61B5/004
HUMAN NECESSITIES
International classification
Abstract
An image processing device acquires an image obtained by irradiating an area of a living body with light having a wavelength of 955 [nm] to 2025 [nm]. The image processing device inputs the acquired image to a learned model or a statistical model generated in advance for detecting, from the image, a tumor present in the area, and determines whether or not a tumor is present at each point in the image.
Claims
1. An image processing device, comprising: an image acquisition unit that acquires an image obtained by irradiating an area of a living body with light having a wavelength of 955 nm to 2025 nm; and a determination unit that inputs the image acquired by the image acquisition unit to a learned model or a statistical model generated in advance for detecting, from the image, a tumor present in the area, and that determines whether or not a tumor is present at each point in the image acquired by the image acquisition unit.
2. The image processing device according to claim 1, wherein: the image acquisition unit acquires an image obtained by irradiating a gastrointestinal tract area in a living body with light having a wavelength of 1000 nm to 1500 nm, and the determination unit inputs the image acquired by the image acquisition unit to a learned model or a statistical model generated in advance for detecting, from the image, a gastrointestinal stromal tumor present in the gastrointestinal tract area, and determines whether or not a gastrointestinal stromal tumor is present at each point in the image acquired by the image acquisition unit.
3. The image processing device according to claim 2, wherein the light includes at least one of: first light representing light of a wavelength of 1050 to 1105 nm, second light representing light of a wavelength of 1145 to 1200 nm, third light representing light of a wavelength of 1245 to 1260 nm, or fourth light representing light of a wavelength of 1350 to 1405 nm.
4. The image processing device according to claim 3, wherein the light includes: first light representing light of a wavelength of 1050 to 1105 nm, second light representing light of a wavelength of 1145 to 1200 nm, third light representing light of a wavelength of 1245 to 1260 nm, and fourth light representing light of a wavelength of 1350 to 1405 nm.
5. The image processing device according to claim 2, wherein the learned model or the statistical model is a model generated in advance based on data in which an in-vivo image for training, and information indicating whether or not a gastrointestinal stromal tumor is present inside a gastrointestinal tract area appearing in the in-vivo image, are associated with each other.
6. The image processing device according to claim 2, wherein the determination unit inputs a pixel value of each pixel of the image acquired by the image acquisition unit to the learned model or the statistical model, and determines whether or not a gastrointestinal stromal tumor is present for each pixel of the image acquired by the image acquisition unit.
7. The image processing device according to claim 1, wherein: the image acquisition unit acquires an image obtained by irradiating lungs in a living body with light having a wavelength of 955 nm to 2025 nm, and the determination unit inputs the image acquired by the image acquisition unit to a learned model or a statistical model generated in advance for detecting, from the image, a tumor present in the lungs, and determines whether or not a tumor is present at each point in the image acquired by the image acquisition unit.
8. The image processing device according to claim 7, wherein the light includes at least one of: first light representing light of a wavelength of 955 to 1020 nm, second light representing light of a wavelength of 1055 to 1135 nm, third light representing light of a wavelength of 1135 to 1295 nm, fourth light representing light of a wavelength of 1295 to 1510 nm, fifth light representing light of a wavelength of 1510 to 1645 nm, or sixth light representing light of a wavelength of 1820 to 2020 nm.
9. The image processing device according to claim 8, wherein the light includes: first light representing light of a wavelength of 955 to 1020 nm, second light representing light of a wavelength of 1055 to 1135 nm, third light representing light of a wavelength of 1135 to 1295 nm, fourth light representing light of a wavelength of 1295 to 1510 nm, fifth light representing light of a wavelength of 1510 to 1645 nm, and sixth light representing light of a wavelength of 1820 to 2020 nm.
10. The image processing device according to claim 7, wherein the learned model or the statistical model is a model generated in advance based on data in which an in-vivo image for training, and information indicating whether or not a tumor is present in the lungs appearing in the in-vivo image, are associated with each other.
11. The image processing device according to claim 1, wherein: the image acquisition unit acquires an image obtained by irradiating a stomach in a living body with light having a wavelength of 1085 nm to 1405 nm, and the determination unit inputs the image acquired by the image acquisition unit to a learned model or a statistical model generated in advance for detecting, from the image, a tumor present in the stomach, and determines whether or not a tumor is present at each point in the image acquired by the image acquisition unit.
12. The image processing device according to claim 11, wherein the light includes at least one of: first light representing light of a wavelength of 1065 to 1135 nm, second light representing light of a wavelength of 1180 to 1230 nm, third light representing light of a wavelength of 1255 to 1325 nm, or fourth light representing light of a wavelength of 1350 to 1425 nm.
13. The image processing device according to claim 12, wherein the light includes: first light representing light of a wavelength of 1065 to 1135 nm, second light representing light of a wavelength of 1180 to 1230 nm, third light representing light of a wavelength of 1255 to 1325 nm, and fourth light representing light of a wavelength of 1350 to 1425 nm.
14. The image processing device according to claim 11, wherein the learned model or the statistical model is a model generated in advance based on data in which an in-vivo image for training, and information indicating whether or not a tumor is present in the stomach appearing in the in-vivo image, are associated with each other.
15. The image processing device according to claim 1, wherein: the image acquisition unit acquires an image obtained by irradiating a large bowel in a living body with light having a wavelength of 1020 nm to 1540 nm, and the determination unit inputs the image acquired by the image acquisition unit to a learned model or a statistical model generated in advance for detecting, from the image, a tumor present in the large bowel, and determines whether or not a tumor is present at each point in the image acquired by the image acquisition unit.
16. The image processing device according to claim 15, wherein the light includes at least one of: first light representing light of a wavelength of 1020 to 1140 nm, second light representing light of a wavelength of 1140 to 1260 nm, third light representing light of a wavelength of 1315 to 1430 nm, or fourth light representing light of a wavelength of 1430 to 1535 nm.
17. The image processing device according to claim 15, wherein the light includes: first light representing light of a wavelength of 1020 to 1140 nm, second light representing light of a wavelength of 1140 to 1260 nm, third light representing light of a wavelength of 1315 to 1430 nm, and fourth light representing light of a wavelength of 1430 to 1535 nm.
18. The image processing device according to claim 15, wherein the learned model or the statistical model is a model generated in advance based on data in which an in-vivo image for training, and information indicating whether or not a tumor is present in the large bowel appearing in the in-vivo image, are associated with each other.
19. An image processing program executable by a computer to function as: an image acquisition unit that acquires an image obtained by irradiating an area of a living body with light having a wavelength of 955 nm to 2025 nm; and a determination unit that inputs the image acquired by the image acquisition unit to a learned model or a statistical model generated in advance for detecting, from the image, a tumor present in the area, and that determines whether or not a tumor is present at each point in the image acquired by the image acquisition unit.
20. An image processing method according to which a computer executes processing, the processing comprising: acquiring an image obtained by irradiating an area of a living body with light having a wavelength of 955 nm to 2025 nm; and inputting the acquired image to a learned model or a statistical model generated in advance for detecting, from the image, a tumor present in the area, and determining whether or not a tumor is present at each point in the acquired image.
21. An endoscope device, comprising: a light output unit that outputs light having a wavelength of 955 nm to 2025 nm; and an imaging device that captures an image when an area of a living body is irradiated with light from the light output unit.
22. The endoscope device according to claim 21, wherein: the area in the living body is a gastrointestinal tract area, and the light includes at least one of: first light representing light of a wavelength of 1050 to 1105 nm, second light representing light of a wavelength of 1145 to 1200 nm, third light representing light of a wavelength of 1245 to 1260 nm, or fourth light representing light of a wavelength of 1350 to 1405 nm.
23. The endoscope device according to claim 21, wherein: the area in the living body is lungs, and the light includes at least one of: first light representing light of a wavelength of 955 to 1020 nm, second light representing light of a wavelength of 1055 to 1135 nm, third light representing light of a wavelength of 1135 to 1295 nm, fourth light representing light of a wavelength of 1295 to 1510 nm, fifth light representing light of a wavelength of 1510 to 1645 nm, or sixth light representing light of a wavelength of 1820 to 2020 nm.
24. The endoscope device according to claim 21, wherein: the area in the living body is a stomach, and the light includes at least one of: first light representing light of a wavelength of 1065 to 1135 nm, second light representing light of a wavelength of 1180 to 1230 nm, third light representing light of a wavelength of 1255 to 1325 nm, or fourth light representing light of a wavelength of 1350 to 1425 nm.
25. The endoscope device according to claim 21, wherein: the area in the living body is a large bowel, and the light includes at least one of: first light representing light of a wavelength of 1020 to 1140 nm, second light representing light of a wavelength of 1140 to 1260 nm, third light representing light of a wavelength of 1315 to 1430 nm, or fourth light representing light of a wavelength of 1430 to 1535 nm.
26. An endoscope image processing system, comprising: the endoscope device according to claim 21; and the image processing device comprising: an image acquisition unit that acquires an image obtained by irradiating an area of a living body with light having a wavelength of 955 nm to 2025 nm; and a determination unit that inputs the image acquired by the image acquisition unit to a learned model or a statistical model generated in advance for detecting, from the image, a tumor present in the area, and that determines whether or not a tumor is present at each point in the image acquired by the image acquisition unit.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
DESCRIPTION OF EMBODIMENTS
[0039] An example of an embodiment of the disclosure will be described hereinbelow with reference to the drawings. Note that, in the drawings, the same or equivalent constituent elements and portions are assigned the same reference signs. Furthermore, the dimensional ratios in the drawings are exaggerated for convenience of description, and are sometimes different from the actual ratios.
[0040] (Configuration of Endoscope Image Processing System 1)
[0041]
[0042] The endoscope image processing system 1 according to the present embodiment detects a gastrointestinal stromal tumor (hereinafter, simply referred to as “GIST”.), which is a malignant tumor occurring under the mucosa of the gastrointestinal tract such as the stomach or the small intestine.
[0043] Conventionally, biological tissue has been irradiated with near-infrared light in all wavelength regions (for example, light having a wavelength around 800 to 2500 nm), and an image of the biological tissue at that time has been captured. In this case, it is necessary to mount, on the endoscope, a camera (for example, a near-infrared hyperspectral camera or the like) capable of capturing images of near-infrared light in all wavelength regions. However, it is difficult to mount such a camera on the endoscope.
[0044] Therefore, according to the present embodiment, light of a specific wavelength useful for discriminating the GIST from the near-infrared light is selected. The endoscope image processing system 1 according to the present embodiment irradiates a gastrointestinal tract area with light of a specific wavelength selected in advance, and captures an image of the gastrointestinal tract area at that time. Further, the endoscope image processing system 1 according to the present embodiment discriminates, based on the captured image, the presence or absence of a GIST at the gastrointestinal tract area.
[0045] A specific description will be provided hereinbelow.
[0046] (Endoscope System)
[0047] As illustrated in
[0048] The endoscope device 12 includes an insertion portion 14 that is inserted into the human body H. The insertion portion 14 is attached to an operation unit 16. The operation unit 16 includes various buttons for instructing an operation such that a distal end 18 of the insertion portion 14 is curved in the vertical direction and the horizontal direction within a predetermined angle range, collecting a tissue sample by operating a puncture needle attached to the distal end 18 of the endoscope device 12, and spraying medicine.
[0049] The endoscope device 12 according to the present embodiment is an endoscope for the gastrointestinal tract, and the distal end 18 is inserted into the gastrointestinal tract of the human body H. A light output unit is provided at the distal end 18 of the insertion portion 14 of the endoscope device 12, and light outputted from the light output unit is irradiated onto the gastrointestinal tract area in the living body. Further, the endoscope device 12 acquires an image of the gastrointestinal tract, which is the subject, using an imaging optical system.
[0050]
[0051] Light of a specific wavelength is outputted from the light source device of the endoscope device 12 according to the present embodiment, and light of the specific wavelength is outputted from the light guides 18B and 18C, which are examples of the light output unit. Specifically, the light source device is configured to be capable of outputting light having a wavelength of 1000 [nm] to 1500 [nm].
[0052] More specifically, the light source device (not illustrated) of the control device 19 is configured to be capable of outputting light having a wavelength of 1050 to 1105 [nm](hereinafter simply referred to as “first light”.), light having a wavelength of 1145 to 1200 [nm](hereinafter simply referred to as “second light”.), light having a wavelength of 1245 to 1260 [nm] (hereinafter simply referred to as “third light”.), and light having a wavelength of 1350 to 1405 [nm] (hereinafter simply referred to as “fourth light”.). Such light components are light components of a specific wavelength selected in advance.
[0053] The endoscope device 12 performs control to irradiate the gastrointestinal tract area in the living body with the first light, and captures an image (hereinafter simply referred to as a “first image”.) using the camera 18A at that time. Furthermore, the endoscope device 12 performs control to irradiate the gastrointestinal tract area in the living body with the second light, and captures an image (hereinafter simply referred to as a “second image”.) using the camera 18A at that time. The endoscope device 12 performs control to irradiate the gastrointestinal tract area in the living body with the third light, and captures an image (hereinafter simply referred to as a “third image”.) using the camera 18A at that time. The endoscope device 12 performs control to irradiate the gastrointestinal tract area in the living body with the fourth light, and captures an image (hereinafter simply referred to as a “fourth image”.) using the camera 18A at that time.
[0054] The control device 19 acquires each image captured by the camera of the endoscope device 12. The control device 19 integrates the first image, the second image, the third image, and the fourth image to generate an image Im of the gastrointestinal tract area as illustrated in
[0055] The control device 19 then transmits the image Im of the gastrointestinal tract area to the image processing device 30.
[0056] (Image Processing Device)
[0057]
[0058] The image acquisition unit 32 acquires the image Im of the gastrointestinal tract area transmitted from the control device 19. The image acquisition unit 32 then temporarily stores the image Im of the gastrointestinal tract area, in the image storage unit 34.
[0059] The image storage unit 34 stores the image Im of the gastrointestinal tract area.
[0060] The learned model storage unit 36 stores a learned model generated in advance for detecting a GIST that is present inside the gastrointestinal tract area from the image Im of the gastrointestinal tract area.
[0061] The learned model according to the present embodiment is realized by, for example, a known neural network. The learned model according to the present embodiment is a model generated in advance based on data in which an in-vivo image for training and information (so-called label) indicating whether or not a GIST is present inside a gastrointestinal tract area appearing in the in-vivo image are associated with each other.
[0062]
[0063] The determination unit 38 inputs the pixel value of each pixel of the image Im of the gastrointestinal tract area stored in the image storage unit 34 to the learned model stored in the learned model storage unit 36, and determines, for each pixel in the image Im of the gastrointestinal tract area, whether or not a GIST is present.
[0064] For example, in a case in which a certain pixel is inputted to the learned model, when the probability of being a GIST is higher than the probability of not being a GIST, the determination unit 38 determines that a GIST is present in the point corresponding to the pixel. Furthermore, in a case in which a certain pixel is inputted to the learned model, when the probability of being a GIST is equal to or less than the probability of not being a GIST, the determination unit 38 determines that a GIST is not present at a point corresponding to the pixel.
[0065] The determination unit 38 outputs, to a display unit (not illustrated), the determination result regarding the presence or absence of a GIST for each pixel in the image Im of the gastrointestinal tract area.
[0066] A display unit (not illustrated) displays the determination result, outputted from the determination unit 38, regarding the presence or absence of a GIST. Note that the determination result regarding the presence or absence of a GIST is outputted, for example, in a format superimposed on the image Im of the gastrointestinal tract area (for example, the point where the GIST is present is displayed in red.). The user then checks the determination result displayed on the display unit.
[0067]
[0068] The CPU 21 is a central processing unit, and executes various programs and controls each unit. That is, the CPU 21 reads a program from the ROM 22 or the storage 24, and executes the program using the RAM 23 as a work area. The CPU 21 performs control of each of the foregoing configurations and various types of arithmetic processing according to the program stored in the ROM 22 or the storage 24. In the present embodiment, the ROM 22 or the storage 24 stores various programs for processing information inputted from the input device.
[0069] The ROM 22 stores various programs and various data. Serving as a work area, the RAM 23 temporarily stores programs or data. The storage 24 is configured from a hard disk drive (HDD) or a solid state drive (SSD) or the like, and stores various programs including an operating system, and various data.
[0070] The input unit 25 includes a pointing device such as a mouse, and a keyboard, and is used to perform various inputs.
[0071] The display unit 26 is, for example, a liquid crystal display, and displays various types of information. The display unit 26 may function as the input unit 25 by adopting a touch panel system.
[0072] The communication I/F 27 is an interface for communicating with another device such as an input device, and for example, standards such as Ethernet (registered trademark), FDDI, and Wi-Fi (registered trademark) are used.
[0073] Next, the operation of the endoscope image processing system 1 will be described.
[0074] When the distal end 18 of the insertion portion 14 of the endoscope device 12 is inserted into the living body in response to operation by the user and the distal end 18 reaches the gastrointestinal tract area, imaging of an image of the gastrointestinal tract area is started.
[0075] The endoscope device 12 performs control to irradiate the gastrointestinal tract area with the first light, and captures the first image by using the camera 18A at that time. Further, the endoscope device 12 performs control to irradiate the gastrointestinal tract area with the second light, and captures the second image by using the camera 18A at that time. Further, the endoscope device 12 performs control to irradiate the gastrointestinal tract area with the third light, and captures the third image by using the camera 18A at that time. The endoscope device 12 performs control to irradiate the gastrointestinal tract area with the fourth light, and captures the fourth image by using the camera 18A at that time.
[0076] The control device 19 integrates the first image, the second image, the third image, and the fourth image to generate an image Im of the gastrointestinal tract area as illustrated in
[0077] When acquiring the image Im of the gastrointestinal tract area transmitted from the control device 19, the image acquisition unit 32 of the image processing device 30 stores the image Im of the gastrointestinal tract area, in the image storage unit 34.
[0078] Further, upon receiving an instruction signal to start determination processing to determine whether or not a GIST is present at each point of the image Im of the gastrointestinal tract area, the image processing device 30 executes the image processing routine illustrated in
[0079] Specifically, the CPU 21 performs image processing by reading the image processing program from the ROM 22 or the storage 24, expanding the image processing program in the RAM 23, and executing the image processing program.
[0080] In step S50, the image acquisition unit 32 reads the image Im of the gastrointestinal tract area stored in the image storage unit 34.
[0081] In step S52, the determination unit 38 reads the learned model stored in the learned model storage unit 36.
[0082] In step 554, the determination unit 38 inputs the pixel value of each pixel of the image Im of the gastrointestinal tract area read in step S50 to the learned model read in step S52, and determines whether or not there is a GIST for each pixel in the image Im of the gastrointestinal tract area.
[0083] In step 556, the determination unit 38 outputs the determination result regarding the presence or absence of a GIST for each pixel in the image Im of the gastrointestinal tract area to the display unit 26, and terminates the image processing routine.
[0084] The display unit 26 displays the determination result regarding the presence or absence of a GIST thus outputted from the determination unit 38.
[0085] As described above, the endoscope device according to the present embodiment outputs light having a wavelength of 1000 [nm] to 1500 [nm], more specifically, first light representing light having a wavelength of 1050 to 1105 [nm], second light representing light having a wavelength of 1145 to 1200 [nm], third light representing light having a wavelength of 1245 to 1260 [nm], and fourth light representing light having a wavelength of 1350 to 1405 [nm], and captures an image when the gastrointestinal tract area in the living body is irradiated with the light. Further, the image processing device according to the present embodiment inputs the image of the gastrointestinal tract area to a learned model generated in advance for detecting a GIST that is present inside the gastrointestinal tract area from the image of the gastrointestinal tract area, and determines whether or not a GIST is present at each point of the image. Thus, the presence or absence of a GIST can be detected using an image captured by irradiating a gastrointestinal tract area in a living body with light of a specific wavelength. As a result, for example, the presence or absence of a GIST can be detected without mounting a large-scale imaging device such as a near-infrared hyperspectral camera on an endoscope device.
Example
GIST-Related Example
[0086] Next, a method of selecting light of a specific wavelength according to the present embodiment will be described as an Example. In the present embodiment, when light having a specific wavelength is selected from near-infrared light, light having a wavelength useful for GIST detection is selected using a neural network which is an example of a learned model obtained by machine learning and using partial least square discriminant analysis (PLS-DA) which is an example of a statistical model obtained by statistical analysis.
[0087] Table 1 shows the number of training data used in generating the neural network and the PLS-DA. As described above, in the present embodiment, because the presence or absence of a GIST is determined for each pixel, the number of pieces of training data corresponds to the number of pixels. Note that “Tumor” in Table 1 represents a pixel in which GIST is present, and “Normal” represents a pixel corresponding to a normal region in which a GIST is not present. Furthermore, the number at the left end of Table 1 represents the date on which the image was collected; for example, “20160923” represents Sep. 23, 2016.
TABLE-US-00001 TABLE 1 Tumor Normal All 20160923 27760 15522 43282 20160930 6090 5382 11472 20171115 18619 1917 20536 20171215 6399 12936 19335 20180202 9617 2332 11949 All 68485 38089 106574
[0088] (Wavelength Selection Using Neural Network)
[0089]
[0090] In the present embodiment, the neural network of
[0091] (1) A learned neural network is generated using the training data set. One piece of training data is data in which a pixel value of a certain pixel in an image captured when a gastrointestinal tract area is irradiated with light of each wavelength is associated with a label indicating whether the pixel is a GIST. Note that, when the neural network is learned, the weights of only the fully connected layers in the neural network are learned. Further, the neural network has no bias, and the activation function is the Relu function.
[0092] (2) One piece of data is selected from the training data set, and a pixel value for each wavelength of the data is inputted to the learned neural network to calculate forward propagation. At this time, the output values of all the nodes in the learned neural network are recorded. Note that the output value refers to both a value outputted by each node of the neural network and a value outputted by the output layer of the neural network.
[0093] (3) The pixel value corresponding to one wavelength is selected from the pixel values for each wavelength in the one piece of data used in (2) above, and the pseudo forward propagation is calculated using the learned neural network with the values of the pixel values of the other wavelengths set to 0. The output of the learned neural network at this time is set as a contribution amount of one wavelength of one piece of data. Note that, at this time, the Relu function of the learned neural network is not applied. At this time, for each node of the learned neural network, the output value of the node for which the value recorded in (2) above is 0 or less is calculated as 0.
[0094] (4) (3) above is repeated for all wavelengths of one piece of data, and the contribution amount for the input of the pixel value corresponding to each wavelength is obtained.
[0095] (5) (2) to (4) above are repeated for a plurality of pieces of training data to obtain data of the contribution amounts for the plurality of pieces of training data.
[0096] The contribution amount will be described in specific terms hereinbelow.
[0097] Two values which are ultimately outputted by the learned neural network are, for example, {0.7, 0.3} (indicating a tumor or a normal pixel). The larger of the two output values of the learned neural network is selected, and it is specified whether the pixel is a tumor pixel or a normal pixel. For example, in a case in which the output value when the pixel value of a certain pixel is inputted to the learned neural network is {0.7, 0.3}, the pixel is determined to be a tumor.
[0098] When the pixel values of the pixels corresponding to all the wavelengths have been inputted to the learned neural network, there exist, among those wavelengths, wavelengths that contribute to a correct output and wavelengths that contribute to an incorrect output. For example, consider a case in which correct data of a certain pixel is {1, 0} and the pixel is a tumor. In this case, it is assumed that {0.7, 0.3} is outputted in a case in which the pixel values corresponding to all the wavelengths of the pixels have been inputted to the learned neural network. A wavelength that directs this output value toward {1, 0} is a wavelength that contributes to the correct output, and a wavelength that directs this output value toward {0, 1} is a wavelength that contributes to the incorrect output.
[0099] Therefore, one piece of data (the pixel value of one pixel) of the training data is inputted to the learned neural network with a value other than a certain wavelength as 0, and a final output value is calculated. In this case, the output of the node that directs the final output value in the wrong direction is set to 0. This processing corresponds to (2) and (3) above.
[0100] Further, for a certain wavelength, the sum of the output values calculated as described above for a plurality of pieces of data (the pixel values of a plurality of pixels) is calculated and set as the contribution amount of the wavelength. In this case, in a case in which the correct data of a certain pixel is {1, 0} and a pixel value corresponding to a certain wavelength is inputted to the learned neural network as {0.8, 0.2}, 0.8 is the calculated value. The calculated value thus calculated is also calculated for a plurality of pixels, and the sum of the calculated values calculated for the plurality of pixels is set as the contribution amount. The same calculation is executed for each of the plurality of wavelengths, and the contribution amounts of the wavelengths are calculated.
[0101]
TABLE-US-00002 TABLE 2 Wavelength Wavelength number [nm] 1 913.78 2 920.11 3 926.45 4 932.78 5 939.12 6 945.45 7 951.78 8 958.12 9 964.45 10 970.79 11 977.12 12 983.45 13 989.78 14 996.12 15 1002.45 16 1008.78 17 1015.11 18 1021.44 19 1027.78 20 1034.11 21 1040.44 22 1046.77 23 1053.1 24 1059.43 25 1065.76 26 1072.09 27 1078.42 28 1084.75 29 1091.08 30 1097.41 31 1103.74 32 1110.07 33 1116.4 34 1122.72 35 1129.05 36 1135.38 37 1141.71 38 1148.03 39 1154.36 40 1160.69 41 1167.02 42 1173.34 43 1179.67 44 1186 45 1192.32 46 1198.65 47 1204.97 48 1211.3 49 1217.62 50 1223.95 51 1230.27 52 1236.6 53 1242.92 54 1249.25 55 1255.57 56 1261.89 57 1268.22 58 1274.54 59 1280.86 60 1287.19 61 1293.51 62 1299.83 63 1306.15 64 1312.48 65 1318.8 66 1325.12 67 1331.44 68 1337.76 69 1344.08 70 1350.4 71 1356.72 72 1363.05 73 1369.37 74 1375.69 75 1382.01 76 1388.32 77 1394.64 78 1400.96 79 1407.28 80 1413.6 81 1419.92 82 1426.24 83 1432.56 84 1438.87 85 1445.19 86 1451.51 87 1457.83 88 1464.14 89 1470.46 90 1476.78 91 1483.09 92 1489.41 93 1495.73 94 1502.04 95 1508.36 96 1514.67 97 1520.99 98 1527.3 99 1533.62 100 1539.93 101 1546.25 102 1552.56 103 1558.87 104 1565.19 105 1571.5 106 1577.81 107 1584.13 108 1590.44 109 1596.75 110 1603.06 111 1609.38 112 1615.69 113 1622 114 1628.31 115 1634.62 116 1640.94 117 1647.25 118 1653.56 119 1659.87 120 1666.18 121 1672.49 122 1678.8 123 1685.11 124 1691.42 125 1697.73 126 1704.04 127 1710.34 128 1715.65
TABLE-US-00003 TABLE 3 Wavelength Wavelength number [nm] 129 1722.96 130 1729.27 131 1735.58 132 1741.89 133 1748.19 134 1754.5 135 1760.81 136 1767.11 137 1773.42 138 1779.73 139 1786.03 140 1792.34 141 1798.65 142 1804.95 143 1811.26 144 1817.56 145 1823.87 146 1830.17 147 1836.48 148 1842.78 149 1849.09 150 1855.39 151 1861.69 152 1868 153 1874.3 154 1880.61 155 1886.91 156 1893.21 157 1899.51 158 1905.82 159 1912.12 160 1918.42 161 1924.72 162 1931.02 163 1937.32 164 1943.63 165 1949.93 166 1956.23 167 1962.53 168 1968.83 169 1975.13 170 1981.43 171 1987.73 172 1994.03 173 2000.33 174 2006.63 175 2012.92 176 2019.22 177 2025.52 178 2031.82 179 2038.12 180 2044.42 181 2050.71 182 2057.01 183 2063.31 184 2069.6 185 2075.9 186 2082.2 187 2088.49 188 2094.79 189 2101.09 190 2107.38 191 2113.68 192 2119.97 193 2126.27 194 2132.56 195 2138.86 196 2145.15 197 2151.45 198 2157.74 199 2164.03 200 2170.33 201 2176.62 202 2182.91 203 2189.21 204 2195.5 205 2201.79 206 2208.08 207 2214.38 208 2220.67 209 2226.96 210 2233.25 211 2239.54 212 2245.83 213 2252.13 214 2258.42 215 2264.71 216 2271 217 2277.29 218 2283.58 219 2289.87 220 2296.16 221 2302.45 222 2308.73 223 2315.02 224 2321.31 225 2327.6 226 2333.89 227 2340.18 228 2346.46 229 2352.75 230 2359.04 231 2365.33 232 2371.61 233 2377.9 234 2384.19 235 2390.47 236 2396.76 237 2403.04 238 2409.33 239 2415.62 240 2421.9 241 2428.19 242 2434.47 243 2440.75 244 2447.04 245 2453.32 246 2459.61 247 2465.89 248 2472.17 249 2478.46 250 2484.74 251 2491.02 252 2497.31 253 2603.59 254 2509.87 255 2518.15 256 2522.44
[0102] The larger the absolute value of the contribution amounts illustrated in
[0103] Accordingly, it may be said that light having a wavelength of 1050 to 1105 [nm], light having a wavelength of 1145 to 1200 [nm], light having a wavelength of 1245 to 1260 [nm], and light having a wavelength of 1350 to 1405 [nm] are light having wavelengths useful for detecting a GIST.
[0104] (Wavelength Selection by PLS-DA)
[0105] In the present embodiment, wavelength selection and discrimination are performed using PLS-DA, which is a known statistical method. Specifically, the identification model was generated by PLS-DA by using the training data in Table 1. The contribution amount of the light of each wavelength was calculated using the sum of the factor loads of the light of each wavelength up to the eighth principal component of the generated identification model. Therefore, the contribution amount is the sum of the factor loads of the light of each wavelength up to the eighth principal component of the generated identification model.
[0106]
[0107] Table 4 below shows the GIST discrimination accuracy using the neural network (denoted as “proposed technique” in Table 4) and the GIST discrimination accuracy by PLS-DA. Note that the number in parentheses in Table 4 represents the number of wavelengths. “After dimension reduction” indicates that four wavelengths are selected from all wavelengths and that the dimension is reduced. As shown in Table 4, detection accuracy substantially equivalent to that in a case in which all the wavelengths are used is obtained only by using light of four selected wavelengths.
TABLE-US-00004 TABLE 4 Proposed method PLS-DA All wavelengths 97.6% (196) 93.1% (196) After dimension reduction 95.9% (4) 91.4% (4) Accuracy[%] (Number of wavelengths)
[0108] As described above, in a case in which, according to the present embodiment, a GIST is detected using light having a selected wavelength, it can be said that the accuracy in this case is substantially equivalent to the detection accuracy in a case in which the light having the wavelength in all the regions of near-infrared light is used.
[0109] (Lung Cancer-Related Example)
[0110] Next, a lung cancer-related Example will be described. Light of a specific wavelength useful for detecting lung cancer was selected using the same method as in the above GIST-related Example. The wavelength range is 196 wavelengths of 913.78 [nm] to 2145.15 [nm].
[0111] The larger the absolute value of the contribution amounts illustrated in
[0112] In this Example, six wavelengths in a region having a high contribution amount were selected from
TABLE-US-00005 TABLE 5 Validation Accuracy All wavelengths (196 wavelengths) 94.3% .sup. After dimension reduction (6 wavelengths) 89.3% (6)
[0113] As shown in the above table, it can be seen that even if the total wavelength (196 wavelengths) is reduced to 6 wavelengths, the accuracy is reduced only by about 5%. Accordingly, it may be said that at least one of light having a wavelength of 955 to 1020 [nm], light having a wavelength of 1055 to 1135 [nm], light having a wavelength of 1135 to 1295 [nm], light having a wavelength of 1295 to 1510 [nm], light having a wavelength of 1510 to 1645 [nm], and light having a wavelength of 1820 to 2020 [nm] is light having a wavelength useful for detecting lung cancer.
[0114] Furthermore,
[0115] (Gastric Cancer-Related Example)
[0116] Next, a gastric cancer-related Example will be described. A learned SVM model was generated by replacing the neural network according to the GIST-related embodiment above with a support vector machine (SVM), and light of a specific wavelength useful for detection of gastric cancer was selected by a similar method using LASSO (Least Absolute Shrinkage and Selection Operator) for wavelength selection. When the learned SVM model was generated, learning data of 6 specimens (normal: 405,525 pix, tumor: 107,078 pix) was used. In addition, leave-one-out cross-validation was used for evaluation of the learned SVM.
[0117] In this Example, four wavelengths in a region having a high contribution were selected from
TABLE-US-00006 TABLE 6 All wavelengths (95) Selected wavelengths (4) Accuracy 0.784 0.800 Precision 0.920 0.773 Recall 0.746 0.968 Specificity 0.500 0.512 F-measure 0.971 0.859
[0118] As shown in the above table, it can be seen that there is no significant difference in accuracy, precision, recall, specificity, and F-measure between all (95) wavelengths and the selected (4) wavelengths. Accordingly, it may be said that at least one of light having a wavelength of 1065 to 1135 [nm], light having a wavelength of 1180 to 1230 [nm], light having a wavelength of 1255 to 1325 [nm], and light having a wavelength of 1350 to 1425 [nm] is light having a wavelength useful for detecting gastric cancer.
[0119] Furthermore,
[0120]
[0121]
[0122]
[0123] When
[0124] (Tumor-Bearing Mouse-Related Example)
[0125] Next, a tumor-bearing mouse-related Example will be described. Light of a specific wavelength useful for detecting a tumor was selected using the same method as the method in the above gastric cancer-related Example. Note that the tumor-bearing mouse according to this Example is a tumor-bearing mouse in which cells derived from human colorectal cancer are used. In this Example, learning data of 11 specimens (normal: 245,866 pix; tumor: 107,078 pix) was used. In addition, leave-one-out cross-validation was used for evaluation of the learned SVM.
[0126] In this Example, four wavelengths in a region having a high contribution were selected from
TABLE-US-00007 TABLE 7 All wavelengths (95) Selected wavelengths (4) Accuracy 0.885 0.891 Precision 0.276 0.287 Recall 0.917 0.909 Specificity 0.883 0.890 F-measure 0.424 0.436
[0127] As shown in the above table, it can be seen that there is no significant difference in accuracy, precision, recall, specificity, and F-measure between all (95) wavelengths and the selected (4) wavelengths.
[0128] In addition,
[0129]
[0130]
[0131] When
[0132] Each of the above-described Examples is applicable to the foregoing embodiment, and can have a similar system configuration.
[0133] Note that, in the foregoing embodiments, each processing, which is executed as a result of the CPU reading software (a program), may be executed by various processors other than the CPU. Examples of the processor in this case include a programmable logic device (PLD) in which a circuit configuration can be changed after manufacturing a field-programmable gate array (FPGA) or the like, a dedicated electric circuit that is a processor having a circuit configuration exclusively designed for executing specific processing such as an application specific integrated circuit (ASIC), and the like. Moreover, each processing may be executed by one of these various processors, or may be executed by a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs, a combination of a CPU and an FPGA, and so forth). Furthermore, the hardware structure of these various processors is, more specifically, an electric circuit in which circuit elements such as semiconductor elements are combined.
[0134] Note that the disclosure is not limited to or by the foregoing embodiment, rather, a variety of variations and applications are possible within a scope not departing from the spirit of the invention.
[0135] For example, in the embodiment, a case in which a GIST is detected using a neural network has been described as an example, but the disclosure is not limited thereto. For example, a GIST may be detected using the statistical model in the Example.
[0136] Further, in the embodiment, a case in which a gastrointestinal tract area is irradiated with first light representing light having a wavelength of 1050 to 1105 [nm], second light representing light having a wavelength of 1145 to 1200 [nm], third light representing light having a wavelength of 1245 to 1260 [nm], and fourth light representing light having a wavelength of 1350 to 1405 [nm] to acquire an image has been described as an example, but the disclosure is not limited thereto. A gastrointestinal tract area may be irradiated with at least one of these four light components to acquire an image. In addition, the same applies to each of the Examples, and an image may be acquired by irradiating an area of a living body with at least one of the light components.
[0137] Further, in the foregoing embodiment, a case in which a GIST is detected using light of four wavelengths has been described as an example, but the disclosure is not limited thereto. For example, a GIST may be detected using light of a plurality of wavelengths whose contribution amount in the Examples is equal to or greater than a predetermined threshold value. Furthermore, the same applies to the Examples, and a tumor may be detected using light of a plurality of wavelengths whose contribution amount is equal to or greater than a predetermined threshold value.
[0138] Moreover, in each of the Examples, light having a plurality of wavelengths is selected, but any light may be selected as long as the light has a wavelength, from the graph illustrated in the drawing, which can be regarded as useful.
[0139] The disclosure of Japanese Patent Application No. 2020-130535, filed on Jul. 31, 2020, is incorporated in the present specification by reference in its entirety. All documents, patent applications, and technical standards disclosed in the present specification are incorporated herein by reference to the same extent as a case in which the individual documents, patent applications, and technical standards were specifically and individually marked as being incorporated by reference.