OPTICAL MONITORING DEVICE AND METHOD AND DATA PROCESSING SYSTEM FOR DETERMINING INFORMATION FOR DISTINGUISHING BETWEEN TISSUE FLUID CELLS AND TISSUE CELLS
20220392061 · 2022-12-08
Inventors
Cpc classification
G06T2207/20016
PHYSICS
G06T7/30
PHYSICS
G06V20/69
PHYSICS
International classification
Abstract
The invention relates to a method for determining information for distinguishing between tissue fluid cells and tissue cells in a high-resolution image (34) of a tissue area. In the method, images (33A-E) stored temporarily with a low resolution and a high image rate before the high-resolution image (34) is recorded are accessed and the information for distinguishing between tissue fluid cells and tissue cells is obtained from the temporarily stored images (33A-E) with the low resolution and the high image rate.
Claims
1. A computer-implemented method for determining information for distinguishing between tissue fluid cells and tissue cells in a high-resolution image of a tissue region, wherein low-resolution images with a high frame rate that were buffered before the high-resolution image was recorded are accessed, and the information for distinguishing between tissue fluid cells and tissue cells is obtained from the low-resolution buffered images with the high frame rate.
2. The computer-implemented method as claimed in claim 1, wherein the information for distinguishing between tissue fluid cells and tissue cells is obtained by virtue of a video sequence being generated from at least some of the low-resolution buffered images with the high frame rate.
3. The computer-implemented method as claimed in claim 1, wherein the information to distinguish between tissue fluid cells and tissue cells is obtained by virtue of an image background and those image elements that move in relation to the image background being determined in the low-resolution buffered images with the high frame rate, and the image elements that move in relation to the image background being represented with emphasis in an image as tissue fluid cells.
4. The computer-implemented method as claimed in claim 2, wherein the video sequence or the image elements represented with emphasis as tissue fluid cells are overlaid on the high-resolution image.
5. The computer-implemented method as claimed in claim 2, wherein the low-resolution buffered images with the high frame rate, which are used to generate the video sequence or are used to determine the image elements that move in relation to the image background, are registered in relation to a reference image.
6. The computer-implemented method as claimed in claim 1, wherein the tissue cells are classified.
7. The computer-implemented method as claimed in claim 6, wherein the tissue cells are labeled in the high-resolution image in accordance with their classification.
8. The computer-implemented method as claimed in claim 1, wherein a check is carried out for each low-resolution buffered image with the high frame rate as to whether it is suitable for obtaining the information for distinguishing between tissue fluid cells and tissue cells, and only the low-resolution buffered images with the high frame rate which were determined as suitable for obtaining the information for distinguishing between tissue fluid cells and tissue cells are used to obtain information for distinguishing between tissue fluid cells and tissue cells.
9. The computer-implemented method as claimed in claim 8, wherein a check is carried out as to whether a sufficient number of low-resolution buffered images with the high frame rate which are suitable for obtaining the information for distinguishing between tissue fluid cells and tissue cells are available and should the check yield that a sufficient number of low-resolution buffered images with the high frame rate which are suitable for obtaining the information for distinguishing between tissue fluid cells and tissue cells are not available, this prompts the recording of further low-resolution images with the high frame rate.
10. A computer program for determining information for distinguishing between tissue fluid cells and tissue cells in a high-resolution image of a tissue region, said computer program comprising instructions which, when executed on a computer, prompt the computer to access low-resolution images with a high frame rate which were buffered before the high-resolution image was recorded and to obtain the information for distinguishing between tissue fluid cells and tissue cells from the low-resolution buffered images with the high frame rate.
11. A nonvolatile computer-readable storage medium with instructions stored thereon for determining information for distinguishing between tissue fluid cells and tissue cells in a high-resolution image of a tissue region, said instructions, when executed on a computer, prompting the computer to access low-resolution images with a high frame rate which were buffered before the high-resolution image was recorded and to obtain the information for distinguishing between tissue fluid cells and tissue cells from the low-resolution buffered images with the high frame rate.
12. A data processing system for determining information for distinguishing between tissue fluid cells and tissue cells in a high-resolution image of the tissue region, comprising a processor and at least one memory, the processor being configured, on the basis of instructions of a computer program stored in the memory, to access low-resolution images with a high frame rate which were buffered before the high-resolution image was recorded and to obtain the information for distinguishing between tissue fluid cells and tissue cells from the low-resolution buffered images with the high frame rate.
13. A method for recording a high-resolution image of a tissue region with assigned information for distinguishing between tissue fluid cells and tissue cells by means of a scanning imaging method, in which low-resolution images of the tissue region are recorded with a high frame rate in a first scanning mode and the low-resolution recorded images with the high frame rate are buffered for a certain period of time, and in which a trigger signal is followed by a change to a second scanning mode with a high resolution, in which a high-resolution image is recorded, and a determination of the information for distinguishing between tissue fluid cells and tissue cells in accordance with the steps of the method as claimed in claim 1.
14. The method as claimed in claim 13, wherein the recording of the high-resolution image is followed by a return into the first scanning mode and the method resumes with the recording of low-resolution images with the high frame rate.
15. The method as claimed in claim 13, wherein the recording of a high-resolution image triggers the obtainment of the information for distinguishing between tissue fluid cells and tissue cells.
16. An optical observation apparatus comprising scanning image recording equipment and a data processing system as claimed in claim 12.
Description
[0024] Further features, properties and advantages of the present invention will become apparent from the following description of exemplary embodiments with reference to the accompanying figures.
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032] For explanatory purposes, the invention will be described in detail below on the basis of exemplary embodiments. Here,
[0033] The endomicroscope 1 shown in
[0034] The second end 7 of the optical fiber 3 faces a sensor 13, by means of which it is possible to capture luminous energy incident on the sensor 13. The sensor 13 is located in a housing 15, which is embodied as a separate module in the present exemplary embodiment but which can also be embodied as a handle, and in which, moreover, a light source (not illustrated in the figure) for generating illumination light for illuminating the observation object 9 and input coupling equipment for coupling the illumination light into the second end 7 of the optical fiber 3 are housed. In particular, the light source can be a laser light source. However, the light source can also be arranged outside of the housing 15 and be connected to the latter by way of a light guide. Then, the output end of the light guide is situated in the housing 15. In this case, the input coupling equipment couples the illumination light emerging from the output end of the light guide into the second end 7 of the optical fiber 3. The illumination light can be white light, i.e., have a broadband spectrum, or light with a spectrum that consists of one or more narrowband spectral ranges, for example of one or more narrowband spectral ranges suitable for exciting a fluorescence in the observation object 9.
[0035] Illumination light coupled into the second end 7 of the optical fiber 3 is guided through the optical fiber 3 to the first end 5, from where the illumination light emerges in the direction of the observation object 9. Illumination light reflected by the observation object 9 or light excited by the illumination light and emitted by the observation object 9, for instance fluorescent light, enters into the first end 5 of the optical fiber 3 in turn and is guided from the latter to the second end 7, from where it emerges in the direction of the sensor 13. Moreover, focusing optical units can be located at, or in front of, the ends 5, 7 of the optical fiber 3 and these can be used to focus light onto the surface of the observation object 9 or onto the sensor 13. In particular, the endomicroscope 1 can be embodied as a confocal endomicroscope. In addition or as an alternative thereto, it can also be embodied as an endomicroscope for carrying out optical coherence tomography (OCT). Confocal microscopy and optical coherence tomography are well-known methods and are described in US 2010/0157308 A1 and U.S. Pat. No. 9,921,406 B2, for example. Therefore, the description of details in respect of confocal microscopy and in respect of optical coherence tomography is dispensed with in the scope of the present description. Instead, reference is made to US 2010/0157308 A1 and U.S. Pat. No. 9,921,406 B2.
[0036] Recording the image with the aid of the endomicroscope 1 is controlled with the aid of a computer 17 in the present exemplary embodiment. However, the control can also be implemented by means of a dedicated control device. The computer 17 used for controlling in the present exemplary embodiment is connected both to the scanning device 11 and to the sensor 13. In the present exemplary embodiment, the scanning device 11 is controlled by the computer 17 in such a way that the observation object 9 is scanned along a grid 19 with grid points 21 (see
[0037] In the present exemplary embodiment, the grid comprises grid lines which extend in the x-direction in
[0038]
[0039] To increase the frame rate there is the option of reducing the number of grid points 21 used during the scanning, as shown in
[0040] The omission of lines 25 during the scanning leads to a reduction in the image resolution in the y-direction. It is evident from
[0041] While recording a video stream with a reduced resolution may be sufficient in some cases, for example for as long as there is only navigation to an examination site, there are situations in which a high-resolution image of the structures 23 of the observation object 9 is required. This applies, in particular, if the endomicroscope 1 has reached the examination site and the examination site should be examined in respect of changes.
[0042] While the video stream is recorded, the low-resolution frames of the video stream are buffered in the memory of the computer 17 for a certain amount of time in the present exemplary embodiment, for the purposes of which use can be made, for example, of a circular buffer, i.e., a memory in which data are stored for a certain storage time and, in this case, the data in the memory for which the storage time has elapsed are overwritten with current data. When the user has reached the examination site, they can trigger the recording of a high-resolution image by virtue of entering a corresponding command into the computer 17 as a trigger signal. By way of example, the command can be a keyboard entry or, should the computer 17 be equipped to accept voice commands, a voice command. However, it is also possible to use an external input apparatus, which is connected to the computer 17 by wire or radio. By way of example, such an external input apparatus can be a foot switch, following the actuation of which a trigger signal is transmitted to the computer 17. Compared to a keyboard entry, foot switch and voice commands are advantageous in that the user of the endomicroscope 1 requires no hands for generating the trigger signal.
[0043] An embodiment of the computer-implemented method for determining information for distinguishing between tissue fluid cells and cells embedded in an extracellular matrix in a high-resolution image is explained below on the basis of the flowchart depicted in
[0044] In the present exemplary embodiment, the computer-implemented method is carried out on the computer 17. Its implementation is triggered by a trigger signal, for example by actuating a foot switch, by way of a keyboard input or a voice command. The trigger signal may be a signal designed only for carrying out the method but, like in the previous embodiment, it may also be a signal coupled to the trigger of a high-resolution image 34 (see
[0045] After the buffered frames 33A-E have been accessed in step S2, the quality of the frames 33A-E is evaluated in step S3 and the frames whose quality is too low are rejected. The evaluation of the frames in respect of their quality can be implemented, in particular, in view of the sharpness of the respective frame, the presence of movement artifacts, the contrast, etc. In particular, a quality parameter can be determined on the basis of the image sharpness, the contrast, the presence of movement artifacts, etc. Those low-resolution frames 33A-E which do not reach a specified value of the quality parameter are rejected. In the process, there also is the option of defining a plurality of quality parameters and rejecting all those images which do not reach the specified value for at least one quality parameter.
[0046] Frames whose quality has a sufficient value are subjected to a check in step S4 to the effect of whether there is a sufficient overlap of the image fields of the respective frames. The frames for which there is no sufficient overlap of the image fields are rejected. To determine a sufficient overlap, a reference image can for example be determined from the low-resolution frames 33A-E and the overlap of the remaining frames 33A-E with the reference image can be determined. By way of example, the overlap can be represented by a numerical value which indicates what portion of the image field of the respective frame 33A-E corresponds to the image field of the reference image. Even though the frames with an insufficient image quality are rejected first in the present embodiment before the frames which do not have sufficient overlap with the reference image are rejected, it is also possible to reverse the sequence of the test, that is to say first reject those frames which do not have a sufficient overlap with the reference image and subsequently reject those frames whose quality is insufficient. Instead of a frame, the high-resolution image 34 may also serve as a reference image.
[0047] After the images whose quality was insufficient or which did not have a sufficient overlap have been rejected in steps S3 and S4, a check is carried out in step S5 as to whether the remaining number of low-resolution frames 33A-E is sufficient to determine the information for distinguishing between tissue fluid cells and tissue cells embedded in an extracellular matrix. By way of example, a lower limit for the number of frames, which should not be undershot, can be specified to this end. Additionally or as an alternative, it is possible to specify a maximum time interval between two low-resolution frames 33A-E which should not be exceeded. Further additionally or as a further alternative, it is also possible to specify an overall time interval that should be covered by the low-resolution frames 33A-E.
[0048] Should the number of low-resolution frames 33A-E be determined in step S5 as being insufficient for determining the information for distinguishing between tissue fluid cells and tissue cells embedded in an extracellular matrix, the method advances to step S6, in which further low-resolution frames 35A-C are added to the already available frames 33A-E. Typically, the endomicroscope 1 continues to record low-resolution frames 35A-C after the high-resolution image has been created. Therefore, the low-resolution frames 35A-C recorded after the high-resolution image has been created merely need to be accessed in step S6. Should the endomicroscope 1 not automatically continue recording low-resolution frames 35A-C after the high-resolution image 34 was created, recording a number of low-resolution frames 35A-C is triggered in step S6. Independently of whether recording a number of low-resolution frames 35A-C is triggered in step S6 or whether the endomicroscope 1 automatically continues recording low-resolution frames 35A-C after the high-resolution image 34 was created, the method returns from step S6 to step S3 in order to carry out the check of steps S3, S4 and S5 again. This is carried out until a sufficient number of low-resolution frames 33A-E, 35A-C is determined as being available in step S5.
[0049]
[0050] As soon as a sufficient number of low-resolution frames 33A-E, 35A-C are determined as being available in step S5, the method advances to step S7, in which the low-resolution frames are registered in relation to the reference image. If the reference image is the high-resolution image 34, it is advantageous for the quality of the registration if pixel values for the pixels omitted in the low-resolution frames 33A-E, 35A-C are calculated by means of an interpolation of the pixels that were not omitted.
[0051] Using the comparison of the registered frames with one another as a basis, the image elements which in each case adopt the same position in the image field in the frames 33A-E, 35A-C are then determined in step S8 of the present embodiment. These image elements are depicted in
[0052] After the information regarding which image elements represent tissue cells embedded in an extracellular matrix and which image elements represent tissue fluid cells was obtained in step S8, the information obtained is suitably processed for display in step S9. By way of example, the preparation may include the tissue cells embedded in the extracellular matrix or the tissue fluid cells being emphasized by color in the high-resolution image. Alternatively, there is also the option of generating a short video sequence on the basis of the low-resolution frames 33A-E, 35A-C, said video sequence being overlaid on the high-resolution image following a registration with the high-resolution image 34. Instead of an overlay, there is also the option of presenting the video sequence next to the high-resolution image 34. The prepared information is then output for display on an external monitor (step S10). Alternatively, the information can also be presented on the monitor of the computer 17. The method ends with the presentation of the information.
[0053] The tissue cells can be classified following the determination of the information which allows the tissue cells embedded in the extracellular matrix to be distinguished from the tissue fluid cells. Such a classification can be implemented by means of suitable software and, for example, be based on methods as described in the following publications: A. BenTaieb et al. “Deep Learning Models for Digital Pathology”, arXiv:1910:12329v2 [cs.CV] 29 Oct. 2019; A. Bizzego et al. “Evaluating reproducibility of AI algorithms in digital pathology with DAPPER” in PLoS Comput Biol 15(3):e1006269, Mar. 27, 2019 and T. Fuchs et al. “Computational pathology: Challenges and promises for tissue analysis” in Computerized Medical Imaging and Graphics 35 (2011), pages 515-530. With the aid of the classification, it is possible to label the tissue cells differently in the present exemplary embodiment. By way of example, labeling allows a distinction to be made in the high-resolution image between tissue cells which do not have a change in relation to the normal state and those which do have a change.
[0054] The present invention has been described in detail on the basis of exemplary embodiments for explanatory purposes. However, a person skilled in the art recognizes that there can be deviations from the exemplary embodiments within the scope of the present invention, as has also already been indicated within the scope of the exemplary embodiments. Thus, it is for example possible to merely merge the registered low-resolution frames into a short video sequence that is presented next to the high-resolution image instead of the option of determining, on the basis of the low-resolution frames, image regions forming the background and image regions that move in relation to the background. In the video sequence constructed from the registered low-resolution frames, a user of the endomicroscope can identify the tissue fluid cells on the basis of their movement presented in the short video sequence.
[0055] Hence, an automated determination of the image regions that form the image background and of the image regions that move in relation to the determined image background, as was implemented in the described exemplary embodiment, is not mandatory. A person skilled in the art recognizes that further modifications of the described exemplary embodiments are possible.
[0056] Therefore, the present invention is intended to be restricted only by the appended claims.
LIST OF REFERENCE SIGNS
[0057] 1 Endomicroscope [0058] 3 Optical fiber [0059] 5 First end [0060] 7 Second end [0061] 9 Observation object [0062] 11 Scanning device [0063] 13 Sensor [0064] 15 Housing [0065] 17 Computer [0066] 19 Grid [0067] 21 Grid point [0068] 23 Structure [0069] 25 Omitted lines [0070] 27 Scanned lines [0071] 29 Image background-forming image elements [0072] 31 Image elements moving in relation to the image background [0073] 33 Low-resolution images [0074] 34 High-resolution image [0075] S1 Start [0076] S2 Reading buffered images [0077] S3 Selection according to image quality [0078] S4 Selection according to image field [0079] S5 Checking whether a sufficient number of low-resolution frames is available [0080] S6 Adding further low-resolution frames [0081] S7 Registering the low-resolution frames [0082] S8 Determining the image elements forming the image background and image elements moving in relation to the image background [0083] S9 Processing the information [0084] S10 Outputting the information for display purposes