OPTICAL MONITORING DEVICE AND METHOD AND DATA PROCESSING SYSTEM FOR DETERMINING INFORMATION FOR DISTINGUISHING BETWEEN TISSUE FLUID CELLS AND TISSUE CELLS

20220392061 · 2022-12-08

    Inventors

    Cpc classification

    International classification

    Abstract

    The invention relates to a method for determining information for distinguishing between tissue fluid cells and tissue cells in a high-resolution image (34) of a tissue area. In the method, images (33A-E) stored temporarily with a low resolution and a high image rate before the high-resolution image (34) is recorded are accessed and the information for distinguishing between tissue fluid cells and tissue cells is obtained from the temporarily stored images (33A-E) with the low resolution and the high image rate.

    Claims

    1. A computer-implemented method for determining information for distinguishing between tissue fluid cells and tissue cells in a high-resolution image of a tissue region, wherein low-resolution images with a high frame rate that were buffered before the high-resolution image was recorded are accessed, and the information for distinguishing between tissue fluid cells and tissue cells is obtained from the low-resolution buffered images with the high frame rate.

    2. The computer-implemented method as claimed in claim 1, wherein the information for distinguishing between tissue fluid cells and tissue cells is obtained by virtue of a video sequence being generated from at least some of the low-resolution buffered images with the high frame rate.

    3. The computer-implemented method as claimed in claim 1, wherein the information to distinguish between tissue fluid cells and tissue cells is obtained by virtue of an image background and those image elements that move in relation to the image background being determined in the low-resolution buffered images with the high frame rate, and the image elements that move in relation to the image background being represented with emphasis in an image as tissue fluid cells.

    4. The computer-implemented method as claimed in claim 2, wherein the video sequence or the image elements represented with emphasis as tissue fluid cells are overlaid on the high-resolution image.

    5. The computer-implemented method as claimed in claim 2, wherein the low-resolution buffered images with the high frame rate, which are used to generate the video sequence or are used to determine the image elements that move in relation to the image background, are registered in relation to a reference image.

    6. The computer-implemented method as claimed in claim 1, wherein the tissue cells are classified.

    7. The computer-implemented method as claimed in claim 6, wherein the tissue cells are labeled in the high-resolution image in accordance with their classification.

    8. The computer-implemented method as claimed in claim 1, wherein a check is carried out for each low-resolution buffered image with the high frame rate as to whether it is suitable for obtaining the information for distinguishing between tissue fluid cells and tissue cells, and only the low-resolution buffered images with the high frame rate which were determined as suitable for obtaining the information for distinguishing between tissue fluid cells and tissue cells are used to obtain information for distinguishing between tissue fluid cells and tissue cells.

    9. The computer-implemented method as claimed in claim 8, wherein a check is carried out as to whether a sufficient number of low-resolution buffered images with the high frame rate which are suitable for obtaining the information for distinguishing between tissue fluid cells and tissue cells are available and should the check yield that a sufficient number of low-resolution buffered images with the high frame rate which are suitable for obtaining the information for distinguishing between tissue fluid cells and tissue cells are not available, this prompts the recording of further low-resolution images with the high frame rate.

    10. A computer program for determining information for distinguishing between tissue fluid cells and tissue cells in a high-resolution image of a tissue region, said computer program comprising instructions which, when executed on a computer, prompt the computer to access low-resolution images with a high frame rate which were buffered before the high-resolution image was recorded and to obtain the information for distinguishing between tissue fluid cells and tissue cells from the low-resolution buffered images with the high frame rate.

    11. A nonvolatile computer-readable storage medium with instructions stored thereon for determining information for distinguishing between tissue fluid cells and tissue cells in a high-resolution image of a tissue region, said instructions, when executed on a computer, prompting the computer to access low-resolution images with a high frame rate which were buffered before the high-resolution image was recorded and to obtain the information for distinguishing between tissue fluid cells and tissue cells from the low-resolution buffered images with the high frame rate.

    12. A data processing system for determining information for distinguishing between tissue fluid cells and tissue cells in a high-resolution image of the tissue region, comprising a processor and at least one memory, the processor being configured, on the basis of instructions of a computer program stored in the memory, to access low-resolution images with a high frame rate which were buffered before the high-resolution image was recorded and to obtain the information for distinguishing between tissue fluid cells and tissue cells from the low-resolution buffered images with the high frame rate.

    13. A method for recording a high-resolution image of a tissue region with assigned information for distinguishing between tissue fluid cells and tissue cells by means of a scanning imaging method, in which low-resolution images of the tissue region are recorded with a high frame rate in a first scanning mode and the low-resolution recorded images with the high frame rate are buffered for a certain period of time, and in which a trigger signal is followed by a change to a second scanning mode with a high resolution, in which a high-resolution image is recorded, and a determination of the information for distinguishing between tissue fluid cells and tissue cells in accordance with the steps of the method as claimed in claim 1.

    14. The method as claimed in claim 13, wherein the recording of the high-resolution image is followed by a return into the first scanning mode and the method resumes with the recording of low-resolution images with the high frame rate.

    15. The method as claimed in claim 13, wherein the recording of a high-resolution image triggers the obtainment of the information for distinguishing between tissue fluid cells and tissue cells.

    16. An optical observation apparatus comprising scanning image recording equipment and a data processing system as claimed in claim 12.

    Description

    [0024] Further features, properties and advantages of the present invention will become apparent from the following description of exemplary embodiments with reference to the accompanying figures.

    [0025] FIG. 1 shows a very schematic illustration of an endomicroscope which is configured to carry out a scanning imaging method.

    [0026] FIG. 2 shows a schematic illustration of a high-resolution image.

    [0027] FIG. 3 shows schematic illustrations of a low-resolution image.

    [0028] FIG. 4 shows, on the basis of a flowchart, an example of the computer-implemented method for determining information for distinguishing between tissue fluid cells and tissue cells in a high-resolution image.

    [0029] FIG. 5 shows a sequence of recorded images.

    [0030] FIG. 6 shows a very schematic illustration of an image with tissue cells and a tissue fluid cell in a first position.

    [0031] FIG. 7 shows the image of FIG. 6 with the tissue fluid cell in a second position.

    [0032] For explanatory purposes, the invention will be described in detail below on the basis of exemplary embodiments. Here, FIG. 1 shows an endomicroscope with a scanning device as an exemplary embodiment of an optical observation apparatus with an image recording device designed to record images of an object, said images being composed of a pixel grid. FIGS. 2 and 3 show very schematic images that have been obtained on the basis of scans carried out by the endomicroscope.

    [0033] The endomicroscope 1 shown in FIG. 1 comprises an optical fiber 3 with an input end 5 and an output end 7. The input end 5 faces the observation object 9 and it is located in a scanning device 11, with the aid of which the end 5 can be moved along two lateral directions, referred to as x-direction and y-direction below, with respect to the observation object 9. In particular, the scanning device can be realized by means of microelectromechanical systems (MEMS). By way of example, a scanning device using microelectromechanical systems is described in US 2016/0051131 A1. Reference is made to this document in respect of the structure of a suitable scanning device. Alternatively, if the fiber end 5 is stationary, the observation object 9 can be scanned with the aid of a movable microelectromechanical mirror (MEMS mirror) or with the aid of a plurality of movable microelectromechanical mirrors.

    [0034] The second end 7 of the optical fiber 3 faces a sensor 13, by means of which it is possible to capture luminous energy incident on the sensor 13. The sensor 13 is located in a housing 15, which is embodied as a separate module in the present exemplary embodiment but which can also be embodied as a handle, and in which, moreover, a light source (not illustrated in the figure) for generating illumination light for illuminating the observation object 9 and input coupling equipment for coupling the illumination light into the second end 7 of the optical fiber 3 are housed. In particular, the light source can be a laser light source. However, the light source can also be arranged outside of the housing 15 and be connected to the latter by way of a light guide. Then, the output end of the light guide is situated in the housing 15. In this case, the input coupling equipment couples the illumination light emerging from the output end of the light guide into the second end 7 of the optical fiber 3. The illumination light can be white light, i.e., have a broadband spectrum, or light with a spectrum that consists of one or more narrowband spectral ranges, for example of one or more narrowband spectral ranges suitable for exciting a fluorescence in the observation object 9.

    [0035] Illumination light coupled into the second end 7 of the optical fiber 3 is guided through the optical fiber 3 to the first end 5, from where the illumination light emerges in the direction of the observation object 9. Illumination light reflected by the observation object 9 or light excited by the illumination light and emitted by the observation object 9, for instance fluorescent light, enters into the first end 5 of the optical fiber 3 in turn and is guided from the latter to the second end 7, from where it emerges in the direction of the sensor 13. Moreover, focusing optical units can be located at, or in front of, the ends 5, 7 of the optical fiber 3 and these can be used to focus light onto the surface of the observation object 9 or onto the sensor 13. In particular, the endomicroscope 1 can be embodied as a confocal endomicroscope. In addition or as an alternative thereto, it can also be embodied as an endomicroscope for carrying out optical coherence tomography (OCT). Confocal microscopy and optical coherence tomography are well-known methods and are described in US 2010/0157308 A1 and U.S. Pat. No. 9,921,406 B2, for example. Therefore, the description of details in respect of confocal microscopy and in respect of optical coherence tomography is dispensed with in the scope of the present description. Instead, reference is made to US 2010/0157308 A1 and U.S. Pat. No. 9,921,406 B2.

    [0036] Recording the image with the aid of the endomicroscope 1 is controlled with the aid of a computer 17 in the present exemplary embodiment. However, the control can also be implemented by means of a dedicated control device. The computer 17 used for controlling in the present exemplary embodiment is connected both to the scanning device 11 and to the sensor 13. In the present exemplary embodiment, the scanning device 11 is controlled by the computer 17 in such a way that the observation object 9 is scanned along a grid 19 with grid points 21 (see FIG. 2). At each scanned grid point 21 there is an illumination of the observation object 9 with illumination light and a recording of the reflected illumination light or of the light emitted by the observation object 9 on account of an excitation by means of the illumination light. Then, the computer generates an image from the reflected illumination light recorded at the grid points 21 or from the light emitted by the observation object recorded at the grid points 21, the pixel grid of said image corresponding to the grid 19 used during the scanning. Therefore, the optical fiber 3, the scanning device 11, the sensor 13 and the computer 17 together form an image recording device, in which the computer 17 serves as image generation device.

    [0037] In the present exemplary embodiment, the grid comprises grid lines which extend in the x-direction in FIG. 2 and grid columns which extend in the y-direction in FIG. 2. Here, in the present exemplary embodiment, scanning of the observation object is carried out line-by-line, i.e., in such a way that a line is scanned, i.e., there is a scan along the x-direction, and, after the line has been completed, there is an offset of the optical fiber 3 in the y-direction before a line extending in the x-direction is scanned again using the optical fiber 3, which has been offset in the y-direction. During the scanning procedure, the sensor 13 is exposed to each grid point 21, at which the optical fiber is situated at the time of a recording. In this way, an image of the observation object 9, as shown schematically in FIG. 2, is generated line-by-line with the aid of the sensor 13 and the scanning device 11.

    [0038] FIG. 2 very schematically shows an image in which all grid points 21 of the grid 19 have been used when scanning the observation object 9. Here, FIG. 2 schematically shows a structure 23 of the observation object 9, which is represented by a ring for illustration purposes. Grid points 21 situated over the structure 23 lead to a different signal on the sensor 13 than those grid points 21 that are not situated over the structure 23. In FIG. 2, the signal generated by the grid points 21 situated over the structure 23 is represented by hatched grid points 21. A high resolution of structures 23 in the observation object 9 is possible with small dimensions of the grid points 21 and correspondingly small pitches between the grid points 21, as are facilitated by the use of the optical fiber 3. On account of the large number of grid points to be scanned in connection with the high resolution, the generation of a high-resolution image with the aid of the scanning imaging method requires comparatively much time. Should a video sequence be recorded using the scanning imaging method, only low frame rates can therefore be achieved on account of the time duration required for recording a frame with the aid of the scanning imaging method.

    [0039] To increase the frame rate there is the option of reducing the number of grid points 21 used during the scanning, as shown in FIG. 3, in order to increase the speed with which the scan can be carried out for a frame. To this end, in the present exemplary embodiment the scanning device 11 can be acted upon in controlling fashion by the computer 17 in such a way that certain lines 25 are omitted when scanning along the grid 19. Expressed differently, only every n-th line is scanned during scanning, as illustrated schematically in FIG. 3 where, by way of example, only every third line is scanned. The grid points 21 of the lines 21 used during the scanning are represented by full lines in the figure while the grid points 21 of the lines 25 omitted during the scanning are represented by dashed lines. In the shown illustration, only every third line of the grid 19 is used during the scanning, and so the frame rate can be approximately tripled. In reality, more than two lines are omitted between two scanned lines in order to increase the frame rate to at least 4 fps.

    [0040] The omission of lines 25 during the scanning leads to a reduction in the image resolution in the y-direction. It is evident from FIG. 3 that, on account of the reduced number of lines, fewer grid points 21 are available in the image for representing the structure 23 than when using all lines.

    [0041] While recording a video stream with a reduced resolution may be sufficient in some cases, for example for as long as there is only navigation to an examination site, there are situations in which a high-resolution image of the structures 23 of the observation object 9 is required. This applies, in particular, if the endomicroscope 1 has reached the examination site and the examination site should be examined in respect of changes.

    [0042] While the video stream is recorded, the low-resolution frames of the video stream are buffered in the memory of the computer 17 for a certain amount of time in the present exemplary embodiment, for the purposes of which use can be made, for example, of a circular buffer, i.e., a memory in which data are stored for a certain storage time and, in this case, the data in the memory for which the storage time has elapsed are overwritten with current data. When the user has reached the examination site, they can trigger the recording of a high-resolution image by virtue of entering a corresponding command into the computer 17 as a trigger signal. By way of example, the command can be a keyboard entry or, should the computer 17 be equipped to accept voice commands, a voice command. However, it is also possible to use an external input apparatus, which is connected to the computer 17 by wire or radio. By way of example, such an external input apparatus can be a foot switch, following the actuation of which a trigger signal is transmitted to the computer 17. Compared to a keyboard entry, foot switch and voice commands are advantageous in that the user of the endomicroscope 1 requires no hands for generating the trigger signal.

    [0043] An embodiment of the computer-implemented method for determining information for distinguishing between tissue fluid cells and cells embedded in an extracellular matrix in a high-resolution image is explained below on the basis of the flowchart depicted in FIG. 4.

    [0044] In the present exemplary embodiment, the computer-implemented method is carried out on the computer 17. Its implementation is triggered by a trigger signal, for example by actuating a foot switch, by way of a keyboard input or a voice command. The trigger signal may be a signal designed only for carrying out the method but, like in the previous embodiment, it may also be a signal coupled to the trigger of a high-resolution image 34 (see FIG. 5). Once the method has been started in step S1 following the trigger signal, the method in step S2 accesses the low-resolution frames 33A-E of the video stream stored in the circular buffer, which were recorded before the high-resolution image 34 was recorded. Should the low-resolution frames 33A-E not be stored in a memory of the computer 17 but be stored externally, for example in a memory of the endomicroscope 1, accessing the low-resolution frames 33A-E of the video stream also comprises reading the low-resolution frames 33A-E into the computer 17. It should be noted at this point that the low-resolution frames 33A-E are the frames of a video stream in the present embodiment but that this is not mandatory for the present invention.

    [0045] After the buffered frames 33A-E have been accessed in step S2, the quality of the frames 33A-E is evaluated in step S3 and the frames whose quality is too low are rejected. The evaluation of the frames in respect of their quality can be implemented, in particular, in view of the sharpness of the respective frame, the presence of movement artifacts, the contrast, etc. In particular, a quality parameter can be determined on the basis of the image sharpness, the contrast, the presence of movement artifacts, etc. Those low-resolution frames 33A-E which do not reach a specified value of the quality parameter are rejected. In the process, there also is the option of defining a plurality of quality parameters and rejecting all those images which do not reach the specified value for at least one quality parameter.

    [0046] Frames whose quality has a sufficient value are subjected to a check in step S4 to the effect of whether there is a sufficient overlap of the image fields of the respective frames. The frames for which there is no sufficient overlap of the image fields are rejected. To determine a sufficient overlap, a reference image can for example be determined from the low-resolution frames 33A-E and the overlap of the remaining frames 33A-E with the reference image can be determined. By way of example, the overlap can be represented by a numerical value which indicates what portion of the image field of the respective frame 33A-E corresponds to the image field of the reference image. Even though the frames with an insufficient image quality are rejected first in the present embodiment before the frames which do not have sufficient overlap with the reference image are rejected, it is also possible to reverse the sequence of the test, that is to say first reject those frames which do not have a sufficient overlap with the reference image and subsequently reject those frames whose quality is insufficient. Instead of a frame, the high-resolution image 34 may also serve as a reference image.

    [0047] After the images whose quality was insufficient or which did not have a sufficient overlap have been rejected in steps S3 and S4, a check is carried out in step S5 as to whether the remaining number of low-resolution frames 33A-E is sufficient to determine the information for distinguishing between tissue fluid cells and tissue cells embedded in an extracellular matrix. By way of example, a lower limit for the number of frames, which should not be undershot, can be specified to this end. Additionally or as an alternative, it is possible to specify a maximum time interval between two low-resolution frames 33A-E which should not be exceeded. Further additionally or as a further alternative, it is also possible to specify an overall time interval that should be covered by the low-resolution frames 33A-E.

    [0048] Should the number of low-resolution frames 33A-E be determined in step S5 as being insufficient for determining the information for distinguishing between tissue fluid cells and tissue cells embedded in an extracellular matrix, the method advances to step S6, in which further low-resolution frames 35A-C are added to the already available frames 33A-E. Typically, the endomicroscope 1 continues to record low-resolution frames 35A-C after the high-resolution image has been created. Therefore, the low-resolution frames 35A-C recorded after the high-resolution image has been created merely need to be accessed in step S6. Should the endomicroscope 1 not automatically continue recording low-resolution frames 35A-C after the high-resolution image 34 was created, recording a number of low-resolution frames 35A-C is triggered in step S6. Independently of whether recording a number of low-resolution frames 35A-C is triggered in step S6 or whether the endomicroscope 1 automatically continues recording low-resolution frames 35A-C after the high-resolution image 34 was created, the method returns from step S6 to step S3 in order to carry out the check of steps S3, S4 and S5 again. This is carried out until a sufficient number of low-resolution frames 33A-E, 35A-C is determined as being available in step S5.

    [0049] FIG. 5 shows a schematic illustration of the temporal sequence of recording low-resolution frames 33A-E and 35A-C, which are recorded at different times and used in steps S3 to S6. Furthermore, FIG. 5 represents a high-resolution image 34, which was recorded at a time to. Triggering the high-resolution image 34 at the time t.sub.0 for example simultaneously triggers the determination of information for distinguishing between tissue fluid cells and tissue cells in the high-resolution image 34. To this end, according to step S2, the buffered low-resolution frames recorded at times t.sub.−1 to t.sub.−m are evaluated according to steps S3 to S5. At the same time, recording low-resolution frames 35A-C is continued in the present exemplary embodiment after the high-resolution image 34 has been recorded. Should the evaluation according to steps S3 to S5 yield that the number of low-resolution frames 33A-E that meet the required quality criteria does not suffice for determining the information for distinguishing between tissue fluid cells and tissue cells embedded in an extracellular matrix, low-resolution frames 35A-C recorded at times t.sub.1-t.sub.3 are added, said low-resolution frames 35A-C having been recorded after the high-resolution image 34 was recorded. As soon as a sufficient number of low-resolution frames that meet the quality criteria are determined as being available in step S5, the method continues with step S7, in which it registers the low-resolution images.

    [0050] As soon as a sufficient number of low-resolution frames 33A-E, 35A-C are determined as being available in step S5, the method advances to step S7, in which the low-resolution frames are registered in relation to the reference image. If the reference image is the high-resolution image 34, it is advantageous for the quality of the registration if pixel values for the pixels omitted in the low-resolution frames 33A-E, 35A-C are calculated by means of an interpolation of the pixels that were not omitted.

    [0051] Using the comparison of the registered frames with one another as a basis, the image elements which in each case adopt the same position in the image field in the frames 33A-E, 35A-C are then determined in step S8 of the present embodiment. These image elements are depicted in FIGS. 6 and 7 and are labeled by reference sign 29. By comparing the low-resolution frames 33A-E, 35A-C with one another, those image elements that adopt different positions in the frames 33A-E, 35A-C are moreover determined. Such an image element is depicted in FIGS. 6 and 7 and is labeled by reference sign 31 in exemplary fashion. Those image elements 29 in the frames 33A-E, 35A-C which are always present in the same position of the image field represent the tissue cells embedded in the extracellular matrix while the image elements 31 whose position differs in the various image fields of the frames represent the moving tissue fluid cells.

    [0052] After the information regarding which image elements represent tissue cells embedded in an extracellular matrix and which image elements represent tissue fluid cells was obtained in step S8, the information obtained is suitably processed for display in step S9. By way of example, the preparation may include the tissue cells embedded in the extracellular matrix or the tissue fluid cells being emphasized by color in the high-resolution image. Alternatively, there is also the option of generating a short video sequence on the basis of the low-resolution frames 33A-E, 35A-C, said video sequence being overlaid on the high-resolution image following a registration with the high-resolution image 34. Instead of an overlay, there is also the option of presenting the video sequence next to the high-resolution image 34. The prepared information is then output for display on an external monitor (step S10). Alternatively, the information can also be presented on the monitor of the computer 17. The method ends with the presentation of the information.

    [0053] The tissue cells can be classified following the determination of the information which allows the tissue cells embedded in the extracellular matrix to be distinguished from the tissue fluid cells. Such a classification can be implemented by means of suitable software and, for example, be based on methods as described in the following publications: A. BenTaieb et al. “Deep Learning Models for Digital Pathology”, arXiv:1910:12329v2 [cs.CV] 29 Oct. 2019; A. Bizzego et al. “Evaluating reproducibility of AI algorithms in digital pathology with DAPPER” in PLoS Comput Biol 15(3):e1006269, Mar. 27, 2019 and T. Fuchs et al. “Computational pathology: Challenges and promises for tissue analysis” in Computerized Medical Imaging and Graphics 35 (2011), pages 515-530. With the aid of the classification, it is possible to label the tissue cells differently in the present exemplary embodiment. By way of example, labeling allows a distinction to be made in the high-resolution image between tissue cells which do not have a change in relation to the normal state and those which do have a change.

    [0054] The present invention has been described in detail on the basis of exemplary embodiments for explanatory purposes. However, a person skilled in the art recognizes that there can be deviations from the exemplary embodiments within the scope of the present invention, as has also already been indicated within the scope of the exemplary embodiments. Thus, it is for example possible to merely merge the registered low-resolution frames into a short video sequence that is presented next to the high-resolution image instead of the option of determining, on the basis of the low-resolution frames, image regions forming the background and image regions that move in relation to the background. In the video sequence constructed from the registered low-resolution frames, a user of the endomicroscope can identify the tissue fluid cells on the basis of their movement presented in the short video sequence.

    [0055] Hence, an automated determination of the image regions that form the image background and of the image regions that move in relation to the determined image background, as was implemented in the described exemplary embodiment, is not mandatory. A person skilled in the art recognizes that further modifications of the described exemplary embodiments are possible.

    [0056] Therefore, the present invention is intended to be restricted only by the appended claims.

    LIST OF REFERENCE SIGNS

    [0057] 1 Endomicroscope [0058] 3 Optical fiber [0059] 5 First end [0060] 7 Second end [0061] 9 Observation object [0062] 11 Scanning device [0063] 13 Sensor [0064] 15 Housing [0065] 17 Computer [0066] 19 Grid [0067] 21 Grid point [0068] 23 Structure [0069] 25 Omitted lines [0070] 27 Scanned lines [0071] 29 Image background-forming image elements [0072] 31 Image elements moving in relation to the image background [0073] 33 Low-resolution images [0074] 34 High-resolution image [0075] S1 Start [0076] S2 Reading buffered images [0077] S3 Selection according to image quality [0078] S4 Selection according to image field [0079] S5 Checking whether a sufficient number of low-resolution frames is available [0080] S6 Adding further low-resolution frames [0081] S7 Registering the low-resolution frames [0082] S8 Determining the image elements forming the image background and image elements moving in relation to the image background [0083] S9 Processing the information [0084] S10 Outputting the information for display purposes