METHOD AND APPARATUS FOR MULTIMODAL SOFT TISSUE DIAGNOSTICS

20230222767 · 2023-07-13

Assignee

Inventors

Cpc classification

International classification

Abstract

A method and device for multimodal imaging of dermal and mucosal lesions. The method includes using at least two imaging modalities from which one is a 3D scan of the lesion, and, additionally providing information on the distance and angulation between scanning device and the dermis or mucosa and mapping at least the second modality over the 3D data.

Claims

1. An apparatus for multi modal imaging of dermal and mucosal lesions comprising: a scanning device having illumination light sources and sensors; and at least one processor configured to: compute images from raw data provided by the scanning device which is adapted to use at least two imaging modalities from which a first imaging modality generates 3D data for a 3D image in a 3D scan of the lesion, compute 3D information on the distance and angulation between scanning device and the dermis or mucosa, and map at least an image generated by a second imaging modality over the 3D image of the 3D scan based on the 3D information.

2. The apparatus of claim 1, wherein the processor is further configured to compute exact dimensions of the lesion by using the 3D information on the distance and angulation between the scanning device and the lesion.

3. The apparatus of claim 1, wherein the processor is further configured to compute a 3D surface texture of the lesion by using the 3D information of the 3D scan.

4. The apparatus of claim 1, wherein the second imaging modality generates as the said image at least one of a 2D image or an auto-fluorescence image by employing one sensor, wherein the 2D image is spectrally resolved with 3 or more channels.

5. The apparatus of claim 4, wherein the spectrally resolved 2D data of the 2D image superimposed through the processor onto the 3D data support the correct registration of the single 3D images of the 3D data to form a complete 3D image of the region of interest, wherein the 3D images include at least one of a 3D texture image or subsurface structure image.

6. The apparatus of claim 1, wherein the apparatus is configured to capture 3D data of the 3D scan with a technology that suppresses volume scattering through confocal imaging, OCT, or combinations of confocal scanning or OCT with depth of focus based technologies.

7. The apparatus of claim 1, wherein (i) the scanning device is configured to use wavelengths between 350 and 400 nm for the 3D scan of the surface by employing one corresponding sensor for the 3D image or (ii) the scanning device is configured to use a wavelength longer than 840 nm for the 3D scan of the subsurface by employing one corresponding sensor for a subsurface structure image so as to operate at wavelength where dermal and mucosal lesions show a smaller scattering coefficient.

8. (canceled)

9. The apparatus of claim 7, wherein the scanning device is configured to use both wavelength ranges together for 3D scan of the surface and 3D scan of the subsurface.

10. The apparatus of claim 9, wherein the scanning device is configured to switch the illumination sequentially between different wavelengths, wherein the illumination light sources are coupled into the same light path with dichroitic mirrors.

11. The apparatus of claim 9, wherein the light path of the illuminations sources for the different wavelengths is not fully congruent, provided that light source dimensions are small enough and slight angular deviation causing a displacement of the illumination pattern on associated sensor is corrected by calculation.

12. The apparatus of claim 9, wherein in the scanning device when using illumination with wavelength beyond 1000 nm, the scanning device comprises at least one beam splitter configured to separate the optical path for the different sensors.

13. The apparatus of claim 1, wherein the first or the second imaging modality is further configured for fluorescence imaging with excitation light in the UV/Blue range for fluorophores as FAD, NADH and collagen.

14. The apparatus of claim 1, wherein the scanning device is adapted to use, in the second imaging modality, light in the red wavelength range for excitation of porphyrins and the processor is configured to overlay a fluorescence image over the 3D data of the 3D scan.

15. The apparatus of claim 13, wherein the scanning device comprises a blocking filter for excitation light using same illumination pattern with UV/Blue wavelength as used for the 3D imaging and adapted to introduce the blocking filter for fluorescence detection into the imaging path after separation from the illumination path.

16. The apparatus of claim 15, wherein the blocking filter is in the 2D imaging path for a 2D image of the second imagining modality.

17. The apparatus of claim 13, wherein the scanning device comprises a separate excitation light source placed on the sides of a replaceable hood and a blocking filter is integrated in a window of the replaceable hood and the blocking filter is designed not to suppress the structured light for the 3D measurement.

18. The apparatus of claim 13, wherein the excitation wavelength is below 350 nm, and the wavelength of an illumination pattern is in the range of 365 nm-405 nm.

19. The apparatus of claim 16, wherein the blocking filter is placed in front of one 2D sensor for the 2D image.

20. The apparatus of claim 1, wherein the scanning device has one or more combinations of the following imaging modalities: use of visual wavelength 2D images together with 3D information used for recalculation of the magnification, distance and angulation of the lesion; visual wavelength 2D image overlaid over the 3D texture image of the lesion; use of visual wavelength 2D images and fluorescent image with 3D information used for recalculation of the magnification, distance and angulation; visual wavelength 2D image and fluorescent image overlaid over the 3D texture image of the lesion; visual wavelength 2D image overlaid over the 3D texture image of the lesion and subsurface structure image; visual wavelength 2D image and fluorescent image overlaid over the 3D texture image of the lesion and subsurface structure image.

21. (canceled)

22. The apparatus of claim 1, wherein an artificial neuronal network is integrated into the apparatus, and adapted to use images of a multimodal image database for training, and classifying multimodal images which are fed in the artificial neuronal network.

23. The apparatus of claim 1, wherein the calculated multimodal images is sent by a computer to a cloud based artificial neuronal network for training of the network, and collected in a multimodal image database, wherein the network is connectable by the apparatus for classification of the intra-oral lesions captured by the apparatus and provided to the network.

24. The apparatus of claim 22 or 23, wherein additionally to the multimodal imaging data information as palpation results, removability of whitish layers, lesion history and risk factors such as smoking, alcohol etc. is used for training and retrieval of the artificial neuronal network.

25. The apparatus of claim 8, wherein the scanning device comprises one InGaAs image sensor configured to cover at least the wavelength range from 1000 nm-1600 nm.

26. The apparatus of claim 4, wherein the scanning device comprises a mosaic type CMOS sensor with a plurality of different filters in combination with a lens array configured for the 2D spectral imaging.

27. The apparatus of claim 17, wherein the hood covers fluorescence excitation LEDs to illuminate the field of interest through the window wherein the hood is removable for sterilization, while the LEDs stay on the rest of the scanning device.

28. The apparatus of claim 8, wherein the scanning device comprises a CQD image sensor configured to extend the sensitivity into the NIR range covering at least additionally the wavelength range from 1000 nm-1400 nm.

29. The apparatus of according to claim 1, wherein the processor is configured to additionally provide the 3D information on the distance and angulation between scanning device and the dermis or mucosa through the use of an illumination pattern, a stereogrammetry, or time of flight.

30. A method for multimodal imaging of dermal and mucosal lesions comprising: providing an apparatus comprising: a scanning device having illumination light sources and sensors; computing images from raw data provided by the scanning device which is adapted to use at least two imaging modalities from which a first imaging modality generates 3D data for a 3D image in a 3D scan of the lesion, computing 3D information on the distance and angulation between scanning device and the dermis or mucosa, and mapping at least an image generated by a second imaging modality over the 3D image of the 3D scan based on the 3D information.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0049] In the subsequent description, further aspects and advantageous effects of the present invention will be described in more detail by using exemplary embodiments and by reference to the drawings, wherein

[0050] FIG. 1: shows a comparison of photographs and corresponding auto-fluorescence images;

[0051] FIG. 2a: shows a whitish colored lesion;

[0052] FIG. 2b: shows a lesion with increased pigmentation;

[0053] FIG. 3: shows core functional blocks of a 3D scanning device;

[0054] FIG. 4: shows functional blocks of a 3D scanning device;

[0055] FIG. 5: shows a frontend hood of a 3D scanning device;

[0056] FIG. 6: shows emission bands of different fluorophores;

[0057] FIG. 7: shows combinations of imaging modalities;

[0058] FIG. 8: shows a setup with artificial neuronal networks for diagnostic support.

[0059] The reference numbers shown in the drawings denote the elements as listed below and will be referred to in the subsequent description of the exemplary embodiment. [0060] 1-1: Lesion [0061] 3-1: 3D scanning optics [0062] 3-2: Dichroitic mirror/Beam splitter [0063] 3-3: Sensor (e.g. CMOS) [0064] 3-4: Sensor (e.g. InGaAs Detector) [0065] 4-1: 3D scanning optics [0066] 4-2: Beam splitter [0067] 4-3: Sensor (e.g. CMOS) [0068] 4-4: Blocking filter [0069] 4-5: Sensor (e.g. CMOS) [0070] 4-6: Frontend [0071] 5-1: frontend hood [0072] 5-2: UV LED [0073] 5-3: Imaging window [0074] 7-1: 2D color image [0075] 7-2: Auto-fluorescence image [0076] 7-3: 3D texture image [0077] 7-4: Subsurface structure image [0078] 8-1: Device [0079] 8-2: 2D image [0080] 8-3: Data base [0081] 8-4: brush biopsy [0082] 8-5: X-ray image [0083] 8-6: Palpation result

[0084] FIG. 1 shows a comparison of photographs and corresponding auto-fluorescence images, in this case made with a “VELscope” device. The corresponding images show clearly a better contrast between the lesion (1-1) and healthy tissue in the auto-fluorescence image.

[0085] FIG. 2a shows a white colored lesion mainly caused by thickening of the epidermal layer and thus significantly increased scattering coefficient, while FIG. 2b shows a lesion with increased pigmentation, which causes an increased absorption coefficient.

[0086] FIG. 3 shows functional blocks of a 3D scanning device. (3-1) is the 3D scanning optics, (3-2) is a dichroitic mirror/beam splitter separating the wavelength bands in a range from 300 nm-800 nm reaching a CMOS sensor (3-3), while wavelength longer than 1000 nm are mirrored to sensor (3-4), which can be an InGaAs Detector covering at least the wavelength range from 1000 nm-1600 nm.

[0087] FIG. 4 shows functional blocks of a 3D scanning device. (4-6) is the frontend which deflects the image towards a beam splitter (4-2), separating the 2D imaging path from the 3D imaging path. (4-1) is the 3D scanning optics with a CMOS sensor (4-3) (as a 3D sensor) and (4-4) is a blocking filter that suppresses the excitation light not reaching the CMOS sensor (4-5). Alternatively the CMOS sensor (4-3) may be optionally replaced with a CQD Sensor (4-3) with extended sensitivity in the NIR range. Cutoff wavelength of the blocking filter is around 370 nm-400 nm. This allows the emission light of the auto-fluorescence to pass and allows as well the visual wavelength to pass for a color image. CMOS sensor (4-5) is not limited to traditional RGB 3 channel sensors but may contain more channels with better spectral resolution e.g. a mosaic type CMOS sensor with a multitude of different filters in combination with a lens array not shown in the image. This allows to differentiate between different fluorophores as shown in FIG. 6, since they have emission bands with maxima at different wavelength. The optical components (3-2), (3-3), and (3-4) in FIG. 3 can replace the component (4-5) in FIG. 4 in order to have a 2D sensor (3-3) for the visible range and another 2D sensor (3-4) for the NIR light.

[0088] The apparatus for multimodal imaging of dermal and mucosal lesions, comprises: a scanning device (8-1) having illumination light sources and sensors (3-3,3-4,4-3,4-5); and at least one processing means for calculation of images from raw data provided by the scanning device (8-1) which is adapted to use at least two imaging modalities from which the first imaging modality generates 3D data for a 3D image (7-3;7-4) in a 3D scan of the lesion, wherein the processing means is adapted to additionally provide 3D information on the distance and angulation between scanning device (8-1) and the dermis or mucosa through the use of an illumination pattern, or stereogrammetry, or time of flight, and map at least an image (7-1;7-2) generated by the second imaging modality over the 3D image (7-3;7-4) of the 3D scan based on the 3D information. The use of illumination pattern, or stereogrammetry, or time of flight are one of various techniques which can be used by those skilled in the art.

[0089] FIG. 5. shows the frontend hood (5-1) of a 3D scanning device. The hood is typically removable from the rest of the scanning device for disinfection. UV LEDs (5-2) are positioned parallel to the imaging window (5-3) to illuminate the field of interest and excite auto-fluorescence. The backscattered light passes through the imaging window (5-3) which can be covered already with an excitation light blocking filter (interference filter), if this filter is not positioned somewhere else in the detection ray path. The hood may contain the optical elements including the UV LEDs or may be a more or less empty shell covering the optics inside the hood. This avoids subjecting the UV LEDs to sterilization cycles.

[0090] FIG. 6 shows the different emission bands and maxima of different fluorophores excited in this case at 308 nm. The different peaks may allow a separation of different fluorophores. However, this is a normalized image. In reality the emission intensity of collagen is forming a high background signal that can dominate other fluorophores.

[0091] FIG. 7 shows different combinations of imaging modalities possible with the proposed device, with 2D color images (7-1) (e.g., 2D spectrally resolved image), auto-fluorescence images (7-2), 3D texture images (7-3) and subsurface structure images (7-4) taken with longer wavelength.

[0092] FIG. 8 shows a setup how to use artificial neuronal networks for supporting the diagnosis of dermal/mucosal lesions with images taken with the proposed device (8-1). FIG. 8 shows only 2D images (8-2) which contain additional 3D information as distance, angulation, but it is not limited to these images. All combinations as shown in FIG. 7 or described in the text apply. From 2D images (8-2) a data base (8-3) is build, which is used for training of the artificial neuronal network. To improve the classification performance other than image data can be added as further information to the network as brush biopsy (8-4) results, x-ray images (8-5) and palpation results (8-6). Image for palpation results (8-6) is only exemplary chosen to show an elevation in the gingiva, which might be hard or soft. Of course, a multitude of cases with these data have to be included in the database connected to the corresponding case images, which are used for training.

[0093] With the present invention, due to the known exact absolute dimensions and known imaging conditions like angulation and distance of the lesion surface versus the imaging plane provided by the combination of at least 3D measurement and 2D color image allows to overlay (register) images of the same lesion taken at different times to see even small deviations, which allows to monitor the development of a lesion over time.