MEDICAL IMAGING DEVICE FOR SPATIALLY RESOLVED RECORDING OF MULTISPECTRAL VIDEO DATA

20220360700 · 2022-11-10

    Inventors

    Cpc classification

    International classification

    Abstract

    A medical imaging device configured to spatially resolve recording of multispectral video data of an examination area of a patient including a light source having multiple optical emitters with different wavelengths in the visible and NIR spectral range. The light source has an emitter whose wavelength lies in the range of ±50% of its half-width around the intersection of the blue and green filter curves or the green and red filter curves, and the exposure control and the data processing means are arranged to separately detect the affected two of the red and green or the green and blue colour signals in an exposure pattern with activation of the emitter at the intersection point and to evaluate them in the multispectral analysis with mutually different wavelengths shifted by the two affected filter curves as two supporting point wavelengths.

    Claims

    1-14. (canceled)

    15. An endoscopic system comprising: a camera sensor; at least one lens directing light onto the camera sensor; a plurality of LED emitters controlled by a recording control device, each of the plurality of LED emitters having an associated wavelength, one or more of the plurality of LED emitters selectively activated based on a measurement to be taken; and a computing device configured to receive and store one or more of images and video from the camera sensor.

    16. The system of claim 15, wherein multiple combinations of the plurality of LEDs are selectively actuated in succession by the recording control device to affect a plurality of measurements.

    17. The system of claim 15, wherein the measurement includes one or more of an oxygen saturation of arterial blood (SpO2), a heart rate (HR), a pulsation index (PI), and a heart rate variability (HRV).

    18. The system of claim 15, wherein the measurement includes one or more of information about an oxygen saturation of a microcirculation in a tissue (StO2), a tissue haemoglobin content (THI), a tissue water content (TWI), a tissue fat content (TLI), and a respiratory rate (RR).

    19. The system of claim 15, wherein one or more of pulsatile and non-pulsatile signal components are received by the computing device.

    20. The system of claim 15, further comprising the computing device receiving one or more of a visible image and a video.

    21. The system of claim 15, wherein through programming, the recording control device: generates a sequence of spectral exposure patterns in a predetermined activation sequence, whereby in each individual spectral exposure pattern one or more of the LED emitters with different wavelengths are activated, and repeats this activation sequence successively over time, wherein each spectral exposure pattern of the activation sequence is recorded by the camera sensor.

    22. The system of claim 15, wherein synchronization between the plurality of LED emitters and the camera sensor is provided by the recording control device.

    23. The system of claim 15, wherein the selective activation is caused by a user selection for a desired test.

    24. The system of claim 15, wherein the associated wavelengths include one or more of 405 nm, 430 nm, 455 nm, 490 nm, 520 nm, 540 nm, 600 nm, 620 nm, 660 nm, 760 nm, 810 nm, 880 nm, 930 nm, and 960 nm.

    25. A method to operate an endoscopic system comprising: receiving, via a lens, light onto a camera sensor; controlling, by a recording control device, a plurality of LED emitters, each of the plurality of LED emitters having an associated wavelength; selectively activating one or more of the plurality of LED emitters based on a measurement to be taken; and receiving and storing one or more of images and video from the camera sensor.

    26. The method of claim 25, wherein multiple combinations of the plurality of LEDs are selectively actuated in succession by the recording control device to affect a plurality of measurements.

    27. The method of claim 25, wherein the measurement includes one or more of an oxygen saturation of arterial blood (SpO2), a heart rate (HR), a pulsation index (PI), and a heart rate variability (HRV).

    28. The method of claim 25, wherein the measurement includes one or more of information about an oxygen saturation of a microcirculation in a tissue (StO2), a tissue haemoglobin content (THI), a tissue water content (TWI), a tissue fat content (TLI), and a respiratory rate (RR).

    29. The method of claim 25, wherein one or more of pulsatile and non-pulsatile signal components are received by the computing device.

    30. The method of claim 25, further comprising receiving one or more of a visible image and a video at the computing device.

    31. The method of claim 25, wherein through programming, the recording control device: generates a sequence of spectral exposure patterns in a predetermined activation sequence, whereby in each individual spectral exposure pattern one or more of the LED emitters with different wavelengths are activated, and repeats this activation sequence successively over time, wherein each spectral exposure pattern of the activation sequence is recorded by the camera sensor.

    32. The method of claim 25, wherein synchronization between the plurality of LED emitters and the camera sensor is provided by the recording control device.

    33. The method of claim 25, wherein the selective activation is caused by a user selection for a desired test.

    34. The method of claim 25, wherein the associated wavelengths include one or more of 405 nm, 430 nm, 455 nm, 490 nm, 520 nm, 540 nm, 600 nm, 620 nm, 660 nm, 760 nm, 810 nm, 880 nm, 930 nm, and 960 nm.

    35. An endoscopic system comprising: a camera sensor; at least one lens directing light onto the camera sensor; a plurality of LED emitters controlled by a recording control device, each of the plurality of LED emitters having an associated wavelength between 405nm and 960 nm, one or more of the plurality of LED emitters selectively activated at a specific point in time to produce the light for the camera sensor, wherein the selection of the one or more of the plurality of LED emitters is based on a desired measurement; and a computing device configured to receive and store one or more of images and video from the camera sensor.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0037] The invention is described below with reference to examples of embodiments in connection with the drawings, in which:

    [0038] FIG. 1 schematically shows the structure of a medical imaging device according to the invention,

    [0039] FIG. 2 schematically shows the structure of a stereoscopic embodiment of the medical imaging device,

    [0040] FIG. 3 shows in the graph on the left the emission spectra of a number of emitters in the visible range and the total spectrum resulting from this by superposition, and in the graph on the right the blue, green and red filter curves of an RGB camera sensor,

    [0041] FIG. 4 shows the spectral emission curve of a broadband orange LED emitter, the three colour filter curves of the RGB camera sensor and the resulting spectral signal distribution in the green and red color signals,

    [0042] FIG. 5 shows on the left an activation sequence with two successive spectral exposure patterns, in each of which several emitters emit light pulses, and in the right part the emission spectra of the first exposure pattern (S1-top) and the second exposure pattern (S2-bottom) in each case together with the red, green and blue filter curves of the RGB camera sensor,

    [0043] FIG. 6 shows the time sequence of exposure and readout of the RGB camera sensor with rolling shutter,

    [0044] FIGS. 7a-7c each show on the left the emission spectra of a spectral exposure pattern of an activation sequence with three successive spectral exposure patterns (FIGS. 7a, 7b and 7c) each together with the three colour filter curves, each show two resulting colour signals in the center and the third resulting colour signal in the right column,

    [0045] FIG. 8 shows the emission spectra of a series of LED emitters in the visible range at the top and the resulting red, green and blue colour signals at the bottom,

    [0046] FIG. 9 shows a graph of the activation sequence with the three successive spectral exposure patterns S1, S2 and S3 as a function of time for the activation sequence shown in FIG. 7,

    [0047] FIG. 10 shows in the middle column from top to bottom the red, green, blue colour signals and below that the red and blue colour signals for seven successive activation sequences with two successive spectral exposure patterns each according to FIG. 5 and next to it and below it summaries of the signals,

    [0048] FIG. 11 shows the absorption curves of the most important absorbers of the tissue and the effective intercept wavelengths for multispectral analysis, and

    [0049] FIG. 12 schematically shows the composition of tissue absorption and the physiological parameters derived from it.

    DETAILED DESCRIPTION

    [0050] FIG. 1 shows a schematic block diagram of the structure of a medical imaging device which can be operated according to the present invention. The imaging device comprises a light source 3, which is shown schematically as a matrix with several emitters shown in different shades of grey. The emitters may be formed by LEDs whose wavelengths are distributed over the visible and NIR spectral ranges. Where the phrase is used herein that the emitters each have “one” wavelength, this refers to the peak wavelength of the emission spectrum of the respective emitter. Further details of the light source are described below.

    [0051] The light from the light source 3 is directed onto the examination area via an imaging lens 1. The light reflected from the examination area is directed via a lens 2 onto an RGB colour camera sensor 4. The red, green and blue colour filters designated R, G and B are arranged in the so-called Bayer pattern directly in front of the pixels of the camera sensor, which may be a CMOS or CCD sensor. Each pixel therefore directly measures only one of the three colour signals, the two missing colours are usually estimated (interpolated) from the colour values of the eight neighboring pixels.

    [0052] A recording control device 5 communicates with the light source 3 and the RGB colour camera sensor 4. The recording control device 5 is arranged to synchronize and control the operation of the light source 3 and the operation of the colour camera sensor 4. The recording control device 5 is also in data exchange connection with a data processing unit 6, 7, represented here by two EDP system modules 6 and 7. The first module 6 of the data processing unit receives and processes the signals from the colour camera sensor and generates successive images (video frames) therefrom. Each successive frame represents a spectral exposure pattern from an activation sequence with several successive, different spectral exposure patterns. The activation sequences are in turn repeated sequentially in time by the recording control device. Each image of the RGB colour camera sensor taken by the module 6 thus corresponds to a specific spectral exposure pattern from the activation sequence. The successive images of the spectral illumination patterns of an activation sequence are subjected to multispectral analysis in the analysis module 7 of the data processing device using predetermined algorithms in order to derive desired information and physiological parameters, as described further below. The analysis module 7 of the data processing device may also be associated with other functions such as image storage, image analysis and other functions. The analysis module 7 of the data processing device may be based on a PC, for example. The data acquisition module 6 can in principle also be implemented in this PC; as a rule, the data acquisition module 6 is implemented in a real-time processor system.

    [0053] FIG. 2 shows a corresponding block diagram, whereby the medical imaging device sketched there provides stereoscopic images. For this purpose, it is provided with a further lens 2′ and a further RGB colour camera sensor 4′, the two lenses and the associated RGB colour camera sensors 4′ being arranged in such a way that stereoscopic image information is derived from the images of the two RGB colour cameras 4, 4′ in the data processing device 6, 7 in order to achieve three-dimensional imaging. In this embodiment, the RGB colour camera sensor 4 determines the synchronization as master, while the second RGB colour camera sensor 4′ and the synchronized multispectral light source 3 are operated as slave.

    [0054] The light source 3 comprises several LED emitters in the visible and NIR spectral range. LEDs are fast-switching, variable emitters (light pulses with a length in the range of 10 μs to 10,000 μs are typical). LEDs operate without thermal problems at high light powers that are not critical for the tissue. Preferably, the light source comprises more than 10 LEDs so that more than 10 supporting wavelengths distributed over the covered spectral range are available. In the visible spectral range, support sites should be present at local absorption maxima of oxygenated haemoglobin. This allows an additional check whether a periodic signal supposedly detected as a pulse signal is actually the pulse signal of the circulation, since the real pulse signal is accompanied by a corresponding pulse signal for oxygenated haemoglobin.

    [0055] It is further preferred that the light source comprises at least one emitter with emission maximum in the range of 500 nm to 540 nm and one in the range of 570 nm to 600 nm. In particular, an emitter with a wavelength of about 520 nm and an emitter with a wavelength of 585 nm may be present. These wavelengths allow good discrimination between oxygenated and deoxygenated haemoglobin, since 520 nm is an isosbestic point of haemoglobin, i.e. the absorption coefficients of oxygenated and deoxygenated haemoglobin are equal. Another isosbestic point of haemoglobin is at 810 nm. In contrast, the absorption of deoxygenated haemoglobin at 575 nm and 880 nm is much smaller than that of oxygenated haemoglobin. At 600 nm and 760 nm, the absorption of oxygenated haemoglobin is again smaller, and therefore these four wavelengths can also be used to calculate the oxygenation of haemoglobin

    [0056] By using short wavelength emitters (violet), the imaging device can be used to increase the contrast between organic structures such as blood vessels, structures with different water contents or differences in concentration of other chromophores. Due to the high absorption coefficients of haemoglobin in this spectral range with constant scattering coefficients of the tissue, very sharp images are obtained, as the light penetrates and scatters little into the tissue.

    [0057] A preferred selection of wavelengths as spectral interpolation points are listed in Table 1 and shown in FIG. 10.

    TABLE-US-00001 TABLE 1 Preferred emission wavelengths of the light source Wavelength in nm Function 405 Haemoglobin absorption for contrast enhancement Excitation of fluorescence in the red channel 430 Haemoglobin absorption for contrast enhancement Excitation of fluorescent dyes and autofluorescence in tissue 455 Colour image (blue channel) 490 Colour image (CRI) Oximetry wavelength Splitting into 2 wavelengths through colour filter of the camera sensor(Blue Green) 520 Colour image (green channel) Isosbestic point haemoglobin absorption 540 Support of colour video and additional interpolation point 600 Colour image (CRI) Oximetry wavelength Splitting into 2 wavelengths through colour filter of the camera sensor(Green Red) 620 Colour image (red channel 1) Oximetry wavelength Pulse oximetry 660 Colour image (red channel 2) Oximetry wavelength Pulse oximetry 760 Local absorption maximum of deoxygenated haemoglobin 810 Isosobestic point haemoglobin absorption 880 Pulse oximetry Reference wavelength for haemoglobin, water and fat absorption 930 Local absorption maximum of fat 960 Local absorption maximum of water

    [0058] The recording control device is arranged, e.g. by programming, to generate a sequence of spectral exposure patterns in a predetermined activation sequence, whereby in each individual spectral exposure pattern one or more emitters with different wavelengths are activated, and to repeat this activation sequence successively in time. Each spectral exposure pattern of the activation sequence is recorded with an image of the colour camera sensor. This synchronization between the light source 3 and the colour camera sensor 4 is provided by the recording control device 5. FIG. 5 shows an example of an activation sequence not covered by the claims of the application with two successive spectral exposure patterns S1 and S2. In the graph shown on the left side of FIG. 5, the activation of five emitters during two successive spectral exposure patterns S1 and S2 as well as the recording activity of the colour camera sensor are shown as a function of time. At a maximum frame rate of the colour camera sensor of 120 Hz, two spectral exposure patterns (the first with simultaneous activation of emitters at 520, 660 and 960 nm, and the second with simultaneous activation of two emitters at 455 and 880 nm) are generated here repeatedly one after the other in an activation sequence. Thus, through efficient selection of the emitters and utilizing the three colour images of the colour camera sensor, five spectral interpolation wavelengths are available, with two images being captured by the colour camera sensor per activation sequence, so that the frame rate for multispectral imaging with five interpolation wavelengths is 60 Hz. These spectral wavelength support points can be used, for example, for simultaneous calculation of a colour video image, spatially resolved SpO.sub.2 representation and spatially resolved tissue water index representation.

    [0059] The number of spectral exposure patterns and the selection of emitters to be activated in the respective exposure patterns can be set during operation of the exposure control unit. control device. For this purpose, an input device can be provided in which the user enters the number of spectral exposure patterns in the activation sequence and then, for each spectral exposure pattern, the respective emitters to be activated. Thus, if required, a higher number of spectral intercept wavelengths can be recorded at a lower frame rate or, conversely, a lower number of spectral intercept wavelengths can be recorded at a higher frame rate.

    [0060] The activation sequence illustrated in FIG. 5 with two successive spectral exposure patterns can be used to generate a colour image (colour video), a physiological parameter video representing the water concentration (water absorption band 960 nm to reference point 880 nm) and to generate a parameter video for SpO.sub.2, the latter extracted from the pulsatile part of the signal at 660 nm, 880 nm and 960 nm. During the first spectral exposure pattern, emitter 2 (520 nm), emitter 3 (660 nm) and emitter 5 (960 nm) are switched on. The normalized intensity of the emitters and the colour filter curves of the colour camera sensor are shown in the graph at the top right. The largest signal component in the blue colour signal comes from emitter 5. Since the sensitivity of the colour camera sensor above 800 nm is almost identical in the three colour channels R, G and B, the information about the signal intensity of emitter 5 in the blue colour signal can be used to correct the signal intensities of the green and red colour signals. To obtain the actual signal of emitter 2 (520 nm), the blue colour signal is subtracted from the green colour signal. To determine the actual signal of emitter 3 (660 nm), the measured blue colour signal is subtracted from the red colour signal. Thus, after the recording of the first image (of the first spectral exposure pattern S1), there are already two supporting points in the visible and one in the near-infrared range. For the second spectral exposure pattern S2, only emitters 1 (455 nm) and 4 (880 nm) are switched on. The red colour signal comes only from emitter 4. The signal from emitter 1 can be calculated by subtracting the blue colour signal from the red colour signal. After completion of the activation sequence with the two spectral exposure patterns S1 and S2, five spectral interpolation wavelengths are available.

    [0061] FIG. 6 shows the time sequence of reading out and exposing CMOS colour camera sensors of the rolling shutter type. These CMOS colour camera sensors do not have a global shutter, so that individual lines of the camera sensor chip are read out continuously and with a time delay relative to one another, which can lead to overlapping of the exposure within individual frames when the LED emitters are activated sequentially. To solve this problem, a certain dead time is provided between successive spectral exposure patterns of the LEDs, which corresponds at least to the readout time and/or reset time of the CMOS sensor. Alternatively, it is conceivable that the exposure times of the spectral exposure patterns are selected to be greater than the time intervals between a reset and a readout of the entire CMOS sensor.

    [0062] FIG. 4 illustrates the generation of two spectral interpolation points from the emission curve of a single emitter (emitter orange) and the colour filter curves of the colour camera sensor. The wavelength of the orange emitter (its peak wavelength is close to the intersection of the green filter curve and the red filter curve). Due to the green and red filter curves running in opposite directions, the resulting signal maxima in the green and red colour signals are shifted in opposite directions by a few nanometers. The signal maximum in the green colour signal is shifted in relation to the maximum of the emission curve of the orange colour signal. The signal maximum in the green colour signal is shifted to smaller wavelengths compared to the maximum of the emission curve of the orange emitter, while the signal maximum in the red colour signal is shifted to longer wavelengths compared to the orange emitter. Effectively, the emission curve of the emitter is divided into two wavelength maxima in the two colour signals in the intersection area of the filter curves by the two filter curves. By decomposing the emission curve into two adjacent signal maxima in the resulting colour signals, the number of wavelengths that can be evaluated (supporting points) can be effectively increased for multispectral analysis using a single emitter.

    [0063] FIGS. 7a-7c illustrate an activation sequence that exploits the effect just described of splitting the spectrum of an emitter near the intersection of the green and red filter curves into mutually shifted green and red colour signals. Such an activation sequence is applicable in an imaging device according to the invention. The activation sequence with three successive spectral exposure patterns is shown in FIGS. 7a to 7c respectively in the form of emission spectra of the emitters involved and the resulting colour signals for the first (FIG. 7a), the second (FIG. 7b) and the third spectral exposure pattern (FIG. 7c); FIG. 9 illustrates these spectral exposure patterns in the form of a graph of the activation periods of the individual emitters and the camera sensor as a function of time over the three successive spectral exposure patterns S1, S2 and S3. In particular, FIG. 9 shows the time sequence of the switching edges during the three spectral exposure patterns with a total of seven emitters as shown in FIGS. 7a to 7c. During the time period S1 of the first spectral exposure pattern, emitter 1 (430 nm) and emitter 5 (600 nm) are switched on. During the second spectral exposure pattern (S2), emitter 3 (490 nm) and emitter 7 (660 nm) are switched on.

    [0064] For the third spectral exposure pattern over time period S3, emitter 2 (455 nm), emitter 4 (520 nm) and emitter 6 (620 nm) are switched on. The camera sensor is switched on during the phases of all three spectral exposure patterns for image recording.

    [0065] The activation sequence in FIGS. 7a-7c and 9 uses seven emitters in the visible range from which nine spectral interpolation points are obtained. This activation sequence can be used, for example, to generate a colour video, a parameter video for tissue oxygenation (StO.sub.2) and a parameter video of the relative tissue haemoglobin concentration.

    [0066] FIG. 7a to c each show in the left graph the emission spectra of the emitters used in the three successive spectral exposure patterns together with the filter curves of the colour camera sensor. The middle and right graphs in FIGS. 7a to 7c show the effective colour signals for the three colours. The first spectral exposure pattern of the activation sequence switches on two emitters at 430 nm and 600 nm, whereby the emission spectrum of the latter emitter lies close to the intersection of the green and red filter curves and its emission spectrum is therefore split into the resulting green and red colour signals, as previously described in connection with FIG. 4, so that the first spectral exposure pattern provides three spectral interpolation points.

    [0067] The second spectral exposure pattern uses two emitters at 490 nm and 660 nm, whereby the signals resulting from the first emitter at 490 nm are decomposed by the blue and green filter curves into two neighboring signal maxima in the blue and green colour signals, so that the second spectral exposure pattern also provides three supporting wavelengths.

    [0068] The third spectral exposure pattern of the activation sequence switches on three emitters at 455 nm, 520 nm and 620 nm, so that the third spectral exposure pattern also provides three supporting point wavelengths. Thus, the activation sequence described in connection with FIGS. 7 and 9 yields a total of nine interpolation wavelengths. At a frame rate of 120 Hz, this results in forty complete data sets per second, each with nine spectral interpolation points for the multispectral analysis.

    [0069] FIG. 8 shows in the upper graph the normalized intensities of the emitters used and in the lower graph the resulting effective three colour signals of the activation sequence described above in connection with FIGS. 7a to 7c and 9. After completion of the activation sequence of three spectral exposure patterns each, nine spectral support points are present in the visible range from 400 to 700 nm.

    [0070] FIG. 10 illustrates the recording of the signals and their summary for the activation sequence of FIG. 5, which is shown repeatedly seven times in succession in the graphs in the middle column of FIG. 10. The top five in the middle column shown graphs I.sub.E1(t) . . . I.sub.E5(t) show the following colour signals over seven activation sequences with two spectral exposure patterns each S1 and S2: the graph I.sub.E3(t) shows the signal due to the emission of emitter 3 in the red colour signal (the rising and again falling course over the seven activation sequences is due to the pulsation of haemoglobin, which is clearly visible at this wavelength); the second signal sequence I.sub.E2(t) shows the signal resulting from the activation of emitter 2 (520 nm) in the green colour signal; the third signal I.sub.E5(t) shows the signal resulting from the activation of emitter 5 (960 nm) in the blue colour signal. These first three signals are each recorded simultaneously in the first spectral exposure pattern S 1 of the successive activation sequences. The fourth signal I.sub.E4(t) is due to the activation of emitter 4 (880 nm) and is recorded as a red colour signal. Simultaneously, the signal resulting from the activation of emitter 1 (455 nm) is recorded as a blue colour signal in the second spectral exposure pattern S2. Since the second spectral exposure pattern follows the first spectral exposure pattern of each activation sequence with a time delay, the signals I.sub.E4(t) and I.sub.E1(t) are time-delayed compared to the signals of the first spectral exposure pattern in the three upper graphs.

    [0071] The right column shows the time-averaged signals (i.e. the time-averaged signals at the respective wavelengths over the seven activation phases shown in the graphs in the middle column). From these average signals at the individual wavelengths, the graph Ī(λ) with all five spectral interpolation points can be assembled (bottom right in FIG. 10), which can be evaluated as a tissue spectrum. This tissue spectrum can be used for the calculation of non-pulsatile physiological tissue parameters for which time averaging is necessary to increase the signal quality.

    [0072] The graph I(t), which is shown second from the bottom in the middle column of FIG. 10, shows the measured intensities of all colour signals and the respective successive first and second spectral exposure patterns over the seven successive activation sequences of all three colour signals. Further information can be obtained from the difference of the intensities of two temporally separated activation sequences. In the two lowest graphs of the middle column of FIG. 10, the intensities I(λ) as a function of wavelength and the intensity I(λ) as a function of wavelength in two temporally separated activation sequences are shown. In other words, I.sub.t1(λ) shows the five spectral interpolation points recorded in the first activation sequence and the graph I.sub.t2Q) shows the five spectral interpolation points recorded in the fifth activation sequence. The pulsatile part of the signal (time-varying part) can be derived from the difference of the interpolation points in the two graphs I(λ)-I(λ). This pulsatile part can be used for the generation of the physiological parameter image for SpO.sub.2, which is extracted on the pulsatile part of the signal at 660 nm, 880 nm and 960 nm.

    [0073] FIG. 11 shows the absorption curves of the main absorbers in the tissue in the visible and near-infrared range. In addition, the preferred emission wavelengths of the emitters of the light sources and the effective interpolation wavelengths resulting from this, if applicable in combination with the filter curves of the colour camera sensor, are shown.

    [0074] FIG. 12 shows the pulsatile and non-pulsatile signal components of tissue absorption. The pulsatile component is caused by the arterial blood and, after separation from the non-pulsatile signal component, can be used to determine the oxygen saturation of the arterial blood (SpO.sub.2), the heart rate (HR), the pulsation index (PI) and the heart rate variability (HRV). The non-pulsatile signal component contains, among other things, information about the oxygen saturation of the microcirculation in the tissue (StO.sub.2), the tissue haemoglobin content (THI), the tissue water content (TWI) and the tissue fat content (TLI). Another signal component is caused by the displacement of the venous blood and can be used, among other things, to determine the respiratory rate (RR).