Device for detecting water on a surface and a method for detecting water on a surface
11680895 · 2023-06-20
Assignee
Inventors
Cpc classification
G01N2021/1738
PHYSICS
G06V20/56
PHYSICS
International classification
Abstract
A device for identifying water on a surface, including an optical sensor and a processor. The optical sensor is configured to produce a first image of the surface which has a first optical bandwidth within which the water has a first absorption rate, and a second image of the surface which has a second optical bandwidth within which the water has a second absorption rate that is higher than the first absorption rate. The processor is configured to combine the first image and the second image to produce a combined image in which the surface is reduced or eliminated as compared to the water. In addition, the processor is configured to detect water in the combined image.
Claims
1. A device for identifying water on a surface, comprising: an optical sensor for producing a first image of the surface which exhibits a first optical bandwidth within which the water exhibits a first absorption rate, and a second image of the surface which exhibits a second optical bandwidth within which the water exhibits a second absorption rate that is higher than the first absorption rate; a processor for combining the first image and the second image to produce a combined image in which the surface is reduced or eliminated as compared to the water, and for detecting the water in the combined image; and an additional optical sensor configured to provide to the processor a third image with a first polarization filter comprising a first polarization angle, and to provide to the processor a fourth image with a second polarization filter comprising a second polarization angle, wherein the two polarization filters differ from each other with regard to their polarization angles, and wherein the processor is configured to detect an aggregate state of the water while using the third image and the fourth image in the combined image.
2. The device as claimed in claim 1, wherein the first optical bandwidth comprises a waveband selected from a spectral range between 400 nm and 900 nm; and the second optical bandwidth comprises a waveband selected from a spectral range between 900 nm and 1200 nm.
3. The device as claimed in claim 1, wherein the first optical bandwidth at a full width at half maximum exhibits a value ranging between 820 nm and 870 nm; and the second optical bandwidth at the full width at half maximum exhibits a value ranging between 950 nm and 990 nm.
4. The device as claimed in claim 1, wherein the optical sensor comprises a vertical polarization filter configured to reduce or eliminate the horizontally polarized light rays.
5. The device as claimed in claim 1, wherein the optical sensor comprises a silicon-based image sensor, a monochrome sensor, or a CMOS sensor.
6. The device as claimed in claim 1, wherein the optical sensor comprises a first individual sensor comprising a first optical bandpass filter comprising the first bandwidth, and a second individual sensor comprising a second optical bandpass filter comprising the second bandwidth, wherein the optical sensor is configured to simultaneously produce the first image by means of the first individual sensor and the second image by means of the second individual sensor simultaneously; or wherein the optical sensor comprises a third individual sensor comprising exchangeable optical bandpass filters, the exchangeable optical bandpass filters comprising the first optical bandpass filter comprising the first bandwidth and the second optical bandpass filter comprising the second bandwidth, wherein the optical sensor is configured to produce, in succession, the first image by means of the third individual sensor comprising the first optical bandpass filter, and the second image by means of the third individual sensor comprising the second optical bandpass filter; or wherein the optical sensor comprises a fourth individual sensor comprising a filter layer, wherein the filter layer is configured to divide pixels of the image produced by the fourth individual sensor into two groups in a row-wise, column-wise or checkerboard-wise manner, wherein the first optical bandpass filter comprising the first bandwidth is arranged above the pixels of the first group, so that the pixels of the first group produce the first image comprising the first optical bandwidth, and the second optical bandpass filter comprising the second bandwidth is arranged above the pixels of the second group, so that the pixels of the second group produce the second image comprising the second optical bandwidth.
7. The device as claimed in claim 1, wherein the processor is configured to use, in combining the first image and the second image, pixel-by-pixel quotient calculation, pixel-by-pixel difference calculation, pixel-by-pixel normalized difference calculation, or a difference between 1 and quotient calculation.
8. The device as claimed in claim 1, wherein the processor is configured to produce a convolved image, wherein the processor is configured to scan the combined image with pixels on a pixel-by-pixel or block-by-block basis by using a pixel window, to compute a weighted average of the values of the pixels of the pixel window, and to produce a convolved image from the weighted averages, wherein positions of the weighted averages in the convolved image match positions of the pixel windows in the combined image.
9. The device as claimed in claim 1, wherein the processor is configured to compare values of individual pixels of the combined image or of a convolved image derived from the combined image with a threshold, wherein the pixels with values that exhibit a certain relation to the threshold are detected as being water, or to perform feature detection to detect water.
10. The device as claimed in claim 1, wherein the additional optical sensor comprises an individual sensor comprising a polarization filter layer, wherein the polarization filter layer is configured to divide picture elements of the image produced by the individual sensor into two groups in a row-by-row, column-by-column, or checkerboard manner, wherein the first polarization filter is arranged above the picture elements of the first group, so that the picture elements of the first group produce the third image with the first polarization angle, and the second polarization filter is arranged above the picture elements of the second group, so that the picture elements of the second group produce the fourth image with the second polarization angle.
11. The device as claimed in claim 1, wherein the optical sensor and the additional optical sensor are configured in an individual sensor comprising a filter layer, wherein the filter layer is configured to divide picture elements of the image produced by the individual sensor into four groups in a row-by-row, column-by-column, or checkerboard manner, wherein the first optical bandpass filter comprising the first bandwidth is arranged above the pixels of the first group, so that the pixels of the first group produce the first image comprising the first optical bandwidth, wherein the second optical bandpass filter comprising the second bandwidth is arranged above the pixels of the second group, so that the pixels of the second group produce the second image comprising the second optical bandwidth, wherein the first polarization filter is arranged above the picture elements of the third group, so that the picture elements of the third group produce the third image with the first polarization angle, and wherein the second polarization filter is arranged above the picture elements of the fourth group, so that the picture elements of the fourth group produce the fourth image with the second polarization angle.
12. The device as claimed in claim 1, wherein the polarization angles of the polarization filters are arranged to be offset from one another by 90°.
13. The device as claimed in claim 1, wherein the processor is configured to detect regions comprising water in the combined image or in a convolved image derived from the combined image, to evaluate picture elements from the third and fourth images in the regions comprising water, and to detect an aggregate state of the water, whether the regions comprising water comprise liquid water or ice.
14. The device as claimed in claim 13, wherein the processor is configured to generate normalized intensity values of the third and fourth images when evaluating the picture elements of the third and fourth images, to use, in generating the normalized intensity values, a comparison of the values of the picture elements with a sum of the values of the picture elements of the third and fourth images, to detect the aggregate state of the water, the processor being configured to detect an aggregate state of the water as ice when the normalized intensity values of the third and fourth images are within a range comprising +/−10% of the mean value of the normalized intensity values of the third and fourth images, or to detect the aggregate state of the water as liquid water when a normalized intensity value from the normalized intensity values of the third and fourth images is greater than 1.5 times the mean value of the normalized intensity values of the third and fourth images.
15. The device as claimed in claim 1, comprising a temperature sensor configured to provide an ambient temperature value to the processor, by means of which the processor may evaluate an aggregate state of the water.
16. The device as claimed in claim 1, wherein the processor is configured to sense a set of state values comprising information from the first image and the second image, and to provide a variable threshold as a function of the set while using an artificial-intelligence element or a neural network, and to detect, while using the variable threshold, water in the combined image or an aggregate state of the water in the combined image.
17. The device as claimed in claim 1, wherein the processor is configured to sense a set of state values comprising information from the third image and from the fourth image, and to provide a variable threshold as a function of the set while using an artificial-intelligence element or a neural network, and to detect, while using the variable threshold, water in the combined image or an aggregate state of the water in the combined image.
18. The device as claimed in claim 1, wherein the optical sensor comprises the light-sensitive area for producing the first image of the surface which exhibits the first optical bandwidth, and the second image of the surface which exhibits the second optical bandwidth, wherein the additional optical sensor is configured to provide the third image with the first polarization filter, and to provide the fourth image with the second polarization filter, wherein the processor is configured to determine a difference from the first image and the second image, and to calculate the combined image while using the first polarization filter, and wherein the processor is configured to determine a further difference from the third image and the fourth image, and to calculate a polarization difference while using the first absorption filter so as to use the polarization difference for evaluating the regions comprising water.
19. Means of locomotion comprising: the device for identifying water on a surface, said device comprising: an optical sensor for producing a first image of the surface which exhibits a first optical bandwidth within which the water exhibits a first absorption rate, and a second image of the surface which exhibits a second optical bandwidth within which the water exhibits a second absorption rate that is higher than the first absorption rate; a processor for combining the first image and the second image to produce a combined image in which the surface is reduced or eliminated as compared to the water, and for detecting the water in the combined image; and an additional optical sensor configured to provide to the processor a third image with a first polarization filter comprising a first polarization angle, and to provide to the processor a fourth image with a second polarization filter comprising a second polarization angle, wherein the two polarization filters differ from each other with regard to their polarization angles, and wherein the processor is configured to detect an aggregate state of the water while using the third image and the fourth image in the combined image, and an interface, wherein the interface is configured to alert a driver of the means of locomotion and/or to influence control of the means of locomotion if the device detects a solid state of the water.
20. A method of distinguishing a liquid or solid aggregate state of water in a region comprising water, the method comprising: acquiring a first image of the surface which exhibits a first optical bandwidth within which the water exhibits a first absorption rate, acquiring a second image of the surface which exhibits a second optical bandwidth within which the water exhibits a second absorption rate that is higher than the first absorption rate, combining the first image and the second image to acquire a combined image in which the surface is reduced or eliminated as compared to the water, detecting the water in the combined image, acquiring a third image with a first polarization filter comprising a first polarization angle, acquiring a fourth image with a second polarization filter comprising a second polarization angle, wherein the two polarization filters differ from each other with regard to their polarization angles, evaluating the picture elements of the regions comprising water of the third and fourth images, and detecting the aggregate state of the water on the basis of the evaluation of the picture elements.
21. A non-transitory digital storage medium having a computer program stored thereon to perform the method of distinguishing a liquid or solid aggregate state of water in a region comprising water, the method comprising: acquiring a first image of the surface which exhibits a first optical bandwidth within which the water exhibits a first absorption rate, acquiring a second image of the surface which exhibits a second optical bandwidth within which the water exhibits a second absorption rate that is higher than the first absorption rate, combining the first image and the second image to acquire a combined image in which the surface is reduced or eliminated as compared to the water, detecting the water in the combined image, acquiring a third image with a first polarization filter comprising a first polarization angle, acquiring a fourth image with a second polarization filter comprising a second polarization angle, wherein the two polarization filters differ from each other with regard to their polarization angles, evaluating the picture elements of the regions comprising water of the third and fourth images, and detecting the aggregate state of the water on the basis of the evaluation of the picture elements, when said computer program is run by a computer.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) Embodiments of the present invention will be detailed subsequently referring to the appended drawings, in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
of the light rays which are polarized vertically and in parallel and are reflected off the water surface at different incident angles;
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
(25)
(26)
DETAILED DESCRIPTION OF THE INVENTION
(27) Before embodiments of the present invention will be explained in detail below with reference to the drawings, it shall be pointed out that identical elements, objects and/or structures having the same function or the same effect are provided with the same or similar reference numerals in the different figures, so that the descriptions of these elements which are given in the different embodiments are interchangeable, or mutually applicable.
(28)
(29) The optical sensor 110 includes a vertical polarization filter 115, a first individual sensor 123 comprising a first optical bandpass filter 133 and a second individual sensor 126 comprising a second optical bandpass filter 136. The vertical polarization filter 115 is not required.
(30) The first optical bandpass filter has a first bandwidth within which the water has a first absorption rate. The second optical bandpass filter has a second bandwidth within which the water has a second absorption rate that is higher than the first absorption rate.
(31) The device is configured to detect water on the surface 150. Light rays 160, as will be discussed further below, partially penetrate the water and are partially reflected off the water surface.
(32) The vertical polarization filter 115 is configured to reduce or eliminate the horizontally polarized rays that are reflected off the water surface in a specular manner.
(33) The non-reduced rays will arrive at either the first individual sensor 123 comprising the first optical bandpass filter 133 or at the second individual sensor 126 comprising the second optical bandpass filter 136.
(34) The optical sensor 110 is configured to produce a first image 143 by means of the first individual sensor 123 and a second image 146 by means of the second individual sensor as simultaneously as possible, so that, as far as possible, the same surface 150 is imaged in the first image and in the second image. Furthermore, the sensor is configured to provide the first image and the second image to the processor.
(35) The processor is configured to provide a combined image 130 as a combination of the first image and the second image, and to detect or locate water in the combined image, as will be discussed further below. The detected regions comprising water 140 are the outputs of the processor or the outputs of the device.
(36)
(37) The processor may be coupled to an optional additional temperature sensor 230.
(38) The optical sensor 110 includes an individual sensor 123 comprising a filter layer 233. As indicated in frontal view 233F, the filter layer is divided into two groups in a row-by-row, column-by-column, or checkerboard manner. The first optical bandpass filter having the first bandwidth is located in the first group, and the second optical bandpass filter having the second bandwidth is located in the second group.
(39) The first optical bandpass filter has a first bandwidth within which the water has a first absorption rate, and the second optical bandpass filter has a second bandwidth within which the water has a second absorption rate that is higher than the first absorption rate.
(40) The additional optical sensor 210 includes an individual sensor 243 comprising a polarization filter layer 253. Similar to the filter layer 233, the polarization filter layer 253 is divided into three groups in a row-by-row, column-by-column, or checkerboard manner. A first polarization filter having a first polarization angle is located in the first group, a second polarization filter having a second polarization angle is located in the second group, and a third polarization filter having a third polarization angle is located in the third group.
(41) The device 200 is configured to detect the aggregate state of the water on the surface 150.
(42) The optical sensor 110 is configured to produce a first image 143 having a first bandwidth and a second image 146 having a second bandwidth.
(43) The additional optical sensor 210 is configured to produce a third image 263 with a first polarization filter, a fourth image 266 with a second polarization filter, and a fifth image 269 with a third polarization filter.
(44) The optical sensor 110 and the additional optical sensor 210 are configured to provide the first, second, third, fourth, and fifth images 143, 146, 263, 266, 269 to the processor 120.
(45) The optional temperature sensor 230 is configured to provide an ambient temperature value to the processor. If the ambient temperature is well above or well below the freezing point of water, the region comprising water will be identified as being water or ice, depending on the ambient temperature.
(46) The processor 120 is configured to evaluate an absorption measurement on the basis of the first and second images 280 and to detect the water on the surface 150. Furthermore, the processor 120 is configured to evaluate a backscattering measurement while using the third, fourth, and fifth images 290 and to detect an aggregate state of the water in the regions comprising water. The evaluation method 280 of an absorption measurement and properties of the evaluation method 290 of a backscattering measurement will be further explained below.
(47) The detected aggregate states 240 of the regions comprising water are the outputs of the processor or the outputs of the device.
(48)
(49) As was explained in
(50) The interface 310 is configured to alert a driver 320 of the means of locomotion and/or to preemptively influence control of the means of locomotion if the device detects a solid state of the water.
(51) The device 200 of the means of locomotion 300 enables safe driving by preemptively identifying and locating wet areas, puddles, and ice or snow formations on the road and classifying the aggregate state when the ambient temperature is near the freezing point of water. Inference of an actual hazard or risk of traction loss is a key feature for future advanced driver assistance systems (ADAS) or autonomous driving vehicles.
(52) Alternatively,
(53)
(54)
(55) As was explained in
(56) Since other environmental materials such as asphalt, for example, exhibit a consistent absorption rate within the spectral range between 400 nm and 1200 nm, external influencing factors such as changing background or illumination, for example, will affect both pictures equally.
(57) To reduce or eliminate the background, or to highlight the regions comprising water, the processor 120 of the device 100 is configured to combine the pictures 143, 146 with each other. By optimally selecting the first optical bandwidth of the first optical bandpass filter 133 and the second optical bandwidth of the second optical bandpass filter 136, a maximally possible difference in pixel intensities in the regions comprising water between the pictures is provided.
(58) The absorption peak of water is at about 980 nm, so a first bandpass filter is used within the wavelength range between 900 and 1000 nm. The second bandpass filter is actually freely selectable within the wavelength range between 400 and 900 nm.
(59) In an optimum selection, the first optical bandwidth may have a value ranging between 820 nm and 870 nm, 840 nm and 880 nm, or 850 nm and 890 nm, for example, at a half-width, and the second optical bandwidth may have a value ranging between 920 nm and 970 nm, 940 nm and 980 nm, or 950 nm and 990 nm, for example, at the half-width.
(60)
(61) As a starting point, a surface 510 is shown which is partially covered with water 520, such as a road with a puddle.
(62) As a first step, the optical sensor 110 of the device 200 produces a first image 143 having a first optical bandwidth within which the water 520 has a first absorption rate, and a second image 146 having a second optical bandwidth within which the water has a second absorption rate that is higher than the first absorption rate.
(63) Because of the different absorption rates, regions comprising water 520 are imaged differently in the first image 143 and the second image 146. In the second image 146, regions comprising water 520 are considerably darker.
(64) As a second step, the processor 120 of the device 200 produces a combined image 530 from the first image 143 and from the second image 146. In this case, pixel-by-pixel quotient calculation is used in pixel-by-pixel combining. Other methods of combining may include pixel-by-pixel difference calculation, pixel-by-pixel normalized difference calculation, or difference calculation between 1 and quotient calculation.
(65) The combined image or the result of the pixel-by-pixel quotient calculation is a kind of heat map, so the more a quotient value deviates from 1.0, the more likely it will be that water or ice there will be detected there.
(66) As a third step, the processor 120 of the device 200 detects or locates regions comprising water. Locating or detecting the regions comprising water may be performed as a final decision with a threshold on a pixel-by-pixel basis or via “feature extraction,” i.e., identification of contiguous areas. The detected regions comprising water 140 are the outputs of the evaluation method 280 of the device 200.
(67) It should be noted here that a water absorption rate is about
(68)
That is, a 1 mm deep film of water with a pixel saturation of 255, absorption will be only 2.0.Math.10.sup.−3.Math.255=0.51 at best.
(69) For the absorption effect to be measurable at all, the adjacent pixels are also incorporated in a pixel window. The larger the pixel window, the more the absorption effect will be amplified and, accordingly, the blurrier the resulting folded image or water heat map will be. For example, if a puddle consists of only a single pixel or too few pixels, it will not be identified.
(70) Furthermore, the difference between a simple difference calculation and a simple quotient calculation will be explained in more detail in the following example.
(71) As a starting point, a surface having light and dark areas is selected which is covered by a water film. Within a spectral range within which water absorbs negligibly little, a measured value of e.g. 200 is obtained for the light area and e.g. 50 for the dark area. Within the spectral range within which water absorbs to a noticeable extent, one gets readings of, e.g., 0.8.Math.200=160 for the light area and 0.8.Math.50=40 for the dark area.
(72) The difference would now be 40 for the light area and 10 for the dark area, which would lead to the wrong conclusion that there is more water present on the light area.
(73) However, the quotient results in the value of 0.8 for both areas. A dry area corresponds to the value of 1.0 in the case of a quotient calculation.
(74) For example, pixel-by-pixel normalization of the difference would also be possible, e.g.
(75)
which would then simply be 1−the quotient. The values i.sub.1(x,y) and i.sub.2(x,y) represent individual pixel intensity values of the first and the second images at the position x, y.
(76)
(77)
(78) In the example, the following bandpass filters were used: 850 nm CWL, 50 nm FWHM bandpass filter with a 400-2200 nm vertical polarization filter with a contrast of 400:1, and 950 nm CWL, 50 nm FWHM bandpass filter with a 400-2200 nm vertical polarization filter with a contrast of 400:1.
(79) The combined image was evaluated with a simple threshold. The threshold decision may be defined while using, for example, the following function:
max(round(i(x,y)−0.4),1.0)
(80) Dry regions or regions of milky ice are light or white in the picture and have a value greater than 0.9. Values less than 0.9 are dark or black in the picture and are identified as water. Absorption of clear/wet ice behaves like water.
(81) Light may not penetrate milky ice or snow, so milky ice or snow has no effect on the absorption rate.
(82) Regions comprising water 610b in the combined image are clearly visible. Furthermore, reflections of trees on the water surface hardly have an effect on the absorption measurement. The combined image corrects the measurements for changing external influences such as subsurface material, lighting, or a changing environment reflected in the water body.
(83) Clear ice behaves like water.
(84) This absorption measurement method is configured to cope with unknown light sources. The only requirement is that the light source has enough energy within the near infrared (NIR) range, i.e. below 1000 nm. In sunlight, there is no problem at all, and at night, car headlights such as halogen lamps, for example, are sufficient.
(85) The absorption peak of water is around 980 nm, so a first bandpass filter is used within the wavelength range between 900 and 1000 nm.
(86) The second bandpass filter is actually freely selectable within the wavelength range between 400 and 900 nm. In embodiments, the second bandpass filter is within the wavelength range between 800 and 900 nm.
(87) Thus, simply put, two colors are compared. However, the optical bandpass filters 133, 136 that are used are not within the visible range. The optical sensors 110 used, e.g. CMOS sensors, have no Bayer filter coating and are thus monochrome. The three colors red, green, blue (R-G-B) are nothing but bandpass filters, within the wavelength ranges ˜600-700 nm, ˜500-600 nm, ˜400-500 nm.
(88)
(89)
(90)
(91)
(92)
of the light rays that are polarized vertically and in parallel and are reflected off the water surface, at different incidence angles 740. The polarization rate has a maximum value at the Brewster angle 750, i.e., at about ˜53°.
(93) In
(94) Only the penetrating light rays 720 are affected by the absorption of the water. The horizontally polarized rays 730 that are reflected in a specular manner are not relevant to, or even interfere with, absorption measurement. Reduction or elimination of the horizontally polarized rays 730 that are reflected in a specular manner is possible with a vertical polarization filter, such as the vertical polarization filter 115 in
(95) As is shown in
(96) If an image sensor of a means of locomotion, e.g. the means of locomotion 300 in
(97) In the following, the backscattering measurement will be explained. The evaluation 290 of the backscattering measurement is performed by the processor 120 of the device 200 while using the third, fourth, and fifth images, the three images differing from one another with regard to their polarization planes.
(98) The polarization angles of the images may be represented as a polarization ellipse.
(99)
(100)
(101) As was explained in
(102) Ice crystals scatter the light ray and rotate the polarization ellipse. The result is a more scattered polarization with a slightly shifted orientation.
(103) The rotation of the polarization ellipse depends on the illumination. With homogeneous illumination, the rotation of the polarization ellipse is also homogeneous and may be used as a feature.
(104)
(105)
(106)
(107) The following polarization filters were used in the example: 400-2200 nm polarization filter with a contrast of 400:1 at 0° (horizontal). 400-2200 nm polarization filter with a contrast of 400:1 at 60°. 400-2200 nm polarization filter with a contrast of 400:1 at 120°.
(108) The backscattering measurement is independent of the absorption measurement. Here, the three pictures taken are normalized to the average of the three pictures on a pixel-by-pixel basis and compared with one another. In normalization, the following equation is used:
(109)
where p.sub.1N represents a normalized pixel of the first image, and p.sub.1, p.sub.2 and p.sub.3 represent individual picture-element values of the third, fourth and fifth images.
(110) A simple threshold decision in this case would be, for example: if p.sub.0<<p.sub.60 and p.sub.0<<p.sub.120, then water, or if p.sub.0˜=p.sub.60˜=p.sub.120, then milky ice/snow, or if p.sub.0>p.sub.60˜=p.sub.120, then clear ice. The values p.sub.0, p.sub.60 and p.sub.120 represent individual picture-element values of the third, fourth, and fifth images.
(111) In the first approaches, heat maps of the individual methods are simply combined. The backscattering method measures surface scattering and would interpret any smooth object as water. The absorption method would ignore this.
(112) What is at hand is liquid water only if the absorption rate is high and the backscattering measurement ascertains little scattering or a smooth surface.
(113) In the case of clear ice, the logic is different. What is at hand is clear ice when the absorption rate is high, i.e. water is present, while the backscattering measurement shows high scattering.
(114) Turbid ice/snow is poorly detected by absorption but is all the more easily detected by backscattering, since it exhibits no absorption but very high scattering.
(115) The absorption method may be improved by using algorithms to assess neighboring pixels and trained decision models for taking into account all of the surroundings. To cover the different cases, a model is trained by means of machine learning here. For image sections of a size of 32×32 pixels, the recognition rate was at 98%, and was still at a very good 85% when cross-validated with field test data. When using a neural network that also evaluates the shape, and when performing training with field test data, more than 95% will again be achieved.
(116)
(117) The filter layer is evenly divided into four different filter groups, F1, F2, F3, F4 on a pixel-by-pixel basis. For example, filter groups F1 and F2 may include a bandpass filter, and filter groups F3 and F4 may include a polarization filter,
(118) Light rays 1040 will arrive at the filter layer 1010. Pixels 1030 of image sensor 1020 will receive filtered light rays from different filter groups F1, F2, F3, F4 of the filter layer 1010. The image produced by the individual sensor is divided into four pictures 1060, 1070, 1080, 1090 by filter groups F1, F2, F3, F4, and the images are provided to the processor.
(119) The optical sensor 1000 may replace the sensors 110 and 210 in the device 200.
(120) Even though some aspects have been described within the context of a device, it is understood that said aspects also represent a description of the corresponding method, so that a block or a structural component of a device is also to be understood as a corresponding method step or as a feature of a method step. By analogy therewith, aspects that have been described in connection with or as a method step also represent a description of a corresponding block or detail or feature of a corresponding device. Some or all of the method steps may be performed by a hardware device (or while using a hardware device). In some embodiments, some or several of the most important method steps may be performed by such a device.
(121)
(122) For the evaluation method, ΔXφ and ΔXλ may be used as are present in brackets on the left side of the block of equations 1105. In the 4-camera/filter system, the sizes are directly obtained. In the present 3-camera system shown in
(123) It shall be noted that it is not relevant whether or not the arrangement comprises three dedicated cameras or one chip having 3 areas (pixels) and/or three areas 1102, 1102, 1103 per pixel.
(124) In the advantageous embodiment in
(125) φ1=0° (horizontally polarized)
(126) φ2=90° (vertically polarized)
(127) λ1=850 nm (bandpass 50 nm FWHM)
(128) λ2=950 nm (bandpass 50 nm FWHM)
(129) The advantageous device thus includes the optical sensor having the light-sensitive area (1101, 1102, 1103) for producing the first image (X.sub.01) of the surface which has the first optical bandwidth (λ.sub.1), and the second image (X.sub.02) of the surface which has the second optical bandwidth (λ.sub.2). Moreover, the additional optical sensor is configured to provide the third image (X.sub.01) having the first polarization filter, and to provide the fourth image (X.sub.03) having the second polarization filter to the processor. In addition, the processor is configured to determine a difference (X.sub.01−X.sub.02) from the first image and the second image, and to calculate the combined image while using the first polarization filter. Furthermore, the processor is configured to determine a further difference (X.sub.01−X.sub.03) from the third image and the fourth image, and to calculate a polarization difference (ΔXλ) while using the first absorption filter so as to use the polarization difference (ΔXλ) for evaluating the regions comprising water.
(130) The first image mentioned for the purpose of this application is, in the case of the 3-camera/element solution described, the same image as the third image. However, the second image and the fourth image are different images of different light-sensitive areas 1102 and 1103. In addition, it shall be noted that the combined image thus cannot necessarily be obtained by a simple difference but may be obtained by processing a difference, possibly while using a filter as in the embodiment of
(131) Depending on particular implementation requirements, embodiments of the invention may be implemented in hardware or in software. The implementation may be performed using a digital storage medium, for example, a floppy disk, a DVD, a Blu-ray disc, a CD, a ROM, a PROM, an EPROM, an EEPROM, or a FLASH memory, a hard disk, or any other magnetic or optical storage medium on which electronically readable control signals are stored, which may interact, or actually do interact, with a programmable computer system in such a way as to perform the respective method. Therefore, the digital storage medium may be computer readable.
(132) Thus, some embodiments according to the invention include a storage medium having electronically readable control signals capable of interacting with a programmable computer system such that any of the methods described herein are performed.
(133) Generally, embodiments of the present invention may be implemented as a computer program product having program code, the program code being operative to perform any of the methods when the computer program product is running on a computer.
(134) For example, the program code may also be stored on a machine-readable medium.
(135) Other embodiments include the computer program for performing any of the methods described herein, wherein the computer program is stored on a machine-readable medium.
(136) In other words, an embodiment of the method of the invention is thus a computer program comprising program code for performing any of the methods described herein when the computer program runs on a computer.
(137) Thus, another embodiment of the methods according to the invention is a data carrier (or a digital storage medium or a computer-readable medium) on which the computer program for performing any of the methods described herein is recorded.
(138) Thus, another embodiment of the method of the invention is a data stream or sequence of signals representing the computer program for performing any of the methods described herein. The data stream or sequence of signals may, for example, be configured to be transferred over a data communication link, such as over the Internet.
(139) Another embodiment comprises a processing device, such as a computer or programmable logic device, configured or adapted to perform any of the methods described herein.
(140) A further embodiment comprises a computer having installed thereon the computer program for performing any of the methods described herein.
(141) Another embodiment according to the invention comprises a device or system configured to transmit a computer program for performing at least one of the methods described herein to a receiver. The transmission may be, for example, electronic or optical. The receiver may be, for example, a computer, mobile device, storage device, or similar device. The device or system may include, for example, a file server for transmitting the computer program to the receiver.
(142) In some embodiments, a programmable logic device (for example, a field programmable gate array, an FPGA) may be used to perform some or all of the functionalities of the methods described herein. In some embodiments, a field programmable gate array may interact with a microprocessor to perform any of the methods described herein. In general, in some embodiments, the methods are performed on the part of any hardware device. This may be general-purpose hardware, such as a computer processor (CPU), or hardware specific to the method, such as an ASIC.
(143) While this invention has been described in terms of several embodiments, there are alterations, permutations, and equivalents which fall within the scope of this invention. It should also be noted that there are many alternative ways of implementing the methods and compositions of the present invention. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations and equivalents as fall within the true spirit and scope of the present invention.
LITERATURE
(144) A. L. Rankin, L. H. Matthies, and P. Bellutta (2011), “Daytime water-detection based on sky reflections”, in Robotics and Automation (ICRA 2011 IEEE International Conference on. IEEE, 2011, pp. 5329-5336 C. V. Nguyen, M. Milford, R. Mahony (2017): “3[) tracking of water hazards with polarized stereo cameras”, 2017 IEEE International Conference on Robotics and Automaton (ICRA) Singapore, May 29-Jun. 3, 2017 V. Vikari, T. Varpula, and M. Kantanen (2009), “Road-condition recognition using 24-GHz automotive radar”, IEEE Transactions on Intelligent Transportation Systems, vol. 10, no. 4, pp. 639-648, 2009 J. Casselgren, S. Rosendahl, M. Sjòdahl, P. Jonsson (2015), “Road condition analysis using NIR illumination and compensating for surrounding light”, in Optics and Lasers in Engineering 77(2016) pp 175-182 A near-infrared optoelectronic approach to detection of road conditions L. Colace, F. Santoni, G. Assanto, NooEL-Nonlinear Optics and OptoElectronics Lab, University “Roma Tre”, Via della Vasca Navale 84, 00146 Rome, Italy 3D tracking of water hazards with polarized stereo cameras Chuong V. Nguyen1, Michael Milford2 and Robert Mahony A Lane Detection Vision Module for Driver Assistance A. L. Rankin, L. H. Matthies, and P. Bellutta (2011), “Daytime water-detection based on sky reflections”, in Robotics and Automation (ICRA 2011 IEEE International Conference on. IEEE, 2011, pp. 5329-5336 C. V. Nguyen, M. Milford, R. Mahony (2017): “3[) tracking of water hazards with polarized stereo cameras”, 2017 IEEE International Conference on Robotics and Automaton (ICRA) Singapore, May 29-Jun. 3, 2017 V. Vikari, T. Varpula, and M. Kantanen (2009), “Road-condition recognition using 24-GHz automotive radar”, IEEE Transactions on Intelligent Transportation Systems, vol. 10, no. 4, pp. 639-648, 2009 J. Casselgren, S. Rosendahl, M. Sjòdahl, P. Jonsson (2015), “Road condition analysis using NIR illumination and compensating for surrounding light”, in Optics and Lasers in Engineering 77(2016) pp 175-182 A near-infrared optoelectronic approach to detection of road conditions L. Colace, F. Santoni, G. Assanto, NooEL-Nonlinear Optics and OptoElectronics Lab, University “Roma Tre”, Via della Vasca Navale 84, 00146 Rome, Italy 3D tracking of water hazards with polarized stereo cameras Chuong V. Nguyen1, Michael Milford2 and Robert Mahony A Lane Detection Vision Module for Driver Assistance Author(s): Maček, Kristijan; Williams, Brian; Kolski, Sascha; Siegwart, Roland Daytime Water Detection Based on Color Variation Arturo Rankin and Larry Matthies Daytime Water Detection Based on Sky Reflections Arturo L. Rankin, Larry H. Matthies, and Paolo Bellutta Detecting water hazards for autonomous off-road navigation Larry Matthies*, Paolo Bellutta, Mike McHenry Ice detection on a road by analyzing tire to road friction ultrasonic noise D. Gailius, S. Jačėnas Ice formation detection on road surfaces using infrared thermometry Mats Riehm, Torbjörn Gustaysson, Jörgen Bogren, Per-Erik Jansson Wet Area and Puddle Detection for Advanced Driver Assistance Systems (ADAS) Using a Stereo Camera Jisu Kim, Jeonghyun Baek, Hyukdoo Choi, and Euntai Kim* MULTISPECTRAL IMAGING OF ICE Dennis Gregoris, Simon Yu and Frank Teti Near field ice detection using infrared based optical imaging technology Hazem Abdel-Moati, Jonathan Morris, Yousheng Zeng, Martin Wesley Corie II, Victor Garas Yanni New System for Detecting Road Ice Formation Amedeo Troiano, Eros Pasero, Member, IEEE, and Luca Mesin Selection of optimal combinations of band-pass filters for ice detection by hyperspectral imaging Shigeki Nakauchi, Ken Nishino, and Takuya Yamashita A Sensor for the Optical Detection of Dangerous Road Condition Armando Piccardi and Lorenzo Colace Polarization-Based Water Hazards Detection for Autonomous Off-road Navigation Bin Xie, Huadong Pan, Zhiyu Xiang, Jilin Liu Road condition analysis using NIR illumination and compensating for surrounding light Johan Casselgren, SaraRosendahl, MikaelSjödahl, PatrikJonsson Self-Supervised Segmentation of River Scenes Supreeth Achar, Bharath Sankaran, Stephen Nuske, Sebastian Scherer and Sanjiv Singh.