METHOD FOR MEASURING A PARTICLE PRECIPITATION RATE, AND DEVICE THEREOF
20170363776 · 2017-12-21
Inventors
Cpc classification
International classification
Abstract
A method for measuring a particles' precipitation rate includes the steps of acquiring at least one first image during a precipitation event through an image acquisition device having a sensor and lens; detecting the particles of the precipitation in the at least one first image by subtracting a background of the first image and setting a brightness threshold for detecting the particles, the particles being visible as a plurality of streaks in the image, wherein a first portion of the plurality of streaks comprises blurred streaks, and a second portion of the plurality of streaks comprises focused streaks; determining an apparent diameter and an apparent length for the plurality of streaks; estimating an actual diameter and an actual length for the plurality of streaks by resolving an equations' system having three equations and three unknowns, namely the actual diameter, the actual length and a depth position of the plurality of streaks, the depth position being the position of each particle from the lens, in which a first equation has the actual diameter as unknown in function of the depth position, a second equation has the actual length as unknown in function of the depth position and a third equation equals the theoretical terminal velocity of the particles with an estimated velocity of the particles in function of the depth position; estimating the velocity of the particles based on the ratio between a net streak length and an exposure time used to take at least one first image; estimating the particles' precipitation rate based on the actual diameter and the velocity (v).
Claims
1. A method for measuring a particles' precipitation rate, said method comprising the steps of: acquiring at least one first image during a precipitation event through an image acquisition device having a sensor and lens; detecting said particles of said precipitation in said at least one first image by subtracting a background of said first image and setting a brightness threshold for detecting said particles, said particles being visible as a plurality of streaks in said image, wherein a first portion of said plurality of streaks comprises blurred streaks, and a second portion of said plurality of streaks comprises focused streaks; determining an apparent diameter and an apparent length for said plurality of streaks; estimating an actual diameter and an actual length for said plurality of streaks by resolving an equations' system having three equations and three unknowns, namely said actual diameter said actual length and a depth position of said plurality of streaks, said depth position being the position of each particle from said lens, in which a first equation has said actual diameter as unknown in function of said depth position, a second equation has said actual length as unknown in function of said depth position and a third equation equals the theoretical terminal velocity of said particles with an estimated velocity of said particles in function of said depth position; estimating said velocity of said particles based on the ratio between a net streak length and an exposure time used to take said at least one first image; estimating said particles' precipitation rate based on said actual diameter and said velocity
2. The method according to claim 1, wherein it is provided for acquiring a second image through said image acquisition device and for subtracting said second image to said first image for deleting a background of said first image.
3. The method according to claim 1, wherein said method provides for determining said depth position of a particle in a sampled volume.
4. The method according to claim 1, wherein said velocity of said particles is estimated based on said depth position of the particles.
5. The method according to claim 1, wherein said particles' precipitation rate is obtained by averaging precipitation rates of a set of acquired images.
6. The method according to claim 1, wherein said particles are in liquid or solid state and comprise one or more of the elements of the group consisting of: hydrometeors, settling particles, chemical particles.
7. A computer product which can be loaded into a memory of said image acquisition device comprising portions of software code adapted to implement the method of claim 1.
Description
[0018] The above objects will become more apparent from the following detailed description of a method and a device for measuring a particles precipitation rate according to the present invention, with particular reference to the annexed drawings, wherein:
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026] An image acquisition device (not shown in figures) according to the present invention can be a camera able to shoot images and/or video, a smartphone or a tablet having a camera, a webcam, and so on. The image acquisition device comprises at least one processor, lens and a sensor arranged for taking images and/or videos, and memory means for storing said images and/or videos.
[0027] With reference to
[0034] It is important to specify that the focused streaks are taken into account for the computation of the particles precipitation rate as well. Indeed, diameters and lengths of focused streaks are directly derivable from the first image, because they are not affected from the blur.
[0035] The method of the present invention further comprises the step of acquiring a second image (taken in a different instant from the first image) through said image acquisition device and for subtracting said second image to said first image for deleting a background of said first image. If three images are available (taken in three different instants) the noise removal from the first image is more efficient. Deleting the background from the first image allows to detect particles in the first image. Moreover, it is provided for setting a brightness threshold for detecting particles.
[0036] The method according to the present invention will be more clear from the following description. In the following will be described in detail the method considering rain drops as particles and the image acquisition device is a camera having a sensor. Consider the camera observing a volume of rain. Rain produces sharp intensity changes in pictures/images and videos. Rain visibility strongly depends on camera parameters. For instance, it is rather easy to verify that, at short exposure times (˜0.001 sec), rain appears in the form of stationary drops (an example is shown on
[0037] Different rain streaks have different diameter (i.e., width or size), length and intensity, depending on drop characteristics and camera parameters. The present invention advantageously exploits streaks characteristics to quantitatively derive drop diameter, drop velocity, and rain rate (i.e. the particles precipitation rate).
[0038] With reference to
expresses the relation among f, f.sub.1 and z.sub.0.
[0039] As aforementioned, the method of the present invention comprises the step of acquiring a second image (taken in a different instant from the first image one) through the image acquisition device and subtracting the second image to the first image for deleting a background of the first image. If three images are available (taken in three different instants) the noise removal from the first image is more efficient.
[0040] In the next step, the method provides for detecting particles of the precipitation in at least one first image. Particles are visible as a plurality of streaks in the first image and a first portion of said plurality of streaks comprises blurred streaks, and a second portion of said plurality of streaks comprises focused streaks (i.e. not affected by blur).
[0041] Detecting particles is carried out by finding candidate rain pixels in at least one image. Thus, in this example, the method provides for finding candidate rain pixels by referring to couples (or triplets) of frames, i.e. images taken at two (or three) adjacent time steps (e.g., j−1, j and j+1). Subsequently, it is provided for comparing the brightness intensity I (e.g., 0≦I≦255) at each pixel in the first image Im.sub.j with the corresponding pixel intensity in the second image Im.sub.j−1 (and third image Im.sub.j+1). If the background remains stationary in said images, then the conditions of the following equation (1)
I(Im.sub.j)−I(Im.sub.j−1)>S.sub.1 & I(Im.sub.j)−I(Im.sub.j+1)>S.sub.1 (1)
[0042] can be used to detect candidate drops with reference to the first image Im.sub.j, being S.sub.1 a brightness threshold that represents the minimum change in intensity due to a drop and the symbol & represents the AND logic condition. This latter condition applies only if three images are acquired. In other words, the method provides for setting the brightness threshold S.sub.1 for detecting particles.
[0043] When three images are available, the conditions in equation (1) are both to be met in order to detect and select real drops. More in detail, pixels that meet just one of the two conditions of equation (1) should not be associated to raindrops, but to random noise in the first image (i.e., apparent particles or irregular borders).
[0044] The isolation of candidate rain pixels along a focused rain streak allows one to detect:
[0045] a) the number of candidate raindrops within the gauged volume by counting the number of streaks;
[0046] b) the drop diameter D.sub.P (in pixels) by setting it to the average width of the streak; and
[0047] c) the drop velocity, which is proportional to the ratio of the net streak length to the exposure time t.sub.e.
[0048] The net streak length is obtained by subtracting one drop diameter D.sub.P to the total length L.sub.P (in pixels) of the streak as it appears on the acquired image. Indeed, the velocity of a moving object is proportional to the distance covered in a time step by a fixed point of the object; considering the raindrop centre as the fixed point, this will cover L.sub.P−D.sub.P pixels in a time t.sub.e, while the total length of the streak will be L.sub.P, because the drop occupies D.sub.P/2 pixels above the drop centre, and D.sub.P/2 pixels below it.
[0049] For non-stationary backgrounds, e.g., vegetation with leaves moved by wind, equation (1) is not effective to detect candidate drops. In this case the subtraction of two frames does not guarantee the removal of the false positives created by the visual effects of light interaction with moving surfaces. False positive could be detected with specific post-processing algorithms that, for instance, verify the presence/absence of sub-vertical preferential directions ascribable to the effect of rain streaks.
[0050] The appearance of rain in the acquired image (first image) is significantly affected by blur. The blur effect is caused by a cone of light rays from a lens not coming to a perfect focus when imaging a point source. Thus, the next step of the method is determining an apparent diameter (D.sub.P,b) and an apparent length (L.sub.P,b) of the blurred streaks.
[0051] With reference to
[0052] The diameter of the blur circle (or circle of confusion, c.sub.P(z), in pixels) is obtained by dividing the diameter of the auxiliary blur circle C by the magnification factor f.sub.1/z.sub.0 where z.sub.0 is the distance of the focus plane from the lens 13. C is obtained via similar triangles as
The blur circle in the image plane can hence be written as
which exemplifies the dependence of the blur magnitude on z, z.sub.0, f.sub.1 and the aperture diameter A. By setting z=z.sub.0 in the expression for c, then the blur effect is null on the focus plane. The diameter of the blur in pixels (c.sub.P(z)), to be compared with the drop diameters and streak lengths, is obtained as
where H.sub.P is the image height in the focus plane (in pixels) and h is the sensor height in millimetres (see
[0053] With reference to
[0054] The drop detection in turn acts as a further threshold filter (line 17 in
where
and ΔI is the maximum positive brightness impulse due to a drop (typically ΔI=50). It is to be noted that, in the presence of blur, drops that are either very small or very distant from the focus plane may produce small brightness variations in the image, which may be undistinguishable from the random noise. This effect, in
[0055] From the image analysis, the actual diameter D.sub.P and the actual length L.sub.P values of each drop present in the image can be obtained; to obtain D.sub.P and L.sub.P the system of equations (3) has to be solved. However, in these equations there is a third unknown quantity, which is the distance from the lens z, i.e. the depth position. One more equation is thus needed to position the drops at the right distance z from the lens and infer the blur magnitude. In other words, the method of the present invention provides for estimating the actual diameter D.sub.P and the actual length L.sub.P of blurred streaks using the actual depth position z of blurred streaks, the apparent diameter D.sub.P,b and the apparent length L.sub.P,b. Said depth position z is the actual position of each particles (e.g., raindrops) from the lens of the camera.
[0056] To set the third equation the present invention provides for estimating the drop velocity v of the particles, e.g., raindrops, derived from the ratio of the net streak length (preferably calculated as L.sub.P−D.sub.P multiplied by the pixel dimension d.sub.P(z) to express it in millimetres) to the exposure time t.sub.e used to take at least one first image. More in detail, the step of estimating the actual diameter D.sub.P and the actual length L.sub.P of blurred streaks provides for resolving an equation system having three equations and three unknowns, namely said actual diameter D.sub.P, said actual length L.sub.P and said actual depth position z, in which the first equation has said actual diameter D.sub.P as unknown in function of depth position z, a second equation has said actual length L.sub.P as unknown in function of depth position z (see equation 3) and a third equation equals the theoretical terminal velocity of said particles with an estimated (drop) velocity v of said particles in function of depth position z (see next equation 4).
[0057] The estimated drop velocity v is equalled to the drop terminal speed (theoretical terminal velocity) expressed as
where C.sub.D is the drag coefficient which is approximately equal to 0.5 for a sphere, ρ is the water density, ρ.sub.a is the air density and g is the gravitational acceleration. The terminal velocity is also a function of the drop diameter, being D=D.sub.P.Math.d.sub.P(z). The diameters and streak lengths can be expressed in millimetres, while other variables are in IS (“International System”) units.
[0058] The equivalence between the two expressions for drop speeds reads:
[0059] The dependence on depth position z is expressed through D.sub.P and L.sub.P (see equation (3)) and through the variation of the pixel dimension d.sub.P with the distance z of the object from the lens, expressed as (see
[0060] In the following it is assumed the equation (4) to be valid for all drop diameters because very small drops play a minor role on the rain rate estimation.
[0061] The present invention provides for positioning drops along the dimension z, i.e. moving away from the lens 13; in other words it is possible to infer the third dimension from an intrinsically two-dimensional information (the image). The position of the drops in z is obtained by setting equation (5) in equation (4) and solving the system of equations (3) and (4), where the unknowns are the actual diameter D.sub.P, the actual length L.sub.P and depth position z. The equation in z to be solved, in squared form, hence results:
α.sup.2z.sup.2=β.Math.z−γ|z−z.sub.0| (6)
where:
[0062] With reference to
[0063]
[0064] With reference to
[0065] Indeed, before making a decision between z.sub.1 and z.sub.4, an additional constraint is set to define the sampled volume V: preferably the depth of the volume is limited to ⅔z.sub.0 as a lower bound and to 2z.sub.0 as an upper bound. This condition, in some cases, allows to flag as unlikely one of the two solutions and to identify the other solution as the best one. Note that an object in ⅔z.sub.0 or in 2z.sub.0 would produce a blur circle in the focus plane C=A/2. The sampled volume V (i.e., the volume of the truncated pyramid with bases in ⅔z.sub.0 and 2z.sub.0) can be computed as a function of the image dimensions in pixels, H.sub.P (height of the image) and W.sub.P (width of the image), and of the width of the volume section at z=2z.sub.0,
[0066] More precisely,
[0067] Despite the confinement of the sampled volume V, two admissible solutions z.sub.1 and z.sub.4, and hence two admissible diameters D(z.sub.1) and D(z.sub.4), still exist for the majority of drops. Non univocal cases of drop positioning can be further reduced by constraining drop diameters: for example it is possible consider that drops diameters larger than 6 mm and smaller than 0.5 pixel are very unlikely to occur, the first being a physical limit that is currently found in the literature, the latter deriving from the very low variations in pixel brightness induced by drops occupying less than half of the pixel. These conditions help to discern the most likely solution between z.sub.1 and z.sub.4, and to univocally attribute N.sub.1u, drops to z.sub.1 and N.sub.4u, to z.sub.4, with N.sub.1u+N.sub.4u<N, where N is the total number of detected raindrops. Moreover, the upper bound set on diameters allows to discard unlikely large diameters that are responsible for significant overestimations in the final estimate of rain rate.
[0068] To disentangle the remaining N−(N.sub.1u+N.sub.4u) cases of non univocal drop positioning, the method provides for adopting a pragmatic approach, namely to attribute drops to z.sub.1 and z.sub.4 randomly, by determining the probability to fall before (i.e. in z.sub.4) or behind (i.e. in z.sub.1) the focused plane. Then, it is provided for computing a first probability P.sub.1 for a drop to fall in the volume behind z.sub.0 as the ratio between the volume of the truncated pyramid with bases in z.sub.0 and 2z.sub.0 and the total sampled volume V. A second probability P.sub.4, i.e. the probability for a drop to fall in the volume before z.sub.0, is computed accordingly as P.sub.4=1−P.sub.1. The number of drops N.sub.1a attributed to the volume behind z.sub.0 is hence computed as
N.sub.1a=P.sub.1.Math.N−N.sub.1u (8)
[0069] Conversely, the number of drops N.sub.4a attributed to the volume before z.sub.0 is obtained as
N.sub.4a=(1−P.sub.1).Math.N−N.sub.4u (9)
[0070] For each non-univocally positioned drop, a random number q is sampled from a uniform (0,1) distribution. If q<N.sub.1a/(N−N.sub.1u−N.sub.4u) the drop is attributed to z.sub.1, otherwise to z.sub.4.
[0071] Through Monte Carlo simulations it has been verified that this random attribution algorithm only marginally affects the rain rate estimation.
[0072] Finally, the method provides for estimating the particles precipitation rate based on said actual diameter (D.sub.P) and said actual velocity (see the following equation 10).
[0073] The method is amenable to be applied to set of images registered as a sequence of images recorded at adjacent and close time steps. The following equation (10), in such an application, can be applied to estimate the rain rate from each image. An average rainfall rate can then be obtained by averaging the image-specific rainfall rates, thus reducing the sample variability and the inherent uncertainty in rain rate estimation.
[0074] The method provides that for each image Im.sub.j, where j is an integer index of the number of images, the intensity rain rate R.sub.Im.sub.
[0075] Each drop is assumed as responsible of a quota R.sub.i of the total rain rate of one image R.sub.I.sub.
where N is the total number of drops in the image, 1/6π.Math.D.sub.i.sup.3 is the volume of the i-th drop (in mm.sup.3), v.sub.i is the velocity of a i-th drop (in m/s), and V the total sampled volume (in m.sup.3).
[0076] Furthermore, as aforementioned, the method of the present invention provides for taking into account focused streaks, i.e. their diameters and velocity, for the computation of the rain rate, and not only the blurred streaks.
[0077] The method according to the invention can be implemented by means of a computer product which can be loaded into a memory of the image acquisition device and which comprises software code portions adapted to implement said method.
[0078] The features of the present invention, as well as the advantages thereof, are apparent from the above description.
[0079] A first advantage offered by the method and the image acquisition device according to the present invention is that the estimation of the particles precipitation rate is comparable with that of the prior art techniques.
[0080] A second advantage offered by the method and the image acquisition device according to the present invention is that results have errors in the same order of magnitude of the standard measuring devices (i.e. rain gauges, if the meteorological precipitation is considered).
[0081] A further advantage offered by the method and the image acquisition device according to the present invention is to retrieve measures of precipitation intensity at very high temporal resolution (e.g., one measure per second) at a very low cost.
[0082] A further advantage offered by the method and the image acquisition device according to the present invention is the possibility to dramatically increase the spatial density of precipitation observations (e.g., one measure/Km.sup.2, where Km is a kilometer).
[0083] The method and the image acquisition device for measuring a particles precipitation rate according to the present invention may be subject to many possible variations without departing from the novelty spirit of the inventive idea; it is also clear that in the practical implementation of the invention the illustrated details may have different shapes or be replaced with other technically equivalent elements.
[0084] According to one possible alternative, for example, the image acquisition device is an intelligent mobile terminal, e.g., a smartphone or a tablet, which implements the method of the present invention. The intelligent mobile terminals available today, and certainly also those available in the future, include at least one camera which can be used for acquiring at least one image to be processed according to the method of the present invention. In such case, virtually, anyone with a smartphone can obtain a rain rate measure in the place where he/she is located and, since smartphones are widespread, the estimation of meteorological conditions in a certain area (e.g., a district) can be improved with respect to known art techniques, e.g., rain gauges, due to the large amount of sampled points.
[0085] It can therefore be easily understood that the present invention is not limited to a method for measuring a particles precipitation rate, and related image acquisition device, but may be subject to many modifications, improvements or replacements of equivalent parts and elements without departing from the novelty spirit of the inventive idea, as clearly specified in the following claims.