VISIBILITY MEASUREMENT DEVICE
20230368357 · 2023-11-16
Inventors
- Paul Smith (Portishead Bristol, GB)
- Alec Bennett (Portishead Bristol, GB)
- Matthew Bennett (Portishead Bristol, GB)
Cpc classification
H04N23/695
ELECTRICITY
International classification
H04N23/698
ELECTRICITY
Abstract
A visibility measurement device (10) comprising a camera (12) and a computing device (14) communicatively coupled to the camera and configured to: receive a first image from the camera; select a measurement feature in the first image, the measurement feature corresponding to a location within the field of view of the camera; measure an optical characteristic of the measurement feature to obtain a measured optical characteristic value for the location; and determine an ambient fog intensity value based on: the measured optical characteristic value; a reference optical characteristic value for the location; and a distance between the camera and the location, such that the ambient fog intensity value can be used to calculate the visibility in a direction between the camera and the location.
Claims
1. A visibility measurement device comprising a camera and a computing device communicatively coupled to the camera and configured to: receive a first image from the camera; select a measurement feature in the first image, the measurement feature corresponding to a location within the field of view of the camera; measure an optical characteristic of the measurement feature to obtain a measured optical characteristic value for the location; and determine an ambient fog intensity value based on: the measured optical characteristic value; a reference optical characteristic value for the location; and a distance between the camera and the location, such that the ambient fog intensity value can be used to calculate the visibility in a direction between the camera and the location.
2. The visibility measurement device of claim 1 wherein the computing device is further configured to determine the visibility and output a signal representative of the visibility.
3. The visibility measurement device of claim 1, wherein the optical characteristic is any of intensity, brightness, or color hue.
4. The visibility measurement device of claim 1, wherein the reference optical characteristic value is determined based on a reference feature corresponding to the location in a second image distinct from the first image.
5. The visibility measurement device of claim 1, wherein determining the ambient fog intensity value comprises iteratively optimizing the ambient fog intensity value based on the respective values of: a measured optical characteristic value; a reference optical characteristic value; and a distance, for one or more measurement features in the first image, until the respective measured optical characteristic of each selected measurement feature is equal to an estimated optical characteristic of each of the selected features.
6. The visibility measurement device of claim 1, wherein the computing device is configured to: receive a third image captured from a different location than the first image; identify a comparison feature in the third image that corresponds to the location; use the position of the measurement and comparison features to determine by way of photogrammetry the distance between the first camera and the location; and store the distance in a database.
7. The visibility measurement device of claim 6, wherein: the visibility measurement device comprises a second camera having a known spatial relationship with respect to the first camera and configured to capture the third image; or the first camera is configured to move a known distance to capture the third image.
8. The visibility measurement device of claim 1, wherein the camera is configured to rotate to capture multiple first images in different directions such that the visibility measurement device can determine the visibility in corresponding multiple directions.
9. A computer implemented method of measuring visibility, the method comprising: receiving a first image from a first camera; selecting a measurement feature in the first image, the measurement feature corresponding to a location within the field of view of the camera; measuring an optical characteristic of the measurement feature to obtain a measured optical characteristic value for the location; determining an ambient fog intensity value based on: the measured optical characteristic value; a reference optical characteristic value for the location; and a distance between the camera and the location; and using the ambient fog intensity value to calculate the visibility in a direction between the camera and the location.
10. The method of claim 9, wherein the first characteristic is any of intensity, brightness, or color hue.
11. The method of claim 9, wherein the reference optical characteristic value is determined based on a reference feature corresponding to the location in a second image.
12. The method of claim 9, comprising: identifying one or more further measurement features within the first image, each further measurement feature corresponding to a distinct location within the field of view of the camera and the step of determining the ambient fog intensity value comprising iteratively optimizing the ambient fog intensity value based on: the respective values of measured optical characteristic; reference value of the optical characteristic; and the distance for some or all of the measurement features, until the respective measured optical characteristic of each measurement feature is equal to an estimated optical characteristic of each of the selected features.
13. The method of claim 9, comprising: receiving a third image captured from a different location than the first image; identifying a comparison feature in the third image that corresponds the location and use the position of the measurement and comparison features within the respective images to determine by way of photogrammetry the distance between the first camera and the location, wherein the third image is captured at a known distance from the point at which the first image was captured.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0038] By way of example only, certain embodiments of the invention will now be described by reference to the accompanying drawings, in which:
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
DETAILED DESCRIPTION
[0047]
[0048] The visibility measurement device 10 comprises a camera 12 and a computing device 14.
[0049] The camera 12 can be any suitable digital camera and can therefore comprise a sensor sensitive to visible light, i.e. electromagnetic radiation with a wavelength between about 350 to 750 nanometers. In some embodiments the sensor can be sensitive to wavelengths greater than 750 nanometers and/or lesser than 350 nanometers.
[0050] In the embodiment of
[0051] Advantageously, having a camera that can rotate and/or tilt enables the capturing of images in a variety of different directions. This enables the visibility measurement device 10 to measure the visibility in different directions, e.g., in the case of rotation, in multiple octants (N, NE, SW etc.)
[0052] In some embodiments, the camera 12 can be configured to rotate and capture multiple images that are combined by the computing device 14 to generate a panoramic image, with a field of view of 30 degrees-360 degrees, and preferably a field of view of 180 degrees-360 degrees. Advantageously, capturing a panoramic image with an expanded field of view can enable the visibility measurement device to simultaneously determine the visibility in multiple directions from the camera, e.g. in multiple octants (N, NE, SW etc.)
[0053] In the embodiment of
[0054] The computing device 14 is communicatively coupled to the camera 12 through a wireless or wired connection (not shown). The computing device 14 can have one or more local or distributed processing cores (not shown), network interface (not shown), and volatile and non-volatile memory 16. The memory can store images, video, or metadata for example.
[0055] The computing device 12 is configured to run a computer implemented algorithm that uses images to produce data that can be used to calculate visibility.
[0056] Referring additionally to
[0057] The visibility measurement device 12 obtains or measures the distance from the camera 12 to the location. The visibility measurement device 10 can for example comprise a database stored on a local memory 16, or remote memory accessible by the device 12. The database comprises a table of distances between the location of the camera 12 and one or more plurality locations in the environment of the camera 12 that can serve as measurement features.
[0058] An optical characteristic of the measurement feature MF1 is then determined. In the embodiment of
[0059] Thus, in the embodiment of
[0060] Referring additionally to
[0061] Thus, in the embodiment of
[0062] The visibility measurement device 10 then calculates an estimated intensity I.sub.e of the measurement feature MF1 based on equation (1):
I.sub.e=[e.sup.−k+d*(I.sub.r−I.sub.f)] (1)
where: [0063] d is the distance of the measurement feature MF1 from the first camera 12; [0064] I.sub.f is the ambient fog intensity, also known as ambient fog density, scatter intensity, backscatter intensity, scatter value or ambient fog intensity value; [0065] I.sub.r is the reference intensity of the first measurement feature MF1; and [0066] k is the atmospheric extinction coefficient.
[0067] The visibility measurement device 10 then determines the values of k and I.sub.f such that the estimated intensity I.sub.e is equal to the measured intensity I.sub.m of the feature MF1. The skilled person will recognise that solving for k and I.sub.f such that I.sub.r is equal to a specific value is an optimisation problem, which can be solved by employing a variety of known algorithms e.g. a non-linear least-squares optimisation algorithm that minimises the squared error between estimated and measured feature point intensities.
[0068] Once the visibility measurement device calculates the value of I.sub.f for which I.sub.e=I.sub.m, the system calculates the contrast of the measurement feature MF1 based on the following equation (2):
[0069] Once the contrast C of the measurement feature MF1 is determined, visibility V in the direction of the MF1 in the image MI can be calculated based on equation (3)
[0070] The visibility measurement device 10 can be configured to output a signal representative of a calculated visibility, or can for example simply output the contrast C for another process or user to use to calculate the visibility.
[0071] Thus, in order to measure the visibility of a single feature point, the intensity, reference intensity and distance are used. Two global constants are also used, these being the ambient fog intensity, so you can see how close a feature intensity is to the ambient i.e. know the contrast, and the extinction coefficient, to convert contrast to visibility. Knowing the two global constants for the scene, the system can calculate the visibility for a measurement feature without reference to any other feature points.
[0072] While the above description focuses on a single measurement feature MF1, solving equation (1) for k and I.sub.f such that the estimated intensity I.sub.e is equal to the measured intensity I.sub.m may utilise multiple measurement features in multiple directions. For example, in addition to first measurement feature MF1, the visibility measurement device also selects additional measurement features MF2 and MF3. The measurement features can for example represent objects at various distances from the first camera 12. In cases where the two global constants are unknown for a given scene, they can be estimated by an optimisation algorithm. An optimisation algorithm may iterate over some or all of the measurement features using the constants in equation 1 to find the values of k and I.sub.f that result in the smallest error between the respective estimated intensity and measured intensity for each selected measurement feature. Once a consensus is found, the two global constants are associated with the scene. Any feature point in the scene can then have its visibility calculated independently. As such, embodiments involving the use of multiple measurement points can determine the visibility for a measurement point in the image without any pre-existing knowledge of what is in the scene other than the distance to the measurement point.
[0073] Selecting multiple measurement features in multiple directions enables substantially concurrent estimations of visibility from the camera 12 towards their respective directions. This is advantageous when compared to known methods of estimating visibility which usually only calculate visibility towards one direction at any given time, with the process starting over when there is a need to estimate the visibility in a different direction.
[0074] In embodiments where the visibility measurement device 10 has selected multiple measurement features MF1 to MF3, a corresponding number of reference features RF1 to RF3 are utilised as described above and determination of the ambient fog intensity value using equation (1) comprises iteratively optimising the values of the atmospheric extinction co-efficient and the ambient fog intensity value for all of the selected features, until the respective measured intensity of each selected feature is equal to the respective estimated intensity of each of the selected features. In some embodiments, only a subset of the selected features are used to determine the ambient fog intensity value.
[0075] Where the image includes regions in different octants, measurement features can be grouped and in some cases averaged by octant. An orientation sensor such as an encoder can inform which direction the camera is facing when capturing an image.
[0076] While in the above-described embodiment the optical characteristic is intensity, in other embodiments the computing device 14 can determine the optical characteristic of a feature based on values of brightness, colour hue, saturation, tone or the like. In some embodiments, where the captured images are greyscale, the intensity of a feature can be the greyscale brightness of the image pixels that form the feature. In other embodiments, where the captured images are in colour, the image is converted to greyscale using an algorithmic combination of the red, green and blue components to enable the computing device 14 to extract a greyscale brightness.
[0077] Selecting a measurement feature can comprise ensuring that the measurement feature satisfies a set of requirements. The set of requirements can comprise having a brightness value greater than a pre-defined threshold or having a contrast value greater than a pre-defined threshold. Features can be identified and selected using any of the known feature or image matching techniques, for example the Orientated FAST and Rotated BRIEF (ORB) algorithm (E. Rublee, V. Rabaud, K. Konolige and G. Bradski “ORB: an efficient alternative to SIFT or SURF” in IEEE International Conference on Computer Vision, ICCV MI11, Barcelona, Spain, November 6-13, MI11). An intensity threshold is 15 set for a feature to be considered (relating to the intensity of a centre pixel and those in a circular ring around it, as per the FAST algorithm, and a measurement of “cornerness” calculated by the Harris response). Features that have been selected but whose determined contrast brightness or intensity is below a pre-defined threshold can be discarded. The computing device 14 can also limit the number of selected features per image chosen as a balance between information and computational requirements. The computing device 14 can also discard information relating to features that are determined to be approximately in the same distance and direction as other features. In embodiments utilising photogrammetry, a point cloud of measurement features can be generated.
[0078] In any embodiment, communication hardware (not shown) can be provided that enables the visibility measurement device 10 to communicate with a remote computing unit (not shown). The communication hardware can comprise an antenna, a transceiver, a wired connection, etc. The remote computing unit can comprise an online database which comprises a table of distances between the location of the camera 12 and a plurality of objects in the environment of the camera 12. The remote computing unit can provide to the visibility measurement device 10 the respective distances between the location of the camera 12 and a plurality of objects in the environment of the camera 12.
[0079] In such embodiments, the visibility measurement device 10 can provide to the remote computing unit the location of the camera 12 so that the remote computing unit can provide the distances between the location of the camera 12 and a plurality of objects in the environment of the camera 12.
[0080] In some embodiments, the visibility measurement device 10 can be configured to communicate wirelessly with user equipment (not shown). A user can use the user equipment to manually provide information about the location of the camera 12 to visibility measurement device 10. In some embodiments, a user can use the user equipment to capture an image of an area visible by the camera 12 and provide the image to the visibility measurement device 10 along with the location from which the image was taken.
[0081]
[0082]
[0083] The visibility measurement device 30 and visibility measurement device 40 are configured to use the first image and the third image to measure the distance of objects that are visible in both the first image and the third image using photogrammetry. As the first image and the third image are images of the same area from spatially offset points of view, the computing device of visibility measurement device 30 and/or visibility measurement device 40 can automatically identify features in the first and third image that correspond to the same object. The computing device can use pattern matching or envelope matching algorithms to identify features in the first and third image that correspond to the same identified object. The computing device can determine the distance of the identified object from the first camera 12 based on the position of the corresponding feature in the first image, the position of the corresponding feature in the third image and the distance D in case of the visibility measurement device 30 or the distance D′ in case of the visibility measurement device 40.
[0084] As the visibility measurement devices 30 and 40 have the ability to measure both the intensity and the distance of the measurement features captured by the camera(s), the feature intensity and location reference values can be refreshed at times without manual intervention. This is advantageous should the environmental conditions change and affect the intensity of the imaged features while maintaining good visibility, e.g. in cases where the angle of sunlight changes, or if imaged objects are physically moved over time.
[0085]
[0086] In the captured images, measurement feature MF1 and comparison feature CF1 correspond to the same location, measurement feature MF2 and comparison feature CF2 correspond to the same location and measurement feature MF3 and comparison feature CF3 correspond to the same location. However, because the two images are captured from different vantage points, the objects that are closer to the camera have a greater displacement when compared with objects further from the camera. The computing device 14 can measure the distances D1, D1′, D2 and D2′ and determine the distance of the objects corresponding to imaged features using known photogrammetry algorithms. In some embodiments, the computing device 14 has access to look-up tables that associate a degree of displacement of a selected feature with a distance from the first camera 12.
[0087]
[0088] In step 602, the computing device 14 receives a first image from the camera 12. The first image can comprise one or more measurement features as described above.
[0089] In step 604, the computing device 14 selects a first measurement feature.
[0090] In step 606, the computing device 14 measures an optical characteristic of the measurement feature, preferably the intensity of the measurement feature as described previously.
[0091] In step 608, the computing device 14 determines an ambient fog intensity value in the direction of the location, as described above. In some embodiments, the reference value of the optical characteristic has been determined prior to step 602, based on a reference image of the location captured in good visibility conditions.
[0092] The ambient fog intensity value can be used to calculate visibility in a direction between the camera and the location.
[0093]
[0094] Step 702 corresponds to step 602. However, after step 702 concludes, the system proceeds to step 702b.
[0095] In step 702b the computing device 14 receives a third image captured from a different location than the first image. If the visibility measurement device comprises a movable camera as the one described in relation to
[0096] Step 704 corresponds to step 604. However, after step 704 concludes, the system proceeds to step 704b.
[0097] In step 704b the computing device 14 identifies a comparison feature in the third image that corresponds to the measurement feature.
[0098] In step 704c the computing device 14 determines by way of photogrammetry the distance between the first camera 12 and the location. Once the distance between the first camera 12 and the location has been determined, the method proceeds to steps 706 and 708 which are similar to steps 606 and 608.
[0099] It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be capable of designing many alternative embodiments without departing from the scope of the invention as defined by the appended claims. AMENDMENTS TO THE CLAIMS