Device and a method for clustering light spots

10970570 · 2021-04-06

Assignee

Inventors

Cpc classification

International classification

Abstract

A device for clustering light spots, the device is configured to receive an image captured by a camera, detect light spots in the image, calculate a distance measure for each of a plurality of pairs of light spots detected in the image, and group the detected light spots to clusters based on the calculated distance measures.

Claims

1. A device for clustering light spots, wherein the light spots are spots of headlights and tail lights of vehicles; the device is configured to: receive an image captured by a camera; detect light spots in the image; calculate a distance measure for each of a plurality of pairs of light spots detected in the image based on two or more features of the two light spots of the respective pair, wherein the distance measure quantifies a dissimilarity of features between each of the plurality of pairs of light spots; and group the detected light spots to form clusters based on the calculated distance measures, wherein two light spots form a new cluster if their distance measure is below a predetermined threshold T.sub.2, and a third light spot is assigned to the cluster if a distance measure to at least one of the two light spots of the cluster is below the predetermined threshold T.sub.2.

2. The device as claimed in claim 1, wherein the at least two features of the two light spots of the respective pair are selected from the following group of features: a difference in blob area of the two light spots, a difference in angle with the optical axis of the camera between the two light spots, a difference in center of gravity of the two light spots, a difference in color average of the two light spots, a difference in intensity average of the two light spots and an Euclidean distance between the two light spots.

3. The device as claimed in claim 1, wherein the distance measure for a respective pair of light spots and for a feature i is a function of ((fi—μi)/σi){circumflex over ( )}2, wherein fi is the value of the feature i for the two light spots of the respective pair, μi is a mean value of fi for a plurality of pairs of light spots and σi is a standard deviation value of fi for the plurality of pairs of light spots.

4. The device as claimed in claim 3, wherein the distance measure for the respective pair of light spots and for a plurality of features i is a function of Σ(i=1, nf)[((fi—μi)/σi){circumflex over ( )}2], wherein nf is the number of the plurality of features.

5. The device as claimed in claim 1, wherein the device comprises a Bayes classifier and the device is further configured to use the Bayes classifier to calculate the distance measure for each of the plurality of pairs of light spots.

6. The device as claimed in claim 1, wherein the device is further configured to: receive images captured by the camera in temporally successive frames; group the light spots detected in the image of one of the frames to clusters based on the calculated distance measures; and determine which of the clusters still exist in the subsequent frame.

7. The device as claimed in claim 6, wherein a cluster still exists in the subsequent frame if the number of the light spots of the cluster that are detected in the subsequent frame exceeds a predetermined fraction of the number of the light spots of the cluster that were detected in the previous frame.

8. The device as claimed in claim 6, wherein the device is further configured to determine for a light spot of one of the clusters in the subsequent frame whether the distance measure of the respective light spot to at least one of the other light spots of the cluster is below a further predetermined threshold T.sup.1 in order to evaluate whether the respective light spot belongs to the cluster in the subsequent frame.

9. The device as claimed in claim 8, wherein the further predetermined threshold T.sub.1 is greater than the predetermined threshold T.sub.2.

10. The device as claimed in claim 1, wherein the device is further configured to merge together overlapping clusters.

11. The device as claimed in claim 1, wherein the device is further configured to use a k-means clustering algorithm in order to merge clusters together.

12. A system, comprising: a camera mounted on a vehicle and configured to capture images; and a device for clustering light spots as claimed in claim 1.

13. A method for clustering light spots of headlights and tail lights of vehicles, comprising: receiving an image captured by a camera; detecting light spots in the image; calculating a distance measure for each of a plurality of pairs of light spots detected in the image based in two or more features of the two light spots of the respective pair, wherein the distance measure quantifies a dissimilarity of features between each of the plurality of pairs of light spots; and grouping the detected light spots to clusters based on the calculated distance measures wherein two light spots form a new cluster if their distance measure is below a predetermined threshold T.sub.2, and a third light spot is assigned to the cluster if its distance measure to at least one of the two light spots of the cluster is below the predetermined threshold T.sub.2.

14. The method as claimed in claim 13, wherein the at least two features of the two light spots of the respective pair are selected from the following group of features: a difference in blob area of the two light spots, a difference in angle with the optical axis of the camera between the two light spots, a difference in center of gravity of the two light spots, a difference in color average of the two light spots, a difference in intensity average of the two light spots and an Euclidean distance between the two light spots.

15. The method as claimed in claim 13, wherein the distance measure for a respective pair of light spots and for a feature i is a function of ((fi—μi)/σi){circumflex over ( )}2, wherein fi is the value of the feature i for the two light spots of the respective pair, μi is a mean value of fi for a plurality of pairs of light spots and σi is a standard deviation value of fi for the plurality of pairs of light spots.

16. The method as claimed in claim 15, wherein the distance measure for the respective pair of light spots and for a plurality of features i is a function of Σ(i=1, nf)[((fi—μi)/σi){circumflex over ( )}2], wherein nf is the number of the plurality of features.

17. The method as claimed in claim 13, wherein the device comprises a Bayes classifier and the device is further configured to use the Bayes classifier to calculate the distance measure for each of the plurality of pairs of light spots.

18. The method as claimed in claim 13, wherein the device is further configured to receive images captured by the camera in temporally successive frames; group the light spots detected in the image of one of the frames to clusters based on the calculated distance measures; and determine which of the clusters still exist in the subsequent frame.

19. The method as claimed in claim 18, wherein a cluster still exists in the subsequent frame if the number of the light spots of the cluster that are detected in the subsequent frame exceeds a predetermined fraction of the number of the light spots of the cluster that were detected in the previous frame.

20. The device as claimed in claim 1, wherein the device is further configured to assign a light spot, which is detected in the subsequent frame and does not belong to a cluster, to one of the existing clusters or a new cluster of the subsequent frame if its distance measure to at least one of the light spots of one of the existing clusters or a new cluster is below the predetermined threshold T.sub.2.

21. The method as claimed in claim 18, wherein the device is further configured to determine for a light spot of one of the clusters in the subsequent frame whether the distance measure of the respective light spot to at least one of the other light spots of the cluster is below a further predetermined threshold T.sub.1 in order to evaluate whether the respective light spot belongs to the cluster in the subsequent frame.

22. The method as claimed in claim 21, wherein the further predetermined threshold T.sub.1 is greater than the predetermined threshold T.sub.2.

23. The method as claimed in claim 22, wherein the device is further configured to assign a light spot, which is detected in the subsequent frame and does not belong to a cluster, to one of the existing clusters or a new cluster of the subsequent frame if its distance measure to at least one of the light spots of one of the existing clusters or a new cluster is below the predetermined threshold T.sub.2.

24. The method as claimed in claim 13, further including the step of merging together overlapping clusters.

25. The method as claimed in claim 13, further including the step of using a k-means clustering algorithm in order to merge clusters together.

Description

BRIEF DESCRIPTION OF DRAWINGS

(1) The invention will be described in more detail in the following in an exemplary manner with reference to an embodiment and to the drawings. There are shown in these:

(2) FIG. 1 is a schematic representation of an exemplary embodiment of a system including a device for clustering light spots;

(3) FIG. 2 is a schematic representation of an exemplary embodiment of a method for clustering light spots;

(4) FIG. 3 is a schematic representation of a further exemplary embodiment of a

(5) FIG. 4A is a schematic illustration of an image captured by a camera in a frame; and

(6) FIG. 4B is a schematic illustration of an image captured by the camera in a frame subsequent to the frame of FIG. 4A.

DETAILED DESCRIPTION

(7) Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the various described embodiments. However, it will be apparent to one of ordinary skill in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.

(8) ‘One or more’ includes a function being performed by one element, a function being performed by more than one element, e.g., in a distributed fashion, several functions being performed by one element, several functions being performed by several elements, or any combination of the above.

(9) It will also be understood that, although the terms first, second, etc. are, in some instances, used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without departing from the scope of the various described embodiments. The first contact and the second contact are both contacts, but they are not the same contact.

(10) The terminology used in the description of the various described embodiments herein is for describing embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

(11) As used herein, the term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.

(12) FIG. 1 schematically illustrates a system 10 including a camera 11 and a device 12 that allows clustering light spots.

(13) The camera 11 is mounted on a vehicle and captures images 13 of the area in front of the vehicle and thus the traffic in front of the vehicle. Alternatively, the camera 11 may be directed to an area in the rear and/or at a side of the vehicle.

(14) The images 13 captured by the camera 11 are fed to the device 12. The device 12 processes the images 13 and generates an output signal 14. The device 12 performs a method 20 for clustering light spots as schematically illustrated in FIG. 2.

(15) In step 21 of the method 20, the device 12 receives an image 13, which was previously captured by the camera 11 and shows a road scene in front of the vehicle.

(16) In step 22, the device 12 detects light spots in the image 13, in particular headlight spots and tail light spots of other vehicles.

(17) In step 23, for each pair of light spots detected in the image 13, the device 12 calculates a distance measure.

(18) In step 24, the device 12 groups the detected light spots to clusters based on the calculated distance measures. The clusters can be output by the device 12 as the output signal 14.

(19) The device 12, the system 10 and the method 20 are exemplary embodiments according to the first, second and third aspect of the invention, respectively.

(20) In the following embodiments and functions of the device 12, the system 10 and the method 20 are explained in more detail.

(21) In current technology handpicked rules are used for the pairing and/or clustering of headlights and tail lights in image data. However, those rules may not be optimal and are not flexible. In order to overcome these disadvantages of conventional technology, an automatic feature selection algorithm is used to pick the best combination of features out of a large set of features. For example, in tests the best 6 features out of a set of 94 features were extracted.

(22) The problem of the inflexibility of a hard threshold for every feature is solved by using a distance measure, which is based on a Bayes classifier and the mean as well as the standard deviation of the features.

(23) For the feature selection process, a Bayes classifier is trained on labeled training data. The training data include images showing headlights and tail lights, in particular pairs of headlights and tail lights which are labelled accordingly. These images are input into the Bayes classifier in order to train the Bayes classifier. The features used for the training are selected by a sequential backward selection strategy out of a huge feature set. For example, in tests the initial feature set included 94 features.

(24) The features are used to characterize the similarity or dissimilarity of the two light spots of a light spot pair. The set of features can include one or more of the following features: a difference in blob area between the two light spots of the respective pair of light spots, a difference in angles of the two light spots with the optical axis of the camera, a difference in center of gravity of the two light spots, a difference in color average of the two light spots, a difference in intensity average of the two light spots and an Euclidean distance between the two light spots, i.e., an Euclidean distance between the two light spots in a coordinate system with x and y axes. It is to be noted that the aforementioned features are only examples of possible features. The set of features may contain other features and/or additional features.

(25) At the beginning of the feature selection process, the Bayes classifier is trained on all of the features of the feature set. The performance of the Bayes classifier using all of the features is then evaluated. Subsequently, each feature is individually removed from the feature set and the Bayes classifier is trained on the remaining features of the feature set. For each of the reduced feature sets, the performance of the Bayes classifier is evaluated. This allows to determine the removal of which feature improved the performance of the Bayes classifier most. This feature is then permanently removed from the feature set. Afterwards, the process of a removing a single feature from the feature set is repeated as long as the removal of any other feature from the feature set only worsens the performance of the Bayes classifier. The remaining feature set is considered as optimal in regard of the initially chosen set of features.

(26) The trained Bayes classifier is used to calculate the distance measure. The Bayes classifier outputs the probability p of a feature value f.sub.i for a feature i under the condition that the two light spots, on which the feature value f.sub.i was calculated, are the light spots of the headlights or tail lights of the same vehicle:

(27) p ( f i .Math. paired ) = 1 σ i 2 π e - 1 2 ( f i - μ i σ i ) 2 ( 3 )
wherein μ.sub.i is a mean value of f.sub.i for a plurality of pairs of light spots and σ.sub.i is a standard deviation value of f.sub.i for the plurality of pairs of light spots.

(28) The distance measure d(f.sub.i) for the pair of light spots and for the feature i can be calculated by taking the logarithm of equation (3) and negating it:

(29) d ( f i ) = - log p ( f i .Math. paired ) = 1 2 ( f i - μ i σ i ) 2 + log σ i + 1 2 log 2 π ( 4 )

(30) In case n.sub.f features (with n.sub.f≥2) are considered to determine the similarity or dissimilarity of the light spots, the distance measure d for the n.sub.f features can be calculated by the sum of the distance measures of all single features:

(31) d = .Math. i = 1 n f d ( f i ) = 1 2 .Math. i = 1 n f ( f i - μ i σ i ) 2 + log ( .Math. i = 1 n f σ i ) + 1 2 n f log 2 π ( 5 )

(32) Equation (4) or (5) can be used to calculate the distance measure d between light spots and clusters of light spots in order to group light spots to clusters and to decide to which cluster a specific light spot fits best.

(33) FIG. 3 illustrates a method 30 as a further exemplary embodiment of a method for clustering light spots. The method 30 can be performed by the device 12. The method 30 can be divided into three parts: preprocessing 31, clustering 32 and postprocessing 33. The preprocessing 31 includes steps 34 and 35, the clustering includes steps 36 and 37, and the postprocessing includes steps 38 and 39.

(34) In step 34 of the method 30, the device 12 receives an image 13, which was previously captured by the camera 11 and shows a road scene in front of the vehicle. The device 12 receives the images 13 captured by the camera 11 in temporally successive frames. In each of the frames, the device 12 receives a single image 13.

(35) Further, in step 34, the device 12 detects light spots in the image 13, which can be headlight spots or tail light spots of other vehicles. A Gaussian curve is applied on every detected light spot with a standard deviation dependent on the radius of the respective light spot and a coloring dependent on whether the respective light spot is a headlight or a tail light. Afterwards, the values of the Gaussian curves are summed up on the position of every detected light spot. The respective light spot is then ultimately classified on the coloring with the highest value, i.e., the respective light spot is classified to be a headlight or a tail light. This ensures that single false detections, for example, a false tail light in a cluster of many headlights, are smoothed out.

(36) In step 35, the distance measure for every pair of light spots detected in the actual image 13 is calculated with the help of equation (4) or (5). For example, if the image 13 shows 4 light spots, then 6 pairs of light spots can be formed and for each of these pairs the distance measure can be calculated.

(37) In steps 36 and 37, the detected light spots are grouped to clusters based on the calculated distance measures.

(38) In step 36, it is evaluated which light clusters detected in the previous frame still exist in the image 13 of the actual frame. This is done by evaluating which light spots of the previous frame are still detected and matched in the actual frame, i.e., the frame subsequent to the previous frame. A cluster that was detected in the previous frame still exists in the actual frame if the number of the light spots of the cluster that are detected in the actual frame exceeds a predetermined fraction of the number of the light spots of the cluster that were detected in the previous frame. Thus, the cluster still exists if a percentage of the light spots of the cluster that are detected in the actual frame exceeds a predetermined percentage value.

(39) As an example, FIG. 4A schematically shows an image 41 captured in the previous frame and FIG. 4B schematically shows an image 42 captured in the actual frame. In the image 41, light spots L.sub.1 to L.sub.8 are detected. Further, light spots L.sub.1 to L.sub.4 are grouped to a cluster C.sub.1 and light spots L.sub.5 to L.sub.8 are grouped to a cluster C.sub.2. In the image 42, the light spots L.sub.1, L.sub.2, L.sub.4 and L.sub.5 to L.sub.8 are detected, however the light spot L.sub.3 detected in the previous frame is no longer detected in the actual frame. Further, a new light spot L.sub.9 is detected in image 42.

(40) The cluster C.sub.1 in the image 42 thus contains 75% of the light spots of the cluster C.sub.1 in the image 41. If the predetermined percentage value is 70%, the cluster C.sub.1 in the image 42 exceeds this value so that it can be stated that the cluster C.sub.1 still exists in the actual frame. The cluster C.sub.2 also still exists in the actual frame since the cluster C.sub.2 has the same number of light spots in both images 41 and 42.

(41) In step 36, it is also evaluated if the light spots of a cluster are still connected in the actual frame, for example, by using an iterative breadth-first-search algorithm. A light spot is considered to be connected to the cluster if its distance measure to at least one of the other light spots of this cluster is below a first predetermined threshold T.sub.1. In the example of FIGS. 4A and 4B, an iterative breadth-first-search algorithm can be executed for evaluating whether the light spots L.sub.1, L.sub.2, L.sub.4 detected in the actual frame are connected to the cluster C.sub.1 and the light spots L.sub.5 to L.sub.8 are connected to the cluster C.sub.2.

(42) In step 37, the remaining light spots are clustered and, if possible, clusters are merged. For this purpose, an iterative breadth-first-search algorithm can be performed running over all light spots light spots L.sub.1, L.sub.2, L.sub.4 and L.sub.5 to L.sub.9 that are detected in the image 42. In particular, it is evaluated if the distance measure of a remaining light spot, i.e., the light spot L.sub.9, to at least one of the light spots of one of the existing clusters or a new cluster is below a second predetermined threshold T.sub.2, wherein the second predetermined threshold T.sub.2 is smaller than the first predetermined threshold T.sub.1. There are the following options for the result of step 37: Clusters C.sub.1 and C.sub.2 remain as they are and a new cluster C.sub.3 is formed containing only light spot L.sub.9 as light spot L.sub.9 is not connected to one of the clusters C.sub.1 and C.sub.2; light spot L.sub.9 is assigned to cluster C.sub.1 as light spot L.sub.9 is connected to cluster C.sub.1; light spot L.sub.9 is assigned to cluster C.sub.2 as light spot L.sub.9 is connected to cluster C.sub.2; a new cluster C.sub.3 is formed containing light spot L.sub.9 and clusters C.sub.1 and C.sub.2 are merged to a single cluster as light spot L.sub.9 is not connected to one of the clusters C.sub.1 and C.sub.2 and at least some of the light spots of clusters C.sub.1 and C.sub.2 are connected to the other cluster; and clusters C.sub.1 and C.sub.2 and light spot L.sub.9 are merged to a single cluster.

(43) It is to be noted that for the initial frame when clusters have not been formed yet, two light spots form a new cluster if their distance measure is below the second predetermined threshold. Another light spot may be assigned to this cluster if its distance measure to at least one of the light spots of this cluster is also below the second predetermined threshold.

(44) In step 38, overlapping clusters are merged together in the actual frame.

(45) In step 39, a k-means clustering algorithm is executed in order to limit the number of clusters to a predetermined number L.sub.c. The centers of the light clusters that were also detected in the previous frame are chosen as the initial cluster centers to gain temporal consistency. Further, the centers of randomly chosen remaining light clusters are chosen as further initial cluster centers. Execution of the k-means clustering algorithm will merge together the closest light clusters until exactly L.sub.c light clusters remain.

(46) While this invention has been described in terms of the preferred embodiments thereof, it is not intended to be so limited, but rather only to the extent set forth in the claims that follow.