Method, System, and Computer Program Product for Determining a Blockage of a Sensor of a Plurality of Sensors of an Ego Vehicle

20200353942 ยท 2020-11-12

    Inventors

    Cpc classification

    International classification

    Abstract

    A method determines a blockage of a sensor of a plurality of sensors of an ego vehicle. The method determines a prior blockage probability of the sensor of the plurality of sensors; receives sensor data of the sensor of the plurality of sensors; determines a performance of the sensor based on the received sensor data; calculates a posterior blockage probability based on the prior blockage probability of the sensor and the performance of the sensor; and determines the blockage of the sensors using the calculated posterior blockage probability.

    Claims

    1.-9. (canceled)

    10. A method for determining a blockage of a sensor of a plurality of sensors of an ego vehicle, the method comprising: determining a prior blockage probability of each single sensor of the plurality of sensors; receiving sensor data of the sensor of the plurality of sensors; determining a performance of the sensor based on the received sensor data; calculating a posterior blockage probability based on the prior blockage probability of the sensor and the performance of the sensor; and determining the blockage of the sensors using the calculated posterior blockage probability.

    11. The method according to claim 10, wherein determining the prior blockage probability of the sensor of the plurality of sensors comprises: determining a first blockage probability of a sensor of the plurality of sensors of the ego vehicle based on a relative change of a motion of an object detected by the sensor; determining a second blockage probability of the sensor of the plurality of sensors of the ego vehicle using a predefined performance of the sensor regarding a current weather condition; and calculating a prior blockage probability of the sensor of the plurality of sensors of the ego vehicle based on the first blockage probability and the second blockage probability.

    12. The method according to claim 11, wherein the relative change of the motion of the object is determined by comparing a current change of the motion of the object to one or more previous changes of the motion of the object; and wherein determining the first blockage probability comprises: checking whether the relative change of the motion of the object deviates from a predefined range; and if the relative change of the motion of the object deviates from the predefined range, determining the first blockage probability of a sensor of the plurality of sensors of the ego vehicle based on the relative change of the motion of the object detected by the sensor.

    13. The method according to claim 10, wherein determining the performance of the sensor based on the received sensor data comprises: determining an occupancy probability of a sensor of the plurality of sensors for a current field of view; receiving a predefined occupancy probability of an external reference point for the current field of view; calculating a fused occupancy probability of the current field of view based on the occupancy probability of the sensor and the predefined occupancy probability of the external reference point; and determining a deviation of the occupancy probability of the sensor from the fused occupancy probability; determining the deviation of the occupancy probability of the sensor from the fused occupancy probability as the performance of the sensor.

    14. The method according to claim 10, wherein the field of view comprises an occupancy grid of a predefined number of cells; and wherein the occupancy probability, the predefined occupancy probability and the fused occupancy probability is determined for each cell of the occupancy grid.

    15. The method according to claim 10, wherein the occupancy probability is determined for at least a subset of sensors of the plurality of sensors.

    16. The method according to claim 10, wherein determining the blockage of the sensor using the calculated posterior blockage probability comprises: if the posterior blockage exceeds a predetermined blockage threshold: determining the blockage of the sensor as blocked; and if the posterior blockage probability does not exceed the predetermined blockage threshold: determine the blockage of the sensor as non-blocked.

    17. A computer program product for determining a blockage of a sensor of a plurality of sensors of an ego vehicle, the computer program product comprising a non-transitory computer readable medium having stored thereon program code, which, when executed by a processor, cause the product to: determine a prior blockage probability of each single sensor of the plurality of sensors; receive sensor data of the sensor of the plurality of sensors; determine a performance of the sensor based on the received sensor data; calculate a posterior blockage probability based on the prior blockage probability of the sensor and the performance of the sensor; and determine the blockage of the sensors using the calculated posterior blockage probability.

    18. A system for determining a blockage of a sensor of a plurality of sensors of an ego vehicle, the system comprising: a processor; a memory; instructions stored within the memory, wherein the instructions, when executed on the processor, cause the system to: determine a prior blockage probability of each single sensor of the plurality of sensors; receive sensor data of the sensor of the plurality of sensors; determine a performance of the sensor based on the received sensor data; calculate a posterior blockage probability based on the prior blockage probability of the sensor and the performance of the sensor; and determine the blockage of the sensors using the calculated posterior blockage probability.

    19. A vehicle comprising the system for determining a blockage of a sensor of a plurality of sensors of an ego vehicle according to claim 18.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0022] FIG. 1 shows a method for determining a blockage of a sensor.

    [0023] FIG. 2 shows exemplary blockage probabilities of sensors.

    [0024] FIG. 3 shows an exemplary movement model for tracking an object by a single sensor.

    [0025] FIG. 4 shows an exemplary implementation of ground truth using occupancy grids.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0026] The illustration in the drawings is schematically. It is noted that in different figures, similar or identical elements are provided with the same reference signs or with reference signs, which are different from the corresponding reference signs only within the first digit.

    [0027] The present application describes a solution to the problem of identifying blockage of sensors. A sensor is considered to be blocked if its field of view, short FoV, is limited due to one or more environmental effects not related to a presence of other objects, e.g., other moving or static objects, which are present on a road. For example, a reduction of a sensor's FoV due to a truck moving nearby an ego vehicle is considered to be not a blockage, whereas a reduction of a LIDAR's FoV due to snow or salt in front of the laser is considered to be a blockage. Environmental conditions that are typically associated with a blockage are snow, ice, fog, rain, night, mud and/or direct sun light.

    [0028] Typical approaches for blockage detection are based on a comparison between expected and actual sensor measurements as mentioned above. However, the comparison between expected measurement and actual sensor measurement may lack ground truth which is required to properly detect and/or identify objects on or near roads or streets and in particular to properly detect and/or identify objects in a field of view of a sensor of the ego vehicle.

    [0029] In view of the above, the approach as set forth below may rely on the following two key features:

    [0030] 1) using data from one or more data sources which are known to be unaffected by a blockage of a sensors in order to define ground truth using the data from the one or more data sources; and

    [0031] 2) adjusting an importance of each sensor using the data from the one or more data sources when computing a blockage probability of a sensor.

    [0032] An exemplary data source may provide weather information, in particular local weather information, which may be used to adjust the importance of a particular sensor of the ego vehicle. For example, at night, the probability that a camera is blocked due to direct sun light is considered to be very low. In summer, the probability that a LIDAR is blocked due to salt on the road or snow should also be very low.

    [0033] FIG. 1 illustrates a method 100 for determining a blockage of a sensor of a plurality of sensors of an ego vehicle. The method 100 may determine 102 the prior blockage probability of a single sensor of the plurality of sensors of the ego vehicle. Preferably, the prior blockage may be estimated using a machine learning algorithm and/or an estimation function, e.g., a binary Bayes approach. To determine 102 the prior blockage probability, the method 100 may determine 104 an environmental condition, e.g., a weather condition, at a position of the ego vehicle. Exemplary environmental conditions are day, night, sunny, cloudy, rainy, icy, and/or snowy. The environmental condition may be obtained from a weather information service via an internet connection of the ego vehicle of a backend server and/or a weather sensor of the ego vehicle. The backend server may comprise a machine learning component which may receive data related to environmental conditions from other vehicles and may aggregate the received data to generate a sensor specific blockage probability based on the environmental condition. Alternatively, the backend server may compute a sensor specific blockage probability based on a weather report of a weather information service.

    [0034] Exemplary blockages probabilities of sensors 200 of the ego vehicle based on different weather conditions are shown in FIG. 2. When the current weather condition indicates fog, the blockage probability of a camera sensor of the ego vehicle may be 0.5. In other words, there is a probability of 50% that the camera sensor is blocked when the weather condition is foggy. If a blockage probability of a sensor and a particular weather condition cannot be determined, a qualitative estimation of a sensors blockage may be determined, e.g., can be blocked or not relevant for blockage.

    [0035] To determine 102 the prior blockage probability, the method 100 may determine 104 a motion or movement of an object detected by a single sensor of the plurality of sensors. If the object cannot be detected at a particular point in time, the motion model may assume a constant velocity of the ego vehicle. By using the constant velocity, the distance between object and the sensor may be predicted. Depending on the accuracy of the prediction, a blockage probability of the sensor may be determined. For example, the sensor may detect an object which actually does not exist. In this example, the detected object may always stay at a constant position with respect to the ego vehicle. The blockage probability of the sensor is high since a relative position of the object does not change. In a further example, an object detected by the sensor may suddenly disappear as e.g., the object moves from one area in the field of view of the sensor to another area in the field of view of the sensor and appears again in a further area in the field of view of the sensor. This unexpected behavior may also indicate a high blockage probability of the sensor.

    [0036] FIG. 3 illustrates an approach to determining a blockage probability of a single sensor based on the tracking of an object in the sensor's field of view. From previous measurements y.sub.1:k-1 and appropriate filtering, e.g. by application of a Kalman filter, a predicted object state .sub.k is computed for time t.sub.k. The object state .sub.k comprises object position and size. This results in a probability P.sub.occ(x|.sub.k ) 302 that location x is occupied by the predicted object.

    [0037] At time t.sub.k the sensor may register the detection of an object at a position with the probability distribution of p (x|y.sub.k) 304. The blockage probability b.sub.s.sub.i.sub.,tracking.sup.t.sup.k of a single sensor s.sub.i at time t.sub.k can be determined based on the performed object tracking and the present detection. For the shown example, the blockage probability of a Lidar sensor may be determined as follows:

    [00001] b Lidar , tracking = 1 - - .Math. P occ ( x | ^ k ) .Math. p ( x | y k ) .Math. dx = 0.67 .

    [0038] The blockage probability of the sensor based on the motion model and the blockage probability of the sensor based on the environmental condition may be used to compute the prior blockage probability of the sensor. For example, a binary Bayes approach may be used to compute the prior blockage probability of a Lidar sensors in foggy weather conditions.

    [00002] C 1 = b Lidar , .Math. tracking t K * b Lidar , .Math. weather t K ( 1 - b Lidar , .Math. tracking t K ) * ( 1 - b Lidar , .Math. weather t K ) = 0.67 * 0.5 ( 1 - 0.67 ) * ( 1 - 0.5 ) and b Ad , .Math. Lidar t K = C 1 1 + C 1 = 2.03 1 + 2.03 = 0.7 .

    [0039] As exemplary described above, the prior blockage probability of the Lidar sensor of the ego vehicle may be 0.7. In other words, the Lidar sensor of the ego vehicle may be blocked with a probability of 70%.

    [0040] Further, the method 100 may receive 108 sensor data of a plurality of sensors and may receive 1 10 data related to one or more external reference points. The sensor data of the plurality of sensors, the data related to the one or more external reference points may be used to determine 1 12 the posterior blockage probability. The plurality of sensors may comprise any sensor of the ego vehicle, e.g., a Radar sensor, a Lidar sensor, a camera sensor, and/or a GPS sensor. The data related to an external reference point may be map data comprising one or more landmarks as external reference points. A landmark may comprise a bridge, a road signal, a sign, a traffic sign and/or a traffic light. A position of the landmark may be obtained by GPS data.

    [0041] Further exemplary external reference points may comprise one or more mechanical movable elements which position may be controlled. For example, a windshield washer is a mechanical movable element of the ego vehicle which may be controlled by the ego vehicle. When moving the windshield washer in front of a camera sensor, a position of the windshield washer is known. Thus, the camera sensor should observe the windshield washer in its entire field of view. Accordingly, the windshield washer may be used as an external reference point.

    [0042] As exemplary illustrated in FIG. 4, a field of view of a sensor may be represented by an occupancy grid. Each cell in the occupancy grid has a probability value between 0 and 1 representing the probability of the occupancy of that cell, in the following also referred as occupancy probability. Values close to 1 represent a high certainty that the cell is occupied by an object in that area of the field of view. Values close to 0 represent a certainty that the cell is not occupied by an object in that area of the field of view.

    [0043] For example, an occupancy grid 402 of a Lidar sensor for a particular field of view may comprise 4 cells representing 4 areas of the particular field of view. Cell of the occupancy grid 402 comprises an occupancy probability o.sub.1,Lidar.sup.t.sup.K of 0.1. The Lidar sensor indicates that there is a certainty that the cell is not occupied by the object. In other words, the Lidar sensor indicates that there is a certainty that the corresponding area of the field of view is most likely not occupied by the object. The Lidar sensor cannot detect the object with a high certainty.

    [0044] Further, an occupancy grid 404 of a Radar sensor for the particular field of view may also comprise 4 cells representing 4 areas of the particular field of view. Cell of the occupancy grid 404 comprises an occupancy probability o.sub.1,Lidar.sup.t.sup.K of 0.9. The Radar sensor indicates that here is a high certainty that the cell is occupied by the object. In other words, the Radar sensor indicates that there is a high certainty that the corresponding area of the field of view is most likely occupied by the object. The Radar sensor can detect the object with a high certainty.

    [0045] Furthermore, an occupancy grid 406 related to map data of the particular field of view may also comprise 4 cells representing 4 areas of the particular field of view. The map data may comprise a landmark which can be detected by the plurality of sensors of the ego vehicle.

    [0046] The landmark may represent the external reference object as described above. Cell of the occupancy grid 306 comprises an occupancy probability o.sub.1,Map.sup.t.sup.K of 0.9. The map data indicates that here is a high certainty that the cell is occupied by the object, e.g., a landmark. In other words, the map data indicates that there is a high certainty that the corresponding area of the field of view is most likely occupied by the object, e.g., the landmark. The map data defines the ground truth for the sensors' detection of the object, e.g., the landmark.

    [0047] Further, the method 100 may determine 1 14 a performance of the sensor based on the received sensor's data of the plurality of sensors of the ego vehicle and data related to one or more external reference points. To determine 1 14 the performance of the sensor, the occupancy grids 402, 404, and 406 may be fused using a Bayes approach to compute a fused occupancy grid 408. Next, a deviation of each occupancy grid of a sensor 402, 404 from the fused occupancy grid 408 may be derived. Occupancy grid 410 shows a deviation for the Lidar sensor and occupancy grid 412 shows a deviation for the Radar sensor.

    [0048] Formally, the deviation of the occupancy probability o for each cell, of an occupancy grid for a sensor s.sub.i at time t.sub.K may be computed as follows:


    o.sub.cell.sub.j.sub.,s.sub.i.sub.,deviation.sup.t.sup.K=|o.sub.cell.sub.j.sub.,fusion.sup.t.sup.Ko.sub.cell.sub.j.sub.,s.sub.i.sup.t.sup.K|.

    [0049] Finally, the performance b.sub.p of a sensor s.sub.i at time t.sub.K may be determined as follows:

    [00003] b p , s t t K = .Math. number .Math. .Math. of .Math. .Math. cells .Math. .Math. cell j .Math. .Math. o cell j , fusion t K - o cell j , s i t K .Math. .Math. 1 number .Math. .Math. of .Math. .Math. cells .Math. .Math. cell j .

    [0050] For example, the performance of the Lidar sensor in the given example is:

    [00004] b p , Lidar t K = 0.7 4 = 0.175 .

    As defined above, the performance of a sensor defined a blockage probability of the sensors for the entire field of view of the sensor. In particular, the performance of the Lidar sensor b.sub.p,Lidar.sup.t.sup.K is 0.175. This means that occupancy probability of the Lidar sensor is 17.5 percent.

    [0051] The method 100 may calculate 1 12 the posterior probability b for a sensor s.sub.i at time t.sub.K based on determined performance b.sub.p of the sensor and the prior blockage probability b.sub.AP using a binary Bayes approach:

    [00005] C 2 = b p , s i t K * b Ap , s i t K ( 1 - b p , s i t K ) * ( 1 - b AP , s i t K ) = 0.7 * 0.175 ( 1 - 0.7 ) * ( 1 - 0.175 ) = 0.49 and b Lidar t K = C 2 1 + C 2 = 0.49 1 + 0.49 = 0.32 .

    [0052] For the exemplary Lidar sensor the calculated a posteriori blockage probability may be 0.32. Based on the calculated posterior blockage probability, the method 100 may determine 1 16 the blockage of the sensor. To determine the blockage of the sensor, a threshold may be used. If the posterior blockage probability of a sensor s.sub.i is larger than the predefined threshold g, i.e. b.sub.s.sub.i.sup.t.sup.K>g, the sensor is blocked. If the posterior blockage probability of a sensor s.sub.i is smaller or equal than the predefined threshold g, i.e. b.sub.s.sub.i.sup.t.sup.Kg, the sensor is not blocked. For example, the threshold of the Lidar sensor may be g=0.5. Since the posterior blockage probability of the Lidar sensor is 0.32, the Lidar sensor is not blocked.

    [0053] In case, a sensor is determined as blocked, the method 100 may perform one or more actions to resolve the blockage of the sensor e.g., by activating an integrated high performance spray mechanism. The blockage of the sensor may be determined before the action to resolve the blockage and after the action to resolve the blockage. If there is a large difference before and after the action to resolve the blockage, the sensor was most likely blocked. The method 100 may perform the action to resolve the blockage repeatedly so that the blockage of the sensor may be prevented in advance.

    [0054] Advantageously, the blockage of a particular sensor may be determined more efficiently by using data regarding external reference point which defines the ground truth.

    [0055] It should be noted that the term comprising does not exclude other elements or steps and the use of articles a or an does not exclude a plurality. Also elements described in association with different embodiments may be combined. It should also be noted that reference signs in the claims should not be construed as limiting the scope of the claims.

    LIST OF REFERENCE SIGNS

    [0056] 100 method [0057] 102 determine a prior blockage probability [0058] 104 determine a blockage probability [0059] 106 determine a blockage probability [0060] 108 receive sensor data [0061] 1 10 receive data regarding an external reference point [0062] 1 12 determine an a posteriori blockage probability [0063] 1 14 calculate a performance of a sensor [0064] 1 16 determine a blockage of a sensor [0065] 200 exemplary blockage probabilities [0066] 300 exemplary movement model for tracking an object by a single sensor [0067] 302 probability [0068] 304 probability distribution [0069] 400 exemplary implementation of ground truth using occupancy grids [0070] 402 occupancy grid [0071] 404 occupancy grid [0072] 406 occupancy grid [0073] 408 fused occupancy grid [0074] 410 deviation from fused occupancy grid [0075] 412 deviation from fused occupancy grid