CONCEPT FOR MONITORING A DATA FUSION FUNCTION OF AN INFRASTRUCTURE SYSTEM
20230061522 · 2023-03-02
Inventors
- Adwait Sanjay Kale (Ludwigsburg, DE)
- Nils Uhlemann (Ludwigsburg, DE)
- Alexander Geraldy (Hildesheim, DE)
- Holger Mindt (Steinheim A. D. Murr, DE)
Cpc classification
B60W50/0098
PERFORMING OPERATIONS; TRANSPORTING
G01S7/003
PHYSICS
G01S7/4802
PHYSICS
G08G1/096783
PHYSICS
G01S2013/9316
PHYSICS
G08G1/096708
PHYSICS
G01S13/87
PHYSICS
B60W2554/4044
PERFORMING OPERATIONS; TRANSPORTING
International classification
G08G1/0967
PHYSICS
Abstract
A method for monitoring a data fusion function of an infrastructure system for the infrastructure-supported assistance of motor vehicles during an at least semi-automated driving task within an infrastructure, the infrastructure including multiple infrastructure surroundings sensors for detecting an area of the infrastructure. The method includes: receiving multiple input data sets intended for the data fusion function, each of which includes surroundings data based on the respective detection of the area, which represent the detected area; receiving output data based on a data fusion of the input data sets, output by the data fusion function; checking the input data sets and/or the output data for consistency; outputting a check result of the check. A device, a computer program, and a machine-readable memory medium are also provided.
Claims
1-11. (canceled)
12. A method for monitoring a data fusion function of an infrastructure system for an infrastructure-supported assistance of motor vehicles during an at least semi-automated driving task within an infrastructure, the infrastructure system including multiple infrastructure surroundings sensors configured to detect an area of the infrastructure, the method comprising the following steps: receiving multiple input data sets intended for the data fusion function, each of which includes surroundings data based on a respective detection of the area, which represent the detected area; receiving output data based on a data fusion of the input data sets, output by the data fusion function; checking the input data sets and/or the output data for consistency; and outputting a check result of the check.
13. The method as recited in claim 12, wherein some of the input data sets include in each case an open space recognition result, which indicates a result of an open space recognition of the area, the output data including a fused open space recognition result of the respective open space recognition results.
14. The method as recited in claim 13, wherein some of the input data sets include in each case an object detection result, which indicates a result of an object detection of the area, the output data including a fused object detection result of the respective object detection results.
15. The method as recited in claim 14, wherein the check includes a comparison of the fused open space recognition result with the fused object detection result in order to detect inconsistencies.
16. The method as recited in claim 14, wherein an object detection majority result is ascertained, which corresponds to the object detection result of a majority of identical object detection results, the check including a comparison of the fused object detection results with the object detection majority result in order to detect inconsistencies.
17. The method as recited in claim 12, wherein one of the input data sets includes trajectory data, which represent a trajectory of an object located within the area, the check including a check of the trajectory for plausibility in order to detect inconsistencies.
18. The method as recited in claim 12, wherein one of the input data sets includes position data, which represent an initial position of an object located within the area at a point in time of an initial detection by a corresponding one of the infrastructure surroundings sensors, the check including a comparison of the initial position with a maximum detection range of the corresponding infrastructure surroundings sensor in order to detect inconsistencies between the initial position and the maximum detection range.
19. The method as recited in claim 12, wherein one of the input data sets includes position data, which represent an end position of an object located within the area at a point in time of a final detection by the corresponding infrastructure surroundings sensor, the check including a comparison of the end position with a maximum detection range of a corresponding one of the infrastructure surroundings sensors in order to detect inconsistencies between the end position and the maximum detection range.
20. A device configured to monitor a data fusion function of an infrastructure system for an infrastructure-supported assistance of motor vehicles during an at least semi-automated driving task within an infrastructure, the infrastructure system including multiple infrastructure surroundings sensors configured to detect an area of the infrastructure, the device configured to: receive multiple input data sets intended for the data fusion function, each of which includes surroundings data based on a respective detection of the area, which represent the detected area; receive output data based on a data fusion of the input data sets, output by the data fusion function; check the input data sets and/or the output data for consistency; and output a check result of the check.
21. A non-transitory machine-readable memory medium on which is stored a computer program for monitoring a data fusion function of an infrastructure system for an infrastructure-supported assistance of motor vehicles during an at least semi-automated driving task within an infrastructure, the infrastructure system including multiple infrastructure surroundings sensors configured to detect an area of the infrastructure, the computer program, when executed by a computer, causing the computer to perform the following steps: receiving multiple input data sets intended for the data fusion function, each of which includes surroundings data based on a respective detection of the area, which represent the detected area; receiving output data based on a data fusion of the input data sets, output by the data fusion function; checking the input data sets and/or the output data for consistency; and outputting a check result of the check.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0047] Exemplary embodiments of the present invention are represented in the figures and explained in greater detail below.
[0048]
[0049]
[0050]
[0051]
[0052]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0053] In the following, identical reference numerals may be used for identical features.
[0054]
[0055] receiving 101 multiple input data sets intended for the data fusion function, each of which includes surroundings data based on the respective detection of the area, which represent the detected area,
[0056] receiving 103 output data based on a data fusion of the input data sets, output by the data fusion function, checking 105 the input data sets and/or the output data sets for consistency,
[0057] outputting 107 a check result of the check.
[0058] According to one specific embodiment, it is decided based on the check result whether the infrastructure system is to be switched off or whether an assistance function provided by the infrastructure system is to be limited.
[0059] In one specific embodiment of the method, this includes a step of detecting the area via the multiple infrastructure surroundings sensors in order to output surroundings sensor data corresponding to the detection. The output surroundings sensor data are further processed, for example, in order to ascertain surroundings data based in each case on the respective detection of the area, which represent the detected area.
[0060] In one specific embodiment of the method, this includes a fusing of the input data sets in order to ascertain one or multiple fusion results, the output data including the ascertained fusion result or results. Thus, this means, in particular, that the method includes, for example, an implementation of the data fusion function.
[0061]
[0062]
[0063]
[0064] According to block diagram 401, multiple video cameras 403 including one video sensor each are provided. Multiple radar sensors 405 are further provided. Multiple LIDAR sensors 407 are further provided. These infrastructure surroundings sensors detect one or multiple areas of an infrastructure, through which motor vehicles are able to drive in at least a semi-automated manner. The motor vehicles are supported in this case by an infrastructure system for the infrastructure-supported assistance of motor vehicles during an at least semi-automated driving task. Infrastructure surroundings sensors 403, 405, 407 are encompassed by the infrastructure system.
[0065] According to one function block 409, an object recognition is carried out based on the video images of video cameras 403. A track detection also takes place within the scope of this object recognition.
[0066] According one function block 411, an object recognition is carried out based on the radar images of radar sensors 405.
[0067] According to one function block 413, an object detection is carried out based on the LIDAR images of LIDAR sensors 407.
[0068] A result of the object detection according to function block 409 is fed to a function block 415, according to which the result is checked. This check includes, for example, a plausibility check. A digital map 416 of the infrastructure is used for the check. Should, for example, digital map 416 shows an object at a point that is not present in the result according to function block 409, then it is assumed, for example, that an error has occurred, for example, within the scope of the object recognition and/or already in one or in multiple of video cameras 403.
[0069] Similarly, a result of the object recognition according to function block 411 is fed to a function block 417, according to which, similar to function block 415, the result of the object recognition is checked according to function block 411. The corresponding explanations apply similarly.
[0070] Similarly, a result of the object recognition according to function block 413 is fed to a function block 419, according to which, similar to function blocks 415, 417, the result of the object recognition is checked according to function block 413. The corresponding explanations apply similarly.
[0071] The results of these checks are conveyed to a state machine 421. Based on these results, state machine 421 is able to ascertain, for example, a state of the operability of the infrastructure system with respect to an infrastructure-supported assistance of motor vehicles. One state may, for example, be that the infrastructure system functions completely correctly. One state may, for example, be that the infrastructure system functions to only a limited degree. One state may, for example, be that the infrastructure system functions completely incorrectly.
[0072] The checked result according to function block 415 is fed to a function block 423. This means, therefore, that function block 423 is fed a checked result of the object and track recognition according to function block 409. According to function block 423, detected objects are tracked over time.
[0073] Similarly, the result of the object recognition checked by function block 417 according to function block 411 is fed to a function block 425. Function block 425 tracks the detected objects over time, similarly to function block 423.
[0074] Similarly, the result of the object detection checked by function block 419 or object recognition according to function block 413 is delivered to a function block 427. Similarly to function blocks 423, 425, a detected object or multiple detected objects according to function block 427 is/are tracked over time.
[0075] Thus, for example, respective trajectories of the detected objects are ascertained in function blocks 423, 425, 427. These trajectories are conveyed to a data fusion function 429. The data fusion function includes a function block 431, according to which the ascertained trajectories, which have been ascertained in each case based on video images, radar images and LIDAR images, are checked for consistency. This check includes, for example, a plausibility check of the trajectories. For the purpose of further illustration, reference is made here to
[0076] A result of this check is provided to state machine 421, which is able to decide based on the result which state the infrastructure system has.
[0077] The correspondingly checked trajectories are provided to a function block 433, according to which the individual trajectory data are fused.
[0078] The video images, radar images and LIDAR images are provided to a function block 435. From these input data, function block 435 generates for each sensor the information about the areas visible to the sensor. With this information, the occlusions resulting from static and dynamic obstacles are known to the system for each sensor. The generated pieces of information are the output of function block 435.
[0079] The output of function block 435 is provided firstly to function blocks 423, 425, 427 and secondly also to function block 433 for the purpose of carrying out the fusion of the trajectories.
[0080] Function block 433 outputs as output data within the context of the description a fused object detection result, which is checked for consistency in a function block 437.
[0081] The video images are further provided to a function block 439, according to which an open space recognition is carried out. The radar images are further provided to a function block 441, according to which an open space recognition is carried out based on the radar images. The LIDAR images are further provided to a function block 443, according to which an open space recognition is carried out based on the LIDAR images.
[0082] Corresponding open space recognition results of individual function blocks 439, 441, 443 are provided to a function block 445 of data fusion function 429, according to which the open space recognition results are fused to form a fused open space recognition result. Function block 445 outputs as output data within the context of the description the fused open space recognition result to function block 437, according to which the fused open space recognition result is checked.
[0083] The check steps according to function block 437 are, for example, the check steps described above and/or below.
[0084] A result of this check, i.e., a check result, is output to state machine 421, which is able to decide based thereupon, which state the infrastructure system has.
[0085] Accordingly, it may then be decided according to function block 447 what exactly is to be sent, for example, to motor vehicles, which drive in at least a semi-automated manner through the area or areas of the infrastructure.
[0086] If it should be established, for example, that according to function block 437 an inconsistency between the object detection result and the open space recognition result is present, the corresponding area in the open space recognition result is then marked as not open, i.e., as occupied and is sent to motor vehicles. In no event are inconsistent or invisible areas reported as open.
[0087] In one specific embodiment not shown, it is provided that, similarly to function block 431, a corresponding function block is provided upstream from function block 445 which, similarly to function block 431, checks the open space recognition results, for example, checks for consistency and/or for plausibility. A corresponding result may also be provided to state machine 421.
[0088]
[0089] The four detection areas 513, 515, 517, 519 are each represented with the aid of differently dashed lines.
[0090] First infrastructure surroundings sensor 505 and third infrastructure surroundings sensor 509 detect (“see”) opposite the driving direction of motor vehicle 501, which extends from left to right relative to the paper plane.
[0091] Second infrastructure surroundings sensor 507 and fourth infrastructure surroundings sensor 511 detect (“see”) in the driving direction of motor vehicle 501.
[0092] At the point in time of its drive shown in
[0093] As heat map 523 according to
[0094] Thus, if the calibration is maintained and the infrastructure surroundings sensors function flawlessly, a correspondingly ascertained heat map should show that in the future as well the corresponding trajectories extend within the boundary markings.
[0095] In the case of a decalibration and/or an error in the infrastructure surroundings sensors, a different heat map is expected. This is represented, for example, in
[0096] Thus, when a corresponding heat map is ascertained during the runtime of the infrastructure system, i.e., during the operation of the infrastructure system, then this is an indication that, for example, corresponding infrastructure surroundings sensors are decalibrated and/or an error has occurred.
[0097] It may further be provided, for example, to ascertain for motor vehicle 501 a start position of an initial detection, for example, via first infrastructure surroundings sensor 505. When this is carried out for multiple motor vehicles over time, a heat map may also be ascertained, which is shown, for example, in
[0098] If, however, the environmental conditions deteriorate, for example, due to rain or snow, the first infrastructure surroundings sensor 505 is no longer able to detect motor vehicles at its maximum detection range. Instead, an instantaneous detection range decreases, so that a start position of an initial detection is located within first detection area 513 and no longer corresponds to the maximum detection range. If this is ascertained for multiple motor vehicles over time, a corresponding heat map may again be ascertained, which is represented in
[0099] As
[0100] In summary, the concept described herein is based, in particular, on the monitoring of a data fusion function with the aid of a data fusion monitoring function and, for example, conveying a result of the monitoring to a state machine. The data fusion monitoring function aids in identifying inconsistencies in various phases of the perception pipeline. On the basis of the monitoring results, the state machine decides, for example, whether degradation functions are required to be activated. The instantaneous status of the degradation is communicated, for example, at the system output.
[0101] The data fusion monitoring function may, for example, include one or multiple of the following three possibilities:
[0102] Possibility 1: based on the comparison of pieces of fused open space information (fused open space recognition result) with global fused objects (fused object detection result).
[0103] Pieces of open space information are complementary and they and global fused objects are mutually exclusive. This is the basis on which inconsistencies between two information sources are able to be identified. Example: in a particular area, the global fusion reports an object with an 80% probability of existence. If the open space fusion function reports an open space in the same area with a high degree of certainty, this is a clear inconsistency, which is able to be recognized.
[0104] In the case of infrastructure perception systems, dynamic occlusions may occur due to different mounting points and view angles of the infrastructure surroundings sensors. Such occlusions are taken into account, for example, in this monitoring function by using the pieces of information about the dynamic visibility grid calculated for each sensor.
[0105] Moreover, the ascertained discrepancies may, for example, be used for adapting the pieces of open space information in order to avoid erroneous interpretations within the at least semi-automated motor vehicle. In the above example, in which an object has been reported with a high degree of probability, the uncertainty of the open space in this area may, for example, be heightened in order to ensure the overall consistency of the surroundings model provided by the system.
[0106] Possibility 2: based on the comparison between multiple infrastructure surroundings sensors.
[0107] Using the 2-out-of-3 principle, it is possible to use the probability of the presence of each individual sensor object together with the probability of the presence of the global fusion object in order to identify false-positive and false-negative cases.
[0108] Possibility 3: based on the trajectories of the recognized sensor objects.
[0109] Based on the fact that the perception system (arrangement of the multiple infrastructure surroundings sensors) is static, particular trajectories of the objects may be assumed, which pass through the field of view of the perception system. An accumulation of improbable start points and end points or paths of the trajectories may be recognized. If the object data provide improbable trajectories in any stage of the perception pipeline, the part of the perception pipeline providing these improbable trajectories is reported to the state machine.
[0110] Thus, this yields the technical advantage of identifying inconsistencies in order to determine a suitable system response. For example: if a system degradation is necessary.
[0111] This may ensure an improvement in the safety and performance of the system.
[0112] The perception pipeline obtains its input data from the infrastructure surroundings sensors, for example, using different measuring principles, in order to minimize the risk of errors having a common cause. From this point, two parallel processing paths, for example, are carried out:
1. Object-Based Perception
[0113] a) Based on the received sensor data, sensor-specific object recognition algorithms and sensor-specific tracking algorithms generate object data (local tracks) of tracked objects for each infrastructure surroundings sensor. At this stage already, monitoring functions are possible in order to validate the local tracks.
[0114] b) Function block 431 checks the content of the local tracks, for example, based on their trajectories (possibility 3, described above).
[0115] c) The object fusion according to function block 433 combines, for example: local tracks, knowledge about where each infrastructure surroundings sensor is able to recognize objects (from the dynamic visibility grid), to fused object tracks (global tracks).
[0116] d) Function block 437 uses, for example, the global tracks (from function block 433), the pieces of information about the contributing infrastructure surroundings sensors (from function block 433), the knowledge about where each infrastructure surroundings sensor is able to recognize objects (from the dynamic visibility grid) in order to recognize false-positive or false-negative cases.
2. Open Space-Based Perception
[0117] a) The sensor-specific open space detectors generate pieces of open space information (local open space) on the basis of the received surroundings sensor data.
[0118] b) The open space fusion combines the local open spaces to form a global open space.
[0119] In addition to the recognition of false-positive and false negative, function block 437 compares, for example, also the global tracks and the global open space in order to find inconsistencies between them (possibilities 1 and 2 described above).
[0120] Each monitoring block along the perception pipeline reports a corresponding monitoring result to state machine 421, which controls, for example, an output of the infrastructure system by the former deciding on a suitable system response.
[0121] The monitoring function, which is based on the trajectories of the sensor objects, is explained in greater detail below.
[0122] Decalibrated infrastructure surroundings sensors may result in false sensor-object trajectories (in particular, in areas, which are covered by only a single infrastructure surroundings sensor). This decalibration of a single sensor is to be monitored.
[0123] It is expected that the trajectories of the motor vehicles follow essentially the boundaries of the traffic lanes. If the traffic lanes on the map and the average trajectories do not coincide, a decalibrated sensor could be the cause thereof.
[0124] Moreover, the performance of the infrastructure surroundings sensors is not always identical. Environmental influences (poor weather, glare, dim light) or a poor calibration may affect the detection performance of the infrastructure surroundings sensors. This performance deterioration is monitored, for example.
[0125] Under perfect conditions, it is expected that the trajectories of the motor vehicles are created at similar distances/positions, i.e., when they enter into the field of view of the sensor. The same applies, for example, to the termination of the trajectories of motor vehicles, when they leave the field of view of the infrastructure surroundings sensor. The area of the trajectory creation and/or trajectory termination may change in comparison to normal behavior due to environmental influences (for example, weather or time of day) and/or decalibration.
[0126] Heat maps may be used in order to store past areas of the object creation and object termination. Heat map 701 shows the nominal behavior during object creation. The objects are generated mainly at the edge of the field of view. According to heat map 801, the object recognition is heavily influenced by the surroundings and thus the object recognition area is also reduced. The object tracks are generated later at a different position than in the setpoint behavior, which is indicated by the shift according to heat map 601. A similar heat map may be created and monitored, for example, for the termination of the track.