Method and Apparatus for Sensor Data Fusion for a Vehicle

20220128680 · 2022-04-28

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for sensor data fusion for a vehicle includes providing current sensor object data that are representative of a sensor object s.sub.t ascertained at the time t; providing fusion object data that are representative of fusion objects f.sub.t.sup.j′ intended for sensor data fusion at the time t; providing a sensor object data record H including historical sensor object data that are representative of a sensor object s.sub.t-k.sup.i, ascertained at a preceding time t−k; taking the sensor object data record H as a basis for ascertaining a reduced sensor object data record H′={S.sub.t-k.sup.αt−k|k=1 . . . n}, wherein s.sub.ttext missing or illegible when filed denotes a sensor object associated with a fusion object f.sub.t at the time t; and taking the reduced sensor object data record H′ as a basis for associating the sensor object s.sub.t with a fusion object f.sub.t.sup.j and ascertaining a refreshed fusion object.

    Claims

    1.-9. (canceled)

    10. A method for sensor data fusion for a vehicle, wherein a sensor apparatus is assigned to the vehicle, the method comprising: providing current sensor object data that are representative of a sensor object s.sub.t ascertained by the sensor apparatus in surroundings of the vehicle at a time t; providing a fusion object data set comprising fusion object data, each of which is representative of one fusion object f.sub.t.sup.j out of j=1 . . . p fusion objects ascertained in the surroundings of the vehicle and each of which is provided at the time t for sensor data fusion; providing a sensor object data set H={stext missing or illegible when filed|i=1 . . . m, k=1 . . . n} comprising historic sensor object data, each of which is representative of one sensor object s.sub.t-k.sup.i, out of i=1 . . . m sensor objects ascertained by the sensor apparatus in the surroundings of the vehicle at a preceding time t−k, where n=1 and m≥1; ascertaining a reduced sensor object data set H′={s.sub.ttext missing or illegible when filed|k=1 . . . n} depending on the sensor object set H, wherein s.sub.ttext missing or illegible when filed refers to a sensor object associated with a fusion object f.sub.t at the time t; depending on the reduced sensor object data set H′ associating the sensor object s.sub.t with a fusion object f.sub.t.sup.j; and ascertaining a refreshed fusion object.

    11. The method according to claim 10, wherein 1≤n≤5.

    12. The method according to claim 11, wherein n=2.

    13. The method according to claim 10, wherein the sensor object s.sub.t is associated with the fusion object f.sub.t.sup.j by the information matrix fusion (IMF) algorithm.

    14. The method according to claim 13, wherein: the reduced sensor object data set H′ is assigned to the respective fusion objects f.sub.t.sup.j and stored in the fusion object data, and depending on the fusion object data set, the sensor object s.sub.t is associated with the fusion object f.sub.t.sup.j, and the refreshed fusion object is ascertained.

    15. The method according to claim 14, wherein: the reduced sensor object data set H′ assigned to a respective fusion object f.sub.t.sup.j does not comprise an associated sensor object s.sub.t-k.sup.ntext missing or illegible when filed at any preceding time t−k, k=1 . . . n, the sensor object s.sub.t is associated with a fusion object f.sub.t.sup.j by using the cross-covariance algorithm.

    16. The method according to claim 14, wherein: feature data x.sub.t-1.sup.s, xtext missing or illegible when filed that are representative of a lateral extent, position and orientation of the sensor object s.sub.t-1, s.sub.t, and indicator data ptext missing or illegible when filed, ptext missing or illegible when filed that are representative of an uncertainty in the ascertainment of the lateral extent, position or orientation, are assigned to the sensor object s.sub.t-1, s.sub.t; feature data xtext missing or illegible when filed.sup.f, xtext missing or illegible when filed.sup.f, that are representative of a lateral extent, position and orientation of the fusion object f.sub.t-1.sup.j, f.sub.t.sup.j, and indicator data p.sub.xtext missing or illegible when filed.sup.f, p.sub.xtext missing or illegible when filed, that are representative of an uncertainty in the ascertainment of the lateral extent, position or orientation, are assigned to the fusion object ftext missing or illegible when filed.sup.j, f.sub.t.sup.j; depending on the feature data xtext missing or illegible when filed, xtext missing or illegible when filed.sup.j, a feature fusion state xtext missing or illegible when filed.sup.f that is representative of the lateral extent, position and orientation of the fusion object ftext missing or illegible when filed.sup.j at the time t following the sensor data fusion with xtext missing or illegible when filed.sup.s is ascertained; depending on the feature data xtext missing or illegible when filed, xtext missing or illegible when filed.sup.f and on the indicator data p.sub.x.sub.ttext missing or illegible when filed, p.sub.xtext missing or illegible when filed.sup.f, a refreshed feature fusion state xtext missing or illegible when filed is ascertained that is representative of the lateral extent, position and orientation of the fusion object f.sub.t.sup.j at the time t following the sensor data fusion with x.sub.t.sup.s; and depending on the feature fusion state xtext missing or illegible when filed.sup.f, the refreshed feature fusion state ftext missing or illegible when filed.sup.f and the feature data x.sub.t.sup.s, a check is made as to whether the refreshed feature fusion state ftext missing or illegible when filed.sup.f satisfies equation min(xtext missing or illegible when filed, xtext missing or illegible when filed)≤xtext missing or illegible when filed.sup.f≤max(xtext missing or illegible when filed.sup.f, xtext missing or illegible when filed), wherein, if the equation is satisfied, the refreshed fusion object is ascertained depending on the refreshed feature state xtext missing or illegible when filed.sup.f, and if the equation is not satisfied, the sensor object s.sub.t is associated by using the cross-covariance algorithm with a fusion object f.sub.t.sup.j.

    17. An apparatus for sensor data fusion for a vehicle, wherein the apparatus is configured to carry out a method comprising: providing current sensor object data that are representative of a sensor object s.sub.t ascertained by the sensor apparatus in surroundings of the vehicle at a time t; providing a fusion object data set comprising fusion object data, each of which is representative of one fusion object f.sub.t.sup.j out of j=1 . . . p fusion objects ascertained in the surroundings of the vehicle and each of which is provided at the time t for sensor data fusion; providing a sensor object data set H={stext missing or illegible when filed|i=1 . . . m, k=1 . . . n} comprising historic sensor object data, each of which is representative of one sensor object s.sub.t-k.sup.i, out of i=1 . . . m sensor objects ascertained by the sensor apparatus in the surroundings of the vehicle at a preceding time t—k, where n=1 and m≥1; ascertaining a reduced sensor object data set H′={s.sub.t-ktext missing or illegible when filed|k=1 . . . n} depending on the sensor object set H, wherein s.sub.t.sup.atext missing or illegible when filed refers to a sensor object associated with a fusion object f.sub.t at the time t; depending on the reduced sensor object data set H′, associating the sensor object s.sub.t with a fusion object f.sub.t.sup.j; and ascertaining a refreshed fusion object.

    18. A computer product comprising a non-transitory computer readable medium having stored thereon program code which, when executed on a processor, a microcontroller or a programmable hardware component, carries out the acts of: providing current sensor object data that are representative of a sensor object s.sub.t ascertained by the sensor apparatus in surroundings of the vehicle at a time t; providing a fusion object data set comprising fusion object data, each of which is representative of one fusion object f.sub.t.sup.j out of j=1 . . . p fusion objects ascertained in the surroundings of the vehicle and each of which is provided at the time t for sensor data fusion; providing a sensor object data set H={s.sub.t-k.sup.i|l=1 . . . m, k=1 . . . n} comprising historic sensor object data, each of which is representative of one sensor object s.sub.t-k.sup.i, out of i=1 . . . m sensor objects ascertained by the sensor apparatus in the surroundings of the vehicle at a preceding time t−k, where n=1 and m≥1; ascertaining a reduced sensor object data set H′={s.sub.t-ktext missing or illegible when filed|k=1 . . . n} depending on the sensor object set H, wherein s.sub.ttext missing or illegible when filed refers to a sensor object associated with a fusion object f.sub.t at the time t; depending on the reduced sensor object data set H′, associating the sensor object s.sub.t with a fusion object f.sub.t.sup.j; and ascertaining a refreshed fusion object.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0041] FIG. 1 shows an exemplary vehicle with an apparatus for sensor data fusion.

    [0042] FIG. 2 shows an exemplary flowchart of a method for sensor data fusion.

    [0043] FIG. 3 shows an exemplary association between a respective reduced sensor object data set and fusion objects.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0044] Elements of the same design or function are given the same reference signs throughout the figures.

    [0045] A method for high-level object fusion based on the information matrix fusion (IMF) algorithm, which permits a reliable use in embedded systems, is proposed below. On the basis of the exemplary embodiment of FIG. 1, a vehicle F according to embodiments of the invention with an apparatus V for sensor data fusion and a sensor apparatus S1 that is configured for the capture of objects, in particular other road users and relevant properties thereof, and for the ascertainment of a corresponding sensor object s.sub.t, is illustrated. This can, for example, be a camera, lidar (light detection and ranging), ladar (laser detection and ranging), radar (radio detection and ranging), ultrasonic, point laser or infrared sensor. Regardless of the method used for high-level object fusion, in the automobile field a list of rectangles or squares is usually output as the result of the object fusion as object representations, representing recognized, and in particular moving, objects in the surroundings of the vehicle F. FIG. 1 shows such a fusion object f.sub.t.sup.j, to which, for example, a reference point A, a length and a width with respect to the reference point A, and an orientation α.sub.f of the fusion object f.sub.t.sup.j with respect to a reference coordinate system R of the vehicle F are assigned as relevant properties. Indicator data that are representative of an uncertainty in the ascertainment of the length, width and orientation α.sub.s can, moreover, be assigned to the fusion object f.sub.t.sup.j. The uncertainty can, for example, be expressed by a variance. FIG. 1 further shows a sensor object s.sub.t ascertained by the sensor apparatus S1.

    [0046] The vehicle F further comprises by way of example a further sensor apparatus S2, that is also configured for capture of the surroundings of the vehicle F. The sensor apparatuses S1, S2 are signal-coupled to the apparatus V for sensor data fusion. Sensor object data provided by the sensor apparatuses S1, S2 can be fused by the apparatus V in accordance with any method for high-level object fusion, and stored in a fusion object data set.

    [0047] The method for high-level object fusion can involve a recursive estimation process on the basis of the information matrix fusion (IMF) algorithm.

    [0048] It emerges from the formal description of the IMF algorithm that for every sensor object s.sub.t={x.sub.t.sup.s, ptext missing or illegible when filed.sup.s} with properties x.sub.t.sup.s and their uncertainties ptext missing or illegible when filed that are to be fused at the time t to one of j=1 . . . p fusion objects f.sub.t.sup.j, the state of the sensor object s.sub.t-k at the last fusion time must be known. Since in theory very large values of k>>1 are possible, it is not readily possible to make all the information required for the IMF algorithm available on an embedded system with low memory resources.

    [0049] In this context, a data and program memory is in particular assigned to the apparatus V, in which a computer program that is explained below in more detail with reference to the flowchart of FIG. 2 is stored.

    [0050] In a first step P10, a current sensor object data that are representative of a sensor object s.sub.t ascertained by the sensor apparatus S1 in the surroundings of the vehicle F at a time t are provided. In particular, a plurality of sensor object data can be ascertained in this step by the sensor apparatus S1, so that altogether q>1 different sensor objects s.sub.t.sup.i, i=1 . . . q ascertained by the sensor apparatus S1 are provided. For example, in the first step P10 current sensor object data of the further sensor apparatus S2 can also be provided.

    [0051] The program is continued in a step P20, in which a fusion object data set is provided. The fusion object data set comprises fusion object data, which are in each case representative of one fusion object f.sub.t.sup.j out of j=1 . . . p ascertained in the surroundings of the vehicle F, that are each provided at the time t for sensor data fusion. The fusion object data set is, for example, stored in a data memory of the apparatus V, and was ascertained in a preceding fusion process from the sensor object data of the sensor apparatuses S1, S2.

    [0052] A sensor object data set H={s.sub.t-k.sup.i|i=1 . . . m, k≤1 . . . n} is provided in a subsequent step P30. The sensor object data setH comprises historic sensor object data, which are in each case representative of one sensor object s.sub.t-k.sup.i, out of i=1 . . . m ascertained by the sensor apparatus S1 in the surroundings of the vehicle F at a preceding time t-k, where n, m≥1. For example, in the first step P30 a sensor object data set of the further sensor apparatus S2 can also be provided.

    [0053] The sensor object data set H can also be referred to as the association history. A fusion only results from a previous association of sensor object and fusion object. Thus instead of recording all q of the sensor objects s.sub.t.sup.i, i=1 . . . q recognized by the sensor apparatus S1 at the time t, it is sufficient to include only the p≤q sensor objects s.sub.t.sup.i, i=1 . . . p actually associated with a fusion object in the association history. In other words, in a step preceding the step P30 and following the step P10 in which the current sensor data are ascertained, it is possible for a selection to be made each time as to which of the ascertained current sensor data are included in the association history. In doing so, a unique association history can be recorded for each sensor apparatus S1, S2 independently of the association histories of other sensor apparatuses in the fusion system. Only the association history of the sensor apparatus S1 will therefore be considered below.

    [0054] The method is continued in a step P40 that follows step P30, in which, depending on the sensor object data set H, a reduced sensor object data set H′={s.sub.t-k.sup.α.sup.t-k|k=1 . . . n} is ascertained, wherein s.sub.t.sup.α.sup.t refers to a sensor object associated at the time t with a fusion object f.sub.t. The reduced sensor object data set H′ is then assigned to the respective fusion objects f.sub.t.sup.j and stored in the fusion object data.

    [0055] For the sensor object data set H, the necessity of n>1 only results from a possible association of different sensor objects at different times with the same fusion object. It can, however, usually be assumed that the sensor apparatus S1 continuously tracks the objects in the surroundings of the vehicle F, and it is only in dense traffic situations that ambiguous object formations can result, so that an actual object in the surroundings results in the formation of multiple sensor objects (separated in space or time). In addition, at any time no more than one object of the sensor apparatus S1 is associated with the same fusion object.

    [0056] It is therefore sufficient to note the few sensor objects s.sub.t-k.sup.αtext missing or illegible when filed, k=1 . . . n most recently associated with a fusion object at the fusion object. The history thus now contains different sensor objects at different times, H′={s.sub.t-ktext missing or illegible when filed|k=1 . . . n}. A value of n=2 has been found appropriate in practice, so that one-off ambiguities can be handled. The resulting model, with n=2, for the realization of the association history is shown in FIG. 3. A reduced sensor object data set H′.sub.S1 of the sensor apparatus S1 and a reduced sensor object data set H′.sub.S1 of the further sensor apparatus S2 are here respectively assigned to each fusion object f.sub.t.sup.1, f.sub.t.sup.2. The reduced sensor object data set H′.sub.S1 here comprises the sensor objects s.sub.t-1.sup.αtext missing or illegible when filed and s.sub.t-2.sup.αtext missing or illegible when filed of the sensor apparatus S1 associated at the times t−1 and t−2, while the reduced sensor object data set H′.sub.S2 comprises the sensor objects s.sub.t-1.sup.αtext missing or illegible when filed and s.sub.t-2.sup.αtext missing or illegible when filed of the further sensor apparatus S2 associated at the times t−1 and t−2.

    [0057] In a subsequent step P50, depending on the reduced sensor object data set H′, the sensor object s.sub.t is associated with a fusion object f.sub.t.sup.j, and a refreshed fusion object is ascertained.

    [0058] To this end a check is first made as to whether the reduced sensor data set H′ assigned to the respective fusion object f.sub.t.sup.j comprises a sensor object s.sub.t-k.sup.α.sup.t-k associated at an earlier time t−k, k=1 . . . n.

    [0059] In the event that the reduced sensor object data set H′ does not comprise an associated sensor object s.sub.t-k.sup.α.sup.t-k at any preceding time t−k, k=1 . . . n, the sensor object s.sub.t is associated in a step P60 with a fusion object f.sub.t.sup.j by using the cross-covariance algorithm, and the refreshed fusion object is ascertained. Otherwise the sensor object s.sub.t is associated in a step P70 with a fusion object f.sub.t.sup.j by the information matrix fusion (IMF) algorithm on the basis of the reduced sensor object data set h′, and a refreshed fusion object is ascertained.

    [0060] If no fused sensor object related to an earlier time can be found in the limited association history explained above, for example during the first fusion of the sensor object, or because the choice of q cannot resolve the ambiguity of the sensor apparatus, the fusion is only applied with approximative consideration of possible correlations using the cross-covariance method.

    [0061] The program is subsequently ended or, possibly following a specified interruption, continued in step P10 with an updated object data set.

    [0062] In an alternative variant embodiment, the step P70 is supplemented with a plausibility check of the refreshed fusion object.

    [0063] In the practical application of the IMF algorithm for the fusion of the sensor object data of multiple sensor apparatuses on an embedded system, an error case can occur that results in the determination of fused properties that deviate grossly from the properties reported by the corresponding sensor apparatus.

    [0064] In this variant embodiment, feature data x.sub.t-1text missing or illegible when filed, x.sub.t.sup.s are assigned to the respective sensor object s.sub.t-1, s.sub.t, that are representative of the properties reported by the sensor apparatus, such as a lateral extent, position and orientation of the sensor object s.sub.t-1, s.sub.t. Indicator data p.sub.xtext missing or illegible when filed.sup.s, p.sub.x.sub.t.sup.s are furthermore assigned to the respective sensor object s.sub.t-1, s.sub.t, that are representative of an uncertainty in the ascertainment of the properties.

    [0065] Similarly, feature data x.sub.t-1.sup.f, x.sub.t.sup.f that are representative of the properties of the corresponding fusion object f.sub.t-1.sup.j, f.sub.t.sup.j and indicator data p.sub.x.sub.t-1.sup.f, p.sub.x.sub.t.sup.f that are representative of an uncertainty in the ascertainment of the properties are assigned to the respective fusion object f.sub.t-1.sup.j, f.sub.t.sup.j.

    [0066] In a step P72 that follows the step P50, depending on the feature data x.sub.t-1text missing or illegible when filed, x.sub.t-1.sup.f, a feature fusion state xtext missing or illegible when filed.sup.f is ascertained that is representative of the properties of the fusion object f.sub.t-1.sup.j at the time t following the sensor data fusion with f.sub.t-1text missing or illegible when filed.

    [0067] In a step P74, depending on the feature data x.sub.t.sup.s, xtext missing or illegible when filed.sup.f and on the indicator data ptext missing or illegible when filed, p.sub.xtext missing or illegible when filed.sup.f a refreshed feature fusion state xtext missing or illegible when filed.sup.f is thereupon ascertained, that is representative of the properties of the fusion object f.sub.t.sup.j at the time t following the sensor data fusion with x.sub.t.sup.s.

    [0068] Finally, in a step P76, depending on the feature fusion state xtext missing or illegible when filed, the refreshed feature fusion state xtext missing or illegible when filed.sup.f, and the feature data x.sub.t.sup.s, a check is made as to whether the refreshed feature fusion state x.sub.t;t.sup.f satisfies the equation min(x.sub.t-1text missing or illegible when filed.sup.f, x.sub.t.sup.s)≤xtext missing or illegible when filed≤max(x.sub.t-1text missing or illegible when filed, x.sub.t.sup.s). In other words, a check is made in the step P76 as to whether the intuitive assumption that the updated properties following the fusion lie between the original properties and the properties of the fused sensor object is violated.

    [0069] In the event that the equation is satisfied, the refreshed fusion object is ascertained depending on the refreshed feature state xtext missing or illegible when filed.sup.f in the step P70. Otherwise, the sensor object s.sub.t is associated in the step P60 with a fusion object f.sub.t.sup.j by using the cross-covariance algorithm, and the refreshed fusion object is ascertained.

    [0070] To handle this error case, the IMF fusion of the fusion and sensor objects is in particular carried out as usual, and the result stored temporarily for checking. If the result violates the intuitive assumption described above, the temporary result is discarded. A fusion instead takes place with only approximative consideration of possible correlations, making use of the cross-covariance method.