Method and Apparatus for Sensor Data Fusion for a Vehicle
20220128680 · 2022-04-28
Inventors
- Dominik BAUCH (Muehldorf am Inn, DE)
- Marco BAUMGARTL (Gilching, DE)
- Michael HIMMELSBACH (Muenchen, DE)
- Josef Mehringer (Gmund, DE)
- Daniel MEISSNER (Friedberg, DE)
- Luca TRENTINAGLIA (Eichenau, DE)
Cpc classification
G01S7/53
PHYSICS
G01S13/86
PHYSICS
G01S15/86
PHYSICS
G01S13/87
PHYSICS
International classification
G01S13/86
PHYSICS
G01S13/72
PHYSICS
Abstract
A method for sensor data fusion for a vehicle includes providing current sensor object data that are representative of a sensor object s.sub.t ascertained at the time t; providing fusion object data that are representative of fusion objects f.sub.t.sup.j′ intended for sensor data fusion at the time t; providing a sensor object data record H including historical sensor object data that are representative of a sensor object s.sub.t-k.sup.i, ascertained at a preceding time t−k; taking the sensor object data record H as a basis for ascertaining a reduced sensor object data record H′={S.sub.t-k.sup.αt−k|k=1 . . . n}, wherein s.sub.t denotes a sensor object associated with a fusion object f.sub.t at the time t; and taking the reduced sensor object data record H′ as a basis for associating the sensor object s.sub.t with a fusion object f.sub.t.sup.j and ascertaining a refreshed fusion object.
Claims
1.-9. (canceled)
10. A method for sensor data fusion for a vehicle, wherein a sensor apparatus is assigned to the vehicle, the method comprising: providing current sensor object data that are representative of a sensor object s.sub.t ascertained by the sensor apparatus in surroundings of the vehicle at a time t; providing a fusion object data set comprising fusion object data, each of which is representative of one fusion object f.sub.t.sup.j out of j=1 . . . p fusion objects ascertained in the surroundings of the vehicle and each of which is provided at the time t for sensor data fusion; providing a sensor object data set H={s|i=1 . . . m, k=1 . . . n} comprising historic sensor object data, each of which is representative of one sensor object s.sub.t-k.sup.i, out of i=1 . . . m sensor objects ascertained by the sensor apparatus in the surroundings of the vehicle at a preceding time t−k, where n=1 and m≥1; ascertaining a reduced sensor object data set H′={s.sub.t
|k=1 . . . n} depending on the sensor object set H, wherein s.sub.t
refers to a sensor object associated with a fusion object f.sub.t at the time t; depending on the reduced sensor object data set H′ associating the sensor object s.sub.t with a fusion object f.sub.t.sup.j; and ascertaining a refreshed fusion object.
11. The method according to claim 10, wherein 1≤n≤5.
12. The method according to claim 11, wherein n=2.
13. The method according to claim 10, wherein the sensor object s.sub.t is associated with the fusion object f.sub.t.sup.j by the information matrix fusion (IMF) algorithm.
14. The method according to claim 13, wherein: the reduced sensor object data set H′ is assigned to the respective fusion objects f.sub.t.sup.j and stored in the fusion object data, and depending on the fusion object data set, the sensor object s.sub.t is associated with the fusion object f.sub.t.sup.j, and the refreshed fusion object is ascertained.
15. The method according to claim 14, wherein: the reduced sensor object data set H′ assigned to a respective fusion object f.sub.t.sup.j does not comprise an associated sensor object s.sub.t-k.sup.n at any preceding time t−k, k=1 . . . n, the sensor object s.sub.t is associated with a fusion object f.sub.t.sup.j by using the cross-covariance algorithm.
16. The method according to claim 14, wherein: feature data x.sub.t-1.sup.s, x that are representative of a lateral extent, position and orientation of the sensor object s.sub.t-1, s.sub.t, and indicator data p
, p
that are representative of an uncertainty in the ascertainment of the lateral extent, position or orientation, are assigned to the sensor object s.sub.t-1, s.sub.t; feature data x
.sup.f, x
.sup.f, that are representative of a lateral extent, position and orientation of the fusion object f.sub.t-1.sup.j, f.sub.t.sup.j, and indicator data p.sub.x
.sup.f, p.sub.x
, that are representative of an uncertainty in the ascertainment of the lateral extent, position or orientation, are assigned to the fusion object f
.sup.j, f.sub.t.sup.j; depending on the feature data x
, x
.sup.j, a feature fusion state x
.sup.f that is representative of the lateral extent, position and orientation of the fusion object f
.sup.j at the time t following the sensor data fusion with x
.sup.s is ascertained; depending on the feature data x
, x
.sup.f and on the indicator data p.sub.x.sub.
, p.sub.x
.sup.f, a refreshed feature fusion state x
is ascertained that is representative of the lateral extent, position and orientation of the fusion object f.sub.t.sup.j at the time t following the sensor data fusion with x.sub.t.sup.s; and depending on the feature fusion state x
.sup.f, the refreshed feature fusion state f
.sup.f and the feature data x.sub.t.sup.s, a check is made as to whether the refreshed feature fusion state f
.sup.f satisfies equation min(x
, x
)≤x
.sup.f≤max(x
.sup.f, x
), wherein, if the equation is satisfied, the refreshed fusion object is ascertained depending on the refreshed feature state x
.sup.f, and if the equation is not satisfied, the sensor object s.sub.t is associated by using the cross-covariance algorithm with a fusion object f.sub.t.sup.j.
17. An apparatus for sensor data fusion for a vehicle, wherein the apparatus is configured to carry out a method comprising: providing current sensor object data that are representative of a sensor object s.sub.t ascertained by the sensor apparatus in surroundings of the vehicle at a time t; providing a fusion object data set comprising fusion object data, each of which is representative of one fusion object f.sub.t.sup.j out of j=1 . . . p fusion objects ascertained in the surroundings of the vehicle and each of which is provided at the time t for sensor data fusion; providing a sensor object data set H={s|i=1 . . . m, k=1 . . . n} comprising historic sensor object data, each of which is representative of one sensor object s.sub.t-k.sup.i, out of i=1 . . . m sensor objects ascertained by the sensor apparatus in the surroundings of the vehicle at a preceding time t—k, where n=1 and m≥1; ascertaining a reduced sensor object data set H′={s.sub.t-k
|k=1 . . . n} depending on the sensor object set H, wherein s.sub.t.sup.a
refers to a sensor object associated with a fusion object f.sub.t at the time t; depending on the reduced sensor object data set H′, associating the sensor object s.sub.t with a fusion object f.sub.t.sup.j; and ascertaining a refreshed fusion object.
18. A computer product comprising a non-transitory computer readable medium having stored thereon program code which, when executed on a processor, a microcontroller or a programmable hardware component, carries out the acts of: providing current sensor object data that are representative of a sensor object s.sub.t ascertained by the sensor apparatus in surroundings of the vehicle at a time t; providing a fusion object data set comprising fusion object data, each of which is representative of one fusion object f.sub.t.sup.j out of j=1 . . . p fusion objects ascertained in the surroundings of the vehicle and each of which is provided at the time t for sensor data fusion; providing a sensor object data set H={s.sub.t-k.sup.i|l=1 . . . m, k=1 . . . n} comprising historic sensor object data, each of which is representative of one sensor object s.sub.t-k.sup.i, out of i=1 . . . m sensor objects ascertained by the sensor apparatus in the surroundings of the vehicle at a preceding time t−k, where n=1 and m≥1; ascertaining a reduced sensor object data set H′={s.sub.t-k|k=1 . . . n} depending on the sensor object set H, wherein s.sub.t
refers to a sensor object associated with a fusion object f.sub.t at the time t; depending on the reduced sensor object data set H′, associating the sensor object s.sub.t with a fusion object f.sub.t.sup.j; and ascertaining a refreshed fusion object.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0041]
[0042]
[0043]
DETAILED DESCRIPTION OF THE DRAWINGS
[0044] Elements of the same design or function are given the same reference signs throughout the figures.
[0045] A method for high-level object fusion based on the information matrix fusion (IMF) algorithm, which permits a reliable use in embedded systems, is proposed below. On the basis of the exemplary embodiment of
[0046] The vehicle F further comprises by way of example a further sensor apparatus S2, that is also configured for capture of the surroundings of the vehicle F. The sensor apparatuses S1, S2 are signal-coupled to the apparatus V for sensor data fusion. Sensor object data provided by the sensor apparatuses S1, S2 can be fused by the apparatus V in accordance with any method for high-level object fusion, and stored in a fusion object data set.
[0047] The method for high-level object fusion can involve a recursive estimation process on the basis of the information matrix fusion (IMF) algorithm.
[0048] It emerges from the formal description of the IMF algorithm that for every sensor object s.sub.t={x.sub.t.sup.s, p.sup.s} with properties x.sub.t.sup.s and their uncertainties p
that are to be fused at the time t to one of j=1 . . . p fusion objects f.sub.t.sup.j, the state of the sensor object s.sub.t-k at the last fusion time must be known. Since in theory very large values of k>>1 are possible, it is not readily possible to make all the information required for the IMF algorithm available on an embedded system with low memory resources.
[0049] In this context, a data and program memory is in particular assigned to the apparatus V, in which a computer program that is explained below in more detail with reference to the flowchart of
[0050] In a first step P10, a current sensor object data that are representative of a sensor object s.sub.t ascertained by the sensor apparatus S1 in the surroundings of the vehicle F at a time t are provided. In particular, a plurality of sensor object data can be ascertained in this step by the sensor apparatus S1, so that altogether q>1 different sensor objects s.sub.t.sup.i, i=1 . . . q ascertained by the sensor apparatus S1 are provided. For example, in the first step P10 current sensor object data of the further sensor apparatus S2 can also be provided.
[0051] The program is continued in a step P20, in which a fusion object data set is provided. The fusion object data set comprises fusion object data, which are in each case representative of one fusion object f.sub.t.sup.j out of j=1 . . . p ascertained in the surroundings of the vehicle F, that are each provided at the time t for sensor data fusion. The fusion object data set is, for example, stored in a data memory of the apparatus V, and was ascertained in a preceding fusion process from the sensor object data of the sensor apparatuses S1, S2.
[0052] A sensor object data set H={s.sub.t-k.sup.i|i=1 . . . m, k≤1 . . . n} is provided in a subsequent step P30. The sensor object data setH comprises historic sensor object data, which are in each case representative of one sensor object s.sub.t-k.sup.i, out of i=1 . . . m ascertained by the sensor apparatus S1 in the surroundings of the vehicle F at a preceding time t-k, where n, m≥1. For example, in the first step P30 a sensor object data set of the further sensor apparatus S2 can also be provided.
[0053] The sensor object data set H can also be referred to as the association history. A fusion only results from a previous association of sensor object and fusion object. Thus instead of recording all q of the sensor objects s.sub.t.sup.i, i=1 . . . q recognized by the sensor apparatus S1 at the time t, it is sufficient to include only the p≤q sensor objects s.sub.t.sup.i, i=1 . . . p actually associated with a fusion object in the association history. In other words, in a step preceding the step P30 and following the step P10 in which the current sensor data are ascertained, it is possible for a selection to be made each time as to which of the ascertained current sensor data are included in the association history. In doing so, a unique association history can be recorded for each sensor apparatus S1, S2 independently of the association histories of other sensor apparatuses in the fusion system. Only the association history of the sensor apparatus S1 will therefore be considered below.
[0054] The method is continued in a step P40 that follows step P30, in which, depending on the sensor object data set H, a reduced sensor object data set H′={s.sub.t-k.sup.α.sup.
[0055] For the sensor object data set H, the necessity of n>1 only results from a possible association of different sensor objects at different times with the same fusion object. It can, however, usually be assumed that the sensor apparatus S1 continuously tracks the objects in the surroundings of the vehicle F, and it is only in dense traffic situations that ambiguous object formations can result, so that an actual object in the surroundings results in the formation of multiple sensor objects (separated in space or time). In addition, at any time no more than one object of the sensor apparatus S1 is associated with the same fusion object.
[0056] It is therefore sufficient to note the few sensor objects s.sub.t-k.sup.α, k=1 . . . n most recently associated with a fusion object at the fusion object. The history thus now contains different sensor objects at different times, H′={s.sub.t-k
|k=1 . . . n}. A value of n=2 has been found appropriate in practice, so that one-off ambiguities can be handled. The resulting model, with n=2, for the realization of the association history is shown in
and s.sub.t-2.sup.α
of the sensor apparatus S1 associated at the times t−1 and t−2, while the reduced sensor object data set H′.sub.S2 comprises the sensor objects s.sub.t-1.sup.α
and s.sub.t-2.sup.α
of the further sensor apparatus S2 associated at the times t−1 and t−2.
[0057] In a subsequent step P50, depending on the reduced sensor object data set H′, the sensor object s.sub.t is associated with a fusion object f.sub.t.sup.j, and a refreshed fusion object is ascertained.
[0058] To this end a check is first made as to whether the reduced sensor data set H′ assigned to the respective fusion object f.sub.t.sup.j comprises a sensor object s.sub.t-k.sup.α.sup.
[0059] In the event that the reduced sensor object data set H′ does not comprise an associated sensor object s.sub.t-k.sup.α.sup.
[0060] If no fused sensor object related to an earlier time can be found in the limited association history explained above, for example during the first fusion of the sensor object, or because the choice of q cannot resolve the ambiguity of the sensor apparatus, the fusion is only applied with approximative consideration of possible correlations using the cross-covariance method.
[0061] The program is subsequently ended or, possibly following a specified interruption, continued in step P10 with an updated object data set.
[0062] In an alternative variant embodiment, the step P70 is supplemented with a plausibility check of the refreshed fusion object.
[0063] In the practical application of the IMF algorithm for the fusion of the sensor object data of multiple sensor apparatuses on an embedded system, an error case can occur that results in the determination of fused properties that deviate grossly from the properties reported by the corresponding sensor apparatus.
[0064] In this variant embodiment, feature data x.sub.t-1, x.sub.t.sup.s are assigned to the respective sensor object s.sub.t-1, s.sub.t, that are representative of the properties reported by the sensor apparatus, such as a lateral extent, position and orientation of the sensor object s.sub.t-1, s.sub.t. Indicator data p.sub.x
.sup.s, p.sub.x.sub.
[0065] Similarly, feature data x.sub.t-1.sup.f, x.sub.t.sup.f that are representative of the properties of the corresponding fusion object f.sub.t-1.sup.j, f.sub.t.sup.j and indicator data p.sub.x.sub.
[0066] In a step P72 that follows the step P50, depending on the feature data x.sub.t-1, x.sub.t-1.sup.f, a feature fusion state x
.sup.f is ascertained that is representative of the properties of the fusion object f.sub.t-1.sup.j at the time t following the sensor data fusion with f.sub.t-1
.
[0067] In a step P74, depending on the feature data x.sub.t.sup.s, x.sup.f and on the indicator data p
, p.sub.x
.sup.f a refreshed feature fusion state x
.sup.f is thereupon ascertained, that is representative of the properties of the fusion object f.sub.t.sup.j at the time t following the sensor data fusion with x.sub.t.sup.s.
[0068] Finally, in a step P76, depending on the feature fusion state x, the refreshed feature fusion state x
.sup.f, and the feature data x.sub.t.sup.s, a check is made as to whether the refreshed feature fusion state x.sub.t;t.sup.f satisfies the equation min(x.sub.t-1
.sup.f, x.sub.t.sup.s)≤x
≤max(x.sub.t-1
, x.sub.t.sup.s). In other words, a check is made in the step P76 as to whether the intuitive assumption that the updated properties following the fusion lie between the original properties and the properties of the fused sensor object is violated.
[0069] In the event that the equation is satisfied, the refreshed fusion object is ascertained depending on the refreshed feature state x.sup.f in the step P70. Otherwise, the sensor object s.sub.t is associated in the step P60 with a fusion object f.sub.t.sup.j by using the cross-covariance algorithm, and the refreshed fusion object is ascertained.
[0070] To handle this error case, the IMF fusion of the fusion and sensor objects is in particular carried out as usual, and the result stored temporarily for checking. If the result violates the intuitive assumption described above, the temporary result is discarded. A fusion instead takes place with only approximative consideration of possible correlations, making use of the cross-covariance method.