Method and device for sensor data fusion for a vehicle
11756310 · 2023-09-12
Assignee
Inventors
- Michael Himmelsbach (Munich, DE)
- Luca TRENTINAGLIA (Eichenau, DE)
- Dominik BAUCH (Muehldorf am Inn, DE)
- Daniel MEISSNER (Friedberg, DE)
- Josef Mehringer (Gmund, DE)
- Marco BAUMGARTL (Gilching, DE)
Cpc classification
B60W50/0098
PERFORMING OPERATIONS; TRANSPORTING
G06V20/58
PHYSICS
G06V20/56
PHYSICS
International classification
G06V20/58
PHYSICS
G06V10/80
PHYSICS
Abstract
A method and device for sensor data fusion for a vehicle as well as a computer program and a computer-readable storage medium are disclosed. At least one sensor device (S1) is associated with the vehicle (F), and in the method, fusion object data is provided representative of a fusion object (O.sub.F) detected in an environment of the vehicle (F); sensor object data is provided representative of a sensor object (O.sub.S) detected by the sensor device (S1) in the environment of the vehicle (F); indicator data is provided representative of an uncertainty in the determination of the sensor object data; reference point transformation candidates of the sensor object (O.sub.S) are determined depending on the indicator data; and an innovated fusion object is determined depending on the reference point transformation candidates.
Claims
1. A method of sensor data fusion for a vehicle (F), wherein a sensor device (S1) is associated with the vehicle (F), and wherein in the method a fusion object data set is provided comprising fusion object data each representative of a fusion object (O.sub.F) detected in an environment of the vehicle (F), each fusion object (O.sub.F) being associated with a fusion reference point, sensor object data is provided, which is representative of a sensor object (O.sub.S) detected by the respective sensor device (S1) in the environment of the vehicle (F) and to which a sensor reference point (A.sub.S) is assigned, indicator data is provided that is representative of an uncertainty in the determination of the sensor object data, wherein the indicator data comprises a first, a second, and a third characteristic of the sensor object (O.sub.S), the first characteristic value representative of an uncertainty in the determination of a length (1), a second characteristic value representative of an uncertainty in the determination of a width (b), and a third characteristic value representative of an uncertainty in the determination of an orientation (α), and wherein the first, second and third characteristic values are each compared with a separate predetermined threshold value, reference point transformation candidates of the sensor object (O.sub.S) are determined depending on the indicator data, wherein reference point transformation candidates are determined when either the first characteristic or the second characteristic are less than or equal to the respective threshold value, and wherein the third characteristic is less than or equal to the respective threshold value, and depending on the reference point transformation candidates, an innovated fusion object is determined.
2. The method according to claim 1, wherein the sensor object data and fusion object data is representative of a lateral extent and/or an orientation of the sensor object (O.sub.S) and/or fusion object (O.sub.F), the fusion reference point (A.sub.F) and/or sensor reference point (A.sub.S) lies on a contour of the sensor object (O.sub.S) and/or fusion object (O.sub.F), and the indicator data is representative of an uncertainty in the determination of the lateral extent and/or orientation.
3. The method according to claim 1, wherein the sensor object data and fusion object data is representative of a rectangular or cuboid representation of the sensor object (O.sub.S) and/or fusion object (O.sub.F), wherein the respective sensor object (O.sub.S) and/or fusion object (O.sub.F) extends laterally rectangularly along its longitudinal axis with the length (1) and along its transverse axis with the width (b) in the orientation (α) relative to the vehicle (F) and each corner of the rectangle and each centre point of its sides a reference point is assigned each, wherein one of the reference points each forms the fusion reference point (A.sub.F) and/or sensor reference point(A.sub.S), and in the event that the first characteristic value exceeds the respective threshold value and the second and third characteristic values are less than or equal to the respective threshold value, only reference points of the sensor object (O.sub.S) along its transverse axis are assigned to the sensor object (O.sub.S) as reference point transformation candidates, the second characteristic value exceeds the respective threshold value and the first and third characteristic values are less than or equal to the respective threshold value, only reference points of the sensor object (O.sub.S) along its longitudinal axis are assigned to the sensor object (O.sub.S) as reference point transformation candidates, the third characteristic value exceeds the respective threshold value or the first and the second characteristic values exceed the respective threshold value, no reference point of the sensor object (O.sub.S) is assigned to the sensor object (O.sub.S) as reference point transformation candidate.
4. The method according to claim 3, wherein the indicator data is representative of the fact that the corresponding sensor device (S1) is capable of detecting the lateral extent of an object and/or the orientation of the object, and reference points of the sensor object (O.sub.S) are assigned to the sensor object (O.sub.S) as reference point transformation candidates only if they have been determined by a sensor device (S1) which is capable of detecting the lateral extent of an object and the orientation of the object.
5. The method according to claim 3, wherein the indicator data is representative of a visual range of the sensor device (S1), depending on the sensor object data and the indicator data, it is determined whether the corresponding sensor object (O.sub.S) is located within a predetermined range within the visual range of the corresponding sensor device (S1), and reference points of the sensor object (O.sub.S) are assigned to the sensor object (O.sub.S) as reference point transformation candidates only if it is located within the predetermined range within the visual range of the corresponding sensor device (S1).
6. The method according to claim 3, wherein it is checked whether a reference point transformation candidate is associated with the sensor object (O.sub.S), wherein in the case that a reference point transformation candidate is associated with the sensor object (O.sub.S): depending on the sensor object (O.sub.S), a transformation state each of the sensor object (O.sub.S) with respect to the respective reference point transformation candidate is determined and compared with a respective fusion object (O.sub.F); depending on the comparison, the sensor object (O.sub.S) is assigned to a fusion object (O.sub.F), and depending on the corresponding transformation state of the sensor object (O.sub.S) and the fusion object data, an innovated fusion object is determined; and in the case that no reference point transformation candidate is assigned to the sensor object (O.sub.S), the sensor object (O.sub.S) is evaluated as a point target and: a shortest distance each between the reference points of the sensor object (O.sub.S) and the reference points of a respective fusion object (O.sub.F) is determined; a transformation state each of the respective fusion object (O.sub.F) with respect to the corresponding reference point of the fusion object (O.sub.F), which has the shortest distance to the corresponding reference point of the sensor object (O.sub.S), is determined and compared with the sensor object (O.sub.S); depending on the comparison, the sensor object (O.sub.S) is assigned to a fusion object (O.sub.F), and depending on the corresponding transformation state of the fusion object (O.sub.F) and the sensor object data, an innovated fusion object is determined.
7. The method according to claim 6, wherein the fusion object (O.sub.F) is assigned, depending on the respective assigned sensor object, a visual range characteristic value each which is representative of the visual range of the respective sensor device (S1), depending on the respective fusion object data and the respective visual range characteristic value, it is determined whether the fusion object (O.sub.F) is located within a predetermined range within the visual range of the corresponding sensor device (S1), and the innovated fusion object is evaluated as a point target if the fusion object (O.sub.F) is located outside the predetermined area within the respective visual range of all corresponding sensor devices (S1).
8. The method according to claim 6, wherein in determining the innovated fusion object, it is checked whether the fusion object (O.sub.F) or the sensor object (O.sub.S) have been evaluated as a point target, wherein: in case only the sensor object (O.sub.S) has been evaluated as a point target, the fusion reference point of the fusion object (O.sub.F) is taken over as fusion reference point of the innovated fusion object, and in case neither the fusion object (O.sub.F) nor the sensor object (O.sub.S) has been evaluated as a point target or only the fusion object (O.sub.F) has been evaluated as a point target, the sensor reference point (A.sub.S) of the sensor object (O.sub.S) is taken over as fusion reference point of the innovated fusion object.
9. A non-transitory, computer-readable storage medium storing a computer program for sensor data fusion for a vehicle (F), wherein the computer program comprises instructions which, when the computer program is executed by a computer, cause the same to execute the method according to claim 1.
10. A sensor data fusion device (V) to determine an innovated fusion object for a vehicle (F), wherein the device (V) is configured to: obtain a fusion object data set comprising fusion object data each representative of a fusion object (O.sub.F) detected in an environment of the vehicle (F), each fusion object (O.sub.F) being associated with a fusion reference point, obtain sensor object data, which is representative of a sensor object (O.sub.S) detected by the respective sensor device (S1) in the environment of the vehicle (F) and to which a sensor reference point (A.sub.S) is assigned, obtain indicator data, which is representative of an uncertainty in the determination of the sensor object data, wherein the indicator data comprises a first, a second, and a third characteristic of the sensor object (O.sub.S), the first characteristic value representative of an uncertainty in the determination of a length (1), a second characteristic value representative of an uncertainty in the determination of a width (b), and a third characteristic value representative of an uncertainty in the determination of an orientation (α), and wherein the first, second and third characteristic values are each compared with a separate predetermined threshold value, determine reference point transformation candidates of the sensor object (O.sub.S) depending on the indicator data, wherein reference point transformation candidates are determined when either the first characteristic or the second characteristic are less than or equal to the respective threshold value, and wherein the third characteristic is less than or equal to the respective threshold value, and determine the innovated fusion object based on the reference point transformation candidates.
Description
BRIEF DESCRIPTION OF THE FIGURES
(1) Examples of embodiments of the invention are explained in more detail below with reference to the schematic drawings,
(2) in which:
(3)
(4)
(5)
DETAILED DESCRIPTION
(6) Elements of the same construction or function are provided with the same reference signs throughout all Figures.
(7) In the following, a method for high-level object fusion is proposed, which provides a reference point treatment appropriate to the particular sensor and object. By means of the embodiment of
(8) The sensor device S1 is associated, by its orientation and physical characteristics, with a field of view or visual range indicated by dashed lines around the angle α.
(9) The vehicle F further comprises, by way of example, a further sensor device S2 also arranged to detect the environment of the vehicle F. The sensor devices S1, S2 are signal-coupled to the sensor data fusion device V. Sensor object data provided by the sensor devices S1, S2 may be fused by the device V according to any high-level object fusion method and stored in a fusion object data set. Exemplarily reference is made in this context to the remarks of N. Kämpchen in “Feature-level fusion of laser scanner and video data”, Ulm: Ulm University, 2007; and F. Seeliger in “Fahrzeugübergreifende Informationsfusion,” Ulm: Schriftenreihe des Instituts für Mess-, Regel-und Mikrotechnik der Universität Ulm, 2017.
(10) In order to carry out the association between sensor and fusion objects and, if necessary, to fuse the associated objects, a common reference point may first have to be found in order to make the properties of the objects comparable. The change of the reference point of an object requires knowledge of the extension (length and width) as well as the orientation of the object. However, this requirement cannot always be met in practice, for example due to the physical measurement principle of the sensor device, or incomplete object detection, so that reliable sensor data fusion is compromised.
(11) In order to avoid negative effects of wrong reference point information on the fusion result, e.g., wrong positioning of objects, an extended treatment of the reference points of objects appropriate to the respective sensor and object is proposed.
(12) In this context, the device V is associated in particular with a data and program memory in which a computer program is stored, which is explained in more detail below with reference to the flow chart in
(13) In a first step P10, a fusion object data set is provided comprising fusion object data representative of the fusion object O.sub.F. For example, the fusion object data set is stored in a data memory of the device V and was determined from the sensor object data of the sensor devices S1, S2 in a previous fusion process.
(14) The program continues in a step P20 of providing sensor object data representative of the fusion object O.sub.S.
(15) In a subsequent step P30, indicator data representative of an uncertainty in the determination of the sensor object data is provided. The indicator data may comprise, for example, above-mentioned indicator characteristic values describing the uncertainty in the respective determination of length l, width b and orientation of the sensor object O.sub.S, for example by variances. Furthermore, the indicator data may be representative that the sensor device S1 is technically capable of detecting the lateral extent and orientation of the sensor object O.sub.S. Exemplarily, a sensor type is stored in the device V for this purpose or is determined in a step P26 preceding the step P30. Further, the indicator data may be representative of the visual range of the sensor device S1. For example, the visual range may also be stored in the device V or determined in a step P28 preceding the step P30.
(16) The method is then continued in a step P40, in which reference point transformation candidates of the sensor object O.sub.S are determined depending on the indicator data. For this purpose, in a step P46, the indicator characteristic values are compared with a predefined threshold value each. If the respective uncertainty exceeds the respective threshold value, the quantity is considered unknown and the possibility of reference point transformation is restricted accordingly. Thus, only reference points of the sensor object O.sub.S along its back or front side are determined as reference point transformation candidates if the length l is unknown, but width b and orientation are known.
(17) If the width b is unknown, but length l and orientation are known, only reference points of the sensor object O.sub.S along its longitudinal axis are determined as reference point transformation candidates. If both the length l and the width b and/or the orientation of the sensor object O.sub.S are unknown, the sensor object O.sub.S is considered as a point target to which no reference point transformation candidates are assigned.
(18) In an optional step P42 before or after step P46, depending on the indicator data, it is checked whether the sensor device S1 is technically capable of detecting the lateral extension and orientation of the sensor object O.sub.S. For example, if the sensor device S1 is an ultrasonic or RaDAR sensor that does not meet this requirement, the sensor object O.sub.S is considered a point target.
(19) In an optional step P44 before or after the step P46, depending on the indicator data, it is checked whether the sensor object O.sub.S is outside the field of view of the sensor device S1 or at the edge thereof. In this case, the sensor object O.sub.S is also considered as a point target.
(20) In a step P50 following step P46, an innovated fusion object is determined depending on the reference point transformation candidates. For this purpose, it is first checked whether reference point transformation candidates are assigned to the sensor object O.sub.S at all or whether it is a point target. Furthermore, it is checked whether the fusion object O.sub.F is a point target, for example in a step P48 preceding the step P50. If both objects are no point targets, an association with the fusion object O.sub.F is performed, in which the reference point transformation is restricted to the reference point transformation candidates, and the transformed sensor object O.sub.S and/or a transformation state is subsequently fused with the fusion object O.sub.F, i.e., the innovated fusion object is determined. The sensor reference point is adopted as the fusion reference point of the innovated fusion object.
(21) In the case where the sensor object O.sub.S or the fusion object O.sub.F is a point target, a pair of the locally closest possible reference points of the sensor object O.sub.S and the fusion object O.sub.F is determined. Based on the properties of the sensor and fusion objects in these reference points, a decision is made about an association and fusion is performed if necessary. This does not necessarily result in a common reference point for both objects.
(22) The fusion reference point of the innovated fusion object is the sensor reference point if the fusion object O.sub.F is a point target, or the fusion reference point if the sensor object O.sub.S is a point target.
(23) Finally, in a subsequent step P60, the innovated fusion object is assigned an updated visual range characteristic value representative of the visual range of the sensor device S1 as well as of all sensor devices contributing to the fusion so far.
(24) The program is then terminated or, if necessary, continued after a predetermined interruption in step P10 with an updated object data set.
(25) In step P48, the visual range characteristic value associated with the fusion object O.sub.F is used to check whether the fusion object O.sub.F is a point target. For this purpose, it is checked whether the fusion object O.sub.F is located outside the field of view of all sensor devices S1, S2 contributing to the fusion or is located at the edge thereof. If this is the case, the fusion object is evaluated as a point target.