METHOD FOR ASCERTAINING A SPATIAL ORIENTATION OF A TRAILER
20220258766 ยท 2022-08-18
Inventors
- Ganesh Nageswaran (Benningen Am Neckar, DE)
- Benjamin Haas (Stuttgart, DE)
- Daniel Eckstein (Stuttgart, DE)
- Daniel Kuhner (Marbach, DE)
- Jannik Steinkamp (Stuttgart, DE)
- Benjamin Classen (Weinsberg, DE)
- Eva Zimmermann (Reutlingen, DE)
- Robert Herzig (Stuttgart, DE)
Cpc classification
B60Y2200/148
PERFORMING OPERATIONS; TRANSPORTING
B60W2300/14
PERFORMING OPERATIONS; TRANSPORTING
B60W60/001
PERFORMING OPERATIONS; TRANSPORTING
B60W60/0015
PERFORMING OPERATIONS; TRANSPORTING
B60W2520/22
PERFORMING OPERATIONS; TRANSPORTING
B60Y2200/147
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A method for ascertaining a spatial orientation of a trailer of an autonomously driving towing vehicle with trailer. In the method, sensor data of a plurality of different sensor types are processed, so that the trailer and objects in the environment are identified in the sensor data, the sensor data communicated by the various sensor types and processed in this way being combined with one another, and the movement path of the trailer relative to the objects in the environment being determined on the basis of these combined sensor data.
Claims
1-14. (canceled)
15. A method for ascertaining a spatial orientation of a trailer of an autonomously driving towing vehicle with trailer, the method comprising the following steps: processing respective sensor data of each of a plurality of different sensor types to identify the trailer and objects in an environment in each of the respective sensor data; combining the processed respective sensor data with one another; and determining a movement path of the trailer relative to the objects in the environment based on the combined sensor data.
16. The method as recited in claim 15, wherein the respective sensor data are ascertained with sensors used for the autonomous driving.
17. The method as recited in claim 15, wherein an estimation filter is used to determine the movement path of the trailer.
18. The method as recited in claim 15, wherein the trailer is identified in the sensor data based on dimensions of the trailer, a position of a support point of the trailer, and a towing vehicle steering angle.
19. The method as recited in claim 18, wherein the dimensions of the trailer are determined from the sensor data using a dimension estimation algorithm.
20. The method as recited in claim 15, wherein the identification of the trailer in the sensor data is carried out using a deep learning algorithm.
21. The method as recited in claim 15, wherein a factor is calculated indicating how reliably the trailer was identified in each of the respective sensor data.
22. The method as recited in claim 21, wherein the identification of the trailer in the respective sensor data is rejected if the factor is below a specified value.
23. The method as recited in claim 15, wherein a previously ascertained movement path of the trailer is used as a basis for the identification of the trailer in the respective sensor data.
24. The method as recited in claim 15, wherein the identification of the trailer is carried out during a detected state of the towing vehicle.
25. A device, comprising: at least two different sensor types; and a processing unit configured to ascertain a spatial orientation of a trailer of an autonomously driving towing vehicle with trailer, the processing unit configured to: process respective sensor data of each of the two different sensor types to identify the trailer and objects in an environment in each of the respective sensor data; combining the processed respective sensor data with one another; and determine a movement path of the trailer relative to the objects in the environment based on the combined sensor data.
26. A non-transitory machine-readable data carrier on which is stored a computer program for ascertaining a spatial orientation of a trailer of an autonomously driving towing vehicle with trailer, the computer program, when executed by one or more computers, causing the one or more computers to perform the following steps: processing respective sensor data of each of a plurality of different sensor types to identify the trailer and objects in an environment in each of the respective sensor data; combining the processed respective sensor data with one another; and determining a movement path of the trailer relative to the objects in the environment based on the combined sensor data.
27. A computer configured to ascertain a spatial orientation of a trailer of an autonomously driving towing vehicle with trailer, the computer configured to: process respective sensor data of each of a plurality of different sensor types to identify the trailer and objects in an environment in each of the respective sensor data; combine the processed respective sensor data with one another; and determine a movement path of the trailer relative to the objects in the environment based on the combined sensor data.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] Exemplary embodiments of the present invention are shown in the figures and are explained in more detail in the following description.
[0024]
[0025]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0026]
[0027] On towing vehicle 14, in addition there are situated at least two radar sensors 30, also facing the rear, for the detection of trailer 22. In this exemplary embodiment, in addition at least one lidar sensor 34 is configured in the direction of trailer 22. All sensors 26, 30, 34 are connected to a processing unit 38 provided in towing vehicle 14, so that the sensor data of all these sensors 26, 30, 34 can be communicated to processing unit 38 for further processing.
[0028]
[0029] In a next step B (B1, B2, B3), the sensor data are processed. In this step, in the respective sensor data trailer 22 and objects in the environment are identified. Here, objects in the environment may be for example other traffic participants, or construction work in the roadway. Trailer 22 is here identified for example by a deep learning method that has been trained on a recognition of trailer 22 for the respective sensor type.
[0030] In a further step (C1, C2, C3), on the basis of the thus ascertained trailer 22 a factor F is calculated indicating how reliably trailer 22 was able to be identified in the sensor data. This factor F determines how reliably trailer 22 has been recognized on the basis of, for example, the quality of the sensor data, the recognized shape of trailer 22, the weather conditions, and the current function status of sensor 26, 30, 34. If the resulting factor F is greater than or equal to a predefined value W, these sensor data are forwarded. If, on the other hand, the factor F is smaller than the value W, the corresponding sensor data are not forwarded in order to not worsen the overall result.
[0031] The forwarded sensor data are combined in a step D, so that, in a final step E, the movement path of trailer 22 relative to the recognized objects in the environment can be determined from these sensor data combined in this way. For easier identification of trailer 22 in step B, the movement path recognized in step E can be used as an aid in step B.