METHOD FOR ASCERTAINING A SPATIAL ORIENTATION OF A TRAILER

20220258800 · 2022-08-18

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for ascertaining a spatial orientation of a trailer of an autonomously driving towing vehicle with trailer. The method includes the steps of reading in image data of at least one rear-facing camera, assigning image points of the image data to the trailer or to the vehicle surrounding environment, ascertaining a rear trailer edge or at least one point of the rear trailer contour from the image points assigned to the trailer, and determining the trailer angle as a function of image coordinates of the trailer edge or of the point, the dimensions of the trailer, and the position of the camera relative to the support point of the trailer.

    Claims

    1-10. (canceled)

    11. A method for ascertaining a spatial orientation of a trailer of an autonomously driving towing vehicle with trailer, comprising the following steps: reading in image data of at least one rear-facing camera; assigning image points of the image data to the trailer or to the vehicle surrounding environment; ascertaining a rear trailer edge or at least one point of a rear trailer contour, from the image points assigned to the trailer; and determining a trailer angle as a function of image coordinates of the trailer edge or of the point, dimensions of the trailer, and a position of the camera relative to a support point of the trailer.

    12. The method as recited in claim 11, wherein the assignment of the image points is carried out using a machine learning model that was trained on the assignment of the image points to the trailer.

    13. The method as recited in claim 11, wherein the ascertaining of the rear trailer edge or of the point is determined using a machine learning model.

    14. The method as recited in claim 11, wherein an image processing algorithm is used to ascertain the rear trailer edge or the point.

    15. The method as recited in claim 11, wherein the ascertaining of the rear trailer edge or of the point and the distance thereto is determined using a disparity ascertained by a stereo camera.

    16. The method as recited in claim 11, wherein the trailer angle is calculated based on geometric and trigonometric relations between the image coordinates, the position of the camera relative to the support point of the trailer, and the dimension of the trailer.

    17. The method as recited in claim 11, wherein the trailer angle is determined based on a lower point of the trailer edge or an outermost point of the rear trailer contour.

    18. The method as recited in claim 11, wherein the longitudinal and lateral position of the trailer edge or of the point, relative to the support point of the trailer or of the camera, is determined based on the trailer angle.

    19. The method as recited in claim 11, wherein the dimensions of the trailer are ascertained via sensors of the towing vehicle with trailer.

    20. A device, comprising: at least two rear-facing cameras configured to acquire image data; and a processing unit configured to ascertain a spatial orientation of a trailer of an autonomously driving towing vehicle with trailer, the processing unit configured to ascertain a rear trailer edge or at least one point of a rear trailer contour, and to ascertain a trailer angle as a function of image coordinates of the trailer edge or of the point, dimensions of the trailer, and a position of the camera relative to a support point of the trailer.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0023] FIG. 1 shows an exemplary embodiment of a method according to the present invention for ascertaining the spatial orientation of a trailer.

    [0024] FIG. 2 shows an exemplary embodiment for ascertaining the trailer angle, in accordance with the present invention.

    DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

    [0025] FIG. 1 shows an exemplary embodiment of a method according to the present invention for ascertaining the spatial orientation of a trailer 10 (see also FIG. 2). Trailer 10 is in particular a trailer 10 of an autonomously driving towing vehicle with trailer 14. For the autonomous driving functions, this vehicle-trailer unit is as a rule equipped with a multiplicity of sensors such as, inter alia, cameras 18. These are also used in this method for ascertaining the spatial orientation of trailer 10, so that no additional cameras 18 are required.

    [0026] In a first step A, the image data recorded by a rear-facing camera 18 (see FIG. 2) are read into a processing unit 22. These image data here contain at least one rear part of trailer 10. In the processing unit, in a next step B the image points are assigned to trailer 10 or to the vehicle surrounding environment. This assignment can be carried out for example via a machine learning model that, on the basis of training data, has learned to distinguish the trailer from the vehicle surrounding environment. In this way, trailer 10 is recognized in the image data.

    [0027] In a next step C, on the basis of trailer 10 recognized in this way, a rear trailer edge 26 is ascertained. For this purpose, the image points of trailer 10 are inputted into a machine learning model that, based on the training, recognizes rear trailer edge 26. It is also possible to use only a point of rear trailer edge 26 for the further method. In particular, for this purpose a point 26 is used that is situated in a lower region of trailer edge 26. Such a point 26 has the advantage that a rolling movement of trailer 10, caused by a curve, will have only a small influence on trailer angle α.

    [0028] In a further step D, the trailer angle α is determined. For this step, if the dimensions of trailer 10 are not already stored then they have to be determined ahead of time. The determining of the dimensions only has to be carried out at the beginning of the trip. This can be ascertained for example using a radar sensor installed in the towing vehicle with trailer 14.

    [0029] FIG. 2 shows an exemplary embodiment for ascertaining the trailer angle α. This Figure shows a plan view of a towing vehicle with trailer 14 in which trailer angle α is 0° and in which trailer 10 is turned (broken lines). In this exemplary embodiment, camera 18 is situated on a left external mirror of towing vehicle 30. In addition, a left rear trailer edge or point 26 is shown. A radius r between this point 26 and a point of rotation, here formed by a support point 34, results from the trailer width and a trailer length going out from support point 34. From these values, there results an angle α.sub.0 between diagonal r and a vehicle longitudinal axis v.

    [0030] The lines SF show a field of view of camera 18 in an optimally oriented position. In contrast, lines SF′ show a field of view of a camera 18 that is displaced from the optimal position by a correction angle γ. This correction angle γ has to be taken into account in the calculation as a correction, because a center axis x of camera 18 is also displaced by this angle. Here, the point distance x.sub.w represents the distance of point 26 from camera 18 along center axis x. This value is calculated according to x.sub.w=r.Math.cos(α+α.sub.0)+c.sub.x. Because the calculation of the values α, α.sub.0, r is calculated starting from support point 34, for the calculation of the point distance x.sub.w the camera distance c.sub.x between support point 34 and camera 18 also has to be added. Likewise, the point distance y.sub.w represents the distance of the point from camera 18 along transverse axis y. This value is calculated according to y.sub.w=r.Math.sin(α+α.sub.0)−c.sub.y. Here, the camera distance c.sub.y, which corresponds to the distance between support point 34 and camera 18, has to be subtracted.

    [0031] An image of trailer 10 is recorded in camera 18 on an image plane 38 formed for example by an image sensor. For clearer illustration, the image plane 38 in front of camera 18 is shown. The value y.sub.i here describes the image distance between center axis x and point 26 of trailer edge 26 on image plane 38. The image distance y.sub.i can be determined. The image distance y.sub.i for focal length f here behaves in the same way as the point distance y.sub.w relative to x.sub.w. In this way, including the correction angle γ, the following results:

    [00001] y i f = - sin ( γ ) x w + cos ( γ ) y w cos ( γ ) x w + sin ( γ ) y w

    [0032] If the equations for x.sub.w and y.sub.w are substituted in this equation, the following results:

    [00002] y i f = - sin ( γ ) ( r cos ( α + α 0 ) + c x ) + cos ( γ ) ( r sin ( α + α 0 ) - c y ) cos ( γ ) ( r cos ( α + α 0 ) + c x ) + sin ( γ ) ( r sin ( α + α 0 ) - c y )

    [0033] If this equation is solved for trailer angle α, then trailer angle α can be ascertained on the basis of the image on camera 18, the dimensions of trailer 10, and the position of camera 18 relative to support point 34. By substituting the trailer angle α determined in this way in the equations for x.sub.w and y.sub.w, the longitudinal and lateral position of point 26 of trailer edge 26 can then also be determined.