METHOD FOR DETECTING WHETHER AN EGO VEHICLE CHANGES FROM A CURRENTLY TRAVELED TRAFFIC LANE OF A ROADWAY TO AN ADJACENT TRAFFIC LANE OR WHETHER IT STAYS IN THE CURRENTLY TRAVELED TRAFFIC LANE

20230351887 · 2023-11-02

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for detecting whether an ego vehicle will leave a currently traveled traffic lane of a roadway to the left or right or whether it will stay in the currently traveled traffic lane. In the method, an image of a measuring space, which includes the vehicle area in front of the ego vehicle, is generated using an image sensor; an expected trajectory of the ego vehicle is projected into the image; at least one traffic lane boundary laterally adjacent to the trajectory is detected; and a decision is made whether the traffic lane will be changed or maintained by comparing the trajectory to the at least one detected traffic lane boundary.

    Claims

    1. A method for detecting whether an ego vehicle will leave a currently traveled traffic lane of a roadway to the left or right or whether it will stay in the currently traveled traffic lane, the method comprising the following steps: generating, using an image sensor, an image of a measuring space, which includes a vehicle area in front of the ego vehicle; projecting an expected trajectory of the ego vehicle into the image; detecting at least one traffic lane boundary laterally adjacent to the trajectory; and making a decision whether the traffic lane will be changed or maintained by comparing the trajectory to the at least one detected traffic lane boundary.

    2. The method as recited in claim 1, wherein an image sequence of multiple temporally consecutive images is examined.

    3. The method as recited in claim 1, wherein the trajectory is determined from a proper movement of the ego vehicle.

    4. The method as recited in claim 1, wherein the trajectory includes a left and a right vehicle boundary of the ego vehicle.

    5. The method as recited in claim 1, wherein the comparison includes a determination of a distance of the trajectory, including of the left and/or a right vehicle boundary, from at least one detected traffic lane boundary.

    6. The method as recited in claim 5, wherein: a first and/or second and/or third and/or fourth distance is used in the distance determination of the distance of the trajectory from the at least one traffic lane boundary, wherein: the first distance is measured from the left vehicle boundary to a next traffic lane boundary situated to the left of the left vehicle boundary, the second distance is measured from the right vehicle boundary to a next traffic lane boundary situated to the right of the right vehicle boundary, the third distance is measured from the left vehicle boundary to the next traffic lane boundary situated to the right of the left vehicle boundary, the fourth distance is measured from the right vehicle boundary to the next traffic lane boundary situated to the left of the right vehicle boundary.

    7. The method as recited in claim 5, wherein a decision that the ego vehicle will leave the currently traveled traffic lane is made when at least one distance changes over time.

    8. The method as recited in claim 5, wherein a decision that the ego vehicle will stay in the current traffic lane is made when at least one distance remains constant over time.

    9. The method as recited in claim 6, wherein the determination of the distances takes place in a predetermined image line of the recorded image.

    10. The method as recited in claim 1, wherein the determination of the trajectory is additionally implemented using map data from a navigation system available in the ego vehicle.

    11. The method as recited in claim 1, wherein the detection of the at least one traffic lane boundary takes place using an image analysis using a neural network.

    12. A control device for an ego vehicle, configured to detect whether an ego vehicle will leave a currently traveled traffic lane of a roadway to the left or right or whether it will stay in the currently traveled traffic lane, the control device configured to: generate, using an image sensor, an image of a measuring space, which includes a vehicle area in front of the ego vehicle; project an expected trajectory of the ego vehicle into the image; detect at least one traffic lane boundary laterally adjacent to the trajectory; and make a decision whether the traffic lane will be changed or maintained by comparing the trajectory to the at least one detected traffic lane boundary.

    13. An ego vehicle, comprising: an image sensor; and a control device configured to detect whether an ego vehicle will leave a currently traveled traffic lane of a roadway to the left or right or whether it will stay in the currently traveled traffic lane, the control device configured to: generate, using the image sensor, an image of a measuring space, which includes a vehicle area in front of the ego vehicle; project an expected trajectory of the ego vehicle into the image; detect at least one traffic lane boundary laterally adjacent to the trajectory; and make a decision whether the traffic lane will be changed or maintained by comparing the trajectory to the at least one detected traffic lane boundary; wherein the image sensor is connected to the control device in a data-transmitting manner, for the acquisition and transmission of image data pertaining to the vehicle area in front of the ego vehicle to the control device.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0025] FIGS. 1 and 2 show images regarding staying in the currently traveled traffic lane (FIG. 1) or changing the traffic lane (FIG. 2).

    DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

    [0026] FIG. 1 shows an image 10 of area 3 in front of the vehicle generated by an image or video sensor (not shown) of an ego vehicle according to the present invention (not shown). Thus, image 10 represents the native measuring space 9 of the image sensor. Shown in image 10 or in area 3 in front of the vehicle is a road featuring a traffic lane in which the ego vehicle is currently traveling. On the left side, his traffic lane 1 is restricted by a left traffic lane boundary 2a, and on the right side by a right traffic lane boundary 2b, each in the form of a marking.

    [0027] In addition, FIG. 1 shows expected trajectory 5 of the ego vehicle 1 in the form of a position of the left and right vehicle boundary 41, 4r of the ego vehicle. Trajectory 5 may be defined by the own movement of the ego vehicle, and traffic lane boundaries 2a, 2b are able to be detected with the aid of a neural network. As output data, the neural network is able to usually generate point lists or polygon chains or polynomials or ‘splines’, for example, which describe the extension of the two traffic lane boundaries 2a, 2b. By comparing trajectory 5, which includes the left and right vehicle boundary 41, 4r, to the two traffic lane boundaries 2a, 2b, a decision is made whether traffic lane 1 will be left or maintained. Optionally, the determination of trajectory 5 may additionally be made with the aid of map data from a navigation system available in the ego vehicle. The use of a radar sensor, which acquires and monitors the area in front of the vehicle, is also possible.

    [0028] According to FIG. 1, four distances d1, d2, d3, d4 are determined for the distance determination of the distance of trajectory 5 from detected traffic lane boundaries 2a, 2b. First distance d1 is measured from the left vehicle boundary 41 to the next traffic lane boundary situated to the left of left vehicle boundary 41. In the scenario of FIG. 1, this is left traffic lane boundary 2a of the currently traveled traffic lane 1. Second distance d2 is measured from right vehicle boundary 41 to the next traffic lane boundary situated to the right of right vehicle boundary 4r. This is right traffic lane boundary 2b of currently traveled traffic lane 1 in the scenario of FIG. 1. The third distance is measured from left vehicle boundary 41 to the next traffic lane boundary situated to the right of the left vehicle boundary 4r. In the scenario of FIG. 1, this is right vehicle traffic lane boundary 2b of currently traveled traffic lane 1. The fourth distance is measured from the right vehicle boundary 4r to the next traffic lane boundary 2a, 2b situated to the left of the right vehicle boundary. This is left traffic lane boundary 2a of currently traveled traffic lane 1 in the scenario of FIG. 1.

    [0029] By evaluating multiple temporally consecutive images 10, it is now possible to ascertain whether and, if so, in which way the individual distances d1, d2, d3, d4 are changing over time. In the example of FIG. 1, there is no change in any of the four distances d1 to d4, that is, d1/dt=d2/dt=d3/dt=d4/dt=0.

    [0030] It is therefore determined as the result of the method according to the present invention that the ego vehicle will stay in current traffic lane 1.

    [0031] FIG. 2 illustrates a scenario in which the ego vehicle is just leaving the currently traveled traffic lane 1 and changing to adjacent left traffic lane 1′. On the right, this adjacent left-side traffic lane 1′ is restricted by left traffic lane boundary 2a of currently traveled traffic lane 1, and on the left, it is restricted by a further traffic lane boundary 2c. For the specification of distances d1 to d4 of the two vehicle boundaries, the statements made above in connection with FIG. 1 apply.

    [0032] Here, too, an evaluation of multiple temporally successive images 10 therefore makes it possible to ascertain whether and, if so, in which way the individual distances d1, d2, d3, d4 change over time. In the example of FIG. 2, all four distances d1 to d4 are changing. It is therefore determined as a result of the method according to the present invention that the ego vehicle changes from the currently traveled traffic lane 1 to adjacent traffic lane 1′.