METHOD USED FOR DERIVING A CONTROL VARIABLE FOR LATERAL GUIDANCE OF A MOTOR VEHICLE
20230227035 · 2023-07-20
Inventors
Cpc classification
B60W2050/0031
PERFORMING OPERATIONS; TRANSPORTING
B60W2552/53
PERFORMING OPERATIONS; TRANSPORTING
G06V20/588
PHYSICS
International classification
B60W50/00
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A method used for deriving a control variable for lateral guidance of a motor vehicle by means of images of at least one camera of the motor vehicle, wherein the at least one camera captures images of a roadway. A more precise and/or resource-saving derivation of the control variable, and thus lateral guidance, of the motor vehicle is achieved by projecting a traffic lane to be followed as well as a travel trajectory of the motor vehicle onto the images, and by a control variable for lateral guidance of the motor vehicle being derived by comparing the traffic lane projected onto the images with the travel trajectory projected onto the images. A motor vehicle in which the method is performed is also described.
Claims
1. A method for deriving a control variable for lateral guidance of a motor vehicle using images of at least one camera of the motor vehicle, wherein the at least one camera captures images of a roadway, the method comprising the following steps: projecting a traffic lane of the roadway to be followed onto the images; determining a travel trajectory of the motor vehicle in reference to the images based on a self-motion estimate and projecting the determined travel trajectory onto the images; and deriving a control variable for lateral guidance of the motor vehicle from a comparison between the traffic lane projected onto the images and the determined travel trajectory projected onto the images.
2. The method according to claim 1, wherein the travel trajectory is determined using a single lane model and a ground plane estimate, the single lane model including an Ackermann model.
3. The method according to claim 1, further comprising: determining various time constants for a collision between the traffic lane and the travel trajectory from the projected traffic lane and the projected travel trajectory, the control variable being derived as a projection in the images for at least a portion of the various points in time.
4. The method according to claim 1, wherein the control variable is derived by a line-by-line comparison between the projected traffic lane and the projected travel trajectory.
5. The method according to claim 1, further comprising: determining a trajectory vanishing point of the self-motion estimate is determined, and projecting the trajectory vanishing point onto the images as the travel trajectory; determining at least one lane vanishing point of the traffic lane and projecting the at least one lane vanishing point of the traffic lane onto the images; and wherein the control variable is derived by comparing the trajectory vanishing point and the at least one lane vanishing point.
6. The method according to claim 5, wherein, for a curved traffic lane, for each of a plurality of points along the traffic lane, an respective associated lane vanishing point is determined, and the control variable is derived by comparing the respective associated lane vanishing points with the trajectory vanishing point.
7. The method according to claim 1, wherein the traffic lane to be followed is determined from the images using road boundary features.
8. The method according to claim 1, wherein the traffic lane to be followed is determined by determining a drivable surface and/or using a geographic map and/or using semantic segmentation and/or by a comparison with trajectories of other vehicles.
9. A non-transitory computer-readable storage medium on which is stored a computer program for deriving a control variable for lateral guidance of a motor vehicle using images of at least one camera of the motor vehicle, wherein the at least one camera captures images of a roadway, the computer program, when executed by a computer or control device, causing the computer or control device to perform the following steps: projecting a traffic lane of the roadway to be followed onto the images; determining a travel trajectory of the motor vehicle in reference to the images based on a self-motion estimate and projecting the determined travel trajectory onto the images; and deriving a control variable for lateral guidance of the motor vehicle from a comparison between the traffic lane projected onto the images and the determined travel trajectory projected onto the images.
10. A motor vehicle, comprising: at least one camera which, during operation, captures images of a roadway; and a control device communicatively connected to the at least one camera, wherein the control device is configured to derive a control variable for lateral guidance of a motor vehicle using images of the at least one camera of the motor vehicle, the control device configured to: projecting a traffic lane of the roadway to be followed onto the images; determine a travel trajectory of the motor vehicle in reference to the images based on a self-motion estimate and projecting the determined travel trajectory onto the images, and derive a control variable for lateral guidance of the motor vehicle from a comparison between the traffic lane projected onto the images and the determined travel trajectory projected onto the images.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0042]
[0043]
[0044]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0045] For lateral guidance, e.g., for lane center guidance and/or for lane keeping in a motor vehicle 1, as shown by way of example in
[0046]
[0047] In this context, the projections and derivations explained below are performed within the images 3. Compared to projections between the images 3 and a coordinate system, errors and accuracy losses caused by the transformations are therefore prevented or at least reduced. Doing so leads to a more precise derivation of the control variable, hence to a more precise lateral guidance of the motor vehicle 1. A traffic lane 5 to be followed as well as a known, extrinsic calibration of the at least one camera 2 are required for this purpose. In the exemplary embodiment shown in
[0048] According to
[0049] In the flow chart shown in
[0050] Determination of the travel trajectory 6 can be determined by means of an Ackermann model and a ground plane estimate. The travel trajectory 6 thus determined is then projected onto the images 3 during the travel trajectory step 21. The travel trajectory step 21 can in this case also include determining the travel trajectory 6
[0051] To derive the control variable (hence during the derivation step 22), various time constants regarding a collision between the travel lane 5 and the travel trajectory 6 can be determined based on the projected traffic lane 5 and the projected travel trajectory 6. These time constants are also known to the person skilled in the art as “Time To Contact” (abbreviated as “TTC”) and are indicated as “TTC” in
[0052] Alternatively or additionally, it is possible that a vanishing point-based approach be used to derive the control variable. For this purpose, according to
[0053] It is possible that the traffic lane 5 be determined from the images 3 from the camera 2. Doing so has the advantage that the required steps are performed in the images 3. This results in a further reduction of possible errors and/or inaccuracies. In other words, the traffic lane 5 to be followed is extracted and thus determined from the images 3, e.g., by means of the road boundary features 9. However, it is also alternatively or additionally possible that the traffic lane 5 to be followed be determined by means of determining a drivable surface and/or by means of a geographic map and/or by means of semantic segmentation and/or by comparison with the trajectories of other vehicles.
[0054] It is understood that the method steps 20, 21, 22 will be repeated continuously in order to achieve an appropriate and continuous derivation of the control variable.
[0055] The method is performed in an automated manner and, advantageously, by means of a computer program product, e.g., an appropriately configured software and/or algorithm.
[0056] In order to perform the method, the motor vehicle 1 comprises a control device 10 as indicated in
[0057] Although only one control variable for lateral guidance of the vehicle 1 has been addressed in the foregoing description of the figures, it is to be understood that two or more control variables can also be derived using the method. It is further understood that deriving the control variable also includes changes to an existing and/or provided control variable.
[0058] The at least one derived control variable is in this case provided to a driving assistance system of the motor vehicle 1 (not shown). The driving assistance system is preferably able to drive, in particular steer, the motor vehicle 1 at least partially autonomously.