METHOD AND SYSTEM FOR ESTIMATING ROAD LANE GEOMETRY
20220406077 · 2022-12-22
Assignee
Inventors
Cpc classification
G06V20/588
PHYSICS
International classification
Abstract
A method and system for estimating road lane geometry includes a camera-estimated lane segment, for estimating lane geometry based on camera detection of road markings and a leading-vehicle-estimated lane segment, for estimating lane geometry based on traces of at least one leading vehicle. Estimated road geometry is obtained from a combination of the camera-estimated lane segment and the leading-vehicle-estimated lane segment.
Claims
1. A method for estimating road lane geometry of a lane of a road on which a vehicle travels, the method comprising: detecting a camera-estimated lane segment of the lane of the road based on road markings of the lane of the road in images obtained from a camera; detecting a leading-vehicle-estimated lane segment of a lane of the road based on a trace of a leading vehicle traveling on the road in front of the vehicle; and determining the road lane geometry of the lane of the road based on the camera-estimated lane segment and the leading-vehicle-estimated lane segment.
2. The method according to claim 1, wherein detecting the camera-estimated lane segment and the leading-vehicle-estimated lane segment comprises detecting a left edge of the lane of the road and detecting a right edge of the lane of the road.
3. The method according to claim 1, wherein detecting the camera-estimated lane segment comprises performing image recognition of the road markings based on the images obtained by the camera, and wherein the road markings comprise at least one of lane markings, curb stones, guide posts and guardrails.
4. The method according to claim 1, wherein detecting the leading-vehicle-estimated lane segment comprises tracking the leading vehicle, storing temporal location information of the leading vehicle to obtain the trace of the leading vehicle, and determining a lane width of the lane of the road based on the trace.
5. The method according to claim 2, further comprising: smoothing data of the camera-estimated lane segment and the leading-vehicle-estimated lane segment by fitting a polynomial to the list of two-dimensional points; and sampling the data of the camera-estimated lane segment and the leading-vehicle-estimated lane segment fit to the polynomial as the list of two-dimensional points.
6. The method according to claim 1, further comprising extrapolating the camera-estimated lane segment or the leading-vehicle-estimated lane segment until the camera-estimated lane segment or the leading-vehicle-estimated lane segment overlaps with an adjacent lane segment.
7. The method according to claim 5, wherein the determining comprises combining adjacent lane segments are stitched together if the overlap of the camera-estimated lane segment or the leading-vehicle-estimated lane segment with the adjacent lane segment in a direction perpendicular to a driving direction of the lane is greater than 50%.
8. The method according to claim 7, further comprising assigning a stitching quality measure to each of the adjacent lane segments based on the overlap of the camera-estimated lane segment or the leading-vehicle-estimated lane segment with the adjacent lane segment in the direction perpendicular to the driving direction of the lane.
9. The method according to claim 8, further comprising determining an extra road lane of the road if the overlap of the leading-vehicle-estimated lane segment with another lane segment in the direction perpendicular to the driving direction of the lane is less than 15%.
10. The method according to claim 9, further comprising: providing a road boundary estimation of a road boundary of the road based on radar detection of road boundaries of the road; and rejecting leading-vehicle-estimated lane segments that lie more than 25% outside the road boundary.
11. The method according to claim 10, further comprising broadcasting the road lane geometry to an advanced driver assistance system.
12. A system for estimating road lane geometry of the lane of the road on which a vehicle travels, the system comprising: at least one camera; and a computing unit configured to detect a camera-estimated lane segment of the lane of the road based on road markings of the lane of the road in images obtained from a camera, detect a leading-vehicle-estimated lane segment of a lane of the road based on a trace of a leading vehicle traveling on the road in front of the vehicle, and determine the road lane geometry of the lane of the road based on the camera-estimated lane segment and the leading-vehicle-estimated lane segment.
13. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] The above and other aspects of the present application will be apparent from and elucidated further with reference to the embodiments described by way of examples in the following description and with reference to the accompanying drawings, in which:
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031] It should be noted that the figures are purely diagrammatic and not drawn to scale. In the figures, elements which correspond to elements already described may have the same reference numerals. Examples, embodiments or optional features, whether indicated as non-limiting or not, are not to be understood as limiting the present application as claimed.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0032]
[0033] The camera 3 is configured to take images of the area ahead of the vehicle 1, in particular of road markings and of leading vehicles. Said images are transferred to the computing unit 4 and analyzed by the computing unit 4. Said analysis may be performed by artificial intelligence, e.g., a neural network or a decision tree. As a result of the analysis, road markings and leading vehicles are identified and their locations are determined, e.g., as two-dimensional points.
[0034] Based on the identified road markings, a camera lane estimation is generated, comprising at least one camera-estimated lane segment. Based on traces of leading vehicles, a vehicle lane estimation is generated, comprising at least one leading-vehicle-estimated lane segment. Finally, the at least one camera-estimated lane segment and the at least one leading-vehicle-estimated lane segment are stitched by the computing unit 4. As a result, an estimated road lane geometry with a range greater than that of the camera lane estimation is obtained. Said estimated road lane geometry may be used by an advanced driver assistance system of the vehicle 1, benefitting greatly from the extended range.
[0035]
[0036]
[0037] Further, a leading vehicle 9.1 is shown along with a trace 10.1 of the leading vehicle 9.1. Said trace 10.1 has been obtained by tracking the leading vehicle 9.1 using the camera 3 and/or radar 5 and storing temporal location information of the leading vehicle 9.1. Also, the driving direction D′ of the leading vehicle 9.1 is shown.
[0038] As a next step, a leading-vehicle-estimated lane segment 11.1 is generated from the trace 10.1 of the leading vehicle 9.1. The leading-vehicle-estimated lane segment 11.1 is also given by is left edge 12.1 and right edge 13.1. The leading-vehicle-estimated lane segment 11.1 is generated by creating the left edge 12.1 and the right edge 13.1 at a distance of one-half of the lane width from the trace 10.1 to either direction perpendicular to the driving direction D′ of the leading vehicle 9.1.
[0039] Then, the camera-estimated lane segments 6 and the leading-vehicle-estimated lane segment 11.1 are extrapolated until they reach one another, as shown in
[0040] Hence, the camera-estimated lane segment 6.2 and the leading-vehicle-estimated lane segment 11.1 are stitched together to obtain the estimated road lane geometry 14 as shown in
[0041] Shown in
[0042] Shown in
[0043] Shown in
[0044] Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the aspects of the embodiments of the present application, from the study of the drawings, the disclosure, and the appended claims. In the claims the word “comprising” does not exclude other elements or steps and the indefinite article “a” or “an” does not exclude a plurality. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope of the claims.