Lane keeping assistance system

10836385 · 2020-11-17

Assignee

Inventors

Cpc classification

International classification

Abstract

A lane keeping assistance system including position means for determining a first position of the vehicle in a network with a first accuracy; first interface for providing trajectory of the vehicle in the network; second interface for providing position data of right boundary objects and position data of left boundary objects, and radar signatures of these boundary objects; radar system to scan right and left lateral environments of the vehicle and determine distances to objects on a right of the vehicle and radar signatures thereof, and distances to objects on a left of the vehicle and radar signatures thereof; evaluation unit to perform identification of acquired objects based on the first position, the provided data, and the determined data, and to determine a second position of the vehicle with a second position accuracy; and control device to control the vehicle taking into consideration the target trajectory and the second position.

Claims

1. A lane keeping assistance system for transverse control of a vehicle, the system comprising: a position means for determining a current first position P1(t) of the vehicle in a road traffic network, wherein the first position P1(t) has a position accuracy P1: P1(t)=P1(t)P1; a first interface to provide a target trajectory ST(t) of the vehicle in the road traffic network; a second interface to provide georeferenced position data P.sub.OR,i of objects OR.sub.i of a right traffic lane boundary and georeferenced position data P.sub.OL,i of objects OL.sub.i of a left traffic lane boundary, and 2D or 3D radar signatures RS.sub.OR,i and RS.sub.OL,i of the objects OR.sub.i and OL.sub.i for a route section of the road traffic network being traveled by the vehicle, wherein the georeferenced position data P.sub.OR,i and P.sub.OL,i have a position accuracy of P2, where P2<P1, i=1, 2, 3, . . . ; a radar system to scan a right lateral environment and a left lateral environment of the vehicle to determine data including distances D.sub.OR to objects OR present laterally to the right of the vehicle and radar signatures RS.sub.OR of the objects OR, and distances D.sub.OL to objects OL present laterally to the left of the vehicle and radar signatures RS.sub.OL of the objects OL; an evaluation unit, via which initially, based on the first position P1(t), provided data P.sub.OR,i, P.sub.OL,i, RS.sub.OR,i, and RS.sub.OL,i, and determined data D.sub.OR, D.sub.OL, RS.sub.OR, and RS.sub.OL, an identification of the objects OR and OL as respective objects OR.sub.i and OL.sub.i is performed, and based on a successful identification, a second position P2(t) of the vehicle is determined using georeferenced position data P.sub.OR,i of an identified object OR.sub.i coupled with a determined distance D.sub.OR of an object OR associated with the successful identification of the object OR as the object OR.sub.i, or using the georeferenced position data P.sub.OL,i of an identified object OL.sub.i coupled with a distance D.sub.OL of an object OL associated with the successful identification of the object OL as the object OL.sub.i, wherein the second position P2(t) has a position accuracy P2: P2(t)=P2(t)P2, at least in one dimension; and a control device to perform transverse control of the vehicle in the road traffic network taking into consideration the target trajectory ST(t) and the second position P2(t) of the vehicle.

2. The lane keeping assistance system according to claim 1, wherein the position means for determining the first position P1(t) is designed based on data of a satellite navigation system, or based on optical environment data of an optical system, or based on radar data of the radar system, or based on dead reckoning data of a dead reckoning system, or a combination of these data.

3. The lane keeping assistance system according to claim 1, wherein the objects OR.sub.i and OL.sub.i are guardrails, vertical guardrail supports, curbstones, concrete deflectors, pylons, metal fences, noise control walls, side walls or radar reflectors, retroreflectors, corner reflectors, poles of traffic signs, overhead sign structures, or roadside emergency telephones.

4. The lane keeping assistance system according to claim 1, wherein the radar signatures RS.sub.OR,i and RS.sub.OL,i of the objects OR.sub.i and OL.sub.i are based on radar signatures acquired by a sensor on board an aircraft or on board a satellite in top view and converted to radar signatures acquirable by the radar system of the vehicle.

5. The lane keeping assistance system according to claim 1, wherein the evaluation unit is designed and set up for a plausibility verification, wherein probabilities W(OR) and W(OL) of an unequivocal identification of objects OR and OL are determined in each case as one of the objects OR.sub.i and OL.sub.i based on the determined radar signatures RS.sub.OR and RS.sub.OL, the provided radar signatures RS.sub.OR,i and RS.sub.OL,i and the first position P1(t), wherein, for the determination of the second position P2(t), only those objects OR and OL are used, the probabilities W(OR) and W(OL) of which are above a predetermined limit value G1, and wherein for objects OR and OL, the probabilities W(OR) and W(OL) of which in each case are below the predetermined limit value G1, a warning signal WARN is generated.

6. The lane keeping assistance system according to claim 5, wherein, in the case in which the evaluation unit has determined the second position P2(t) at a time step t.sub.0, and for time steps t.sub.k greater than t.sub.0, where t.sub.k=t.sub.k1+t, k=1, 2, . . . , and t:=time increment, the warning signal WARN is generated, the evaluation unit for these time steps t.sub.k determines a position P2(t), where
P2(t.sub.k)=P2(t.sub.k1)+{dot over (P)}1(t.sub.k1)t, and(1) the control device carries out the transverse control of the vehicle at least for a predetermined period of time ZS based on the position P2(t.sub.k).

7. The lane keeping assistance system according to claim 1, wherein the system comprises: a third interface, via which data is provided that enables an optical characterization and/or ultrasound characterization of the objects OR.sub.i and OL.sub.i; and an optical system OPT and/or an ultrasound system US for scanning the right lateral environment and the left lateral environment of the vehicle for the determination of distances D.sub.R,OR,OPT and/or D.sub.R,OR,US to objects OR present laterally to the right of the vehicle and for the identification thereof as objects OR.sub.i, and of distances D.sub.L,OR,OPT and/or D.sub.L,OR,US to objects OL present laterally to the left of the vehicle and for the identification thereof as objects OL.sub.i; wherein in determination of the second position P2(t), the evaluation unit takes into consideration the distances D.sub.R,OR,OPT, D.sub.L,OL,OPT and/or D.sub.R,OR,US, D.sub.L,OR,US to the respective identified objects OR.sub.i and OL.sub.i.

8. The lane keeping assistance system according to claim 1, wherein the evaluation unit is designed and set up in such a manner that a transmission to a central station is made, indicating that, with the radar system, at positions P1(t) or P2(t), objects OR and/or OL were determined, which are not identifiable as objects OR.sub.i and/or OL.sub.i and/or that, with the radar system, at positions P1(t) or P2(t), no objects OR and/or OL were determined, which, however, should be present as objects OR.sub.i and/or OL.sub.i.

9. The lane keeping assistance system according to claim 1, wherein, in identification of the objects OR and OL acquired with the radar system, the evaluation unit acquires, as objects OR.sub.i and OL.sub.i, with a counter Z.sub.L, a number ANZ.sub.OL of the objects OL acquired laterally to the left, and with a counter Z.sub.R, a number ANZ.sub.OR of the objects OR acquired laterally to the right, wherein the quantities ANZ.sub.OL and ANZ.sub.OR are taken into consideration in the determination of the vehicle position P2(t).

10. A method for the transverse control of a vehicle, the method comprising: determining a current first position P(t) of the vehicle in a road traffic network, wherein the first position P1(t) has a position accuracy P1: P1(t)=P1(t)P1; providing a target trajectory ST(t) of the vehicle in the road traffic network; providing georeferenced position data P.sub.OR,i of objects OR.sub.i of a right traffic lane boundary and georeferenced position data P.sub.OL,i of objects OL.sub.i of a left traffic lane boundary, and 2D or 3D radar signatures RS.sub.OR,i and RS.sub.OL,i of the objects OR.sub.i and OL.sub.i, for a route section of the road traffic network being traveled by the vehicle, wherein the georeferenced position data P.sub.OR,i and P.sub.OL,i have a position accuracy P2, where P2<P1, i=1, 2, 3, . . . ; scanning, using a radar system, a right lateral environment and a left lateral environment of the vehicle to determine data including distances D.sub.OR to objects OR present laterally to the right of the vehicle and radar signatures RS.sub.OR of the objects OR, and distances D.sub.OL to objects OL present laterally to the left of the vehicle and radar signatures RS.sub.OL objets OL; performing initially, based on the first position P1(t), provided data P.sub.OR,i, P.sub.OL,i, RS.sub.OR,i, and RS.sub.OL,i, and determined data D.sub.OR, D.sub.OL, RS.sub.OR, and RS.sub.OL, an identification of the objects OR and OL as respective objects OR.sub.i and OL.sub.i, and based on a successful identification, determining a second position P2(t) of the vehicle using georeferenced position data P.sub.OR, of an identified object OR.sub.i coupled with a determined distance D.sub.OR of an object OR associated with the successful identification of the object OR as the object OR.sub.i, or using the georeferenced position data P.sub.OL,i of an identified object OL.sub.i coupled with a distance D.sub.OL of an object OL associated with the successful identification of the object OL as the object OL.sub.i, wherein the second position P2(t) has a position accuracy P2: P2(t)=P2(t)P2, at least in one dimension; and performing transverse control of the vehicle road traffic network taking into consideration the target trajectory ST(t) and the second position P2(t) of the vehicle.

11. A lane keeping system for the transverse control of a vehicle, the system comprising: a processing device; and a memory storing instructions that, when executed the processing device, cause the processing device to perform instructions comprising: determining a current first position P1(t) of the vehicle in a road traffic network, wherein the first position P1(t) has a position accuracy P1: P1(t)=P1(t)P1; providing a target trajectory ST(t) of the vehicle in the road traffic network; providing georeferenced position data P.sub.OR,i of objects OR.sub.i of a right traffic lane boundary and georeferenced position data P.sub.OL,i of objects OL.sub.i of a left traffic lane boundary, and 2D or 3D radar signatures RS.sub.OR,i and RS.sub.OL,i of the objects OR.sub.i and OL.sub.i, for a route section of the road traffic network being traveled by the vehicle, wherein the georeferenced position data P.sub.OR,i and P.sub.OL,i have a position accuracy P2, where P2<P1, i=1, 2, 3, . . . ; scanning, using a radar system, a right lateral environment and a left lateral environment of the vehicle to determine data including distances D.sub.OR to objects OR present laterally to the right of the vehicle and radar signatures RS.sub.OR of the objects OR, and distances D.sub.OL to objects OL present laterally to the left of the vehicle and radar signatures RS.sub.OL objects OL; performing initially, based on the first position P1(t), the provided data P.sub.OR,i, P.sub.OL,i, RS.sub.OR,i, and RS.sub.OL,i, and determined data D.sub.OR, D.sub.OL, RS.sub.OR, and RS.sub.OL, an identification of the objects OR and OL as respective objects OR.sub.i and OL.sub.i, and based on a successful identification, determining a second position P2(t) of the vehicle using georeferenced position data P.sub.OR,i of an identified object OR.sub.i coupled with a determined distance D.sub.OR of an object OR associated with the successful identification of the object OR as the object OR.sub.i, or using the georeferenced position data P.sub.OL,i of an identified object OL.sub.i coupled with a distance D.sub.OL of an object OL associated with the successful identification of the object OL as the object OL.sub.i, wherein the second position P2(t) has a position accuracy P2: P2(t)=P2(t)P2, at least in one dimension; and performing transverse control of the vehicle road traffic network taking into consideration the target trajectory ST(t) and the second position P2(t) of the vehicle.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) Additional advantages, features, and details result from the following description, whereinif applicable in reference to the drawingsat least one embodiment example is described in detail. Identical, similar and/or functionally equivalent parts are provided with identical reference numerals.

(2) In the drawings:

(3) FIG. 1 shows a schematic design of a lane keeping assistance system according to the invention;

(4) FIG. 2 shows a schematic flowchart of a method for transverse control of a vehicle in a route network according to the invention;

(5) FIG. 3 shows a schematic design of a road traffic network in which the lane keeping assistance system shown in FIG. 1 and method shown in FIG. 2 are applicable; and

(6) FIG. 4 shows a schematic design of optional components of the lane keeping assistance system shown in FIG. 1 in relation to the road traffic network shown in FIG. 3.

DETAILED DESCRIPTION

(7) FIG. 1 shows a schematic design of an inventive lane keeping assistance system 100, in view of FIG. 3 that shows a road traffic network 300 in which the lane keeping assistance system 100 is applicable. The lane keeping assistance system 100 includes a position means 101, a first interface 102, a second interface 103, a radar system 104, an evaluation unit 105, and a control device 106.

(8) The position means 101 determines a current first position P1(t) of a vehicle 304 in the road traffic network 300 with a position accuracy P1: P1(t)=P1(t)APE The first interface 102 provides a target trajectory ST(t) 305 of the vehicle 304 in the road traffic network 300. The second interface 103 provides georeferenced position data P.sub.OR,i of objects OR.sub.i of a right traffic lane boundary 301 and georeferenced position data P.sub.OR,i of objects OL, of a left traffic lane boundary 302, and 2D or 3D radar signatures RS.sub.OR,i and RS.sub.OL,i of these objects OR.sub.i and OL.sub.i for a route section 303 of the road traffic network 300 being traveled by the vehicle 304, wherein the georeferenced position data P.sub.OR,i, P.sub.OL,i have a position accuracy P2, where P2<P1, i=1, 2, 3, . . . , the foregoing position and signature data collectively referred to as georeferenced data 306 of objects OR.sub.i, OL.sub.i.

(9) The radar system 104 scans a right lateral environment and a left lateral environment of the vehicle for the determination of distances D.sub.OR to objects OR present laterally to the right of the vehicle 304 and the radar signatures RS.sub.OR thereof, and distances D.sub.OL to objects OL present laterally to the left of the vehicle and the radar signatures RS.sub.OL thereof, the foregoing distance and signature data collectively referred to as determined data of acquired objects OR, OL.

(10) The evaluation unit 105, based on the first position P1(t), the provided georeferenced data P.sub.OR,i, P.sub.OL,i, RS.sub.OR,i and RS.sub.OL,i, and the determined data D.sub.OR, RS.sub.OR, D.sub.OL, RS.sub.OL, performs an identification of the acquired objects OR and OL as objects OR.sub.i, OL.sub.i, and based on a successful identification, determines a second position P2(t) of the vehicle 304 using georeferenced position data P.sub.OR,i of an identified object OR.sub.i coupled with a determined distance D.sub.OR of an object OR associated with the successful identification of the object OR as the object OR.sub.i or using the georeferenced position data P.sub.OL,i of an identified object OL.sub.i coupled with a distance D.sub.OL of an object OL associated with the successful identification of the object OL as the object OL.sub.i, wherein the second position P2(t) has the position accuracy P2: P2(t)=P2(t)P2, at least in one direction.

(11) The control device 106 transversely controls the vehicle 304, taking into consideration the target trajectory ST(t) 305 and the second position P2(t) of the vehicle.

(12) FIG. 2 shows a schematic flowchart of an inventive method 200 for the transverse control of a vehicle 304 in a road traffic network 300 of FIG. 3. The method includes the following steps. In step 201, there is determined a current first position P1(t) of the vehicle 304 in the road traffic network 300 with a position accuracy P1: P1(t)=P1(t)P1. In step 202, there is provided a target trajectory ST(t) 305 of the vehicle 304 in the road traffic network 300. In a step 203, there is provided georeferenced position data P.sub.OR,i of objects OR.sub.i of a right traffic lane boundary and of georeferenced position data P.sub.OL,i of objects OL.sub.i of a left traffic lane boundary, and of 2D or 3D radar signatures RS.sub.OR,i and RS.sub.OL,i of these objects OR.sub.i, OL.sub.i, for the route section 303 of the road traffic network 300 being traveled by the vehicle 304, wherein the georeferenced position data P.sub.OR,i, P.sub.OL,i have a position accuracy P2 of <0.15 m, where P2<P1, i=1, 2, 3, . . . , the foregoing position and signature data collectively referred to as georeferenced data 306 of objects OR.sub.i, OL.sub.i.

(13) In step 204, using the radar system 104, there is performed a scanning of a right and of a left lateral environment of the vehicle 304 for the determination of distances D.sub.OR to objects OR present laterally to the right of the vehicle 304 and the radar signatures RS.sub.OR thereof, and of distances D.sub.OL of objects OL present laterally to the left of the vehicle 304 and the radar signatures RS.sub.OL thereof. In step 205, based on the first position P1(t), the provided georeferenced data P.sub.OR,i, P.sub.OL,i, RS.sub.OR,i and RS.sub.OL,i, and the determined data D.sub.OR, RS.sub.OR, D.sub.OL, RS.sub.OL, there is performed a determination of a second position P2(t) of the vehicle, the position accuracy of which has the position accuracy P2 at least in one direction. In a step 206, there is performed a transverse control of the vehicle 304, taking into consideration the target trajectory ST(t) and the position P2(t).

(14) FIG. 3 shows a schematic design of a road traffic network 300 in which the lane keeping assistance system 100 of FIG. 1 and the method shown in FIG. 2 are applicable. In this regard, it is understood that vehicle 304 includes the lane keeping assistance system 100 as shown in FIG. 1, and can be controlled as it travels along the road traffic network 300 using the lane keeping assistance method as shown in FIG. 2. As particularly shown in FIG. 3, the road traffic network 300 includes a right traffic boundary 301 and a left traffic boundary 302, and the vehicle 304 travels along the road traffic network 300 generally between the right and left traffic boundaries 301, 302.

(15) The first position P1(t) having a position accuracy P1 is determined for the vehicle 304 in the road traffic network 300 using the position means 101. The position means 101 can include a radar system 104 as shown in FIG. 1, a satellite navigation system 401 that communicates with satellites 308, an optical system 402, an ultrasound system 403, or a dead reckoning system 404, as shown in FIG. 4, or a combination thereof. For example, as described hereinabove, the position accuracy P1 of the first position P1(t) is relatively and generally imprecise, for example, in the case of a satellite navigation system 401, in a range between 5 m and 150 m, and as such, the position accuracy is not sufficient in characterizing the first position P1(t) for use in autonomous vehicle operation.

(16) The target trajectory ST(t) 305 provides the upcoming target route of the vehicle 304. As described hereinabove, target trajectory ST(t) 305 of the vehicle 304 in the road network 300 can be provided by the satellite navigation system 401 of FIG. 4, a central system of the vehicle, or an external central traffic control center (e.g., a central server to which the vehicle is connected for data exchange).

(17) The georeferenced position data 306 include positions P.sub.OR,i of the objects OR, of the right traffic lane boundary 301 and positions P.sub.OL,i of objects OL, of the left traffic lane boundary 302, and radar signatures RS.sub.OR,I, RS.sub.OL,i of the respective objects OR.sub.i, OL.sub.i, for a route section 303 of the road traffic network 300 associated with travel of the vehicle 304, wherein the foregoing georeferenced position data have a position accuracy P2. As described hereinabove, the georeferenced position data 306 are relatively and significantly more precise (e.g., position accuracy of 0.5 m-0.05 m) than the first position P1(t) of the vehicle 304 (e.g., position accuracy of 5 m-15 m). The radar signatures RS.sub.OR,I, RS.sub.OL,i of the georeferenced position data 306 can be those resulting from signatures acquired by an airplane 307 or a satellite 308 from a top view of the objects P.sub.OR,i, P.sub.OL,i, and converted to signatures that can be acquirable by the radar system 104 of the vehicle 304.

(18) The right and the left environments of the vehicle 304 are scanned using the radar system 104. The radar system 104 determines distances D.sub.OR to objects OR on the right and their radar signatures RS.sub.OR, and similarly distances D.sub.OL of objects OL on the left and their radar signatures RS.sub.OL. P1(t). Moreover, the radar system 104 enables the acquisition of the radar signatures RS.sub.OR, RS.sub.OL. As described hereinabove, the determined distance data D.sub.OR, D.sub.OL have relatively precise distance accuracy (e.g., 0.4 m-0.05 m), which in combination with radar signatures RS.sub.OR, RS.sub.OL help the evaluation unit 105 to identify objects OR, OL as respective objects OL.sub.i, OL.sub.i, as described below.

(19) The evaluation unit 105 performs determination of a second position of the vehicle P2(t) based on the first position P1(t) of the vehicle 304, the georeferenced data, and the determined data, wherein P2(t) has a position accuracy P2. In particular, based on the first position P1(t), the provided data P.sub.OR,i, P.sub.OL,i, RS.sub.OR,i and RS.sub.OL,i, and the determined data D.sub.OR, RS.sub.OR, D.sub.OL, RS.sub.OL, the evaluation unit 105 initially performs an identification of the objects OR and OL as respective objects OR.sub.i and OL.sub.i. Thereafter, based on a successful identification (e.g., object OR successfully identified as object OR.sub.i, which is illustrated as cross-hatched), the evaluation unit 105 determines a second position P2(t) of the vehicle 304 using georeferenced position data P.sub.OR,i of an identified object OR.sub.i coupled with a determined distance D.sub.OR of an object OR associated with the successful identification of the object OR as the object OR.sub.i, or using the georeferenced position data P.sub.OL,i of an identified object OL.sub.i coupled with a distance D.sub.OL of an object OL associated with the successful identification of the object OL as the object OL.sub.i.

(20) Thereafter, the control device 106 performs transverse control of the vehicle 304 in the road traffic network 300, taking into consideration the target trajectory ST(t) and the second position P2(t) of the vehicle 304. In this regard, the relatively precise position of the georeferenced data coupled with the relatively precise distance of the determined data, based on a successful identification, enables determination of the second position P2(t) of the vehicle 304 that is relatively and significantly more precise than the first P1(t) of the vehicle 304, and as such, transverse control of the vehicle 304 can be performed in the route road traffic network 300 for autonomous operation of the vehicle 304.

(21) FIG. 4 shows a schematic design of optional components 400 of the lane keeping assistance system 100 as shown in FIG. 1, in relation to the road traffic network 300 as shown in FIG. 3. The lane keeping assistance system 100 can include a satellite navigation system 401, an optical system OPT 402, an ultrasound system US 403, a dead reckoning system 404, or combinations thereof. The lane keeping assistance system 100 can also include a third interface 405 connected to the evaluation unit 105 of FIG. 1, which can further be in communication with a central station 406.

(22) Via the third interface 405, data can be provided that enables optical and/or ultrasound characterization of the objects OR.sub.i, OL.sub.i. The optical system OPT 402 and/or an ultrasound system US 403, can be used to scan the right and left lateral environments of the vehicle 304, and further to determine corresponding distances D.sub.R,OR,OPT, D.sub.R,OR,US to objects OR present to the right of the vehicle 304 and corresponding distances D.sub.L,OR,OPT, D.sub.L,OR,US to objects OL present laterally to the left of the vehicle 304 can be determined.

(23) Moreover, the optical system OPT 402 and/or the ultrasound system US can also be used to identify the acquired objects as objects OR.sub.i or OL.sub.i based on the optical data and/or ultrasound data acquired by the respective systems 402, 403. Moreover, the evaluation unit 105, for the determination of the second position P2(t), can thus take into consideration the determined distances D.sub.R,OR,OPT, D.sub.L,OR,OPT and/or D.sub.R,OR,US, D.sub.L,OR,US to the respective identified objects OR.sub.i, OL.sub.i. In this development, the objects OR.sub.i and OL.sub.i detected by the radar system 104 of FIG. 1 can be initially verified using an optical system OPT 402 and/or an ultrasound system US 403 of FIG. 4.

(24) In the lane keeping assistance system 100, the evaluation unit 105 can also make a transmission to a central station 406, indicating that with the radar system 104, at positions P1(t) and/or P2(t), objects OR and/or OL were determined, which are not identifiable as objects OR.sub.i and/or OL.sub.i, and/or indicating that with the radar system 104, at positions P1(t) and/or P2(t), no objects OR and/or OL were determined, which, however, should be present as objects OR.sub.i and/or OL.sub.i. Moreover, the central station 406 can provide the georeferenced position data P.sub.OR,i, P.sub.OL,i of objects OR.sub.i and OL.sub.i as well as associated 2D or 3D radar signatures RS.sub.OR,i and RS.sub.OL,i to the vehicle 304 via the second interface 103.

(25) Although the invention was illustrated and explained in further detail by preferred embodiments, the invention is not limited by the disclosed examples, and other variations can be derived therefrom by the person skilled in the art, without leaving the scope of protection of the invention. Therefore, it is clear that numerous variation possibilities exist. It is also clear that embodiments mentioned as example represent in fact only examples which in no way should be taken as a limitation of, for example, the scope of protection, the application possibilities or the configuration of the invention. Instead, the above description and the description of figures enable the person skilled in the art to concretely implement the exemplary embodiments, wherein the person skilled in the art, in the knowledge of the disclosed inventive idea, can make various changes, for example, with regard to the function or the arrangement, in an exemplary embodiment of mentioned examples, without leaving the scope of protection which is defined by the claims and their legal equivalents such as, for example, further explanations in the description.