METHOD FOR OPERATING A DRIVER ASSISTANCE SYSTEM, COMPUTER PROGRAM PRODUCT, DRIVER ASSISTANCE SYSTEM, AND VEHICLE
20240194077 ยท 2024-06-13
Assignee
Inventors
- Ludovic Mosnier-Thoumas (Bietigheim-Bissingen, DE)
- Markus Heimberger (Bietigheim-Bissingen, DE)
- Niko Moritz Scholz (Kronach Neuses, DE)
- Jean-Francois Bariant (Bietigheim-Bissingen, DE)
Cpc classification
G01S15/58
PHYSICS
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
B60W30/0956
PERFORMING OPERATIONS; TRANSPORTING
B60Q5/006
PERFORMING OPERATIONS; TRANSPORTING
G01S15/876
PHYSICS
G08G1/166
PHYSICS
G01S15/86
PHYSICS
B60Q1/525
PERFORMING OPERATIONS; TRANSPORTING
G01S15/42
PHYSICS
B60W2554/4044
PERFORMING OPERATIONS; TRANSPORTING
International classification
G01S15/58
PHYSICS
Abstract
The invention relates to a method for operating a driver assistance system (110). The method has the steps of: a) receiving (S1) a drive state sensor signal (SIG0(t)), which indicates the drive state, at a number of different points in time (t0-t5), b) receiving (S2) a number of sensor signals (SIG1(t)), which indicate the surroundings (200), at a number of different points in time (t0-t5), c) detecting (S3) a number of objects (210, 211) in the surroundings (200) on the basis of a first number of sensor signals (SIG1(t)), which have been detected at a first point in time, d) ascertaining (S4) a position (POS) and a movement vector (VEC) for a detected object (210, 211) on the basis of the first number of sensor signals (SIG1 (t)) and a second number of sensor signals (SIG1(t)), which have been received at a second point in time following the first point in time, using a plurality of different ascertaining methods (V1, V2), wherein different ascertaining methods (V1, V2) of the plurality have a different degree of computing complexity, and e) outputting (S5) a warning signal if a potential collision with the detected object (210, 211) is ascertained on the basis of the drive state sensor signal (SIG0(t)) received at a specified point in time and the position (POS) and the movement vector (VEC) ascertained for the detected object (210, 211).
Claims
1. A method for operating a driver assistance system for a vehicle, the method comprising: a) receiving a drive state sensor signal, which indicates the drive state of the vehicle, at a number of different points in time, b) receiving a number of sensor signals, which indicate the environment of the vehicle, at a number of different points in time; c) detecting a number of objects in the environment of the vehicle on the basis of a first number of sensor signals, which have been received at a first point in time; d) ascertaining a position and a movement vector for a detected object on the basis of the first number of sensor signals and a second number of sensor signals, which have been received at a second point in time following the first point in time, using a plurality of different ascertainment methods, wherein different ascertainment methods of the plurality have a different degree of computing complexity; and e) outputting a warning signal if a potential collision of the vehicle with the detected object is ascertained on the basis of the drive state sensor signal received at a specified point in time and the position and the movement vector ascertained for the detected object.
2. The method as claimed in claim 1, wherein the number of different ascertainment methods comprises at least one first ascertainment method, in which for each detected object of the number a Kalman filter is assigned and initialized, which is used to ascertain the position and the movement vector of the respective object.
3. The method as claimed in claim 2, wherein different sensor signals of the number are assigned different scanning regions in the environment, wherein each sensor signal from the number of sensor signals received at a given time, which is assigned to a specific scanning region in the environment, is supplied to the Kalman filter, the assigned object of which has a position that is located within the scanning region assigned to the sensor signal.
4. The method as claimed in claim 2, wherein the output of the warning signal, if a potential collision is ascertained on the basis of the position and the movement vector ascertained for the respective detected object using the first ascertainment method, takes place only if the ascertained movement vector of the object is non-zero.
5. The method as claimed in claim 1, characterized by determining a driving tube for the vehicle on the basis of the received drive status sensor signal.
6. The method as claimed in claim 5, wherein a warning signal is output only if a distance from the respective object to the vehicle and/or to the ascertained driving tube is less than or equal to a lower threshold value.
7. The method as claimed in claim 5, wherein a warning signal is output only if the ascertained movement vector of the respective object points in the direction of the vehicle and/or the direction of the ascertained driving tube.
8. The method as claimed in claim 5, wherein step e) comprises: ascertaining a future trajectory of the detected object on the basis of the ascertained position and the movement vector, wherein a warning signal is only output if the ascertained future trajectory at at least one position falls below a predetermined minimum distance and/or has a point of intersection with the ascertained driving tube.
9. The method as claimed in claim 1, wherein the received sensor signals exclusively comprise ultrasonic sensor signals.
10. The method as claimed in claim 1, wherein the number of different ascertainment methods comprises at least one second ascertainment method, in which a feature recognition is carried out on the basis of the number of sensor signals received at each point in time and a digital environment map is determined using recognized features.
11. The method as claimed in claim 1, wherein the method is carried out exclusively if the vehicle has a speed of less than or equal to 15 km/h.
12. A computer program product comprising instructions that, when the program is executed by a computer, cause said computer to perform the method as claimed in claim 1.
13. A driver assistance system for a vehicle comprising: a reception unit for receiving a drive state sensor signal, which indicates a drive state of the vehicle, at a number of different points in time, and for receiving a number of sensor signals, which indicate the environment of the vehicle, at a number of different points in time; a detection unit for detecting a number of objects in the environment of the vehicle on the basis of a first number of sensor signals, which have been received at a first point in time; an ascertainment unit for ascertaining a position and a movement vector for a detected object on the basis of the first number of sensor signals and a second number of sensor signals, which have been received at a second point in time following the first point in time, using a plurality of different ascertainment methods, wherein different ascertainment methods of the plurality have a different degree of computing complexity; and an output unit for outputting a warning signal if a potential collision of the vehicle with the detected object is ascertained on the basis of the drive state sensor signal received at a specified point in time and the position and the motion vector ascertained for the detected object.
14. A vehicle comprising a number of environmental sensor units for capturing an environment of the vehicle and for outputting a respective sensor signal, and having a driver assistance system as claimed in claim 13.
15. The vehicle as claimed in claim 14, wherein the environmental sensor units exclusively comprise ultrasonic sensors.
16. The vehicle as claimed in claim 14, wherein the vehicle has a mass of more than 2.5 tons and/or a length of more than 5 meters.
Description
[0076] Further advantageous configurations and aspects of the invention are the subject of the dependent claims and of the exemplary embodiments of the invention that are described below. The invention is explained in more detail below on the basis of preferred embodiments with reference to the accompanying figures.
[0077]
[0078]
[0079]
[0080]
[0081]
[0082]
[0083]
[0084] Identical or functionally identical elements have been provided with the same reference signs in the figures, unless stated otherwise.
[0085]
[0086] The driver assistance system 110 is designed, for example, as explained in more detail on the basis of
[0087]
[0088] In addition to the six ultrasonic sensors 130 shown here, which are physically present, additional virtual ultrasonic sensors (not shown) may be present. For example, a virtual ultrasonic sensor is based on the principle that a first ultrasonic sensor 130 emits an ultrasonic signal and a second ultrasonic sensor 130 receives a reflection of the ultrasonic signal emitted by the first ultrasonic sensor. For example, a virtual ultrasonic sensor has a virtual position between two physically present ultrasonic sensors 130.
[0089] Two objects 210, 211 are located in the environment 200 of the vehicle 100. A first object 210, for example a cyclist, is located in the scanning regions 135, 136 of two ultrasonic sensors 130. The cyclist 210 is therefore detected in particular by two ultrasonic sensors. In addition, the cyclist can be detected by a virtual ultrasonic sensor as described above. A second object 211, for example a pedestrian, is located in the scanning region 132 of a single ultrasonic sensor 130. However, the pedestrian 211 can also be detected by a virtual ultrasonic sensor as described above.
[0090] To ascertain the position POS (see
[0091]
[0092] The ultrasonic sensors 130 (see
[0093] In the situation shown, the pedestrian 210 is moving toward the driving tube TR of the vehicle 100. The current distance D of the pedestrian 210 from the driving tube TR is also shown. The driving assistance system 110 is configured to output a warning signal depending on predetermined criteria. For example, it is checked whether the distance D of the pedestrian 210 from the current driving tube TR (alternatively from the vehicle 100) is less than or equal to a predetermined threshold value, or whether the ascertained movement vector VEC points in the direction of the driving tube TR or toward the vehicle 100. If one or more of these criteria are met, the warning signal is output, since a collision with the pedestrian 210 is then likely unless the vehicle 100 is stopped or changes direction.
[0094] In other words, the warning signal is output if a potential collision of the vehicle 100 with the detected object 210, 211 is ascertained on the basis of the drive state sensor signal SIG0(t) received at a specified point in time t0-t5 (see
[0095]
[0096] On the basis of ultrasonic sensor signals SIG1(t) (see
[0097] A smallest distance between the driving tube TR and the future trajectory TR1 can be ascertained. If this distance D is less than a predetermined minimum distance, a warning signal is issued, for example.
[0098]
[0099] The ascertainment of the position POS and the movement vector VEC at a particular time t0-t5 is preferably carried out on the basis of the first ascertainment method V1 using a Kalman filter and on the basis of a further ascertainment method V2 (see
[0100]
[0101]
[0102] Although the present invention has been described on the basis of exemplary embodiments, it may be modified in many ways.
LIST OF REFERENCE SIGNS
[0103] 100 vehicle [0104] 110 driver assistance system [0105] 112 reception unit [0106] 114 capture unit [0107] 116 ascertainment unit [0108] 118 output unit [0109] 120 sensor [0110] 130 sensor [0111] 131 scanning region [0112] 132 scanning region [0113] 133 scanning region [0114] 134 scanning region [0115] 135 scanning region [0116] 136 scanning region [0117] 200 environment [0118] 210 object [0119] 210(t0) object [0120] 210(t1) object [0121] 210(t2) object [0122] 210(t3) object [0123] 210(t4) object [0124] 210(t5) object [0125] 211 object [0126] D distance [0127] POS position [0128] S1 method step [0129] S2 method step [0130] S3 method step [0131] S4 method step [0132] S5 method step [0133] SIG0(t) drive state sensor signal [0134] SIG1(t) sensor signal [0135] t time [0136] t0 point in time [0137] t1 point in time [0138] t2 point in time [0139] t3 point in time [0140] t4 point in time [0141] t5 point in time [0142] TR driving tube [0143] TR1 trajectory [0144] V1 ascertainment method [0145] V2 ascertainment method [0146] VEC movement vector [0147] VEC(t1) movement vector [0148] VEC(12) movement vector [0149] VEC(t3) movement vector [0150] VEC(t4) movement vector [0151] VEC(t5) movement vector