Method for Operating a Vehicle

20220388544 · 2022-12-08

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for operating a vehicle in an automatic driving operation not requiring any user action which can be deactivated by a deactivation action of a driver of the vehicle includes, during the automatic driving operation in a learning phase, driving situations in which the driver deactivates the automatic driving operation are recorded by a surroundings recording device and the recorded driving situations are stored in a memory as subjectively critical driving situations. The method further includes, during an operating phase of the automatic driving operation, comparing a currently recorded driving situation to the stored subjectively critical driving situations and emitting a warning to the driver when the currently recorded driving situation matches one of the stored subjectively critical driving situations within a tolerance range.

    Claims

    1.-10. (canceled)

    11. A method for operating a vehicle in an automatic driving operation not requiring any user action which can be deactivated by a deactivation action of a driver of the vehicle, comprising the steps of: during the automatic driving operation in a learning phase, driving situations in which the driver deactivates the automatic driving operation are recorded by a surroundings recording device and the recorded driving situations are stored in a memory as subjectively critical driving situations; during an operating phase of the automatic driving operation, comparing a currently recorded driving situation to the stored subjectively critical driving situations; and emitting a warning to the driver when the currently recorded driving situation matches one of the stored subjectively critical driving situations within a tolerance range.

    12. The method according to claim 11, wherein the subjectively critical driving situations are stored in the vehicle or on a remote server.

    13. The method according to claim 11, wherein the subjectively critical driving situations are stored specifically to the driver and/or specifically to a location.

    14. The method according to claim 11, wherein, when recording the driving situations, a driving track and objects in the surroundings of the vehicle are recorded.

    15. The method according to claim 11, wherein the surroundings recording device comprises one or more cameras, radar sensors, Lidar sensors and/or ultrasound sensors.

    16. The method according to claim 11, wherein sensor data recorded by the surroundings recording device are divided into interest regions, wherein a first one of the interest regions is an ego lane in which the vehicle moves, wherein a second one of the interest regions is a left lane and/or a right lane adjacent to the ego lane, wherein movement data of all objects perceived in the interest regions are calculated, and wherein a critical object is identified which moves into a safety corridor inside the ego lane.

    17. The method according to claim 16, wherein, based on the sensor data recorded by the surroundings recording device, at least one of the following variables is calculated for at least one or each of the objects: a) a time which is necessary for the object to reach the safety corridor when trajectories of the vehicle and the object intersect; b) a time which is necessary for the vehicle to cover a longitudinal distance to the object; c) a time which is necessary for the vehicle to reach a point at which the object reaches the safety corridor less a time necessary for the object to reach the point; and d) a longitudinal distance between the vehicle and the object when a limit of the safety corridor is exceeded.

    18. The method according to claim 17, wherein an object with the lowest time a) or c) is identified as a most critical object.

    19. The method according to claim 18, wherein, as soon as the critical object or the most critical object is identified, a piece of fuzzy logic is used for a prediction whether the driver conceives a higher or lower degree of complexity.

    20. The method according to claim 11, wherein the comparing is performed by a majority election mechanism.

    21. The method according to claim 19, wherein, based on the prediction and the comparing, a trust percentage is calculated, wherein an adaptive benchmark that is adjustable by the driver determines whether the trust percentage is high enough to warn the driver of a critical driving situation.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0023] FIG. 1 is a schematic view of a driving situation, and

    [0024] FIG. 2 is a schematic view of a driving situation and a workflow for ascertaining a criticality of the driving situation.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0025] Parts corresponding to one another are provided with the same reference numerals in all figures.

    [0026] FIG. 1 shows a schematic view of a driving situation 1 with a vehicle 2, which is driving in an ego lane 3 of a driving track 4. In front of the vehicle 2 in the driving direction, a safety corridor 5 is defined, which the vehicle 2 will presumably travel along. The safety corridor 5 has a width which is greater than the width of the vehicle 2. The width of the safety corridor 5 can vary with the distance to the vehicle. An object 7, for example a further vehicle, is in transit on a lane 6 adjacent to the ego lane 3, with a longitudinal distance X and a transverse distance Y to a front edge, in the driving direction, of the vehicle 2. The object 7 moves with a speed V in the longitudinal direction and the transverse direction in the ego lane 3, such that it enters the safety corridor 5 after a time ΔT. The vehicle 2 has sensors for recognising the driving situation 1, in particular the driving track 4 and objects 7. Such sensors can comprise one or more cameras, radar sensors, Lidar sensors and/or ultrasound sensors etc.

    [0027] FIG. 2 shows a schematic view of a driving situation 1 with a vehicle 2, which drives along an ego lane 3 of a driving track 4. Furthermore, a workflow is depicted for determining a criticality of the driving situation 1.

    [0028] The criticality of the driving situation 1 can be ascertained by means of a subjective complexity model.

    [0029] The proposed model is based on the vehicle 2 having a driver assistance system which is able to simultaneously take over longitudinal and transverse control of the vehicle 2 without the driver having to have their hands on the steering wheel. Typically, this is only possible with advanced level 2 or level 3 systems, since the driver has complete or partial longitudinal and transverse control in lower automatic levels. The aim of the model is to make predictions as to when the complexity of the driving situation 1 reaches a point in the surroundings where the driver has the impression that their intervention is required. When the driver intervenes, the driver subjectively decides that, from their point of view, the complexity of the driving situation 1 is too high to trust the driver assistance system to handle the driving situation. In this case, the driver assistance system stores the sensor data of this driving situation 1 or data characterising the driving situation in a database DB. Here, the database DB can be located in the vehicle 2 or on an external server, to which a communication connection is established via radio.

    [0030] FIG. 2 shows a workflow for ascertaining a criticality of the driving situation 1 by means of the driver assistance system. Sensor data are divided into the interest regions of left lane 6.1, ego lane 3 and right lane 6.2. Movement data of all perceived objects 7 into these interest regions are calculated, whereupon the critical objects are identified, for example the object 7 shown in FIG. 1, which moves into the safety corridor 5. The criticality is calculated based on previously stored data.

    [0031] Sensor data relating to the objects 7 in the surroundings are received and divided into the three possible interest regions, left lane 6.1, ego lane 3 and right lane 6.2. The received raw data includes transverse and longitudinal positions and speeds V of the object 7 in relation to the vehicle 2. This makes it possible to easily calculate the relative speed V between each of the objects 7 and the vehicle 2.

    [0032] In an exemplary embodiment of the proposed of the driver assistance system, six objects 7 can be perceived, a maximum of two per interest region 3, 6.1, 6.2. In other embodiments, higher numbers of objects 7 can be perceived and their positions and speeds V analysed.

    [0033] Based on sensor data, for example radar data, the following cinematic variables are calculated in a step S1:

    [00001] TT cross_border = ( Dist y - ( 1 2 ( Width y ) + Buffer ) Spd y ) ( 1 ) TT headway = Dist x RelSpd ( 2 ) TTC cross_border = TT headway - TT cross_border ( 3 ) Dist cross_border = TT cross_border * RelSpd ( 4 )

    [0034] Overall, ten cinematic variables, for example, are taken into consideration with the available sensors, for example the following variables:

    TABLE-US-00001 Variable name Definition Dist.sub.x Longitudinal distance X of the front edge of the vehicle 2 to the rear end of the object 7 Dist.sub.y Transverse distance Y of the front edge of the vehicle 2 to the rear end of the object 7 Spd.sub.x Longitudinal speed of the object 7 in relation to the longitudinal axis of the vehicle 2 Spd.sub.y Transverse speed of the object 7 in relation to the transverse axis of the vehicle 2 EgoSpd.sub.x Speed V of the vehicle 2 along its longitudinal axis RelSpd Difference of the longitudinal speed between the object 7 and the vehicle 2 TT.sub.cross.sub..sub.border The time ΔT necessary for the object 7 to reach the safety corridor 5 of the vehicle 2. Only taken into consideration when the trajectories of the vehicle 2 and the object 7 intersect. TT.sub.headway The time ΔT necessary for the vehicle 2 to cover the longitudinal distance X Dist.sub.x to the object 7 TTC.sub.cross.sub..sub.border The time ΔT necessary for the vehicle 2 to reach a point at which the object 7 reaches the safety corridor 5 of the vehicle 2, less the time necessary for the object 7 to reach this point. This measure takes into consideration the time until the collision when the limit of the safety corridor 5 is exceeded. Dist.sub.cross.sub..sub.border The longitudinal distance X between the vehicle 2 and the object 7 when the limit of the safety corridor 5 is exceeded.

    [0035] Each relevant object 7 in the three lanes 3, 6.1, 6.2 has a separate set of these variables. Further variables can include angle of intersection and the time ΔT at which the object 7 will leave the safety corridor 5.

    [0036] After calculating the cinematic variables, the critical object 7 is identified in a step S2, for example the object 7 shown in FIG. 1. For example, the minimum time ΔT TTC.sub.cross_border of all recognised objects 7 is used as the comparison measure for the objective complexity (see equation (3)). More than one critical object 7 can also be taken into consideration.

    [0037] As soon as a critical object 7 or the critical object 7 is identified, the driver assistance system uses a piece of fuzzy logic in order to predict whether the driver perceives a high or a low degree of subjective complexity. All cinematic variables can be used for the prediction method. A majority election mechanism determines the comparability of stored driving situations with the current driving situation 1. Based on the prediction and the comparison of the stored driving situations with the current driving situation 1, a trust percentage is calculated in a step S5. An adaptive benchmark which can be adjusted by the driver determines whether the trust percentage is high enough to warn the driver of a critical driving situation in a step S3. The adaptive benchmark determines whether the driver assistance system is sensitive or idle with its warning. Every time the driver takes back the control of the vehicle 2 during the learning phase, the current driving situation 1 is recorded and stored in a database DB in a step S4. The stored data comprise the current constellation, that is to say both the cinematic variables of the objects 7 in the surroundings and the constellation of the vehicle 2 shortly before. The moment that is considered to be shortly before depends on the reaction times of the driver. Since the driving surroundings can change dramatically due to the type of driving track 4 and national limitations, the data and types of driving culture can be formed to be completely adaptative in the surroundings. The preferences of the driver can also vary over time ΔT in terms of the warning and can be compared to other drivers. By more and more data being recorded, the database DB adapts over time ΔT, such that it can be personalised to each driver.

    [0038] The particular advantage of the present exemplary embodiment is that driving situations are identified which the driver subjectively perceives as critical and which they do not trust the driver assistance system to handle. Here, such driving situations are identified as subjectively critical in which the driver ends the automatic driving operation by taking over driving of the vehicle. The identified driving situations are stored for later use. In the future automatic driving operation, the driving situations currently recorded in each case are compared to the stored situations and, when a sufficient agreement is established, the driver is informed about this by a warning being emitted. The driver is thus warned of the emergence of a driving situation which, from their point of view, is critical. Thus, the warning threshold is adjusted to the needs of the driver.