System for a vehicle

11273835 · 2022-03-15

Assignee

Inventors

Cpc classification

International classification

Abstract

A system for a vehicle, having a driver monitoring device with a sensor device and with a first camera, an environment detection device with sensors for surroundings acquisition, a control unit and a classifier. Using the environment detection device and the classifier, traffic scenarios A and B may be classified. Using the driver monitoring device, it is possible to determine whether a direction of view of the driver is directed towards a traffic situation and whether the driver has their hands on a steering wheel of the vehicle. Using the control unit, it is possible, on the basis of information from the driver monitoring device and the environment detection device, to determine whether the driver is capable of resolving the traffic scenario classified as A or B within a response time.

Claims

1. A system for a vehicle, the system comprising: a driver monitoring device; a sensor device being part of the driver monitoring device, the sensor device is configured to detect whether at least one hand of a driver of the vehicle is in contact with a steering wheel of the vehicle, and to generate corresponding contact data; a first camera, and the first camera is configured to detect whether a direction of view of the driver is directed towards a traffic situation in which the vehicle is situated and to generate corresponding direction of view data; an environment detection device with sensors for surroundings acquisition, such that the sensors for surroundings acquisition are configured to detect lane information and object information in an external environment of the vehicle and to detect objects in a region in front of the vehicle; a control unit; and a classifier, and the sensors for surroundings acquisition are configured to acquire a traffic scenario from the detected objects, lane information and object information and transmit it to the classifier; wherein the classifier is configured to identify, on the basis of the traffic scenario, whether a system limit is reached within a response time (t.sub.L) and, if a system limit is identified, to classify the traffic scenario into a traffic scenario A and a traffic scenario B, wherein the driver of the vehicle may better resolve the traffic scenario A if the direction of view of the driver is directed towards the traffic situation in which the vehicle is situated, and wherein the driver of the vehicle may better resolve the traffic scenario B if at least one hand of the driver is in contact with the steering wheel; wherein the control unit is configured, in the case of an identified system limit and a traffic scenario classified as A or B, to evaluate on the basis of the contact data or the direction of view data whether the driver is capable of resolving the traffic scenario A or B within the response time (t.sub.L), wherein the control unit is configured to compare an estimated response time (t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B) of the driver with the response time (t.sub.L) until the system limit is reached.

2. The system of claim 1, wherein the estimated response time of the driver is stored as a constant and the control unit may access the constant.

3. The system of claim 1, wherein the control unit is configured to determine continuously the estimated response time (t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B) of the driver on the basis of the traffic scenario classified as A and the direction of view data or on the basis of the traffic scenario classified as B and the contact data.

4. The system of claim 1, wherein the control unit is configured, in the case of an identified system limit and a traffic scenario classified as A, to draw the conclusion that the driver is capable of resolving the traffic scenario A within the response time (t.sub.L), if the direction of view data include the fact that the direction of view of the driver is directed towards the traffic situation in which the vehicle is located, and to generate a take-over request to the driver, and to draw the conclusion that the driver is not capable of resolving the traffic scenario A within the response time (t.sub.L), if the direction of view data include the fact that the direction of view of the driver is not directed towards the traffic situation in which the vehicle is located, and to initiate a minimum risk maneuver of the vehicle.

5. The system of claim 1, wherein the control unit is configured, in the case of an identified system limit and a traffic scenario classified as B, to draw the conclusion that the driver is capable of resolving the traffic scenario B within the response time (t.sub.L), if the contact data include the fact that at least one hand of the driver of the vehicle is in contact with the steering wheel of the vehicle, and to generate a take-over request to the driver, and to draw the conclusion that the driver is not capable of resolving the traffic scenario B within the response time (t.sub.L), if the contact data include the fact that neither hand of the driver of the vehicle is in contact with the steering wheel of the vehicle, and to initiate a minimum risk maneuver of the vehicle.

6. The system of claim 1, the sensor device further comprising a capacitance sensor.

7. The system of claim 1, wherein the sensor device is configured to measure a torque from the hand of the driver acting on the steering wheel.

8. The system of claim 1, wherein the traffic scenario A involves the filtering-in of a further vehicle in the region ahead of the vehicle or the merging of two directly adjacent lanes in the same direction of travel in the region ahead of the vehicle.

9. The system of claim 1, wherein the traffic scenario B includes a sharp bend, a discrepancy between the course of a bend and a steering angle of the vehicle or a sudden extreme change in road conditions.

10. A vehicle comprising a system, the system further comprising: a driver monitoring device; a sensor device being part of the driver monitoring device, the sensor device is configured to detect whether at least one hand of a driver of the vehicle is in contact with a steering wheel of the vehicle, and to generate corresponding contact data; a first camera, and the first camera is configured to detect whether a direction of view of the driver is directed towards a traffic situation in which the vehicle is situated and to generate corresponding direction of view data; an environment detection device with sensors for surroundings acquisition, such that the sensors for surroundings acquisition are configured to detect lane information and object information in an external environment of the vehicle and to detect objects in a region in front of the vehicle; a control unit; and a classifier, and the sensors for surroundings acquisition are configured to acquire a traffic scenario from the detected objects, lane information and object information and transmit it to the classifier; wherein the classifier is configured to identify, on the basis of the traffic scenario, whether a system limit is reached within a response time (t.sub.L) and, if a system limit is identified, to classify the traffic scenario into a traffic scenario A and a traffic scenario B, wherein the driver of the vehicle may better resolve the traffic scenario A if the direction of view of the driver is directed towards the traffic situation in which the vehicle is situated, and wherein the driver of the vehicle may better resolve the traffic scenario B if at least one hand of the driver is in contact with the steering wheel; wherein the control unit is configured, in the case of an identified system limit and a traffic scenario classified as A or B, to evaluate on the basis of the contact data or the direction of view data whether the driver is capable of resolving the traffic scenario A or B within the response time (t.sub.L), wherein the control unit is configured to compare an estimated response time (t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B) of the driver with the response time (t.sub.L) until the system limit is reached.

11. A method, comprising the steps of: providing a driver monitoring device; providing a sensor device being part of the driver monitoring device; providing a first camera; providing an environment detection device with sensors for surroundings acquisition; providing a control unit; and providing a classifier; detecting with the sensor device whether at least one hand of a driver of the vehicle is in contact with a steering wheel of the vehicle, and to generate corresponding contact data; detecting with the first camera whether a direction of view of the driver is directed towards a traffic situation in which the vehicle is situated and to generate corresponding direction of view data; detecting, with the environment detection device with sensors for surroundings acquisition, lane information and object information in an external environment of the vehicle and to detect objects in a region in front of the vehicle; acquiring, with the environment detection device with sensors for surroundings acquisition, a traffic scenario from the detected objects, lane information and object information and transmitting it to the classifier; identifying with the classifier, on the basis of the traffic scenario, whether a system limit is reached within a response time (t.sub.L) and, if a system limit is identified, classifying the traffic scenario into a traffic scenario A and a traffic scenario B, wherein the driver of the vehicle may better resolve the traffic scenario A if the direction of view of the driver is directed towards the traffic situation in which the vehicle is situated, and wherein the driver of the vehicle may better resolve the traffic scenario B if at least one hand of the driver is in contact with the steering wheel; evaluating with the control unit, in the case of an identified system limit and a traffic scenario classified as A or B, on the basis of the contact data or the direction of view data whether the driver is capable of resolving the traffic scenario A or B within the response time (t.sub.L); comparing, with the control unit, an estimated response time (t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B) of the driver with the response time (t.sub.L) until the system limit is reached.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) Exemplary embodiments of the invention are explained in greater detail below with reference to schematic drawings, in which:

(2) FIG. 1 is a side view of a vehicle having an exemplary embodiment of a system according to the invention,

(3) FIG. 2 is a plan view of a first traffic scenario A,

(4) FIG. 3 is a plan view of a second traffic scenario A,

(5) FIG. 4 is a plan view of a first traffic scenario B,

(6) FIG. 5 is a plan view of a second traffic scenario B,

(7) FIG. 6 is a functional representation of the system according to FIG. 1 and

(8) FIG. 7 is a flow chart of a calculation of response times of the driver for a function according to FIG. 7.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

(9) The following description of the preferred embodiment(s) is merely exemplary in nature and is in no way intended to limit the invention, its application, or uses.

(10) FIG. 1 shows a vehicle 1 in the form of an automobile. The vehicle 1 comprises an exemplary embodiment of a system according to the invention with a driver monitoring device and with an environment detection device. The driver monitoring device comprises a sensor device 2 within a steering wheel 3 and a first camera 4. The environment detection device comprises a second camera 5 and a radar device 6. Furthermore, the system comprises a control unit 7 and a classifier 8. In addition, a driving function 9, for example a propulsion, steering or braking system of the vehicle 1, may be controlled by the control unit 7.

(11) The driving function 9 may be automatically operated, for example at SAE Level 3 or higher, whereby a driver 10 of the vehicle 1 is enabled to turn their attention away from the traffic for a certain time. It may then arise that the vehicle 1 is situated in a traffic situation which may be classified as a traffic scenario A or B. FIGS. 2 and 3 each show an example of a traffic scenario A, and FIGS. 4 and 5 each show an example of a traffic scenario B.

(12) In FIG. 2 the vehicle 1, depicted in highly simplified manner from above, is driving in a lane 11 behind a second vehicle 12. A third vehicle 13 suddenly filters in between the vehicle 1 and the second vehicle 12. In this traffic situation, a system limit is reached after a response time has elapsed. Too long a system delay may in this case require the driver 10 of the vehicle 1 to brake within the response time or the system to initiate an appropriate minimum risk maneuver in order to avoid a collision with the third vehicle 13. This traffic scenario A may be optically or visually identified and dealt with by the driver 10 of the vehicle 1, if the direction of view of the driver 10 is directed towards the traffic situation in which the vehicle 1 according to FIG. 2 is situated. On the other hand, the driver 10 of the vehicle 1 does not experience any feedback via the steering wheel 3 in the traffic situation shown in FIG. 2, and it is therefore very difficult to ensure timely intervention by the driver 10 unless the direction of view of the driver 10 is directed towards the traffic situation.

(13) In FIG. 3 the vehicle 1, depicted in highly simplified manner from above, is driving in a first lane 14, which narrows and merges with a second lane 15 traveling in the same direction Ahead of the vehicle 1 (still) in the first lane 14 there is situated a second vehicle 16, which filters into the second lane 15 between a third vehicle 17 and a fourth vehicle 18. As vehicle 1 proceeds, it too must filter into the second lane 15 due to the narrowing of the first lane 14. Were the vehicle 1 automatically to follow the second vehicle 16, this might lead to a side-on collision for example with the fourth vehicle 18. In this traffic situation a system limit is reached after a response time has elapsed. To avoid the side-on collision with the fourth vehicle 18, the driver 10 of the vehicle 1 would themselves have to carry out the filtering within the response time or the system would have to initiate an appropriate minimum risk maneuver. This traffic scenario may be readily resolved by a driver 10 who is looking ahead with their view directed towards the traffic. If the driver 10 of the vehicle 1 has at least one hand on the steering wheel 3, sideways movement of the vehicle 1 or of the steering wheel 3 will be perceptible via the at least one hand of the driver 10 on the steering wheel 3, a function thereby being necessary which avoids the side-on collision.

(14) These traffic scenarios A according to FIGS. 2 and 3 are better resolved by the driver 10 of the vehicle 1 if the driver 10 is observing the traffic, a fact which may be detected by the first camera 4 identifying whether the direction of view of the driver 10 is directed towards the traffic situation in which the vehicle 1 according to FIG. 2 or 3 is situated. On the other hand, in the present traffic scenarios A the driver 10 who, despite having their hands on the steering wheel 3, turns their gaze away from the traffic, might grasp and deal with the traffic scenario less well.

(15) In FIG. 4 the vehicle 1, depicted in highly simplified manner from above, is driving in a lane 19 with a sharp bend 20. If a maximum steering torque of the vehicle 1 is insufficient to take the sharp bend 20, the vehicle 1 may drift away from the lane 19. In this traffic situation a system limit is reached after a response time has elapsed. To prevent the vehicle 1 from drifting away from the lane 19, the driver 10 of the vehicle 1 has to intervene actively within the response time in the steering of the vehicle 1 or the system has to initiate an appropriate minimum risk maneuver. In this case, the limitation of the steering torque may be identified at a very early point by the driver 10 of the vehicle 1 via the steering wheel 3 and timely intervention by the driver 10 is possible provided the driver 10 has at least one of their hands on the steering wheel 3. In contrast, it is unlikely that association of the curvature of the sharp bend 20 with a correspondingly necessary torque is well resolved optically or visually by the driver 10 of the vehicle 1, making timely intervention by the driver 10 barely possible.

(16) In FIG. 5 the vehicle 1, depicted in highly simplified manner from above (position of the vehicle 1 shown on the left), is driving in a lane 21 with a bend 22, which does not have to be as sharp as the bend according to FIG. 4. To take the bend 22, the vehicle 1 needs to be steered to the right. If a sensor (not shown) of the system identifies steering movement of the vehicle 1 to the left in the region of the bend 22 (discrepancy between necessary and actual steering movement), this has the consequence that the vehicle 1 immediately takes an incorrect course and may for example move into the position of the vehicle 1 shown on the right in FIG. 5. In this traffic situation a system limit is reached after a response time has elapsed.

(17) In this situation it is necessary for the driver 10 of the vehicle 1 to intervene actively within the response time in the steering movement of the vehicle 1 by turning the steering wheel 3 to the right, or the system initiates a corresponding minimum risk maneuver. In this case, a sudden implausible steering movement of the vehicle 1 is perceptible via the steering wheel 3 of the vehicle at the instant at which the implausible steering movement is performed. Timely intervention by the driver 10 is possible in this case, provided that the driver 10 has at least one of their hands on the steering wheel 3. In contrast, sudden implausible steering movements may only be perceived optically or visually when the steering movement of the vehicle 1 has been performed. Timely intervention by the driver 10 is therefore barely possible if the driver 10 does not have at least one hand on the steering wheel 3.

(18) It may moreover arise that the lane 21 has ruts (not shown), the lane 21 is icy (black ice) or that strong side winds are acting on the vehicle 1 while it drives along the lane 21. In such traffic situations a system limit is reached after a response time has elapsed, wherein the system limit may optionally already have been reached beforehand. The driver has to adapt the driving behavior of the vehicle 1 accordingly, or the system must initiate an appropriate minimum risk maneuver to avoid an accident situation. The conditions on the lane 21 is felt by the driver 10 of the vehicle 1 via the steering wheel 3. This makes it possible for the driver 10 to adapt the driving behavior of the vehicle 1 or to intervene in critical situations. On the other hand, the above-described conditions on the lane 21 are difficult to perceive optically or visually.

(19) These traffic scenarios B according to FIGS. 4 and 5 are better resolved by the driver 10 of the vehicle 1 if the driver 10 has at least one of their hands on the steering wheel 3, a fact which is detected by the sensor device 2. On the other hand, in the present traffic scenarios B, a driver 10 who, despite looking at the traffic, does not have their hands on the steering wheel 3, might grasp the traffic scenario B and thus the system limits less well.

(20) Functions of the system of vehicle 1 according to FIG. 1 are explained in greater detail below in connection with FIGS. 6 and 7.

(21) By means of the second camera 5 and the radar device 6, it is possible in a first step 100 to analyze an environment in which the vehicle 1 is currently situated. By means of the classifier 8, the previously analyzed environment may then be classified in a classification step 200 into traffic scenarios A (result 201) and B (result 202).

(22) To this end, the second camera 5 may detect lane and object information in an external environment of the vehicle 1. The radar device 6 may moreover detect objects in a region in front of the vehicle 1. Corresponding data from the second camera 5 and the radar device 6 (objects, course of a lane and bend etc.) is supplied preprocessed to the classifier 8. On the basis of the environment data, the classifier 8 may classify whether a system limit is reached after a response time t.sub.L has elapsed. The response time t.sub.L is in this case a maximum possible response time before the system limit is reached. If a system limit is detected, the system classifies whether the traffic scenario is A (result 201) or B (result 202).

(23) In combination with the results 201 and 202 of the classification 200, the control unit 7 may, in a further step 300, use the first camera 4 and the sensor device 3 to determine the driver's condition, this being shown in detail in FIG. 7.

(24) The first camera 4 may in the process detect if a direction of view of the driver 10 is directed towards a traffic situation (result 301) in which the vehicle 1 is situated, and generate corresponding direction of view data. The sensor device 2 may for example detect by means of a capacitance sensor or a torque sensor (neither of which is shown) if at least one hand of the driver 10 of the vehicle 1 is in contact with the steering wheel 3 of the vehicle 1 (result 302), and generate corresponding contact data.

(25) Based on the results 301 and 302 of the analysis 300 of the driver's condition and the results 201 and 202 of the classification 200, a respective estimated response time of the driver 10 may be continuously determined. The following four different response times are obtained a first response time t.sub.1A for a direction of view of the driver 10 directed towards the traffic situation (result 301) and for a traffic scenario classified as A (result 201) a second response time t.sub.1B for a direction of view of the driver 10 directed towards the traffic situation (result 301) and for a traffic scenario classified as B (result 202) a third response time t.sub.2A for at least one hand of the driver 10 detected on the steering wheel 3 (result 302) and for a traffic scenario classified as A (result 201), an a fourth response time t.sub.2B for at least one hand of the driver 10 detected on the steering wheel 3 (result 302) and for a traffic scenario classified as B (result 202).

(26) As an alternative to the above-described continuous evaluation (which allows extension to multiple classes and dynamic adaptation), the response times may also be saved as predefined constants.

(27) In the following comparison steps 401, 402, 403 and 404 respectively, the respectively determined response time t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B is compared by the control unit 7 with the maximum possible response time t.sub.L before the system limit is reached. Both the determined response times t.sub.1A, t.sub.1B, t.sub.2A and t.sub.2B and the response time t.sub.L before the system limit is reached are periods of time.

(28) If it is identified in the comparison steps 401, 402, 403 and 404 that the determined response time t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B is shorter than the response time t.sub.L until the system limit is reached (result 402), the control unit 7 draws the conclusion that the driver 10 is capable of resolving the respective traffic scenario A or B within the response time t.sub.L and generates a take-over request to the driver 10. In this case, the control unit does not initiate a minimum risk maneuver.

(29) If it is identified in the comparison steps 401, 402, 403 and 404 that the determined response time t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B is longer than the response time t.sub.L until the system limit is reached or is the same length as the response time t.sub.L until the system limit is reached (result 403), the control unit 7 draws the conclusion that the driver 10 is incapable of resolving the respective traffic scenario A or B within the response time t.sub.L and initiates a minimum risk maneuver. In this case, the control unit 7 does not generate a take-over request to the driver 10.

(30) The description of the invention is merely exemplary in nature and, thus, variations that do not depart from the gist of the invention are intended to be within the scope of the invention. Such variations are not to be regarded as a departure from the spirit and scope of the invention.