System for a vehicle
11273835 · 2022-03-15
Assignee
Inventors
Cpc classification
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
B60W60/0059
PERFORMING OPERATIONS; TRANSPORTING
B60W2540/223
PERFORMING OPERATIONS; TRANSPORTING
B60W30/18163
PERFORMING OPERATIONS; TRANSPORTING
B60W2554/408
PERFORMING OPERATIONS; TRANSPORTING
B60W30/16
PERFORMING OPERATIONS; TRANSPORTING
B60W2554/00
PERFORMING OPERATIONS; TRANSPORTING
B60W60/0057
PERFORMING OPERATIONS; TRANSPORTING
B60W2554/804
PERFORMING OPERATIONS; TRANSPORTING
B60W2540/221
PERFORMING OPERATIONS; TRANSPORTING
B60W2555/60
PERFORMING OPERATIONS; TRANSPORTING
B60W2540/22
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
B60W30/16
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A system for a vehicle, having a driver monitoring device with a sensor device and with a first camera, an environment detection device with sensors for surroundings acquisition, a control unit and a classifier. Using the environment detection device and the classifier, traffic scenarios A and B may be classified. Using the driver monitoring device, it is possible to determine whether a direction of view of the driver is directed towards a traffic situation and whether the driver has their hands on a steering wheel of the vehicle. Using the control unit, it is possible, on the basis of information from the driver monitoring device and the environment detection device, to determine whether the driver is capable of resolving the traffic scenario classified as A or B within a response time.
Claims
1. A system for a vehicle, the system comprising: a driver monitoring device; a sensor device being part of the driver monitoring device, the sensor device is configured to detect whether at least one hand of a driver of the vehicle is in contact with a steering wheel of the vehicle, and to generate corresponding contact data; a first camera, and the first camera is configured to detect whether a direction of view of the driver is directed towards a traffic situation in which the vehicle is situated and to generate corresponding direction of view data; an environment detection device with sensors for surroundings acquisition, such that the sensors for surroundings acquisition are configured to detect lane information and object information in an external environment of the vehicle and to detect objects in a region in front of the vehicle; a control unit; and a classifier, and the sensors for surroundings acquisition are configured to acquire a traffic scenario from the detected objects, lane information and object information and transmit it to the classifier; wherein the classifier is configured to identify, on the basis of the traffic scenario, whether a system limit is reached within a response time (t.sub.L) and, if a system limit is identified, to classify the traffic scenario into a traffic scenario A and a traffic scenario B, wherein the driver of the vehicle may better resolve the traffic scenario A if the direction of view of the driver is directed towards the traffic situation in which the vehicle is situated, and wherein the driver of the vehicle may better resolve the traffic scenario B if at least one hand of the driver is in contact with the steering wheel; wherein the control unit is configured, in the case of an identified system limit and a traffic scenario classified as A or B, to evaluate on the basis of the contact data or the direction of view data whether the driver is capable of resolving the traffic scenario A or B within the response time (t.sub.L), wherein the control unit is configured to compare an estimated response time (t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B) of the driver with the response time (t.sub.L) until the system limit is reached.
2. The system of claim 1, wherein the estimated response time of the driver is stored as a constant and the control unit may access the constant.
3. The system of claim 1, wherein the control unit is configured to determine continuously the estimated response time (t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B) of the driver on the basis of the traffic scenario classified as A and the direction of view data or on the basis of the traffic scenario classified as B and the contact data.
4. The system of claim 1, wherein the control unit is configured, in the case of an identified system limit and a traffic scenario classified as A, to draw the conclusion that the driver is capable of resolving the traffic scenario A within the response time (t.sub.L), if the direction of view data include the fact that the direction of view of the driver is directed towards the traffic situation in which the vehicle is located, and to generate a take-over request to the driver, and to draw the conclusion that the driver is not capable of resolving the traffic scenario A within the response time (t.sub.L), if the direction of view data include the fact that the direction of view of the driver is not directed towards the traffic situation in which the vehicle is located, and to initiate a minimum risk maneuver of the vehicle.
5. The system of claim 1, wherein the control unit is configured, in the case of an identified system limit and a traffic scenario classified as B, to draw the conclusion that the driver is capable of resolving the traffic scenario B within the response time (t.sub.L), if the contact data include the fact that at least one hand of the driver of the vehicle is in contact with the steering wheel of the vehicle, and to generate a take-over request to the driver, and to draw the conclusion that the driver is not capable of resolving the traffic scenario B within the response time (t.sub.L), if the contact data include the fact that neither hand of the driver of the vehicle is in contact with the steering wheel of the vehicle, and to initiate a minimum risk maneuver of the vehicle.
6. The system of claim 1, the sensor device further comprising a capacitance sensor.
7. The system of claim 1, wherein the sensor device is configured to measure a torque from the hand of the driver acting on the steering wheel.
8. The system of claim 1, wherein the traffic scenario A involves the filtering-in of a further vehicle in the region ahead of the vehicle or the merging of two directly adjacent lanes in the same direction of travel in the region ahead of the vehicle.
9. The system of claim 1, wherein the traffic scenario B includes a sharp bend, a discrepancy between the course of a bend and a steering angle of the vehicle or a sudden extreme change in road conditions.
10. A vehicle comprising a system, the system further comprising: a driver monitoring device; a sensor device being part of the driver monitoring device, the sensor device is configured to detect whether at least one hand of a driver of the vehicle is in contact with a steering wheel of the vehicle, and to generate corresponding contact data; a first camera, and the first camera is configured to detect whether a direction of view of the driver is directed towards a traffic situation in which the vehicle is situated and to generate corresponding direction of view data; an environment detection device with sensors for surroundings acquisition, such that the sensors for surroundings acquisition are configured to detect lane information and object information in an external environment of the vehicle and to detect objects in a region in front of the vehicle; a control unit; and a classifier, and the sensors for surroundings acquisition are configured to acquire a traffic scenario from the detected objects, lane information and object information and transmit it to the classifier; wherein the classifier is configured to identify, on the basis of the traffic scenario, whether a system limit is reached within a response time (t.sub.L) and, if a system limit is identified, to classify the traffic scenario into a traffic scenario A and a traffic scenario B, wherein the driver of the vehicle may better resolve the traffic scenario A if the direction of view of the driver is directed towards the traffic situation in which the vehicle is situated, and wherein the driver of the vehicle may better resolve the traffic scenario B if at least one hand of the driver is in contact with the steering wheel; wherein the control unit is configured, in the case of an identified system limit and a traffic scenario classified as A or B, to evaluate on the basis of the contact data or the direction of view data whether the driver is capable of resolving the traffic scenario A or B within the response time (t.sub.L), wherein the control unit is configured to compare an estimated response time (t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B) of the driver with the response time (t.sub.L) until the system limit is reached.
11. A method, comprising the steps of: providing a driver monitoring device; providing a sensor device being part of the driver monitoring device; providing a first camera; providing an environment detection device with sensors for surroundings acquisition; providing a control unit; and providing a classifier; detecting with the sensor device whether at least one hand of a driver of the vehicle is in contact with a steering wheel of the vehicle, and to generate corresponding contact data; detecting with the first camera whether a direction of view of the driver is directed towards a traffic situation in which the vehicle is situated and to generate corresponding direction of view data; detecting, with the environment detection device with sensors for surroundings acquisition, lane information and object information in an external environment of the vehicle and to detect objects in a region in front of the vehicle; acquiring, with the environment detection device with sensors for surroundings acquisition, a traffic scenario from the detected objects, lane information and object information and transmitting it to the classifier; identifying with the classifier, on the basis of the traffic scenario, whether a system limit is reached within a response time (t.sub.L) and, if a system limit is identified, classifying the traffic scenario into a traffic scenario A and a traffic scenario B, wherein the driver of the vehicle may better resolve the traffic scenario A if the direction of view of the driver is directed towards the traffic situation in which the vehicle is situated, and wherein the driver of the vehicle may better resolve the traffic scenario B if at least one hand of the driver is in contact with the steering wheel; evaluating with the control unit, in the case of an identified system limit and a traffic scenario classified as A or B, on the basis of the contact data or the direction of view data whether the driver is capable of resolving the traffic scenario A or B within the response time (t.sub.L); comparing, with the control unit, an estimated response time (t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B) of the driver with the response time (t.sub.L) until the system limit is reached.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) Exemplary embodiments of the invention are explained in greater detail below with reference to schematic drawings, in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
(9) The following description of the preferred embodiment(s) is merely exemplary in nature and is in no way intended to limit the invention, its application, or uses.
(10)
(11) The driving function 9 may be automatically operated, for example at SAE Level 3 or higher, whereby a driver 10 of the vehicle 1 is enabled to turn their attention away from the traffic for a certain time. It may then arise that the vehicle 1 is situated in a traffic situation which may be classified as a traffic scenario A or B.
(12) In
(13) In
(14) These traffic scenarios A according to
(15) In
(16) In
(17) In this situation it is necessary for the driver 10 of the vehicle 1 to intervene actively within the response time in the steering movement of the vehicle 1 by turning the steering wheel 3 to the right, or the system initiates a corresponding minimum risk maneuver. In this case, a sudden implausible steering movement of the vehicle 1 is perceptible via the steering wheel 3 of the vehicle at the instant at which the implausible steering movement is performed. Timely intervention by the driver 10 is possible in this case, provided that the driver 10 has at least one of their hands on the steering wheel 3. In contrast, sudden implausible steering movements may only be perceived optically or visually when the steering movement of the vehicle 1 has been performed. Timely intervention by the driver 10 is therefore barely possible if the driver 10 does not have at least one hand on the steering wheel 3.
(18) It may moreover arise that the lane 21 has ruts (not shown), the lane 21 is icy (black ice) or that strong side winds are acting on the vehicle 1 while it drives along the lane 21. In such traffic situations a system limit is reached after a response time has elapsed, wherein the system limit may optionally already have been reached beforehand. The driver has to adapt the driving behavior of the vehicle 1 accordingly, or the system must initiate an appropriate minimum risk maneuver to avoid an accident situation. The conditions on the lane 21 is felt by the driver 10 of the vehicle 1 via the steering wheel 3. This makes it possible for the driver 10 to adapt the driving behavior of the vehicle 1 or to intervene in critical situations. On the other hand, the above-described conditions on the lane 21 are difficult to perceive optically or visually.
(19) These traffic scenarios B according to
(20) Functions of the system of vehicle 1 according to
(21) By means of the second camera 5 and the radar device 6, it is possible in a first step 100 to analyze an environment in which the vehicle 1 is currently situated. By means of the classifier 8, the previously analyzed environment may then be classified in a classification step 200 into traffic scenarios A (result 201) and B (result 202).
(22) To this end, the second camera 5 may detect lane and object information in an external environment of the vehicle 1. The radar device 6 may moreover detect objects in a region in front of the vehicle 1. Corresponding data from the second camera 5 and the radar device 6 (objects, course of a lane and bend etc.) is supplied preprocessed to the classifier 8. On the basis of the environment data, the classifier 8 may classify whether a system limit is reached after a response time t.sub.L has elapsed. The response time t.sub.L is in this case a maximum possible response time before the system limit is reached. If a system limit is detected, the system classifies whether the traffic scenario is A (result 201) or B (result 202).
(23) In combination with the results 201 and 202 of the classification 200, the control unit 7 may, in a further step 300, use the first camera 4 and the sensor device 3 to determine the driver's condition, this being shown in detail in
(24) The first camera 4 may in the process detect if a direction of view of the driver 10 is directed towards a traffic situation (result 301) in which the vehicle 1 is situated, and generate corresponding direction of view data. The sensor device 2 may for example detect by means of a capacitance sensor or a torque sensor (neither of which is shown) if at least one hand of the driver 10 of the vehicle 1 is in contact with the steering wheel 3 of the vehicle 1 (result 302), and generate corresponding contact data.
(25) Based on the results 301 and 302 of the analysis 300 of the driver's condition and the results 201 and 202 of the classification 200, a respective estimated response time of the driver 10 may be continuously determined. The following four different response times are obtained a first response time t.sub.1A for a direction of view of the driver 10 directed towards the traffic situation (result 301) and for a traffic scenario classified as A (result 201) a second response time t.sub.1B for a direction of view of the driver 10 directed towards the traffic situation (result 301) and for a traffic scenario classified as B (result 202) a third response time t.sub.2A for at least one hand of the driver 10 detected on the steering wheel 3 (result 302) and for a traffic scenario classified as A (result 201), an a fourth response time t.sub.2B for at least one hand of the driver 10 detected on the steering wheel 3 (result 302) and for a traffic scenario classified as B (result 202).
(26) As an alternative to the above-described continuous evaluation (which allows extension to multiple classes and dynamic adaptation), the response times may also be saved as predefined constants.
(27) In the following comparison steps 401, 402, 403 and 404 respectively, the respectively determined response time t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B is compared by the control unit 7 with the maximum possible response time t.sub.L before the system limit is reached. Both the determined response times t.sub.1A, t.sub.1B, t.sub.2A and t.sub.2B and the response time t.sub.L before the system limit is reached are periods of time.
(28) If it is identified in the comparison steps 401, 402, 403 and 404 that the determined response time t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B is shorter than the response time t.sub.L until the system limit is reached (result 402), the control unit 7 draws the conclusion that the driver 10 is capable of resolving the respective traffic scenario A or B within the response time t.sub.L and generates a take-over request to the driver 10. In this case, the control unit does not initiate a minimum risk maneuver.
(29) If it is identified in the comparison steps 401, 402, 403 and 404 that the determined response time t.sub.1A, t.sub.1B, t.sub.2A or t.sub.2B is longer than the response time t.sub.L until the system limit is reached or is the same length as the response time t.sub.L until the system limit is reached (result 403), the control unit 7 draws the conclusion that the driver 10 is incapable of resolving the respective traffic scenario A or B within the response time t.sub.L and initiates a minimum risk maneuver. In this case, the control unit 7 does not generate a take-over request to the driver 10.
(30) The description of the invention is merely exemplary in nature and, thus, variations that do not depart from the gist of the invention are intended to be within the scope of the invention. Such variations are not to be regarded as a departure from the spirit and scope of the invention.