Method and device for detecting the alertness of a vehicle driver
09731727 · 2017-08-15
Assignee
Inventors
Cpc classification
B60W2900/00
PERFORMING OPERATIONS; TRANSPORTING
B60W2554/00
PERFORMING OPERATIONS; TRANSPORTING
G06V20/597
PHYSICS
B60W30/09
PERFORMING OPERATIONS; TRANSPORTING
B60W30/08
PERFORMING OPERATIONS; TRANSPORTING
B60W2040/0818
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W30/08
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A method for detecting the alertness of a vehicle driver includes the detection of a potentially dangerous situation by a system of the vehicle; the estimation of a visual observation field of the vehicle driver; and the detection of the alertness of the vehicle driver by comparing an orientation or a position of the potentially dangerous situation to the visual observation field of the vehicle driver.
Claims
1. A method for ascertaining an alertness of a vehicle driver, characterized by the steps: detecting a potentially dangerous situation by a system of the vehicle; detecting at least one optical feature of the vehicle as a reference point, the at least one optical feature having a known position on the vehicle; receiving at least one image recorded by a camera unit; estimating a visual observation field of the vehicle driver by comparing the at least one image with the position of the at least one optical feature; and detecting the alertness of the vehicle driver by comparing an orientation or a position of the potentially dangerous situation to the visual observation field of the vehicle driver.
2. The method as recited in claim 1, further comprising: outputting a warning signal if the orientation of the potentially dangerous situation and the visual observation field do not agree.
3. The method as recited in claim 2, wherein at least one of an intensity, a duration, a type, and a starting instant of the warning signal is a function of the detected alertness.
4. The method as recited in claim 1, further comprising: intervening in a driving operation of the vehicle if the orientation of the potentially dangerous situation and the visual observation field do not agree.
5. The method as recited in claim 4, wherein at least one of an intensity, a duration, a type, and a starting instance of the intervention is a function of the detected alertness.
6. The method as recited in claim 1, wherein the visual observation field is ascertained based on at least one of: i) a plurality of optical features of the vehicle in a recording of the camera unit, ii) at least an indication of a cardinal direction of a device for detecting a cardinal direction of a portable sensor unit and iii) at least one signal from an acceleration sensor of the portable sensor unit.
7. The method as recited in claim 1, wherein the potentially dangerous situation is indicated by the orientation or position of an object.
8. The method as recited in claim 1, further comprising adjusting at least one of a resolution and accuracy of the estimated visual observation field based on at least one of a number and a position of the at least one optical feature.
9. A device for alertness detection of a vehicle driver of a vehicle having a system for detecting a potentially dangerous situation, comprising: a sensor unit designed to: detect at least one optical feature of the vehicle as a reference point, the at least one optical feature having a known position on the vehicle; receive at least one image recorded by a camera unit; and estimate a visual observation field of the vehicle driver by comparing the at least one image with the position of the at least one optical feature; a communications module for communication with the system and the sensor unit; and a computer unit set up to detect alertness of the vehicle driver by comparing an orientation or a position of the potentially dangerous situation to the visual observation field of the vehicle driver.
10. The device as recited in claim 9, wherein the computer unit is situated in at least one of the system and in the portable sensor unit.
11. The device as recited in claim 9, wherein the at least one optical feature is an infrared transmitter.
12. The device as recited in claim 9, wherein the system has at least one camera designed to record pupillae, and thus a gaze direction, of the driver, the computer unit being designed to superpose the gaze direction onto the visual observation field.
13. The device as recited in claim 9, wherein the sensor unit is further designed to adjust at least one of a resolution and accuracy of the estimated visual observation field based on at least one of a number and a position of the at least one optical feature.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) Exemplary embodiments of the present invention are explained in greater detail below with reference to the figures.
(2)
(3)
(4)
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
(5)
(6) A head 3 of vehicle driver 2 is situated in a certain orientation, the head orientation generally corresponding to the direction of attention of vehicle driver 2. In addition or as an alternative to the head orientation, the gaze direction of vehicle driver 2 is able to be utilized for estimating a visual observation field of vehicle driver 2. While a determination of the head orientation is assumed by way of example in the following figure description, the detection or estimation of the visual observation field is encompassed as well.
(7) To illustrate the head orientation in
(8) Head orientations in the Z plane, as well, i.e., from above to below, may be monitored here, so that it is also possible to sense the Z direction. This makes it possible to detect scenarios in which the driver or his attention is drawn to a region below or above the normal field of vision, i.e., the windshield. For example, this may be the case when the driver is busy with an operating element such as a radio, or if he gazes at the operating unit of a sliding roof. In other words, it is detectable whether the head is inclined toward below or above.
(9) A system 5, such as a driver assistance system or a safety system, for example, is situated in vehicle 1; it monitors the environment, or at least the region lying in front of vehicle 1 in the driving direction, for potentially dangerous situations 6. A potentially dangerous situation 6 is a situation that may lead to a collision with a potentially dangerous object 7 if vehicle 1 and/or object 7 maintain(s) their current heading or their current speed. To record the environment, system 5 includes radar and/or video sensors, for example, which may be implemented as stereo or mono sensors.
(10) In the situation shown in
(11)
(12)
(13) An augmented reality system or a portable sensor unit 12, in this instance shown in the form of smartglasses by way of example, encompasses a camera unit 13, which makes recordings, such as images or video films, from the viewpoint of vehicle driver 2. In addition, augmented reality system 12 includes a computer unit 14. Portable sensor unit 12 advantageously includes additional sensors, for instance a compass or similar device, for ascertaining the cardinal direction or orientation, and/or acceleration sensors.
(14) While a portable sensor unit is described within the context of the exemplary embodiment shown in the figures, the sensor unit may alternatively or additionally also be integrated in the vehicle.
(15) A communications device 15 links augmented reality system 12 to system 5 or control unit 10. Communications connection 15 can be part of a vehicle bus system, such as a CAN or FlexRay. Communications connection 15 may be compatible with further cabled or wireless connection types or connection protocols, for instance Bluetooth, W-LAN, USB, etc.
(16) Below, the way in which way the attention of vehicle driver 2 is ascertained will be described with the aid of the figures.
(17) By monitoring the environment of vehicle 1, system 5 detects a potentially dangerous situation 6, such as the movement of object 7 into the current driving path of vehicle 1.
(18) Subsequently or at the same time, the head orientation of vehicle driver 2 is detected with the aid of the sensors of portable sensor unit 12. For example, a recording of camera unit 12 is used and possibly supplemented by measured values of the further sensors in so doing. One or more image(s) of camera unit 13, for instance, is/are analyzed for this purpose, which may take place in computer unit 14 or computer unit 11. By comparing the recording or sections or parts of the recording with optical features of the vehicle that are known in their appearance and position, such as the steering wheel or a specially applied marking, the head orientation of vehicle driver 2 is determined.
(19) In
(20) The head orientation of vehicle driver 2 is now compared to the orientation or position of potentially dangerous situation 6 or object 7. The orientation of object 7, i.e., the alignment with respect to the current driving direction, amounts to approximately 20° in
(21) In the situation according to
(22) Computer units 11 and/or 14 carrying out the alertness detection receive(s) the orientation and/or position of object 7 and optionally also further information, such as, for example, an ascertained collision time, the movement direction of object 7, etc. Based on an individual or multiple recording(s) of camera unit 13, computer unit 11 and/or 14 check(s) the gaze direction of the driver, for example by detecting certain vehicle interior features, exterior vehicle features, special optical features or features outside the vehicle, such as road markings (this may be lines in front of the vehicle, for example).
(23) Depending on the ascertained head orientation or gaze direction of vehicle driver 2, computer unit 11 and/or 14 decide(s) whether the time has come already to give a warning to driver 2 because the driver is not gazing in the direction of dangerous object 7, for instance while he is glancing over the shoulder in connection with a lane change, as shown in
(24) If a warning is to be output, vehicle driver 2 receives a timely warning by way of a suitable signal. This may be an optical signal, output for instance from augmented reality system 12 or a vehicle information unit. In addition, an acoustic warning signal may be provided, which is output by an audio unit of vehicle 1 and/or augmented reality system 12. Moreover, a haptic warning, such as on the steering wheel of vehicle 1, is possible.
(25) For an active reaction of the alertness detection, an intervention in the driving operation of the vehicle is performed when the orientation and head orientation do not agree, or if their difference is excessive. In this case information pertaining to the alertness or inattentiveness of vehicle driver 2 are returned to control unit 10 provided the calculations have not been performed there. Control unit 10 then induces autonomous braking or a preliminary activation of the brakes, for instance, in an attempt to shorten the breaking distance.