Drowsy driving management device, system including the same, and method thereof
10882533 ยท 2021-01-05
Assignee
Inventors
Cpc classification
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
G06V20/597
PHYSICS
B60W2540/223
PERFORMING OPERATIONS; TRANSPORTING
B60W40/08
PERFORMING OPERATIONS; TRANSPORTING
A61B2503/22
HUMAN NECESSITIES
B60W50/16
PERFORMING OPERATIONS; TRANSPORTING
A61B5/4809
HUMAN NECESSITIES
International classification
B60W40/08
PERFORMING OPERATIONS; TRANSPORTING
B60W50/16
PERFORMING OPERATIONS; TRANSPORTING
A61B5/00
HUMAN NECESSITIES
Abstract
A drowsy driving management device includes: a processor configured to determine whether slow eye movement of a user occurs and whether there is no change in steering torque and to determine that the user drives while drowsy when the slow eye movement of the user occurs and when there is no change in the steering torque for a predetermined period of time; and a storage that stores information indicating whether the slow eye movement occurs and a result of determining whether there is no change in the steering torque.
Claims
1. A drowsy driving management device, comprising: a processor configured to: determine whether slow eye movement of a user occurs and whether there is no change in steering torque, and determine that the user drives while drowsy when the slow eye movement of the user occurs and when there is no change in the steering torque for a predetermined period of time; and a storage storing information indicating whether the slow eye movement occurs and a result of determining whether there is no change in the steering torque, wherein the processor is configured to: calculate first eye movement information with respect to a vehicle based on information about a fixation of the user and information about a head pose of the user based on image data, and calculate second eye movement information with respect to an external environment based on the first eye movement information, wherein the information about the fixation includes a gaze yaw angle and a gaze pitch angle, and wherein the information about the head pose includes a head pose yaw angle and a head pose pitch angle.
2. The drowsy driving management device of claim 1, wherein the processor is configured to determine that the driver drives while drowsy, when there is no user maneuver in a state where the slow eye movement of the user occurs and where there is no change in the steering torque for the predetermined period of time.
3. The drowsy driving management device of claim 1, wherein the processor is configured to: calculate a first eye movement yaw angle with respect to the vehicle in the first eye movement information by subtracting the head pose yaw angle from the gaze yaw angle, and calculate a first eye movement pitch angle with respect to the vehicle in the first eye movement information by subtracting the head pose pitch angle from the gaze pitch angle.
4. The drowsy driving management device of claim 3, wherein the processor is configured to: calculate a second eye movement yaw angle with respect to the external environment in the second eye movement information using the first eye movement yaw angle and a vehicle yaw rate, and calculate a second eye movement pitch angle with respect to the external environment in the second eye movement information using the first eye movement pitch angle and a vehicle pitch angle.
5. The drowsy driving management device of claim 4, wherein the processor is configured to: calculate an eye movement distance during the predetermined period of time using a value obtained by adding the second eye movement yaw angle and the second eye movement pitch angle, and determine that the slow eye movement occurs, when the eye movement distance is less than a predetermined reference value.
6. The drowsy driving management device of claim 1, wherein the processor is configured to determine that there is no change in the steering torque, when an output value of a steering torque sensor is less than a predetermined reference value.
7. A vehicle system, comprising: a drowsy driving management device configured to: determine whether slow eye movement of a user occurs and whether there is no change in steering torque, and determine that the user drives while drowsy when the slow eye movement of the user occurs and when there is no change in the steering torque for a predetermined period of time; and a warning device configured to output a warning to the user, when it is determined that the user drives while drowsy, wherein the drowsy driving management device is configured to: calculate first eye movement information with respect to a vehicle based on information about a fixation of the user and information about a head pose of the user based on image data, and calculate second eye movement information with respect to an external environment based on the first eye movement information, wherein the information about the fixation includes a gaze yaw angle and a gaze pitch angle, and wherein the information about the head pose includes a head pose yaw angle and a head pose pitch angle.
8. The vehicle system of claim 7, wherein the warning device is configured to provide at least one or more of a visual warning, a tactile warning, and an audible warning.
9. The vehicle system of claim 7, further comprising: a camera configured to detect information about a gaze of the user and information about a head pose of the user; a yaw pitch sensor configured to sense a vehicle yaw rate and a vehicle pitch angle; a steering torque sensor configured to sense a change in steering torque of the vehicle; and a decelerator/accelerator pedal sensor configured to sense a change value in decelerator/accelerator pedal.
10. The vehicle system of claim 7, wherein the drowsy driving management device includes: a processor configured to: determine whether the slow eye movement of the user occurs and whether there is no change in the steering torque, and determine that the user drives while drowsy when the slow eye movement of the user occurs and when there is no change in the steering torque for the predetermined period of time; and a storage storing information indicating whether the slow eye movement occurs and a result of determining whether there is no change in the steering torque, the information and the result being obtained by the processor.
11. The vehicle system of claim 10, wherein the processor is configured to determine that the driver drives while drowsy, when there is no user maneuver in a state where the slow eye movement of the user occurs and where there is no change in the steering torque for the predetermined period of time.
12. A drowsy driving management method, comprising: determining whether slow eye movement of a user occurs and whether there is no change in steering torque; and determining that the user drives while drowsy, when the slow eye movement of the user occurs and when there is no change in the steering torque for a predetermined period of time, wherein the determining whether slow eye movement of a user occurs and whether there is no change in steering torque includes: calculating first eye movement information with respect to a vehicle based on information about a fixation of the user and information about a head pose of the user based on image data; and calculating second eye movement information with respect to an external environment based on the first eye movement information, wherein the information about the fixation includes a gaze yaw angle and a gaze pitch angle, and wherein the information about the head pose includes a head pose yaw angle and a head pose pitch angle.
13. The drowsy driving management method of claim 12, wherein the determining that the user drives while drowsy includes determining that the driver drives while drowsy, when there is no user maneuver in a state where the slow eye movement of the user occurs and where there is no change in the steering torque for the predetermined period of time.
14. The drowsy driving management method of claim 12, wherein the determining whether slow eye movement of a user occurs and whether there is no change in steering torque includes: calculating a first eye movement yaw angle with respect to the vehicle in the first eye movement information by subtracting the head pose yaw angle from the gaze yaw angle; and calculating a first eye movement pitch angle with respect to the vehicle in the first eye movement information by subtracting the head pose pitch angle from the gaze pitch angle.
15. The drowsy driving management method of claim 14, wherein the determining whether slow eye movement of a user occurs and whether there is no change in steering torque includes: calculating a second eye movement yaw angle with respect to the external environment in the second eye movement information using the first eye movement yaw angle and a vehicle yaw rate; and calculating a second eye movement pitch angle with respect to the external environment in the second eye movement information using the first eye movement pitch angle and a vehicle pitch angle.
16. The drowsy driving management method of claim 15, wherein the determining whether slow eye movement of a user occurs and whether there is no change in steering torque includes: calculating an eye movement distance during the predetermined period of time using a value obtained by adding the second eye movement yaw angle and the second eye movement pitch angle; and determining that the slow eye movement occurs, when the eye movement distance is less than a predetermined reference value.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The above and other objects, features and advantages of the present disclosure will be more apparent from the following detailed description taken in conjunction with the accompanying drawings:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
DETAILED DESCRIPTION
(9) Hereinafter, some embodiments of the present disclosure will be described in detail with reference to the exemplary drawings. In adding the reference numerals to the components of each drawing, it should be noted that the identical or equivalent component is designated by the identical numeral even when they are displayed on other drawings. Further, in describing the embodiment of the present disclosure, a detailed description of well-known features or functions will be ruled out in order not to unnecessarily obscure the gist of the present disclosure.
(10) In describing the components of the embodiment according to the present disclosure, terms such as first, second, A, B, (a), (b), and the like may be used. These terms are merely intended to distinguish one component from another component, and the terms do not limit the nature, sequence or order of the constituent components. Unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meanings as those generally understood by those skilled in the art to which the present disclosure pertains. Such terms as those defined in a generally used dictionary are to be interpreted as having meanings equal to the contextual meanings in the relevant field of art, and are not to be interpreted as having ideal or excessively formal meanings unless clearly defined as having such in the present application.
(11) Because drowsiness relaxes muscles, it is difficult to take purposeful action such as driving. In case of eyes, when drowsy, the movement of pupils is not indicated as saccade and the fixation of the eyes moves at a slow speed. In case of hands, in a drowsy situation, a clear behavior is not performed due to muscle relaxation and the strength of grasp of the hands is reduced. Thus, an exemplary embodiment of the present disclosure may disclose technologies of determining drowsy driving at the beginning of drowsiness based on slow eye movement of a user and a vehicle signal (e.g., a steering torque and information indicating whether various switches are manipulated) and warning the user to prevent accident by the drowsy driving.
(12) Hereinafter, a description will be given in detail of embodiments of the prevent disclosure with reference to
(13)
(14) Referring to
(15) The drowsy driving management device 100 may determine whether slow eye movement of a user occurs and whether there is no change in steering torque. When the slow eye movement of the user occurs and when there is no change in steering torque for a predetermined period of time, the drowsy driving management device 100 may determine that the user drives while drowsy.
(16) Furthermore, when there is no user maneuver in the state where the slow eye movement of the user occurs and where there is no change in the steering torque for the predetermined period of time, the drowsy driving management device 100 may determine that the user drives while drowsy.
(17) The drowsy driving management device 100 may include a communicator 110, a storage 120, and a processor 130.
(18) The communicator 110 may be a hardware device implemented with various electronic circuits to transmit and receive a signal over a wireless or wired connection. In an embodiment of the present disclosure, the communicator 110 may perform inter-vehicle communication through controller area network (CAN) communication, local interconnect network (LIN) communication, or the like and may communicate with the sensing module 200 and the warning device 300.
(19) The storage 120 may store a sensing result of the sensing module 200 and information indicating whether slow eye movement occurs, the result of determining whether there is no change in steering torque, and the like, obtained by the processor 130. The storage 120 may include at least one type of storage medium, such as a flash memory type memory, a hard disk type memory, a micro type memory, a card type memory (e.g., a secure digital (SD) card or an extreme digital (XD) card), a random access memory (RAM), a static RAM (SRAM), a read-only memory (ROM), a programmable ROM (PROM), an electrically erasable PROM (EEPROM), a magnetic RAM (MRAM), a magnetic disk, and an optical disk.
(20) The processor 130 may be electrically connected with the communicator 110, the storage 120, the warning device 300, or the like and may electrically control the respective components. The processor 130 may be an electrical circuit which executes instructions of software and may perform a variety of data processing and calculation described below.
(21) The processor 130 may determine whether slow eye movement of the user occurs and whether there is no change in steering torque. When the slow eye movement of the user occurs and when there is no change in the steering torque during a predetermined time, the processor 130 may determine that the user drives while drowsy.
(22) When there is no user maneuver in the state when the slow eye movement of the user occurs and when there is no change in the steering torque for the predetermined period of time, the processor 130 may determine that the user drives while drowsy.
(23) The processor 130 may calculate eye movement information with respect to a vehicle based on information about a fixation of the user and information about a head pose of the user based on image data and may calculate eye movement information with respect to an external environment based on the eye movement information with respect to the vehicle.
(24) The processor 130 may subtracts a head pose yaw angle from a gaze yaw angle to calculate an eye movement yaw angle with respect to the vehicle in the eye movement information with respect to the vehicle. The processor 130 may subtract a head pose pitch angle from a gaze pitch angle to calculate an eye movement pitch angle with respect to the vehicle in the eye movement information with respect to the vehicle.
(25) The processor 130 may calculate an eye movement yaw angle with respect to the external environment in the eye movement information with respect to the external environment using the eye movement yaw angle with respect to the vehicle and a vehicle yaw rate. The processor 130 may calculate an eye movement pitch angle with respect to the external environment in the eye movement information with respect to the external environment using the eye movement pitch angle with respect to the vehicle and a vehicle pitch angle.
(26) The processor 130 may calculate an eye movement distance during a predetermined period of time using a value obtained by adding the eye movement yaw angle with respect to the external environment and the eye movement pitch angle with respect to the external environment. When the eye movement distance is less than a predetermined reference value, the processor 130 may determine that the slow eye movement occurs.
(27) When an output value of a steering torque sensor 230 is less than a predetermined reference value, the processor 130 may determine that there is no change in steering torque.
(28) The sensing module 200 may sense information about a fixation of the user, information about a head pose (head movement) of the user, a vehicle yaw rate, a vehicle pitch angle, a change in steering torque of the vehicle, whether a decelerator/accelerator pedal operates, or the like.
(29) To this end, the sensing module 200 may include a camera 210, a yaw/pitch sensor 220, the steering torque sensor 230, and a decelerator/accelerator pedal sensor 240.
(30) The camera 210 may be driver status monitoring (DSM) and may detect information about a gaze of the user and information about a head pose of the user.
(31) The yaw/pitch sensor 220 may sense a vehicle yaw rate and a vehicle pitch angle and may deliver the sensed information to the drowsy driving management device 100.
(32) The steering torque sensor 230 may sense a change in the steering torque of the vehicle and may deliver the sensed information to the drowsy driving management device 100.
(33) The decelerator/accelerator pedal sensor 240 may sense a change value in decelerator/accelerator pedal and may deliver the sensed information to the drowsy driving management device 100.
(34) Although not illustrated in
(35) When it is determined that the user drives while drowsy by the drowsy driving management device 100, the warning device 300 may be controlled by the drowsy driving management device 100 to output at least one or more of a visual warning, a tactile warning, and an audible warning to the user and to guide the user to shake off sleepiness to display a screen of recommending the user to take a break, recommending the user to operate a driving assistance function, or guiding the user toward a rest area or a sleeping shelter or outputting a voice to the user. The warning device 300 may be implemented as a head-up display (HUD), a cluster, an audio video navigation (AVN), or the like. Furthermore, the warning device 300 may include at least one of a liquid crystal display (LCD), a thin film transistor-LCD (TFT-LCD), a light emitting diode (LED) display, an organic LED (OLED) display, an active matrix OLED (AMOLED) display, a flexible display, a bended display, and a three-dimensional (3D) display. Some thereof may be implemented as transparent displays configured as a transparent type or a semi-transparent type to see the outside. Moreover, the warning device 300 may be implemented as a touchscreen including a touch panel to be used as an input device other than an output device.
(36) As such, an embodiment of the present disclosure may determine drowsy driving of the user with high reliability at the beginning of the drowsy driving based on slow eye movement and an interval where there is no change in steering torque. Furthermore, when there are no inputs for manipulating various switches, such as a pedal operation, from the user as well as the slow eye movement and the interval where there is no change in the steering torque, an embodiment of the present disclosure may determine that the driver drives while drowsy.
(37)
(38)
(39) Referring to
(40) Referring to
(41) Hereinafter, a description will be given in detail of a drowsy driving management method according to an embodiment of the present disclosure with reference to
(42) Hereinafter, it is assumed that a drowsy driving management device 100 of
(43) Referring to
(44) In S102, the drowsy driving management device 100 may convert a relative fixation with respect to a vehicle into an absolute fixation with respect to an external environment using the fixation and a value output from a yaw/pitch sensor 220 of
(45) In S103, the drowsy driving management device 100 may verify the slow eye movement based on the fixation. When a person who in a normal state wants to see another place while looking at one place, he or she performs eye movement in the form of quickly moving a fixation and looking at another point. Such eye movement is called saccade. When the eye movement is measured using a gaze tracking camera, it is shown in
(46) In S104, the drowsy driving management device 100 may detect an interval where there is no change in steering torque, using a value output from the steering torque sensor 230 of
(47) In S105, the drowsy driving management device 100 may verify an interval where there is no user maneuver, based on an output value applied from a decelerator/accelerator pedal sensor 240 of
(48) In S106, the drowsy driving management device 100 may determine whether there are the slow eye movement, the interval where there is no change in the steering torque, and the interval where there is no user maneuver at the same time to determine whether the user drives while drowsy.
(49) When it is determined that the user drives while drowsy, in S107, the drowsy driving management device 100 may warn the user of drowsiness by a warning device 300 of
(50)
(51) Referring to
(52) In this case, the information about the fixation of the user may include a gaze yaw angle w.sub.gaze and a gaze pitch angle v.sub.gaze. The information about the head pose may include a head pose yaw angle w.sub.head and a head pose pitch angle v.sub.head.
(53) In S203, the drowsy driving management device 100 may calculate eye movement information with respect to a vehicle using the information about the fixation of the user and the information about the head pose of the user.
(54) In this case, the eye movement information with respect to the vehicle may include an eye movement yaw angle w.sub.eye with respect to the vehicle and an eye movement pitch angle v.sub.eye with respect to the vehicle.
(55) In this case, the eye movement yaw angle w.sub.eye with respect to the vehicle and the eye movement pitch angle v.sub.eye with respect to the vehicle may be calculated as Equation 1 below.
w.sub.eye=w.sub.gazew.sub.head
v.sub.eye=v.sub.gazev.sub.head[Equation 1]
(56) In other words, the drowsy driving management device 100 may subtract the head pose yaw angle w.sub.head from the gaze yaw angle w.sub.gaze to calculate the eye movement yaw angle w.sub.eye with respect to the vehicle. The drowsy driving management device 100 may subtract the head pose pitch angle v.sub.head from the gaze pitch angle v.sub.gaze to calculate the eye movement pitch angle v.sub.eye with respect to the vehicle.
(57) After an amount of movement of the gaze yaw angle w.sub.gaze and the gaze pitch angle v.sub.gaze occurs greater than or equal to a certain rate (e.g., 30%) due to a head pose, the drowsy driving management device 100 may fail to determine drowsy driving. This is, when the user moves his or her gaze while turning his or her face (head), because it is not drowsy driving.
(58) In S204, the drowsy driving management device 100 may receive a vehicle yaw rate y.sub.ear and a vehicle pitch angle v.sub.car from a yaw/pitch sensor 220 of
(59) In S205, the drowsy driving management device 100 may calculate eye movement information with respect to an external environment like Equation 2 below using the eye movement information with respect to the vehicle, the vehicle yaw rate y.sub.car, and the vehicle pitch angle v.sub.car to convert a relative fixation with respect to the vehicle and an absolute fixation with respect to the external environment. In this case, the eye movement information with respect to the external environment may include an eye movement yaw angle w.sub.eye with respect to the external environment and an eye movement pitch angle v.sub.eye with respect to the external environment.
w.sub.eye=w.sub.eyey.sub.car
v.sub.eye=v.sub.eyev.sub.car[Equation 2]
(60) In other words, the drowsy driving management device 100 may subtract an integral value in a period (e.g., 2 seconds) when the vehicle yaw rate y.sub.car is evaluated from the eye movement yaw angle w.sub.eye with respect to the vehicle to calculate the eye movement yaw angle w.sub.eye with respect to the external environment. The drowsy driving management device 100 may subtract the vehicle pitch angle v.sub.car from the eye movement pitch angle v.sub.eye with respect to the vehicle to calculate the eye movement pitch angle v.sub.eye with respect to the external environment.
(61) In S206, the drowsy driving management device 100 may determine slow eye movement based on the total eye movement distance.
(62) In other words, the drowsy driving management device 100 may determine whether an absolute value of a value obtained by adding the eye movement yaw angle w.sub.eye with respect to the external environment and the eye movement pitch angle v.sub.eye with respect to the external environment is identical to a value obtained by applying the sum of the eye movement yaw angle w.sub.eye with respect to the external environment and the eye movement pitch angle v.sub.eye with respect to the external environment, to which a cosine function is applied, to an arccosine function. When the absolute value is identical to the value, the drowsy driving management device 100 may determine that the slow eye movement occurs.
|w.sub.eye+v.sub.eye|=cos.sup.1(cos(w.sub.eye)+cos(v.sub.eye))[Equation 3]
(63) For example, when an eye movement distance for 2 seconds is greater than or equal to 10 degrees and when there is no eye movement at a speed of greater than or equal to 300 degrees/seconds within 2 seconds, the drowsy driving management device 100 may determine that the slow eye movement occurs. When a fixation moves by eye movement at a speed of greater than or equal to 300 degrees/seconds, the drowsy driving management device 100 may determine that saccade occurs.
(64) In S207, the drowsy driving management device 100 may determine whether there is understeering, based on a maximum value and a minimum value of a steering torque. In other words, the drowsy driving management device 100 may determine whether an output value of a steering torque sensor 230 of
(65) In S208, the drowsy driving management device 100 may determine whether a turn signal operates, whether there is a change in decelerator/accelerator pedal, or whether various switches operate. For example, when there is a change greater than or equal to an accelerator pedal compression of 5% and when there is a change greater than or equal to a decelerator pedal compression of 10 bar, the drowsy driving management device 100 may determine that there is a change in a switch operation by the user.
(66) When slow eye movement occurs, when in an understeering state, and when there is no operation of a turn signal, when there is no change in decelerator/accelerator pedal, and when there is no operations of various switches, in S209, the drowsy driving management device 100 may determine that the user drives while drowsy.
(67) As such, an exemplary embodiment of the present disclosure may determine drowsy driving of the user with higher reliability than long eye opening and closing which is an existing drowsy driving detection method, based on slow eye movement and muscle relaxation which are an initial drowsy biomarker and may provide a warning at a quick, accurate time.
(68) Furthermore, when determining slow eye movement and saccade using a camera of monitoring a user's face, an embodiment of the present disclosure may compensate a yaw and pitch of the vehicle to prevent similar slow eye movement by vehicle motion from being incorrectly detected, thus more accurately detecting drowsy driving.
(69)
(70) Referring to
(71) The processor 1100 may be a central processing unit (CPU) or a semiconductor device that processes instructions stored in the memory 1300 and/or the storage 1600. The memory 1300 and the storage 1600 may include various types of volatile or non-volatile storage media. For example, the memory 1300 may include a ROM (Read Only Memory) 1310 and a RAM (Random Access Memory) 1320.
(72) Thus, the operations of the method or the algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware or a software module executed by the processor 1100, or in a combination thereof. The software module may reside on a storage medium (that is, the memory 1300 and/or the storage 1600) such as a RAM memory, a flash memory, a ROM, an EPROM, an EEPROM, a register, a hard disk, a removable disk, and a CD-ROM.
(73) The exemplary storage medium may be coupled to the processor 1100, and the processor 1100 may read information out of the storage medium and may record information in the storage medium. Alternatively, the storage medium may be integrated with the processor 1100. The processor 1100 and the storage medium may reside in an application specific integrated circuit (ASIC). The ASIC may reside within a user terminal. In another case, the processor 1100 and the storage medium may reside in the user terminal as separate components.
(74) The present technology may early and accurately detect drowsy driving of the user based on slow eye movement and an interval where there is no change in steering torque by a vehicle signal, thus preventing an accident due to the drowsy driving.
(75) In addition, various effects directly or indirectly ascertained through the present disclosure may be provided.
(76) Hereinabove, although the present disclosure has been described with reference to exemplary embodiments and the accompanying drawings, the present disclosure is not limited thereto, but may be variously modified and altered by those skilled in the art to which the present disclosure pertains without departing from the spirit and scope of the present disclosure claimed in the following claims.
(77) Therefore, the exemplary embodiments of the present disclosure are provided to explain the spirit and scope of the present disclosure, but not to limit them, so that the spirit and scope of the present disclosure is not limited by the embodiments. The scope of the present disclosure should be construed on the basis of the accompanying claims, and all the technical ideas within the scope equivalent to the claims should be included in the scope of the present disclosure.