SYSTEM FOR A MOTOR VEHICLE AND METHOD FOR ASSESSING THE EMOTIONS OF A DRIVER OF A MOTOR VEHICLE
20230020786 · 2023-01-19
Assignee
Inventors
- David Bethge (Stuttgart-Feuerbach, DE)
- Tobias Große-Puppendahl (Tübingen, DE)
- Mohamed Kari (Essen, DE)
Cpc classification
G01S19/01
PHYSICS
G06V10/774
PHYSICS
G06V10/12
PHYSICS
International classification
G06V20/59
PHYSICS
G01S19/01
PHYSICS
G06V10/12
PHYSICS
Abstract
A system for a motor vehicle includes a sensor apparatus having a sensor for determining motor vehicle data and/or driving data of the motor vehicle, and an evaluation unit. The evaluation unit includes an emotion determination unit configured to assess the emotions of the driver of the motor vehicle on the basis of sensor signals transmitted by the sensor.
Claims
1. A system for a motor vehicle, comprising: a sensor apparatus comprising a sensor that is configured for determining motor vehicle data and/or driving data of the motor vehicle, and an evaluation unit including an emotion determination unit, wherein the emotion determination unit is configured to assess emotions of the driver of the motor vehicle based on sensor signals transmitted by the sensor.
2. The system according to claim 1, further comprising a mobile device, wherein the mobile device comprises the sensor apparatus and the evaluation unit.
3. The system according to claim 2, wherein the mobile device is connected for signal exchange to a readout interface of the motor vehicle, wherein the vehicle data can be accessed via the readout interface.
4. The system according to claim 1, wherein a control unit of the motor vehicle comprises the sensor apparatus and the evaluation unit.
5. The system according to claim 1, wherein the sensor is a GPS sensor, wherein the driving data of the motor vehicle can be determined by the sensor signals from the GPS sensor and up-to-date map material, wherein the emotions of the driver can be assessed by way of the emotion determination unit of the evaluation unit on the basis of the driving data of the motor vehicle.
6. The system according to claim 1, wherein the sensor apparatus comprises a camera that is configured to be focused on the driver for directly capturing the emotions of the driver, wherein the camera is connected for signal exchange with the evaluation unit and wherein the emotions of the driver of the motor vehicle are assessed on the basis of the sensor signals of the sensor and supplemented by images from the camera.
7. The system according to claim 1, further comprising an input device that is connected for signal exchange with the evaluation unit and is configured such that information about the driver can be added.
8. The system according to claim 1, wherein the emotions of the driver can be determined by means of artificial intelligence, wherein the emotion determination unit comprises an emotion assessment model which is trained with training data that contains a range of sensor signals of the sensor and the resulting motor vehicle data and/or driving data as well as different emotion levels and the emotions of the driver are assessed from sensor signals of the sensor by way of the emotion assessment model.
9. A method for assessing the emotions of a driver of a motor vehicle using the system according to claim 1.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] A design example of the invention will be explained in more detail with reference to the drawings.
[0026]
[0027]
[0028]
DETAILED DESCRIPTION OF THE INVENTION
[0029]
[0030] The system 12 includes a sensor apparatus 19 comprising a sensor 20, which is configured as a GPS sensor. The system 12 further comprises an evaluation unit 22, which is connected to the sensor 20 via a data line 21 and is used to evaluate the sensor signals of the sensor 20. The system 12 also comprises a camera 24 and an input device 30. The camera 24 is mounted on the steering wheel 16 and is focused on the face of the driver 14 such that the face of the driver 14 is continuously captured by the camera 24. The camera 24 is connected to the evaluation unit 22 via a data line 25, wherein the images from the camera 24 are also evaluated by the evaluation unit 22. The input device 30 is used to input information, which is likewise processed in the evaluation unit 22.
[0031]
[0032]
[0033] In the first step 40, the location of the motor vehicle 10 is continuously ascertained by means of the GPS sensor 20 while driving and this information is used to determine the road conditions, the current weather and the current traffic conditions with the aid of accessible data, whereby, to make such a determination of the road conditions, the weather and the current traffic conditions, the evaluation unit 22 is connected to a corresponding device via a mobile network. The current vehicle speed and the vehicle acceleration are determined on the basis of the sensor signals from GPS sensor 20 as well. The face of the driver 14 is also continuously captured by the camera 24 while the motor vehicle 10 is being driven, and data about the driver 14, for example the mood before driving and the age of the person 14, is retrieved by means of the input device 30.
[0034] In a second step 42, the sensor data of the sensor 20 with the associated motor vehicle data and driving data, the images from the camera 24 and the inputs via the input device 30 are processed and the processed data is fed to a determination of the emotions of the driver 14 in the third step 44. The determination or assessment of the emotions of the driver 14 is carried out by an emotion assessment model 50 which is implemented in an emotion determination unit 23 of the evaluation unit 22. The data from the camera 24 is only used to confirm the emotions assessed by the emotion assessment model 50.
[0035] The result obtained in the third step 44 is the mood or emotion of the driver 14 at a specific point in time, whereby the result is output in the fourth step 46 and can be used further, in particular for adapting the environment of the driver 14 or adapting the control of a drive train of the motor vehicle 10 to the emotional situation of the driver 14. For example, appropriate lighting of the interior of the motor vehicle, appropriate music using an audio system or even the driving dynamics of the motor vehicle could be adapted to the emotional situation of the driver 14.
[0036] The emotion assessment model 50 used to assess the emotions of the driver 14 and implemented in the emotion determination unit 23 is created in such a way that an algorithm 54 is trained using training data 52, wherein the algorithm 54 recognizes patterns, correlations, dependencies and hidden structures in the training data 52 and independently creates a program code. This machine learning produces a statistically optimized emotion assessment model 50. The training data 52 includes a wide range of driving data and motor vehicle data affecting the emotions of the driver 14, such as the weather, the density of traffic, the route, the driving speed, the age of the driver 14, the mood of the driver 14 as a result of driving and different facial expressions of different drivers 14, as well as the different emotions corresponding to the driving data and the motor vehicle data, such as anger or joy, as a target variable.
[0037] Structural embodiments other than the described embodiments, which fall within the scope of protection of the claims, are possible as well.