METHOD FOR DETERMINING KINETOSIS

20230074207 ยท 2023-03-09

Assignee

Inventors

Cpc classification

International classification

Abstract

The invention relates to a method for determining kinetosis in a vehicle user of a vehicle during at least one travel event in which at least one body part of the vehicle user is monitored, as a result of which image data are generated. Driving dynamics of the vehicle are monitored as the vehicle is being driven, as a result of which driving dynamics data are generated for every travel event while the vehicle is in motion. The image data are evaluated to determine the formation of sweat on the at least one body part of the vehicle user, as a result of which approximated electrodermal activity data are generated. The driving dynamics data are associated with the approximated electrodermal activity data, as a result of which the kinetosis of the vehicle user in at least one of the travel events is determined.

Claims

1. A method for determining kinetosis in a vehicle user of a vehicle during at least one travel event, wherein at least one body part of the vehicle user is monitored, by which image data are generated, driving dynamics of the vehicle are monitored during a journey of the vehicle, by which driving dynamics data are generated for each travel event during the journey, the image data are evaluated to ascertain perspiration on the at least one body part of the vehicle user, by which approximated electrodermal activity data are generated, the driving dynamics data are linked to the approximated electrodermal activity data, by which the kinetosis of the vehicle user during at least one of the travel events is determined.

2. The method as claimed in claim 1, wherein the monitoring of the at least one body part is carried out by means of a camera.

3. The method as claimed in claim 1, wherein the driving dynamics of the vehicle are monitored by means of a driving dynamics sensor.

4. The method as claimed in claim 3, wherein the driving dynamics sensor is formed as an acceleration sensor and/or as a gyro sensor.

5. The method as claimed in claim 1, wherein if kinetosis is determined during the at least one travel event, the at least one travel event is stored as kinetosis-triggering.

6. The method as claimed in claim 5, wherein during a repetition of the at least one travel event or during a new travel event, which is comparable to the at least one travel event, the driving dynamics of the vehicle are adapted, so that reduced kinetosis or no kinetosis is triggered.

7. An evaluation device for a vehicle, wherein the evaluation device is configured to be connected to a camera of the vehicle and to a driving dynamics sensor of the vehicle, wherein the evaluation device has means to carry out the method as claimed in any one of the preceding claims.

8. A computer program product comprising commands which, upon execution of the program by an evaluation device as claimed in claim 7.

9. A vehicle having an evaluation device as claimed in claim 7, a camera, and a driving dynamics sensor, wherein the evaluation device is connected to the camera and the driving dynamics sensor, wherein the vehicle is configured to carry out automated functions.

Description

[0028] FIG. 1 shows a schematic illustration of a vehicle according to one exemplary embodiment,

[0029] FIG. 2 shows a schematic illustration of a forehead of the vehicle user of the vehicle from FIG. 1 without kinetosis,

[0030] FIG. 3 shows a schematic illustration of the forehead of the vehicle user from FIG. 2 with kinetosis,

[0031] FIG. 4 shows a schematic illustration of a method which is carried out by the vehicle from FIG. 1.

[0032] FIG. 1 shows a schematic illustration of a vehicle 1 according to one exemplary embodiment. In the vehicle 1, a vehicle user 2 is located in the passenger compartment, wherein the vehicle user 2 travels with the vehicle 1. The vehicle 1 is capable of carrying out automated functions, for example, driving autonomously. The vehicle 1 travels along a route, wherein during this journey a travel event B occurs. This travel event B can be, for example, cornering or an acceleration.

[0033] The vehicle 1 has an evaluation device 5, a camera 3, and a driving dynamics sensor 7. The evaluation device 5 is connected to the camera 3, so that a data and signal exchange can take place. The connection can be made wireless or wired. The evaluation device 5 is connected to the driving dynamics sensor 7, so that a data and signal exchange can take place. The connection can be made wireless or wired.

[0034] The camera 3 is arranged in the vehicle interior, more precisely in the passenger compartment of the vehicle 1. The camera 3 is aligned and arranged so that it can at least partially acquire the vehicle user 2. The camera 3 thus acquires a body part 4 of the vehicle user 2, the forehead here. The body part 4 of the vehicle user 2 is monitored by means of the camera 3 for reflections, which are caused by perspiration 6 on the forehead of the vehicle user 2. Image data, which are evaluated by the evaluation device 5, are generated by the monitoring by means of the camera 3. The image data are therefore passed on to the evaluation device 5.

[0035] The evaluation device 5 evaluates the image data of the camera 3 to infer the strength of the perspiration 6. Strongly acquired reflections are evaluated by the evaluation device 5 as strong perspiration 6 on the body part 4. Weakly acquired reflections are evaluated by the evaluation device 5 as weak or nonexistent perspiration 6 on the body part 4. For example, the evaluation device 5 can make use of a trained artificial neural network to evaluate the image data with respect to the perspiration. The evaluation device 5 approximates the electrodermal activity data EDA starting from the determined perspiration 6 or the degree of the determined perspiration 6. This is carried out, for example, starting from a database in which electrodermal activity data EDA are linked to data on the perspiration 6. If the approximated electrodermal activity data EDA are high, this means that the vehicle user 2 has symptoms of kinetosis. If the approximated electrodermal activity data EDA are low, this means that the vehicle user 2 has no symptoms of kinetosis.

[0036] The driving dynamics sensor 7, which can be formed, for example, as an acceleration sensor or as a gyro sensor, ascertains the driving dynamics for each travel event B during the journey of the vehicle 1, by which driving dynamics data are generated. For example, the driving dynamics sensor 7 ascertains the positive and/or the negative acceleration during the travel event B. The ascertained driving dynamics data are passed on to the evaluation device 5.

[0037] The evaluation device 5 subsequently links the driving dynamics data to the approximated electrodermal activity data EDA. The kinetosis of the vehicle user 2 during the travel event B is thus determined. It can thus be ascertained which travel events B trigger kinetosis in the vehicle user 2. It can thus be determined whether and to what extent acceleration, braking, turning, pitching, rolling, or yawing, which are all related to specific travel events B, trigger kinetosis in the vehicle user 2.

[0038] FIG. 2 shows a schematic illustration of a forehead of the vehicle user 2 of the vehicle 1 from FIG. 1 without kinetosis. The forehead of the vehicle user 2 represents the body part 4 which is monitored by means of the camera 3. Alternatively thereto, instead of the forehead or simultaneously thereto, hands of the vehicle user 2, cheeks of the vehicle user 2, a neck of the vehicle user 2, an upper lip of the vehicle user 2, a throat of the vehicle user 2, a chest area of the vehicle user 2, or the like could be monitored as the body part 4.

[0039] It can be seen clearly here that the forehead of the vehicle user 2 only has minor perspiration 6 during the present travel event. This minor perspiration 6 only causes a slight reflection, which is acquired by the camera 3. Therefore, low electrodermal activity data EDA are approximated by the evaluation device. The vehicle user 2 thus does not have kinetosis.

[0040] FIG. 3 shows a schematic illustration of the forehead of the vehicle user 2 from FIG. 2 with kinetosis. The same vehicle user 2 is shown here as in FIG. 2. The forehead of the vehicle user 2 again represents the body part 4 which is monitored by means of the camera 3. It can be seen clearly here that the forehead of the vehicle user 2 has strong perspiration 6 during the present travel event. This strong perspiration 6 causes a strong reflection, which is acquired by the camera 3. High electrodermal activity data EDA are thus approximated by the evaluation device. The vehicle user 2 thus has kinetosis.

[0041] FIG. 4 shows a schematic illustration of a method V which is executed by the vehicle from FIG. 1. The method V runs during the journey of the vehicle during a travel event B.

[0042] In a first step 101, the body part of the vehicle user is monitored by means of the camera, by which image data D are generated. These image data D comprise data on the reflection of the body part of the vehicle user.

[0043] In a second step, which runs in parallel, for example, driving dynamics of the vehicle are monitored during a journey of the vehicle by means of the driving dynamics sensor, by which driving dynamics data F are generated for the travel event B.

[0044] In a third step 103 following first step 101, the image data D are evaluated by means of the evaluation device to ascertain perspiration on the body part of the vehicle user, by which approximated electrodermal activity data EDA are generated. The perspiration is ascertained starting from the acquired reflection of the body part. For this purpose, the evaluation device can make use, for example, of a trained artificial neural network. Third step 103 can run in parallel to second step 102, for example. The electrodermal activity data EDA are approximated starting from the strength of the perspiration. A database can be used for this purpose, for example, in which various electrodermal activity data EDA are linked to data on the perspiration or degrees of perspiration.

[0045] In a last fourth step 104, the driving dynamics data F are linked to the approximated electrodermal activity data EDA, by which the kinetosis K of the vehicle user during the travel events B is determined. It can thus be ascertained which travel events B trigger kinetosis K in the vehicle user.

[0046] The method V runs continuously during the journey of the vehicle. This means that the method V is continued when a travel event B is ended, but the vehicle moves further, or when kinetosis K has been determined in the vehicle user.

[0047] The examples shown here are only selected by way of example. For example, another body part of the vehicle user or multiple body parts of the vehicle user can be monitored simultaneously by means of the camera. For example, more than one vehicle user can use the vehicle. In this case, all vehicle users are monitored to be able to determine kinetosis in them.

LIST OF REFERENCE SIGNS

[0048] 1 vehicle [0049] 2 vehicle user [0050] 3 camera [0051] 4 body part [0052] 5 evaluation device [0053] 6 perspiration [0054] 7 driving dynamics sensor [0055] 101 first step [0056] 102 second step [0057] 103 third step [0058] 104 fourth step [0059] B travel event [0060] D image data [0061] EDA electrodermal activity data [0062] F driving dynamics data [0063] K kinetosis [0064] V method