METHOD AND CONTROL UNIT FOR OPERATING A SELF-DRIVING CAR
20190337521 ยท 2019-11-07
Assignee
Inventors
Cpc classification
B60W2400/00
PERFORMING OPERATIONS; TRANSPORTING
B60W50/085
PERFORMING OPERATIONS; TRANSPORTING
B60W50/0098
PERFORMING OPERATIONS; TRANSPORTING
B60W10/18
PERFORMING OPERATIONS; TRANSPORTING
B60W50/082
PERFORMING OPERATIONS; TRANSPORTING
G05D1/0088
PHYSICS
B60W40/08
PERFORMING OPERATIONS; TRANSPORTING
B60W10/04
PERFORMING OPERATIONS; TRANSPORTING
B60W50/10
PERFORMING OPERATIONS; TRANSPORTING
B60W2420/403
PERFORMING OPERATIONS; TRANSPORTING
A61B5/4227
HUMAN NECESSITIES
A61B5/4803
HUMAN NECESSITIES
B60W2540/221
PERFORMING OPERATIONS; TRANSPORTING
A61B5/0205
HUMAN NECESSITIES
B60W2540/22
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W40/08
PERFORMING OPERATIONS; TRANSPORTING
A61B5/00
HUMAN NECESSITIES
A61B5/0205
HUMAN NECESSITIES
Abstract
The invention relates to a control unit for autonomous driving (18) comprising a processor (40) configured to rate an executed or upcoming driving maneuver (M1-M4) on the basis of physiological measurements
Claims
1. A control device for autonomous driving comprising a processor configured to rate an executed or upcoming driving maneuver (M1-M4) based on physiological measurements.
2. The control device for autonomous driving according to claim 1, wherein the physiological measurements comprise a heart rate, skin conductance, oxygen saturation in blood, blood pressure, and/or adrenaline concentration of a vehicle occupant.
3. The control device for autonomous driving according to claim 1, wherein the physiological measurements comprise information obtained through image analysis or speech analysis.
4. The control device for autonomous driving according to claim 1, wherein the processor is configured to create a user profile based on rated driving maneuvers.
5. The control device for autonomous driving according to claim 4, wherein the processor is configured to rate an upcoming driving maneuver based on the created user profile.
6. The control device for autonomous driving according to claim 1, wherein the processor is configured to rate numerous variations of a driving maneuver, and select an optimal variation based on the rating.
7. The control device for autonomous driving according to claim 1, wherein the physiological measurements are given different priori-ties.
8. The control device for autonomous driving according to claim 1, wherein the processor is configured to identify relative changes in the physiological measurements, and rate a driving maneuver based on the relative changes in the physiological measurements.
9. The control device for autonomous driving according to claim 1, wherein the processor is configured to check whether a change in the physiological measurements can be attributed to a driving maneuver or some other reason, and to then rate the driving maneuver when the driving maneuver can be assumed to be the reason for the change.
10. A method for autonomous driving, in which an executed or upcoming driving maneuver is rated on the basis of physiological measurements.
11. The control device for autonomous driving according to claim 2, wherein the processor is configured to create a user profile based on rated driving maneuvers.
12. The control device for autonomous driving according to claim 2, wherein the processor is configured to rate numerous variations of a driving maneuver, and select an optimal variation based on the rating.
13. The control device for autonomous driving according to claim 2, wherein the physiological measurements are given different priori-ties.
14. The control device for autonomous driving according to claim 2, wherein the processor is configured to identify relative changes in the physiological measurements, and rate a driving maneuver based on the relative changes in the physiological measurements.
15. The control device for autonomous driving according to claim 2, wherein the processor is configured to check whether a change in the physiological measurements can be attributed to a driving maneuver or some other reason, and to then rate the driving maneuver when the driving maneuver can be assumed to be the reason for the change.
16. The control device for autonomous driving according to claim 3, wherein the processor is configured to create a user profile based on rated driving maneuvers.
17. The control device for autonomous driving according to claim 3, wherein the processor is configured to rate numerous variations of a driving maneuver, and select an optimal variation based on the rating.
18. The control device for autonomous driving according to claim 3, wherein the physiological measurements are given different priori-ties.
19. The control device for autonomous driving according to claim 3, wherein the processor is configured to identify relative changes in the physiological measurements, and rate a driving maneuver based on the relative changes in the physiological measurements.
20. The control device for autonomous driving according to claim 3, wherein the processor is configured to check whether a change in the physiological measurements can be attributed to a driving maneuver or some other reason, and to then rate the driving maneuver when the driving maneuver can be assumed to be the reason for the change.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0029] Embodiments shall now be described by way of example and with reference to the attached drawings, wherein:
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
DETAILED DESCRIPTION
[0047]
[0048]
[0049] In the example shown in
[0050] The autonomous vehicle 2 also comprises a control unit 14 (ECU 2) that controls a braking system. The braking system relates to the components that enable a braking of the vehicle.
[0051] The autonomous vehicle 2 furthermore comprises a control unit (ECU 3) that controls a drive train. The drive train relates to the drive components of the vehicle. The drive train can comprise a motor, a transmission, a drive/propulsion shaft, a differential and an axle drive.
[0052] The autonomous vehicle 2 also comprises a control unit for autonomous driving 18 (ECU 4). The control unit for autonomous driving 18 is configured to drive, steer and park the autonomous vehicle 2 such that it is operated entirely or partially without the influence of a human driver.
[0053] The control unit for autonomous driving 18, which is illustrated in
[0054] The vehicle sensor system of the autonomous vehicle 2 also comprises a satellite navigation unit 24 (GPS unit). It should be noted that in the context of the present invention, GPS stands for any global navigation satellite system (GNSS), such as GPS, A-GPS, Galileo, GLONASS (Russia), Compass (China), IRNSS (India), etc.
[0055] When an operating state is activated for autonomous driving by the control system or the driver, the control unit 18 for autonomous driving determines parameters for the autonomous operation of the vehicle (e.g. target speed, target torque, distance to the vehicle in front, steering procedure, etc.) on the basis of available data regarding a predefined route and vehicle operating data recorded by means of vehicle sensors that are sent to the control unit 18 from the control units 12, 14, 16.
[0056] The autonomous vehicle 2 also comprises one or more environment sensors 20 that are configured to record the environment of the vehicle, wherein the environment sensors 20 are installed on the vehicle and record objects or states in the environment of the vehicle self-sufficiently, i.e. without outside information signals. These include cameras, radar sensors, lidar sensors, ultrasound sensors, etc. The environment sensors 20 can be located inside or outside the vehicle (e.g. on the outer surface of the vehicle).
[0057] The autonomous vehicle 2 also comprises one or more interior cameras 21, which provide image data on the occupants of the vehicle 2. Based on the image data on the vehicle occupants, the control unit for autonomous driving 18 requests further information regarding the well being of the vehicle occupants, and incorporates this knowledge in the evaluation of driving maneuvers, in order to adapt the manner of driving to the well being, or a profile of the occupant. If it is determined, for example, that a vehicle occupant has made an angry gesture, or closed his eyes over a longer period of time, conclusions can be drawn regarding the state of fear or stress. The skin temperature, oxygen saturation or blood pressure of a vehicle occupant can be determined from the image data with processes known to the person skilled in the art.
[0058] The autonomous vehicle 2 also comprises a user interface 26 (HMI=Human Machine Interface), which enables a vehicle occupant to interact with one or more vehicle systems. This user interface 26 can comprise an electronic display, for example (e.g. a GUI=graphical user interface) for outputting a graphic, symbols and/or content in text form, and an input interface for receiving an input (e.g. a manual input, speech input, and inputs through gestures, head or eye movements). The input interface can comprise keyboards, switches, touchscreens, eye trackers, etc. By way of example, a personal user profile can be set by a user via the user interface 26. The selection can take place, for example, via a dropdown menu, by means of which the user can select a user profile from a predefined list of user profiles by tapping a touchscreen, or by pressing keys on the user interface. The user can also transmit user authentication data to a server or a cloud, on which numerous user profiles are provided. Based on the user authentication data, the relevant configuration data for the vehicle can be selected from the user profile, and transmitted to the vehicle via a communication link.
[0059] The autonomous vehicle 2 also comprises a microphone 23, and means for speech recognition and speech analysis. Speech analysis while driving likewise allows conclusions to be drawn regarding the well being of the vehicle occupants.
[0060] The autonomous vehicle 2 also comprises a communication interface 19 for mobile networks. This comprises, e.g., a SIM card, with which the control unit can communicate via a mobile network, e.g. UMTS or LTE. The communication interface 19 can enable communication with a cloud, for example.
[0061] The autonomous vehicle 2 also comprises a communication interface 22 for an external user device, e.g. a wireless WLAN or Bluetooth interface, or a hard-wired USB connection. The communication interface 22 for an external user device is used for connecting to user devices such as smartphones, smartwatches, etc.
[0062] The communication interfaces 22 and/or 19 can also provide interfaces for the exchange of information and data between motor vehicles (C2C, car-to-car communication) or between vehicles and a traffic infrastructure (C2X, X2C).
[0063] As is described below in reference to exemplary embodiments, the control unit 18 for autonomous driving continuously receives physiological measurements regarding one or more vehicle occupants from a portable user device (1 in
[0064]
[0065] The portable user device 1 also comprises one or more biosensors 31, which are configured to record physiological measurements regarding a vehicle occupant. These preferably comprise one or more of the following sensors: a sensor for determining the heart rate and the blood oxygen saturation, wherein these sensors are preferably formed by an optical sensor (e.g. a photoplethysmography sensor), a sensor for measuring the electrical conductance of the skin, in particular the electro dermal activity, a sensor for measuring the temperature or a heat flow, a sensor for measuring respiration, a sensor for monitoring blood pressure, a sensor for monitoring muscle tone, etc.
[0066] The sensors can be in a single portable user device 1, which can be worn, for example, as a bracelet, a finger clip, or on the ear. Alternatively, numerous user devices can provide physiological measurements to the control unit for autonomous driving.
[0067] The portable user device 1 comprises a communication interface 36, e.g. a wireless transmitter, with which the recorded data can be transmitted wirelessly, in particular via UMTS, WLAN or Bluetooth, to the control unit for autonomous driving 18 (see
[0068] The portable user device 1 also comprises a power source 37. The power source 37 is preferably in the form of a battery, and supplies the components connected thereto with electrical energy. The portable user device 1 also has a user interface 35 (UI), by means of which the user can read or input information. This user interface 35 can comprise a display, for example, in particular an LED monitor, which can indicate a measured physiological state to the wearer of the portable user device 1.
[0069]
[0070] The processor of the control unit for autonomous driving 18 is configured to evaluate an executed or upcoming driving maneuver in that a rating is assigned thereto. The rating can be defined, e.g., by a parameter, a numerical value, etc. By way of example, various rating values can be predefined in the control unit, wherein each rating value is assigned to a well being of a vehicle occupant (see
[0071] The control unit for autonomous driving 18 also comprises a memory and an input/output interface. The memory can be composed of one or more non-volatile computer-readable media, and comprises at least one program storage region and one data storage region. The program storage region and the data storage region can comprise combinations of various types of memory, e.g. a read-only memory 43 (ROM) and a random access memory 42 (RAM) (e.g. dynamic RAM (DRAM), synchronous DRAM (SDRAM) etc.). The control unit for autonomous driving 18 can also comprise an external memory 44, e.g. an external hard disk drive (HDD), a flash memory drive, or a non-volatile solid state drive (SDD).
[0072] The control unit for autonomous driving 18 also comprises a communication interface 45, via which the control unit can communicate with the vehicle communication network (28 in
[0073]
[0074]
[0075]
[0076] The well being of a vehicle occupant can also be expressed by an increased adrenaline level, a specific posture, a change in skin conductance, a change in oxygen saturation in blood, etc.
[0077] Numerous types of measurements can also be recorded, and a conclusion regarding the state of being of the vehicle occupant can be drawn from a combined view of all of the measurements. By way of example, predefined multidimensional tables can be stored for this, or rating values can be calculated analytically from numerous measurements through computing. Different priorities can be assigned in the assignment of the rating values if there are numerous measurement variables of the physiological measurements. By way of example, the measurement of the heart rate can be given a higher priority than the measurement of skin conductance.
[0078]
[0079]
[0080]
[0081]
[0082] In the above example, the driving maneuvers M1a-M1e are determined solely by the transverse acceleration and the context, cornering. In alternative exemplary embodiments, numerous vehicle operating parameters that are recorded by the vehicle sensors can also be drawn on to define a driving maneuver.
[0083]
[0084] In the above example, the driving maneuvers M2a-M2g are determined solely by the speed and the context, highway driving. In alternative exemplary embodiments, numerous vehicle operating parameters can also be drawn on to define a driving maneuver. Moreover, the traffic conditions can also be taken in to account. Thus, parameters for the states of other vehicles relating to the environment can be drawn on, for example, in particular their positions, speeds or direction of travel, dangerous situations, etc. which are obtained via the vehicle-based communication system (e.g. a car-to-X interface and/or an X-to-car interface). Furthermore, the traffic congestion can be estimated using numerous sensors, e.g. radar, lidar and optical cameras, which may already be available on a vehicle as parts of other systems.
[0085]
[0086] The measurement for the heart rate can also be a mean value of numerous measurements that have been taken at the same time as a specific driving maneuver.
[0087] The associated rating values VAL1 to VAL7 are thus determined by the control unit for autonomous driving from the obtained physiological measurements based on predefined ratings, which represent the emotional states such as relaxed or stressed, or drowsiness, and thus evaluating the driving maneuver that has been executed. As such, the control unit for autonomous driving 18 can determine, based on the recorded physiological measurements, whether the driver is relaxed or tense, or even stressed.
[0088]
[0089] In this manner, a rating of the driving maneuvers of the control unit for autonomous driving can take place on the basis of the measured physiological measurements of one or more vehicle occupants, e.g. continuously or during special training or learning phases. The rating of various driving maneuvers that are carried out obtained in this manner can be used to create one or more user profiles.
[0090]
[0091] In the above exemplary embodiment shown in
[0092]
[0093] In step S202, the control unit for autonomous driving determines a change in the physiological measurements regarding a vehicle occupant. In step S204, the control unit for autonomous driving assigns the determined change in the physiological measurement to a predefined rating value VAL. When the physiological measurements do not change, or change only slightly, or there are no noticeable gestures or speech, the maneuver can be rated well. If the physiological measurements change to a greater extent, it is determined whether the driving maneuver is the reason for the change. In a step S206, the control unit for autonomous driving detects a driving maneuver M that correlates temporally to the determined change in the physiological measurements. In step S207, the control unit for autonomous driving checks whether there is another reason for the change in the physiological measurements. If it is determined in step S207 that there is no other reason, the control unit for autonomous driving rates the detected driving maneuver M in step S208 with the rating value VAL that is assigned to the determined change in the physiological measurements. In step S210, the control unit for autonomous driving registers the rated driving maneuver M in a user profile of the vehicle occupant. If instead it is determined in step S207 that there is another reason, the driving maneuver M and the rating VAL are discarded in step S209.
[0094] The reason can likewise be explained with gestures and speech. If the driving maneuver is the only possible reason, a corresponding rating can be defined on the basis of the change or expression. If instead it is determined with means such as image or speech analysis that the occupant is in an emotionally tense conversation, the control unit for autonomous driving can decide that the associated physiological measurements are not to be drawn on for rating driving maneuvers.
[0095] Through a continuous analysis of all of the maneuvers for all of the vehicle occupants, user profiles can be created with ratings for all driving maneuvers. As a result, it is possible for the control unit for autonomous driving to anticipate how specific types of people will react during specific maneuvers, i.e. how the maneuvers are received by the vehicle occupants. Occupants rate manners of driving differently. The aim can be to adapt the manner of driving such that different people always feel safe.
[0096] Based on the user profile created in this manner, the control unit for autonomous driving can then rate an upcoming driving maneuver or driving maneuver variation. The selection of the appropriate manner of driving (speeds, transverse accelerations, maneuvers: evasive maneuvers, braking, or combinations of accelerations and decelerations, position in the driving lane: left, right, middle, . . . ) is carried out in this manner by the autonomous vehicle itself.
[0097]
[0098] The user profile that has been created is then used by the control unit for autonomous driving 18 to control upcoming driving maneuvers such that the driving behavior of the vehicle is optimized. The manner of driving is thus adapted to the people in the vehicle. Accordingly, when an upcoming driving maneuver of the driving maneuver category M1, cornering, is to be carried out, the control unit for autonomous driving selects the driving maneuver variation M1c, cornering with a transverse acceleration of 4 m/s2, which is assigned the optimal rating value VAL4 as a rating value.
[0099] The control unit 18 for autonomous driving according to the present invention can store the created user profile, and reuse it later in order to optimize the driving behavior of the vehicle.
[0100]
[0101] In the above exemplary embodiment, the control unit for autonomous driving selects optimal driving maneuver variations in each case from the perspective of the well being of the occupants. In alternative exemplary embodiments, the control unit for autonomous driving can also decide between the well being of the vehicle occupants and the demands of a traffic situation, based on the traffic situation. The control unit for autonomous driving can thus decide during a passing maneuver that it is appropriate in the current traffic situation to accept a certain amount of stress for the vehicle occupant(s), in order to quickly complete the passing maneuver. In this manner, a parameter can be determined for every type of driving maneuver, which determines the extent to which the well being of the occupants should affect the selection of the driving maneuver variation.
[0102] On the basis of the rating of the driving maneuver, or on the basis of information regarding the health of a vehicle occupant, certain maneuvers for specific occupants can also be replaced by other maneuvers. By way of example, the control unit for autonomous driving can detect an obstruction on the street and identify avoiding collision with the obstacle as an upcoming driving maneuver variation. For this driving maneuver, the control unit for autonomous driving can identify two possible driving maneuver variations, specifically driving around the obstacle as the first driving maneuver variation, and braking before reaching the obstacle as the second driving maneuver variation. Depending on the information in the user profile, certain maneuvers can then be avoided, and replaced by other possible maneuvers. By way of example, it can be derived from the user profile that a vehicle occupant normally reacts negatively to the driving maneuver variation, driving around the obstacle, but positively to the driving maneuver variation, braking before reaching the obstacle. Consequently, the control unit for autonomous driving can select the preferred driving maneuver variation, braking before reaching the obstacle here. In this manner, the control unit for autonomous driving replaces the driving maneuver variation driving around the obstacle with the driving maneuver variation, braking before reaching the obstacle, for the vehicle occupants in question.
[0103] Information regarding the health of a vehicle occupant can also be stored in a user profile, and certain driving maneuvers can be avoided, depending on the state of health. The state of health can affect the rating of a driving maneuver. The manner of driving can be rated based on the heath data. The manner of driving can be continuously optimized with this data. If the control device for autonomous driving detects an ill vehicle occupant in the vehicle, for example, then those driving maneuvers that present a difficulty for the vehicle occupant, such as cornering with high transverse accelerations, etc. are systematically given a worse rating by the control unit for autonomous driving. In this manner, health data can have an effect on the driving manner of the vehicle. The system can thus be pre-conditioned (setting a specific driving manner when people get in the vehicle). III, old, sensitive people can be brought to their destination very gently (with low transverse and longitudinal accelerations). Vehicle occupants that are stressed can be reassured by a specific driving manner that has been set and adapted to their needs. Music and lighting in the vehicle can likewise be adapted on the basis of the health data. The aim is to calm the occupants, and bring them safely and happily to their destination.
[0104] According to another aspect of the invention, injured people or pregnant women in pain can be brought to a hospital as quickly as possible, using the shortest route, if the control unit for autonomous driving detects a corresponding state of health through analysis of the physiological measurements.
[0105] The ratings of driving maneuvers determined by the above exemplary embodiments, or the user profiles created on the basis thereof, can also be used to further optimize the driving manner, e.g. through the use of deep learning, artificial intelligence (AI) or machine learning. The driving manner can be adapted incrementally, e.g. in steps. The driving manner can thus be subdivided into steps, and adapted in a noticeably quick manner. The driving manner can also be continuously adapted, i.e. modified in small steps. The driving manner can also be adapted in a combination of incremental and continuous modifications.
[0106] The control unit for autonomous driving can also be configured such that the continuous optimization of the driving manner can be switched on or off by means of a user interface (HMI=human-machine interface) (user interface 26 in
[0107] The acquired data (user profile) can also be exported from the control unit for autonomous driving and used, e.g., in other vehicles. The user profile can thus also be stored in a central memory, e.g. on a server or in a cloud server. A user profile in the control unit for autonomous driving can also be synchronized with a central user profile. Data from the user profile can be sent to the manufacturer, e.g. for longterm studies and further analyses. Collected data can be used as training data for AI algorithms. The vehicle control device or the user end deviceinsofar as aspects thereof are outsourced to a server or the cloudcan send the data to a central unit (e.g. the manufacturer), and the manufacturer can thus draw conclusions regarding the algorithm as to how it is received in a large sampling and whether it has reached the limits of its capabilities (the vehicle should behave differently, but is unable to adapt because it has already reached the limits of the function). This has the advantage that the manufacturer can then further develop the algorithm, and offer an update for the control logic in the vehicle.
REFERENCE SYMBOLS
[0108] 1 portable user device [0109] 2 autonomous vehicle [0110] 12 steering system control device [0111] 14 braking system control device [0112] 16 drive train control device [0113] 18 control device for autonomous driving [0114] 19 communication interface [0115] 20 environment sensors [0116] 21 interior camera [0117] 22 communication interface [0118] 23 microphone [0119] 24 satellite navigation unit [0120] 26 user interface [0121] 28 vehicle communication network [0122] 30 portable user device processor [0123] 31 biosensor [0124] 32 RAM memory [0125] 33 ROM memory [0126] 34 memory drive [0127] 35 user interface [0128] 36 communication interface [0129] 37 power source [0130] 40 control unit for autonomous driving processor [0131] 42 RAM memory [0132] 43 ROM memory [0133] 44 memory drive [0134] 46 user interface