Communication flow from road user to a vehicle driving an automated manner
11126872 ยท 2021-09-21
Assignee
Inventors
- Philipp Notheis (Salem-Mimmenhausen, DE)
- Leschek Debernitz (Eriskirch, DE)
- Daniel Schmidt (Markdorf, DE)
Cpc classification
G06V20/58
PHYSICS
B60Q1/507
PERFORMING OPERATIONS; TRANSPORTING
G06V40/10
PHYSICS
International classification
B60Q1/50
PERFORMING OPERATIONS; TRANSPORTING
Abstract
Disclosed is a training evaluation device for a vehicle having a first input interface for receiving a recording of a sign given by a road user, and a second input interface for receiving a driver control command corresponding to the sign, wherein the training evaluation device propagates an artificial neural network with the recording of the sign and the driver control command to obtain a vehicle control command in the propagation of the artificial neural network, and adjusts weighting factors such that the vehicle control command matches the driver control command, for the machine learning of a meaning of the sign. A method for training an artificial neural network, a working evaluation device for an automatically operated vehicle, a driver assistance system, and a method for recognizing a meaning of a sign and for indicating a vehicle reaction to a known meaning of this sign are also disclosed.
Claims
1. A training evaluation device for a vehicle comprising: a first input interface for obtaining a recording of a sign given by a road user; and a second input interface for obtaining a driver control command corresponding to the sign, wherein the training evaluation device is configured to feed an artificial neural network with the recording of the sign and the driver control command to obtain a vehicle control command in a propagation of the artificial neural network, and to adjust weighting factors such that the vehicle control command matches the driver control command, for a machine learning of a meaning of the sign; wherein a vehicle reaction corresponding to the the vehicle control command is indicated to the road user by generating at least one of an electric, optical, or acoustic signal using a signal transmitter, wherein the signal transmitter comprises a light strip located at least one of a front region or a rear region of the vehicle, and wherein indicating the vehicle reaction to the road user further comprises outputting the signal from the signal transmitter to a portable device worn by the road user to make the road user aware of the indication of the vehicle reaction based on the signal from the signal transmitter.
2. The training evaluation device according to claim 1, wherein the first input interface is configured to receive a visual sign comprising a gesture, a hand movement, a facial expression, or an acoustic sign from the road user.
3. The training evaluation device according to claim 1, wherein the first input interface is configured to receive a recording of at least one of a size or a facial expression of the road user, and wherein the artificial neural network is configured to obtain an age of the road user based on the at least one of the size or the facial expression, in order to adjust the vehicle control command based on the age.
4. The training evaluation device according to claim 1, wherein the first input interface comprises an interface to at least one environment detection sensor comprising at least one of an image sensor of a camera, a radar sensor, a lidar sensor, or a sound sensor, and wherein the second input interface comprises an interface to a vehicle data transfer system.
5. The training evaluation device according to claim 1, wherein the artificial neural network is a multi-layer convolutional or recurrent, artificial neural network.
6. A working evaluation device for an automatically operated vehicle, comprising: an input interface configured to receive a recording of a sign given by a road user, wherein the working evaluation device is configured to propagate an artificial neural network that has been trained with a meaning of the sign with the recording of the sign to recognize the meaning of the sign, and to obtain a vehicle control command corresponding to the sign; and a first output interface for outputting the vehicle control command to the road user to indicate to the road user a vehicle reaction to a known meaning of the sign, wherein indicating the vehicle reaction to the road user comprises generating at least one of an electric, optical, or acoustic signal using a signal transmitter, wherein the signal transmitter comprises a light strip located at least one of a front region or a rear region of the vehicle, and wherein indicating the vehicle reaction to the road user further comprises outputting the signal from the signal transmitter to a portable device worn by the road user to make the road user aware of the indication of the vehicle reaction based on the signal from the signal transmitter.
7. The working evaluation device according to claim 6, wherein the working evaluation device is configured to indicate that the vehicle will continue or stop, based on a recorded stopping sign or a recorded continuation sign given by the road user.
8. The working evaluation device according to claim 6 wherein the working evaluation device has a second output interface configured to output the vehicle control command to a vehicle control device.
9. A driver assistance system comprising: the working evaluation device according to claim 6; at least one environment detection sensor for recording the sign given by the road user; and the signal transmitter configured to indicate the vehicle reaction to a known meaning of the sign to the road user.
10. The driver assistance system according to claim 9, further comprising an interface for outputting the signal from the signal transmitter to the portable device worn by the road user.
11. A method for recognizing a meaning of a sign given by a road user comprising: recording the sign with at least one environmental sensor of a vehicle; propagating an artificial neural network executed on a processor that has been trained with the meaning of the sign with the recording of the sign; obtaining a vehicle control command in the propagation of the artificial neural network; outputting an indication of a vehicle reaction according to the vehicle control command to the road user by generating at least one of an electric, optical, or acoustic signal using a signal transmitter, wherein the signal transmitter comprises a light strip located at least one of a front region or a rear region of the vehicle, and wherein outputting the indication comprises outputting the signal from the signal transmitter to a portable device worn by the road user to make the road user aware of the indication of the vehicle reaction based on the signal from the signal transmitter.
12. The method according to claim 11, wherein a non-transitory computer readable medium has stored thereon a computer program that, when executed by a computer, cause the computer to execute the method according to claim 11.
13. The method according to claim 11, further comprising outputting the indication of the vehicle reaction comprising an indication that the vehicle will continue or stop.
14. The method according to claim 11, further comprising outputting the vehicle control command to a vehicle control device to responsively control at least one of a longitudinal or a transverse aspect of the vehicle according to the vehicle control command.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The present disclosure shall be explained in detail based on the following figures.
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
DETAILED DESCRIPTION
(11) The same reference symbols are used in the figures for identical or functionally similar components. Only the respective relevant reference components are indicated with reference numbers in the figures.
(12) The training evaluation device 10 shown in
(13) The training evaluation device 10 is a graphic card processor that is configured specifically for automotive applications, which runs an artificial neural network 13. The artificial neural network 13 is a recurrent artificial neural network 13, for example, in which neurons 16 in one layer are connected to neurons 16 in a preceding layer.
(14) The inputs of the artificial neural network 13 are the recording of the sign 2 and a driver control command 4 corresponding to the signs 2. The output of the artificial neural network 13 is a vehicle control command calculated on the basis of the inputs. By adjusting the weighting factors 15, the calculated vehicle control command 14 is optimized, such that the vehicle control command 14 matches the driver control command 4. The training evaluation device 10 learns to assign a corresponding vehicle control command 14 to a recorded sign through actual test drives and/or simulated test drives, and to thus assign a corresponding meaning thereto. If the training evaluation device 10 located on the vehicle 1 records a hand signal from a road user 3 waiting at a crosswalk 9, and a corresponding driver control command 4, pressing the gas pedal if signaled to continue, braking if signaled to stop, the training evaluation device 10 learns to generate a corresponding situation-dependent vehicle control command 14. In a similar situation, the training evaluation device 10 can then react appropriately with a vehicle control command 14, without having recorded a corresponding driver control command 4.
(15)
(16) The environment detection sensor 5 provides the training evaluation device 10 with the sign 2 given by the road user 3 recorded while driving the vehicle 1 via the first input interface 11 and the driver control command 4 corresponding to the sign 2 via the second input interface 12. The training evaluation device 10 feeds the sign 2 and the driver control command 4 to the artificial neural network 13. In addition, the training evaluation device 10 simulates an incorrect movement of the vehicle 1, e.g., a stopping vehicle control command for a sign indicating to continue given by the road user. By this means, it is possible to accumulate incorrect movements directly, without additional sensors. The artificial neural network 13 learns to react to errors with simulated incorrect movements. The corresponding vehicle control command 14 is obtained as an output from the artificial neural network 10. The weighting factors 15 are adjusted, e.g., through backward propagation of the artificial neural network 13, such that the vehicle control command 14 matches the driver control command 4. As a result, the artificial neural network 13 learns the semantic meaning of the sign 2.
(17) A computer program product 20 is shown in
(18)
(19) The artificial neural network 13 is optimized by the configuration of its neurons 16 and the adjustment of the weighting factors 15 that takes place in the training phase, in order to calculate the vehicle control command 14 corresponding to the sign 2 that has been recorded. If the working evaluation device 30 records a clear sign 2 to continue, the working evaluation device 30 calculates the corresponding vehicle control command 14 to continue pressing the gas pedal using the artificial neural network 13.
(20) The working evaluation device 30 outputs the vehicle control command to the road user 3 via a first output interface 32, in order to indicate to the road user 3 that the vehicle 1 has acknowledged the sign 2 and will react with a corresponding driving behavior. The vehicle control command 14 is output via a second output interface 33 to a vehicle control device 40, in order to automatically control the vehicle 1 in accordance with the vehicle control command 14.
(21) The vehicle control device 40, also shown in
(22) A driver assistance system 50 according to the present disclosure is shown in
(23) A pedestrian in the form of a road user 3 is recorded at a crosswalk by the camera and the radar devices. The road user can also be at another type of crossing, e.g., a traffic light. The environment detection sensors 5 also record the sign 2 given by the road user 3 to continue driving the vehicle 1. The working evaluation device 30 of the driver assistance system 50 calculates the vehicle control command 14 corresponding to the sign 2 using the artificial neural network 13.
(24) The driver assistance system 50 also has a signal transmitter 51, with which the vehicle control command 14 is indicated to the road user 3. The signal transmitter 51 thus indicates a vehicle reaction to the known meaning of the sign 2 to the road user 3.
(25) The signal transmitter 51 generates and outputs a signal. The signal is a visible signal, e.g., a light signal in a specific color. The signal transmitter 51 can be in the form of a light strip 8 in particular, which contains numerous light sources, as is shown in the middle exemplary embodiment in
(26) The signal transmitter 51 is configured to indicate a continuation of the vehicle 1 to the road user 3 shown in
(27) The driver assistance system 50 shown in
(28) The method according to the present disclosure for recognizing a meaning of a sign 2 given by a road user 3, and for indicating a vehicle reaction to a known meaning of this sign 2 is shown in
(29) A computer program product 70 for executing the method according to the present disclosure for recognizing a meaning of a sign 2 given by a road user 3 and for indicating a vehicle reaction to a known meaning of this sign 2 is shown in
(30) The driver assistance system 60 can also be successfully used in the following situation. A road user 3, e.g., a pedestrian, is standing at a crosswalk 9, and is waiting for another road user 3. The driver assistance system 50 of the automated vehicle 1 detects the individual road user 3, and indicates that he will stop by means of the signal transmitter 51. The road user 3 indicates to the vehicle 1 that the vehicle 1 should continue through a clear gesture, e.g., a hand movement. This hand movement is recognized as a sign 2 with the corresponding meaning, and processed by the driver assistance system 50. The vehicle 1 then changes the signal, and the signal transmitter 51 indicates that the vehicle 1 will now continue. The vehicle assistance system 50 assesses the gesture based on the size, age, and/or facial expressions of the road user 3. This assessment is necessary to be able to analyze children and elderly people properly.
REFERENCE SYMBOLS
(31) 1 vehicle
(32) 2 sign
(33) 3 road user
(34) 4 driver control command
(35) 5 environment detection sensor
(36) 6 vehicle data transfer system
(37) 7 bumper
(38) 8 light strip
(39) 9 crosswalk
(40) 10 training evaluation device
(41) 11 first input interface
(42) 12 second input interface
(43) 13 artificial neural network
(44) 14 vehicle control command
(45) 15 weighting factor
(46) 16 neuron
(47) 20 computer program product
(48) 21 memory
(49) 22 computer
(50) 30 working evaluation device
(51) 31 input interface
(52) 32 first output interface
(53) 33 second output interface
(54) 40 vehicle control mechanism
(55) 50 driver assistance system
(56) 51 signal transmitter
(57) 52 interface
(58) 60 portable device
(59) 70 computer program product