METHOD AND SYSTEM FOR THE CONTROL OF A VEHICLE BY AN OPERATOR
20230036840 ยท 2023-02-02
Inventors
- Alexander Geraldy (Hildesheim, DE)
- Holger Kiehne (Peine, DE)
- Jan Wolter (Hannover, DE)
- Jens Schwardmann (Hildesheim, DE)
- Peter Engel (Hessisch Oldendorf, DE)
Cpc classification
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
B60W60/001
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A method for the control of a vehicle by an operator. The method includes: using a predictive map to control the vehicle by: detecting a situation and/or location reference of the vehicle, transmitting data of a defined set of sensors, fusing and processing the data of the defined set of sensors; displaying the fused and processed data for the operator; creating/updating the predictive map by: recognizing a problematic situation and/or a problematic location by observation of the operator and/or marking by the operator, storing the problematic situation and/or the problematic location in a first database for storing problematic situations and locations, and training a model for selecting the defined set of sensors and fusing the data of the defined set of sensors by machine learning.
Claims
1. A method for control of a vehicle by an operator, comprising the following steps: using a predictive map to control the vehicle by: detecting a situation and/or location reference of the vehicle, transmitting data of a defined set of sensors, fusing and processing the data of the defined set of sensors, displaying the fused and processed data for the operator; and creating/updating the predictive map by: recognizing a problematic situation and/or a problematic location by: observation of the operator and/or marking by the operator, storing the problematic situation and/or the problematic location in a first database for storing problematic situations and locations, and training a model for selecting the defined set of sensors and fusing the data of the defined set of sensors by machine learning.
2. The method as recited in claim 1, wherein the observation of the operator is carried out by detecting: stress level of the operator, and/or viewing direction of the operator, and/or behavior of the operator.
3. The method as recited in claim 1, further comprising the following steps: retrieving parameters for upcoming routes and/or areas from a second database for storing situation-related and/or location-related detection, fusion, and display parameters; adapting the defined set of sensors, whose data are transmitted; adapting the fusion of the data of the defined set of sensors; adapting the display for the operator.
4. The method as recited in claim 1, wherein the fusion of the data of defined set of sensors is allocated onto multiple partial fusions.
5. The method as recited in claim 1, further comprising: searching for recognized situations and/or locations in the first database; evaluating the recognized situations and/or locations; generating situation-adapted and/or location-adapted detection, fusion, and display parameters; storing the situation-adapted and/or location-adapted detection, fusion, and display parameters in the second database.
6. A system for control of a vehicle by an operator, comprising: a vehicle which permits teleoperation; an operator who controls the vehicle without direct line of sight based on pieces of vehicle and surroundings information; sensors, which enable a comprehensive surroundings model of the vehicle for the operator; a predictive map to select the defined set of sensors and fuse the data of the defined set of sensors, which is configured to indicate whether and how data of individual sensors of the defined set of sensors are fused with one another; a wireless network configured to transmit data of the sensors; a control center configured to for control the vehicle; and a training system configured to train the predictive map to select the defined set of sensors and use the defined set of sensors as a function of location, and/or situation, and/or preferences of the operator.
7. The system as recited in claim 6, further comprising: a backend, in which the data of the defined set of sensors are processed between the wireless network and the control center.
8. The system as recited in claim 7, wherein the backend is a part of the control center or is separate from the control center.
9. The system as recited in claim 6, wherein the fusion of the data of the individual sensors takes place at arbitrary points of the system.
10. A non-transitory machine-readable memory medium on which is stored a computer program for control of a vehicle by an operator, the computer program, when executed by a computer, causing the computer to perform the following steps: using a predictive map to control the vehicle by: detecting a situation and/or location reference of the vehicle, transmitting data of a defined set of sensors, fusing and processing the data of the defined set of sensors, displaying the fused and processed data for the operator; and creating/updating the predictive map by: recognizing a problematic situation and/or a problematic location by: observation of the operator and/or marking by the operator, storing the problematic situation and/or the problematic location in a first database for storing problematic situations and locations, training a model for selecting the defined set of sensors and fusing the data of the defined set of sensors by machine learning.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0064] Specific embodiments of the present invention are explained in greater detail on the basis of the figures and the following description.
[0065]
[0066]
[0067]
[0068]
[0069]
[0070]
[0071]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENT
[0072] In the following description of the specific embodiments of the present invention, identical or similar elements are identified by identical reference numerals, a repeated description of these elements being omitted in individual cases. The figures only schematically represent the subject matter of the present invention.
[0073]
[0074] It may be seen from
[0075] Situations in which data of multiple sensors 12 have to be fused with one another are defined not only by bad weather or temporal aspects, but are often also dependent on the local conditions.
[0076] It is apparent from the representation according to
[0077] Backend 32 is designed here as a data processing center, in which the data of sensors 12 are processed between wireless network 40 and control center 34. In the present case in
[0078] The fusion may take place at arbitrary points of system 100. The fusion may also be allocated onto multiple partial fusions at different points of system 100, for example, to preprocess and reduce data prior to the wireless transmission and to process and finally fuse data after the wireless transmission.
[0079] Thus, for example, an in-vehicle fusion of data of LIDAR sensor 14 and camera sensor 16 of vehicle 10 may be carried out. The fusion result is transmitted via wireless network 40 to backend 32.
[0080] Optionally, infrastructure unit 20, in the present case traffic sign 22, may be configured to fuse data of various sensors 12. The fused data are also transmitted to backend 32.
[0081] Backend 32 may be configured to receive the data sent from vehicle 10 and infrastructure unit 20, fuse them, and transmit the data fused there further to control center 34.
[0082] Control center 34 may also be configured to fuse the received data. The data thus fused are provided directly to operator 36, for example, via audiovisual or haptic devices.
[0083]
[0084] Initially, data of a LIDAR sensor 14 are detected in a first step 201 and data of a camera sensor 16 are detected in a second step 202. Subsequently, the data of LIDAR sensor 14 and camera sensor 16 are brought together in a third step 203.
[0085] The data of LIDAR sensor 14 and camera sensor 16 are then fused with one another. There is not only one fusion possibility for a combination of two sensors 12. Two possibilities 210, 220 for fusing data of LIDAR sensor 14 and camera sensor 16 are shown in
[0086] Finally, in a seventh step 207, the fused data of LIDAR sensor 14 and camera sensor 16 are displayed to operator 36.
[0087]
[0088] In this situation, the data of camera sensor 16 are fused with the data of a LIDAR sensor 14 of vehicle 10. The fusion is carried out on the basis of a weighting of particular sensors 12. In the present case, a weighting of camera sensor 16 of 0.5 and a weighting of LIDAR sensor 14 of 0.5 are selected for an area 304 problematic for camera sensor 16. A weighting of camera sensor 16 of 1 [sic] is selected outside area 304.
[0089] Initially an object detection is carried out from the LIDAR data. The detected objects are subsequently shown to operator 36 as a bounding box 306 in first camera image 300.
[0090]
[0091]
[0092]
[0093] In a first method step 601, the method according to the present invention is started. Vehicle 10 is controlled by operator 36. In a second method step 602, a situation and/or location reference of vehicle 10 is detected. Subsequently, data of a defined set of sensors 12 are transmitted in a third method step 603. The transmitted data are then fused in a fourth method step 604. The fused and processed data are then displayed to operator 36 in a fifth method step 605.
[0094] With the aid of method steps 602 through 605, a predictive map is used which may be updated during the control by operator 36. It is checked whether a problematic situation and/or a problematic location was recognized. A problematic situation and/or a problematic location may be recognized in a sixth method step 606 by observation of operator 36. A problematic situation and/or a problematic location may also, however, be recognized by marking by operator 36 in a seventh method step 607.
[0095] If a problematic situation and/or a problematic location are recognized, it and/or these are stored in an eighth method step 608 in a first database 630 for storing problematic situations and locations.
[0096] In a ninth method step 609, it is checked whether the trip is ended. If the trip is ended, the method is ended in a tenth method step 610. If vehicle 10 drives further, method steps 602 through 609 repeat.
[0097] In the creation of the predictive map, the detection, fusion, and display parameters are adapted if parameters are already present for the situation and/or location reference detected in second method step 602.
[0098] In an eleventh method step 611, the parameters for upcoming routes and/or areas are retrieved from a second database 640 for storing situation-related and/or location-related detection, fusion, and display parameters, which the predictive map displays. Subsequently, the defined set of sensors 12, whose data are transmitted, is adapted in a twelfth method step 612. The fusing of the data of sensors 12 is adapted in a thirteenth method step 613 and the display for operator 36 is adapted in a fourteenth method step 614.
[0099] If a problematic situation and/or a problematic location are recognized, an aggregation of data is carried out in parallel. The aggregation is started in a fifteenth method step 615 if a problematic situation and/or a problematic location are recognized.
[0100] A search is made for the recognized situations and/or locations in a sixteenth method step 616. Subsequently, in a seventeenth method step 617, an evaluation of identical situations and/or locations is compiled. In an eighteenth method step 618, it is then checked whether the recognized situation and/or the recognized location are permanently critical. Method steps 616 through 618 are repeated. If the recognized situation and/or the recognized location are permanently critical, it is furthermore checked in a nineteenth method step 619 whether parameters are already present. If the parameters are already present, they are retrieved in a twentieth method step 620 from second database 640 and taken into consideration when evaluating the recognized situation and/or the recognized location in a twenty-first method step 621. Subsequently, situation-adapted and/or location-adapted detection, fusion, and display parameters are generated in a twenty-second step 622, which are stored in a twenty-third method step 623 in second database 640. After the storage of the adapted parameters, the aggregation of data is ended in a twenty-fourth method step 624.
[0101] However, as already stated above in general, this selected sequence for carrying out the method according to the present invention in
[0102] The present invention is not restricted to the exemplary embodiments described here and the aspects highlighted therein. Rather, a variety of modifications are possible within the scope of the present invention, which are within the expertise of those skilled in the art.