METHOD FOR CALIBRATING A TRAFFIC MANAGEMENT SYSTEM, AND TRAFFIC MANAGEMENT SYSTEM
20230343212 ยท 2023-10-26
Assignee
Inventors
Cpc classification
International classification
Abstract
The embodiment relates to a method for calibrating a traffic management system which is configured for the automated guiding of a vehicle within a traffic region, which method is configured for the automated parking of a vehicle, and to such a traffic management system. The traffic management system comprises at least one monitoring sensor for monitoring the traffic region, a communication device for communication with a vehicle, in particular for sending driving directions to the vehicle, and a controller for processing the signals from the at least one monitoring sensor and for determining driving instructions for the vehicle. The method comprises the following steps: detecting at least one object in the traffic region with the at least one monitoring sensor and storing the information about the object, querying sensor information of at least one environmental sensor of the vehicle and checking whether the detected object is detected by the environmental sensor, if the object is detected by the environmental sensor, retrieving the information of the environmental sensor relating to that object, and calibrating the monitoring sensor taking account of the information about the object acquired by the monitoring sensor and the information about the object acquired by the environmental sensor of the vehicle.
Claims
1. A method for calibrating a traffic management system which is configured for the automated guiding of a vehicle, the method comprising: detecting at an object in a traffic region of the vehicle; obtaining information relating to the object detected by an environmental sensor of the vehicle; and calibrating a monitoring sensor of the vehicle for detecting the object based on information relating to the object acquired by the monitoring sensor and the information relating to the object acquired by the environmental sensor of the vehicle.
2. The method as claimed in claim 1, wherein the object is defined beforehand in the traffic region.
3. The method as claimed in claim 2, further comprising storing the information relating to the object.
4. The method as claimed in one claim 2, wherein the information relating the object comprises a position and an orientation of the object within the traffic region.
5. The method as claimed in claim 4, wherein the object is accentuated by a structure, shape, material or color.
6. The method as claimed in claim 5, wherein the object comprises reflector elements provided on the object.
7. The method as claimed in claim 6, further comprising determining a distance of the object from the environmental sensor of the vehicle, wherein the calibrating comprises calibrating the monitoring sensor based on the distance of the object from the environmental sensor.
8. The method as claimed in claim 1, wherein the environmental sensor is an ultrasonic sensor, an optical sensor or a RADAR sensor.
9. The method as claimed in claim 8, wherein the monitoring sensor is an optical sensor.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] Further advantages and features will become apparent from the following description in connection with the appended drawings. in which:
[0022]
[0023]
[0024]
DETAILED DESCRIPTION
[0025]
[0026] The traffic management system 10 has multiple monitoring sensors 16 for monitoring the traffic region 12. The monitoring sensors 16 are, for example, optical sensors, in particular cameras, which each produce an image of a region of the traffic region 12. The monitoring sensors 16 are so arranged that the entire traffic region 12 can be monitored.
[0027] The traffic management system 10 further has a controller 18 which is connected to the monitoring sensors 16 and which is able to process the information from the monitoring sensors 16. A map of the traffic region 12, for example, is stored in the controller 18. The controller 18 is further connected to a communication device 20 which is able to establish communication with the vehicles 14 and send information to or receive information from the vehicle 14.
[0028] The monitoring sensors 16 of the traffic management system 10 detect the vehicles 14 which are within the traffic region 12. Preferably, the vehicles 14 must register beforehand and check in with the controller 18. The controller 18 determines a driving path 22 for the vehicles 14, for example to a parking space 24 for the vehicle 14. The communication device 20 then sends corresponding driving directions to the vehicle 14. The driving commands are received by a vehicle communication device 26. The vehicle 14 then drives independently or remotely controlled by the controller 18 along the driving path 22 to the parking space 24.
[0029] While the vehicle is driving within the traffic management system 10, the position, orientation and direction of movement of the vehicle 14 are monitored by the monitoring sensors 16. In addition, the traffic region 12 as a whole is monitored by the sensors. The driving directions can be adapted and sent to the vehicle while the vehicle 14 is driving, for example if the controller 18 detects that the vehicle 14 has deviated from the calculated driving path 22 or obstacles or hazards on the driving path are identified by the monitoring sensors.
[0030] Monitoring of the vehicle 14 must be carried out with very high accuracy in order to prevent collisions with other vehicles or other objects 28, for example pillars or walls. During normal operation, however, the orientation of a monitoring sensor 16 can change, whereby the accuracy of the monitoring sensor 16 decreases.
[0031] In order to check the traffic management system or the monitoring sensors 16, stationary objects 28 which can be detected by the monitoring sensors 16 are defined within the traffic region 12. The information about these objects is transmitted to or stored in the controller 18. The information can be, for example, a position or an orientation of the object 28. In addition, the objects 28 and information about that object 28 can be recorded in the map which is stored in the controller 18.
[0032] The vehicle 14 has environmental sensors 30 with which the environment of the vehicle 14 can be detected in normal vehicle operation. For example, the environmental sensors 30 are ultrasonic sensors of a parking aid. Alternatively, the environmental sensors 30 can also be optical sensors, RADAR or LIDAR sensors. The environmental sensors 30 are connected to a vehicle controller 32.
[0033] In order to check the monitoring sensors 16, the controller checks whether one of the objects 28 is within the detection range of an environmental sensor 30 of the vehicle 14. If that is the case, the information about the object 28 acquired by the environmental sensor 30 or processed by the vehicle controller 32 is retrieved from the vehicle controller 32. The vehicle controller 32 sends the information to the communication device 20 or the controller 18 by way of the vehicle communication device 26.
[0034] In the controller 18, the information about the object 28 acquired by the environmental sensor 30 is compared with the information about the object 28 stored in the controller or acquired by the monitoring sensors 16.
[0035] If these pieces of information are different, the reliability of the information from the vehicle 14 can additionally be checked, for example by checking the position of the vehicle by additional monitoring sensors 16. The monitoring sensor 16 can then be calibrated taking account of the information about the object 28 acquired with the environmental sensor 30.
[0036] Recognition of the objects 28 can additionally be improved by accentuating the objects 28 such that they can be detected more reliably and more accurately by the environmental sensor 30 and/or the monitoring sensor 16. For example, the objects 28 can have a particular structure or shape. The structure or shape can have, for example, special edges or surfaces which can be detected and allocated identified particularly easily by the environmental sensor 30 and/or the monitoring sensor 16. For example, the objects 28 can also have a defined surface structure, for example a three-dimensional structure, or can be painted a specific color or in a color pattern. In particular, the objects 28 can be clearly identified by their structure, that is to say the individual objects 28 each have an individual, distinctive structure.
[0037] In addition, further elements which permit better detection and identification of the object 28 can also be provided on the objects 28. For example, the object 28 can be provided with reflector elements which reflect signals of an active sensor system which has a signal transmitter and a signal receiver.
[0038] The elements can, for example, also be active elements, for example lighting which is activated by the controller 18 in order to improve detection of the objects 28.
[0039] Optionally, the controller can check whether the quality of the information acquired by the environmental sensor 30 and of the information about an object 28 acquired by the monitoring sensor 16 is sufficient to compare those pieces of information with one another and/or to carry out a calibration of the monitoring sensor 16. If the quality is not sufficient, additional measures can be initiated by the controller 18 in order to improve the detection of the object 28 with the environmental sensor 30 and/or the monitoring sensor 16. For example, additional lighting can be activated. The lighting can be, for example, stationary lighting of the traffic region 12. Optionally, the controller 18 can also send a corresponding instruction to the vehicle 14 to switch on the vehicle lighting of the vehicle 14. In particular, such lighting can effect better accentuation of edges or shapes of the object 28.
[0040] Furthermore, the distance of the vehicle 14 from the object 28 can additionally be detected. As a result, it can be checked, for example, whether the distance of the object 28 from the vehicle 14 is sufficient to acquire the information about the object 28 with sufficient accuracy. In particular, the information about the object acquired with the environmental sensor 30 can be weighted or taken into account in the calibration in dependence on the distance of the vehicle 14 from the object 28. In particular, the vehicle 14 can also have different environmental sensors 16, wherein the information from the environmental sensors 16 is taken into account or used in dependence on the distance from the object 28.