Method and system for automated calibration of sensors

20220343656 · 2022-10-27

    Inventors

    Cpc classification

    International classification

    Abstract

    The invention relates to a method for automated calibration of sensors of a vehicle, wherein at least one first passive optical sensor and at least one second active optical sensor are calibrated by a calibration unit based on a matching spatial orientation of recognised environmental features in transformed sensor data of the first sensor and the sensor data captured by the second sensor.

    Claims

    1. Method for automated calibration of sensors in a vehicle comprising the steps of: a. capturing sensor data elating to a vehicle environment passed through by the vehicle during operation by at least one first passive optical sensor arranged on the vehicle and at least one second active optical sensor arranged on the vehicle;) b. calibrating the at least one first sensor by determining intrinsic sensor parameters and distortion parameters based on the sensor data captured by the first sensor by a calibration unit and applying the intrinsic sensor parameters and the distortion parameters to the sensor data captured by the first sensor to obtain transformed sensor data; c. recognising environmental features of the vehicle environment previously passed through in the transformed sensor data of the first sensor and the sensor data captured by the second sensor by a recognition unit; and d. calibrating the at least one first sensor and the at least one second sensor based on a matching spatial orientation of the recognised environmental features in the transformed sensor data of the first sensor and the sensor data captured by the second sensor by the calibration unit, while determining extrinsic sensor parameters and applying the extrinsic sensor parameters to transformed sensor data from the first sensor and sensor data captured by the second sensor to obtain calibrated sensor data in each case.

    2. Method according to claim 1, wherein: the environmental features are at least parts of substantially static objects which are arranged in the vehicle environment passed through by the vehicle during operation and were detected by the at least one first sensor and the at least one second sensor using the sensor data, wherein the environmental features are recognised by the recognition unit by means of a gradient-based or a keypoint-based method.

    3. Method according to claim1, wherein the at least one first sensor designed as a monocular camera or as a stereo camera and the at least one second sensor uses radar, lidar or ultrasound technology or is designed as a time-of-flight camera.

    4. Method according to claim 1, wherein: the steps a. to c. are continuously repeated, wherein environmental features recognised in step c., if they are recognised as matching in a plurality of sensor data captured consecutively in terms of time and/or location, are stored by the recognition unit as consistent environmental features or are otherwise deleted, wherein step d. is carried out only when the number of consistent environmental features exceeds a predetermined first threshold value.

    5. Method according to claim 4, wherein: a distribution parameter is determined by an evaluation unit which specifies a spatial distribution of the consistent environmental features in the respective sensor data, wherein step d. is carried out only when the number of consistent environmental features exceeds the first threshold value and when the value of the distribution parameter exceeds a predetermined second threshold value.

    6. Method according to claim 1, wherein: a third sensor is provided on the vehicle, wherein the third sensor is provided and designed to capture the spatial orientation and movement of the vehicle, wherein the captured spatial orientation and movement of the vehicle are also used for the calibration of the at least one first sensor in step b., wherein the third sensor is a GNSS receiver.

    7. Method according to claim 1, wherein: the calibration unit, the recognition unit and/or the evaluation unit are arranged on the vehicle or external to the vehicle, wherein the calibration unit, the recognition unit and/or the evaluation unit are designed externally to the vehicle as part of a data processing system or are cloud-based, wherein the intrinsic sensor parameters, the extrinsic sensor parameters, the distortion parameters, the consistent environmental features and/or the distribution parameter are stored in a retrievable manner on a memory unit arranged on the vehicle or external to the vehicle.

    8. Method according to claim 1, wherein: a sequence of the method is displayed on a display device based on the number of consistent environmental features relative to the first threshold value and the value of the distribution parameter relative to the second threshold value, so that the sequence of the method be followed by a user, wherein an operating device is provided, by means of which the sequence of the method can be regulated by the user.

    9. System for carrying out a method according to claim 1, comprising at least one first sensor and at least one second sensor, a calibration unit and a recognition unit.

    10. Vehicle equipped with a system according to claim 9.

    Description

    [0049] Additional aims, advantages and expediencies of the present invention can be found in the following description in conjunction with the drawings. In the drawings:

    [0050] FIG. 1 shows a vehicle in a vehicle environment for calibrating sensors according to a preferred embodiment of the invention;

    [0051] FIG. 2 is a schematic representation of a system for calibrating sensors according to a preferred embodiment;

    [0052] FIG. 3 shows a method for calibrating sensors using a flow chart according to a preferred embodiment;

    [0053] FIG. 4 shows a display device displaying the sequence of the method according to the invention according to a preferred embodiment.

    [0054] FIG. 1 shows a vehicle 1 according to the invention which is equipped with a system 1000 according to the invention. The system 1000 includes sensors arranged on the vehicle which are to be calibrated. The vehicle 1 drives or moves in a movement direction R (indicated by an arrow) in a vehicle environment 4. By way of example, a plurality of buildings and trees are arranged in the vehicle environment.

    [0055] The system 1000 is designed as a so-called roof box, which has been subsequently mounted on the vehicle roof. The system 1000 has four first passive optical sensors 2a, 2b, 2c, 2d, which are designed as cameras. The sensor 2a is oriented towards the front in the travel direction, the sensors 2c, 2c are oriented laterally perpendicular to the travel direction and the sensor 2d is oriented towards the rear contrary to the travel direction. In this way, the entire vehicle environment 4 around the vehicle 1 can be captured by the sensors 2a-d (cameras). The system 1000 also has three second active optical sensors 3a, 3b, 3c, which are designed as lidar sensors (distance sensors). The sensor 3a here is designed as a 360° lidar sensor with 32 layers and is oriented towards the front in the travel direction (capture area). The sensors 3b, 3c are designed as 360° lidar sensors with 16 layers and are oriented forwards in the travel direction (capture area). The sensor design and arrangement shown are only examples and can also be implemented differently.

    [0056] The system 1000 according to FIG. 1 also comprises a calibration unit 6 and a recognition unit 7, which are not shown (see FIG. 2).

    [0057] Two environmental features 5a, 5b, which can be recognised by the recognition unit 7 in the sensor data of the corresponding first sensors 2a-d and second sensors 3a-c, are shown as examples in the vehicle environment 4 through which the vehicle 1 passes while driving in the movement direction R. The environmental feature 5a is a part of a building, more precisely a window front or the transition from window to masonry. The environmental feature 5b relates to vegetation, more precisely a tree. The environmental features 5a, 5b are characterised in that they differ in colour and/or structure and are therefore easily recognisable.

    [0058] A system 1000 according to a preferred embodiment is shown schematically in FIG. 2.

    [0059] The system 1000 comprises at least one first sensor 2, at least one second sensor 3 and a third sensor 8, wherein the sensors 2, 3, 8 are arranged in and/or on the vehicle 1. The first sensor 2 is a passive optical sensor, such as a camera, the second sensor 3 is an active optical sensor, such as a lidar, radar or ultrasonic sensor, and the third sensor 8 is a position sensor, such as a GNSS receiver. The sensors 2, 3, 8 are each provided and designed to capture sensor data which comprise corresponding measured variables. The third sensor 8 is provided for this purpose and is designed to capture the spatial orientation and movement of the vehicle 1, wherein the sensor data of the third sensor 8 can also be used for calibrating the first sensor 2. The sensor data are transmitted to a computing unit 11 via at least one signalling connection 14, wherein the computing unit 11 comprises a calibration unit 6, a recognition unit 7 and an evaluation unit 9. The processing unit 11 can be arranged on the vehicle side and designed as part of the driver assistance system or independently (preferably as a general vehicle computing unit). Furthermore, the computing unit 11 could be designed as part of a data processing system or cloud which is arranged externally to the vehicle.

    [0060] The calibration unit 6 is provided and designed to calibrate the at least one first sensor 2 by determining intrinsic sensor parameters and distortion parameters based on the sensor data captured by the first sensor 2, wherein a mathematical model or an algorithm is preferably used for this purpose in order to compensate for errors in the capture of the sensor data and to make the sensor data of the first sensor 2 usable for the further method. Furthermore, the calibration unit 6 is provided and designed to apply the determined intrinsic sensor parameters and distortion parameters to the sensor data of the first sensor 2 and thus to obtain transformed sensor data. Furthermore, the calibration unit 6 is provided and designed to calibrate the at least one first sensor 2 and the at least one second sensor 3 based on a matching spatial orientation of the recognised environmental features 5a, 5b in the transformed sensor data of the first sensor 2 and to calibrate the sensor data captured by the second sensor 3 by determining extrinsic sensor parameters and to apply the extrinsic sensor parameters to transformed sensor data of the first sensor 2 and sensor data captured by the second sensor 3 to obtain calibrated sensor data in each case.

    [0061] The recognition unit 7 is provided and designed to detect environmental features 5a, 5b of the vehicle environment 4 previously passed through in the transformed sensor data of the first sensor 2 and the sensor data captured by the second sensor 3. The recognition preferably takes place based on mathematical models or an algorithm, which are preferably gradient-based or keypoint-based. The recognition unit 7 is also provided and designed to determine and to store consistent environmental features.

    [0062] The evaluation unit 9 is preferably provided and designed to determine a distribution parameter which specifies a spatial distribution of the consistent environmental features 5 in the respective sensor data. Furthermore, the evaluation unit 9 is provided and designed to compare a number of consistent environmental features 5 and the value of the distribution parameter with an assigned predetermined threshold value in each case and to enable step d. only when the threshold values are exceeded.

    [0063] The computing unit 11 is connected to a memory unit 10 via a bidirectional signalling connection 15. The intrinsic and extrinsic sensor parameters, the distortion parameter, the distribution parameter, the consistent environmental features and/or sensor data can be stored on the memory unit 10 in a retrievable manner.

    [0064] Furthermore, the computing unit 11 is connected via a bidirectional signalling connection 15 to a display device 12 with an operating device 13, wherein the display device is arranged in the vehicle 1. The display device 12 is provided and designed to display a sequence of the method based on the number of consistent environmental features relative to the first threshold value and the value of the distribution parameter relative to the second threshold value, so that the sequence of the method can be followed by a user, such as a vehicle occupant. The sequence of the method can be regulated by a user via the operating device 13.

    [0065] FIG. 3 shows a preferred embodiment of the method 100 according to the invention using a flow chart.

    [0066] The method 100 can start automatically when the vehicle 1 is put into operation or started, or it can be started manually by a user using the operating device 13.

    [0067] The method begins with a step S1 (corresponding to step a) and the capture of sensor data relating to a vehicle environment 4 passed through by the vehicle 1 during operation by at least one first passive optical sensor 2 arranged on the vehicle and at least one second active optical sensor 3 arranged on the vehicle.

    [0068] The at least one first sensor 2 is then calibrated by a calibration unit 6 according to step S2 (corresponding to step b), determining intrinsic sensor parameters and distortion parameters based on the sensor data captured by the first sensor 2, and the intrinsic sensor parameters and the distortion parameters are applied to the sensor data captured by the first sensor 2, wherein transformed sensor data are obtained.

    [0069] In a subsequent step S3 (corresponding to step c), environmental features 5 of the vehicle environment 4 previously passed through are recognised by a recognition unit 6 in the transformed sensor data of the first sensor 2 and the sensor data captured by the second sensor 3. Furthermore, environmental features 5 are stored as consistent environmental features 5 by the recognition unit 6 or are otherwise deleted if they have been recognised in a plurality of sensor data captured consecutively in terms of time and/or location.

    [0070] In a further step S4, a distribution parameter which indicates a spatial distribution of the consistent environmental features 5 in the respective sensor data is determined by an evaluation unit 7. This step can also be carried out in step S3. The number of stored consistent environmental features is then compared with a predetermined first threshold value and the value of the distribution parameter is compared with a predetermined second threshold value by the evaluation unit 7 to determine whether the value is exceeded.

    [0071] If it is determined in step S4 that the number of stored consistent environmental features and the value of the distribution parameter exceed the assigned threshold value in each case, step S5 is carried out. On the other hand, if it is determined in step S4 that one of the threshold values is not exceeded, there is a return to step S3 and new or further environmental features 5 of the vehicle environment 4 previously passed through are recognised by the recognition unit 6 in new transformed sensor data of the first sensor 2 and the new sensor data captured by the second sensor 3.

    [0072] According to step S5 (corresponding to step d), the at least one first sensor 2 and the at least one second sensor 3 based on a matching spatial orientation of the recognised consistent environmental features 5 in the transformed sensor data of the first sensor 2 and the sensor data captured by the second sensor 3 are calibrated by the calibration unit 5 by determining extrinsic sensor parameters. The extrinsic sensor parameter is applied to the transformed sensor data of the first sensor and the sensor data captured by the second sensor 3, obtaining calibrated sensor data in each case.

    [0073] In a subsequent step S6, the intrinsic sensor parameters, the extrinsic sensor parameters, the distortion parameters, the consistent environmental features and/or the distribution parameters are stored on a memory unit 10.

    [0074] The method 100 ends automatically when the vehicle 1 is taken out of operation or parked, or when a user stops or ends the method 100 manually by means of the operating device 13.

    [0075] FIG. 4 shows a display device 12 with an operating device 13 according to a preferred embodiment.

    [0076] The operating device 13 comprises operating elements 16 in order to start, stop or restart the method, wherein further operating elements are conceivable. The operating elements 16 can be designed as touchscreen elements of the display device 12 or as mechanically actuatable buttons.

    [0077] The transformed sensor data of the at least one first sensor 2 in the form of a camera image 17 and the sensor data of the at least one second sensor 3 in the form of a point cloud 18 are displayed superimposed on one another on the display device 12. Recognised (consistent) environmental features 5 are highlighted by markings 20 (shown here as hatched circles by way of example). In this way, a user can follow the progress of the method in real time and can easily recognise the recognised environmental features without having to be specially trained to do so.

    [0078] Furthermore, the number of consistent environmental features relative to the first threshold value and the value of the distribution parameter relative to the second threshold value can each be represented as a percentage (illustrated by “XX%”) by a calibration progress indicator 19 with an associated progress bar arranged above it. This further simplifies recognition of the sequence of the method and the progress thereof.

    [0079] All features disclosed in the application documents are claimed as substantial to the invention, provided that they are, individually or in combination, novel over the prior art.

    LIST OF REFERENCE SIGNS

    [0080] 1 vehicle [0081] 100 method [0082] 1000 system [0083] 2 first sensor [0084] 3 second sensor [0085] 4 vehicle environment [0086] 5 environmental feature [0087] 6 calibration unit [0088] 7 recognition unit [0089] 8 third sensor [0090] 9 evaluation unit [0091] 10 memory unit [0092] 11 computing unit [0093] 12 display device [0094] 13 operating device [0095] 14, 15 signalling connection [0096] 16 operating element [0097] 15 camera image [0098] 16 point cloud [0099] 17 calibration progress indicator [0100] 20 marking [0101] R movement direction, travel direction [0102] S step