METHOD FOR CALIBRATING A CAMERA AND/OR A LIDAR SENSOR OF A VEHICLE OR A ROBOT

20230082700 ยท 2023-03-16

    Inventors

    Cpc classification

    International classification

    Abstract

    Calibrating a camera and/or a lidar sensor of a vehicle or a robot involves a camera capturing images of a vehicle or robot environment. The lidar sensor emits a real pattern into the vehicle or robot environment in at least one portion of a detection range of the camera, and the real pattern is captured by the camera. A virtual pattern generated in a coordinate system of the lidar sensor is projected onto a virtual plane in the vehicle or robot environment by the lidar sensor. Laser radiation emitted by the lidar sensor penetrates the virtual plane and the real pattern correlating with the virtual pattern is generated on a real projection surface. The real pattern captured by the camera is recalculated onto the virtual plane based on a surface profile of the real projection surface. A rectified virtual pattern is generated in a coordinate system of the camera and the camera and/or lidar sensor is/are calibrated by comparing the virtual pattern and the rectified virtual pattern.

    Claims

    1-10. (canceled)

    11. A method for calibrating a camera or a lidar sensor of a vehicle or robot, the method comprising: capturing, by the camera, images of a vehicle or robot environment; emitting, by the lidar sensor into the vehicle or robot environment, a real pattern in at least one portion of a detection range of the camera; capturing, by the camera, the real pattern; projecting, by the lidar sensor onto a virtual plane in the vehicle or robot environment, a virtual pattern generated in a coordinate system of the lidar sensor, wherein laser radiation emitted by the lidar sensor penetrates the virtual plane, generating the real pattern correlated with the virtual pattern on a real projection surface; recalculating the real pattern captured by the camera onto the virtual plane based on a surface profile of the real projection surface; generating, based on the recalculated real pattern, a rectified virtual pattern in a coordinate system of the camera; and calibrating the camera or the lidar sensor based on a comparison of the virtual pattern and the rectified virtual pattern.

    12. The method of claim 11, wherein the surface profile of the real projection surface is determined by the lidar sensor based on distances to pixels of the real pattern.

    13. The method of claim 11, wherein the comparison of the virtual pattern and the rectified virtual pattern involves determining at least one transformation equation for converting the virtual pattern into the rectified virtual pattern or for converting the rectified virtual pattern into the virtual pattern.

    14. The method of claim 11, wherein the calibration further comprises determining an azimuth error or elevation error of the camera or of the lidar sensor.

    15. The method of claim 11, wherein the generation of the virtual pattern, involves the laser radiation emitted by the lidar sensor being deflected by a rotating mirror of the lidar sensor.

    16. The method of claim 11, wherein, during the capture of the real pattern by the camera, an integration is performed across several successively captured images of the camera.

    17. The method of claim 11, wherein calibrating the camera further comprises switching the camera to a calibration mode.

    18. The method of claim 11, wherein the virtual pattern is projected by infrared laser radiation.

    19. The method of claim 11, wherein the images captured by the camera of the vehicle or robot environment are captured while light radiation hitting the camera is filtered by a camera-internal infrared light filter.

    20. The method of claim 19, wherein the camera-internal infrared light filter is transparent to infrared laser radiation emitted by the lidar sensor and reflected by the real projection surface, or switched to be transparent in a calibration mode of the camera for infrared laser radiation emitted by the lidar sensor and reflected by the real projection surface.

    Description

    BRIEF DESCRIPTION OF THE DRAWING FIGURES

    [0020] Here are shown:

    [0021] FIG. 1 in diagram form, a perspective view of a vehicle, a virtual pattern and a real pattern.

    [0022] FIG. 2 in diagram form, a block diagram of a device for calibrating a camera and/or a lidar sensor of a vehicle.

    [0023] Corresponding parts are provided with the same reference numerals in all figures.

    DETAILED DESCRIPTION

    [0024] FIG. 1 shows a perspective view of a vehicle 1, a virtual pattern M.sub.v and a real pattern M.sub.r. FIG. 2 shows a block diagram of a possible exemplary embodiment of a device 4 for calibrating a camera 3 and/or a lidar sensor 2.1 of the vehicle 1.

    [0025] The following statements are also analogously applicable to robots that comprise at least one camera 3 and/or at least one lidar sensor 2.1. Such robots are, for example, also designed as a vehicle, for example as a highly or fully automated passenger car, as a highly or fully automated transport vehicle or as a highly or fully automated truck. Also, the robots may be industrial robots, automated lawn mowers, vacuum robots, mopping robots or automated watercraft.

    [0026] The vehicle 1 comprises a lidar 2 with at least one lidar sensor 2.1 and a camera 3, wherein the lidar 2 and the camera 3 are designed to capture a vehicle environment.

    [0027] The device 4 comprises the lidar 2, the camera 3, and a processing unit 5.

    [0028] The aim of the calibration is to calibrate the camera 3 relative to the lidar sensor 2.1 in such a way that any differences between optical axes of the camera 3 and the lidar sensor 2.1, i.e., azimuth and elevation errors, are equalized, so that the camera 3 and the lidar sensor 2.1 see the same object at the same location.

    [0029] For this purpose, a virtual plane E.sub.v is defined in front of the vehicle 1, which is located in the detection range of the camera 3 and the lidar sensor 2.1. By means of the lidar sensor 2.1, infrared laser radiation is emitted and thus the virtual pattern M.sub.v is generated on the virtual plane E.sub.v. The virtual pattern M.sub.v is, for example, a chessboard pattern. The virtual pattern M.sub.v is generated here in a coordinate system of the lidar sensor 2.1. The infrared laser radiation penetrates the virtual plane E.sub.v, so that a real pattern M.sub.r correlating with the virtual pattern M.sub.v is projected onto a projection surface A in the vehicle environment, for example onto a road surface. This projected real pattern M.sub.r is distorted compared to the virtual pattern M.sub.v because the projection surface A on which the real pattern M.sub.r arises is not plane-parallel to the virtual plane E.sub.v.

    [0030] In a possible embodiment, the lidar 2 or the lidar sensor 2.1 has a rotating mirror with which the infrared laser radiation is deflected across a scene to be scanned.

    [0031] By means of the lidar sensor 2.1, a distance d to individual pixels of the projected real pattern M.sub.r is determined. Thus, by means of the processing unit 5, a surface profile of the projection surface A is three-dimensionally determined, and a so-called ground truth is generated.

    [0032] At least one image B of the real pattern M.sub.r is captured by means of the camera 3. Based on the determined surface profile of the projection surface A, the captured real pattern M.sub.v is rectified by means of the processing unit 5 by recalculating what it would look like on the virtual plane E.sub.v. The result of this recalculation is a rectified virtual pattern M.sub.ev recalculated in the coordinate system of the camera 3.

    [0033] Since synchronization fluctuations of the rotating mirror of the lidar 2 can cause distances between the lines of the virtual pattern M.sub.v and therefore distances between lines of the real pattern M.sub.r to vary, in order to compensate for these errors upon capturing the real pattern M.sub.r, in a possible embodiment an integration is carried out across several images B of the camera 3. This integration also increases the resolution of the camera 3 in the infrared range, which is advantageous because a camera 3 designed as a conventional color camera is most highly sensitive in the visible light range and has a rather low sensitivity in the infrared range.

    [0034] In a further possible embodiment, the camera 3 is switched to a calibration mode for calibration. Furthermore, it is possible for the camera 3 to have an infrared light filter to reduce interference and/or to increase a color quality. This infrared light filter is designed, for example, in such a way that it is either transparent for the reflected IR laser pulses, or, that in the calibration mode of the camera 3, it can be switched to a transparent state for the infrared laser radiation of the lidar sensor 2.1.

    [0035] Due to an azimuth and elevation error, the coordinate system of the lidar sensor 2.1 is shifted and rotated relative to the coordinate system of the camera 3. Therefore, the two virtual patterns M.sub.v, M.sub.ev are also shifted and rotated against one another.

    [0036] From the two virtual patterns M.sub.v, M.sub.ev, transformation equations are determined for converting one virtual pattern M.sub.v, M.sub.ev into the other virtual pattern M.sub.ev, M.sub.v. This means that transformation parameters P of a coordinate transformation are determined, with which the data captured by the camera 3 can be transformed into the coordinate system of the lidar sensor 2.1, or with which the data captured by the lidar sensor 2.1 can be transformed into the coordinate system of the camera 3.

    [0037] With the determined transformation parameters P, the environment data determined by the lidar sensor 2.1 or the camera 3 is then transformed into the coordinate system of the respective other sensor during regular operation. The lidar sensor 2.1 and the camera 3 then see the same object at the same location.

    [0038] Although the invention has been illustrated and described in detail by way of preferred embodiments, the invention is not limited by the examples disclosed, and other variations can be derived from these by the person skilled in the art without leaving the scope of the invention. It is therefore clear that there is a plurality of possible variations. It is also clear that embodiments stated by way of example are only really examples that are not to be seen as limiting the scope, application possibilities or configuration of the invention in any way. In fact, the preceding description and the description of the figures enable the person skilled in the art to implement the exemplary embodiments in concrete manner, wherein, with the knowledge of the disclosed inventive concept, the person skilled in the art is able to undertake various changes, for example, with regard to the functioning or arrangement of individual elements stated in an exemplary embodiment without leaving the scope of the invention, which is defined by the claims and their legal equivalents, such as further explanations in the description.