METHOD FOR CALIBRATING THE ALIGNMENT OF CAMERAS

20230230281 ยท 2023-07-20

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for calibrating a multiplicity of cameras on a vehicle in a common coordinate system. The method includes: a) acquiring camera images using each camera of the multiplicity of cameras, b) ascertaining overlap regions of camera images acquired in step a), c) setting up a common optimization function, which describes the alignments of each camera of the multiplicity of cameras as a target variable of the optimization, based on the overlap regions, d) solving the common optimization function set up in step c) to ascertain the alignments of each camera.

    Claims

    1. A method for calibrating a multiplicity of cameras on a vehicle in a common coordinate system, the method comprising the following steps: a) acquiring camera images using each camera of the multiplicity of cameras; b) ascertaining overlap regions of the camera images acquired in step a); c) setting up a common optimization function, which describes alignments of each camera of the multiplicity of cameras as a target variable of the optimization, based on the overlap regions; and d) solving the common optimization function set up in step c) to ascertain the alignments of each camera of the multiplicity of cameras.

    2. The method as recited in claim 1, wherein the common coordinate system is a vehicle coordinate system.

    3. The method as recited in claim 1, wherein a rectification of the camera images is implemented.

    4. The method as recited in claim 1, wherein the alignment of each camera of the multiplicity of cameras is described by a rotation matrix, which describes the alignment of the camera relative to the common coordinate system.

    5. The method as recited in claim 1, wherein the optimization function includes a system matrix, which describes relative alignment of the cameras relative to one another in the form of variables that are identified with a minimum error when the optimization function is solved in step d).

    6. The method as recited in claim 1, wherein an optimization, which is at least of the second order, is performed to solve the optimization function.

    7. The method as recited in claim 1, wherein a Gauss-Newton algorithm is used to solve the optimization function.

    8. The method as recited in claim 1, wherein the multiplicity of cameras forms a camera circle in which the cameras cover at least a portion of an environment of the vehicle.

    9. The method as recited in claim 1, wherein absolute alignment information is inserted into the common coordinate system for at least one camera of the multiplicity of cameras when setting up the optimization function.

    10. The method as recited in claim 1, wherein pixels in the overlap region of different camera images are allocated to one another when setting up the common optimization function in step c).

    11. A non-transitory machine-readable memory medium on which is stored a computer program for calibrating a multiplicity of cameras on a vehicle in a common coordinate system, the computer program, when executed by a computer, causing the computer to perform the following steps: a) acquiring camera images using each camera of the multiplicity of cameras; b) ascertaining overlap regions of the camera images acquired in step a); c) setting up a common optimization function, which describes alignments of each camera of the multiplicity of cameras as a target variable of the optimization, based on the overlap regions; and d) solving the common optimization function set up in step c) to ascertain the alignments of each camera of the multiplicity of cameras.

    12. A control device configured to calibrate a multiplicity of cameras on a vehicle in a common coordinate system, the control device configured to: a) acquire camera images using each camera of the multiplicity of cameras; b) ascertain overlap regions of the camera images acquired in step a); c) set up a common optimization function, which describes alignments of each camera of the multiplicity of cameras as a target variable of the optimization, based on the overlap regions; and d) solve the common optimization function set up in step c) to ascertain the alignments of each camera of the multiplicity of cameras.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0095] FIG. 1 shows a top view of an exemplary representation of a vehicle including a camera circle.

    [0096] FIG. 2 shows a top view of a further exemplary representation of a vehicle having a plurality of cameras.

    [0097] FIG. 3 shows an exemplary sequence of the method introduced here, according to the present invention.

    [0098] FIG. 4 shows exemplary image representations to illustrate an advantageous aspect of the method according to an example embodiment of the present invention.

    [0099] FIG. 5 shows a top view of a further exemplary representation of a vehicle having a multiplicity of cameras, according to the present invention.

    DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

    [0100] FIG. 1 schematically shows a top view of an exemplary representation of a vehicle 3 having a camera circle 5. In this context, FIG. 1 shows an exemplary system of video sensors on a vehicle. While front and rear cameras 1 are able to be calibrated in an exact and robust manner relative to the world, this is not possible in all situations and with the required accuracy and robustness for sideways-pointing cameras 2 while driving is taking place. The method introduced here can contribute to solving this problem.

    [0101] FIG. 2 schematically shows a further exemplary representation of a vehicle 3 having a plurality of cameras 1, 2 in a top view. In this context, FIG. 2 exemplarily illustrates that in calibration methods according to the related art, side cameras 2 (K.sub.1 and K.sub.3) are able to be calibrated relative to front and rear camera 1 (K.sub.2 and K.sub.0). A calibration may be calculated for each overlap region 4 between two cameras 1, 2. This results in the existence of two calibrations for each side camera 2 (dotted and dashed lines). It is unclear, a priori, how these calibrations are to be offset against one another. This method presently uses a simple harmonization, which averages the error. However, this may also worsen an already calibrated overlap region 4. The introduced method is able to contribute to this solution as well.

    [0102] By way of example, FIG. 2 also shows a control device 6, which may be or is developed to execute the described method.

    [0103] FIG. 3 schematically shows an exemplary sequence of the method introduced here. The sequence of steps a), b), c) and d) represented by blocks 110, 120, 130 and 140 serves as an example and may be cycled through at least once in the illustrated sequence, for instance. The method is used to calibrate the alignment of a multiplicity of cameras 1, 2 on a vehicle 3 in a common coordinate system. The common coordinate system may be a vehicle coordinate system, for instance.

    [0104] In block 110, each camera 1, 2 of the multiplicity of cameras 1, 2 acquires camera images according to step a).

    [0105] A rectification of the camera images may particularly be implemented in this context, for instance in such a way that virtual images are generated from original or real images (see FIG. 4).

    [0106] In block 120, according to step b), overlap regions 4 of camera images acquired in step a) are ascertained.

    [0107] In block 130, according to step c), a common optimization function is set up with the aid of overlap regions 4, which describes the alignments of each camera 1, 2 of the multiplicity of cameras 1, 2 as a target variable of the optimization.

    [0108] For example, when setting up the optimization function for at least one camera 1 of the multiplicity of cameras 1, 2, absolute alignment information may be inserted in the common coordinate system.

    [0109] As a further example, pixels in overlap region 4 of different (real or virtual) camera images are able to be allocated to one another when setting up the common optimization function in step c).

    [0110] In block 140, according to step d), the common optimization function set up in step c) is solved to ascertain the alignments of each camera 1, 2.

    [0111] To solve the optimization function, an optimization of at least the second order, simply by way of example, may be carried out. However, as an especially advantageous but likewise merely exemplary embodiment variant, a Gauss-Newton algorithm may be used to solve the optimization function.

    [0112] For instance, the alignment of each individual camera 1, 2 is able to be described by a rotation matrix, which describes the alignment of camera 1, 2 in relation to the common coordinate system.

    [0113] By way of example, the optimization function may include a system matrix, which describes the relative alignment of cameras 1, 2 relative to one another in the form of variables that are able to be identified with a minimal error when the optimization function is solved in step d).

    [0114] FIG. 4 schematically shows an exemplary image representation to illustrate an advantageous aspect of the present method, i.e., a possible rectification. In this context, FIG. 4 shows, for example, original images (on top) and an associated rectification (at the bottom) of a rear and a right fisheye camera. The region marked in the above original view is the common overlap region 4. The virtual views shown at the bottom predominantly relate to overlap region 4.

    [0115] Following the rectification (below), the images are clearly more similar, which is better for the subsequent image processing. For example, this is better for calculating the optical flow. Since these images are slightly rotated during the rectification, this can also be taken into account in the modeling of the calibration problem. The new approach described here simplifies the estimation of the calibration considerably and makes it more robust insofar as it estimates the real camera calibration and operates only indirectly on the virtual cameras.

    [0116] FIG. 5 schematically shows another exemplary representation of a vehicle 3 having multiple cameras 1, 2 in a view from above. FIG. 5 is specifically used to illustrate advantageous aspects of the new method described here.

    [0117] FIG. 5 particularly illustrates the new problem modeling. While a direct calculation of the pose between the virtual cameras took place in the classic method (see virtual alignments 9 and dash-dotted arrow), the pose is now implicitly calculated via the concatenation of the poses and optimized with regard to the delta pose .sup.K.sup.0P.sub.U.sub.0. In this context, reference is made to the above equations for a more detailed explanation of the mathematical relationships. Here, it is assumed that camera 1 (the front camera) is already calibrated. The advantage is obvious. Instead of estimating an orientation of the virtual cameras relative to one another, the calibration of side camera 2 is estimated directly. This makes it possible to couple and jointly optimize any number of cameras. In FIG. 5, reference numeral 7 denotes uncalibrated alignments, reference numeral 8 denotes calibrated alignments, and reference numeral 9 denotes virtual alignments.

    [0118] The described algorithm is preferably implemented in software. For instance, the advantage of the described method in comparison with a separate calibration of each camera can be recognized once a complete online calibration is performed (detectable by the diagnosis output). Subsequently, precisely one single side camera is deliberately rotated by a few degrees (approximately 3 to 5 degrees) in different directions so that the camera is now decalibrated. If conventional methods are then used for an individual calibration of cameras, this can be detected in that not only the camera itself, but also adjacent cameras no longer exhibit the same value as during the first calibration process because the error of 3 to 5 degrees propagates further and thus is distributed to other cameras. In contrast, due to the method described here, no difference can be detected if the introduced method is used, which discovers through the fusion of multiple cameras precisely which camera is decalibrated and automatically recalibrates it.