SYNCHRONIZED CAMERA SYSTEM HAVING TWO DIFFERENT CAMERAS
20220038646 · 2022-02-03
Assignee
Inventors
- Aless Lasaruk (Lindau, DE)
- Reik Müller (Oberreitnau, DE)
- Simon Hachfeld (Lindau, DE)
- Dieter Krökel (Eriskirch, DE)
- Stefan Heinrich (Achern, DE)
Cpc classification
B60R11/04
PERFORMING OPERATIONS; TRANSPORTING
H04N25/41
ELECTRICITY
H04N23/90
ELECTRICITY
H04N23/58
ELECTRICITY
H04N13/25
ELECTRICITY
International classification
Abstract
The invention relates to a camera system, and to a method for controlling the camera system, for capturing images of the surroundings of a vehicle for a driver assistance system of the vehicle. The camera system comprises a first rolling shutter camera (1) having a first aperture angle α, a second rolling shutter camera (2) having a second aperture angle β, and control electronics. The first camera (1) is suitable for generating a wide-angle camera image, that is, the first aperture angle α is greater than the second aperture angle β of the second camera (2) which is suitable for generating a tele camera image. The two cameras (1, 2) are designed in such a way that both camera images have an overlap region.
The control electronics is configured to synchronize the two cameras (1, 2).
The geometric arrangement of the two cameras (1, 2) with respect to one another, and the position of the overlap region (10) in the wide-angle image and in the tele camera image, are determined by means of continuous estimation.
The stored geometric arrangement and position are taken into consideration during synchronization of the first camera (1) and the second camera (2) of the camera system.
Claims
1-11. (canceled)
12. A camera system for capturing images of the surroundings of a vehicle for a driver assistance system of the vehicle, the camera system comprising: a first rolling shutter camera having a first aperture angle; a second rolling shutter camera having a second aperture angle; wherein the first camera is configured to generate a wide-angle camera image and the second camera is designed to generate a tele camera image such that an overlap region exists among the camera images; and control electronics configured to synchronize the two cameras; wherein a geometric arrangement of the two cameras with respect to one another and a position of the overlap region are determined by continuous estimation; and wherein the determined geometric arrangement and position are taken into consideration during synchronization.
13. The camera system according to claim 1, wherein the control electronics is configured to produce the synchronization in such a manner that a greater line geometry of the first camera predefines the beginning of synchronization and the clock frequency of the second camera in such a manner that the exposure of each line of the first camera is begun synchronously with the relevant line of the second camera.
14. The camera system according to claim 2, wherein the first camera captures per line n.sub.w with a readout clock Δt.sub.l and the second camera captures per line n.sub.T with the readout clock t.sub.l/k, wherein k indicates the number of the lines n.sub.T, n.sub.T+1, n.sub.T+(k−1) of the second camera which are contained in one line n.sub.w of the first camera.
15. The camera system according to any claim 14, wherein two pixels of the second camera are predefined inside the overlap region and are exposed synchronously to the relevant pixels of the first camera.
16. The camera system according to claim 15, wherein a start pixel of the first line and an end pixel of the last line of the second camera are each exposed, inside the overlap region for each line of the first camera, synchronously to a relevant start and end pixel of the line of the first camera.
17. The camera system according claim 12, wherein the image sensor of the first camera exposes with an exposure time b1 and the image sensor of the second camera exposes with the exposure time b2, wherein b2>b1, and the start of exposure of the first camera is delayed by (b2−b1)/2 with respect to the start of exposure of the second camera.
18. The camera system according to claim 17, wherein the start of capturing of the second camera is delayed by half a line clock Δt.sub.l of the first camera with respect to the first camera inside the overlap region.
19. The camera system according to claim 12, wherein the image sensors of the first camera and the second camera have a comparable number of pixels.
20. The camera system according to claim 12, wherein the resolution of the second camera is twice the resolution of the first camera in the overlap region, wherein n is a natural number.
21. The camera system according to claim 12, wherein the geometric arrangement of the two cameras with respect to one another and the position of the overlap region is continuously estimated by utilizing a fundamental matrix estimation. cm 22. A method for controlling a camera system for capturing images of the surroundings of a vehicle for a driver assistance system of the vehicle, wherein a first rolling shutter camera has a first aperture angle and a second rolling shutter camera has a second aperture angle, wherein the first camera is configured to generate a wide-angle camera image and the second camera is configured to generate a tele camera image such an overlap region exists among the camera images, said method comprising: determining the geometric arrangement of the two cameras with respect to one another, and the position of the overlap region in the wide-angle image and in the tele camera image by continuous estimation; and synchronizing the two cameras including considering the determined geometric arrangement and position.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0036] Exemplary embodiments are described and explained in greater detail below on the basis of figures, wherein:
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
DETAILED DESCRIPTION
[0045] An exemplary embodiment of a camera system is represented in
[0046]
[0047] A projection of the spatial point (5) onto a field of view plane of the second camera (2) produces the image (9) of the second camera (2).
[0048] In the general case, the projection (9) of the field of view of the second camera (2), intersected with the image of the first camera (1), is a distorted four-sided
[0049]
[0050] In this case, the four-sided
[0051] An exemplary embodiment of a method for controlling the camera system is schematically represented in
[0052] The ratio represented in
[0053] Relationships are represented in
[0054] The difference in the start of exposure of subsequent lines, the exposure start clock, of the second camera (2) is advantageously cut down to the fractional value Δt=Δt.sub.l/k. The exposure start clock corresponds to the readout clock of a line. The second camera (2) therefore has in total a shorter exposure start clock. As a result, the k lines of the second camera (2) start the exposure in the time Δt.sub.l. Consequently, it follows from this that the line n+1 of the first camera (1) in turn starts simultaneously with the corresponding line of the second camera (2).
[0055] The optimized imager control (4) can now be used in order to repeat the method according to
[0056] An exemplary overlap region of the two cameras (1, 2) is represented schematically in
[0057] Inside the overlap region, the start pixel (17) of the first line n.sub.T and the end pixel (18) of the last line n.sub.T+2 of the second camera (2) are now each exposed synchronously to the relevant start or end pixels (that is, the first or the fourth pixel, lower line (14) in
[0058] As a general rule, the second camera (2) has more pixels/degree. It collects significantly less light per pixel and must therefore generally be exposed for longer than a pixel of the first camera (1). Consequently, the second camera (2) exposes object movements over a longer period of time than the first camera (1). The latter results in other (more pronounced) blurring in the image of the second camera (2), in particular due to the motion blur. In order to reduce the motion blur effect for the imaging of the stereo image, the following synchronization can be used in a preferred embodiment. If, for example, the image sensor of the second camera (2) requires an exposure time of b2=10 ms and the image sensor of the first camera (1) only requires b1=1 ms, the image sensor of the first camera (1) could be started with a delay of (b2−b1)/2=4.5 ms. The motion blur of the image sensor of the second camera (2) would therefore be distributed more uniformly among relevant regions of the image sensor of the first camera (1) and an error in the calculation of the stereo image is decreased.
[0059] Consequently, the respective imager control (4) allows a suitable synchronization for two cameras (1, 2) having different fields of view.
[0060] Results from camera calibration methods are cleverly utilized in order to indicate a passage of time for the exposure, which seeks to minimize the temporal differences in the exposure of corresponding world points (5).
[0061] It can be ensured with the method described above that the exposure takes place simultaneously at the start and end of the overlap region (first and last pixel or line). Moreover, it is ensured that, at the beginning of the exposure of a line of the wide-angle camera, there is synchronicity with the 1st relevant line of the tele camera. The further lines of the tele camera, which also correspond to the relevant line of the wide-angle camera, diverge in synchronicity in order to then converge at the beginning of the next line of the wide-angle camera with the tele camera. This is illustrated by
[0062] A further optional improvement in the synchronicity of the capturing consists of again delaying the start of capturing of the tele camera with respect to the wide-angle camera by half the line clock of the wide-angle camera, as a result of which the maximum asynchronicity delta T max is again halved. The very schematic representation in