Camera system for a vehicle, mirror replacement system comprising such camera system, and driver assistance system comprising such system

10509207 · 2019-12-17

Assignee

Inventors

Cpc classification

International classification

Abstract

A camera system for a vehicle comprising a capturing unit which comprises an optical element and image sensor with surface, to capture a section of a vehicle environment. The optical element has a distortion curve r=f(), wherein r is the distance between an object point displayed on the image sensor surface and the intersection point of the optical axis with the image sensor surface, and is the angle between the optical axis of the optical element and the beam incident on the optical element from the object point. The distortion curve r=f() has, for r.sub.w=f (.sub.w) within 0<r<r.sub.max, a turning point .sub.w; r.sub.w, for which r=f(.sub.w)=d.sup.2r/d.sup.2 (.sub.w)=0 applies, wherein r.sub.max is the distance r=f(.sub.max) on the image sensor surface from the optical axis to the most distant edge of the image sensor.

Claims

1. A mirror replacement system, wherein the mirror replacement system is part of a commercial vehicle, and comprising a camera system for a vehicle, wherein the camera system comprises a capturing unit including an optical element and an image sensor having an image sensor surface and adapted to capture a section of a vehicle environment, wherein the optical element has a distortion with a distortion curve r=f(), wherein r is the distance from an object point depicted on the image sensor surface to the intersection point of the optical axis with the image sensor surface, and a is the angle between the optical axis of the optical element and the beam incident in the optical element from the object point, the distortion curve r=f() for r.sub.w=f (.sub.w) has a turning point (.sub.w; r.sub.w) within 0<r<r.sub.max, for which r=f (.sub.w)=d.sup.2r/d.sup.2(.sub.w)=0 applies, wherein r.sub.max is the distance r=f(.sub.max) on the image sensor surface from the optical axis to the most distant boundary point of the image sensor surface, and for the curvature of the distortion curve r=f()<0 for 0<<.sub.w r=f()>0 for .sub.w<<.sub.max applies, and further comprising a processing unit for processing the data of the capturing unit and/or a display unit for displaying information captured by the capturing unit visible for the vehicle driver, wherein: the capturing unit is adapted to capture at least one of the field of vision of a main mirror, or the field of vision of a wide angle mirror on a side of the commercial vehicle, or the field of vision of a (close-)proximity mirror, or the field of a front mirror, the capturing unit is prefer-ably adapted to capture both the field of vision of a main mirror and the field of vision of a wide angle mirror on the same side of the commercial vehicle, the optical axis of the optical element of the capturing unit intersects the field of vision or one of the fields of vision, the optical axis of the optical element crosses one of the fields of vision in an intersection point (S) at a maximal distance of 5 m to a lateral boundary line of the vehicle, where-in the lateral boundary line is an intersecting line of a plane in parallel to the central longitudinal plane of the vehicle, which plane passes through a lateral outermost point of the vehicle, with the horizontal plane road.

2. The mirror replacement system according to claim 1, wherein distortion curve r=f() has exactly one turning point (.sub.w; r.sub.w) within 0<r<r.sub.max.

3. The mirror replacement camera system according to claim 1, wherein the gradient r=dr/d of distortion curve r=f() is maximal in the region 0<<.sub.w at the zero point r=f(0)=0 of the distortion curve.

4. The mirror replacement system according to claim 1, wherein the gradient r=dr/d of distortion curve r=f() is minimal at the turning point r=f(.sub.w)=r.sub.w of the distortion curve.

5. The mirror replacement system according to claim 1, wherein the gradient r=dr/d of distortion curve r=f() is maximal in the region .sub.w<<.sub.max for .sub.max (r=f(.sub.max)=r.sub.max) of the distortion curve.

6. The mirror replacement system according to one of the preceding claims, wherein distortion curve r=f() is a polynomal function ()=.sub.i=0.sup.n.sub.i.sup.i.

7. The mirror replacement system according to claim 1, wherein the distortion curve r=f() is a spline function.

8. The mirror replacement system according to claim 1, wherein the distortion curve r=f() is a Bezier curve.

9. The mirror replacement system according to claim 1, wherein the centroid of the image sensor area and the intersection point of the optical axis with the image sensor surface do not coincide, and the optical axis is disposed eccentrically with respect to the image sensor surface.

10. The mirror replacement system according to claim 1, wherein the optical element includes at least one lens having a shape other than a partial sphere.

11. The mirror replacement system according to claim 1, wherein the optical element includes at least one aspherical lens.

12. The mirror replacement system according to claim 1, wherein the optical element includes at least two lenses that are different from each other.

13. The mirror replacement system according to claim 1, wherein the optical element has a rotationally symmetric distortion with regard to its optical axis, so that the distortion curves r=f() are identical for each angle of rotation around the optical axis.

14. The mirror replacement system according to claim 1, wherein the optical element has a distortion that is not rotationally symmetric with regard to its optical axis, so that a first distortion curve r.sub.1=f() for a rotational angle .sub.1 about the optical axis differs from a second distortion curve r.sub.2=f() for a rotational angle .sub.2 about the optical axis.

15. The mirror replacement system according to claim 14, wherein the optical element is anamorphic.

16. The mirror replacement system according to claim 1, wherein the mirror replacement system is adapted to display at least two fields of vision around the vehicle visible for the driver, wherein preferably a first field of vision is visible in a first region of the display unit, and a second field of vision is visible in a second region of the display unit, which second region is optically separated from the first region.

17. The mirror replacement system according to claim 16, wherein the mirror replacement system is adapted to capture the information of the two fields of vision by means of a joint/common capturing unit of the camera system, and the processing unit is adapted to separate and extract the data received from the capturing unit into information to be displayed in the first region of the display unit and the second region of the display unit, respectively.

18. The mirror replacement system according to claim 1, wherein a straight line-of-sight segment perpendicular to the lateral boundary line, which line segment passes through the intersection point (S) and is limited by the limitation of the field of vision of the main mirror, is in the region of the distortion curve r=f() for 0<<.sub.w with r=f()<0.

19. The mirror replacement system according to claim 18, wherein the turning point (.sub.w; r.sub.w) is beyond the straight line-of-sight segment.

20. A driver assistance system comprising the mirror replacement system according to claim 1.

21. The mirror replacement system according to claim 1, wherein the mirror replacement system is adapted to visually display the information captured by the capturing unit, and is further adapted to display at least a field of vision, which is located on a plane horizontal part of the road around the vehicle, on the display unit visible for the vehicle driver.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) In the following, the invention is exemplarily described by means of the accompanying drawings, in which:

(2) FIG. 1 is a schematic view of a mirror replacement system using a camera system according to the invention;

(3) FIG. 2 is a perspective view of a commercial vehicle with a camera system according to the invention;

(4) FIG. 3 is a simplified sectional view of the essential components of the camera system;

(5) FIG. 4 is a perspective view of the essential components of the camera system, in accordance with FIG. 3;

(6) FIG. 5 illustrates a schematic, perspective sectional view of an embodiment of a structure of the optical element of the capturing unit of the camera system;

(7) FIG. 6 is a further sectional perspective view of the detailed structure of the optical element of the capturing unit of the camera system;

(8) FIG. 7 shows the distortion curve of the optical element of the capturing unit in an a, r coordinate system;

(9) FIG. 8 shows the first derivative of the distortion curve of the optical element of the capturing unit of the capturing system;

(10) FIG. 9 shows the second derivative of the distortion curve of the optical element of the capturing unit of the camera system;

(11) FIG. 10a shows the distortion curve of the optical element of the capturing unit of the camera system in comparison to conventional distortion curves.

(12) FIG. 10b is a detail of FIG. 10a showing the distortion curves around the origin of the distortion curve;

(13) FIG. 11 is a schematic view of the image sensor of the capturing unit, where fields of vision of a main mirror and a wide angle mirror according to FIG. 2 are illustrated;

(14) FIG. 12 shows details of the distortion curve of the capturing unit of the camera system of the embodiment shown in FIG. 11;

(15) FIG. 13 is a schematic top view of a commercial vehicle, which schematically shows the movement of a region to be illustrated depending on the driving situations of the vehicle; and

(16) FIG. 14 is a schematic view of an image sensor surface of the capturing unit of the camera system, which illustrates the shift/displacement of the regions of interest according to FIG. 13 on the image sensor surface.

DETAILED DESCRIPTION OF THE PRESENTLY PREFERRED EMBODIMENTS

(17) FIG. 1 shows a schematic view of a mirror replacement system 100, for example, for a commercial vehicle. The mirror replacement system 100 can be integrated in a driver assistance system, or may be used as separate mirror replacement system 100. The mirror replacement system 100 comprises a camera system 130, a processing unit 120, and a display unit 110. Image data captured by the camera system 130 are supplied to the processing unit 120 which, after adequate processing, supplies these image data to a display unit 110 for displaying the same visible for a vehicle driver. The mirror replacement system 100 may further be coupled with one or more vehicle sensors 140, which also supply data to the processing unit 120, like, for example, the current driving state (steering angle, driving speed, driving direction) of the vehicle, which data is taken into account by the processing unit 120 when processing the data received from the camera system 130. Alternatively, the vehicle sensor(s) 140 could be directly coupled with the camera system 130, so that the camera system is directly controlled dependent on the data received by the vehicle sensor(s) 140. It is, therefore, also possible that the processing unit 120 outputs data for controlling the camera system 130 to the camera system 130.

(18) The processing unit 120 may be provided as a processing unit separate from the camera system 130, e.g. in the form of an on-board computer of the vehicle, or, alternatively, it may be integrated into the camera system 130. The display unit 110 is, for example, a monitor provided in the vehicle, where the data supplied from the processing unit 120 are displayed visible for the vehicle driver. Alternatively, instead of a monitor provided in the vehicle, a display unit attached outside the vehicle, e.g. in the region of conventional vehicle mirrors, could be provided. Furthermore, the display unit could be implemented in form of a projection on a vehicle structure component in the vehicle interior. With regard to the display unit 110 it has to be noted that, besides the illustrated embodiment where a monitor is provided for displaying the data supplied by the processing unit 120, also a plurality of separate monitors or display units may constitute the display unit. Depending on the requirements, these monitors may be formed identically to or differently from each other.

(19) Moreover, in particular if used in the context of a driver assistance system (ADAS), the mirror replacement system 100, and in particular its processing unit 120, is connected with further information or control components of the vehicle 150 if required, which components may be display units for the driver, e.g. audio message units, or components that directly control the vehicle, e.g. steering assistance.

(20) The camera system 130 comprises at least one capturing unit 30, which will be described in more detail in the following; it may, however, also comprise a plurality of capturing units 30 of the above-described type. Moreover, further capturing units 31 may be provided, which do not necessarily have to meet the requirements imposed on the capturing units 30. It is therefore possible that the processing unit 120, as indicated in FIG. 1, directly receives image data from the individual capturing units 30 and 31, respectively, instead of receiving said image data from the general camera system 130. Accordingly, the processing unit 120 can also supply control signals directly to the individual capturing units 30, 31.

(21) FIG. 2 shows a perspective view of a commercial vehicle 10 that is provided with a mirror replacement system 100 according to FIG. 1. Accordingly, one or more capturing units 30 are mounted to the commercial vehicle 10. As illustrated in FIG. 2, the exemplary mirror replacement system 100, which is mounted on the vehicle, is adapted for capturing a field of vision 11 of a main mirror and a field of vision 12 of a wide angle mirror by means of the capturing unit 30, and to display the same in/at the driver's cabin of the vehicle 10 visible for the driver of the commercial vehicle 10. In FIG. 2, the field of vision 11 of a main mirror and the field of vision 12 of a wide angle mirror are schematically illustrated by dashed lines (field of vision 11 of a main mirror by long dashes, field of vision 12 of a wide angle mirror by shorter dashes) on the plane road surface beside the vehicle 10.

(22) Further, in FIG. 2, the forward driving direction is denoted by an arrow D. All directions specified in this description, i.e. front, rear, left, right etc. refer to the forward driving direction D of the vehicle.

(23) The field of vision 1 of the main mirror, which is shown in FIG. 2, extends from a lateral boundary line 13 of the vehicle in lateral direction away from the vehicle and to the rear. The lateral boundary line 13 is a line defined by the intersection of the plane horizontal road with a plane in parallel to the central longitudinal line (not illustrated) of the vehicle, and passing through the, in lateral direction, outmost point of the vehicle.

(24) The optical axis 302 of the optical element 301 (FIGS. 3, 4) of the capturing unit 30, which is, for example, provided by a camera, extends in lateral direction at an angle with respect to the central longitudinal plane of the vehicle and the road surface, such that it intersects the field of vision 11 of a main mirror on the road surface. This means that the intersection point S or crossing point of the optical axis 302 and the road surface is within the field of vision 11 of a main mirror in the embodiment shown in FIG. 2. Preferably, the intersection point S is located maximally 6 m behind the capturing unit 30 when viewed in the longitudinal direction of the vehicle, more preferably in the range of 4 to 5 m.

(25) Moreover, a straight line-of-sight segment 14 is illustrated in FIG. 2 by thin dotted lines, which straight line-of-sight segment is defined by the line segment of a line perpendicular to the lateral boundary line 13 and passing through the intersection point S of the optical axis 302 and the road surface, which is located within the area of the field of vision 11 of the main mirror.

(26) In the following, the capturing unit 31 of the camera system 130 is described in further detail with reference to FIGS. 3 and 4, which schematically show the beam path through a schematically illustrated optical element in a sectional view and perspective view (cut), respectively, as well as by means of FIGS. 5 and 6, which schematically show embodiments of a structure of the optical element.

(27) The optical element 301 and the image sensor 302 form the essential components of the capturing unit 30 for the camera system for a motor vehicle. As can be seen in FIGS. 3 and 4, in the present embodiment, the optical element 301 is substantially rotationally symmetric around the optical axis 302. This means that any light beam incident on the capturing unit 30 from an object point to be displayed at an identical angle, e.g. .sub.1, with regard to the optical axis 302, is displayed on the image sensor surface 304 rotationally symmetric around the optical axis with identical distortion. The angle a, in the following also referred to as object angle , corresponds to an angle that is the incidence angle of the light beam into an optical element 301, having an incidence surface (virtually not illustrated) perpendicular to the optical axis 302. Accordingly, any light beam incident at the object angle .sub.1 is displayed on the image sensor surface 304 with a distance r.sub.1 to the optical axis 302. Here, the image sensor surface 304 is the surface that is actually provided for display within the entire image aperture/opening angle of the optical element 301, i.e. the surface of the image sensor 302, which is suitable for display, and which faces the optical element 301.

(28) FIGS. 3 and 4 schematically illustrate the beam paths through the optical element 301 and their respective display on the image sensor surface 304 of the image sensor 303 at distances r.sub.1, r.sub.2, . . . , r.sub.n for different object angles .sub.1, .sub.2, . . . .sub.n. As, in the illustrated embodiment, the optical element 301 is rotationally symmetric, the distance r.sub.1, r.sub.2, . . . , r.sub.n and the beam path through the optical element 301 are also rotationally symmetric with regard to the optical axis 302. The optical axis 302 for which =.sub.0=0, r=r.sub.0=0 strikes the image sensor surface 304 at the origin of a distortion curve (=0; r=0) of an , r coordinate system.

(29) The optical element 301, which, as schematically illustrated in FIGS. 5 and 6, is composed of a lens system and, if necessary, further optical components, and comprises a plurality of rotationally symmetric lenses that are arranged in a row one behind the other, has a so-called distortion curve r=f(), which is a geometrical imaging error of the optical element, causing a local change of the image scale. Due to the rotational symmetry of the optical element 301, the distortion curve r=f() of the embodiment shown in FIGS. 3 and 4 is also rotationally symmetric with regard to the optical axis 302.

(30) An embodiment of the lens arrangement is illustrated in FIG. 5. Here, the seven lenses 314 to 320 are arranged in a row in the path of the incident light (from left to right in FIG. 5). Lens 314 is a spherical convex-concave lens, lens 315 is a spherical convex-concave lens. Lens 316 is formed by a spherical concave-convex lens, lens 317 by a freeform lens (aspherical lens) having a convex-concave surface and a concave surface, lens 318 is a spherical bi-convex lens, lens 319 is an aspherical lens having a convex surface and a convex-concave surface, and lens 320 is an aspherical lens having a concave and a convex-concave surface. The freeform surfaces of lenses 317 and 320 are also rotationally symmetric, such that the optical element 301 of the embodiment of FIG. 5, which is formed by the seven lenses, is rotationally symmetric with regard to the optical axis 302. In the embodiment of FIG. 6, a sensor protection glass 305 and an infrared filter 329 may be provided in front of the image sensor 303 (see embodiment of FIG. 6). Also in this case, the optical element 301 has the schematically indicated beam path, as well as a distortion curve r=f() having a turning point in the region 0<r<r.sub.max.

(31) In the alternative embodiment of the optical element 301 shown in FIG. 6, the optical element 301 comprises eight lenses 306, 307, 308, 309, 310, 311, 312, 313 arranged in a row along the beam path of the incident light (from the left to the right in FIG. 6). In the order, in which the incident light passes the lenses on its way to the image sensor 303, lens 306 is a spherical convex-concave lens, lens 307 is a spherical convex-concave lens, lens 308 is a spherical concave-convex lens, lenses 309 and 310 are, respectively, spherical bi-concave lenses, lens 311 is a spherical bi-concave lens, lens 312 is a freeform lens (aspherical lens) having a rotationally symmetric convex-concave surface and a convex surface, and lens 313 is an aspherical concave-spherical convex lens. Moreover, a sensor protection glass 305, as well as an infrared filter 329 are provided as additional optical components in front of the image sensor 303. By means of such lens arrangement, the incident light, as exemplarily shown for some beams in FIG. 6, is guided through and diverted by the optical element 301. Therefore, and due to the respective lens arrangement, the optical element 301 as a whole has the distortion curve r=f(), which has a turning point in the region of the image sensor surface 304 0<r<r.sub.max.

(32) Both the optical system shown in FIG. 5 and the optical system shown in FIG. 6 comprises an aperture 303 as a further component. Additional filters, apertures etc. may be provided if necessary. The lenses may be formed, for example, of glass (especially spherical lenses) or of synthetic material. Different materials can be combined where required. Moreover, the lenses may be provided with, for example, a vapour-deposited metallic coating or a different coating, which usually has no influence on light refraction, but only serve to influence scattering, eliminate undesired reflexion etc.

(33) Most of the lenses 307 to 320 of the embodiments shown in FIGS. 5 and 6 are lenses having at least a partially spherical surface. E.g. lenses 312, 317 and 320, however, are so-called aspherical lenses, which have at least one surface that is not partially spherical. Although not illustrated in FIGS. 5 and 6, by selecting suitable lenses that are not rotationally symmetric with regard to the optical axis 302, it is also possible to form the optical element 301 anamorphic, so that the optical element has a distortion curve that is not rotationally symmetric with regard to the optical axis.

(34) By means of the exemplary lens arrangements shown in FIGS. 5 and 6, rotationally symmetric distortion curves r=f() of the optical element can be generated, which are a function r=f() having a turning point (.sub.w; r.sub.w) within the maximal distance r.sub.max, which is the maximal distance of a point on the image sensor surface 304 with regard to the optical axis 302 on the image sensor surface 304. To attain the turning point in the region of 0<r()<r.sub.max for the distortion curve r=f(), for aw corresponding to a radius r.sub.w on the image sensor surface 304, which is smaller than r.sub.max, the following has to apply: r=f(.sub.w)=d.sup.2r/d.sup.2 (=.sub.w)=0; r=f()<0 for 0<<.sub.w; r=f()>0 for .sub.w<<.sub.max. Such type of distortion curve, which may, for example, be achieved by the lens arrangement of the optical element according to FIGS. 5 and 6, is schematically shown in FIG. 7. Its first derivative is shown in FIG. 8 and its second derivative is shown in FIG. 9.

(35) As shown in FIG. 7, in an ,r coordinate system, a turning point (d.sub.w; r.sub.w) is present in the region [0; r.sub.max]. Further, at a specific object angle =.sub.2=.sub.w, the second derivative (FIG. 9) of the distortion curve is zero, i.e. has a zero crossing at .sub.w. As illustrated in FIG. 9, in front of the turning point, i.e. in the region of 0<<.sub.w, the second derivative of the distortion curve r=f() is negative; for the region .sub.w<<.sub.max, the distortion curve is positive. This means that, as can be seen in FIG. 7, the distortion curve r() is curved to the right in a first region of 0<<.sub.w, and curved to the left in a second region of .sub.w<<.sub.max.

(36) The origin of the ,r coordinate system in FIG. 7, i.e. r=0 mm, =0, corresponds to the point of the optical axis 302 on the image sensor. r.sub.max is the maximal distance a point on the image sensor can have from the optical axis 302. If, in a rectangular image sensor, the optical axis is centric, i.e. arranged at the centroid, and the rectangular image sensor has edge lengths a, b, r.sub.max={square root over (a.sup.2+b.sup.2)} applies. If the optical axis is not arranged centrically on the image sensor, the distance r.sub.max is the distance between the optical axis 302 on the image sensor surface 304 to the most distant corner of the image sensor surface 304.

(37) FIGS. 10a and 10b show, also in the ,r coordinate system, the distortion curve r=f() for an optical element 301 of the capturing unit 30 in comparison to several distortion curves of the prior art. FIG. 10b shows an enlarged section Z in the region of the origin of the ,r coordinate system. The distortion curve r=f() having a turning point (.sub.w; r.sub.w) in the region of 0<<.sub.max, is illustrated by a solid line and denoted by f.sub.1. f.sub.2 denotes a gnomonic distortion curve (distortion-free), f.sub.3 a stereographic distortion curve, i.e. a conformal distortion curve, f.sub.4 an equidistant distortion curve, f.sub.5 an equal-area distortion curve, and f.sub.6 an orthographic distortion curve. The focal lengths with regard to the distortion curves are selected such that all distortion curves pass through the point (.sub.w, r.sub.w).

(38) As can be seen from FIG. 10a and FIG. 10b, distortion curve f.sub.1 has a turning point at (.sub.w; r.sub.w), where the curvature of the distortion curve changes from curved to the right (in the region 0<<.sub.w) to curved to the left (in the region of .sub.w<<.sub.max). Further, as can be seen in particular from FIG. 10b, the gradient of the distortion curve f.sub.1 in the range near the origin of the ,r coordinate system is large, in particular compared to the other distortion curves. This means that a relatively large space for displaying a relatively small angle is provided on the image sensor 303, which has the effect that the area in this region can be displayed with high resolution. Moreover, the gradient of the distortion curve r=f.sub.1() at the turning point (.sub.w; r.sub.w) is minimal, i.e. at the turning point itself and in its close proximity, a relatively low gradient is present. Finally, for .sub.max, the gradient of the distortion curve is preferably maximal or relatively large, as can be particularly seen from the illustration in FIG. 10a.

(39) A distortion curve as illustrated in FIGS. 7 to 10b, is, for example, described by a polynomial function ()=.sup.n.sub.i=0.sub.i.sup.i. Alternatively, a spline function may describe the distortion curve, said spline function being a partial polynomial function, i.e. a function consisting of a plurality of polynome pieces/units, or a Bzier curve, which is a mathematically formulated curve (numerically generated curve).

(40) Referring to FIG. 11 and FIG. 12, a display of the field of visions 11, 12 for the commercial vehicle 10 (FIG. 2), which are captured by means of the camera system 130, on the image sensor surface 304 is explained. The image sensor surface 304 of the image sensor 303 shown in FIG. 11 is rectangular with side lengths a and b of the rectangle. As can be seen in FIG. 11, in the illustrated embodiment, the optical axis 302 is arranged eccentrically with regard to the image sensor surface 304, i.e. beyond the centroid of the image sensor surface 304. Specifically, the optical axis 302 is eccentric with regard to side a of the rectangular image sensor surface 304. This results in a maximal distance r.sub.3=r.sub.max from the optical axis 302 on the image sensor surface 304 to the most distant corners of the image sensor surface 304. Moreover, in accordance with FIGS. 10a and 10b, those radii having the optical axis 302 as a center are illustrated, which pass through the turning point of distortion curve r=f() (r.sub.2=r.sub.w), a radius r.sub.SB corresponding to the maximal distance from the optical axis 302 to the edge of the sensor surface in parallel to the side line a; a radius r.sub.SH corresponding to the maximal distance from the optical axis 302 on the sensor surface 304 to the edge of the sensor surface in parallel to the sensor edge b; as well as a radius r.sub.1, which corresponds to an illustrated first radius r.sub.1 at an angle .sub.1 as illustrated in FIGS. 10a and 10b. Further, in FIG. 11, the displayed image 11 of the field of vision 11 of the main mirror (see FIG. 2), as well as the displayed image 12 of the field of vision 12 of the wide angle mirror (see FIG. 2), as well as a displayed image 15 of the horizon line, are illustrated. As can be seen, specifically the displayed image 11 of the field of vision 11 of the main mirror is largely within a region located within the radius r.sub.1, such that this region within radius r.sub.1 is displayed with increased resolution compared to normal distortion curves of the prior art. Moreover, the entire displayed image 12 of the field of vision 12 of the wide angle mirror can be effected on the same image sensor with the same optical element. It is not necessary to provide a second optical element and/or a second image sensor and to subsequently combine the image for display.

(41) Furthermore, in FIG. 11, the displayed image 14 of the straight line-of-sight segment 14 (see FIG. 2) is illustrated. As can be seen from FIG. 11, this line segment substantially extends in parallel to a lateral edge of the image sensor surface 304 (lateral edge a).

(42) In FIG. 12 this displayed image 14 of the straight line-of-sight segment is also represented in the ,r coordinate system, in addition to the displayed image of the distortion curve r=f.sub.1(). It is clearly recognizable that the entire width of the displayed image of field of vision 11 of the main mirror is within the region of 0<<.sub.w, which is in the region of the distortion curve r=f() that is curved to the right and, thus, involves high resolution (specifically when compared to the distortion curves of conventional optical systems).

(43) In the presently described embodiment, where the camera system 130 is used in a mirror replacement system 100 of a vehicle, a processing unit 120 of the mirror replacement system 100 can subsequently evaluate the image data captured by the image sensor 303, and display the same, for example on a monitor, visible for a driver located, for example, in the driver's cabin of a commercial vehicle. In the present embodiment, separate regions are read out for the field of vision 11 of a main mirror and the field of vision 12 of a wide angle mirror and, in a preferred embodiment (not illustrated), displayed to the driver in separate regions of the display unit 110. The separate regions may be provided on a common monitor or on separate monitors. It is therefore possible to model the usual appearance of a main mirror and a wide angle mirror for the driver of the commercial vehicle. If the camera system 130 is, for example, used in the context of a driver assistance system, the regions of interest of the image sensor surface 304 can be also evaluated with regard to specific environmental information (e.g. road lines, traffic signs, other road users etc.) by a processing unit and, dependent on the captured and determined information, it can be interfered in the drive control system, a note or information may be indicated to the driver, etc.

(44) In a mirror replacement system 100 as described above, it is further possible, dependent on the driving situation of the vehicle, for example, a commercial vehicle 10, to extract data to be displayed to the driver on the display unit 100 from different regions of the image sensor surface 304, i.e. to evaluate different portions of the image senor surface 304 at different times during driving operation. This is exemplarily described with reference to FIGS. 13 and 14.

(45) FIG. 13 shows a top view of a commercial vehicle during forward or straight ahead driving, wherein the field of vision 11 of a main mirror and the field of vision 12 of a wide angle mirror are schematically illustrated. FIG. 14 illustrates, also for straight ahead driving, the image sensor surface 304 with the displayed image 11 of the field of vision 11 and the displayed image 12 of the field of vision 12. As already explained above, during normal straight ahead driving, the region 21 for illustrating/displaying the field of vision 11 of the main mirror and for providing the driver with a view of the field of vision 11 of the main mirror, respectively, is extracted from the data in a specific first portion on the image sensor surface 304. If the driving situation changes, it may happen that, while the alignment of the capturing unit 30 on the vehicle does not change, the region of interest shifts from the original region of interest 21 to the displaced/shifted region 22. This may be the case if a vehicle, in particular a commercial vehicle with trailer, drives along curves or performs a manoeuvring process. In this case, the region of interest, which corresponds to the field of vision 11 of the main mirror, as illustrated in FIG. 13, shifts to the region 22. By means of the camera system 130 comprising the capturing unit 30 that includes the optical element 301, which has the distortion curve r=f() of the above-described type, it is possible to also shift the region on the image sensor surface 304, from which region image sensor data are extracted, such that, as illustrated in FIG. 14, image data of a region 22 on the image sensor surface 304 are extracted. This is possible without losing the required precision of the image data, i.e. in particular the resolution, as the distortion curve r=f() can provide the required resolution and distortion in all regions from which data may be extracted, without requiring data post-processing. Thus, the field of vision 11 and its displayed image 11, respectively, may be updated corresponding to the driving situation. A mechanical adjustment of the capturing unit 30 is not required. Rather, adjusting may be effected exclusively by data extraction of the image data of the image sensor 304 in selected regions.

(46) These advantages are achieved by using at least one capturing unit 30 comprising an optical element 301 having a distortion curve r=f(), which has a turning point within the maximal distance of a point on the image sensor surface 304 to the optical axis 302 on the image sensor surface 304.

(47) It is explicitly stated that all features disclosed in the description and/or the claims are intended to be disclosed separately and independently from each other for the purpose of original disclosure as well as for the purpose of restricting the claimed invention independent of the composition of the features in the embodiments and/or the claims.

(48) Thus, while there have shown and described and pointed out fundamental novel features of the invention as applied to a preferred embodiment thereof, it will be understood that various omissions and substitutions and changes in the form and details of the devices illustrated, and in their operation, may be made by those skilled in the art without departing from the spirit of the invention. For example, it is expressly intended that all combinations of those elements and/or method steps which perform substantially the same function in substantially the same way to achieve the same results are within the scope of the invention. Moreover, it should be recognized that structures and/or elements and/or method steps shown and/or described in connection with any disclosed form or embodiment of the invention may be incorporated in any other disclosed or described or suggested form or embodiment as a general matter of design choice. It is the intention, therefore, to be limited only as indicated by the scope of the claims appended hereto.