Method and device for determining a headlight range alignment
09970752 ยท 2018-05-15
Assignee
Inventors
Cpc classification
G01J1/4257
PHYSICS
B60Q1/10
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60Q1/10
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A method for determining a headlight range alignment of at least one first headlight of a motor vehicle. The motor vehicle includes at least one optical sensor which is designed for detecting at least one part of a first illuminated area of the first headlight and generating a first image having the part of the first illuminated area. The method includes the following steps: reading in the first image from the optical sensor, selecting at least one first image area in the first image, whereby a cut-off line of the first headlight is intended to be imaged in the first image area, and determining the headlight range alignment as a function of the first image area.
Claims
1. A method for determining a headlight range alignment of a first headlight and a second headlight of a motor vehicle, the motor vehicle including at least one optical sensor to detect at least one part of a first illuminated area of the first headlight and to generate a first image including the part of the first illuminated area, the method comprising: reading in the first image from the at least one optical sensor; selecting at least one first image area in the first image, wherein a cut-off line of the first headlight is intended to be imaged in the first image area; determining the headlight range alignment as a function of the first image area; orienting the first headlight as a function of the headlight range alignment and a setpoint orientation; and determining if there is a misadjustment of the first headlight and the second headlight relative to one another based on the setpoint orientation as a function of a headlight range alignment of the second headlight.
2. The method as recited in claim 1, further comprising: selecting at least one of: i) at least one second image area in the first image, in which a part of the first illuminated area is intended to be imaged, and ii) at least one third image area, in which an area outside the illuminated area is intended to be imaged; and determining the headlight range alignment as a function of the at least one of the second and the third image area.
3. The method as recited in claim 2, further comprising: determining a reflectivity of a road surface as a function of at least one of the first, second, and third image areas; and determining the headlight range alignment as a function of the reflectivity.
4. The method as recited in claim 1, further comprising: determining a topology of a road surface; and determining the headlight range alignment as a function of the topology of the road surface.
5. The method as recited in claim 1, further comprising: detecting movements of the motor vehicle; and determining the headlight range alignment as a function of the movements.
6. The method as recited in claim 1, further comprising: determining characteristic values for a data series of the first image area; and determining the headlight range alignment as a function of the characteristic values.
7. The method as recited in claim 6, wherein image rows of the first image area are used as the data series.
8. The method as recited in claim 6, wherein sums of the data series of the first image area are used as characteristic values for the data series.
9. The method as recited in claim 1, further comprising: orienting the at least first headlight in a predefined basic orientation.
10. The method as recited in claim 1, wherein a driver-side illuminated area is used as the first illuminated area.
11. A control and evaluation unit for determining a headlight range alignment of a first headlight and a second headlight of a motor vehicle, the motor vehicle including at least one optical sensor to detect at least one part of a first illuminated area of the first headlight and generating a first image having the part of the first illuminated area, the control and evaluation unit being configured to: read in the first image from the at least one optical sensor; determine at least a first image area in the first image, in which a cut-off line of the first headlight is intended to be imaged; and determine the headlight range alignment as a function of the first image area; orient the first headlight as a function of the headlight range alignment and a setpoint orientation; and determine if there is a misadjustment of the first headlight and the second headlight relative to one another based on the setpoint orientation as a function of a headlight range alignment of the second headlight.
12. A non-transitory storage medium storing program code, which is executable by a processor, comprising: a program code arrangement having program code for determining a headlight range alignment of a first headlight and a second headlight of a motor vehicle, the motor vehicle including at least one optical sensor to detect at least one part of a first illuminated area of the first headlight and to generate a first image including the part of the first illuminated area, by performing the following: reading in the first image from the at least one optical sensor; selecting at least one first image area in the first image, wherein a cut-off line of the first headlight is intended to be imaged in the first image area; and determining the headlight range alignment as a function of the first image area; orient the first headlight as a function of the headlight range alignment and a setpoint orientation; and determine if there is a misadjustment of the first headlight and the second headlight relative to one another based on the setpoint orientation as a function of a headlight range alignment of the second headlight.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
(14)
(15) The example represented here relates to a motor vehicle designed for right-hand traffic. Headlight 20 emits a light cone 22, which is reflected on a road surface 24. Light cone 22 may be pivoted, in its entirety, in the direction of arrows 25 via an actuator of headlight 20. Light cone 22 has a first upper limit 26 on the driver side, which defines a cut-off line 28 on road surface 24. In addition, light cone 22 has a second, passenger-side upper limit 30. In this case, upper limits 26 and 30 are situated next to one another as viewed in the direction of travel. Passenger-side upper limit 30, together with road surface 24, also forms a cut-off line, this cut-off line is not represented here for the sake of clarity. This results in a driver-side illuminated area 32 and an illuminated area 34 generated on the passenger side.
(16) Camera 12 is oriented in the direction of travel of motor vehicle 10 and records a surrounding area ahead of motor vehicle 10, which is located within a recording range 36 of camera 12.
(17) As is apparent in
(18)
(19)
(20)
(21) Passenger-side illuminated area 34 in section a, in turn, has a horizontally situated cut-off line 28, which is delimited by a vertical course of cut-off line 28. Driver-side illuminated area 32 is formed in a corresponding manner in section b. Therefore, area 40 located outside illuminated areas 32, 34, 32 and 34 is formed as a section, which may mask, for example, preceding motor vehicles in order to thereby avoid a glare. In the case of such headlights 20, it is advantageous when both a horizontal and a vertical orientation of the headlights may be determined and preferably calibrated.
(22)
(23) It is clear from
(24) This applies correspondingly for passenger-side illuminated areas 34 and 34. Such a state of illuminated areas 32, 34, 32 and 34 must then be corrected in the horizontal direction.
(25)
(26)
(27) Section b shows the course of cut-off lines 28 and 28, these lines being detected partially within a first image area 50. Image area 50 has a matrix-like configuration and includes data series in the horizontal and vertical directions, i.e., pixel rows and pixel columns. Characteristic values are now ascertained along the particular pixel rows in order to detect the position of the particular sections of cut-off lines 28 and 28 in the vertical direction. This takes initially place here as a function of a gradient image, particularly high values then being present in the area of cut-off lines 28 and 28 and no values or only small values being present in the other areas. Forming the characteristic values, preferably by row-by-row summation, results in a very easily evaluated and schematic representation of the type shown in section c. A coordinate system having an abscissa 52 and an ordinate 54 is represented here for better understanding. Abscissa 52 has the quantitative value of the characteristic value. Ordinate 54 assigns the characteristic values to the individual rows. A characteristic series 56, made up of the characteristic values, is represented within the coordinate system. Two maxima 58 and 60 are apparent within characteristic data series 56. These correspond to the vertical locations of cut-off lines 28 and 28, respectively. Therefore, the relative position of the two headlights relative to one another may be ascertained very easily and quickly by evaluating the characteristic data series.
(28) In order to clarify this method, different headlight range alignments of the headlights relative to one another are represented in
(29) A situation similar to that in
(30)
(31)
(32) The depictions in
(33) At the first point in time, in
(34) At the second point in time, in
(35) At the third point in time, in
(36) At the fourth point in time, in
(37)
(38) A scene is represented in section b, on the basis of which it becomes clear that cut-off line 28 is situated within first image area 50. Therefore, the method may then be carried out immediately in a highly precise manner.
(39) The case which is complementary to section a is represented in section c. In this case, cut-off line 28 is located within third image area 64. Therefore, cut-off line 28 must be moved in the vertical direction downward to image area 50.
(40) It also becomes clear from section b that the illuminated area is situated within image area 62 when cut-off line 28 is correctly positioned. At the same time, dark area 40 is located in third image area 64. As a result of evaluating second and third image areas 62 and 64, inferences may therefore be made regarding the reflectivity of the illuminated road surface. On the basis of this reflectivity, it is possible to adapt and improve the evaluation within image area 50. On the basis thereof, it may be concluded how high the gradient would have to be. For example, a high reflectivity and a low scattering rate result in a very strong gradient, whereas a fuzzy, blurred transition is to be detected in the case of low reflectivity and high scattering.
(41)
(42) Within coordinate system 66, the values for five exemplary light angles are emphasized at points 80, 82, 84, 86 and 88.
(43) At point 80, it is apparent that the mean within all three image areas 50, 62 and 64 is located at the lower value. It may therefore be assumed that cut-off line 28 is situated, in the vertical direction, below and outside all three image areas 50, 62 and 64. This is due to the fact that all three image areas 50, 62 and 64 therefore represent dark area 40.
(44) At point 82, only second image area 62 is fully illuminated. Therefore, cut-off line 28 is situated exactly between first image area 50 and second image area 62.
(45) At point 84, second image area 62 is fully illuminated, first image area 50 is partially illuminated, and third image area 64 is not illuminated. This results, for example, in the optimal position, as is represented in
(46) At point 86, both first and second image areas 50, 62 are fully illuminated and third image area 64 is not illuminated. Therefore, the cut-off line is situated between first image area 50 and second image area 64.
(47) At point 88, all image areas 50, 62 and 64 are fully illuminated. Therefore, cut-off line 28 is situated above and outside all image areas 50, 62 and 64.
(48) As a result of the qualitative evaluation of individual image areas 50, 62 and 64, a qualitative headlight range alignment and/or a qualitative determination of the headlight range alignment may be carried out very quickly and efficiently.
(49)
(50) The method begins at a start point 92.
(51) In a subsequent step 94, headlight 20 is moved into a predefined home position.
(52) In a step 96, a predefined light distribution of headlight 20 is preferably set, which generates a particularly good cut-off line 28, so that the determination of the headlight range alignment is made possible in a highly precise manner.
(53) Subsequently, in step 98, a check of parallelism of first headlight 20 relative to the second headlight is carried out.
(54) In a further step 100, the result from step 98 is checked and the following step is selected.
(55) If this is not the case, an alert is output to a driver of the motor vehicle in an additional step 102. In addition, in a further step 104, a headlight correction is carried out, as is described, for example, with reference to
(56) If it is established in step 100 that the headlights are already parallel to one another, the method is immediately continued in step 106.
(57) In a further step 108, the two headlights are moved into a second home position and a second light distribution is set for measuring purposes. The second home position and the second light distribution are used for setting a horizontal orientation of the two headlights relative to one another.
(58) In a further step 110, another image is recorded and image areas are selected for evaluation. These image areas may be selected in a manner analogous to image areas 50, 62 and 64, these image areas being situated next to one another in the horizontal direction.
(59) In a further step 112, the absolute orientation of the headlights is checked. This is also carried out in a manner analogous to the previously described procedures and with reference to predefined setpoint values.
(60) In a further step 114, a decision is made as to whether the orientation of the headlights is correct overall. If this is not the case, a corresponding alert is output to the driver in a further step 116 and, in addition, the recorrection of the orientation of the headlights is carried out in a step 118. The method is subsequently continued in a further step 120.
(61) If it is established in step 114 that the orientation is correct, the method is immediately continued in step 120.
(62) The method ends in a final step 122.
(63) The exemplary embodiments described and shown in the figures are selected merely by way of example. Different exemplary embodiments may be combined with one another entirely or with respect to the individual features. One exemplary embodiment may also be supplemented by features of a further exemplary embodiment.
(64) Furthermore, method steps according to the present invention may be repeated and may be carried out in a sequence other than that described.
(65) If an exemplary embodiment includes an and/or linkage between a first feature and a second feature, this is intended to be read that the exemplary embodiment according to one specific embodiment includes both the first feature and the second feature and, according to a further specific embodiment, includes either only the first feature or only the second feature.