Apparatus and method for determining a double image angle and/or a viewing angle

11668659 · 2023-06-06

Assignee

Inventors

Cpc classification

International classification

Abstract

The invention concerns an apparatus for determining a secondary image angle (20) of a light source (11) on a transparent object (14). To achieve the objective of building a simple apparatus and to determine the secondary image angle (20) with higher measuring point densities even on transparent objects (14) with large surfaces in a quick, reliable manner with few movements, the apparatus includes an illuminating device (10), which has multiple, simultaneously illuminating, punctiform light sources (11), a two-dimensional target (16a) with at least one camera (16), whereby at least one camera (16) is set up to capture the positions of a primary image (21a) and a secondary image (21b) of multiple simultaneously illuminating light sources (11) at the same time, whereby the primary image (21) and the secondary image (21b) of one light source (11) are generated on the target by one of the volume elements (14a) of the transparent object illuminated by the light source (11), and an evaluation device (18) is set up to determine the secondary image angle (20) of the respective volume element (14a) of the transparent object (14) based on the positions of the primary image (21a) and the secondary image (21b). Furthermore, a method for determining the secondary image angle is also specified.

Claims

1. An apparatus to determine an angle of vision (σ) of a light source on a transparent object: an illuminating device, which has multiple, partially simultaneously illuminated, punctiform light sources, whereby the light sources have an illuminating distance (G) from the transparent object, are located in front of the transparent object and the light of the light sources falls on the transparent object at an angle of incidence (κ) with regard to a surface normal; at least one camera, which is set up to capture simultaneously the positions of a primary image and a secondary image of multiple, simultaneously illuminated light sources on a 2-dimensional target, whereby the target has a viewing distance (A) from the transparent object, is arranged behind the transparent object along a light path and is formed by a recording surface or image sensor of the at least one camera, whereby the primary image and the secondary image of each light source are generated by a volume element of the transparent object illuminated by the light source on the target; and an evaluation unit, which is set up to determine the angle of vision (σ) of the respective volume element of the transparent object based on the positions of the primary image and the secondary image as well as based on the viewing distance (A), the angle of incidence (κ) and the illuminating distance (G) whereby the angle of vision (σ) is the angle at which the at least one camera perceives the primary image and the secondary image of the light source.

2. The apparatus of claim 1, wherein the evaluation unit is further set up for at least one volume element of the transparent object, to determine a second angle of vision (σ*) for a second configuration, which is different from a first configuration with regard to a second viewing distance (A*), a second angle of incidence (κ*), and/or a second illuminating distance (G*), based on the angle of vision (σ) determined with the first configuration with the viewing distance (A), the incidence angle (κ), and the illuminating distance (G).

3. The apparatus according to claim 1, characterized in that the light sources of the illuminating device can be switched on and off separately so that there is sequential recording of the primary and secondary images of all light sources, whereby in each step a subset of multiple light sources is switched on simultaneously and another subset is switched off at the same time.

4. The apparatus according to claim 1, characterized in that the camera only captures the vertical component of the position of the primary image and the secondary image of each light source, and the evaluation unit only uses the captured vertical positions for determining the angle of vision (σ).

5. The apparatus according to claim 1, characterized in that adjacent light sources of the illuminating device can be controlled such that they have different light intensity and/or color.

6. The apparatus according to claim 2, characterized in that for at least one subset of two adjacent light sources one light source is switched on and the other light source is switched off at the same time.

7. The apparatus according to claim 1, characterized in that the evaluation unit is further set up to calculate for at least one volume element of the transparent object a secondary image angle (ρ) for the respective volume element using the angle of vision (σ).

8. A method to determine the angle of vision (σ) of a light source on a transparent object, the method comprising: whereby the transparent object is illuminated by an illuminating device with multiple, simultaneously illuminated, punctiform light sources, whereby the light sources have an illuminating distance (G) from the transparent object, are located in front of the transparent object and the light of the light sources falls on the transparent object at an angle of incidence (κ) with regard to a surface normal, whereby the positions of a primary image and a secondary image of multiple simultaneously illuminated light sources are captured by at least one camera on a two-dimensional target at the same time, whereby the target has a viewing distance (A) from the transparent object, is arranged behind the transparent object along a light path and is formed by a recording surface or image sensor of the at least one camera, whereby the primary image and the secondary image of each light source are generated on the target through a volume element of the transparent object illuminated by the respective light source, and whereby by means of an evaluation unit the angle of vision (σ) of the respective volume element of the transparent object based on the positions of the primary image and the secondary image as well as based on the viewing distance (A), the angle of incidence (κ) and the illuminating distance (G) is determined, whereby the angle of vision (σ) is the angle at which the at least one camera perceives the primary image and the secondary image of the light source.

9. The method according to claim 8, whereby for at least one volume element of the transparent object a second angle of vision (σ*) is determined for a second configuration, which is different from a first configuration with regard to a second viewing distance (A*), a second angle of incidence (κ*), and/or a second illuminating distance (G*), based on the angle of vision (σ) determined with the first configuration with the viewing distance (A), the incidence angle (κ), and the illuminating distance (G).

10. The method according to claim 9, characterized in that the determination of the second angle of vision (σ*) of the second configuration includes the following steps when the first angle of vision (σ) was determined: calculation of a wedge angle (η) for the respective volume element based on the first angle of vision (σ), and calculation of the second angle of vision (σ*) by using the calculated wedge angle (η).

11. The method according to claim 8, characterized in that the light sources of the illuminating device can be switched on and off separately so that there is sequential recording of the primary and secondary images of all light sources, whereby in each step a subset of multiple light sources is switched on simultaneously and another subset is switched off at the same time.

12. The method according to claim 11, characterized in that for at least one subset of two adjacent light sources one light source is switched on and the other light source is switched off at the same time.

13. The method based on claim 8, characterized in that the camera only captures the vertical components of the position of the primary image and the secondary image of each light source and the evaluation unit only uses the captured vertical positions for determining the angle of vision (σ).

14. The method based on claim 8, characterized in that the adjacent light sources of the illuminating device can be controlled such that they have a different light intensity and/or color.

15. The method according to claim 8, characterized in that the transparent object is moved relative to the illuminating device and the target when determining the angle of vision (σ).

16. The method according to claim 8, characterized in that a fraction of the angle of vision (σ) caused by a bending radius (R) and a thickness (d) of the transparent object in the respective volume element is determined.

17. The method according to claim 8, characterized in that a fraction of the angle of vision (σ) caused by a wedge angle (η) in the respective volume element of the transparent object is determined.

18. The method according to claim 8, characterized in that a secondary image angle (ρ) for the respective volume element is calculated for at least one volume element of the transparent object using the angle of vision (σ).

Description

(1) The invention is explained below by means of embodiments and with reference to figures. All the features described and/or explained with images form the object of this invention, and this is applicable irrespective of the summary of claims or the back references.

(2) Illustrated schematically below are:

(3) FIG. 1 A first embodiment of the device according to the invention for determining a secondary image angle of a light source on a transparent object in the form of a windscreen in a longitudinal section,

(4) FIG. 2, 2a The formation of an angle of vision through a single, illuminated volume element of a transparent object or a secondary image angle through the transparent object in cross-section, respectively,

(5) FIG. 3 The positions of a primary image and a secondary image of a single light source on a target in a front view,

(6) FIG. 4 Intensities of the primary and secondary images of six vertically arranged light sources from FIG. 1 when adjacent light sources have different light intensities,

(7) FIG. 5 A second embodiment of the device according to the invention in a longitudinal section,

(8) FIG. 6 An illuminating device in front view

(9) FIG. 7 Generation of an angle of vision through a single wedge-shaped, illuminated volume element of a transparent object in cross section,

(10) FIG. 8 Generation of an angle of vision similar to FIG. 7 under a second ambient parameter,

(11) FIG. 9-11 Generation of an angle of vision through a single curved, illuminated volume element of a transparent object in cross section, and

(12) FIG. 12 A third embodiment of the device according to the invention for determining a secondary image angle and/or angle of vision of a light source on a transparent object in the form of a windscreen in longitudinal section.

(13) The embodiment of a device according to the invention illustrated in FIG. 1 comprises an illuminating device 10 with for example nine simultaneously illuminated, punctiform series of light sources 11 arranged vertically above each other, which are designed for example as LEDs. The illuminating device 10 is arranged on the first side of a transparent object in the form of a windscreen (hereinafter referred to as a pane) 14. A camera 16 is placed on the second side of the pane 14, which is opposite the first side. The pane 14 is placed in a horizontal direction 12 at a distance 13 of 7 m from the illuminating device 10 and arranged with an inclination angle 15 with regard to the vertical direction, whereby the inclination angle corresponds to the later installation position of the pane 14. Each light source 11 emits light in the direction of pane 14 and illuminates one volume element 14a of pane 14 each time. The target is formed by the recording surface 16a of camera 16. The camera 16 simultaneously captures the positions of a primary image 21a and a secondary image 21b (see FIG. 3) for each of the illuminating light sources. An optical filter 17 can be arranged in the light path in front of camera 16, wherein the filter transmits only one wavelength with which the light sources 11 shine. Disturbing light from other sources with different wavelengths will not be captured by the camera 16 in this case. An evaluation device 18 connected with camera 16 determines the secondary image angle and/or angle of vision of the illuminated volume element 14a of pane 14 by using the positions of the associated primary and secondary image 21a, 21b, as described below, simultaneously for all volume elements 14a illuminated by the illuminating device 10.

(14) Alternatively the illuminating device 10 in FIG. 1 may have twelve light sources 11 vertically juxtaposed (for example at a distance of 4.5 mm), wherein the light sources can be switched on and off separately. To explain a sample switching pattern, the status of each light source is marked with “1” for switched on status and with “0” for switched off status. In the first step, every fourth light source 11 is switched on when viewed from top to bottom (first switching pattern: 100010001000). In the second step, the light sources switched on in the first step are switched off and the light sources below are switched on (second switching pattern: 010001000100). This is followed analogously by the third step with a switching status which corresponds to a third switching pattern 001000100010 and a fourth switching status which corresponds to a fourth switching pattern 000100010001. Thus, the secondary image angle/angle of vision for each individual volume element can be determined in a total of four steps with all light sources 11 of the illuminating device 10, without any difficulties in the allocation of primary and secondary images 21a, 21b for adjacent light sources 11 due to high light source density (see FIG. 2).

(15) Illuminating devices 10 with another number and/or distribution of light sources 11 are also possible.

(16) FIG. 2 shows a single volume element 14a of the pane 14 from FIG. 1, which is illuminated from a single light source 11 from FIG. 1. A light beam from the light source 11 falls on the illuminated volume element 14a at an angle of incidence κ with regard to the surface normal. A portion of the light from light source 11 follows a primary light path 19a and passes through the volume element 14a without being reflected. Another portion of the light from light source 11 follows a secondary light path 19b and passes through the volume element 14a being reflected at the second interface of pane 14. On the second side of the volume element 14a, which corresponds to the second side of pane 14 from FIG. 1, the primary light path 19a and secondary light path 19b form an angle of vision σ. Pane 14 is a flat, i.e. not curved, pane with a wedge. This means that the front and rear sides of the pane 14 in the area of illuminated volume element 14a do not run parallel to each other but form a wedge angle η.

(17) FIG. 2a shows the formation of the secondary image angle ρ by the primary light path 19a and the secondary light path 19b of an incident beam 19 through pane 14. For the secondary image angle ρ:

(18) ρ = 2 η ( n 2 - sin 2 κ cos κ ) ( F1 )

(19) Here n is the refractive index of the material of pane 14, κ is the angle of incidence of the incident beam 19, and n is the wedge angle of pane 14.

(20) FIG. 3 shows a section of the target 16a of camera 16 from FIG. 1. For example, the positions of a primary image 21a and a secondary image 21b of a single light source 11 are shown. Based on a vertical distance 22 and a horizontal distance 23 of the positions of the primary image 21a and secondary image 21b on target 16a, the evaluation device 18 in FIG. 1 can determine the secondary image angle ρ and/or angle of vision σ for the volume element 14a of pane 14 from FIG. 11, which is illuminated by the light source 11. Alternatively, the absolute horizontal and vertical positions of the primary image 21a and secondary image 21b can be determined in a two-dimensional coordinate system of target 16a. A length of the light path from light source 11 to target 16a is known. The vertical distance 22 is calculated from the vertical pixel number Pv determined by camera 16. By using a proportionality factor Fv, which includes a pixel distance and magnification scale of camera 16 in the vertical direction, a vertical component ρv=arctan(Pv*Fv/E) of the secondary image angle ρ is determined taking the distance E between the target 16a and the pane into consideration. The calculation of a horizontal component of the secondary image angle ρh is done analogously on the basis of a horizontal pixel number Ph. Likewise, a primary and secondary image is captured for each additional light source 11 and the secondary image angle is determined with both components ρv and ρh. The same can be done with regard to the angle of vision σ.

(21) For example, a diagram in FIG. 4 shows the intensities on the target of the primary and secondary images of six light sources 11 from FIG. 1 arranged next to each other, whereby two adjacent light sources 11 each have different light intensities. The intensity is plotted on vertical axis 40. The light sources 11 with a higher light intensity generate primary images with high primary image intensity 41a and the light sources 11 with a lower light intensity generate primary images with a lower primary image intensity 42a. Accordingly, the secondary images of light sources 11 with a higher light intensity also show a higher secondary image intensity 14b than the light sources 11 with a lower light intensity. The primary and secondary images can be assigned to each other by using different light intensities.

(22) FIG. 5 shows a second embodiment of the device according to the invention. In contrast to the first embodiment shown in FIG. 1, this design has a first mirror 50 and a second mirror 51 in the light path between the illuminating device 10 and pane 14. The illuminating device 10, the first mirror 50, and the second mirror 51 are arranged together in an enclosure 52. The light path between the illuminating device 10 and transparent object 14 is folded twice through the first mirror 50 and second mirror 51. The space requirement 53 on the front side of pane 14 for the apparatus may thereby be reduced significantly. By using a double fold, a light path of more than 7 m length between the illuminating device 10 and pane 14 can be realized with a space requirement 53 of only 2.5 m.

(23) FIG. 6 shows another design option for an illuminating device 10a. It has multiple horizontally arranged light strips 10b, whereby each light strip 10b has multiple vertically juxtaposed light sources 11, with a uniform light source distance 11v. The vertical light source distance 11v in this embodiment is larger than the horizontal light source distance 11h, which corresponds to the distance of two adjacent light strips 10b. Two adjacent light sources 11, i.e. arranged next to or below each other, preferentially illuminate in different colors and/or intensities and/or polarizations. Two adjacent light strips 10b are shifted vertically by a distance that is smaller than the vertical light source distance 11v to achieve a higher light source density in the vertical direction. Due to the horizontal distance of the light strips 10b, the illuminating device 10a from FIG. 6 is also suitable for determining the horizontal component of the secondary image angle.

(24) FIG. 7 explains the formation of an angle of vision at a wedge-shaped volume element 14a from a primary image 71a and a secondary image 71b of light source 11 on target 16a. Light on a primary light path at an illuminating distance G from light source 11 falls on the front side of the volume element 14a at an angle of incidence κ and is refracted due to the refractive index n of the volume element 14a so that it passes through the volume element 14a at an angle A to the normal on the front side of volume element 14a. Then, it leaves on the rear side at an exit angle ν to the normal on to the rear side, passes through an aperture 70 at a sight distance A and generates the primary image 71a on target 16a. On a secondary light path (shown as dotted), light of the same light source 11 falls on the front side at another angle of incidence α, is refracted at an angle β, reflected twice in volume element 14a, and then leaves on its rear side at an exit angle φ. It then passes through aperture 70 and generates a secondary image 71b on target 16a. On the second side of the volume element 14a, i.e. between its rear side and target 16a, the primary and secondary light paths run at an angle of vision σ.sub.η.

(25) As an approximation, it is assumed that the thickness of the volume element 14a is constant in spite of wedge angle η. In addition, it is assumed that σ and η are small angles. Thus, the primary and secondary light paths pass through aperture 70 and the following equations are generally applicable for a wedge-shaped transparent object:
sin θ=sin κ−η√{square root over (n.sup.2−sin.sup.2κ)}  (F2)
sin φ=sin α−3η√{square root over (n.sup.2−sin.sup.2α)}  (F3)
G cos κ(tgα−tgκ)+d[2tg(β−2η)+tgβ−tgλ]−A cos(θ+η)[tg(ζ+η)−tg(φ+η)]=0  (F4)
σ.sub.η=ν−φ  (F5)

(26) The angle of vision σ.sub.n of the same volume element 14a can be different for different ambient parameters or configurations (illuminating distance G, viewing distance A, angle of incidence κ).

(27) Instead, the angles of vision σ.sub.ηfor different volume elements 14a each under first ambient conditions (i.e. for first configuration) which differ from the reference parameters are determined, and the angle of vision σ.sub.η*, formed under the reference parameters (reference viewing distance A*, reference illuminating distance G*, reference angle of incidence k*) is calculated (cf. FIGS. 7 and 8). For example, the angle of vision can be determined for other parameters (i.e. for a configuration deviating from the norm), e.g. a viewing distance of 4 m, and the angle of vision supported by the standard and/or secondary image angle can be calculated for a reference viewing distance of 7 m. Subsequently the angle of vision and/or the secondary image is calculated for the reference or standard configuration.

(28) The calculation of the angle of vision is made on the basis that the illuminating distance G, angle of incidence κ, thickness d, wedge angle η, viewing distance A are known, and the angle of vision σ.sub.η is measured. Now the angle of incidence α is changed in the equation system with formulas (F2) to (F5) and the equation system is solved with an iterative process (for example MS Solver).

(29) Alternatively, if the illumination distance G, angle of incidence α, angle of incidence κ, thickness d and viewing distance A as well as angle of vision σ.sub.η are known, the wedge angle η can be determined by using the equation system with formula (F2) to (F5) of the tested volume element. A starting value or approximation value of the wedge angle η is obtained by adjusting formula (F1) if the measured angle of vision σ.sub.η is used instead of the secondary image angle ρ. The wedge angle η is then changed until the measured angle of vision σ.sub.η is reproduced in formula (F5). With wedge angle η and by solving the above equation system, an angle of vision σ.sub.η* is determined for a second ambient parameter. Furthermore, wedge angle η and angle of incidence κ with formula (F1) can be used to calculate the associated (independent of A and G) secondary image angle ρ for volume element.

(30) Since the equation system has four equations, known numeric methods can be used in many cases to find solutions for multiple unknown values, especially for wedge angle η and angle of incidence α.

(31) FIG. 9 to 11 illustrate a curved transparent object (pane) 14 without a wedge (i.e. η=0) in which an angle of vision σ.sub.B of the volume element 14a is caused by the bending radius R. For better clarity, the primary and secondary light paths are shown in FIG. 10 only up to the front side of the volume element 14a and in FIG. 11 only from the rear side of the volume element 14a up to target 16a. In the area of the volume element 14a, pane 14 has a bending radius R on the rear side and a bending radius R+d on the front side, whereby d is the thickness of the pane in the area of volume element 14a. The bending radius R and thickness d are known in most cases, in contrast to possible wedge angles. On the primary light path, the light from the light source 11 at illuminating distance G and angle of incidence κ falls on the front side of the volume element 14a, and on the secondary light path at illuminating distance G.sub.S and angle of incidence α. With regard to the center of curvature M of the volume element 14a, the point of impact of light of the primary and secondary light paths is separated on the front side by an angle Ω.

(32) The light of the primary light path passes through the volume element 14a only once and leaves it at its rear side at an exit angle φ. The light of the secondary light path, in contrast, is reflected first at the rear side, then at the front side of the volume element 14a, and only then leaves it at the rear side at an exit angle ν. As is evident in FIG. 11, the exit points of the primary and secondary light paths are separated on the rear by an angle of ξ with reference to the center of curvature M. The light of the primary light path falls on the target 16 coming from the rear side of the volume element 14 at a viewing distance A. The light of the secondary light path also falls on target 16a coming from the rear side. The primary light path and secondary light path run after leaving the volume element 14a at an angle of vision σ.sub.B, which is determined by the curvature or bending radius R of the volume element and thickness d of the volume element.

(33) The following equation system is applicable for the situation shown in FIG. 9 to 11:

(34) sin φ = R + d R sin κ ( F6 ) sin v = R + d R sin α ( F7 ) Ω = G sin ( κ - α ) ( R + d ) cos α + G cos ( κ - α ) ( F8 ) ω = 3 d R sin α n 2 - sin 2 α ( F9 ) ϖ = d R sin κ n 2 - sin 2 κ ( F10 ) ξ = Ω - ω + ϖ ( F11 ) σ B = v - φ + ξ ( F12 ) G cos ( κ - Ω 2 ) sin ξ 2 [ tan ( κ - Ω 2 ) - tan ( α + Ω 2 ) ] - 2 d sin Ω 2 sin ξ 2 = A cos ( φ - ξ 2 ) sin Ω 2 [ tan ( v + ξ 2 ) - tan ( φ - ξ 2 ) ] ( F13 )

(35) In particular, with known measuring arrangements and in the absence of a wedge error (η=0) in a volume element 14a, the bending radius R can be determined based on the measured angle of vision σ.sub.B. For this purpose, the above equation system is solved with formulas (F6) to (F13) through known numeric solution methods by varying a.

(36) Even the angle of vision σ.sub.B generated by the bending radius R generally depends on the ambient parameters. In particular, with known radius R the above equation system can be used to calculate an angle of vision σ.sub.B* under a second ambient parameter, for example, with a desired viewing distance A.

(37) The aforementioned method is used separately and simultaneously for each volume element 14a of multiple volume elements illuminated by the illuminating device. The angle of vision and/or the secondary image angle for all volume elements 14 of a large area of pane 14 can be calculated simultaneously and for the entire pane 14 if required.

(38) The third embodiment of the device according to the invention shown in FIG. 12 differs from the apparatus shown in FIG. 1 in that the target 16a is formed from the recording surfaces/image sensors of multiple cameras 16, which are arranged in camera line 16b. The illuminating device 10 is designed as an LED strip with light sources 11 arranged one above the other. As shown in FIG. 1, the flat pane 14 is inclined at an inclination angle 15 and arranged in the horizontal direction 12 at a distance 13 from the illuminating device 10. Due to the inclination angle 15, the individual light sources 11 of the illuminating device 10 have different horizontal illuminating distances G. An illuminating distance G, between one of the lowest light sources 11 and lowest volume elements 14a is less than the illuminating distance G.sub.n between the uppermost light source 11 and the uppermost volume element 14a and less than distance 13. Conversely, viewing distance A.sub.1 is greater than viewing distance A.sub.n. The apertures are available through a camera opening of camera 16 not shown in detail. For each volume element 14a, the above method is used to calculate, by means of a measured first angle of vision, a second angle of vision σ* (not shown) for reference conditions, i.e. a common reference illuminating distance G* and reference viewing distance A*. The reference illuminating distance should preferentially be G*=7 m.

(39) The measured angle of vision σ of a volume element 14a for a first ambient parameter (e.g. G=5 m) is often produced by a wedge angle η as shown in FIG. 7 and by the bending radius R as shown in FIG. 9 to 11. If second angles of vision σ* with a specified second ambient parameter (e.g. G*=100 m) for the volume element 14a of a curved pane 14 have to be determined using the apparatus from FIG. 12, the angle of vision σ.sub.B caused by its bending radius R and thickness d is first calculated for each volume element 14a using the method described above and subtracted from the measured first angle of vision σ (σ.sub.η=σ−σ.sub.B). Here the bending radii R of the volume elements 14a over the pane 14 may vary. The result of subtraction corresponds to the angle of vision σ.sub.η for the respective volume element 14a, as it is assumed that a non-zero value of σ.sub.η is caused by the wedge angle η present in this volume element. If pane 14, as shown in FIG. 12, does not have any bending (σ.sub.B=0), the measured first angle of vision σ corresponds to the angle of vision σ.sub.η of the wedge in accordance with FIG. 7.

(40) The above mentioned distance G*=100 m is interesting in practice if the light source comes from that distance. In order to determine the angle of vision σ* for this situation from the value a determined with G=5 m, the angle of vision σ.sub.η* for the ambient parameter G*=100 m is calculated for each volume element 14a from σ.sub.η, as described with regard to FIGS. 7 and 8 (other ambient parameters remain unchanged). Then, if applicable, the angle of vision σ.sub.B* caused by a bending radius R and thickness d of the volume element 14a for this ambient parameter as described above is calculated for each volume element 14a from σ.sub.B and added to σ.sub.η* so that a second angle of vision σ* is calculated under a second ambient parameter for each volume element 14a as:
σ*=σ.sub.B*+σ.sub.η*  (F12)
If the volume element 14a does not show a bending radius R, then σ*=σ.sub.η* is applicable for the second angle of vision.

(41) Alternatively, the second angle of vision σ* for the reference ambient parameter G*=7 defined in a standard can also be calculated from the measured angle of vision σ. One such standard is the Regulation No. 43 of the United Nations Economic Commission for Europe (UNECE) mentioned at the beginning. In addition, the associated secondary image angle ρ can be calculated for each volume element 14a with known or calculated values for the angle of incidence κ and wedge angle η by using formula (F1).

(42) The described method with an arrangement as shown in FIG. 12 allows, in spite of the inclination angle 15 of the pane, for different illuminating distances G.sub.1 to G.sub.n as well as different viewing distances A.sub.1 to A.sub.n, the first angle of vision of all volume elements 14a to be measured simultaneously for known first ambient parameters and then to calculate the second angle of vision σ* for each volume element for second ambient parameters. This results in major time saving when determining the viewing and/or secondary image angle because otherwise the volume element has to be brought into the status of second ambient parameter through corresponding movement.

REFERENCE SIGN LIST

(43) 10, 10a Illuminating device 10b Light strip 11 Light source 11h Horizontal light source distance 11v Vertical light source distance 12 Horizontal direction 13 Distance 14 Pane 14a Volume element 15 Inclination angle 16 Camera 16a Target 16b Camera line 17 Optical filter 18 Evaluation device 19 Incident beam 19a Primary light path 19b Secondary light path 21a, 71a Primary image 21b, 71b Secondary image 22 Vertical distance 23 Horizontal distance 40 Vertical axis 41a, 42a Primary image intensity 41b, 42b Secondary image intensity 50 First mirror 51 Second mirror 52 Enclosure 53 Space requirement 70 Aperture α, κ, κ* Angle of incidence β, λ, ξ, ω, Ω Angle ν, φ Exit angle σ, σ.sub.η, σ.sub.η*, σ.sub.B, Angle of vision η Wedge angle ρ Secondary image angle A, A.sub.1, A.sub.n, A* Viewing distance d Thickness G, G.sub.S, G.sub.1, G.sub.n, G* Illumination distance M Center of curvature R Bending radius