METHOD FOR CONTROLLING A MOTOR VEHICLE LIGHTING SYSTEM

20230042933 · 2023-02-09

Assignee

Inventors

Cpc classification

International classification

Abstract

A method for controlling a lighting system of a host motor vehicle with a plurality of selectively controllable elementary light sources, each able to emit an elementary light beam with the vertical angular opening of which is less than 1°. The method includes detecting a target object by a sensor system. Determining a vertical angle between a given point of the sensor system of the host vehicle and a detected point of the target object. Determining, from the vertical angle, a lower angle and an upper angle between a given point of the lighting system and, respectively, an upper cut-off and a lower cut-off intended to together vertically border the target object. Controlling the elementary light sources to emit a pixelated road light beam, as a function of the lower and upper angles, to generate, in the light beam, a dark area extending substantially between the upper and lower cut-offs.

Claims

1. A method for controlling a lighting system of a host motor vehicle, the lighting system including a plurality of selectively controllable elementary light sources each able to emit an elementary light beam whose vertical angular aperture is smaller than 1°, the method comprising: detecting a target object with a sensor system of the host vehicle; determining a vertical angle between a given point of the sensor system of the host vehicle and a detected point of the target object; determining, from the vertical angle, a lower angle and an upper angle between a given point of the lighting system of the host vehicle and, an upper cut-off and a lower cut-off intended to together vertically border the target object; and controlling the elementary light sources of the lighting system of the host motor vehicle to emit a pixelated high light beam, some of the elementary light sources being controlled, as a function of the lower and upper angles, to generate, in the light beam, a dark region extending substantially between the upper and lower cut-offs.

2. The method as claimed in claim 1, wherein detecting the target object includes detecting a light source of the target object, with the detected point being a point of the light source.

3. The method as claimed in claim 1, wherein determining the lower and upper angles includes comparing the vertical angle with a lower threshold and an upper threshold.

4. The method as claimed in claim 1, wherein determining the vertical angle is the vertical angle at a given time, and further comprising predicting a value of the vertical angle at a future time with respect to the given time.

5. The method as claimed in claim 4, wherein the predicting a value includes determining a vertical angular velocity of the target object utilized to predict the value.

6. The method as claimed in claim 1, wherein the determining a vertical angle includes determining the distance separating the host vehicle and the target object, with the value of the lower angle being responsive to the determined distance.

7. The method as claimed in claim 1, wherein determining the vertical angle includes determining a height of the target object, with the upper angle being responsive to the lower angle and the height.

8. The method as claimed in claim 1, wherein the detecting the target object includes classifying the type of the target object among a set of predetermined target object types, and wherein a height of the target object is determined as a function of the classified target object type.

9. The method as claimed in claim 1, wherein the controlling the elementary light sources of the lighting system of the host vehicle includes turning off some elementary light sources each capable of emitting an elementary light beam between the upper and lower cut-offs.

10. A motor vehicle with a sensor system, a lighting system and a controller, with the controller being configured to implement a method comprising: detecting a target object with the sensor system; determining a vertical angle between a given point of the sensor system and a detected point of the target object; determining, from the vertical angle, a lower angle and an upper angle between a given point of the lighting system of the host vehicle and, an upper cut-off and a lower cut-off intended to together vertically border the target object; and controlling elementary light sources of the lighting system to emit a pixelated high light beam, some of the elementary light sources being controlled, as a function of the lower and upper angles, to generate, in the light beam, a dark region extending substantially between the upper and lower cut-offs.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0035] The present invention will now be described by way of examples that are only illustrative and that in no way limit the scope of the invention, and with reference to the accompanying illustrations, in which:

[0036] FIG. 1 shows, schematically and partially, a motor vehicle according to one embodiment of the invention;

[0037] FIG. 2 shows a method according to one embodiment of the invention, implemented by the motor vehicle of FIG. 1;

[0038] FIG. 3 shows a side view of a road scene during the implementation of the method of FIG. 2 by the vehicle of FIG. 1; and

[0039] FIG. 4 shows a front view of a road scene during the implementation of the method of FIG. 2 by the vehicle of FIG. 1.

DETAILED DESCRIPTION OF THE INVENTION

[0040] In the following description, elements that are identical in terms of structure or in terms of function and that appear in various figures have been designated with the same reference, unless otherwise indicated.

[0041] FIG. 1 shows a partial view of a host motor vehicle 1 according to one embodiment of the invention. The host motor vehicle 1 comprises a sensor system 2 comprising a camera 21, for example arranged at the level of an interior rear-view mirror of the vehicle 1 so as to be able to acquire images of the road ahead of the vehicle 1, and a computer 22 designed to implement various methods for processing these images. The host vehicle 1 also comprises a lighting system 3 comprising a light module 31, for example arranged in a headlight of the vehicle 1. The light module 31 comprises in particular a pixelated light source 32 associated with a lens 33. In the example described, the pixelated light source 32 is a monolithic pixelated light-emitting diode, each of the light-emitting elements of which forms an elementary light source 32.sub.i,j that is able to be selectively activated and controlled by an integrated controller so as to emit light toward the lens 33, which thus projects an elementary light beam HD.sub.i,j, the light intensity of which is able to be controlled, onto the road. Each elementary light beam HD.sub.i,j is projected by the lens in a given emission cone, defined by a given direction of emission and a given angular aperture. Thus, in the example described, all of the elementary light beams HD.sub.i,j thus form a pixelated light beam HD featuring 500 pixels distributed over 25 columns and 20 rows, extending vertically over an angular vertical range from −1° to +5°, each pixel of which is formed by one of these elementary light beams HD.sub.i,j.

[0042] Each elementary light beam HD.sub.i,j emitted by one of the elementary light sources 32.sub.i,j of the source 32 has a vertical aperture smaller than 1°. More specifically, the elementary light sources 32.sub.i,j of the source 32 are arranged such that the vertical angular aperture of the elementary light beams HD.sub.i,j that they are able to emit increases toward the top of the pixelated light beam. In particular:

[0043] Each of the elementary light sources whose emission cone belongs to the angular vertical range from −1° to +1° is able to emit an elementary light beam whose vertical angular aperture is substantially 0.25°;

[0044] Each of the elementary light sources whose emission cone belongs to the angular vertical range from +1° to +2° is able to emit an elementary light beam whose vertical angular aperture is substantially 0.3°;

[0045] Each of the elementary light sources whose emission cone belongs to the angular vertical range from +2° to +3° is able to emit an elementary light beam whose vertical angular aperture is substantially 0.35°;

[0046] Each of the elementary light sources whose emission cone belongs to the angular vertical range from +3° to +5° is able to emit an elementary light beam whose vertical angular aperture is substantially 0.4°.

[0047] The light module 31 comprises a controller 34 designed to control the integrated controller of the pixelated light source 32 so as to selectively control the turning on, the turning off and the modification of the light intensity of each of the elementary light beams HD.sub.i,j as a function of instructions received from a controller 4 of the host vehicle 1, these instructions being in particular determined based on the information provided by the computer 22 of the sensor system 2 of the host vehicle.

[0048] It will be noted that, in the example described, the camera 21 is located at a height H.sub.cam and the light module 31 is located at a height H.sub.HL, these heights being measured with respect to the road on which the host vehicle 1 is traveling. Furthermore, the camera 21 and the light module 31 are separated by a distance D.sub.capot.

[0049] FIG. 2 shows a method for controlling the lighting system 3 of the host vehicle 1 that allows the lighting system 3 to emit a high light beam that is non-dazzling for a target object 5, implemented by the controller 4, and using the sensor system 2. FIG. 3 and FIG. 4 show side and front views of the road scene onto which this light beam is projected during the implementation of this method. It should be noted that these FIG. 3 and FIG. 4 show only partial views of this light beam.

[0050] In a first step E1, the sensor system 2 detects the presence of a target object 5, in this case a target vehicle 5, on the road. In the example described, the computer 22 implements one or more methods for processing the images acquired by the camera 21 that allow light sources to be detected in these images, and thus the presence of the taillights 51 of the target vehicle 5 to be detected.

[0051] In a second step E2, the computer 22 determines a vertical angle α between the camera 21 of the host vehicle 1 and the taillights 51 of the target vehicle 5 and the distance D.sub.HC separating the camera 21 of the host vehicle from the taillights 51 of the target vehicle 5. In addition, the computer 22 classifies the type of the target vehicle among a set of predetermined vehicle types and determines, based on the type of the target vehicle 5 that has been selected, the height H.sub.C of the target vehicle 5. Each of these operations may be performed by one or more algorithms for processing the images acquired by the camera 21 and implemented by the computer 22. All of this information α, D.sub.HC and H.sub.C is transmitted by the computer 22 to the controller 4.

[0052] In a step E3, the controller 4 compares the value of the vertical angle α with a lower threshold α.sub.min, for example −0.7°, and with an upper threshold α.sub.max, for example +4.8°. If the angle α is not between α.sub.min and α.sub.max, the method stops as it is possible to deduce that the host 1 and target 5 vehicles are traveling on a road whose slope does not allow or does not require a non-dazzling high-beam function. In the case that the vertical angle α is in between α.sub.min and α.sub.max, the method moves on to the next step.

[0053] The vertical angle a that has been determined by the computer 22 relates to the position of the target vehicle 5 at the time t of acquisition by the camera 21 of the image that allowed it to be determined. However, both the various methods implemented by the computer 22 of the sensor system 2 and the steps of the method according to the invention which will be described below and which allow a non-dazzling high beam to be produced by the lighting system 3 require a given execution time ΔT after which the beam is actually emitted. During this time ΔT, the target vehicle 5 may have moved such that the value of the vertical angle α no longer corresponds to the actual position of the target vehicle 5 when the beam is emitted.

[0054] In order to compensate for this latency, in a step E4, the controller 4 predicts a value of a vertical angle α′ between the camera 21 of the host vehicle 1 and the taillights 51 of the target vehicle at a future time t+Δt with respect to the time t of acquisition by the camera 21 of an image that allowed the vertical angle α to be determined in step E2. To those ends, the controller 4 determines a vertical angular velocity {dot over (θ)} of the target vehicle by deriving the values from the various values of the vertical angle α determined previously in steps E2. The predicted value α′ may thus be obtained using the following equation:


α′=α+{dot over (θ)}.Math.Δt   Math 3

[0055] Where α is the value of the vertical angle at time t determined in step E2, α′ the predicted value of the vertical angle at the future time t+Δt, {dot over (θ)} the vertical angular velocity of the target vehicle 5 and Δt the latency time of the method according to the invention.

[0056] In a step E5, the controller 4 determines a lower angle V.sub.inf between the light module 31 and the taillights 51 of the target vehicle 5. The controller 4 thus transforms the predicted vertical angle α′ by changing reference frame from a reference frame centered on the camera 21 of the sensor system 2 to a reference frame centered on the light module 31 of the lighting system 3 of the host vehicle, using the following equation:

[00003] V inf = tan - 1 ( tan ( α ) .Math. D HC .Math. cos ( α ) + ( H cam - H HL ) D HC .Math. cos ( α ) - D capot ) Math 4

[0057] Where V.sub.inf is the lower angle, α′ the value of the vertical angle predicted in step E4, D.sub.HC the distance separating the host vehicle 1 and the target vehicle 5 determined in step E2, H.sub.cam the height of the sensor system 2 of the host vehicle 1 with respect to the road, H.sub.HL the height of the lighting system 3 of the host vehicle 1 with respect to the road and D.sub.capot the distance separating the sensor system 2 from the lighting system 3.

[0058] The values H.sub.cam, H.sub.HL, and D.sub.capot are known in advance and stored in a memory of the controller 4.

[0059] Furthermore, still in step E5, the controller 4 determines an upper angle V.sub.sup from the value of the lower angle V.sub.lower obtained previously and the height of the target vehicle H.sub.C determined in step E2, for example using the following equation:

[00004] V sup = tan - 1 ( H C - H HL ) D HC .Math. cos ( α ) - D capot ) + V inf Math 5

[0060] Where V.sub.sup is the upper angle, α the value of the vertical angle predicted in step E4, D.sub.HC the distance separating the host vehicle 1 and the target vehicle 5 determined in step E2, H.sub.C the height of the target vehicle determined in step E2, H.sub.HL the height of the lighting system of the host vehicle with respect to the road, V.sub.inf the value of the lower angle and D.sub.capot the distance separating the sensor system 2 from the lighting system 3.

[0061] Upon completion of step E5, the controller 4 transmits the pair of lower V.sub.inf and upper V.sub.sup angles to the controller 34 of the light module 31. Furthermore, in steps that are not described, the controller 4 determines a pair of right V.sub.LD and left V.sub.LG lateral angles, respectively, from the positions of the taillights 51 of the target vehicle 5 and also transmits this pair of angles to the controller 34.

[0062] In a step E6, the controller 34 selects those elementary light sources 32.sub.i,j of the light source 32 which are able to emit elementary light beams HD.sub.i,j whose emission cones are vertically at least partially between the lower V.sub.inf and upper V.sub.sup angles and horizontally at least partially between the right V.sub.LD and left V.sub.LG lateral angles. The controller 34 thus controls the turning off of these selected elementary light sources 32.sub.i,j while controlling the turning on of the other elementary light sources. The light module 1 thus emits a pixelated high light beam HD in which is formed a dark region Zc centered on the target vehicle 5 and defined vertically by lower and upper cut-offs each forming a vertical angle with the light module 1, the respective values of which are substantially V.sub.inf and V.sub.sup; and horizontally by right and left lateral cut-offs each forming a horizontal angle with the light module 1, the respective values of which are substantially V.sub.LD and V.sub.LG. It will be noted that the term “substantially” should be interpreted here in relation to the vertical and horizontal resolutions of the pixelated light beam HD.

[0063] The preceding description clearly explains how the invention achieves the set objectives, in particular by providing a method for controlling a lighting system of a host vehicle that controls the turning on or the turning off of the elementary light sources of the lighting system so as to create a dark region in a pixelated light beam delimited by an upper cut-off and a lower cut-off, the positions of which are determined based on information from a sensor system of the host vehicle, and in particular relate to the vertical position of a target object on the road that should not be dazzled.

[0064] In any event, the invention should not be regarded as being limited to the embodiments specifically described in this document, and extends, in particular, to any equivalent means and to any technically operative combination of these means. In particular, it is possible to envisage other types of light module than that described, and in particular a light module comprising a combination of a light source and a matrix array of selectively activatable micromirrors. It will also be possible to envisage other methods for determining the various values used in the equations that allow the values of the lower and upper angles to be determined, or even equations other than those which have been described, and in particular equations integrating margins that allow the position of the upper and lower cut-offs of the dark region in the pixelated light beam to be moved vertically.