Method and device for detecting a braking situation
09827956 · 2017-11-28
Assignee
Inventors
Cpc classification
B60T8/171
PERFORMING OPERATIONS; TRANSPORTING
B60W30/16
PERFORMING OPERATIONS; TRANSPORTING
B60W30/0956
PERFORMING OPERATIONS; TRANSPORTING
B60T2201/022
PERFORMING OPERATIONS; TRANSPORTING
G06V10/22
PHYSICS
G08G1/166
PHYSICS
G06V20/56
PHYSICS
International classification
B60Q9/00
PERFORMING OPERATIONS; TRANSPORTING
B60W30/095
PERFORMING OPERATIONS; TRANSPORTING
B60T8/171
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A method for detecting a braking situation of a vehicle located on a traffic route, which has a task of ascertaining a red component of at least one image area of an image which depicts at least one section of the traffic route, and includes a task of determining the braking situation based on the red component.
Claims
1. A method comprising: obtaining, by processing circuitry of a first vehicle, an image of a route in front of the first vehicle in which a second vehicle is located in columnar line with, and between, the first vehicle and a third vehicle, which is not imaged in the obtained image; processing, by the processing circuitry, the obtained image to ascertain a red component within the image, which red component does not include lighted braking lights; identifying the ascertained red component, by the processing circuitry, as a reflection of light from a braking light of the third vehicle that is not imaged in the obtained image; and the processing circuitry at least one of outputting a warning and controlling a motion of the first vehicle in response to the identification of the reflection of the light from the braking light of the third vehicle.
2. The method of claim 1, wherein the red component identified as the reflection is in an area of the image that depicts a roadway surface of the route.
3. The method of claim 1, wherein the red component identified as the reflection is in an area of the image that depicts a barrier at a side of a roadway of the route.
4. The method of claim 1, wherein the red component is ascertained as at least one of an absolute red component and a relative red component of the image.
5. The method of claim 1, further comprising: ascertaining an additional red component within an additional image of the route at a different time than that to which the image corresponds, wherein the identifying of the red component as the reflection of light from the braking light of the third vehicle is based on the red component and the additional red component.
6. The method of claim 1, wherein the identifying is based on a determination that the red component exceeds a threshold value.
7. The method of claim 6, further comprising setting the threshold value as a function of an environmental condition.
8. The method of claim 1, wherein the identifying includes comparing the red component to each of a plurality of threshold values.
9. The method of claim 1, wherein the second vehicle, which blocks the third vehicle from being imaged in the image, is imaged in the image.
10. The method of claim 9, further comprising, based on another red component, identifying, by the processing circuitry, a braking by the second vehicle.
11. The method of claim 1, wherein the red component identified as the reflection is in an area of the image that depicts a side of a body of a fourth vehicle in a lane of a roadway of the route that is adjacent to a lane of the first, second, and third vehicles, the side of the body of the fourth vehicle not including any brake lights.
12. A device comprising: an imaging device of a first vehicle; and processing circuitry of the first vehicle that is in communication with the imaging device, wherein the processing circuitry is configured to: obtain from the imaging device an image of a route in front of the first vehicle in which a second vehicle is located in columnar line with, and between, the first vehicle and a third vehicle, which is not imaged in the obtained image; process the obtained image to ascertain a red component within the image, which red component does not include lighted braking lights; identify the ascertained red component as a reflection of light from a braking light of the third vehicle that is not imaged in the obtained image; and at least one of output a warning and control a motion of the first vehicle in response to the identification of the reflection of the light from the braking light of the third vehicle.
13. A non-transitory computer readable medium on which is stored a computer program that is executable by a processor of a first vehicle, and, when executed by the processor, causes the processor to perform a method, the method comprising: obtaining, from an imaging device of the first vehicle, an image of a route in front of the first vehicle in which a second vehicle is located in columnar line with, and between, the first vehicle and a third vehicle, which is not imaged in the obtained image; processing the obtained image to ascertain a red component within the image, which red component does not include lighted braking lights; identifying the ascertained red component as a reflection of light from a braking light of the third vehicle that is not imaged in the obtained image; and at least one of outputting a warning and controlling a motion of the first vehicle in response to the identification of the reflection of the light from the braking light of the third vehicle.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
DETAILED DESCRIPTION
(5) In the subsequent description of exemplary embodiments of the present invention, the same or similar reference numerals are used for the elements that are shown in the various figures and act similarly, so that a repeated description of these elements has been dispensed with.
(6)
(7)
(8) Because of the ever greater degree of penetration of video cameras in the vehicle, e.g. by high-beam assistants and/or lane keeping assistants, the longitudinally guiding functions, which were originally implemented using radar, may now be implemented, or at least supported using a video camera.
(9) A video camera takes the scene ahead of the vehicle using an image converter or imager, which besides brightness information, the so-called “gray value”, is also able to evaluate color information, particularly red. In this context, a so-called R31 imager may be involved. Of every four image points or pixels, three pixels are intensity pixels without color filter, so-called “I pixels”, and one pixel having a red filter. By an evaluation of the pixel provided with the red filter, an image plane may be detected which depicts a red range of an environment of the vehicle recorded by the camera. By a suitable evaluation, in this way the red component may be ascertained in the image provided by the imager. It has also proven advantageous to ascertain the red component relative to the measured intensity, whereby a normalized or relative red component is able to be ascertained. Because of the relative red component, the dependence of the absolute red component of the environmental brightness may be reduced.
(10) The video camera may be installed behind the rear-view mirror, looking forwards, in vehicle 100, and take the scene in front of the vehicle shown in
(11) Building up on the distance estimation, in a similar way to radar, for example, warning or intervening functions are able to be implemented, such as FCW.
(12) If such a function is implemented based on the distance estimate, there are certain delay times, since the distance estimation is relatively inaccurate compared to the radar measurement. In this case, a good situation interpretation is essential, in order to warn at the right time and to hold low the false alarm rate.
(13) If vehicles 100, 102, 104 travel closely up to one another in a column, the danger of an accident rises, since braking interventions often turn out more strongly than in the case of sufficiently large spacing. The same is the case if one has to brake strongly unexpectedly from a high speed, for instance, at a poorly visible end of a traffic jam.
(14) Furthermore, at night, the driver of vehicle 100 is able to estimate distances or the strength of brake interventions only with difficulty, since he only sees the lights or signal lights of vehicle 102, and is only able to sense the strength of a brake intervention via the distance change of the lights. During the day, a person also tends to use the strength of the pitching motion of other vehicles in order to conclude on the strength of a braking intervention.
(15) At night, the camera finds it difficult to estimate distances, since objects “vanish” in the dark. In the same way, bad weather reduces visibility conditions.
(16) In
(17) This means that vehicle 104, located in section 210, is not able to be detected in an image taken by the camera.
(18) When vehicle 104 brakes, vehicle 102 will presumably also brake. Depending on the speed of the host vehicle and the distance of vehicle 100, it must subsequently also brake, in order to avoid a collision with vehicle 102.
(19) For distance-regulating or distance-warning systems in vehicle 100, without the use of a suitable method, one is able to react exclusively to the behavior of vehicle 102, since vehicle 104 is covered out of sight of vehicle 100. However, since the behavior of vehicle 102 depends on the behavior of vehicle 104, it is useful to include the behavior of vehicle 104 in the system behavior of the systems of vehicle 100.
(20) Each vehicle 100, 102, 104 is equipped with red braking lights, which are activated upon the activating of the brake. To be sure, the brake lights of vehicle 104 are covered by vehicle 102, but the red light of the brake lights of vehicle 104 is reflected by the surroundings and is able to be perceived by the camera in vehicle 100, as shown in
(21)
(22) In
(23) Reflections 312 are caused by an activation of brake lights 314 based on a braking process of vehicle 104. The braking process of vehicle 104 is able to be perceived by reflections 312 of brake lights 314, by the camera in vehicle 100, even before vehicle 102 reacts to the braking process of vehicle 104. Information on the braking process of vehicle 104 or generally on a braking situation of vehicle 104 located in a column in front of vehicle 102 is able to be gained by an evaluation of the image taken by the camera of vehicle 100. This information is able to be utilized by an assistance system of vehicle 100. An assistance system of vehicle 100 is thereby able to react presciently to the potential braking process of vehicle 102.
(24) The evaluation of the camera image of the camera of vehicle 100 may be limited to individual areas, in which reflections 312 are most probable to appear. Such areas may be situated, for example, directly on the roadway, round about vehicle 102 or at an edge-of-the-road development.
(25)
(26)
(27)
(28) The light radiated by the braking lights of the vehicle located ahead of vehicle 102 is reflected on the surface of road 420 and roadway limitation 422. An image of reflection 312 in the image shown in
(29)
(30) A column brake signal, which includes information on a braking situation of vehicles moving in a column, may be produced, for instance, by evaluating the absolute red component and the red component change of an image section or of image areas of, for example, the images shown in
(31) By evaluating the red component, a traffic jam end in flowing traffic may also be detected, for example. If the distance is great enough, the brake lights are poorly able to be classified as such. However, at the end of the traffic jam, all preceding vehicles brake almost simultaneously or shortly after one another. The red component in the image or the image area thereby becomes greater.
(32) The approach according to the present invention may be used in the case of distance regulating or distance warning systems, such as FCW or (video) ACC for detecting the situation. For this purpose, the column brake signal or information on the braking situation may be supplied to a corresponding system, and used by the system for decision finding. The column brake signal or information on the braking situation may also be used in light systems such as AHC (=Adaptive High Beam Control; a flowing transition of the bright-dark border between low beam and high beam, related to illumination distance regulation) for predictive adaptation of the safety distance or the safety angle.
(33) The reflections are able to be perceived especially well at night and/or on wet roads. At night, the environmental brightness is slight, whereby low light intensities (or light reflections) may be resolved better. When the roadway is wet, the road has good reflection properties, whereby more red light is able to arrive in the camera than when the road is dry. In the same way, the reflection from guardrails and concrete demarcations at construction sites is easy to recognize.
(34)
(35) Vehicle 100 has a camera 730, which is developed to image a section 732 of traffic route 420, located ahead of vehicle 100, as an image. The image may correspond to one of the images shown in
(36)
(37) The exemplary embodiments described and shown in the figures have been selected merely as examples. Different exemplary embodiments are combinable with one another, either completely or with regard to individual features. An exemplary embodiment may also be supplemented by features from another exemplary embodiment. Furthermore, method tasks according to the present invention may be carried out repeatedly and also performed in a sequence other than the one described. If an exemplary embodiment includes an “and/or” linkage between a first feature and a second feature, this may be understood to mean that the exemplary embodiment according to one specific embodiment has both the first feature and the second feature, and according to an additional specific embodiment, either has only the first feature or only the second feature.