Vehicle driver information method
10136105 ยท 2018-11-20
Assignee
Inventors
- Florian Oszwald (Munich, DE)
- Renaud Debon (Munich, DE)
- Thomas Barmeyer (Munich, DE)
- Marc Walessa (Munich, DE)
Cpc classification
B60R2300/802
PERFORMING OPERATIONS; TRANSPORTING
B60R2300/302
PERFORMING OPERATIONS; TRANSPORTING
H04N7/181
ELECTRICITY
B60R2300/70
PERFORMING OPERATIONS; TRANSPORTING
B60R2300/301
PERFORMING OPERATIONS; TRANSPORTING
International classification
H04N7/18
ELECTRICITY
Abstract
A driver information method is provided, wherein an image of the surrounding area is captured by at least one side camera attached to the motor vehicle. The image contains an image capturing area that lies essentially laterally relative to the motor vehicle. A display unit, which is visible to the driver of a motor vehicle, outputs a screen image that contains a reproduction of the image of the surrounding area, the reproduction being graphically processed, if desired, and being preferably in essence photo-realistic. The output of the screen image is only temporary; and an obstructed view situation, in which an obstructed lateral view of the driver is to be assumed, is detected on the basis of distance signals of at least one environment detection system that is not based on a camera. The output of the screen image is automatically started when an obstructed view situation is detected.
Claims
1. A method of providing information to a driver of a motor vehicle, the motor vehicle being equipped with at least one side camera attached to a first side of the motor vehicle for capturing an image of an image capturing area lying laterally to said first side of the motor vehicle, and a display unit visible to the driver on which a screen image is output that is a reproduction of the image of the image capturing area, the method comprising the acts of: detecting that an obstacle is present laterally to said first side of the motor vehicle, wherein said detecting is based on distance signals of at least one environment detection system, the environment detection system not being a camera-based system; determining from the obstacle's detected presence that the obstacle is causing an obstructed lateral view situation in which a lateral view of the driver to said first side of the motor vehicle is obstructed; and in response to determining the obstructed lateral view situation, starting automatically an output of the screen image upon detecting the obstructed lateral view situation such that the output of the screen image is only temporary, wherein the screen image is a reproduction of the image capturing area provided by the at least one side camera attached to the first side of the motor vehicle.
2. The method according to claim 1, further comprising the act of: only temporarily capturing the image of the image capturing area; and upon detecting of the obstructed lateral view situation, starting the temporary capturing of the image of the image capturing area.
3. The method according to claim 1, wherein the detecting of the obstructed lateral view situation is based additionally on a result of an image processing of the image of the image capturing area of at least one side camera.
4. The method according to claim 2, wherein the detecting of the obstructed lateral view situation is based additionally on a result of an image processing of the image of the image capturing area of at least one side camera.
5. The method according to claim 3, wherein the image processing of the image uses a motion-stereo vision method of image processing.
6. The method according to claim 1, wherein the detecting of the obstructed lateral view situation is based additionally on a result of a determination of a driving speed of the motor vehicle.
7. The method according to claim 1, wherein the detecting of the obstructed lateral view situation is based additionally on a result of a determination of steering angle of the motor vehicle.
8. The method according to claim 6, wherein the detecting of the obstructed lateral view situation is based additionally on a result of a determination of steering angle of the motor vehicle.
9. The method according to claim 1, wherein the detecting of the obstructed lateral view situation is based additionally on a result of a determination of a gear selection of the motor vehicle.
10. The method according to claim 8, wherein the detecting of the obstructed lateral view situation is based additionally on a result of a determination of a gear selection of the motor vehicle.
11. The method according to claim 1, wherein the detecting of the obstructed lateral view situation is based additionally on a result of a determination of a geographical position of the motor vehicle.
12. The method according to claim 10, wherein the detecting of the obstructed lateral view situation is based additionally on a result of a determination of a geographical position of the motor vehicle.
13. The method according to claim 1, wherein the at least one environment detection system that is not a camera-based system comprises an ultrasonic-based parking assistance system.
14. The method according to claim 2, wherein the at least one environment detection system that is not a camera-based system comprises an ultrasonic-based parking assistance system.
15. The method according to claim 1, further comprising the acts of: detecting obstacles in a surrounding area of the motor vehicle and classifying said obstacles via the environment detection system that is not a camera-based system; and wherein the detecting of the obstructed lateral view situation is based on a result of a determination of a presence of at least one obstacle of a certain class and its position relative to the motor vehicle.
16. The method according to claim 2, further comprising the acts of: detecting obstacles in a surrounding area of the motor vehicle and classifying said obstacles via the environment detection system that is not a camera-based system; and wherein the detecting of the obstructed lateral view situation is based on a result of a determination of a presence of at least one obstacle of a certain class and its position relative to the motor vehicle.
17. The method according to claim 1, wherein the reproduction is essentially photo-realistic.
18. The method according to claim 1, wherein the reproduction is graphically processed and is essentially photo-realistic.
19. The method of claim 1, wherein the image capturing area extends in a direction that is substantially orthogonal to the motor vehicle.
Description
BRIEF DESCRIPTION OF THE DRAWING
(1)
DETAILED DESCRIPTION OF THE DRAWING
(2) The following description is based on the example of a motor vehicle that is equipped with a side view system comprising two side cameras that are positioned on the left and the right on the front bumpers of the motor vehicle and oriented orthogonal to the direction of travel. The two video images (images of the surrounding area) that are captured by these side cameras are collected by a control unit of the motor vehicle and jointly prepared, or more specifically, processed to form a screen image (cf. rectangle 1 in
(3) The screen image 1 includes (with a slightly distorted perspective for the sake of a better orientation of the driver) screen images 2 and 3 of the surrounding area; and each of these screen images 2, 3 matches in essence the image of the surrounding area captured by a side camera. A display panel 4, arranged centrally between the screen images 2 and 3 of the surrounding area, shows an image 7 of the motor vehicle as two virtual screen objects 5 and 6 of the area. The image and the screen objects 5 and 6 of the area show the observer how the image capturing areas of the images of the surrounding area are arranged relative to the motor vehicle.
(4) The output of the screen image on the CID can be activated by the driver of the motor vehicle by the stroke of a key or by selecting an appropriate menu option in a menu structure. When a specified value of the driving speed is exceeded, the output is automatically terminated or, more specifically, disabled.
(5) Since it is necessary to stroke a key or select an appropriate menu option, the situation may arise that the function is not available in many situations, in which it would be very usefulfor example, because the driver cannot find the time to activate it.
(6) In order to solve this problem, the exemplary system has the possibility of an automatic system activation at defined positions, in certain situations and/or under certain conditions.
(7) The following described characteristic activation situations are defined, in particular, for an automatic activation of the output of the screen image, preferably without the evaluation of so-called geotags. In this case, the required data for the necessary interpretation of the environment are provided by an ultrasonic-based parking assistance system (for example, PDC/PMA), a navigation system and/or an image processing with motion-stereo vision method and existing CAN messages. The described course of action to be taken makes, to some extent, high (not specified in detail herein) demands on the accuracy of the navigation localization and/or the coverage and range of PDC/PMA and image processing (motion-stereo vision), but these demands are considered to have been met in the following description.
(8) A first characteristic activation situation represents the case of driving out of a parallel parking space with a forward motion of the vehicle. This activation situation is detected when the following conditions are completely (optionally, also only predominantly) met:
(9) a) starting the engine with the start/stop button (within a specified time span);
(10) b) engaging a forward gear/step D (within a specified timespan);
(11) c) detection by use of PDC/PMA of at least one constant (that is, non-moving) obstacle/object in the immediate vicinity laterally on the left and laterally on the right relative to the motor vehicle (an obstacle/object on only one side significantly reduces the probability of the presence of a transverse/diagonal parking situation);
(12) d) front obstacle detection by use of PDC/PMA no front obstacle, path of travel open; and
(13) e) vehicle's own position off the road of a certain category (in order to avoid mis-activation at intersections or in congested traffic, in particular, no intersection, no multi-lane road).
(14) The use of motion-stereo-vision-based image processing methods for detecting blind spots is not possible in connection with this first characteristic activation situation, especially if an activation is supposed to occur as early as in a standstill state.
(15) A second characteristic activation situation represents the case of driving through a narrow parking garage exit with a forward motion of the vehicle. This activation situation is detected when the following conditions are completely (optionally, also only predominantly) met:
(16) a) slow travel (low accelerator pedal position and/or speed less than a specified value v_slow);
(17) b) optionally, after a previous vehicle standstill: brake application;
(18) c) driving approximately straight ahead (small steering angle);
(19) d) front obstacle detection by use of PDC/PMA no front obstacle, path of travel open;
(20) e) detection by use of PDC/PMA of at least one constant (that is, non-moving) obstacle/object in the immediate vicinity laterally on the left and laterally on the right relative to the motor vehicle;
(21) f) vehicle's own position off the road of a certain category (in order to avoid mis-activation at intersections or in congested traffic, in particular, no intersection, no multi-lane road, preferably position on private grounds or parking spacethat is, not (yet) on a public road); and
(22) g) detection of a change or, more specifically a transition, from the detection of obstacles/objects (on the left and right) to the detection of free space (on the left and right) by use of PDC/PMA and/or by means of image processing with motion-stereo vision method.
(23) A third characteristic activation situation represents the case of turning into or turning out of a side road with an obstructed view. This activation situation is detected when the following conditions are completely (optionally, also only predominantly) met:
(24) a) slow travel (low accelerator pedal position and/or speed less than a specified value v_slow);
(25) b) optionally, after a previous vehicle standstill: brake application;
(26) c) driving approximately straight ahead (small steering angle);
(27) d) front obstacle detection by use of PDC/PMA no front obstacle, path of travel open;
(28) e) detection by use of PDC/PMA of at least one constant (that is, non-moving) obstacle/object in the immediate vicinity laterally on the left and laterally on the right relative to the motor vehicle;
(29) f) vehicle's own position off the road of a certain category (in order to avoid mis-activation at intersections or in congested traffic, here in particular, vehicle's own position at single lane intersections in residential areas); and
(30) g) detection of a change or, more specifically a transition, from the detection of obstacles/objects (on the left and right) to the detection of free space (on the left and right) by use of PDC/PMA and/or by means of image processing with motion-stereo vision method (preferred due to higher range).
(31) In order to safeguard against mis-activation, it is also possible to include an additional active triggering by the driverfor example, by braking or short term vehicle standstillbefore the activation time. A suitable combination of an environment detection and the behavior of the driver can increase the rate of detection and, at the same time, decrease the rate of mis-detection for the purpose of a (semi) automatic activation without a key stroke.
(32) It must be pointed out that the definition of the above-described characteristic activation situations for an automatic activation of the output of the screen image is also advantageous in connection with other types of detection of at least one image of the surrounding area and in connection with other types of processing and presentation of the screen image.
(33) The above description is basedinfluenced by the predominant orientation of the prior artwithout loss of generality on a side view system with side cameras arranged in the front area of the vehicle as well as on an automatic activation in the event of driving out of a parking space with a forward motion and/or driving through with a forward motion, etc. However, the invention can also be transferred and/or applied to an automatic activation of the output of a screen image in a side view system with side cameras, which are arranged in the rear area of the motor vehicle and/or, for example, in the event of driving out of a parking space with a rearwards motion.
(34) The foregoing disclosure has been set forth merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof.