DRIVER ASSISTANCE SYSTEM FOR A MOTOR VEHICLE
20180012495 ยท 2018-01-11
Inventors
Cpc classification
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
B60W2422/00
PERFORMING OPERATIONS; TRANSPORTING
B60W50/0098
PERFORMING OPERATIONS; TRANSPORTING
G08G1/166
PHYSICS
B60W2756/10
PERFORMING OPERATIONS; TRANSPORTING
B60W2554/00
PERFORMING OPERATIONS; TRANSPORTING
B60W2050/0043
PERFORMING OPERATIONS; TRANSPORTING
B60W30/095
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W30/095
PERFORMING OPERATIONS; TRANSPORTING
B60W50/00
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A driver assistance system for motor vehicles, including at least one sensor for detecting object properties of objects which are located in the surroundings of the motor vehicle; a first interface; an output unit for transmitting the object properties to a user; and a control unit. The sensor transmits the object properties in a form of a first signal to the first interface. The first interface transmits the object properties, received in the form of the first signal, to the control unit in the form of a second signal, the control unit being configured to forward the object properties, received in the form of a second signal, to the output unit and to control the output of the object properties by the output unit.
Claims
1-17. (canceled)
18. A driver assistance system for a motor vehicle, comprising: at least one sensor to detect object properties of objects which are located in surroundings of the motor vehicle; an interface; an output unit to transmit the object properties to a user; and a control unit; wherein the sensor transmits the object properties to the interface in the form of a first signal, the interface transmits the object properties, received in the form of the first signal, to the control unit in the form of a second signal, and wherein the control unit is configured to forward the object properties, received in the form of the second signal, to the output unit and to control the output of the object properties by the output unit.
19. The driver assistance system as recited in claim 18, wherein the control unit is a processor of a smartphone, on which an application software (APP) is executed, the control unit being configured to process the object properties before forwarding to the output unit, as a function of inputs of a user received via an input mask of the APP.
20. The driver assistance system as recited in claim 18, wherein the processing of the object properties includes that the object properties are processed as a function of data detected by at least one sensor of the mobile unit.
21. The driver assistance system as recited in claim 19, wherein the processing of the object properties includes that at least one of: (i) the object properties are incorporated into a graphic or a diagram, and (ii) the object properties are superimposed with a camera image from a camera of the mobile unit.
22. The driver assistance system as recited in claim 18, wherein the control unit is a processor of a navigation system which is installed in the motor vehicle.
23. The driver assistance system as recited in claim 19, wherein the output unit is a display of the smartphone, the object properties being transmitted to the user at least one of: (i) in the form of a graphic output of the APP, and (ii) acoustically.
24. The driver assistance system as recited in claim 18, wherein the output unit is part of a GPS navigation device of the motor vehicle, the object properties being transmitted to the user via the GPS navigation device of the motor vehicle in at least one of: (i) visual form, and (ii) acoustic form.
25. The driver assistance system as recited in claim 18, wherein the output unit is a projector which transmits the object properties to the user in visual form, the visual transmission being carried out in the form of a projection onto a windshield of the motor vehicle.
26. The driver assistance system as recited in claim 18, wherein the first signal is a CAN signal as is also used in a CAN (Controlled Area Network).
27. The driver assistance system as recited in claim 18, wherein the second signal is a Bluetooth signal, the interface converting the first signal into a Bluetooth signal.
28. The driver assistance system as recited in claim 18, wherein the second signal is a WLAN signal, the interface converting the first signal into a WLAN signal.
29. The driver assistance system as recited in claim 18, wherein the control unit is designed in such a way that the control unit monitors the object properties for critical properties with respect to at least one of: (i) a course of the motor vehicle, and (ii) a speed of the motor vehicle, for imminent collisions or critical safety distances, and the critical properties being identified through comparison of one or multiple of the object properties with threshold values of the object properties stored in a memory of the control unit.
30. The driver assistance system as recited in claim 29, wherein the control unit is designed in such a way that the control unit outputs to the user at least one of: (i) a visual warning, and (ii) an acoustic warning, via the output unit if a critical property of the object properties has been ascertained.
31. The driver assistance system as recited in claim 19, wherein the output unit is a display of the smartphone and the object properties of different objects are grouped with respect to their values, different marking intensities being assigned to the groups in the output unit.
32. The driver assistance system as recited in claim 18, wherein the object property is a differential speed of the motor vehicle, the differential speeds of the objects being color coded in the output unit.
33. The driver assistance system as recited in claim 18, further comprising: an additional output unit, the control unit being configured in such a way to forward the output of the object properties to the output unit and to the additional output unit.
34. The driver assistance system as recited in claim 33, wherein the control unit is a processor of a smartphone, on which an application software (APP) is executed, the output unit being the display of the smartphone, the additional output unit being the display of a navigation device, and the forwarding of the object properties to the additional output unit being carried out by a mirroring function of the APP executed on the processor of the smartphone.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] The present invention explained in greater detail below based on preferred exemplary embodiments, the same reference numerals being used for the same features.
[0024]
[0025]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0026]
[0027] One or multiple sensor(s) 120, such as radar sensors, ultrasonic sensors, and/or video sensors, or the like may function as sensor 120 on suitable points outside or inside the motor vehicle. The installation complexity may be limited to the effect that a sufficiently stable fastening of sensor 120 and the laying of a power supply cable. Thus, for example, a radar sensor may be installed as sensor 120 on the dashboard and/or on the rear window of the vehicle, so that monitoring of the rear area of the vehicle, including the adjacent lane and a large part of the blind spot, is facilitated to provide a lane change assistant function for example during passing on express highways.
[0028] Sensor 120 transmits the object properties to an interface 130 in the form of a first signal 125. First signal 125 in the specific embodiment shown is a signal of a vehicle-typical bus system 170, in particular a CAN bus. A CAN bus is a known standard in the automotive field, which does not need to be explained in greater detail at this point. While the MOST bus was, in particular, for multimedia applications, CAN busses are used with a bandwidth of 1 Mbit/s and LIN busses with a bandwidth of 20 Kbit/s for the control of the vehicle electronics. In contrast, FlexRay has a data rate of 10 Mbit/s.
[0029] Interface 130 receives the object properties in the form of a first signal 125 and transmits them to a control unit 140 in the form of a second signal 135. The conversion of first signal 125 into second signal 135 is therefore necessary because one object of the present invention includes offering a retrofittable driver assistance system 100. It is therefore necessary that first signal 125, based for example on the known CAN standard, be converted into second signal 135, which is compatible with external mobile units 110, which may in some circumstances already be in the possession of a user and, if necessary, also be used for additional purposes, such as a smartphone. In preferred specific embodiments, second signal 135 is a Bluetooth signal or a WLAN signal. In this way, the object properties are transmitted from interface 130 to control unit 140 in a common signal standard.
[0030] Control unit 140 forwards the object properties to an output unit 150, control unit 140 forwarding the object properties, received in the form of second signal 135, to output unit 150 and being configured to control the output of the object properties by output unit 150. In this way, the information about the object properties of detected objects 160 provided by sensor 120 may be represented on the display as output unit 150, in particular on a display 150 of a mobile unit 110 as text, by symbols, pictograms, or also in the form of an image, for example, in the form of a situation image from the bird's eye view. Alternatively, the indication may also be carried out acoustically or in another form.
[0031]
[0032] If internal functions of smartphone 110 may be accessed by APP 145, for example, the camera of smartphone 110 or sensors 148 integrated into the smartphone, then this access may be used to also consider information obtained by this access during the processing of the object properties. Thus, for example, the camera of smartphone 110 may be used to obtain a picture of the vehicle surroundings and to identify those objects 160 in this picture, whose object properties are detected by sensor 120 and transmitted to control unit 140, using corresponding image recognition algorithms that are integrated into APP 145. An output to the user may then be carried out in such a way that an actual camera image of the vehicle surroundings is displayed in display 150 of smartphone 110, the object properties of objects 160 located in the camera image being superimposed on the camera image.
[0033] This superimposition is carried out intuitively. For example, a road user in the surroundings of the motor vehicle, which moves at a certain speed and whose speed has been ascertained by sensor 120 as an object property, is displayed simultaneously in display 150 of smartphone 110 and is identified there by a colored frame, the color of the frame being subject to a certain intuitive color coding. Thus, for example, road users which are moving faster than the motor vehicle itself may be identified with red frames. Conversely, road users which are moving slower than the motor vehicle itself may be identified with green frames. A gradual gradation along a certain continuous color spectrum is hereby also possible.
[0034] Basically, a projector may also be used as output unit 150 which, for example, generates a projection onto a windshield of the motor vehicle so that the user may be informed at all times about potential objects and their object properties during the driving operation using the easily visible windshield.
[0035] Furthermore, road users, which are located on a collision course with the motor vehicle, ascertained on the basis of the object data by control unit 140, may be identified on output unit 150, in particular on display 150 of smartphone 110, by a red blinking frame.
[0036] The present invention is not limited to the described exemplary embodiments, but instead also includes other similar specific embodiments. The description of the figures is only used for understanding the present invention.