INTELLIGENT HEAD-UP DISPLAY
20230271622 · 2023-08-31
Inventors
Cpc classification
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
G02B2027/0141
PHYSICS
G02B2027/0187
PHYSICS
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
Abstract
Technologies and techniques for displaying information for an occupant of a motor vehicle. The display position of a head-up display is able to be varied flexibly depending on the context. To this end, both a displacement of the X and/or Y position and a color selection adapted to the background may be implemented and be performed automatically. Information from the steering angle, speed, map data and also head tracking by the interior camera may be combined to determine the X and/or Y position. In order to achieve optimum visibility for any load case, sensor fusion from the position data and an exterior camera may take place: Content or information (24) may thus be displayed depending on the current display position, so as to be more visible. If only some of the information is on a background, only some of the color may also change.
Claims
1-10. (canceled)
11. A device for displaying a notice for an occupant of a motor vehicle, comprising: an input interface for receiving data comprising information regarding the notice for the occupant; an analysis unit for analyzing the data and generating a control command, the control command effectuating an optical output of the notice comprising a notice symbol; and an output interface for transmitting the control command to an augmented reality display unit, wherein the control command effectuates a predefined optical output of the notice at a predefined position in a field of vision of the occupant.
12. The device according to claim 11, wherein the device is configured as part of an on-board computer and/or navigation system, and further configured to be installed in a motor vehicle.
13. The device according to claim 11, wherein the input interface is configured to receive sensor data comprising information regarding the field of vision of the occupant, and wherein the analysis unit is configured to determine the predefined position in the field of vision of the occupant based on the sensor data.
14. The device according to claim 11, wherein the input interface is configured to receive a plurality of sensor data comprising at least two of steering angle sensor data, speed sensor data, position sensor data, head-tracking sensor data, passenger compartment camera data, and/or exterior camera, wherein the analysis unit is configured to combine the plurality of sensor data via sensor fusion.
15. The device according to claim 11, wherein the analysis unit is configured to determine a hazard source in the field of vision of the occupant, wherein the control command effectuates a displacement of the notice symbol toward the hazard source in the field of vision of the occupant.
16. The device according to claim 11, wherein the analysis unit is configured to determine a predefined optical output comprising a color and/or a type of the notice symbol, based on an object ahead of the motor vehicle in the field of vision of the occupant and the predefined position in the field of vision of the occupant, so as to effectuate an output of the notice symbol with increased visibility for the occupant via the control command.
17. A method for displaying a notice for an occupant of a motor vehicle, comprising: receiving data via an input interface, the data comprising information regarding the notice for the occupant; analyzing, via an analysis unit, the data and generating a control command, the control command effectuating an optical output of the notice comprising a notice symbol; and transmitting, via an output interface, the control command to an augmented reality display unit, wherein the control command effectuates a predefined optical output of the notice at a predefined position in a field of vision of the occupant.
18. The method according to claim 11, wherein the analysis unit is configured as part of an on-board computer and/or navigation system, and further configured to be installed in a motor vehicle.
19. The method according to claim 11, further comprising receiving sensor data comprising information regarding the field of vision of the occupant, and determining the predefined position in the field of vision of the occupant based on the sensor data.
20. The method according to claim 11, further comprising receiving a plurality of sensor data comprising at least two of steering angle sensor data, speed sensor data, position sensor data, head-tracking sensor data, passenger compartment camera data, and/or exterior camera, and further comprising combining, via the analysis unit, the plurality of sensor data via sensor fusion.
21. The method according to claim 11, further comprising determining, via the analysis unit, a hazard source in the field of vision of the occupant, wherein the control command effectuates a displacement of the notice symbol toward the hazard source in the field of vision of the occupant.
22. The method according to claim 11, further comprising determining, via the analysis unit, a predefined optical output comprising a color and/or a type of the notice symbol, based on an object ahead of the motor vehicle in the field of vision of the occupant and the predefined position in the field of vision of the occupant, so as to effectuate an output of the notice symbol with increased visibility for the occupant via the control command.
23. A system for displaying a notice for an occupant of a motor vehicle, comprising: an input interface for receiving data comprising information regarding the notice for the occupant; an analysis unit for analyzing the data and generating a control command, the control command effectuating an optical output of the notice comprising a notice symbol; and an output interface for transmitting the control command to an augmented reality display unit, wherein the control command effectuates a predefined optical output of the notice at a predefined position in a field of vision of the occupant.
24. The system according to claim 23, wherein the system is configured as part of an on-board computer and/or navigation system, and further configured to be installed in a motor vehicle.
25. The system according to claim 23, wherein the input interface is configured to receive sensor data comprising information regarding the field of vision of the occupant, and wherein the analysis unit is configured to determine the predefined position in the field of vision of the occupant based on the sensor data.
26. The system according to claim 23, wherein the input interface is configured to receive a plurality of sensor data comprising at least two of steering angle sensor data, speed sensor data, position sensor data, head-tracking sensor data, passenger compartment camera data, and/or exterior camera, wherein the analysis unit is configured to combine the plurality of sensor data via sensor fusion.
27. The system according to claim 23, wherein the analysis unit is configured to determine a hazard source in the field of vision of the occupant, wherein the control command effectuates a displacement of the notice symbol toward the hazard source in the field of vision of the occupant.
28. The system according to claim 23, wherein the analysis unit is configured to determine a predefined optical output comprising a color and/or a type of the notice symbol, based on an object ahead of the motor vehicle in the field of vision of the occupant and the predefined position in the field of vision of the occupant, so as to effectuate an output of the notice symbol with increased visibility for the occupant via the control command.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] Aspects of the present disclosure will be described hereafter in exemplary embodiments based on the associated drawings. In the drawings:
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
DETAILED DESCRIPTION
[0031]
[0032] The input interface 12 is configured to receive data including information regarding the notice for the occupant. The data can encompass a speed or a navigation notice, for example, and can stem from a speedometer, a GPS receiver and/or a navigation system, for example. For reception, the input interface 12 is preferably connected to a vehicle-internal transmission network. The input interface 12 can furthermore also be designed for wireless communication or be connected to a proprietary, for example hard-wired, transmission network.
[0033] The analysis unit 14 is configured to receive the data and to generate a control command, wherein the control command effectuates an optical output of the notice in the form of a notice symbol by means of an augmented reality display unit. The control command effectuates a predefined optical output of the notice at a predefined position in the field of vision of the occupant.
[0034] The output interface 16 is configured to transmit the control command to the augmented reality display unit. The output interface 16 can be designed for communication, analogously to the input interface 12. It shall be understood that the input interface 12 and the output interface 16 can also be designed to be combined, serving as a communication interface for transmission and reception.
[0035]
[0036] In some examples, the device 10 receives data including information regarding the notice. These data can be, for example, a speed of the motor vehicle, data of an eye tracking sensor, data about the traffic situation, position data of the motor vehicle, data regarding a position of a hazard source, and the like.
[0037] Based on the data, the device 10 creates a control command for the AR display unit 20 and thereby effectuates an output of a notice 24 with respect to information for an occupant on a windshield 26 of a motor vehicle. In the shown example, the AR display unit 20 comprises a head-up display, wherein the projection unit 22 is designed to be movable and, for example, can be moved by means of actuators so that the notice 24 can be projected at the predefined position in the field of vision of the occupant on the windshield 26. In this way, the notice symbol can be displayed directly at the windshield 26 of the motor vehicle.
[0038]
[0039] The device 10 receives data including information regarding the notice 24 for an occupant 30, as described above. In the shown example, the system 18 is designed as part of a navigation system and connected to a GPS receiver 32 for position determination or speed determination. It shall be understood that the system 18 can also be designed as an independent unit. In the shown example, the navigation system can form a unit that determines a piece of information, such as a navigation prompt to be transmitted to the occupant 30.
[0040] Further data can stem from an occupant camera 34, for example for determining a viewing direction of the occupant 30, or from an exterior camera 36, such as, for example, a front camera, for determining a situation ahead of the motor vehicle 28. It shall be understood that further sensors that, in principle, are known in the prior art can also be used. In particular, sensor data fusion of data from multiple sensors may be carried out so as to achieve a higher data quality.
[0041] The device 10 analyzes the received data and determines a control command, which effectuates an output of the notice 24 at a predefined position in the field of vision of the occupant 30.
[0042]
[0043] In contrast to the variant shown in
[0044] In the shown example, an illustration of the sensors was not shown for the sake of clarity and brevity.
[0045]
[0046] In the shown example, it is advantageous when a driver of the motor vehicle 28 directs his or her gaze at the apex of the curve, which is why the predefined position of the notice symbol 42 is the apex of the curve. The position of a notice symbol 44 known in the prior art is shown in dotted lines. By displaying the notice symbol 42 at the predefined position, the driver is able to detect the notice symbol 42 without have to avert his or her gaze from the apex of the curve. The visibility of the notice symbol 42 is increased. Furthermore, safety in road traffic is increased since the driver is encouraged to optimally guide his or her gaze.
[0047]
[0048] In contrast to the situation shown in
[0049] In the shown example, the notice symbol 44 is furthermore optically output in a predefined manner. So as to increase the visibility or perceptibility of the notice symbol 44 for the occupant or driver, a background 48 is placed behind the notice symbol 44. It shall be understood that the predefined optical output can also encompass a change in color or contrast of the notice symbol 44.
[0050]
[0051] The control command effectuates an optical output of the notice 24 at a predefined position in the field of vision of the occupant 28.
[0052] It shall be understood that the method can be carried out by means of a computer program including program code means, wherein the program code means are designed to carry out all the steps of the method when the computer program is being executed on a computer or a corresponding processing unit, such as an on-board computer, a navigation system and/or infotainment system.
[0053] Aspects of the present disclosure were described in detail. Using the disclosed teaching, it is possible to achieve the following advantages and/or solve the following problems by at least one embodiment: [0054] The display position of a head-up display can be varied flexibly as a function of the context. [0055] To this end, both a displacement of the X and/or Y positions and a color selection that is adapted to the background can be implemented and be carried out automatically. [0056] For the determination of the X and/or Y positions, information comprising the steering angle, speed, map data and also head tracking of the passenger compartment camera can be combined. [0057] In particular, so as to achieve optimal visibility for any load scenario or application, sensor fusion from the position data and an external camera can be carried out. In this way, it is possible to represent content or notices as a function of the current display position in such a way that the content or notices is or are more visible. If only a portion of the notice or notice symbol is in front of a background, it is also possible that only a portion changes color, so that the visibility is ensured for the entire notice symbol. [0058] In addition, by using visual warning colors and simultaneously displacing the display in the direction of a hazard source, the driver’s gaze can be guided toward the hazardous situation.
TABLE-US-00001 List of Reference Signs 10 device 12 input interface 14 analysis unit 16 output interface 18 system 20 augmented reality display unit (AR display unit) 22 projection unit 24 notice 26 windshield 28 motor vehicle 30 occupant 32 GPS receiver 34 occupant camera 36 exterior camera 38 output 40 roadway 42 notice symbol 44 notice symbol 46 obstacle 48 background S1-S4 method steps