Abstract
Controlling a display of an augmented reality head-up display for a motor vehicle. The content to be displayed by the augmented reality head-up display may be initially analyzed. In so doing, there can optionally be a prioritization of the content to be displayed. Subsequently, a position of an eyebox of the augmented reality head-up display is adapted on the basis of the content to be displayed.
Claims
1-11. (canceled)
12. A method for controlling a display of an augmented reality head-up display for a motor vehicle, comprising: analyzing content that is to be displayed by the augmented reality head-up display; and adapting a position of an eyebox of the augmented reality head-up display as a function of the content that is to be displayed.
13. The method according to claim 12, wherein a vertical shift of the eyebox of the augmented reality head-up display occurs as a function of the content that is to be displayed.
14. The method according to claim 12, further comprising adapting the position of the eyebox using an adjustment of an optical component of the augmented reality head-up display.
15. The method according to claim 12, further comprising analyzing an image that has been rendered for the display during the analyzing of the content that is to be displayed by the augmented reality head-up display.
16. The method according to claim 15, wherein analyzing the image that has been rendered for the display comprises analyzing color values of the image.
17. The method according to claim 12, further comprising analyzing input data for a rendering of an image for the display, while analyzing the content that is to be displayed by the augmented reality head-up display.
18. The method according to claim 17, further comprising prioritizing the content that is to be displayed.
19. The method according to claim 18, wherein the prioritization comprises a function of a driving situation, or wherein the driving situation can be influenced by a user of the augmented reality head-up display.
20. An apparatus for controlling a display of an augmented reality head-up display for a motor vehicle, comprising: an analysis module for analyzing content that is to be displayed by the augmented reality head-up display; and a control module for adapting a position of an eyebox of the augmented reality head-up display as a function of the content that is to be displayed.
21. The apparatus according to claim 20, wherein the analysis module and control module are configured to enable a vertical shift of the eyebox of the augmented reality head-up display as a function of the content that is to be displayed.
22. The apparatus according to claim 20, wherein the analysis module and control module are configured to adapt the position of the eyebox using an adjustment of an optical component of the augmented reality head-up display.
23. The apparatus according to claim 20, wherein the analysis module and control module are configured to analyze an image that has been rendered for the display during the analyzing of the content that is to be displayed by the augmented reality head-up display.
24. The apparatus according to claim 23, wherein the analysis module and control module are configured to analyze the image that has been rendered for the display by analyzing color values of the image.
25. The apparatus according to claim 20, wherein the analysis module and control module are configured to analyze input data for a rendering of an image for the display, while analyzing the content that is to be displayed by the augmented reality head-up display.
26. The apparatus according to claim 25, wherein the analysis module and control module are configured to prioritize the content that is to be displayed.
27. The apparatus according to claim 26, wherein the prioritization comprises a function of a driving situation, or wherein the driving situation can be influenced by a user of the augmented reality head-up display.
28. A computer program with instructions which, upon being executed by a computer, cause the computer to: analyze content that is to be displayed by the augmented reality head-up display; and adapt a position of an eyebox of the augmented reality head-up display as a function of the content that is to be displayed.
29. The computer program according to claim 28, wherein a vertical shift of the eyebox of the augmented reality head-up display occurs as a function of the content that is to be displayed.
30. The computer program according to claim 28, further comprising adapt the position of the eyebox using an adjustment of an optical component of the augmented reality head-up display.
31. The computer program according to claim 28, further comprising analyze an image that has been rendered for the display during the analyzing of the content that is to be displayed by the augmented reality head-up display.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0028] Further features of the present disclosure will become apparent from the following description and the appended claims in conjunction with the figures.
[0029] FIG. 1 shows an approach to an intersection, seen from the driver's perspective, at a configured distance from the intersection according to some aspects of the present disclosure;
[0030] FIG. 2 shows the approach to the intersection, seen from the driver's perspective, at a short distance from the intersection according to some aspects of the present disclosure;
[0031] FIG. 3 shows, schematically, a method for controlling a display of an augmented reality head-up display for a motor vehicle according to some aspects of the present disclosure;
[0032] FIG. 4 shows a first embodiment of an apparatus for controlling a display of an augmented reality head-up display for a motor vehicle according to some aspects of the present disclosure;
[0033] FIG. 5 shows a second embodiment of an apparatus for controlling a display of an augmented reality head-up display according to some aspects of the present disclosure;
[0034] FIG. 6 shows, schematically, a motor vehicle inside which a solution according to the present disclosure has been implemented according to some aspects of the present disclosure;
[0035] FIG. 7 schematically shows a general structure of an augmented reality head-up display for a motor vehicle according to some aspects of the present disclosure;
[0036] FIG. 8 shows a representational area of a head-up display and adjoining tolerance areas for different positions of the eyebox according to some aspects of the present disclosure;
[0037] FIG. 9 shows a turning situation, which is to be illustrated by means of an augmented reality display according to some aspects of the present disclosure;
[0038] FIG. 10 shows an augmented reality representation of a navigation marking without any shifting of the eyebox according to some aspects of the present disclosure; and
[0039] FIG. 11 shows an augmented reality representation of the navigation marking with the eyebox shifted consistent with the situation according to some aspects of the present disclosure.
DETAILED DESCRIPTION
[0040] For a better understanding of the principles of the present disclosure, embodiments of the present disclosure will be explained in more detail below with reference to the figures. It is understood that the present disclosure is not limited to these embodiments and that the features described can also be combined or modified without departing from the scope of the present disclosure as defined in the appended claims.
[0041] FIG. 1 shows an approach to an intersection, seen from the driver's perspective, at a great distance from the intersection. The augmented reality head-up display represents, on the one hand, a contact-analog navigation marking 60, here in the form of a visualized trajectory of a vehicle approaching an intersection, and, on the other hand, a contact-analog object marking 61, here in the form of a frame around a person. Also displayed are two different fields of view 62, 62′, a large field of view 62′ corresponding to an angular range of 20°×10° and a small field of view 62 corresponding to an angular range of 10°×4°. At this distance, the virtual content can be represented without any problems for both sizes of the fields of view 62, 62′.
[0042] FIG. 2 shows the approach to the intersection, seen from the driver's perspective, at a short distance from the intersection. At this distance, the representations of both the contact-analog navigation marking 60 and the contact-analog object marking 61 are severely truncated by the small field of view 62. The navigation marking 60 can hardly be recognized as such. This effect reduces the added value and the user experience of an augmented reality head-up display.
[0043] FIG. 3 schematically shows a method for controlling a display of an augmented reality head-up display for a motor vehicle. In a first step 10, the content to be displayed by the augmented reality head-up display is analyzed. For example, an image rendered for the display can be analyzed, in particular its color values. Alternatively, input data for rendering an image can also be analyzed for the display. Optionally, the content to be displayed can also be prioritized 11. This prioritization can be a function of a driving situation, or it can be influenced by a user. A position of an eyebox of the augmented reality head-up display is then adapted as a function of the content that is to be displayed 12. The eyebox is preferably shifted at least vertically. The position of the eyebox can, for example, be adapted by adjusting an optical component of the augmented reality head-up display 12.
[0044] FIG. 4 shows a simplified schematic representation of a first embodiment of an apparatus 20 for controlling a display of an augmented reality head-up display for a motor vehicle. The apparatus 20 has an input 21 via which, for example, image data from a camera 43, data from a sensor system 44 or data from a navigation system 45 can be received. The sensor system 44 can, for example, have a laser scanner or a stereo camera for detecting objects in the surroundings of the motor vehicle. The apparatus 20 also has an analysis unit 22 which can analyze the content to be displayed by the augmented reality head-up display, in particular with regard to its representational capacity in a display area of the augmented reality head-up display. For example, the analysis unit 22 can be configured to analyze an image rendered for the display, in particular its color values. Alternatively, input data for rendering an image can also be analyzed for the display. The analysis unit 22 can also prioritize the content to be displayed. This prioritization can be a function of a driving situation, or it can be subject to being influenced by a user. Finally, a control module 23 causes an adaptation of a position of an eyebox of the augmented reality head-up display as a function of the content that is to be displayed. Therein, at least one vertical shift of the eyebox preferably occurs. The position of the eyebox can be adapted, for example, by adjusting an optical component of the augmented reality head-up display. Control signals from the control module 23 can be output via an output 26 of the apparatus 20, e.g., to a control device 42 of the augmented reality head-up display.
[0045] The analysis unit 22 and the control module 23 can be controlled by a control unit 24. If necessary, settings of the analysis unit 22, the control module 23 or the control unit 24 can be changed via a user interface 27. The data collected by the apparatus 20 can, if necessary, be stored in a memory 25 of the apparatus 20, for example, for later analysis or for utilization by the components of the apparatus 20. The analysis unit 22, the control module 23 and the control unit 24 can be implemented as dedicated hardware, for example, as integrated circuits. Of course, they can also be partially or fully combined or implemented as software that is executed on a suitable processor, for example, a GPU. The input 21 and the output 26 can be implemented as separate interfaces or as one combined bidirectional interface. In the example that is described here, the apparatus 20 is an independent component. However, it can also be integrated in the control unit 42 of the augmented reality head-up display apparatus.
[0046] FIG. 5 shows a simplified schematic representation of a second embodiment of an apparatus 30 for controlling a display of an augmented reality head-up display for a motor vehicle. The apparatus 30 has a processor 32 and a memory 31. For example, the apparatus 30 is a computer or a control unit. Residing in the memory 31 are instructions that have been stored there which, when executed by the processor 32, cause the apparatus 30 to execute the steps according to any one of the described methods. The instructions that are stored in the memory 31 therefore embody a program which can be executed by the processor 32 that implements the method according to the present disclosure. The apparatus 30 has an input 33 for receiving information, for example, navigation data or data relating to the surroundings of the motor vehicle. Data generated by the processor 32 are provided via an output 34. In addition, they can be stored in memory 31. The input 33 and the output 34 can be combined to form a bidirectional interface.
[0047] The processor 32 may comprise one or more processing units, for example, microprocessors, digital signal processors, or combinations thereof.
[0048] The memories 25, 31 of the described embodiments can have volatile and non-volatile data storage areas, and they can comprise a wide variety of storage apparatuses and storage media, for example, hard drives, optical storage media, or semiconductor memories.
[0049] FIG. 6 schematically shows a motor vehicle 40 where a solution according to the present disclosure has been implemented. The motor vehicle 40 has an augmented reality head-up display 41 with an associated control device 42. Furthermore, the motor vehicle 40 has an apparatus 20 for controlling a display of the augmented reality head-up display 41. The apparatus 20 can, of course, also be integrated in the augmented reality head-up display 41 or in the control device 42 of the augmented reality head-up display 41. Further components of the motor vehicle 40 are a camera 43 and a sensor system 44 for detecting objects, a navigation system 45, a data transmission unit 46, and a number of assistance systems 47, wherein one of these assistance system is shown as an example. A connection to service providers can be established by means of the data transmission unit 46, for example, for retrieving map data. A memory 48 is provided for storing data. The data exchange between the various components of the motor vehicle 40 takes place via a network 49.
[0050] FIG. 7 schematically shows an augmented reality head-up display 41 for a motor vehicle 40 that is used for displaying content on a projection area 53 of the motor vehicle 40, for example, on the windshield or on an additional pane made of glass or plastic, which is arranged on the dashboard between the driver and the windshield. The displayed content is generated by means of an imaging unit 50 and projected onto the projection surface 53 with the aid of an optical module 51. The projection typically occurs in an area of the windshield and above the steering wheel. The position of an eyebox of the augmented reality head-up display 41 can be adapted by means of an optical component 52 of the optical module 51. The imaging unit 50 can be an LCD-TFT display, for example. The augmented reality head-up display 41 is usually installed in a dashboard of the motor vehicle 40.
[0051] A preferred embodiment of the present disclosure will be described below with reference to FIGS. 8 to 11.
[0052] FIG. 8 shows a field of view 62 of a head-up display and tolerance ranges 63 for different positions of the eyebox adjacent thereto. FIG. 8a) illustrates a middle position of the eyebox, FIG. 8b) a high position, and FIG. 8c) a low position. Due to the optical design of head-up displays, the virtual image is only detectable if the viewer's eyes are inside the eyebox. By adjusting the optics of the head-up display, e.g., by adjusting the mirror, this eyebox can be shifted in the vertical alignment thereof. The available adjustment range is indicated by the vertical double arrow and the rectangle shown with dotted lines. Therefore, the vertical position of the field of view 62 is defined via the adjustment in the optics, i.e., the look-down angle (downward viewing angle, i.e., the angle of the viewing axis relative to the road) in relation to the center point of the eyebox. If the eyebox is set too high or too low for the driver, the image of the display is truncated at the upper or lower edge of the field of view 62. When the setting is correct, on the other hand, the driver can see the image fully. In addition, tolerance areas 63 result above and below the field of view 62. If the field of view 62 had a greater vertical extension, the virtual image would also be visible in these areas.
[0053] FIG. 9 shows a turning situation that is to be illustrated by means of an augmented reality display. Shown is a visualized trajectory of travel that reflects a course of a turn. This is not an actual augmentation by means of an augmented reality head-up display but merely the visualization of a trajectory of travel that utilizes the driver's field of view completely.
[0054] FIG. 10 shows an augmented reality display of a navigation marking 60 without shifting the eyebox. The augmented reality head-up display is used to show an augmentation in the form of a navigation marking 60, which corresponds to the trajectory of travel as shown in FIG. 9. Because of the short distance to the intersection, the display of the contact-analog navigation marking 60 is severely truncated by the field of view 62. The navigation marking 60 is hardly visible as such.
[0055] FIG. 11 shows an augmented reality display of the navigation marking 60 with the eyebox shifted consistent with a given situation. In light of the navigation marking 60 that is to be displayed, the eyebox was shifted downward, whereby a significantly larger part of the trajectory of travel becomes visible. Even without a larger vertical extension of the field of view 62, the display of the navigation marking 60 has been improved significantly. By an enlarging the field of view 62, the display can be improved further.
LIST OF REFERENCE NUMERALS
[0056] 10 Analyze content to be displayed
[0057] 11 Prioritize the content to be displayed
[0058] 12 Adapt a position of an eyebox
[0059] 20 Apparatus
[0060] 21 Input
[0061] 22 Analysis module
[0062] 23 Control module
[0063] 24 Control unit
[0064] 25 Memory
[0065] 26 Output
[0066] 27 User interface
[0067] 30 Apparatus
[0068] 31 Memory
[0069] 32 Processor
[0070] 33 Input
[0071] 34 Output
[0072] 40 Motor vehicle
[0073] 41 Augmented reality head-up display
[0074] 42 Control device of the augmented reality head-up display
[0075] 43 Camera
[0076] 44 Sensor system
[0077] 45 Navigation system
[0078] 46 Data transmission unit
[0079] 47 Assistance system
[0080] 48 Memory
[0081] 49 Network
[0082] 50 Imaging unit
[0083] 51 Optical module
[0084] 52 Optical component
[0085] 53 Projection area
[0086] 60 Navigation marking
[0087] 61 Object marking
[0088] 62, 62′ Field of view
[0089] 63 Tolerance range