Method For Displaying Augmented Reality Information In Vehicles
20230234442 ยท 2023-07-27
Inventors
Cpc classification
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
The present invention relates to a method for displaying augmented reality information in vehicles. In particular, the invention relates to a method for displaying an augmentation on a windshield (12) of a vehicle (10), taking into account an current viewing direction of an occupant. The method prepares (900) augmented reality information for display on the windshield (12); detects (910) the current viewing direction (R) of an occupant (F) with respect to the windshield (12); defines (930) a clearance area (A) of the windshield (12) around the current viewing direction (R); and displays (940) the prepared augmented reality information taking into account the defined clearance area (A) on the windshield (12).
Claims
1. A method for displaying augmented reality information in vehicles (10), wherein augmentation takes place on a windshield (12) of the vehicle (10), comprising: preparing (900) augmented reality information for display on the windshield (12); detecting (910) a current viewing direction (R) of an occupant (F) with respect to the windshield (12); defining (930) a clearing area (A) of the windshield (12) around the current viewing direction (R); and displaying (940) the prepared augmented reality information taking into account the defined clearance area (A) on the windshield (12), wherein after the detection (910) of the current viewing direction (R), a determination (920) of an ambient object (82, 84) located in the current viewing direction (R) for an expanded and/or enlarged displaying as augmented reality information takes place, wherein a displaying of the ambient object (82, 84) is made as augmented reality information within the clearance area (A), within a surrounding area (B) located in the vicinity of the clearance area (A) or on a screen, wherein after a displaying of the ambient object (82, 84), a renewed detection (910) of the current viewing direction (R) of the occupant (F) is deactivated or can be deactivated for a limited time.
2. The method according to claim 1, wherein the clearance area (A) of the windshield (12) around the current viewing direction (R) has a viewing-direction-dependent size and/or shape, or these values are determined depended on situation on the basis of external parameters.
3. The method according to claim 1, wherein within the clearance area (A) no augmented reality information or only augmented reality information presented in dimmed form is displayed.
4. The method according to claim 1, wherein the detection (910) of the current viewing direction (R) of an occupant (F) and/or the definition of a clearance area (A) of the windshield (12) can be deactivated.
5. The method according to claim 1, wherein the displaying of the magnified ambient object (82, 84) largely masks reality by means of brightness and optional color enhancement or contouring.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0031] The invention is explained below in embodiment examples with reference to the accompanying drawing, wherein:
[0032]
[0033]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0034]
[0035] The computing unit 20 can also be configured to use external sensor data to determine 920 a ambient object 82, 84 in the direction of view R and to process this data for an expanded and/or enlarged display as augmented reality information. After detection 910 of the current viewing direction R, a determination 920 of an ambient object 82, 84 lying in the current viewing direction R can thus be carried out. As ambient objects, a tree 82 and an oncoming bus 84 along the course of a road 80 are shown as examples. Since the assignment to individual objects does not have to be unambiguous from the viewing direction, an object detection can preferably first be carried out to represent the individual objects in question in the computing unit 20. Thus, in the displaying shown, the tree 82 leads to a first object detection 30 and the bus 84 to a second object detection 32. By means of a suitable evaluation algorithm or by means of neural networks, for example, the most probable object or the object that is probably to be regarded as most relevant in the viewing direction R can then be selected for an expanded and/or enlarged displaying according to certain criteria. Alternatively or in addition to this, a corresponding selection can also be made via the occupant F.
[0036]
[0037] In the flow chart shown, a processing 900 of augmented reality information for display on the windshield 12 is carried out first. In this context, a processing is understood to mean in particular the compilation and arrangement of the content of the information to be displayed on the windshield 12. Since a renewed processing 900 generally cannot take place continuously, but rather with a sometimes significant time interval between each other, this step is preferably carried out before the other steps of the method according to the embodiment of invention, which can be expected to have a higher repetition rate due to principle. However, reprocessing 900 can also be carried out at another point in the process or, in the case of repeated execution of the process, only as required when the display content changes.
[0038] This is followed by detection 910 of the current viewing direction R of an occupant F in relation to the windshield 12. After detection 910, 930 defines a clearance area A of the windshield 12 around the current viewing direction R. This is followed by display 940 of the prepared augmented reality information taking into account the determined clearance area A on the windshield 12. This is followed by displaying 940 the prepared augmented reality information on the windshield 12, taking into account the determined clearance area A. Once this sequence has been completed, the process can be repeated. If the augmented reality information is to remain unchanged in the next run, the processing 900 can be skipped and the display 940 of the prepared augmented reality information need only be carried out taking into account a newly determined clearance area A when detecting 910 any change in the current viewing direction R of an occupant F with respect to the windshield 12.
[0039] The detection 910 of the current viewing direction R of an occupant F and/or the definition of an clearance area A of the windshield F can be deactivated. This can be done for example by a digital or analog switching element, via gesture or voice control as well as depending on the situation. In the case of deactivation, for example, a current display 940 of the prepared augmented reality information can be displayed permanently and without further adaptation to the current viewing direction. Deactivating the detection 910 of the current viewing direction R of an occupant F also has the advantage that a corresponding sensor 24 for detection can also be deactivated. The optional interruption or delay in the process sequence is indicated in the illustration by an interrupted feedback arrow.
[0040] Preferably, after the detection 910 of the current viewing direction R, a determination 920 of a ambient object 82, 84 located in the current viewing direction R can take place for an expanded and/or enlarged display as augmented reality information. A displaying of the ambient object 82, 84 as augmented reality information can then take place within the clearance area A, within a surrounding area B located in the vicinity of the clearance area A, or on a screen. Preferably, after an augmented and/or enlarged displaying of the ambient object 82, 84, a renewed detection 910 of the current viewing direction R of the occupant F is deactivated for a limited time or can be deactivated. This means that, for example, a digital or analog switching element, gesture or voice control and, depending on the situation, an expanded and/or enlarged displaying of a ambient object 82, 84 located in the current viewing direction R can be used to enable an in-depth view. The deactivation can take place independently of a deactivation of the detection 910 of the current viewing direction R of an occupant F and/or the definition of an clearance area A of the windshield F, so that an expanded and/or enlarged displaying of a ambient object 82, 84 lying in a previous viewing direction R can nevertheless be tracked, for example, into a new clearance area A adapted in the meantime by changing the viewing direction.
LIST OF REFERENCE NUMBERS
[0041] 10 vehicle
[0042] 12 front screen
[0043] 14 steering wheel
[0044] 20 computing unit
[0045] 22 displaying unit
[0046] 24 sensor
[0047] 30 first object detection
[0048] 32 second object detection
[0049] 80 street
[0050] 82 tree
[0051] 84 bus
[0052] A clearance area
[0053] B surrounding area
[0054] F occupant (driver)
[0055] R current viewing direction
[0056] 900 preparation of augmented reality information
[0057] 910 detecting the current viewing direction
[0058] 920 determining a ambient object lying in the current viewing direction
[0059] 930 setting a clearance area
[0060] 940 display of the prepared augmented reality information