METHOD FOR OPERATING A DISPLAY DEVICE FOR A MOTOR VEHICLE AND MOTOR VEHICLE

20200310548 · 2020-10-01

    Inventors

    Cpc classification

    International classification

    Abstract

    A method of operating a display device for a motor vehicle having a plurality of display areas includes providing a plurality of non-contact input operations where a first non-contact input operation of a user is detected and verified by a second non-contact input operation of the user in order to select at least one display element on a first display area, and at least one third non-contact input operation of the user is detected.

    Claims

    1-15. (canceled)

    16. A method of operating a display device for a motor vehicle comprising a plurality of display areas, the method comprising: providing a plurality of non-contact input operations, wherein a first non-contact input operation of a user is detected and verified by a second non-contact input operation of the user in order to select at least one display element on a first display area, and at least one third non-contact input operation of the user is detected in order to displace the at least one selected display element at least once from the first display area within the first display area or to a second display area, wherein, in order to detect the first non-contact input operation, at least one characteristic variable for a head of the user is detected and, for verification purposes, at least one characteristic variable for a torso of the user is detected as the second non-contact input operation, or in order to detect the first non-contact input operation, at least one characteristic variable for the torso of the user is detected and, for verification purposes, at least one characteristic variable for the head of the user is detected as the second non-contact input operation, and wherein, in order to detect the third non-contact input operation, at least one characteristic variable for a hand, finger, fingers, or mouth of the user is detected.

    17. The method according to claim 16, wherein the first, second and third non-contact input operations can be at least one of selected or amended by the user.

    18. The method according to claim 16, wherein the characteristic variable for the head of the user comprises at least one of a line of vision, an eye position, an iris position, a pupil position, a nose position, a posture of the head, a head position, a head orientation, or a facial expression of the user, the characteristic variable of the torso of the user comprises at least one of a bodily posture, a body position, a body orientation, a shoulder position, or a shoulder orientation of the user, the characteristic variable for a hand or a finger of the user comprises at least one of a gesture such as a skimming past, approaching, moving away, splaying of fingers, bending of fingers, touching of fingers, making a fist, a finger or hand position, or a finger or hand orientation, or the characteristic variable for the mouth of the user comprises at least one of a movement of the lips, a noise, or a voice command.

    19. The method according to claim 16, wherein, in order to detect one or more of the first, second or third non-contact input operations, at least one depth image camera or a time-of-flight camera is used.

    20. The method according to claim 16, wherein at least one of monitors, projection screens, head-up displays, flexible OLED displays, liquid crystal displays, light-transmitting fabrics, or light-transmitting films are used for the plurality of display areas.

    21. The method according to claim 16, wherein the plurality of display areas are arranged in at least one of an instrument panel, a windshield, a headliner, a central console, or in a further interior trim part of the motor vehicle.

    22. The method according to claim 16, wherein the at least one display element is at least one of an icon, a menu item, a total content of the first display area, or a selected subarea of the first display area.

    23. The method according to claim 16, wherein brain activity of the user is detected as a further non-contact input operation.

    24. The method according to claim 16, wherein at least one of the at least one display element on the first display area is selected in response to the first and second non-contact input operations being detected within a first interval of time, or the at least one selected display element is displaced from the first display area within the first display area or to a second display area in response to one or more of the first or second non-contact input operations and the third non-contact input operation being detected within a second interval of time.

    25. The method according to claim 16, wherein the displacement is cancelled by a fourth non-contact input operation of the user, which differs from the first, second and third non-contact input operations of the user, or by actuating an input apparatus comprising a touch panel.

    26. The method according to claim 16, wherein during the displacement of the at least one display element, at least one of the at least one display element is enlarged, additional information regarding the at least one display element is displayed, or a change is made to a submenu regarding the at least one display element.

    27. The method according to claim 16, wherein during the displacement of the at least one display element, at least one of the at least one display element is reduced, less information regarding the at least one display element is displayed, or the at least one display element disappears from a line of vision of the user or enters the background.

    28. A motor vehicle comprising a display device with a plurality of display areas; at least one sensor for detecting a non-contact input operation; and a control apparatus which is configured to perform the method according to claim 16.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0027] Preferred embodiments of the present invention are explained in greater detail below by way of example with reference to the drawings, in which

    [0028] FIG. 1 shows a schematic representation of a motor vehicle which is configured to perform an exemplary embodiment of the method according to the invention;

    [0029] FIG. 2 represents a schematic representation of the detection of non-contact operating actions within the framework of an exemplary embodiment of the method according to the invention;

    [0030] FIG. 3 represents a schematic representation of the performance of an operating action within the framework of an exemplary embodiment of the method according to the invention; and

    [0031] FIG. 4 shows a further schematic representation of the performance of an operating action within the framework of a further exemplary embodiment of the method according to the invention.

    DETAILED DESCRIPTION

    [0032] It is to be understood that the disclosure is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The Figures and written description are provided to teach those skilled in the art to make and use the inventions for which patent protection is sought. The disclosure is capable of other embodiments and of being practiced and carried out in various ways. Those skilled in the art will appreciate that not all features of a commercial embodiment are shown for the sake of clarity and understanding.

    [0033] In addition, it is to be understood that the phraseology and terminology employed herein are for the purpose of describing the present disclosure and should not be regarded as limiting. For example, the use of a singular term, such as, a is not intended as limiting of the number of items. Also, the use of relational terms, such as but not limited to, top, bottom, left, right, upper, lower, down, up, side, are used in the description for clarity in specific reference to the Figures and are not intended to limit the scope of the present disclosure. Further, it should be understood that any one of the features may be used separately or in combination with other features. Other systems, methods, features, and advantages will be or become apparent to those skilled in the art upon examination of the Figures and the description. The term driver is used throughout this disclosure but is not limited to a person who is operating or controlling the vehicle; it may refer to any vehicle occupant, person, passenger, or user inside the vehicle, or, in certain circumstances, a person who is outside the vehicle but controlling the vehicle or interested in movement of the vehicle. It is intended that all such additional systems, methods, features, and advantages be included within this description, and be within the scope of the present disclosure.

    [0034] A motor vehicle which is designated in its entirety with 10 includes a first camera 12 which is arranged in an instrument panel 14, and a second camera 16 which is installed at the transition between a windshield 18 and the roof 20 of the motor vehicle 10. The camera 16 can, for example, also be integrated into an internal rearview mirror of the motor vehicle 10.

    [0035] Furthermore, the motor vehicle 10 includes a plurality of display devices (not represented in FIG. 1), which can be integrated, for example, into the instrument panel 14, can be executed as a head-up display on the windshield 18 or can be installed in a headliner 22 or other interior trim parts of the motor vehicle 10.

    [0036] In order to make possible a non-contact control of the display devices, e.g. the position of a head 24 of a vehicle occupant 26, of the driver of the motor vehicle 10 in the example shown, is observed with the camera 16. The camera 16 detects, for example, both the eyes 28 of the vehicle occupant 26 and his entire head 24. The position of the eyes 28 can be monitored by image recognition of the eyes 28 as an entirety. However, a more precise analysis can also be performed, in which the position of the pupils or the iris of the eyes 28 is observed. In order to determine the position and orientation of the head 24, the camera 16 can observe parts of the head which are particularly easy to recognize such as, for example, the nose 30 of the vehicle occupant 26.

    [0037] Furthermore, the further camera 12 in the instrument panel 24 records the position and movement of a hand 32 of the vehicle occupant 26.

    [0038] The cameras 12, 16 are preferably depth image cameras, particularly preferably so-called time-of-flight cameras which also supply distance information for each pixel so that the image recognition becomes particularly precise.

    [0039] The combination of the detection of the head position, eye position and gestures of the vehicle occupant 26 makes possible, as explained below with reference to FIGS. 2 to 4, a particularly precise non-contact control of the display device of the motor vehicle 10.

    [0040] FIG. 2 again summarizes how both gestures and the bodily and head posture or respectively the eye position of the vehicle occupant 26 are detected by means of three cameras 12 and 16. The detection of all of the indicated parameters is particularly advantageous for the described method.

    [0041] FIG. 3 shows how a display device 34 having a first display area 36 and a second display area 38 can be operated. A first display element 40, for example an icon, is arranged on the first display area 36. The second display area 38 shows a second display element 42, in which it can likewise be an icon, a menu item or the like.

    [0042] If the vehicle occupant 26 then wishes to carry out an operating action, he will first look, for example, at one of the two display areas 36, 38, which is to be the objective of his operating action. He can thus, for example, look at the display area 36 and the line of vision is detected by means of his eye position with a first camera 16. At the same time, it is checked whether the bodily posture of the vehicle occupant 26 coincides with his line of vision in order to verify the recognition of the display area 36 to be selected. To this end, a second camera 16 detects the body orientation. If the vehicle occupant 26 then executes a gesture with his hand 32, which can be detected by the third camera 12, in the example shown a lateral wiping movement between the display areas 36, 38, the display element 40 selected by the look and verified by the body orientation is displaced from the selected display area 36 to the other display area 38.

    [0043] FIG. 4 shows this again in detail. Here, the entire content of the display area 36 is displaced in the way described to the display area 38, as a result of which a new, predetermined display element 44 is displayed on the display area 36 which is now vacated. A displacement of the entire content of the display area 36 to the display area 38 is, however, not necessarily always the case. It is also possible that only one display element 40, 42, 44 is displaced between the display areas 36, 38 in the way described, or that further actions regarding the selected display element 40, 42, 44 are performed. For example, by displacing an individual icon to a new display area 36, 38, a program associated with said icon can be activated in the newly selected display area 36, 38, or similar further operating functions can be performed. Here, the selection of a menu item, the adjustment of slide controls such as, for example, for the volume of an entertainment system, the selection from a list such as, for example, a list of telephone numbers for an integrated telephone or the like is also conceivable.

    [0044] To this end, further non-contact inputs can also be detected and evaluated. An additional voice recognition or the recognition of brain activities of the user within the meaning of a brain-machine interface is conceivable, for example.

    [0045] The features of the invention disclosed in the above description, in the drawings and in the claims can be material, both individually and in any combination, for the realization of the invention in its various embodiments.

    LIST OF REFERENCE NUMERALS

    [0046] Motor vehicle 10 [0047] Camera 12 [0048] Instrument panel 14 [0049] Camera [0050] 1 [0051] 6 [0052] Windshield 18 [0053] Roof 20 [0054] Headliner 22 [0055] Head 24 [0056] Vehicle occupant 26 [0057] Eye 28 [0058] Nose 30 [0059] Hand [0060] 3 [0061] 2 [0062] Display device 34 [0063] Display area 36 [0064] Display area 38 [0065] Display element 40 [0066] Display element 42 [0067] Display element 44