METHOD FOR OPERATING A DISPLAY AND OPERATING DEVICE, DISPLAY AND OPERATING DEVICE, AND MOTOR VEHICLE

20210278909 · 2021-09-09

Assignee

Inventors

Cpc classification

International classification

Abstract

A control device of a display and operating device specifies a graphical configuration of a display element to be displayed by the display device. Based on the specified graphical configuration, at least one light wave parameter for a respective light wave of a group of light waves is ascertained. A totality of the light wave parameters describes an interference pattern which describes an at least partial superposition of the light waves for generating an image representation of the display element. An interference pattern signal which describes the totality of the ascertained light wave parameters is generated and the generated interference pattern signal is transferred to an interference output device of the display and operating device. Based on the transferred interference pattern signal, the interference output device generates a light wave interference, and an output of a real image of the display element is displayed.

Claims

1.-9. (canceled)

10. A method for operating a display and operating device, the method comprising: specifying a graphical configuration of a display element to be displayed by the display and operating device; based on the specified graphical configuration, ascertaining at least one light wave parameter for a respective light wave of a group of light waves, each of the light wave parameters describing a frequency, an amplitude, or a wavelength of the respective light wave, and a totality of the light wave parameters describing an interference pattern which describes an at least partial superposition of the light waves for generating an image representation of the display element; generating an interference pattern signal which describes the totality of the ascertained light wave parameters; transferring, from a control device of the display and operating device, the generated interference pattern signal to an interference output device; based on the transferred interference pattern signal, the interference output device generating and outputting the group of light waves and setting the respective at least one light wave parameter for each of the light waves among the group of light waves so as to generate a light wave interference and output a real image of the display element; receiving an operating signal from an operating device, the operating signal describing an operating action by a user with respect to the real image of the display element; based on the received operating signal, establishing a change in a relative position and/or a position of the real image of the display element, as described by the operating action; and based on the established change in the relative position and/or the position, adapting at least one light wave parameter for at least one light wave to change the interference pattern.

11. The method according to claim 10, wherein the display element is displayable at a position at which, during use of the display and operating device, a body part of the user is intersect-able with a display plane of the display element.

12. The method according to claim 10, further comprising: generating, by a contact trigger device of the display and operating device, a field perceivable in a haptic and tactile fashion in and/or on an operating region which is a region reachable by a body part of the user for performing the operating action with respect to the real image of the display element.

13. The method according to claim 12, wherein generating the field perceivable in the haptic and tactile fashion is performed by generating a vibration at a contact point of the body part and the display element and/or by graphically highlighting a portion of the display element at the contact point of the body part and the display element.

14. The method according to claim 10, wherein the generated interference pattern signal describes a configuration of an operating element as the display element and/or of a three-dimensional display element as the display element.

15. The method according to claim 10, wherein the operating device includes a gesture recognition device, and the operating action by the user is an operating gesture recognized by the gesture recognition device.

16. The method according to claim 15, wherein the operating gesture includes a rotary movement of a body part of the user, and adapting the at least one light wave parameter for the at least one light wave to change the interference pattern causes the real image of the display element to be rotated in accordance with the rotary movement of the body part of the user.

17. A control device for a display and operating device, the control device comprising: a memory to store instructions; and a processor to execute the instructions stored in the memory to: specify a graphical configuration of a display element to be displayed by the display and operating device, based on the specified graphical configuration, ascertain at least one light wave parameter for a respective light wave of a group of light waves, each of the light wave parameters describing a frequency, an amplitude, or a wavelength of the respective light wave, and a totality of the light wave parameters describing an interference pattern which describes an at least partial superposition of the light waves for generating an image representation of the display element, generate an interference pattern signal which describes the totality of the ascertained light wave parameters, transfer the generated interference pattern signal to an interference output device to cause a real image of the display element to be output, receive an operating signal from an operating device, the operating signal describing an operating action by a user with respect to the real image of the display element, based on the received operating signal, establish a change in a relative position and/or a position of the real image of the display element, as described by the operating action, and based on the established change in the relative position and/or the position, adapt at least one light wave parameter for at least one light wave to change the interference pattern.

18. A display and operating device, comprising: an interference output device configured to, based on an interference pattern signal, generate and output a group of light waves and set a respective at least one light wave parameter for each of the light waves among the group of light waves, so as to generate a light wave interference and output a real image of a display element; and a control device configured to: specify a graphical configuration of the display element to be displayed, based on the specified graphical configuration, ascertain at least one light wave parameter for a respective light wave of the group of light waves, each of the light wave parameters describing a frequency, an amplitude, or a wavelength of the respective light wave, and a totality of the light wave parameters describing an interference pattern which describes an at least partial superposition of the light waves for generating an image representation of the display element, generate the interference pattern signal which describes the totality of the ascertained light wave parameters, transfer the generated interference pattern signal to the interference output device, receive an operating signal from an operating device, the operating signal describing an operating action by a user with respect to the real image of the display element, based on the received operating signal, establish a change in a relative position and/or a position of the real image of the display element, as described by the operating action, and based on the established change in the relative position and/or the position, adapt at least one light wave parameter for at least one light wave to change the interference pattern.

19. The display and operating device according to claim 18, further comprising a contact trigger device configured to generate a field perceivable in a haptic and a tactile fashion in an operating region which is a region reachable by a body part of the user for performing the operating action with respect to the real image of the display element, the contact trigger device including at least one airflow generating element.

20. The display and operating device according to claim 18, further comprising the operating device, wherein the operating device includes a gesture recognition device configured to capture and recognize the operating action in an operating region which is a region reachable by a body part of the user for performing the operating action with respect to the real image of the display element.

21. A motor vehicle, comprising: a chassis; and the control device of claim 17.

22. The motor vehicle according to claim 21, further comprising: a vehicle system including at least one of a climate system, navigation system, or entertainment system, wherein the operating action by the user with respect to the real image of the display element is to select or change a function or setting of the vehicle system.

23. A motor vehicle, comprising: a chassis; and the display and operating device of claim 18.

24. The motor vehicle according to claim 23, further comprising: a vehicle system including at least one of a climate system, navigation system, or entertainment system, wherein the operating action by the user with respect to the real image of the display element is to select or change a function or setting of the vehicle system.

25. The motor vehicle according to claim 24, wherein the display and operating device further comprises a contact trigger device configured to generate a field perceivable in a haptic and a tactile fashion in an operating region which is a region reachable by a body part of the user for performing the operating action with respect to the real image of the display element.

26. The motor vehicle according to claim 24, wherein the display and operating device further comprises the operating device, and the operating device includes a gesture recognition device configured to capture and recognize the operating action in an operating region which is a region reachable by a body part of the user for performing the operating action with respect to the real image of the display element.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0037] These and other aspects and advantages will become more apparent and more readily appreciated from the following description of the example embodiments, taken in conjunction with the accompanying drawings of which:

[0038] FIG. 1 is a schematic illustration of a first embodiment of the method described herein and the display and operating apparatus described herein;

[0039] FIG. 2 is a schematic illustration of a further embodiment of the method described herein and the display and operating apparatus described herein;

[0040] FIG. 3 is a schematic illustration of a further embodiment of the method described herein and the display and operating apparatus described herein;

[0041] FIG. 4 is a schematic illustration of a further embodiment of the method described herein and the display and operating apparatus described herein;

[0042] FIG. 5 is a schematic illustration of a further embodiment of the method described herein and the display and operating apparatus described herein; and

[0043] FIG. 6 is a schematic illustration of a further embodiment of the method described herein and the display and operating apparatus described herein.

DETAILED DESCRIPTION

[0044] Reference will now be made in detail to various examples which are illustrated in the accompanying drawings, wherein like reference characters refer to like elements throughout.

[0045] The example embodiments described below are merely examples. In the example embodiments, the described components of the embodiments each constitute individual features which are to be considered independently of one another, which also each develop the disclosure independently of one another, and which are thus also to be considered to be a component of the disclosure individually or in a combination other than the one depicted. Moreover, the embodiments described can also be supplemented with further features of the already described features.

[0046] In the drawings, functionally identical elements are each denoted by the same reference characters.

[0047] FIG. 1 elucidates the principle of the method described herein and of the display and operating apparatus 10 described herein on the basis of a first example embodiment. By way of example, the display and operating apparatus (device) 10 can be installed, for example, in a center console or in a control panel of a motor vehicle 12 in this case, wherein the motor vehicle can be configured as an automobile, for example. By way of example, the display and operating apparatus 10 can include an optional processor device 11 and an optional storage device 13, wherein the storage device 13 can be configured, for example, as a memory card or memory chip. Here, in the example of FIG. 1, both the processor device 11 and the storage device 13 are shown as optional constituent parts of the control device 14; however, both components can also be arranged outside of the control device 14. By way of example, the control device 14 can be configured as a control circuit board.

[0048] The example display and operating apparatus 10 of FIG. 1 includes an interference output device 16, which can for example include a holographic generator or a plurality of holographic generators. The interference output device 16 is configured and set up to generate light wave interference. To this end, the interference output device 16 can generate and output one or more light waves or light wave groups. The individual light waves or light wave groups can be varied in terms of their light wave parameters in such a way that two or more light waves can superpose. As an alternative or in addition thereto, the light waves or light wave groups can be steered in such a way that such a superposition can take place. Appropriate appliances are known to a person skilled in the art from the related art. The interferences, i.e. superpositions, generate one or more picture elements in front of the display area (not shown in FIG. 1) or therebehind. Expressed differently, a picture element arises where two or more light waves meet. As a result, an image arises by a combination of a plurality of points. In contrast to a hologram, the image of the interference output device 16 is not static.

[0049] Expressed differently, a light spot that can be actively controlled in space can arise where two or many high amplitudes of light waves meet. This control can be implemented in a three-dimensional and/or color-dependent fashion. In the process, these light spots or picture elements actually arise in space, i.e. these are not virtual points. Expressed differently, this relates to a real image. Advantageously, the picture elements can be actively controlled, and so moving pictures arise which, for example, may appear as real shining illuminants.

[0050] The example of FIG. 1 shows an optional contact trigger device 18 which can be configured, for example, to generate small blasts of air. Alternatively, the contact trigger device 18 can be configured as an ultrasonic device which may include a sound generator element 20. By way of example, the sound generator element 20 can be a membrane transmitter. Such a sound generator element can also be referred to as a sound generator or signal generator. For example, the sound generator element can include an arrangement or a matrix of a plurality of sound generators or signal generators, i.e. it can be configured as a so-called “signal generator array” or “sound generator array”, for example as an ultrasonic array. Suitable appliance components or appliances for generating a field perceivable in haptic and tactile fashion, a so-called “acoustic field”, are known to a person skilled in the art from the related art, for example from “Ultrahaptics”. For example, the example sound generator element 20 can be an array of a plurality of ultrasonic sensors, which is used in park assist systems in the related art.

[0051] In the example of FIG. 1, the example sound generator element 20 can be aligned at an operating region 22, i.e. the region in which the user 15 must hold their hand, for example, so that an operating gesture can be recognized. As an alternative to the operation by way of an operating gesture, operation by way of, for example, a touchpad or a voice input or a rotary controller is also conceivable, which may be connected to the display and operating apparatus 10.

[0052] In the example of FIG. 1, the operating region 22 can correspond with a display region 24, or the operating region 22 can be, for example, disposed upstream of the display region 24, i.e. be located next to the display region 24 within which the interference output device 16 outputs the image, which can also be referred to as interference image.

[0053] By way of example, the object or display element 26 displayed in the display region can be an image of a rotary/push controller. By way of example, such an image of an operating element can be provided to set a function of an air-conditioning unit or for navigating through an operating menu.

[0054] In the case of the optional operation by an operating gesture, provision can be made for a gesture recognition device 28, which is shown in example fashion as a constituent part of the display and operating apparatus 10. By way of example, the gesture recognition device 28 can include a camera 30 arranged on a roofliner of the motor vehicle 12, or can be connected to such an example camera 30. Suitable appliance components and software for gesture recognition are known to a person skilled in the art from the related art. The example camera 30 is for example aligned at the operating region 22.

[0055] In combination with the acoustic field, the gesture recognition offers the advantage of, as already described further above, it being possible to exactly position a body part whose operating gesture is intended to be recognized. This improves the recognition of the gesture. In addition or as an alternative thereto, provision can be made, for example, for a user 15 to have to place their hand on a hand rest for operating purposes, for example on a touch-sensitive surface (“touchpad”). The gesture recognition device 28 can be set up to distinguish between, for example, an unwittingly moving hand which is “searching for” the operating region or a randomly moving hand, or an operating hand. As an alternative to the embodiment of the gesture recognition with the aid of the camera 30, a radar or an infrared camera, for example, can be used for gesture recognition.

[0056] In a first method operation S1, a graphical configuration of the example operating element can be specified, for example by providing three-dimensional graphics data. By way of example, these graphics data can be stored in the storage device 13 or, for example, can be retrieved from a vehicle-external data server (not shown in FIG. 1). The example graphics data can then be read. Within the scope of this example read-out process, it is possible to ascertain properties of the light waves and/or how the latter have to be deflected in order to generate the desired image of the operating element by way of light wave interference. By way of example, ascertaining one or more light wave parameters S2 can be based on empirical values or are read from the graphics file. Here, example light wave parameters can be an amplitude or a frequency or a wavelength. In a group of light waves generated by the interference output device 16, one or more light parameters can be ascertained for each of the light waves (S2) and the interference pattern can be established thus, for example on the basis of a preprogrammed specification. Expressed differently, the interference pattern can describe the image of the example operating element.

[0057] The control device 14 generates the interference pattern signal which describes this pattern and/or the respective light wave parameters (S3).

[0058] The generated interference pattern signal is transferred to the interference output device 16 (S4). Data communication connections 31 are shown as black connecting lines in FIG. 1 and can be configured, for example, for a wireless data transfer, for example as a WLAN or Bluetooth-LE connection. Alternatively, a data communication connection 31 can be wired, i.e., for example, configured as a cable or data bus.

[0059] The light waves are controlled by the interference output device 16 on the basis of the transferred interference pattern signal (S5), for the purposes of which, for example, specified light wave parameters can be set (S6). As a result of this, light wave interference is generated or provided (S7), i.e. the real image of the example operating element is output (S8).

[0060] If an acoustic field is generated by the optional contact trigger device 18 (S9), the user 15 is helped with finding the operating region 22. Upon incidence on the skin, the acoustic field triggers the feeling of contact and can consequently convey, for example, to the user 15 that their hand is situated in or on the operating region 22.

[0061] FIG. 2 shows a further example embodiment of a display and operating apparatus 10 described herein, which may correspond to that from the example of FIG. 1, for example. For reasons of clarity, the example gesture recognition device 28 and the contact trigger device 18 are no longer shown here. It is for this reason that the control device 14 is no longer shown in the individual partial images of FIG. 2 either. Here, only differences to the functionality of FIG. 1 are discussed in the following.

[0062] The interference output device 16 of the example of FIG. 2 can include a display area 32, which can optionally be configured as a holographic screen (“holographic display”). On a side facing the user 15 in an example fashion, a round, disk-shaped display element 26, for example, can be displayed (upper part of FIG. 2).

[0063] The purpose of the display element 26 can be the selection of the air-conditioning function, for example. To trigger this function, provision can be made, for example, for the user 15 to be able to hold a hand (or finger(s)) 34 against or on the display element 26 for a certain amount of time, and/or, for example, carry out a tapping gesture or, for example, a swiping gesture. In the case of an example tapping gesture which uses two fingers, it is possible to effectively guard against or even avoid a mal-operation. Example alternative operating functions can be those of a navigation appliance or a radio, for example.

[0064] The gesture recognition device 28 can capture the operating gesture by use of an appropriate signal from the camera 30 (S10), and can recognize the gesture with the aid of the example gesture recognition software (S11).

[0065] For example, it is possible to provide an operating concept in which a plurality of subfunctions of the triggered function, the example air-conditioning unit operation, are provided by “grasping” and/or “tapping” the display element 26 (i.e. by a grasping gesture and/or by a tapping function) and, to this end, the individual display element 26 is replaced by four display elements 26 (central part of FIG. 2). Each of the example four display elements 26 can then be assigned a sub-function, for example setting a temperature, a fan power, a fan direction and an additional extra function. Expressed differently, the menu can “unfold”.

[0066] A hand (or finger(s)) 34 of the user 15, which is currently able to carry out an operating gesture, is likewise shown. If the user 15 then uses the latter to operate the temperature setting, for example, by rotation with the aid of a rotary movement (D), the user is able to select and adjust a specific temperature within the temperature menu. In this case, the currently active sub-function can be presented by a display element 26 that is larger than the two further display elements 26 which can represent the other sub-functions (lower part of FIG. 2).

[0067] Here, it is possible to assign the assignment of the operating gesture to the “rotated” display element 26, and hence to the function to be activated, for example on the basis of the camera image. The gesture recognition device 28 can generate an operating signal (S12), which can describe the operating gesture and/or the function to be triggered. This allows the generated operating signal to be provided (S13).

[0068] By way of example, if the example operating gesture describes a rotary movement D, it is possible to establish a corresponding rotary movement of the display element 26 (S14). Optionally, provision can be made for the gesture recognition to be able to capture a length of the rotary movement, for example, and be able to adapt a corresponding visually displayed rotary movement of the display element 26.

[0069] The light wave parameters can be adapted (S15) for this in such a way that the interference pattern changes in such a way that the image of the display element 26 rotates.

[0070] FIG. 3 shows a further example embodiment relating to a further operating concept. In order to provide a better overview, not all of the individual components of the display and operating apparatus 10 are illustrated. However, in principle, the display and operating apparatus 10 of FIG. 3 can be a display and operating apparatus 10 of FIG. 1, for example. The operating concept of FIG. 3 can optionally be combinable with the operating concept of FIG. 2.

[0071] FIG. 3 shows an example display area 32, which can optionally be referred to as holographic display or as interferometric display. By way of example, the interference output device 16 can display four display elements 26, wherein each of the display elements can have two or more regions and different operating functions can be assigned to the respective regions. Here, the individual functions can be operated by, for example, a tapping or swiping gesture.

[0072] FIG. 4, FIG. 5 and FIG. 6 show an operating concept with a display element 26 configured in an example spherical fashion, which can be rotated in three dimensions in space. A possible conversion of a rotary gesture into a rotary movement of the display element 26 has already been described further above. The spherical operating element as display element 26 can be subdivided, for example, into different regions, which can each have assigned a function of a motor vehicle system. As a result of the three-dimensional configuration, different so-called functional regions can be present both on the front side and on the rear side. By way of, for example, a pressing gesture on a corresponding functional region, it is possible to trigger a function, wherein the spherical display element 26 can be rotated in the case of a rotary gesture D. If the different functional regions are labeled, for example, by an image, writing or an image of a symbol, these symbols or labels can also rotate with the functional region when the display element 26 rotates.

[0073] By way of example, functions for a motor vehicle seat can be provided in the example of FIG. 4, for example the sub-functions “rear”, “off” and “auto”. By way of example, these operating functions can serve to select an air-conditioning state for the back seat, a switch-off or an automatic setting. Other function regions can be displayed in the case of a virtual rotation of the display element 26. If the display element 26 is rotated (FIG. 5), it is possible, for example, to provide functions for ventilation or for a temperature setting.

[0074] FIG. 6 shows a configuration of the spherical display element 26, which can be configured as a roller, for example, after, for example, setting a temperature by, for example, a push gesture (FIG. 5).

[0075] Optionally, provision can be made for the provision of a so-called learning phase for getting used to the operating concept, within the scope of which the user 15 can learn to operate the display and operating apparatus described herein. Thus, the user 15 can train their motor memory and, for example, learn an optimum speed for rotating a display element 26, or learn where a certain function can be on the display element 26. For learning purposes, haptic feedback, for example, can be generated by the contact trigger device 18 in order to assist the user 15 with the learning.

[0076] Optionally, provision can be made for the display element 26 or a plurality of display elements 26 to be able to be adjustable for a front passenger, for example. In the process, it is possible to alter the position and/or the relative position of the display element 26, for example for as long as it can be visible within a sight cone. By way of example, if the display and operating apparatus 10 is in a control panel of a motor vehicle 12, the display element 26 can be positioned for a driver in such a way that the latter's gaze can fall on the display area 32 in perpendicular fashion. Expressed differently, the display area 32 can be aligned on the user 15. In the case of a large display and operating apparatus 10, a plurality of occupants, for example, can see simultaneously the display area 32 and/or the display element 26. By way of example, a position and/or relative position can be manually adjustable by the user 15. As a result, the display and operating apparatus 10 has a very ergonomic design.

[0077] Overall, the example embodiments elucidate a holographic operating concept which can be based on a virtual 3D body and by use of which it is possible to navigate on this body by use of an input medium or operating element such as, for example, a rotary/push controller or touch pad, by way of, for example, gesture recognition.

[0078] According to a further example embodiment (FIG. 2), the interference output device 16, which can optionally include a holographic generator, can generate 3D images in front of a display area 32 with the aid of the example holographic generator. For example, by way of a gesture recognition, it is possible, for example, to “grasp” and operate a menu. Potentially, this can be supplemented by haptics, for example by blasts of air or ultrasound. By way of example, the user 15 can initially see a three-dimensional body for, for example, to set ambient conditions. By way of example, by grasping (or a grasping gesture), this body of the display element 26 can unfold into a plurality of bodies, i.e. into a plurality of display elements 26, which can each display or represent a function. By way of example, it is possible to alter, for example, a value by grasping and/or selecting.

[0079] According to a further example embodiment, FIG. 3 describes objects or display elements 26 that appear to “float” in the air. Here, the picture elements are real and not virtual. Here, alternatively, the display elements 26 can be represented as “sheets”, for example for different media or contacts, through which the user 15 can scroll with a “scroll gesture”. Each of the display elements 26 can optionally have a text field with a menu text (for example: “est leucocyte”, “est sushi in shoeshop”, “est goldwrap”, “est dodge the dodo”, “est strange phase”).

[0080] FIG. 4, FIG. 5 and FIG. 6 show a sphere as a display element 26 for a further example embodiment, which sphere is able to, for example, “float in the air” with all air-conditioning functions (FIG. 4). By way of example, the side of the display element 26 shown in FIG. 4 can be a back side. In this respect, FIG. 5 shows, in example fashion, a corresponding front side and a function can be searched for by, for example, rotating and/or grasping.

[0081] In this respect, FIG. 6 shows a selection and adjustment of a value in example fashion.

[0082] Expressed differently, the example sphere can actually “float” in three-dimensional fashion in space, and functions can likewise be displayed on a back side, for example. Thus, a plurality of functions can be imaged in a small space.

[0083] According to a further example embodiment, functions can be operated blindly following a learning phase, for example using an input instrument. By way of example, a seat heater can be selected and/or set using three click gestures to the right and one click gesture in the upward direction.

[0084] A description has been provided with reference to various examples, but it will be understood that variations and modifications can be effected within the spirit and scope of the claims which may include the phrase “at least one of A, B and C” as an alternative expression that means one or more of A, B and C may be used, contrary to the holding in Superguide v. DIRECTV, 358 F3d 870, 69 USPQ2d 1865 (Fed. Cir. 2004).