METHOD FOR OPERATING A DISPLAY AND OPERATING DEVICE, DISPLAY AND OPERATING DEVICE, AND MOTOR VEHICLE
20210278909 · 2021-09-09
Assignee
Inventors
Cpc classification
G06F3/017
PHYSICS
G06F3/011
PHYSICS
G06F3/0425
PHYSICS
G03H2001/0061
PHYSICS
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
G06F3/016
PHYSICS
G06F3/04845
PHYSICS
International classification
Abstract
A control device of a display and operating device specifies a graphical configuration of a display element to be displayed by the display device. Based on the specified graphical configuration, at least one light wave parameter for a respective light wave of a group of light waves is ascertained. A totality of the light wave parameters describes an interference pattern which describes an at least partial superposition of the light waves for generating an image representation of the display element. An interference pattern signal which describes the totality of the ascertained light wave parameters is generated and the generated interference pattern signal is transferred to an interference output device of the display and operating device. Based on the transferred interference pattern signal, the interference output device generates a light wave interference, and an output of a real image of the display element is displayed.
Claims
1.-9. (canceled)
10. A method for operating a display and operating device, the method comprising: specifying a graphical configuration of a display element to be displayed by the display and operating device; based on the specified graphical configuration, ascertaining at least one light wave parameter for a respective light wave of a group of light waves, each of the light wave parameters describing a frequency, an amplitude, or a wavelength of the respective light wave, and a totality of the light wave parameters describing an interference pattern which describes an at least partial superposition of the light waves for generating an image representation of the display element; generating an interference pattern signal which describes the totality of the ascertained light wave parameters; transferring, from a control device of the display and operating device, the generated interference pattern signal to an interference output device; based on the transferred interference pattern signal, the interference output device generating and outputting the group of light waves and setting the respective at least one light wave parameter for each of the light waves among the group of light waves so as to generate a light wave interference and output a real image of the display element; receiving an operating signal from an operating device, the operating signal describing an operating action by a user with respect to the real image of the display element; based on the received operating signal, establishing a change in a relative position and/or a position of the real image of the display element, as described by the operating action; and based on the established change in the relative position and/or the position, adapting at least one light wave parameter for at least one light wave to change the interference pattern.
11. The method according to claim 10, wherein the display element is displayable at a position at which, during use of the display and operating device, a body part of the user is intersect-able with a display plane of the display element.
12. The method according to claim 10, further comprising: generating, by a contact trigger device of the display and operating device, a field perceivable in a haptic and tactile fashion in and/or on an operating region which is a region reachable by a body part of the user for performing the operating action with respect to the real image of the display element.
13. The method according to claim 12, wherein generating the field perceivable in the haptic and tactile fashion is performed by generating a vibration at a contact point of the body part and the display element and/or by graphically highlighting a portion of the display element at the contact point of the body part and the display element.
14. The method according to claim 10, wherein the generated interference pattern signal describes a configuration of an operating element as the display element and/or of a three-dimensional display element as the display element.
15. The method according to claim 10, wherein the operating device includes a gesture recognition device, and the operating action by the user is an operating gesture recognized by the gesture recognition device.
16. The method according to claim 15, wherein the operating gesture includes a rotary movement of a body part of the user, and adapting the at least one light wave parameter for the at least one light wave to change the interference pattern causes the real image of the display element to be rotated in accordance with the rotary movement of the body part of the user.
17. A control device for a display and operating device, the control device comprising: a memory to store instructions; and a processor to execute the instructions stored in the memory to: specify a graphical configuration of a display element to be displayed by the display and operating device, based on the specified graphical configuration, ascertain at least one light wave parameter for a respective light wave of a group of light waves, each of the light wave parameters describing a frequency, an amplitude, or a wavelength of the respective light wave, and a totality of the light wave parameters describing an interference pattern which describes an at least partial superposition of the light waves for generating an image representation of the display element, generate an interference pattern signal which describes the totality of the ascertained light wave parameters, transfer the generated interference pattern signal to an interference output device to cause a real image of the display element to be output, receive an operating signal from an operating device, the operating signal describing an operating action by a user with respect to the real image of the display element, based on the received operating signal, establish a change in a relative position and/or a position of the real image of the display element, as described by the operating action, and based on the established change in the relative position and/or the position, adapt at least one light wave parameter for at least one light wave to change the interference pattern.
18. A display and operating device, comprising: an interference output device configured to, based on an interference pattern signal, generate and output a group of light waves and set a respective at least one light wave parameter for each of the light waves among the group of light waves, so as to generate a light wave interference and output a real image of a display element; and a control device configured to: specify a graphical configuration of the display element to be displayed, based on the specified graphical configuration, ascertain at least one light wave parameter for a respective light wave of the group of light waves, each of the light wave parameters describing a frequency, an amplitude, or a wavelength of the respective light wave, and a totality of the light wave parameters describing an interference pattern which describes an at least partial superposition of the light waves for generating an image representation of the display element, generate the interference pattern signal which describes the totality of the ascertained light wave parameters, transfer the generated interference pattern signal to the interference output device, receive an operating signal from an operating device, the operating signal describing an operating action by a user with respect to the real image of the display element, based on the received operating signal, establish a change in a relative position and/or a position of the real image of the display element, as described by the operating action, and based on the established change in the relative position and/or the position, adapt at least one light wave parameter for at least one light wave to change the interference pattern.
19. The display and operating device according to claim 18, further comprising a contact trigger device configured to generate a field perceivable in a haptic and a tactile fashion in an operating region which is a region reachable by a body part of the user for performing the operating action with respect to the real image of the display element, the contact trigger device including at least one airflow generating element.
20. The display and operating device according to claim 18, further comprising the operating device, wherein the operating device includes a gesture recognition device configured to capture and recognize the operating action in an operating region which is a region reachable by a body part of the user for performing the operating action with respect to the real image of the display element.
21. A motor vehicle, comprising: a chassis; and the control device of claim 17.
22. The motor vehicle according to claim 21, further comprising: a vehicle system including at least one of a climate system, navigation system, or entertainment system, wherein the operating action by the user with respect to the real image of the display element is to select or change a function or setting of the vehicle system.
23. A motor vehicle, comprising: a chassis; and the display and operating device of claim 18.
24. The motor vehicle according to claim 23, further comprising: a vehicle system including at least one of a climate system, navigation system, or entertainment system, wherein the operating action by the user with respect to the real image of the display element is to select or change a function or setting of the vehicle system.
25. The motor vehicle according to claim 24, wherein the display and operating device further comprises a contact trigger device configured to generate a field perceivable in a haptic and a tactile fashion in an operating region which is a region reachable by a body part of the user for performing the operating action with respect to the real image of the display element.
26. The motor vehicle according to claim 24, wherein the display and operating device further comprises the operating device, and the operating device includes a gesture recognition device configured to capture and recognize the operating action in an operating region which is a region reachable by a body part of the user for performing the operating action with respect to the real image of the display element.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0037] These and other aspects and advantages will become more apparent and more readily appreciated from the following description of the example embodiments, taken in conjunction with the accompanying drawings of which:
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
DETAILED DESCRIPTION
[0044] Reference will now be made in detail to various examples which are illustrated in the accompanying drawings, wherein like reference characters refer to like elements throughout.
[0045] The example embodiments described below are merely examples. In the example embodiments, the described components of the embodiments each constitute individual features which are to be considered independently of one another, which also each develop the disclosure independently of one another, and which are thus also to be considered to be a component of the disclosure individually or in a combination other than the one depicted. Moreover, the embodiments described can also be supplemented with further features of the already described features.
[0046] In the drawings, functionally identical elements are each denoted by the same reference characters.
[0047]
[0048] The example display and operating apparatus 10 of
[0049] Expressed differently, a light spot that can be actively controlled in space can arise where two or many high amplitudes of light waves meet. This control can be implemented in a three-dimensional and/or color-dependent fashion. In the process, these light spots or picture elements actually arise in space, i.e. these are not virtual points. Expressed differently, this relates to a real image. Advantageously, the picture elements can be actively controlled, and so moving pictures arise which, for example, may appear as real shining illuminants.
[0050] The example of
[0051] In the example of
[0052] In the example of
[0053] By way of example, the object or display element 26 displayed in the display region can be an image of a rotary/push controller. By way of example, such an image of an operating element can be provided to set a function of an air-conditioning unit or for navigating through an operating menu.
[0054] In the case of the optional operation by an operating gesture, provision can be made for a gesture recognition device 28, which is shown in example fashion as a constituent part of the display and operating apparatus 10. By way of example, the gesture recognition device 28 can include a camera 30 arranged on a roofliner of the motor vehicle 12, or can be connected to such an example camera 30. Suitable appliance components and software for gesture recognition are known to a person skilled in the art from the related art. The example camera 30 is for example aligned at the operating region 22.
[0055] In combination with the acoustic field, the gesture recognition offers the advantage of, as already described further above, it being possible to exactly position a body part whose operating gesture is intended to be recognized. This improves the recognition of the gesture. In addition or as an alternative thereto, provision can be made, for example, for a user 15 to have to place their hand on a hand rest for operating purposes, for example on a touch-sensitive surface (“touchpad”). The gesture recognition device 28 can be set up to distinguish between, for example, an unwittingly moving hand which is “searching for” the operating region or a randomly moving hand, or an operating hand. As an alternative to the embodiment of the gesture recognition with the aid of the camera 30, a radar or an infrared camera, for example, can be used for gesture recognition.
[0056] In a first method operation S1, a graphical configuration of the example operating element can be specified, for example by providing three-dimensional graphics data. By way of example, these graphics data can be stored in the storage device 13 or, for example, can be retrieved from a vehicle-external data server (not shown in
[0057] The control device 14 generates the interference pattern signal which describes this pattern and/or the respective light wave parameters (S3).
[0058] The generated interference pattern signal is transferred to the interference output device 16 (S4). Data communication connections 31 are shown as black connecting lines in
[0059] The light waves are controlled by the interference output device 16 on the basis of the transferred interference pattern signal (S5), for the purposes of which, for example, specified light wave parameters can be set (S6). As a result of this, light wave interference is generated or provided (S7), i.e. the real image of the example operating element is output (S8).
[0060] If an acoustic field is generated by the optional contact trigger device 18 (S9), the user 15 is helped with finding the operating region 22. Upon incidence on the skin, the acoustic field triggers the feeling of contact and can consequently convey, for example, to the user 15 that their hand is situated in or on the operating region 22.
[0061]
[0062] The interference output device 16 of the example of
[0063] The purpose of the display element 26 can be the selection of the air-conditioning function, for example. To trigger this function, provision can be made, for example, for the user 15 to be able to hold a hand (or finger(s)) 34 against or on the display element 26 for a certain amount of time, and/or, for example, carry out a tapping gesture or, for example, a swiping gesture. In the case of an example tapping gesture which uses two fingers, it is possible to effectively guard against or even avoid a mal-operation. Example alternative operating functions can be those of a navigation appliance or a radio, for example.
[0064] The gesture recognition device 28 can capture the operating gesture by use of an appropriate signal from the camera 30 (S10), and can recognize the gesture with the aid of the example gesture recognition software (S11).
[0065] For example, it is possible to provide an operating concept in which a plurality of subfunctions of the triggered function, the example air-conditioning unit operation, are provided by “grasping” and/or “tapping” the display element 26 (i.e. by a grasping gesture and/or by a tapping function) and, to this end, the individual display element 26 is replaced by four display elements 26 (central part of
[0066] A hand (or finger(s)) 34 of the user 15, which is currently able to carry out an operating gesture, is likewise shown. If the user 15 then uses the latter to operate the temperature setting, for example, by rotation with the aid of a rotary movement (D), the user is able to select and adjust a specific temperature within the temperature menu. In this case, the currently active sub-function can be presented by a display element 26 that is larger than the two further display elements 26 which can represent the other sub-functions (lower part of
[0067] Here, it is possible to assign the assignment of the operating gesture to the “rotated” display element 26, and hence to the function to be activated, for example on the basis of the camera image. The gesture recognition device 28 can generate an operating signal (S12), which can describe the operating gesture and/or the function to be triggered. This allows the generated operating signal to be provided (S13).
[0068] By way of example, if the example operating gesture describes a rotary movement D, it is possible to establish a corresponding rotary movement of the display element 26 (S14). Optionally, provision can be made for the gesture recognition to be able to capture a length of the rotary movement, for example, and be able to adapt a corresponding visually displayed rotary movement of the display element 26.
[0069] The light wave parameters can be adapted (S15) for this in such a way that the interference pattern changes in such a way that the image of the display element 26 rotates.
[0070]
[0071]
[0072]
[0073] By way of example, functions for a motor vehicle seat can be provided in the example of
[0074]
[0075] Optionally, provision can be made for the provision of a so-called learning phase for getting used to the operating concept, within the scope of which the user 15 can learn to operate the display and operating apparatus described herein. Thus, the user 15 can train their motor memory and, for example, learn an optimum speed for rotating a display element 26, or learn where a certain function can be on the display element 26. For learning purposes, haptic feedback, for example, can be generated by the contact trigger device 18 in order to assist the user 15 with the learning.
[0076] Optionally, provision can be made for the display element 26 or a plurality of display elements 26 to be able to be adjustable for a front passenger, for example. In the process, it is possible to alter the position and/or the relative position of the display element 26, for example for as long as it can be visible within a sight cone. By way of example, if the display and operating apparatus 10 is in a control panel of a motor vehicle 12, the display element 26 can be positioned for a driver in such a way that the latter's gaze can fall on the display area 32 in perpendicular fashion. Expressed differently, the display area 32 can be aligned on the user 15. In the case of a large display and operating apparatus 10, a plurality of occupants, for example, can see simultaneously the display area 32 and/or the display element 26. By way of example, a position and/or relative position can be manually adjustable by the user 15. As a result, the display and operating apparatus 10 has a very ergonomic design.
[0077] Overall, the example embodiments elucidate a holographic operating concept which can be based on a virtual 3D body and by use of which it is possible to navigate on this body by use of an input medium or operating element such as, for example, a rotary/push controller or touch pad, by way of, for example, gesture recognition.
[0078] According to a further example embodiment (
[0079] According to a further example embodiment,
[0080]
[0081] In this respect,
[0082] Expressed differently, the example sphere can actually “float” in three-dimensional fashion in space, and functions can likewise be displayed on a back side, for example. Thus, a plurality of functions can be imaged in a small space.
[0083] According to a further example embodiment, functions can be operated blindly following a learning phase, for example using an input instrument. By way of example, a seat heater can be selected and/or set using three click gestures to the right and one click gesture in the upward direction.
[0084] A description has been provided with reference to various examples, but it will be understood that variations and modifications can be effected within the spirit and scope of the claims which may include the phrase “at least one of A, B and C” as an alternative expression that means one or more of A, B and C may be used, contrary to the holding in Superguide v. DIRECTV, 358 F3d 870, 69 USPQ2d 1865 (Fed. Cir. 2004).