OPERATING DEVICE AND METHOD FOR DETECTING A USER SELECTION OF AT LEAST ONE OPERATING FUNCTON OF THE OPERATING DEVICE

20190346966 ยท 2019-11-14

Assignee

Inventors

Cpc classification

International classification

Abstract

One, some or all of a specified number of multiple operating functions of an operating device can be selected by a user. Each operating function is assigned to a respective finger of a hand of the user in the operating device by a controller, and a detection device is used to detect a contact of at least one of the fingers of the hand on a specified contact surface of the operating device and ascertain which of the fingers of the hand is contacting the contact surface. Each of the operating functions whose assigned finger contacts the contact surface is determined as the user selection by a controller, and the user selection is signaled to a subsequent process of the operating device using selection data which identifies each selected operating function.

Claims

1-10. (canceled)

11. A method for detecting a user selection of one or more or of all of a predetermined set of operating functions of an operating device, comprising: displaying all of the operating functions of the predetermined set, each represented by a dedicated graphical object, on a display face; assigning fingers of a hand of a user to each of the operating functions of the predetermined set, by a control device; detecting an instance of finger contact of at least one of the fingers of the hand, by a detection device, on a predetermined contact face of the operating device; determining each selection finger, among the fingers of the hand, that contacts the predetermined contact face; specifying a whole user selection, including each selected operating function corresponding to each selection finger, by the control device; signaling the whole user selection to a subsequent process of the operating device by selection data identifying each selected operating function; displaying each selected operating function respectively represented by the dedicated graphical object on the display face in the subsequent process while excluding the dedicated graphical object of any operating function excluded from the whole user selection, the dedicated graphical object of each selected operating function being displayed in a first spatial region different than a second spatial region in which the finger contact of the at least one of the fingers is detected; activating, in the subsequent process, each selected operating function; and outputting on the display face, in a sub-divided fashion, in a respective dedicated sub-region of the display face, respective functional data simultaneously for each activated operating function.

12. The method as claimed in claim 11, further comprising detecting a confirmation gesture of the hand by the detection device, and wherein the specifying of the whole user selection is performed only when the confirmation gesture of the hand is detected by the detection device.

13. The method as claimed in claim 12, wherein an overall movement of the hand together with the fingers which contact the contact face in a predetermined direction along the contact face is detected as the confirmation gesture 19).

14. The method as claimed in claim 13, wherein the detection device detecting the finger contact of the at least one of the fingers on the contact face is at least one of a sensor matrix with proximity sensitivity, contact-sensitive sensors and a camera.

15. The method as claimed in claim 14, wherein the determining of each selection finger that contacts the contact face includes at least one of detecting an arrangement of contact points of contacting fingers on the contact face, and a sequence of detecting the fingers of the hand by a time-of-flight camera; recognizing each finger of the hand in 3D image data of the time-of-flight camera; and determining a respective distance of each finger from the contact face.

16. The method as claimed in claim 13, wherein the determining of each selection finger that contacts the contact face includes at least one of detecting an arrangement of contact points of contacting fingers on the contact face, and a sequence of detecting the fingers of the hand by a time-of-flight camera; recognizing each finger of the hand in 3D image data of the time-of-flight camera; and determining a respective distance of each finger from the contact face.

17. The method as claimed in claim 12, wherein the detection device detecting the finger contact of the at least one of the fingers on the contact face is at least one of a sensor matrix with proximity sensitivity, contact-sensitive sensors and a camera.

18. The method as claimed in claim 12, wherein the determining of each selection finger that contacts the contact face includes at least one of detecting an arrangement of contact points of contacting fingers on the contact face, and a sequence of detecting the fingers of the hand by a time-of-flight camera; recognizing each finger of the hand in 3D image data of the time-of-flight camera; and determining a respective distance of each finger from the contact face.

19. The method as claimed in claim 11, wherein the detection device detecting the finger contact of the at least one of the fingers on the contact face is at least one of a sensor matrix with proximity sensitivity, contact-sensitive sensors and a camera.

20. The method as claimed in claim 11, wherein the determining of each selection finger that contacts the contact face includes at least one of detecting an arrangement of contact points of contacting fingers on the contact face, and a sequence of detecting the fingers of the hand by a time-of-flight camera; recognizing each finger of the hand in 3D image data of the time-of-flight camera; and determining a respective distance of each finger from the contact face.

21. An operating device for detecting a whole user selection of one or more or of all of a predetermined set of operating functions of the operating device, comprising: a display face initially displaying all of the operating functions of the predetermined set, each operating function represented by a dedicated graphical object; a contact face; a detection device detecting an instance of finger contact of at least one finger of a hand of a user on the contact face of the operating device and determining each selection finger, among the fingers of the hand, that contacts the contact face; and a control device configured to assign the fingers of the hand to each of the operating functions of the predetermined set, specify the whole user selection, including each selected operating function corresponding to each selection finger, signal the whole user selection to a subsequent process of the operating device by selection data identifying each selected operating function, display each selected operating function respectively represented by the dedicated graphical object on the display face in the subsequent process while excluding the dedicated graphical object of any operating function excluded from the whole user selection, the dedicated graphical object of each selected operating function being displayed in a first spatial region different than a second spatial region in which the finger contact of the at least one of the fingers is detected, activate, in the subsequent process, each selected operating function, and output on the display face in a respective dedicated sub-region of the display face, respective functional data simultaneously for each activated operating function.

22. A motor vehicle, comprising: a chassis; and an operating device, including a display face initially displaying all of the operating functions of the predetermined set, each operating function represented by a dedicated graphical object; a contact face; a detection device detecting an instance of finger contact of at least one finger of a hand of a user on the contact face of the operating device and determining each selection finger, among the fingers of the hand, that contacts the contact face; and a control device configured to assign the fingers of the hand to each of the operating functions of the predetermined set, specify the whole user selection, including each selected operating function corresponding to each selection finger, signal the whole user selection to a subsequent process of the operating device by selection data identifying each selected operating function, display each selected operating function respectively represented by the dedicated graphical object on the display face in the subsequent process while excluding the dedicated graphical object of any operating function excluded from the whole user selection, the dedicated graphical object of each selected operating function being displayed in a first spatial region different than a second spatial region in which the finger contact of the at least one of the fingers is detected, activate, in the subsequent process, each selected operating function, and output on the display face in a respective dedicated sub-region of the display face, respective functional data simultaneously for each activated operating function.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0021] In the text which follows, an exemplary embodiment is described that makes the aspects and advantages more apparent and more readily appreciated, taken in conjunction with the accompanying drawings of which:

[0022] FIG. 1 is a schematic illustration providing a block diagram and front view of an embodiment of the operating device during the display of graphical objects which each represent a selectable operating function of the operating device;

[0023] FIG. 2 is a schematic illustration providing a block diagram and front view of the operating device in FIG. 1 during the detection of a plurality of instances of finger contact on a contact face;

[0024] FIG. 3 is a schematic illustration providing a block diagram and front view of the operating device during the detection of a confirmation gesture;

[0025] FIG. 4 is a schematic perspective view of a hand of a user during a selection of two other operating functions of the operating device; and

[0026] FIG. 5 is a schematic illustration of a display face of the operating device, which can be operated in a sub-divided fashion in such a way that a plurality of operating functions can output respective functional data simultaneously.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

[0027] An exemplary embodiment is described in the text which follows. In the exemplary embodiment, the described components of the embodiment each represent individual features which can be considered independently of one another and which each also may be used independently of one another and can therefore also be considered individually or in another combination than that shown. Furthermore, other features than those already described can also be added to the described embodiment.

[0028] In the figures, functionally identical elements are respectively provided with the same reference symbols.

[0029] FIG. 1 shows an operating device 1 which can be made available, for example, in a motor vehicle 2 or (not illustrated) in a portable mobile terminal, for example a smartphone or a tablet PC. The operating device 1 can have a display device 3 with a display face 4. The display device 3 can be, for example, a screen or a head-up display. A display content on the display face 4 can be controlled by a control device 5 which can be implemented, for example, on the basis of a microcontroller or a microprocessor. The control device 5 can control, for example, device components 6, for example vehicle components of the motor vehicle 2, by a corresponding control signal 7. The device components 6 are illustrated in FIG. 1 merely by a single element. For each of the device components 6, the control device 5 can make available a respective operating function 8 by which, for example, a graphical user interface for operating the respective device component or a control function for triggering a device function of the respective device component can be implemented. The control device 5 can represent the operating functions 8 by corresponding graphical data 9 on the display face 4 of the display device 3, respectively by a graphical object 10, for example an icon or lettering, or a menu entry.

[0030] The operating device 1 makes it possible for a user now to select a plurality of the graphical objects 10 simultaneously and as a result make a selection, for example in a menu, of the operating functions 8 which are to be offered to the user on the display face 4, that is to say a selection of only the graphical objects 10 which are still to be displayed. It is also possible to implement simultaneous activation of a plurality of the operating functions 8 by a corresponding simultaneous selection of a plurality of the graphical objects 10. In order to recognize or detect the user selection, the operating device 1 can have a detection device 11 which can include, for example, a camera 12, in particular a TOF camera, and/or a proximity-sensitive and/or contact-sensitive sensor matrix 13. The sensor matrix 13 can be made available, for example, on the display face 4, that is to say in this case the display device 3 is a touchscreen. A detection range 14 of the camera 12 can also be directed toward the display face 4.

[0031] FIG. 2 illustrates how the control device 5 detects that the user contacts the contact face 4 with fingers 15 of his hand 16. The contact face 4 constitutes a contact face in this case. The control device 5, assigns the fingers 15 of the hand 16 respectively to the operating functions 8. Accordingly, one of the fingers is also respectively assigned to each of the graphical objects 10. A possible assignment is illustrated in FIG. 2 by corresponding letters A, B, C, D, E.

[0032] The detection device 11 is able to detect which finger 15 the user uses to contact the contact face in the form of the display face 4. In the example shown, these are the fingers A, B, that is to say the thumb and the index finger, for example.

[0033] Corresponding instances of finger contact 18 can be detected, for example, on the basis of 3D image data 12 of the camera 12 and/or contact data 17 of the sensor matrix 13 by the detection device 11 and signaled to the control device 15. It is also signaled which of the fingers 15, that is to say the fingers A, B here, carry out the instances of finger contact 18 on the contact face in the form of the display face 4.

[0034] FIG. 3 illustrates how the user confirms, by a confirmation gesture 19, the selection of the operating functions 8 selected by the instances of finger contact 18. The control device 5 deletes the other, non-selected graphical objects 10 on the display face 4 by adapting the corresponding graphical data 9. It is then possible to provide that only an operating menu which has been adapted in this way is then displayed to the user in future on the display face 4.

[0035] The confirmation gesture can be, for example, a movement of the hand together with the fingers 15 along a predetermined movement direction, for example upward in FIG. 3. When the confirmation gesture 19 is detected, selection data 20 which describe which of the operating functions 8 the user has selected are generated by the control device 5.

[0036] FIG. 4 illustrates the finger position 21 with which the user could select two different operating functions 8 on the operating face, namely those operating functions 8 which are assigned to the fingers A, E.

[0037] FIG. 5 illustrates, as an alternative to the configuration of an operating menu, how the selected operating functions 8, as described by the selection data 20, can be activated simultaneously and how a sub-face or a sub-region 22 is made available for each operating function on the display face 4 in accordance with the number of selected operating functions, in which sub-region 22 each activated operating function 8 respectively outputs functional data, for example displays status data or makes available contact faces or operating faces for receiving user inputs for the respective operating function 8. FIG. 5 shows by way of example how four operating functions 8 can be displayed by the display device 3. For example, an air conditioning setting 23, a navigation operating device 24, a media playback operating device 25 for playing back, for example, music files S1, S2, and a seat-setting function 26 for setting a sitting position can be represented as a respective operating function 8 simultaneously on the display face 4 in a respective sub-region 22.

[0038] Therefore, the operating device 1 permits a multi-contact gesture operating system or multi-touch gesture operating system for menu selection of a plurality of operating functions 8. The gesture operating system on the contact-sensitive display face 4 of a touchscreen 3 or else on a virtual contact level or a contact level of a touchpad serves to detect a desired arrangement of the menu structure or for simultaneously activating a plurality of operating functions.

[0039] In the case of the menu arrangement it is possible, as illustrated in FIG. 1, initially to display a standard orientation of the individual menu points in the form of a respective graphical object 10, for example for radio or navigation. FIG. 2 illustrates how the selection is made by detection of instances of finger contact 18 or finger taps, wherein the number and the finger position are determined. FIG. 3 shows how all the menu items in the form of the graphical objects 10 apart from the selected ones are removed by sliding upward or by an alternative direction of movement.

[0040] The described gestures can be detected on the touchscreen by the sensor matrix 13 or can be freely detected in the air by contactless recognition by a TOF camera. Corresponding program modules can be adapted with little expenditure to the respective configuration of the operating device 1.

[0041] Overall, the example shows how a multi-touch gesture operating system provides for menu selection.

[0042] A description has been provided with particular reference to preferred embodiments thereof and examples, but it will be understood that variations and modifications can be effected within the spirit and scope of the claims which may include the phrase at least one of A, B and C as an alternative expression that means one or more of A, B and C may be used, contrary to the holding in Superguide v. DIRECTV, 358 F3d 870, 69 USPQ2d 1865 (Fed. Cir. 2004).