CONTROL SYSTEM AND METHOD FOR CONTROLLING A VEHICLE

20210252979 · 2021-08-19

    Inventors

    Cpc classification

    International classification

    Abstract

    An operating system for a motor vehicle has a steering wheel (10) comprising a steering-wheel rim, a first input device (14) assigned to the steering wheel (10), a second input device (16) assigned to the steering wheel (10), and a processing unit (18) which is connected in a signal-transmitting manner to the first input device (14) and to the second input device (16). The operating system is characterized in that the first input device (14) is spatially separated from the second input device (16), the first input device (14) being assigned to a first side of the steering wheel, and the second input device (16) being assigned to a second side of the steering wheel opposite to the first side, the first input device (14) and the second input device (16) being operatively coupled to the processing unit (18) such that a main function (38) can at least be preselected via the first input device (14), and a subfunction (44) assigned to the main function (38) being adapted to be controlled via the second input device (16).

    Claims

    1. An operating system for a motor vehicle, having a steering wheel (10) for steering the motor vehicle, which comprises a steering wheel rim (22) and at least one steering wheel spoke (24), a first input device (14) assigned to the steering wheel (10), a second input device (16) assigned to the steering wheel (10), and a processing unit (18) which is connected in a signal-transmitting manner to the first input device (14) and to the second input device (16), the processing unit (18) being configured to detect a manual input on the first input device (14) and a manual input on the second input device (16) and to convert them into a control of a vehicle function (38, 44), characterized in that the first input device (14) is spatially separated from the second input device (16), the first input device (14) being assigned to a first side (28) of the steering wheel (10), and the second input device (16) being assigned to a second side (30) of the steering wheel (10) opposite to the first side (28), the first input device (14) and the second input device (16) being operatively coupled to the processing unit (18) such that a main function (38) can at least be preselected via the first input device (14), and a subfunction (44) assigned to the main function (38) can be controlled via the second input device (16).

    2. The operating system according to claim 1, wherein the processing unit (18) is arranged to make a distinction between and recognize a choice of the main function (38) and a selection of the main function (38), the choice and/or the selection of the main function (38) being performed by at least one input via the first input device (14).

    3. The operating system according to wherein the first input device (14) and/or the second input device (16) comprise(s) a capacitive sensor system, an optical sensor system, a force sensor system and/or a mechanical sensor system.

    4. The operating system according to claim 1, wherein at least a haptic, an optical and/or an acoustic feedback device (34) are/is assigned to the first input device (14) and/or to the second input device (16).

    5. The operating system according to claim 1, wherein at least the first input device (14) and the processing unit (18) are configured to recognize a number of points of contact (42) on the first input device (14) and to cause a change in one input parameter depending on the number of recognized points of contact (42).

    6. The operating system according to claim 1, wherein the first input device (14) and/or the second input device (16) comprise(s) an internal display device.

    7. The operating system according to claim 1, wherein a display device (20) is provided, which is formed separately from the first input device (14) and the second input device (16) and which is connected to the processing unit (18).

    8. The operating system according to claim 1, wherein the processing unit (18) is arranged to generate a graphical user interface (41, 46) which comprises a menu including menu items which are at least adapted to be preselected via the first input device (14).

    9. The operating system according to claim 1, wherein the first and/or the second input device (14, 16) are/is configured in a web-like or rocker-like manner.

    10. The operating system according to claim 1, wherein the processing unit (18) is arranged to make a distinction between different input gestures and to control different main functions (38) and/or subfunctions (44) depending on the input gestures.

    11. A method for operating a motor vehicle, comprising the steps of: detecting at least one finger of an operator on a first input device (14) which is arranged on a steering wheel (10) and is assigned to a first side (28) of the steering wheel (10), preselecting a main function (38) by means of the at least one detected finger, providing a subfunction (44) assigned to the preselected main function (38) on a second input device (16) which is arranged on the steering wheel (10) and is assigned to a second side (30) of the steering wheel (10) which is opposite to the first side (28), and operating the second input device (16) to control a vehicle function (38, 44) depending on the subfunction (44) assigned to the preselected main function (38).

    12. The method according to claim 11, characterized in that the main function (38) is assigned to the at least one detected finger due to the relative location or the absolute position of the at least one finger on the first input device (14).

    13. The method according to claim 11 wherein the preselected main function (38) is confirmed via the subfunction (44) assigned thereto and/or in that a (pre)selection of a main function (38) and/or a subfunction (44) is performed via one or more input gestures.

    14. The method according to claim 11, wherein the preselected main function (38) is confirmed automatically after the expiration of a time period and/or due to an exerted pressure, in particular wherein further fingers resting previously on the first input device (14) are removed for confirming the selection.

    15. The method according to claim 11, wherein at least one icon which is assigned to the preselected main function (38) and/or the subfunction (44) is represented on a display device (20).

    Description

    [0068] Further advantages and properties of the invention will become apparent from the following description and from the accompanying drawings, to which reference is made and in which:

    [0069] FIG. 1 shows a schematic representation of a cockpit of a motor vehicle,

    [0070] FIG. 2 shows a schematic representation of an operating system of the motor vehicle according to the invention as shown in FIG. 1,

    [0071] FIG. 3 shows a schematic representation of a first embodiment of the operating system according to the invention as shown in FIG. 2,

    [0072] FIG. 4 shows a schematic representation of a second embodiment of the operating system according to the invention as shown in FIG. 2,

    [0073] FIG. 5 shows a schematic representation of a third embodiment of the operating system according to the invention as shown in FIG. 2,

    [0074] FIG. 6 shows a schematic representation of a fourth embodiment of the operating system according to the invention as shown in FIG. 2, and

    [0075] FIG. 7 shows a detailed view of an operating variant of the operating system according to the invention as shown in FIG. 6.

    [0076] FIG. 1 shows a cockpit of a vehicle.

    [0077] The vehicle has various vehicle components, such as a steering wheel 10, a driver's seat, a passenger seat, a dashboard, a center console and further components.

    [0078] In addition, an operating system 12 is provided via which vehicle functions of the vehicle, such as an entertainment system and a navigation system, among others, can be controlled.

    [0079] In the example embodiment shown, the operating system 12 comprises two first input devices 14, two second input devices 16, at least one processing unit 18, and a plurality of external display devices 20.

    [0080] For example, the processing unit 18 comprises at least one processor, a data memory, a working memory, and/or the like. The processing unit 18 may in particular also comprise a plurality of processors.

    [0081] The first input devices 14 and the second input devices 16 are connected to the processing unit 18 in a signal-transmitting manner, so that an input via the first input devices 14 and/or the second input devices 16 can be processed by the processing unit 18.

    [0082] In addition, the processing unit 18 is connected to the plurality of display devices 20 in a signal-transmitting manner so that corresponding information provided by the processing unit 18 can be displayed.

    [0083] The first input devices 14 are arranged on a rear side of the steering wheel 10, that is, facing away from an operator, whereas the second input devices 16 are arranged on a front side of the steering wheel 10, that is, facing the operator.

    [0084] Accordingly, the first input devices 14 and the second input devices 16 are provided on opposite sides of the steering wheel 10.

    [0085] In the embodiment shown, the first input devices 14 have a web-like configuration, which can be seen more clearly in FIG. 2. In addition, the first input devices 14 have touch-sensitive surfaces, for example capacitive surfaces, via which a manual input can be made.

    [0086] The second input devices 16 also have touch-sensitive surfaces, for example capacitive surfaces, via which a manual input can also be made.

    [0087] Two screens 20.1, 20.2 in the dashboard are provided as external display devices 20 in FIG. 1, and a screen of a head-up display 20.3 (HUD) also serves as a display device 20.

    [0088] However, the number of input devices 14, 16 is to be understood as exemplary only. The operating system 12 may likewise be implemented with only one first input device 14 and/or only one second input device 16 or any other number of first and/or second input devices 14, 16.

    [0089] The number of external display devices 20 is also to be understood as exemplary. The operating system 12 may likewise have no or any other number of external display devices 20.

    [0090] Inputs which are processed by the processing unit 18 and can optionally be displayed on at least one of the external display devices 20 can be made by an operator via the input devices 14, 16.

    [0091] FIG. 2 shows the operating system 12 in more detail. For clarity, the operating system 12 is shown here with only one external display device 20.

    [0092] In the embodiment shown, the steering wheel 10 has a steering wheel rim 22, three steering wheel spokes 24 and a center section 26 in which, for example, an airbag can be accommodated.

    [0093] The steering wheel 10, in particular the steering wheel rim 22, defines a plane having a first side 28 and a second side 30 opposite the first side 28.

    [0094] Here, the first side 28 corresponds to a rear side of the steering wheel 10 facing away from an operator, and the second side 30 corresponds to a front side of the steering wheel 10 facing the operator.

    [0095] As already explained, the first input devices 14 have touch-sensitive surfaces which are assigned to the first side 28, i.e. have their active surface facing away from the operator. In this respect, the first input devices 14 are mounted on the rear side of the steering wheel 10, in particular on the steering wheel spokes 24. The first input devices 14 may be attached to the steering wheel spokes 24 as separately formed switching elements. The first input devices 14 may also be integrated in the steering wheel spokes 24.

    [0096] The design and attachment of the first input devices 14 is to be understood as exemplary only. The first input devices 14 may have any other shape, mode of operation, or positioning.

    [0097] In particular, the first input devices 14 may also be arranged on the steering wheel rim 22 or on a base of the steering wheel 20, wherein they are assigned to the first side 28, i.e. the rear side of the steering wheel 10, or wherein their operating surface faces away from the operator or the driver.

    [0098] In other words, one or both of the first input devices 14 may be arranged at a different location of the steering wheel 10, and for example be integrated in the steering wheel spoke 24 or in the steering wheel rim 22.

    [0099] It is also conceivable that one or both of the first input devices 14 is/are mounted on a (substantially) immovable part of the cockpit, so as to not rotate with the steering wheel 10. This is also referred to as a vehicle-fixed arrangement of the first input devices 14, as the absolute position of the first input devices 14 is maintained, regardless of the steering wheel position.

    [0100] For example, one or both of the first input devices 14 may include a capacitive sensor system, an optical sensor system, a force sensor system, or a mechanical sensor system. A combination of different sensor systems per input device 14 is also possible. Furthermore, the first input devices 14 may each comprise different sensor systems.

    [0101] Instead of or in addition to the touch-sensitive surface, one or both of the first input devices 14 may, for example, comprise one or more mechanical buttons, mechanical wheels or the like. Accordingly, the shape may of course also differ from a web-like shape.

    [0102] It may further be provided that a haptic, optical and/or acoustic feedback device 34 is assigned to the first input devices 14 and/or the second input devices 16.

    [0103] The haptic feedback by the feedback device 34 may take place actively, for example by an actuator controlled by the processing unit 18, in particular an electric motor.

    [0104] Additionally or alternatively, the haptic feedback may take place passively, for example by structural components arranged on the first input device 14 and/or the second input device 16, such as recesses, grooves or protrusions, which are haptically sensed by the operator of the operating system 12 when the operator actuates the corresponding input device 14, 16.

    [0105] The properties, features, and variations of the first input devices 14 described above are generally equally applicable to the second input devices 16.

    [0106] One or both of the second input devices 16 may or may not include an internal display device. The latter can, for example, be designed to be touch-sensitive and thus form a touch screen.

    [0107] The processing unit 18 is in signal or data connection 32 with the steering wheel 10, in particular with the input devices 14, 16, and the external display device 20. The signal or data transmission may be wired or wireless.

    [0108] The external display device 20 can be used to visualize inputs via the first input devices 14 and/or the second input devices 16.

    [0109] In the following, the general operation of the operating system 12 will be explained. FIGS. 3 to 7 describe special variants of the operation of the operating system 12.

    [0110] To operate the operating system 12, the operator makes an input at one of the first input devices 14 with one or more fingers by placing the at least one finger on the touch-sensitive surface.

    [0111] The input causes a main function to be at least preselected via the processing unit 18.

    [0112] A preselection corresponds here to a choice of the main function. The preselection can be confirmed in a further input step, which corresponds to a selection of the main function. The processing unit 18 is able to make a distinction between and recognize a choice of the main function and a selection of the main function.

    [0113] In the case of a selection, the choice must additionally be confirmed. The confirmation can for example take place via a time period or a further input, in particular also by an input via the second input device 16.

    [0114] By (pre)selecting the main function via the first input device 14, a certain set of subfunctions specific to the preselected main function can be controlled. The main function thus represents a categorization or a menu item in which subfunctions that are usual for and specific to the category or menu item are classified.

    [0115] The subfunctions specific to the main function are provided to the second input device 16 after the (pre)selection of the main function by the first input device 14 and can be controlled by an input via the second input device 16.

    [0116] The provision of the subfunctions assigned to the main function can (additionally) be carried out visually by displaying the available subfunctions on the internal display device of the second input device 16 and/or on the external display device 20. This is also clearly shown in FIG. 3.

    [0117] Optionally, the first input device 14 can also perform a kind of “shift” function. It is thus possible by an actuation, for example a tilting (similar to shift rockers), a pressing, a touching or the like of the first input device 14 to change to a shift mode and, when the actuation is terminated, to maintain or terminate the shift mode, thus switching back to the default mode. A further actuation of the first input device 14 may terminate the shift mode or change to another shift mode.

    [0118] For example, other main functions and/or subfunctions can be provided for a (pre)selection in the shift mode. Additionally or alternatively, an input parameter may be changed in the shift mode, e.g., the scrolling speed may be reduced to an incremental scrolling in the default mode, and a faster scrolling may be provided in the shift mode.

    [0119] Thus, the “shift” function can be used to switch between different sets of main functions or subfunctions.

    [0120] Therefore, the core idea of the operating system 12 or the method for operating the operating system 12 is the combined operation of the first input devices 14 and the second input devices 16.

    [0121] Accordingly, the first input devices 14 are operatively coupled to the second input devices 16.

    [0122] The input for a (pre)selection of the main functions only via one of the first input devices 14 has been described above. In this variant, both first input devices 14 are interdependently coupled, i.e. the operation of one input device 14 has an influence on the operation of the other input device 14.

    [0123] For example, the operator performs a scrolling through the main functions via the right-hand first input device 14, which however must be interrupted, for example, due to a necessary shifting operation using a gearshift lever. The operator can then continue scrolling at the point where scrolling was interrupted using the left-hand first input device 14. Therefore, both first input devices 14 act together here as a common first input device 14.

    [0124] The properties described above may also apply to the second input devices 16.

    [0125] As an alternative variant, the first input devices 14 may also be operated independently of each other. This means that the first input devices 14 are each assigned a mutually identical or different set of main functions, from which a main function can be (pre)selected independently of one another by a respective one of the two first input devices 14. For example, the left-hand first input device 14 is assigned only main functions relating to the “media” area, such as “telephone”, “radio”, “music”, etc., whereas the right-hand first input device 14 is assigned only main functions relating to other vehicle functions of the vehicle, such as “air conditioning”, “navigation”, “system settings”, etc.

    [0126] The properties described above may also apply to the second input devices 16.

    [0127] FIGS. 3 to 7 show specific variants of the operation of the operating system 12.

    [0128] Here, the first input devices 14 have touch-sensitive surfaces and the second input devices 16 have touch-sensitive display surfaces (touch screens) that can be operated independently of each other.

    [0129] In addition, the input devices 14, 16 are coupled to the external display device 20.

    [0130] It goes without saying that the embodiments shown in FIGS. 3 to 7 are to be understood as examples only, and that the individual components of the operating system 12 may also have other properties, features and modes of operation—as explained in the foregoing.

    [0131] In a first operating variant of the operating system 12 according to FIG. 3, a main function set 36 is respectively assigned to a first input device 14, the main function sets 36 each having five mutually identical main functions 38, for example “climate control”, “music”, “navigation”, “telephone” and “assistance systems”, in particular “driving assistance systems”.

    [0132] Furthermore, different driving modes can for example be controlled by the first input device 14 and/or the second input device 16.

    [0133] Here, one of the first input devices 14 is respectively subdivided virtually and optionally structurally into sections 40 corresponding to the number of main functions 38 in the main function set 36, for example by passive haptic feedback devices such as recesses and/or elevated parts.

    [0134] Optionally, this may also be the case for the second input device 16.

    [0135] The processing unit 18 is arranged to generate a graphical user interface 41 on the external display device 20 comprising the main function sets 36 as a menu with the individual main functions 38 as menu items.

    [0136] As soon as the operator places a finger on the first input device 14, the processing unit 18 detects a point of contact 42 on the touch-sensitive surface via the first input device 14.

    [0137] Depending on the section 40 in which the point of contact 42 is located, a main function 38 assigned to the respective section 40 is chosen, as shown in FIG. 3 by a plurality of marking lines around the corresponding main function 38.

    [0138] Subfunctions 44 specific to the main function 38 can be assigned to the second input device 16 already in a chosen state of a main function 38 and optionally be displayed by the second input device 16, in particular as a preview, for example on the integrated display device on the front side of the steering wheel 10.

    [0139] Alternatively, the subfunctions 44 specific to the main function 38 may be assigned to the second input device 16 and optionally be displayed by the second input device 16 only after a selection of the corresponding main function 38.

    [0140] The selection of a chosen main function 38 may take place, for example, by removing the finger from the section 40 of the associated chosen main function 38, by making a specific gesture, such as “tap” or “press”, in particular with a higher force, or by remaining in the chosen state for a specific period of time.

    [0141] Several fingers may also rest on the first input device 14 and thus create several points of contact 42 in different sections 40.

    [0142] The processing unit 18 can then detect how many fingers are resting on the input device 14, i.e. the touch sensitive surface, and in which sections 40 the points of contact 42 are located.

    [0143] A main function 38 assigned to the respective section 40 is chosen depending on which sections 40 the individual points of contact 42 are located in.

    [0144] Here, a selection of a main function 38 can also be made by removing the fingers from the sections 40 assigned to the main functions 38 that are not to be selected, as described above.

    [0145] Thus, the (pre)selection of the main functions 38 is dependent on an absolute location of the point of contact 42 on the first input device 14. Or, in other words, the main function 38 assigned to the at least one of the detected fingers is assigned based on the absolute position of the finger on the first input device 14.

    [0146] The confirmation or selection of a chosen main function 38 may also be performed via the second input device 16.

    [0147] It may be provided that a particular gesture and/or a determined number of points of contact 42 may initiate a change in the menu of the graphical user interface 41.

    [0148] For example, a “swipe” of the finger across a plurality of sections 40 may cause the currently listed main functions 38 to be replaced by other main functions 38.

    [0149] By (pre)selecting a main function 38, subfunctions 44 specific to the main function 38 are assigned to the second input device 16. Optionally, the subfunctions 44 may be displayed on the second input device 16 such that a graphical user interface 46 is created on the second input device 16 by the processing unit 18.

    [0150] A subfunction 44 can then be controlled, preferably with only one finger (thumb) by a gesture, for example “tap” or “press”.

    [0151] The subfunctions 44 shown in FIG. 3 are all individual functions.

    [0152] An individual function can be executed after a gesture, for example “tap”, or during a gesture, for example “tap and hold”. However, it is not possible to perform another gesture during the input process to control another individual function.

    [0153] Two or more individual functions can be combined in a collective function. Accordingly, a collective function has two or more individual functions, wherein a first individual function is performed during the execution of a first gesture (e.g. “pull to the right”) and optionally a second, e.g. opposite individual function can be executed during the same input process by a second, e.g. opposite input gesture (e.g. “pull to the left”). A collective function is shown in more detail in FIG. 7, to which reference will be made below.

    [0154] It may be provided that a particular gesture initiates a change in the menu of the graphical user interface 46. For example, “circling” the thumb over a plurality of subfunctions 44 may cause the currently listed subfunctions 44 to be replaced by other subfunctions 44.

    [0155] Only the operation using a first input device 14 and a second input device 16 has been discussed above. Operation can of course also be performed simultaneously on both first input devices 14 and both second input devices 16.

    [0156] In this case, it is possible that different main function sets 36 are assigned to each of the first input devices 14.

    [0157] The operating concept described above is illustrated below using the specific example shown in FIG. 3.

    [0158] An operator places one of his left fingers on the fifth section 40 of the left first input device 14 counted from the top of the first input devices 14, and places one of his right fingers on the second section 40 of the right first input device 14.

    [0159] On the one hand, the main function 38 “assistance systems”—on the left in the graphical user interface 41 of the external display device 20—and, on the other hand, the main function 38 “music”—on the right in the graphical user interface 41 of the external display device 20—are thus chosen.

    [0160] Optionally, the chosen main functions 38 are confirmed and thus selected by a selection gesture, for example “tap” or “press”, by a removal of the finger from the section 40 of the chosen main function 38 or by a period of time.

    [0161] During or shortly after the pre-selection or during or shortly after the selection, subfunctions 44 corresponding to the (pre)selected main function 38 are displayed on the internal display device of the second input devices 16, as is apparent from FIG. 3.

    [0162] Starting at the top in a clockwise direction around the control pad, these are for the main function 38 “assistance systems” the subfunctions 44 “lane keeping assistant”, “increase cruise control speed”, “distance control assistant” and “decrease cruise control speed”.

    [0163] Radially further inward with respect to the steering wheel 10, the subfunction “turn cruise control off or on” is shown at the top and the standard subfunction “undo or exit” is shown at the bottom.

    [0164] Starting at the top in a clockwise direction around the control pad, these are for the main function 38 “music” the subfunctions 44 “increase volume”, “next track”, “decrease volume” and “previous track”.

    [0165] Radially further inward with respect to the steering wheel 10, subfunction 44 “confirm” is shown at the top and the standard subfunction “undo or exit” is shown at the bottom.

    [0166] As mentioned above, the subfunctions 44 shown are all individual functions.

    [0167] An individual function can be executed after a gesture, e.g. “tap”, or during a gesture, e.g. “tap and hold”.

    [0168] For example, a “tap” on “increase cruise control speed” increases the speed gradually with each “tap”, whereas a “tap and hold” increases the speed more quickly and continuously during the hold.

    [0169] A “tap” on “next track” will skip to the next track, wherein a fast forwarding of the current track can be performed by a “tap and hold”.

    [0170] It is however not possible to perform another gesture during the input process to control another individual function.

    [0171] Alternatively, one or more related subfunctions 44, for example “increase cruise control speed” and “decrease cruise control speed”, “increase volume” and “decrease volume”, or “next track” and “previous track”, may be formed as a collective function.

    [0172] It is for example possible to increase the volume during a “swipe up” and to optionally “swipe down” in the current input process, thereby decreasing the volume during the “swipe down”.

    [0173] A second operating variant of the operating system 12 according to FIG. 4 is similar to the first operating variant.

    [0174] Here, the number of main functions 38 of a main function set 36 is however not limited to the number of virtual sections of the first input device 14, but the main function set 36 is set up per first input device 14 as a kind of endless list (“scroll”), the endless lists of the two first input devices 14 having mutually identical main functions 38.

    [0175] The processing unit 18 is configured to generate a graphical user interface 41 on the external display device 20, which comprises the main function sets 36 as a menu with the individual main functions 38 as menu items.

    [0176] As soon as the operator places one or more fingers on the first input device 14, the processing unit 18 detects one or more points of contact 42 via the first input device 14.

    [0177] It may be provided that the start choice of a main function 38 is dependent on the touch position on the first input device 14.

    [0178] Alternatively, the start choice may always be the same, in particular individually adjustable “default” main function 38 or the last active main function 38.

    [0179] If the point of contact 42 is located in the upper area of the first input device 14, a main function 38 located in the upper area of the menu of the graphical user interface 41 is more likely to be chosen for start.

    [0180] A gesture, for example “swipe”, can be used to scroll through the menu of main functions 38 and choose one main function 38 at a time.

    [0181] It is here conceivable that an input parameter can be changed depending on a number of points of contact 42 performing the scroll gesture. For example, an input parameter may be a scroll speed through the menu of the main functions 38.

    [0182] The input parameter, for example the scroll speed, may additionally or alternatively be changed depending on the movement speed of the at least one finger across the corresponding input device 14 and/or by an initiation of a shift mode (“shift” function).

    [0183] Alternatively, only incremental scrolling may be provided, which is independent of the number of points of contact 42, the speed of movement of the at least one finger and/or the initiation of a shift mode.

    [0184] A mechanical wheel and/or mechanical buttons may for example be used instead of a touch-sensitive surface.

    [0185] It is also conceivable that the complete first input device 14 is tilted similar to a rocker in order to perform a, in particular incremental, scrolling through the main functions 38.

    [0186] Subfunctions 44 specific to the main function 38 can be assigned to the second input device 16 already in the chosen state of a main function 38 and optionally be displayed by the second input device 16.

    [0187] Alternatively, the subfunctions 44 specific to the main function 38 may be assigned to the second input device 16 and optionally be displayed by the second input device 16 only after a selection of the corresponding main function 38.

    [0188] The selection of a chosen main function 38 may take place, for example, by removing the finger from the first input device 14, by making a specific gesture, such as “tap” or “press,” or by remaining in the chosen state for a specific period of time.

    [0189] The (pre)selection of the main functions 38 is thus dependent on the relative movement of the point of contact 42 on the first input device 14.

    [0190] Or, in other words, the main function 38 assigned to the detected finger is assigned based on the relative location or the relative movement of the finger on the first input device 14.

    [0191] In this respect, it is not necessary for the operator to find the absolute position of the corresponding main function 38, as the corresponding main function 38 is determined based on a relative location or a relative movement of the at least one finger, irrespective of the absolute position at which this is done.

    [0192] The confirmation or selection of a chosen main function 38 may also be performed via the second input device 16.

    [0193] With regard to the subfunctions 44, reference is made to the explanations as to the first operating variant according to FIG. 3.

    [0194] Only operation using a first input device 14 and a second input device 16 has been discussed above.

    [0195] Operation can of course also be performed simultaneously on both first input devices 14 and both second input devices 16.

    [0196] The operating concept described above is illustrated below using the specific example shown in FIG. 4.

    [0197] An operator places one of his left fingers on an area of the left first input device 14 and one of his right fingers on an area of the right first input device 14.

    [0198] As a result, on the one hand, a main function 38 corresponding to the touch position of the left first input device 14 is chosen in the left menu of the graphical user interface 41 of the external display device 20 for starting, and on the other hand, a main function 38 corresponding to the touch position of the right first input device 14 is chosen in the right menu for starting.

    [0199] It may be provided that the start choice of a main function 38 is dependent on the touch position on the first input device 14.

    [0200] Alternatively, the start choice can always be the same, in particular individually adjustable “default” main function 38 or the last active main function 38.

    [0201] The operator then “swipes” with his left and/or right finger in the direction of the desired main functions 38—in this case the main functions 38 “assistance systems” or “music”- and thus chooses them, as shown in FIG. 4 by several marking lines around the corresponding main functions 38.

    [0202] Optionally, the chosen main functions 38 are confirmed and thus selected by a selection gesture, for example “tap” or “press”, by removing the finger from the section 40 of the chosen main function 38 or by a period of time.

    [0203] During or shortly after the preselection, or during or shortly after the selection, subfunctions 44 corresponding to the preselected main function 38 are displayed on the internal display device of the second input devices 16.

    [0204] For the descriptions of the main functions 38 and subfunctions 44 (individual functions and collective functions), reference is made to the explanations as to the first operating variant according to FIG. 3.

    [0205] A third operating variant of the operating system 12 according to FIG. 5 is similar to the first operating variant.

    [0206] Here, however, the main functions 38 of a main function set 36 are not assigned to virtual sections of the first input device 14, but the at least one main function 38 is assigned to the at least one point of contact 42 of the at least one finger touching the first input device 14.

    [0207] The processing unit 18 is configured to generate a graphical user interface 41 on the external display device 20 which visualizes the main functions 38 assigned to the points of contact 42.

    [0208] As soon as the operator places one or more fingers on the first input device 14, the processing unit 18 detects one or more points of contact 42 via the first input device 14.

    [0209] Each point of contact 42 is assigned to a main function 38, which is thereby chosen.

    [0210] Only the chosen main functions 38 are displayed on the external display device 20. Alternatively, the main functions 38 that are not chosen may be also displayed and the main functions 38 that are chosen may be highlighted in some manner.

    [0211] It may be provided that the start choice of one or more main functions 38 is dependent on the point of contact 42 or points of contact 42 on the first input device 14.

    [0212] Alternatively, the start choice may always be the same, in particular individually adjustable “default” main function 38 or the last active main function 38.

    [0213] For example, if the point of contact 42 or points of contact 42 is/are located in the upper area of the first input device 14, one or more main function(s) 38 located in the upper area of the menu of the graphical user interface 41 are more likely to be selected for start. For this purpose, main function sets 36 may already be displayed in the menu prior to touching the first input devices 14.

    [0214] Subfunctions 44 specific to the main function 38 may be assigned to the second input device 16 already in a chosen state of a main function 38 and optionally be displayed by the second input device 16.

    [0215] Several fingers may also rest on the first input device 14 and thus create several points of contact 42, such that several main functions 38 are selected. In this case, however, no subfunctions 44 can be assigned to the second input device 16 in a selected state of the main functions 38 and optionally be displayed by the second input device 16.

    [0216] The processing unit 18 may then recognize how many fingers rest on the input device 14, that is, the touch-sensitive surface.

    [0217] Here, a corresponding main function 38 may be assigned to each recognized finger, in particular based on a relative location of the fingers, so that a main function 38 is (pre-)selected as soon as the operator removes the fingers from the touch-sensitive surface, except for the finger whose assigned main function 38 is to be (pre-)selected.

    [0218] The relative location of the finger with respect to the input device 14 is in particular recognized, i.e., the uppermost or lowermost finger on the touch-sensitive surface; regardless of whether the respective finger is the index finger or the middle finger.

    [0219] Then, the correspondingly associated subfunctions 44 may be assigned to the second input device 16 in a chosen state of the main function 38 and optionally be displayed by the second input device 16.

    [0220] Alternatively, the subfunctions 44 specific to the main function 38 may be assigned to the second input device 16 and optionally be displayed by the second input device 16 only after a selection of the corresponding main function 38.

    [0221] The selection of a chosen main function 38 may take place, for example, by removing the finger of the assigned chosen main function 38, by making a specific gesture, such as “tap” or “press,” or by remaining in the chosen state for a specific period of time.

    [0222] If several points of contact 42 have been created and thereby several main functions 38 are chosen, a selection of a main function 38 may also be made by removing the fingers assigned to the main functions 38 that are not to be selected.

    [0223] Thus, the (pre)selection of the main functions 38 is dependent on a relative location of the point of contact 42 or points of contact 42 on the first input device 14. Or, in other words, the main function 38 assigned to the at least one of the detected fingers is assigned based on the absolute location of the finger on the first input device 14.

    [0224] The confirmation or selection of a chosen main function 38 may also be performed via the second input device 16.

    [0225] It may be provided that a particular gesture initiates a change in the menu of the graphical user interface 41. For example, a “swipe” of the finger or fingers may cause the currently chosen main functions 38 to be replaced by other main functions 38.

    [0226] With regard to the subfunctions 44, reference is made to the explanations as to the first operating variant according to FIG. 3.

    [0227] Only the operation using a first input device 14 and a second input device 16 has been discussed above. Operation can of course also be performed simultaneously on both first input devices 14 and both second input devices 16.

    [0228] The operating concept described above is illustrated below on the basis of the specific example shown in FIG. 5.

    [0229] An operator places three of his left fingers on a lower area of the left first input device 14 and two of his right fingers on an upper area of the right first input device 14.

    [0230] As a result, on the one hand, main functions 38 (“navigation”, “telephone” & “assistance Systems”) corresponding to three of the points of contact 42 of the left first input device 14 are chosen for starting in the left menu of the graphical user interface 41 of the external display device 20, and only these are displayed.

    [0231] On the other hand, main functions 38 (“climate” & “music”) corresponding to two of the points of contact 42 of the right first input device 14 are chosen for starting in the right menu, and only these are displayed.

    [0232] The desired main functions 38 are selected by a selection gesture, for example “tap” or “press”, by a removal of the finger of the main function 38 to be selected or by a removal of the fingers of the main functions 38 not to be selected.

    [0233] FIG. 5 shows the (pre-)selected main functions 38 (“assistance systems” & “music”) by several marking lines around the corresponding main function 38.

    [0234] During or shortly after the pre-selection or during or shortly after the selection, subfunctions 44 corresponding to the (pre-)selected main function 38 are displayed on the internal display device of the second input devices 16.

    [0235] For the descriptions of the main functions 38 and subfunctions 44 (individual functions and collective functions), reference is made to the explanations as to the first operating variant according to FIG. 3.

    [0236] FIG. 6 shows a fourth operating variant. Here, the available main functions 38 are additionally or exclusively visualized in the menu of the graphical user interface 46 on the internal display device of the second input devices 16.

    [0237] One main function 38 per operating page can then be (pre-)selected according to one of the (pre-)selection methods described above, and the subfunctions 44 can be controlled as described above.

    [0238] Alternatively or additionally, when a main function 38 of the menu shown in FIG. 6 is controlled via the second input device 16, a collective function of the most important subfunctions 44 of the controlled main function 38 can be executed, as shown in FIG. 7. This collective function may, for example, be individually preset by the operator.

    [0239] In the example shown, the collective function is “adjust volume” and comprises the two individual functions “increase volume” and “decrease volume”, which can be controlled by placing a finger on an icon of the desired collective function and the gesture “clockwise circular movement” or “counterclockwise circular movement”.

    [0240] Matching subfunctions 44 in the operating variants according to FIGS. 2 to 5 may of course also be formed together as a collective function.

    [0241] In the present case, the second input devices 16 are arranged on the steering wheel spokes 24 and face the driver.

    [0242] Alternatively, the second input devices 16 can also be provided on the webs or rockers on which the first input devices 14 are arranged in the present case.

    [0243] The first and second input devices 14, 16 are then arranged on opposite sides of the rockers or webs.

    [0244] In an analogous manner, the second input devices 16 can be operated accordingly by the operator via the thumb.

    [0245] The described properties, features and modes of operation of all operating systems described above can of course be freely combined, interchanged or the like.