Method for Increasing Safety During the Operation of a Device
20250077042 ยท 2025-03-06
Inventors
Cpc classification
B60K35/29
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/122
PERFORMING OPERATIONS; TRANSPORTING
B60W2540/229
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/162
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/126
PERFORMING OPERATIONS; TRANSPORTING
B60K35/28
PERFORMING OPERATIONS; TRANSPORTING
B60W2540/22
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/161
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/146
PERFORMING OPERATIONS; TRANSPORTING
G06F3/017
PHYSICS
B60K2360/11
PERFORMING OPERATIONS; TRANSPORTING
B60K35/60
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/113
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/111
PERFORMING OPERATIONS; TRANSPORTING
G06F3/0484
PHYSICS
B60K35/10
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/197
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A method for increasing safety during the operation of a device. Multiple widgets are displayed on a graphical user interface. Each widget is assigned an operating function of the device; each operating function is selectable by selecting the corresponding widget via an input gesture of a user. Data relating to a current situation is detected, and it is determined whether increased attention is required in the current situation based on the data. If increased attention is required, the widgets are arranged such that each widget is assigned a simple input gesture which differs greatly from input gestures assigned to other displayed widgets in order to increase safety during the operation of the device.
Claims
1-15. (canceled)
16. A method for increasing safety during the operation of a device, wherein a plurality of widgets are displayed on a graphical user interface, wherein each widget is assigned an operating function of the device, and wherein each operating function can be selected by selecting the corresponding widget by means of an input gesture that can be carried out by a user, the method comprising: detecting data relating to a current situation; determining whether increased attention is required in the current situation on the basis of the data relating to the current situation; if it is determined that increased attention is required in the current situation, arranging the widgets displayed on the graphical user interface such that each widget is assigned a simple input gesture which differs greatly from input gestures assigned to other displayed widgets in order to increase safety during the operation of the device.
17. The method of claim 16, further comprising: setting confidence intervals when assigning detected input gestures to individual widgets such that, if it is determined that increased attention is required in the current situation, each detected input gesture can be assigned to a displayed widget.
18. The method of claim 16, further comprising: if it is determined that increased attention is required in the current situation, reducing a number of widgets displayed on the graphical user interface compared to a state in which increased attention is not required.
19. The method of claim 16, further comprising: if it is determined that increased attention is required in the current situation, presenting at least one piece of information relating to the input gestures assigned to the displayed widgets.
20. The method of claim 16, further comprising: displaying a detected input gesture in real time on a display apparatus.
21. The method of claim 16, wherein, in addition to executing an input gesture assigned to a widget, a confirmation action is also required to select the operating function assigned to the corresponding widget.
22. The method of claim 16, further comprising: outputting a feedback signal once an operating function assigned to a widget has been selected.
23. An input apparatus for increasing safety during the operation of a device, comprising: a graphical user interface configured to display a plurality of widgets, wherein each widget is assigned an operating function of the device, and each operating function is selectable by selecting the corresponding widget via an input gesture of a user; an input unit configured to detect input gestures of the user; a detection unit configured to detect data relating to a current situation; a determination unit configured to determine whether increased attention is required in the current situation on the basis of the data relating to the current situation; and a user interface control module configured to arrange the widgets displayed on the graphical user interface such that each widget is assigned a simple input gesture which differs greatly from input gestures assigned to other displayed widgets to increase safety during the operation of the device if it is determined by the determination unit that increased attention is required in the current situation.
24. The input apparatus of claim 23, wherein the user interface control module is further configured to, in response to the determination unit determining that increased attention is required in the current situation: set confidence intervals in the assignment of detected input gestures to individual widgets such that each input gesture detected by the input unit can be assigned to a widget.
25. The input apparatus of claim 23, wherein the user interface control module is further configured to, in response to the determination unit determining that increased attention is required in the current situation: reduce a number of widgets displayed on the graphical user interface compared to a state in which increased attention is not required.
26. The input apparatus of claim 23, wherein the user interface control module is further configured to, in response to the determination unit determining that increased attention is required in the current situation: present at least one piece of information relating to the input gestures assigned to the displayed widgets on a display apparatus.
27. The input apparatus of claim 23, further comprising: a display module configured to display, in real time on a display apparatus, an input gesture detected by the input unit.
28. The input apparatus of claim 23, further comprising: a confirmation element configured to detect a confirmation action of a user, wherein the user interface control module is further configured to select an operating function assigned to a widget only when a corresponding input gesture is detected by the input unit and a confirmation action is detected by the confirmation element.
29. The input apparatus of claim 23, further comprising: a feedback generator configured to output a feedback signal once an operating function assigned to a widget has been selected.
30. A motor vehicle, comprising: the input apparatus of claim 23.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0050]
[0051]
[0052]
[0053]
DETAILED DESCRIPTION OF THE DRAWINGS
[0054]
[0055] In particular,
[0056] For example, motor vehicles usually have a large number of technical devices that a user can operate at will. Examples of such devices are a radio, a navigation system, a motor vehicle telephone or a hands-free device integrated in the motor vehicle, or an air conditioning system.
[0057] These devices can be operated via input units that receive input gestures assigned to individual operating functions, i.e., operating or adjustable functions of a device, and are coupled to the device in such a way that the operating function can be selected on the basis of a received input gesture and this can then be set or controlled accordingly. The input units can be, for example, a touchpad or a trackball arranged on a steering wheel of the motor vehicle.
[0058] In individual situations, for example when controlling a motor vehicle, it is important that the user focuses their attention and concentration completely on controlling the vehicle. Corresponding input apparatuses should draw the user's concentration as little as possible so as not to distract them unnecessarily. At the same time, however, safe operation of at least the most important devices must be ensured.
[0059] According to the embodiments of
[0060] If, on the other hand, it is determined in step 3 that no increased attention is required in the current situation, an input apparatus is operated in its normal state according to the embodiments of
[0061] Situations in which increased attention is required may involve, for example, driving a motor vehicle. Thus, while driving, a driver's view to the outside should be continuous, i.e. as uninterrupted as possible. In this case, the input gestures can be detected, for example, by means of a trackball attached to the steering wheel of the motor vehicle, whereas the graphical user interface can be displayed on an instrument cluster or a head-up display of the motor vehicle. However, a situation in which increased attention is required is also given, for example, if a user performing the input gestures is wearing gloves and the input gestures can only be received or detected with difficulty and no longer precisely.
[0062] Simple input gestures are also understood here to mean, for example, gestures that have a small or reduced number of degrees of freedom. In particular, simple input movements are understood to mean rotational movements or linear movements in particular. An example of such a simple input gesture is, for example, a forward or backward movement with a finger and/or hand. The fact that individual input gestures differ greatly means that they differ significantly and that individual input gestures that can be used to select displayed widgets can be easily distinguished from one another without much effort.
[0063]
[0064] According to the embodiments of
[0065] The selection of the widgets, which are to be displayed even in situations in which increased attention is required, can be carried out by artificial intelligence on the basis of widgets and corresponding operating functions that a user has selected in comparable situations in the past. The training of a corresponding artificial neural network can be done, for example, on the basis of past selections of widgets or corresponding operating functions by a user in comparable situations as input variables. If there are different users of the corresponding input apparatus, a separate artificial neural network can also be trained for each user. Furthermore, in situations in which increased attention is required, only those widgets can be displayed which are assigned to operating functions that are absolutely necessary for the regular operation of the device and/or the motor vehicle.
[0066] As
[0067] The at least one piece of information can, in particular, be a graphical representation or a description of an input gesture on a display apparatus or on the graphical user interface. Furthermore, information relating to selectable operating functions can also be presented to a user with the at least one piece of information.
[0068] Further, the method comprises a step 9 of displaying a detected input gesture in real time on a display apparatus.
[0069] Also, according to the embodiments of
[0070] Furthermore, the method 1 also comprises a step 11 of outputting a feedback signal after an operating function assigned to a widget has been selected. The feedback signal can be output, for example, haptically, visually, or in the form of audio. For example, a brightness of a display on which the graphical user interface is displayed may be increased, or the display may flash.
[0071]
[0072] In particular,
[0073] As further shown in
[0074] The memory 25 contains control information in the form of a script with window description source text that defines elements or widgets of the graphical user interface 23 and specifies their features or properties, such as appearance and behavior. In addition, the memory 25 contains a repository that contains text, graphics and sound resources required for the graphical user interface 23, which are to be displayed in the graphical user interface 23.
[0075] As can be seen, the user interface control module 26 further comprises an interpreter 27, a GUI manager 28, and an interface 29 to at least one device to be controlled or operated by the input apparatus 20. The interface 29 is designed to create a logical interface between corresponding applications running on the device, the individual operating functions and the GUI script, i.e. the control information. The interface 29 is based here on the three functional elements of event, action and data.
[0076] As
[0077] According to the embodiments of
[0078] According to the embodiments of
[0079] As shown in
[0080] In addition, the depicted user interface control module 26 has an arrangement module 34 configured to arrange the widgets depicted on the graphical user interface 23 such that each widget is assigned a simple input gesture which differs greatly from input gestures assigned to other depicted widgets in order to increase safety during the operation of the device if it is determined by the determination unit 33 that increased attention is required in the current situation. For example, various look-up tables may be stored in this context, which are matched to the corresponding application and in which assignments between widgets displayed in the corresponding situation and their arrangement within the graphical user interface 23 are stored.
[0081] Also, the illustrated user interface control module 26 further comprises a confidence module 35 configured to set confidence intervals in the assignment of detected input gestures to individual widgets such that each input gesture detected by the input unit 21 can be assigned to a widget if it is determined by the determination unit 33 that increased attention is required in the current situation.
[0082] According to the embodiments of
[0083] The selection of the widgets which are still to be displayed even in situations in which increased attention is required can in this case be carried out by artificial intelligence on the basis of widgets and corresponding operating functions which a user has selected in comparable situations in the past, wherein information relating to situations detected in the past and information relating to which widgets or operating functions a user has selected in the past in the presence of a corresponding situation are provided as input variables to an artificial intelligence module, wherein the artificial intelligence module is designed to train an artificial neural network on the basis of these input variables. Furthermore, in situations in which increased attention is required, only widgets can be displayed which are assigned to operating functions that are absolutely necessary for the regular operation of the device and/or the motor vehicle, wherein corresponding assignments can be stored in a memory, for example.
[0084] In addition, the user interface control module 26 has a presentation module 37 that is configured to present at least one piece of information relating to the input gestures assigned to the displayed widgets on the display unit 22 if it is determined by the determination unit 33 that increased attention is required in the current situation. The information relating to the input gestures assigned to individual widgets may in turn be stored in a memory. In addition, the information relating to input gestures assigned to individual widgets is also preferably presented adjacent to or in close proximity to the corresponding widgets on the display unit 22.
[0085] As can also be seen, the illustrated user interface control module 26 further comprises a display module 38 configured to display an input gesture detected by the input unit 21 in real time on the display unit 22.
[0086] Also, the input apparatus 20 further comprises a confirmation element 39 designed to detect a confirmation action of a user. According to the embodiments of
[0087] As
[0088]
[0089] According to the first embodiment, two widgets 51, 52 are displayed here on a graphical user interface. In particular, the widgets 51, 52 are assigned functions of a motor vehicle telephone or a hands-free device integrated in a motor vehicle.
[0090] There is currently a situation present, during which increased attention is required. During this situation, only two widgets 51,52 are displayed on the graphical user interface, wherein a user can accept an incoming call via a first widget 51 and reject the incoming call via a second widget 52.
[0091] A cursor 53 is also shown, which can be moved by input gestures detected via an input unit, for example a trackball arranged on the steering wheel of the motor vehicle, in order to select one of the two widgets 51, 52 and the corresponding operating function. In doing so, the cursor 53 can indicate the current finger position of a finger of a user.
[0092] According to the first embodiment, the widgets 51, 52 are further arranged on the graphical user interface in such a way that a simple input gesture is assigned to each widget 51, 52 in order to increase safety during the operation of the device. According to the first embodiment, the simple input gestures are input gestures that have only one degree of freedom and thus a reduced number of degrees of freedom, and in particular forward or backward movements with a finger and/or hand, or a corresponding forward or backward flicking of a trackball.
[0093] As
[0094]
[0095]
[0096] As
[0097] In particular, according to the first embodiment, the method is designed in such a way that an input gesture detected by a trackball arranged on a steering wheel of a motor vehicle can be reliably processed even during a steering movement.
[0098]
[0099] According to the second embodiment, six widgets 61, 62, 63, 64, 65, 66 are displayed on a graphical user interface. The first five widgets 61, 62, 63, 64, 65 are widgets assigned to operating functions that a user has selected in the past during corresponding situations requiring increased attention. These widgets 61, 62, 63, 64, 65 are thus displayed to provide quick access to these functions. By selecting the additional widget 66, the selection of the displayed widgets can be changed and, for example, it is possible to navigate between different levels, wherein the widgets displayed in individual levels are stored and thus predetermined.
[0100] According to the second embodiment, the widgets 61, 62, 63, 64, 65, 66 are again arranged here on the graphical user interface in such a way that, for selection of one of the widgets 61, 62, 63, 64, 65, a simple input gesture, in particular a rotational movement, which is very different from an input gesture required for selecting the additional widget 66, which is in particular a linear movement, is assigned in order to increase safety during the operation of the device. Thus, the first five widgets 61, 62, 63, 64, 65 can be selected in particular by a rotational movement with a finger or a hand, or a circular guiding of a trackball. The additional widget 66 can again be selected in particular by a forward movement with a finger or a hand, or by flicking a trackball forward.
[0101] As
[0102]
[0103] As
[0104] It should be noted here that, according to the second embodiment, the last-selected widget remains active after the corresponding input apparatus is switched off and on again, i.e., a cursor displayed on the graphical user interface does not jump back to its initial position when the corresponding input apparatus is switched off and on again.
[0105]
[0106] As