Abstract
The invention relates to a transportation means, a user interface and a method for assisting a user during interaction with a user interface (1). The method comprises the steps: detecting a crossing motion of an input means (2) of the user in relation to a border of a detection region for detecting gestures freely executed in the area, and in response thereto, displaying this crossing motion by means of a light strip (8) in an edge region of a display device (4) of the user interface (1).
Claims
1. A method for assisting a user during interaction with a user interface (1) comprising the steps: Detection (100) of a crossing of a detection area (3) border by a user's input means (2) for detecting gestures freely made in space, and in response thereto Display (300) of the crossing by means of a light strip (8) in an edge area of a display apparatus (4) of the user interface (1).
2. The method according to claim 1, wherein the light strip (8) is depicted by means of pixels of the display apparatus (4).
3. The method according to claim 1 or 2, wherein the light strip (8) borders an edge, preferably all edges, of the display apparatus (4).
4. The method according to one of the preceding claims, wherein the light strip (8) is only displayed for the duration of the crossing.
5. The method according to one of the preceding claims, wherein an intensity of the light strip (8) is modified depending on a position of the input means (2) with respect to the border of the detection area (3).
6. The method according to one of the preceding claims, wherein a color of the light strip (8) is modified depending on a position of the input means (2) with respect to the border of the detection area (3).
7. The method according to one of the preceding claims, wherein the direction of a change in color and/or a change in intensity of the light strip is changed depending on a direction of the crossing with respect to the detection area (3).
8. The method according to one of the preceding claims, moreover comprising the step: Increasing (200) the light strip (8) by widening the light strip (8) toward the middle of the display apparatus (4), or decreasing (400) the light strip (8) by narrowing the light strip (8) toward an edge area of the display apparatus (4).
9. The method according to one of the preceding claims, wherein the input means (2) comprises a hand and/or a finger of a user.
10. The method according to one of the preceding claims, wherein the border of the detection area (3) covers a pyramidal frustum, and/or a conical frustum, and/or a cubic space.
11. The method according to one of the preceding claims, wherein the crossing in a first direction with respect to the detection area (3) is accompanied by a first sound indicator (9), and the crossing in a second direction with respect to the detection area (3) is accompanied by a second sound indicator (11), and wherein the first direction and second direction differ, and in particular the first sound indicator (9) and second sound indicator (11) differ.
12. A user interface for assisting a user with an interaction comprising a detection apparatus (5) for detecting an input means of a user, an evaluation unit (6) and a signaling apparatus (12), wherein the evaluation unit (6) is configured to detect a crossing of a detection area (3) border by a user's input means (2) for detecting gestures freely made in space, and, the signaling apparatus (12) is configured to report, in response to the crossing, the crossing by means of a light strip (8) in an edge area of a display apparatus (4) of the user interface (1).
13. A computer program product comprising instructions that, when they are run on a programmable evaluation unit (6) of the user interface (1) according to claim 12, they cause the evaluation unit (6) to perform the steps of a method according to one of claims 1 to 11.
14. A signal sequence representing instructions that, when they are run on a programmable evaluation unit (6) of the user interface (1) according to claim 12, they cause the evaluation unit (6) to perform the steps of a method according to one of claims 1 to 11.
15. A means of transportation comprising a user interface according to claim 12.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] Exemplary embodiments of the invention will be described below in detail with reference to the accompanying drawings.
[0021] In the drawings:
[0022] FIG. 1 shows a schematic representation of components of an exemplary embodiment of a means of transportation designed according to the invention with an exemplary embodiment of a user interface designed according to the invention;
[0023] FIG. 2 shows an illustration of feedback of a first exemplary embodiment of a user interface according to the invention when an input means enters a detection area;
[0024] FIG. 3 shows an illustration of feedback of a second exemplary embodiment of a user interface according to the invention when an input means leaves a detection area;
[0025] FIG. 4 shows an illustration of feedback of a third exemplary embodiment of a user interface according to the invention when an input means enters a detection area;
[0026] FIG. 5 shows an illustration of feedback of a fourth exemplary embodiment of a user interface according to the invention when an input means leaves a detection area; and
[0027] FIG. 6 shows a flow chart illustrating steps of an exemplary embodiment of a method according to the invention for assisting a user during interaction with a user interface.
EMBODIMENTS OF THE INVENTION
[0028] FIG. 1 shows a passenger car 10 as a means of transportation according to the invention that has a user interface 1 with a screen 4 as a display apparatus and an electronic controller 6 as an evaluation unit. An infrared LED strip 5 is provided below the screen 4 as a detection apparatus that covers a rectangular detection area 3 in front of the screen 4. A data memory 7 is configured to provide program codes for executing the method according to the invention, as well as references for signals of the infrared LED strip 5 when an input means crosses a border of the detection area 3. The electronic controller 6 is connected by IT in a radial manner to the aforementioned components. A speaker 13 is connected by IT to the electronic controller 6 so that an acoustic output of a sound indicator can underscore the display of the crossing.
[0029] FIG. 2 shows the screen 4 shown in FIG. 1 whose edge area (pixels of the display surface for displaying an optional content) is used to depict a light strip 8 in order to display, or respectively report the entry of a hand 2 that moves along a double arrow P across the border of the detection area toward a central position before the screen 4. The light strip 8 consists of four substantially linear elements that assume a plurality of respective outermost pixels along the four edges of the screen 4. The light strip 8 can therefore be understood as a light frame. It shines while the hand 2 of the user crosses the border of the detection area and goes dark once the crossing is entirely completed. The intensity of the light emitted by the light strip 8 during the crossing can occur like a swelling or respectively, attenuating dimming process to calm the visual appearance and comprise a color change of the emitted light. Conversely, the signaling would occur when the hand of the user subsequently leaves the detection area in front of the screen 4 (for example following input). Since the screen 4 is used to signal, or respectively generate the light strip, it can also be understood as a signaling apparatus 12.
[0030] FIG. 3 shows an alternative embodiment to FIG. 2 of a signaling apparatus 12 that borders the screen 4 in the form of a separate light outlet. Whereas the function substantially corresponds to that described in conjunction with FIG. 2, the light strip 8 can be designed independent of the options of the screen 4, in particular with regard to the maximum possible intensity of the light emitted by the separate signaling apparatus 12. Alternatively or in addition, the emitted light can be generated as indirect light in that the light strip 8 is generated behind an opaque screen, and shines into the surroundings of the screen 4. This embodiment enables a light strip 8 that is visually particularly attractive, subdued and minimally distracting.
[0031] FIG. 4 shows another embodiment of a user interface according to the invention in which the crossing of the hand 2 is acknowledged by a light strip 8 with a width, or respectively strength that grows from the outside to the inside. Neighboring pixels toward the middle of the screen 4 help the pixels close to the edge of the screen 4 generate the light strip 8 over time. Visually, the light strip 8, or respectively the light frame formed by a plurality of light strips 8, swells. Optionally, the feedback to the user is accompanied by the emission of a first sound indicator 9 in the form of an interrupted two-note sound of rising pitch.
[0032] FIG. 5 shows a situation subsequent to the situation shown in FIG. 4 in which the user's hand 2 leaves the detection area in front of the screen 4, in response to which the light strip 8 fades such that the pixels closest toward the middle of the screen (middle horizontal, or respectively middle vertical) first reduce their light intensity, and then pixels further from the middle of the screen are dimmed. The exit of the hand 2 along arrow P is also underscored by a sound indicator 11, this time in the form of an interrupted two-tone sound of decreasing pitch, whereby the user recognizes that he has just left the detection area for 3D gesture detection without directing his attention to the user interface.
[0033] FIG. 6 shows steps of an exemplary embodiment of a method according to the invention for assisting a user during interaction with a user interface. In step 100, a crossing of a detection area border by a user's input means for detecting gestures freely made in space is detected. The crossing can be entering or leaving the detection area, or respectively a section of the detection area (such as the core area, edge area, etc.). Then in step 200, a light strip is formed by the light strip widening in the direction of the middle of the user interface display apparatus. In step 300, the maximum width of the light strip has formed in an edge area of the display apparatus, and the input means has fully entered the detection area. By performing 3D gestures, the user can now communicate with the user interface, or respectively a technical apparatus associated therewith. After the input, the user leaves the detection area in step 400, in response to which a light strip is reduced by the light strip narrowing toward an edge area of the display apparatus. This decrease occurs as a dimming process also in terms of the overall intensity of the emitted light up to complete disappearance of the light strip. Consequently, the feedback to the user about the crossing of the border of a detection area occurs in a more intuitively understandable, visually attractive and subdued manner so that user acceptance of a correspondingly designed user interface is increased, and driving safety is improved.
[0034] Although the aspects and advantageous embodiments according to the invention were explained in detail with reference to the exemplary embodiments explained in conjunction with the associated drawings, modifications and combinations of features of the depicted exemplary embodiments are possible for a person skilled in the art without departing from the ambit of the present invention whose scope of protection is defined by the accompanying claims.
REFERENCE NUMBER LIST
[0035] 1 User interface [0036] 2 Hand [0037] 3 Detection area [0038] 4 Screen [0039] 5 Infrared LED strip [0040] 6 Electronic controller [0041] 7 Data memory [0042] 8 Light strip [0043] 9 Sound indicator [0044] 10 Passenger car [0045] 11 Sound indicator [0046] 12 Signaling apparatus [0047] 13 Speaker [0048] 100-400 Method steps [0049] P Arrow