METHOD FOR OPERATING AN OPERATOR CONTROL DEVICE AND OPERATOR CONTROL DEVICE FOR A MOTOR VEHICLE

20180173316 ยท 2018-06-21

Assignee

Inventors

Cpc classification

International classification

Abstract

An operating gesture of a user and at least one spatial position in which the operating gesture is performed are sensed without contact by a sensing apparatus of an operating device of a motor vehicle. Then a function of the motor vehicle is controlled according to the operating gesture if it was sensed that the at least one spatial position lies within a predetermined interaction space. To determine the interaction space, a predetermined determination gesture performed by the user is detected, at least one position in which the determination gesture is performed is sensed, and the at least one sensed position of the determination gesture is defined as a coordinate of the interaction space.

Claims

1-10. (canceled)

11. A method for operating an operator control device of a motor vehicle, comprising: sensing an operator control gesture of a user and at least one spatial location at which the operator control gesture is carried out in a contactless fashion by a sensing apparatus of the operator control device; controlling, in reaction to said sensing of the operator control gesture, a function of the motor vehicle in dependence on the operator control gesture when the at least one spatial location lies within an interaction space; and determining the interaction space, prior to said sensing of the operator control gesture, by detecting a predetermined determining gesture carried out by the user in at least one location defined as a coordinate of the interaction space, including activating the interaction space determining as soon as a predetermined activation position of hands of the user is detected, sensing a predetermined relative movement of the hands from the predetermined activation position into an end position of the hands as the predetermined determining gesture, defining locations of the hands during the predetermined relative movement as the at least one location, including the locations of the hands in the end position as coordinates of outer boundaries of the interaction space, sensing, as the predetermined relative movement, movement apart of the hands along a first spatial direction from the predetermined activation position, in which the hands are separated by a first distance, into the end position, in which the hands are separated by a second distance larger than the first distance, and defining a first spatial extent of the interaction space as the second distance.

12. The method as claimed in claim 11, wherein the second distance defines a second spatial extent of the interaction space in a second spatial direction oriented perpendicularly with respect to the first spatial direction, and the second distance defines a third spatial extent of the interaction space in a third spatial direction oriented perpendicularly with respect to the first and second spatial directions.

13. The method as claimed in claim 12, wherein contact between surfaces of the hands and/or contact between at least two fingers of a first hand and at least two fingers of a second hand is detected as the predetermined activation position.

14. The method as claimed in claim 11, wherein contact between surfaces of the hands and/or contact between at least two fingers of a first hand and at least two fingers of a second hand 24) is detected as the predetermined activation position.

15. The method as claimed in claim 11, further comprising displaying the predetermined determining gesture figuratively to the user on a display apparatus of the operator control device.

16. The method as claimed in claim 11, further comprising defining a tolerance range directly adjoining the interaction space, and wherein said controlling controls the function of the motor vehicle when the operator control gesture is carried out within at least one of the interaction space and the tolerance range.

17. The method as claimed in claim 11, further comprising determining, when a further determining gesture is sensed, a new interaction space, and wherein said controlling controls the function of the motor vehicle subsequent to determining the new interaction space only when the operator control gesture is carried out in the new interaction space.

18. The method as claimed in claim 11, further comprising: identifying the user to make available a personalized interaction space, by sensing the user carrying out the predetermined determining gesture, and storing the personalized interaction space determined for each user, and wherein said controlling the function of the motor vehicle is performed based on the operator control gesture sensed in the personalized interaction space of the user.

19. An operator control device of a motor vehicle for controlling a function of the motor vehicle, comprising: a sensing apparatus configured to sense an operator control gesture of a user in at least one spatial location; and a control apparatus configured to control the function of the motor vehicle based on the operator control gesture only when the at least one spatial location sensed by the sensing apparatus lies within an interaction space, to detect a predetermined determining gesture, carried out by the user, determining the interaction space, to sense at least one location at which the predetermined determining gesture is carried out, to define the at least one location of the predetermined determining gesture as a coordinate of the interaction space, to activate the determining of the interaction space as soon as a predetermined activation position of hands of the user is detected by the sensing apparatus, to sense a predetermined relative movement of the hands from the predetermined activation position into an end position of the hands as the predetermined determining gesture, and to define locations of the hands during the predetermined relative movement as the at least one location, including the locations of the hands in the end position as coordinates of outer boundaries of the interaction space, the sensing apparatus being configured to sense, as the predetermined relative movement, movement apart of the hands in a first spatial direction from the predetermined activation position, in which the hands are separated by a first distance, into the end position in which the hands are separated by a second distance larger than the first distance and defining a first spatial extent of the interaction space, bounded by the locations of the hands in the end position.

20. The operator control device as claimed in claim 19, wherein the second distance defines a second spatial extent of the interaction space in a second spatial direction oriented perpendicularly with respect to the first spatial direction, and the second distance defines a third spatial extent of the interaction space in a third spatial direction oriented perpendicularly with respect to the first and second spatial directions.

21. The operator control device as claimed in claim 20, wherein contact between surfaces of the hands and/or contact between at least two fingers of a first hand and at least two fingers of a second hand is detected as the predetermined activation position.

22. The operator control device as claimed in claim 19, wherein contact between surfaces of the hands and/or contact between at least two fingers of a first hand and at least two fingers of a second hand 24) is detected as the predetermined activation position.

23. The operator control device as claimed in claim 19, further comprising a display device configured to display the predetermined determining gesture figuratively to the user.

24. The operator control device as claimed in claim 19, wherein the control apparatus is further configured to define a tolerance range directly adjoining the interaction space, and to control the function of the motor vehicle when the operator control gesture is carried out within at least one of the interaction space and the tolerance range.

25. The operator control device as claimed in claim 19, wherein the control apparatus is further configured to determine, when a further determining gesture is sensed, a new interaction space, and to control the function of the motor vehicle subsequent to determining the new interaction space only when the operator control gesture is carried out in the new interaction space.

26. The operator control device as claimed in claim 19, wherein the sensing apparatus is further configured to sense the user carrying out the predetermined determining gesture, and wherein the control apparatus is further configured to store a personalized interaction space associated with each user and to control the function of the motor vehicle based on the operator control gesture sensed in the personalized interaction space of the user.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0022] of the exemplary embodiments, taken in conjunction with the accompanying drawings of which:

[0023] These and other aspects and advantages will become more apparent and more readily appreciated from the description below on the basis of an exemplary embodiment and also with reference to the appended drawings of which:

[0024] FIG. 1 is a schematic side view of a motor vehicle with an embodiment of an operator control device;

[0025] FIG. 2a is a schematic perspective view of an activation position of two hands during a determining gesture; and

[0026] FIG. 2b is a schematic perspective view of an end position of two hands during a determining gesture.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

[0027] In the figures, identical and functionally identical elements are provided with the same reference symbols.

[0028] In the exemplary embodiment, the described components of the embodiment each constitute individual features which are to be considered independently of one another and which each also develop the invention independently of one another and at the same time are also to be a component, either individually or in another combination than that shown. Furthermore, further features which have already been described can also be added to the described embodiment.

[0029] FIG. 1 shows a motor vehicle 10 having an operator control device 20 by of which a user 14 can control a function F of the motor vehicle 10. The user 14 is illustrated sitting here in a passenger compartment 12 of the motor vehicle 10 on a rear seat 16 of the motor vehicle 10, in particular in a comfortable, reclined sitting position. The function F which is to be controlled is here a function of a display apparatus 38, for example in the form of a tablet or a touch-sensitive screen which is arranged on a backrest of a front seat 18 of the motor vehicle 10 and lies, in particular, outside a range of the user 14. In other words, the user 14 cannot control the function F of the touch-sensitive screen by touching the touch-sensitive screen. However, the user 14 can control the function F of the motor vehicle 10 in a contactless fashion by operator control gestures which the user 14 carries out with his hands 22, 24. In order to sense the operator control gestures of the user 14 and to sense at least one location of the hands 22, 24 of the user 14, the operator control device 20 has a sensing apparatus 26, for example in the form of a so-called time-of-flight camera. In order to avoid undesired incorrect triggering or incorrect control of the function F, a control apparatus 40 of the operator control device 20 is configured to control the function F only when it has been sensed by the sensing apparatus 26 that the operator control gestures of the user 14 have been carried out within a predetermined interaction space 28.

[0030] There is provision here that the user 14 can himself define or determine the interaction space 28, in particular a position and dimensions of the interaction space 28 within the passenger compartment 12 of the motor vehicle 10. In this way, the user 14 can determine the interaction space 28, for example as a function of his sitting position, in such a way that operator control gestures for controlling the function F can be carried out easily and comfortably within the interaction space 28. For this purpose, the user 14 carries out a predetermined determining gesture with his hands 22, 24, which gesture is sensed by the sensing apparatus 26 and detected as such. In addition, at least one location of the determining gesture or at least one location of the hands 22, 24 of the user 14 is sensed during the execution of the determining gesture and defined as a coordinate of the interaction space 28, for example by the control apparatus 40 of the operator control device 20.

[0031] In order to initialize the determination of the interaction space 28, the sensing apparatus 26 detects a predetermined activation position 34 of the hands 22, 24 of the user 14. One embodiment of the predetermined activation position 34 is depicted by of the hands 22, 24 illustrated in FIG. 2a. Such an activation position 34 can be assumed, for example, by contact of surfaces 30, 32 of the hands 22, 24 of the user 14. From this activation position 34, the user 14 then moves his hands 22, 24 in accordance with the predetermined relative movement. Movement apart of the hands 22, 24 in a first spatial direction R1, for example in a horizontal spatial direction, can be detected as such a predetermined relative movement by the sensing apparatus 26. The relative movement of the hands 22, 24 is therefore carried out up to an end position 36 of the hands 22, 24. One embodiment of an end position 36 of the hands 22, 24 is shown on the basis of the hands 22, 24 illustrated in FIG. 2b.

[0032] In the end position 36 according to FIG. 2b, the hands 22, 24 are at a distance a from one another, which distance a can be freely determined by the user 14. A first spatial extent A1 of the interaction space 28 in the first spatial direction R1 is determined by this distance a. Furthermore, it can be provided that a second spatial extent A2 is defined in a second spatial direction R2 oriented perpendicularly with respect to the first spatial direction R1 and a third spatial extent A3 can be defined in a third spatial direction R3 which is oriented perpendicularly with respect to the first spatial direction R1 and perpendicularly with respect to the second spatial direction R2, also with the distance a, for example from the control apparatus 40. By of the movement apart of the hands 22, 24, a virtual cube is therefore drawn which is determined as a user-specific interaction space 28, for example by the control apparatus 40, and stored, for example in a storage apparatus (not illustrated) of the operator control device 20.

[0033] Furthermore, the sensing apparatus 26 senses locations which the hands 22, 24 assume during the execution of the relative movements. In FIG. 2b, for example the end locations P1, P2 of the hands 22, 24 are shown, wherein the hand 22 assumes the location P1 in the end position of the hands 22, 24, and the hand 24 assumes the location P2 in the end position of the hands 22, 24. The locations P1, P2 are defined here as coordinates of an outer boundary of the interaction space 28. In a fixed coordinate system in the passenger compartment 12 of the motor vehicle 10, locations P1, P2 of the hands 22, 24 are identical to the coordinates of the outer boundary of the interaction space 28.

[0034] In addition there can be provision that the determining gesture for determining the interaction space 28 is displayed, for example in a film sequence, to the user 14 on the display apparatus 38 of the operator control device 20, for example the tablet which is arranged in the backrest of the front seat 18. The user 14 is therefore provided with visual guidance as to how he can define his personal interaction space 28.

[0035] By the determining gesture the user 14 can therefore determine both the position of the interaction space 28 in the passenger compartment 12 of the motor vehicle 10 and the dimension of the interaction space 28, that is to say the spatial extents A1, A2, A3. Furthermore, for example the control apparatus 40 can define a tolerance range which adjoins the interaction space 28, wherein the control apparatus 40 controls the function F even if it has been sensed by the sensing apparatus 26 that the user 14 is carrying out the operator control gesture for controlling the function F, for example, outside the interaction space 28 but within the adjoining tolerance range.

[0036] A description has been provided with particular reference to preferred embodiments thereof and examples, but it will be understood that variations and modifications can be effected within the spirit and scope of the claims which may include the phrase at least one of A, B and C as an alternative expression that means one or more of A, B and C may be used, contrary to the holding in Superguide v. DIRECTV, 358 F3d 870, 69 USPQ2d 1865 (Fed. Cir. 2004).