Vehicle gesture recognition system and method

09959461 ยท 2018-05-01

Assignee

Inventors

Cpc classification

International classification

Abstract

Embodiments of vehicle gesture recognition systems and methods are disclosed. An example vehicle gesture recognition system comprises a data interface configured for receiving 2d image data from a 2d sensor and/or from a portable device camera via a portable device interface. Additionally or alternatively, the data interface is configured for receiving gesture data indicating a gesture. A vehicle processing unit is configured for controlling user interfacing with a user interface based on the gestures recognized from the 2d image data and/or as indicated by the gesture data.

Claims

1. A vehicle gesture recognition system, comprising: a data interface configured for receiving gesture data, the gesture data including an indicator indicating a gesture from a predefined set of gestures selected for use with a vehicle processing unit, the gesture data including gestures detected based on image data from a 2d sensor of a portable device, the detected gestures restricted to the predefined set of gestures, wherein the indicator indicates the detected gestures indirectly, and wherein fields of view of the 2d sensor of the portable device and at least one other 2d sensor overlap in a common region located in free space of a passenger compartment between a middle console and a center console; a portable device interface being configured for: establishing a data connection between the portable device and the data interface, wherein the data interface is further configured for receiving the gesture data via the data connection from a portable device application executed on the portable device; and the vehicle processing unit configured for controlling user interfacing with a user interface based on the gesture indicated by the gesture data.

2. The vehicle gesture recognition system of claim 1, wherein the indicator indicates the gesture indirectly by use of a code and a lookup table, the lookup table corresponding to the predefined set of gestures.

3. The vehicle gesture recognition system of claim 1, wherein the indicator indicates the gesture by specifying a start point of a linear movement associated with the gesture.

4. The vehicle gesture recognition system of claim 1, wherein the indicator indicates the gesture by specifying an end point of a linear movement associated with the gesture.

5. The vehicle gesture recognition system of claim 1, wherein the indicator indicates the gesture by specifying a series of trace points of a movement associated with the gesture.

6. The vehicle gesture recognition system of claim 5, wherein the series of trace points relates to 2D or 3D coordinates of the movement in space.

7. The vehicle gesture recognition system of claim 1, wherein the indicator indicates a velocity of a movement associated with the gesture.

8. The vehicle gesture recognition system of claim 1, wherein the at least one other 2d sensor is physically integrated with a vehicle head unit in a vehicle.

9. The vehicle gesture recognition system of claim 1, wherein the gesture data further includes a three-dimensional movement reconstructed from the image data from the 2d sensor of the portable device and data received from the at least one other 2d sensor.

10. A method of gesture recognition in a vehicle, comprising: establishing a data connection between a portable device and a portable device interface; receiving gesture data including an indicator indicating a gesture from a predefined set of gestures selected for use with a vehicle processing unit via the data connection from a portable device application executed on the portable device, the gesture data including gestures detected based on image data from a 2d sensor of the portable device, the detected gestures restricted to the predefined set of gestures, wherein the indicator indicates the detected gestures indirectly, and wherein a field of view of the 2d sensor of the portable device and a field of view of one other 2d sensor overlap in a common region located in free space of a passenger compartment between a middle console and a center console; and in the vehicle processing unit, controlling a user interface based on the gesture indicated by the gesture data.

11. The method of claim 10, wherein the indicator indicates the gesture indirectly by use of a code and a lookup table, the lookup table corresponding to the predefined set of gestures.

12. The method of claim 10, further comprising identifying the gesture by the indicator, the indicator specifying a start point of a linear movement associated with the gesture.

13. The method of claim 10, further comprising identifying the gesture by the indicator, the indicator specifying an end point of a linear movement associated with the gesture.

14. The method of claim 10, further comprising identifying the gesture by the indicator, the indicator specifying a series of trace points of a movement associated with the gesture.

15. The method of claim 14, wherein the series of trace points relates to 2D or 3D coordinates of the movement in space.

16. The method of claim 10, wherein the indicator indicates a velocity of a movement associated with the gesture.

17. The method of claim 10, further comprising determining a predetermined position of the portable device based on the image data received from the 2d sensor of the portable device and further based on data received from the one other 2d sensor.

18. A vehicle head unit of a vehicle, the vehicle head unit comprising: a data interface configured for receiving gesture data, the gesture data including an indicator indicating a gesture from a predefined set of gestures selected for use with a vehicle processing unit, the gesture data including gestures detected based on image data from a 2d sensor of a portable device, the detected gestures restricted to the predefined set of gestures, wherein the indicator indicates the detected gestures indirectly, and wherein fields of view of the 2d sensor of the portable device and at least one other 2d sensor overlap in a common region located in free space of a passenger compartment between a middle console of the vehicle and a center console of the vehicle; a portable device interface being configured for: establishing a data connection between the portable device and the data interface, wherein the data interface is further configured for receiving the gesture data via the data connection from a portable device application executed on the portable device; and the vehicle processing unit configured for controlling user interfacing with a user interface of the vehicle based on the gesture indicated by the gesture data.

19. The vehicle head unit of claim 18, wherein the indicator indicates the gesture by specifying one or more of a start point, an end point, a series of trace points, and a velocity of a movement associated with the gesture.

20. The vehicle head unit of claim 18, wherein the gestures are detected based on the image data from the 2d sensor of the portable device and the at least one other 2d sensor by redundant and multiple recognizing of the detected gestures from the image data from both the 2d sensor of the portable device and the at least one other 2d sensor in a correlated manner.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) In the following, the disclosure will be explained in further detail with respect to embodiments illustrated in the accompanying drawings.

(2) FIG. 1 is a schematic illustration of a vehicle gesture recognition system according to various embodiments;

(3) FIG. 2 illustrates a gesture;

(4) FIG. 3 illustrates gesture data including indicators indicating at least one specific gesture from a predefined set of gestures;

(5) FIG. 4 is a perspective view of a vehicle center console including the vehicle gesture recognition system of FIG. 1;

(6) FIG. 5 illustrates fields of view of a 2d sensor and a cell phone camera for the system of FIG. 4; and

(7) FIG. 6 is a flowchart of a method of gesture recognition in a vehicle according to various embodiments.

DETAILED DESCRIPTION

(8) The foregoing and additional features and effects of the disclosure will become apparent from the following detailed description when read in conjunction with the accompanying drawings, in which like reference numerals refer to like elements. The drawings are to be regarded as being schematic representations, and elements illustrated in the drawings are not necessarily shown to scale. Rather, the various elements are represented such that their function and general purpose become apparent to a person skilled in the art. Any connection or coupling between functional blocks, devices, components or other physical or functional units shown in the drawings or described herein may also be implemented by an indirect connection or coupling. A coupling between components may also be established over a wireless connection. Functional blocks may be implemented in hardware, firmware, software or a combination thereof.

(9) In FIG. 1, a vehicle gesture recognition system 100 is schematically illustrated. The vehicle gesture recognition system 100 comprises a vehicle processing unit 120 coupled to interface elements 122 and a display 150. For example, the interface elements 122, sometimes referred to as control elements, can comprise a touch panel, buttons, rotary push buttons, etc. The vehicle processing unit 120 is configured to provide a GUI on the display 150 for user interfacing.

(10) Furthermore, the vehicle processing unit is coupled to a data interface 110. The data interface 110 can be configured to receive 2d image data from an at least one image source covering a field of view within the vehicle; additionally or alternatively the data interface 110 can be configured to receive gesture data indicating a gesture. For example, when the data interface 110 is configured for receiving 2d image data, the vehicle processing unit 120 can recognize a gesture from the received 2d image data.

(11) Turning to FIG. 2 where a gesture 300 is illustrated. For example, the depicted hand of a user may be wiped horizontally or vertically or may be turned. These movements may correspond to different gestures.

(12) In FIG. 3, gesture data 310 is schematically illustrated. The gesture data 310 comprises two indicators 311, each indicating a particular gesture 300 in a parameterized manner. For example, the gesture labeled 7 may correspond to a wiping gesture from left to right, while the gesture labeled 3 may correspond to a turning of the hand. Such gestures 300 as discussed with respect to FIG. 3 above are merely illustrative and are not to be construed as being limiting.

(13) Turning back to FIG. 1, by recognizing the gesture 300 from the 2d image data and/or by obtaining the pre-recognized gesture 300 from the indicators 311 of the gesture data 310, a certain gesture 300 is provided to the vehicle processing unit 120. Based on this gesture 300, the vehicle processing unit 120 is configured for controlling user interfacing with the GUI displayed on the display 150. As a specific example, it is possible that a wiping gesture from left to right moves a cursor of the GUI. Respectively, it is possible that a pushing gesture executes a certain command or menu entry of the GUI where the cursor is currently positioned.

(14) As will be appreciated from the above, the gesture control of the GUI can be based on either 2d image data of the gesture 300, or the gesture 300 indicated by the gesture data 310, or a combination thereof. Scenarios where the 2d image data is employed will be discussed next.

(15) For example, the data interface 110 may receive at least parts of the 2d image data from a 2d sensor 140. For example, the 2d sensor 140 can be an infrared sensor, a 2d optical camera, a capacitive sensor, a magnetic field sensor, or an ultrasonic sensor. It is possible to provide a plurality of 2d sensors 140 which can each contribute to providing the 2d image data. In such a scenario, each of the plurality of 2d sensors 140 may provide a part of the 2d image data.

(16) In particular, compared to scenarios where 3d sensors and 3d image data is provided, said recognizing of the gesture from the received 2d image data may be comparably simple. This may reduce the required computational efforts for said recognizing in the vehicle processing unit 120. At the same time, it is possible to provide a reliable recognizing of the gesture 300, e.g. by providing a plurality of 2d sensors 140. Redundant and multiple recognizing of the gesture 300 may then be possible, e.g. if the respective fields of view of the plurality of 2d sensors 140 overlap and the gesture is executed in the common region.

(17) In further scenarios, a cell phone interface 130 is configured for establishing a data connection between a cell phone (not shown in FIG. 2) and the data interface 110. In general, a connection may be established between the portable device interface 130 to any portable device, e.g., a touch pad, laptop, gaming console, webcam, cell phone etc. The data interface 110 is configured to receive at least parts of the 2d image data via the data connection of the cell phone interface 130 from a 2d camera of the cell phone. In other words, it is possible to employ the 2d camera of the cell phone as image source in order to acquire and receive the 2d image data.

(18) It is possible to receive the 2d image data via the cell phone interface 130 alternatively or additionally to 2d image data received from at least one 2d sensor 140. In other words, in various examples it may be dispensable to provide the 2d sensor 140; in other scenarios, it may be dispensable to provide the cell phone interface 130.

(19) Additionally or alternatively to the receiving of at least parts of the 2d image data via the cell phone interface 130, the cell phone interface 130 can be configured to receive the gesture data 310. This will be explained in more detail below.

(20) In one scenario, the gesture data 310 relates to a gesture 300 executed within the field of view of a cell phone camera. An application executed on the cell phone can recognize this free-space gesture 300, e.g. a wipe left-to-right or turning of the hand, and determine the respective gesture data 310 for the recognized gesture 300.

(21) In a further scenario, the gesture data 310 relates to a gesture 300 executed on a touch panel of the cell phone. Cell phones may comprise a touch-sensitive display as the touch panel. Typically, capacitive sensors are employed for this purpose. An application executed on the cell phone can recognize this touch gesture 300 and determine the respective gesture data 310 for the recognized gesture 300. It is possible to obtain the gesture data from both, a cell phone touch panel and a cell phone camera.

(22) In other words, the cell phone interface 130 can be configured to receive 2d image data and/or gesture data 310. Depending on the particular application, i.e. if it is desired to recognize free-space gestures 300 executed in a field of view of the cell phone camera and/or recognizing touch gestures 300 executed on a touch panel of the cell phone, either one of the scenarios or both scenarios may be implemented.

(23) The units illustrated in FIG. 1 can be implemented using hardware and/or software where applicable. For example, it is possible to implement the data interface 110 and/or cell phone interface 130 primarily has software executed on the vehicle processing unit. This may be in particularly true if wireless technology is applied for the data connection with the cell phone. Antennas for the wireless connection may be provided.

(24) Turning to FIG. 4, a perspective view of a center console and middle console of a vehicle is depicted. Visible in FIG. 4 is the display 150, the 2d sensor 140, the interface elements 122, and the cell phone interface 130. In the scenario depicted in FIG. 4, the vehicle processing unit 120 is associated with a vehicle head unit comprising a housing where one outer surface is a cover 121 shielding an interior of the vehicle head unit from the passenger compartment of the vehicle. The interface elements 122, as well as the display 150, and the 2d sensor 140 are located on the cover 121. In other words, the 2d sensor 140 is fully integrated with the vehicle head unit which makes the coupling and data communication between the 2d sensor 140 and the data interface 110 comparably simple.

(25) As can be further seen from FIG. 4, the cell phone interface 130 is configured to releasably mount the cell phone in a predetermined position, namely in the scenario of FIG. 4, in the middle console between driver and co-driver. The cell phone interface 130 is arranged such that the mounted cell phone faces approximately along the vertical direction. The cell phone interface 130 further comprises locking means, which engage with side surfaces of the cell phone in order to secure the cell phone in the predetermined position. In the scenario of FIG. 4 it is possible to obtain 2d image data from both, the 2d camera of the cell phone mounted to the cell phone interface 130 as well as from the 2d sensor 140 located at the outside surface 121 of the vehicle head unit 120.

(26) Turning to FIG. 5, the fields of view 160, 161 of the cell phone camera and the 2d sensor 140, respectively, are graphically indicated. As can be seen from FIG. 5, the cell phone interface 130 is configured for releasably mounting the cell phone such that the fields of view 160, 161 of the at least one 2d sensor and the cell phone camera overlap. The field of view 160 of the cell phone camera, i.e. as defined by the cell phone interface 130, is approximately orientated vertically. The field of view 161 of the 2d sensor 140 is approximately orientated horizontally. In particular, a common region of the fields of view 160, 161 is located in the free space between the middle and center consoles. If the user executes the gesture 300 in this common region, the 2d image data received from both, the cell phone interface 130 and the 2d sensor 140, depict the gesture 300.

(27) The fields of enclose approximately an angle of 90?, i.e. offer substantially different perspectives onto the common region. This may facilitate said recognizing of the gesture. Complementary 2d image data may be available allowing to unambiguously recognize a wide variety of gestures 300.

(28) The vehicle processing unit 120 can then recognize the gesture 300 individually from both 2d image data sets or can use knowledge about the relative positioning of the 2d sensor 140 with respect to the cell phone camera in order to recognize the gesture 300 in a correlated manner. In other words, the vehicle processing unit 120 can be configured to recognize the gesture 300 taking into account the predetermined position of the cell phone and/or the predetermined position of the 2d sensor 140. For example, this can occur as part of a stereoscopic reconstruction which allows to determine 3d position information of the gesture 300 executed within the field of views 160, 161. Respective techniques are known to the skilled person such that further details need not be discussed in this context.

(29) In the scenario of FIGS. 4 and 5, the cell phone interface 130 is shown in a fixed location on the middle console of the vehicle. Yet, it should be understood that in various scenarios it is possible that the cell phone interface 130 is freely movable throughout the interior of the vehicle. For example, this can be the case when the data connection between the cell phone interface 130 and the cell phone is established in a wireless manner and/or if the cell phone interface 130 is implemented via software. In such a scenario it may be possible that the vehicle processing unit 120 is further configured for determining the predetermined position of the cell phoneand therefore of the cell phone interface 130based on parts of the 2d image data received from the 2d camera of the cell phone and further based on parts of the 2d image data received from the at least one 2d sensor. For example, this may occur as part of a correlated gesture recognition where based on a stereoscopic reconstruction, e.g. of a reference gesture 300, the camera position of the cell phone camera is determined.

(30) In FIG. 6 a flowchart of a method of gesture recognition is shown. The method starts with step S1.

(31) In step S2, the gesture data 310 is received. Step S2 is an optional step. As discussed previously with respect to FIG. 1, in various scenarios no gesture data 310 is receivedalternatively or additionally to step S2, it is possible to receive 2d image data (step S3).

(32) For example, the 2d image data in step S3 can be received from the 2d senor 140 and/or the cell phone camera via the cell phone interface 130.

(33) In step S4, it is checked whether 2d image data has been received in step S3. If so, in step S5, the gesture 300 is recognized from the 2d image data. Otherwise, the method commences with step S6.

(34) In step S6, the vehicle head unit 120 controls the GUI displayed on the display 150 based on the gesture 300 which was recognized based on the 2d image data in step S5 and/or indicated by the received gesture data 310 in step S2.

(35) The method ends in step S7. It should be understood that the particular order of the steps may vary. E.g. it is possible to receive gesture data in step S4 only after step S5 has been executed.

(36) As can be seen from the above, the vehicle head unit may rely on various input channels for receiving either pre-recognized gesture data and/or 2d image data which can be used for gesture recognition. The 2d image data may originate from various image sources, e.g. cameras, touch pads, etc.

(37) Although the disclosure has been shown and described with respect to certain preferred embodiments, equivalents and modifications will occur to others skilled in the art upon the reading and understanding of the specification. The present disclosure includes all such equivalents and modifications. For example, while the disclosure has been described predominantly with respect to vehicles such as passenger cars, it should be understood that vehicles such as airplanes, trains, trucks, etc. may also employ techniques described herein.