Method for operating a mobile terminal using a gesture recognition and control device, gesture recognition and control device, motor vehicle, and an output apparatus that can be worn on the head
12008168 ยท 2024-06-11
Assignee
Inventors
Cpc classification
B60K35/80
PERFORMING OPERATIONS; TRANSPORTING
G06F3/04815
PHYSICS
G06F3/017
PHYSICS
G06F3/011
PHYSICS
B60K2360/569
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/563
PERFORMING OPERATIONS; TRANSPORTING
B60W50/10
PERFORMING OPERATIONS; TRANSPORTING
B60W60/001
PERFORMING OPERATIONS; TRANSPORTING
B60W2540/00
PERFORMING OPERATIONS; TRANSPORTING
B60W2420/403
PERFORMING OPERATIONS; TRANSPORTING
G06F3/0484
PHYSICS
G06T3/40
PHYSICS
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
B60K35/10
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/577
PERFORMING OPERATIONS; TRANSPORTING
B60K35/28
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
B60W50/10
PERFORMING OPERATIONS; TRANSPORTING
B60W60/00
PERFORMING OPERATIONS; TRANSPORTING
G06F3/04815
PHYSICS
G06F3/0484
PHYSICS
G06T3/40
PHYSICS
Abstract
A gesture recognition and control device recognizes a mobile terminal and ascertains a current graphical user interface generated by a display device of the mobile terminal. The gesture recognition and control device provides an output signal describing, as display content, the graphical user interface generated by the display device of the mobile terminal, and transmits the output signal to an output apparatus that can be worn on the head for outputting the display content in a predefined output region in the interior of the motor vehicle as part of augmented reality or virtual reality output by the output apparatus. During the process of outputting the display content, the gesture recognition and control device recognizes a spatial gesture of the user, generates a remote control signal for triggering an operating function of the mobile terminal, and transmits the remote control signal to a control device of the recognized mobile terminal.
Claims
1. A method for operating a mobile terminal using a gesture recognition and control device, the method comprising: recognizing the mobile terminal; ascertaining a graphical user interface as a GUI of the mobile terminal generated to be displayed by a display device of the mobile terminal, the GUI providing an operating option that involves an operating function of the mobile terminal being triggerable; providing an output signal describing the GUI of the mobile terminal as a display content of the mobile terminal; mirroring the display content of the mobile terminal in form of a virtual display content of the mobile terminal in an output region in an interior of a motor vehicle as part of an augmented reality or a virtual reality, output by a wearable output apparatus, based on the output signal describing the GUI of the mobile terminal as the display content of the mobile terminal; while the virtual display content of the mobile terminal is mirrored in the output region, recognizing a contactless operating gesture of a user based on the operating option of the GUI of the mobile terminal; generating a remote control signal based on the contactless operating gesture, the remote control signal describing triggering of the operating function of the mobile terminal assigned to the contactless operating gesture; and transmitting the remote control signal to a control device of the mobile terminal for triggering the operating function of the mobile terminal.
2. The method as claimed in claim 1, further comprising: recognizing a further contactless operating gesture to predefine a positioning of the virtual display content in the interior; and predefining the output region in the interior based on the further contactless operating gesture.
3. The method as claimed in claim 2, further comprising: providing a further output signal describing a further graphical user interface as a further GUI providing an operating menu for predefining the positioning of the virtual display content.
4. The method as claimed in claim 1, further comprising: recognizing a further contactless operating gesture describing a scaling of the virtual display content; and scaling an image showing the virtual display content based on the further contactless operating gesture.
5. The method as claimed in claim 4, further comprising: providing a further output signal describing a further graphical user interface as a further GUI providing an operating menu for scaling the virtual display content.
6. The method as claimed in claim 1, wherein the method is performed by the gesture recognition and control device based on at least one of an activation of a fully autonomous driving mode and a start signal of the motor vehicle.
7. A gesture recognition and control device, comprising: a memory configured to store instructions; and a processor configured to execute the instructions stored in the memory to: recognize a mobile terminal, ascertain a graphical user interface as a GUI of the mobile terminal generated to be displayed by a display device of the mobile terminal, the GUI providing an operating option that involves an operating function of the mobile terminal being triggerable, provide an output signal describing the GUI of the mobile terminal as a display content of the mobile terminal, mirroring the display content of the mobile terminal in form of a virtual display content of the mobile terminal in an output region in an interior of a motor vehicle as part of an augmented reality or a virtual reality, output by a wearable output apparatus, based on the output signal describing the GUI of the mobile terminal as the display content of the mobile terminal, while the virtual display content of the mobile terminal is mirrored in the output region, recognize a contactless operating gesture of a user based on the operating option of the GUI of the mobile terminal, generate a remote control signal based on the contactless operating gesture, the remote control signal describing triggering of the operating function of the mobile terminal assigned to the contactless operating gesture, and transmit the remote control signal to a control device of the mobile terminal to trigger the operating function of the mobile terminal.
8. The gesture recognition and control device as claimed in claim 7, wherein the processor is configured to execute the instructions stored in the memory to: recognize a further contactless operating gesture to predefine a positioning of the virtual display content in the interior, predefine the output region in the interior based on the further contactless operating gesture.
9. The gesture recognition and control device as claimed in claim 8, wherein the processor is configured to execute the instructions stored in the memory to: provide a further output signal describing a further graphical user interface as a further GUI providing an operating menu for predefining the positioning of the virtual display content.
10. The gesture recognition and control device as claimed in claim 7, wherein the processor is configured to execute the instructions stored in the memory to: recognize a further contactless operating gesture describing a scaling of the virtual display content, and scale an image showing the virtual display content based on the further contactless operating gesture.
11. The gesture recognition and control device as claimed in claim 10, wherein the processor is configured to execute the instructions stored in the memory to: provide a further output signal describing a further graphical user interface as a further GUI providing an operating menu for scaling the virtual display content.
12. The gesture recognition and control device as claimed in claim 7, wherein the processor is configured to execute the instructions stored in the memory to recognize the mobile terminal based on at least one of an activation of a fully autonomous driving mode and a start signal of the motor vehicle.
13. A motor vehicle comprising the gesture recognition and control device claimed in claim 7.
14. An output apparatus wearable on a head of a user for outputting augmented reality and/or virtual reality, the output apparatus comprising: a display; and a gesture recognition and control device, including: a memory configured to store instructions, and a processor configured to execute the instructions stored in the memory to: recognize a mobile terminal, ascertain a graphical user interface as a GUI of the mobile terminal generated to be displayed by a display device of the mobile terminal, the GUI providing an operating option that involves an operating function of the mobile terminal being triggerable, generate an output signal describing the GUI of mobile terminal as a display content of the mobile terminal, mirror the display content of the mobile terminal in form of a virtual display content of the mobile terminal in an output region in an interior of a motor vehicle as part of an augmented reality or a virtual reality, output by the output apparatus, based on the output signal describing the GUI of the mobile terminal as the display content of the mobile terminal, while the virtual display content of the mobile terminal is mirrored in the output region, recognize a contactless operating gesture of user based on the operating option of the GUI of the mobile terminal, generate a remote control signal based on the contactless operating gesture, the remote control signal describing triggering of the operating function of the mobile terminal assigned to the contactless operating gesture, and transmit the remote control signal to a control device of the mobile terminal to trigger the operating function of the mobile terminal.
15. The output apparatus as claimed in claim 14, further comprising a camera to detect the contactless operating gesture of the user.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) These and other aspects and advantages will become more apparent and more readily appreciated from the following description of the example embodiments, taken in conjunction with the accompanying drawings of which:
(2)
(3)
(4)
(5)
(6)
DETAILED DESCRIPTION
(7) Reference will now be made in detail to example embodiments which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout.
(8) The example embodiments described below are examples of the disclosure In the example embodiments, the described components of the embodiments each constitute individual features which should be considered independently of one another and which each also develop the disclosure independently of one another. Therefore, the disclosure is also intended to encompass combinations of the features of the embodiments other than the combinations presented. Furthermore, the embodiments described can also be supplemented by further features from among those already described.
(9) In the drawings, identical reference signs in each case designate functionally identical elements.
(10)
(11) As shown in the example in
(12) The communication with an output apparatus 18 that can be worn on the head can for example be effected via a wireless data communication connection 20, for example via a WLAN connection, Bluetooth connection or mobile radio connection. Alternatively, the data communication connection 20 can be for example a wired data communication connection 20, for example a cable.
(13) The output apparatus 18 that can be worn on the head can for example be an output apparatus 18 for augmented and/or virtual reality. If the gesture recognition and control device 10 is a component of the motor vehicle 12, then it is possible to use any output apparatus 18 known to a person skilled in the art, for example any known smartglasses.
(14) Alternatively (not shown in
(15) Communication with a mobile terminal 22, for example a smartphone or a tablet PC, can likewise be effected via a wireless data communication connection 20, or by use of a wired data communication connection 20, for example a data bus of the motor vehicle 12 and/or a cable.
(16) The mobile terminal 22 includes a display device 24, which can for example include a touch-sensitive screen. Via this example touch-sensitive screen, depending on the current graphical user interface, various operating functions of the mobile terminal 22 can be triggered, for example opening a program, switching to a navigation overview with small views of open programs, or for example accepting a telephone call or playing back a video, for example. In this case, the direct control of the display device 24 is undertaken by a control device 26, for example a control circuit board and/or an application program (App) or an operating system. The control device 26 of the mobile terminal 22 can likewise include a processor device and/or a data memory, these components not being shown in
(17)
(18) Finally,
(19) For example, the gesture recognition and control device 10 can also include a plurality of such sensors 30.
(20) The mobile terminal can be recognized (method operation S1) for example as soon as the mobile terminal 22 approaches the motor vehicle 12, for which purpose a Bluetooth LE receiver can be situated for example on an exterior of the motor vehicle 12. Alternatively, the mobile terminal 22 can be recognized (S1) if it is placed into a charging cradle in the motor vehicle 12, for example, wherein the mobile terminal 22 can be recognized (S1) by use of recognition techniques known to a person skilled in the art.
(21) The mobile terminal 22 in the example in
(22) The graphical user interface currently being displayed by the display device 24 of the known mobile terminal 22 can be ascertained (S2) by for example a corresponding output signal of the display device 24 being transmitted to the gesture recognition and control device 10, or by for example the gesture recognition and control device 10 interrogating the mobile terminal 22 in regard to what is currently being displayed.
(23) For providing the output signal used for mirroring the graphical user interface of the mobile terminal 22 on the display surface 28 of the output apparatus 18 that can be worn on the head (S3), the output signal can be generated by the gesture recognition and control device 10, for example, or an output signal of the mobile terminal 22 can be forwarded to the output apparatus 18. In the latter example, the providing S3 also encompasses transmitting S4 the provided output signal to the example smartglasses.
(24) The predefined output region 32 can be preset, for example, and, in the example in
(25) Techniques for determining whether or not the user has turned his/her head precisely toward the example predefined output region 32 are known to a person skilled in the art in the field of virtual reality and smartglasses.
(26) The example in
(27) By way of example, a user interface can be displayed to the user, the user interface displaying various menu items, and/or for example values of various vehicle functions. If the user would then like to activate one of the possible operating functions displayed or for example acquire more detailed information concerning one of the vehicle functions, the spatial gesture can provide for example for the user to point with the finger at a corresponding icon or display element. After the optional process of detecting the contactless operating gesture by way of the sensor 30 (S5), a corresponding sensor signal can be received (S6) by the gesture recognition and control device 10 and the contactless operating gesture can be recognized for example for triggering a video playback function (S7).
(28) Recognizing or tracking a body part by use of infrared sensors, for example, is known to a person skilled in the art as the so-called leap motion technique, for example. This technique can be implemented independently of whether the gesture recognition and control device 10 is a component of the motor vehicle 12.
(29) For mirroring the graphical user interface or an excerpt therefrom, this can be mirrored for example by use of a grabber, i.e. a so-called content grabber or frame grabber, and be inserted into the virtual reality. Optionally, in addition to the inserted user interface, an image of the hand 36 can be inserted, for which purpose, according to a principle similar to the gesture recognition described above, the user's hand 36 can for example be filmed or tracked and then imaged onto the display surface 28 by the gesture recognition and control device 10.
(30) By way of example, yet another display element 38 is shown on the display surface 28 of the output apparatus 18 in the example in
(31) In the example of
(32) Analogously thereto, a further spatial gesture can scale (S9) the image 34, that is to say that the image 34 can be dragged at the corners, for example, that is to say that the image 34 can be magnified, for example.
(33) For controlling the mobile terminal 22, the gesture recognition and control device can generate a remote control signal (S10), which can activate for example the operating function for displaying the detailed information concerning an operating parameter or the playback of a video. The generated remote control signal is transmitted (S11) to the control device 26 of the recognized mobile terminal 22 via the data communication connection 20. The mobile terminal 22, for example the control device 26, then triggers the operating function (S12).
(34) Optionally, the gesture recognition and control device 10 can be in communication with a motor vehicle system (not shown in
(35)
(36) In the example in
(37) The mirrored graphical user interface can represent a TV menu, for example, or a desktop of the mobile terminal 22 can be mirrored, or the graphical user interface can show a so-called content streamer.
(38) The example in
(39)
(40)
(41) This system can be combined well with an entertainment system that provides virtual reality. During the example game, the user would therefore not have to exit virtual reality in order to be able to view his/her secondary activities, for example retrieving emails, or to operate his/her motor vehicle 12.
(42) Overall, the examples show how VR and/or AR-based remote operation may be made possible by the methods and apparatuses described herein.
(43) In accordance with a further example embodiment, within virtual reality, for example, the user, regardless of from what context (for example gaming and/or meeting), can activate for example a flying HMI or VR HMIi.e. the mirrored, virtual user interface. This can offer the same functionality as a serial HMI. In this case, the serial HMI and the VR HMI can always be synchronized and show the same display contents. For example, analogously to serial operation (for example touch, slide, pinch), the VR HMI can likewise be able to be operated as usual by use of analogous gestures. Moreover, for example, the user can position the VR HMI, i.e. the mirrored graphical user interface, at an expedient location for the user, i.e. can predefine the output region 32. The system can be implemented very easily both in VR and in AR (augmented reality).
(44) In accordance with a further example embodiment, a technical implementation can provide that by tracking for example the hands 36 or one hand 36 of the user by using one or more sensors 30, for example infrared sensors, for example using leap motion, and by determining the hand coordinates in relation to the coordinates of the output apparatus 18, i.e. of the HMD, the action space of the user can be detected completely in three dimensions. The represented image in the serial HMI, i.e. of the image represented by the display device 24 of the mobile terminal 22, can for example be streamed by use of grabbers into the area of interest, i.e. into the output region 32, with the other region freely defined by the user within virtual reality. Actions by a third person on the serial HMI, i.e. for example on the screen of the mobile terminal 22, can therefore be represented in real time in the VR HMI, i.e. in the image 34 of the mirrored user interface. During operation of the corresponding operating function in the VR HMI, the corresponding message can be communicated to a main unit, for example, which performs the actions.
(45) A description has been provided with reference to various examples, but it will be understood that variations and modifications can be effected within the spirit and scope of the claims which may include the phrase at least one of A, B, and C as an alternative expression that means one or more of A, B, and C may be used, contrary to the holding in Superguide v. DIRECTV, 358 F3d 870, 69 USPQ2d 1865 (Fed. Cir. 2004). That is the scope of the expression at least one of A, B, and C is intended to include all of the following: (1) at least one of A, (2) at least one of B, (3) at least one of C, (4) at least one of A and at least one of B, (5) at least one of A and at least one of C, (6) at least one of B and at least one of C, and (7) at least one of A, at least one of B, and at least one of C. In addition, the term and/or includes a plurality of combinations of relevant items or any one item among a plurality of relevant items. That is, the scope of the expression or phrase A and/or B includes all of the following: (1) the item A, (2) the item B, and (3) the combination of items A and B.