CAPTURING APPARATUS FOR RECOGNIZING A GESTURE AND/OR A VIEWING DIRECTION OF AN OCCUPANT OF A MOTOR VEHICLE BY SYNCHRONOUS ACTUATION OF LIGHTING UNITS, OPERATING ARRANGEMENT, MOTOR VEHICLE AND METHOD

20170323165 ยท 2017-11-09

Assignee

Inventors

Cpc classification

International classification

Abstract

The invention relates to a capturing apparatus (3) for recognizing a gesture and/or a viewing direction of an occupant (13) of a motor vehicle (1), comprising a first sensor device (5) and comprising at least one second sensor device (6), wherein each of the sensor devices (5, 6) respectively has a lighting unit (7) for emitting light (11), a receiving unit (8) for receiving the light (12) reflected by the occupant (13) and a computer unit (9) for actuating the lighting unit (7) and for recognizing the gesture and/or the viewing direction on the basis of the reflected light (12), wherein the computer units (9) of the sensor devices (5, 6) are designed to actuate the lighting units (7) synchronously as a function of a synchronization signal.

Claims

1. A capturing apparatus for recognizing a gesture and/or a viewing direction of an occupant of a motor vehicle, comprising: a first sensor device; at least one second sensor device, wherein each of the sensor devices respectively has a lighting unit for emitting light; and a receiving unit for receiving the light reflected by the occupant and a computer unit for actuating the lighting unit and for recognizing the gesture and/or the viewing direction on the basis of the reflected light, wherein the computer units of the sensor devices are designed to actuate the lighting units synchronously as a function of a synchronization signal.

2. The capturing apparatus according to claim 1, wherein the computer unit of the first sensor device is designed to provide the synchronization signal and transfer the latter to the computer unit of the second sensor device.

3. The capturing apparatus according to claim 1, wherein the computer units of the first sensor device and the at least one second sensor device are linked via a data line for transferring the synchronization signal.

4. The capturing apparatus according to claim 3, wherein the data line is a CAN bus of the motor vehicle.

5. The capturing apparatus RPM according to claim 1, wherein the first sensor device and/or the at least one second sensor device is designed to determine a distance to at least a part of the occupant on the basis of a time-of-flight of the emitted light and of the reflected light.

6. The capturing apparatus according to claim 1, wherein the first sensor device and/or the at least one second sensor device is configured to capture an image of at least a part of the occupant on the basis of the reflected light.

7. The capturing apparatus according to claim 1, wherein the computer unit of the first sensor device and the computer unit of the at least one second sensor device alternately actuate the lighting units when capturing the gesture by means of the first sensor device and the at least one second sensor device.

8. The capturing apparatus according to claim 1, wherein the computer unit of the first sensor device and the computer unit of the at least one second sensor device simultaneously actuate the lighting units when capturing the viewing direction with the first sensor device and the at least one second sensor device.

9. An operating arrangement for a motor vehicle comprising: a capturing apparatus according to claim 1; a functional device actuatable as a function of the gesture and/or viewing direction recognized by the capturing apparatus.

10. A motor vehicle comprising an operating arrangement according to claim 9.

11. A method for recognizing a gesture and/or viewing direction of an occupant of a motor vehicle having a first sensor device and at least one second sensor device are provided, the method comprising: in each one of the sensor devices, actuating a lighting unit for emitting light by a computer unit; and receiving the light reflected by the occupant by a receiving unit; and recognizing the gesture and/or the viewing direction by the computer unit on the basis of the reflected light, wherein the lighting units are actuated synchronously by the computer units of the sensor devices, as a function of a synchronization signal.

Description

[0027] In the figures:

[0028] FIG. 1 shows a schematic illustration of a motor vehicle in accordance with one embodiment of the present invention;

[0029] FIG. 2 shows a capturing apparatus of the motor vehicle, by means of which a gesture and/or a viewing direction of an occupant is captured; and

[0030] FIG. 3 shows the capturing apparatus in accordance with FIG. 2, which comprises a first sensor device and a second sensor device.

[0031] In the figures, equivalent or functionally equivalent elements are provided with the same reference signs.

[0032] FIG. 1 shows a schematic illustration of a motor vehicle 1 in accordance with one embodiment of the present invention. In the present case, the motor vehicle 1 is embodied as a passenger motor vehicle. The motor vehicle 1 comprises an operating arrangement 2. The operating arrangement 2 in turn comprises a capturing apparatus 3. Using the capturing apparatus 3, it is possible as explained in more detail below to capture a gesture and/or a viewing direction of an occupant 13 of the motor vehicle 1.

[0033] Depending on the captured gesture and/or the captured viewing direction, it is possible to transfer a corresponding operating signal from the capturing apparatus 3 to a functional device 4 of the motor vehicle 1.

[0034] By way of example, the functional device 4 of the motor vehicle 1 may be a navigation system, an infotainment system, an air conditioning unit or the like. The functional device 4 may also be an appropriate actuator for opening and/or closing the windows, an actuator for adjusting the external mirrors, an actuator for opening and/or closing a sliding roof or a soft top, an actuator for adjusting the seats or the like. The functional device may also be part of a driver assistance system of the motor vehicle 1.

[0035] FIG. 2 shows a schematic illustration of an embodiment of the capturing apparatus 3. In the present exemplary embodiment, the capturing apparatus 3 comprises a first sensor device 5 and a second sensor device 6. Provision may also be made for the capturing apparatus 3 to comprise more than two sensor devices 5, 6. The sensor devices 5, 6 may be arranged distributed in the interior of the motor vehicle 1. Each of the sensor devices 5, 6 comprises a lighting unit 7, by means of which it is possible to emit light. By way of example, the lighting unit 7 may be embodied to emit light in the visible wavelength range or light in the infrared wavelength range. The emitted light 11 is reflected by a part of the occupant 13. The reflected light 12 reaches a receiving unit 8 of the respective sensor device 5, 6.

[0036] One or both of the sensor devices 5, 6 may be embodied as cameras which, depending on the reflected light 12, may capture an image of at least a part of the occupant 13. One or both of the sensor devices 5, 6 may be embodied as so-called 3D cameras or TOF cameras. Using these, it is possible to recognize the spatial orientation of a part of the occupant 13 on the basis of the reflected light 12. Hence, it is possible, for example, to recognize a gesture carried out by a hand 15 of the occupant 13. Furthermore, it is possible, for example, to identify an orientation, an inclination and/or a rotation of the head 14 of the occupant 13. The sensor devices 5, 6 may also be configured to recognize a viewing direction of the occupant 13, for example on the basis of the position of the eyes of the occupant 13. In the present case, this is depicted in an exemplary manner by arrow 16.

[0037] Each of the sensor devices 5, 6 comprises a computer unit 9. By way of example, it may be formed by an appropriate processor, by an integrated circuit or by a so-called FPGA (field programmable gate array). The respective computer units 9 serve to actuate the lighting units 7 of the sensor devices 5, 6. If the respective lighting units 7 are actuated, the latter emit the light 11. Moreover, the computer units 9 are designed to recognize the gesture or the viewing direction of the occupant 13 on the basis of the reflected light 12. To this end, it is possible, for example, to carry out appropriate image processing, on the basis of which gestures and/or viewing direction are identified.

[0038] The sensor devices 5, 6 are linked by way of a data line 10 for data transfer. In particular, the computer units 9 of the respective sensor devices 5, 6 are linked by the data line 10. By way of example, the data line 10 may be formed by a data bus of the motor vehicle 1, for example the CAN bus. By way of the data bus, it is possible to transfer a corresponding synchronization signal, as a function of which the respective computer units 9 synchronously actuate the lighting units 7.

[0039] FIG. 3 shows the capturing apparatus 3 in a further embodiment. It is possible to recognize here that each of the sensor devices 5, 6 has a corresponding sensor board 17, on which the computer units 9 are arranged. Moreover, provision is made of a corresponding processor 18. Further, each of the sensor devices 5, 6 has a communication interface 19, said communication interfaces being linked to the data line 10. Furthermore, a direct link is provided between the communication interface and the computer unit 9 by way of a data line 20. When the capturing apparatus 3 is started up, it is possible to transfer corresponding data frames via the data line, in particular the CAN bus. Here, it is possible, for example, to define the computer unit 9 of the first sensor device as a master. The computer unit 9 of the second sensor device 6 may be defined as a slave.

[0040] Then, a corresponding synchronization signal may be provided by the computer unit 9 of the first sensor device 5. This synchronization signal may be provided by any data frame which is transferred via the data line 10. Depending on the transferred synchronization signal, the computer units 9 are then able to actuate the respective lighting units 7. By way of example, the lighting units 7 may be actuated at the same time or with a temporal offset. As a result of the synchronous operation of the lighting units 7, it is possible, in particular, to avoid influencing of the sensor devices 5, 6 among themselves.