CAPTURING APPARATUS FOR RECOGNIZING A GESTURE AND/OR A VIEWING DIRECTION OF AN OCCUPANT OF A MOTOR VEHICLE BY SYNCHRONOUS ACTUATION OF LIGHTING UNITS, OPERATING ARRANGEMENT, MOTOR VEHICLE AND METHOD
20170323165 ยท 2017-11-09
Assignee
Inventors
Cpc classification
G06F3/038
PHYSICS
G06F3/017
PHYSICS
B60R1/00
PERFORMING OPERATIONS; TRANSPORTING
G06V20/597
PHYSICS
G06V40/28
PHYSICS
B60R2300/30
PERFORMING OPERATIONS; TRANSPORTING
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
The invention relates to a capturing apparatus (3) for recognizing a gesture and/or a viewing direction of an occupant (13) of a motor vehicle (1), comprising a first sensor device (5) and comprising at least one second sensor device (6), wherein each of the sensor devices (5, 6) respectively has a lighting unit (7) for emitting light (11), a receiving unit (8) for receiving the light (12) reflected by the occupant (13) and a computer unit (9) for actuating the lighting unit (7) and for recognizing the gesture and/or the viewing direction on the basis of the reflected light (12), wherein the computer units (9) of the sensor devices (5, 6) are designed to actuate the lighting units (7) synchronously as a function of a synchronization signal.
Claims
1. A capturing apparatus for recognizing a gesture and/or a viewing direction of an occupant of a motor vehicle, comprising: a first sensor device; at least one second sensor device, wherein each of the sensor devices respectively has a lighting unit for emitting light; and a receiving unit for receiving the light reflected by the occupant and a computer unit for actuating the lighting unit and for recognizing the gesture and/or the viewing direction on the basis of the reflected light, wherein the computer units of the sensor devices are designed to actuate the lighting units synchronously as a function of a synchronization signal.
2. The capturing apparatus according to claim 1, wherein the computer unit of the first sensor device is designed to provide the synchronization signal and transfer the latter to the computer unit of the second sensor device.
3. The capturing apparatus according to claim 1, wherein the computer units of the first sensor device and the at least one second sensor device are linked via a data line for transferring the synchronization signal.
4. The capturing apparatus according to claim 3, wherein the data line is a CAN bus of the motor vehicle.
5. The capturing apparatus RPM according to claim 1, wherein the first sensor device and/or the at least one second sensor device is designed to determine a distance to at least a part of the occupant on the basis of a time-of-flight of the emitted light and of the reflected light.
6. The capturing apparatus according to claim 1, wherein the first sensor device and/or the at least one second sensor device is configured to capture an image of at least a part of the occupant on the basis of the reflected light.
7. The capturing apparatus according to claim 1, wherein the computer unit of the first sensor device and the computer unit of the at least one second sensor device alternately actuate the lighting units when capturing the gesture by means of the first sensor device and the at least one second sensor device.
8. The capturing apparatus according to claim 1, wherein the computer unit of the first sensor device and the computer unit of the at least one second sensor device simultaneously actuate the lighting units when capturing the viewing direction with the first sensor device and the at least one second sensor device.
9. An operating arrangement for a motor vehicle comprising: a capturing apparatus according to claim 1; a functional device actuatable as a function of the gesture and/or viewing direction recognized by the capturing apparatus.
10. A motor vehicle comprising an operating arrangement according to claim 9.
11. A method for recognizing a gesture and/or viewing direction of an occupant of a motor vehicle having a first sensor device and at least one second sensor device are provided, the method comprising: in each one of the sensor devices, actuating a lighting unit for emitting light by a computer unit; and receiving the light reflected by the occupant by a receiving unit; and recognizing the gesture and/or the viewing direction by the computer unit on the basis of the reflected light, wherein the lighting units are actuated synchronously by the computer units of the sensor devices, as a function of a synchronization signal.
Description
[0027] In the figures:
[0028]
[0029]
[0030]
[0031] In the figures, equivalent or functionally equivalent elements are provided with the same reference signs.
[0032]
[0033] Depending on the captured gesture and/or the captured viewing direction, it is possible to transfer a corresponding operating signal from the capturing apparatus 3 to a functional device 4 of the motor vehicle 1.
[0034] By way of example, the functional device 4 of the motor vehicle 1 may be a navigation system, an infotainment system, an air conditioning unit or the like. The functional device 4 may also be an appropriate actuator for opening and/or closing the windows, an actuator for adjusting the external mirrors, an actuator for opening and/or closing a sliding roof or a soft top, an actuator for adjusting the seats or the like. The functional device may also be part of a driver assistance system of the motor vehicle 1.
[0035]
[0036] One or both of the sensor devices 5, 6 may be embodied as cameras which, depending on the reflected light 12, may capture an image of at least a part of the occupant 13. One or both of the sensor devices 5, 6 may be embodied as so-called 3D cameras or TOF cameras. Using these, it is possible to recognize the spatial orientation of a part of the occupant 13 on the basis of the reflected light 12. Hence, it is possible, for example, to recognize a gesture carried out by a hand 15 of the occupant 13. Furthermore, it is possible, for example, to identify an orientation, an inclination and/or a rotation of the head 14 of the occupant 13. The sensor devices 5, 6 may also be configured to recognize a viewing direction of the occupant 13, for example on the basis of the position of the eyes of the occupant 13. In the present case, this is depicted in an exemplary manner by arrow 16.
[0037] Each of the sensor devices 5, 6 comprises a computer unit 9. By way of example, it may be formed by an appropriate processor, by an integrated circuit or by a so-called FPGA (field programmable gate array). The respective computer units 9 serve to actuate the lighting units 7 of the sensor devices 5, 6. If the respective lighting units 7 are actuated, the latter emit the light 11. Moreover, the computer units 9 are designed to recognize the gesture or the viewing direction of the occupant 13 on the basis of the reflected light 12. To this end, it is possible, for example, to carry out appropriate image processing, on the basis of which gestures and/or viewing direction are identified.
[0038] The sensor devices 5, 6 are linked by way of a data line 10 for data transfer. In particular, the computer units 9 of the respective sensor devices 5, 6 are linked by the data line 10. By way of example, the data line 10 may be formed by a data bus of the motor vehicle 1, for example the CAN bus. By way of the data bus, it is possible to transfer a corresponding synchronization signal, as a function of which the respective computer units 9 synchronously actuate the lighting units 7.
[0039]
[0040] Then, a corresponding synchronization signal may be provided by the computer unit 9 of the first sensor device 5. This synchronization signal may be provided by any data frame which is transferred via the data line 10. Depending on the transferred synchronization signal, the computer units 9 are then able to actuate the respective lighting units 7. By way of example, the lighting units 7 may be actuated at the same time or with a temporal offset. As a result of the synchronous operation of the lighting units 7, it is possible, in particular, to avoid influencing of the sensor devices 5, 6 among themselves.