METHOD AND APPARATUS FOR ASSIGNING CONTROL INSTRUCTIONS IN A VEHICLE, AND VEHICLE
20170293355 · 2017-10-12
Inventors
Cpc classification
B60R11/04
PERFORMING OPERATIONS; TRANSPORTING
B60W40/08
PERFORMING OPERATIONS; TRANSPORTING
B60R2300/8006
PERFORMING OPERATIONS; TRANSPORTING
B60W50/10
PERFORMING OPERATIONS; TRANSPORTING
B60W2540/00
PERFORMING OPERATIONS; TRANSPORTING
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W50/10
PERFORMING OPERATIONS; TRANSPORTING
B60W40/08
PERFORMING OPERATIONS; TRANSPORTING
B60R11/04
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A method for assigning control instructions in a vehicle includes reading in an occupant gaze datum via an interface to an occupant detection device of the vehicle, the occupant gaze datum representing a gaze of an occupant of the vehicle toward a device of the vehicle which is to be controlled; and assigning a control instruction of a first control device of the vehicle and/or a control instruction of a second control device of the vehicle to the device using the occupant gaze datum, in order to control the device with the first control device or the second control device.
Claims
1. A method for assigning control instructions in a vehicle, the method comprising: obtaining, by processing circuitry and via an interface to an occupant detection device of the vehicle, an occupant gaze datum that represents a gaze of an occupant of the vehicle toward a vehicle device that is to be controlled; and based on the occupant gaze datum, assigning, by the processing circuitry, at least one of a control instruction of a first control device of the vehicle and a control instruction of a second control device of the vehicle to the vehicle device in order to control the vehicle device.
2. The method of claim 1, wherein the obtaining includes generating the occupant gaze datum using at least one of: a gaze direction datum of the occupant detection device that represents coordinates of a current gaze direction of the occupant; and a head posture datum of an optical sensor of the occupant detection device that represents coordinates of a current head posture of the occupant.
3. The method of claim 1, wherein at least one of the control instruction of the first control device is generated by a manual actuation of the first control device, and the control instruction of the second control device is generated by a manual actuation of the second control device.
4. The method of claim 1, further comprising: outputting, to the vehicle device to be controlled and in response to the assigning step, an indicating signal that at least one of visually, optically, and acoustically indicates to the occupant the vehicle device that is to be controlled.
5. The method of claim 1, wherein the first control device is a scroll wheel integrated into a steering wheel of the vehicle and the second control device is a rotary knob integrated into a center console of the vehicle.
6. The method of claim 1, further comprising: obtaining, via the interface, a second occupant gaze datum that represents a gaze by the occupant toward a further vehicle device that is to be controlled; and based on the second occupant gaze datum, assigning at least one of a control instruction of the first control device and a control instruction of the second control device to the further vehicle device in order to control the further vehicle device.
7. The method of claim 6, wherein one of the vehicle device and further vehicle device is a combination instrument and the other of the vehicle device and further vehicle device is a central operating and indicating element of the vehicle.
8. A vehicle device arrangement for assigning control instructions in a vehicle, the vehicle device arrangement comprising: processing circuitry; and an interface to an occupant detection device of the vehicle; wherein the processing circuitry is configured to: obtain via the interface an occupant gaze datum that represents a gaze of an occupant of the vehicle toward a vehicle device that is to be controlled; and based on the occupant gaze datum, assign at least one of a control instruction of a first control device of the vehicle and a control instruction of a second control device of the vehicle to the vehicle device in order to control the vehicle device.
9. The vehicle device arrangement of claim 8, wherein the processing circuitry includes a control switch that is configured to perform the assignment.
10. The vehicle device arrangement of claim 8, further comprising the occupant detection device.
11. A non-transitory computer-readable medium on which are stored instructions that are executable by a processor and that, when executed by the processor, cause the processor to perform a method for assigning control instructions in a vehicle, the method comprising: obtaining, via an interface to an occupant detection device of the vehicle, an occupant gaze datum that represents a gaze of an occupant of the vehicle toward a vehicle device that is to be controlled; and based on the occupant gaze datum, assigning at least one of a control instruction of a first control device of the vehicle and a control instruction of a second control device of the vehicle to the vehicle device in order to control the vehicle device.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0024]
[0025]
[0026]
[0027]
[0028]
DETAILED DESCRIPTION
[0029]
[0030] Driver detection device or occupant detection device 102 shown by way of example in
[0031] In an example embodiment of the present invention, an apparatus 110 for assigning control instructions is provided in vehicle 100. Apparatus 110 is electrically conductively connected to occupant detection device 102. Apparatus 110 is furthermore electrically conductively connected, for example via a CAN bus of vehicle 100, (a) to a device 112 and a first control device 114, the latter of which is assigned in a default setting to the device 112, and (b) to a further device 116 and a second control device 118, the latter of which is assigned in the default setting to the further device 116.
[0032] Depending on the exemplifying embodiment, apparatus 110 can be accommodated in a shared housing with occupant detection device 102, or can be disposed physically remotely from occupant detection device 102 in vehicle 100 and coupled to occupant detection device 102, for example, via the CAN bus of vehicle 100.
[0033] In the scenario shown in
[0034] Occupant detection device 102 is configured to generate an occupant gaze datum 120 using a gaze direction datum and/or head posture datum based on the gaze data or head posture data of camera 108, and to furnish it via a suitable interface to apparatus 110. The gaze direction datum or head posture datum represents, for example, coordinates of a current gaze direction or a current head posture of occupant 104. Occupant gaze datum 120 represents a gaze 122 of occupant 104 toward device 112 of vehicle 100 which is to be controlled.
[0035] According to an alternative exemplifying embodiment, apparatus 110 is configured to read in the gaze direction datum and/or head posture datum from occupant detection device 102 and to generate occupant gaze datum 120.
[0036] Apparatus 110 is furthermore configured to assign to device 112, using occupant gaze datum 120 and in response to a manual actuation of first control device 114 or of second control device 118 by occupant 104, a control instruction 124 of first control device 114 or a control instruction 126 of second control device 118. In the scenario sketched in
[0037] In accordance with the concept sketched here, apparatus 110 therefore allows occupant 104 to apply control, both via first control device 114 and via second control device 118, to device 112 that occupant 104 is aiming to control, depending on what is more convenient or safer for occupant 104 in a current situation.
[0038] Apparatus 110 analogously assigns a control instruction of first control device 114 and/or a control instruction of second control device 118 to further device 116, using a second occupant gaze datum 128 that represents a gaze 130 of occupant 104 toward further device 116, in order to control further device 116 with first control device 114 or with second control device 118. Occupant 104 directs gaze 130, for example, at an interval in time with respect to gaze 122.
[0039] The concept of assigning multiple functions to a single control device, as presented on the basis of the scenario shown in
[0040]
[0041] In the exemplifying embodiment shown in
[0042] Apparatus 110 encompasses a control switch 200 that performs a redirection, suitable in accordance with the approach presented here, of the instructions furnished by control devices 114, 118. Control switch 200 is an electrical operating means for converting a manual actuation into a signal intended for further processing.
[0043] Control switch 200 redirects control instructions furnished by control devices 114, 118 depending on whether occupant gaze datum 120 most recently furnished by occupant detection device 102 to apparatus 110 prior to an actuation of first control device 114 or of second control device 118 represents an occupant's gaze toward combination instrument 112 or toward central operating and indicating element 116. In the switching logic shown in
[0044] According to an exemplifying embodiment, in response to the assignment of control instructions 124, 126, apparatus 110 furnishes an indicating signal 202 to the device that is currently to be controlled (in this case, combination instrument 112). Indicating signal 202 generates a visual and/or optical and/or acoustic feedback that assignment has occurred, indicating to the occupant which of the devices 112, 116 shown by way of example in
[0045] If, alternatively, occupant gaze datum 120 or a further occupant gaze datum at a later point in time represents an occupant's gaze toward central operating and indicating element 116 as the most recent gaze prior to a manual actuation of one of control elements 114, 118 by the occupant, control switch 200 switches out of the first switching position into a second switching position characterized by dashed lines in the depiction of
[0046]
[0047] In a reading-in step 302, an occupant gaze datum that represents a vehicle occupant's gaze toward a device of the vehicle which is to be controlled is read in via an interface to an occupant detection device of the vehicle.
[0048] In an assigning step 304, a control instruction of a first control device of the vehicle and/or a control instruction of a second control device of the vehicle is assigned to the device using the occupant gaze datum, in order to control the device with the first control device or the second control device.
[0049] In a furnishing step 306 that is executed at a later point in time than the reading-in step 302, a second occupant gaze datum is furnished via the interface to the occupant detection device of the vehicle. The second occupant gaze datum represents the occupant's gaze toward a further device of the vehicle which is to be controlled.
[0050] In a further assigning step 308, a control instruction of the first control device and/or a control instruction of the second control device is assigned to the further device using the second occupant gaze datum, in order to control the further device with the first control device or the second control device.
[0051]
[0052] In a conventional configuration the display screen, or operating and indicating element 116, of an infotainment system of the passenger car is controlled using a touch display or using rotary knob 118 that is integrated into the center console in the passenger car. The occupant can turn rotary knob 118 to the right or left in order to operate operating and indicating element 116. A rotation of rotary knob 118 generates a control instruction for controlling operating and indicating element 116. Because operating and indicating element 116 is typically disposed above the center console, it is also referred to as “head unit” 116.
[0053] In accordance with the concept presented here, of assigning multiple functions to one operating element 114, 118, the occupant can operate device 112, 116 with scroll wheel 114 or with rotary knob 118, based on the device 112, 116 toward which the occupant is gazing.
[0054] For example, if the occupant has the occupant's hands on the steering wheel and wishes to operate operating and indicating element 116, the occupant no longer needs to remove the occupant's hands from the wheel in order to grasp rotary knob 118. The occupant can instead use scroll wheel 114 in the steering wheel. Conversely, if the occupant's right arm is resting on the armrest behind the center console, it is possibly more convenient for the occupant to reach for rotary knob 118.
[0055] Based on gaze direction recognition using a driver monitoring device directed toward the occupant's head, the occupant can use scroll wheel 114, and alternatively rotary knob 118, to control both devices 112, 116, as illustrated graphically in the depiction of
[0056] The novel concept proposed here can be transferred to as many devices, or display screens thereof, as are present in the passenger car. The proposed concept can of course also be applied to elements that do not have display screens. For example, based on a detected gaze toward the climate control region in the passenger car the temperature in the passenger car can be regulated upward or downward, for example, by turning scroll wheel 114 or rotary knob 118.
[0057] According to a further exemplifying embodiment, based on a detected gaze toward the mirrors in the vehicle, individual settings of all the mirrors in the vehicle can be configured using a directional pad 500 as the control device that is to be utilized, as shown by way of example in
[0058] The concept presented herein can be applied to all types of operating elements or input devices such as knobs, joysticks, pushbuttons, switches, or even elements such as voice control or gesture recognition.
[0059] It is furthermore conceivable for the control devices to be “non-displays,” for example the control units of the climate control system in the vehicle, or the outside mirrors. A display can also be subdivided into areas, and each area can be operated individually depending on which of the areas the vehicle occupant is gazing toward. Another aspect of the approach presented here is that an acknowledgment of the control unit selected is given to the user or the vehicle occupant. The intention is for the occupant to know which display, which area, or which control unit the occupant is currently operating. For this, the background on the display or on the selected control unit can appear in a different color than the non-selected control units or vehicle elements or can be given a border, or a special feature can be displayed in such a case. Acknowledgment of the selection of a “non-display” element as a control element can occur, for example, via a light source such as an LED. A further aspect of the approach presented here, specifically in the motor vehicle context, can be that when the vehicle occupant looks back at the road, the display, area, non-display unit, or control unit in general that was most recently gazed toward remains active, so that the occupant can continue to operate it without constantly needing to stare at the display. The vehicle occupant is thus less distracted and can, as accustomed, merely verify with monitoring glances that the desired menu is still selected.
[0060] If an exemplifying embodiment encompasses an “and/or” relationship between a first feature and a second feature, this is to be read to mean that the exemplifying embodiment according to one embodiment has both the first feature and the second feature, and according to a further embodiment has either only the first feature or only the second feature.