Device and Method for Controlling a Vehicle Function of a Vehicle

20230127363 · 2023-04-27

    Inventors

    Cpc classification

    International classification

    Abstract

    A device for controlling a vehicle function of a vehicle includes a steering device for manual lateral control of the vehicle, wherein the steering device has at least one touch sensor, which is designed to detect sensor data relating to a touching of the steering device by a driver of the vehicle. The device is designed to determine sensor data of the touch sensor and to carry out a hands-on detection for the steering device based on the sensor data of the touch sensor. As well as the hands-on detection, the device is also designed to operate a vehicle function of the vehicle according to the sensor data.

    Claims

    1.-16. (canceled)

    17. A device for controlling a vehicle function of a vehicle, comprising: a steering device for manual lateral control of the vehicle, the steering device comprising at least one touch sensor that acquires sensor data with respect to a touch of the steering device by a driver of the vehicle; and a control unit operatively configured to: ascertain the sensor data of the touch sensor; carry out a hands-on recognition for the steering device based on the sensor data of the touch sensor; and in addition to the hands-on recognition, operate a vehicle function of the vehicle in dependence on the sensor data.

    18. The device according to claim 17, wherein the steering device comprises at least one rod-shaped steering device segment, designed for the manual lateral control of the vehicle, to be touched by the driver of the vehicle at different touch positions along a linear touch region, and the control unit is configured to: ascertain, on the basis of the sensor data of the touch sensor, the touch position, at which the driver of the vehicle touches the rod-shaped steering device segment; and operate the vehicle function of the vehicle in dependence on the touch position.

    19. The device according to claim 18, wherein the control unit is configured to: effectuate an optical representation on a display screen of the vehicle depending on the touch position; and/or set and/or adapt a perspective of a representation of surroundings of the vehicle reproduced on a display screen of the vehicle, and/or an external view of the vehicle in dependence on the touch position.

    20. The device according to claim 19, wherein the control unit is configured to: detect a change of the touch position on the rod-shaped steering device segment on the basis of the sensor data of the touch sensor; and in reaction thereto, adapt a perspective of the visual representation on the display screen.

    21. The device according to claim 18, wherein the rod-shaped steering device segment comprises at least one linear lighting element which extends along the linear touch region and generates light signals selectively in different partial regions of the linear lighting element; and the control unit is configured to cause the lighting element to generate a light signal selectively at the ascertained touch position.

    22. The device according to claim 18, wherein the control unit is configured to: ascertain a time curve of the touch position on the basis of the sensor data of the touch sensor; on the basis of the time curve of the touch position, detect a gesture which the driver of the vehicle effectuates by touching the rod-shaped steering device segment at different touch positions; and operate the vehicle function in dependence on the detected gesture.

    23. The device according to claim 22, wherein at least one of: the gesture comprises a movement in a first direction along the rod-shaped steering device segment, the gesture comprises a movement in an opposing second direction along the rod-shaped steering means segment, or the gesture comprises a repeated alternating movement in the first direction and in the second direction along the rod-shaped steering means segment.

    24. The device according to claim 22, wherein the vehicle function comprises cleaning a surroundings sensor of the vehicle, and the control unit is configured to effectuate the cleaning of the surroundings sensor in reaction to a detected gesture.

    25. The device according to claim 24, wherein the rod-shaped steering device segment comprises at least one linear lighting element, which extends along the linear touch region and generates light signals having different lengths, and the control unit is configured to: ascertain a required extent of the cleaning of the surroundings sensor; and to adapt a length of the light signal generated by the lighting element in dependence on the required extent of the cleaning of the surroundings sensor.

    26. The device according to claim 25, wherein the control unit is configured to: indicate on the basis of the length of the light signal generated by the lighting element, how frequently the driver has to repeat the gesture until the cleaning of the surroundings sensor is triggered; and/or effectuate a change of the length of the light signal generated by the lighting element as a result of an execution of the gesture in dependence on the required extent of the cleaning of the surroundings sensor.

    27. The device according to claim 18, wherein the control unit is configured to set and/or adapt a parameter value of a vehicle parameter settable within a specific value range in dependence on the touch position; and the vehicle parameter comprises at least one of: a parameter of an infotainment system and/or a climate control system of the vehicle; a volume of an audio signal played back by the vehicle; a component of highs and/or lows of an audio signal played back by the vehicle; or a setpoint temperature in a passenger compartment of the vehicle.

    28. The device according to claim 27, wherein the rod-shaped steering device segment comprises at least one linear lighting element, which extends along the linear touch region and generates light signals having different lengths; and the control unit is configured to set and/or adapt the length of the light signal generated by the lighting element in dependence on the parameter value.

    29. The device according to claim 17, wherein the steering device comprises a steering wheel having a steering wheel rim, and the at least one touch sensor is designed to acquire sensor data with respect to a touch of the steering wheel rim by the driver of the vehicle.

    30. The device according to claim 17, wherein the control unit is configured to carry out the hands-on recognition on the basis of the sensor data of the touch sensor during operation of a driving function for at least partially automated driving of the vehicle, and/or the vehicle function which is operated in dependence on the sensor data of the touch sensor is independent of the hands-on recognition and/or independent of the driving function for at least partially automated driving of the vehicle.

    31. The device according to claim 17, wherein at least one of: the vehicle function comprises output of an optical representation on a display screen of the vehicle, the vehicle function comprises cleaning of a surroundings sensor of the vehicle, or the vehicle function comprises setting of a vehicle parameter.

    32. The device according to claim 18, wherein the rod-shaped steering device comprises a steering wheel rim.

    33. A method for controlling a vehicle function of a vehicle, which comprises a steering device for manual lateral control of the vehicle, wherein the steering device comprises at least one touch sensor designed to acquire sensor data with respect to a touch of the steering device by a driver of the vehicle, wherein the method comprises: carrying out a hands-on recognition for the steering device on the basis of the sensor data of the touch sensor; and in addition to the hands-on recognition, operating a vehicle function of the vehicle in dependence on the sensor data.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0034] FIG. 1a shows exemplary components of a vehicle;

    [0035] FIG. 1b shows an exemplary steering wheel of a vehicle;

    [0036] FIGS. 2a and 2b show an exemplary control of a surroundings display of a vehicle by device of a steering wheel input;

    [0037] FIGS. 3a to 3c show an exemplary control of a vehicle function by means of a steering wheel input;

    [0038] FIGS. 4a to 4c show an exemplary setting of a parameter value by means of a steering wheel input; and

    [0039] FIG. 5 is a flow chart of an exemplary method for providing a user interface for a vehicle.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0040] As described at the outset, the present document relates to providing a convenient user interface, which can possibly also be used in conjunction with a driving function, for a hands-on requirement. In this context, FIG. 1a shows exemplary components of a vehicle 100, in particular a motor vehicle. The vehicle 100 comprises one or more surroundings sensors 102, which are configured to acquire sensor data (in this document also referred to as surroundings data) with respect to the surroundings of the vehicle 100. Exemplary surroundings sensors 102 are a camera, a radar sensor, a lidar sensor, an ultrasonic sensor, etc.

    [0041] The vehicle 100 furthermore comprises one or more longitudinal and/or lateral control actuators 103 (e.g., a drive motor, a braking device, a steering unit, etc.), which are configured to longitudinally and/or laterally control the vehicle 100 automatically or in an automated manner. A control unit 101 (or a device) of the vehicle 100 can be configured to operate the one or more longitudinal and/or lateral control actuators 103 of the vehicle as a function of the surroundings data in order to longitudinally and/or laterally control the vehicle 100 in an automated manner (in particular according to SAE level 1, according to SAE level 2, according to SAE level 3, or higher).

    [0042] The vehicle 100 comprises one or more manual control devices 105, which enable the driver of the vehicle 100 to make manual control inputs with respect to the longitudinal and/or lateral control of the vehicle 100. Exemplary control devices 105 are: a steering wheel, a brake pedal, and/or an accelerator pedal. The control unit 101 can be configured (in particular when the vehicle 100 is operated in a manual driving mode) to detect a manual control input at a manual control device 105 of the vehicle 100. Furthermore, the control unit 101 can be configured to operate the one or more longitudinal and/or lateral control actuators 103 of the vehicle 100 as a function of the manual control input, in particular to enable the driver of the vehicle 100 to longitudinally and/or laterally control the vehicle 100 manually.

    [0043] The vehicle 100 can comprise a user interface 106, which enables an interaction between the vehicle 100 and the driver of the vehicle 100. The user interface 106 can comprise one or more operating elements (e.g., a button, a rotary knob, etc.) and/or one or more output elements (e.g., a display screen, a lighting element, a loudspeaker, etc.). The control unit 101 can be configured to output an optical, haptic, and/or acoustic notice to the driver of the vehicle 100 via the user interface 106. Furthermore, it can be made possible for the driver of the vehicle 100 to activate or deactivate one or more driving functions (possibly having different degrees of automation) via the user interface 106.

    [0044] FIG. 1b shows exemplary components of a vehicle 100 at the driver position of the vehicle 100. In particular, FIG. 1b shows a steering wheel 110 as an exemplary manual control or steering device 105, which enables the driver of the vehicle 100 to steer the vehicle 100 manually (in order to effectuate the lateral control of the vehicle 100). One or more touch sensors 121, 122 can be arranged on the steering wheel 110, which are configured to detect whether the driver of the vehicle 100 touches the steering wheel 110 with at least one hand. The control unit 101 can be configured to determine on the basis of the sensor data of the one or more touch sensors 121, 122 of the steering wheel 110 whether the driver of the vehicle 100 touches the steering wheel 110 with at least one hand, touches it with two hands, or does not touch it. Furthermore, FIG. 1b shows a display screen 116 and a loudspeaker 117 as exemplary components of the user interface 106.

    [0045] The steering wheel 110 can furthermore have one or more lighting elements 111, 112, which can be activated or deactivated. A lighting element 111, 112 preferably has an elongated shape. In particular, a lighting element 111, 112 can be designed in such a way that the lighting element 111, 112 extends linearly along the circumference of the steering wheel rim 115. For example, a lighting element 111, 112 can extend over an angle range of 45° or more, in particular of 90° or 120° or more, along the circumference of the steering wheel rim 115.

    [0046] A linear lighting element 111, 112 can have a plurality of partial segments (each having one or more LEDs), which can each be activated or deactivated individually. In other words, a linear lighting element 111, 112 can be designed in such a way that if needed only a part of the lighting element 111, 112 is activated, so that the length of a linear light signal emitted by the lighting element 111, 112 can be changed, in particular reduced or increased.

    [0047] The one or more touch sensors 121, 122 on the steering wheel rim 115 of the steering wheel 110 can be designed to indicate the position, in particular the angle, at which the driver of the vehicle 100 touches the steering wheel rim 115. For this purpose, the one or more touch sensors 121, 122 can be divided into a plurality of partial segments, for example, to indicate the position of the touch of the steering wheel rim 115 with a specific position resolution or a specific angle resolution. For example, the circumference of the steering wheel rim 115 can be divided (possibly uniformly) into 10 or more, or into 20 or more partial segments, so that the position of the touch can be determined at an angle resolution of 360°/10 or less or at an angle resolution of 360°/20 or less, respectively, on the basis of the sensor data of the one or more touch sensors 121, 122.

    [0048] The control unit 101 can be configured to adapt a display represented on the display screen 116 of the user interface 106 of the vehicle 100 in dependence on the sensor data of the one or more touch sensors 121, 122, in particular in dependence on the position at which the driver of the vehicle 100 touches the steering wheel rim 115. On the display, for example, the surroundings of the vehicle 100, such as a 360° bird's eye perspective of and/or around the vehicle 100 and/or of the surroundings of the vehicle 100 can be shown. The perspective of the view of the surroundings and/or the position of the camera using which the represented surroundings is acquired or represented can be adapted in dependence on the touch position of the steering wheel rim 115. It can thus be made possible for the driver of the vehicle 100, in particular in the case of a parking assistant, to show different views of the surroundings of the vehicle 100 on the display screen 116 in a convenient manner.

    [0049] FIG. 2a shows an exemplary touch of the steering wheel rim 115 at a touch position 222. The touch position 222 can be ascertained on the basis of the sensor data of the one or more touch sensors 121, 122. FIG. 2b shows an exemplary visual representation 230 on a display screen 116 of the vehicle 100. The visual representation 230 comprises, for example, an external representation 231 of the vehicle 100. It can be made possible here for the driver of the vehicle 100 to change the virtual position of the camera 232, using which the external representation 231 of the vehicle 100 is acquired, by changing the touch position 222 (shown by the arrow in FIG. 2a). In particular, by circling along the circumference of the steering wheel rim 115, it is possible to cause the virtual position of the camera 232 to circle around the vehicle 100 (as shown by the ring in FIG. 2b).

    [0050] The control unit 101 of the vehicle 100 can thus be designed to depict the (touch) position 222 of the hand or a finger on the steering wheel 110 directly or indirectly on a camera position. The point 222 of the touch and/or the point of the current camera position can be indicated as a light spot 212 on the lighting element 111, 112. Particularly convenient and reliable assistance of the driver of the vehicle 100 can be effectuated by such optical feedback with respect to the perspective of the optical representation 230 which is shown on the display screen 116.

    [0051] The control unit 101 can be configured to detect an upward and downward movement of one or both hands along the steering wheel rim 115 (as shown by way of example in FIGS. 3a to 3c) on the basis of the sensor data of the one or more movement sensors 121, 122. An extent of the upward and downward movement can also be ascertained. In particular, it can be ascertained which angle range of the steering wheel rim 115 is passed over during the upward and downward movement. Upward and downward movements having touch regions 322 of different sizes are shown by way of example in FIGS. 3a to 3c.

    [0052] The touch region 322 passed over by the driver of the vehicle 100 can be indicated by a light signal 212 generated by the one or more lighting elements 111, 112. The length of the light signal 212 along the steering wheel rim 115 can correspond to the length of the touch region 322. In particular, the one or more lighting elements 111, 112 can be activated precisely in the partial regions which correspond to the touch region 322 of the upward and downward movement. Particularly convenient feedback with respect to an input, which is effectuated or can be effectuated via the steering wheel 110, can thus be given to the driver of the vehicle 100.

    [0053] For example, the cleaning of one or more surroundings sensors 102 of the vehicle 100 can be effectuated by an upward and downward movement. The extent of the cleaning can be set, for example, via the length of the touch region 322 and/or via the number of repetitions of the upward and downward movement. Particularly convenient and precise cleaning of surroundings sensors 102 of the vehicle 100 can thus be enabled.

    [0054] An interaction with the vehicle 100 can thus be effectuated by a gesture taking place on a circular path of the steering wheel rim 115. For example, cleaning of the camera lenses and/or other optical sensors in the vehicle 100 can be triggered by a (possibly repeated) upward and downward movement on both sides or on one side of the steering wheel 110.

    [0055] Unnecessary cleaning of the one or more sensors 102 is typically undesired (for example, because of a relatively high water consumption and/or because of an impairment of a driving function for automated driving). The control unit 101 can be configured to determine whether cleaning of the one or more surroundings sensors 102 is required or not. An extent of the required cleaning can possibly be ascertained. The length of the touch region 322 and/or the number of repetitions of the gestures which are required to effectuate cleaning of the one or more surroundings sensors 102 can be changed in dependence on the ascertained extent of the required cleaning. In particular, the length of the touch region 322 to be effectuated and/or the required number of repetitions of the gestures can be reduced with increasing extent of the required cleaning, or vice versa. The triggering of the cleaning can thus be made more difficult by increasing the number of the required repetitions of the gesture if cleaning is not required or can be facilitated by reducing the number of the required repetitions of the gesture if cleaning is required.

    [0056] The light display of the steering wheel 110 can be used as a feedback element. In particular, the one or more light signals 212 on the steering wheel rim 115 can be used to give feedback about the time and/or the number of the gesture repetitions still required until reaching the threshold value for triggering the sensor cleaning. For example, with each repetition of the gesture, the length of the one or more light signals 212 can be increased, wherein the sensor cleaning is initiated as soon as, for example, 100% of a lighting element 111, 112 is lit up. FIGS. 3a to 3c show by way of example a lengthening of the light signal 212 with increasing number of repetitions of the upward and downward movement. The driver of a vehicle 100 can thus be assisted in a reliable and convenient manner in the cleaning of the one or more surroundings sensors 102 of the vehicle 100.

    [0057] Alternatively or additionally, it can be made possible for the driver of the vehicle 100, for example, via a stroking movement upward or downward, along the steering wheel rim 115, to change a parameter value of a vehicle parameter settable in a specific value range (e.g., the playback volume of an audio signal or the setting of an equalizer), in particular to increase or reduce it. The set parameter value can be indicated via the length of the light signal 212 on the steering wheel rim 115. FIGS. 4a to 4c show exemplary stroke gestures (illustrated by the arrows) and light signals 212 of different lengths for parameter values of different levels.

    [0058] The steering wheel lights 111, 112 can thus be used as a form of representation for bar diagrams. The length of a bar diagram can be changed via a stroke gesture. Thus, for example, the volume level (from 0% to 100%) can be shown (length of the light signal 212) and changed (level/fill level by upward/downward movements of the finger or the hand on the steering wheel 110). Alternatively or additionally, in a corresponding manner the bass or treble component of the music (equalizer function) can be depicted and parameterized. Furthermore, feedback with respect to the presently set (volume) level can possibly be provided via the length of the light signal 212.

    [0059] FIG. 5 shows a flow chart of an exemplary (computer-implemented) method 500 for controlling a vehicle function of a vehicle 100, which comprises a steering device 110, in particular a steering wheel or handlebars, for manual lateral control of the vehicle 100. The steering device 110 comprises at least one touch sensor 121, 122, which is designed to acquire sensor data with respect to a touch of the steering device 110 by a driver of the vehicle 100, in particular by at least one hand or at least one finger of the driver. The touch sensor 121, 122 can be designed, for example, as a capacitive and/or resistive sensor. The touch sensor 121, 122 can extend along the steering device 110.

    [0060] The method 500 comprises carrying out 501 a hands-on recognition for the steering device 110 on the basis of the sensor data of the touch sensor 121, 122. In other words, the touch sensor 121, 122 can be used to recognize whether the driver of the vehicle 100 touches the steering device 110 with one or two hands. This can be necessary, for example, so that a driving function for at least partially automated driving is provided in the vehicle 100. For example, the driving function can be suppressed or terminated if it is recognized on the basis of the sensor data of the touch sensor 121, 122 that the driver holds no hands or does not hold both hands on the steering device 110. On the other hand, the driving function can possibly only be enabled when it is recognized on the basis of the sensor data of the touch sensor 121, 122 that the driver holds at least one hand on the steering device 110.

    [0061] The method 500 furthermore comprises, additionally to and/or independently of the hands-on recognition, the operation 502 of a vehicle function of the vehicle 100 in dependence on the sensor data. The vehicle function is possibly independent here of the automated longitudinal and/or lateral control of the vehicle 100. For example, an optical representation 230 on a display screen 116 of the vehicle 100 can be effectuated in dependence on the sensor data of the touch sensor 121, 122 as a vehicle function.

    [0062] The (already present) sensor system 121, 122 on a steering wheel 110, which is used to enable a hands-on recognition for a driving function, for example) can thus be used as part of a user interface 106 of a vehicle 100. In particular, a circumferential sensor system 121, 122 (for example 360° circumferentially) around the steering wheel rim 115 of a steering wheel 110 can be provided having a relatively fine position resolution. For example, the sensor system 121, 122 can be designed in such a way that the position of multiple fingers of a hand on the steering wheel 110 can be associated with a corresponding point 222 of the touch of the steering wheel rim 115. One or more (possibly already present) lighting elements 111, 112 on the steering wheel 110 can be used to assist the interaction. The touch sensor system 121, 122 can be used to control the content of a display screen 116 decoupled therefrom (for example, to set the perspective of a set scene) and/or to control a specific vehicle function (for example, the sensor cleaning). A particularly convenient and safe user interface 106 for a vehicle 100 can thus be provided.

    [0063] The present invention is not restricted to the exemplary embodiments shown. In particular, it is to be noted that the description and the figures are only to illustrate the principle of the proposed methods, devices, and systems by way of example.