Vehicle user interface
10719218 · 2020-07-21
Assignee
Inventors
- Björn Alexander Jubner (Spånga, SE)
- Björn Thomas Eriksson (Stockholm, SE)
- Gunnar Martin Fröjdh (Dalarö, SE)
- Simon Greger Fellin (Sigtuna, SE)
- Stefan Johannes Holmgren (Sollentuna, SE)
Cpc classification
G06F3/04842
PHYSICS
G06F3/017
PHYSICS
G06F3/0488
PHYSICS
G06F3/0421
PHYSICS
H04M2250/22
ELECTRICITY
B60K35/60
PERFORMING OPERATIONS; TRANSPORTING
G06F3/033
PHYSICS
B62D1/046
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/141
PERFORMING OPERATIONS; TRANSPORTING
G06F3/04847
PHYSICS
G06F3/0416
PHYSICS
G06F2203/0339
PHYSICS
G06F3/0487
PHYSICS
B60K35/10
PERFORMING OPERATIONS; TRANSPORTING
International classification
G06F3/0484
PHYSICS
G06F3/041
PHYSICS
G06F3/0488
PHYSICS
Abstract
A vehicle user interface including a vehicle steering wheel including a grip, a sensor mounted in the steering wheel grip detecting objects touching the steering wheel grip, a plurality of individually activatable illumination units illuminating respective locations on the steering wheel grip, and a processor receiving outputs from the sensor, selectively activating a subset of the illumination units adjacent to a detected object, and controlling a plurality of vehicle functions in response to outputs of the sensor.
Claims
1. A vehicle user interface comprising: a vehicle steering wheel comprising a grip; a sensor detecting objects touching said steering wheel grip; a set of individually activatable illumination units mounted in said steering wheel grip illuminating respective locations on said steering wheel grip; and a processor receiving outputs from said sensor, selectively activating a subset of said illumination units adjacent to a driver's hand gripping said steering wheel grip, detected by said sensor, and activating a vehicle function in response to said sensor further detecting a thumb or finger of the driver's hand touching said steering wheel grip at an illuminated location.
2. The vehicle user interface of claim 1, wherein the activated subset of illumination units together illuminates a contiguous segment of said steering wheel grip, and wherein said processor activates a first vehicle function in response to said sensor detecting the thumb or finger touching one end of the illuminated segment, and a second vehicle function, different than the first vehicle function, in response to said sensor detecting the thumb or finger touching the opposite end of the illuminated segment.
3. The vehicle user interface of claim 2, wherein the first vehicle function increases a parameter for a vehicle system, and the second vehicle function decreases that parameter.
4. The vehicle user interface of claim 1, wherein said sensor outputs enable said processor to determine whether an object touching said steering wheel grip is thumb-sized or hand-sized, and wherein said processor activates vehicle functions in response to said sensor detecting an object touching said steering wheel grip, in accordance with the size of the object determined by said processor.
5. The vehicle user interface of claim 4, wherein said sensor comprises a plurality of light emitters and a plurality of light detectors mounted in said steering wheel grip, said processor determining whether an object touching said steering wheel grip is thumb-sized or hand-sized based on the number of neighboring activated emitter-detector pairs that are responsive to the object, an activated emitter-detector pair comprising an activated one of said emitters and a synchronously activated one of said detectors, and a responsive emitter-detector pair being characterized by light emitted by the light emitter of the pair out of said steering wheel grip being reflected by the object back to the light detector of the pair.
6. The vehicle user interface of claim 4, wherein autonomous drive functionality is provided in the vehicle and controlled by said processor, and wherein said processor deactivates a previously activated autonomous drive function in response to sensor outputs indicating that a hand-sized object is taking hold of said steering wheel grip.
7. The vehicle user interface of claim 1, wherein autonomous drive functionality is provided in the vehicle, and wherein prior to deactivating a previously activated autonomous drive function, said processor informs the driver that the previously activated autonomous drive function will be deactivated, by selectively activating a subset of said illumination units.
8. The vehicle user interface of claim 7, wherein said processor informs the driver that an autonomous drive function will be activated, via an activation sequence of said illumination units, and further informs the driver that the activated autonomous drive function will be deactivated, via an inversion of the activation sequence of said illumination units.
9. A vehicle user interface comprising: a vehicle steering wheel comprising a grip; a sensor comprising a plurality of light emitters and a plurality of light detectors mounted in said steering wheel grip; and a processor controlling a plurality of vehicle functions, said processor (i) activating a plurality of emitter-detector pairs, an activated emitter-detector pair comprising an activated one of said emitters and a synchronously activated one of said detectors, (ii) receiving detector outputs for the activated emitter-detector pairs, (iii) determining that an object is touching said steering wheel grip, and determining whether the object is thumb-sized or hand-sized, based on the number of activated emitter-detector pairs that are responsive to the object as determined by the received detector outputs, a responsive emitter-detector pair being characterized by light emitted by the light emitter of the pair out of said steering wheel grip being reflected by the object back to the light detector of the pair, and (iv) activating different vehicle functions in accordance with the determined size of the object.
10. The vehicle user interface of claim 9, wherein autonomous drive functionality is provided in the vehicle and controlled by said processor, and wherein said processor deactivates a previously activated autonomous drive function in response to said processor determining that a hand-sized object is taking hold of said steering wheel grip.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19) In the disclosure and figures, the following numbering scheme is used. Light transmitters are numbered in the 100's, light detectors are numbered in the 200's, light guides and lenses are numbered in the 300's, miscellaneous items are numbered in the 400's, light beams are numbered in the 600's, and flow chart elements are numbered 1000-1100. Like numbered elements are similar but not necessarily identical.
(20) The following tables catalog the numbered elements and list the figures in which each numbered element appears.
(21) TABLE-US-00001 Light Transmitters Element FIGS. 100 4, 5 101 6 102 6 105 8-10 106 6, 7 107 6, 7 108 6, 7 109 6, 7
(22) TABLE-US-00002 Light Detectors Element FIGS. 200 4 201 6, 8, 9 202 6, 8, 9 203 8, 9 205 8, 9 206 8
(23) TABLE-US-00003 Light Guides and Lenses Element FIGS. 300 2-7 301 8, 9 302 8, 9 303 8, 9 304 8, 9 305 8-10
(24) TABLE-US-00004 Light Beams Element FIGS. Description 601 10 light beam 602 3, 8-10 light beam 603 8, 9 light beam 604 8, 9 light beam
(25) TABLE-US-00005 Flow Chart Stages Element FIGS. Description 1001-1005 13 vehicle application state 1010-1019 13 vehicle application action
(26) TABLE-US-00006 Miscellaneous Items Element FIGS. Description 400 1 steering wheel 401 1 grip 402 1 right spoke 403 1 left spoke 404 1 bottom spoke 405 1 answer button 406 1 reject button 410 12, 14-19 steering wheel 411 2-5 steering wheel frame 412 2-5 top cover 413 2-5 thumb notch 414 2-7 PCB 415 2-5, 7 light baffle 416 3, 5 transparent cover section 417 3, 5 transparent cover section 418 12, 14, 15, 17, 18 finger 419 12, 14, 16-19 hand 420 12, 15 steering wheel surface 421-424, 12, 15 hand/finger movement directions 428, 431-434 425 12 clock icon 426 12 finger 430 14 double-tap gesture 436 14-16 Illumination 437 14 movement of illumination 438 14, 17 tap gesture 440 6, 11 Processor 441-443 11 network client 444 11 message bus
DETAILED DESCRIPTION
(27) Aspects of the present disclosure relate to light-based touch controls that allow a driver to keep his hands on a steering element while operating peripheral electronic devices and automated features in a vehicle.
(28) According to a first embodiment of the invention, a steering wheel is provided with a touch sensitive strip disposed along the entire circumference of the steering wheel. In order to facilitate locating the strip, it is disposed in a thumb receiving notch or groove that is etched or otherwise formed along the circumference of the steering wheel. In addition to a touch sensor, there is also a visible-light illuminator behind or around the touch sensitive strip that is used to indicate the state of the user interface to the user, and also indicate where certain tap gestures should be performed.
(29) A user interface for this steering wheel is designed to be independent of the rotation of the steering wheel. Sweep gestures are clockwise and counter-clockwise so that they are independent of rotation of the wheel. A function is activated in response to a gesture, such as a double-tap, performed anywhere along the circumference of the wheel. The activation of some functions places the user interface into a state in which one or more additional functions can be selectively activated. In order to activate these additional functions, the touch location at which the initial gesture was performed is illuminated and subsequent gestures are performed in relation to the illuminated portion of the wheel. When a portion of the wheel is thus illuminated, and the driver slides his hand along the steering wheel grip, the illuminated portion of the steering wheel follows the hand so that the hand is always next to the location for performing subsequent gestures. Similarly, when the user switches hands gripping the steering wheel, the illumination jumps to the newly gripped part of the wheel.
(30) Reference is made to
(31) Reference is made to
(32) Reference is made to
(33)
(34) Reference is made to
(35) The outward-facing light emitters are used to provide visual indications to the user by illuminating light-transmissive portion 416 of the steering wheel cover, and emit light in the visible range. Lenses 300 are described in assignee's U.S. application Ser. No. 14/555,731, entitled DOOR HANDLE WITH OPTICAL PROXIMITY SENSORS.
(36) Reference is made to
(37) Reference is made to
(38) Reference is made to
(39) Methods for determining two-dimensional coordinates of an object detected by the disclosed proximity sensor array are described in assignee's U.S. application Ser. No. 14/312,787, entitled OPTICAL PROXIMITY SENSORS, and U.S. application Ser. No. 14/555,731, entitled DOOR HANDLE WITH OPTICAL PROXIMITY SENSORS. Because the present application is for a steering wheel and the proximity sensor array is arranged along an arc-shaped grip of the steering wheel, the determined coordinates are polar coordinates, including a polar angle and a radial coordinate. The polar angle corresponds to a coordinate along the proximity sensor array, which in U.S. application Ser. Nos. 14/312,787 and 14/555,731 is described as an x-axis coordinate. The radial coordinate corresponds to a distance from the proximity sensor array, which in U.S. application Ser. Nos. 14/312,787 and 14/555,731 is described as a y-axis coordinate.
(40) Discussion now turns to the firmware and software used to detect and interpret user gestures. There are five basic gesture components that are detected by the hardware and low-level drivers: (i) Thumb-Tap, (ii) Thumb-Glide, (iii) Thumb-Long-Press, (iv) Grab and (v) Rim-Tap. These components are emitted on the network as they are detected, and are used by higher level software to assemble more complex gestures such as double-taps. Application software interprets these gestures as input commands. In some embodiments of the invention multiple client applications are connected via a network to the detector firmware. The firmware sends information for each detected gesture component over the network, and a client application translates that information into commands and/or constructs compound gestures from multiple gesture components.
(41) Reference is made to
(42) The five basic gesture components are categorized according to whether they are performed by a large object (hand) or small object (thumb), and whether the nature of the gesture component is discrete or continuous, as presented in the table below.
(43) TABLE-US-00007 Component Description Object Type Thumb-Tap Tap thumb on steering Small Discrete wheel rim Thumb-Glide Glide thumb along Small Continuous steering wheel rim Thumb-Long-Press Hold thumb on steering Small Continuous wheel rim Grab Grab hold of steering Large Continuous wheel rim Rim-Tap Tap hand on steering Large Discrete wheel rim
These gesture components are alternatively referred to as follows.
(44) TABLE-US-00008 Component Alternative Name Thumb-Tap small-object tap Thumb-Glide small-object glide Thumb-Long-Press small-object touch-and-hold Grab large-object grab Rim-Tap large-object tap
(45) The parameters are the same for all gesture components; namely, time stamp, start angle (min_angle), end angle (max_angle), center angle (angle) and state.
(46) The angle parameters refer to a polar angle along the steering wheel at which the object is detected. Because of the object's size, there is a first polar angle at which the object begins (start angle) and a last polar angle at which the object ends (end angle). The midpoint between the start and end angles (center angle) is used as the object's polar angle. The start and end angles are useful for determining the size of a detected object.
(47) The state parameter takes on three values: RECOGNIZED, UPDATED and ENDED. The ENDED state is applied to all discrete gesture components, and also when a continuous gesture component ends. The RECOGNIZED and UPDATED states are only applied to continuous gesture components. The RECOGNIZED state is applied when a continuous gesture component is first detected. The UPDATED state is applied during the course of a continuous gesture component.
(48) The discrete gesture components, Thumb-Tap and Rim-Tap, are emitted to the clients after they happen, and then only one message is sent for the gesture component. They are only sent with the state ENDED.
(49) The continuous gesture components, Thumb-Glide, Thumb-Long-Press and Grab, are emitted to the clients intermittently from the instant that they are recognized until they end when the hand or finger leaves the rim. When they are first recognized, they are sent to the network with the state RECOGNIZED. With a configurable interval, the gesture component is reported to the network with new parameters and the state UPDATED. When the gesture component ends, the gesture component is sent with the state ENDED.
(50) Reference is made to
(51) A Thumb-Tap gesture component is generated when a small object touches the rim (or gets very close) and then is lifted from the rim within a short period. This period is configurable, but typically it is 100-200 ms.
(52) A Rim-Tap gesture component is the same as a Thumb-Tap, but for a large object such as a hand.
(53) A Thumb-Glide gesture component is generated when a small object touches the rim and moves at least a certain threshold distance along the rim. That distance is configurable. When it continues to move, UPDATE messages are sent when the object has moved a certain distance, also configurable.
(54) A Grab gesture component is the same as a Thumb-Glide gesture component, but for a large object touching the rim, and with the difference that the Grab gesture component does not have to move to be reported on the network. When the hand has been on the rim for a certain time threshold, the Grab gesture component is recognized and messages are intermittently sent to the network.
(55) A Thumb-Long-Press gesture component is generated when a small object is present, and not moving, on the rim. When the small object has been present for a certain time, messages are sent intermittently to the network about the gesture component. If the object starts moving, the Thumb-Long-Press gesture component is ended and a Thumb-Glide gesture component is started instead.
(56) As mentioned above, gesture components are combined into compound user interface gestures. In some cases, environment conditions at the gesture location are combined with the gesture component to define a gesture. For example, a Thumb-Tap gesture performed at one end of an illuminated portion of the rim is translated into a first command, and a Thumb-Tap gesture performed at the other end of the illuminated portion of the rim is translated into a second command. The following table lists the different gestures and compound gestures in the steering wheel user interface, the gesture components that make up each gesture, additional gesture parameters, and example context and commands for each gesture.
(57) TABLE-US-00009 Gesture Additional Example Example Gesture Components Parameters Context Command Tap inside Thumb-Tap Thumb-tap Cruise Increase or notch performed at top or control is decrease cruise bottom of illuminated active control speed in portion of illuminated 5 mph segment of steering increments wheel Tap on Rim-Tap During Reactivate phone steering phone call interaction, e.g., wheel outer when phone call rim is active for set period of time. Enables hanging up the phone call with a clockwise swipe gesture Single Two Thumb- Thumb-taps have Vehicle is in Activate cruise object Taps different time motion control and double-tap stamps, similar illuminate inside notch center angles location of double-tap Single Two Rim-Taps Side of steering Car is not Activate Park object wheel rim (left or moving, and Assist to park on double-tap right) at which Park Assist left or right side on steering double-tap is icon is of car, based on wheel outer performed displayed on tapped side of rim HUD rim Multi-touch Two Thumb- Thumb-taps have Autonomous Activate double-tap Taps similar time stamps, drive is not autonomous inside notch different center active drive angles Extended Thumb-Long- Thumb-long-press Cruise Increase or touch inside Press performed at top or control is decrease cruise notch bottom of illuminated active control speed in portion of illuminated 1 mph segment of steering increments wheel Grab Grab Autonomous Deactivate drive is autonomous active drive, and enter cruise control mode Swipe Thumb-Glide clockwise/counter- Cruise Increase or clockwise control is decrease distance active from forward car in cruise control mode Radial swipe Thumb-Glide Thumb-glide data Cruise Open cruise structures have control is control menu on similar center angles active HUD and different radial coordinates Slide Grab Grab data structures Portion of Move illumination have different time steering to new hand stamps and different wheel is location (follow center angles selectively slide movement) illuminated Switch Grab Grab data structures Portion of Move illumination hands have different time steering to new hand stamps and different wheel is location (jump to center angles selectively other side of illuminated wheel)
(58) Reference is made to
(59) The flowchart of
(60) The user enters Adaptive Cruise Control mode from Normal Drive mode by performing a double-tap gesture. The user enters Autonomous Drive mode from Normal Drive mode and from Adaptive Cruise Control mode by performing a multi-touch double-tap gesture. These gestures are described below. In order to alert the driver that Autonomous Drive mode will begin shortly, the steering wheel is illuminated with an illumination pattern that indicates a countdown until Autonomous Drive is activated.
(61) The user exits Adaptive Cruise Control mode by performing a double-tap gesture that opens a menu on the HUD for changing the mode 1015 of cruise control. The user performs clockwise or counter-clockwise swipe gestures to scroll through the different modes on the HUD, and performs a single-tap gesture to select the displayed mode. One of the modes is Exit ACC 1018, and selecting this mode exits Adaptive cruise Control. Another mode configures the cruise control application to follow the road signage 1019.
(62) The user exits Autonomous Drive mode 1013 by grabbing the rim of the steering wheel. In order to alert the driver that Autonomous Drive mode is about to exit, the steering wheel is illuminated with an illumination pattern that indicates a countdown until Autonomous Drive is deactivated. Upon exiting Autonomous Drive mode, the vehicle enters Adaptive Cruise Control mode.
(63) In Adaptive Cruise Control mode 1002 the user adjusts a distance 1016 between the vehicle and the vehicle directly in front of it, by performing a clockwise or counter-clockwise swipe gesture. The user adjusts the speed of the vehicle by performing either a tap gesture or an extended touch gesture. When the vehicle enters Adaptive Cruise Control mode 1002 a segment of the steering wheel is illuminated. A tap gesture or extended touch gesture at one end of the illuminated segment increases the vehicle speed, and a tap gesture or extended touch gesture at the other end of the illuminated segment decreases the vehicle speed.
(64) A voice control state 1004 can be entered from Normal Drive mode and Adaptive Cruise Control mode. In this state, the user can initiate a phone call by saying call and the name of a contact from his phone's contact list. Once the call has been connected, the user can hang up 1010 by performing a clockwise swipe gesture. The user can also adjust the volume 1011 by saying the word volume and then performing a counter-clockwise swipe gesture to raise the volume, or a clockwise swipe gesture to lower the volume.
(65) When an incoming phone call 1005 is received, the user can answer the call 1012 by performing a counter-clockwise swipe gesture, or decline the call 1012 by performing a clockwise swipe gesture.
(66) Reference is made to
(67) When Adaptive Cruise Control is active the user has four options; namely, adjust cruise control speed, adjust the distance between the vehicle and the vehicle ahead, open an adaptive cruise control menu, and activate Autonomous Drive mode. As mentioned above Adaptive Cruise Control is activated when the user taps twice with his thumb in the steering wheel thumb notch. The location of these taps is subsequently illuminated to indicate to the user where to perform future gestures. This is illustrated in drawing (b) in
(68) If the user slides his hand 419 along steering wheel 410, the illuminated portion 436 moves with the hand so that the user's thumb is always next to the illuminated portion of the steering wheel. This is illustrated in drawing (c) in
(69) In some embodiments the cruise control speed is also adjusted in response to extended touch gestures above and below the illuminated portion of the steering wheel. For example, the speed is adjusted by 5 km/h in response to a tap gesture, and is adjusted by 1 km/h in response to an extended touch gesture.
(70) In order to increase or decrease the distance between the vehicle and the vehicle in front of it on the road, the user performs clockwise and counter-clockwise swipe gestures. These are illustrated in drawings (f) and (g) in
(71) In order to change the mode of Adaptive Cruise Control the user performs a radial swipe gesture with his thumb across the width of the steering wheel thumb notch. This is illustrated in drawings (h) and (i) in
(72) Reference is made to
(73) Once the user performs this multi-touch double-tap gesture, a series of locations on the steering wheel is sequentially illuminated over time to indicate a countdown until Autonomous Drive is activated, as illustrated in drawings (b) and (c). For example, viewing the upright steering wheel as a clock, drawing (b) illustrates a sequence of illuminations that begins with (i) the 2:30 and 9:30 clock positions indicated by a 1; followed by (ii) the 1:30 and 10:30 clock positions indicated by 2; followed by (iii) the 12:30 and 11:30 clock positions indicated by 3. Drawing (c) illustrates finally illuminating the 12 o'clock position indicated by the word Go to inform the user that Autonomous Drive is activated and the user can safely take his hands off the wheel.
(74) In order to exit Autonomous Drive mode and enter Adaptive Cruise Control mode, the user grabs the steering wheel. Reference is made to
(75) In both Normal Drive mode and Adaptive Cruise Control mode the user can enable voice-activated controls by tapping twice on the outer rim of the steering wheel. When voice-activated controls are enabled the user disables these controls by repeating the same double-tap gesture.
(76) Two voice-activated controls are illustrated in
(77) Reference is made to
(78) Reference is made to
(79) In a city scenario the user interface provides a park assist function that automatically parks the car without the user's intervention. Reference is made to
(80) In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made to the specific exemplary embodiments without departing from the broader spirit and scope of the invention. In particular, sensors other than optical sensors may be used to implement the user interface, inter alia capacitive sensors disposed along the circumference of the steering wheel, and cameras that captured images of the steering wheel. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.