Adaptive Touch User Interface Systems and Methods
20200363925 ยท 2020-11-19
Inventors
- Franziska Lang (Ventura, CA, US)
- Martin Francisco (Pasadena, CA, US)
- Eric Brown (North Hollywood, CA, US)
- Matthew Potter (Porter Ranch, CA, US)
- Paul Ferraiolo (Ventura, CA, US)
- Ross CARMICHAEL (Everrett, WA, US)
Cpc classification
G06F3/038
PHYSICS
G06F3/0488
PHYSICS
B60K2360/1442
PERFORMING OPERATIONS; TRANSPORTING
G06F3/04886
PHYSICS
G06F2203/04809
PHYSICS
G06F2203/04803
PHYSICS
G06F3/04847
PHYSICS
G06F3/0416
PHYSICS
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
G06F3/016
PHYSICS
B60K35/10
PERFORMING OPERATIONS; TRANSPORTING
B60K35/28
PERFORMING OPERATIONS; TRANSPORTING
International classification
G06F3/0488
PHYSICS
Abstract
A system for an adaptive touch user interface includes a visual display and an adaptive touch-input interface. The visual display is configured to display one or more interactive elements thereon in accordance with a software application that receives user input in connection with displayed information. The touch-input interface is configured to receive user input and communicate the user input to the software application. The touch-input interface also has a surface profile that is adaptable in connection with the display of the one or more interactive elements.
Claims
1. A system for an adaptive touch user interface, comprising: a visual display configured to display one or more interactive elements thereon in accordance with a software application that receives user input in connection with displayed information; and a touch-input interface having a surface profile adaptable in connection with the display of the one or more interactive elements, the touch-input interface configured to receive user input and communicate the user input to the software application.
2. The system of claim 1, further including one or more actuators configured to adapt the surface profile of the touch-input interface in connection with the display of the one or more interactive elements.
3. The system of claim 1, wherein the touch-input interface comprises a plurality of touch-input surface elements that define the surface profile, each touch-input surface element being individually extendable and retractable to adjust the surface profile in connection with the display of the one or more interactive elements.
4. The system of claim 3, wherein extended touch-input surface elements are activated so as to receive the user input; and wherein retracted touch-input surface elements are deactivated so as to not receive the user input.
5. The system of claim 3, wherein the plurality of touch-input surface elements comprise an array of substantially rectangular touch-input surface elements.
6. The system of claim 3, wherein the plurality of touch-input surface elements comprise a matrix of substantially square touch-input surface elements
7. The system of claim 1, wherein the touch-input interface is configured to adapt to form one or more of: a substantially planar surface profile, a bar surface profile, a track ball surface profile, and a keyboard surface profile.
8. The system of claim 1, wherein the user input communicates interaction with the interactive elements by way of the touch-input interface via one or more of: clicking, scrolling, dragging, selecting, zooming, swiping, and pointing actions on the touch-input interface.
9. The system of claim 1, wherein the touch-input interface is substantially flush with a vehicle panel, when adjusted to a neutral surface profile.
10. A vehicle having an adaptive touch user interface, the vehicle comprising: a control unit configured to execute a software application for controlling one or more vehicle systems, the software application receiving user input in connection with displayed information; a visual display configured to display one or more interactive elements thereon in accordance with the software application; and a touch-input interface having a surface profile adaptable in connection with the display of the one or more interactive elements, the touch-input interface configured to receive user input and communicate the user input to the software application.
11. The vehicle of claim 10, further including one or more actuators configured to adapt the surface profile of the touch-input interface in connection with the display of the one or more interactive elements.
12. The vehicle of claim 10, wherein the touch-input interface comprises a plurality of touch-input surface elements that define the surface profile, each touch-input surface element being individually extendable and retractable to adjust the surface profile in connection with the display of the one or more interactive elements.
13. The vehicle of claim 12, wherein extended touch-input surface elements are activated so as to receive the user input; and wherein retracted touch-input surface elements are deactivated so as to not receive the user input.
14. The vehicle of claim 12, wherein the plurality of touch-input surface elements comprise an array of substantially rectangular touch-input surface elements.
15. The vehicle of claim 12, wherein the plurality of touch-input surface elements comprise a matrix of substantially square touch-input surface elements
16. The vehicle of claim 10, wherein the touch-input interface is configured to adapt to form one or more of: a substantially planar surface profile, a bar surface profile, a track ball surface profile, and a keyboard surface profile.
17. The vehicle of claim 10, wherein the user input communicates interaction with the interactive elements by way of the touch-input interface via one or more of: clicking, scrolling, dragging, selecting, zooming, swiping, and pointing actions on the touch-input interface.
18. The vehicle of claim 10, wherein the touch-input interface is substantially flush with a panel of the vehicle, when adjusted to a neutral surface profile.
19. A method for providing user input to a software application, the method comprising: visually displaying interactive elements on a visual display in accordance with the software application; adapting a surface profile of a touch-input interface in connection with the display of the one or more interactive elements; receiving user input via the touch-input interface having the adapted surface profile; communicating the received user input to the software application.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The features, objects, and advantages of the present invention will become more apparent from the detailed description, set forth below, when taken in conjunction with the drawings, in which like reference characters identify correspondingly throughout and wherein:
[0013]
[0014]
[0015]
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0016] The above described drawing figures illustrate the present invention in at least one embodiment, which is further defined in detail in the following description. Those having ordinary skill in the art may be able to make alterations and modifications to what is described herein without departing from its spirit and scope. While the present invention is susceptible of embodiment in many different forms, there is shown in the drawings and will herein be described in detail at least one preferred embodiment of the invention with the understanding that the present disclosure is to be considered as an exemplification of the principles of the present invention, and is not intended to limit the broad aspects of the present invention to any embodiment illustrated. It will therefore be understood that what is illustrated is set forth for the purposes of example, and should not be taken as a limitation on the scope of the present invention.
[0017] The present invention generally relates to an adaptive touch user interface having a surface profile that is adaptable in connection with the display of the one or more interactive elements.
[0018]
[0019] The software application may be any software application that involves the receipt of user input in connection with displayed information. Exemplary software applications include software applications for controlling various vehicle systems, such as vehicle entertainment systems, climate control systems, driver assistance systems, security systems, navigation systems, etc., through user input, as well as operating systems and other software applications.
[0020] The visual display 120 may be any type of device capable of visually communicating information to a user, such as a liquid-crystal display (LCD) screen, a plasma screen, etc. The visual display 120 may be configured to visually display one or more interactive elements 122 thereon in accordance with software applications being executed by the control unit 160. The interactive elements 122 may be arranged according to various arrangements, such as lists, matrices, etc.
[0021] The touch-input interface 140 may be any type of interface capable of allowing a user to provide the user input in accordance with the software application via touch, e.g., a touch-sensitive surface. The touch-input interface 140 may include a plurality of touch-input surface elements 142, which define the surface profile, and which individually and collectively may be capable of allowing the user to provide the user input.
[0022] The touch-input interface 140 may be configured to allow the user to provide the user input by way of selecting one or more of the interactive elements 122 displayed on the visual display 120, or otherwise interacting with the system 100 in accordance with the software application. Accordingly, portions of the touch-input interface 140 may correspond to portions of the visual display 120 such that touch input received by the touch-input interface 140 may cause the generation of a corresponding visualization on the display 120.
[0023] The touch-input interface 140 may further include one or more actuators 144 configured to adapt the surface profile in accordance with the displayed interactive elements 120. Referring now to
[0024] Referring back to
[0025] The control unit 160 may include a processor 162 and a memory 164. The processor may instruct other components, such as the touch-input interface 140 and the visual display 120, to perform various tasks based on the processing of information and/or data that may have been previously stored or has been received, such as instructions and/or data stored in memory 164. The processor 112 may be a standard processor, such as a central processing unit (CPU), or may be a dedicated processor, such as an application-specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
[0026] Memory 164 stores at least instructions and/or data that can be accessed by processor 162. For example, memory 164 may be hardware capable of storing information accessible by the processor, such as a ROM, RAM, hard-drive, CD-ROM, DVD, write-capable, read-only, etc. The set of instructions may be included in software that can be implemented by the system 100. It should be noted that the terms instructions, steps, algorithms, and programs may be used interchangeably. Data can be retrieved, manipulated or stored by the processor 162 in accordance with the set of instructions or other sets of executable instructions. The data may also be stored as a collection of data.
[0027] It is to be understood that the configuration illustrated in
[0028] In at least one embodiment, a vehicle (not shown) may be provided with the system 100 for an adaptive touch user interface. In such embodiments, the vehicle may include a panel surface, which may at least partially comprise the touch-input interface 140 such that the touch-input interface 140 is, when in the neutral position, substantially flush with the remainder of the panel surface adjacent thereto. The panel surface may include, for example, a dashboard surface, a console surface, a door surface, a ceiling surface, a floor surface, an armrest surface, or any other vehicle surface accessible by the user.
[0029] An exemplary operation of the system 100 will now be described with reference to
[0030] In accordance with the grid arrangement 202 of the displayed interactive elements 122, the control unit 160 may control the actuators 142 to extend the touch-input surface elements 142 so as to form the touch-input interface 140 in a substantially planar surface profile 204. In the substantially planar surface profile 204, substantially the entire surface of the touch-input interface 140 defines the active area 146, which corresponds to the interactive area 126 of the visual display 120 being substantially the entire display. This arrangement allows the user to interact with any of the interactive elements 122 displayed in a natural and intuitive manner.
[0031] For example, referring to
[0032] In accordance with the list arrangement 206 of the displayed interactive elements 122, the control unit 160 may control the actuators 142 to extend the touch-input surface elements 142 so as to form the touch-input interface 140 in a left bar surface profile 208, while inactive touch-elements 142 may also be retracted. In the left bar surface profile 208, the active area 146 is defined by a bar of one or more touch-input surface elements 142 on the left side of the touch-input interface 140, which corresponds to the interactive area 126 of the visual display 120 being the list of interactive elements 122. This arrangement allows the user to interact with the displayed list of interactive elements 122 in a natural and intuitive manner.
[0033] For example, referring back to
[0034] It will be understood that the arrangements described herein are illustrative, and any arrangement of touch-elements 142 and interactive elements 122 is within the scope of this disclosure. Moreover, correlations between various arrangements of interactive elements 122 and various arrangements of touch-input surface elements 142 may be stored in the memory 164, and may be referred to by the processor 162 in controlling the relevant components.
[0035] It will also be understood that, while the touch-input surface elements 142 are shown as rectangular in shape, the touch-input surface elements 142 may be of any shape or configuration that allows for the adaptation of the surface profile in connection with the display of the interactive elements 122. In particular, it will be understood that configurations in which the touch-input interface 140 takes on a curved profile (e.g., a track ball like), or a more modular profile (e.g., keyboard like) are expressly contemplated.
[0036] It will further be understood that the touch-input interface 140 may be configured to recognize and receive various touch input actions in addition to or alternatively to the touch-selection described herein for illustrative purposes. Such touch input actions may include clicking, scrolling, dragging, selecting, zooming, swiping, pointing, and other actions known for providing touch input via touch sensitive surface user interfaces, which actions may include statically or dynamically contacting the active area 146 at one or more locations of the touch-input interface 140.
[0037] A method for providing user input to a software application via an adaptive touch user interface will now be described with reference to
[0038] At step 301, interactive elements 122 are visually displayed on the visual display 120 in accordance with the software application. As discussed herein, this may involve displaying the interactive elements 122 in various arrangements.
[0039] At step 302, the surface profile of the touch-input interface 140 is adapted in connection with the display of the interactive elements 122. As discussed herein, this may involve actuating the touch-elements 142 (e.g., via extending/retracting them) to form the active area 146 corresponding to the arrangement of the interactive elements 122.
[0040] At step 303, user input is received by the touch-elements 142 of active area 146, which received user input is communicated to the software application. If the receipt of the user touch input by the software application results in the software application modifying the arrangement of interactive elements 122, then the process returns to Step 301.
[0041] The objects, advantages and features described in detail above are considered novel over the prior art of record and are considered critical to the operation of at least one embodiment of the present invention and to the achievement of at least one objective of the present invention. The words used in this specification to describe these objects, advantages and features are to be understood not only in the sense of their commonly defined meanings, but also to include any special definition with regard to structure, material or acts that would be understood by one of ordinary skilled in the art to apply in the context of the entire disclosure.
[0042] Moreover, various elements described herein generally include hardware and/or software/firmware, including but not limited to: processors, memories, input/output interfaces, operating systems and network interfaces, configured to effectuate the functionalities described herein. When implemented in software, the elements of the invention are essentially the code segments to perform the necessary tasks. The code segments can be stored in a processor readable medium or transmitted by a computer data signal. The processor readable medium may include any medium that can store information. Examples of the processor readable medium include an electronic circuit, a semiconductor memory device, a ROM, a flash memory or other non-volatile memory, a floppy diskette, a CD-ROM, an optical disk, a hard disk, etc.
[0043] As used herein, the terms a or an shall mean one or more than one. The term plurality shall mean two or more than two. The term another is defined as a second or more. The terms including and/or having are open ended (e.g., comprising). The term or as used herein is to be interpreted as inclusive or meaning any one or any combination. Therefore, A, B or C means any of the following: A; B; C; A and B; A and C; B and C; A, B and C. An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
[0044] Reference throughout this document to one embodiment, certain embodiments, an embodiment or similar term means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of such phrases or in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner on one or more embodiments without limitation.
[0045] Moreover, the definitions of the words or drawing elements described herein are meant to include not only the combination of elements which are literally set forth, but all equivalent structures, materials or acts for performing substantially the same function in substantially the same way to obtain substantially the same result. In this sense, it is therefore contemplated that an equivalent substitution of two or more elements may be made for any one of the elements described and its various embodiments or that a single element may be substituted for two or more elements in a claim without departing from the scope of the present invention.
[0046] Changes from the claimed subject matter as viewed by a person with ordinary skill in the art, now known or later devised, are expressly contemplated as being equivalents within the scope intended and its various embodiments. Therefore, obvious substitutions now or later known to one with ordinary skill in the art are defined to be within the scope of the defined elements. This disclosure is thus meant to be understood to include what is specifically illustrated and described above, what is conceptually equivalent, what can be obviously substituted, and also what incorporates the essential ideas.
[0047] The scope of this description is to be interpreted in conjunction with the appended claims.