User terminal, electronic device, and control method thereof
10416848 ยท 2019-09-17
Assignee
Inventors
- Young-kyu Jin (Seoul, KR)
- Young-ho Rhee (Yongin-si, KR)
- Young-shil Jang (Suwon-si, KR)
- Il-ku Chang (Seongnam-si, KR)
Cpc classification
G06F3/04842
PHYSICS
B60K35/211
PERFORMING OPERATIONS; TRANSPORTING
G06F3/0488
PHYSICS
G06F2203/04808
PHYSICS
B60K35/60
PERFORMING OPERATIONS; TRANSPORTING
B60K35/10
PERFORMING OPERATIONS; TRANSPORTING
International classification
G06F3/0488
PHYSICS
Abstract
A method for controlling a user terminal is provided. The user terminal controlling method includes, when an external apparatus providing a UI screen is connected, providing a touch mode where a touch manipulation to navigate menu items provided on the UI screen is input, and when a predetermined touch manipulation is input in the touch mode, transmitting information corresponding to the input touch manipulation to the external apparatus.
Claims
1. A method for controlling a user terminal, comprising: when an external apparatus providing a UI screen is connected, providing a touch mode where a touch manipulation to navigate menu items provided on the UI screen is input; and when a predetermined touch manipulation is input in the touch mode, transmitting information corresponding to the input touch manipulation to the external apparatus, wherein the external apparatus is a center fascia screen apparatus of a vehicle, and wherein, when the touch manipulation is input in the touch mode, comparing information corresponding to the input touch manipulation with the receiving information regarding the UI screen by comparing a number of touch points with a number of sub menu items of the selected item.
2. The method as claimed in claim 1, wherein the information corresponding to a touch manipulation is at least one of a number of touch manipulations and the number of touch points input at a time, wherein the number of touch manipulations navigates a depth of the menu items and the number of touch points input at a time navigates menu items provided in a same depth.
3. The method as claimed in claim 1, further comprising: receiving information regarding the UI screen from the external apparatus, wherein the transmitting information corresponding to the input touch manipulation to the external apparatus comprises transmitting information corresponding to the input touch manipulation to the external apparatus according to a result of the comparison.
4. The method as claimed in claim 3, wherein the information regarding a UI screen is information regarding an item currently selected on a menu provided on the UI screen and the number of sub menu items of the selected item, wherein the information corresponding to a touch manipulation is at least one of a number of touch manipulations and the number of touch points input at a time.
5. The method as claimed in claim 4, wherein if the number of touch points exceeds the number of sub menu items of the selected item based on the comparison result, an error message is output, and if the number of touch points does not exceed the number of sub menu items of the selected item, the number of touch points is transmitted to the external apparatus.
6. A method for controlling an electronic apparatus which is interlocked with a user terminal, comprising: providing a UI screen; performing connection with a user terminal which provides a touch mode where a touch manipulation to navigate menu items provided on the UI screen is input; receiving information corresponding to a touch manipulation input in a touch mode of the user terminal; and navigating menu items provided on the UI screen according to the received information, wherein the electronic apparatus is a center fascia screen apparatus of a vehicle, and wherein, when the touch manipulation is input in the touch mode, comparing information corresponding to the input touch manipulation with the receiving information regarding the UI screen by comparing a number of touch points with a number of sub menu items of the selected item.
7. The method as claimed in claim 6, wherein the information corresponding to a touch manipulation is at least one of a number of touch manipulations and the number of touch points input at a time, wherein the number of touch manipulations navigates a depth of the menu items and the number of touch points input at a time navigates menu items provided in a same depth.
8. A user terminal, comprising: a communicator which performs communication with an external apparatus providing a UI screen; a display which provides a touch mode where a touch manipulation to navigate menu items provided on the UI screen is input; and a controller which, when it is connected to the external apparatus, provides the touch mode, and when a predetermined touch manipulation is input in the touch mode, controls to transmit information corresponding to the input touch manipulation to the external apparatus, wherein the external apparatus is a center fascia screen apparatus of a vehicle, and wherein, when the touch manipulation is input in the touch mode, comparing information corresponding to the input touch manipulation with the receiving information regarding the UI screen by comparing a number of touch points with a number of sub menu items of the selected item.
9. The user terminal as claimed in claim 8, wherein the information corresponding to a touch manipulation is at least one of a number of touch manipulations and the number of touch points input at a time, wherein the number of touch manipulations navigates a depth of the menu items and the number of touch points input at a time navigates menu items provided in a same depth.
10. The user terminal as claimed in claim 8, wherein the controller, when information regarding the UI screen is received from the external apparatus and the touch manipulation is input in the touch mode, controls to compare information corresponding to the input touch manipulation with the received information regarding the UI screen and transmit information corresponding to the input touch manipulation to the external apparatus.
11. The user terminal as claimed in claim 10, wherein the information regarding the UI screen is information regarding an item which is currently selected on a menu provided on the UI screen and the number of sub menu items of the selected item, wherein the information corresponding to a touch manipulation is at least one of a number of touch manipulations and the number of touch points input at a time.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The above and/or other aspects of the present inventive concept will be more apparent by describing certain exemplary embodiments of the present inventive concept with reference to the accompanying drawings, in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
DETAILED DESCRIPTION
(11) Certain exemplary embodiments are described in higher detail below with reference to the accompanying drawings.
(12) In the following description, like drawing reference numerals are used for the like elements, even in different drawings. The matters defined in the description, such as detailed construction and elements, are provided to assist in a comprehensive understanding of exemplary embodiments. However, exemplary embodiments can be practiced without those specifically defined matters. Also, well-known functions or constructions are not described in detail since they would obscure the application with unnecessary detail.
(13)
(14) Referring to
(15) The user terminal receives a user command to manipulate a User Interface (UI) screen provided by the electronic apparatus 200 and transmits it to the electronic apparatus 200.
(16) Specifically, once the user terminal 100 is connected to the electronic apparatus 200, the user terminal 100 may provide a touch mode where a touch manipulation to navigate menu items provided on the UI screen of the electronic apparatus 200 is input.
(17) In addition, when a predetermined touch manipulation is input in the touch mode, the user terminal 100 may transmit information corresponding to the input touch manipulation to the electronic apparatus 200.
(18) Meanwhile, the user terminal 100 may be realized in various forms such as a mobile phone, a notebook computer, a PMP, an MP3, and so on, but is not limited thereto as long as it includes a touch pad.
(19) The electronic apparatus 200 provides a UI screen which can be manipulated in the user terminal 100, and herein the UI screen may include various menu items.
(20) In addition, the electronic apparatus 200 may be connected to the user terminal 100 which provides a touch mode where a touch manipulation to navigate menu items provided in the UI screen is input.
(21) The electronic apparatus 200 may also receive information corresponding to a touch manipulation input in the touch mode of the user terminal 100 and navigate menu items provided on the UI screen according to the received information.
(22) Meanwhile, the electronic apparatus 200 may be realized as a center fascia screen apparatus in a vehicle, but may also be an apparatus providing a UI which can be controlled by a touch manipulation input in the user terminal 100. Herein, the center fascia refers to the front portion of the vehicle interior between the driver seat and the passenger seat where devices such as an audio, an air conditioner, a navigation, etc. are located. The center fascia screen apparatus may be an audio, navigation, and so on.
(23)
(24) Referring to
(25) The communication unit 110 communicates with the external apparatus 200 which provides a UI screen.
(26) In particular, the communication unit 110 performs connection to the external apparatus 200 and then, transmits an event occurring in the display unit 120 which will be explained later to the external apparatus 200.
(27) Herein, the communication unit 110 may be realized as a Bluetooth communication module, a Wi-Fi communication module, and a USB communication module, and so on. Accordingly, the communication unit 110 may perform communication with the electronic apparatus 100 using a notified wired or wireless protocol such as Bluetooth, Wireless Fidelity (Wi-Fi), other USB standard Internet, LAN, Ethernet, TCP/IP, IPX, FireWire, IEEE 1394, iLink, CDMA, TDMA, High Definition Multimedia Interface (HDMI-CEC), Wireless HDMI-CEC, Radio Frequency (RF), and so on.
(28) The display unit 120 may display various information provided by the user terminal 100.
(29) However, when the display unit 120 is connected to the external apparatus 200, the display unit 120 may provide a touch mode where a user touch command to manipulate a UI screen provided by the external apparatus 200 is input.
(30) Herein, the display unit 120 may be realized as a touch screen which forms an interlayer structure with respect to a touch pad. In this case, the display unit 120 may perform not only an output function but also the functions of a user interface which will be explained later. In addition, the touch screen may be configured to detect not only the location and size of a touch input but also the pressure of a touch input.
(31) The controller 130 controls overall operations of the user terminal 100.
(32) In particular, if the controller 130 is connected to the external apparatus 200, the controller may control the display unit 120 to provide a touch mode where a user touch command to manipulate a UI screen provided by the external apparatus 200 is input.
(33) In addition, if a predetermined touch manipulation is input, the controller 130 may control the communication unit 110 to transmit information corresponding to the input touch manipulation to the external apparatus 200.
(34) Herein, the information corresponding to a touch manipulation may be at least one of the number of touch manipulations and the number of touch points which are input at a time.
(35) In this case, the number of touch manipulations may navigate the depth of menu items, and the number of touch points which are input may navigate menu items provided in the same depth.
(36) In addition, when information regarding a UI screen is received from the external apparatus 200, the controller 130 may compare information corresponding to an input manipulation with information regarding the received UI screen and control to transmit information corresponding to the input touch manipulation to the external apparatus based on the comparison result.
(37) Herein, the information regarding the UI screen may be information regarding an item which is currently selected on the menus provided by the UI screen and information regarding the number of sub-menu items of the selected item.
(38) In this case, the controller 130 may compare the number of touch points and the number of sub-menu items of the currently-selected item.
(39) Based on the comparison result, if the number of touch points does not exceed the number of sub-menu items of the currently-selected item on the UI screen, the controller 130 may control to transmit the number of touch points to the external apparatus 200.
(40) Alternatively, if the number of touch points exceeds the number of sub-menu items of the selected item, the controller 130 may control to output an error message.
(41) Meanwhile, various manipulation methods which will be explained later may be used when moving from a sub menu to an upper menu.
(42) According to an exemplary embodiment, a specific touch manipulation may be set as a command to move to an upper step in advance.
(43) For example, it may be predetermined that the touch gesture of sweeping from right to left is a command to move to an upper step. In addition, with reference to information received from the electronic apparatus 200, a command to move to an upper step may be automatically set according to the number of sub-menu items of a currently-highlighted menu item. For example, if there are 3 sub-menu items, a touch manipulation of 4 points which are not used may be set as a command to move to an upper step.
(44) Meanwhile, a touch manipulation to execute a selected item may be set. For example, the gesture of sweeping from top to bottom may be set as a command to execute the corresponding menu item.
(45) In addition, a touch manipulation to move a current highlight key position to an uppermost item (for example, a home menu) may be set. For example, the gesture of swing from bottom to top may be set as a command to execute the corresponding menu item.
(46) However, the above-described touch gestures are only examples, and various other touch gestures may be set as a specific user command. In addition, it is possible to set/change a touch gesture in various ways according to a user setting.
(47)
(48) Referring to
(49) The display unit 210 displays various UI screens provided by the electronic apparatus 200. Herein, the display unit 210 may be realized as at least one of a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, and a 3D display. In addition, the display unit 210 may be realized as a touch screen which forms an interlayer structure with respect to a touch pad.
(50) Meanwhile, the UI screen provided by the display unit 210 may include menu items which are configured to have a predetermined depth. In this case, the menu items are displayed such that whether the menu items are selected or not is identified using highlight.
(51) The communication unit 220 performs communication with the user terminal 100 which receives a user command to navigate menu items provided on the UI screen.
(52) Herein, the user terminal 100 may provide a touch mode where a touch manipulation to navigate menu items provided on the UI screen is input.
(53) In this case, the communication unit 220 may receive information corresponding to a touch manipulation input in the touch mode of the user terminal 100.
(54) When information corresponding to a touch manipulation input in the touch mode of the user terminal 100 is received through the communication unit 220, the controller 230 navigates menu items provided on the UI screen according to the received information.
(55) Herein, the information corresponding to a touch manipulation may be at least one of the number of touch manipulations and the number of touch points which are input at a time.
(56) In this case, the controller 230 may navigate the depth of the menu items based on the number of touch manipulations, and navigate the menu items provided in the same depth based on the number of touch points which are input at a time. This will be explained later in greater detail with reference to the corresponding drawings.
(57) Although not illustrated in the drawings, the electronic apparatus 200 may further comprise a UI processor (not shown).
(58) The UI processor (not shown) performs the function of processing/generating various UI screens in 2D or 3D form. Herein, the UI screen may be a menu screen including a plurality of menu items, but is not limited thereto. That is, the UI screen may be a screen displaying a warning sentence and texts or diagrams such as time or channel number.
(59) In addition, the UI processor (not shown) may perform such operations as 2D/3D conversion of UI elements, adjustment of transparency, color, size, shape, and location, highlight, and animation effect under the control of the controller 130.
(60)
(61) As illustrated in
(62) That is, a user may navigate menu items included in a UI screen provided by the electronic apparatus 200 by touching the screen of the user terminal 100 which provides a touch mode with fingers in series or at once as if playing a piano.
(63) In this case, according to the number of fingers touching the screen, an audio feedback may be provided so as to make the user feel as if he or she is playing a piano, thereby enhancing reliability of interaction.
(64)
(65) As illustrated in
(66) As illustrated in
(67) As illustrated in
(68) As illustrated in
(69) As illustrated in
(70)
(71) As illustrated in
(72)
(73) Specifically, if there is one touch point manipulation in the user terminal 100, the shape of waterdrop of the first menu item may be magnified and highlighted.
(74) Subsequently, if there are two touch point manipulations, the shape of the first menu item may be restored to its original shape, and the shape of waterdrop of the second menu item corresponding to the two touch point manipulations may be magnified and highlighted.
(75) If there are three or four touch point manipulations, the shape of menu items may be changed in the similar manner.
(76)
(77) According to the method for controlling a user terminal illustrated in
(78) Herein, when a predetermined touch manipulation is input in the touch mode, information corresponding to the input touch manipulation is transmitted to the external apparatus (S730).
(79) In this case, the information corresponding to the touch manipulation may be at least one of the number of touch manipulations and the number of touch points input at once.
(80) In addition, the number of touch manipulations may navigate the depth of menu items, and the number of touch points input at once may navigate menu items provided in the same depth.
(81) Further, when information regarding a UI screen is received from the external apparatus and a touch manipulation is input in the touch mode, information corresponding to the input touch manipulation may be compared with the received information regarding the UI screen. Subsequently, the information corresponding to the input touch manipulation may be transmitted to the external apparatus based on the result of comparison.
(82) Herein, the information regarding a UI screen may be information regarding an item which is currently selected on a menu provided by the UI screen and information regarding the number of sub menu items of the selected item. In addition, the information corresponding to the touch manipulation may be at least one of the number of touch manipulations and the number of touch points input at a time.
(83) In this case, the number of touch points and the number of sub menu items of the selected item may be compared, and information corresponding to the touch manipulation may be transmitted to an external apparatus based on the result of the comparison.
(84) Specifically, if the number of touch points exceeds the number of sub menu items of the selected item based on the comparison result, an error message may be output. In addition, if the number of touch points does not exceed the number of sub menu items of the selected item, the number of touch points may be transmitted to an external apparatus.
(85)
(86) According to the method for controlling an electronic apparatus which is interlocked with a user terminal illustrated in
(87) Subsequently, information corresponding to a touch manipulation input in the touch mode of the user terminal is received (S830).
(88) The menu items provided on the UI screen are navigated according to the received information (S840).
(89) Herein, the information corresponding to the touch manipulation may be at least one of the number of touch manipulations and the number of touch points input at a time.
(90) In addition, the number of touch manipulations may navigate the depth of menu items, and the number of touch points input at once may navigate menu items provided in the same depth.
(91)
(92) According to an operation of a user terminal and an electronic apparatus illustrated in
(93) Subsequently, the user terminal 100 may provide a touch mode for manipulating a UI screen of the electronic apparatus 200 (S930).
(94) When a predetermined touch manipulation is input in the touch mode of the user terminal 100 (S940), information corresponding to the input touch manipulation is transmitted to the electronic apparatus 200 (S950). In this case, the electronic apparatus 200 may navigate menu items provided on the UI screen according to the received information (S960).
(95) As described above, according to an exemplary embodiment, a method for manipulating a UI remotely using a piano metaphor method may be provided.
(96) Accordingly, a user may access to a desired function more rapidly and accurately without watching a screen, unlike the existing interaction where the user touches a touch screen and selects a desired function thereon. A user may select a desired function more rapidly once he or she is accustomed to the method, as if playing a musical instrument.
(97) The above methods according to various exemplary embodiments may be realized simply by upgrading software regarding the existing device or user terminal.
(98) A program to perform the above various methods may be stored and used in various types of recording medium.
(99) Specifically, the program may be stored and provided in various types of recording medium which is readable by a terminal, such as Random Access Memory (RAM), flash memory, Read Only Memory (ROM), Erasable Programmable ROM (EPROM), Electronically Erasable and Programmable ROM (EEPROM), register, hard disk, removable disk, memory card, USB memory, CD-ROM, and so on.
(100) The foregoing embodiments and advantages are merely exemplary and are not to be construed as limiting the present invention. The present teaching can be readily applied to other types of apparatuses. Also, the description of the exemplary embodiments of the present inventive concept is intended to be illustrative, and not to limit the scope of the claims, and many alternatives, modifications, and variations will be apparent to those skilled in the art.