DISPLAY APPARATUS, DISPLAY CONTROL METHOD, AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM FOR STORING PROGRAM
20190369867 ยท 2019-12-05
Inventors
Cpc classification
G06F3/04842
PHYSICS
G06F3/0416
PHYSICS
G01C21/3664
PHYSICS
G06F3/0488
PHYSICS
B60K35/50
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/1442
PERFORMING OPERATIONS; TRANSPORTING
G06F2203/04101
PHYSICS
G06F3/04886
PHYSICS
B60K35/10
PERFORMING OPERATIONS; TRANSPORTING
International classification
G06F3/0488
PHYSICS
Abstract
A display apparatus comprises a display unit configured to display screen data containing a plurality of selectable items; and a display control unit configured to change the screen data in accordance with selection of an item contained in the screen data, when a touch operation is performed on a display surface of the display unit. During a course of the touch operation, the display control unit changes the screen data in accordance with a position of a conductive object in a space on the display surface, which corresponds to an item.
Claims
1. A display apparatus comprising: a display unit configured to display screen data containing a plurality of selectable items; and a display control unit configured to change the screen data in accordance with selection of an item contained in the screen data, when a touch operation is performed on a display surface of the display unit, wherein during a course of the touch operation, the display control unit changes the screen data in accordance with a position of a conductive object in a space on the display surface, which corresponds to an item.
2. The apparatus according to claim 1, wherein when the touch operation has passed the position in the space on the display surface, which corresponds to the item, the display control unit determines that the item is selected and changes the screen data.
3. The apparatus according to claim 1, wherein the display unit is a capacitive touch panel capable of detecting a change in capacitance between the touch panel and the conductive object.
4. The apparatus according to claim 3, further comprising: a detection unit configured to detect a capacitance between the detection unit and the object; and a determination unit configured to determine whether the capacitance detected by the detection unit falls within a predetermined range, wherein after the determination unit determines that the capacitance falls within the predetermined range, if the determination result from the determination unit changes and indicates that the capacitance falls outside the predetermined range, the display control unit determines that the item is selected and changes the screen data.
5. The apparatus according to claim 4, wherein the object is a finger, and the display apparatus further comprises a changing unit configured to change the predetermined range based on information about the finger.
6. The apparatus according to claim 5, wherein the information about the finger contains one of humidity and a size of the finger.
7. The apparatus according to claim 5, further comprising an obtaining unit configured to obtain the information about the finger.
8. The apparatus according to claim 7, wherein the display apparatus is mounted in a vehicle, and the obtaining unit obtains the information about the finger based on information about a passenger of the vehicle.
9. The apparatus according to claim 1, wherein the number of items contained in screen data as a transition target of the display control unit is not more than the number of items contained in screen data as a transition destination.
10. The apparatus according to claim 1, wherein the display control unit changes the screen data between a plurality of predetermined screen data.
11. The apparatus according to claim 10, further comprising a determination unit configured to determine a layout of items contained in each of the plurality of screen data.
12. The apparatus according to claim 11, further comprising a storage unit configured to store the selected item as log information, wherein the determination unit determines the layout of items contained in each of the plurality of screen data based on the log information.
13. The apparatus according to claim 11, wherein the determination unit determines the layout such that scattering of items contained in each of the plurality of screen data decreases on the display screen.
14. A display control method to be executed in a display apparatus, comprising: displaying screen data containing a plurality of selectable items on a display unit; changing the screen data in accordance with selection of an item contained in the screen data, when a touch operation is performed on a display surface of the display unit, and changing the screen data in accordance with a position of a conductive object in a space on the display surface, which corresponds to an item, during a course of the touch operation.
15. A non-transitory computer-readable storage medium storing a program for causing a computer to perform: displaying screen data containing a plurality of selectable items on a display unit; changing the screen data in accordance with selection of an item contained in the screen data, when a touch operation is performed on a display surface of the display unit, and changing the screen data in accordance with a position of a conductive object in a space on the display surface, which corresponds to an item, during a course of the touch operation.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
DESCRIPTION OF THE EMBODIMENT
[0027] An embodiment will be explained in detail below with reference to the accompanying drawings. Note that the following embodiment does not restrict the invention according to the scope of claims, and not all combinations of features explained in the embodiment are necessarily essential to the invention. Two or more features of a plurality of features explained in the embodiment can freely be combined. Note also that the same reference numerals denote the same or similar parts, and a repetitive explanation thereof will be omitted.
[0028]
[0029]
[0030] In the control unit 200, individual blocks are connected via a bus, and a CPU 201 controls each block connected to the bus. A ROM 202 stores a basic control program and parameters for operating the control unit 200. The operation of the display apparatus 110 as explained in this embodiment is implemented by the CPU 201 by loading the control program and parameters into a RAM 203 and executing them. The display apparatus 110 can also be a computer for carrying out the present invention according to the program. The ROM 202 stores locus information obtained from the touch panel 222 by a locus obtaining unit 204.
[0031] The locus obtaining unit 204 obtains, from the touch panel 222, the locus information indicating the locus of the finger of a passenger on the touch panel 222. A screen data generation unit 205 generates screen data to be displayed on the touch panel 222, based on operation log information 212 (to be described later). A region adjusting unit 206 adjusts a detection region for detecting the finger on the touch panel 222, based on user attribute information obtained from the ECU 230. The user attribute information will be described later.
[0032] The storage unit 210 is a hard disk or the like, and stores screen data 211, the operation log information 212, and region adjustment reference information 213. The screen data 211 is, for example, the setting screen of each device 240 mounted in the cabin of the vehicle 100, and contains screen data of a plurality of layers. The operation log information 212 is log information about an operation indicating a setting item selected on the touch panel 222 by the user of the display apparatus 110. In this embodiment, the user is a passenger in the cabin of the vehicle 100. The region adjustment reference information 213 contains reference information with which the region adjusting unit 206 adjusts the detection region for detecting the finger on the touch panel 222.
[0033] The speaker 220 outputs, for example, a guidance for a setting screen or a navigation screen to be displayed on the touch panel 222, by a voice. The microphone 221 receives the voice of a user. The input voice data can also be used in, for example, the authentication of a passenger. The touch panel 222 is a capacitive touch panel capable of detecting a change in capacitance between the touch panel 222 and a conductive object, such as a finger, approaching the touch panel 222, and capable of specifying the position of the finger by detecting the change. The touch panel 222 can be either a surface capacitive type or a projected capacitive type. The operation accepting unit 223 can accept an operation from the user by, for example, a power switch, an LED, and hardware keys.
[0034] The ECU 230 is a unit to be mounted in a control device for implementing driving control of the vehicle 100. This driving control includes control in which the vehicle system is a main driving party, and control in which the driver is a main driving party. The ECU 230 identifies a user by obtaining image data of a passenger captured by a camera 231 installed in the cabin of the vehicle 100. The ECU 230 can also identify the user by using not only the camera 231 but also detection information from a sensor 232 such as a pressure sensor mounted on a seat.
[0035] The ECU 230 can communicate with the external server 250 across a wireless communication network (not shown) by using an OF 233. The server 250 includes a CPU 251, a ROM 252, a RAM 253, and a storage unit 254. The CPU 251 comprehensively controls the server 250 by loading a program stored in the ROM 252 into the RAM 253, and executing the program. The storage unit 254 stores information for identifying a passenger of the vehicle 100. The CPU 251 can identify the passenger based on, for example, voice data, image data, and sensor's detection information transmitted from the vehicle 100.
[0036] The display apparatus 110 is connected to devices 240 so that they can communicate with each other. The devices 240 include an air-conditioner 241, an illuminator 242, an audio component 243, and a radio 244 installed in the cabin of the vehicle 100. The display apparatus 110 transmits setting information set on the setting screen displayed on the touch panel 222, for example, volume information of the audio component 243, to each device of the devices 240. Based on the transmitted setting information, each device controls its operation.
[0037]
[0038] The one-touch setting mode will be explained below with reference to
[0039] In this embodiment as shown in
[0040]
[0041] The screen 900 shows a state in which the selection item 904 is selected. In this case, the screen 900 changes to a screen 910. The screen 910 displays selection items 911, 912, and 913 for selecting devices. The screen 910 changes to a setting screen for a CD when the selection item 911 is selected, changes to a setting screen for a radio when the selection item 912 is selected, and changes to a setting screen for a USB when the selection item 913 is selected.
[0042] The screen 910 shows a state in which the selection item 912 is selected. In this case, the screen 910 changes to a screen 920. The screen 920 displays selection items 921, 922, and 923 for selecting stations. For example, when the selection item 923 is selected, a radio 244 outputs the radio broadcasting of an 80.0-MHz station.
[0043]
[0044] First, when the finger 804 reaches the region 801 shown in
[0045] Assume that the user selects the selection item 904 on the screen 900. In this case, the user moves the finger 804 to the position of the selection item 904 on the XY plane, and further moves the finger 804 closer to the touch panel 222 (Z-direction movement). When the finger 804 reaches the region 802 as shown in
[0046] Assume that the user selects the selection item 912 on the screen 910. In this case, the user moves the finger 804 to the position of the selection item 912 on the XY plane, and further moves the finger 804 closer to the touch panel 222 (Z-direction movement). When the finger 804 reaches the region 803 as shown in
[0047] The selection items 904, 912, and 923 are selected by the locus of the series of movements of the finger 804 described above. The arrow shown in
[0048] Referring to
[0049] In the above description, the determination in step S103 is explained by taking, as an example, whether the menu for the one-touch setting mode is selected. However, it is also possible to display a button such as Execute one-touch setting mode on the main screen, and, if this button is selected, determine in step S103 that designation of the one-touch setting mode is accepted.
[0050] In the above explanation, the one-touch setting mode is not set on the main screen displayed in step S102. This can also be implemented by imposing the limitation that the position of the finger 804 on the XY plane can be specified when the surface of the touch panel 222 and the finger 804 are in contact with each other, that is, when the capacitance between the surface and the finger 804 becomes larger than a threshold. In addition, the CPU 201 can also cancel this limitation in step S104.
[0051]
[0052] In step S202, the CPU 201 determines whether the finger 804 exists in a first region, that is, the region 801. If it is determined that the finger 804 exists in the region 801, the process advances to step S203, and the CPU 201 obtains locus information of the finger 804 and stores the information in the ROM 202. In this step, the CPU 201 displays a pointer on the screen 900 so that the pointer corresponds to the position of the finger 804 on the XY plane. With a configuration like this, the user can easily recognize the position pointed on the touch panel 222 by the finger 804, even when the finger 804 is apart from the touch panel 222.
[0053] The processing in step S202 is repeated after step S203. On the other hand, if it is determined in step S202 that the finger 804 does not exist in the region 801, the process advances to step S204. In step S204, the CPU 201 determines whether the finger 804 exists in a second region, that is, the region 802. If it is determined that the finger 804 does not exist in the region 802, the process advances to step S205, and the CPU 201 resets the locus information stored in the ROM 202. Since this is a case in which the finger 804 moves away from the region 801, the processing in step S201 is executed when the finger 804 reaches the region 801 again. On the other hand, if it is determined in step S204 that the finger 804 exists in the region 802, the process advances to step S206.
[0054] Details of the procedures in steps S202, S203, and S204 will be explained below with reference to
[0055] The process shown in
[0056] In step S403, the CPU 201 determines whether the change in capacitance in step S401 is an increase. If it is determined that the change in capacitance is an increase, the process advances to step S404, and the CPU 201 determines that the finger 804 has moved closer to the touch panel 222, and terminates the process shown in
[0057] The process shown in
[0058] Referring to
[0059] The processing in step S206 will be explained with reference to
[0060] In step S503, the CPU 201 determines a transition destination screen in accordance with the specified item. For example, if the selection item 904 on the screen 900 is selected, the CPU 201 determines that the screen 910 is a transition destination screen. After that, the CPU terminates the process shown in
[0061] In step S208, the CPU 201 determines whether the finger 804 exists in the second region, that is, the region 802. If it is determined that the finger 804 exists in the region 802, the process advances to step S209, and the CPU 201 obtains locus information of the finger 804 on the XY plane and stores the information in the ROM 202. In this step, the CPU 201 displays a pointer on the screen 910 so that the pointer corresponds to the position of the finger 804 on the XY plane. A configuration like this enables the user to easily recognize the position pointed on the touch panel 222 by the finger 804, even when the finger 804 is apart from the touch panel 222.
[0062] After step S209, the CPU 201 repeats the processing in step S208. On the other hand, if it is determined in step S208 that the finger 804 does not exist in the region 802, the process advances to step S210. In step S210, the CPU 201 determines whether the finger 804 exists in a third region, that is, the region 803. If it is determined that the finger 804 does not exist in the region 803, the process advances to step S211, and the CPU 201 resets the locus information stored in the ROM 202. Since the finger 804 has moved away toward the region 801 in this case, the CPU 201 re-executes the processing in step S201. On the other hand, if it is determined in step S210 that the finger 804 exists in the region 803, the process advances to step S212.
[0063] Details of the procedures in steps S208, S209, and S210 will be explained below with reference to
[0064] The process shown in
[0065] In step S403, the CPU 201 determines whether the change in capacitance in step S401 is an increase. If it is determined that the change in capacitance is an increase, the process advances to step S404, and the CPU 201 determines that the finger 804 has moved closer to the touch panel 222, and terminates the process shown in
[0066] The process shown in
[0067] Referring to
[0068] The processing in step S212 will be explained with reference to
[0069] In step S503, the CPU 201 determines a transition destination screen in accordance with the specified item. For example, if the selection item 912 on the screen 910 is selected, the CPU 201 determines that the screen 920 is a transition destination screen. After that, the CPU terminates the process shown in
[0070] In step S214, the CPU 201 controls the devices 240 based on a combination of the selection items selected as the finger 804 approaches the touch panel 222. Since the items 904, 912, and 923 are selected in the examples shown in
[0071] In the above explanation, the devices 240 are controlled in step S214 based on the combination of the selection items having been selected. In this step, it is also possible to associate the selection item combination with user identification information, and store the result as the operation log information 212 in the storage unit 210. The user identification information in this case is information obtained when the ECU 230 identifies the user when he or she gets in the vehicle 100, based on a feature amount obtained from the camera 231 or the sensor 232. Alternatively, the ECU 230 transmits the feature amount obtained by the camera 231 or the sensor 232 to an external server (not shown), and the server identifies the user based on the feature amount and transmits the user identification information to the ECU 230.
[0072] A process of generating screen data to be displayed on the touch panel 222 based on a combination of selection items frequently used by the user will be explained below with reference to
[0073] In step S601, the CPU 201 identifies the user. The CPU 201 may also obtain the user identification information obtained by the ECU 230 as described above. In step S602, the CPU 201 obtains the operation log information 212 corresponding to the user identified in step S601 from the storage unit 210. Then, in step S603, the CPU 201 specifies a combination of selection items most frequently used by the user from the obtained operation log information 212. For example, the CPU 201 specifies a combination of selection items such as Audio, CD, Random playback as the operation log information 212 corresponding to user A.
[0074] In step S604, the CPU 201 performs optimization so that icons of the selection items specified in step S603 are aligned in the Z-axis direction, that is, so that the motion of the finger 804 in the XY-axis direction decreases when the finger 804 approaches the touch panel 222. This processing in step S604 will be explained below with reference to
[0075] Referring to
[0076] Assume that the selection items 1311, 1312, and 1313 respectively correspond to Audio, CD, and Random playback.
[0077] As shown in
[0078] The process of aligning the selection items 1311 to 1313 most in the Z-axis direction will be explained. Let w1, w2, and w3 be the icon widths of the selection items 1311, 1312, and 1313, respectively. The process of aligning the selection items 1311 to 1313 most in the Z-axis direction is to adjust the positions of the individual icons so as to maximize a width w4 in which the icon widths w1, w2, and w3 overlap each other. In
[0079] Referring to
[0080] In the case of
[0081] This processing in step S607 will be explained. In
[0082] Subsequently, the CPU 201 adjusts the icon widths w1 to w3 of the selection items 1311 to 1313 so as to maximize the width w4. As shown in
[0083]
[0084] In the abovementioned explanation, selection items having low use frequencies are specified on the screens 1301 and 1302 by performing the processing in step S607 once. However, this operation may also be performed by performing the processing in step S607 a plurality of times. That is, selection items having low use frequencies are first specified on the screen 1302 in step S607, and the process returns to step S604 after that. After step S605, selection items having low use frequencies are specified again on the screen 1301 in step S607.
[0085] If it is determined in step S605 that the number of selection items on each screen satisfies the predetermined condition, the process advances to step S606, and the CPU 201 generates screen data based on the optimized selection item layout. For example, the CPU 201 generates screen data indicating the screens 1301 to 1303 based on the layout of the selection items 1311 to 1313 shown in
[0086] After the screen data is generated in step S606, the touch panel 222 displays a screen such as the screen 1200 shown in
[0087] Next, a process of changing the ranges of the regions 801 to 803 shown in
[0088] In step S701, the CPU 201 obtains user attribute information. For example, the CPU 201 obtains the user attribute information when obtaining the user identification information from the ECU 230. Examples of the user attribute information are the nationality, the height, the weight, and the degree of humidity of the finger. Information such as the nationality, the height, and the weight can be stored in the external server 250 by associating the information with the user identification information. In this case, when the server identifies the user by receiving information obtained by the camera 231 or the sensor 232 from the ECU 230, the server can transmit the user attribute information such as the nationality, the height, and the weight to the ECU 230 in addition to the user identification information. The degree of humidity of the finger may also be obtained based on, for example, information detected by the ECU 230 from a humidity sensor attached to the steering wheel.
[0089] The capacitance is largely affected by the degree of humidity and the size of the finger 804. Therefore, the user attribute information is not limited to the abovementioned information as long as these two elements are introduced.
[0090] In step S702, the CPU 201 adjusts the range of each of the regions 801 to 803 in the Z-axis direction based on the user attribute information obtained in step S701. For example, as the degree of humidity of the finger 804 increases, the CPU 201 decreases the range of each of the regions 801 to 803 in the Z-axis direction, because it can be expected that the detection performance improves. Likewise, as the size of the finger 804 increases, the CPU 201 decreases the range of each of the regions 801 to 803 in the Z-axis direction, because it can be expected that the detection performance improves. It is also possible to combine the two pieces of information. The size of the finger 804 can also be estimated from, for example, the height and the weight, or the nationality.
[0091] In the explanation of this embodiment, the position of the finger 804 in the Z-axis direction is detected based on the capacitance. However, the position of the finger 804 in the Z-axis direction may also be detected by another detection method. For example, it is also possible to form a detection plate including a plurality of electrode patterns for detecting the capacitance, such that the detection plate is perpendicular to the display surface of the touch panel 222 and positioned on the side of the touch panel 222. That is, the position of the finger 804 in the Z-axis direction is detected by not the electrodes of the touch panel 222 but the electrodes of the detection plate formed on the side of the touch panel 222. If the detection plate like this is so formed as to detect the position of the finger 804 in the Z-axis direction and the position of the finger 804 on the XY plane in accordance with the change in capacitance, a display unit that is not a capacitive touch panel may also be used instead of the touch panel 222. Furthermore, the abovementioned detection plate can also be configured not to detect the capacitance but to detect the position of the finger 804 in the air by using an infrared sensor or the like.
Summary of Embodiment
[0092] A display apparatus of the abovementioned embodiment comprises a display unit (the touch panel 222) configured to display screen data containing a plurality of selectable items, and a display control unit configured to change the screen data in accordance with selection of an item contained in the screen data, when a touch operation is performed on a display surface of the display unit (
[0093] In a configuration like this, when the position in the space, which corresponds to an item to be selected, is passed, the screen data can be changed by determining that the item is selected. This can further simplify the user operation.
[0094] The display unit is a capacitive touch panel (the touch panel 222,
[0095] The display apparatus further comprises a detection unit configured to detect a capacitance between the detection unit and the object (step S302), and a determination unit configured to determine whether the capacitance detected by the detection unit falls within a predetermined range (step S401), wherein after the determination unit determines that the capacitance falls within the predetermined range, if the determination result from the determination unit changes and indicates that the capacitance falls outside the predetermined range, the display control unit determines that the item is selected and changes the screen data (step S402). A configuration like this can change the screen data based on the change in capacitance.
[0096] The object is a finger, and the display apparatus further comprises a changing unit configured to change the predetermined range based on information about the finger (
[0097] The display apparatus further comprises an obtaining unit configured to obtain the information about the finger (step S701). The display apparatus is mounted in a vehicle, and the obtaining unit obtains the information about the finger based on information about a passenger of the vehicle. A configuration like this can obtain, for example, the size of the finger based on the information about the passenger of the vehicle.
[0098] The number of items contained in screen data as a transition target of the display control unit is not more than the number of items contained in screen data as a transition destination (step S605). A configuration like this decreases the number of items as the finger moves away from the display surface of the touch panel. This can compensate for a decrease in detection accuracy.
[0099] The display control unit changes the screen data between a plurality of predetermined screen data. The display apparatus further comprises a determination unit configured to determine a layout of items contained in each of the plurality of screen data (
[0100] A configuration like this can lay out, for example, selection items having high use frequencies so as to decrease displacement on the XY plane.
[0101] The invention is not limited to the abovementioned embodiment and can variously be modified and changed without departing from the spirit and scope of the invention.