APPARATUS FOR CONTROLLING VEHICLE DISPLAY BASED ON APPROACH DIRECTION DETERMINATION USING PROXIMITY SENSOR
20230094520 · 2023-03-30
Assignee
Inventors
- Tae Hun Kim (Seongnam-si, KR)
- Sung Hyun PARK (Hwaseong-si, KR)
- Jun Seong SEO (Yongin-si, KR)
- Seung Hwan Lee (Hwaseong-si, KR)
Cpc classification
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
An apparatus for determining an approach direction of a gesture of a vehicle occupant (a driver or a fellow passenger) by using a proximity sensor and controlling a vehicle display based on the approach direction determination. The apparatus includes an input unit to receive information on an approach direction in which an occupant’s hand in a vehicle moves to operate a vehicle function, a memory to store a program for controlling the vehicle display by using the information on the approach direction, and a processor to execute the program. The processor controls the vehicle display to perform different functions when an input is made to the same button based on information on a first approach direction in which a driver’s hand approaches from a driver seat side and information on a second approach direction in which a fellow passenger’s hand approaches from a passenger seat side.
Claims
1. An apparatus for controlling a vehicle display based on approach direction determination using a proximity sensor, the apparatus comprising: an input unit configured to receive information on an approach direction in which an occupant’s hand in a vehicle moves to operate a vehicle function; a memory configured to store a program for controlling the vehicle display by using the information on the approach direction; and a processor configured to execute the program, wherein the processor is configured to control the vehicle display to perform different functions when an input is made to the same button based on information on a first approach direction in which a driver’s hand approaches from a driver seat side and information on a second approach direction in which a fellow passenger’s hand approaches from a passenger seat side.
2. The apparatus of claim 1, wherein the processor is configured to display an operation guide, which is related to a function operable by a hard button, through the vehicle display based on the information on the approach direction made by determining whether a hand approaching the hard button is the driver’s hand or the fellow passenger’s hand.
3. The apparatus of claim 1, wherein the processor is configured to determine a menu, which is activated among menu buttons displayed in a preset region of the vehicle display by using the information on the approach direction, and to provide an operation guide related to the menu.
4. The apparatus of claim 1, wherein the processor is configured to recognize a connection state of the driver’s smartphone and the fellow passenger’s smartphone, and when a function related to cooperation of smartphones is selected, the processor is configured to control contents playing on the driver’s smartphone or the fellow passenger’s smartphone to be displayed through the vehicle display by using the information on the approach direction.
5. The apparatus of claim 1, wherein, when a smartphone, which operates in conjunction with the vehicle, is the driver’s smartphone and the fellow passenger’s hand intends to approach to perform a function of the smartphone that operates in conjunction with the vehicle, the processor is configured to control the corresponding function not to operate.
6. The apparatus of claim 5, wherein when the fellow passenger’s hand approaches, the processor is configured to control the vehicle display to separately display a menu that does not operate even when the menu is selected by the fellow passenger’s hand.
7. The apparatus of claim 1, wherein when a point of interest (POI) is clicked during a navigation guide, the processor is configured to control the vehicle display to move the POI, display information on the POI based on the information on the first approach direction, and display the information on the POI in a screen region configured to be separately split from a route guide screen based on the information on the second approach direction.
8. A method of controlling a vehicle display based on approach direction determination using a proximity sensor, the method comprising: (a) checking an approach direction in which an occupant’s hand in a vehicle moves to operate a vehicle function; and (b) controlling the vehicle display to perform different functions when an input is made to the same button based on information on a first approach direction in which a driver’s hand approaches from a driver seat side and information on a second approach direction in which a fellow passenger’s hand approaches from a passenger seat side.
9. The method of claim 8, wherein (a) includes checking the approach direction on whether a hand approaching a hard button is the driver’s hand or the fellow passenger’s hand, and (b) includes displaying an operation guide, which is related to a function operable by the hard button, through the vehicle display based on the approach direction.
10. The method of claim 8, wherein (b) includes determining a menu, which is activated among menu buttons displayed in a preset region of the vehicle display based on the approach direction, and providing an operation guide related to the menu.
11. The method of claim 8, wherein (b) includes recognizing a connection state of the driver’s smartphone and the fellow passenger’s smartphone, and when a function related to cooperation of smartphones is selected, and controlling contents playing on the driver’s smartphone or the fellow passenger’s smartphone to be displayed through the vehicle display based on the approach direction.
12. The method of claim 8, wherein, when a smartphone, which operates in conjunction with the vehicle, is the driver’s smartphone and the fellow passenger’s hand intends to approach to perform a function of the smartphone that operates in conjunction with the vehicle, (b) includes controlling the corresponding function not to operate.
13. The method of claim 12, wherein, when the fellow passenger’s hand approaches, (b) includes controlling the vehicle display to separately display a menu that does not operate even when the menu is selected by the fellow passenger’s hand.
14. The method of claim 8, wherein, when a point of interest (POI) is clicked during a navigation guide, (b) includes controlling the vehicle display to move the POI, displaying information on the POI based on the information on the first approach direction, and displaying the information on the POI in a screen region configured to be separately split from a route guide screen based on the information on the second approach direction.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
DETAILED DESCRIPTION
[0037] The above-mentioned object, other objects, advantages, and features of the present disclosure and methods of achieving the objects, advantages, and features will be clear with reference to embodiments described in detail below together with the accompanying drawings.
[0038] However, the present disclosure is not limited to the embodiments disclosed herein but will be implemented in various forms. The following embodiments are merely provided to allow those skilled in the technical field to which the present disclosure pertains to easily understand the object, configuration, and effect of the present disclosure, and the scope of the present disclosure is defined by the disclosure disclosed in claims.
[0039] Meanwhile, the terms used in the present specification are for explaining the embodiments, not for limiting the present disclosure. Unless particularly stated otherwise in the present specification, a singular form also includes a plural form. The terms “comprise (include)” and/or “comprising (including)” used in the specification are intended to specify the presence of the mentioned constituent elements, steps, operations, and/or elements, but do not exclude the presence or addition of one or more other constituent elements, steps, operations, and/or elements.
[0040]
[0041] The system for controlling a vehicle display based on approach direction determination using a proximity sensor according to the embodiment of the present disclosure includes: an approach direction detection unit 100 configured to detect an approach direction of a vehicle occupant’s hand by using a proximity sensor in a vehicle; and an AVN screen control unit 200 configured to perform AVN screen control based on the approach direction.
[0042] According to the embodiment of the present disclosure, it is possible to determine the approach direction of the occupant’s hand in the vehicle by using the proximity sensor. As another example, it is also possible to determine the approach direction of the occupant’s hand by capturing an image or by using an ultrasonic sensor, a non-contact haptic display, or the like.
[0043] The AVN screen control unit 200 includes: a button operation control unit 210 configured to differently control an operation of a hard button based on an approach direction; an operation guide providing unit 220 configured to change and provide information display in accordance with an approach for each detailed position and provide an operation guide in accordance with a keypad button operation during an approach to a keypad; a phone connection control unit 230 configured to recognize connection states of a smartphone owned by the driver and a smartphone owned by a fellow passenger and control connection between an AVN and the smartphone in accordance with an approach of the driver’s hand or an approach of the fellow passenger’s hand; a privacy mode control unit 240 configured to control an operation of a function of the smartphone connected to the AVN or an operation of a function of the AVN in accordance with the approach of the driver’s hand or the approach of the fellow passenger’s hand; and a navigation function control unit 250 configured to provide different screens when a point of interest (POI) in a map is clicked during a navigation guide in accordance with the approach of the driver’s hand or the approach of the fellow passenger’s hand.
[0044]
[0045] During the approach to each keypad, each keypad button is activated, and an operation guide is provided when a corresponding keypad button is pushed for a short time (.square-solid.) or pushed for a long time (—).
[0046] Referring to
[0047] For example, a screen related to radio function control is displayed at a lower end of the AVN screen when the driver’s hand approaches from the driver’s side, and a screen related to navigation control is displayed at the lower end of the AVN screen when the fellow passenger’s hand approaches from the fellow passenger’s side.
[0048] As the driver’s hand approaches, the operation guide is provided such that the frequency is adjusted to FM 89.1 MHz when the hard button is pushed for a short time (.square-solid.), and the frequency is adjusted to FM 94.5 MHz when the hard button is pushed for a long time (—).
[0049] As the fellow passenger’s hand approaches, the operation guide is provided such that a destination is set to Home when the hard button is pushed for a short time (.square-solid.), and a navigation menu is displayed when the hard button is pushed for a long time (—).
[0050]
[0051] According to the embodiment of the present disclosure, a detailed position is determined, whether the driver or fellow passenger’s hand approaches is determined, and the information display is changed in accordance with the approach and provided.
[0052] Referring to
[0053] Referring to
[0054] Referring to
[0055]
[0056] As illustrated in
[0057]
[0058] According to the embodiment of the present disclosure, the connection states of the driver’s smartphone and the fellow passenger’s smartphone are recognized, a proximity sensor is used to recognize that the driver or fellow passenger approaches and pushes a button, and a smartphone cooperation function is provided to conform to the user’s needs.
[0059] Referring to
[0060] In a case in which only the driver is seated in the vehicle, the vehicle recognizes, as a registered phone, a smartphone owned by the driver.
[0061] In addition, when a smartphone is additionally registered in addition to the driver’s smartphone connected in advance, the audio system is connected, and the fellow passenger seated in the vehicle is recognized, the vehicle recognizes, as the additionally registered phone, a smartphone owned by the fellow passenger.
[0062] A button corresponding to menu 6 on the AVN screen is a media button, and when the driver’s hand approaches from the driver’s side and touches a Bluetooth audio button, a sound source of the driver’s smartphone is played through a driver’s Bluetooth audio system, and the AVN screen is changed to a driver’s background screen.
[0063] In this case, when a request to change the Bluetooth audio system is inputted, a sound source of the fellow passenger’s smartphone is played through a fellow passenger Bluetooth audio system, and the AVN screen is changed to a fellow passenger’s background screen.
[0064] In addition, when the fellow passenger’s hand approaches from the fellow passenger’s side and touches the Bluetooth audio button, a sound source of the fellow passenger’s smartphone is played through the fellow passenger’s Bluetooth audio system, and the AVN screen is changed to the fellow passenger’s background screen.
[0065]
[0066] Referring to
[0067] When the driver’s hand approaches from the driver’s side and applies a touch input to the phone button, a phone screen is displayed through the AVN screen.
[0068] When the fellow passenger’s hand approaches from the fellow passenger’s side and applies a touch input to the phone button, the phone screen is not displayed through the AVN screen (the phone function does not operate).
[0069] Referring to
[0070] For example, the functions such as Phone of menu 5, Projection of menu 7, Setup of menu 8, Voice Memo of menu 9, Notification of menu 10, and Manual of menu 12 are displayed as not being operated even though the fellow passenger’s hand pushes corresponding buttons. Further, the text is inactivated, the function, which the fellow passenger cannot approach, is displayed, and the function, which the fellow passenger cannot approach, is eliminated from the screen.
[0071]
[0072] When the driver’s hand approaches from a driver seat side and touches Wujeong Art Center as a point of interest (POI) on the AVN screen during the navigation guide, the AVN screen displays information on the POI while enlarging the corresponding POI. Next, when the movement to the current position is requested, the screen returns to a route information screen.
[0073] When the fellow passenger’s hand approaches from a passenger seat side and touches a POI on the AVN screen during the navigation guide, detailed information on the POI is displayed through separate split screens, and a traveling route screen is consistently provided without moving and hindering the driver’s traveling along the route.
[0074]
[0075] The apparatus for controlling a vehicle display based on approach direction determination using a proximity sensor according to the embodiment of the present disclosure includes: an input unit 1001 configured to receive information on an approach direction in which an occupant’s hand in a vehicle moves to operate a vehicle function; a memory 1002 configured to store a program for controlling the vehicle display by using the information on the approach direction; and a processor 1003 configured to execute the program. The processor 1003 controls the vehicle display to perform different functions when an input is made to the same button based on information on a first approach direction in which a driver’s hand approaches from a driver seat side and information on a second approach direction in which a fellow passenger’s hand approaches from a passenger seat side.
[0076] The processor 1003 displays an operation guide, which is related to a function operable by a hard button, through the vehicle display based on the information on the approach direction made by determining whether a hand approaching the hard button is the driver’s hand or the fellow passenger’s hand.
[0077] The processor 1003 determines a menu, which is activated among menu buttons displayed in a preset region of the vehicle display by using the information on the approach direction, and provides an operation guide related to the activated menu.
[0078] The processor 1003 recognizes the connection state of the driver’s smartphone and the fellow passenger’s smartphone. When the function related to the cooperation of the smartphones is selected, the processor 1003 controls the contents playing on the driver’s smartphone or the fellow passenger’s smartphone to be displayed through the vehicle display by using the information on the approach direction.
[0079] The processor 1003 controls the corresponding function not to operate when the smartphone, which operates in conjunction with the vehicle, is the driver’s smartphone and the fellow passenger’s hand intends to approach to perform a function of the smartphone that operates in conjunction with the vehicle.
[0080] When the fellow passenger’s hand approaches, the processor 1003 controls the vehicle display to separately display a menu that does not operate even though the menu is selected by the fellow passenger’s hand.
[0081] When a POI is clicked during the navigation guide, the processor 1003 controls the vehicle display to move to the POI, display information related to the POI based on the information on the first approach direction, and display the information on the POI in a screen region configured to be separately split from a route guide screen based on the information on the second approach direction.
[0082] Meanwhile, a method of controlling a vehicle display based on approach direction determination using a proximity sensor according to the embodiment of the present disclosure may be implemented by a computer system or recorded in a recording medium. The computer system may include one or more processors, a memory, a user input device, a data communication bus, a user output device, and a storage. The above-mentioned constituent elements perform data communication through the data communication bus.
[0083] The method of controlling a vehicle display based on approach direction determination using a proximity sensor according to the embodiment of the present disclosure includes: step (a) of checking an approach direction in which an occupant’s hand in a vehicle moves to operate a vehicle function; and step (b) of controlling the vehicle display to perform different functions when an input is made to the same button based on information on a first approach direction in which a driver’s hand approaches from a driver seat side and the information on a second approach direction in which a fellow passenger’s hand approaches from a passenger seat side.
[0084] Step (a) checks the approach direction on whether a hand approaching a hard button is the driver’s hand or the fellow passenger’s hand. Step (b) displays an operation guide, which is related to a function operable by using the hard button, through the vehicle display based on the approach direction.
[0085] Step (b) determines a menu, which is activated among menu buttons displayed in the preset region of the vehicle display based on the approach direction, and provides an operation guide related to the activated menu.
[0086] Step (b) recognizes a connection state of the driver’s smartphone and the fellow passenger’s smartphone. When the function related to the cooperation of the smartphones is selected, step (b) performs the contents playing on the driver’s smartphone or the fellow passenger’s smartphone to be displayed through the vehicle display based on the approach direction.
[0087] When a smartphone, which operates in conjunction with the vehicle, is the driver’s smartphone and the fellow passenger’s hand intends to approach to perform a function of the smartphone that operates in conjunction with the vehicle, step (b) controls the corresponding function not to operate.
[0088] When the fellow passenger’s hand approaches, step (b) controls the vehicle display to separately display a menu that does not operate even though the menu is selected by the fellow passenger’s hand.
[0089] When a POI is clicked during a navigation guide, step (b) controls the vehicle display to move to the POI, display information related to the POI based on the information on the first approach direction, and display the information on the POI in a screen region configured to be separately split from a route guide screen based on the information on the second approach direction.
[0090] The computer system may further include a network interface coupled to a network. The processor may be a central processing unit (CPU) or a semiconductor device that processes commands stored in the memory and/or the storage.
[0091] The memory and the storage may include volatile or non-volatile storage media having various shapes. Examples of the memory may include a read only memory (ROM) and a random-access memory (RAM).
[0092] Therefore, the method of controlling a vehicle display based on approach direction determination using a proximity sensor according to the embodiment of the present disclosure may be implemented as a method that may be executed by a computer. When the method of controlling a vehicle display based on approach direction determination using a proximity sensor according to the embodiment of the present disclosure is performed by a computer, computer-readable commands may perform the method of controlling a vehicle display based on approach direction determination using a proximity sensor according to the embodiment of the present disclosure.
[0093] Meanwhile, the method of controlling a vehicle display based on approach direction determination using a proximity sensor according to the embodiment of the present disclosure may be implemented as a computer-readable code in a computer-readable recording medium. Examples of the computer-readable recording medium includes all kinds of recording media for storing data readable by a computer system. Specific examples thereof may include a read only memory (ROM), a random-access memory (RAM), a magnetic tape, a magnetic disc, a flash memory, an optical data storage device, and the like. In addition, the computer-readable recording medium may be distributed to a computer system connected by a computer communication network and stored and executed as a computer-readable code in a distributed manner.
[0094] The present invention has been described above with respect to embodiments thereof. Those skilled in the art should understand that various changes in form and details may be made herein without departing from the essential characteristics of the present invention. Therefore, the embodiments described herein should be considered from an illustrative aspect rather than from a restrictive aspect. The scope of the present invention should be defined not by the detailed description but by the appended claims, and all differences falling within a scope equivalent to the claims should be construed as being encompassed by the present invention.
[0095] The components described in the example embodiments may be implemented by hardware components including, for example, at least one digital signal processor (DSP), a processor, a controller, an application-specific integrated circuit (ASIC), a programmable logic element, such as an FPGA, other electronic devices, or combinations thereof. At least some of the functions or the processes described in the example embodiments may be implemented by software, and the software may be recorded on a recording medium. The components, the functions, and the processes described in the example embodiments may be implemented by a combination of hardware and software.
[0096] The method according to example embodiments may be embodied as a program that is executable by a computer, and may be implemented as various recording media such as a magnetic storage medium, an optical reading medium, and a digital storage medium.
[0097] Various techniques described herein may be implemented as digital electronic circuitry, or as computer hardware, firmware, software, or combinations thereof. The techniques may be implemented as a computer program product, i.e., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable storage device (for example, a computer-readable medium) or in a propagated signal for processing by, or to control an operation of a data processing apparatus, e.g., a programmable processor, a computer, or multiple computers. A computer program(s) may be written in any form of a programming language, including compiled or interpreted languages and may be deployed in any form including a stand-alone program or a module, a component, a subroutine, or other units suitable for use in a computing environment. A computer program may be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
[0098] Processors suitable for execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. Elements of a computer may include at least one processor to execute instructions and one or more memory devices to store instructions and data. Generally, a computer will also include or be coupled to receive data from, transfer data to, or perform both on one or more mass storage devices to store data, e.g., magnetic, magneto-optical disks, or optical disks. Examples of information carriers suitable for embodying computer program instructions and data include semiconductor memory devices, for example, magnetic media such as a hard disk, a floppy disk, and a magnetic tape, optical media such as a compact disk read only memory (CD-ROM), a digital video disk (DVD), etc. and magneto-optical media such as a floptical disk, and a read only memory (ROM), a random access memory (RAM), a flash memory, an erasable programmable ROM (EPROM), and an electrically erasable programmable ROM (EEPROM) and any other known computer readable medium. A processor and a memory may be supplemented by, or integrated into, a special purpose logic circuit.
[0099] The processor may run an operating system (OS) and one or more software applications that run on the OS. The processor device also may access, store, manipulate, process, and create data in response to execution of the software. For purpose of simplicity, the description of a processor device is used as singular; however, one skilled in the art will be appreciated that a processor device may include multiple processing elements and/or multiple types of processing elements. For example, a processor device may include multiple processors or a processor and a controller. In addition, different processing configurations are possible, such as parallel processors.
[0100] Also, non-transitory computer-readable media may be any available media that may be accessed by a computer, and may include both computer storage media and transmission media.
[0101] The present specification includes details of a number of specific implements, but it should be understood that the details do not limit any invention or what is claimable in the specification but rather describe features of the specific example embodiment. Features described in the specification in the context of individual example embodiments may be implemented as a combination in a single example embodiment. In contrast, various features described in the specification in the context of a single example embodiment may be implemented in multiple example embodiments individually or in an appropriate sub-combination. Furthermore, the features may operate in a specific combination and may be initially described as claimed in the combination, but one or more features may be excluded from the claimed combination in some cases, and the claimed combination may be changed into a sub-combination or a modification of a sub-combination.
[0102] Similarly, even though operations are described in a specific order on the drawings, it should not be understood as the operations needing to be performed in the specific order or in sequence to obtain desired results or as all the operations needing to be performed. In a specific case, multitasking and parallel processing may be advantageous. In addition, it should not be understood as requiring a separation of various apparatus components in the above described example embodiments in all example embodiments, and it should be understood that the above-described program components and apparatuses may be incorporated into a single software product or may be packaged in multiple software products.
[0103] It should be understood that the example embodiments disclosed herein are merely illustrative and are not intended to limit the scope of the invention. It will be apparent to one of ordinary skill in the art that various modifications of the example embodiments may be made without departing from the spirit and scope of the claims and their equivalents.