APPARATUS AND METHOD FOR CONTROLLING VEHICLE INTERIOR LIGHTING
20250324501 ยท 2025-10-16
Assignee
Inventors
Cpc classification
International classification
Abstract
Disclosed are an apparatus and a method for controlling vehicle interior lighting. The apparatus for controlling vehicle interior lighting according to an aspect of the present disclosure includes: a first communication module configured to communicate with a display device installed in a vehicle interior; a lighting module configured to include a plurality of light sources; and a processor connected to the first communication module and the lighting module, wherein the processor is configured to: receive first image information, which is information on an image being output on the display device, through the first communication module; generate first color information based on the first image information; and control the lighting module based on the first color information.
Claims
1. An apparatus for controlling vehicle interior lighting, comprising: a first communication module configured to communicate with a display device disposed at an interior of a vehicle; a lighting module comprises a plurality of light sources; and a processor connected to the first communication module and the lighting module and configured to: receive, via the first communication module, first image information including information on an image displayed on the display device; generate first color information based on the first image information; and control the lighting module based on the first color information.
2. The apparatus of claim 1, wherein the lighting module is configured to output ambient light.
3. The apparatus of claim 1, wherein, to generate the first color information, the processor is further configured to: sample the first image information to extract a first frame; divide the first frame into a plurality of areas; determine a representative color for each of the plurality of areas; and generate the first color information by including the representative color for each of the plurality of areas.
4. The apparatus of claim 3, wherein, to determine the representative color for each of the plurality of areas, the processor is further configured to: detect a color value at each of a randomly selected number of points from the plurality of areas; calculate a median value for the color values at each of the randomly selected number of points; and use a color corresponding to the median value as the representative color for each of the plurality of areas.
5. The apparatus of claim 3, wherein the processor is configured to: identify an area corresponding to each of the plurality of light sources based on a location of the display device and a location of each of the plurality of light sources; and control the lighting module such that each of the plurality of light sources outputs light having the representative color of the area corresponding to each of the plurality of light sources.
6. The apparatus of claim 1, wherein the processor is further configured to: identify, from the plurality of light sources, at least one light source corresponding to a location of a user; and control the at least one identified light source based on the first color information.
7. The apparatus of claim 1, further comprising a second communication module configured to communicate with a mobile device located at the interior of the vehicle, wherein the processor is further configured to: receive, via the second communication module, second image information including information on an image displayed on the mobile device; generate second color information based on the second image information; and control the lighting module based on the second color information.
8. The apparatus of claim 7, wherein, to generate the second color information, the processor is configured to: sample the second image information to extract a second frame; divide the second frame into a plurality of areas; determine a representative color for each of the plurality of areas; and generate the second color information by including the representative color for each of the plurality of areas.
9. The apparatus of claim 8, wherein the processor is further configured to: identify an area corresponding to each of the plurality of light sources based on a location of the mobile device and a location of each of the plurality of light sources; and control the lighting module such that each of the plurality of light sources outputs light having the representative color of the area corresponding to each of the plurality of light sources.
10. A method for controlling vehicle interior lighting, comprising: receiving first image information on an image displayed on a display device disposed at an interior of a vehicle; generating first color information based on the first image information; and controlling, based on the first color information, a lighting module comprising a plurality of light sources.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
DETAILED DESCRIPTION
[0027] The components described in the example embodiments may be implemented by hardware components including, for example, at least one digital signal processor (DSP), a processor, a controller, an application-specific integrated circuit (ASIC), a programmable logic element, such as an FPGA, other electronic devices, or combinations thereof. At least some of the functions or the processes described in the example embodiments may be implemented by software, and the software may be recorded on a recording medium. The components, the functions, and the processes described in the example embodiments may be implemented by a combination of hardware and software.
[0028] The method according to example embodiments may be embodied as a program that is executable by a computer, and may be implemented as various recording media such as a magnetic storage medium, an optical reading medium, and a digital storage medium.
[0029] Various techniques described herein may be implemented as digital electronic circuitry, or as computer hardware, firmware, software, or combinations thereof. The techniques may be implemented as a computer program product, i.e., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable storage device (for example, a computer-readable medium) or in a propagated signal for processing by, or to control an operation of a data processing apparatus, e.g., a programmable processor, a computer, or multiple computers. A computer program(s) may be written in any form of a programming language, including compiled or interpreted languages and may be deployed in any form including a stand-alone program or a module, a component, a subroutine, or other units suitable for use in a computing environment. A computer program may be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
[0030] Processors suitable for execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. Elements of a computer may include at least one processor to execute instructions and one or more memory devices to store instructions and data. Generally, a computer will also include or be coupled to receive data from, transfer data to, or perform both on one or more mass storage devices to store data, e.g., magnetic, magneto-optical disks, or optical disks. Examples of information carriers suitable for embodying computer program instructions and data include semiconductor memory devices, for example, magnetic media such as a hard disk, a floppy disk, and a magnetic tape, optical media such as a compact disk read only memory (CD-ROM), a digital video disk (DVD), etc. and magneto-optical media such as a floptical disk, and a read only memory (ROM), a random access memory (RAM), a flash memory, an erasable programmable ROM (EPROM), and an electrically erasable programmable ROM (EEPROM) and any other known computer readable medium. A processor and a memory may be supplemented by, or integrated into, a special purpose logic circuit.
[0031] The processor may run an operating system (OS) and one or more software applications that run on the OS. The processor device also may access, store, manipulate, process, and create data in response to execution of the software. For purpose of simplicity, the description of a processor device is used as singular; however, one skilled in the art will be appreciated that a processor device may include multiple processing elements and/or multiple types of processing elements. For example, a processor device may include multiple processors or a processor and a controller. In addition, different processing configurations are possible, such as parallel processors.
[0032] Also, non-transitory computer-readable media may be any available media that may be accessed by a computer, and may include both computer storage media and transmission media.
[0033] The present specification includes details of a number of specific implements, but it should be understood that the details do not limit any invention or what is claimable in the specification but rather describe features of the specific example embodiment. Features described in the specification in the context of individual example embodiments may be implemented as a combination in a single example embodiment. In contrast, various features described in the specification in the context of a single example embodiment may be implemented in multiple example embodiments individually or in an appropriate sub-combination. Furthermore, the features may operate in a specific combination and may be initially described as claimed in the combination, but one or more features may be excluded from the claimed combination in some cases, and the claimed combination may be changed into a sub-combination or a modification of a sub-combination.
[0034] Similarly, even though operations are described in a specific order on the drawings, it should not be understood as the operations needing to be performed in the specific order or in sequence to obtain desired results or as all the operations needing to be performed. In a specific case, multitasking and parallel processing may be advantageous. In addition, it should not be understood as requiring a separation of various apparatus components in the above described example embodiments in all example embodiments, and it should be understood that the above-described program components and apparatuses may be incorporated into a single software product or may be packaged in multiple software products.
[0035] It should be understood that the example embodiments disclosed herein are merely illustrative and are not intended to limit the scope of the invention. It will be apparent to one of ordinary skill in the art that various modifications of the example embodiments may be made without departing from the spirit and scope of the claims and their equivalents.
[0036] Hereinafter, with reference to the accompanying drawings, embodiments of the present disclosure will be described in detail so that a person skilled in the art can readily carry out the present disclosure. However, the present disclosure may be embodied in many different forms and is not limited to the embodiments described herein.
[0037] In the following description of the embodiments of the present disclosure, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present disclosure rather unclear. Parts not related to the description of the present disclosure in the drawings are omitted, and like parts are denoted by similar reference numerals.
[0038] In the present disclosure, components that are distinguished from each other are intended to clearly illustrate each feature. However, it does not necessarily mean that the components are separate. That is, a plurality of components may be integrated into one hardware or software unit, or a single component may be distributed into a plurality of hardware or software units. Thus, unless otherwise noted, such integrated or distributed embodiments are also included within the scope of the present disclosure.
[0039] In the present disclosure, components described in the various embodiments are not necessarily essential components, and some may be optional components. Accordingly, embodiments consisting of a subset of the components described in one embodiment are also included within the scope of the present disclosure. In addition, embodiments that include other components in addition to the components described in the various embodiments are also included in the scope of the present disclosure.
[0040] Hereinafter, with reference to the accompanying drawings, embodiments of the present disclosure will be described in detail so that a person skilled in the art can readily carry out the present disclosure. However, the present disclosure may be embodied in many different forms and is not limited to the embodiments described herein.
[0041] In the following description of the embodiments of the present disclosure, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present disclosure rather unclear. Parts not related to the description of the present disclosure in the drawings are omitted, and like parts are denoted by similar reference numerals.
[0042] In the present disclosure, when a component is referred to as being linked, coupled, or connected to another component, it is understood that not only a direct connection relationship but also an indirect connection relationship through an intermediate component may also be included. In addition, when a component is referred to as comprising or having another component, it may mean further inclusion of another component not the exclusion thereof, unless explicitly described to the contrary.
[0043] In the present disclosure, the terms first, second, etc. are used only for the purpose of distinguishing one component from another, and do not limit the order or importance of components, etc., unless specifically stated otherwise. Thus, within the scope of this disclosure, a first component in one exemplary embodiment may be referred to as a second component in another embodiment, and similarly a second component in one exemplary embodiment may be referred to as a first component.
[0044] In the present disclosure, components that are distinguished from each other are intended to clearly illustrate each feature. However, it does not necessarily mean that the components are separate. That is, a plurality of components may be integrated into one hardware or software unit, or a single component may be distributed into a plurality of hardware or software units. Thus, unless otherwise noted, such integrated or distributed embodiments are also included within the scope of the present disclosure.
[0045] In the present disclosure, components described in the various embodiments are not necessarily essential components, and some may be optional components. Accordingly, embodiments consisting of a subset of the components described in one embodiment are also included within the scope of the present disclosure. In addition, exemplary embodiments that include other components in addition to the components described in the various embodiments are also included in the scope of the present disclosure.
[0046]
[0047] Referring to
[0048] The first communication module 110 may perform communication with a display device disposed (or installed) in a vehicle. The first communication module 110 may perform communication with the display device by using various types of communication methods. The display device is a device that includes a display for outputting an image, and may include a navigation system. The first communication module 110 may perform communication with the display device, for example, by using a wired communication. However, the communication method of the first communication module 110 is not limited to the embodiments described above, and the first communication module 110 may perform communication with the display device by using a variety of known communication methods.
[0049] The second communication module 120 may perform communication with a mobile device located within the vehicle. The second communication module 120 may perform communication with the mobile device by using various types of communication methods. The mobile device is a device that includes a display for outputting an image, and may include a variety of wireless electronic devices, such as a smartphone, tablet, and laptop. The second communication module 120 may perform communication with the mobile device by using, for example, Wi-Fi Direct technology. However, the communication method of the second communication module 120 is not limited to the embodiments described above, and the second communication module 120 may perform communication with the mobile device by using a variety of known wireless communication methods.
[0050] The lighting module 130 may illuminate a vehicle interior with light. The lighting module 130 may include a plurality of light sources 131. Each of the light sources 131 may be disposed or installed at different locations in the vehicle. The light sources 131 may illuminate the interior of the vehicle with various colors of light. The colors of the light output by the plurality of light sources 131 may be different from each other. For example, the lighting module 130 may output ambient light.
[0051] The memory 140 may store at least one instruction executed by the processor 150. The memory 140 may be implemented as a volatile storage medium and/or a non-volatile storage medium, for example, as read-only memory (ROM) and/or random access memory (RAM). The memory 140 may store various information required during the operation of the processor 150. In addition, the memory 140 may store various information generated during the operation of the processor 150.
[0052] The processor 150 may be operatively connected to the first communication module 110, the second communication module 120, the lighting module 130, and the memory 140. The processor 150 may be implemented in a central processing unit (CPU) or a system on chip (SoC), may run an operating system or an application to control a plurality of hardware or software components connected to the processor 150, and may perform various data processing and computations. The processor 150 may be configured to execute at least one instruction stored in the memory 140 and to store the resulting data in the memory 140.
[0053] The processor 150 may receive first image information, which is information on an image being output on the display device, via (or through) the first communication module 110, generate first color information based on the first image information, and control the lighting module 130 based on the first color information. The present embodiment may change the output color of the light sources in the vehicle in response to the image being output on the display device, thereby enhancing a user's immersion in the image being output on the display device.
[0054] The processor 150 may receive second image information, which is information on an image being output on the mobile device, via (or through) the second communication module 120, generate second color information based on the second image information, and control the lighting module 130 based on the second color information. The present embodiment may change the output color of the light sources in the vehicle in response to the image being output on the mobile device, thereby enhancing a user's immersion in the image being output on the mobile device.
[0055]
[0056] Referring to
[0057] First, the processor 150 may receive first image information, which is information on an image being output on the display device, via or through the first communication module 110 (S201). The first image information may include color information, for each area, on the image being output on the display device. In various embodiments, the processor 150 may receive information on only some areas (e.g., edge areas) of the image being output on the display device.
[0058] Next, the processor 150 may sample the first image information to extract a first frame (S203). The first frame may refer to an image being output on the display device at a specific point in time. In step S203, the processor 150 may perform sampling at a predetermined set interval. The set interval may be changed depending on the interior illuminance of the vehicle. To this end, the apparatus for controlling vehicle interior lighting 100 may further include an illuminance sensor configured to detect the interior illuminance of the vehicle. Relationship information on the interval depending on the interior illuminance may be preset and stored in the memory 140, and the processor 150 may detect the interval corresponding to a current interior illuminance from the relationship information stored in the memory 140, and set the detected interval as a set interval. The interval at which the light sources 131 change colors may be changed depending on the set interval.
[0059] Next, the processor 150 may divide the first frame into a plurality of areas (S205). As illustrated in
[0060] Next, the processor 150 may determine a representative color for each of the plurality of areas (S207). In various embodiments, the processor 150 may determine representative colors for only some areas (e.g., edge areas) out of the plurality of areas. For example, if the first frame is divided as shown in
[0061] Next, the processor 150 may generate the first color information by including the representative color for each of the plurality of areas (S209). In step S209, the processor 150 may generate the first color information by integrating information on the representative colors determined in step S207.
[0062] Next, the processor 150 may control the lighting module 130 based on the first color information (S211). In step S211, the processor 150 may identify a representative color of the area corresponding to each of the plurality of light sources 131 by using the first color information, and may control the lighting module 130 such that each of the plurality of light sources 131 outputs light having the representative color of the area corresponding to each of the plurality of light sources 131. For example, as illustrated in
[0063] In this case, the processor 150 may determine a location of a user, identify at least one light source 131, out of the plurality of light sources 131, corresponding to the location of the user, and control only the at least one identified light source 131. The processor 150 may receive information on the location of the user from various sensors and systems provided in the vehicle. For example, the processor 150 may receive information on the location of the user from an airbag system. To this end, the apparatus for controlling vehicle interior lighting 100 may include a separate communication module (third communication module) for communicating with sensors and/or systems in the vehicle.
[0064] In the present embodiment, the light source corresponding to the location of the user may refer to a light source of which the output light (or color of light) is detected in the user's field of view. For example, as illustrated in
[0065]
[0066] Referring to
[0067] First, the processor 150 may randomly select a number (n) of points in an area for which a representative color is to be determined (hereinafter the target area) (S601). For example, assuming that the target area is the right area and n is 10, the processor 150 may select points as illustrated in
[0068] Next, the processor 150 may detect a color value at each of the number (n) of the randomly selected points (S603). In the present embodiment, a color value may refer to an RGB value. The processor 150 may calculate a difference between a color value detected from a previous frame and a color value detected from a current frame, and may change a set interval (sampling interval) based on the calculated difference. The processor 150 may decrease the set interval as the difference between the color value detected from the previous frame and the color value detected from the current frame becomes greater.
[0069] Next, the processor 150 may calculate a median value for the color values detected in step S603 (S605), and determine a color corresponding to the calculated median value as a representative color for the target area (S607). In various embodiments, the processor 150 may utilize an average value, a maximum value, or a minimum value instead of a median value. The processor 150 may receive user input regarding which value to use among the median, average, maximum, and minimum values, and may determine which value to use based on the received user input.
[0070]
[0071] Referring to
[0072] First, the processor 150 may determine a location of the display device (S801). Information on the location of the display device may be pre-stored in the memory 140, and the processor 150 may determine the location of the display device from the memory 140.
[0073] Next, based on the location of the display device and the location of a light source to be controlled (hereinafter the target light source), the processor 150 may identify an area corresponding to the target light source among the plurality of areas constituting the first frame (S803). Information on the location of each of the light sources 131 may be pre-stored in the memory 140, and the processor 150 may determine the location of the target light source from the memory 140. In step S803, the processor 150 may identify an area closest to the target light source as the area corresponding to the target light source. Mapping information in which the relationship information on the corresponding area depending on the location of the light sources 131 is set for each location of the display device may be pre-stored in the memory 140, and the processor 150 may identify the area corresponding to the target light source by using the mapping information.
[0074] However, the method of identifying the area corresponding to the target light source is not limited to the embodiments described above, and the processor 150 may identify the area corresponding to the target light source by using various methods, by considering the locations of both the display device and the target light source. There may be one or more light sources 131 corresponding to the area.
[0075] Next, the processor 150 may detect a representative color for the area identified in step S803 from the first color information (S805). Information on representative colors for each area may have been recorded in the first color information, and the processor 150 may detect a representative color for the area corresponding to the target light source from the first color information.
[0076] Next, the processor 150 may control the target light source to output light having the representative color detected in step S805 (S807).
[0077]
[0078] Referring to
[0079] First, the processor 150 may receive second image information, which is information on an image being output on the mobile device, via or through the second communication module 120 (S901). The second image information may include color information for each area on the image being output on the mobile device. In various embodiments, the processor 150 may receive information on only some areas (e.g., edge areas) of the image being output on the mobile device.
[0080] Next, the processor 150 may sample the second image information to extract a second frame (S903). The second frame may refer to an image being output on the mobile device at a specific point in time. In step S903, the processor 150 may perform sampling at a predetermined set interval. The set interval may be changed depending on the interior illuminance of the vehicle.
[0081] Next, the processor 150 may divide the second frame into a plurality of areas (S905). The size and number of the plurality of areas may be set by considering the number of the light sources 131, the location of the light sources 131, and the like.
[0082] Next, the processor 150 may determine a representative color for each of the plurality of areas (S907). In various embodiments, the processor 150 may determine representative colors for only some areas (e.g., edge areas) out of the plurality of areas.
[0083] Next, the processor 150 may generate the second color information by including the representative color for each of the plurality of areas (S909). In step S909, the processor 150 may generate the second color information by integrating information on the representative colors determined in step S907.
[0084] Next, the processor 150 may control the lighting module 130 based on the second color information (S911). In step S911, the processor 150 may identify a representative color of the area corresponding to each of the plurality of light sources 131 by using the second color information, and may control the lighting module 130 such that each of the plurality of light sources 131 outputs light having the representative color of the area corresponding to each of the plurality of light sources 131.
[0085] In this case, the processor 150 may determine a location of a user, identify at least one light source 131, out of the plurality of light sources 131, corresponding to the location of the user, and control only the at least one identified light source 131. In the present embodiment, the light source corresponding to the location of the user may refer to a light source of which the output light is detected in the user's field of view.
[0086]
[0087] Referring to
[0088] Next, the processor 150 may determine a location of the mobile device (S1001). The apparatus for controlling vehicle interior lighting 100 may include a capture module (e.g., a camera), and the processor 150 may determine a location of the mobile device based on the image captured by the capture module.
[0089] Next, based on the location of the mobile device and the location of a light source to be controlled (hereinafter the target light source), the processor 150 may identify an area corresponding to the target light source among the plurality of areas constituting the second frame (S1003). In step S1003, the processor 150 may identify an area closest to the target light source as the area corresponding to the target light source. Mapping information in which the relationship information on the corresponding area depending on the location of the light sources 131 is set for each location of the mobile device may be pre-stored in the memory 140, and the processor 150 may identify the area corresponding to the target light source by using the mapping information.
[0090] Next, the processor 150 may detect a representative color for the area identified in step S1003 from the second color information (S1005). Information on representative colors for each area may have been recorded in the second color information, and the processor 150 may detect a representative color for the area corresponding to the target light source from the second color information.
[0091] Next, the processor 150 may control the target light source to output light having the representative color detected in step S1005 (S1007).
[0092] As described above, the apparatus and the method for controlling vehicle interior lighting according to embodiments of the present disclosure may control vehicle interior lighting to correspond to an image output on the display device or mobile device, thereby enhancing the user's immersion in the image output on the display device or mobile device.