LAMP CONTROL SYSTEM, LAMP CONTROL METHOD, AND VEHICLE
20250303956 ยท 2025-10-02
Assignee
Inventors
Cpc classification
B60Q2300/335
PERFORMING OPERATIONS; TRANSPORTING
B60Q1/1423
PERFORMING OPERATIONS; TRANSPORTING
B60Q2300/20
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A lamp control system includes a pair of lamps configured to emit light forward; a sensing module configured to sense an adjacent lane of a travelling vehicle; and a processor configured to: receive information on the sensed adjacent lane; and when the adjacent lane corresponds to a specific lane, control a light output of a lamp adjacent to the specific lane among the pair of lamps based on the received information on the adjacent lane. The specific lane may include at least one of a center line and an outermost lane.
Claims
1. A lamp control system comprising: a pair of lamps configured to emit light forward; a sensing module configured to sense an adjacent lane of a travelling vehicle; and a processor configured to: receive information on the sensed adjacent lane; and in response to the adjacent lane corresponding to a specific lane, control a light output of a lamp adjacent to the specific lane among the pair of lamps based on the received information on the adjacent lane, wherein the specific lane includes at least one of a center line and an outermost lane.
2. The lamp control system of claim 1, wherein the processor is configured to: in response to the specific lane corresponding to the center line, turn off a light output of a lamp adjacent to the center line or control the light output of the lamp adjacent to the center line to have a first light output; and in response to the specific lane corresponding to the outermost lane, control a light output of a lamp adjacent to the outermost lane to have a second light output.
3. The lamp control system of claim 2, wherein the first light output has a value smaller than that of the second light output.
4. The lamp control system of claim 1, wherein, in response to the specific lane corresponding to the outermost lane, the processor is configured to: analyze a driving tendency of a driver; and control a light output of a lamp adjacent to the outermost lane based on the driving tendency of the driver.
5. The lamp control system of claim 4, wherein the processor is configured to analyze the driving tendency or of the driver based on at least one of: an acceleration tendency of the driver; a safe distance tendency of the driver; and a reaction speed tendency of the driver.
6. The lamp control system of claim 1, wherein: the sensing module is configured to detect at least one of static and dynamic objects around the vehicle, and in response to (1) the specific lane corresponding to the outermost lane and (2) information on at least one of static and dynamic objects outside the outermost lane being received, the processor is configured to restore a light output of a lamp adjacent to the outermost lane for a predetermined period of time.
7. The lamp control system of claim 6, wherein the processor is configured to: configure an operating condition based on information on a reaction speed tendency of a driver; configure a recovery point based on the operating condition; and in response to the operating condition being satisfied for the driver, restore the light output of the lamp adjacent to the outermost lane at the recovery point.
8. The lamp control system of claim 1, wherein: the sensing module is configured to detect a pupil movement of a driver, and in response to the specific lane corresponding to the outermost lane, the processor is configured to: receive information on the detected pupil movement of the driver; and restore a light output of a lamp adjacent to the outermost lane for a predetermined period of time based on the received information on the pupil movement.
9. The lamp control system of claim 8, wherein the processor is configured to: configure an operating condition based on information on a reaction speed tendency of a driver; and in response to the information on the pupil movement satisfying the operating conditions for the driver, restore the light output of the lamp adjacent to the outermost lane.
10. The lamp control system of claim 9, wherein the information on the pupil movement includes at least one of: information on a time at which pupils of the driver are focused on the outermost lane; and information on a number of times the pupils of the driver are focused on the outermost lane.
11. The lamp control system of claim 1, wherein the information on the adjacent lane includes at least one of a number of lanes, a lane type, and a lane color.
12. The lamp control system of claim 1, wherein: the pair of lamps includes a pair of high beams; and the processor is configured to control a beam pattern of a high beam adjacent to the specific lane among the pair of high beams.
13. A method for controlling light outputs of a pair of lamps emitting light forward, the method comprising: sensing an adjacent lane of a travelling vehicle; receiving information on the sensed adjacent lane; and in response to the adjacent lane corresponding to a specific lane, controlling a light output of a lamp adjacent to the specific lane among the pair of lamps based on the received information on the adjacent lane, wherein the specific lane includes at least one of a center line and an outermost lane.
14. The method of claim 13, wherein controlling the light output of the lamp adjacent to the specific lane comprises: in response to the specific lane corresponding to the center line, turning off a light output of a lamp adjacent to the center line or controlling the light output of the lamp adjacent to the center line to have a first light output; and in response to the specific lane corresponding to the outermost lane, controlling a light output of a lamp adjacent to the outermost lane to have a second light output.
15. A vehicle comprising: a pair of lamps configured to emit light forward; a sensing module configured to sense an adjacent lane while travelling; and a processor configured to: receive information on the sensed adjacent lane; and in response to the adjacent lane corresponding to a specific lane, control a light output of a lamp adjacent to the specific lane among the pair of lamps based on the received information on the adjacent lane, wherein the specific lane includes at least one of a center line and an outermost lane.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0029] The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the disclosure and together with the description serve to explain the principle of the disclosure. In the drawings:
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
DETAILED DESCRIPTION
[0036] Hereinafter, preferred embodiments of the present disclosure will be described in detail, examples of which are shown in the attached drawings. The detailed description below with reference to the attached drawings is intended to explain the preferred embodiments of the present disclosure, rather than representing only embodiments capable of being implemented according to the present disclosure. The following detailed explanation includes specific details to provide a thorough understanding of the embodiments. However, it is evident to those skilled in the art that the embodiments are capable of being practiced without these specific details.
[0037] Most of the terms used herein are selected from commonly used terms in the relevant field. However, some terms are arbitrarily chosen by the applicant, and the meanings thereof are detailed in the following description as needed. Therefore, the embodiments should be understood based on the intended meaning of the terms rather than the names or meanings thereof.
[0038]
[0039]
[0040] The vehicle according to the embodiments may be configured as shown in
[0041] An autonomous vehicle 1000 may be implemented based on an autonomous driving integrated controller 600, which transmits and receives data necessary for autonomous driving control through a driving information input interface 101, a travelling information input interface 201, an occupant output interface 301, and an autonomous vehicle control output interface 401. However, in this specification, the autonomous driving integrated controller 600 may also be referred to as a processor, processors, or simply as a controller.
[0042] The autonomous driving integrated controller 600 may acquire, through the driving information input interface 101, driving information based on the operations of an occupant on the user input unit 100 in the autonomous driving mode or manual driving mode of the autonomous vehicle. As shown in
[0043] For example, the driving mode (i.e., autonomous driving mode/manual driving mode or sports mode/eco mode/safe mode/normal mode) of the autonomous vehicle, which is determined by the operation of the driving mode switch 110 by the occupant, may be transmitted to the autonomous driving integrated controller 600 through the driving information input interface 101 as the above-described driving information.
[0044] Navigation information such as an occupant destination and a route to the destination (e.g., the shortest route or preferred route selected by the occupant among candidate routes to the destination), which is input by the occupant through the control panel 120, may be transmitted to the autonomous driving integrated controller 600 through the driving information input interface 101 as the above-described driving information
[0045] The control panel 120 may be implemented as a touch screen panel providing a user interface (UI) for the occupant to input or modify information for autonomous driving control of the autonomous vehicle. In this case, the aforementioned driving mode switch 110 may be implemented as a touch button on the control panel 120.
[0046] In addition, the autonomous driving integrated controller 600 may obtain travelling information indicating the driving state of the autonomous vehicle through the travelling information input interface 201. The travelling information may include various information indicating the driving states and behaviors of the autonomous vehicle, such as a steering angle formed when the occupant manipulates a steering wheel, an accelerator pedal stroke or brake pedal stroke formed when the occupant depresses an accelerator pedal or brake pedal, and behaviors of the vehicle including a vehicle speed, acceleration, a yaw, a pitch, and a roll formed in the vehicle. As shown in
[0047] The travelling information on the autonomous vehicle may also include information on the location of the vehicle, which may be obtained through a global positioning system (GPS) receiver 260 applied to the autonomous vehicle. The travelling information may be transmitted to the autonomous driving integrated controller 600 through the travelling information input interface 201 and then used to control the driving of the autonomous vehicle in the autonomous driving mode or manual driving mode.
[0048] The autonomous driving integrated controller 600 may transmit information on the driving state of the autonomous vehicle in the autonomous driving mode or manual driving mode, which is intended for the occupant, to an output unit 300 through the occupant output interface 301. In other words, the autonomous driving integrated controller 600 may transmit the information on the driving state of the autonomous vehicle to the output unit 300, enabling the occupant to check the autonomous or manual driving state of the vehicle based on the driving state information displayed through the output unit 300. The driving state information may include various information indicating the driving state of the autonomous vehicle, such as the current driving mode, gear range, vehicle speed, and so on.
[0049] If the autonomous driving integrated controller 600 determines that a warning is necessary for the occupant in the autonomous driving mode or manual driving mode, the autonomous driving integrated controller 600 may transmit warning information along with the aforementioned driving state information to the output unit 300 through the occupant output interface 301 to enable the output unit 300 to issue the warning to the occupant. To output the driving state information and warning information both audibly and visually, the output unit 300 may include a speaker 310 and a display device 320 as shown in
[0050] The autonomous driving integrated controller 600 may transmit control information for driving control of the autonomous vehicle in the autonomous driving mode or manual driving mode to a lower control system 400 applied to the autonomous vehicle through the autonomous vehicle control output interface 401. As shown in
[0051] As described above, the autonomous driving integrated controller 600 in this embodiment may obtain driving information based on the operations of the operation of the occupant and travelling information indicating the driving state of the autonomous vehicle through the driving information input interface 101 and the travelling information input interface 201, respectively. The autonomous driving integrated controller 600 may transmit driving state information and warning information, generated according to the autonomous driving algorithm, to the output unit 300 through the occupant output interface 301. Additionally, the autonomous driving integrated controller 600 may transmit control information, generated according to the autonomous driving algorithm, to the lower control system 400 through the autonomous vehicle control output interface 401 to enable the driving control of the autonomous vehicle.
[0052] To ensure stable autonomous driving of the autonomous vehicle, it is necessary to continuously monitor the driving state of the autonomous vehicle by accurately measuring the driving environment of the autonomous vehicle and control driving based on the measured driving environment. To this end, as illustrated in
[0053] The sensing module 500, as shown in
[0054] The LiDAR sensor 510 may detect a surrounding object outside the autonomous vehicle by transmitting a laser signal around the autonomous vehicle and receiving the signal reflected back from the object. The LiDAR sensor 510 may detect the surrounding object located within the ranges of a preset distance, a preset vertical field of view, and a preset horizontal field of view, which are predefined depending on specifications thereof. The LiDAR sensor 510 may include a front LiDAR sensor 511, a top LiDAR sensor 512, and a rear LiDAR sensor 513 installed at the front, top, and rear of the autonomous vehicle, respectively, but the installation location of each LiDAR sensor and the number of LiDAR sensors are not limited to a specific embodiment. A threshold for determining the validity of a laser signal reflected and returning from an object may be previously stored in a memory (not illustrated) of the autonomous driving integrated controller 600. The autonomous driving integrated controller 600 may determine the position (including the distance to the object), speed, and direction of movement of the object by measuring the time taken for the laser signal transmitted by the LiDAR sensor 510 to be reflected back from the object.
[0055] The radar sensor 520 may detect a surrounding object outside the autonomous vehicle by emitting an electromagnetic wave around the autonomous vehicle and receiving the signal reflected back from the object. The radar sensor 520 may detect the surrounding object located within the ranges of a preset distance, a preset vertical field of view, and a preset horizontal field of view, which are predefined depending on specifications thereof. The radar sensor 520 may include a front radar sensor 521, a left radar sensor 522, a right radar sensor 523, and a rear radar sensor 524 installed at the front, left, right, and rear of the autonomous vehicle, respectively, but the installation location of each radar sensor and the number of radar sensors are not limited to a specific embodiment. The autonomous driving integrated controller 600 may determine the position (including the distance to the object), speed, and direction of movement of the object by analyzing the power of an electromagnetic wave transmitted and received by the radar sensor 520.
[0056] The camera sensor 530 may detect a surrounding object outside the autonomous vehicle by capturing an image of the area around the vehicle. The camera sensor 530 may detect the surrounding object within the ranges of a preset distance, a preset vertical field of view, and a preset horizontal field of view, which are predefined depending on specifications thereof.
[0057] The camera sensor 530 may include a front camera sensor 531, a left camera sensor 532, a right camera sensor 533, and a rear camera sensor 534 installed at the front, left, right, and rear of the autonomous vehicle, respectively, but the installation location of each camera sensor and the number of camera sensors are not limited to a specific embodiment. The autonomous driving integrated controller 600 may determine a location (including a distance to a corresponding object), speed, and moving direction of the corresponding object by applying predefined image processing to an image captured by the camera sensor 530. The autonomous driving integrated controller 600 may determine the position (including the distance to the object), speed, and direction of movement of the object by applying a predefined image processing algorithm to the image captured by the camera sensor 530.
[0058] In addition, an internal camera sensor 535 for capturing the inside of the autonomous vehicle may be mounted at a predetermined location (e.g., rearview mirror) inside the vehicle. The autonomous driving integrated controller 600 may monitor the behavior and state of the occupant based on an image captured by the internal camera sensor 535 and provide guidance or warnings to the occupant through the output unit 300.
[0059] In addition to the LiDAR sensor 510, radar sensor 520, and camera sensor 530, the sensing module 500 may also include an ultrasonic sensor 540 as shown in
[0060] For better understanding of this embodiment,
[0061] To assess the condition of the occupant in the autonomous vehicle, the sensing module 500 may also include a biometric sensor to detect biological signals of the occupant (e.g., heart rate, electrocardiogram, respiration, blood pressure, body temperature, brain waves, blood flow (pulse wave), and blood sugar). The biometric sensors may include a heart rate sensor, electrocardiogram sensor, respiration sensor, blood pressure sensor, body temperature sensor, electroencephalogram sensor, photoplethysmography sensor, and blood sugar sensor.
[0062] The sensing module 500 additionally includes a microphone 550. In particular, an internal microphone 551 and an external microphone 552 may be used for different purposes, respectively.
[0063] The internal microphone 551 may be used, for example, to analyze the voice of the occupant in the autonomous vehicle 1000 based on artificial intelligence (AI) or to respond immediately to direct voice commands.
[0064] On the other hand, the external microphone 552 may be used, for example, to appropriately respond to safe driving by analyzing various sounds generated from the outside of the autonomous driving vehicle 1000 based on various analysis tools such as deep learning.
[0065] For reference, the symbols illustrated in
[0066]
[0067] A sensing module 3200 shown in
[0068] Referring to
[0069] The lamps 3100 are a type of output unit that emits light in the forward direction of the vehicle, which may be made in pairs. More specifically, the lamps 3100 may be a type of headlamp that is configured as a pair on the left front and right front of the vehicle. In general, headlamps or headlights may include low beams, high beams, turn signals, daytime running lights, and position lights. The lamps 3100, controlled by the processor 3300 of the lamp control system 3000 according to the embodiments, may correspond to the high beams among the low beams, high beams, turn signals, daytime running lights, and position lights.
[0070] The sensing module 3200 may sense an adjacent lane of the traveling vehicle to determine a specific lamp (e.g., the left lamp or the right lamp) among the pair of lamps 3100 of which the light output is to be controlled. In other words, the sensing module 3200 may sense the adjacent lane of the traveling vehicle and transmit information on the sensed adjacent lane to the processor 3300. For example, the sensing module 3200 may include a camera 3201 that captures images of the adjacent lane of the traveling vehicle. More specifically, the camera 3201 may capture images of the adjacent lane located in front of the vehicle, and the camera 3201 may correspond to the front camera 531 shown in
[0071] Alternatively, the sensing module 3200 may detect static or dynamic objects around the vehicle. In other words, the sensing module 3200 may detect whether there are static or dynamic objects around the vehicle. Upon detecting such objects, the sensing module 3200 may transmit information on the detected objects to the processor 3300. For example, the sensing module 3200 may include the camera 3201 or a LiDAR sensor 3202 for determining the presence of dynamic objects around the vehicle or a navigation system 3203 for determining the presence of static objects. More specifically, the camera 3201 may detect dynamic objects on the right side and in front of the vehicle, and the camera 3201 may correspond to the front camera 531 and/or the right camera 533 in
[0072] Alternatively, the sensing module 3200 may detect the pupil movement of the driver. In other words, the sensing module 3200 may detect the pupil movement of the driver to determine whether the driver is looking to the left, right, or forward and transmit information on the gaze direction of the driver to the processor 3300. For example, the sensing module 3200 may include a pupil tracking sensor 3204 that detects the pupil movement of the driver. More specifically, the pupil tracking sensor 3204 may be a type of camera located inside the vehicle (e.g., a camera in the cluster) that detects the pupil movement of the driver, and the pupil tracking sensor 3204 may correspond to the internal camera sensor 535 shown in
[0073] The sensing module 3200 and the processor 3300 may be connected to each other via an interface. In other words, the interface may transmit information on the adjacent lane sensed by the sensing module 3200, information on the presence of dynamic or static objects around the vehicle, and/or information on the pupil movement of the driver to the processor 3300.
[0074] Additionally, the interface may also transmit the status of the pair of lamps 3100 to the processor 3300.
[0075] The processor 3300 may receive the information on the adjacent lane of the traveling vehicle sensed by the sensing module 3200. Based on the received information on the adjacent lane, the processor 3300 may control the light output of the specific lamp (e.g., the left lamp or the right lamp) among the pair of lamps 3100. In other words, the processor 3300 may determine which lamp in the pair of lamps 3100 to control the light output based on the information on the adjacent lane of the traveling vehicle. Therefore, the processor 3300 may control the light output of one of the lamps in the pair of lamps 3100 or the light output of both lamps in the pair of lamps 3100 based on the information on the adjacent lane.
[0076] Upon receiving information that the adjacent lane of the traveling vehicle is either the center line or the outermost lane, the processor 3300 may control the light output of a lamp in the pair of lamps 3100 that is adjacent to the corresponding lane. The outermost lane may refer to the leftmost lane. Details thereof will be described later with reference to
[0077]
[0078] A subject vehicle (SV) shown in
[0079]
[0080] First, referring to (a) of
[0081] For example, when the SV is traveling in the second lane as shown in (a) of
[0082] Referring to (b) of
[0083] For example, when the SV is traveling on a single-lane one-way road as shown in (b) of
[0084] Referring to (c) of
[0085] For example, when the SV is traveling in the outermost lane, i.e., third lane on a three-lane one-way road as shown in (c) of
[0086] Therefore, when the adjacent lane of the traveling vehicle corresponds to the center line or the outermost lane, the lamp control system 3000 according to the embodiments may control the light output of a lamp adjacent to the lane.
[0087] When the adjacent lane of the traveling vehicle corresponds to the center line, the processor 3300 may completely turn off the light output of the lamp adjacent to the center line or control the light output of the lamp adjacent to the center line to have a first light output. In this case, the first light output may be 10%.
[0088] More specifically, the processor 3300 may completely turn off the light output of the left-side portion of a beam pattern provided by the lamp adjacent to the center line or control the light output of the left-side portion to have the first light output. In other words, the processor 3300 may control the light output of the lamp adjacent to the center line (left lamp) for a range that crosses the center line (opposite lane range or left range), while the processor 3300 may not control the light output for a range that does not cross the center line (driving lane range or right range).
[0089] That is, since the left side of the road based on the center line is an opposite lane, the lamp control system 3000 according to the embodiments may preserve the visibility of drivers in oncoming vehicles by completely turning off the light output or providing a lower light output for the range that crosses the center line.
[0090] When the adjacent lane of the traveling vehicle corresponds to the outermost lane, the processor 3300 may control the light output of a lamp adjacent to the outermost lane to have a second light output. In this case, the second light output may have a value greater than the first light output. For example, if the first light output is 10%, the second light output may be less than 100% but greater than 10%.
[0091] More specifically, the processor 3300 may control the light output of the right-side portion of a beam pattern provided by the lamp adjacent to the outermost lane to have the second light output.
[0092] In other words, since the right side of the road based on the outermost lane is a pedestrian road, the lamp control system 3000 according to the embodiments reduces the light output for the range that crosses the outermost lane to prevent glare for pedestrians outside the lane, while simultaneously providing a certain level of light output to alert pedestrians to the presence of the vehicle.
[0093] As described in
[0094] The information on the adjacent lane may include information on the number of lanes, information on the types of lanes, and information on the colors of lanes. That is, the processor 3300 may determine the road on which the vehicle is currently driving and/or the position of the vehicle based on the information on the number of lanes, the information on the types of lanes, the information on the colors of lanes, and/or combination thereof.
[0095] For example, if there are four lanes (three-lane road), a logic for determining determine which lane is the adjacent lane of the traveling vehicle based on information on the adjacent lane will be explained.
[0096] First, the sensing module 3200 may sense information on the left adjacent lane of the pair of adjacent lanes of the traveling vehicle and transmit the information to the processor 3300. The processor 3300 may determine which lane the adjacent lane corresponds to, based on the information on the adjacent lane.
[0097] For example, if the left adjacent lane is a double solid line and there is a barrier to the left of the left adjacent lane, the processor 3300 may determine that the left adjacent lane corresponds to the center line. Alternatively, if the left adjacent lane is a dashed line and there is a double solid line to the left of the left adjacent lane, the processor 3300 may determine that the left adjacent lane is the second lane, which does not correspond to the center line. Alternatively, if the left adjacent lane is a dashed line and there is another dashed line to the left of the left adjacent lane, the processor 3300 may determine that the left adjacent lane is the third lane, which does not correspond to the center line.
[0098] Similarly, for example, if the right adjacent lane is a solid line and there is a barrier to the right of the right adjacent lane, the processor 3300 may determine that the right adjacent lane corresponds to the outermost lane (the fourth lane in the case of a three-lane road). Alternatively, if the right adjacent lane is a dashed line and there is a solid line to the right of the right adjacent lane, the processor 3300 may determine that the right adjacent lane does not correspond to the outermost lane. Alternatively, if the right adjacent lane is a dashed line and there is another dashed line to the right of the right adjacent lane, the processor 3300 may determine that the right adjacent lane does not correspond to the outermost lane.
[0099] The determination logic based on the type and number of lanes has been explained, but lanes may also be determined by combining lane color information such as white, yellow, and blue.
[0100]
[0101] An SV shown in
[0102] Hereinafter, various embodiments of the lamp control system 3000 according to the embodiments will be described with reference to
[0103] First, referring to
[0104] That is, compared to a case where the adjacent lane is the center line, when the adjacent lane is the outermost lane, the processor 3300 may require a certain level of light output to notify pedestrians of the presence of the vehicle and control the required light output differently according to the tendency or tendencies of the driver.
[0105] For example, as shown in
[0106] For example, if the driver is categorized as cautious, it may mean that the driver has a strong tendency to accelerate, maintains a short safe distance from the vehicle ahead, and has a slow reaction time to surrounding situations. Therefore, in this case, there is a greater need to notify pedestrians of the presence of the vehicle, and thus, the processor 3300 may control the light output of the lamp to have a relatively high value. For instance, the light output may be controlled to 70%.
[0107] Alternatively, if the driver is categorized as average, it may mean that the driver has a moderate tendency to accelerate, maintains a moderate safe distance from the vehicle ahead, and has a moderate reaction time to surrounding situations. Therefore, in this case, the need to notify pedestrians of the presence of the vehicle is less than for cautious drivers, and thus the processor 3300 may control the light output of the lamp to have a lower value than for cautious drivers. For instance, the light output may be controlled to 50%.
[0108] Alternatively, if the driver is categorized as safety-conscious, it may mean that the driver has a low tendency to accelerate, maintains a long safe distance from the vehicle ahead, and has a quick reaction time to surrounding situations. Therefore, in this case, the need to notify pedestrians of the presence of the vehicle is lower (the driver is more likely to be aware of the presence of the pedestrians), and thus the processor 3300 may control the light output of the lamp to have a relatively low value. For instance, the light output may be controlled to 30%.
[0109] Therefore, the lamp control system 3000 according to the embodiments may efficiently control lamps based on the tendency or tendencies of drivers.
[0110] Alternatively, referring to
[0111] That is, as described in
[0112] The processor 3300 may receive information on static objects such as facilities or dynamic objects around the vehicle detected by the sensing module 3200. Referring also to
[0113] That is, when the left adjacent lane of the vehicle is the outermost lane, in other words, when the vehicle is driving in the outermost lane, if a facility or moving object is detected in the right front of the outermost lane, the processor 3300 may restore the light output of the lamp adjacent to the outermost lane to the original level. Therefore, when the facility or moving object is detected in the right front of the outermost lane, the lamp control system 3000 according to the embodiments may provide sufficient light to the area of interest of the driver by restoring the light output of the lamp adjacent to the outermost lane to the original level.
[0114] Additionally, the processor 3300 may analyze the reaction speed tendency or tendencies of the driver, configure operating conditions based on the reaction speed tendency or tendencies of the driver, and configure a recovery point based on the configured operating conditions, thereby controlling to restore the light output of the lamp adjacent to the outermost lane at the recovery point when the operating conditions are satisfied for each driver. The analyzed reaction speed tendency or tendencies of the driver may also be stored in the memory described in
[0115] That is, when the facility or moving object is detected in the right front of the outermost lane, the processor 3300 may restore the light output of the lamp adjacent to the outermost lane and configure different conditions for restoring the light output based on the reaction speed of the driver.
[0116] For example, as shown in
[0117] For example, if the reaction speed of the driver is categorized as strong, it may mean that the driver has a relatively fast reaction time. Therefore, in this case, even if the speed of the vehicle is relatively high, the driver may detect a facility or dynamic object in the right front of the outermost lane. Thus, the processor 3300 may set the operating condition to a relatively high value. For example, the operating condition may be set to a vehicle speed of 80 km/h or less. When the operating condition is set to 80 km/h or less, the processor 3300 may restore the light output of the lamp to the original level at a point 300 meters before the detected facility or dynamic object.
[0118] Alternatively, if the reaction speed of the driver is categorized as normal, it may mean that the driver has an average reaction time. Therefore, in this case, the speed of the vehicle needs to be lower than that for a driver with a strong reaction speed to allow the driver to detect a facility or dynamic object in the right front of the outermost lane. Thus, the processor 3300 may set the operating condition to a relatively lower value. For instance, the operating condition may be set to a vehicle speed of 70 km/h or less, which is lower than that for a driver with a strong reaction speed. When the operating condition is set to 70 km/h or less, the processor 3300 may restore the light output of the lamp to the original level at a point 400 meters before the detected facility or dynamic object.
[0119] Alternatively, if the reaction speed of the driver is categorized as weak, it may mean that the driver has a relatively slow reaction time. Therefore, in this case, the speed of the vehicle needs to be relatively lower to allow the driver to detect a facility or dynamic object in the right front of the outermost lane. Thus, the processor 3300 may set the operating condition to a relatively low value. For instance, the operating condition may be set to a vehicle speed of 65 km/h or less. When the operating condition is set to 65 km/h or less, the processor 3300 may restore the light output of the lamp to the original level at a point 450 meters before the detected facility or dynamic object.
[0120] Therefore, the lamp control system 3000 according to the embodiments may efficiently control lamps based on the reaction speed tendency or tendencies of drivers.
[0121] Alternatively, referring to
[0122] That is, as described in
[0123] The processor 3300 may receive the information on the pupil movement of the driver detected by the sensing module 3200. Referring also to
[0124] That is, when the left adjacent lane of the vehicle is the outermost lane, that is, when the vehicle is driving in the outermost lane, if it is determined that the gaze of the driver remains on the outermost lane or the driver frequently focuses on the outermost lane within a specific period, the processor 3300 may restore the light output of the lamp adjacent to the outermost lane to the original level. Therefore, when the driver is focusing on the outermost lane, the lamp control system 3000 according to the embodiments may provide sufficient light to the area of interest of the driver by restoring the light output of the lamp adjacent to the outermost lane to the original level.
[0125] The processor 3300 may analyze the reaction speed tendency or tendencies of the driver and configure operating conditions based on the reaction speed tendency or tendencies. Then, if the operating conditions are satisfied for each driver, the processor 3300 may control to restore the light output of the lamp adjacent to the outermost lane. The analyzed reaction speed tendency or tendencies of the driver may also be stored in the memory described in
[0126] In other words, when the driver is focusing on the outermost lane, the processor 3300 may restore the light output of the lamp adjacent to the outermost lane. The conditions for restoring the light output may be set differently based on the reaction speed of the driver.
[0127] For example, as shown in
[0128] For example, if the reaction speed of the driver is categorized as strong, it may mean that the driver has a relatively fast reaction time. Therefore, in this case, since the gaze of the driver remains on the outermost lane for a relatively short duration or the frequency that the driver focusing on the outermost lane within the specific period is relatively high, the processor 3300 may set the operating condition to a relatively small value for the duration of the gaze or a relatively large value for the frequency of focusing. For instance, the operating condition may be configured such that the gaze of the driver remains on the outermost lane for at least 0.2 seconds or the driver focuses on the outermost lane at least three times within the specific period.
[0129] Alternatively, if the reaction speed of the driver is categorized as normal, it may mean that the driver has an average reaction time. Therefore, in this case, since the gaze of the driver remains on the outermost lane for a relatively longer duration than a driver with a strong reaction speed or the frequency of focusing on the outermost lane within the specific period is relatively lower than a driver with a strong reaction speed, the processor 3300 may set the operating condition with a larger value for the duration of the gaze or a smaller value for the frequency of focusing compared to a driver with a strong reaction speed. For example, the operating condition may be configured such that the gaze of the driver remains on the outermost lane for at least 0.4 seconds or the driver focuses on the outermost lane at least two times within the specific period.
[0130] Alternatively, if the reaction speed of the driver is categorized as weak, it may mean that the driver has a relatively slow reaction time. Therefore, in this case, since the gaze of the driver remains on the outermost lane for a relatively long duration or the frequency of focusing on the outermost lane within the specific period is relatively low, the processor 3300 may configure the operating condition with a larger value for the duration of the gaze or a smaller value for the frequency of focusing. For instance, the operating condition may be configured such that the gaze of the driver remains on the outermost lane for at least 0.5 seconds or the driver focuses on the outermost lane at least two times within the specific period.
[0131] Therefore, the lamp control system 3000 according to the embodiments may efficiently control lamps based on the reaction speed tendency or tendencies of drivers.
[0132]
[0133] Specifically,
[0134] Referring to
[0135] The sensing step (S8000) may be performed by the sensing module 3200 of the lamp control system 3000, and the control step (S8001) may be performed by the processor 3300 of the lamp control system 3000.
[0136] In the control step (S8001), if the specific lane corresponds to the center line, that is, if the right adjacent lane among adjacent lanes corresponds to the center line, the light output of a lamp adjacent to the center line may be turned off or controlled to have a first light output. In this case, the first light output may be 10%. Alternatively, in the control step (S8001), if the specific lane corresponds to the outermost lane, that is, if the left adjacent lane among the adjacent lanes corresponds to the outermost lane, the light output of a lamp adjacent to the outermost lane may be controlled to have a second light output. In this case, the second light output may be greater than the first light output and less than 100%.
[0137] The embodiments have been described from the perspectives of the method and/or apparatus, and the description of the method and apparatus may be mutually complementary and applicable.
[0138] While each drawing has been explained separately for the sake of clarity, it is also possible to design new embodiments by combining the embodiments illustrated in each drawing. Designing a computer-readable recording medium having recorded thereon a program for executing the above-described embodiments as needed by an ordinary skilled person falls within the scope of the present disclosure. The device and method according to the embodiments are not be limited to the configurations and methods in the above-described embodiments. Instead, the embodiments may be selectively combined in whole or in part to allow for various modifications. While preferred embodiments of the present disclosure have been illustrated and explained, the present disclosure is not limited to the specific embodiments described above. In addition, those skilled in the art will appreciate that various modifications may be made in the embodiments without departing from the essence of the embodiments claimed in the claims. These variations should not be individually understood apart from the technical concept or perspective of the embodiments.
[0139] Various components of the apparatus according to the embodiments may be implemented by hardware, software, firmware, or a combination thereof. Various components of the embodiments may be implemented as a single chip such as a hardware circuit, for example. According to embodiments, the components of the embodiments may be implemented as separate chips. According to embodiments, at least one or more of the components of the apparatus according to the embodiments may include one or more processors capable of executing one or more programs. The one or more programs may perform one or more of the operations/methods according to embodiments or include instructions for performing the same. Executable instructions for performing the methods/operations of the apparatus according to the embodiments may be stored in a non-transitory computer-readable medium (CRM) or other computer program products configured to be executed by the one or more processors. Alternatively, the instructions may be stored in a transitory CRM or other computer program products configured to be executed by the one or more processors. The concept of memory according to embodiments may encompass not only a volatile memory (e.g., random-access memory (RAM)) but also a non-volatile memory, a flash memory, a programmable read-only memory (PROM), and the like. The memory may also be implemented in the form of carrier waves, such as transmission over the Internet. Furthermore, a processor-readable recording medium may be distributed to computer systems connected over a network, where processor-readable code may be stored and executed in a distributed manner.
[0140] In this document, / and , are interpreted as and/or. For example, A/B is interpreted as A and/or B, and A, B is interpreted as A and/or B. In addition, A/B/C means at least one of A, B, and/or C. Similarly, A, B, C also means at least one of A, B, and/or C. Furthermore, or is interpreted as and/or. For example, A or B may mean: 1) A only, 2) B only, or 3) A and B. In other words, or in this document may mean additionally or alternatively.
[0141] Terms such as first and second may be used to describe various components of the embodiments. However, the various components according to the embodiments should not be limited by the interpretation of these terms. These terms are merely used to distinguish one component from another. For example, a first user input signal and a second user input signal are both user input signals, but unless clearly indicated in context, the first user input signal and second user input signal do not refer to the same user input signals.
[0142] The terms used to describe the embodiments are used for the purpose of describing specific embodiments. In other words, the terms are not intended to limit the embodiments. As described in the embodiments and claims, the singular form is intended to encompass the plural unless explicitly specified in context. The and/or expression is used to mean all possible combinations of terms. The terms such as includes or comprises are used to describe the presence of features, numbers, steps, elements, and/or components and does not imply the exclusion of additional features, numbers, steps, elements, and/or components. Condition expressions such as if and when used to describe embodiments are not limited to optional cases but are intended to be interpreted to mean that when specific conditions are satisfied, related operations or definitions are performed or interpreted.
[0143] The operations according to embodiments described in this document may be performed by a transmitting/receiving device, which includes a memory and/or a processor according to embodiments. The memory may store programs for performing/controlling the operations according to the embodiments, and the processor may control various operations described in this document. The processor may also be referred to as a controller. The operations according to the embodiments may be performed by firmware, software, and/or a combination thereof. The firmware, software, and/or combination thereof may be stored in the processor or memory.
[0144] On the other hand, the operations according to the embodiments may also be performed by a transmitting device and/or a receiving device according to embodiments. The transmitting/receiving device may include a transceiver for transmitting and receiving media data, a memory for storing instructions (e.g., program code, algorithms, flowcharts, and/or data) for processes according to embodiments, and a processor for controlling the operations of the transmitting/receiving device.
[0145] The processor may be referred to as a controller. The processor may correspond to hardware, software, and/or a combination thereof. The operations according to the embodiments may be performed by the processor. Additionally, the processor may be implemented as an encoder/decoder for the operations according to the embodiments.
[0146] Hereinabove, the best mode for implementing the embodiments has been described.
[0147] As described above, the embodiments may be applied in whole or in part to an autonomous valet driving apparatus and system.
[0148] Those skilled in the art may make various modifications or variations to the embodiments without departing from the scope of the present disclosure.
[0149] The embodiments may include modifications/variations without departing from the scope of the claims and their equivalents.