AUTONOMOUS DRIVING VEHICLE AND CONTROL METHOD THEREOF
20250065801 ยท 2025-02-27
Assignee
Inventors
Cpc classification
B60W30/16
PERFORMING OPERATIONS; TRANSPORTING
B60Q1/143
PERFORMING OPERATIONS; TRANSPORTING
B60Q2300/45
PERFORMING OPERATIONS; TRANSPORTING
B60Q1/085
PERFORMING OPERATIONS; TRANSPORTING
B60W2554/4045
PERFORMING OPERATIONS; TRANSPORTING
B60W60/0015
PERFORMING OPERATIONS; TRANSPORTING
B60W2554/4044
PERFORMING OPERATIONS; TRANSPORTING
B60W10/20
PERFORMING OPERATIONS; TRANSPORTING
B60W50/0097
PERFORMING OPERATIONS; TRANSPORTING
B60W30/09
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W30/09
PERFORMING OPERATIONS; TRANSPORTING
B60W30/16
PERFORMING OPERATIONS; TRANSPORTING
B60W60/00
PERFORMING OPERATIONS; TRANSPORTING
B60W50/00
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A method of controlling an autonomous vehicle includes sensing, using a first sensor, an object located in front of the autonomous vehicle as the autonomous vehicle is driving on a driving path using a low beam of a headlight mounted on the autonomous vehicle. The method also includes, when a portion of the object is exposed by the low beam and is additionally sensed by a second sensor, switching, by a processor, the low beam to a high beam.
Claims
1. A method of controlling an autonomous vehicle, the method comprising: sensing, using a first sensor, an object located in front of the autonomous vehicle as the autonomous vehicle is driving on a driving path using a low beam of a headlight mounted on the autonomous vehicle; and when a portion of the object is exposed by the low beam and is additionally sensed by a second sensor, switching, by a processor, the low beam to a high beam.
2. The method of claim 1, further comprising: in response to switching to the high beam, setting, by the processor, the object to be a target; and determining, by the processor, a braking distance between the autonomous vehicle and the target.
3. The method of claim 2, further comprising controlling, by the processor, a forward collision-avoidance assist (FCA) system or steering of the autonomous vehicle, based on the determined braking distance.
4. The method of claim 3, wherein controlling the FCA system or the steering includes controlling the FCA system in response to determining, by the processor, that the determined braking distance is greater than a preset safety distance.
5. The method of claim 3, wherein controlling the FCA system or the steering includes controlling the steering in response to determining, by the processor, that the determined braking distance is less than a preset safety distance.
6. The method of claim 1, wherein switching the low beam to the high beam includes switching the low beam gradually to the high beam.
7. The method of claim 2, further comprising predicting, by the processor, at least one of a movement of the object or a direction of the object while determining the braking distance.
8. The method of claim 7, further comprising re-determining, by the processor, the braking distance based on the at least one of the movement of the object or the direction of the object.
9. The method of claim 8, further comprising controlling, by the processor, a forward collision-avoidance assist (FCA) system or steering of the autonomous vehicle, based on the re-determined braking distance.
10. A non-transitory computer-readable storage medium storing instructions that, by being executed by a processor, cause the processor to: sense, using a first sensor, an object located in front of an autonomous vehicle as the autonomous vehicle is driving on a driving path using a low beam mounted on the autonomous vehicle; and when a portion of the object is exposed by the low beam and is additionally sensed by a second sensor, switch the low beam to a high beam.
11. An autonomous vehicle, comprising: a headlight; a first sensor; a second sensor; and a processor configured to control the headlight, wherein the processor is configured to sense an object located in front of the autonomous vehicle using the first sensor as the autonomous vehicle is driving on a driving path using a low beam, and when a portion of the object is exposed by the low beam and is additionally sensed by the second sensor, switch the low beam to a high beam.
12. The autonomous vehicle of claim 11, wherein the processor is further configured to: in response to switching to the high beam, set the object to be a target; and determine a braking distance between the autonomous vehicle and the target.
13. The autonomous vehicle of claim 12, wherein the processor is further configured to control a forward collision-avoidance assist (FCA) system or steering, based on the determined braking distance.
14. The autonomous vehicle of claim 13, wherein the processor is configured to control the FCA system in response to determining that the determined braking distance is greater than a preset safety distance.
15. The autonomous vehicle of claim 13, wherein the processor is configured to control the steering in response to determining that the determined braking distance is less than a preset safety distance.
16. The autonomous vehicle of claim 11, wherein the processor is configured to switch the low beam gradually to the high beam.
17. The autonomous vehicle of claim 12, wherein the processor is further configured to predict at least one of a movement of the object or a direction of the object while determining the braking distance.
18. The autonomous vehicle of claim 17, wherein the processor is further configured to re-determine the braking distance based on the at least one of a movement of the object or a direction of the object.
19. The autonomous vehicle of claim 18, wherein the processor is further configured to control a forward collision-avoidance assist (FCA) system or steering, based on the re-determined braking distance.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
DETAILED DESCRIPTION OF THE DISCLOSURE
[0037] Hereinafter, embodiments of the present disclosure are described in detail with reference to the accompanying drawings. In the accompanying drawings, the same or similar elements are designated with the same reference numerals regardless of reference symbols, and a repeated description thereof has been omitted. Further, in describing the embodiments, where it was determined that a detailed description of related publicly known technology may obscure the gist of the embodiments described herein, the detailed description thereof has been omitted. The accompanying drawings are used to explain various technical features of embodiments of the present disclosure. It should be understood that the embodiments presented herein are not limited by the accompanying drawings.
[0038] As used herein, the terms include, comprise, have, or the like, specify the presence of stated features, numbers, operations, elements, components, and/or combinations thereof. Such terms do not preclude the presence or addition of one or more other features, numbers, operations, elements, components, and/or combinations thereof. In addition, when describing embodiments with reference to the accompanying drawings, like reference numerals refer to like components and a repeated description related thereto is omitted.
[0039] The terms unit and control unit included in names such as a vehicle control unit (VCU) may be terms widely used in the naming of a control device or controller configured to control vehicle-specific functions. Such terms may not refer to a generic function unit. For example, each controller or control unit may include a communication device that communicates with other controllers or sensors to control a corresponding function, a memory that stores an operating system (OS) or logic commands and input/output information, and a processor that performs determination, calculation, selection, and the like necessary to control the function.
[0040] When a component, device, element, or the like of the present disclosure is described as having a purpose or performing an operation, function, or the like, the component, device, or element should be considered herein as being configured to meet that purpose or perform that operation or function.
[0041]
[0042] Referring to
[0043] The sensors 130 may include front detection sensors 130 disposed in the front of the autonomous vehicle 100 or a different vehicle. For example, the sensors 130 may include a radar 131, a camera 132, and a lidar 133. The radar 131 may also be referred to herein as a first sensor, the camera 132 may also be referred to herein as a second sensor, and the lidar 133 may also be referred to herein as a third sensor.
[0044] The radar 131 may be provided as at least one radar in the autonomous vehicle 100. The radar 131 may measure a relative speed and a relative distance with respect to a recognized object, along with a wheel speed sensor (not shown) provided in the autonomous vehicle 100. For example, the radar 131 may be provided in the front of the autonomous vehicle 100 to recognize an object present ahead.
[0045] The camera 132 may be provided as at least one camera in the autonomous vehicle 100. The camera 132 may capture an image of an object present around the autonomous vehicle 100 and an image of a state of the object, and output image data based on information associated with the captured image. For example, the camera 132 may be provided in the autonomous vehicle 100 to recognize a portion of an object present ahead. For example, the camera 132 may be provided in the autonomous vehicle 100 to recognize a portion of an object present ahead or a lower body of a pedestrian present ahead that is exposed by a low beam.
[0046] The lidar 133 may be provided as at least one lidar in the autonomous vehicle 100. The lidar 133 may emit a laser pulse and measure a time at which the laser pulse reflected from an object present within a measurement range returns. The lidar 133 may sense information such as a distance to an object, a direction of the object, and a speed of the object, and may output lidar data based on the sensed information. The term object as used herein may be an obstacle, a person, a thing, or the like, that is present outside the autonomous vehicle 100.
[0047] The processor 110 may sense an object present in front of the autonomous vehicle 100 using the first sensor while the autonomous vehicle is driving on a dark driving path (e.g., at night) using a low beam provided in the autonomous vehicle 100. For example, the processor 110 may sense an object located in front of the autonomous vehicle 100, even in an area that is not reachable by the low beam, using the radar 131.
[0048] When a portion of the object is exposed by the low beam and is additionally sensed by the second sensor different from the first sensor, the processor 110 may control the low beam to be switched to a high beam. For example, when the portion of the object is exposed by the low beam, the processor 110 may recognize, using the camera 132, the object that has been sensed by the radar 131. In this case, when the portion of the object present ahead is recognized, the processor 110 may switch the low beam to the high beam, enabling better recognition of the object ahead.
[0049] In an embodiment, in response to switching to the high beam, the processor 110 may set, as a target, the object additionally sensed through the camera 132, which is the second sensor. The processor 110 may then determine a braking distance between the autonomous vehicle 100 and the object based on a result of the setting. For example, in response to switching to the high beam, the processor 110 may predict or estimate various data about the object sensed through the camera 132. For example, the processor 110 may analyze the sensed object to predict or estimate a movement of the object, a direction of the object, a speed of the object, a braking distance between the object and the autonomous vehicle 100, a relative speed between the object and the autonomous vehicle 100, and the like.
[0050] The processor 110 may control a forward collision-avoidance assist (FCA) system and/or steering, based on the determined braking distance. This is described in more detail below.
[0051]
[0052] Referring to
[0053] The autonomous vehicle 100 may drive on a vehicle driving path under the control of the processor 110. When it is determined as night driving in a step or operation S11 while the autonomous vehicle 100 is driving on the vehicle driving path, the autonomous vehicle 100 may activate or deactivate a low beam in a step or operation S12, under the control of the processor 110. For example, when the low beam is activated, a downward light may be activated.
[0054] A low beam as used herein may refer to a light located in the front of the autonomous vehicle 100 to secure forward visibility for safety during night driving. Although the low beam is used to secure the forward vision of a driver, it may shine or blink toward the bottom of the driving path, rather than fully illuminating the front side of the vehicle. Accordingly, a downward light emitted through the low beam may shine or blink toward the bottom of the driving path.
[0055] A high beam as used herein may refer to a light located in the front of the autonomous vehicle 100 to secure forward visibility for safety during night driving. The high beam may shine or blink fully toward the front side of the autonomous vehicle 100 to secure the forward vision of the driver. Accordingly, an upward light emitted through the high beam may shine or blink toward the front side of the autonomous vehicle 100.
[0056] A headlight may include the low beam and the high beam. The expression the low beam of the headlight as used herein may also indicate that the downward light is activated. The expression the high beam of the headlight as used herein may also indicate the upward light is activated.
[0057] In a step or operation S13, when the low beam is deactivated, the autonomous vehicle 100 may recognize an object 200 located in front of the autonomous vehicle 100, under the control of the processor 110. The object 200 may include a pedestrian 200, a thing, or the like. For example, a case where the low beam is deactivated may indicate a situation where a surrounding area of a driving lane is brighter than the standard brightness, and thus a dimmed light or a fog light is activated. For example, when the autonomous vehicle 100 is driving on a driving lane in an auto mode and the surrounding area of the driving lane is brighter than the standard brightness, the low beam may be deactivated. However, examples are not limited to the foregoing examples. For example, the case where the low beam is deactivated may indicate a situation e the surrounding area of the driving lane is darker than the standard darkness and the high beam is thus activated.
[0058] Subsequently, the autonomous vehicle 100 may recognize the pedestrian 200 located in front of the autonomous vehicle 100 and determine it as a target, under the control of the processor 110.
[0059] when the recognized pedestrian 200 is determined as the target, the autonomous vehicle 100 may determine a braking distance based on the pedestrian 200, under the control of the processor 110.
[0060] In a step or operation S14, the autonomous vehicle 100 may activate an FCA system and warn of a risk of collision based on the determined braking distance, under the control of the processor 110, as shown in
[0061] In a step or operation S15, the autonomous vehicle 100 may control collision risk braking after warning of the risk of collision, under the control of the processor 110. Accordingly, the autonomous vehicle 100 may prevent a collision with the object 200.
[0062] For example, as shown in
[0063] In a step or operation S16, when stop braking control is completed, the autonomous vehicle 100 may deactivate the FCA system or cancel the operation, under the control of the processor 110.
[0064] In a step or operation S17, when the low beam is activated, the autonomous vehicle 100 may sense the object 200 located in front of the autonomous vehicle 100 using a first sensor, under the control of the processor 110. For example, the autonomous vehicle 100 may recognize the object 200 present ahead, using a radar or a front radar, under the control of the processor 110. For example, the autonomous vehicle 100 may use the radar to sense the object 200 located in front of the autonomous vehicle 100 up to an area where the low beam does not reach, under the control of the processor 110.
[0065] In a step or operation S18, when a portion of the object 200 is exposed by the low beam, the autonomous vehicle 100 may recognize the portion of the object 200 using a second sensor different from the first sensor, under the control of the processor 110. For example, in the step or operation S18, when a lower body of the pedestrian 200 is exposed by the low beam, the autonomous vehicle 100 may recognize the lower body of the pedestrian 200 using a camera, under the control of the processor 110.
[0066] In a step or operation S19, when the portion of the object 200 is recognized by the low beam, the autonomous vehicle 100 may switch the low beam to the high beam, under the control of the processor 110. In the step or operation S19, the autonomous vehicle 100 may switch the low beam of the headlight to the high beam of the headlight, under the control of the processor 110.
[0067] For example, when the lower body of the pedestrian 200 is exposed by the low beam, the autonomous vehicle 100 may recognize, by the camera, the pedestrian 200 sensed by the radar, under the control of the processor 110. In this case, when the lower body of the pedestrian 200 present ahead is recognized, the processor 110 may switch the low beam to the high beam to more accurately recognize the pedestrian 200 present ahead.
[0068] In a step or operation S20, the autonomous vehicle 100 may determine, as a target, the pedestrian 200 that is additionally sensed through the camera after the switch from the low beam to the high beam, under the control of the processor 110.
[0069] In a step or operation S21, when the sensed pedestrian 200 is not determined as the target, the autonomous vehicle 100 may switch the high beam to the low beam, under the control of the processor 110. For example, the autonomous vehicle 100 may switch the high beam of the headlight to the low beam of the headlight under the control of the processor 110.
[0070] In a step or operation S22, when the pedestrian 200 is determined or set as the target in step S20, the autonomous vehicle 100 may determine a braking distance between the autonomous vehicle 100 and the pedestrian 200 based on the determined or set result, under the control of the processor 110.
[0071] The autonomous vehicle 100 may control the FCA system in the step or operation S14 or control steering in a step or operation S23 based on the determined braking distance, under the control of the processor 110.
[0072] For example, when the determined braking distance is greater than a preset safety distance, the autonomous vehicle 100 may activate the FCA system, under the control of the processor 110. For example, when the determined braking distance is greater than the preset safety distance, this may signify that the autonomous vehicle 100 satisfies the braking distance, under the control of the processor 110.
[0073] When the determined braking distance is satisfied in the step or operation S22, the autonomous vehicle 100 may activate the FCA system and warn of a risk of collision in the step or operation S14, under the control of the processor 110, as shown in
[0074] Subsequently, after warning of the risk of collision, the autonomous vehicle 100 may control the collision risk braking in the step or operation S15 to prevent a collision with the pedestrian 200, under the control of the processor 110.
[0075] In the step or operation S23, when the determined braking distance is less than the preset safety distance, the autonomous vehicle 100 may control steering of the autonomous vehicle 100, under the control of the processor 110, as shown in
[0076] However, examples are not limited to the foregoing. For example, when the determined braking distance is not satisfied in the step or operation S22, the autonomous vehicle 100 may activate the FCA system but control the steering to avoid the pedestrian 200 in the step or operation S23, under the control of the processor 110.
[0077] Subsequently, when the steering control is completed, the autonomous vehicle 100 may deactivate the FCA system and the steering or may cancel the operation in the step or operation S16, under the control of the processor 110.
[0078]
[0079] Referring to
[0080] For example, when a lower body of the pedestrian 200 is exposed by a low beam as shown in
[0081] Subsequently, when a portion of the object 200 is recognized by the low beam as shown in
[0082] In an embodiment, when switching the low beam to the high beam, the autonomous vehicle 100 may control the low beam to be switched gradually to the high beam, under the control of the processor 110, as shown in
[0083] In addition, to prevent such a safety accident, the autonomous vehicle 100 may gradually switch the low beam to the high beam, but when the pedestrian 200 is clearly recognized, may control a light such that the light illuminates an upper body of the pedestrian 200, excluding a face thereof, under the control of the processor 110.
[0084] However, examples are not limited to the foregoing. For example, the autonomous vehicle 100 may gradually switch the low beam to the high beam, but lower the brightness of the high beam to be less than the brightness of the low beam, under the control of the processor 110, thereby preventing a safety accident of the pedestrian 200 or animals that may occur due to glares by the high beam.
[0085]
[0086] Referring to
[0087] The autonomous vehicle 100 may predict at least one of a movement or a direction of the object 200 while determining a braking distance between the autonomous vehicle 100 and the object 200 based on a result of setting a target, under the control of the processor 110.
[0088] For example, the autonomous vehicle 100 may predict or estimate a movement of the object 200, a direction of the object 200, a speed of the object 200, a braking distance between the object 200 and the autonomous vehicle 100, a relative speed between the object 200 and the autonomous vehicle 100, and/or the like, by analyzing the sensed object 200, under the control of the processor 110.
[0089] As shown in
[0090] The autonomous vehicle 100 may control the FCA system or control steering based on the re-determined braking distance, under the control of the processor 110.
[0091] As shown in
[0092] For example, as shown in
[0093] In contrast, as shown in
[0094] In addition, as shown in
[0095] As described above, an autonomous vehicle (e.g., the autonomous vehicle 100) and control method thereof may recognize early a pedestrian (e.g., the pedestrian 200) present ahead in the distance using a front radar and a front camera while the autonomous vehicle 100 is driven by a driver on a dark road at night with a low beam. The autonomous vehicle 100 and the control method thereof may then switch the low beam to a high beam to operate the high beam to illuminate the whole body of the pedestrian 200 such that the front camera identifies the pedestrian 200. The autonomous vehicle 100 and the control method thereof may thus allow FCA function warning and braking control to be performed at a suitable time, thereby improving the driving stability of the autonomous vehicle 100.
[0096] In addition, an autonomous vehicle (e.g., the autonomous vehicle 100) and control method thereof may use the front radar capable of sensing an object (e.g., the pedestrian 200) present ahead in the distance to allow the autonomous vehicle 100 driving on a dark road at night with the low beam to recognize the pedestrian 200 normally. The autonomous vehicle 100 and the control method thereof may also use the front camera to recognize a lower body of the pedestrian 200, thereby enhancing recognition accuracy.
[0097] In addition, an autonomous vehicle (e.g., the autonomous vehicle 100) and control method thereof may determine the pedestrian 200 present in the distance on a dark road using an enhanced method, switch a headlight from the low beam to the high beam such that the front camera quickly recognizes the pedestrian 200 present ahead, and control warning and braking of an FCA function at a normal timing.
[0098] In addition, an autonomous vehicle (e.g., the autonomous vehicle 100) and control method thereof may switch the control to steering control, rather than FAC braking control, when a collision is expected because the pedestrian 200 is recognized relatively late and a braking distance is not sufficient, thereby facilitating the avoidance of the pedestrian 200.
[0099] The embodiments of the present disclosure described herein may be implemented as computer-readable instructions on a computer-readable medium in which a program is recorded. The computer-readable medium may include various types of recording devices that store data to be read by a computer system. The computer-readable medium may include, for example, a hard disk drive (HDD), a solid-state drive (SSD), a silicon disk drive (SDD), a read-only memory (ROM), a random-access memory (RAM), a compact disc ROM (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, or the like.
[0100] The foregoing detailed description should not be construed as restrictive but as illustrative in all respects. The scope of the embodiments of the present disclosure should be determined by reasonable interpretation of the appended claims, and all changes and modifications within the equivalent scope of the present disclosure are included in the scope of the present disclosure.