COLLISION AVOIDANCE METHOD AND APPARATUS
20230182722 · 2023-06-15
Assignee
Inventors
Cpc classification
B60W30/0956
PERFORMING OPERATIONS; TRANSPORTING
B60W2554/40
PERFORMING OPERATIONS; TRANSPORTING
B60W60/0015
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W30/09
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A collision avoidance method and apparatus are provided. The collision avoidance method includes sensing a forward vehicle and a lane of a front road, receiving global positioning system (GPS) information and vehicle specification information from the forward vehicle, generating a virtual lane corresponding to the forward vehicle upon failing to the lane of the front road, and performing a control operation to avoid collision with the forward vehicle based on the generated virtual lane.
Claims
1. A collision avoidance method, comprising: sensing, by a sensor, a forward vehicle and a lane of a front road; receiving, by a communicator, global positioning system (GPS) information and vehicle specification information from the forward vehicle; upon failing to detect the lane of the front road, generating, by a processor, a virtual lane corresponding to the forward vehicle; and performing, by the processor, a control operation to avoid collision with the forward vehicle based on the generated virtual lane.
2. The collision avoidance method of claim 1, wherein the generating the virtual lane corresponding to the forward vehicle comprises: upon failing to detect the lane of the front road, generating, by the processor, a virtual vehicle corresponding to the forward vehicle based on the GPS information and the vehicle specification information; and generating, by the processor, the virtual lane based on the generated virtual vehicle.
3. The collision avoidance method of claim 2, wherein the generating the virtual lane based on the generated virtual vehicle comprises: generating, by the processor, the virtual lane based on a width of a lane in which the virtual vehicle is traveling and an entire width of the virtual vehicle.
4. The collision avoidance method of claim 2, further comprising: receiving, by the processor, the GPS information and the vehicle specification information from each of a plurality of forward vehicles based on presence of the plurality of forward vehicles; generating, by the processor, a plurality of virtual vehicles corresponding to the plurality of forward vehicles, respectively; and generating, by the processor, a plurality of virtual lanes corresponding to the plurality of generated virtual vehicles, respectively.
5. The collision avoidance method of claim 4, further comprising determining, by the processor, whether the plurality of virtual lanes are straight lanes.
6. The collision avoidance method of claim 5, further comprising generating, by the processor, virtual lanes of an entire road by fusing the plurality of virtual lanes, based on the plurality of virtual lanes being the straight lanes.
7. The collision avoidance method of claim 5, further comprising: disregarding, by the processor, non-straight virtual lanes when some of the plurality of virtual lanes are not straight lanes; and generating, by the processor, virtual lanes of an entire road by fusing a plurality of virtual lanes except for the disregarded virtual lanes.
8. The collision avoidance method of claim 2, further comprising: receiving, by the processor, curvature information of the front road; and generating, by the processor, the virtual lane in correspondence to the curvature information.
9. The collision avoidance method of claim 1, further comprising: generating, by the processor, a hologram based on the generated virtual lane; and outputting, by an output unit, the generated hologram to a front and a rear of a vehicle.
10. A recording medium storing a collision avoidance program that causes a computer to sense a forward vehicle and a lane of a front road, receive global positioning system (GPS) information and vehicle specification information from the forward vehicle, upon failing to detect the lane of the front road, generate a virtual lane corresponding to the forward vehicle, and perform a control operation to avoid collision with the forward vehicle based on the generated virtual lane.
11. A collision avoidance apparatus, comprising: a sensor configured to sense a forward vehicle and a lane of a front road; a communicator configured to receive global positioning system (GPS) information and vehicle specification information from the forward vehicle; a navigation system configured to provide map information of the front road; and a processor configured to generate a virtual lane corresponding to the forward vehicle, upon failing to detect the lane of the front road, and perform a control operation to avoid collision with the forward vehicle based on the generated virtual lane.
12. The collision avoidance apparatus of claim 11, wherein the processor generates a virtual vehicle corresponding to the forward vehicle based on the GPS information and the vehicle specification information upon failing to detect the lane of the front road, and generates the virtual lane based on the generated virtual vehicle.
13. The collision avoidance apparatus of claim 12, wherein the processor generates the virtual lane based on a width of a lane in which the virtual vehicle is traveling and an entire width of the virtual vehicle.
14. The collision avoidance apparatus of claim 12, wherein the communicator receives the GPS information and the vehicle specification information from each of a plurality of forward vehicles based on presence of the plurality of forward vehicles, wherein the processor generates a plurality of virtual vehicles corresponding to the plurality of forward vehicles, respectively, and generates a plurality of virtual lanes corresponding to the plurality of generated virtual vehicles, respectively.
15. The collision avoidance apparatus of claim 14, wherein the processor determines whether the plurality of virtual lanes are straight lanes.
16. The collision avoidance apparatus of claim 15, wherein the processor generates virtual lanes of an entire road by fusing the plurality of virtual lanes, based on the plurality of virtual lanes being the straight lanes.
17. The collision avoidance apparatus of claim 15, wherein the processor disregards non-straight virtual lanes when some of the plurality of virtual lanes are not the straight lanes, and generates virtual lanes of an entire road by fusing a plurality of virtual lanes except for the disregarded virtual lanes.
18. The collision avoidance apparatus of claim 12, wherein the processor receives curvature information of the front road from the navigation system, and generates the virtual lane in correspondence to the curvature information.
19. The collision avoidance apparatus of claim 11, wherein the processor generates a hologram based on the generated virtual lane, and performs the control operation to output the generated hologram to a front and a rear of a vehicle.
20. An autonomous driving vehicle, comprising: at least one sensor configured to sense a forward vehicle and a lane of a front road; and a collision avoidance apparatus configured to generate a virtual lane corresponding to the forward vehicle, upon failing to detect the lane of the front road, and perform a control operation to avoid collision with the forward vehicle based on the generated virtual lane.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0031] The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the disclosure and together with the description serve to explain the principle of the disclosure. In the drawings:
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
DETAILED DESCRIPTION
[0042] Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings so that the present disclosure may be easily realized by those skilled in the art. However, the present disclosure may be achieved in various different forms and is not limited to the embodiments described herein. In the drawings, parts that are not related to a description of the present disclosure are omitted to clearly explain the present disclosure and similar reference numbers will be used throughout this specification to refer to similar parts.
[0043] In the specification, when a part “includes” an element, it means that the part may further include another element rather than excluding another element unless otherwise mentioned.
[0044]
[0045] First, a structure and function of an autonomous driving control system (e.g., an autonomous driving vehicle) to which an autonomous driving apparatus according to the present embodiments is applicable will be described with reference to
[0046] As illustrated in
[0047] The autonomous driving integrated controller 600 may obtain, through the driving information input interface 101, driving information based on manipulation of an occupant for a user input unit 100 in an autonomous driving mode or manual driving mode of a vehicle. As illustrated in
[0048] For example, a driving mode (i.e., an autonomous driving mode/manual driving mode or a sports mode/eco mode/safety mode/normal mode) of the vehicle determined by manipulation of the occupant for the driving mode switch 110 may be transmitted to the autonomous driving integrated controller 600 through the driving information input interface 101 as the driving information.
[0049] Furthermore, navigation information, such as the destination of the occupant input through the control panel 120 and a path up to the destination (e.g., the shortest path or preference path, selected by the occupant, among candidate paths up to the destination), may be transmitted to the autonomous driving integrated controller 600 through the driving information input interface 101 as the driving information.
[0050] The control panel 120 may be implemented as a touchscreen panel that provides a user interface (UI) through which the occupant inputs or modifies information for autonomous driving control of the vehicle. In this case, the driving mode switch 110 may be implemented as touch buttons on the control panel 120.
[0051] In addition, the autonomous driving integrated controller 600 may obtain traveling information indicative of a driving state of the vehicle through the traveling information input interface 201. The traveling information may include a steering angle formed when the occupant manipulates a steering wheel, an accelerator pedal stroke or brake pedal stroke formed when the occupant depresses an accelerator pedal or brake pedal, and various types of information indicative of driving states and behaviors of the vehicle, such as a vehicle speed, acceleration, a yaw, a pitch, and a roll formed in the vehicle. The traveling information may be detected by a traveling information detection unit 200, including a steering angle sensor 210, an accelerator position sensor (APS)/pedal travel sensor (PTS) 220, a vehicle speed sensor 230, an acceleration sensor 240, and a yaw/pitch/roll sensor 250, as illustrated in
[0052] Furthermore, the traveling information of the vehicle may include location information of the vehicle. The location information of the vehicle may be obtained through a global positioning system (GPS) receiver 260 applied to the vehicle. Such traveling information may be transmitted to the autonomous driving integrated controller 600 through the traveling information input interface 201 and may be used to control the driving of the vehicle in the autonomous driving mode or manual driving mode of the vehicle.
[0053] The autonomous driving integrated controller 600 may transmit driving state information provided to the occupant to an output unit 300 through the occupant output interface 301 in the autonomous driving mode or manual driving mode of the vehicle. That is, the autonomous driving integrated controller 600 transmits the driving state information of the vehicle to the output unit 300 so that the occupant may check the autonomous driving state or manual driving state of the vehicle based on the driving state information output through the output unit 300. The driving state information may include various types of information indicative of driving states of the vehicle, such as a current driving mode, transmission range, and speed of the vehicle.
[0054] If it is determined that it is necessary to warn a driver in the autonomous driving mode or manual driving mode of the vehicle along with the above driving state information, the autonomous driving integrated controller 600 transmits warning information to the output unit 300 through the occupant output interface 301 so that the output unit 300 may output a warning to the driver. In order to output such driving state information and warning information acoustically and visually, the output unit 300 may include a speaker 310 and a display 320 as illustrated in
[0055] Furthermore, the autonomous driving integrated controller 600 may transmit control information for driving control of the vehicle to a lower control system 400, applied to the vehicle, through the vehicle control output interface 401 in the autonomous driving mode or manual driving mode of the vehicle. As illustrated in
[0056] As described above, the autonomous driving integrated controller 600 according to the present embodiment may obtain the driving information based on manipulation of the driver and the traveling information indicative of the driving state of the vehicle through the driving information input interface 101 and the traveling information input interface 201, respectively, and transmit the driving state information and the warning information, generated based on an autonomous driving algorithm, to the output unit 300 through the occupant output interface 301. In addition, the autonomous driving integrated controller 600 may transmit the control information generated based on the autonomous driving algorithm to the lower control system 400 through the vehicle control output interface 401 so that driving control of the vehicle is performed.
[0057] In order to guarantee stable autonomous driving of the vehicle, it is necessary to continuously monitor the driving state of the vehicle by accurately measuring a driving environment of the vehicle and to control driving based on the measured driving environment. To this end, as illustrated in
[0058] The sensor unit 500 may include one or more of a LiDAR sensor 510, a radar sensor 520, or a camera sensor 530, in order to detect a nearby object outside the vehicle, as illustrated in
[0059] The LiDAR sensor 510 may transmit a laser signal to the periphery of the vehicle and detect a nearby object outside the vehicle by receiving a signal reflected and returning from a corresponding object. The LiDAR sensor 510 may detect a nearby object located within the ranges of a preset distance, a preset vertical field of view, and a preset horizontal field of view, which are predefined depending on specifications thereof. The LiDAR sensor 510 may include a front LiDAR sensor 511, a top LiDAR sensor 512, and a rear LiDAR sensor 513 installed at the front, top, and rear of the vehicle, respectively, but the installation location of each LiDAR sensor and the number of LiDAR sensors installed are not limited to a specific embodiment. A threshold for determining the validity of a laser signal reflected and returning from a corresponding object may be previously stored in a memory (not illustrated) of the autonomous driving integrated controller 600. The autonomous driving integrated controller 600 may determine a location (including a distance to a corresponding object), speed, and moving direction of the corresponding object using a method of measuring time taken for a laser signal, transmitted through the LiDAR sensor 510, to be reflected and returning from the corresponding object.
[0060] The radar sensor 520 may radiate electromagnetic waves around the vehicle and detect a nearby object outside the vehicle by receiving a signal reflected and returning from a corresponding object. The radar sensor 520 may detect a nearby object within the ranges of a preset distance, a preset vertical field of view, and a preset horizontal field of view, which are predefined depending on specifications thereof. The radar sensor 520 may include a front radar sensor 521, a left radar sensor 522, a right radar sensor 523, and a rear radar sensor 524 installed at the front, left, right, and rear of the vehicle, respectively, but the installation location of each radar sensor and the number of radar sensors installed are not limited to a specific embodiment. The autonomous driving integrated controller 600 may determine a location (including a distance to a corresponding object), speed, and moving direction of the corresponding object using a method of analyzing power of electromagnetic waves transmitted and received through the radar sensor 520.
[0061] The camera sensor 530 may detect a nearby object outside the vehicle by photographing the periphery of the vehicle and detect a nearby object within the ranges of a preset distance, a preset vertical field of view, and a preset horizontal field of view, which are predefined depending on specifications thereof.
[0062] The camera sensor 530 may include a front camera sensor 531, a left camera sensor 532, a right camera sensor 533, and a rear camera sensor 534 installed at the front, left, right, and rear of the vehicle, respectively, but the installation location of each camera sensor and the number of camera sensors installed are not limited to a specific embodiment. The autonomous driving integrated controller 600 may determine a location (including a distance to a corresponding object), speed, and moving direction of the corresponding object by applying predefined image processing to an image captured by the camera sensor 530.
[0063] In addition, an internal camera sensor 535 for capturing the inside of the vehicle may be mounted at a predetermined location (e.g., rear view mirror) within the vehicle. The autonomous driving integrated controller 600 may monitor a behavior and state of the occupant based on an image captured by the internal camera sensor 535 and output guidance or a warning to the occupant through the output unit 300.
[0064] As illustrated in
[0065]
[0066] Furthermore, in order to determine a state of the occupant within the vehicle, the sensor unit 500 may further include a bio sensor for detecting bio signals (e.g., heart rate, electrocardiogram, respiration, blood pressure, body temperature, electroencephalogram, photoplethysmography (or pulse wave), and blood sugar) of the occupant. The bio sensor may include a heart rate sensor, an electrocardiogram sensor, a respiration sensor, a blood pressure sensor, a body temperature sensor, an electroencephalogram sensor, a photoplethysmography sensor, and a blood sugar sensor.
[0067] Finally, the sensor unit 500 additionally includes a microphone 550 having an internal microphone 551 and an external microphone 552 used for different purposes.
[0068] The internal microphone 551 may be used, for example, to analyze the voice of the occupant in the autonomous driving vehicle 1000 based on AI or to immediately respond to a direct voice command of the occupant.
[0069] In contrast, the external microphone 552 may be used, for example, to appropriately respond to safe driving by analyzing various sounds generated from the outside of the autonomous driving vehicle 1000 using various analysis tools such as deep learning.
[0070] For reference, the symbols illustrated in
[0071]
[0072] Referring to
[0073] The sensor unit 2100 may include a camera that captures the front of an autonomous driving vehicle 1000.
[0074] The sensor unit 2100 may detect a road, a lane, vehicles, etc., located in front thereof from an image obtained by capturing the front of the autonomous driving vehicle 1000.
[0075] The sensor unit 2100 may provide detection information of vehicles located within a predetermined distance in front of the autonomous driving vehicle 1000 to the processor 2400.
[0076] The communicator 2200 may communicate with an external vehicle for collision avoidance control of the autonomous driving vehicle 1000 according to the present disclosure. For example, the communicator 2200 may receive GPS information and vehicle specification information from an external vehicle. The communicator 2200 may transmit and receive data with the external vehicle through vehicle-to-vehicle (V2V) communication.
[0077] The navigation system 2300 may provide navigation information. The navigation information may include at least one of information about a set destination, route information based on the destination, map information related to a driving route, and information about a current location of a vehicle. The navigation system 2300 may provide information such as a curvature of a road, the number of lanes of the road, and the size of a lane of the road to the processor 2400 as the map information related to a driving route.
[0078] The processor 2400 may detect a forward vehicle, that travels on a front road of the autonomous driving vehicle 1000, and a lane of the front road, based on data detected by the camera of the sensor unit 2100.
[0079] The processor 2400 may receive GPS information and vehicle specification information of the forward vehicle from the communicator 2200.
[0080] The processor 2400 may receive map information of the front road from the navigation system 2300.
[0081] Upon failing to detect the lane of the front road, the processor 2400 may generate a virtual vehicle corresponding to the forward vehicle based on the GPS information and the vehicle specification information.
[0082] The processor 2400 may generate a virtual lane based on the generated virtual vehicle. Accordingly, the processor 2400 may generate the virtual lane based on the width of a lane in which the virtual vehicle is traveling and the overall width of the virtual vehicle.
[0083] In addition, when there is a plurality of forward vehicles, the processor 2400 may receive the GPS information and the vehicle specification information from each of the forward vehicles, thereby generating a plurality of virtual vehicles corresponding respectively to the plurality of forward vehicles. Furthermore, the processor 2400 may generate a plurality of virtual lanes based on the generated plurality of virtual vehicles.
[0084] The processor 2400 may determine whether the generated virtual lanes are straight lanes.
[0085] When the virtual lanes are straight lanes, the processor 2400 may generate virtual lanes of the entire road by fusing the virtual lanes.
[0086] Meanwhile, when some of the virtual lanes are not straight lanes, the processor 2400 may disregard virtual lanes other than straight lanes and fuse virtual lanes except for the disregarded virtual lanes, thereby generating virtual lanes of the entire road.
[0087] Then, the processor 2400 may receive information about the curvature of the road from the navigation system 2300. The processor 2400 may generate a virtual lane corresponding to the curvature information.
[0088] The processor 2400 may perform control to avoid collision with a forward vehicle based on the generated virtual lane.
[0089] The processor 2400 may perform control to generate a hologram based on the generated virtual lane.
[0090] The output unit 2500 may output the hologram to the front and rear of the autonomous driving vehicle 1000 based on a control signal generated from the processor 2400.
[0091]
[0092] In
[0093] First, referring to
[0094] In this case, as illustrated in
[0095] Thereafter, as illustrated in
[0096] Thereafter, the autonomous driving vehicle 1000 may generate a virtual lane 4300 based on information about the width of a lane based on the generated virtual vehicle 3000 and navigation information.
[0097] That is, the autonomous driving vehicle 1000 may generate the virtual lane 4300 at a distance separated by a preset distance w1 from the left and right of the virtual vehicle 3000. According to an embodiment, the preset distance w1 may be a value obtained by dividing, in half, a difference between a width w2 of the lane and the entire width w of the forward vehicle based on the navigation information.
[0098] Accordingly, the autonomous driving vehicle 1000 may perform a collision avoidance control operation through the generated virtual vehicle and virtual lane.
[0099]
[0100]
[0101] As illustrated in
[0102] However, when collision avoidance is performed using a virtual lane according to the GPS information as illustrated in
[0103] Accordingly, when the autonomous driving vehicle 1000 performs the collision avoidance operation according to lane change, the autonomous driving vehicle 1000 may prevent collision with the forward vehicle by considering the virtual lane and the virtual vehicle.
[0104] Meanwhile,
[0105] As illustrated in
[0106] However, when collision avoidance is performed according to sudden braking of the forward vehicle according to the GPS information as illustrated in
[0107] Accordingly, upon performing a collision avoidance operation according to braking of the forward vehicle, the autonomous driving vehicle 1000 may prevent collision with the forward vehicle by considering the entire length 1 of the virtual vehicle.
[0108]
[0109] First, the autonomous driving vehicle 1000 according to an embodiment of the present disclosure may acquire front image data from the front camera (S601).
[0110] Furthermore, the autonomous driving vehicle 1000 may detect a forward vehicle and a driving lane of the forward vehicle from the image data input through the front camera (S602). The autonomous driving vehicle may detect a forward vehicle and a driving lane of the forward vehicle located within a camera detection range among the image data.
[0111] Upon failing to detect the forward vehicle and the traveling lane of the forward vehicle, the autonomous driving vehicle 1000 may receive GPS information and vehicle specification information of the forward vehicle from the forward vehicle (S603). For example, the autonomous driving vehicle 1000 may receive the GPS information and the vehicle specification information from the forward vehicle, and the GPS information may include GPS information of the forward vehicle, and the vehicle specification information may include the entire width and entire length information of the forward vehicle.
[0112] After step S603, the autonomous driving vehicle 1000 may generate a virtual vehicle corresponding to the forward vehicle based on the GPS information and the vehicle specification information of the forward vehicle (S604).
[0113] The autonomous driving vehicle may generate a virtual lane in which the virtual vehicle 2000 travels based on the virtual vehicle generated in step S604 and navigation information (S605).
[0114] The autonomous driving vehicle 1000 may perform collision avoidance control with the virtual vehicle based on the virtual lane generated in step S605 and the virtual vehicle (S606). Accordingly, in a collision avoidance control situation of the autonomous driving vehicle 1000, a possibility of collision may be avoided through the virtual vehicle. This may correspond to, for example,
[0115] That is, the technical idea of the present disclosure may be applied to the whole autonomous driving vehicle or may be applied to only some configurations inside the autonomous driving vehicle. The scope of the present disclosure should be determined according to the matters described in the claims.
[0116]
[0117] First,
[0118] As illustrated in
[0119] Accordingly, accuracy of the autonomous driving vehicle is lowered in that the virtual lane generated only by data of one forward vehicle does not match the actual lane, and thus there is a risk that the autonomous driving vehicle 1000 travels in a lane different from the actual lane during the collision prevention operation.
[0120] Accordingly, the autonomous driving vehicle 1000 needs to perform a method of increasing the reliability of a virtual lane through a plurality of forward vehicles as illustrated in
[0121]
[0122] The autonomous driving vehicle 1000 may generate virtual lanes of the entire road by combining the generated virtual lanes. Therethrough, the reliability of the autonomous driving vehicle 1000 may be raised.
[0123] According to an embodiment, when three vehicles are driving in front of the autonomous driving vehicle 1000 which is traveling on a three-lane straight road, the autonomous driving vehicle 1000 may generate a first virtual lane corresponding to a forward vehicle traveling in the first lane, a second virtual lane corresponding to a forward vehicle traveling in the second lane, and a third virtual lane corresponding to a forward vehicle driving in the third lane.
[0124] Thereafter, the autonomous driving vehicle 1000 may generate virtual lanes of the entire three-lane road by substituting the generated first to third virtual lanes into the road on which the autonomous driving vehicle 1000 is currently traveling.
[0125] Meanwhile,
[0126] The autonomous driving vehicle 1000 may generate virtual lanes of the entire road based on forward vehicles of the autonomous driving vehicle 1000. In this case, when there is a difference in lane information by comparing the generated virtual lanes, the autonomous driving vehicle 1000 may select virtual lanes generated based on more vehicles among a plurality of vehicles as driving lanes and travel in the driving lanes.
[0127] According to an embodiment, when three vehicles in front of the autonomous driving vehicle 1000 traveling in a lane on the road are traveling in respective lanes, the autonomous driving vehicle 1000 may generate a first virtual lane corresponding to the forward vehicle traveling in the first lane, a second virtual lane corresponding to the forward vehicle traveling in the second lane, and a third virtual lane corresponding to the forward vehicle traveling in the third lane.
[0128] In this case, the first virtual lane and the third virtual lane may be virtual lanes corresponding to straight lanes of the road, and the second virtual lane may be a virtual lane which does not correspond to a straight lane of the road.
[0129] When it is determined that the second virtual lane does not correspond to the straight lane, the autonomous driving vehicle 1000 may disregard data of the forward vehicle corresponding to the second virtual lane and generate virtual lanes of the entire road based on the first virtual lane and the third virtual lane.
[0130] Thereafter, the autonomous driving vehicle 1000 may autonomously travel by determining that the virtual lanes of the entire road implemented by the second virtual lane and the third virtual lane are actual lanes.
[0131]
[0132] Referring to
[0133] As a result of the determination, when there is a plurality of forward vehicles, the autonomous driving vehicle 1000 may generate virtual vehicles corresponding to the plural forward vehicles (S802).
[0134] The autonomous driving vehicle 1000 may generate a plurality of virtual lanes corresponding to a plurality of virtual vehicles (S803).
[0135] The autonomous driving vehicle 1000 may determine whether the plural virtual lanes are straight lanes (S804).
[0136] When the virtual lanes are not necessarily straight lanes in S804, the autonomous driving vehicle 1000 may exclude virtual lanes corresponding to non-straight lanes (S805).
[0137] The autonomous driving vehicle 1000 may generate virtual lanes of the entire road by fusing a plurality of virtual lanes (S806). Accordingly, when there is a difference in lane information by comparing the generated virtual lanes, the autonomous driving vehicle 1000 may generate the virtual lanes of the entire road generated based on more virtual vehicles among a plurality of virtual vehicles.
[0138]
[0139] The autonomous driving vehicle 1000 may generate a virtual lane by applying a curvature of a road on which the autonomous driving vehicle 1000 is currently traveling. In addition, the autonomous driving vehicle 1000 may increase the accuracy of the virtual lane by receiving the motion data of a forward vehicle.
[0140]
[0141] The autonomous driving vehicle 1000 may generate a virtual lane in front thereof based on the location of a lane in which the autonomous driving vehicle 1000 is currently traveling.
[0142] Meanwhile,
[0143] As illustrated in
[0144] Thereafter, as illustrated in
[0145]
[0146]
[0147] The autonomous driving vehicle 1000 may output a virtual lane 4300 generated by a forward vehicle through a hologram 4500 to the front and rear thereof. In this case, the hologram 4500 may be output at the same position as the virtual lane 4300. In addition, the hologram 4500 may be output in the same form as the virtual lane 4300.
[0148] For example, the autonomous driving vehicle 1000 may visually provide the hologram 4500 according to the virtual lane 4300 to the driver thereof.
[0149] For example, the autonomous driving vehicle 1000 may visually provide lane information to a backward vehicle through the hologram according to the virtual lane 4300. Accordingly, collision with a vehicle approaching from the rear of the autonomous driving vehicle may be avoided.
[0150]
[0151] The autonomous driving vehicle 1000 may detect a driving lane of a forward vehicle located within a camera detection range among image data.
[0152] Upon failing to detect the driving lane of the road, the autonomous driving vehicle 1000 may receive information about the width of the entire road from the navigation system 2300.
[0153] The autonomous driving vehicle 1000 may generate a virtual lane based on the received information about the width of the entire road. To this end, the autonomous driving vehicle 1000 may generate a virtual lane 4300 corresponding to a central line by dividing the width of the entire road by 2.
[0154] Thereafter, the autonomous driving vehicle 1000 should be capable of avoiding collision with the forward vehicle and have no problems in safe driving even with respect to an opposite vehicle. For this purpose, the autonomous driving vehicle 1000 needs to consider all of a width W3 of an actual road, a width W4 of the virtual lane, and the entire width of the opposite vehicle.
[0155] Thereafter, the autonomous driving vehicle 1000 may output the generated virtual lane 4300 through a hologram 4500.
[0156] Therefore, even on road on which lanes are not detected, there is an advantage of eliminating the possibility of collision between the autonomous driving vehicle and the opposite vehicle while securing the central line so that a nearby vehicle may move.
[0157] As another aspect of the present disclosure, the above-described proposal or operation of the disclosure may be provided as code which may be implemented, carried out, or executed by a “computer” (comprehensive concept including a system-on-chip (SoC) or a microprocessor) or as an application, a computer-readable storage medium, or a computer program product, which stores or includes the code, and this also falls within the scope of the present disclosure.
[0158] As described above, the detailed description of the embodiments of the present disclosure has been given to enable those skilled in the art to implement and practice the disclosure. Although the disclosure has been described with reference to the embodiments, those skilled in the art will appreciate that various modifications and variations may be made in the present disclosure without departing from the spirit or scope of the disclosure and the appended claims. For example, those skilled in the art may use constructions disclosed in the above-described embodiments in combination with each other.
[0159] Accordingly, the present disclosure should not be limited to the specific embodiments described herein, but should be accorded the broadest scope consistent with the principles and features disclosed herein.
[0160] Various implementations of the apparatus, system, unit, controller, and processor described herein may include digital electronic circuits, integrated circuits, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), computer hardware, firmware, software, and/or a combination thereof. These various implementations may include an implementation using one or more computer programs executable on a programmable system. The programmable system includes at least one programmable processor (which may be a special purpose processor or a general-purpose processor) coupled to receive and transmit data and instructions from and to a storage system, at least one input device, and at least one output device. Computer programs (also known as programs, software, software applications or codes) contain instructions for a programmable processor and are stored in a computer-readable recording medium.