SYSTEM FOR PREDICTING A LOCATION-BASED MANEUVER OF A REMOTE VEHICLE IN AN AUTONOMOUS VEHICLE
20230234612 ยท 2023-07-27
Inventors
Cpc classification
B60W30/0956
PERFORMING OPERATIONS; TRANSPORTING
B60W2554/4045
PERFORMING OPERATIONS; TRANSPORTING
B60W60/00274
PERFORMING OPERATIONS; TRANSPORTING
B60W2554/4044
PERFORMING OPERATIONS; TRANSPORTING
B60W60/0011
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W60/00
PERFORMING OPERATIONS; TRANSPORTING
B60W30/09
PERFORMING OPERATIONS; TRANSPORTING
B60W30/095
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A system for an autonomous vehicle that predicts a location-based maneuver of a remote vehicle located in a surrounding environment includes one or more vehicle sensors collecting sensory data indicative of one or more vehicles located in the surrounding environment. The system also includes one or more automated driving controllers in electronic communication with the one or more vehicle sensors. The one or more automated driving controllers execute instructions to compare a lane of travel of the remote vehicle with a current lane of travel of the autonomous vehicle. In response to determining the lane of travel of the remote vehicle is a different lane than the current lane of the autonomous vehicle, the one or more automated driving controllers predict the location-based maneuver of the remote vehicle based on aggregated vehicle metrics that are based on historical data collected at the specific geographical location.
Claims
1. A system for an autonomous vehicle that predicts a location-based maneuver of a remote vehicle located in a surrounding environment, the system comprising: one or more vehicle sensors collecting sensory data indicative of one or more vehicles located in the surrounding environment; and one or more automated driving controllers in electronic communication with the one or more vehicle sensors, wherein the one or more automated driving controllers executes instructions to: monitor the one or more vehicle sensors for the sensory data; identify the remote vehicle located in a specific geographical location relative to the autonomous vehicle based on the sensory data; determine a lateral distance and a longitudinal distance between the remote vehicle and the autonomous vehicle; compare the lateral distance and the longitudinal distance with respective threshold distance values based on the sensory data; in response to determining the lateral distance and the longitudinal distance are less than the respective threshold distance values, determine a lane of travel of the remote vehicle based on the sensory data; compare the lane of travel of the remote vehicle with a current lane of travel of the autonomous vehicle; in response to determining the lane of travel of the remote vehicle is a different lane than the current lane of the autonomous vehicle, predict the location-based maneuver of the remote vehicle based on aggregated vehicle metrics that are based on historical data collected at the specific geographical location; and determine an adaptive maneuver that the autonomous vehicle performs in response to predicting the location-based maneuver of the remote vehicle.
2. The system of claim 1, wherein the remote vehicle is located in front of the autonomous vehicle, and wherein the remote vehicle travels in the same direction as the autonomous vehicle.
3. The system of claim 2, wherein the location-based maneuver of the remote vehicle is a lane change.
4. The system of claim 2, wherein the one or more automated driving controllers execute instructions to: compare the lateral distance with a maximum threshold lateral distance value that is part of the aggregated vehicle metrics; determine the lateral distance is less than the maximum threshold lateral distance value; and in response to determining the lateral distance is less than the maximum threshold lateral distance value, determine a probability that the remote vehicle performs the lane change from a lane of travel into a current lane that the autonomous vehicle is located based on the aggregated vehicle metrics.
5. The system of claim 1, wherein the remote vehicle travels in an opposite direction from the autonomous vehicle, and wherein the autonomous vehicle and the remote vehicle are both located at a four-way intersection.
6. The system of claim 5, wherein the location-based maneuver is a turn at the four-way intersection.
7. The system of claim 5, wherein the one or more automated driving controllers execute instructions to: compare the lateral distance with a maximum threshold lateral distance value that is part of the aggregated vehicle metrics; determine the lateral distance is less than the maximum threshold lateral distance value; and in response to determining the lateral distance is less than the maximum threshold lateral distance value, determine a probability that the remote vehicle performs a turn from a four-way intersection based on the aggregated vehicle metrics.
8. The system of claim 1, wherein the adaptive maneuver is either decelerating the autonomous vehicle or having the autonomous vehicle come to a stop.
9. The system of claim 1, wherein the historical data is collected over a period of time and represents overall vehicle behavior in the specific geographical location.
10. The system of claim 1, wherein the historical data accounts for changes in overall vehicle behavior based on a time of day, a day of the week, and zoning rules.
11. The system of claim 1, wherein the historical data may include discrete profiles for a unique geographical location based on different times of the day or day of the week.
12. The system of claim 1, wherein the adaptive maneuver includes a deceleration or stop, increasing a longitudinal distance between the autonomous vehicle and the remote vehicle, merging left or right, or changing lanes.
13. A method for predicting a location-based maneuver of a remote vehicle located in a surrounding environment, the method comprising: monitoring, by one or more controllers, one or more vehicle sensors for sensory data, wherein the one or more vehicle sensors are part of an autonomous vehicle and collect sensory data indicative of one or more vehicles located in the surrounding environment; identifying, by the one or more controllers, the remote vehicle located in a specific geographical location relative to the autonomous vehicle based on the sensory data; determining a lateral distance and a longitudinal distance between the remote vehicle and the autonomous vehicle; comparing the lateral distance and the longitudinal distance with respective threshold distance values based on the sensory data; in response to determining the lateral distance and the longitudinal distance are less than the respective threshold distance values, determining a lane of travel of the remote vehicle based on the sensory data; comparing the lane of travel of the remote vehicle with a current lane of travel of the autonomous vehicle; in response to determining the lane of travel of the remote vehicle is a different lane than the current lane of the autonomous vehicle, predicting the location-based maneuver of the remote vehicle based on aggregated vehicle metrics that are based on historical data collected at the specific geographical location relative to the autonomous vehicle; and determining an adaptive maneuver that the autonomous vehicle performs in response to predicting the location-based maneuver of the remote vehicle.
14. A system for an autonomous vehicle that predicts a change in vehicle speed of a remote vehicle located in a surrounding environment, the system comprising: one or more vehicle sensors collecting sensory data indicative of one or more vehicles located in the surrounding environment; and one or more automated driving controllers in electronic communication with the one or more vehicle sensors, wherein the one or more automated driving controllers executes instructions to: monitor the one or more vehicle sensors for the sensory data; identify the remote vehicle located in a specific geographical location relative to the autonomous vehicle based on the sensory data; determine a lateral distance and a longitudinal distance between the remote vehicle and the autonomous vehicle; compare the lateral distance and the longitudinal distance with respective threshold distance values based on the sensory data; in response to the determining the lateral distance and the longitudinal distance are less than the respective threshold distance values, determine a lane of travel of the remote vehicle based on the sensory data; compare the lane of travel of the remote vehicle with a current lane of travel of the autonomous vehicle; in response to determining the lane of travel of the remote vehicle is a different lane than the current lane of the autonomous vehicle, predict the change in vehicle speed of the remote vehicle based on aggregated vehicle metrics that are based on historical data collected at the specific geographical location relative to the autonomous vehicle; and determine an adaptive maneuver that the autonomous vehicle performs in response to predicting the change in vehicle speed of the remote vehicle.
15. The system of claim 14, wherein the change in vehicle speed is either a deceleration event or an acceleration event.
16. The system of claim 14, wherein the remote vehicle travels in the same direction as the autonomous vehicle.
17. The system of claim 14, wherein the historical data is collected over a period of time and represents overall vehicle behavior in the specific geographical location.
18. The system of claim 14, wherein the historical data accounts for changes in overall vehicle behavior based on a time of day, a day of the week, and zoning rules.
19. The system of claim 14, wherein the historical data may include discrete profiles for a unique geographical location based on different times of the day or day of the week.
20. The system of claim 14, wherein the adaptive maneuver includes a deceleration or stop, increasing a longitudinal distance between the autonomous vehicle and the remote vehicle, merging left or right, or changing lanes.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way.
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
DETAILED DESCRIPTION
[0033] The following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses.
[0034] Referring to
[0035] As explained below, the location-based maneuver of the remote vehicle 14 that is predicted by the system 12 is either a lane change performed by a remote vehicle 14 located in a position in front of the autonomous vehicle 10, where the autonomous vehicle 10 and the remote vehicle 14 travel in the same direction (seen in
[0036] As explained below, the system 12 predicts the location-based maneuver of the remote vehicle 14 based on aggregated vehicle metrics that are based on historical data collected at a specific geographical location where the remote vehicle 14 is presently located. The aggregated vehicle metrics are stored in memory of the one or more automated driving controllers 20 or, in the alternative, by one or more databases 40 that are part of one or more centralized computers 42 located at the back-end office 36. The historical data that the aggregated vehicle metrics are based on is collected over a period of time and is representative of overall vehicle behavior in the specific geographical location. The overall vehicle behavior includes information such as vehicle speed, whether the vehicle accelerated or decelerated, and any possible maneuvers that were performed. In an embodiment, the aggregated vehicle metrics include the probability that the remote vehicle 14 will perform a specific maneuver at the specific geographical location. For example, the aggregated vehicle metrics may indicate eighty percent probability that a vehicle may continue straight at a specific intersection, a five percent probability the vehicle turns right, and a fifteen percent probability that the vehicle turns left.
[0037] The historical data accounts for changes in the overall vehicle behavior based on a time of day, a day of the week, and zoning rules. Some examples of zoning rules include, but are not limited to, areas of reduced speed during specific hours of the day such as school zones, and signage forbidding vehicles to perform specific maneuvers such as, for example, turning during a red light. In one embodiment, the historical data may include discrete profiles for a unique geographical location based on different times of the day or day of the week. For example, a first profile may be used during a morning rush hour time during the weekday, a second profile for an evening rush hour time during the weekday, and a third profile for weekends with respect to a unique geographical location. For example, if the specific geographical location is in a school zone, then the probability that a remote vehicle 14 may turn left or right at an intersection in a school zone may be significantly greater during the morning rush hour time during a weekday as parents drop off their children to school when compared to other times of the day, or on weekends.
[0038] Referring to
[0039] Referring to
[0040]
[0041] In block 204, the one or more automated driving controllers 20 identify the remote vehicle 14 located in the specific geographical location relative to the autonomous vehicle 10 based on the sensory data. In addition to the specific geographical location, the one or more automated driving controllers 20 also determine a direction of travel of the remote vehicle 14 relative to the autonomous vehicle 10. In the example as shown in
[0042] In block 206, the one or more automated driving controllers 20 determine the lateral distance d.sub.lat and the longitudinal distance d.sub.long between the remote vehicle 14 and the autonomous vehicle 10 based on the sensory data. The method 200 may then proceed to block 208.
[0043] In block 208, the one or more automated driving controllers 20 compare the lateral distance d.sub.lat and the longitudinal distance d.sub.long with respective threshold distance values. That is, the lateral distance d.sub.lat is compared with a lateral threshold distance value and the longitudinal distance d.sub.long is compared with a longitudinal threshold distance value.
[0044] The lateral threshold distance value and the longitudinal threshold distance value are part of the aggregated vehicle metrics that are stored in memory of the one or more automated driving controllers 20 or, in the alternative, by the one or more databases 40. When the lateral distance d.sub.lat is less than the lateral threshold distance value and the longitudinal distance d.sub.long is less than longitudinal threshold distance value, the one or more automated driving controllers 20 determine a potential change in motion of the autonomous vehicle 10. The potential change in motion occurs when the remote vehicle 14 performs the location-based maneuver. For example, in the embodiment as shown in FIG. 2A, the potential change in motion is when the remote vehicle 14 changes lanes from the center lane C to the right lane R. In addition to the lateral distance d.sub.lat and the longitudinal distance d.sub.long, in an embodiment the potential change is also determined based on factors such as, for example, road shape and speed limit.
[0045] In response to determining the lateral distance d.sub.lat is less than the lateral threshold distance value and the longitudinal distance d.sub.long is less than longitudinal threshold distance value, the method 200 may proceed to block 210. Otherwise, the method 200 terminates.
[0046] In block 210, in response to determining the lateral distance d.sub.lat and the longitudinal distance d.sub.long are less than the respective threshold distance values, the one or more automated driving controllers 20 determine a lane of travel of the remote vehicle 14 based on the sensory data. In the example as shown in
[0047] In block 212, the one or more automated driving controllers 20 compare the lane of travel of the remote vehicle 14 with a current lane of the autonomous vehicle 10. In response to the one or more automated driving controllers 20 determining both the autonomous vehicle 10 and the remote vehicle 14 are traveling in the same lane, the method 200 may terminate. However, in response to determining the lane of travel of the remote vehicle 14 is different than the current lane of the autonomous vehicle 10, the method 200 may then proceed to block 214.
[0048] In block 214, in response to determining the lane of travel of the remote vehicle 14 is different than the current lane of the autonomous vehicle 10, the one or more controllers 20 may predict the location-based maneuver of the remote vehicle 14 based on the aggregated vehicle metrics that are based on historical data collected at the specific geographical location relative to the autonomous vehicle 10.
[0049] In the example as shown in
[0050] In block 216, the one or more automated driving controllers 20 determine the adaptive maneuver that the autonomous vehicle 10 performs in response to predicting the location-based maneuver of the remote vehicle 14. That is, in the example as shown in
[0051]
[0052] Referring now to
[0053] In block 304, the one or more automated driving controllers 20 identify the remote vehicle 14 located in the specific geographical location relative to the autonomous vehicle 10 based on the sensory data. In the embodiment as shown in
[0054] In block 306, the one or more automated driving controllers 20 determine the lateral distance d.sub.lat and the longitudinal distance d.sub.long between the remote vehicle 14 and the autonomous vehicle 10. The method 300 may then proceed to block 308.
[0055] In block 308, the one or more automated driving controllers 20 compare the lateral distance d.sub.lat and the longitudinal distance d.sub.long with respective threshold distance values. That is, the lateral distance d.sub.lat is compared with the lateral threshold distance value and the longitudinal distance d.sub.long is compared with the longitudinal threshold distance value. In response to determining the lateral distance d.sub.lat is less than the lateral threshold distance value and the longitudinal distance d.sub.long is less than longitudinal threshold distance value, the method 300 may proceed to block 310. Otherwise, the method 300 terminates.
[0056] In block 310, in response to determining the lateral distance chat and the longitudinal distance d.sub.long are less than the respective threshold distance values, the one or more automated driving controllers 20 determine a lane of travel of the remote vehicle 14 based on the sensory data. In the example as shown in
[0057] In block 312, the one or more automated driving controllers 20 compare the lane of travel of the remote vehicle 14 with a current lane of the autonomous vehicle 10. In response to the one or more automated driving controllers 20 determining both the autonomous vehicle 10 and the remote vehicle 14 are traveling in the same lane, the method 300 may terminate. However, in response to determining the lane of travel of the remote vehicle 14 is different than the current lane of the autonomous vehicle 10, the method 300 may then proceed to block 314.
[0058] In block 314, in response to determining the lane of travel of the remote vehicle 14 is different than the current lane of the autonomous vehicle 10, the one or more automated driving controllers 20 predict the location-based maneuver of the remote vehicle 14 based on the aggregated vehicle metrics.
[0059] In the example as shown in
[0060] In block 316, the one or more automated driving controllers 20 determine the adaptive maneuver that the autonomous vehicle 10 performs in response to predicting the location-based maneuver of the remote vehicle 14. That is, in the example as shown in
[0061]
[0062] Referring now to
[0063] In block 404, the one or more automated driving controllers 20 identify the remote vehicle 14 located in the specific geographical location relative to the remote vehicle 14 based on the sensory data. In addition to the specific geographical location, the one or more automated driving controllers 20 also determine a direction of travel of the remote vehicle 14 relative to the autonomous vehicle 10. In the example as shown in
[0064] In block 406, the one or more automated driving controllers 20 determine the lateral distance d.sub.lat and the longitudinal distance d.sub.long between the remote vehicle 14 and the autonomous vehicle 10. The method 400 may then proceed to block 408.
[0065] In block 408, the one or more automated driving controllers 20 compare the lateral distance d.sub.lat and the longitudinal distance d.sub.long with respective threshold distance values. That is, the lateral distance d.sub.lat is compared with the lateral threshold distance value and the longitudinal distance d.sub.long is compared with the longitudinal threshold distance value. In response to determining the lateral distance d.sub.lat is less than the lateral threshold distance value and the longitudinal distance d.sub.long is less than longitudinal threshold distance value, the method 400 may proceed to block 410. Otherwise, the method 400 terminates.
[0066] In block 410, in response to determining the lateral distance d.sub.lat and the longitudinal distance d.sub.long are less than the respective threshold distance values, the one or more automated driving controllers 20 determine a lane of travel of the remote vehicle 14 based on the sensory data. In the example as shown in
[0067] In block 412, the one or more automated driving controllers 20 compare the lane of travel of the remote vehicle 14 with a current lane of the autonomous vehicle 10. In response to the one or more automated driving controllers 20 determining both the autonomous vehicle 10 and the remote vehicle 14 are traveling in different lanes, the method 300 may terminate. However, in response to determining the lane of travel of the remote vehicle 14 is the same as the current lane of the autonomous vehicle 10, the method 400 may then proceed to block 414.
[0068] In block 414, in response to determining the lane of travel of the remote vehicle 14 is the same as the current lane of the autonomous vehicle 10, the one or more automated driving controllers 20 predict the change in vehicle speed of the remote vehicle 14 based on the aggregated vehicle metrics that are based on historical data collected at the specific geographical location relative to the autonomous vehicle 10. In the example as shown in
[0069] In block 416, the one or more automated driving controllers 20 determine the adaptive maneuver that the autonomous vehicle 10 performs in response to predicting the change in vehicle speed of the remote vehicle 14. That is, in the example as shown in
[0070] Referring generally to the figures, the disclosed system provides various technical effects and benefits by providing an approach to predict the behavior of vehicles surrounding the host or autonomous vehicle. The prediction is determined based on aggregated vehicle metrics that are based on historical data collected at the specific geographical location of the remote vehicle. The system also determines adaptive maneuvers for the autonomous vehicle to perform to accommodate the behavior of the remote vehicle. Thus, the disclosed system anticipates likely maneuvers by surrounding vehicles and instructs the autonomous vehicle to react to the likely maneuvers, thereby allowing the autonomous vehicle to operate more naturalistically in traffic.
[0071] The controllers may refer to, or be part of an electronic circuit, a combinational logic circuit, a field programmable gate array (FPGA), a processor (shared, dedicated, or group) that executes code, or a combination of some or all of the above, such as in a system-on-chip. Additionally, the controllers may be microprocessor-based such as a computer having a at least one processor, memory (RAM and/or ROM), and associated input and output buses. The processor may operate under the control of an operating system that resides in memory. The operating system may manage computer resources so that computer program code embodied as one or more computer software applications, such as an application residing in memory, may have instructions executed by the processor. In an alternative embodiment, the processor may execute the application directly, in which case the operating system may be omitted.
[0072] The description of the present disclosure is merely exemplary in nature and variations that do not depart from the gist of the present disclosure are intended to be within the scope of the present disclosure. Such variations are not to be regarded as a departure from the spirit and scope of the present disclosure.