MOTOR VEHICLE WITH TURN SIGNAL-BASED LANE LOCALIZATION
20220355864 · 2022-11-10
Assignee
Inventors
Cpc classification
B62D15/0255
PERFORMING OPERATIONS; TRANSPORTING
B60Q1/343
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A method increases fidelity of a lane localization function aboard a motor vehicle by receiving input signals indicative of a relative position of the vehicle with respect to a roadway. The input signals include GPS and geocoded mapping data, and an electronic turn signal indicative of activation of the turn signal lever. Sensor-specific lane probability distributions are calculated via the lane localization function using the input signals. The various distributions are fused via the localization function to generate a host lane assignment. The host lane assignment corresponds to a lane of the roadway having a highest probability among a set of possible lane assignments. An autonomous steering control action is performed aboard the motor vehicle using an Advanced Driver Assistance System (ADAS) in response to the host lane assignment. A motor vehicle has a controller that performs the method, e.g., by executing instructions from computer-readable media.
Claims
1. A method for performing a lane localization function aboard a motor vehicle having a turn signal lever, the method comprising: receiving GPS data and geocoded mapping data, via a controller, together indicative of a relative position of the motor vehicle with respect to a roadway; in response to enabling conditions, receiving an electronic turn signal indicative of a present activation state of the turn signal lever, wherein the GPS data, the geocoded mapping data, and the present activation state collectively form a set of input signals; in response to the set of input signals, calculating a plurality of lane probability distributions via the controller using the lane localization function; automatically fusing the plurality of lane probability distributions, via the lane localization function, to thereby generate a host lane assignment of the motor vehicle, wherein the host lane assignment corresponds to a lane of the roadway having a highest probability among a set of possible lane assignments; and executing an autonomous steering control action aboard the motor vehicle in response to the host lane assignment, via the controller using an Advanced Driver Assistance System (ADAS).
2. The method of claim 1, wherein the motor vehicle includes a video camera configured to collect real-time video image data of the roadway, the method further comprising receiving the real-time video image data, via the controller, as part of the input signals.
3. The method of claim 2, further comprising: determining a lane marker type via the controller using the real-time video image data of the roadway, wherein the enabling conditions includes a predetermined lane marker type on a side of a lane matching a direction of the turn signal.
4. The method of claim 2, wherein the motor vehicle also includes a radar system and/or a lidar system respectively configured to collect radar data or lidar data of the roadway, the method further comprising receiving the radar data and/or the lidar data as part of the set of input signals.
5. The method of claim 4, further comprising automatically fusing the radar data and/or the lidar data with the real-time video image data using an object fusion logic block of the controller.
6. The method of claim 1, wherein the lane localization function includes a Markov localization function, and wherein calculating the plurality of lane probability distributions includes using Markov localization function.
7. The method of claim 1, wherein the enabling conditions includes a lane change value indicative of an elapsed time since a detected lane change in a direction of the electronic turn signal and/or an elapsed time since the electronic turn signal has been set in a particular direction.
8. The method of claim 1, further comprising: determining a driver attention score via the controller, wherein the enabling conditions includes the driver attention score exceeding a calibrated threshold attention score.
9. The method of claim 1, wherein the enabling conditions includes a look-ahead value indicative of an existence of and/or an estimated width of an upcoming lane of the roadway.
10. The method of claim 1, wherein executing an autonomous steering control action includes executing on or more of a lane centering control maneuver, a driver-requested automatic lane change maneuver, and/or a system-initiated automatic lane change maneuver.
11. A motor vehicle comprising: a vehicle body; a set of road wheels connected to the vehicle body; a turn signal lever configured to generate an electronic turn signal; an Advanced Driver Assistance System (ADAS) configured to control a dynamic state of the motor vehicle based on a host lane assignment; and a controller configured to execute instructions for performing a lane localization function aboard the motor vehicle using the electronic turn signal, wherein the controller is configured to: receive a set of input signals indicative of a relative position of the motor vehicle with respect to a roadway, the set of input signals including GPS data and geocoded mapping data; in response to enabling conditions, receive the electronic turn signal as part of the input signals, the electronic turn signal being indicative of a present activation state of the turn signal lever; calculate multiple lane probability distributions, via the lane localization function, using the set of input signals; automatically fuse the lane probability distributions, via the lane localization function, to thereby generate the host lane assignment, wherein the host lane assignment corresponds to a lane of the roadway having a highest probability among a set of possible lane assignments; and execute an autonomous steering control action aboard the motor vehicle in response to the host lane assignment using the ADAS.
12. The motor vehicle of claim 11, wherein the motor vehicle includes a video camera configured to collect real-time video image data of the roadway, and wherein the set of input signals include the real-time video image data.
13. The motor vehicle of claim 12, wherein the controller is configured to: determine a lane marker type via the controller using the real-time video image data of the roadway, wherein the enabling conditions includes a predetermined crossable lane marker type on a side of a lane matching a direction of the electronic turn signal.
14. The motor vehicle of claim 11, further comprising a radar system and/or a lidar system configured to collect radar data and/or lidar data of the roadway, respectively, as part of the set of input signals, and wherein the controller is configured to automatically fuse the radar data and/or the lidar data together or with other sensor data using an object fusion logic block.
15. The motor vehicle of claim 11, wherein the enabling conditions includes a lane change value indicative of an elapsed time since a detected lane change in a direction of the turn signal and/or an elapsed time since the turn signal has been set in one direction.
16. The motor vehicle of claim 11, wherein the controller is configured to determine a driver attention score, wherein the enabling conditions includes the driver attention score exceeding a calibrated threshold attention score.
17. The motor vehicle of claim 11, wherein the enabling conditions includes a look-ahead value indicative of an existence of and/or an estimated width of an upcoming lane of the roadway.
18. The motor vehicle of claim 11, wherein the autonomous steering control action includes a lane centering control maneuver, a driver-requested automatic lane change maneuver, and/or a controller-initiated automatic lane change maneuver.
19. A computer-readable medium on which is recorded instructions for selectively increasing fidelity of a lane localization function aboard a motor vehicle having a turn signal lever, wherein the instructions are selectively executed by a processor of the motor vehicle in response to a set of fidelity enhancement enabling conditions to thereby cause the processor to: receive an electronic turn signal indicative of a present activation state of the turn signal lever, wherein the electronic turn signal is part of a set of input signals; receive GPS data and geocoded mapping data, as part of the set of input signals and indicative of a relative position of the motor vehicle with respect to a roadway; calculate a plurality of lane probability distributions, via the lane localization function, using the input signals; automatically fuse the plurality of lane probability distributions, via the lane localization function, to thereby generate a host lane assignment of the motor vehicle, wherein the host lane assignment corresponds to a lane of the roadway having a highest probability among a set of possible lane assignments; and transmit control signals to an Advanced Driver Assistance System (ADAS) of the motor vehicle in response to the host lane assignment to thereby perform an autonomous steering control action of the motor vehicle, including one or more of a lane centering control maneuver, a driver-requested automatic lane change maneuver, and/or a system-initiated automatic lane change maneuver.
20. The computer-readable medium of claim 19, wherein the enabling conditions includes a detected predetermined crossable lane marker type on a side of a lane matching an activation direction of the turn signal lever.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0019]
[0020]
[0021]
[0022]
[0023]
DETAILED DESCRIPTION
[0024] The present disclosure is susceptible of embodiment in many different forms. Representative examples of the disclosure are shown in the drawings and described herein in detail as non-limiting examples of the disclosed principles. To that end, elements and limitations described in the Abstract, Introduction, Summary, and Detailed Description sections, but not explicitly set forth in the claims, should not be incorporated into the claims, singly or collectively, by implication, inference, or otherwise.
[0025] For purposes of the present description, unless specifically disclaimed, use of the singular includes the plural and vice versa, the terms “and” and “or” shall be both conjunctive and disjunctive, “any” and “all” shall both mean “any and all”, and the words “including”, “containing”, “comprising”, “having”, and the like shall mean “including without limitation”. Moreover, words of approximation such as “about”, “almost”, “substantially”, “generally”, “approximately”, etc., may be used herein in the sense of “at, near, or nearly at”, or “within 0-5% of”, or “within acceptable manufacturing tolerances”, or logical combinations thereof.
[0026] Referring to the drawings, wherein like reference numbers refer to like features throughout the several views, a motor vehicle 10 is depicted in
[0027] For illustrative simplicity, select components of the motor vehicle 10 are shown and described while other components are omitted. In the depicted representative embodiment of
[0028] Within the scope of the present disclosure, the motor vehicle 10 is equipped with a plurality of lane localization input sensors and/or devices, hereinafter referred to as a lane localization suite 18 for simplicity. Collectively, the constituent components of the lane localization suite 18 provide input signals (arrow 30) to the controller 50 indicative of a relative position of the motor vehicle 10 with respect to/on a roadway having the road surface 16. The capabilities of the lane localization suite 18 are thus relied upon in real-time by the controller 50 when performing autonomous or semi-autonomous steering functions, such as but not necessarily limited to lane keep assist with lane departure warning, lane change assist, lane centering, etc.
[0029] The composition of the lane localization suite 18 will vary with the particular equipment configuration of the motor vehicle 10. Typically, however, the lane localization suite 18 will include or have access to at least a geocoded mapping database 22 and a GPS receiver 24, the latter of which receives GPS signals 25 from an orbiting constellation of GPS satellites (not shown), as is well understood in the art. Thus, the input signals (arrow 30) typically include geocoded mapping data and the GPS signals 25 from the respective geocoded mapping database 22 and GPS receiver 24, with the mapping data provided by such sources displayed to the driver of the motor vehicle 10 via a touch screen (not shown) or other suitable display arranged in a center stack or other convenient location within the motor vehicle 10, or on a similar touch screen of a smartphone or other portable electronic device.
[0030] Additionally, the lane localization suite 18 may include a video camera 20 and one or more remote sensing transceivers 26, e.g., a radar sensor and/or a lidar sensor. With respect to the video camera 20, such a device may be securely connected to the vehicle body 14 at a suitable forward-facing location thereof, such as behind a rearview mirror 17 attached to a windshield 19, to a dashboard (not shown), or at another application-suitable location providing good visibility of the roadway lying before the motor vehicle 10. The video camera 20 is configured to collect real-time video image data of the roadway, with the input signals (arrow 30) including the real-time video image data. The remote sensing transceiver(s) 26 in turn are configured to transmit electromagnetic energy at sensor-specific wavelengths toward a target as an interrogation signal, and to receive a reflected portion of the electromagnetic waveform from the target as a response signal. In
[0031] In addition to the video camera 20, the geocoded mapping database 22, the GPS receiver 24, and the remote sensing transceivers 26, the method 100 and controller 50 of
[0032] For the purposes of executing the method 100, the controller 50 shown schematically in
[0033] Software may include, in non-limiting examples, routines, programs, objects, components, and data structures that perform particular tasks or implement particular data types. The software may form an interface to allow a computer to react according to a source of input. The software may also cooperate with other code segments to initiate a variety of tasks in response to data received in conjunction with the source of the received data. The software may be stored on a variety of memory (M), such as but not limited to CD-ROM, magnetic disk, solid-state memory, etc. Similarly, the method 100 or parts thereof may be executed by a device other than the controller 50 and/or embodied in firmware or dedicated hardware in an available manner, such as when implemented by an ASIC, a programmable logic device, a field programmable logic device, discrete logic, etc.
[0034] Still referring to
[0035] Referring now to
[0036] In the illustrated scenario, a driver of the motor vehicle 10 traveling in lane L2 may decide to merge into lane L1, e.g., in preparation for an upcoming offramp or when passing another vehicle. Such a lane change maneuver is indicated in
[0037] Referring to
[0038] At block B104 (“TS-ENBL Cond?”), the controller 50 of
[0039] As part of the present method 100, the controller 50 of
[0040] In an optional embodiment of the motor vehicle 10 in which an interior camera and/or other hardware and associated software evaluates and assigns a numeric score to a driver's attention level, e.g., a gaze-tracking camera collocated with the video camera 20 on the windshield 19 of
[0041] Still other exemplary enabling conditions usable as part of block B104 include a particular detected lane marker type. More particularly, the controller 50 could evaluate, for instance using resident image processing software, whether lines detected on the side of a lane L1, L2, L3, or L4 matching a turn signal direction of the turn signal lever 28 are dashed or another crossable line type, and/or that a line marker located one over from a side of the lane matching the direction of the turn signal is valid, i.e., is not a road edge corresponding to the boundary lines 144 of
[0042] At block B105 (“DISBL CC.sub.TS”), the controller 50 temporarily disables use of the turn signal state by preventing its use in the lane localization function 51, with “CC.sub.TS” shown in
[0043] Block B106 (“CALC Bel(L.sub.T=1)”) includes calculating multiple lane probability distributions using the lane localization function 51 shown in
[0044] With respect to lane localization and its related statistical probability analysis, in general the controller 50 performs real-time calculation of the probability of the motor vehicle 10 being present in a particular lane at a given moment in time. Referring briefly again to
[0045] Precisely how a given input probability distribution (P(s.sub.T|l)) is calculated in a given application may differ depending on the sensor being considered, i.e., the various sensors or components in the lane localization suite 18 of
[0046] For the turn signal indication contemplated herein for the purposes of increasing fidelity of the lane localization function 51, lanes meeting the criteria of having an adjacent available lane in the direction of the turn signal, for instance indicating that a lane is present on the left of the motor vehicle 10 when the turn signal lever 28 is used to signal a left hand turn, are given a higher probability than lanes that do not meet this criteria. The exact number of a high or low probability result may be tunable in order to adjust weight of a given sensor or sensor input. Once a given sensor input distribution is calculated, it may be applied in the same way as other available sensor inputs to produce a final belief/probability distribution.
[0047] As part of block B106 of
[0048] where S.sub.T is the turn signal direction at time T, and L.sub.T is the set of available lanes at time T, e.g., Lanes L1, L2, L3, and L4 of
[0049] Referring briefly to the representative probability sequence 80 of
[0050] The probability distribution 82 from a sensor update is then applied to the prior probability distribution 84 at time T-1, i.e., Belief(L.sub.T-1=1). Thus, the controller 50 multiples the probability distributions 82 and 84 together to calculate an updated probability distribution 86, i.e., Belief(L.sub.T=1), at the present time T. The effect in the representative scenario of
[0051] Referring again to
[0052] Execution of the method 100 described above could be facilitated using the representative control logic 50L as depicted schematically in
[0053] In a possible signal configuration, for example, the video camera 20 may output camera data CC.sub.20B indicative of detected lane marker types, camera data CC.sub.20A indicative of the presence, size, and shape of, and range to detected objects in proximity to the motor vehicle 10, and camera data CC.sub.20C indicative of lateral positions of detected broken or solid line lane markers, e.g., lines 44 and 144 of
[0054] The lane localization function 51, i.e., an encoded or programmed implementation of the present method 100, is thereafter used to generate multiple lane probability distributions. Specifically, logic blocks 60, 62, 64, 66, and 68 may be used to independently generate corresponding sensor-specific probability distributions, which are then collectively processed via a fusion block 69 (“Fusion”) to generate the above-noted host lane assignment. The ADAS equipment 70 is then informed by the host lane assignment, with the controller 50 thereafter executing a corresponding control action aboard the motor vehicle 10 of
[0055] Logic block 60 may receive fused object data CC.sub.27 from an object fusion block 27 and lane layout data CC.sub.22C from the geocoded mapping database 22, and then output a lane probability distribution (arrow P.sub.60) indicative of relative lanes of fused objects versus a map-based lane layout. In terms of data fusion at block 27, which in general may be implemented in an analogous manner to implementation of the fusion block 69 described below, various possibilities exist within the scope of the disclosure, including fusing video image data, radar data, and/or lidar data, i.e., any or all available sensor data depending on sensor availability and application requirements. Logic block 62 similarly may determine a lane probability distribution (arrow P.sub.62) indicative of lane marker type using the video camera 20 and the geocoded mapping database 22. Likewise, logic block 64 may produce a lane probability distribution (arrow P.sub.64) indicative of the GPS location of the motor vehicle 10, while logic block 66 produces a lane probability distribution (arrow P.sub.66) based on a detected lane change informed solely by the video camera 20. To account for turn signal information, the logic block 68 produce a lane probability distribution (arrow P.sub.68) based solely on the state of the turn signal lever 28 of
[0056] As noted above, sensor updates are applied in the same way, i.e., by multiplying probability distributions by a prior belief and thereafter normalizing to produce a new belief. Input probability distributions, e.g., P.sub.60, P.sub.62, P.sub.64, P.sub.66, P.sub.68 in
[0057] Those skilled in the art will appreciate that, by using the present method 100, the fidelity of the lane localization function 51 may be situationally improved aboard the motor vehicle 10 of
[0058] The detailed description and the drawings or figures are supportive and descriptive of the present teachings, but the scope of the present teachings is defined solely by the claims. While some of the best modes and other embodiments for carrying out the present teachings have been described in detail, various alternative designs and embodiments exist for practicing the present teachings defined in the appended claims. Moreover, this disclosure expressly includes combinations and sub-combinations of the elements and features presented above and below.