DRIVER SEAT AND SIDE MIRROR-BASED LOCALIZATION OF 3D DRIVER HEAD POSITION FOR OPTIMIZING DRIVER ASSIST FUNCTIONS
20230130201 · 2023-04-27
Assignee
Inventors
Cpc classification
B60W2040/0881
PERFORMING OPERATIONS; TRANSPORTING
G06V20/597
PHYSICS
International classification
Abstract
A motor vehicle includes a body defining a passenger compartment and having opposing driver and passenger sides. The vehicle includes driver and passenger side mirrors. The driver side mirror has a sweep angle (α) and an elevation angle (γ). The passenger side mirror has a sweep angle (β). The side mirrors are separated from each other by a distance (D). An adjustable driver seat has a height (H). An electronic controller, in response to position signals inclusive of angles (α), (β), and (γ), the distance (D), and the height (H), calculates a three-dimensional (3D) driver head position of a driver of the vehicle, and thereafter uses the 3D driver head position to improve performance of a driver assist system device. Functions of the controller may be implemented as a method or recorded on a computer readable medium for execution by a processor.
Claims
1. A motor vehicle comprising: a vehicle body defining a passenger compartment, the vehicle body including a driver side and a passenger side; a driver side mirror connected to the driver side of the vehicle body, the driver side mirror having a sweep angle (α) and an elevation angle (γ); a passenger side mirror connected to the passenger side of the vehicle body and having a sweep angle (β), wherein the passenger side mirror is separated from the drive side mirror by a distance of separation (D); an adjustable driver seat connected to the vehicle body within the passenger compartment and having a height (H); an electronic controller configured, in response to electronic position signals inclusive of the sweep angle (α), the sweep angle (β), the elevation angle (γ), the distance of separation (D), and the height (H), to calculate a three-dimensional (3D) driver head position of a driver of the motor vehicle when the driver is seated within the passenger compartment; and at least one driver assist system (DAS) device in communication with the electronic controller and configured to execute a corresponding driver assist control function in response to the 3D driver head position.
2. The motor vehicle of claim 1, wherein the electronic controller is configured to output the 3D driver head position as a numeric triplet value [x, y, z] corresponding to an x-position (P.sub.x), a y-position (P.sub.y), and a z-position within a nominal xyz Cartesian frame of reference.
3. The motor vehicle of claim 2, wherein the electronic controller is configured to calculate the x-position (P.sub.x) using the following equation:
4. The motor vehicle of claim 3, wherein the function of the x-position (P.sub.x) is P.sub.y=P.sub.x tan(α), and the electronic controller is configured to calculate the z-position (P.sub.z) as a function of the x-position (P.sub.x).
5. The motor vehicle of claim 4, wherein the function of the x-position (P.sub.x) is
6. The motor vehicle of claim 1, further comprising a microphone array, wherein the at least one DAS device includes an acoustic beamforming block coupled to the microphone array and configured to process received acoustic signatures therefrom, wherein the acoustic beamforming block is configured to use the 3D driver head position to perform a speech recognition function as the corresponding driver assist control function.
7. The motor vehicle of claim 1, further comprising a driver monitoring system (DMS) having at least one camera positioned within the passenger compartment, wherein the at least one DAS device includes the DMS and an associated logic block configured to perform a gaze tracking and/or facial expression recognition function as the corresponding driver assist control function.
8. The motor vehicle of claim 1, further comprising a heads up display (HUD) device positioned within the passenger compartment, wherein the at least one DAS device includes the HUD device and an associated logic block configured to adjust a setting of the HUD device as the corresponding driver assist control function.
9. The motor vehicle of claim 1, further comprising a height-adjustable seat belt assembly mounted to the vehicle body within the passenger compartment, wherein the at least one DAS device includes the height-adjustable seat belt assembly and an associated logic block configured to adjust a height of the height-adjustable seat belt assembly as the corresponding driver assist control function.
10. The motor vehicle of claim 1, wherein the motor vehicle is characterized by an absence of a driver monitoring system (DMS).
11. A method for use aboard a motor vehicle having a vehicle body defining a passenger compartment, the vehicle body including driver side mirror connected to a driver side of the vehicle body and having a sweep angle (α) and an elevation angle (γ), a passenger side mirror connected to a passenger side of the vehicle body, having a sweep angle (β), and separated from the driver side mirror by a distance of separation (D), and an adjustable driver seat connected to the vehicle body within the passenger compartment and having a height (H), the method comprising: receiving, via an electronic controller, a set of position signals inclusive of the sweep angle (α), the sweep angle (β), the elevation angle (γ), the distance (D), and the height (H); calculating, using the set of position signals, a three-dimensional (3D) driver head position of a driver of the motor vehicle when the driver is seated within the passenger compartment; and transmitting the 3D driver head position to at least one driver assist system (DAS) device in communication with the electronic controller to thereby request execution of a corresponding driver assist control function aboard the motor vehicle.
12. The method of claim 11, wherein calculating the 3D driver head position includes calculating a numeric triplet value [x, y, z] corresponding to an x-position (P.sub.x), a y-position (P.sub.y), and a z-position (P.sub.z) within a nominal xyz Cartesian frame of reference.
13. The method of claim 12, wherein calculating the 3D driver head position includes calculating the x-position (P.sub.x) using the following equation:
14. The method of claim 13, further comprising calculating the z-position (P.sub.z) as a function of the x-position (P.sub.x) using the following equation:
15. The method of claim 12, wherein transmitting the 3D driver head position to the at least one DAS device includes transmitting the 3D driver head position to an acoustic beamforming block coupled to a microphone array to thereby cause the at least one DAS device to perform a speech recognition function, using the 3D driver head position, as the corresponding driver assist control function.
16. The method of claim 12, wherein transmitting the 3D driver head position to the at least one DAS device includes transmitting the 3D driver head position to a logic block associated with a driver monitoring system (DMS) having at least one camera positioned within the passenger compartment, the at least one DMS device including the DMS, to thereby cause the DMS to perform a gaze tracking function and/or facial expression recognition function as the corresponding driver assist control function.
17. A computer readable medium on which instructions are recorded for localizing a three dimensional (3D) driver head position of a driver of a motor vehicle, wherein execution of the instructions by at least one processor of an electronic controller causes the electronic controller to: receive a set of position signals inclusive of a sweep angle (α) and an elevation angle (γ) of a driver side mirror connected to a driver side of a vehicle body of the motor vehicle, a sweep angle (β) of a passenger side mirror connected to a passenger side of the vehicle body, a distance of separation (D) between the driver side mirror and the passenger side mirror, and a height (H) of an adjustable driver seat; calculate the 3D driver head position using the set of position signals when the driver is seated within a passenger compartment of the motor vehicle; and transmit the 3D driver head position to at least one driver assist system (DAS) device of the motor vehicle for use in executing a corresponding driver assist control function aboard the motor vehicle.
18. The computer readable medium of claim 17, wherein execution of the instructions causes the electronic controller to transmit an optimization request signal to the at least one DAS device concurrently with the 3D driver head position to thereby request use of the 3D driver head position in an optimization subroutine of the at least one DAS device.
19. The computer readable medium of claim 17, wherein execution of the instructions causes the electronic controller to calculate the 3D driver head position as a numeric triplet value [x, y, z] corresponding to an x-position (P.sub.x), a y-position (P.sub.y), and a z-position (P.sub.z) within a nominal xyz Cartesian frame of reference.
20. The computer readable medium of claim 19, wherein execution of the instructions causes the electronic controller to respectively calculate the x-position (P.sub.x), the y-position (P.sub.y), and the z-position (P.sub.z) using the following equations:
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0012]
[0013]
[0014]
[0015]
DETAILED DESCRIPTION
[0016] The present disclosure is susceptible of embodiment in many different forms. Representative examples of the disclosure are shown in the drawings and described herein in detail as non-limiting examples of the disclosed principles. To that end, elements and limitations described in the Abstract, Introduction, Summary, and Detailed Description sections, but not explicitly set forth in the claims, should not be incorporated into the claims, singly or collectively, by implication, inference, or otherwise.
[0017] For purposes of the present description, unless specifically disclaimed, use of the singular includes the plural and vice versa, the terms “and” and “or” shall be both conjunctive and disjunctive, “any” and “all” shall both mean “any and all”, and the words “including”, “containing”, “comprising”, “having”, and the like shall mean “including without limitation”. Moreover, words of approximation such as “about”, “almost”, “substantially”, “generally”, “approximately”, etc., may be used herein in the sense of “at, near, or nearly at”, or “within 0-5% of”, or “within acceptable manufacturing tolerances”, or logical combinations thereof.
[0018] Referring to the drawings, wherein like reference numbers refer to like features throughout the several views,
[0019] The vehicle body 12 includes a driver side 12D and a passenger side 12P. As shown in the representative embodiment of the motor vehicle 10 shown in
[0020] Within the scope of the present disclosure, the motor vehicle 10 includes an electronic controller (C) 50 in the form of one or more computer hardware and software devices collectively configured, i.e., programmed in software and equipped in hardware, to execute computer readable instructions embodying a method 100. In executing the present method 100, the electronic controller 50 is able to optimize one or more driver assist functions aboard the motor vehicle 10, with such functions possibly ranging from automatic speech and/or facial recognition/gaze tracking functions to direct or indirect component control actions, several examples of which are described in greater detail below.
[0021] In accordance with the present method 100, the vehicle body 12 include respective first (“driver side”) and second (“passenger side”) adjustable side mirrors 20D and 20P. The respective driver and passenger side mirrors 20D and 20P are configured as reflective panes of glass each selectively positioned by the driver 18 using a corresponding joystick or other suitable device (not shown). The driver side mirror 20D, which is connected to the driver side 12D of the vehicle body 12, has a corresponding sweep angle (α) and elevation angle (γ), both of which are measured, monitored, and reported to the electronic controller 50 as part of a set of position signals (arrow CO over the vehicle communications bus, e.g., a controller area network (CAN) bus as appreciated in the art, in the course of operation of the motor vehicle 10.
[0022] Referring briefly to
[0023] Referring again to
[0024] The motor vehicle 10 of
[0025] The electronic controller 50 of
[0026] The motor vehicle 10 as contemplated herein includes at least one driver assist system (DAS) device 11 in communication with the electronic controller 50 over hardwired transfer conductors and/or a wireless communications pathway using suitable communications protocols, e.g., a Wi-Fi protocol using a wireless local area network (LAN), IEEE 802.11, a 3G, 4G, or 5G cellular network-based protocol, BLUETOOTH, BLE BLUETOOTH, and/or other suitable protocol. Each DAS device 11 in turn is operable to execute a corresponding driver assist control function in response to the received 3D driver head position (P.sub.18) as set forth herein.
[0027] Still referring to
[0028] Computer readable non-transitory instructions or code embodying the method 100 and executable by the electronic controller 50 may include one or more separate software programs, each of which may include an ordered listing of executable instructions for implementing the stated logical functions, specifically including those depicted in
[0029] Similarly, the processor (P) receives and processes measured position signals from the respective driver and passenger side mirrors 20D and 20P, as well as stored calibrated data such as the above-noted distance of separation (D) along a lateral axis (xx) extending between mirrors 20D and 20P. In response to these signals, which collectively form the position signals (arrow CO of
[0030] As noted above, the DAS device 11 shown schematically in
[0031] Moreover, digital signal processing (DSP) techniques such as acoustic beamforming may be used to shape received acoustic waveforms 32 from the various microphones 30 of the microphone array 30A shown in
[0032] With respect to vision systems, modern vehicles having higher trim levels benefit from the integration of cameras and related image processing software that together identify unique characteristics of the driver 18, and that thereafter use such characteristics in the overall control of the motor vehicle 10. For example, facial recognition software may be used to estimate the cognitive state of the driver 18, such as by detecting facial expressions or other facial characteristics that may be indicative of possible drowsiness, anger, or distraction. Gaze detection is used in a similar manner to help detect and locate the pupils of the driver 18, and to thereafter calculate a line of sight of the driver 18. Refined location and orientation of the driver 18 in the motor vehicle 10 can also help improve gaze detection and task completion, providing more accurate results for voice-based virtual assistants.
[0033] In order to locate the face of the driver 18 within the passenger compartment 16, the electronic controller 50 uses setting profiles of the driver side mirrors 20D and the passenger side mirror 20P, as well as of the adjustable driver seat 19. The electronic controller 50 performs its localization functions without specialized sensors, with the electronic controller 50 instead using position data from integrated position sensors of the respective driver and passenger side mirrors 22D and 22P and the adjustable driver seat 19, i.e., data that is already customarily reported via a resident CAN bus of the motor vehicle 10.
[0034] The electronic controller 50 according to a representative embodiment is configured, for a nominal xyz Cartesian reference frame in which the electronic controller 50 derives and outputs the numeric triplet value P[x,y,z], to calculate an x-position (P.sub.x) of a head of the driver 18 of
and to calculate a y-position (P.sub.y) as a function of the x-position (P.sub.x). The function of the x-position (P.sub.x) may be expressed mathematically as P.sub.y=P.sub.x tan(α), with the electronic controller 50 configured to calculate a z-position (P.sub.z) as a function of the x-position (P.sub.x). The function of the x-position (P.sub.x) may be expressed as
[0035]
[0036] In another possible embodiment, the DAS device 11 of
[0037] Referring now to
[0038] As part of the method 100, a 3D position estimator block 102 of the electronic controller 50, in response to input signals (arrow CC.sub.I of
[0039] As shown in
[0040] In a similar vein, the method 100 may be used to improve the available accuracy and/or detection speed of a driver monitoring system (DMS) device 60 having one or more cameras 62 disposed thereon. Such cameras 62 may operate at a required resolution and in an application-specific, eye-safe frequency range. Output images (arrow 160) may be fed from the DMS device 60 into a corresponding processing block, e.g., a facial expression recognition (FXR) block 44 or a gaze control (GZ) block 54, which in turn are configured to generate respective output files (arrows 144 and 154, respectively) and communicate the same to the DAS APPS block 40. Facial expressions can be used for various purposes, including for sentiment analysis. It is useful, for instance, for adapting voice user interface and feedback to the driver 18. A better estimate of user gaze and facial expression would therefore lead to more accurate classification of the user's sentiment.
[0041] Other vision-based applications may be used along with or instead of the representative F×R block 44 and GZC block 54 without departing from the intended scope of the present disclosure. Thus, the DAS device 11 of
[0042] As an example, a calculated line of sight determined in logic block 54 could be used by the DAS APP block 40 to detect or estimate possible distraction of the driver 18, with the DAS APP block 40 thereafter executing a control action responsive to the estimated alertness or distraction level, e.g., activating an alarm to alert the driver 18 and/or performing an autonomous control action such as steering or braking.
[0043] As noted above, the present method 100 is not limited to use with speech recognition and vision-based applications. For instance, one or more additional DAS devices 11X could be used aboard the motor vehicle 10 of
[0044] Alternatively to or concurrently with operation of the HUD device 28, the associated control logic block 64 for the height-adjustable seat belt assembly 24 may output electronic control signals to raise or lower a shoulder harness other restraint to a more comfortable or suitable position. Other DAS devices 11X may be contemplated in view of the disclosure that may benefit from improved locational accuracy of the 3D driver head position (P.sub.18), such as but not limited to possible deployment trajectories of airbags, positioning of a rear view mirror, etc., and therefore the various examples of
[0045] Those skilled in the art will recognize that the method 100 may be used aboard the motor vehicle 10 of
[0046] In another aspect of the disclosure, the memory (M) of
[0047] Additionally, the execution of the instructions causes the electronic controller 50 to calculate the 3D head position (P.sub.18) using the position signals (arrow CC.sub.I) when the driver 18 is seated within the passenger compartment 16, and to transmit the 3D head position (P.sub.18) to the driver assist system (DAS) device(s) 11 for use in execution of a corresponding driver assist control function aboard the motor vehicle 10. Execution of the instructions in some embodiments causes the electronic controller 50 to transmit optimization request signals (arrow CC.sub.O) to the DAS device(s) 11 concurrently with the 3D head position (P.sub.18) to thereby request use of the 3D head position (P.sub.18) in an optimization subroutine of the DAS device(s) 11.
[0048] As will be appreciated by those skilled in the art in view of the foregoing disclosure, the method 100 of
[0049] The detailed description and the drawings or figures are supportive and descriptive of the present teachings, but the scope of the present teachings is defined solely by the claims. While some of the best modes and other embodiments for carrying out the present teachings have been described in detail, various alternative designs and embodiments exist for practicing the present teachings defined in the appended claims. Moreover, this disclosure expressly includes combinations and sub-combinations of the elements and features presented above and below.