AUTONOMOUS VEHICLES AND METHODS OF USING SAME
20200331486 · 2020-10-22
Inventors
- Romeo Wieczorek (Stuttgart, DE)
- Reinhold Langbein (Stuttgart, DE)
- Matthias Koller (Stuttgart, DE)
- Yijun Zhao (Stuttgart, DE)
- Thomas Agung Nugraha (Stuttgart, DE)
- Gianluca Caretta (Stuttgart, DE)
- David-Kenneth Jaeger (Stuttgart, DE)
- Ilka Rötzer (Denkendorf, DE)
- Andreas Herrmann (Stuttgart, DE)
- Mukesh Patel (Stuttgart, DE)
- Jan Dormanns (Göttingen, DE)
- Torsten Weingärtner (Göttingen, DE)
- Yann Buchet (Göttingen, DE)
- Alf Liesener (Bruchköbel, DE)
- Stefanie Göttlicher (Bruchköbel, DE)
Cpc classification
B60R2300/202
PERFORMING OPERATIONS; TRANSPORTING
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/167
PERFORMING OPERATIONS; TRANSPORTING
B60W50/045
PERFORMING OPERATIONS; TRANSPORTING
B60K35/29
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/149
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/84
PERFORMING OPERATIONS; TRANSPORTING
B60W2420/40
PERFORMING OPERATIONS; TRANSPORTING
B60W40/08
PERFORMING OPERATIONS; TRANSPORTING
B60W2540/221
PERFORMING OPERATIONS; TRANSPORTING
B60K35/28
PERFORMING OPERATIONS; TRANSPORTING
B60W2540/22
PERFORMING OPERATIONS; TRANSPORTING
B60W2420/54
PERFORMING OPERATIONS; TRANSPORTING
B60K35/65
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/146
PERFORMING OPERATIONS; TRANSPORTING
B60R1/00
PERFORMING OPERATIONS; TRANSPORTING
B60K2360/186
PERFORMING OPERATIONS; TRANSPORTING
G06V20/597
PHYSICS
B60K2360/682
PERFORMING OPERATIONS; TRANSPORTING
B60N3/102
PERFORMING OPERATIONS; TRANSPORTING
B60K35/60
PERFORMING OPERATIONS; TRANSPORTING
G06V40/28
PHYSICS
B60K2360/741
PERFORMING OPERATIONS; TRANSPORTING
B60W2420/403
PERFORMING OPERATIONS; TRANSPORTING
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
B60N3/18
PERFORMING OPERATIONS; TRANSPORTING
B60K35/10
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W50/04
PERFORMING OPERATIONS; TRANSPORTING
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
B60N3/18
PERFORMING OPERATIONS; TRANSPORTING
B60R7/04
PERFORMING OPERATIONS; TRANSPORTING
B60W40/08
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A vehicle includes one or more sensors arranged on at least one of a dashboard, a roof, and a center console of the vehicle, or one or more image capturing devices for capturing one or more images from a left side and a right side of the vehicle, an electronic control unit (ECU) configured to communicate with the one or more sensors or the one or more image capturing devices, and at least one of a morphing surface, a windshield display, and one or more displays configured to be controlled by the ECU, where the one or more sensors are arranged under a black panel surface.
Claims
1-39. (canceled)
40. A vehicle, comprising: one or more sensors arranged on at least one of a dashboard, a roof, and a center console of the vehicle, or one or more image capturing devices for capturing one or more images from a left side and a right side of the vehicle; an electronic control unit (ECU) configured to communicate with the one or more sensors or the one or more image capturing devices; and at least one of a morphing surface, a windshield display, and one or more displays configured to be controlled by the ECU, wherein the one or more sensors are arranged under a black panel surface, the black panel surface is comprised by a seamless surface of a dashboard which includes a structure of a partially translucent layer which is designed as a decorative cover layer, and a carrier layer having recesses in certain areas which are used to position sensors via fastening devices attached to the carrier layer so that no bumps on the top layer arise and thus appearance is not disturbed, and wherein the ECU receives data from the one or more sensors on one or more of a position of an occupant, a position of an object, a size of an object, or a shape of an object.
41. The vehicle according to claim 40, wherein the one or more sensors comprise at least one of a Time of Flight (ToF) sensor, a camera, an infrared (IR) sensor, a radar, an ultrasound, a capacitive sensor, a brightness sensor, and a LIDAR sensor.
42. The vehicle according to claim 40, wherein the one or more sensors comprises a plurality of sensors, and a first sensor of the plurality of sensors is configured to detect a position of the occupant to activate or deactivate a second sensor or a third sensor of the plurality of sensors.
43. The vehicle according to claim 40, wherein, in response to a first sensor detecting that a position of the occupant is outside a range of the second sensor and inside a range of the third sensor, the third sensor is activated, and in response to the first sensor detecting that a position of the occupant is outside a range of the third sensor and inside a range of the second sensor, the second sensor is activated.
44. The vehicle according to claim 43, wherein the second sensor and the third sensor are positioned on a left side and a right side of an instrument cluster display at a distance ranging from about 15 cm above to about 15 cm below the instrument cluster display.
45. The vehicle according to claim 40, wherein the one or more sensors are configured to detect a position of at least one of a center of a face of the occupant, an outer edge of a head of the occupant, a shoulder of the occupant, and a head of the occupant with respect to a shoulder of the occupant.
46. The vehicle according claim 40, wherein each of the one or more sensors has a field of view of at least about 30 degrees and operates at a close range of at least about 5 cm.
47. The vehicle according to claim 40, wherein the vehicle comprises the windshield display, and content on the windshield display is configured to be moved depending on the data received from the one or more sensors.
48. The vehicle according to claim 47, wherein the content on the windshield display is configured to be moved to compensate for movement of eyes of the occupant with respect to at least one of an icon on the windshield display or an external object outside the vehicle.
49. The vehicle according to claim 40, wherein the one or more sensors are configured to detect one or more of an identification, a drowsiness or fatigue, a distraction, a head orientation, an eye gaze, a facial expression, a gender classification, an age classification, a body type, a quantity, a hand gesture, a thumbs up gesture, an open palm gesture, a first or first gesture, a grabbing of the object, a releasing of the object, a proximity, and a proximity to the object of or by the occupant.
50. The vehicle according to claim 40, wherein the one or more sensors are configured to detect one or more of a type of the object, a size of the object, an orientation of the object, and a position of the object.
51. The vehicle according to claim 40, further comprising one or more of an air vent, dashboard lighting, switches, a smartphone, a cup holder, a door pocket, a door armrest, a center console, a trunk, a seat, a seat back, and a roof which is configured to be controlled by the ECU.
52. The vehicle according to claim 40, wherein the ECU is configured to initiate a self-test process to determine whether the one or more sensors and the ECU are operating properly, in response to detecting a malfunction of any of the one or more sensors and the ECU, the ECU is configured to display the detected malfunction or initialize a programmed malfunction protocol to self-cure the detected malfunction, in response to no malfunction being detected, the ECU is configured to read an input of a first sensor of the one or more sensors until an arm movement is detected by the first sensor, in response to an arm movement being detected by the first sensor, the ECU is configured to read a second sensor of the one or more sensors to determine whether a recognized movement or gesture is performed, and in response to determining that a recognized movement or gesture is performed, the ECU is configured to transmit a signal to one or more vehicle components based on the recognized movement or gesture.
53. A method of using a vehicle according to claim 40, the method comprising: initiating a self-test process to determine whether the one or more sensors and the ECU are operating properly; displaying the detected malfunction or initialize a programmed malfunction protocol to self-cure the detected malfunction in response to detecting a malfunction of any of the one or more sensors and the ECU; reading an input of a first sensor of the one or more sensors until an arm movement is detected by the first sensor in response to no malfunction being detected; reading a second sensor of the one or more sensors to determine whether a recognized movement or gesture is performed in response to an arm movement being detected by the first sensor; and transmitting a signal to one or more vehicle components based on the recognized movement or gesture in response to determining that a recognized movement or gesture is performed.
54. A vehicle, comprising: one or more sensors arranged on at least one of a dashboard, a roof, and a center console of the vehicle; an electronic control unit (ECU) configured to communicate with the one or more sensors; and at least one morphing surface configured to be controlled by the ECU, wherein the ECU receives data from the one or more sensors on one or more of a position of an occupant, a position of an object, a size of an object, or a shape of an object, and wherein one or more of a cup holder, a door pocket, a door armrest, a center console, a trunk, a seat, a seat back, and a roof comprises a morphing surface which is configured to be controlled by the ECU as morphing surface.
55. A vehicle, comprising: one or more image capturing devices for capturing one or more images from a left side and a right side of the vehicle; an electronic control unit (ECU) configured to communicate with the one or more image capturing devices; and one or more displays configured to be controlled by the ECU, wherein the ECU is configured to control display of the one or more captured images on the one or more displays, the ECU is configured to initiate a self-test process to determine whether the one or more displays, the one or more image capturing devices, and the ECU are operating properly, in response to detecting a malfunction of any of the one or more displays, the one or more image capturing devices, and the ECU, the ECU is configured to display the detected malfunction or initialize a programmed malfunction protocol to self-cure the detected malfunction, in response to no malfunction being detected, the ECU is configured to read or receive vehicle data to evaluate driving conditions, in response to receiving vehicle data, the ECU is configured to determine whether to use a comfort display or a full display, in response to determining to use the full display, the ECU is configured to process the one or more captured images and display the full display on the one or more displays, and in response to determining to use the comfort display, the ECU is configured to process the one or more captured images and display comfort display on the one or more displays.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0060] The foregoing summary, as well as the following detailed description, will be better understood when read in conjunction with the appended drawings. For the purpose of illustration, certain examples of the present disclosure are shown in the drawings. It should be understood, however, that the present disclosure is not limited to the precise arrangements and instrumentalities shown. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an implementation of system, apparatuses, and methods consistent with the present disclosure and, together with the detailed description, serve to explain advantages and principles consistent with the present disclosure, wherein:
[0061]
[0062]
[0063]
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
[0070]
[0071]
[0072]
[0073]
[0074]
[0075]
[0076]
[0077]
[0078]
[0079]
[0080]
[0081]
[0082]
[0083]
[0084]
[0085]
[0086]
[0087]
DETAILED DESCRIPTION
[0088] It is to be understood that the disclosure is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The Figures and written description are provided to teach those skilled in the art to make and use the inventions for which patent protection is sought. The disclosure is capable of other embodiments and of being practiced and carried out in various ways. Those skilled in the art will appreciate that not all features of a commercial embodiment are shown for the sake of clarity and understanding. Those skilled in the art will also appreciate that the development of an actual commercial embodiment incorporating aspects of the present disclosure may require numerous implementation-specific decisions to achieve the developer's ultimate goal for the commercial embodiment.
[0089] In addition, it is to be understood that the phraseology and terminology employed herein are for the purpose of describing the present disclosure and should not be regarded as limiting. For example, the use of a singular term, such as, a is not intended as limiting of the number of items. Also, the use of relational terms, such as but not limited to, top, bottom, left, right, upper, lower, down, up, side, are used in the description for clarity in specific reference to the Figures and are not intended to limit the scope of the present disclosure. Further, it should be understood that any one of the features may be used separately or in combination with other features. Other systems, methods, features, and advantages will be or become apparent to those skilled in the art upon examination of the Figures and the description. The term driver is used throughout this disclosure but is not limited to a person who is operating or controlling the vehicle; it may refer to any vehicle occupant, person, passenger, or user inside the vehicle, or, in certain circumstances, a person who is outside the vehicle but controlling the vehicle or interested in movement of the vehicle. It is intended that all such additional systems, methods, features, and advantages be included within this description, and be within the scope of the present disclosure.
[0090] Sensor Arrangements.
[0091]
[0092] Three eye tracking sensors 102, 103, 104 are attached to the dashboard 110, each of the sensors 102, 103, 104 may include an IR lamp and a camera. In an example, the sensors 102, 103, 104 are mounted on the dashboard 110 as illustrated in
[0093] For example, movement of the center of the face may be detected by a first of the sensors 102, 103, 104 using parameters which indicate whether the center of the face has moved out of the detection range of an active sensor of the sensors 102, 103, 104. A number of different detecting processes may also be used to detect the position of the head and whether the head has moved outside the range of a sensor 102, 103, 104. For example, the outer edge of the head or the axis of the head may be detected/followed. Further, a position of the user's shoulder may be detected/followed, and a position of the head with respect to the shoulders may be followed. Once the position is detected, the other sensor that can detect the face is activated. For example, one of the sensors 102, 103, 104 may be used to detect the position of the user while two of the sensors 102, 103, 104 may be activated or deactivated depending on the detected position of the user in order to detect the user's face. Any one of the sensors 102, 103, 104 may be a time of flight (hereinafter ToF) sensor or a camera.
[0094] In an example, each of the sensors 102, 103, 104 may have a 30 FOV, and with the attachment of two of the sensors 102, 103, 104, a rotation of the head can be detected up to 100 in the horizontal direction. In another example, the FOV of each sensor may be at least 10, at least 20, at least 30, at least 40, at least 50, at least 60, at least 70, at least 80, at least 90, at least 100, at least 110, at least 120, at least 130, at least 140, at least 150, at most 10, at most 20, at most 30, at most 40, at most 50, at most 60, at most 70, at most 80, at most 90, at most 100, at most 110, at most 120, at most 130, at most 140, at most 150, about 10, about 20, about 30, about 40, about 50, about 60, about 70, about 80, about 90, about 100, about 110, about 120, about 130, about 140, or about 150. Additionally, the rotation of the head of the driver 100 may be detected in the horizontal direction at least 20, at least 30, at least 40, at least 50, at least 60, at least 70, at least 80, at least 90, at least 100, at least 110, at least 120, at least 130, at least 140, at least 150, at least 160, at most 20, at most 30, at most 40, at most 50, at most 60, at most 70, at most 80, at most 90, at most 100, at most 110, at most 120, at most 130, at most 140, at most 150, at most 160, about 20, about 30, about 40, about 50, about 60, about 70, about 80, about 90, about 100, about 110, about 120, about 130, about 140, about 150, or about 160. In one example, the distance between the two of the sensors 102, 103, 104 may be approximately 0.5 m. In another example, the distance between the two sensors may be at least 0.1 m, at least 0.2 m, at least 0.3 m, at least 0.4 m, at least 0.5 m, at least 0.6 m, at least 0.7 m, at least 0.8 m, at least 0.9 m, at least 1 m, at most 0.1 m, at most 0.2 m, at most 0.3 m, at most 0.4 m, at most 0.5 m, at most 0.6 m, at most 0.7 m, at most 0.8 m, at most 0.9 m, at most 1 m, about 0.1 m, about 0.2 m, about 0.3 m, about 0.4 m, about 0.5 m, about 0.6 m, about 0.7 m, about 0.8 m, about 0.9 m, or about 1 m. The distance of the occupant 100 to each of the sensors 102, 103, 104 may range and can be up to one meter, the minimum distance differs from person to person, but may be about 10 cm or else the sensor 102, 103, 104 may become saturated.
[0095] Still referring to
[0096]
[0097] In this example, there is a ToF sensor 105 in the roof console or at the transition between the windscreen and roof 112 on the driver's 100 side for determining the head position of the driver 100. In addition, another ToF sensor 106 may be located in the center console for Human Machine Interface (hereinafter HMI) or other applications.
[0098] The ToF sensor 105 may be used to detect the position of the occupant 100. Mainly the head position is detected, as well as the body. This information may be used to adjust the Head Up Display (hereinafter HUD) according to the head position since objects displayed in the HUD must be moved according to the head position. For example, if a particular location is to be marked with a symbol in the HUD, the location of the head should be known accurately.
[0099] The sensor 106 in the center console may detect gestures and approach and may be used for swiping the display contents from the HMI/Display of the Center Console to the Dashboard displays. In addition, when approaching the display, a user 100 may see the position of his or her hand with respect to the display content. The displayed hand can be used to enlarge symbols or fields that may otherwise be missed. The displayed hand combines gesture control with a touch display. According to this example, the display may then be divided into quadrants and the respective quadrant can then be zoomed in or zoomed out.
[0100] Still referring to
[0101] Referring to
[0102] Further aspects, for example, include moving content on a display, such as a head-up display in particular on windshield, depending on head, eye and/or body position and/or on field of view. Content may be adapted in relation to the vehicle car surroundings by taking at least three positions into accounteyes, the image being displayed, and the actual object.
[0103] In an example, to compensate for a relative movement between eyes and image on display and/or actual object, the position of an icon on a display may be adapted. There may be a fixed relationship between the position of the eyes and the display such that the icon moves on the display to follow an object. In another example, the eyes may move relative to the display and the icon may follow this movement, i.e. the icon aligns itself relative to eyes. In a further example, a warning icon moves into the field of view of the eyes due to movement of the vehicle relative to the object.
[0104] Other aspects may include attracting the attention of a driver 100 to ensure awareness. In order to attract the attention of a driver 100, a light, a vibration and/or a sound effect may be triggered to alert of a dangerous circumstance. The light effects can be in the field of view of the driver 100, provided by light modules already installed within car like door illumination, dashboard illumination, window illumination, and the like. Vibration can be generated by the seat, the steering wheel or an element in direct contact with the driver 100. The effect may be preselected by the driver 100, or automatically selected based on detected conditions and/or a detected status of the diver 100.
[0105] In addition, sensor arrangements may provide additional control features. In an example, selection of a menu function, changing between menus, and other user interface functions may be performed via eye, head and/or body movement detection. In one example, recognizing the identity or condition of a driver 100 results in automatic vehicle configurations such as seat position, steering wheel position, language, address book, favorites, navigation places, limiting of velocity, blocking selected or all function. Recognition may be achieved via pupil identification, face identification, and may including detection of a condition of the driver 100 such as mood, tiredness, and drug or alcohol influence.
[0106] The sensors 102, 103, 104, 105, 106 described above in reference to
TABLE-US-00001 TABLE 1 Example Relevance of Sensor Arrangements Sensors 102, 103, Sensor Sensor 104 105 106 illustrated illustrated illustrated Example function in FIG. 1 in FIG. 2 in FIG. 3 Occupant Monitoring + ++ Driver identification Drowsiness & Fatigue detection Distraction detection Head orientation Eye gaze tracking Facial expression & analysis Number of occupants Occupant classification (gender, age, body type, etc.) Gesture Recognition o ++ Hand gesture of driver and passenger (left-, right hand) Gestures thumb up open palm fist fist action grabbing of object releasing of object proximity to an object proximity to a sensor Object Recognition/Classification o + ++ Type of object: Bottle, smart phone, spectacles, keys Classification of object: size, orientation, position
[0107]
[0108] Referring to
[0109] Materials, actuators, and structures used for forming a morphing surface such as morphing surface 236 are known to a person of ordinary skill in the art, for example, as described in U.S. Pat. Nos. 7,858,891 B2 and 10,173,615 B2.
[0110]
[0111] Referring to the example illustrated in
[0112] One practical example of the above process being implemented may be for controlling the morphing surface 236 to accommodate a bottle of a driver 100. In this example, a driver 100 may grab a bottle out of his pocket or bag, and the TOF sensor 105 may detect arm movement. The ECU 220 may determine that the bottle does not approach the cup holder, and detect the bottle in the driver's hand. The one or more sensors 102, 103, 104, 105, 106 may estimate the size of the bottle for classification purposes. If the driver 100 opens the bottle cap, the one or more sensors 102, 103, 104, 105, 106 may monitor the movement of the hand and arm of the driver 100, and the ECU 220 may determine that this movement is not relevant for an actuation mechanism to be initiated. However, the classification of the bottle may change from closed bottle to open bottle. If the driver 100 drinks from the bottle, the ECU 220 may again determine that this is not relevant for an actuation mechanism. If the driver 100 closes the bottle and moves the bottle towards the cup holder, the one or more sensors 102, 103, 104, 105, 106 may detect the movement, and the ECU 220 may determine the movement is a relevant movement and calculate the position of the movement. The ECU 220 may calculate the expected position on the morphing surface 236 and the size of the bottle, and actuate the forming of a hole in the morphing surface 236 based on the calculated expected position and the size of the bottle. The sensors 102, 103, 104, 105, 106 may detect placement of the bottleat which point movement has stopped, and the ECU 220 may initiate a command to lock the bottle and send the command to the morphing surface 236 actuators. At this point, the driver 100 may release his or her grip on the bottle, and the sensors 102, 103, 104, 105, 106 may detect the released hand. The ECU 220 may store the position and classification of the bottle including the consumed volume and type of drink, and may forward this information to an Internet of Things (IoT) environment.
[0113] The above described example is only one practical example of the process illustrated in
[0114] Uniform Overlay Surface for Sensor and Display.
[0115]
[0116] The topcoat may be decorated by various coating methods such as varnishing, PVD, IML, IMD, PUR flooding, and others. In addition, paints, which do not serve for appearance, but are applied for environmental influences such as resistance to scratching or imprints applied. A self-healing coating can also be applied. Another example includes an anti-reflective surface to avoid unwanted reflections.
[0117] Still referring to
[0118] Sensors which can be installed include but are not limited to ToF sensors, cameras, IR sensors, radar, ultrasound, capacitive sensors, brightness sensors, LIDAR sensors, among others.
[0119] In a further example, a plurality of frameless displays can be mounted. These displays may be technically and operationally linked with one another via software so that they can appear as a single, wide screen when viewed from the outside.
[0120] Integration of Holographic Displays in Bionic Structure or Dashboard.
[0121]
[0122] The recent trend in automotive displays has been the use of Heads-Up Display that allow the projection of information in front of the driver and align the projected images and objects to reality. The projected media can be a combined screen, the car windshield or a hologram that is projected in front of the car. In all these examples, a calibration for the display device is typically needed so that the alignment can be done precisely. It is easy to do this alignment with precise measuring equipment during research and development but not in mass production.
[0123] A calibration method is described below that includes steps that can be performed in mass production. For example, these steps can be performed on an assembly line such as an assembly line of the module supplier (tier 1) of the dashboard integrating the display device, or the a car assembly line where all the components including the display device, the windscreen, the car body and the dashboard are joined. The disclosed method may, for example, be used to anticipate the tolerances from the car body, the dashboard, the windscreen mounting, and to provide means to measure and compensate the deviations in an efficient way.
[0124] Referring to
[0125] Full Windshield Head Up Display.
[0126]
[0127] Referring to
[0128] The windshield 400 may be provided with a foil to enhance image quality, either applied on the inside or between two glasses. The foil may be arranged so that it can reflect specific wavelengths. The foil may be placed on the windshield 400 by gluing or by adhesion, and can also be placed between the windshield 400 to substitute the typically used foil, which is used for safety aspects.
[0129] In an example, advertisements may be displayed on the windshield 400 when the car is not driving. Pedestrians or other road users may also be able to see the advertisements on the display which is used as a screen.
[0130] In one example, the brightness is the maximum allowed brightness allowed by the projector 410. Luminance may be laser class 1 and the minimal contrast to achieve may be 1:5000 for the projector 410. The brightness of the projector 410 may be 0.01 mW. Luminance may be 100 Candela.
[0131] Sharpness as the windshield 400 is curved may need to be taken into consideration. In typical prior applications, stitching has not been implemented. In this example, if there are a pair of projectors 410, stitching in the middle of the two pictures may be applied. The advantage is to provide a bigger area for projection. With that, a seamless presentation of the display with uniform luminance may be achieved. A single calibration may be done for installing the system where the projectors 410 are aligned with the shape of the windshield 400.
[0132]
[0133] Referring to
[0134] Transparent Dashboard with Integrated Light Guide.
[0135] In recent trends in the automotive industry, more and more product parts of the vehicle dashboard are made of plastic. At the same time, more lighting systems and light guides are used in the interior of the car. Also, new developments allow the possibility of making plastic parts using advanced technologies such as bionic structures, 3D-printing, etc. . . . . However, separate lighting units/light guides are still used and attached to the main structure for lighting.
[0136]
[0137] Moving Touch Pad to Follow Passenger Position.
[0138] As autonomous vehicles become more prevalent, vehicle drivers and passengers may desire to select and change positions more freely. As the focus of the driver on the pure driver function can be relaxed, the strict requirements for passive safety (airbags, seat belts) can also be partly relaxed and replaced by better active safety. Self-driving vehicles according to SAE Levels 3 and 4 may allow drivers to recline to a more relaxed position (for level 3), similar to Aircraft seats, and potentially to a totally reclining position (for level 4). As a result, the established ergonomics and operation of car systems such as infotainment, car settings, climate control and others may no longer be accessible or work properly. Inaccessibility of the car systems result in the driver/passenger not being able to easily reach the controls which are usually mounted in the dashboard or the center console of a conventional car.
[0139] Accordingly, an exemplary autonomous driving vehicle may include a movable center console and a touch pad mounted on the center console at an adaptable angle. The center console may have several degrees of freedom (e.g. movable in the x-direction and the z-direction, and rotatable about the z-axis). The touch pad may be tuned around several axes (for example, the y-axis and the z-axis). As such, the touch pad may be properly seen and reached by the driver/passenger in at least 3 positions: normal driving position, relaxed driving position, and sleeping/lying position. Sensors in the car and in the car drive train can determine the status of car and driver, and based on the determined status, the sensors may send signals directing a change in the position to a maximum comfort level.
[0140] Display Cooling with Ventilation Elements.
[0141] As cars continue to have more and more additional monitors in a dashboard, climate management for the electronic components of the displays is already becoming a problem. Climate management is more and more important as the number of displays increases because the overheating of displays may also affect the vehicle occupants' climate.
[0142] Modifications to air conditioner openings, and the elimination of heat accumulation are described. In an example, a ventilation system includes an air duct system positioned within the dashboard, which receives its air from the current air conditioning system before the air duct system passes through a directional distributor at the interior of the vehicle. For this purpose, a defined opening is directed into a designated channel structure and guided close to the back of the monitor. Here, the air warms and thus the air can continue to pull up/back. Therefore, this arrangement allows the cooling systems to be positioned below the monitors so that the air is led up past the monitors. The air supply in front of the distributor should not be switched off. The locking mechanisms, for example via actuators, may lie between the distributor system and lamellae exit. Accordingly, a ventilation system with an air-channel structure inside is provided.
[0143]
[0144]
[0145] Touchskin Keys.
[0146]
[0147] Referring to
[0148] As illustrated in
[0149]
[0150] Referring to
[0151] Referring to
[0152] See Through Vehicle Display.
[0153]
[0154] Referring to
[0155]
[0156] The display 800 may illustrate a number of different values or warnings. For example, the display 800 may give visual information 810 on distance to surrounding objects. The display 800 may provide warning such as sounds 820 or flashing lights when an object is within a threshold distance. The display 800 may provide different warnings sounds 820 or lights depending on a size or type of the object detected. The display 800 may provide different warning sounds 820 or lights depending on whether the object is moving or not moving, and whether approaching the vehicle or moving away from the vehicle. In an example, the see-through display 800 may blend in with the interior of the vehicle and not operate the see-through function until it is activated by a user. In another example, the see-through function is automatically operated when the vehicle is in a specific mode such as a parking mode, or when the car is in reverse.
[0157] It should be appreciated that the see-through display 800 significantly enhances the driver experience by allowing a user to see a position which is typically impossible to see. For example, in a larger vehicles such as pickup trucks, the position that a door occupies is very large and would not allow a user to see a short object or person such as a child standing beside the door as compared to a vehicle which is low to the ground and provides a significantly larger view to the side of the vehicle. In addition, such see-through systems may significantly ease compliance with regulations of certain countries, such as Japan, where vehicles are required to be able to view pilots of certain heights besides the vehicle in order to pass compliance. While the see-through display 800 is described in reference to autonomous vehicles, it should be appreciated that this feature, as well as all features of this disclosure, are also described in reference to non-autonomous vehicles and this disclosure covers all vehicles
[0158] Dashboard with Rearranged Camera Monitoring Systems.
[0159]
[0160] Referring to
[0161] As a result of the arrangement of the displays 1, 2, 3, 4, 5, other passengers are also able to see the displayed image. Further, content can be projected onto the image that may not comply with legal requirements. For example, augmented reality applications for warning or other informational purposes may be projected onto the image.
[0162] In addition, such displayed images may also be used to adjust the FOV of the simulated mirror. In this example, the driver transfers the image from 1 to 3, adjusts the FOV with touch or gesture control on the HMI, and then sends the image back to screen 1.
[0163]
[0164] However, in certain jurisdictions, the ISO standard requires that information captured from the left side of a vehicle must be displayed on a display to the left of the driver and information captured from the right side of a vehicle must be displayed on a display to the right of the driver. In particular, the ISO standard includes rules on the location of CMS information to the left of the driver for information captured on the left side and to the right of the driver for information captured from the right side. The reason is that drivers tend to look to the particular side that they sense a danger exists, for example, based on the noise of another vehicle from that side. Thus, a display should be on the same side as the side where the image is being captured. This arrangement is mandatory in a number of jurisdictions including Europe, Japan, and Korea.
[0165] To meet regulatory requirements while providing a conveniently displayed and combined, stitched image, a system and algorithm may be used for displaying a stitched image (hereinafter a comfort display) or separate images on a display to the left and right of a vehicle occupant (hereinafter an ISO-compliant display).
[0166] Referring back to
[0167] A CMS system, as described, may provide the required mirror/CMS information in the most convenient way, maintain improved safety features by combining the information from the vehicle, the ADAS system and the CMS, switch to safe and ISO-compliant information when required, and free the displays 1, 2, 3, 4, 5, 900 for other purposes when they are not required for the CMS.
[0168]
[0169] In an example, the ISO-compliant display may be displayed in a single display of the displays 1, 2, 3, 4, 5, 900. That is, if a display 1, 2, 3, 4, 5, 900 is large enough, the full CMS may be shown in a window or overlay. Displaying in a single display may provide additional awareness for the danger situation and better catch the attention of a driver 100. Also, once the danger is determined to be no longer at issue, the ISO-compliant CMS image can fade into a comfort display instead of switching off the display 1, 2, 3, 4, 5, 900 or returning it to its original content.
[0170] In one example, the criteria for determining whether a comfort display should be used, i.e. whether it is safe to use a comfort display, or whether an ISO-compliant display should be used, is provided in Table 2.
TABLE-US-00002 TABLE 2 Criteria for Safe/Unsafe Driving and Comfort Display Variety Number Blind Brakes Speed speed of spot activation km/h In km/h Duration lanes warning >0.1 g Decision <6 x x x x x show full CMS >6 +/10 1 minute 1 off no show comfort km/h CMS >6 x x >1 off no show comfort CMS x x x x x yes show full CMS x x x x on x show full CMS
[0171] In Table 2, an x signifies that the input is not used for a particular scenario. Thus, in the first example, when the speed of the vehicle is less than 6 km/h, the full CMS is displayed. In another example, when the speed is greater than or equal to 6 km/h, the speed ranges+/10 km/h for a duration of 1 minute, there is 1 lane, the blind spot warning is off, and there is no break activation, the comfort display is shown. In another example, when the speed is greater than or equal to 6 km/h and there is more than one lane, the comfort display is shown. In further examples, when the brake is activated, the full CMS is shown, or when the blind spot warning is on, the full CMS is shown. These examples are only provided for explanation, and it should be appreciated that the inputs for determining whether or not it is safe to show a comfort display are not limited to these examples.
[0172] It will be appreciated by those skilled in the art that changes could be made to the embodiments described above without departing from the broad inventive concept thereof. It is understood, therefore, that the present disclosure disclosed herein is not limited to the particular embodiments disclosed, and it is intended to cover modifications within the spirit and scope of the present disclosure.