COCKPIT SYSTEM AND CONTROL SYSTEM
20260001570 ยท 2026-01-01
Assignee
Inventors
Cpc classification
B60N2220/20
PERFORMING OPERATIONS; TRANSPORTING
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
B60N2/0023
PERFORMING OPERATIONS; TRANSPORTING
G06V20/597
PHYSICS
B60W2540/223
PERFORMING OPERATIONS; TRANSPORTING
G10L15/22
PHYSICS
B60R2300/8006
PERFORMING OPERATIONS; TRANSPORTING
B60N2210/00
PERFORMING OPERATIONS; TRANSPORTING
B60R1/20
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
B60N2/00
PERFORMING OPERATIONS; TRANSPORTING
B60R1/20
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A cockpit system for use in a vehicle is provided. The cockpit system includes a camera, a flexible touch screen, a steering wheel, and a controller. The camera provides a driving image. The flexible touch screen has a flexible mechanism to control a flexible state of the flexible touch screen. The steering wheel has a foldable mechanism to control a folding state and turning of the steering wheel, and the flexible touch screen is arranged on the steering wheel. The controller is coupled to the camera, the flexible touch screen, and the steering wheel, and the controller determines an operational requirement of a driver based on the driving image and controls the flexible mechanism and the foldable mechanism according to the operational requirement.
Claims
1. A cockpit system for use in a vehicle, the cockpit system comprising: a camera, providing a driving image; a flexible touch screen, having a flexible mechanism to control a flexible state of the flexible touch screen; a steering wheel, having a foldable mechanism to control a folding state of the steering wheel, the flexible touch screen being arranged on the steering wheel; and a controller, coupled to the camera, the flexible touch screen, and the steering wheel, determining an operational requirement of a driver according to the driving image, and controlling the flexible mechanism and the foldable mechanism according to the operational requirement.
2. The cockpit system according to claim 1, further comprising a microphone for providing a driver's voice, wherein the controller is further coupled to the microphone to determine the operational requirement based on the driving image and the driver's voice.
3. The cockpit system according to claim 2, wherein the controller determines whether at least one voice command exists in the driver's voice by applying a voice recognition algorithm to determine the operational requirement based on the at least one voice command.
4. The cockpit system according to claim 1, further comprising a driver seat sensor for providing a seat inclination angle signal, wherein the controller is further coupled to the driver seat sensor to determine the operational requirement based on the driving image and the seat inclination angle signal.
5. The cockpit system according to claim 1, wherein the controller determines whether the driver has a preset steering wheel setting based on the driving image by applying a face recognition algorithm to determine the operational requirement of the driver.
6. The cockpit system according to claim 1, wherein the controller determines whether the driver has entered a sleep state based on the driving image by applying a passenger monitoring system algorithm to determine the operational requirement of the driver.
7. The cockpit system according to claim 1, wherein the controller determines a passenger status of the driver based on the driving image by applying a posture recognition algorithm to determine the operational requirement of the driver.
8. The cockpit system according to claim 1, wherein the controller determines a passenger status of the driver based on the driving image by applying a gesture recognition algorithm to determine the operational requirement of the driver.
9. The cockpit system according to claim 1, wherein when the operational requirement is manual driving, the steering wheel is fully unfolded and oriented toward the driver, and the flexible touch screen is bent to expose a portion of the flexible touch screen to the driver.
10. The cockpit system according to claim 1, wherein when the operational requirement is a tablet operation, the steering wheel and the flexible touch screen are fully unfolded and oriented toward the driver.
11. The cockpit system according to claim 10, further comprising a front display and/or at least one head-up display, wherein at least one of the front display and the at least one head-up display is synchronized with the flexible touch screen.
12. The cockpit system according to claim 1, wherein when the operational requirement is a laptop operation, the steering wheel and the flexible touch screen are unfolded and oriented toward the driver.
13. The cockpit system according to claim 1, wherein when the operational requirement is a sleep operation, the steering wheel and the flexible touch screen are fully folded and retracted into an accommodation space of the vehicle.
14. The cockpit system according to claim 1, wherein the flexible mechanism of the steering wheel further controls a turning of the steering wheel.
15. A control system for use in a vehicle, the control system comprising: a camera, providing an occupant image; a flexible touch screen, having a flexible mechanism to control a flexible state of the flexible touch screen; and a controller, coupled to the camera, and the flexible touch screen, determining an operational requirement of an occupant according to the occupant image, and controlling the flexible mechanism according to the operational requirement.
16. The control system according to claim 15, further comprising: a steering wheel, coupled to the controller, having a foldable mechanism to control a folding state and turning of the steering wheel, the flexible touch screen being arranged on the steering wheel, where the controller further controls the foldable mechanism according to the operational requirement.
17. The control system according to claim 15, further comprising a microphone for providing an occupant's voice, wherein the controller is further coupled to the microphone to determine the operational requirement based on the occupant image and the occupant's voice.
18. The control system according to claim 15, further comprising an occupant seat sensor for providing a seat inclination angle signal, wherein the controller is further coupled to the occupant seat sensor to determine the operational requirement based on the occupant image and the seat inclination angle signal.
19. The control system according to claim 15, wherein the controller determines whether the occupant has entered a sleep state based on the occupant image by applying a passenger monitoring system algorithm to determine the operational requirement of the occupant.
20. The control system according to claim 15, wherein the controller determines a passenger status of the occupant based on the occupant image by applying a posture recognition algorithm to determine the operational requirement of the occupant.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
DESCRIPTION OF THE EMBODIMENTS
[0023] Unless otherwise defined, all terminologies (including technical and scientific terminologies) used herein have the same meaning as commonly understood by people having ordinary skill in the art to which the disclosure belongs. It is understood that these terminologies, such as those defined in commonly used dictionaries, should be interpreted as having meanings consistent with the relevant art and the background or context of the disclosure, and should not be interpreted in an idealized or overly formal way, unless otherwise defined in the embodiments of the disclosure.
[0024] It should be understood that, although the terminologies first, second, third, and so forth may serve to describe various elements, components, regions, layers, and/or sections in this disclosure, these elements, components, regions, layers, and/or sections shall not be limited by these terminologies. These terminologies merely serve to distinguish one element, component, region, layer, and/or section from another element, component, region, layer, or section. Thus, a first element, component, region, layer, or section discussed below may be called as a second element, component, region, layer, or section without departing from the teachings herein.
[0025] The terminologies used herein are only for the purpose of describing particular embodiments and are not restrictive. As used herein, the singular forms a, an, and the are intended to include the plural forms including at least one or represent and/or unless the content clearly indicates otherwise. As used herein, the terminology and/or includes any and all combinations of one or more of the associated listed items. It should also be understood that when used in this disclosure, the terminologies include and/or comprise indicate the presence of the described features, regions, overall scenarios, steps, operations, elements, and/or components but do not exclude the presence or addition of one or more other features, regions, overall scenarios, steps, operations, elements, components, and/or combinations thereof.
[0026]
[0027] In this embodiment, the camera 120 provides a driving image Xmg to the controller 170. The flexible touch screen 160 has a flexible mechanism to control a flexible state of the flexible touch screen 160. The steering wheel 150 has a foldable mechanism to control a folding state and turning of the steering wheel 150, and the flexible touch screen 160 is arranged on the steering wheel 150. The controller 170 determines an operational requirement of a driver (or an occupant) based on the driving image Xmg, and controls the flexible mechanism and the foldable mechanism according to the operational requirement.
[0028] Based on the above, the controller 170 determines the operational requirement of the driver according to the driving image Xmg in response to the operational requirement of the driver for adjusting the retraction/unfolding and turning of the steering wheel and the flexible touch screen, thereby enhancing the convenience of use and space utilization.
[0029] In this embodiment, the microphone 140 can be configured to provide a driver's voice Xad. At this time, the controller 170 can determine the operational requirement according to the driving image Xmg and the driver's voice Xad.
[0030] In this embodiment, the driver seat sensor 180 can be configured to provide a seat inclination angle signal Xse. At this time, the controller 170 can determine the operational requirement based on the driving image Xmg and the seat inclination angle signal Xse.
[0031] In this embodiment of the disclosure, in certain operational scenarios, at least one of the front display 130 and/or at least one of the HUDs 110-1 to 110-3 can be synchronized with the flexible touch screen 160.
[0032]
[0033]
[0034]
[0035]
[0036] As shown in
[0037] If the operational requirement of the driver is the automated driving mode, i.e., when a determination result of step S130 is yes, step S140 is executed to enter the automated driving mode, and an algorithm automatically determines whether the steering wheel is required; on the contrary, if the operational requirement of the driver is not the automated driving mode, i.e., when the determination result of step S130 is no, returns to step S110.
[0038]
[0039] In this embodiment, the controller 170 can apply a face recognition algorithm to determine whether the driver has a preset steering wheel setting (in response to the operational requirement of the driver) based on the driving image Xmg, so as to decide whether the steering wheel should be extended or retracted.
[0040] In this embodiment, the controller 170 can apply a passenger monitoring system algorithm to determine a passenger status of the driver based on the driving image Xmg and then analyze the passenger status (e.g., whether the passenger enters a sleep state) through a determination logic to obtain the corresponding operational requirement of the driver, ultimately deciding whether the steering wheel should be extended or retracted.
[0041] In this embodiment, the controller 170 can apply a posture recognition algorithm to determine the passenger status of the driver based on the driving image Xmg and then analyze the passenger status through the determination logic to obtain the corresponding operational requirement of the driver, ultimately deciding whether the steering wheel should be extended or retracted.
[0042] In this embodiment, the controller 170 can apply a gesture recognition algorithm to determine the passenger status of the driver based on the driving image Xmg and then analyze the passenger status through the determination logic to obtain the corresponding operational requirement of the driver, ultimately deciding whether the steering wheel should be extended or retracted.
[0043] In this embodiment, the controller 170 can apply a voice recognition algorithm to determine whether at least one voice command exists in the driver's voice Xad, so as to determine the passenger status of the driver based on the at least one voice command and then analyze the passenger status through the determination logic to obtain the corresponding operational requirement of the driver, ultimately deciding whether the steering wheel should be extended or retracted.
[0044]
[0045] In step S230, it is determined whether the posture recognition algorithm detects a hand moving forward and whether the gesture recognition algorithm detects a gesture B. When the posture recognition algorithm detects a hand moving forward and the gesture recognition algorithm detects the gesture B, i.e., a determination result of step S230 is yes, then step S240 is executed to extend the steering wheel; on the contrary, when the posture recognition algorithm does not detect any hand moving forward or the gesture recognition algorithm does not detect the gesture B, i.e., the determination result of step S230 is no, then return to step S230. After step S240, return to step S210.
[0046]
[0047] In step S330, it is determined whether the voice recognition algorithm detects a specific voice command B. When the voice recognition algorithm detects the specific voice command B, i.e., a determination result of step S330 is yes, then step S340 is executed to extend the steering wheel; on the contrary, when the voice recognition algorithm does not detect the specific voice command B, i.e., the determination result of step S330 is no, then return to step S330. After step S340, return to step S310.
[0048]
[0049]
[0050] In step S550, it is determined whether a steering wheel application is specified. When the steering wheel application is specified, i.e., a determination result of step S550 is yes, then step S560 is executed to change a steering wheel angle (A.sub.1, A.sub.2). On the contrary, when no steering wheel application is specified, i.e., the determination result of step S550 is no, then step S570 is executed, so that the steering wheel angle (A.sub.1, A.sub.2) remains unchanged, and A.sub.2 defaults to 180 degrees, corresponding to a fully unfolded state (i.e., fully deployed).
[0051] Here, D.sub.1 is a distance by which the steering wheel 150 extends outward from the dashboard, A.sub.1 is an angle between the upper edge of the steering wheel 150 and the steering column, A.sub.2 is an angle between the upper edge and the lower edge of the steering wheel 150 (i.e., the angle of the flexible touch screen 160), D.sub.2 is a distance between the seat center and the dashboard, D.sub.3 is the seat height, A.sub.3 is an angle between the floor and the seat surface, and A.sub.4 is an angle between the seat surface and the seat back.
[0052] In the embodiments depicted in
[0053]
[0054] At this time, when it is necessary to control the audio-visual entertainment content, the flexible touch screen 160 on the steering wheel 150 can be applied for the control, allowing the driver/passenger to avoid having to get up to touch the front display 130. The driver can execute personal private operations through the flexible touch screen 160, such as conducting private conferences, running chat applications, or making personal investments. Moreover, based on the size of the flexible touch screen 160, its distance from the driver, and its operability, it is suitable for creative applications and/or gaming applications.
[0055]
[0056]
[0057] In one or more embodiments of the disclosure, when the driver reclines, the steering wheel 150 and the flexible touch screen 160 can automatically extend the flexible touch screen 160 in front of the driver based on the seat back reclining angle and the driver's setting data. Moreover, the angle of the flexible touch screen 160 can be changed according to the needs of the driver.
[0058] Based on the above, one or more embodiments of the disclosure are directed to the field of vehicle technology, proposing an operation method for a foldable steering wheel and a flexible touch screen. In this method, the folded steering wheel can be automatically detected, increasing interior space when the driver/passenger reclines the seat back to rest. Moreover, the foldable flexible touch screen is arranged on the foldable steering wheel. When it is necessary to control the audio-visual entertainment content, the flexible touch screen on the steering wheel can be applied for the control, allowing the driver/passenger to avoid having to get up to touch the front display, thereby enhancing user experience. Moreover, when the vehicle is manually driven, the flexible touch screen is minimized to reduce the chance of driver distraction.
[0059] To sum up, in the cockpit system provided in one or more embodiments of the disclosure, the controller determines the operational requirement of the driver based on the driving image, and adjusts the folding/unfolding and turning of the steering wheel and the flexible touch screen in response to the operational requirement of the driver, thereby enhancing convenience of use and space utilization.
[0060] It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments without departing from the scope or spirit of the disclosure. In view of the foregoing, it is intended that the disclosure covers modifications and variations provided that they fall within the scope of the following claims and their equivalents.