SUPPORT TO HANDLE AN OBJECT WITHIN A PASSENGER INTERIOR OF A VEHICLE
20180297471 ยท 2018-10-18
Assignee
Inventors
Cpc classification
B60R11/04
PERFORMING OPERATIONS; TRANSPORTING
B60Q9/00
PERFORMING OPERATIONS; TRANSPORTING
G02B2027/0141
PHYSICS
G02B2027/0187
PHYSICS
B60R2300/8006
PERFORMING OPERATIONS; TRANSPORTING
B60Q3/225
PERFORMING OPERATIONS; TRANSPORTING
B60Y2400/902
PERFORMING OPERATIONS; TRANSPORTING
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
B60K35/28
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
B60Q9/00
PERFORMING OPERATIONS; TRANSPORTING
B60R11/04
PERFORMING OPERATIONS; TRANSPORTING
Abstract
The disclosure related to a system to support handling of an object located within a passenger interior of a motor vehicle and not connected to the motor vehicle. The system has a sensor unit to detect a distance of the object to a storage area available within the passenger interior. An evaluation unit receives and processes sensor signals generated by the sensor unit. A signaling unit that can be controlled using the evaluation unit to emit optic, acoustic and/or haptic signals within the passenger interior to determine if a hand of the driver moves towards the storage area and activate the signaling unit if the hand moves towards the storage area and if the hand is located, for a predetermined period of time, within the storage area of a predetermined size.
Claims
1. A vehicle system comprising: a sensor configured to detect a distance of an object to a storage area within an interior; an evaluation unit configured to process signals generated by the sensor to determine if a driver hand moves towards the storage area and is located, for a predetermined time period, within the storage area; and a signaling unit configured to emit haptic signals within the interior responsive to being activated by the evaluation unit.
2. The system as claimed in claim 1, wherein the evaluation unit is configured to control an activated signaling unit such that signals emitted from the signaling unit are varied depending on a distance of the driver hand to the storage area.
3. The system as claimed in claim 1, wherein the signaling unit includes at least one display unit configured to display an image representation, which is formed from optic signals, of a captured area of the passenger interior.
4. The system as claimed in claim 3, wherein the display unit has at least one image projection unit configured to project the image representation of the captured area of the passenger interior onto a component of the motor vehicle.
5. The system as claimed in claim 3, wherein the display unit is a screen arranged in a dashboard.
6. The system as claimed in claim 3, wherein the display unit is formed by smart glasses.
7. The system as claimed in claim 3, wherein the evaluation unit is configured to control the display unit such that a displayed size of an image range of the storage area is varied depending on a current distance of the driver hand to the storage area.
8. The system as claimed in claim 1 further comprising at least one activation unit to activate the sensor unit and the evaluation unit, wherein the activation unit is configured to activate the sensor unit and the evaluation unit responsive to a detected vehicle speed exceeding a predetermined limit value or if an activation command via an interface is detected.
9. The system as claimed in claim 1, wherein the evaluation unit is configured to determine a position and shape of the storage area when controlling the signaling unit.
10. The system as claimed in claim 1, wherein the evaluation unit is configured to determine if an object is in the driver hand from the sensor signals.
11. The system as claimed in claim 3, wherein the evaluation unit is configured to control the display unit such that the driver hand and the storage area are highlighted on a visual level.
12. The system as claimed in claim 3, wherein the evaluation unit is configured to determine a virtual movement path of the driver hand from a captured movement of the driver hand and control the display unit such that the image representation contains the virtual movement path.
13. A method to support handling of an object within a motor vehicle that is not connected to the motor vehicle comprising: detecting a distance of an object to at least one storage area available within a passenger interior; determining if a driver hand moves towards the storage area and is located, for predetermined period of time, in the storage area; and emitting optic, acoustic and haptic signals within the passenger interior if the driver hand moves towards the storage area and if the driver hand is located within a predetermined size of the storage area for the predetermined period of time.
14. The method as claimed in claim 13 further comprising varying the emitted signals depending on the distance of the driver hand to the storage area.
15. The method as claimed in claim 13 further comprising varying a displayed size of an image range of an image representation formed from optic signals depending on a current distance of the driver hand to the storage area.
16. The method as claimed in claim 13, wherein detecting the distance of the object to the storage area occurs when a detected vehicle speed exceeds a predefined limit and if an activation command is detected via an interface.
17. The method as claimed in claim 13 further comprising determining a position and a shape of the storage area , which are taken into consideration during the emitting.
18. The method as claimed in claim 13 further comprising determining if an object is in the driver hand.
19. The method as claimed in claim 13 further comprising highlighting the driver hand and the storage area on a visual level in an image representation formed by optic signals.
20. The method as claimed in claim 13 further comprising detecting a virtual path of movement of the driver hand from a detected movement of the driver hand and displaying the virtual path of movement in an image representation formed by optic signals.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0050]
[0051]
DETAILED DESCRIPTION
[0052] As required, detailed embodiments of the present disclosure are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the disclosure that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present disclosure.
[0053]
[0054] The system 1 comprises a sensor unit 7 arranged on the motor vehicle 3, which is set to up to detect a distance 14 of an object 13 to at least one storage area 8 available within the passenger interior 2 and/or to detect at least one area of the passenger interior 2 or the entire passenger interior 2, wherein the detected area has at least one storage area 8 to store the object 13. The sensor unit 7 may be a camera (not shown) that is arranged within an upper area of the passenger interior 2.
[0055] Furthermore, the system 1 comprises at least one evaluation unit 9, which is set up to receive and process sensor signals generated by the sensor unit 7, and a signaling unit 10 that can be controlled using the evaluation unit 9, which is set up to emit optic, acoustic and/or haptic signals within the passenger interior 2.
[0056] The evaluation unit 9 is set up to determine from the sensor signals if a hand (not shown) and/or an arm (not shown) of the driver (not shown) moves towards the storage area 8 and/or if the hand and/or the arm of the driver is located, for a predetermined period of time, within an environment comprising the respective storage area 8 having a predetermined size of the respective storage area 8. Furthermore, the evaluation unit 9 is set up to activate the signaling unit 10 if the hand and/or the arm of the driver moves towards the respective storage area 8 and/or if the hand/or the arm of the driver is located within an environment comprising the respective storage area 8 having a predetermined size for a predetermined period of time. Thereby, the evaluation unit 9 is set up to control the activated signaling unit 10 in such a way that signals emitted from the signaling unit 10 are varied depending on a distance of the hand and/or of the arm of the driver or the object to the respective storage area 8. The evaluation unit 9 is set up to determine if an object is in the hand of the driver using the sensor signals.
[0057] The signaling unit 10 can have at least one display unit (not shown) that is set up to display an image representation, which is formed from optic signals, of the captured area of the passenger interior 2. The display unit can have an image projection unit (not shown), with which the image representation of the detected area of the passenger interior 2 can be projected onto a component (not shown), particularly on a front windshield (not shown), of the motor vehicle 3. Alternatively, the display unit can be designed as a screen (not shown) arranged in the dashboard 5 or a center console (not shown) of the motor vehicle 3. Alternatively, the display unit is formed by smart glasses (not shown). The evaluation unit 9 can be set up to control the display unit in such a way that a displayed size of an image range comprising the respective storage area 8 is varied depending on a current distance of the hand and/or the arm of the driver to the respective storage area 8.
[0058] Furthermore, the evaluation unit 9 can be set up to control the display unit in such a way that the hand of the driver and the storage area 8 are highlighted on a visual level. In addition, the evaluation unit 9 can be set up to determine a virtual path of movement of the hand and the arm from a captured movement of the hand and/or the arm of the driver as well as control the display unit in such a way that the image representation contains the virtual path of movement.
[0059] The evaluation unit 9 can be set up to determine data concerning a position, in particular, location coordinates and/or a shape of the respective storage area 8 and take these data into account when controlling the signaling unit 10. As an alternative, the system 1 can have an electronic information storage unit (not shown), in which motor vehicle-relevant data concerning the position and/or the shape of the respective storage area 8 is stored, wherein the evaluation unit 9 is set up to take these data into account when controlling the signaling unit 10.
[0060] The system 1 comprises an activation unit 11 to activate the sensor unit 7 and/or the evaluation unit 9, wherein the activation unit 11 is set up to activate the sensor unit 7 and/or the evaluation unit 9 if a detected speed of the motor vehicle 3 exceeds a predetermined limit value, particularly for the first time after starting the motor vehicle and/or if an activation command of the driver via a man/machine interface (not shown) is detected.
[0061] Furthermore, the system 1 comprises an apparatus 12 to monitor attentiveness of the driver, whereby the apparatus 12 is set up to generate an activation signal and send it to the sensor unit 7 and/or to the evaluation unit 9 if the apparatus 12 detects that the driver is inattentive.
[0062]
[0063] In process step 100, at least one area of the passenger interior or the entire passenger interior is captured by a camera, wherein the area of the passenger interior has at least one storage area to store at least one object.
[0064] In process step 200, it is detected if a hand and/or an arm of a driver of the motor vehicle moves towards the storage area and/or if the hand and/or arm of the driver is located within an environment comprising the storage area having a predetermined size for a predetermined period of time.
[0065] If the hand and/or the arm of the driver moves towards the storage area and/or if the hand and/or the arm of the driver is located within the environment comprising the storage area of a predetermined size for a predetermined period of time, in process step 300, an optic, acoustic and/or haptic signal is emitted within the passenger interior. The emitted signals may be varied depending on a distance of the hand and/or the arm of the driver to the storage area. If the hand and/or the arm of the driver does not move towards the storage area and/or if the hand and/or the arm of the driver is not located within the environment comprising the storage area of a predetermined size for a predetermined period of time, a skip is made to process step 100.
[0066] If optic signals are emitted in process step 300 in the form of an image representation formed by the detected area of the passenger interior, a portrayed size of an image range comprising the storage area can vary depending on a current distance of the hand and/or the arm of the driver to the storage area. In the image representation, the hand of the driver and the storage area can be highlighted on a visual level.
[0067] The detection of the at least one area of the passenger interior can take place in process step 100 when a detected speed of the motor vehicle exceeds a predefined limit value and/or if an activation command of the driver is detected via a man/machine interface. In process step 100, data concerning a position and/or a shape of the storage area can be determined from detection of at least one area of the passenger interior and be taken into consideration during the emission of signals. As an alternative, in process step 100, when the emitting the signals, stored motor vehicle-specific data concerning a position and/or a shape of the storage area can be taken into account.
[0068] At a process step 200, it is additionally determined if an object is in the hand of the driver. In addition, in process step 200, a virtual path of movement of the hand or the arm can be determined from a detected movement of the hand and/or the arm of the driver and the virtual path of movement can be displaced in the image representation.
[0069] In process step 400, it is detected if the driver has grasped the object located in the storage area with his/her hand or if the driver has stored or placed an object located in his/her hand into the storage area. If this is the case, in process step 500, the support of handling the object ends and the emission of the signals is stopped then a skip is made to process step 100.
[0070] While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms of the disclosure. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the disclosure. Additionally, the features of various implementing embodiments may be combined to form further embodiments of the disclosure.