ONBOARD SYSTEM FOR A VEHICLE AND PROCESS FOR SENDING A COMMAND TO A PARK AREA ACCESS SYSTEM
20190088058 ยท 2019-03-21
Assignee
Inventors
Cpc classification
G06V20/59
PHYSICS
G07C9/00563
PHYSICS
H04W4/44
ELECTRICITY
G06V20/597
PHYSICS
International classification
Abstract
An onboard system (4) for a vehicle (2) comprising:an emitter circuit (8) suitable to send a command (C) to a park area access system (22);an image sensor (6) suitable to capture a sequence of images (S) of at least part of a body of a driver (D) of the vehicle (2); anda control module (10) suited to process said sequence of images (S) so as to identify a behavioral feature and then control the emitter circuit (8) to send the command (C) to the park area access system (22) provided the identified behavioral feature corresponds to a predetermined behavioral feature. A corresponding process is also proposed.
Claims
1. An onboard system for a vehicle comprising: an emitter circuit for sending a command to a park area access system; an image sensor for capturing a sequence of images of at least part of a body of a driver of the vehicle; and a control module for processing said sequence of images so as to identify a behavioral feature and then control the emitter circuit to send the command to the park area access system provided the identified behavioral feature corresponds to a predetermined behavioral feature.
2. The onboard system according to claim 1, wherein the control module associates the identified behavioral feature with said command among a plurality of commands.
3. The onboard system according to claim 1, wherein the predetermined behavioral feature is a predetermined gesture.
4. The onboard system according to claim 1, wherein the predetermined behavioral feature is a predetermined lip movement.
5. The onboard system according to claim 1, wherein the control module acquires a set of sequenced images showing said predetermined behavioral feature in a learning mode and to record a data representation of said predetermined behavioral feature in a memory.
6. The onboard system according to claim 1, wherein the control module analyzes said sequence of images so as to produce a driving ability level.
7. A method of sending a command to a park area access system, comprising: capturing, by an image sensor, a sequence of images of at least part of a body of a vehicle driver; processing said sequence of images so as to identify a behavioral feature; and controlling an emitter circuit to send the command to the park area access system provided the identified behavioral feature corresponds to a predetermined behavioral feature.
8. The method according to claim 7, wherein the predetermined behavioral feature is a predetermined gesture.
9. The method according to claim 7, wherein the predetermined behavioral feature is a predetermined lip movement.
10. The method according to claim 7, further comprising: receiving said command at said park area access system; and operating a mechanism of said park area access system, thereby enabling the vehicle access to said park area.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0022]
[0023]
DETAILED DESCRIPTION OF EXAMPLE(S)
[0024]
[0025] In this context, a vehicle 2 is about to enter a park area 20 (here a garage) secured by an access system 22.
[0026] The access system 22 includes here a garage door 24; according to a possible variation, the access system may include an elevator making it possible for the vehicle to access the park area.
[0027] The access system 22 also includes a mechanism 26 for operating (e.g. opening or closing) the garage door 24. The mechanism 26 can be remotely controlled, i.e. activated when receiving a wireless command C with valid credentials. The mechanism 26 is designed to operate the access system 22 (here to open the garage door 24) when receiving the wireless command C (e.g. a radiofrequency signal as mentioned below).
[0028] The vehicle 2 is equipped with an onboard system 4 comprising an image sensor 6 (here a video camera), an emitter circuit 8 (for instance an UGDO transceiver) and a control module 10.
[0029] The image sensor 6 is directed towards the driver D of the vehicle 2 and is therefore suited to capture a sequence of images S showing at least part (of the body) of the driver D (for instance at least a space in which the driver D is expected to gesture his hands, or the face of the driver D in the respective examples given below).
[0030] The emitter circuit 8 is suited to send the above-mentioned wireless command C (with valid credentials) when receiving a corresponding instruction I from the control module 10 (for instance via a bus connecting the control module 10 to the emitter circuit 8).
[0031] In the present embodiment, the wireless command C is a sub GHz radiofrequency signal (i.e. a radiofrequency signal having a main frequency below 1 GHz) suitable to be received by the mechanism 26. According to a possible variation, the wireless command C could be sent via a wireless datalink established between the emitter circuit 8 and the mechanism 26 (such as a Bluetooth datalink), or via a wireless local area network (WLAN).
[0032] In addition, in another possible context, the emitter circuit 8 could also send another wireless command to control a comfort equipment (e.g. a garage lighting) associated with the access system (or, in another embodiment, independent of the access system).
[0033] As will be further explained below with referenced to
[0034]
[0035] In practice, some of these elements (such as units 14, 16, 18 described below) may each be implemented by the execution of a specific set of computer program instructions on a processor of the control module 10. These computer program instructions are for instance stored in a memory 12 of the control module 10.
[0036] In the present embodiment, the memory 12 also stores a plurality of data representations corresponding each to a behavioral feature of the driver D. Each data representation is furthermore associated with a particular command that may be sent by the emitter circuit 8, here by storing in the memory 12 a table (e.g. in the form of a matrix) associating each data representation to a particular command (the possible commands thus forming dictionary of commands).
[0037] The control module 10 comprises an association unit 14 suited to process (e. g. analyze) the sequence of images S (received from the image sensor 6) associate it to a behavioral feature corresponding to one of the stored data representation (when the driver D behaves in accordance with such behavioral feature). Identification of the behavioral feature is for instance performed by processing the sequence of images S to obtained representative data and by comparing these representative data to each of the data representations stored in the memory 12.
[0038] According to the proposed embodiment, the behavioral feature is a particular gesture of the driver D. According to a possible variation, the behavioral feature may be a particular movement of the lips of the driver D.
[0039] Once such a behavioral feature is identified in the sequence of image S (identified as a specific one in the data representations stored in the memory 12), the association unit 14 associates this behavioral feature to a specific command (i.e. the command associated with the particular data representation in the table mentioned above). This command may for instance be the wireless command C mentioned above.
[0040] In practice, upon associating the representative data identified from the sequence of images S to the wireless command C, the association unit 14 sends the instruction I to the emitter circuit 8 via the bus, which results in the emitter circuit 8 sending the wireless command C to the park area access system 22 and thus to open the garage door 24.
[0041] The operation of the association unit 14 as just mentioned occurs in a normal usage mode.
[0042] In a training mode (distinct from the normal usage mode), the association unit 14 is deactivated and the sequence of images S is received by a training unit 16 (also part of the control module 10).
[0043] The training unit 16 (which operates in the training mode only) is suited to capture an image sequence S (i.e. a set of sequenced images) showing a behavioral feature of the driver D, to process the captured image sequence S into a corresponding data representation and to record the resulting data representation in the memory 12.
[0044] In practice, a specific command to be emitted by the emitter circuit 8 (such as the wireless command C) may for instance be selected by the driver D. This selection could be performed by selecting the name of this command on a user interface (not shown) provided in the vehicle 2.
[0045] While in training mode, the driver then behaves in a specific manner (e.g. makes a particular gesture in the present example), that is captured by the image sensor 6.
[0046] The image sensor 6 thus delivers a sequence of images S showing this particular behavioral feature. As noted above, this sequence of images S is processed by the training unit 16 to produce a corresponding data representation, which is then stored in the memory 12 and associated to the command selected by the driver D.
[0047] Various data representations (that each correspond to a corresponding behavioral feature) can be stored in the memory 12 in association to respective commands (including the wireless command C to be sent to the park area access system 22 to command its operation, i.e. here to command opening the garage door 24).
[0048] In the embodiment described here, the control module 10 is embedded in a driver monitoring unit 18 (operable in particular in the normal usage mode).
[0049] The driver monitoring unit 18 analyzes the sequence of images S and produces (based on this analysis) a driving ability level L. This driving ability level L can be representative of the ability of the driver D to drive the vehicle 2, or of the inability of the driver D to drive the vehicle 2. In this respect, the driving ability level L could be a distraction level or a drowsiness level.
[0050] The driver monitoring unit 18 may for instance determine the distraction level by evaluating the gaze direction of the driver D and the variation of this gaze direction over time. The driver monitoring unit 18 may for instance determine the drowsiness level based on the frequency and/or the duration of the driver's eyes blinking.