HUMAN-MACHINE INTERFACE, IN PARTICULAR FOR A VEHICLE OR FOR A DEVICE

20240045462 ยท 2024-02-08

    Inventors

    Cpc classification

    International classification

    Abstract

    The present disclosure relates to a human-machine interface, in particular for a vehicle or for a device, comprising at least one gripping element comprising at least one transducer transmitting a command depending on at least one item of input information. The human-machine interface comprises at least one sensor and calculation means configured to determine a position of at least one interaction surface for interaction between an operator and the gripping element, the calculation means also being configured to determine whether the command is intentional by determining whether at least one first predefined surface lies at least partially within the interaction surface, so as to authorise the transmission of the command by the human-machine interface when it is determined that the command is intentional.

    Claims

    1. A human-machine interface for a vehicle or for a device, the human-machine interface comprising: at least one gripping element for transmission of a command according to at least one item of input information, the at least one item of input information comprising an angle of inclination or an angle of rotation of the gripping element with respect to a reference position and/or a force, an effort or a direction applied to the gripping element, at least one sensor comprising an array of sensitive elements disposed over at least one portion of an external surface of the gripping element and configured to transmit a detection signal following an action carried out by an operator on the gripping element, and calculation means configured to determine a position of at least one interaction surface between the operator and the gripping element, the calculation means being configured to determine whether the command is intentional by determining whether at least one first predefined area is included at least partially in the interaction surface, so as to authorise transmission of the command by the human-machine interface when it is determined that the command is intentional.

    2. The human-machine interface according to claim 1, wherein the at least one sensor is of a capacitive, resistive or inductive type.

    3. The human-machine interface according to claim 1, further comprising at least one second predefined area distinct from the first predefined area, wherein the calculation means are configured to transmit the command associated with the second predefined area if the second predefined area is included at least partially in the interaction surface.

    4. The human-machine interface according to claim 1, further comprising at least one directional contactor, wherein the calculation means are configured to determine a pressed movement of the operator on the directional contactor in at least one predefined area according to an origin position, an end position of the pressed movement and to transmit a command different from the command emitted during a simple press on the predefined area.

    5. The human-machine interface according to claim 4, wherein the calculation means are configured to transmit a command associated with a simultaneous interaction on at least two predefined areas.

    6. The human-machine interface according to claim 1, further comprising at least one information layer comprising a delimitation and/or command indication associated with at least one predefined area.

    7. The human-machine interface according to claim 1, further comprising means for illuminating at least one predefined area.

    8. The human-machine interface according to claim 1, further comprising information feedback means allowing warning the operator about a command associated with the predefined area.

    9. A vehicle, controlled by a human-machine interface according to claim 1, wherein the vehicle is at least one of: an aircraft, a drone, a spacecraft, a construction machine, a motor vehicle, or a ship.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0026] The invention will be better understood and other features and advantages will appear better upon reading the following detailed description comprising embodiments given for illustration with reference to the appended figures, presented as non-limiting examples, which could be used to complete the understanding of the invention and the disclosure of making thereof and, where appropriate, contribute to the definition thereof, wherein:

    [0027] FIG. 1 illustrates a first embodiment of the human-machine interface according to the invention, and

    [0028] FIG. 2 illustrates a second embodiment of the human-machine interface according to the invention.

    DETAILED DESCRIPTION

    [0029] It should be noted that, in the figures, the structural and/or functional elements common to the different embodiments may have the same references. Thus, unless stated otherwise, such elements have identical structural, dimensional and material properties.

    [0030] The human-machine interface 1 according to the invention comprises at least one gripping element 10, such as a joystick or a command stick, also called stick or grip.

    [0031] The gripping element 10 is provided with at least one transducer transmitting at least one command according to at least one input item of information, such as, for example, an angle of inclination of the gripping element 10 with respect to a reference position, a force applied on the gripping element 10, etc.

    [0032] The human-machine interface 1 enables an operator, such as a pilot or a driver, to transmit commands to a vehicle, in particular an aircraft, a drone, a spacecraft, a construction machine, a motor vehicle, a ship, etc. Complementarily or alternatively, the human-machine interface 1 enables an operator, such as a pilot or a driver, to transmit commands to a tool, in particular in remote operation, in civil engineering, etc.

    [0033] In a first embodiment illustrated in FIG. 1, the gripping element 10 is in the form of a joystick or command stick, also called central stick or command stick. The gripping element 10 comprises a transducer determining an angle of inclination forwards or rearwards and/or an angle of inclination leftwards or rightwards of the human-machine interface 1.

    [0034] In a complementary or alternative particular embodiment, the gripping element 10 comprises a transducer determining an angle of rotation to the left or to the right and/or in a clockwise or counterclockwise direction of the human-machine interface 1.

    [0035] In addition, the gripping element 10 may comprise a transducer determining a force exerted by the operator on the human-machine interface 1. The exerted force corresponds to an intensity applied by the operator and to a direction of application of the operator on the gripping element 10, such as in particular a pull, a push, a rotation . . .

    [0036] In a second embodiment illustrated in FIG. 2, the gripping element 10 is in the form of a throttle control stick, also called throttle stick or side stick. Such a throttle control stick is used in particular in a 3M-type control system, acronym for hands on stick and joystick in French, also called HOTAS, acronym for Hands on Throttle and Stick.

    [0037] Hence, the human-machine interface 1 provided with such a gripping element comprises a transducer determining an angle of inclination, an angle of rotation, a force, a direction of the gripping element 10, etc.

    [0038] According to the invention, all or part of an external surface of the gripping element 10 is provided with at least one sensor, in particular a sensitive element, preferably with an array of at least one sensor, in particular an array of at least one sensitive element.

    [0039] The sensor, in particular the sensitive element, is designed so as to transmit or emit a detection signal in response to an action carried out by the operator. Such an action carried out by the operator may be a press, a contact, a movement, a force applied by the operator on the sensor. More generally, the action carried out by the operator consists of an interaction between the operator and the gripping element 10 detected by the sensor, in particular the sensitive element, preferably the array of at least one sensor, in particular the array of at least one sensitive element.

    [0040] The sensor may be of the capacitive, resistive or inductive type.

    [0041] The human-machine interface 1 also comprises calculation means configured to determine a position of at least one surface of interaction between the operator and the gripping element 10, corresponding in particular to that in which the sensor emits or transmits the detection signal.

    [0042] For example, the calculation means consist of software means duly programmed to determine the position of the sensor emitting or transmitting the detection signal.

    [0043] The position of the sensor, in particular of the sensitive element, emitting or transmitting the detection signal is determined thanks to a unique addressing of the sensor.

    [0044] The calculation means may be physically comprised in the human-machine interface 1 or be remotely arranged in a calculator connected to the human-machine interface 1, in particular to the array of sensors or sensitive elements and to the gripping element 10.

    [0045] The calculation means may also be configured to determine whether the transmitted command, respectively the transmitted commands, is/are intentional or accidental. Indeed, the gripping element 10 may be touched without actually being grasped. This is the case, for example, during an erroneous movement by the operator, without the latter intending to generate a command.

    [0046] To this end, the human-machine interface 1, in particular the gripping element includes at least one first predefined area 11. The first predefined area 11 allows determining whether an organ of the operator, advantageously a hand of the operator, is placed on the gripping element 10 in an interaction surface.

    [0047] In particular, depending on a position of the surface of interaction with the operator and a position of the first predefined area 11, the calculation means compare the position of the surface of interaction with the operator with the position of the first predefined area 11. According to the invention, it is determined whether an overlap, at least partial, of the first predefined surface 11 is achieved by the surface of interaction with the operator.

    [0048] Thus, the human-machine interface 1 allows differentiating between a handling error and an intentional command so as to ensure the operating safety of the vehicle or of the controlled device. In other words, it is possible to determine transparently for the operator whether a command made via the human-machine interface 1 is an intentional command or a handling error. If the command is intentional or voluntary, the calculation means authorise the transmission of commands by the human-machine interface 1.

    [0049] Depending on the field of use of the human-machine interface 1, the intentional command detection condition may be limited to some commands of the human-machine interface 1.

    [0050] In a particular embodiment, the first predefined area 11 is generally the portion of the human-machine interface 1 in contact with the palm of the hand of the operator.

    [0051] For a gripping element 10 of the stick or joystick type, it generally consists of a base of the gripping element 10 over which the palm of the hand of the operator is pressed. It may be completed by lateral pressing areas and/or opposing areas.

    [0052] For a gripping element 10 of the throttle control stick type, it generally consists of an upper portion of the gripping element 10 over which the palm of the hand of the operator rests.

    [0053] Arranged this way, a hand of the operator hand detection function is thus achieved.

    [0054] The human-machine interface 1, in particular the gripping element 10, may include at least one second predefined area 12a, in particular two second predefined areas 12a and 12b.

    [0055] The second predefined area 12a, respectively the second predefined area 12b, allows making a command associated with conventional buttons, switches or contactors.

    [0056] Thus, making of a command conventionally associated with buttons, switches or contactors may be achieved virtually by determining a contact and/or a press on the second predefined area 12a, respectively the second predefined area 12b.

    [0057] Advantageously, the second predefined area 12a, respectively the second predefined area 12b, is distinct from the first predefined area 11 participating in the hand of the operator hand detection function.

    [0058] To carry out a command conventionally associated with buttons and switches, the calculation means of the human-machine interface 1 may also be configured to determine whether the second predefined area 12a, respectively the second predefined area 12b, of the human-machine interface 1 is included, at least partially, in the surface of interaction with the operator.

    [0059] To this end, the calculation means compare the position of the surface in contact with the operator with a position of the second predefined area 12a, respectively the second predefined area 12b.

    [0060] According to the invention, it is determined whether an overlap, at least partial, of the second predefined area 12a, respectively the second predefined area 12b, is achieved by the surface of interaction with the operator.

    [0061] If it results that such an overlap is achieved, the calculation means then emit or transmit the command associated with the second predefined area 12a, respectively with the second predefined area 12b, whose position is included at least partially in the surface of interaction with the operator.

    [0062] Thus, the emission and/or the transmission of the command associated with the second predefined area 12a, respectively with the second predefined area 12b, is carried out in a manner similar to that of a press on a conventional button, switch or contactor of a conventional command interface.

    [0063] It is possible to define several virtual contactors on the gripping element 10 of the human-machine interface 1, as well as several types of contactors. It is also possible to determine whether they are activated.

    [0064] Different types of virtual contactors may be defined, like push-buttons, two-position switches, four-way contactors or multi-way analog contactors.

    [0065] The human-machine interface 1 may also comprise at least one directional contactor determining a direction and/or an amplitude of a pressed movement of the operator according to an origin position, an end position of the pressed movement and the position of the predefined area. The directional contactor may be a two-position type one such as a switch, a four-position type one, or multi-directional.

    [0066] Advantageously, the directional contactor is associated with the second predefined area 12a, respectively the second predefined area 12b, and/or the second predefined areas 12a and 12b.

    [0067] FIGS. 1 and 2 illustrate embodiments including two second predefined areas 12a and 12b, which can be associated respectively with different contactors and/or with distinct commands.

    [0068] It is also possible to transmit or emit a command associated with a simultaneous interaction, in particular a simultaneous press, over at least two predefined areas of the human-machine interface 1, such as the first predefined area 11, the second predefined area 12a, respectively the second predefined area 12b, and/or the second predefined areas 12a and 12b.

    [0069] In another embodiment, the human-machine interface 1 comprises at least one informative layer allowing informing the operator of the position and/or of the command associated with at least one predefined area, such as the first predefined area 11, the second predefined area 12a, respectively the second predefined area 12b, and/or the second predefined areas 12a and 12b.

    [0070] Such an informative layer may comprise an indication, in particular graphical, of the delimitation of the predefined area and/or of the command associated with the predefined area.

    [0071] The human-machine interface 1 may also comprise means for illuminating at least one predefined area allowing highlighting and/or a visual feedback related to the command, an availability and/or an activation of the command.

    [0072] In a particular case, the illumination means may consist of light-emitting diodes. They may then advantageously be combined with a liquid crystal layer. According to this particular configuration, it is then possible to combine the illumination function and the information function of the informative layer.

    [0073] Alternatively, self-emissive devices, for example of the OLED type, an acronym for Organic Light-Emitting Diode, may be used to combine the illumination function and the information function in a single layer.

    [0074] Moreover, the human-machine interface 1 may comprise information feedback means, in particular haptic information feedback means. The information feedback means allow warning the operator about the command associated with the predefined area.

    [0075] In particular, the information feedback means allow informing the operator of an activation, an unavailability or an error related to the command associated with the predefined area.

    [0076] A feedback, in particular a haptic information feedback, may be limited to the second predefined area 12a, respectively the second predefined area 12b, or associated with all or part of the human-machine interface 1, in particular of the gripping element 10.

    [0077] The first predefined area 11 and the second predefined area 12a, respectively the second predefined area 12b, are defined so as to be decorrelated from a physical structure of the gripping element 10 of the human-machine interface 1. Consequently, the position of the first predefined area 11 and of the second predefined area 12a, respectively the second predefined area 12b, may be modified according to the device or the vehicle to which the human-machine interface 1 is connected, or be modified during operation according to external parameters.

    [0078] For example, the commands of the human-machine interface 1 may change according to the flight phase of an aircraft, so as to prioritise access to the relevant commands for the considered flight phase.

    [0079] Different embodiments have been independently described hereinabove. Of course, the invention is not limited to the embodiments described before and provided only as example. It should nevertheless be understood that they could be used separately or combined together depending on the intended use of the human-machine interface 1. The invention encompasses various modifications, alternative forms and other variants that a person skilled in the art could consider in the context of the invention and in particular all combinations of the different operating modes described before, which may be considered separately or in combination.