METHOD AND DEVICE FOR HELPING TO UNDERSTAND AN AUDITORY SENSORY MESSAGE BY TRANSFORMING IT INTO A VISUAL MESSAGE
20170018281 ยท 2017-01-19
Inventors
Cpc classification
G06F40/58
PHYSICS
G02B2027/0187
PHYSICS
G02B27/0093
PHYSICS
International classification
Abstract
A method and a device for helping a hearing-impaired person to understand an auditory sensory message by transforming the auditory message into a visual message and projecting this message on a support in the field of vision of the hearing-impaired person. This device comprises a support (11) carrying a sensor in the form of microphones (16, 17) to pick up the message, recording and memorizing module (19) for recording and memorizing/storing in real time the auditory message, and transforming the message into a visual message. A screen (18) is placed in the field of vision of the user, and at least one projector (20, 21) projects the visual message on the screen. A sensor detects pupil movement of the user and mechanism converts pupil movement of the user into a display command for the visual message, and carrying out the command by projecting the visual message.
Claims
1-15. (canceled)
16. A method for helping a hearing-impaired person called a user to understand an auditory sensory message by transforming the auditory sensory message into a visual message and projecting the visual message on a surface positioned in a field of vision of the user, the method comprising: picking up the auditory sensory message by at least one sensor mounted on a support of a spectacle or equivalent component, recording the auditory message in real time to allow re-using the recorded auditory message in deferred time, transforming the auditory sensory message into a visual message, projecting the visual message, following a user command, on a screen integral with the support of the spectacle or equivalent component placed in the field of vision of the user, carrying out the command by at least one pupil movement of the user, and performing the projection of the visual message in a static or scrolling way, one single time or repeatedly, immediately or with a delay, depending on the user command that is picked up by a member detecting the pupil movement.
17. The method for helping the hearing-impaired person according to claim 16, further comprising carrying the projection of the visual message out on at least two display lines, with the first line being assigned to the display of the visual transcription of the auditory message, and the second line being assigned to additional information useful for the comprehension of the message.
18. The method for helping the hearing-impaired person according to claim 17, further comprising translation the additional information, useful for the comprehension of the message, into a language selected by the user.
19. The method for helping the hearing-impaired person according to claim 17, further comprising carrying out the projection of the visual message with a predetermined lag with respect to communication of the auditory message.
20. A device (10) implementing the method for helping a hearing-impaired person called a user to understand an auditory sensory message, the device including means for transforming the auditory sensory message into a visual message and means for projecting the visual message on a surface positioned in a field of vision of the user, wherein the device comprises: a support (11) arranged to carry the set of functional elements involved in the fulfillment of the functions of the method, the set of functional elements comprising: at least one sensor (16, 17) mounted on the support and arranged to pick up the auditory sensory message, recording and memorizing means (19a, 19a) for recording and memorizing the auditory message in real time, transformation means (19b, 19b) for transforming the auditory sensory message into a visual message, a screen (18) integral with the support (11) and placed in the field of vision of the user, at least one element (20, 21) for projecting the visual message on the screen of the support, at least one sensor (25, 26) for sensing pupil movement of the user, means (19c, 19c) for converting the pupil movement of the user into a display command for the visual message, and means (19d, 19d) for carrying out the command by projecting the visual message in a static or a scrolling way, one single time or repeatedly, immediately or with a delay.
21. The device according to claim 20, wherein the support is of a spectacle or equivalent component and comprises a front bar (12) with a bridge (13), and the front bar carrying two temples (14, 15) articulated at the ends of the front bar (12).
22. The device according to claim 21, wherein the at least one element to project the visual message includes two projectors (20, 21) respectively mounted on the temples (14, 15), each of the projectors is arranged so as to project at least a part of the visual message corresponding to the auditory sensory message.
23. The device according to claim 20, wherein the device comprises means (25, 26) for picking up at least one pupil movement of the user, means for interpreting the at least one pupil movement and means for carrying out a command for projecting the visual message according to the interpretation of the pupil movement.
24. The device according to claim 22, wherein the means (19d, 19d) for carrying out a command for projecting the visual message is coupled with the two projectors (20, 21) arranged to project at least a part of the visual message.
25. The device according to claim 20, wherein the at least one sensor arranged, for picking up the auditory sensory message, is a directional microphone (20, 21) mounted on the front section of the support.
26. The device according to claim 20, wherein the device includes an electronic unit (19) powered by an electrical power source (22), the electronic circuit comprising a first memorization module (19a) arranged to memorize temporarily and in real time the auditory sensory message, a second transcription module (19b) arranged to transcribe the auditory message into visual data, a third control module (19c) arranged for receiving and interpreting the pupillary command message and working out the corresponding visual message, and a fourth display triggering module (19d) arranged for carrying out the display of the message according to the pupillary command.
27. The device according to claim 26, wherein at least one of the first memorization module (19a), the second transcription module (19b), the third control module (19c) and the fourth display triggering module (19d) of the electronic circuit (19) is arranged remotely from the support of device (10) and comprises means (30, 31) for transmitting and receiving information to and from a device-external equipment.
28. The device according to claim 21, wherein the device comprises a display device (18) mounted on the front bar arranged for displaying the visual message in the form of a signal made of at least one of words, written phrases and informative acronyms.
29. The device according to claim 26, wherein the display device comprises a screen (18) including at least two lines (18a, 18b) arranged for displaying the visual message in a static or scrolling way, one single time or repeatedly, immediately or with a delay.
30. The device according to claim 26, wherein the device is connectable, through the electronic unit (19), with any external device having a suitable connection for receiving an auditory signal to be processed is a similar way and procedure as a signal provided by microphones (16, 17).
31. The device according to claim 30, wherein the device is connectable through a Bluetooth transceiver type electronic unit (19).
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0040] The present invention and its advantages will be better revealed in the following description of an embodiment given as a non limiting example, in reference to the drawings in appendix, in which:
[0041]
[0042]
[0043]
BEST WAYS OF REALIZING THE INVENTION
[0044] The device according to the invention comprises several components that allow on the one hand to pick up a vocal information, to transcribe this vocal information into a visual information, on the other hand to display said visual information in the field of vision of an user, on his request, if necessary to scroll the visual information in the predetermined area of said field of vision. The information can be memorized, its display can be static, but preferably moving, and scrolling can be initiated instantaneously or with a lag defined previously, or even repeated, so that the user can access the information at any time when he considers this necessary, according to the level of comprehension of the communicated vocal information.
[0045] It is essential that the visual data is put at the disposal of the user in an efficient, quasi automatic and discreet way, so that the user will not be penalized by the handicap of a bad auditory perception or of a wrong comprehension of the vocal message and that he will be able to take part normally in a discussion or even in a debate.
[0046] Referring to the figures and especially to
[0047] Support 11 carries two very directional microphones 16 and 17 that are mounted on either side of bridge 13 of front bar 12 and that pick up preferably the sounds emitted by an interlocutor located in front of the user. Front bar 12 is advantageously arranged at the level of the user's eyebrows and includes in its upper section a rectilinear strip 18 having a height sufficient to allow displaying at least two lines of text, said strip 18 forming a screen on which the visual information that transcribes the vocal information the hearing-impaired user has difficulty to hear and/or to understand is projected.
[0048] The vocal information picked up is processed by an electronic unit 19 that includes different areas with different functions. In the example represented in
[0049] Electronic unit 19 is powered by an electrical power source 22, preferably a rechargeable battery, located for example in temple 14 by means of a wiring integrated in support 11. Front bar 12 also serves as the support for spectacle lenses 23, 24. If the support is carried by a classical spectacle frame, the spectacle lenses can be mounted directly in this frame and support 11 of device 10 then does not carry the spectacle lenses, but only the functional components of device 10.
[0050] The mentioned communication with a mobile phone or a tablet or any other equipment provided with a display screen allows transferring the sound signal to a signal processing device integrated in these devices, but also receiving from the latter a return signal in the form of a phrase written in a suitable language or of particular symbols that can be displayed by the display of strip 18.
[0051] The first module 19a called memorization module of electronic unit 19 is arranged for memorizing the auditory information, the third module 19c called command interpretation module and the fourth module called display triggering module are arranged for re-using the data memorized in module 19a in order to delay if necessary by a few seconds the display of this information transcribed in written form.
[0052] The fourth module 19d called display triggering module is arranged to allow displaying, if necessary by scrolling, the transcribed visual message, delayed or repeated as the case may be, possibly with a translation, on request of the user. The operation is triggered on request of the user by the detection of a predetermined movement of his eyeball by means of at least one pupil movement detector 25, 26 located on the internal side of front bar 12, in front of each eye.
[0053] A device management software allows the user to select operating options relating for example to the display of the information transcribed from phonic to visual, such as the selection of the static display mode, the selection of the scrolling display mode, the repetition of an information, the selection of a translation language, the integral transcription of a telephone conversation. The active pupil movement could for example be an upward move with a first meaning that is the command to start a projection, a leftward movement having a second meaning such as the command of a time lag, a rightward movement having a third meaning such as the command of an acceleration of the scrolling, a downward movement having a fourth meaning such as the access to the selection menu of the display options and their validation.
[0054] Further possibilities can be developed, such as the one that consists in displaying at least one light signal when perceiving an interlocutor in order to alert the user, the one that consists in a written description of an abnormal sound environment, the one that consists in displaying alarm acronyms for certain sounds, such as for example the ringing of the phone, of a doorbell, an alarm clock, an abnormally high sound level, an approaching vehicle, a crash of glass, metal, a fire, boiling water, flowing water, a thunderstorm, the falling of an object, the noise made by a person, barking, or that of communicating in writing orally transmitted instructions.
[0055]
[0056] SUBSTITUTE SPECIFICATION sensory message with two very directional microphones 16 and 17 and in sending them into electronic circuit 19, more specifically into first module 19a of this circuit that plays the role of the receiver, powered by an electrical current source 22. The auditory sensory message is transformed into a visual message by module 19b. The connection between microphones 16/17 and the electronic circuit can be ensured by means of a cable mounted in support 11. Module 19c is designed to interpret a control signal transmitted by pupil movement detectors 25, 26. Module 19d, called display triggering module, triggers the display of this message according to the interpretation provided by pupil movement detectors 25, 26. The display can be subdivided into two complementary message elements C1 and C2 by transmission to projectors 20 and 21 that project it on one line 18a or on two lines 18a and 18b of a display carried by display strip 18.
[0057]
[0058] In the second variant, only the transformation of the vocal message into a visual message and the possible addition of additional data or the transformation of the signals is carried out on equipment external to device 10.
[0059] The visual message, which consists in one or two phrases, or the displayed acronym, corresponds to the few last seconds elapsed and registered by the auditory sensor. The user can then make sure that he has heard correctly the sound message, without errors, thus ensuring perfect comprehension, even in the absence of total perception.
[0060] This operating mode can be used for situations requiring perfect comprehension, even though there is no perception deficiency. This is the case for a discussion between two persons talking different languages, when the user does not master perfectly the language of his interlocutor. In this case, the first line displays the message in the language of the interlocutor and allows the user to make sure he perceived the right word, and thus to improve his knowledge in this language. He can also use the second line to display the translation of the upper line in his own language, therefore ensuring his good comprehension even if he does not master the concerned language at all.
[0061] The device can be used by a person suffering from total or partial deafness for all everyday activities, in particular perceiving alarms or for verbal, telephonic communication; by a person suffering from cognitive impairment, for example dementia or Alzheimer's disease that does not allow understanding the meaning of the sound signal; by a person performing work far away from a signal transmitter or in a noisy environment that does not allow a proper comprehension of a phonic signal.
[0062] The device might be used as a prompter, for lecturers, actors, in the theatre, to learn music, one line displaying the score, the second its literal translation or similar.
[0063] The invention is not restricted to the described embodiments and can have different aspects within the framework defined by the claims. The realization and layout of the various components could be modified while still respecting the functionalities of the device. The applications could be extended beyond the support for people suffering from total or partial deafness.
[0064] In particular, the device can be arranged to be connected through electronic unit 19, for example of the Bluetooth transceiver type, with any external device having a suitable connection and to receive an auditory signal to be processed the same way and according to the same procedure as the signal provided by microphones 16 and 17. This variant allows in particular processing visually a telephone communication or a command given remotely on a building site, for example by means of a walkie-talkie or any other auditory or electrical means.