Beam former calibration of a hearing device
11510016 · 2022-11-22
Assignee
Inventors
Cpc classification
International classification
Abstract
A method for adjusting a hearing device (12) adapted to be worn behind an ear (28) comprises: determining a cymba angle (54) between a cartilage (50) above the cymba (46) of the ear (28) and a viewing direction (38) of the user; estimating a tilt angle (39) of the hearing device (12) with respect to the viewing direction (38) from the cymba angle (54); and adjusting a beam former direction (37) of a beam former (34) of the hearing device (12), such that the beam former direction (37) is aligned with the viewing direction (38).
Claims
1. A method for adjusting a hearing device adapted to be worn behind an ear of a user, the method comprising: receiving image data from the ear, the image data containing at least one image of the ear; determining, from the image data, a cymba angle between a cartilage above a cymba of the ear and a viewing direction of the user, wherein a direction of the cartilage is determined by averaging a curve along the cartilage and the cymba angle is determined as an angle between the direction of the cartilage and the viewing direction; estimating a tilt angle of the hearing device worn by the user with respect to the viewing direction from the cymba angle; adjusting a beam former direction of a beam former of the hearing device, such that the beam former direction is aligned with the viewing direction.
2. The method of claim 1, wherein the cymba angle is determined with an image recognition algorithm adapted for identifying parts of the ear.
3. The method of claim 1, wherein the cymba angle is determined with a machine learning algorithm trained with image data of ears with known cymba angles.
4. The method of claim 1, wherein the image data contains images of the ear from different directions and a three-dimensional representation is determined from the image data; wherein the cymba angle is determined from the three-dimensional representation.
5. The method of claim 1, wherein the image data contains an image of a marker provided besides the ear, the marker having at least one of a scale or an indication of the viewing direction.
6. The method of claim 1, further comprising: determining an ear size from the image data; wherein a distance from a front of a helix of the ear to an ear channel is determined from the image data.
7. The method of claim 1, further comprising: determining an optimal tube length of a tube interconnecting a part of the hearing device behind the ear with a part of the hearing device in the ear from the image data.
8. The method of claim 1, further comprising: determining, whether the user wears glasses, from the image data.
9. The method of claim 1, wherein the tilt angle is determined from a lookup table.
10. The method of claim 1, wherein the tilt angle is determined with a machine learning algorithm, which has been trained with known cymba angles.
11. The method of claim 1, wherein the tilt angle is determined from the cymba angle and at least one of: an ear size, a selected tube length of a tube interconnecting a part of the hearing device behind the ear with a part of the hearing device in the ear, information about, whether the user wears glasses or not.
12. A non-transitory computer-readable medium storing instructions, which when executed by a processor, cause a hearing system to perform operations, the operations comprising: receiving image data from an ear of a user, the image data containing at least one image of the ear; determining, from the image data, a cymba angle between a cartilage above a cymba of the ear and a viewing direction of the user, wherein a direction of the cartilage is determined by averaging a curve along the cartilage and the cymba angle is determined as an angle between the direction of the cartilage and the viewing direction; estimating a tilt angle of the hearing device worn by the user with respect to the viewing direction from the cymba angle; adjusting a beam former direction of a beam former of the hearing device, such that the beam former direction is aligned with the viewing direction.
13. The non-transitory computer-readable medium of claim 12, wherein the cymba angle is determined with an image recognition algorithm adapted for identifying parts of the ear.
14. The non-transitory computer-readable medium of claim 12, wherein the cymba angle is determined with a machine learning algorithm trained with image data of ears with known cymba angles.
15. The non-transitory computer-readable medium of claim 12, wherein the image data contains images of the ear from different directions and a three-dimensional representation is determined from the image data; wherein the cymba angle is determined from the three-dimensional representation.
16. The non-transitory computer-readable medium of claim 12, wherein the image data contains an image of a marker provided besides the ear, the marker having at least one of a scale or an indication of the viewing direction.
17. The non-transitory computer-readable medium of claim 12, further comprising: determining an ear size from the image data; wherein a distance from a front of a helix of the ear to an ear channel is determined from the image data.
18. The non-transitory computer-readable medium of claim 12, further comprising: determining an optimal tube length of a tube interconnecting a part of the hearing device behind the ear with a part of the hearing device in the ear from the image data.
19. The non-transitory computer-readable medium of claim 12, further comprising: determining, whether the user wears glasses, from the image data.
20. The non-transitory computer-readable medium of claim 12, wherein the tilt angle is determined from a lookup table.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) Below, embodiments of the present invention are described in more detail with reference to the attached drawings.
(2)
(3)
(4)
(5)
(6)
(7)
(8) The reference symbols used in the drawings, and their meanings, are listed in summary form in the list of reference symbols. In principle, identical parts are provided with the same reference symbols in the figures.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
(9)
(10) The adjusting system 14 may comprise a camera 16, which is used for acquiring images and/or image data 17 of the ear and/or a marker 18, such as a cartoon strip, which comprises a scale and/or indicators for a viewing direction of the user. Furthermore, the adjusting system 14 may comprise a computing unit 20, which automatically may determine control parameters for the hearing device 12 from the images/the image data 17. The adjusting system 14 may establish data communication with the hearing device 12 and may implement the control parameters in the hearing device 12.
(11) The hearing device 12 comprises at least two microphones 22 and a loudspeaker 24, which are also shown in
(12) As shown in
(13) The part 26 carries the microphones 22 and further electronics, which provide a beam former 34 as described above and below.
(14) It may be that the loudspeaker 24 is situated in the part 26. In this case, the tube 32 may be a sound conductor into the ear channel. It also may be that the loudspeaker 24 is provided in the part 30. In this case, the tube 32 may house a line for transmitting signals to the loudspeaker 24.
(15)
(16)
(17) The components 34 and 40 of the hearing device 12 may be implemented as software modules in the hearing device 12, which may comprise a processor for executing these modules.
(18) The direction and angle width of the beam former 34 may be set and/or adjusted with control parameters that are stored in the hearing device 12.
(19)
(20) In step S12, a cymba angle 54 between a cartilage 50 above the cymba 46 of the ear 28 and a viewing direction 38 of the user is determined.
(21) A definition of the cymba angle is given with respect to
(22) The ear 28 has a cartilage 50 arranged above the cymba 46, which is slanted with respect to the viewing direction 38. The cartilage 50, which may be called “rook”, is related to the anatomy behind the ear 28 and influences how the part 26 of the hearing device 12 is positioned behind the ear 28. To this cartilage 50, a cartilage direction 52 may be associated, which may run along a longitudinal extension of the cartilage 50.
(23) The cymba angle 54 is determined as the angle between the cartilage 50 and/or the cartilage direction and the viewing directing 38, which may be defined as the horizontal axis of the head, when an elevation of the head is 0 degree.
(24) Experiments have shown that the larger cymba angle 54, the larger the tilt angle 39.
(25) The cymba angle 54 may be determined from a picture, an image or image data 17 of the ear 8, such as shown in
(26)
(27) For example, a hearing aid specialist may position the marker 18 at the ear 28 and may take one or more pictures with the camera 16, which then sends the image data 17 to the computing unit 20. As an example, the camera 16 may be a component of a smartphone. The computing unit 20 then receives the image data 17 from the ear 28 and automatically determines the cymba angle 54 from it.
(28) The images and/or image data 17 may be acquired with a software that may give the photographer feedback to take qualitative usable pictures. A predefined symbol of a potential ear on a display, which also shows the actual image of the camera 16, may show to the photographer, how to place the camera 16. There also may be a visual feedback on the camera while taking the picture, which shows whether the extract of the view of the ear is correct. Also, a message may be provided, from which the photographer gets informed whether too many hairs cover the ear. This feedback may be given visually by detecting the hairs and highlighting them.
(29) It also may be that the software may be adapted for being used by the user of the hearing device 12 himself. If the user is taking a photo from himself, the software may help to get an accurate extract by guiding him with acoustic notifications (such as voice messages or beep tones) to the right location of his hand.
(30) Furthermore, as soon as the position of the camera 16 is correct, the picture and/or the image data 17 may be acquired automatically, for example without any interaction of the photographer (see also “automatic release” below). When the extract and/or frame of picture is usable, a display of the camera may turn green to signalize the photographer to acquire the picture and/or may capture the picture itself.
(31) It may be that the image data 17 contains pictures of the ear 28 from different directions and that a three-dimensional representation is determined from the image data 17. The cymba angle 54 may be determined from the three-dimensional representation. With posing the camera 16 around the ear, a 3D scan may be performed. Also, a 3D image processing algorithm may be applied, which may capture the room information, such as an angle from the camera 16 to the head, the elevation of the head, the size relations of components of the ear 28, etc.
(32) As already mentioned, also a marker 18 may have been positioned besides the ear 28 and the image data 17 may contain a picture of a marker 18. Also from this marker, the angle from the camera 16 to the head, the elevation of the head, the size relations of components of the ear 28, etc. may be determined.
(33) The marker 18 may be a cartoon as shown in
(34) A mobile device, such as a smartphone, which provides the camera 16, also may display augmented reality images to show the user, how the hearing device 12 may look like in dependence of different parameters. For example, different hearing devices 12 with different housings and/or with longer and shorter tubes 32 may be projected into the image 17. Also, for the current configuration, a beam former performance may be added in form of a description (excellent, good, poor, . . . ) and/or color labels (green, orange, red).
(35) There are several possibilities how the cymba angle 54 may be determined by the computing unit 20.
(36) For example, an image recognition and/or image processing algorithm may identify at least some of the parts 42, 44, 46, 48, 50 of the ear 28.
(37) Alternatively or additionally, the cymba angle 54 may be determined with a machine learning algorithm, which has been trained with image data 17 of ears 28, where the cymba angles 54 were known.
(38) During the step S12, also further parameters, which may be useful during the next step S14, in which the tilt angle 39 is determined, may be determined.
(39) For example, an ear size may be determined from the image data 17. As a further example, a distance from a front of the helix 48 of the ear 28 to the ear channel 44 may be determined from the image data 17. Here, the marker 18 with the scale 56 and/or a 3D scan may be used, in which the size of different parts of the ear 28 may be estimated.
(40) Furthermore, an optimal tube length of a tube 32 interconnecting a part 26 of the hearing device 12 behind the ear 28 with a part 30 of the hearing device 12 in the ear 28 may be determined from the image data 17. This tube length may be determined from the distance mentioned above.
(41) It may be that different tube lengths are offered by the method, for example, the method may determine that a tube with length “2” or “3” or “2.5” may be needed.
(42) Also, a hearing aid specialist may enter, which length of the tube 32 has been chosen, for example one of the proposed different lengths. The method then may recognize, whether the length is longer or shorter or exact than the actual size of the user's anatomy, i.e. the distance determined above.
(43) Furthermore, during step S12, it may be determined, whether the user wears glasses. This may be done with the image data 17, for example automatically by the image recognition algorithm and/or the machine learning algorithm. Alternatively, a hearing aid specialist may enter the information manually and/or via a conversational user interface, whether the user is wearing glasses or not.
(44) Also, the type of hearing device may be determined during step S12. For example, a hearing aid specialist may enter this information into the adjusting system 14. Alternatively, the adjusting system 14 may detect the type of hearing device 12, for example via data communication with the hearing device 12.
(45) In step S14, a tilt angle 39 of the hearing device 12 with respect to the viewing direction 38 is estimated from the cymba angle 54 and optional further data determined during step S12. The tilt angle 39 may be determined from the cymba angle 54 and at least one of: an ear size, a selected tube length of a tube 32 interconnecting a part 26 of the hearing device 12 behind the ear 28 with a part 30 of the hearing device 12 in the ear 28 and information about, whether the user wears glasses or not.
(46) The tilt angle 39 may be determined from a lookup table. A database with measurements of resulting tilt angles dependent on the defined parameters may comprise such a lookup table. Between the discrete data points of the lookup table, the method may interpolate between the two nearest entry points of the lookup table.
(47) It also may be that the tilt angle 39 is determined with a machine learning algorithm, which has been trained with known cymba angles 54 and/or further parameters. The machine learning algorithm may be trained offline with data such as determined during step S12 and matching tilt angles 39.
(48) It also may be that a 3D scan is performed with the camera 16, after the hearing device 12 has been put on the ear 28. The tilt angle of the microphones 22 may then be calculated directly while the hearing device 12 is on the ear 28.
(49) In step S16, a beam former direction 37 of the beam former 34 of the hearing device 12 may be adjusted, such that the beam former direction 37 is aligned with the viewing direction 38.
(50) For example, dependent on the tilt angle, the adjusting system 14 may determine the control parameters of the beam former 34, such that its direction is in parallel with the viewing direction 38. These control parameters may be implemented in the hearing device 12 via data communication.
(51) It also may be that the adjusting system 14 provides feedback to chosen hearing devices 12 and/or chosen length of the tube 32.
(52) For example, if the tilt angle 54 is very disadvantageous to the beam former performance, the adjusting system 14 may advise the hearing aid specialist to choose a shorter tube length. If the hearing aid specialist does so and enters another tube length he has chosen, the tool may determine the tilt angle 39 and/or the control parameters based on this choice.
(53) While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art and practising the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor or controller or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.
LIST OF REFERENCE SYMBOLS
(54) 12 hearing device
(55) 14 adjusting system
(56) 16 camera
(57) 18 marker
(58) 20 computing unit
(59) 22 microphone
(60) 24 loudspeaker
(61) 26 part behind the ear
(62) 28 ear
(63) 30 part in the entrance of the ear channel
(64) 32 tube
(65) 34 beam former
(66) 36 amplification curve
(67) 37 beam former direction
(68) 38 viewing direction
(69) 39 tilt angle
(70) 40 processing unit
(71) 42 concha
(72) 44 cavum
(73) 46 cymba
(74) 48 helix
(75) 50 cartilage
(76) 52 cartilage direction
(77) 54 cymba angle
(78) 56 scale
(79) 58 indicator for viewing direction