HEARING DEVICE

20230209298 · 2023-06-29

    Inventors

    Cpc classification

    International classification

    Abstract

    The present disclosure relates to a hearing device comprising a first radar sensor configured for obtaining first radar data indicative of a difference in the orientation of a head of the user and a further body part of the user. A processing unit of the hearing device is configured to receive the first radar data from the first radar sensor and determine, based on the first radar data, a first relative orientation, wherein the first relative orientation is a difference in the orientation of the head of the user and the further body part of the user.

    Claims

    1-15. (canceled)

    16. A hearing device, comprising a first housing for being worn at and/or in a first ear of a user, and a processing unit configured to control functionality of the hearing device, wherein the hearing device further comprises a first radar sensor configured for obtaining first radar data indicative of a difference in the orientation of a head of the user and a further body part of the user, an orientation sensor configured for obtaining orientation data regarding a head orientation of the user, and wherein the processing unit is further configured to: receive the first radar data from the first radar sensor, receive the orientation data from the orientation sensor, determine, based on the orientation data, the head orientation of the user, output the head orientation of the user, determine, based on the first radar data, a first relative orientation, wherein the first relative orientation is a difference in the orientation of the head of the user and the further body part of the user, outputting the first relative orientation, and based on the first relative orientation, calibrate the orientation sensor.

    17. A hearing device according to claim 16, wherein the orientation sensor is one or more of the following: a gyroscope, an accelerometer, a magnetometer, and an inertial measurement unit, IMU.

    18. A hearing device according to claim 16, wherein the first radar sensor is configured to emit a radio signal at a frequency of 40 GHz-120 GHz, preferably 50 GHz-100 GHz, and even more preferred 55 GHz-65 GHz.

    19. A hearing device according to claim 16, wherein the first radar sensor comprises a phased array.

    20. A hearing device according to claim 19, wherein the phased array comprises at least four antennas.

    21. A hearing device according to claim 19, wherein the processing unit is further configured to: control a phase relation of antennas within the phased array to perform a scan of a user wearing the hearing device.

    22. A hearing device according to claim 16, wherein the processing unit is further configured to: receive an audio signal, and process the received audio signal based on the determined first relative orientation.

    23. A hearing device according to claim 16, wherein the hearing device is a headset, or a pair of earbuds.

    24. A hearing device according to claim 16, wherein the hearing device further comprises a second housing for being worn at and/or in a second ear of a user, and a second radar sensor configured for obtaining second radar data indicative of the orientation of a head of the user relative to a further body part of the user, and wherein the processing unit is further configured to: receive the second radar data from the second radar sensor, and determine, based on the received first radar data and the received second radar data, the orientation of a head of the user relative to a further body part of the user.

    25. A hearing system, comprising: a hearing device, comprising a first housing for being worn at and/or in a first ear of a user, and a processing unit configured to control functionality of the hearing device, wherein the hearing device further comprises a first radar sensor configured for obtaining first radar data indicative of a difference in the orientation of a head of the user and a further body part of the user, an orientation sensor configured for obtaining orientation data regarding a head orientation of the user, and wherein the processing unit is further configured to: receive the first radar data from the first radar sensor, receive the orientation data from the orientation sensor, output the head orientation of the user, determine, based on the first radar data, a first relative orientation, wherein the first relative orientation is a difference in the orientation of the head of the user and the further body part of the user, outputting the first relative orientation, and based on the first relative orientation, calibrate the orientation sensor, wherein the hearing system further comprises an external processing unit, wherein the external processing unit is configured to: receive an audio signal, receive the first relative orientation from the processing unit, process the received audio signal based on the received first relative orientation, and transmit the processed audio signal to the processing unit.

    26. A hearing system according to claim 25, wherein the external processing unit is further configured to: based on the first relative orientation, render the received audio signal into a 3D audio signal.

    27. A hearing system according to claim 25, wherein the hearing device comprises an orientation sensor configured for obtaining head orientation data regarding a head orientation of the user, wherein the external processing unit is further configured to: receive the head orientation data from the hearing device, determine a head orientation, based on the head orientation and the first relative orientation, determine a head related transfer function, HRTF, and process the received audio signal based on the determined HRTF.

    28. A method for determining an orientation of a head of the user relative to a further body part of the user, the method comprises the steps of: receiving first radar data from a first radar sensor connected to a first housing worn at and/or in a first ear of the user, wherein the first radar data is indicative of an orientation of a head of the user relative to a further body part of the user, determining, based on the received first radar data, an orientation of a head of the user relative to a further body part of the user, receiving an audio signal, processing the received audio signal based on the determined orientation and calibrating an orientation sensor, based on the determined orientation.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0098] The disclosure will be explained in more detail below together with preferred embodiments and with reference to the drawings in which:

    [0099] FIG. 1 shows a box diagram of a hearing device according to an embodiment of the disclosure.

    [0100] FIG. 2 shows a schematic drawing of a hearing system according to an embodiment of the disclosure.

    [0101] FIG. 3 shows a schematic top view of a radar sensor according to an embodiment of the disclosure.

    [0102] FIG. 4 shows a flow diagram of a method according to an embodiment of the disclosure.

    [0103] The figures are schematic and simplified for clarity, and they just show details essential to understanding the disclosure, while other details may be left out. Where practical, like reference numerals and/or labels are used for identical or corresponding parts.

    DETAILED DESCRIPTION OF THE DRAWINGS

    [0104] The detailed description given herein and the specific examples indicating embodiments of the disclosure are intended to enable a person skilled in the art to practice the disclosure and should thus be regarded as an illustration of the disclosure. The person skilled in the art will be able to readily contemplate further applications of the present disclosure as well as advantageous changes and modifications from this description without deviating from the scope of the disclosure. Any such changes or modifications mentioned herein are meant to be non-limiting for the scope of the disclosure. An advantage described in conjunction with a particular embodiment is not necessarily limited to that embodiment and can be practiced in any other embodiments even if not so illustrated, or if not so explicitly described.

    [0105] Referring initially to FIG. 1 showing a box diagram of a hearing device 1 according to an embodiment of the disclosure. The hearing device 1 comprises a first housing 11 for being worn at and/or in a first ear of a user. The first housing 11 may comprise attachment means for attaching the hearing device at and/or in the first ear of the user. The first housing 11 may be an earcup, an earbud housing or similar. The hearing device 1 comprises a processing unit 12. The processing unit 12 is configured to control functionality of the hearing device 1. Functionality may comprise the processing of a received audio signal, control of playback of an audio signal, a shutdown function of the hearing device, or other features of the hearing device. The hearing device 1 further comprises a first radar sensor 13 configured for obtaining first radar data indicative of an orientation of a head of the user relative to a further body part of the user. The processing unit 12 is configured to receive the first radar data from the first radar sensor 13, and, based on the first radar data, to determine a first relative orientation, wherein the first relative orientation is a difference in the orientation of the head of the user and the further body part of the user. The hearing device 1 further comprises an orientation sensor 14 configured for obtaining orientation data regarding a head orientation of the user. The hearing device 1 further comprises a second housing 15 for being worn at and/or in a second ear of a user, and a second radar sensor 16 configured for obtaining second radar data indicative of the orientation of a head of the user relative to a further body part of the user. The processing unit 12 is further configured to receive the second radar data from the second radar sensor 16, and determine, based on the received first radar data and the received second radar data, the first relative orientation. The orientation sensor is in the shown embodiment an inertial measurement unit, IMU.

    [0106] Referring to FIG. 2 showing a schematic drawing of a hearing system 10 according to an embodiment of the disclosure. The hearing system 10 comprises a hearing device communicatively connected to an external processing unit 21, either through a wired or a wireless connection. The external processing unit 21 is arranged within an external device 2, such as a mobile, a computer, or a smart device. The hearing device 1 is provided as a headset 1. The headset comprises a first earcup 11 and a second earcup 15. The earcups 11 and 15 are connected via a head strap 17. The earcups 11 and 15 are configured to be arranged on the ears of a user. Connected to the first earcup 11 is a first radar sensor 13 and an orientation sensor 14. In other embodiments a second radar sensor and/or an additional orientation sensor may be connected to the second earcup 15. In yet other embodiments the sensors 13 and 14 are connected to the head strap 17 of the hearing device 1. The external processing unit 21 is configured to receive an audio signal. The audio signal may be received by the external processing unit 21 via a cellular network, a wired network, or other kinds of wireless networks. The external processing unit 21 is configured to receive the first radar data from the processing unit 12, determine, based on the first radar data, a first relative orientation, wherein the first relative orientation is a difference in the orientation of the head of the user and the further body part of the user, and process the received audio signal based on the determined first relative orientation. The hearing device 1 further comprises an orientation sensor 14 configured for obtaining orientation data regarding a head orientation of the user wearing the hearing device. The external processing unit 21 is further configured to determine a head related transfer function, HRTF, based on the first relative orientation and/or orientation data received from the orientation sensor 14.

    [0107] Referring to FIG. 3 showing a schematic top view of a radar sensor 13 according to an embodiment of the disclosure. The radar sensor 13 comprises a substrate 135. The substrate 135 is depicted as being substantially circular when viewed from the top, however, the substrate 135 is not limited to a circular shape but may assume any shape appropriate and may be formed with a complementary shape to a hearing device 1 or a hearing device housing 11 and 15. Arranged on the substrate 135 are four antennas 131, 132, 133, and 134. The antennas 131, 132, 133, and 134 are configured to emit a radio signal at a frequency of 40 GHz-120 GHz, preferably 50 GHz-100 GHz, and even more preferred 55 GHz-65 GHz. The antennas 131, 132, 133, and 134 arranged on the substrate 135 form a phased array. The processing unit 12 of the hearing device 1 is configured to control a phase relation of the antennas 131, 132, 133, and 134 in the phased array to perform a scan of a user wearing the hearing device 1.

    [0108] Referring to FIG. 4 showing a flow diagram of a method 100 according to an embodiment of the disclosure. The method 100 may be a computer implemented method. The method 100 being for determining an orientation of a head of the user relative to a further body part of the user. In a first step the method comprises receiving 101 first radar data from a first radar sensor 13, where the first radar data is indicative of an orientation of a head of the user relative to a further body part of the user. In a second step the method comprises determining 102, based on the received first radar data, an orientation of a head of the user relative to a further body part of the user. In a third step the method may comprise receiving 103 an audio signal. The audio signal may be received from an external source, e.g., a smart device or a computer. The audio signal may be a biaural audio signal, a stereo audio signal, a mono audio signal. The audio signal may comprise meta data to facilitate processing of the audio signal. In a fourth step the method may comprise processing 104 the received audio signal based on the determined first relative orientation. In a fifth step the method may comprise calibrating 105 an orientation sensor 14, based on the first relative orientation. Orientation data from the calibrated orientation sensor 14 may be used in processing the audio signal in the fourth step.

    [0109] The disclosure is not limited to the embodiments disclosed herein, and the disclosure may be embodied in other ways within the subject-matter defined in the following claims. As an example, further features of the described embodiments may be combined arbitrarily, e.g., to adapt devices according to the disclosure to specific requirements.

    [0110] Any reference numerals and labels in the claims are intended to be non-limiting for the scope of the claims.