SYSTEM AND METHOD FOR ASSESSING EYESIGHT ACUITY AND HEARING ABILITY
20170258319 · 2017-09-14
Inventors
Cpc classification
A61B3/0025
HUMAN NECESSITIES
A61B3/028
HUMAN NECESSITIES
A61B2560/0247
HUMAN NECESSITIES
A61B5/398
HUMAN NECESSITIES
A61B3/032
HUMAN NECESSITIES
A61B5/4803
HUMAN NECESSITIES
A61B5/6898
HUMAN NECESSITIES
International classification
A61B3/032
HUMAN NECESSITIES
A61B3/00
HUMAN NECESSITIES
A61B3/14
HUMAN NECESSITIES
A61B5/00
HUMAN NECESSITIES
Abstract
The present invention relates to a system (10) for assessing eyesight acuity of a subject (20), comprising: a display (12) for displaying graphics and/or text to the subject (20); an eye movement sensor (14, 14′) for monitoring an eye movement of the subject (20) while the subject (20) is watching the graphics and/or text displayed on the display (12); a processing unit (16) for assessing the eyesight acuity of the subject (20) based on an analysis of the monitored eye movement, wherein the analysis of the monitored eye movement includes an analysis of at least one of a duration of eye saccades, a frequency of eye saccades, a duration of eye fixations and a frequency of eye fixations in the monitored eye movement; and an output unit (18, 18′) for indicating the result of the assessment of the eyesight acuity. The present invention furthermore relates to a system (110) for assessing hearing ability of a subject (20).
Claims
1. A system for assessing eyesight acuity of a subject, comprising: a display configured to display graphics and/or text to the subject; an eye movement sensor configured to monitor an eye movement of the subject while the subject is watching the graphics and/or text displayed on the display; a processing unit configured to assess the eyesight acuity of the subject based on an analysis of the monitored eye movement, wherein the analysis of the monitored eye movement includes an analysis of at least one of a duration of eye saccades, a frequency of eye saccades, a duration of eye fixations and a frequency of eye fixations in the monitored eye movement; and an output unit configured to indicate the result of the assessment of the eyesight acuity.
2. (canceled)
3. (canceled)
4. A system as claimed in claim 1, wherein the processing unit is configured to control the display to vary at least one of a size, colour and sharpness of the graphics and/or the text over time, and wherein the processing unit is configured to assess the eyesight acuity of the subject based on the analysis of the monitored eye movement and the variation of the at least one of the size, colour and sharpness of the graphics and/or the text over time.
5. A system as claimed in claim 1, further comprising a proximity sensor for measuring a distance between the display and an eye of the subject, wherein the processing unit is configured to assess the eyesight acuity of the subject based on the analysis of the monitored eye movement and the distance between the display and the eye of the subject.
6. A system as claimed in claim 1, further comprising a microphone for recording a voice of the subject while reading out loud the graphics and/or the text displayed on the display, wherein the processing unit is configured to perform a speech recognition, and wherein the processing unit is configured to assess the eyesight acuity of the subject based on the analysis of the monitored eye movement and a comparison of the speech recognition with the graphics and/or the text displayed on the display.
7. A system as claimed in claim 1, wherein the processing unit is configured to adjust at least one of (i) a size of the graphics and/or text displayed on the display, (ii) a luminosity of the display, and (iii) a contrast of the display based on the analysis of the monitored eye movement.
8. A system as claimed in claim 1, further comprising an ambient light sensor for measuring a brightness value of light in the ambience of the system, wherein the processing unit is configured: to generate a feedback regarding the measured brightness value of the ambient light, which feedback is indicated via the output unit; or to adjust at least one of (i) a size of the graphics and/or text displayed on the display, (ii) a luminosity of the display, and (iii) a contrast of the display based on the measured brightness value of the ambient light, or to incorporate the measured brightness value of the ambient light in the assessment of the eyesight acuity of the subject.
9. A system as claimed in claim 1, wherein the processing unit is configured to output instructions for the subject via the output unit, wherein said instructions support the subject using the system.
10. system as claimed in claim 1, further comprising a storage unit for storing the result of the assessment of the eyesight acuity each time the eyesight acuity of the subject is assessed by the processing unit, wherein the processing unit is configured to compare each new assessment with former assessments stored in the storage unit in order to derive a trend of the eyesight acuity of the subject over time.
11. (canceled)
12. A method for assessing eyesight acuity of a subject, comprising: displaying graphics and/or text on a display; monitoring an eye movement of the subject by means of an eye movement sensor while the subject is watching the graphics and/or text displayed on the display; assessing the eyesight acuity of the subject based on an analysis of the monitored eye movement, wherein the analysis of the monitored eye movement includes an analysis of at least one of a duration of eye saccades, a frequency of eye saccades, a duration of eye fixations and a frequency of eye fixations in the monitored eye movement; and indicating the result of the assessment of the eyesight acuity.
13. The computer program comprising program code means for causing a computer to carry out the steps of the method as claimed in claim 12 when said computer program is carried out on a computer.
14. A system for assessing hearing ability of a subject, comprising: a loudspeaker for generating a sound; a sound variation unit for varying a frequency and/or loudness of the sound; a feedback unit for receiving feedback at what frequency and/or loudness the sound may be recognized by the subject; a proximity sensor for measuring a distance between the loudspeaker and the subject; a processing unit for assessing the hearing ability of the subject based on an analysis of the received feedback and the distance between the loudspeaker and the subject; and an output unit for indicating the result of the assessment of the hearing ability.
15. A system as claimed in claim 14, wherein the sound variation unit comprises a user interface for manually varying the frequency and/or loudness of the sound and wherein the sound variation unit is configured to automatically vary the frequency and/or loudness of the sound over time.
16. (canceled)
17. A system as claimed in claim 14, further comprising a microphone for measuring a sound level in the ambience of the system, wherein the processing unit is configured: to generate a feedback regarding the measured ambient sound level, which feedback is indicated via the output unit; or to adjust the frequency and/or loudness of the generated sound based on the measured ambient sound level, or to incorporate the measured ambient sound level in the assessment of the hearing ability of the subject.
18. A method for assessing hearing ability of a subject, comprising: generating a sound by means of a loudspeaker; varying a frequency and/or loudness of the sound; receiving feedback at what frequency and/or loudness the sound may be recognized by the subject; measuring a distance between the loudspeaker and the subject; assessing the hearing ability of the subject based on an analysis of the received feedback and the distance between the loudspeaker and the subject; and indicating the result of the assessment of the hearing ability.
19. The computer program comprising program code means for causing a computer to carry out the steps of the method as claimed in claim 18 when said computer program is carried out on a computer.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0077] These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter. In the following drawings
[0078]
[0079]
[0080]
[0081]
[0082]
[0083]
DETAILED DESCRIPTION OF THE INVENTION
[0084]
[0085] In the shown example the system 10 is comprised in a tablet PC. However, the system 10 may also be comprised in a portable smartphone or in another portable computing device. The system 10 may also be comprised in a stationary desktop computing device.
[0086] The system 10 comprises a display 12, an eye movement sensor 14, a processing unit 16 and an output unit 18.
[0087] The display 12 is configured to display graphics and/or text to a subject 20. The display 12 may comprise any type of display means, such as e.g. a LCD display, a LED display or a plasma display. The display 12 may furthermore be operable to function as a touchscreen.
[0088] The eye movement sensor 14 is configured to monitor an eye movement of the subject 20 while the subject 20 is watching the graphics and/or reading the text displayed on the display 12. The eye movement sensor 14 according to the first embodiment comprises a digital camera, such as a digital video camera commonly used and embedded in tablet PC devices. The camera 14 preferably comprises an automatic focus unit which is configured to focus on the eyes of the subject 20.
[0089] The processing unit 16 is configured to assess the eyesight acuity of the subject 20 based on an analysis of the eye movement monitored by the camera 14. According to the first embodiment of the system 10, the processing unit 16 is preferably configured to perform a preprocessing of the camera signal in order to derive an eye movement signal which is indicative of the movement of the eyes of the subject 20. This is preferably done by means of an eye and/or iris detection during image processing, such that the gaze of the subject 20 is determined. The processing unit 16 may comprise a microchip or any other type of CPU having software stored thereon for carrying out the above-mentioned image and signal processing steps.
[0090] The output unit 18 is configured to indicate the results of the assessment of the eyesight acuity performed by the processing unit 16 to the subject 20. According to the first embodiment of the system 10, the output unit 18 may be a part of the display 12. The results of the eyesight acuity assessment may thus be fed back to the subject 20 in written form. Alternatively, it is also possible to realize the output unit 18 in the form of a loudspeaker, as this will be come apparent from the second embodiment shown in
[0091]
[0092] An example of such an eye movement signal 22 is shown in
[0093] In step S103 the eye movement signal 22 is analysed in order to assess the eyesight acuity of the subject 20. This is done in the processing unit 16. The processing unit 16 is preferably configured to analyse the eye movement signal 22 in the time domain. However, it is also possible to transfer the eye movement signal 22 into the frequency domain and to analyse it in the frequency domain.
[0094] The analysis of the eye movement signal 22 performed by the processing unit 16 preferably includes a detection and analysis of saccades and fixations of the subject's eyes. This may be done by means of an automatic detection of peaks within the eye movement signal 22. The processing unit 16 is configured to detect and calculate differences between two adjacent peaks in the eye movement signal amplitude. Adjacent peaks with almost negligible amplitude differences (smaller than a first threshold value), i.e. an approximately flat section of the eye movement signal 22, are indicative of an eye fixation F which results from the fact that the subject 20 maintains the position of the eyes over a comparatively longer period of time. Adjacent peaks with medium amplitude differences (larger than the first threshold value but smaller than a second threshold value) may be indicative of a short saccade S.sub.S which may result from a rapid short distance eye movement from left to right. Adjacent peaks with a significant amplitude difference (larger than the second threshold value) may be indicative of a long saccade S.sub.L which usually results from a rapid longer distance eye movement in the left-down direction. In other words, eye fixations F occur when the subject's eyes focus for a certain period of time onto one and the same text position or word in the text. Short saccades S.sub.S occur when the subject 20 is reading along a text line and the eyes move within the line from left to right over or along the words in said text line. Longer saccades S.sub.L occur when the subject 20 jumps with his/her eyes from an end of one text line to a beginning of the next text line.
[0095] The eyesight acuity of the subject 20 may be assessed by means of an analysis of a duration of the eye saccades detected in the eye movement signal 22, a frequency of the eye saccades detected in the eye movement signal 22, a duration of eye fixations detected in the eye movement signal 22 and/or a frequency of the eye fixations detected in the eye movement signal 22. An increasing trend of the frequency of eye fixations F may be an indication of an on-going deterioration of the eyesight acuity. Other indicators for an on-going deterioration of the eyesight acuity of the subject 20 are an increasing tendency of the fixation time durations or a slower saccadic movement of the eyes. The processing unit 16 is preferably configured to monitor such indicators over time (not only over time during one assessment/test, but also by comparing different test results with each other over longer periods of time, like a plurality of weeks or months). It is especially preferred that the size, colour and/or sharpness of the displayed text is decreased over time during each session/test in order to being able to monitor the influence of such a reduction/change onto the reading speed and reading behavior. This enables conclusions regarding the eyesight ability.
[0096] The finally derived result of the assessment of the eyesight acuity is indicated to the subject 20 in step S104.
[0097]
[0098] The eye movement sensor 14′ according to the second embodiment comprises an electrooculographic (EOG) sensor 14′ instead of a camera 14. Such EOG sensors 14′ include a pair of electrodes or a plurality of electrodes which are attached to the head of the subject 20 either above and below the eyes or to the left and the right of the eyes. The EOG sensor 14′ monitors the eye movement by detecting a potential difference between said electrodes and assuming that the resting potential is constant, such that the recorded potential is a measure of the position of an eye of the subject 20. The result of the EOG measurement is again an eye movement signal 22 as exemplarily shown in
[0099] A further difference to the first embodiment is the fact that the output unit 18′ according to the second embodiment includes a loudspeaker. The results of the assessment of the eyesight acuity may thus be fed back to the subject 20 in audible form. The loudspeaker 18′ also enables instructing the subject 20 by means of audible instructions. Such instructions may include instructions how to use the device 10, e.g. instructing the subject 20 how fast he/she should try to read and when he/she should start reading the text displayed on the display 12.
[0100] The system 10 according to the second embodiment furthermore comprises a proximity sensor 24. The proximity sensor 24 is configured to measure a distance between the display 12 and an eye/the eyes of the subject 20. The proximity sensor 24 may e.g. include an electromagnetic (e.g. infrared), photoelectric, radar-based or ultrasonic-based proximity sensor.
[0101] Still further, the device 10 according to the second embodiment includes the following additional components: A microphone 26 for recording the voice of the subject 20; an ambient light sensor 28 for measuring a brightness value of light in the ambience of the device 10; and a storage unit 30 for storing the result of an assessment of the eyesight acuity each time the eyesight acuity of the subject 20 is assessed by the processing unit 16. The latter-mentioned storage unit 30 may be either realized as a hard drive that is integrated in the device 10, or as an external storage unit, such as an external server, which is connected to the processing unit 16 via a data network.
[0102] The above-mentioned additional components of the system 10 according to the second embodiment enable the following improvements:
[0103] 1. The processing unit 16 may assess the eyesight acuity of the subject 20 not only based on the above-mentioned analysis of the eye movement signal 22, but also by taking into account the distance between the display 12 and the eyes of the subject 20. The processing unit 16 may in this case particularly be configured to control the display 12 to vary at least one of a size, color and sharpness of the displayed text over time. Together this enables to monitor the reaction of the subject 20 onto a variation of the size, colour and/or sharpness of the displayed text. The system 10 may, for example, detect the type of eyesight disability of the subject 20 using the following indications: If the subject 20 reacts on a decrease of the displayed text by moving the display 12 closer to his/her eyes, the likelihood that the subject 20 suffers from myopia is increased. If the subject 20 reacts on a decrease of the displayed text by moving the display 12 farther away from his/her eyes, the likelihood that the subject 20 suffers from an astigmatism, hypermetropia or prespyopia is increased.
[0104] 2. The possibility to store the results and parameters of former assessments of the eyesight acuity of the subject 20 in the storage unit 30 enables a comparison of different assessments taken over a longer time period. This specifically allows detecting trends of a deterioration of the subject's eyesight ability in very early phases.
[0105] 3. The inclusion of a microphone 26 for recording the voice of the subject 20 enables an assessment mode in which the subject 20 reads out loud the displayed text and the processing unit 16 performs a speech recognition and a comparison of the speech recognition with the text that is actually displayed on the display 12. This increases the robustness of the assessment.
[0106] 4. The provision of an ambient light sensor 28 allows accounting for the ambient conditions in the surrounding of the system 10. Based on the brightness value of the ambient light measured by the sensor 28, the subject 20 may receive a feedback via the loudspeaker 18 or via the display 12 inviting him to move to another place with more light before starting the assessment. Alternatively, the processing unit 16 may be configured to adjust at least of a size of the displayed text, a luminosity of the display 12, and a contrast of the display 12 based on the measured brightness value of the ambient light. According to a still further alternative, the processing unit 16 may incorporate the measured brightness value of the ambient light into the assessment of the eyesight acuity of the subject 20, e.g. by correcting the assessment results depending on the brightness level of the ambient light.
[0107] Despite the assessment of the eyesight ability, the system 10 may also support the subject 20 during regular usage. A long as the system 10 is not used for assessing the eyesight of the subject 20, the processing unit 16 may be configured to adjust at least one of (i) a size of the graphics and/or text displayed on the display 12, (ii) a luminosity of the display 12, and (iii) a contrast of the display 12 based on the analysis of the monitored eye movement. This may be particularly done when the system 10 is used for regular reading. The system 10 may then adjust the text size, the luminosity and/or the contrast of the display 12 to ease the reading process. As these parameters are adjusted, the ease of the reading may be assessed again to check whether the parameters have been adapted in the most optimal way given the disability of the subject 20. In other words, the system changes the settings in a personalized way for the subject 20.
[0108]
[0109] It shall be noted that the components of the system 10 explained above with reference to the first embodiment shown in
[0110]
[0111] The system 110 is in the presented example also included in a portable computing device, e.g. a tablet (similar as the system 10).
[0112] The system 110 for assessing hearing ability of the subject 20 includes an output unit 112 in the form of a display, a processing unit 116, a loudspeaker 118, a proximity sensor 124, a sound variation unit 132 and a feedback unit 134. Optionally, the system 110 may further comprise a microphone 126 and a storage unit 130. The loudspeaker 118 is configured to generate a sound. The sound variation unit 132 is configured to vary a frequency and/or loudness of the sound. This sound variation unit 132 may either be hardware-implemented or software-implemented. The sound variation unit 132 may, according to a first alternative, comprise a button with which the subject 20 may vary the frequency and/or loudness of the generated sound. This button does not have to be a physical button, but may also be a part of a touchscreen integrated into the display 112. According to a second alternative, the sound variation unit 132 may be a part of the processing unit 116 and configured to automatically vary the frequency and/or the loudness of the sound over time. The feedback unit 134 preferably comprises a user interface in the form of a button or a touchscreen integrated into the display 112 which enables the subject 20 to give the system 110 a feedback at what frequency and/or loudness he/she may recognize the generated sound. The proximity sensor 124 may be realized in a similar or even in the same manner as the proximity sensor 24 of the system 10 mentioned above. This proximity sensor 124 is configured to measure a distance between the loudspeaker 118 and the subject 20. The processing unit 116 may be realized as a microchip or another type of CPU (similar as the processing unit 16 of system 10).
[0113]
[0114] The optional microphone 126 may be used to account for sounds in the ambience of the system 110. The processing unit 116 may be configured to react on the ambient sound level recorded by the microphone 126 in the following ways: According to a first alternative, the processing unit may generate a feedback to the subject 20 inviting the subject 20 to move to another place where the ambient sound level is lower. According to a second alternative, the processing unit 116 may be configured to adjust the frequency and/or loudness of the generated sound depending on the measured ambient sound level. According to a third alternative, the processing unit 116 may be configured to incorporate the measured ambient sound level into the assessment of the hearing ability of the subject, e.g. by subtracting the measured ambient sound level from the loudness at which the subject 20 indicated that he/she recognized the generated sound.
[0115] Similar as in the system 10, the storage unit 130 may be used to compare current assessments of the hearing ability of the subject 20 with former assessments stored in the storage unit 130 in order to derive a trend over time of the hearing ability of the subject 20.
[0116] It shall be noted that the system 10 for assessing the eyesight acuity of a subject 20 may also be combined with the system 110 for assessing the hearing ability of the subject 20 without leaving the scope of the present invention. Lastly, it shall be also noted that the herein presented methods in practice include a lot of concurrent method steps instead of sequential method steps and that
[0117] While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims.
[0118] In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single element or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
[0119] A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
[0120] Any reference signs in the claims should not be construed as limiting the scope.