SYSTEMS AND METHODS FOR TEACHING USERS HOW TO READ VIA INTERACTIVE GRAPHICAL USER INTERFACES
20230154352 · 2023-05-18
Inventors
Cpc classification
G06F3/04842
PHYSICS
G06F3/0488
PHYSICS
G06F3/0481
PHYSICS
International classification
Abstract
The computer system displays a user interface that represents text information by an improved method of phonics-based approach for teaching reading, especially for those with dyslexia or other neurological disorders. The interface enlarges a line of text to be read by a user and places the cursor/pointer under the first word in the line of text in a direction of reading on the computer screen. The user interacts with the interface by a cursor/pointer. The system calculates the cursor/pointer position and highlights font and background of the traversed part of the text line/element under which the cursor is located. The elements can be letters, syllables and words. In the voiceover mode, the pointer/cursor moves automatically and the user follows the cursor/pointer at the speed suggested by the system and the system voices the word/syllable being read. In the non-voiceover mode, the user drags the cursor along the text line.
Claims
1. A system for facilitating reading, the system comprising: an electronic device including a display screen and a programmable processor configured to generate a graphical interface on the display screen of the electronic device, the display screen being a touch screen, the graphical interface including: a line of text to be read next by the user, the line of text to be read next by the user containing at least one syllable; a first interactive graphical element representing a cursor or pointer configured to: move along the line of text to be read next by the user in response to a finger of the user dragging the cursor or pointer; or automatically move along the line of text to be read next by the user so long as the finger of the user is following the cursor or pointer; wherein, during movement of the cursor or pointer along the at least one syllable contained in the line of text to be read next by the user, the electronic device generates an audio output to the user corresponding to the at least one syllable.
2. The system of claim 1, wherein the graphical interface further displays at least one line of text previously read by the user and at least one line of text to be read after the line of text to be read next by the user, wherein the line of text to be read next by the user is displayed in larger size font relative to the at least one line of text previously read by the user and the at least one line of text to be read after the line of text to be read next by the user.
3. The system of claim 2, wherein the graphical interface further displays the at least one line of text previously read by the user with a highlight overlay indicating to the user that the at least one line of text was previously read by the user.
4. The system of claim 1, wherein, during movement of the cursor or pointer along the at least one syllable contained in the line of text to be read next by the user, graphical interface generates a first static background color that covers the at least one syllable and a second dynamic background color that moves across the at least one syllable in sync with the movement of the cursor or pointer across the at least one syllable and in sync with the generation of the audio output by the electronic device corresponding to the at least one syllable.
5. The system of claim 1, wherein the graphical interface includes a second graphical element representing a forward navigation button that enables the suer to advance to a screen displaying another line of text to be read next by the user, and a third graphical element representing a backward navigation button that enables the user to go back to a screen displaying a line of text previously read by the user.
6. The system of claim 5, wherein the graphical interface includes a fourth graphical element representing a sound toggle button that permits the user to turn the audio output by the electronic device on and off.
7. The system of claim 1, wherein the programmable processor is programmed to interpret a detection that the finger of the user is not following the cursor or pointer as an error, and to stop the automatic movement of the cursor or pointer, and to reset the cursor or pointer to a beginning of the at least one syllable where the error was detected.
8. A method of facilitating reading, the method comprising: generating, on a touch display screen of an electronic device including a programmable processor, a graphical interface, the graphical interface including: a line of text to be read next by the user, the line of text to be read next by the user containing at least one syllable; a first interactive graphical element representing a cursor or pointer configured to: move along the line of text to be read next by the user in response to a finger of the user dragging the cursor or pointer; or automatically move along the line of text to be read next by the user so long as the finger of the user is following the cursor or pointer; in response to movement of the cursor or pointer along the at least one syllable contained in the line of text to be read next by the user, generating, via a speaker of the electronic device, an audio output to the user corresponding to the at least one syllable.
9. The method of claim 8, further comprising displaying, in the graphical interface, at least one line of text previously read by the user and at least one line of text to be read after the line of text to be read next by the user, wherein the line of text to be read next by the user is displayed in larger size font relative to the at least one line of text previously read by the user and the at least one line of text to be read after the line of text to be read next by the user.
10. The method of claim 9, further comprising displaying, in the graphical interface, the at least one line of text previously read by the user with a highlight overlay indicating to the user that the at least one line of text was previously read by the user.
11. The method of claim 8, further comprising, during movement of the cursor or pointer along the at least one syllable contained in the line of text to be read next by the user, generating in the graphical interface a first static background color that covers the at least one syllable and a second dynamic background color that moves across the at least one syllable in sync with the movement of the cursor or pointer across the at least one syllable and in sync with the generation of the audio output by the electronic device corresponding to the at least one syllable.
12. The method of claim 8, wherein the graphical interface includes a second graphical element representing a forward navigation button that enables the suer to advance to a screen displaying another line of text to be read next by the user, and a third graphical element representing a backward navigation button that enables the user to go back to a screen displaying a line of text previously read by the user.
13. The method of claim 12, wherein the graphical interface includes a fourth graphical element representing a sound toggle button that permits the user to turn the audio output by the electronic device on and off.
14. The method of claim 8, further comprising, by the programmable processor: interpreting a detection that the finger of the user is not following the cursor or pointer as an error, stopping the automatic movement of the cursor or pointer, and resetting the cursor or pointer to a beginning of the at least one syllable where the error was detected.
15. A non-transitory medium holding computing-device executable instructions for facilitating a user of an electronic device to read text displayed on a display screen of the electronic device, the instructions when executed via a programmable processor causing the electronic device to: generate a graphical interface on the display screen of the electronic device, the display screen being a touch screen, the graphical interface including: a line of text to be read next by the user, the line of text to be read next by the user containing at least one syllable; a first interactive graphical element representing a cursor or pointer configured to: move along the line of text to be read next by the user in response to a finger of the user dragging the cursor or pointer; or automatically move along the line of text to be read next by the user so long as the finger of the user is following the cursor or pointer; wherein, during movement of the cursor or pointer along the at least one syllable contained in the line of text to be read next by the user, the electronic device generates an audio output to the user corresponding to the at least one syllable.
16. The non-transitory medium of claim 15, wherein the graphical interface further displays at least one line of text previously read by the user and at least one line of text to be read after the line of text to be read next by the user, wherein the line of text to be read next by the user is displayed in larger size font relative to the at least one line of text previously read by the user and the at least one line of text to be read after the line of text to be read next by the user.
17. The non-transitory medium of claim 16, wherein the graphical interface further displays the at least one line of text previously read by the user with a highlight overlay indicating to the user that the at least one line of text was previously read by the user.
18. The non-transitory medium of claim 15, wherein, during movement of the cursor or pointer along the at least one syllable contained in the line of text to be read next by the user, graphical interface generates a first static background color that covers the at least one syllable and a second dynamic background color that moves across the at least one syllable in sync with the movement of the cursor or pointer across the at least one syllable and in sync with the generation of the audio output by the electronic device corresponding to the at least one syllable.
19. The non-transitory medium of claim 15, wherein the graphical interface includes a second graphical element representing a forward navigation button that enables the suer to advance to a screen displaying another line of text to be read next by the user, and a third graphical element representing a backward navigation button that enables the user to go back to a screen displaying a line of text previously read by the user.
20. The non-transitory medium of claim 19, wherein the programmable processor of the electronic device is programmed to interpret a detection that the finger of the user is not following the cursor or pointer as an error, and to stop the automatic movement of the cursor or pointer, and to reset the cursor or pointer to a beginning of the at least one syllable where the error was detected.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
DETAILED DESCRIPTION
[0036] As mentioned above, there are problems in teaching reading in existing approaches and methods in the education system. The situation is amplified by an increase in the number of people with neurological disorders such as dyslexia and ADHD, the consequences of which are problems with reading. Although early literacy intervention can significantly reduce reading issues or totally remove it, it's not regular practice. Early literacy intervention requires a special teacher, books and handouts, time and money. Moreover, specialists can officially diagnose dyslexia no earlier than 8-10 years. And this means that the child will receive professional help too late.
[0037] The systems, methods and interactive graphical user interfaces described herein improve phonics-based approach interventions and makes them accessible from early age.
[0038] Embodiments of the invention will be described more fully hereinafter with reference to the accompanying drawings, in which embodiments of the invention are shown. This invention may, however be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. It will also be understood that, although the terms first, second, etc. are, in some instances, used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first text line could be termed a second text line, and, similarly, a second text line could be termed a first text line, without departing from the scope of the various described embodiments. The first text line and the second text line are both text lines, but they are not the same text line, unless the context clearly indicates otherwise.
[0039] The terminology used herein in describing exemplary embodiments is for the purpose of describing such embodiments only and is not intended to be limiting. As used in the description of the described embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0040] As used herein, the term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
[0041]
[0042] In some computer systems (e.g., 101-a in
[0043] In a portable multifunction device embodiment (e.g., 300 in
[0044] In some computer systems (e.g., 101-b), in addition to integrated input device(s) 102, the presented interface logic 103, display generation component(s) 104, camera(s) 105, and other input or control devices 106 of electronic device 200, the computer system is also in communication with additional devices that are separate from the computer system, such as separate input device(s) 107 such as a touch-sensitive surface, a wand, a remote control 402, or such as separate output device(s) 108 such as a virtual voice assistant 403, or the like and/or separate display generation component(s) 109 such as a virtual reality headset 401 or augmented reality glasses that overlay virtual objects on a physical environment (e.g., embodiment computer system with additional devices 400 in
[0045]
[0046] Therefore, a distinctive feature of any embodiment of this interface will be the presence of a cursor or pointer 501-a, through which the user interacts with text in the current enlarged text line 502 and text line(s) 505. In some embodiments, the cursor/pointer may be in the form of an image or an animated sequence (e.g., 501-a in
[0047] The current enlarged text line 502 is the second integral part of any embodiment of the interface on a par with the cursor/pointer 501-a, 501-b. The increased size of the current enlarged text line 502 helps to focus on the text contained in it and not be distracted by the rest of the text in other text line(s) 505, especially for users with ADHD and dyslexia.
[0048] In some embodiments (e.g.
[0049] The interface embodiment illustrated
[0050]
[0051] The main difference between these methods is that in the voiceover method 600-a, the user's goal is to follow the cursor/pointer 501-a, 501-b with his/her finger 507. This method 600-a is used in the initial stages of teaching reading. The effect is based on the activation of the frontal lobes of the brain using fine motor skills, which increases the efficiency of perception of any sensory information. Simultaneously with the movement of the cursor/pointer 501-a, 501-b, the interface highlights the elements of the text above it and voices them. The elements of the text are letters, phonemes in the form of letters encoding them, syllables and words (e.g., in
[0052] For a younger user, or if the user has made a large number of mistakes before, the system offers a slow speed of the cursor/pointer 501-a, 501-b movement. Slow is the speed at which the duration of the voicing of the text element under which the cursor/pointer 501-a, 501-b moves is at least one and a half times shorter than the duration of the passage of this element by the cursor/pointer 501-a, 501-b. The user selects text elements for training depending on the level and stage of training.
[0053] The method without voiceover 600-b is used to reinforce the reading skill. After passing the text by the method with voiceover 600-a, the user switches the method using the 503 button to the method without voiceover 600-b. Now, the user drags the cursor/pointer 501-a, 501-b through the text and speaks the elements under which the cursor/pointer 501-a, 501-b passes. The interface, as in the case of the method with voiceover 600-a, also highlights the elements under which the cursor/pointer 501-a, 501-b passes.
[0054] In both methods 600-a, 600-b, for additional emphasizing of text elements, in addition to their highlighting, a decrease in the speed of the cursor/pointer 501-a, 501-b movement or its complete stop in places between elements, including punctuation marks and spaces, is provided.
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061] The foregoing description, for the purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best use the invention and various described embodiments with various modifications as are suited to the particular use contemplated.