Communication Application for Blind and Normal People with Deaf People (HOPE Tech)
20220103679 · 2022-03-31
Inventors
- Mohammadhossein Zare (Feasterville Trevose, PA, US)
- Aliakbar Sheikhy (Amol, IR)
- Asghar Akbary Kamangar (Babol, IR)
Cpc classification
International classification
Abstract
Nowadays, communication and conveying the sense between deaf, blind, and other people are getting more challenging for those who do not know sign language. Also considering the worst case scenario of deaf and blind communication conditions, this problem is resolved by this new communication app as an independent one-person but two-way communication software or application App. So that people with any level of literacy and nationality or language can use this app to convey their words and phrases or their desires to deaf people on a daily basis. In this application, there are two interactive features: the first section takes audio conversations from blind or other people and plays the coherent sign language video. The second section converts the text to the audio message to make the deaf person communication possible, without knowing sign language.
This new communication method is not restricted to weather conditions, industrial environment (noise, dust and high temperatures). This application App can run on the android phone device, is very user friendly, and extendable to different languages compatibilities.
Claims
1. A method of two-way communication between a deaf person who uses sign language and a blind-normal person who speaks comprising: a) receiving a text input from the deaf person and converting the text to speech voice for the blind-normal person by using only one communication device with real-time software; b) receiving speech voice from the blind-normal person and converting the speech to a sign language video for the deaf person by using the same communication device with real-time software; and wherein the deaf person and blind-normal person use only one communication device with real-time software to communicate.
2. The method of claim 1, wherein the only used communication device performs with Google Services support for real-time converting speech to text.
3. The method of claim 1, wherein the communication device is only one phone device without any other external gadgets or assisting devices.
4. The method of claim 1, wherein the communication device works in any environment conditions.
5. The method of claim 1, wherein the communication device has unlimited and extendable different languages support as a built-in feature.
6. The method of claim 1, wherein the communication device has databases or libraries of technical words and phrases with customization capability based on user queries for professional use.
7. A system of two-way communication between a deaf person who uses sign language and a blind-normal person who speaks comprising: a) a communication device receives a text input from the deaf person and real-time convert the text to speech for the blind-normal persons; b) the same communication device receives speech from the blind-normal persons and real-time convert the speech to a sign language video for the deaf person; and wherein the deaf person and blind-normal person use only one communication device with real-time software to communicate.
8. The system of claim 7, wherein the only used communication device performs with Google Services support for real-time converting speech to text.
9. The system of claim 7, wherein the communication device is only one phone device without any other external gadgets or assisting devices.
10. The system of claim 7, wherein the communication device works in any environment conditions.
11. The system of claim 7, wherein the communication device has unlimited and extendable different languages support as a built-in feature.
12. The system of claim 7, wherein the communication device has databases or libraries of technical words and phrases with customization capability based on user queries for professional use.
13. A method of two-way communication between a deaf person who uses sign language and a blind-normal person who speaks using a built-in program algorithm of real-time software comprising: providing only one communication device supports dual usage real-time software for deaf and blind-normal persons; and wherein said the dual usage real-time software providing: a) a text-to-speech real-time software for the deaf person; b) a speech-to-sign language real-time software for the blind-normal person; and wherein when the blind-normal person communicates to the deaf person, the said dual usage real-time software converts inputs from the following steps: i) a speech is taken by the communication device ii) the speech is converted to a text by Goggle service iii) a sign language video corresponding to the text is displayed to the deaf person on the same communication device and wherein when the deaf person communicates to others, the said dual usage real-time software receives a typed text from a provided text box and converts it to an audible speech for the blind-normal person.
14. The method of claim 13, wherein the only used communication device performs with Google Services support for real-time converting speech to text.
15. The method of claim 13, wherein the communication device is only one phone device without any other external gadgets or assisting devices.
16. The method of claim 13, wherein the communication device works in any environment conditions.
17. The method of claim 13, wherein the communication device has unlimited and extendable different languages support as a built-in feature.
18. The method of claim 13, wherein the communication device has databases or libraries of technical words and phrases with customization capability based on user queries for professional use.
Description
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S)
[0036] Various other objects, features and attendant advantages of the present invention will become fully appreciated as the same becomes better understood when considered in conjunction with the accompanying drawings, in which like reference characters designate the same or similar parts throughout the several views, and wherein:
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
DETAILED DESCRIPTION OF THE INVENTION
[0046] The aspects of the present invention may be used as an interactive communication for the blind and deaf person to each other or normal people. Hereinafter, the “application App” represents “the present invention”.
[0047] Considering the unique advantages of our mobile device application App is that on only one mobile device, you can have a two-way conversation between a blind and a deaf person, and there is no need for both the blind and the deaf persons to have a separate phone equipped with the software to be able to communicate with each other wherein it could be described as follows.
[0048] The user can install and run this mobile device software or App on android system mobiles with minimum version 4.0. The basic version of the application App requires a minimum of 200-megabyte memory and it could be extended based upon the user customizations.
[0049] This mobile device software or App includes different layers and pages in order to provide communication capabilities for users with different demands as normal, blind, and deaf people.
[0050] The main page is showing the “MIC” tab selected as the default when the application App is opened. The second layer (103) could provide different functional capabilities based on the tab has been selected in the previous layer (first layer). Also, the third (104) and fourth (105) layers depend on which function has been used on the previous steps.
[0051] In layer 2 shown (103) in
[0052] In layer 3 (104), based on the tab has been selected from layer 1 and the function has been activated at the 2nd layer, the user would find function outcomes for “Google Speech Service”, “Play Audio”, and “Words list vs. new selected language”. In the case that the “MIC” tab has been selected, at the 3rd layer the voice data would be received in and after the audio processing, the final outcomes shown in layer 4 (105).
[0053] This mobile device software has been developed by using “Android Studio” programming software. In
[0054] The “TTSFragment.java” class used in conjunction with the “SPEECH” tab (201) in order to convert the user input text in “Textbox” to playable audio. The “STTFragment.java” and “Recognizerintent class” classes (202) used in conjunction with text output from the Google Speech Service by incorporating variables defined with “Config.java class”. The “Android Manifest.xml” developed to config the general view and access to the application App (203). The tabs appearances are set up with six different “.xml” (204) shown in
[0055] The logic which runs this application App by using Android programming classes and layers explained above is shown in
[0056] Through
[0057]
[0058] After the application App has been downloaded from the application App store and installed on the android phone device (401), this application icon (403) will be appeared on the phone device applications screen (402). The stairs to the sky symbol elected for this application icon which represents the success and guidance path for everyone in the world regardless of people constraints and limitations.
[0059] By launching the application App, the main page view (404) shows the “MIC” tab as the default. The main page as described in (100), includes the selected language indicator (405), which is set up on the English language as default during launching the application App, and menu options (406) on the main page which could be selected by the user touching based on the required function. Each one of the tabs gets focus highlighted when selected. Whereas the “MIC” tab is the default tab on the main page, the microphone icon (407) located on the center bottom of the screen.
[0060] In
[0061] By selecting the “MIC” tab symbol (501), the relevant screen view (502) for this tab will be depicted and the written instruction “click on microphone icon to start . . . ” and the microphone icon (503) will be exerted. This tab used for the communication of normal and blind people with a deaf person.
[0062] By selecting “SPEECH” tab symbol (504), the relevant screen view (505) for this tab will be depicted. Within this view, the textbox (506) with the written instruction “type a text” and speech button (507) will be exerted. The deaf people can use this tab functionality to communicate with the blind people.
[0063] After selecting the “LIBRARY” tab symbol (508), the relevant screen view (509) for this tab will be depicted. The word list (511) will be exerted based upon the selected language is shown in (510). This tab provides words and sentence lists stored in the library database based upon the selected language.
[0064] The “CONTACT US” tab (511) and “HELP” tab (513) are used as informational pages, respectively, to represent communication methods list (512) to contact support team for any query and to understand the general functionality and usage instruction for this application (514).
[0065] Regarding the logic and steps shown in
[0066] In
[0067]
[0068] The third tab, “LIBRARY”, functionality is shown in
[0069] By selecting the “LIBRARY” tab, as default, the words list (801) is depicted in (8a) view based on the last selected language from the other tabs, the preselected language in (8a) is English. In order to switch between eight languages word lists, every time by clicking on the selected language indicator the language will be changed, and the relevant word list shows on the screen. The language change order follows as English (EN), Persian/Farsi (FA), German (GE), French (FR), Italian (IT), Spanish (SP), Japanese (JP), Chinese (CH) which are shown, respectively, (801) in (8a) view, (802) in (8b) view, (803) in (8c) view, (804) in (8d) view, (805) in (8e) view, (806) in (8f) view, (807) in (8g) view, and (808) in (8h) view.
[0070] The user can play any incorporated sign language video with each one of the words, by clicking on the intended word.
[0071] In
[0072] In
[0073] The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and programming according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and programming instructions.