SYSTEMS AND METHODS OF PORTABLE THERAPEUTICS OF EYE DISORDERS
20220265502 · 2022-08-25
Inventors
Cpc classification
G06N5/01
PHYSICS
A61H2201/5048
HUMAN NECESSITIES
A61H2230/085
HUMAN NECESSITIES
International classification
Abstract
A fully wearable, wireless soft electronic system that offers a portable, highly sensitive tracking of eye movements (vergence) via the combination of skin-conformal sensors and a virtual reality system. Advancement of material processing and printing technologies based on aerosol jet printing enables reliable manufacturing of skin-like sensors, while the flexible hybrid circuit based on elastomer and chip integration allows comfortable integration with a user's head. Analytical and computational study of a data classification algorithm provides a highly accurate tool for real-time detection and classification of ocular motions. In vivo demonstration with 14 human subjects captures the potential of the wearable electronics as a portable therapy system, whose minimized form factor facilitates seamless interplay with traditional wearable hardware.
Claims
1. A portable system comprising: a wearable ocular device configured to be worn by a wearer; and skin-conformal electronics.
2. The system of claim 1, wherein the skin-conformal electronics comprise at least one skin-like electrode that is configured to make conformal proximal contact with the nose of the wearer.
3. The system of claim 2, wherein the skin-conformal electronics further comprise at least one flexible electronic circuit that is configured to make conformal proximal contact with the back of the neck of the wearer.
4. The system of claim 2, wherein each skin-like electrode comprises a stretchable aerosol jet printed electrode.
5. The system of claim 1 further comprising a processing system for running a therapy environment that simulates continuous movements of multiple objects in three varying depths of near, intermediate, and distance via the wearable ocular device.
6. The system of claim 5, wherein the three varying depths correspond to 1°, 2°, and 3° of eye motions.
7. The system of claim 5 further comprising an audio system configured to guide the wearer through the therapy environment.
8. A portable system comprising: a wearable ocular device comprising a virtual reality (VR) headset configured to be worn by a wearer; skin-conformal electronics comprising: a first skin-like stretchable aerosol jet printed biopotential electrode configured for: fit under the VR headset; and high-fidelity detection of slight eye movements via conformal lamination on contoured areas around the eyes and nasal region of the wearer; and a second wireless circuit configured for lamination on the back of the neck of the wearer; and a processing system configured to provide accurate, real-time detection and classification of multi-degree eye vergence in a VR environment toward portable therapeutics of eye disorders.
9. The system of claim 8, wherein the portable system is an electrooculography (EOG)-based detection system of eye vergence with at least a 90% classification accuracy of multi-degree motions of eyes of the wearer.
10. The system of claim 9, wherein the processing system includes a mobile application configured to present a visual therapy program for eye convergence and divergence motions.
11. The system of claim 10, wherein the processing system further includes a MATLAB program configured to train and validate precise eye vergence motions for classification.
12. The system of claim 11, wherein the classification comprises an ensemble classifier based off subspace discriminant methods.
13. The system of claim 11, wherein the classification comprises a random forest classification algorithm that yields the at least a 90% classification accuracy.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0052] The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
[0053] The accompanying Figures, which are incorporated in and constitute a part of this specification, illustrate several aspects described below.
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
DETAIL DESCRIPTION OF THE INVENTION
[0070] To facilitate an understanding of the principles and features of the various embodiments of the invention, various illustrative embodiments are explained below. Although exemplary embodiments of the invention are explained in detail, it is to be understood that other embodiments are contemplated. Accordingly, it is not intended that the invention is limited in its scope to the details of construction and arrangement of components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments and of being practiced or carried out in various ways. Also, in describing the exemplary embodiments, specific terminology will be resorted to for the sake of clarity.
[0071] It must also be noted that, as used in the specification and the appended claims, the singular forms “a,” “an” and “the” include plural references unless the context clearly dictates otherwise. For example, reference to a component is intended also to include composition of a plurality of components. References to a composition containing “a” constituent is intended to include other constituents in addition to the one named.
[0072] Also, in describing the exemplary embodiments, terminology will be resorted to for the sake of clarity. It is intended that each term contemplates its broadest meaning as understood by those skilled in the art and includes all technical equivalents which operate in a similar manner to accomplish a similar purpose.
[0073] Ranges may be expressed herein as from “about” or “approximately” or “substantially” one particular value and/or to “about” or “approximately” or “substantially” another particular value. When such a range is expressed, other exemplary embodiments include from the one particular value and/or to the other particular value.
[0074] Similarly, as used herein, “substantially free” of something, or “substantially pure”, and like characterizations, can include both being “at least substantially free” of something, or “at least substantially pure”, and being “completely free” of something, or “completely pure”.
[0075] By “comprising” or “containing” or “including” is meant that at least the named compound, element, particle, or method step is present in the composition or article or method, but does not exclude the presence of other compounds, materials, particles, method steps, even if the other such compounds, material, particles, method steps have the same function as what is named.
[0076] It is also to be understood that the mention of one or more method steps does not preclude the presence of additional method steps or intervening method steps between those steps expressly identified. Similarly, it is also to be understood that the mention of one or more components in a composition does not preclude the presence of additional components than those expressly identified.
[0077] The materials described as making up the various elements of the invention are intended to be illustrative and not restrictive. Many suitable materials that would perform the same or a similar function as the materials described herein are intended to be embraced within the scope of the invention. Such other materials not described herein can include, but are not limited to, for example, materials that are developed after the time of the development of the invention.
[0078] As discussed, ocular disorders are currently affecting the developed world, causing loss of productivity in adults and children. While the cause of such disorders is not clear, neurological issues are often considered as the biggest possibility. Treatment of strabismus and vergence requires an invasive surgery or clinic-based vision therapy that has been used for decades due to the lack of alternatives such as portable therapeutic tools. Recent advancement in electronic packaging and image processing techniques have opened the possibility for optics-based portable eye tracking approaches, but several technical and safety hurdles limit the implementation of the technology in wearable applications.
[0079] The present invention is a fully wearable, wireless soft electronic system that offers a portable, highly sensitive tracking of eye movements (vergence) via the combination of skin-conformal sensors and a virtual reality system. Advancement of material processing and printing technologies based on AJP enables reliable manufacturing of skin-like sensors, while a flexible electronic circuit is prepared by the integration of chip components onto a soft elastomeric membrane.
[0080] Analytical and computational study of a data classification algorithm provides a highly accurate tool for real-time detection and classification of ocular motions. In vivo demonstration with human subjects captures the potential of the wearable electronics as a portable therapy system, which can be easily synchronized with a virtual reality headset.
[0081] Recording eye vergence via EOG has been deemed difficult by ocular experts because of the signal sensitivity, warranted by the lower conformality and motion artifacts of conventional gel electrodes in comparison to skin-like, soft electrodes. Additionally, a pragmatic experimental setup that can invoke a precise eye vergence response is lacking.
[0082] The present invention incorporates nanostructured membrane circuits and skin-like electrodes which are stretchable and flexible enough to compress under a VR headset as well as conform to the contour of the human nose. The stretchable hybrid electronics are made of ultrathin, biocompatible materials which enable ergonomic, continuous sensing of electrooculograms. A VR headset with a customized android application presents the visual therapy program for eye convergence and divergence motions.
[0083] An external MATLAB program is used to train and validate precise eye vergence motions for classification. The classification system uses an ensemble classifier based off subspace discriminant methods, also referred to as random forests. This classification method yields higher than 90% accuracy for the best eye vergence test subjects.
[0084] Analytics also suggests improvement in eye vergence with successive training with the VR headset program and the sensors. Further analysis of a patient indicates the sensors can invoke strabismus motions that are recordable and classifiable. This invention improves current home-based vision therapy methods by allow optometrists to prescribe patients an alternative to archaic pencil pushups.
[0085] As shown in
[0086]
[0087]
[0088]
[0089] As shown, the present portable therapeutic system incorporates a set of stretchable aerosol jet printed electrodes 110, soft wireless electronics 120 on the back of the user's neck, and a VR device 130. The portable system offers a highly sensitive, automatic detection of eye vergence via a data classification algorithm.
[0090] As shown in
[0091] A scalable additive manufacturing method using AJP was used to fabricate the skin-wearable sensor, which was connected to a fully portable, miniaturized wireless circuit that is ultralight and highly flexible for gentle lamination on the back of the neck. For optimization of sensor location and signal characterization, a commercial data acquisition system (BioRadio, Great Lakes NeuroTechnologies) was initially utilized.
[0092] TABLE 1 illustrates feature comparison between the BioRadio and present periocular wearable electronics. Integration of advanced chip components allow the periocular wearable electronics to be equipped with comparable electronic performances to BioRadio while achieving the extremely light weight.
TABLE-US-00001 TABLE 1 Periocular Wearable BioRadio Electronics Wireless Connection Bluetooth Classic + LE Bluetooth LE (4.2) Data Rate 190 kbps 120 kbps Differential Channels 4 Up to 8 Sampling Resolution Up to 24-bit 24-bit Common Mode −100 dB −110 dB Rejection Input Impedance 500 MΩ 1000 MΩ Capacity 8 GB No Storage Sample Rate 250-16 kHz 250-16 kHz Battery Life ~9 hours (1320 mAh) ~1 hour (105 mAh) Battery Type Lithium-Ion Polymer Lithium-Ion Polymer Device Weight 115 G 5.5 G (3.3 G Without Battery)
[0093]
[0094] The systematic integration of lithography-based open-mesh interconnects on a polyimide (PI) film, the bonding of small chip components, and the encapsulation with an elastomeric membrane, each enable the benefits of the present soft and compliant electronic system. The flexible electronic circuit includes different modules that allow for wireless, high-throughput EOG detection. The antenna (
[0095]
[0096] The ultrathin, dry electrode (thickness: 67 μm including a backing elastomer) makes an intimate and conformal bonding to the skin (
[0097] Conformal Contact Analysis for Aerosol Jet—Printed Electrodes
[0098] For conformal contact to occur, the magnitude of adhesion energy must be larger than the sum of the bending and elastic energy. Equation (1) describes this condition:
0<U.sub.bending+U.sub.skin+U.sub.adhesion (1)
[0099] Then, defining each of these energy terms. Bending energy is defined as:
[0100] The bending stiffness, EI.sub.electrode, is calculated via:
EI.sub.electrode=αEI.sub.PI/Ag+(1−α)EI.sub.silicone (3)
[0101] where α is the PI/Ag area fraction of the skin-like electrode. The bending stiffness is split into that for the silicone elastomer layer and the PI/Ag pattern:
[0102] The skin surface is modeled with a sine wave as:
[0103] While the displacement of the electrode is defined as:
[0104] We assume the following dimensions to represent the properties of human skin: [0105] h.sub.rough=55 μm; [0106] λ.sub.rough=140 μm; and [0107] E.sub.skin=130 kPa
[0108] where h.sub.rough is roughness amplitude, λ.sub.rough is wavelength, and E.sub.skin is the modulus of skin.
[0109] The elastic energy of the skin, due to normal stress, is defined as:
[0110] The interfacial adhesion energy is calculated as:
[0111] The work of adhesion value is dominated by the elastomer, and the electrode's total value is:
γ=(1−α)γ.sub.silicone/skin (12)
[0112] Minimizing the total energy to express maximum deflection of the electrode results in terms of h:
[0113] Then, substituting into Equation (1):
U.sub.bending+U.sub.skin+U.sub.adhesion=−0.150J (14)
[0114] Thus, conformal contact occurs between the electrode and skin.
[0115] Parameters used in this calculation include: [0116] E.sub.silicone=7.85 kPa [0117] γ.sub.silicone/skin=0.89; [0118] E.sub.PI=2.5 GPa; [0119] E.sub.Ag=69 GPa; [0120] h.sub.silicone=65 μm; [0121] h.sub.PI=1 μm; [0122] h.sub.Ag=1 μm; and [0123] α=33.47%.
Characterization and Fabrication of Wearable Skin-Like Electrodes Via AJP
[0124] A highly sensitive detection of EOG signals requires a low skin-electrode contact impedance and minimized motion artifacts. Even though a conventional electrode offers a low-contact impedance due to the electrolyte gel, the combination of the sensor rigidity, size, and associated adhesive limits an application onto the sensitive and highly contoured regions of the nose and eye areas.
[0125] As shown in
[0126]
[0127]
[0128]
[0129]
[0130] As a potentially low-cost and scalable printing method, AJP allows a direct printing of an open-mesh structure onto a soft membrane (
[0131] Plasma treatment along with platen heat was used to produce fine features (
[0132] As expected, multi-layer printing of AgNPs provides lowered resistance (
[0133] An FTIR analysis further verifies the dissociation of the binder. The FTIR transmittance spectra for the printed electrodes before and after sintering indicate the disappearance of ‘NH stretch’, ‘NH bend’, and “C-N stretching’, which confirms the dissociation of oleylamine (
[0134]
[0135]
[0136]
[0137] The mesh design of an electrode provides a highly flexible and stretchable mechanics upon multi-modal strain, supported by the computational modeling. A set of experimental tests validates the mechanical stability of the sensor where the structure can be stretched up to 100% before a change in resistance is observed (
[0138] The printed electrode follows dry etching of PI and transfer onto an elastomeric membrane by dissolving a sacrificial layer, which prepares the sensor to be mounted on the skin (details of the processes appear in
[0139] Skin-electrode contact impedances are measured by plugging the pair of electrodes (positioned above and below each eye) to a test equipment (Checktrode Model 1089 MK III, UFI) and recording the measurement values at both the start and finish of the experimental session. One-hour period is chosen to represent the average duration of vergence therapeutic protocols.
[0140] At the start of the session, the average impedance values for the two types of electrodes show 7.6 kΩ and 8.4 kΩ for gel and skin-like electrodes, respectively. After one hour, the respective impedance values change to 1.2 kΩ and 6.3 kΩ, respectively, indicating the formation of excellent skin-to-electrode interfaces by both types of electrodes.
[0141] Next, the subject performs a simple vertical eye movement protocol by looking up and down to generate EOG signals for signal-to-noise ratio (SNR) comparison. The SNR analysis, which is carried out by finding the log ratio of root mean square (RMS) values for each EOG signal (up and down) and the baseline signal from ten trials, is also performed at the start and finish of the one hour session.
[0142] While the SNR results show comparable performances and no meaningful changes in the two types of electrodes over the one hour period, the skin-like electrodes maintained higher SNR values in all cases.
[0143] Lastly, the electrode's sensitivity to movement artifacts is quantified by requesting the user to walk on a treadmill with a speed of 3.2 mph for one minute. The RMS of the two data from each electrode type are quantified to be 2.36 μV and 2.45 μV for gel and skin-like electrodes, respectively, suggesting that, in terms of movement artifact, the use of skin-like electrodes do not offer realistic disadvantages to using the gel adhesive electrodes.
[0144] The details of the experimental setup and analysis results are shown in
Study of Ocular Vergence and Classification Accuracy Based on Sensor Positions
[0145] Recording of ocular vergence via EOG requires meticulous optimization to produce the maximum functionalities. A series of distances with eye vergence were assessed to establish a metric for classification. The most common distances that humans observe in daily life was the basis for the procedure in the physical and virtual domain.
[0146] The discrepancy of the degrees of eye motion is a necessary physical attribute for eye vergence classification.
[0147]
[0148] An experimental setup in
[0149] The classification accuracy per each sensor location is the averaged value from three human subjects (details of individual confusion matrices with associated accuracy calculation are summarized in
[0150] Additionally, the recording setup with the case of ocular vergence 2 (
[0151]
Optimization of Vergence Analysis Via Signal Processing and Feature Extraction
[0152] The variability of ocular vergence in both physical and VR domains were assessed to realize a fully portable vergence therapeutic program.
[0153] As discussed, the acquired EOG signals from vergence motions require a mathematical translation using statistical analysis for quantitative signal comparison. Prior to algorithm implementation, raw EOG signals (
[0154] The bandpass filter removes the drift and high frequency noise, so the derivative is much cleaner as shown in
[0155] Increasing the 2.sup.nd order derivative filter to 6.sup.th order can assist in positively altering the range of thresholds by changing the signal to noise ratio (
[0156] The result of the wrapper feature selection indicates a saturation of the accuracy with a mean accuracy of 95% (
[0157] The integration of the training procedure with filters, thresholds, and the ensemble classifier enables the present high classification accuracies. Consequently, the presented set of high quality EOG measurement, integrated algorithm, and training procedures allow the calibration of vergence classification specific for the user, regardless of the variabilities in EOG arising from individual differences, such face size or shape. Multiple classifiers were tested in the MATLAB's classification learner application; however, the results show k nearest neighbor (KNN) and support vector machine (SVM) were inferior to ensemble subspace discriminant which showed accuracies above 85% for subject 12 (
Methods for Cross-Validation
[0158] After the data is recorded it is stored into a .mat file for further processing. The data is stored in a double structure which is then separated in MATLAB cell arrays. The cell arrays are inserted into a Butterworth bandpass filter between, 0.01 Hz and 10 Hz. A zero-phase filter is implemented after data is recorded, while the real-time data is parsed in to 500 ms windows that are filtered after five seconds of data is recorded. The filtered data is separated into ten features that are inserted into the classifier for cross-validation. Numerous classifiers were compared in classification learner application prior to establishing the ensemble subspace discriminant as the best classifier by utilizing five-fold cross-validation with each classifier. MATLAB's cross-validation algorithm applies randomization of the training and testing data by splitting the data into five folds and six groups. Five out of the six groups are randomly indicated as training and the last group is randomly indicated as the testing group. The groups are divided into n groups equal to the number of classes. The output class is presented in a confusion matrix at the end of recording as well as real-time on the graphical user interface. This approach was utilized with all 14 test subjects with a minimum of at least three attempts from each subject towards training and testing.
Comparison of Ocular Classification Accuracy Between VR and Physical Apparatus
[0159] This section summarizes the experimental results of ocular classification comparison between the soft ocular wearable electronics with a VR headset (
[0160] The VR headset enabled the capture of ideal eye vergence motions because head motions are disabled, and the stimulus is perfectly aligned with the user's binocular vision. This is evident from the averaged signal and standard deviation of the normalized position of the ideal datasets shown in
[0161] The physical apparatus data shows a larger variation in overall trials in comparison to the VR headset as observed in
[0162]
[0163]
[0164] Utilizing a rise time algorithm, the amplitude changes and variation of all datasets from physical and VR are shown in
[0165] The intrinsic quality of the ensemble classifier shows high variance in cross-validation assessments. Even with the real-time classification, the ensemble classifier yields about 83% and 80% for the VR environment and the physical apparatus, respectively. The VR real-time classification is higher than the physical apparatus due to less variation between opposing motions of positive and negative changes. Details of the classification accuracies in cross-validation and real-time are summarized in
[0166] TABLE 2 is a cross-validation accuracies of subjects 1 to 5 using the physical apparatus. Subjects 1-5 conducted eye vergence training with all nine positions; the corresponding accuracies are shown here.
TABLE-US-00002 TABLE 2 All nine positions Center 0° 45° 90° 135° 180° 215° 270° 315° Percent of S1 96.67 96.67 95.00 98.33 97.50 99.17 96.67 97.50 95.83 Accuracy of S2 85.00 82.50 85.83 83.33 88.33 90.83 93.33 89.17 89.17 Each Subject S3 88.33 91.67 95.00 94.17 95.00 93.33 91.67 92.50 88.33 S4 96.67 87.50 95.00 96.67 98.33 90.00 89.17 93.33 95.00 S5 100.00 90.83 95.83 91.67 95.83 95.00 100.00 95.00 93.33 N = 20
[0167] TABLE 3 is a cross-validation assessment of subjects 6 to 10 using the physical apparatus. Subjects 6-10 conducted eye vergence training with all nine positions; the corresponding accuracies are shown here.
TABLE-US-00003 TABLE 3 All nine positions Center 0° 45° 90° 135° 180° 215° 270° 315° Percent of S6 86.67 76.67 94.17 85.00 82.50 80.00 72.50 88.33 90.00 Accuracy of S7 77.50 90.00 98.33 90.83 96.67 96.67 99.17 99.17 100.00 Each Subject S8 76.67 91.67 93.33 87.50 98.33 98.33 94.17 94.17 95.00 S9 87.50 92.50 84.17 85.00 92.50 90.83 97.50 97.50 94.17 S10 71.67 61.67 65.00 72.50 60.00 82.50 85.83 75.83 80.00 N = 20
[0168] TABLE 4 is a real-time classification of test subjects 6 to 10 using the physical apparatus. Subjects 6-10 underwent eye vergence training with all nine positions with the ocular mount. Real-time classification algorithm results under mounted condition is presented.
TABLE-US-00004 TABLE 4 All nine positions Center 0° 45° 90° 135° 180° 215° 270° 315° Percent of S6 66.67 83.33 83.33 72.22 77.78 88.89 83.33 77.78 77.78 Accuracy of S7 94.44 88.89 100.00 94.44 88.89 88.89 88.89 100.00 100.00 Each Subject S8 83.33 88.89 88.89 94.44 94.44 94.44 94.44 88.89 100.00 S9 83.33 88.89 77.78 72.22 83.33 88.89 100.00 100.00 94.44 S10 72.22 72.22 66.67 72.22 77.78 88.89 88.89 100.00 66.67 N = 3
[0169] TABLE 5 is a real-time classification of test subjects 4, 8, and 9 using the physical apparatus. Subjects 4, 8 and 9 underwent eye vergence training with all nine positions without the ocular mount. Real-time classification algorithm results under dismounted condition is presented.
TABLE-US-00005 TABLE 5 All nine positions Center 0° 45° 90° 135° 180° 215° 270° 315° Percent of S4 66.67 66.67 83.33 83.33 88.89 88.89 88.89 94.44 94.44 Accuracy of S8 83.33 88.89 94.44 88.89 100.00 100.00 88.89 94.44 94.44 Each Subject S9 72.22 83.33 88.89 88.89 94.44 88.89 83.33 88.89 94.44 N = 3
VR-Enabled Vergence Therapeutic System
[0170] As the gold standard method, conventional home-based vergence therapy utilizes pencil pushups in conjunction with office-based visual therapy. Adding a VR headset to home-based therapy with a portable data recording system can certainly make a patient use the same therapeutic program both at optometrist's office and home. To integrate with the developed ocular wearable electronics, we designed two ocular therapy programs in a Unity engine (unity3d.com).
[0171] The first program enables a patient to use the virtual ‘Brock String’ (
[0172] The second therapy program uses a set of two cards with concentric circles on each card referred to as ‘Eccentric Circle’ (
[0173]
[0174] As noted, continuous use of the VR headset program (
DISCUSSION
[0175] Collectively, the development of a fully portable, all-in-one, periocular wearable electronic system with a wireless, real-time classification of eye vergence is introduced. The comprehensive study of soft and nanomaterials, stretchable mechanics, and processing optimization and characterization for aerosol jet printed EOG electrodes enabled the first demonstration of highly stretchable and low-profile biopotential electrodes that allowed a comfortable, seamless integration with a therapeutic VR environment. Highly sensitive eye vergence detection was achieved by the combination of the skin-like printed EOG electrodes, optimized sensor placement, and novel signal processing and feature extraction strategy. When combined with a therapeutic program-embedded VR system, the users were able to successfully improve the visual training accuracy in an ordinary office setting. Through the in vivo demonstration, we showed that the soft periocular wearable electronics can accurately provide quantitative feedback of vergence motions, which is directly applicable to many CI and strabismus patients. Overall, the VR-integrated wearable system verified its potential to replace archaic home therapy protocols such as the pencil pushups.
[0176] While the present invention focuses on the development of the integrated wearable system and demonstration of its effectiveness on vergence monitoring with healthy population, additional work investigates the use of the wearable system for home-based and long-term therapeutic effects with eye disorder patients. The quantitative detection of eye vergence can also be utilized for diagnosis of neurodegenerative diseases and childhood learning disorders, both of which are topics of high impact research studies that are bottlenecked by the lack of low-cost and easy-to-use methods for field experiments. For example, Parkinson's patients exhibit diplopia and convergence insufficiency while rarer diseases such as spinocerebellar ataxia type 3 and type 6 can demonstrate divergence insufficiency and diplopia as well. Although these diseases are not yet fully treatable, quality of life can be improved if the ocular conditions are treated with therapy. Beyond disease diagnosis and treatment domains, the presented periocular system may serve as a unique and timely research tool in the prevention and maintenance of ocular health, a research topic with increasing interests due to the excessive use of smart devices.
Materials and Methods
[0177] Fabrication of a Soft, Flexible Electronic Circuit
[0178] The portable and wearable flexible electronic circuit was fabricated to integrate a set of skin-like electrodes for wireless detection of eye vergence. The combination of newly developed transfer printing and hard-soft integration with a conventional microfabrication technique allowed for successful manufacture of the flexible electronics (details of the device fabrication follow and are shown in
[0179]
[0180] Fabrication process of flexible devices including conventional microfabrication techniques, double transfer printing process, direct writing on soft material with additive manufacturing, and chip mounting are presented below. [0181] a) Preparation of A Carrier Wafer [0182] 1. Clean a silicon wafer with acetone, IPA, and DI water. [0183] 2. Dehydrate on a hot plate at 110° C. for three min. [0184] 3. Apply O.sub.2 plasma at 50 W for 60 sec. [0185] 4. Spincoat PMMA A7 at 4000 rpm for 30 sec. [0186] 5. Bake on a hot plate at 180° C. for two min 30 sec. [0187] 6. Spincoat PI (PI) at 4000 rpm for one min. [0188] 7. Pre-bake on a hot plate at 150° C. for five min. [0189] 8. Hard bake on a hot plate at 250° C. for 55 min. [0190] b) Material Deposition And Photolithography [0191] 1. Deposit 1 μm-thick copper (Cu) using sputtering systems. [0192] 2. Spincoat photoresist (AZ 4620) at 2000 rpm for 30 sec. [0193] 3. Bake on a hot plate at 110° C. for five min. [0194] 4. Align with a photomask and expose UV light, exposure dose of 320 mJ/cm.sup.2. [0195] 5. Develop patterns with a developer (AZ 400K, 1:3 dilution). [0196] 6. Etch Cu using copper etchant. [0197] 7. Remove photoresist using acetone. [0198] 8. Dehydrate on a hot plate at 110° C. for three min. [0199] 9. Apply O.sub.2 plasma at 50 W for 30 sec. [0200] 10. Spincoat PI at 900 rpm for one min. [0201] 11. Bake on a hot plate at 150° C. for five min and 200° C. for 15 min. [0202] 12. Spincoat second PI at 900 rpm for one min. [0203] 13. Bake on a hot plate at 150° C. for five min and 200° C. for 45 min. [0204] 14. Apply O.sub.2 plasma at 50 W for 30 sec. [0205] 15. Spincoat photoresist (AZ 4620) at 2000 rpm for 30 sec. [0206] 16. Bake on a hot plate at 110° C. for five min. [0207] 17. Align with a photomask and expose UV light, exposure dose of 320 mJ/cm.sup.2. [0208] 18. Develop patterns with a developer (AZ 400K, 1:3 dilution). [0209] 19. Etch PI using reactive ion etcher (RIE) at 150 W, 150 mTorr, and 20 SCCM O.sub.2 for 18 min. [0210] 20. Remove photoresist using acetone. [0211] 21. Dehydrate on a hot plate at 110° C. for three min. [0212] 22. Apply O.sub.2 plasma at 50 W for 30 sec. [0213] 23. Deposit 2 μm-thick Cu using sputtering systems. [0214] 24. Spincoat photoresist (AZ 4620) at 2000 rpm for 30 sec. [0215] 25. Bake on a hot plate at 110° C. for five min. [0216] 26. Align with a photomask and expose UV light, exposure dose of 320 mJ/cm.sup.2. [0217] 27. Develop patterns with a developer (AZ 400K, 1:3 dilution). [0218] 28. Etch Cu using copper etchant. [0219] 29. Remove photoresist using acetone. [0220] 30. Dehydrate on a hot plate at 110° C. for three min. [0221] 31. Apply O.sub.2 plasma at 50 W for 30 sec. [0222] 32. Spincoat PI at 4000 rpm for one min. [0223] 33. Bake on a hot plate at 150° C. for five min and 200° C. for 45 min. [0224] 34. Apply O.sub.2 plasma at 50 W for 30 sec. [0225] 35. Spincoat photoresist (AZ 4620) at 2000 rpm for 30 sec. [0226] 36. Bake on a hot plate at 110° C. for five min. [0227] 37. Align with a photomask and expose UV light, exposure dose of 320 mJ/cm.sup.2. [0228] 38. Develop patterns with a developer (AZ 400K, 1:3 dilution). [0229] 39. Etch PI using reactive ion etcher (RIE) at 150 W, 150 mTorr, and 20 SCCM O.sub.2 for 30 min. [0230] 40. Remove photoresist using acetone. [0231] c) Preparation Of A Thin Elastomeric Membrane [0232] 1. Prepare 4 g of 1:1 Ecoflex00-30 and 6 g of 1:1 Ecoflex Gel and mix them together. [0233] 2. Spincoat the mixture at 200 rpm for 30 sec on five inch plastic petri dish. [0234] 3. Cure at room temperature. [0235] d) Pick Up And Transfer Printing Of Intraoral Electronic Device [0236] 1. Immerse fabricated intraoral electronic circuit on wafer in acetone overnight. [0237] 2. Pick up the intraoral electronic circuit using water-soluble tape. [0238] 3. Transfer onto thin elastomeric membrane. [0239] 4. Dissolve the water-soluble tape by gently applying water. [0240] e) Mount Electronic Chips [0241] 1. Screen print low temperature solder paste (alloy of Sn/Bi/Ag (42%/57.6%/0.4%), Chip Quik Inc.) with alignment on the intraoral electronic circuit. [0242] 2. After mounting all necessary chips on proper contact pads, apply heat according to recommended reflow profile. [0243] 3. Apply soldering Flux if necessary. [0244] 4. Attach a 1×1×1 mm.sup.3 neodynium magnet to sensor pads for complete circuit. [0245] f) Fabrication Of Skin-Like Electrode With Aerosol Jet Printing [0246] 1. Prepare a glass slide by cleaning with acetone and IPA. [0247] 2. Spin coat PMMA A7 thickness of 700 nm at 4000 RPM for 30 seconds. [0248] 3. Bake the PMMA for two min 30 sec. [0249] 4. Spin coat a layer of PI 2545, thickness of 1 μm at 5000 RPM for one minute. [0250] 5. Bake the PI-2545 for 60 minutes at 250° C. on a hotplate. [0251] 6. Surface treat the sample with air plasma for 30 seconds. [0252] 7. Load sample on aerosol jet print (AJP) platen and set temperature to 70° C. [0253] 8. Turn on the sheath flow rate at 30 SCCM. [0254] 9. Turn on the atomizer flow rate at 20 SCCM. [0255] 10. Turn on the atomizer current at 0.6 Amps. [0256] 11. Deposit silver by running the program. [0257] 12. Sinter the nanoparticles at 200° C. for one hour. [0258] 13. Load sample on AJP platen and set temperature to 70° C. [0259] 14. Turn on the sheath flow rate at 30 SCCM. [0260] 15. Turn on the atomizer flow rate at 20 SCCM. [0261] 16. Turn on the atomizer current at 0.6 Amps. [0262] 17. Align silver layer with next layer using fiducial markers. [0263] 18. Deposit diluted SC1813 by running the program. [0264] 19. Bake for five minutes at 110° C. [0265] 20. Etch PI using reactive ion etcher (RIE) at 250 W, 150 mTorr, and 20 SCCM O.sub.2 for 20 min. [0266] 21. Remove photoresist with acetone. [0267] g) Pick Up And Transfer Printing Of AJP Skin-Like Electrode [0268] 1. Clean a glass slide with acetone and IPA and dehydrate at 100° C. [0269] 2. Laminate a film of polyvinyl alcohol (PVA) onto the glass slide with scotch tape. [0270] 3. Prepare 2.5 g of Ecoflex gel mixture 1:1 and spin coat it at 2000 RPM for one min. [0271] 4. Let Ecoflex gel cure at room temperature for two hours. [0272] 5. Take the skin-like electrode sample and place into a bath of acetone. [0273] 6. Heat up the acetone bath at 60° C. for one hour. [0274] 7. Transfer the sample with a PVA-based water-soluble tape. [0275] 8. Place the transferred samples and the tape onto the Ecoflex gel substrate. [0276] 9. Hydrate the tape to dissolve. [0277] 10. Attach anisotropic conductive film (ACF) wires to the skin-like electrode on the pad side. [0278] 11. Attach a 1×1×1 mm.sup.3 magnet to the wire with silver paint.
[0279] Fabrication of Soft, Skin-Like, Stretchable Electrodes
[0280] An AJP method was used to design and manufacture the skin-like electrodes (details of this method are recited above and are shown in
[0281] In Vivo Experiment With Human Subjects
[0282] The eye vergence study involved 14 volunteers ages 18 to 40 and the study was conducted by following the approved IRB protocol (#H17212) at Georgia Institute of Technology. Prior to the in vivo study, all subjects agreed with the study procedures and provided signed consent forms.
[0283] Vergence Physical Apparatus
[0284] An aluminum frame-based system was built to accustom a human subject with natural eye vergence motions in the physical domain (
[0285] Thus, after device and sensor fabrication, it is ready to be used on a test subject for testing with physical apparatus or virtual reality apparatus. [0286] a) Physical Apparatus [0287] 1. Place electrodes in desired position, conventional, OV1, or OV2. [0288] 2. Place device on shirt if using conventional radio or place flexible device on back of the neck. [0289] 3. Setup physical apparatus in a room with 500×500 cm of space. [0290] 4. Assist the participant by placing their head on the ocular headstand for stability during testing. [0291] 5. Instruct the participant to follow the commands from the MATLAB program. [0292] 6. After 70 seconds, record the data by selecting each trial and position on the apparatus individually. [0293] 7. After five trials are recorded, press the cross-validation button to assess your dataset. [0294] b) Virtual Apparatus [0295] 1. Place electrodes in desired position, conventional, OV1, or OV2. [0296] 2. Place device on shirt if using conventional radio or place flexible device on back of the neck. [0297] 3. Initiate the correct application on the Samsung S6 and then place the Samsung S6 in gear VR. [0298] 4. Assist the participant by placing the Samsung S6 and Samsung gear VR on the head. [0299] 5. After tightening straps, instruct the user to initiate the program at the same time as the MATLAB program. [0300] 6. After 70 seconds, record the data by selecting each trial and position on the apparatus individually. [0301] 7. After five trials are recorded, press the cross-validation button to assess your dataset.
[0302] VR Vergence Training Program
[0303] A portable VR headset running on a smartphone (Samsung Gear VR) was used for all training and therapy programs (above). Unity engine made development of VR applications simpler by accurately positioning items that mock human binocular vision. We simulated our eye vergence physical apparatus on the VR display and optimized head motions by disabling the feature for idealistic geometric positioning. A training procedure was evoked with audio feedback from the MATLAB application.
[0304] VR Vergence Therapy Program
[0305] The eye therapy programs were chosen based off eye vergence and accommodation therapy guidelines from the literature. Two types of home-based visual therapy techniques were reproduced, including Phase One—Brock String, and Phase Two—Eccentric Circles. Brock String involved three dots, at variable distances to simulate near, intermediate, and distance positions. Each individual dot can be moved for the near (20 cm to 40 cm), intermediate (50 to 70 cm), and distance (80 to 100 cm) positions. Eccentric Circles allowed the user to move the cards laterally outwards and inwards to make cross-eye motions difficult. This motion was controlled by the touchpad and buttons on the Samsung Gear VR.
[0306] Classification Feature Selection
[0307] The first feature (Equation 15) shows cumulative trapezoidal method in which the filtered signal f (t) is summed upon each unit step, i to i+1, using the trapezoidal method for quick computation. The next feature (Equation 16) is the variance of the filtered signal. A root mean square is utilized (Equation 17), in conjunction with peak to root mean square ratio (Equation 18). The final feature is a ratio of the maximum over the minimum of the filtered window (Equation 19).
[0308] The rationalization of the use of the ensemble classifier is supported by the MATLAB's classification learner application. This application assesses numerous classifiers by applying k-fold cross-validation using the aforementioned features and additional sixty features. The datasets from our in vivo test subjects indicated a couple classifiers, quadratic support vector machine and ensemble subspace discriminant were consistently more accurate than others. The latter is consistently higher in accuracy with various test subjects in cross-validation assessments. The ensemble classifier utilizes a random subspace with discriminant classification rather than nearest neighbor. Unlike the other ensemble classifiers, random subspace does not use decision trees. The discriminant classification combines the best of the feature set and discriminant classifiers while removing the weak decision trees to yield its high accuracy. A custom feature selection script with the ideas of wrapper and embedded methods was conducted by incorporating the ensemble classifier.
[0309] Numerous characteristics and advantages have been set forth in the foregoing description, together with details of structure and function. While the invention has been disclosed in several forms, it will be apparent to those skilled in the art that many modifications, additions, and deletions, especially in matters of shape, size, and arrangement of parts, can be made therein without departing from the spirit and scope of the invention and its equivalents as set forth in the following claims. Therefore, other modifications or embodiments as may be suggested by the teachings herein are particularly reserved as they fall within the breadth and scope of the claims here appended.