METHODS AND PRINTED INTERFACE FOR ROBOTIC PHYSICOCHEMICAL SENSING
20230158686 · 2023-05-25
Assignee
Inventors
Cpc classification
B25J13/087
PERFORMING OPERATIONS; TRANSPORTING
B25J13/006
PERFORMING OPERATIONS; TRANSPORTING
B25J9/163
PERFORMING OPERATIONS; TRANSPORTING
International classification
B25J13/08
PERFORMING OPERATIONS; TRANSPORTING
Abstract
Systems and methods for an electronic skin based robotic system including a robotic interface and a human subject are provided. An e-skin may be applied to the robotic interface. The e-skin applied to the robotic interface may include a plurality of physicochemical sensors. An e-skin may also be applied to the human subject. The e-skin may include electrodes for sensing muscular contractions associated with hand and arm movements as well as electrodes for stimulation. Machine learning techniques may enable decoding of signals to control the robotic hand and arm. The robotic hand and arm may be controlled to approach unknown compounds that may be hazardous. The sensors making up the physicochemical sensors on the e-skin on the robotic hand and arm may include tactile, pressure, temperature, and chemical sensors, as well as other useful sensors. These sensors may enable detection of explosives, organophosphates, pathogenic proteins, and other hazardous compounds.
Claims
1. A multimodal robotic sensing system, comprising: a robotic interface; a first printed flexible electronic skin applied to the robotic interface, wherein the first printed flexible electronic skin comprises a first substrate layer, a first array of electrodes and sensors disposed on the first substrate layer, and a first encapsulation layer covering the first array of electrodes and sensors; a second printed flexible electronic skin applied to a human subject, wherein the second printed flexible electronic skin comprises a second substrate layer, a second array of electrodes and sensors disposed on the second substrate layer, and a second encapsulation layer covering the second array of electrodes and sensors; and a wireless communication module that transmits information between the first printed flexible electronic skin and the second printed flexible electronic skin.
2. The multimodal robotic sensing system of claim 1, wherein the robotic interface comprises: a robotic hand; and a robotic arm connected to the robotic hand; wherein the first printed flexible electronic skin is applied to the robotic hand.
3. The multimodal robotic sensing system of claim 1, wherein the second printed flexible electronic skin is applied to a human forearm of the human subject and the human forearm controls and receives feedback from a corresponding robotic arm.
4. The multimodal robotic sensing system of claim 1, wherein the first array of electrodes and sensors of the first printed flexible electronic skin further comprises: printed nanoengineered multimodal physicochemical sensors; and engraved kirigami structures.
5. The multimodal robotic sensing system of claim 1, wherein the second array of electrodes and sensors of the second printed flexible electronic skin further comprises: sEMG electrode arrays printed onto a PDMS substrate; and electrical stimulation electrodes printed onto the PDMS substrate.
6. The multimodal robotic sensing system of claim 1, wherein the physicochemical sensors further comprise a tactile sensing module.
7. The multimodal robotic sensing system of claim 1, wherein the physicochemical sensors further comprise a temperature sensing module.
8. The multimodal robotic sensing system of claim 1, wherein the physicochemical sensors further comprise an autonomous dry-phase analyte detection module.
9. The multimodal robotic sensing system of claim 1, further comprising a machine learning module, wherein the robotic interface leverages sensor data and machine learning techniques to improve movement of the robotic interface.
10. A remote robotic control method, comprising: applying a first e-skin to a human subject, the first e-skin comprising: sEMG electrode arrays printed onto a PDMS substrate; and electrical stimulation electrodes printed onto the PDMS substrate; collecting sEMG signals from the first e-skin on the human subject through the sEMG electrode arrays; decoding the collected sEMG signals, wherein the collected sEMG signals are representative of movements made by the human subject; and controlling movements of a robotic arm based on the decoded sEMG signals, wherein the movements of the robotic arm are managed by movements of the human subject.
11. The remote robotic control method of claim 10, further comprising a threat feedback method, comprising: moving the robotic arm into contact with an object, wherein the robotic arm is equipped with a second e-skin having physicochemical sensors; upon contact with the object, detecting properties of the object with the physicochemical sensors; determining whether the object poses a threat based on collected sensor data representative of the properties of the object; and stimulating the human subject using the electrical stimulation electrodes if the threat is detected.
12. The remote robotic control method of claim 11, wherein the physicochemical sensors comprise a Pt-nanoparticle decorated graphene electrode configured to detect TNT.
13. The remote robotic control method of claim 11, wherein the physicochemical sensors comprise a MOF-808 modified gold nanoparticles electrode configured to detect OP.
14. The remote robotic control method of claim 11, wherein the physicochemical sensors comprise a carbon nanotube (CNT) electrode configured to detect pathogenic proteins.
15. The remote robotic control method of claim 10, wherein decoding the collected sEMG signals further comprises decoding the collected sEMG signals with a machine learning module programed to leverage machine learning techniques to improve movements of the robotics arm.
16. An electronic skin fabrication method, comprising: printing a silver (AgNWs) layer for interconnects and reference electrodes using a modified inkjet printer; printing a carbon (Pt-graphene) layer counter electrode and temperature sensor layer onto the silver layer; printing a polyimide (Au) encapsulation layer onto the carbon layer; and printing a target-selective nanoengineered (MOF-808) sensing layer onto the polyimide layer, wherein the target-selective nanoengineered (MOF-808) sensing layer comprises a tactile sensor and biochemical sensing electrodes.
17. The electronic skin fabrication method of claim 16, further comprising: cutting a polyimide substrate with kirigami structures by automatic precision cutting; and treating the polyimide surface with O2 plasma.
18. The electronic skin fabrication method of claim 16, further comprising: printing AgNWs layers onto a nanotextured substrate to form a tactile sensor; and cutting the substrate with AgNWs printed layers into a semicircle shape and applying the AgNWs printed layers to the electronic skin.
19. The electronic skin fabrication method of claim 16, further comprising printing a CNT film onto an IPCE to form a biohazard protein sensor.
20. The electronic skin fabrication method of claim 16, further comprising coating chemical sensors with flexible gelatin hydrogel.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The technology disclosed herein, in accordance with one or more various embodiments, is described in detail with reference to the following figures. The drawings are provided for purposes of illustration only and merely depict typical or example embodiments of the disclosed technology. These drawings are provided to facilitate the reader's understanding of the disclosed technology and shall not be considered limiting of the breadth, scope, or applicability thereof. It should be noted that for clarity and ease of illustration these drawings are not necessarily made to scale.
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030] The figures are not intended to be exhaustive or to limit the invention to the precise form disclosed. It should be understood that the invention can be practiced with modification and alteration, and that the disclosed technology be limited only by the claims and the equivalents thereof.
DETAILED DESCRIPTION
[0031] Systems and methods are described herein relate to controllable human-machine interactive robotic interfaces. Robotic interfaces may be equipped with both physical and chemical sensing capabilities. Robotic interfaces may be configured to perform point-of-use analysis for relevant substances and environments. Robotic interfaces may be highly flexible and conformable, mimicking human skin, to form a flexible, electronic skin (“e-skin”). An e-skin may enable seamless and efficient interaction between electronics and human and/or robot bodies to improve physical and chemical sensing applications. Robotic interfaces may improve sensing in a wide variety of environments including consumer electronics, digital medicine, smart implants, environmental surveillance, and others.
Robotic E-Skin Sensing System Embodiments
[0032] In various embodiments of the present disclosure, a robotic sensing system may be configured to measure an environment using tactile, chemical, and/or temperature sensors. A robotic sensing system may include a robot. A robotic sensing system may also include a human operator. The human operator may control the robot. The human operator may also receive feedback from the robot. A robotic sensing system may make use of electronic skins (“e-skins”) to measure an environment, operate a robot, and receive feedback from the robot based on environmental measurements. E-skins may be applied to the human subject as well as to the robot. The robot may take the form of a human hand and/or arm. The system including a robotic hand/arm equipped with a first e-skin and a human operator equipped with a second e-skin may communicate such that the human operator may control and receive feedback from the robot (“first” and “second” are nominal descriptors and one skilled in the art would understand that either e-skin may be designated as “first” or “second”). A robot equipped with an e-skin may also take other useful forms. For example, a robotic boat may be equipped with an e-skin.
[0033] The e-skin may be applied to the robot. The e-skin may include printed nanoengineered multimodal physicochemical sensors. The robot may be a robotic hand and/or arm. The sensors may be applied to the palm and/or fingers of a robotic hand. The sensors may be fabricated using a drop-on-demand inkjet technology. The e-skin may be engraved with kirigami structures. The kirigami structures may have a high stretchability without a conductive change. These properties may result in a high degree of freedom of movement of the robotic hand. The e-skin may leverage its physicochemical sensors to perform several measurements. For instance, the e-skin may perform proximity sensing using a laser proximity sensor. The e-skin may also sense properties of an object through a tactile sensor which may, for example, detect pressure to determine the weight of an object. The e-skin may also perform temperature measurements. The e-skin may be able to leverage its sensors, including its tactile sensors to perform a perceptual mapping. For instance, the e-skin may be able to determine the shape of an object. The robotic hand may be controlled to grasp an object, via the human subject wearing a second e-skin and controlling movement, and the e-skin applied to the robotic hand, equipped with tactile sensors, may be able to determine the shape of the object.
[0034] The e-skin may, in addition to the above described sensor types, also be equipped with chemical sensors. The chemical sensors may be able to perform both solution-phase and dry-phase sampling of compounds. The chemical sensor may be boated with a hydrogel to aid in the sampling and detection process. The e-skin may be equipped with many custom chemical sensors which may be configured to detect numerous substances including explosives, such as 2, 4, 6 trinitrotoluene (TNT), organophosphates (OPs) such as pesticides and chemical warfare agents, for example, sarin, and biohazard materials, such as pathogenic proteins. Pathogenic proteins may include the SARS-CoV-2 virus.
[0035] Referring now to
[0036] Referring now to
[0037] Referring now to
[0038] An e-skin may be applied to a human subject. A human-applied e-skin may contain electrodes that may be configured to collect physiological data from the human subject. Physiological data collected from a human subject may be decoded using machine learning techniques to assess movement patterns or control movements, generally. Decoded data may then be used to control or manage a corresponding robot. A human-applied e-skin may also be equipped with electrodes configured to provide stimulation to a human subject. The stimulation may provide feedback to a human subject and may alert a human subject to potential threats. The feedback, signals, and/or other information/data may be transmitted between e-skins via a wireless communication module such as Wi-Fi, Bluetooth, or other transceivers and receivers included on the e-skin. For example, the wireless communication module may transmit physiological data between a first printed flexible electronic skin on a human subject and a second printed flexible electronic skin on a robot interface.
[0039] Referring now to
[0040] The sEMG electrode arrays 202 collect signals generated from muscular contractions of the human subject. The collected signals may be analyzed using machine learning techniques and artificial intelligence (“AI”) techniques to categorize, understand, and predict human motions. These human motions may then be used to control a robotic hand. A robotic hand equipped with an e-skin having physicochemical sensors, as discussed with reference to
[0041] Referring now to
[0042] Referring now to
[0043] The robot may be equipped with an e-skin which may be equipped with physicochemical sensors. A fifth operation 510 may include moving a robotic arm in contact with an object and then grasping the object, as shown in
[0044] Such a method may be useful in a variety of contexts. For example, the physicochemical sensors may include a Pt-nanoparticle decorated graphene electrode. This electrode may be effective for detecting explosive compounds, such as TNT. The robot hand may be able to contact and detect the compound while the human operator controls the robot from a safe distance. In another example, the physicochemical sensors may include a MOF-808 modified gold nanoparticle electrode. This electrode may be effective for detecting OPs such as pesticides or chemical warfare agents. OPs can be incredible dangerous to human beings. Contact may cause disease or even death. Using the robot to identify and detect an OP prevents putting a human operator at risk. In another example, the physicochemical sensors may include a carbon nanotube (CNT) electrode. This electrode may be effective for detecting pathogenic proteins and other biohazards. For example, a CNT electrode may be effect for detecting the SARS-CoV-2 virus. The robotic hand may be able to detect the virus and provide immediate feedback to a human subject while a human subject remains at a safe distance to avoid contagion.
Inkjet Printer Fabrication Embodiment
[0045] Many of the above discussed embodiments include flexible printable e-skins. E-skins, including electrodes and/or sensors, may be printed using a modified inkjet printer. Printing e-skins with a modified inkjet printer is highly advantageous because it allows e-skins to be easily produced in large quantities at a low cost. Referring now to
[0046] Next, a third operation 606 may include a substrate being printed using a serial printing method with a modified inkjet printer. The substrate may support sensors and may be integrated with the kirigami structures to form a soft flexible e-skin. First, a silver layer of the substrate may be printed 608. The silver layer may serve an interconnecting purpose. For instance, the silver layer may include pins to connect e-skins to other e-skins or other interfaces, such as circuitry or power sources. The silver layer may also include reference electrodes. Next, a carbon layer may be printed on top of the silver layer 610. A polyimide encapsulation layer may be printed on the top of silver layer 612. The carbon layer may include counter electrodes. The carbon layer may also include a temperature sensor layer. In some embodiments, a gold nanoparticles layer (AuNPs) may be printed on top of the carbon layer 612. Finally, a target-selective nanoengineered sensing layer (MOF-808) may be printed on top of the PI layer 614. This layer may allow for detection of hazardous compounds, such as OPs.
[0047] Additionally, a tactile and/or pressure sensor may also be printed. Layers of silver, for example, AgNWs, may be printed on top of a nanotextured PDMS substrate 616. In one embodiment, 30 AgNWs layers may be printed to form the tactile sensor. Next, the printed AgNWs-PDMS may be cut into a semicircle shape and set onto the e-skin 622. A protein sensor to detect biohazards may also be included. The protein sensor may be printed on CNT film 618. The CNT film may be printed onto inkjet printed carbon electrodes (“IPCE”), as discussed above. Chemical sensors, such as the protein sensor and the OP sensor, may be coated with a flexible gelatin hydrogel 620 to aid in the collection of samples of compounds.
[0048] The e-skin may be completed by connecting pins of e-skins with silver connection pins 624. For instance, as described above, a robotic hand may have a palm e-skin and finger e-skins. These e-skins may be interconnected with silver pins. The connection may be secured by applying conductive tape. Finally, the completed e-skins may be applied to a robotic hand to detect compounds 626.
Source Tracking Embodiment
[0049] In addition to human-robot interfaces, e-skins may have other applications. For instance, an e-skin may be applied to a robotic boat to detect the source of a leak. E-skins may also be applied to and/or used in conjunction with other robots.
[0050] Referring now to
[0051] In an embodiment, robotic boats equipped with e-skins may leverage machine learning techniques to detect the source of the leak. The boats may perform measurements and communicate with each other to advance toward a point having a high or higher concentration of compounds and/or a fluid flow consistent with being the point of origin of a leak of compounds. The robots may perform this process autonomously and may transmit a signals when they have identified the origin of a leak.
[0052] While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not of limitation. Likewise, the various diagrams may depict an example architectural or other configuration for the invention, which is done to aid in understanding the features and functionality that can be included in the invention. The invention is not restricted to the illustrated example architectures or configurations, but the desired features can be implemented using a variety of alternative architectures and configurations. Indeed, it will be apparent to one of skill in the art how alternative functional, logical or physical partitioning and configurations can be implemented to implement the desired features of the present invention. Also, a multitude of different constituent module names other than those depicted herein can be applied to the various partitions. Additionally, with regard to flow diagrams, operational descriptions and method claims, the order in which the steps are presented herein shall not mandate that various embodiments be implemented to perform the recited functionality in the same order unless the context dictates otherwise.
[0053] Although the invention is described above in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described, but instead can be applied, alone or in various combinations, to one or more of the other embodiments of the invention, whether or not such embodiments are described and whether or not such features are presented as being a part of a described embodiment. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments.
[0054] Terms and phrases used in this document, and variations thereof, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. As examples of the foregoing: the term “including” should be read as meaning “including, without limitation” or the like; the term “example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof; the terms “a” or “an” should be read as meaning “at least one,” “one or more” or the like; and adjectives such as “conventional,” “traditional,” “normal,” “standard,” “known” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass conventional, traditional, normal, or standard technologies that may be available or known now or at any time in the future. Likewise, where this document refers to technologies that would be apparent or known to one of ordinary skill in the art, such technologies encompass those apparent or known to the skilled artisan now or at any time in the future.
[0055] The presence of broadening words and phrases such as “one or more,” “at least,” “but not limited to” or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent. The use of the term “module” does not imply that the components or functionality described or claimed as part of the module are all configured in a common package. Indeed, any or all of the various components of a module, whether control logic or other components, can be combined in a single package or separately maintained and can further be distributed in multiple groupings or packages or across multiple locations.
[0056] Additionally, the various embodiments set forth herein are described in terms of exemplary block diagrams, flow charts and other illustrations. As will become apparent to one of ordinary skill in the art after reading this document, the illustrated embodiments and their various alternatives can be implemented without confinement to the illustrated examples. For example, block diagrams and their accompanying description should not be construed as mandating a particular architecture or configuration.