SYSTEM AND METHOD TO INTEGRATE EMOTION DATA INTO SOCIAL NETWORK PLATFORM AND SHARE THE EMOTION DATA OVER SOCIAL NETWORK PLATFORM
20220036481 · 2022-02-03
Inventors
Cpc classification
G16H20/70
PHYSICS
A61B5/165
HUMAN NECESSITIES
G16H50/20
PHYSICS
A61B2503/12
HUMAN NECESSITIES
G16H50/30
PHYSICS
A61B5/00
HUMAN NECESSITIES
G16H15/00
PHYSICS
A61B5/16
HUMAN NECESSITIES
G06F3/14
PHYSICS
H04L51/046
ELECTRICITY
H04W4/70
ELECTRICITY
International classification
G06Q50/00
PHYSICS
A61B5/00
HUMAN NECESSITIES
A61B5/16
HUMAN NECESSITIES
G06F3/14
PHYSICS
G16H15/00
PHYSICS
G16H50/30
PHYSICS
Abstract
Disclosed is a system and method for integrating emotion data into a social network platform and share the emotion data over the social network platform connected through a communication network. The method includes the step of collecting biorhythm data of the user through a wearable user device. The method includes the step of receiving the biorhythm data of the users through a computing device communicatively connected with the wearable user device over the communication network. The method includes the step of integrating the emotion data through an integration module. The method includes the step of determining an emotional state of the user through an emotional state determination module. The method includes the step of analyzing and displaying emotional data of the users in real-time through an emotional data displaying module.
Claims
1- A system to integrate emotion data into a social network platform and share the emotion data over the social network platform connected through a communication network, the system comprising: a wearable user device to collect biorhythm data of the user; and a computing device is communicatively connected with the wearable user device to receive the biorhythm data of the users over the communication network, wherein the computing device comprising: a processor; and a memory communicatively coupled to the processor, wherein the memory stores instructions executed by the processor, wherein the memory comprising: an integration module to integrate the emotion data comprises: a physiological data collection engine to collect physiological data of at least one physiological property of the user; a biosignal generating engine to process the physiological data into at least one biosignal; a score calculating engine to monitor and measure the biosignal for determining at least one score pertaining to at least one of the emotions of the user, and stress of the user; and a social integrating and information overlay engine to integrate the scores with at least one of a social media post associated with the social network platform, a textual conversation, and a multimedia conversation and overlays information pertaining to the emotion of the user, and stress of the user; an emotional state determination module to determine an emotional state of the user comprising: an analysis module to analyze emotional state of the user on receiving the biorhythm data from the wearable user device; emotive module to associate the analyzed emotional state of the user corresponding to at least one of one or more posts being shared by the user, one or more pieces of content being shared to the user, one or more reactions to the posts, and one or more responses to the posts over the social network platform; and a display module to display a representation of the emotional state of the user in the social network platform; and an emotional data displaying module to analyze and display emotional data of the users in real-time, wherein the emotional data displaying module comprising: an algorithmic module to analyze the biorhythm data and compute an emotional score of the user to generate one or more insights, wherein the emotional score is indicative of the emotional state of the user during the interactions; and a visualization module to visually represent a plurality of emotional cycles for a specific time-duration for the user, wherein the visualization module displays the insights and emotional scores of the users on the computing device associated with the users.
2- The system according to claim 1, wherein the emotive module facilitates the user to initiate a command to associate the emotional state of the user corresponding to the posts being shared by the user, and the content being shared to the user.
3- The system according to claim 1, wherein the display module displays the representation of the emotional state with respect to the post being shared by the user, and the content shared to the user on receiving a request command from the user.
4- The system according to claim 1, wherein the visualization module displays emotional data in a plurality of manners on at least one of a two dimensional (2D) graph, and a three dimensional (3D) graphs by using at least one of a plurality of alpha-numeric characters, a plurality of geometric shapes, a plurality of holograms, and a plurality of symbols.
5- A method for integrating emotion data into a social network platform and share the emotion data over the social network platform connected through a communication network, the method comprising steps of: collecting biorhythm data of the user through a wearable user device; receiving the biorhythm data of the users through a computing device communicatively connected with the wearable user device over the communication network; integrating the emotion data through an integration module, wherein the integration module performs a plurality of steps comprising: collecting physiological data of at least one physiological property of the user through a physiological data collection engine; processing the physiological data into at least one biosignal through a biosignal generating engine; monitoring and measuring the biosignal for determining at least one score pertaining to at least one of emotion of the user, and stress of the user through a score calculating engine; and integrating the scores with at least one of a social media post associated with the social network platform, a textual conversation, and a multimedia conversation and overlays information pertaining to emotion of the user, and stress of the user through a social integrating and information overlay engine; determining an emotional state of the user through an emotional state determination module, wherein the emotional state determination module performs a plurality of steps comprising: analyzing the emotional state of the user on receiving the biorhythm data from the wearable user device through an analysis module; associating the analyzed emotional state of the user corresponding to at least one of one or more posts being shared by the user, one or more pieces of content being shared to the user, one or more reactions to the posts, and one or more responses to the posts over the social network platform through an emotive module; and displaying a representation of the emotional state of the user in the social network platform through a display module; and analyzing and displaying emotional data of the users in real-time through an emotional data displaying module, wherein the emotional data displaying module performs a plurality of steps comprising: analyzing the biorhythm data and computing an emotional score of the user to generate one or more insights through an algorithmic module, wherein the emotional score is indicative of the emotional state of the user during the interactions; and visually representing a plurality of emotional cycles for a specific time-duration for the user through a visualization module, wherein the visualization module displays the insights and emotional scores of the users on the computing device associated with the users.
6- The method according to claim 5, wherein the emotive module facilitates the user to initiate a command to associate the emotional state of the user corresponding to the posts being shared by the user, and the content being shared to the user.
7- The method according to claim 5, wherein the display module displays the representation of the emotional state with respect to the post being shared by the user, and the content shared to the user on receiving a request command from the user.
8- The method according to claim 5, wherein the visualization module displays emotional data in a plurality of manners on at least one of a two dimensional (2D) graph, and a three dimensional (3D) graphs by using at least one of a plurality of alpha-numeric characters, a plurality of geometric shapes, a plurality of holograms, and a plurality of symbols.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0036] In the figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description applies to any one of the similar components having the same first reference label irrespective of the second reference label.
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
DESCRIPTION OF EMBODIMENTS
[0044] The present disclosure is best understood with reference to the detailed figures and description set forth herein. Various embodiments have been discussed with reference to the figures. However, those skilled in the art will readily appreciate that the detailed descriptions provided herein with respect to the figures are merely for explanatory purposes, as the methods and systems may extend beyond the described embodiments. For instance, the teachings presented and the needs of a particular application may yield multiple alternative and suitable approaches to implement the functionality of any detail described herein. Therefore, any approach may extend beyond certain implementation choices in the following embodiments.
[0045] References to “one embodiment,” “at least one embodiment,” “an embodiment,” “one example,” “an example,” “for example,” and so on indicate that the embodiment(s) or example(s) may include a particular feature, structure, characteristic, property, element, or limitation but that not every embodiment or example necessarily includes that particular feature, structure, characteristic, property, element, or limitation. Further, repeated use of the phrase “in an embodiment” does not necessarily refer to the same embodiment.
[0046] Methods of the present invention may be implemented by performing or completing manually, automatically, or a combination thereof, selected steps or tasks. The term “method” refers to manners, means, techniques and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques, and procedures either known to, or readily developed from known manners, means, techniques and procedures by practitioners of the art to which the invention belongs. The descriptions, examples, methods, and materials presented in the claims and the specification are not to be construed as limiting but rather as illustrative only. Those skilled in the art will envision many other possible variations within the scope of the technology described herein.
[0047]
[0048] The computing device 104 is communicatively connected with the wearable user device 102 to receive the biorhythm data of the users over a communication network 106.
[0049] Communication network 106 may be a wired or a wireless network, and the examples may include but are not limited to the Internet, Wireless Local Area Network (WLAN), Wi-Fi, Long Term Evolution (LTE), Worldwide Interoperability for Microwave Access (WiMAX), General Packet Radio Service (GPRS), Bluetooth (BT) communication protocols, Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), ZigBee, EDGE, infrared (IR), Z-Wave, Thread, 5G, USB, serial, RS232, NFC, RFID, WAN, and/or IEEE 802.11, 802.16, 2G, 3G, 4G cellular communication protocols.
[0050] Examples of the computing device 104 include but not limited to a laptop, a desktop, a smartphone, a smart device, a smartwatch, a phablet, and a tablet. The computing device 104 includes a processor 110, a memory 112 communicatively coupled to the processor 110, and a user interface 114. The computing device 104 is communicatively coupled with a database 114. The database 116 receives and stores the emotional data and referral data which can be used for further analysis and prediction so that the present system can learn and improve the analysis by using the historical emotional data. Although the present subject matter is explained considering that the present system 100 is implemented on a cloud device, it may be understood that the present system 100 may also be implemented in a variety of computing systems, such as an Amazon elastic compute cloud (Amazon EC2), a network server, and the like.
[0051] Processor 110 may include at least one data processor for executing program components for executing user- or system-generated requests. A user may include a person, a person using a device such as those included in this invention, or such a device itself. Processor 110 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
[0052] Processor 110 may include a microprocessor, such as AMD® ATHLON® microprocessor, DURON® microprocessor OR OPTERON® microprocessor, ARM's application, embedded or secure processors, IBM® POWERPC®, INTEL'S CORE® processor, ITANIUM® processor, XEON® processor, CELERON® processor or other line of processors, etc. Processor 110 may be implemented using mainframe, distributed processor, multi-core, parallel, grid, or other architectures. Some embodiments may utilize embedded technologies like application-specific integrated circuits (ASICs), digital signal processors (DSPs), Field Programmable Gate Arrays (FPGAs), etc.
[0053] Processor 110 may be disposed of in communication with one or more input/output (I/O) devices via an I/O interface. I/O interface may employ communication protocols/methods such as, without limitation, audio, analog, digital, RCA, stereo, IEEE-1394, serial bus, universal serial bus (USB), infrared, PS/2, BNC, coaxial, component, composite, digital visual interface (DVI), high-definition multimedia interface (HDMI), RF antennas, S-Video, VGA, IEEE 802.n/b/g/n/x, Bluetooth, cellular (e.g., code-division multiple access (CDMA), high-speed packet access (HSPA+), global system for mobile communications (GSM), long-term evolution (LTE), WiMax, or the like), etc.
[0054] Memory 112, which may be a non-volatile memory or a volatile memory. Examples of non-volatile memory may include, but are not limited to flash memory, a Read Only Memory (ROM), a Programmable ROM (PROM), Erasable PROM (EPROM), and Electrically EPROM (EEPROM) memory. Examples of volatile memory may include but are not limited Dynamic Random Access Memory (DRAM), and Static Random-Access memory (SRAM).
[0055] The user interface 114 may present the integrated emotion data, and shared emotion data as per the request of an administrator of the present system. In an embodiment, the user interface (UI or GUI) 114 is a convenient interface for accessing the social network platform and viewing the biorhythm data of the connected users. The biorhythm data includes but is not limited to heart rate, heart rate variability, electrodermal activity (EDA)/Galvanic skin response (GSR), breathing rate, 3D accelerometer data, and gyroscope data, body temperature, among others. The biorhythm data can be processed to generate the signals based on mathematical description or algorithms. The algorithms may be introduced via software. There is potential that data is processed on the wearable user device end. Data may also be stored there temporarily before acted upon.
[0056]
[0057] In an aspect, the network architecture of the wearable user device 102 and the computing device 104 can include one or more Internet of Things (IoT) devices. In a typical network architecture of the present disclosure can include a plurality of network devices such as transmitter, receivers, and/or transceivers that may include one or more IoT devices.
[0058] In an aspect, the wearable user device, 102 can directly interact with the cloud and/or cloud servers and IoT devices. The data and/or information collected can be directly stored in the cloud server without taking any space on the user mobile and/or portable computing device. The mobile and/or portable computing device can directly interact with a server and receive information for feedback activation, trigger the feedback and deliver the feedback. Examples of the feedback include but not limited to auditory feedback, haptic feedback, tactile feedback, vibration feedback, or visual feedback from a primary wearable device, a secondary wearable device, a separate computing device (i.e. mobile), or IoT device (which may or may not be a computing device).
[0059] As used herein, the IoT devices can be a device that includes sensing and/or control functionality as well as a WiFi™ transceiver radio or interface, a Bluetooth™ transceiver radio or interface, a Zigbee™ transceiver radio or interface, an Ultra-Wideband (UWB) transceiver radio or interface, a WiFi-Direct transceiver radio or interface, a Bluetooth™ Low Energy (BLE) transceiver radio or interface, and/or any other wireless network transceiver radio or interface that allows the IoT device to communicate with a wide area network and with one or more other devices. In some embodiments, an IoT device does not include a cellular network transceiver radio or interface, and thus may not be configured to directly communicate with a cellular network. In some embodiments, an IoT device may include a cellular transceiver radio and may be configured to communicate with a cellular network using the cellular network transceiver radio.
[0060] A user may communicate with the computing devices using an access device that may include any human-to-machine interface with network connection capability that allows access to a network. For example, the access device may include a stand-alone interface (e.g., a cellular telephone, a smartphone, a home computer, a laptop computer, a tablet, a personal digital assistant (PDA), a computing device, a wearable device such as a smartwatch, a wall panel, a keypad, or the like), an interface that is built into an appliance or other device e.g., a television, a refrigerator, a security system, a game console, a browser, or the like), a speech or gesture interface (e.g., a Kinect™ sensor, a Wiimote™, or the like), an IoT device interface (e.g., an Internet-enabled devices such as a wall switch, a control interface, or other suitable interface), or the like. In some embodiments, the access device may include a cellular or other broadband network transceiver radio or interface and may be configured to communicate with a cellular or other broadband network using the cellular or broadband network transceiver radio. In some embodiments, the access device may not include a cellular network transceiver radio or interface.
[0061] In an embodiment, the users may be provided with an input/display screen which is configured to display information to the user about the current status of the system. The input/display screen may take input from an input apparatus, in the current example buttons. The input/display screen may also be configured as a touch screen or may accept input for determining vitals or bio-signals through touch or haptic based input system. The input buttons and/or screen are configured to allow a user to respond to input prompt from the system regarding needed user input.
[0062] The information which may be displayed on the screen to the user may be, for instance, the number of treatments provided, bio-signals values, vitals, the battery charge level, and volume level. The input/display screen may take information from a processor which may also be used as the waveform generator or a separate processor. The processor provides available information for display to the user allowing the user to initiate menu selections. The input/display screen may be a liquid crystal display to minimize power drain on the battery. The input/display screen and the input buttons may be illuminated to provide a user with the capability to operate the system in low light levels. Information can be obtained from a user through the use of the input/display screen.
[0063]
[0064] The integration module 202 integrates the emotion data into the social network platform. The emotional state determination module 204 determines an emotional state of the user. The emotional data displaying module 206 analyzes and display emotional data of the users in real-time.
[0065] The integration module 202 includes a physiological data collection engine 208, a biosignal generating engine 210, a score calculating engine 212 and a social integrating and information overlay engine 214. The physiological data collection engine 208 collects physiological data of at least one physiological property of the user. The biosignal generating engine 210 processes the physiological data into at least one biosignal. The score calculating engine 212 monitors and measures the biosignal for determining at least one score pertaining to at least one of the emotions of the user, and stress of the user. The social integrating and information overlay engine 214 integrates the scores with at least one of a social media post associated with the social network platform, a textual conversation, and a multimedia conversation (audio, video) and overlays information pertaining to the emotion of the user, and stress of the user.
[0066] The emotional state determination module 204 includes an analysis module 216, an emotive module 218, and a display module 220. The analysis module 216 analyses the emotional state of the user on receiving the biorhythm data from the wearable user device. The emotive module 218 associates the analyzed emotional state of the user corresponding to at least one of one or more posts being shared by the user, one or more pieces of content being shared to the user, one or more reactions to the posts, and one or more responses to the posts over the social network platform. In an embodiment, the emotive module 218 facilitates the user to initiate a command to associate the emotional state of the user corresponding to the posts being shared by the user, and the content being shared to the user. The display module 220 displays a representation of the emotional state of the user in the social network platform. In an embodiment, the display module 220 displays the representation of the emotional state with respect to the post being shared by the user, and the content shared to the user on receiving a request command from the user.
[0067] The emotional data displaying module 206 includes an algorithmic module 222 and a visualization module 224. The algorithmic module 222 analyzes the biorhythm data and computes an emotional score of the user to generate one or more insights. The emotional score is indicative of the emotional state of the user during the interactions. The visualization module visually represents a plurality of emotional cycles for a specific time duration for the user. The visualization module 224 displays the insights and emotional scores of the users on the computing device associated with the users. In an embodiment, the visualization module 224 displays emotional data in a plurality of manners on at least one of a two dimensional (2D) graph, and a three dimensional (3D) graphs by using at least one of a plurality of alpha-numeric characters, a plurality of geometric shapes, a plurality of holograms, and a plurality of symbols which include colors or moving shapes.
[0068] The present specification further describes various use cases. In the first use case, the user is feeling fantastic in the present moment. However, in the next minute, the user came-across a post on social media or got a text from his friend or saw a random post on the news on the internet. Suddenly there is a change in emotions based on such new information. So, the change in the user's emotion needs to be determined and simultaneously shared with any secondary user.
[0069] In an embodiment, this change of emotions is dynamic and real-time. This change occurs because of the change in the state of another user. This change is determined by the biosensors connected to user s body wirelessly or by the wired medium. The change in biorhythm of the user determined the change and the change in emotions are conveyed to the communication channel to the second user if the user allows permission. In an embodiment, a threshold can be pre-determined, based on the threshold and breaching the threshold, the system and the method change the user communication to another channel to relax and/or calm the user. In an embodiment, the other user can also get the information based on the score about the user's emotions over the communication channel.
[0070]
[0071]
[0072]
[0073]
[0074] Thus the present invention provides a social network platform that allows the users to add more connections where upon an invitation is sent to other users for making connections. After acceptance from other users, the user could share data in various forms, but not limited to, photos, messages, attachments of various document file formats or image formats or video formats, audio clips, videos, animations/gifs. Based on the shared data or information, users could respond and share their emotions and feelings via the emoticons by clicking the corresponding buttons that represent, for example sad, happy, laugh, love and like. The system enables the user to request to associate the emotional state of the user with respect to at least one of the piece of content being shared by the user, and content being shared to the user, in the social network. The user could place the emoticons either adjacent to, superimposed on top of the piece of the content. The user could change the emoticons into semi-transparent or could use some other visual effect to respond for the piece of the content on the social networks.
[0075] While embodiments of the present invention have been illustrated and described, it will be clear that the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the scope of the invention, as described in the claims.