G06F2203/011

CONTROLLING PROGRESS OF AUDIO-VIDEO CONTENT BASED ON SENSOR DATA OF MULTIPLE USERS, COMPOSITE NEURO-PHYSIOLOGICAL STATE AND/OR CONTENT ENGAGEMENT POWER

Provided is a system for controlling progress of audio-video content based on sensor data of multiple users, composite neuro-physiological state (CNS) and/or content engagement power (CEP). Sensor data is received from sensors positioned on an electronic device of a first user to sense neuro-physiological responses of the first user and second users that are in field-of-view (FOV) of the sensors. Based on the sensor data and at least one of a CNS value for social interaction application and a CEP value for immersive content, recommendations of action items for first user are predicted. Content of a feedback loop, created based on sensor data, CNS value, CEP value, and predicted recommendations, is rendered on output unit of electronic device during play of the at least one of social interaction application and immersive content experience. Progress of social interaction and immersive content experience is controlled by first user based on predicted recommendations.

SYSTEMS AND METHODS FOR TERMINAL CONTROL

The embodiments of the present disclosure disclose a system and method. The system may include at least one storage device configured to storage computer instruction; and at least one processor, in communication with the storage device. When executing the computer instructions, the at least one processor is configured to direct the system to perform operations including: obtaining a sensing signal of at least one sensing device; identifying a signal feature of the sensing signal; and determining, based on the signal feature, an operation of a target object associated with the at least one sensing device.

DYNAMIC EXPANSION AND CONTRACTION OF EXTENDED REALITY ENVIRONMENTS
20230052418 · 2023-02-16 ·

In one example, a method performed by a processing system including at least one processor includes rendering an extended reality environment, monitoring social interactions of a plurality of users within the extended reality environment, adjusting the extended reality environment in response to the social interactions of the plurality of users, and adjusting a rule associated with the extended reality environment in response to the adjusting the extended reality environment.

Meta verse multimedia system based on brainwaves
11550393 · 2023-01-10 · ·

A metaverse multimedia system based on user brainwaves causes that emotional messages, voice messages and color messages could be captured from the brainwaves of the users. The emotional messages are integrated to the role of the user in the metaverse space. Hence, the role in the metaverse can respond the emotions of the user, while the voice messages and color messages are integrated into the sceneries in the metaverse, which is selected by the user. As a result, the metaverse space completely presents user's states of mind which are captured from the brainwaves of the users. Furthermore the whole sceneries and presentations of the roles in the metaverse space are adjustable with the changes of the users. It can also express user's personalities. Interaction modes between the users and visitors entering into the metaverse space of the user could be analyzed based on the method disclosed in the present invention.

MOOD ORIENTED WORKSPACE
20230041497 · 2023-02-09 ·

A system detects a user's mood and in response establishes computer settings including computer game settings, recommends social network interactions, advises other users, alters task scheduling, and in general enhances collective group mood, collective productivity, social interaction, and engagement.

Dynamic emotion detection based on user inputs
11593243 · 2023-02-28 · ·

A method by a network device for dynamically detecting emotional states of a user operating a client end station to interact with an application. The method includes receiving information regarding user inputs received by the client end station from the user while the user interacted with the application during a particular time period and determining an emotional state of the user based on analyzing the information and information regarding user inputs received by the client end station from the user while the user interacted with the application during one or more previous time periods that together with the particular time period form a time window.

SYSTEMS AND METHODS FOR BIOMECHANICALLY-BASED EYE SIGNALS FOR INTERACTING WITH REAL AND VIRTUAL OBJECTS

Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system may be included within unobtrusive headwear that performs eye tracking and controls screen display. The system may also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.

DISPLAY CASE
20180011512 · 2018-01-11 ·

A display case includes a transparent LCD panel for viewing display case contents therethrough.

VARIABLE COMPUTING ENGINE FOR INTERACTIVE MEDIA BASED UPON USER BIOMETRICS
20180011682 · 2018-01-11 · ·

A system and method for implementing interactive media content is provided. Interactive media content is received for communication to a user through at least wireless earpieces. User biometrics are measured utilizing the wireless earpieces. A user condition associated with the user biometrics is determined. Branching patterns of the interactive media content are modified in response to the user condition. The interactive content may be a game or story.

Determining a mood for a group
11710323 · 2023-07-25 · ·

A system and method for determining a mood for a crowd is disclosed. In example embodiments, a method includes identifying an event that includes two or more attendees, receiving at least one indicator representing emotions of attendees, determining a numerical value for each of the indicators, and aggregating the numerical values to determine an aggregate mood of the attendees of the event.