A63F13/212

CONTROLLING PROGRESS OF AUDIO-VIDEO CONTENT BASED ON SENSOR DATA OF MULTIPLE USERS, COMPOSITE NEURO-PHYSIOLOGICAL STATE AND/OR CONTENT ENGAGEMENT POWER

Provided is a system for controlling progress of audio-video content based on sensor data of multiple users, composite neuro-physiological state (CNS) and/or content engagement power (CEP). Sensor data is received from sensors positioned on an electronic device of a first user to sense neuro-physiological responses of the first user and second users that are in field-of-view (FOV) of the sensors. Based on the sensor data and at least one of a CNS value for social interaction application and a CEP value for immersive content, recommendations of action items for first user are predicted. Content of a feedback loop, created based on sensor data, CNS value, CEP value, and predicted recommendations, is rendered on output unit of electronic device during play of the at least one of social interaction application and immersive content experience. Progress of social interaction and immersive content experience is controlled by first user based on predicted recommendations.

Whole-body human-computer interface
11579692 · 2023-02-14 · ·

A human-computer interface system having an exoskeleton including a plurality of structural members coupled to one another by at least one articulation configured to apply a force to a body segment of a user, the exoskeleton comprising a body-borne portion and a point-of-use portion; the body-borne portion configured to be operatively coupled to the point-of-use portion; and at least one locomotor module including at least one actuator configured to actuate the at least one articulation, the at least one actuator being in operative communication with the exoskeleton.

Whole-body human-computer interface
11579692 · 2023-02-14 · ·

A human-computer interface system having an exoskeleton including a plurality of structural members coupled to one another by at least one articulation configured to apply a force to a body segment of a user, the exoskeleton comprising a body-borne portion and a point-of-use portion; the body-borne portion configured to be operatively coupled to the point-of-use portion; and at least one locomotor module including at least one actuator configured to actuate the at least one articulation, the at least one actuator being in operative communication with the exoskeleton.

Input apparatus, method, and game processing method

An example input apparatus includes an elastic member, a base portion and a strain gauge. The elastic member includes a first end portion and a second end portion, and at least a part of the elastic member is elastically deformable. The base portion holds the opposite end portions of the elastic member so that a ring is formed by the base portion and the elastic member. The strain gauge is provided on the base portion and detects a strain generated on the base portion due to deformation of the elastic member in response to an input from the user. Note that instead of the configuration where a ring is formed by the base portion and the elastic member, the input apparatus may be configured so that two elastic members are held by the base portion.

Range of motion control in XR applications on information handling systems

More realistic experiences can be provided to a user through the use of a wearable suit. The xR wearable suit may include materials with adjustable characteristics, such as friction, and electronics for controlling the materials to provide feedback to the user wearing the xR suit. In an xR game, the materials may be used to translate virtual damage to physical constraints on the user. For example, when an avatar gets shot in the leg and is debilitated, the user's leg motion can be constricted to understand that shortcoming and stay in sync with the avatar. Examples of such feedback materials include inflating ribs, sheet jamming, and mechanical devices.

Realistic virtual/augmented/mixed reality viewing and interactions

The present invention discloses systems and methods for both viewing and interacting with a virtual reality (VR), an augmented reality (AR) or a mixed reality (MR). More specifically, the systems and methods allow the user to interact with aspects of such realities including virtual items presented in such realities or within such environments by manipulating a control device that has an inside-out camera mounted on-board. The apparatus or system uses two distinct representations including a reduced representation in determining the pose of the control device and uses these representations to compute an interactive pose portion of the control device to be used for interacting with the virtual item. The reduced representation is consonant with a constrained motion of the control device.

MOTION BLUR COMPENSATION THROUGH EYE TRACKING
20230042920 · 2023-02-09 ·

A user's eyes and if desired head is tracked as the user's gaze follows a moving object on a display. Motion blur of the moving object is keyed to the eye/head tracking. Motion blur of other objects in the frame also may be keyed to the eye/head tracking.

Thermopile array fusion tracking

A simultaneous location and mapping (SLAM)-enabled video game system, a user device of the video game system, and a computer-readable storage medium of the user device are disclosed. Generally, the video game system includes a video game console, a plurality of thermal beacons, and a user device communicatively coupled with the video game console. The user device includes a thermopile array, a processor, and a memory. The user device may receive thermal data from the thermopile array, the thermal data corresponding to a thermal signal emitted from a thermal beacon of the plurality of thermal beacons and detected by the thermopile array. The user device may determine, based on the thermal data, its location in 3D space, and then transmit that location to the video game system.

Body size estimation apparatus, body size estimation method, and program

Provided are a body size estimation apparatus, a body size estimation method, and a program that enable the estimation of the body size of a user even when the user has not taken a T-pose in advance. A body size data storage unit (50) stores body size data indicating a body size of a user. A posture data acquisition unit (52) acquires position data indicating positions of a plurality of body parts away from each other of the user. A body size estimation unit (54) estimates a body size of the user based on positions of two or more body parts indicated by the position data. A body size update unit (56) updates, in a case where the estimated body size is larger than the body size indicated by the body size data stored in the body size data storage unit (50), the body size indicated by the body size data to the estimated body size.

Body size estimation apparatus, body size estimation method, and program

Provided are a body size estimation apparatus, a body size estimation method, and a program that enable the estimation of the body size of a user even when the user has not taken a T-pose in advance. A body size data storage unit (50) stores body size data indicating a body size of a user. A posture data acquisition unit (52) acquires position data indicating positions of a plurality of body parts away from each other of the user. A body size estimation unit (54) estimates a body size of the user based on positions of two or more body parts indicated by the position data. A body size update unit (56) updates, in a case where the estimated body size is larger than the body size indicated by the body size data stored in the body size data storage unit (50), the body size indicated by the body size data to the estimated body size.