A63F2300/6045

Systems and methods for interfacing video games and user communications
09737804 · 2017-08-22 · ·

Systems and methods for interfacing video games and user communications are provided. Communications amongst users can affect a video game provided to the users. For example, communications can be monitored to determine when a video game event occurs and then a video game event can be generated. Moreover, one or more aspects of the video game can affect communications amongst the users. For example, the active user of a video game may be used to adjust the prominence of communications amongst the users.

PROJECTION OF INTERACTIVE ENVIRONMENT

An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.

Video game using dual motion sensing controllers

An inclination of a first unit is detected based on an output from a first acceleration sensor provided in a first unit of a controller, and an inclination of a second unit is detected based on an output from a second acceleration sensor provided in a second unit separate from the first unit. A difference between the inclinations of the first unit and the second unit is detected, and game control is performed using the detected difference. Thus, with a game apparatus using a plurality of acceleration sensors or a plurality of sensors capable of detecting a motion or a posture, a dynamic play is made possible with a high degree of freedom of motion and an intuitive motion input is realized.

Controlling objects in a virtual environment

Methods, systems, and computer-storage media having computer-usable instructions embodied thereon, for controlling objects in a virtual environment are provided. Real-world objects may be received into a virtual environment. The real-world objects may be any non-human object. An object skeleton may be identified and mapped to the object. A user skeleton of the real-world user may also be identified and mapped to the object skeleton. By mapping the user skeleton to the object skeleton, movements of the user control the movements of the object in the virtual environment.

Video analysis device, video analysis method, and point-of-gaze display system

A video acquirer acquires video obtained by imaging, by an imaging element that moves in association with motion of a head of a user, an area including reflected light of each of light beams irradiated to either one eyeball of the user from two light sources of a first light source that moves in association with the motion of the head of the user and a second light source whose relative position is invariable with respect to a video presenter as an observation target for the user. A head movement estimator estimates the motion of the head of the user based on a relative position of the reflected light of the second light source with respect to the reflected light of the first light source in the video acquired by the video acquirer.

Multi-image interactive gaming device

An image capture device includes: a housing; a first camera defined along a front surface of the housing; a first camera controller configured to control the first camera to capture images of an interactive environment during user interactivity at a first exposure setting; a second camera defined along the front surface of the housing; a second camera controller configured to control the second camera to capture images of the interactive environment during the user interactivity at a second exposure setting lower than the first exposure setting, the captured images from the second camera being analyzed to identify and track an illuminated object in the interactive environment.

Individual discrimination device and individual discrimination method

A frame storage stores an image obtained by imaging a region of at least part of the body of a user. A vital sign signal detector detects a signal sequence of a vital sign that cyclically varies from plural imaged regions of the body of the user by using captured images of a predetermined number of frames stored in the frame storage. A correlation calculator obtains the correlation between the signal sequences of the vital sign detected from the respective imaged regions of the body. An identity determining section determines whether or not the respective imaged regions of the body belong to the same user based on the correlation between the signal sequences of the vital sign detected from the respective imaged regions of the body.

Augmented reality and physical games

Augmented reality and physical game techniques are described. In one or more implementations, an indication is received by a computing device of a location of a physical gaming piece of a game. An augmentation is computed based on the indication by the computing device to be displayed as part of the game. The augmentation is displayed by the computing device on a display device that is at least partially transparent such that a physical portion of the game is viewable through the display device concurrently with the augmentation.

Quality of experience reverse control for electronic games
09717991 · 2017-08-01 · ·

Technologies and implementations for managing an experience during play of an interactive electronic game are generally disclosed.

Virtual controller for touch display

Systems and methods are provided for use with a computing device having a touch sensitive display including a touch sensor configured to detect touches of a digit of a user. The method may include detecting an initial digit down position on the display via the touch sensor, and establishing a neutral position for a virtual controller at the digit down position. The method may further include detecting a subsequent movement of the digit relative to the initial digit down position, and determining a controller input parameter based on the subsequent movement of the digit relative to the initial digit down position. The method may further include generating a controller input message indicating the determined controller input parameter.