Patent classifications
A63F2300/6045
Method, apparatus, device, and storage medium for running stand-alone program
A method for running a stand-alone program includes: running the stand-alone program in a first main program, and establishing a network connection between a first terminal and a second terminal through the first main program run in the first terminal and a second main program run in the second terminal; and transmitting, by the first terminal, image information to the second terminal and transmitting, by the second terminal, a second operation command to the first terminal, through the network connection. A first operation object and a second operation object in the stand-alone program are separately controlled.
Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method
It is determined whether execution is to be carried out in a first mode or in a second mode, according to a user's selection operation. In the first mode, a movement of a player object is controlled according to the user's movement operation, and a movement of a non-player object is automatically controlled. Positions of the player object and the non-player object are changed according to the user's position changing operation such that a relative positional relationship between the player object and the non-player object for use in the second mode is a first positional relationship. In the second mode, movements of the player object and the non-player object are automatically controlled while maintaining the first positional relationship.
Hardware acceleration and event decisions for late latch and warp in interactive computer products
The disclosure provides features or schemes that improve a user's experience with an interactive computer product by reducing latency through late latching and late warping. The late warping can be applied by imaging hardware based on late latch inputs and is applicable for both local and cloud computing environments. In one aspect, the disclosure provides a method of operating an imaging system employing late latching and late warping. In one example the method of operating an imaging system includes: (1) rendering a rendered image based on a user input from an input device and scene data from an application engine, (2) obtaining a late latch input from the input device, (3) rendering, employing imaging hardware, a warped image by late warping at least a portion of the rendered image based on the late latch input, and (4) updating state information in the application engine with late latch and warp information.
INTERACTIVE ENVIRONMENT WITH VIRTUAL ENVIRONMENT SPACE SCANNING
An interactive environment image may be displayed in a virtual environment space, and interaction with the interactive environment image may be detected within a three-dimensional space that corresponds to the virtual environment space. The interactive environment image may be a three-dimensional image, or it may be two-dimensional. An image is displayed to provide a visual representation of an interactive environment image including one or more virtual objects, which may be spatially positioned. User interaction with the visualized representation in the virtual environment space may be detected and, in response to user interaction, the interactive environment image may be changed.
System, method, and graphical user interface for controlling an application executing on a server
A system, method, and graphical user interface for playing games and/or executing applications on a tablet-based client. One embodiment of a graphical user interface (GUI) for playing a video game on a tablet-based client device comprises: a virtual controller rendered on a display of the tablet computer, the virtual controller substantially mimicking the control provided by a thumb stick of a physical game controller and providing omnidirectional, free-form movement in a synchronous direction in which a user moves a finger on the display of the tablet-based client.
INTERACTIVE VIDEO SYSTEM AND A METHOD OF CONTROLLING AN INTERACTIVE VIDEO SYSTEM
An interactive video system includes: a display arranged to display a video; a motion sensor and/or sound sensor integrated with the display a controller, responsive to detection of a signal from the motion sensor derived when the display is moved, to cause a change in the displayed video.
Storage medium storing game program and game apparatus
A computer generating a three-dimensional space and the images to be shown on a display: sets a first angle of view of the virtual camera; displays the image in accordance with the first angle of view; detects a position on the displayed image pointed to by the input device; calculates a straight line passing through the detected position and the virtual camera in the three-dimensional space; identifies an object intersecting the straight line; automatically sets a second angle of view of the virtual camera to zoom in and display the identified object, and displays the identified object using the display device from the perspective of the second angle of view.
Visual target tracking
A method of tracking a target includes classifying a pixel having a pixel address with one or more pixel cases. The pixel is classified based on one or more observed or synthesized values. An example of an observed value for a pixel address includes an observed depth value obtained from a depth camera. Examples of synthesized values for a pixel address include a synthesized depth value calculated by rasterizing a model of the target; one or more body-part indices estimating a body part corresponding to that pixel address; and one or more player indices estimating a target corresponding to that pixel address. One or more force vectors are calculated for the pixel based on the pixel case, and the force vector is mapped to one or more force-receiving locations of the model representing the target to adjust the model representing the target into an adjusted pose.
Method for realizing user interface using camera and mobile communication terminal for the same
A method for realizing a user interface using a camera module and a mobile communication terminal for the same. If a user makes a predetermined motion in a state in which the camera module of the mobile communication terminal is activated, the mobile communication terminal performs a predetermined action according to the motion pattern by recognizing the user motion and patterning the motion. In this case, the action performed according to the motion pattern corresponds to mouse control in a mouse mode, game control in a game mode, and character input in a character input mode.
Storage medium having information processing program stored therein, information processing device, and coordinate calculation method
A touch panel detects a point in one of a plurality of unit areas at which an input was made, the unit areas being arranged in a matrix in an instruction plane. A game apparatus repeatedly acquires detection coordinates for locating a unit area detected by a pointing device. Also, the game apparatus repeatedly calculates, in response to the acquisition of the detection coordinates, detailed coordinates by which a point can be represented with accuracy in more detail than by the detection coordinates. The detailed coordinates indicate a point in the direction of a unit area indicated by previously acquired detection coordinates, as viewed from a predetermined reference point within a unit area indicated by currently acquired detection coordinates.