Patent classifications
A63F13/56
DIGITAL CHARACTER WITH DYNAMIC INTERACTIVE BEHAVIOR
A virtual experience system provides dynamically interactive virtual characters in an environment. The system may be built in a modular fashion, enabling new behaviors and functionality to be added without significant alterations to the existing parts of the system. For example, behavior may be modeled using a model with multiple layers. Each layer determines responses to external stimuli based on one or more factors, with the behavior of a character being determined based on a combination of the layers of the model.
Computing images of dynamic scenes
Computing an output image of a dynamic scene. A value of E is selected which is a parameter describing desired dynamic content of the scene in the output image. Using selected intrinsic camera parameters and a selected viewpoint, for individual pixels of the output image to be generated, the method computes a ray that goes from a virtual camera through the pixel into the dynamic scene. For individual ones of the rays, sample at least one point along the ray. For individual ones of the sampled points, a viewing direction being a direction of the corresponding ray, and E, query a machine learning model to produce colour and opacity values at the sampled point with the dynamic content of the scene as specified by E. For individual ones of the rays, apply a volume rendering method to the colour and opacity values computed along that ray, to produce a pixel value of the output image.
Virtual vehicle control method in virtual scene, computer device, and storage medium
A virtual vehicle control method in a virtual scene, performed by a terminal, is provided. The method includes providing a display interface of an application program, the display interface including a scene picture of the virtual scene, and the virtual scene including a virtual vehicle; obtaining a moving speed of the virtual vehicle; and adjusting, based on the moving speed of the virtual vehicle being greater than a moving speed threshold, the scene picture to a picture of the virtual vehicle being observed in the virtual scene by using a camera model in a predetermined viewing angle direction, the camera model being located at a position with respect to the virtual vehicle.
VIRTUAL CHARACTER INTERACTION METHOD AND APPARATUS, COMPUTER DEVICE, AND STORAGE MEDIUM
Disclosed is a virtual character interaction method, including: displaying a viewing-only picture of a virtual character interaction scene; switching from the viewing-only picture to a virtual character simulation interaction interface in response to an interaction simulation operation triggered during displaying of the viewing-only picture; simulating, in the virtual character simulation interaction interface, virtual characters in the virtual character interaction scene, and displaying a scene status of interaction between the virtual characters, wherein the virtual characters include a simulated virtual character controlled by a user of the computer device; and controlling the simulated virtual character to perform a corresponding action in response to a control operation triggered on the virtual character simulation interaction interface by the user of the computer device.
VIRTUAL CHARACTER INTERACTION METHOD AND APPARATUS, COMPUTER DEVICE, AND STORAGE MEDIUM
Disclosed is a virtual character interaction method, including: displaying a viewing-only picture of a virtual character interaction scene; switching from the viewing-only picture to a virtual character simulation interaction interface in response to an interaction simulation operation triggered during displaying of the viewing-only picture; simulating, in the virtual character simulation interaction interface, virtual characters in the virtual character interaction scene, and displaying a scene status of interaction between the virtual characters, wherein the virtual characters include a simulated virtual character controlled by a user of the computer device; and controlling the simulated virtual character to perform a corresponding action in response to a control operation triggered on the virtual character simulation interaction interface by the user of the computer device.
SKELETON MODEL UPDATING APPARATUS, SKELETON MODEL UPDATING METHOD, AND PROGRAM
Provided are a skeleton model updating apparatus, a skeleton model updating method, and a program by which time and effort for changing the pose of a skeleton model to a known standard pose can be reduced. A target node identifying section (80) identifies a plurality of target nodes from among a plurality of nodes included in a skeleton model that is in a pose other than a known standard pose. A reference node identifying section (82) identifies a reference node that is positioned closest to the side of the plurality of target nodes, from among nodes that are connected to all of the target nodes via one or more bones. A position deciding section (84) decides positions of the plurality of target nodes such that relative positions of the plurality of target nodes with respect to the position of the reference node are adjusted to predetermined positions. A pose updating section (56) updates the pose of the skeleton model to the known standard pose on the basis of the decided positions of the target nodes.
SKELETON MODEL UPDATING APPARATUS, SKELETON MODEL UPDATING METHOD, AND PROGRAM
Provided are a skeleton model updating apparatus, a skeleton model updating method, and a program by which time and effort for changing the pose of a skeleton model to a known standard pose can be reduced. A target node identifying section (80) identifies a plurality of target nodes from among a plurality of nodes included in a skeleton model that is in a pose other than a known standard pose. A reference node identifying section (82) identifies a reference node that is positioned closest to the side of the plurality of target nodes, from among nodes that are connected to all of the target nodes via one or more bones. A position deciding section (84) decides positions of the plurality of target nodes such that relative positions of the plurality of target nodes with respect to the position of the reference node are adjusted to predetermined positions. A pose updating section (56) updates the pose of the skeleton model to the known standard pose on the basis of the decided positions of the target nodes.
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING APPARATUS CONTROL METHOD, AND PROGRAM
Methods and apparatus provide for acquiring, from a plurality of sensors attached to a plurality of body parts of a target person, at least information regarding movement acceleration and posture angular velocity of each of body parts to which the sensors are attached, and estimating, on the basis of the acquired information regarding the movement acceleration and posture angular velocity, the movement velocity in a predetermined coordinate system of each of the body parts to which the sensors are attached. Subsequently, on the basis of the information regarding the estimated movement velocity of each of the body parts, the methods and apparatus provide for estimating the positions of predetermined body parts of the target person.
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM
The present invention is configured so that a player and a third party other than the player can enjoy a game. A server 4 comprises a game execution unit 51, a behavior information acquisition unit 53, and a reflection unit 54. The game execution unit 51 executes a game in which an object is caused to act in accordance with the movements of a player P. The behavior information acquisition unit 53 acquires behavior information pertaining to the behavior of a third party related to the game other than the player P. The reflection unit 54 generates changes that affect the game, including the object, on the basis of the behavior information.
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM
The present invention is configured so that a player and a third party other than the player can enjoy a game. A server 4 comprises a game execution unit 51, a behavior information acquisition unit 53, and a reflection unit 54. The game execution unit 51 executes a game in which an object is caused to act in accordance with the movements of a player P. The behavior information acquisition unit 53 acquires behavior information pertaining to the behavior of a third party related to the game other than the player P. The reflection unit 54 generates changes that affect the game, including the object, on the basis of the behavior information.