Patent classifications
A63F13/537
PROGRAM, METHOD, ELECTRONIC DEVICE, AND SYSTEM FOR GAME INVOLVING MULTI-BATTLE
Provided is a program, etc. for further encouraging a player to play a game. Provided is a program for a game including a multi-battle in which a plurality of players combat against a common enemy character, said program causing a computer to execute: a step of accepting a multi-battle initiation input from a player; a step of initiating a multi-battle for combating against a predetermined enemy character on the basis of the accepted multi-battle initiation input; a step of accepting a participation request input from the player, said participation request input requesting another player to participate in the initiated multi-battle; and a step of executing a process for inflicting damage on the enemy character upon accepting the participation request input.
PROGRAM, METHOD, ELECTRONIC DEVICE, AND SYSTEM FOR GAME INVOLVING MULTI-BATTLE
Provided is a program, etc. for further encouraging a player to play a game. Provided is a program for a game including a multi-battle in which a plurality of players combat against a common enemy character, said program causing a computer to execute: a step of accepting a multi-battle initiation input from a player; a step of initiating a multi-battle for combating against a predetermined enemy character on the basis of the accepted multi-battle initiation input; a step of accepting a participation request input from the player, said participation request input requesting another player to participate in the initiated multi-battle; and a step of executing a process for inflicting damage on the enemy character upon accepting the participation request input.
INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM
An information processing system comprises a computer configured to execute: setting a first area and a second area as areas in which game media of a first player are disposed, the second area being different from the first area; setting a third area and a fourth area as areas in which game media of a second player are disposed, the fourth area being different from the third area; displaying the game media disposed in the first area, the second area, and the fourth area in an identifiable manner; displaying the game media disposed in the third area in an unidentifiable manner; setting a game medium disposed in the first area as a first base medium; setting a game medium disposed in an area where game media of the first player are disposed as a first material medium, the area being an area other than the second area; and fusing the first material medium with the first base medium to change the ability of the first base medium, while deleting the first material medium or changing the ability thereof.
PROGRAM, METHOD, ELECTRONIC DEVICE, AND SYSTEM
An electronic device 10 is an electronic device that includes a display device 13 and that executes a game, the electronic device having: a game control unit 23 that executes content of the game and that makes the display device 13 display a screen related to the content; and a determination unit 24 that determines whether a release condition for allowing second content different from first content to be executable has been satisfied, on the basis of execution of the first content performed by the game control unit 23. The display device 13 displays a screen related to the second content while displaying a main screen G1 related to the first content, in the case where the determination unit 24 determines that the release condition has been satisfied.
PROGRAM, METHOD, ELECTRONIC DEVICE, AND SYSTEM
An electronic device 10 is an electronic device that includes a display device 13 and that executes a game, the electronic device having: a game control unit 23 that executes content of the game and that makes the display device 13 display a screen related to the content; and a determination unit 24 that determines whether a release condition for allowing second content different from first content to be executable has been satisfied, on the basis of execution of the first content performed by the game control unit 23. The display device 13 displays a screen related to the second content while displaying a main screen G1 related to the first content, in the case where the determination unit 24 determines that the release condition has been satisfied.
HMD transitions for focusing on specific content in virtual-reality environments
Methods and systems for presenting an object on a screen of a head mounted display (HMD) include receiving an image of a real-world environment in proximity of a user wearing the HMD. The image is received from one or more forward facing cameras of the HMD and processed for rendering on a screen of the HMD by a processor within the HMD. A gaze direction of the user wearing the HMD, is detected using one or more gaze detecting cameras of the HMD that are directed toward one or each eye of the user. Images captured by the forward facing cameras are analyzed to identify an object captured in the real-world environment that is in line with the gaze direction of the user, wherein the image of the object is rendered at a first virtual distance that causes the object to appear out-of-focus when presented to the user. A signal is generated to adjust a zoom factor for lens of the one or more forward facing cameras so as to cause the object to be brought into focus. The adjustment of the zoom factor causes the image of the object to be presented on the screen of the HMD at a second virtual distance that allows the object to be discernible by the user.
HMD transitions for focusing on specific content in virtual-reality environments
Methods and systems for presenting an object on a screen of a head mounted display (HMD) include receiving an image of a real-world environment in proximity of a user wearing the HMD. The image is received from one or more forward facing cameras of the HMD and processed for rendering on a screen of the HMD by a processor within the HMD. A gaze direction of the user wearing the HMD, is detected using one or more gaze detecting cameras of the HMD that are directed toward one or each eye of the user. Images captured by the forward facing cameras are analyzed to identify an object captured in the real-world environment that is in line with the gaze direction of the user, wherein the image of the object is rendered at a first virtual distance that causes the object to appear out-of-focus when presented to the user. A signal is generated to adjust a zoom factor for lens of the one or more forward facing cameras so as to cause the object to be brought into focus. The adjustment of the zoom factor causes the image of the object to be presented on the screen of the HMD at a second virtual distance that allows the object to be discernible by the user.
User immersion context-based notifications on a user display
Techniques for managing notifications are described. In an example, a computing device receives notification data. The computing device compares a first context associated with the notification data and a second context associated with an application executing in an application window on a display. The computing device determines whether the notification data is to be presented in the application window or is to be sent to a queue based on the comparison, the queue storing other notification data corresponding to another notification. The computing device also presents, based on the comparison, at least the notification data or a notification in the application window over at least a portion of content from execution of the application while a presentation of the content continues, the notification distinct from the notification data and indicating that the notification data is presented.
User immersion context-based notifications on a user display
Techniques for managing notifications are described. In an example, a computing device receives notification data. The computing device compares a first context associated with the notification data and a second context associated with an application executing in an application window on a display. The computing device determines whether the notification data is to be presented in the application window or is to be sent to a queue based on the comparison, the queue storing other notification data corresponding to another notification. The computing device also presents, based on the comparison, at least the notification data or a notification in the application window over at least a portion of content from execution of the application while a presentation of the content continues, the notification distinct from the notification data and indicating that the notification data is presented.
AUGMENTED REALITY PLACEMENT FOR USER FEEDBACK
Methods and systems are provided for generating augmented reality (AR) scenes where the AR scenes include one or more artificial intelligence elements (AIEs) that are rendered as visual objects in the AR scenes. The method includes generating an AR scene for rendering on a display; the AR scene includes a real-world space and virtual objects projected in the real-world space. The method includes analyzing a field of view into the AR scene; the analyzing is configured to detect an action by a hand of the user when reaching into the AR scene. The method includes generating one or more AIEs rendered as virtual objects in the AR scene, each AIE is configured to provide a dynamic interface that is selectable by a gesture of the hand of the user. In one embodiment, each of the AIEs is rendered proximate to a real-world object present in the real-world space; the real-world object is located in a direction of where the hand of the user is detected to be reaching when the user makes the action by the hand.