G06F3/04815

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING TERMINAL, AND PROGRAM

An information processing device is provided which includes a first acquisition unit (214) configured to acquire a control command that is inputted by a first user and corresponds to presentation unit information for designating a presentation unit for presenting a tactile stimulus by a tactile presentation device and mode information for designating a mode of the tactile stimulus; a generation unit (218) configured to generate a tactile control signal for presenting the tactile stimulus to the presentation unit in accordance with the control command; and a first distribution unit (222) configured to distribute the tactile control signal to the tactile presentation device worn on a body of a second user, in which the tactile control signal corresponds to at least one of a presentation timing, frequency, interval, waveform, presentation time, and intensity of the tactile stimulus.

INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING TERMINAL, AND PROGRAM

An information processing device is provided which includes a first acquisition unit (214) configured to acquire a control command that is inputted by a first user and corresponds to presentation unit information for designating a presentation unit for presenting a tactile stimulus by a tactile presentation device and mode information for designating a mode of the tactile stimulus; a generation unit (218) configured to generate a tactile control signal for presenting the tactile stimulus to the presentation unit in accordance with the control command; and a first distribution unit (222) configured to distribute the tactile control signal to the tactile presentation device worn on a body of a second user, in which the tactile control signal corresponds to at least one of a presentation timing, frequency, interval, waveform, presentation time, and intensity of the tactile stimulus.

Method and Device for Managing Interactions Directed to a User Interface with a Physical Object

The method includes: displaying first graphical elements associated with first plurality of output modalities within an XR environment; while displaying the first graphical elements, detecting movement of a physical object; and in response to detecting the movement of the physical object: in accordance with a determination that the movement of the physical object causes the physical object to breach a distance threshold relative to a first graphical element among the first graphical elements, selecting a first output modality associated with the first graphical element as a current output modality for the physical object; and in accordance with a determination that the movement of the physical object causes the physical to breach the distance threshold relative to a second graphical element among the first graphical elements, selecting a second output modality associated with the second graphical element as the current output modality for the physical object.

Method and Device for Managing Interactions Directed to a User Interface with a Physical Object

The method includes: displaying first graphical elements associated with first plurality of output modalities within an XR environment; while displaying the first graphical elements, detecting movement of a physical object; and in response to detecting the movement of the physical object: in accordance with a determination that the movement of the physical object causes the physical object to breach a distance threshold relative to a first graphical element among the first graphical elements, selecting a first output modality associated with the first graphical element as a current output modality for the physical object; and in accordance with a determination that the movement of the physical object causes the physical to breach the distance threshold relative to a second graphical element among the first graphical elements, selecting a second output modality associated with the second graphical element as the current output modality for the physical object.

DIGITAL AUDIO WORKSTATION AUGMENTED WITH VR/AR FUNCTIONALITIES
20230044356 · 2023-02-09 ·

Embodiments of the present technology are directed at features and functionalities of a VR/AR enabled digital audio workstation. The disclosed audio workstation can be configured to allow users to record, produce, mix, and edit audio in virtual 3D space based on detecting and manipulating human gestures in a virtual reality environment. The audio can relate to music, voice, background noise, speeches, background noise, one or more musical instruments, special effects music, electronic humming or noise from electrical/mechanical equipment, or any other type of audio.

DIGITAL AUDIO WORKSTATION AUGMENTED WITH VR/AR FUNCTIONALITIES
20230044356 · 2023-02-09 ·

Embodiments of the present technology are directed at features and functionalities of a VR/AR enabled digital audio workstation. The disclosed audio workstation can be configured to allow users to record, produce, mix, and edit audio in virtual 3D space based on detecting and manipulating human gestures in a virtual reality environment. The audio can relate to music, voice, background noise, speeches, background noise, one or more musical instruments, special effects music, electronic humming or noise from electrical/mechanical equipment, or any other type of audio.

Electronic apparatus, control method for electronic apparatus, and non-transitory computer-readable storage medium
11558599 · 2023-01-17 · ·

An electronic apparatus comprising: a processor; and a memory storing a program which, when executed by the processor, causes the electronic apparatus to: acquire a VR image; read viewpoint information indicating a plurality of viewpoints with respect to the VR image; control a display device so that a part of the VR image corresponding to each viewpoint is automatically switched over in order and displayed, on a screen on a basis of the viewpoint information; and change the viewpoint so that a predetermined subject is included in the part of the VR image displayed on the screen.

Electronic apparatus, control method for electronic apparatus, and non-transitory computer-readable storage medium
11558599 · 2023-01-17 · ·

An electronic apparatus comprising: a processor; and a memory storing a program which, when executed by the processor, causes the electronic apparatus to: acquire a VR image; read viewpoint information indicating a plurality of viewpoints with respect to the VR image; control a display device so that a part of the VR image corresponding to each viewpoint is automatically switched over in order and displayed, on a screen on a basis of the viewpoint information; and change the viewpoint so that a predetermined subject is included in the part of the VR image displayed on the screen.

VOICE INFORMATION PROCESSING METHOD AND ELECTRONIC DEVICE
20230010969 · 2023-01-12 ·

A voice information processing method and an electronic device are provided. The voice information processing method may include: a first device (1100) obtains first voice information, and when the first voice information includes a wakeup keyword, the first device (1100) sends a voice assistant wakeup instruction to a second device (1200), such that the second device (1200) launches a voice assistant; then the first device (1100) obtains second voice information and sends the second voice information to the second device (1200), the second device (1200) determines a voice triggered event corresponding to the second voice information by using the voice assistant, and feeds target information associated with performance of the voice triggered event back to the first device (1100), such that the first device (1100) performs the voice triggered event based on the target information. The method can reduce the computing burden of the first device (1100).

VOICE INFORMATION PROCESSING METHOD AND ELECTRONIC DEVICE
20230010969 · 2023-01-12 ·

A voice information processing method and an electronic device are provided. The voice information processing method may include: a first device (1100) obtains first voice information, and when the first voice information includes a wakeup keyword, the first device (1100) sends a voice assistant wakeup instruction to a second device (1200), such that the second device (1200) launches a voice assistant; then the first device (1100) obtains second voice information and sends the second voice information to the second device (1200), the second device (1200) determines a voice triggered event corresponding to the second voice information by using the voice assistant, and feeds target information associated with performance of the voice triggered event back to the first device (1100), such that the first device (1100) performs the voice triggered event based on the target information. The method can reduce the computing burden of the first device (1100).