H04N5/2224

Matched array general talent architecture system and method
11508103 · 2022-11-22 · ·

A matched array technology system and method for displaying in a two-dimensional array the structured interactions between management and a plurality of employees in an organization. Axes contain proxy values of employee and manager expectations scaled to yield a matched array and an alignment vector containing cells representing target alignment between employee and manager expectations. A scatter plot of multiple employee positions portrays the pattern of talent alignment and distribution, representing the talent architecture for the organization. The talent architecture is characterized by multiple static and dynamic metrics that identify normative opportunities to improve organization alignment, and measure organization talent management performance, especially in relation to the reference and general alignment vectors of the array.

Smart phones for motion capture

A series of smart phones are mounted in respective tripods to capture motion of a person wearing markers, such as marker balls or reflectors. The videos from the phones are stripped of objects other than the markers and the videos of the markers are combined to render a 3D motion capture structure that may be applied to an image of a VR icon to cause the VR icon to move as the person originally moved.

Information processing apparatus that changes viewpoint in virtual viewpoint image during playback, information processing method and storage medium
11494067 · 2022-11-08 · ·

The information processing apparatus of the present invention is an information processing apparatus that outputs viewpoint information for generation of a virtual viewpoint image based on image data obtained by performing image capturing from directions different from one another by a plurality of image capturing apparatuses and comprises: an acquisition unit configured to acquire viewpoint information having a plurality of virtual viewpoint parameter sets respectively indicating positions and orientations of a virtual viewpoint at a plurality of points in time; a change unit configured to change a virtual viewpoint parameter set included in the viewpoint information based on a user operation during playback of a virtual viewpoint image in accordance with viewpoint information acquired by the acquisition unit; and an output unit configured to output viewpoint information having a virtual viewpoint parameter set changed by the change unit.

Systems and methods for lighting subjects for artificial reality scenes

A computer-implemented method for lighting subjects for artificial reality scenes may include (i) identifying (a) a physical camera configured to capture a physical subject for insertion into an artificial reality scene, (b) a physical light source that is positioned such that the physical light source lights the physical subject recorded by the physical camera, and (c) lighting conditions in the artificial reality scene, (ii) determining at least one lighting parameter to light the physical subject such that lighting conditions of the physical subject blend visually with the lighting conditions in the artificial reality scene, and (iii) configuring the physical light source to light the physical subject according to the at least one lighting parameter. Various other methods, systems, and computer-readable media are also disclosed.

ANIMATION PRODUCTION SYSTEM
20230089238 · 2023-03-23 ·

To enables you to take animations in a virtual space an animation production system comprising: a virtual camera that shoots a character placed in a virtual space; a user input detection unit that detects an input of a user from at least one of a head mounted display and a controller which the user mounts; a character control unit that controls an action of the character in response to the input; and a preset storage unit that stores an expression of the character, wherein the character control unit sets the expression corresponding to the input that does not affect the action of the character to the character.

METHOD AND APPARATUS FOR PRODUCTION OF A REAL-TIME VIRTUAL CONCERT OR COLLABORATIVE ONLINE EVENT
20220343951 · 2022-10-27 ·

Methods and apparatus for producing virtual concerts or other online collaborative events. One or more musicians or other presenters or contributors at disparate geolocations transmit continuums of audio and/or image data as well as synchronization data continuums. Digital processing may be used to superimpose musicians or other presenters or contributors onto a virtual background (such as a concert stage or other virtual environment). Audio and video feeds from multiple musicians or other presenters or contributors may be synchronized to give the appearance of each being on the same stage in a 2D, 3D or virtual reality (VR) environment. Multiple continuums of audio data may be transmitted at different speeds and quality in order to allow musicians, presenters or other contributors to remain synchronized while a remote production studio receives high quality data transmissions for generation of a combinative, immersive, multi-dimensional end user audio/visual experience.

COLOR AND LIGHTING ADJUSTMENT FOR IMMERSIVE CONTENT PRODUCTION SYSTEM

In some implementations, a computing device in communication with an immersive content generation system may generate a first set of user interface elements configured to receive a first selection of a shape of a virtual stage light. In addition, the device may generate a second set of user interface elements configured to receive a second selection of an image for the virtual stage light. Also, the device may generate a third set of user interface elements configured to receive a third selection of a position and an orientation of the virtual stage light. Further, the generate a fourth set of user interface elements configured to receive a fourth selection of a color for the virtual stage light. Numerous other aspects are described.

CREATING AND DISTRIBUTING INTERACTIVE ADDRESSABLE VIRTUAL CONTENT
20230082513 · 2023-03-16 ·

Systems and methods create and distribute addressable virtual content with interactivity. The virtual content may depict a live event and may be customized for each individual user based on dynamic characteristics (e.g., habits, preferences, etc.) of the user that are captured during user interaction with the virtual content. The virtual content is generated with low latency between the actual event and the live content that allows the user to interactively participate in actions related to the live event. The virtual content may represent a studio with multiple display screens that each show different live content (of the same or different live events), and may also include graphic displays that include related data such as statistics corresponding to the live event, athletes at the event, and so on. The content of the display screens and graphics may be automatically selected based on the dynamic characteristics of the user.

Information processing apparatus and information processing method
11480787 · 2022-10-25 · ·

The present technology relates to an information processing apparatus, an information processing method, and a program that make it possible to eliminate or minimize VR sickness with an immersive feeling enhanced. On the basis of head posture of a user, a video generation section generates a video resulting from control of an angle of view of a virtual camera, the angle of view corresponding to a field of view of the user travelling in a virtual space. When the user is in an acceleration state in the virtual space, the video generation section changes the angle of view of the virtual camera from a first angle of view at the time when the user is in a non-acceleration state to a second angle of view based on an acceleration direction of the user. The present technology can be applied to, for example, an HMD.

Video Game Engine Assisted Virtual Studio Production Process
20230077552 · 2023-03-16 ·

A production process involves predetermined number of cameras simultaneously filming a background at predetermined angles, and filming actors in a studio with the same number of cameras and the same angles, used in conjunction with a virtual studio system. In studio, the actors perform before a green screen and the virtual studio system composites the actors onto the background in real-time. Camera tracking allows the in-studio cameras to pan, tilt, focus, zoom, and make limited other movements as the virtual studio system adjusts display of the background in a corresponding manner, resulting in a realistic scene without transporting actors and crew to the background location.