G06F3/04815

Four dimensional energy-field package assembly

Four dimensional (4D) energy-field package assembly for projecting energy fields according to a 4D coordinate function. The 4D energy-field package assembly includes an energy-source system having energy sources capable of providing energy to energy locations, and energy waveguides for directing energy from the energy locations from one side of the energy waveguide to another side of the energy waveguide along energy propagation paths.

Four dimensional energy-field package assembly

Four dimensional (4D) energy-field package assembly for projecting energy fields according to a 4D coordinate function. The 4D energy-field package assembly includes an energy-source system having energy sources capable of providing energy to energy locations, and energy waveguides for directing energy from the energy locations from one side of the energy waveguide to another side of the energy waveguide along energy propagation paths.

Dynamic image capturing apparatus and method using arbitrary viewpoint image generation technology

Embodiments relate to a dynamic image capturing method and apparatus using an arbitrary viewpoint image generation technology, in which an image of background content displayed on a background content display unit or an image of background content implemented in a virtual space through a chroma key screen, having a view matching to a view of seeing a subject at a viewpoint of a camera is generated, and a final image including the image of the background content and a subject area is obtained.

Dynamic image capturing apparatus and method using arbitrary viewpoint image generation technology

Embodiments relate to a dynamic image capturing method and apparatus using an arbitrary viewpoint image generation technology, in which an image of background content displayed on a background content display unit or an image of background content implemented in a virtual space through a chroma key screen, having a view matching to a view of seeing a subject at a viewpoint of a camera is generated, and a final image including the image of the background content and a subject area is obtained.

Mid-air volumetric visualization movement compensation

A wearable computing device generates a volumetric visualization at a first position that is located in a three-dimensional space. The wearable computing device includes a volumetric source configured to create the volumetric visualization. The wearable computing device includes one or more sensors configured to determine movement of the wearable computing device. A movement of the wearable computing device is identified by the wearable computing device. Based on the movement the wearable computing device adjusts the volumetric source.

Mid-air volumetric visualization movement compensation

A wearable computing device generates a volumetric visualization at a first position that is located in a three-dimensional space. The wearable computing device includes a volumetric source configured to create the volumetric visualization. The wearable computing device includes one or more sensors configured to determine movement of the wearable computing device. A movement of the wearable computing device is identified by the wearable computing device. Based on the movement the wearable computing device adjusts the volumetric source.

Blending virtual environments with situated physical reality

Various embodiments are provided herein for tracking a user's physical environment, to facilitate on-the-fly blending of a virtual environment with detected aspects of the physical environment. Embodiments can be employed to facilitate virtual roaming by compositing virtual representations of detected physical objects into virtual environments. A computing device coupled to a HMD can select portions of a depth map generated based on the user's physical environment, to generate virtual objects that correspond to the selected portions. The computing device can composite the generated virtual objects into an existing virtual environment, such that the user can traverse the virtual environment while remaining aware of their physical environment. Among other things, the computing device can employ various blending techniques for compositing, and further provide image pass-through techniques for selective viewing of the physical environment while remaining fully-immersed in virtual reality.

Color-sensitive virtual markings of objects
11582312 · 2023-02-14 · ·

Disclosed are systems, methods, and non-transitory computer readable media for making virtual colored markings on objects. Instructions may include receiving an indication of an object; receiving from an image sensor an image of a hand of an individual holding a physical marking implement; detecting in the image a color associated with the marking implement; receiving from the image sensor image data indicative of movement of a tip of the marking implement and locations of the tip; determining from the image data when the locations of the tip correspond to locations on the object; and generating, in the detected color, virtual markings on the object at the corresponding locations.

Color-sensitive virtual markings of objects
11582312 · 2023-02-14 · ·

Disclosed are systems, methods, and non-transitory computer readable media for making virtual colored markings on objects. Instructions may include receiving an indication of an object; receiving from an image sensor an image of a hand of an individual holding a physical marking implement; detecting in the image a color associated with the marking implement; receiving from the image sensor image data indicative of movement of a tip of the marking implement and locations of the tip; determining from the image data when the locations of the tip correspond to locations on the object; and generating, in the detected color, virtual markings on the object at the corresponding locations.

Artificial reality collaborative working environments

Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.