G06F3/0346

Systems and methods for providing mixed-reality experiences under low light conditions

Systems and methods are provided for facilitating computer vision tasks (e.g., simultaneous location and mapping) and pass-through imaging include a head-mounted display (HMD) that includes a first set of one or more cameras configured for performing computer vision tasks and a second set of one or more cameras configured for capturing image data of an environment for projection to a user of the HMD. The first set of one or more cameras is configured to detect at least a visible spectrum light and at least a particular band of wavelengths of infrared (IR) light. The second set of one or more cameras includes one or more detachable IR filters configured to attenuate IR light, including at least a portion of the particular band of wavelengths of IR light.

Artificial reality collaborative working environments

Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.

Systems, methods, and graphical user interfaces for updating display of a device relative to a user's body

An electronic device, while the electronic device is worn over a predefined portion of the user's body, displays, via a display generation component arranged on the electronic device opposite the predefined portion of the user's body, a graphical representation of an exterior view of a body part that corresponds to the predefined portion of the user's body. The electronic device detects a change in position of the electronic device with respect to the predefined portion of the user's body. The electronic device, in response to detecting the change in the position of the electronic device with respect to the predefined portion of the user's body, modifies the graphical representation of the exterior view of the body part that corresponds to predefined portion of the user's body in accordance with the detected change in position of the electronic device with respect to the predefined portion of the user's body.

Systems, methods, and graphical user interfaces for updating display of a device relative to a user's body

An electronic device, while the electronic device is worn over a predefined portion of the user's body, displays, via a display generation component arranged on the electronic device opposite the predefined portion of the user's body, a graphical representation of an exterior view of a body part that corresponds to the predefined portion of the user's body. The electronic device detects a change in position of the electronic device with respect to the predefined portion of the user's body. The electronic device, in response to detecting the change in the position of the electronic device with respect to the predefined portion of the user's body, modifies the graphical representation of the exterior view of the body part that corresponds to predefined portion of the user's body in accordance with the detected change in position of the electronic device with respect to the predefined portion of the user's body.

Wearable electronic systems having variable interactions based on device orientation
11580849 · 2023-02-14 · ·

Wearable electronic systems having varying interactions based on device orientations are described herein. The systems include a first wearable electronic device and a second wearable electronic device having an input device and a device orientation sensor. The device orientation sensor detects a device orientation of the second wearable electronic device and generates a device orientation signal. The systems have a first mapping orientation mode that performs a first mapping between inputs from the input device and functions of a user interface displayed on the first wearable electronic device when the second wearable electronic device has a first device orientation and a second mapping orientation mode that performs a second mapping between inputs from the input device and functions of the user interface displayed on the first wearable electronic device when the device orientation of the second wearable electronic device detected by the device orientation sensor is a second device orientation.

Systems and methods for detecting environmental occlusion in a wearable computing device display
11580324 · 2023-02-14 · ·

Systems and methods for displaying a visual interface in a display of a head-mounted wearable device when the display may occlude objects in the user's physical environment while in use. An image detection device oriented generally in-line with the user's line of sight is used to capture at least one image. One or more objects are detected in the at least one image and, based on the detection of the one or more objects, an environmental interaction mode may be activated or deactivated. In the environmental interaction mode, the user interface may be modified or disabled to facilitate viewing of the physical environment.

Three-dimensional object position tracking system
11579711 · 2023-02-14 · ·

A hand-held controller and a positional reference device for determining the position and orientation of the hand-held controller within a three-dimensional volume relative to the location of the positional reference device. An input/output subsystem in conjunction with processing and memory subsystems can receive a reference image data captured by a beacon sensing device combined with inertial measurement information from inertial measurement units within the hand-held controller. The position and orientation of the hand-held controller can be computed based on the linear distance between a pair of beacons on the positional reference device and the reference image data and the inertial measurement information.

Three-dimensional object position tracking system
11579711 · 2023-02-14 · ·

A hand-held controller and a positional reference device for determining the position and orientation of the hand-held controller within a three-dimensional volume relative to the location of the positional reference device. An input/output subsystem in conjunction with processing and memory subsystems can receive a reference image data captured by a beacon sensing device combined with inertial measurement information from inertial measurement units within the hand-held controller. The position and orientation of the hand-held controller can be computed based on the linear distance between a pair of beacons on the positional reference device and the reference image data and the inertial measurement information.

Control method for terminal, terminal, intelligent wearable device, and system

A control method for controlling terminal state switching includes a terminal obtaining first data of the terminal and second data from a wearable device that is bound to the terminal when receiving an unlocking instruction in a lock screen state. If the first data and the second data meet a preset condition, the terminal performs the operation of unlocking.

Content Transmission Method, Device, and Medium
20230042460 · 2023-02-09 ·

A content transmission method is provided. The method may include: A first device determines that a distance between the first device and a second device is less than a distance threshold. The first device provides a user with a prompt that content transmission can be performed between the first device and the second device. The first device recognizes a gesture operation performed by the user on the first device, and determines transmission content and a transmission direction of the transmission content between the first device and the second device based on the recognized gesture operation. The first device receives the transmission content from the second device or sends the transmission content to the second device based on the determined transmission direction.