G02B2027/0143

GLASSES-TYPE INFORMATION DISPLAY DEVICE, DISPLAY CONTROL METHOD, AND DISPLAY CONTROL PROGRAM
20230273441 · 2023-08-31 · ·

A glasses-type information display device includes a first transmission unit and a second transmission unit, a projection unit that projects an image onto the first transmission unit, a visibility reduction unit that reduces visibility of a real image visually recognized by a user through the second transmission unit, and a processor that performs a control to cause the visibility reduction unit to reduce the visibility on the basis of a display priority given to the image.

Head-mountable augmented vision system for displaying thermal images

The invention concerns an augmented vision system (1) comprising: a displaying device (3) comprising: a fixing element (31) for coupling the displaying device (3) to a head-mountable component (90) configured to be positioned in front of a 5 face (102) of a user (100), and a display body (32) being coupled to said fixing element (31) by means of a pivoting link (321, 322, 323) providing a rotation of the display body with respect to the fixing element around a rotational axis (38); the display body (32) comprising 10 a display (33) configured to display thermal data and/or images provided by a thermal sensing device (2); and a first surface portion (325) configured to enter in contact with a forehead of a user, when said head-mountable component (90) is positioned in front of the face of the user, so as to rotate the display 15 body (32) to a predefined angular positioning (381) around the rotational axis (38).

Tying a virtual speaker to a physical space
11748056 · 2023-09-05 · ·

A method for tying a virtual speaker to a physical space. A first user with a first wearable extended reality appliance enters an area associated with a virtual speaker. First sounds are transmitted from the virtual speaker to the first wearable extended reality appliance. The first user hears the first sounds at first settings of the virtual speaker. The first user changes the settings of the virtual speaker to second settings. Second sounds are transmitted from the virtual speaker to the first wearable extended reality appliance, such that the first user hears the second sounds at the second settings. After the first user leaves the area, a second user with a second wearable extended reality appliance enters the area. Third sounds are transmitted from the virtual speaker to the second wearable extended reality appliance, such that the second user hears the third sounds at the second settings.

TYING A VIRTUAL SPEAKER TO A PHYSICAL SPACE
20230139626 · 2023-05-04 · ·

A method for tying a virtual speaker to a physical space. A first user with a first wearable extended reality appliance enters an area associated with a virtual speaker. First sounds are transmitted from the virtual speaker to the first wearable extended reality appliance. The first user hears the first sounds at first settings of the virtual speaker. The first user changes the settings of the virtual speaker to second settings. Second sounds are transmitted from the virtual speaker to the first wearable extended reality appliance, such that the first user hears the second sounds at the second settings. After the first user leaves the area, a second user with a second wearable extended reality appliance enters the area. Third sounds are transmitted from the virtual speaker to the second wearable extended reality appliance, such that the second user hears the third sounds at the second settings.

INITIATING SENSORY PROMPTS INDICATIVE OF CHANGES OUTSIDE A FIELD OF VIEW

Methods, systems, apparatuses, and computer-readable media are provided for initiating location-driven sensory prompts reflecting changes to virtual objects. In one implementation, the computer-readable medium includes instructions to cause a processor to enable interaction with a virtual object located in an extended reality environment associated with a wearable extended reality appliance; receive data reflecting a change associated with the virtual object; determine whether the virtual object is within or outside a field of view of the wearable extended reality appliance; cause initiating of a first sensory prompt indicative of the change when the virtual object is determined to be within the field of view; and cause initiating of a second sensory prompt indicative of the change when the virtual object is determined to be outside the field of view, the second sensory prompt differing from the first sensory prompt.

ENHANCING VIDEOS OF PEOPLE INTERACTING WITH VIRTUAL OBJECTS IN AN EXTENDED REALITY ENVIRONMENT

A system and method for generating videos of individuals interacting with virtual objects. A wearable extended reality appliance generates an extended reality environment including at least one virtual object. First image data reflects a first perspective of an individual wearing the wearable extended reality appliance, including physical hand movements interacting with the at least one virtual object from the first perspective. Second image data reflects a second perspective facing the individual, including second physical hand movements interacting with a virtual object from the second perspective. The first image data and the second image data are analyzed to determine an interaction with the virtual object. The rendered representation of the virtual object from the second perspective is melded with the second image data to generate a video of the individual interacting with the virtual object from the second perspective.

Directional backlit display device with eye tracking
11796808 · 2023-10-24 · ·

A directional backlit display device includes a light source module, a reflective narrow-angle diffuser, a backlit display device panel, an eye tracking device, and a controller. The diffuser includes a shaft and is utilized to reflect and uniformly diffuse the light to provide a uniform directional light beam. The backlit display device panel is included on the projecting optical path of the uniform directional light beam. The uniform directional light beam illuminates an image displayed on the backlit display device panel and projects to a projection area. The controller receives an eye position information from the eye tracking device and determines a coordinate. The controller determines a corrective projection area when the coordinate deviates from the projection area, and the drive module rotates the reflective narrow-angle diffuser around an axis of the shaft according to the corrective projection area, so the image projection area moves with the eye.

Controlling duty cycle in wearable extended reality appliances
11809213 · 2023-11-07 · ·

Systems, methods, and non-transitory computer readable media including instructions for performing duty cycle control operations for a wearable extended reality appliance are disclosed. Performing the duty cycle control operations includes receiving data representing virtual content in an extended reality environment associated with a wearable extended reality appliance; identifying in the extended reality environment a first display region and a second display region separated from the first display region; determining a first duty cycle configuration for the first display region; determining a second duty cycle configuration for the second display region, wherein the second duty cycle configuration differs from the first duty cycle configuration; and causing the wearable extended reality appliance to display the virtual content in accordance with the determined first duty cycle configuration for the first display region and the determined second duty cycle configuration for the second display region.

Interpreting commands in extended reality environments based on distances from physical input devices

Systems, methods, and non-transitory computer readable media including instructions for selectively operating a wearable extended reality appliance are disclosed. Selectively operating the wearable extended reality appliance includes establishing a link between a wearable extended reality appliance and a keyboard device; receiving sensor data from at least one sensor associated with the wearable extended reality appliance, the sensor data being reflective of a relative orientation of the wearable extended reality appliance with respect to the keyboard device; based on the relative orientation, selecting from a plurality of operation modes a specific operation mode for the wearable extended reality appliance; identifying a user command based on at least one signal detected by the wearable extended reality appliance; and executing an action responding to the identified user command in a manner consistent with the selected operation mode.

Moving content between a virtual display and an extended reality environment

Systems, methods, and non-transitory computer readable media including instructions for extracting content from a virtual display are disclosed. Extracting content from a virtual display includes generating a virtual display via a wearable extended reality appliance, wherein the virtual display presents a group of virtual objects and is located at a first virtual distance from the wearable extended reality appliance; generating an extended reality environment via the wearable extended reality appliance including at least one additional virtual object at a second virtual distance from the wearable extended reality appliance; receiving input for causing a specific virtual object to move from the virtual display to the extended reality environment; and in response, generating a presentation of a version of the specific virtual object in the extended reality environment at a third virtual distance different from the first virtual distance and the second virtual distance.