G02B2027/0187

Short distance illumination of a spatial light modulator using a single reflector
11556013 · 2023-01-17 · ·

A display device includes a light source, a spatial light modulator, and an optical element. The optical element includes a reflective surface. The optical assembly is positioned relative to the light source so that at least a portion of the illumination light received by the optical element is reflected at the reflective surface back toward the light source. The spatial light modulator is positioned to receive at least a portion of the illumination light reflected by the reflective surface. A method performed by the display device is also disclosed.

Devices, systems and methods for predicting gaze-related parameters using a neural network
11556741 · 2023-01-17 · ·

A method for creating and updating a database is disclosed. In one example, the method includes presenting a first stimulus to a first user wearing a head-wearable device, using a first camera of the head-wearable device to generate. When the first user is expected to respond to the first stimulus or expected to have responded to the first stimulus, using a second camera of the head-wearable device to generate a first right image of at least a portion of the right eye of the first user. A data connection is established between the head-wearable device and the database. A first dataset is generated comprising the first left image, the first right image and a first representation of a gaze-related parameter, the first representation being correlated with the first stimulus, and adding the first dataset to a device database.

Head-up display apparatus
11556005 · 2023-01-17 · ·

The head-up display apparatus includes a display-light emitting unit having a second inclination angle around the axis parallel to the vertical direction so that the first direction side thereof is closer to the user than the second direction side thereof is, and being configured to emit display light forming a rectangular image corresponding to the rectangular virtual image. The head-up display apparatus includes a reflecting mirror having a third inclination angle around the axis parallel to the vertical direction so that the first direction side thereof is closer to the user than the second direction side thereof is, and being configured to reflect the display light and emit reflected light to the virtual-image display unit.

Robotic assistant and method for controlling the same

A robotic assistant includes a base; an elevation mechanism positioned on the base; a display rotatably mounted on the elevation mechanism; a camera positioned on the display; and a control system that receives command instructions. In response to the command instructions, the control system is to detect movement of a face of the user in a vertical direction based on the images captured by the camera. In response to detection of the movement of the face of the user in the vertical direction, the control system is to rotate the display and actuate the elevation mechanism to the move the display up and down to allow the camera to face the face of the user during the movement of the face of the user in the vertical direction.

MIXED, VIRTUAL AND AUGMENTED REALITY HEADSET AND SYSTEM
20230011002 · 2023-01-12 ·

A mixed, virtual and augmented reality headset having a front casing (2) with a housing receiving a smartphone (19) facing the holographic display (5); a curved holographic display (5) in the front portion of the headset reflecting a projected image (11) via the display of a smartphone (19) and simultaneously allowing the user to see through same; a motorised mirror (14) positioned in a withdrawn position or in an extended position in front of the holographic display (5) reflecting the projected image (11) via the smartphone (19); two motorised lenses (15) positioned in a withdrawn position or in an extended position in front of the pupils (13) of the user; a mirror system (16) reflecting a real external image (10) with respect to the headset (1) towards a camera of the smartphone (19); and a control unit (50) controlling the position of the motorised lenses (15) and mirror (14).

CAMERA CONTROL USING SYSTEM SENSOR DATA

A method for using cameras in an augmented reality headset is provided. The method includes receiving a signal from a sensor mounted on a headset worn by a user, the signal being indicative of a user intention for capturing an image. The method also includes identifying the user intention for capturing the image, based on a model to classify the signal from the sensor according to the user intention, selecting a first image capturing device in the headset based on a specification of the first image capturing device and the user intention for capturing the image, and capturing the image with the first image capturing device. An augmented reality headset, a memory storing instructions, and a processor to execute the instructions to cause the augmented reality headset as above are also provided.

Integration of a two-dimensional input device into a three-dimensional computing environment

A workstation enables operation of a 2D input device with a 3D interface. A cursor position engine determines the 3D position of a cursor controlled by the 2D input device as the cursor moves within a 3D scene displayed on a 3D display. The cursor position engine determines the 3D position of the cursor for a current frame of the 3D scene based on a current user viewpoint, a current mouse movement, a CD gain value, a Voronoi diagram, and an interpolation algorithm, such as the Laplacian algorithm. A CD gain engine computes CD gain optimized for the 2D input device operating with the 3D interface. The CD gain engine determines the CD gain based on specifications for the 2D input device and the 3D display. The techniques performed by the cursor position engine and the techniques performed by the CD gain engine can be performed separately or in conjunction.

ADJUSTING CONTENT OF A HEAD MOUNTED DISPLAY

Apparatuses, methods, systems, and program products are disclosed for adjusting content of a head mounted display. An apparatus includes a processor and a memory that stores code executable by the processor to determine a field of view for a user relative to a display of a head-mounted display unit, detect that the user is attempting to look at content that is presented on the display of the head-mounted display unit but is out of the user's field of view, and adjust the content that is out of the user's field of view to make the content visible to the user.

Hybrid IGZO pixel architecture

A display device includes a silicon wafer including digital circuits, a micro-light emitting diode (micro-LED) wafer including an array of micro-LEDs, and an indium-gallium-zinc-oxide (IGZO) layer between the silicon wafer and the micro-LED wafer and including analog circuits. The digital circuits are characterized by a first operating supply voltage and are configured to generate digital control signals based on digital display data of an image frame. The analog circuits are characterized by a second operating supply voltage higher than the first operating supply voltage. The analog circuits includes analog storage devices configured to storing analog signals, and transistors controlled by the digital control signals and the analog signals to generate drive currents for driving the array of micro-LEDs. The digital circuits on the silicon wafer or the analog circuits in the IGZO layer include level-shifting circuits at interfaces between the digital circuits and the analog circuits.

System for gaze interaction

The present invention provides improved methods and systems for assisting a user when interacting with a graphical user interface by combining gaze based input with gesture based user commands. The present invention provide systems, devices and method that enable a user of a computer system without a traditional touch-screen to interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. Furthermore, the present invention offers a solution for touch-screen like interaction using gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen, such as for instance in situations where interaction with the regular touch-screen is cumbersome or ergonomically challenging. Further, the present invention provides systems, devices and methods for combined gaze and gesture based interaction with graphical user interfaces to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavourable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.