Patent classifications
G02B2027/0198
WEARABLE VIDEO HEADSET AND METHOD FOR CALIBRATION
During calibration, a wearable video headset displays a pattern on a partially transparent display positioned in a field of view of a user's eye. The user has a hand-held marker that includes a scaled version of the displayed pattern. The user moves the marker toward or away from the user's eye until the pattern on the marker appears to be the same size as the pattern on the display. When the sizes match, the headset measures a distance between a forward-facing camera and the hand-held marker. The headset uses the measured distance, and geometrical relationships, to determine the spacing between the user's eye and the display. Such calibration can ensure that the images displayed to the user mesh realistically with the surroundings, which remain partially visible through the partially transparent display of the video headset.
Wearable computer using programmed local tag
A wearable computing device includes a head-mounted display (HMD) that provides a field of view in which at least a portion of the environment of the wearable computing device is viewable. The HMD is operable to display images superimposed over the field of view. When the wearable computing device determines that a target device is within its environment, the wearable computing device obtains target device information related to the target device from a programmed local tag. The target device information may include information that defines a virtual control interface for controlling the target device and an identification of a defined area of the target device on which the virtual control image is to be provided. The wearable computing device controls the HMD to display the virtual control image as an image superimposed over the defined area of the target device in the field of view.
Method for customizing a head mounted device adapted to generate a virtual image
A method for customizing a head mounted device adapted to generate a virtual image for a wearer, the method comprises: obtaining a 3D eye pupil position of the wearer in a reference frame and corresponding to a predetermined gaze direction, and customizing the head-mounted device based on the 3D eye pupil position so that the position of a focus area (EMB) at least partly coincides with the position of the eye pupil in the predetermined gaze direction, wherein customizing the head-mounted device comprises controlling a recording of a holographic mirror based on the 3D eye pupil position in the reference frame.
Camera System
A device for MR/VR systems that includes a two-dimensional array of cameras that capture images of respective portions of a scene. The cameras are positioned along a spherical surface so that the cameras have adjacent fields of view. The entrance pupils of the cameras are positioned at or near the user's eye while the cameras also form optimized images at the sensor. Methods for reducing the number of cameras in an array, as well as methods for reducing the number of pixels read from the array and processed by the pipeline, are also described.
Eyewear-type terminal and method of controlling the same
Provided is an eyewear-type terminal including a display unit on which picture information is displayed; a sensing unit that senses a period of time for which a user's gaze has been fixed in a state where a user wears the eyewear-type terminal; and a controller that collects information relating to something that the user gazes toward, in a case where the user's gaze has been fixed for a period of reference time or longer, and controls the display unit in such a manner that, among the pieces of collected information, at least one piece of collected information, is displayed on the display unit.
CONTROL DEVICE, CONTROL METHOD, AND PROGRAM
Provided is a configuration for executing display information output control with improved visibility of a user wearable or portable display unit. A controller configured to execute display information output control on a user wearable or portable display unit is included. The controller sets a turning on (ON) period and a turning off (OFF) period and controls switching between afterimage consideration pulse display having the turning off period being set to be within an afterimage recognition period and normal pulse display having the turning off period being set to be longer than or equal to the afterimage recognition period, the turning on (ON) period being an output period of display information to the display unit, the turning off (OFF) period being a non-output period of display information to the display unit. The controller executes the switching control between the afterimage consideration pulse display and the normal pulse display depending on eye velocity of a user. The controller executes the afterimage consideration pulse display in a case where eye velocity of the user is less than a threshold and executes the normal pulse display in a case where the eye velocity is more than or equal to the threshold.
REPROJECTION AND WOBULATION AT HEAD-MOUNTED DISPLAY DEVICE
A head-mounted display device including one or more position sensors and a processor. The processor may receive a rendered image of a current frame. The processor may receive position data from the one or more position sensors and determine an updated device pose based on the position data. The processor may apply a first spatial correction to color information in pixels of the rendered image at least in part by reprojecting the rendered image based on the updated device pose. The head-mounted display device may further include a display configured to apply a second spatial correction to the color information in the pixels of the rendered image at least in part by applying wobulation to the reprojected rendered image to thereby generate a sequence of wobulated pixel subframes for the current frame. The display may display the current frame by displaying the sequence of wobulated pixel subframes.
Determining inter-pupillary distance
A head-mounted display device is disclosed that includes an at least partially see-though display, a processor, and a non-volatile storage device holding instructions executable by the processor to: select an image that corresponds to a physical object viewable by the user; display the image at a perceived offset to the physical object; in response to alignment user input, move a perceived position of the image relative to the physical object; output an instruction to provide completion user input when the image appears to align with the physical object; when the completion user input is received, determine the inter-pupillary distance of the user; and calibrate the head mounted display device based on the inter-pupillary distance.
Head-mount type display device, control system, method of controlling head-mount type display device, and computer program
A transmissive head-mount type display device includes an image display section adapted to display a virtual image, and capable of transmitting an external sight, an object acquisition section adapted to obtain a selectable object located in a predetermined distance range from the image display section, and a position of a specific object included in the external sight, and a control section adapted to display an object-correspondence virtual image associated with the object obtained as the virtual image using the image display section, identify a change in the position of the specific object based on the position of the specific object obtained, select the object based on a relationship between the change in the position of the specific object identified and a position of the object obtained, and display a specific check image associated with the object selected as the virtual image using the image display section.
GAZE TRACKING APPARATUS AND SYSTEMS
A gaze tracking system comprising one or more cameras operable to capture one or more images of a side view of one or both of a user's eyes, a cornea identification unit operable to identify the location and size of a cornea in one or more of the captured images, a gaze detection unit operable to determine a direction of the user's gaze in dependence upon the identified location of the cornea in the one or more captured images.