Patent classifications
G02B2027/0187
Gaze tracking apparatus and systems
A system configured to perform an eye tracking process using a head-mountable eye-tracking arrangement, the system comprising an eye tracking unit, located on the head-mountable arrangement, operable to detect motion of one or both of the user's eyes, a relative motion identification unit operable to identify motion of the head-mountable arrangement relative to the user's head, and a correction unit operable to determine a correction to the eye tracking process in dependence upon the identified motion of the user's head relative to the head-mountable arrangement.
Short distance illumination of a spatial light modulator using an optical element with an aperture
A display device includes a light source, a spatial light modulator, and an optical assembly. The light source is configured to provide illumination light and the spatial light modulator is positioned to receive the illumination light. The optical assembly includes a first reflective surface with an aperture and a second reflective surface that is opposite to the first reflective surface. The optical assembly is positioned relative to the light source so that at least a first portion of the illumination light received by the optical assembly is reflected by the second reflective surface toward the first reflective surface, is reflected by the first reflective surface toward the second reflective surface, and is transmitted through the second reflective surface. A method performed by the display device is also disclosed.
INFORMATION PROCESSING DEVICE THAT DISPLAYS A VIRTUAL OBJECT RELATIVE TO REAL SPACE
An information processing device including a display unit, a detector, and a first control unit and a method of using same. The display unit may be a head-mounted display. The display unit is capable of providing the user with a field of view of a real space and a virtual object. The detector detects an azimuth of the display unit around at least one axis and display of the virtual object is controlled based in the detected azimuth.
INTELLIGENT SYSTEM FOR CONTROLLING FUNCTIONS IN A COMBAT VEHICLE TURRET
A system for controlling turret functions of a land-based combat vehicle includes: a plurality of image detection sensors for recording sequences of images having an at least partial view of a 360° environment of the land-based combat vehicle; at least one virtual, augmented or mixed reality headset for wear by an operator, the headset presenting the at least partial view of the environment of the land-based combat vehicle on a display, the headset including a direction sensor for tracking an orientation of the headset imparted during a movement of a head of the operator and eye tracking means for tracking eye movements of the operator; a control unit including at least one computing unit for receiving as input and processing: images supplied by the plurality of image detection sensors; headset position and orientation data supplied by the direction sensor; eye position data supplied by the eye tracking means.
INTERACTION PERIPHERAL, DETECTION METHOD, VIRTUAL REALITY HEADSET, METHOD FOR REPRODUCING A REAL POINT IN VIRTUAL SPACE, DEVICE AND METHOD FOR VIRTUALISING A REAL SPACE
An interaction peripheral, a method for detecting a real point, a virtual reality headset, a method for reproducing a real point in virtual space, a device and a method for virtualising a real space, particularly allowing a plane to be obtained in two, three or n dimensions of a real space which may be reproducible in virtual reality. An interaction peripheral which can be connected to a virtual reality headset, includes a range finder which can supply, to the headset, a measurement signal including a relative position measurement of a real point of a real space, the real point being sighted by the range finder. The measurement signal enables reproduction of the real point measured in a virtual space generated by the headset. Thus, the real point can be reproduced in real space while reducing risks of errors because the measurement tools are simple interaction peripherals handled by a user.
Head Up Display Apparatus With a Bright Energy Efficient Backlight for a Vehicle
A head up display apparatus for a vehicle includes an imaging unit that generates a projection light beam with display content and includes a transmissive display indication layer with selectively controllable display elements distributed over an area, a matrix backlight that provides backlighting therefor and includes selectively controllable light sources distributed along the transmissive display indication layer, and a collimation array with collimators arranged between a light source and the transmissive display indication layer, and a projection panel in the beam path of the projection light beam generated by the imaging unit for reflecting the projection light beam to a user, the projection panel being arranged in the beam path such that a virtual display image is generated therebehind in the visual field of the user.
MULTI-LAYER REPROJECTION TECHNIQUES FOR AUGMENTED REALITY
This disclosure provides systems, devices, apparatus and methods, including computer programs encoded on storage media, for multi-layer reprojection techniques for augmented reality. A display processor may obtain a layer of graphics data including a plurality of virtual objects. Each the plurality of virtual objects may be associated with at least one bounding box of a plurality of bounding boxes. The display processor may further obtain metadata indicative of at least one edge of the at least one bounding box of the plurality of bounding boxes, and metadata corresponding to reprojection instructions associated with each of the plurality of bounding boxes. The display processor may reproject the plurality of virtual objects based on the metadata indicative of the at least one edge of the at least one bounding box and the metadata corresponding to the reprojection instructions.
HEAD-UP DISPLAY DEVICE AND MOBILE OBJECT
A display region has a curved surface shape having upper and lower end portions disposed at positions closer to a visual field than a reference plane, and a central portion disposed at a position farther from the visual field than the reference plane. A first convergence angle difference between a convergence angle from the eye position to the upper end portion and a convergence angle from the eye position to a first point on the reference plane through the upper end portion, a second convergence angle difference between a convergence angle to the central portion and a convergence angle to a second point on the reference plane through the central portion, and a third convergence angle difference between a convergence angle to the lower end portion and a convergence angle to a third point on the reference plane through the lower end portion respectively fall within four milliradians.
ULTRASOUND DEVICES FOR MAKING EYE MEASUREMENTS
The disclosed ultrasound devices may include at least one ultrasound transmitter positioned and configured to transmit ultrasound signals toward a user's face to reflect off a facial feature of the user's face and at least one ultrasound receiver positioned and configured to receive and detect the ultrasound signals reflected off the facial feature. At least one processor may be configured to receive data from the at least one ultrasound receiver and to determine, based on the received data from the at least one ultrasound receiver, at least one of the following eye measurements: an interpupillary distance of the user; an eye relief; or a position of a head-mounted display relative to the facial feature of the user. Various other devices, systems, and methods are also disclosed.
AUGMENTED REALITY DEVICE AND METHOD FOR DETECTING GAZE OF USER
A method, performed by an augmented reality (AR) device including a vision correction lens, of detecting a gaze of a user is provided. The method includes obtaining lens characteristic information about the vision correction lens arranged to overlap a light guide plate in a gaze direction of the user, emitting light for gaze tracking toward a light reflector through a light emitter, wherein the emitted light is reflected by the light reflector and then directed to an eye of the user, receiving a light reflected by the eye of the user through a light receiver, obtaining an eye image of the user based on the light received, adjusting the eye image of the user based on the lens characteristic information about the vision correction lens, and obtaining gaze information based on the adjusted eye image.