Patent classifications
G02B2027/0138
Gaze tracking apparatus and systems
A system configured to perform an eye tracking process using a head-mountable eye-tracking arrangement, the system comprising an eye tracking unit, located on the head-mountable arrangement, operable to detect motion of one or both of the user's eyes, a relative motion identification unit operable to identify motion of the head-mountable arrangement relative to the user's head, and a correction unit operable to determine a correction to the eye tracking process in dependence upon the identified motion of the user's head relative to the head-mountable arrangement.
Short distance illumination of a spatial light modulator using an optical element with an aperture
A display device includes a light source, a spatial light modulator, and an optical assembly. The light source is configured to provide illumination light and the spatial light modulator is positioned to receive the illumination light. The optical assembly includes a first reflective surface with an aperture and a second reflective surface that is opposite to the first reflective surface. The optical assembly is positioned relative to the light source so that at least a first portion of the illumination light received by the optical assembly is reflected by the second reflective surface toward the first reflective surface, is reflected by the first reflective surface toward the second reflective surface, and is transmitted through the second reflective surface. A method performed by the display device is also disclosed.
Head mounted display apparatus
The occlusion is faithfully expressed even in the binocular vision in the AR display by a head mounted display apparatus or the like. A head mounted display apparatus 10 includes a lens 12, a lens 13, a camera 14, a camera 15, and a control processor 16. A CG image for a right eye is displayed on the lens 12. A CG image for a left eye is displayed on the lens 13. The camera 14 captures an image for the right eye. The camera 15 captures an image for the left eye. The control processor 16 generates the CG image for the right eye in which occlusion at the time of seeing by the right eye is expressed and the CG image for the left eye in which occlusion at the time of seeing by the left eye is expressed, based on the images captured by the cameras 14 and 15 and projects the generated CG image for the right eye and CG image for the left eye onto the lenses 12 and 13. A center of a lens of the camera 14 is provided at the same position as a center of the lens 12. A center of a lens of the camera 15 is provided at the same position as a center of the lens 13.
Wearable device and control method therefor
A wearable device is disclosed. The wearable device comprises: a camera; a sensor; a display; a laser projector; and a processor for identifying a user's sight line on the basis of sensing data obtained by the sensor, identifying location information of at least one object included in an image obtained by the camera, and controlling the laser projector to provide, on the display, augmented reality (AR) information related to the object on the basis of the identified user's sight line and location information of the object, wherein the laser projector comprises: a light emitting element for emitting a laser light; a focus adjusting element for adjusting a focus of the emitted laser light; and a scanning mirror for controlling a scan direction of the light having the adjusted focus, and the processor controls a driving state of the focus adjusting element on the basis of the identified user's sight line and location information of the object.
INTELLIGENT SYSTEM FOR CONTROLLING FUNCTIONS IN A COMBAT VEHICLE TURRET
A system for controlling turret functions of a land-based combat vehicle includes: a plurality of image detection sensors for recording sequences of images having an at least partial view of a 360° environment of the land-based combat vehicle; at least one virtual, augmented or mixed reality headset for wear by an operator, the headset presenting the at least partial view of the environment of the land-based combat vehicle on a display, the headset including a direction sensor for tracking an orientation of the headset imparted during a movement of a head of the operator and eye tracking means for tracking eye movements of the operator; a control unit including at least one computing unit for receiving as input and processing: images supplied by the plurality of image detection sensors; headset position and orientation data supplied by the direction sensor; eye position data supplied by the eye tracking means.
INFORMATION PROCESSING METHOD AND ELECTRONIC DEVICE
An information processing method and an electronic device are provided. The method is performed by a first wearable device, where the first wearable device includes a first image collector, and the method includes: obtaining a second face image by the first image collector and receiving a first face image from the second wearable device in a case that the first wearable device and a second wearable device are in a preset positional relationship; and processing first target information from the second wearable device in a case that the first face image matches the second face image.
WEARABLE APPARATUS FOR ACTIVE SUBSTITUTION
A hearing aid and related systems and methods. In one implementation, a hearing aid system may comprise a wearable camera configured to capture images from an environment of a user, a microphone configured to capture sounds from the environment of the user, and a processor. The processor may be programmed to receive images captured by the camera; receive audio signals representative of sounds captured by the microphone; operate in a first mode to cause a first selective conditioning of a first audio signal; determine, based on analysis of at least one of the images or the audio signals, to switch to a second mode to cause a second selective conditioning of the first audio signal; and cause transmission of the first audio signal selectively conditioned in the second mode to a hearing interface device configured to provide sound to an ear of the user.
ULTRASOUND DEVICES FOR MAKING EYE MEASUREMENTS
The disclosed ultrasound devices may include at least one ultrasound transmitter positioned and configured to transmit ultrasound signals toward a user's face to reflect off a facial feature of the user's face and at least one ultrasound receiver positioned and configured to receive and detect the ultrasound signals reflected off the facial feature. At least one processor may be configured to receive data from the at least one ultrasound receiver and to determine, based on the received data from the at least one ultrasound receiver, at least one of the following eye measurements: an interpupillary distance of the user; an eye relief; or a position of a head-mounted display relative to the facial feature of the user. Various other devices, systems, and methods are also disclosed.
3D MAPPING IN 2D SCANNING DISPLAY
A wearable display device includes a light source, a beam scanner, a pupil-replicating lightguide, and a detector. The light source is configured to emit an image beam and a ranging beam. The beam scanner co-scans both beams. The image beam is used to form an image in angular domain for displaying to a user of the wearable display device, and a ranging beam is used to scan outside environment at the same time. Light reflected from objects in the outside environment is detected by the detector, and a 3D map of the outside environment is built using time-of-flight measurements of the reflected signal and/or triangulation. For triangulation measurements, the detector may include a digital camera.
AUGMENTED REALITY DEVICE AND METHOD FOR DETECTING GAZE OF USER
A method, performed by an augmented reality (AR) device including a vision correction lens, of detecting a gaze of a user is provided. The method includes obtaining lens characteristic information about the vision correction lens arranged to overlap a light guide plate in a gaze direction of the user, emitting light for gaze tracking toward a light reflector through a light emitter, wherein the emitted light is reflected by the light reflector and then directed to an eye of the user, receiving a light reflected by the eye of the user through a light receiver, obtaining an eye image of the user based on the light received, adjusting the eye image of the user based on the lens characteristic information about the vision correction lens, and obtaining gaze information based on the adjusted eye image.