Patent classifications
G02B2027/0187
ANGULAR LIGHT SENSOR AND EYE-TRACKING
Angular sensors that may be used in eye-tracking systems are disclosed. An eye-tracking system may include a plurality of light sources to emit illumination light and a plurality of angular light sensors to receive returning light that is the illumination light reflecting from an eyebox region. The angular light sensors may output angular signals representing an angle of incidence of the returning light.
Head-up display capable of adjusting imaging position
A head-up display capable of adjusting an imaging position is provided. The head-up display includes an image generation module, a reflector, a holographic diffraction optical element and a control unit. The image generation module is configured to display and project an image. The reflector is configured to reflect the image and further project the image on a transparent screen through the reflector. The holographic diffraction optical element is disposed on the transparent screen to reflect the image to a visible range of the user's eyes. The control unit is coupled to the reflector or the transparent screen to adjust the viewing angle of the holographic diffraction optical element having a pre-determined angle with the reflector.
Augmented reality for vehicle operations
A method, includes saving in-flight data from an aircraft during a simulated training exercise, wherein the in-flight data includes geospatial locations of the aircraft, positional attitudes of the aircraft, and head positions of a pilot operating the aircraft, saving simulation data relating to a simulated virtual object presented to the pilot as augmented reality content in-flight, wherein the virtual object was programmed to interact with the aircraft during the simulated training exercise and representing the in-flight data from the aircraft and the simulation data relating to the simulated virtual object as a replay of the simulated training exercise.
Color-sensitive virtual markings of objects
Disclosed are systems, methods, and non-transitory computer readable media for making virtual colored markings on objects. Instructions may include receiving an indication of an object; receiving from an image sensor an image of a hand of an individual holding a physical marking implement; detecting in the image a color associated with the marking implement; receiving from the image sensor image data indicative of movement of a tip of the marking implement and locations of the tip; determining from the image data when the locations of the tip correspond to locations on the object; and generating, in the detected color, virtual markings on the object at the corresponding locations.
Imaging display device and wearable device
An imaging display device includes an imaging unit, a processing unit, a display unit, and a pupil detection unit. The imaging unit includes a plurality of photoelectric conversion elements and is configured to acquire first image information. The processing unit is configured to process a signal from the imaging unit and generate second image information. The display unit is configured to display an image that is based on the signal from the processing unit. The pupil detection unit is configured to detect vector information of a pupil. The processing unit generates the second image information by processing the first image information based on the vector information on the pupil.
Pixel intensity modulation using modifying gain values
A visual perception device has a look-up table stored in a laser driver chip. The look-up table includes relational gain data to compensate for brighter areas of a laser pattern wherein pixels are located more closely than areas where the pixels are further apart and to compensate for differences in intensity of individual pixels when the intensities of pixels are altered due to design characteristics of an eye piece.
Method and device for carrying out eye gaze mapping
The invention relates to a device and a method for performing an eye gaze mapping (M), in which at least one point of vision (B) and/or a viewing direction of at least one person (10) in relation to at least one scene recording (S) of a scene (12) viewed by the at least one person (10) is mapped onto a reference (R). At least a part of an algorithm (A1, A2, A3) for performing the eye gaze mapping (M) is thereby selected from multiple predetermined algorithms (A1, A2, A3) as a function of at least one parameter (P), and the eye gaze mapping (M) is performed on the basis of the at least one part of the algorithm (A1, A2, A3).
Method and device for refraction adjustment, and augmented reality apparatus
A method and device for refraction adjustment in an augmented reality apparatus, and an augmented reality apparatus. The method for refraction adjustment includes: receiving light rays reflected from eyes of a user wearing an augmented reality apparatus; determining a pupil distance of the user according to the reflected light rays; and generating a refraction correction signal according to the pupil distance of the user and a desired diopter(s) for correcting diopters of the user's eyes by means of a refraction adjustment element.
Head-up display and moving body with head-up display mounted thereon
A head-up display is configured to project an image on a transparent reflection member to cause an observer to visually recognize a virtual image, and includes a display device configured to display the image, and a projection optical system configured to project the image displayed by the display device as the virtual image for the observer. The projection optical system is configured to form an image as an intermediate image, and includes a first lens configured to condense light, and a first optical element configured to diffuse light. The first lens and the first optical element are disposed in this order along an optical path from the display device. The first lens is inclined with respect to a reference beam which is defined as a beam reaching a center of a viewpoint region of the observer and corresponding to a center of the virtual image.
Measurement method and system
Methods and systems for determining an individual gaze value are disclosed herein. An exemplary method involves: (a) receiving gaze data for a first wearable computing device, wherein the gaze data is indicative of a wearer-view associated with the first wearable computing device, and wherein the first wearable computing device is associated with a first user-account; (b) analyzing the gaze data from the first wearable computing device to detect one or more occurrences of one or more advertisement spaces in the gaze data; (c) based at least in part on the one or more detected advertisement-space occurrences, determining an individual gaze value for the first user-account; and (d) sending a gaze-value indication, wherein the gaze-value indication indicates the individual gaze value for the first user-account.