Patent classifications
G02B27/0093
Visual-inertial tracking using rolling shutter cameras
Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The eyewear device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the eyewear device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.
Depth estimation using biometric data
Method of generating depth estimate based on biometric data starts with server receiving positioning data from first device associated with first user. First device generates positioning data based on analysis of a data stream comprising images of second user that is associated with second device. Server then receives a biometric data of second user from second device. Biometric data is based on output from a sensor or a camera included in second device. Server then determines a distance of second user from first device using positioning data and biometric data of the second user. Other embodiments are described herein.
Augmented reality for vehicle operations
A method, includes saving in-flight data from an aircraft during a simulated training exercise, wherein the in-flight data includes geospatial locations of the aircraft, positional attitudes of the aircraft, and head positions of a pilot operating the aircraft, saving simulation data relating to a simulated virtual object presented to the pilot as augmented reality content in-flight, wherein the virtual object was programmed to interact with the aircraft during the simulated training exercise and representing the in-flight data from the aircraft and the simulation data relating to the simulated virtual object as a replay of the simulated training exercise.
Color-sensitive virtual markings of objects
Disclosed are systems, methods, and non-transitory computer readable media for making virtual colored markings on objects. Instructions may include receiving an indication of an object; receiving from an image sensor an image of a hand of an individual holding a physical marking implement; detecting in the image a color associated with the marking implement; receiving from the image sensor image data indicative of movement of a tip of the marking implement and locations of the tip; determining from the image data when the locations of the tip correspond to locations on the object; and generating, in the detected color, virtual markings on the object at the corresponding locations.
Imaging display device and wearable device
An imaging display device includes an imaging unit, a processing unit, a display unit, and a pupil detection unit. The imaging unit includes a plurality of photoelectric conversion elements and is configured to acquire first image information. The processing unit is configured to process a signal from the imaging unit and generate second image information. The display unit is configured to display an image that is based on the signal from the processing unit. The pupil detection unit is configured to detect vector information of a pupil. The processing unit generates the second image information by processing the first image information based on the vector information on the pupil.
Systems and methods for providing mixed-reality experiences under low light conditions
Systems and methods are provided for facilitating computer vision tasks (e.g., simultaneous location and mapping) and pass-through imaging include a head-mounted display (HMD) that includes a first set of one or more cameras configured for performing computer vision tasks and a second set of one or more cameras configured for capturing image data of an environment for projection to a user of the HMD. The first set of one or more cameras is configured to detect at least a visible spectrum light and at least a particular band of wavelengths of infrared (IR) light. The second set of one or more cameras includes one or more detachable IR filters configured to attenuate IR light, including at least a portion of the particular band of wavelengths of IR light.
Eyewear use detection
Eyewear including a support structure defining a region for receiving a head of a user. The support structure supports optical elements, electronic components, and a use detector. The use detector is coupled to the electronic components and is positioned to identify when the head of the user is within the region defined by the support structure. The electronic components monitor the use detector and transition from a first mode of operation to a second mode of operation when the use detector senses the head of the user in the region.
Method and device for carrying out eye gaze mapping
The invention relates to a device and a method for performing an eye gaze mapping (M), in which at least one point of vision (B) and/or a viewing direction of at least one person (10) in relation to at least one scene recording (S) of a scene (12) viewed by the at least one person (10) is mapped onto a reference (R). At least a part of an algorithm (A1, A2, A3) for performing the eye gaze mapping (M) is thereby selected from multiple predetermined algorithms (A1, A2, A3) as a function of at least one parameter (P), and the eye gaze mapping (M) is performed on the basis of the at least one part of the algorithm (A1, A2, A3).
Method and device for refraction adjustment, and augmented reality apparatus
A method and device for refraction adjustment in an augmented reality apparatus, and an augmented reality apparatus. The method for refraction adjustment includes: receiving light rays reflected from eyes of a user wearing an augmented reality apparatus; determining a pupil distance of the user according to the reflected light rays; and generating a refraction correction signal according to the pupil distance of the user and a desired diopter(s) for correcting diopters of the user's eyes by means of a refraction adjustment element.
Measurement method and system
Methods and systems for determining an individual gaze value are disclosed herein. An exemplary method involves: (a) receiving gaze data for a first wearable computing device, wherein the gaze data is indicative of a wearer-view associated with the first wearable computing device, and wherein the first wearable computing device is associated with a first user-account; (b) analyzing the gaze data from the first wearable computing device to detect one or more occurrences of one or more advertisement spaces in the gaze data; (c) based at least in part on the one or more detected advertisement-space occurrences, determining an individual gaze value for the first user-account; and (d) sending a gaze-value indication, wherein the gaze-value indication indicates the individual gaze value for the first user-account.