Patent classifications
G02B2027/0198
REPROJECTION AND WOBULATION AT HEAD-MOUNTED DISPLAY DEVICE
A head-mounted display device including one or more position sensors and a processor. The processor may receive a rendered image of a current frame. The processor may receive position data from the one or more position sensors and determine an updated device pose based on the position data. The processor may apply a first spatial correction to color information in pixels of the rendered image at least in part by reprojecting the rendered image based on the updated device pose. The head-mounted display device may further include a display configured to apply a second spatial correction to the color information in the pixels of the rendered image at least in part by applying wobulation to the reprojected rendered image to thereby generate a sequence of wobulated pixel subframes for the current frame. The display may display the current frame by displaying the sequence of wobulated pixel subframes.
ADVANCED HEAD DISPLAY UNIT FOR FIRE FIGHTERS
In this patent, an advanced head display unit designed primarily for fire fighters is disclosed. The advanced augmented reality/virtual reality (AR/VR) head display unit improves coordination between teammates through eye tracking coupled with augmented reality features. This allows one fire fighter to know where another fire fighter is looking and helps coordinate tasks by dividing the scene into sectors and visibly marking each sector. Further, this system helps determine where to hose with a smart target system. Further, multiple sensors are utilized together to triangulate the location of a victim's voice. Additional advantages are also disclosed above.
METHOD AND DEVICE FOR CONTROLLING THE POSITIONING OF A MOUNTED INFORMATION DISPLAY DEVICE
This method, implemented in a mounted information display device, which incorporates a main sensor and an inertial sensor, and which determines the positioning by a hybrid inertial method including determining a calculated position by a main method using data acquired by the main sensor and determining a succession of estimated positions using the calculated position and data acquired by the inertial sensor, includes: obtaining at a first calculation time instant T1 a first estimated position of the device at a reference time instant, calculated by the hybrid inertia method; obtaining at a second time instant T2 a second estimated position of the device at the same reference time instant, calculated by the main method; comparing a difference between the first and second positions and a tolerance threshold; and, if the difference is less than the threshold, validating the positioning calculation by the hybrid inertial method, otherwise raising an alert.
Optical hybrid reality system having digital correction of aberrations
A virtual/augmented reality head-mounted display system and method with means to correct optical aberrations is disclosed. The system includes the initial profiling of the head-mounted display system and eye-tracking means.
Pupil steering: flexure guidance systems
A flexure guidance system may be provided for controlling movement of an optical subassembly and/or a connected combiner lens. For instance, the flexure guidance system may include a distal end piece, a proximal end piece, and multiple wire flexures that link the distal end piece to the proximal end piece. The linking wire flexures may be spaced to form an interior cavity between the distal end piece and the proximal end piece. This interior cavity may house various electronic components. One or more actuators in the system may move the electronic components according to input signals along different axes of movement provided by the wire flexures. Various other methods, systems, and computer-readable media are also disclosed.
Predictive dimming of optical passthrough displays
In one implementation, a method of controlling a dimming level of dimmable optical element based on a predicted ambient light level is performed at a device including one or more processors, non-transitory memory, and a dimmable optical element. The method includes predicting a change, in a first direction, in an ambient light level at a future time. The method includes changing, at a first time in advance of the future time and in the first direction, a transmission coefficient of the dimmable optical element based on the predicted change in the ambient light level. The method includes changing, at a second time after the first time and in a second direction opposite the first direction, the transmission coefficient of the dimmable optical element based on the ambient light level at the second time.
Systems and methods for facilitating the identifying of correspondences between images experiencing motion blur
A system for facilitating the identifying of correspondences between images experiencing motion blur obtains a reference frame captured by a reference camera at a reference camera and obtains a match frame captured by a match camera at a match camera timepoint. The system identifies a motion attribute that includes (1) a reference camera motion attribute associated with the reference camera at the reference camera timepoint, and/or (2) a match camera motion attribute associated with the match camera at the match camera timepoint. The system determines a downsampling resolution using at least as inputs at least one of: the motion attribute, a camera exposure time, a camera field of view, or a camera angular resolution. The system generates a downsampled reference frame and a downsampled match frame based on the downsampling resolution. The system identifies correspondences between the downsampled reference frame and the downsampled match frame.
AUGMENTED OR VIRTUAL REALITY CALIBRATION AND ALIGNMENT SYSTEM AND METHOD
An augmented reality (AR) or virtual reality (VR) calibration method including the steps of: (a) providing a computing device for displaying a base image of a surrounding environment, (b) obtaining location coordinates of the computing device; (c) initiating an executable application program for processing location data and generating an overlay image over the base image; (d) generating a virtual asset container and at least one digital object corresponding to the computing device, (e) determining a first location of the computing device at a local position within the asset container; (e) moving the computing device to a second location that is a determined distance in a direction from the first location, and (f) calculating a local rotation angle relative to a positive axis of the asset container and a rotation angle relative to a positive axis of a real-world coordinate system to determine an orientation difference.
Smartphone-assisted portable augmented reality (AR) device and clip-on unit for adjustable attachment to a user's spectacles
A detachable spectacles-mounted augmented reality (AR) device and clip-on unit wherein the device has a housing (31) configured for detachably supporting the clip-on unit, an exit window (30) and an entrance window (30′) in the housing through which the user observes a scene, a communications interface (71, 74) for coupling to a hand-held device, and a camera (37) inside the housing for imaging the scene observed by the user through a camera window (36) and configured to convey an image of the scene to the hand-held device. A line-of-sight guide unit (39) displays at least one marker at the user's field of view for directing a line of sight of the user toward a designated feature in the scene, and optics (40) within the housing projects the marker at a distance for superimposing on to the scene viewed by the user.
STEREO ALIGNMENT ASSESSMENT FOR HEAD-MOUNTED DISPLAY
A head-mounted display system includes a left display assembly configured to provide left-side display light and left-side test light. A left waveguide incouples the left-side display light and outcouples the left-side display light for viewing. A left optical sensor is positioned to measure the left-side test light. A left inertial measurement unit (IMU) is configured to measure an orientation of the left display assembly. A right display assembly is configured to provide right-side display light and right-side test light. A right waveguide incouples the right-side display light and outcouples the right-side display light for viewing. A right optical sensor is positioned to right the right-side test light. A right IMU is configured to measure an orientation of the left display assembly. A logic machine is configured to assess a stereo alignment for the left- and right-side display light.