Patent classifications
H04N2213/008
System and method for presenting image content on multiple depth planes by providing multiple intra-pupil parallax views
An augmented reality display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity is selected using a light source that outputs light for different images from different locations, with spatial differences in the locations of the light output providing differences in the paths that the light takes to the eye, which in turn provide different amounts of parallax disparity. Advantageously, the wavefront divergence, and the accommodation cue provided to the eye of the user, may be varied by appropriate selection of parallax disparity, which may be set by selecting the amount of spatial separation between the locations of light output.
Ergonomic protective eyewear
Using two or more cameras attached to the eyewear, three-dimensional views with accurate and natural depth perception of the working area can be displayed for users, so that the user can maintain healthy sitting or standing posture while working on patients or objects located below horizontal eye level. Additional functions including eye protection, zoom-in, zoom-out, on-off, lighting control, overlapping, and teleconference capabilities are also supported using electronic, video and audio devices attached to the eyewear. The eyewear can also comprise a face shield designed to protect the user from hazardous droplets, aerosols, harmful wavelengths of light, heat, sparks, flash burn, debris and/or flying objects.
NEAR-EYE DISPLAY AND METHOD FOR ADJUSTING BRIGHTNESS OF A NEAR-EYE DISPLAY
The present disclosure relates to a near-eye display and a method for adjusting the brightness of a near-eye display. The near-eye display including a first display screen for displaying one of a left eye image and a right eye image, a second display screen for displaying the other of the left eye image and the right eye image, a detector configured to detect a first detected value indicating brightness of the first display screen and a second detected value indicating brightness of the second display screen, and a controller configured to adjust brightness of at least one of the first display screen and the second display screen based on the first detected value and the second detected value.
Surgical suite integration and optimization
Systems, methods, and computer-readable media for integrating and optimizing a surgical suite. An ophthalmic suite can include a surgical console, a heads-up display communicatively coupled with a surgical camera for capturing a three-dimensional image of an eye, and a surgical suite optimization engine. The surgical suite optimization engine can performs a wide variety of actions in response to action codes received from the other components in the surgical suite. The surgical suite optimization engine can be integrated within another component of the surgical suite, can be a stand-alone module, and can be a cloud-based tool.
METHOD FOR THE USER-SPECIFIC CALIBRATION OF A DISPLAY APPARATUS, WEARABLE ON THE HEAD OF A USER, FOR AN AUGMENTED PRESENTATION
A method is provided for the user-specific calibration of a display apparatus, wearable on the head of a user, for an augmented presentation, wherein A), in a basic calibration step, the position of the at least one eye is ascertained relative to the display apparatus in all three spatial directions in the state where the display apparatus is worn on the head of the user and said position is saved in the display apparatus as adjustment data in such a way that an image generating module of the display apparatus generates the image taking account of the adjustment data.
EYEWEAR AND CONTROL METHOD THEREOF
The present disclosure relates to an eyewear and a control method thereof. The eyewear includes an optic, a camera device and a frame; wherein the optic and the camera device are respectively located on the frame; and the optic is a transparent display screen. The control method of the eyewear includes: detecting a light intensity of ambient light; obtaining a current light intensity of transmitted light generated after the ambient light passes through the optic, based on the light intensity of the ambient light and current light transmittance of the eyewear; adjusting the light transmittance of the optic; starting the camera device to acquire an environment image; performing enhancement processing on the environment image to obtain an enhanced environment image; and controlling the optic to display according to the enhanced environment image.
HEAD-MOUNTED DISPLAY
A predetermined convergence angle is maintained, and strength is improved. Accordingly, there are provided an image generator that generates an image, a pair of optical plates to be arranged in front of both eyes of an observer who observes the image, the pair of optical plates causing the image to be displayed, a first frame that holds the pair of optical plates, and a second frame including a to-be-fixed part, in which a fixing part to which the to-be-fixed part is fixed is provided at the central part of the first frame in the horizontal direction.
Wearable apparatus, anti-peeping display system and anti-peeping display method
A wearable apparatus, an anti-peeping display system and an anti-peeping display method are disclosed. The anti-peeping display system includes a display apparatus and a wearable apparatus, the display apparatus is configured to display infrared images, the wearable apparatus acquires the infrared images and converts the infrared images to original images, and the original images are visible images.
Real-time aliasing rendering method for 3D VR video and virtual three-dimensional scene
Provided is a real-time aliasing rendering method for a 3D VR video and a virtual three-dimensional scene, including: capturing 3D camera video signals in real time and process the same to generate texture data; creating a virtual three-dimensional scene according to the proportion of a real scene; generating virtual camera rendering parameters according to a physical position of the 3D camera and a shooting angle relationship; aliasing the texture data onto a virtual three-dimensional object in a virtual scene, and adjusting the position of the virtual three-dimensional object according to a physical positional relationship between the virtual three-dimensional scene and the real scene, so as to form a complete virtual reality combined three-dimensional scene; rendering the virtual reality combined three-dimensional scene by using the virtual camera rendering parameters to obtain a simulated rendering picture.
OPEN VIEW, MULTI-MODAL, CALIBRATED DIGITAL LOUPE WITH DEPTH SENSING
A digital loupe system is provided which can include a number of features. In one embodiment, the digital loupe system can include a stereo camera pair and a distance sensor. The system can further include a processor configured to perform a transformation to image signals from the stereo camera pair based on a distance measurement from the distance sensor and from camera calibration information. In some examples, the system can use the depth information and the calibration information to correct for parallax between the cameras to provide a multi-channel image. Ergonomic head mounting systems are also provided. In some implementations, the head mounting systems can be configurable to support the weight of a digital loupe system, including placing one or two oculars in a line of sight with an eye of a user, while improving overall ergonomics, including peripheral vision, comfort, stability, and adjustability. Methods of use are also provided.