H04N13/327

Vanishing point stereoscopic image correction
11570425 · 2023-01-31 · ·

Three-dimensional image calibration and presentation for stereoscopic imaging systems such as eyewear including a first camera and a second camera is described. The calibration and presentation includes obtaining a calibration offset using vanishing points obtained from images captured by a first camera and a second camera to accommodate rotation of the first and second cameras with respect to one another, adjusting a three-dimensional rendering offset by the obtained calibration offset, and presenting the stereoscopic images using the three dimension rendering offset.

Vanishing point stereoscopic image correction
11570425 · 2023-01-31 · ·

Three-dimensional image calibration and presentation for stereoscopic imaging systems such as eyewear including a first camera and a second camera is described. The calibration and presentation includes obtaining a calibration offset using vanishing points obtained from images captured by a first camera and a second camera to accommodate rotation of the first and second cameras with respect to one another, adjusting a three-dimensional rendering offset by the obtained calibration offset, and presenting the stereoscopic images using the three dimension rendering offset.

Head-up display device
11704779 · 2023-07-18 · ·

A head-up display device includes: a display unit that produces a display light for the left eye by modulating a first illumination light derived from superimposing first illumination light beams output from a first region of a fly eye lens and produces a display light for the right eye by modulating a second illumination light derived from superimposing second illumination light beams output a second region of the fly eye lens; and an image processing unit that produces an image for the left eye and produces an image for the right eye; and a display control unit that causes a display unit to display the image for the left eye when the first illumination light is produced and causes the display unit to display the image for the right eye when the second illumination light is produced.

MULTI-VIEWPOINT 3D DISPLAY APPARATUS, DISPLAY METHOD AND DISPLAY SCREEN CORRECTION METHOD
20220417491 · 2022-12-29 ·

A method for realizing multi-viewpoint 3D display screen correction is provided, comprising: determining a correction area in a multi-viewpoint 3D display screen, and detecting a reference correlation between subpixels in composite subpixels in composite pixels in the multi-viewpoint 3D display screen and viewpoints of the multi-viewpoint 3D display screen; and determining a correction correlation between the subpixels in the correction area and the viewpoints based on the reference correlation. The method can reduce the difficulty in correcting an actual correlation between the pixels and the viewpoints. A multi-viewpoint 3D display method, a multi-viewpoint 3D display apparatus, a computer-readable storage medium, and a computer program product are also provided.

MULTI-VIEWPOINT 3D DISPLAY APPARATUS, DISPLAY METHOD AND DISPLAY SCREEN CORRECTION METHOD
20220417491 · 2022-12-29 ·

A method for realizing multi-viewpoint 3D display screen correction is provided, comprising: determining a correction area in a multi-viewpoint 3D display screen, and detecting a reference correlation between subpixels in composite subpixels in composite pixels in the multi-viewpoint 3D display screen and viewpoints of the multi-viewpoint 3D display screen; and determining a correction correlation between the subpixels in the correction area and the viewpoints based on the reference correlation. The method can reduce the difficulty in correcting an actual correlation between the pixels and the viewpoints. A multi-viewpoint 3D display method, a multi-viewpoint 3D display apparatus, a computer-readable storage medium, and a computer program product are also provided.

VIRTUAL REALITY SURGICAL CAMERA SYSTEM

A system includes a console assembly, a trocar assembly operably coupled to the console assembly, a camera assembly operably coupled to the console assembly having a stereoscopic camera assembly, and at least one rotational positional sensor configured to detect rotation of the stereoscopic camera assembly about at least one of a pitch axis or a yaw axis. The console assembly includes a first actuator and a first actuator pulley operable coupled to the first actuator. The trocar assembly includes a trocar having an inner and outer diameter, and a seal sub-assembly comprising at least one seal and the seal sub-assembly operably coupled to the trocar. The camera assembly includes a camera support tube having a distal and a proximal end, the stereoscopic camera operably coupled to the distal end of the support tube and a first and second camera module having a first and second optical axis.

AUTOCALIBRATED NEAR-EYE DISPLAY

A near-eye display device comprises right and left display projectors, expansion optics, and inertial measurement units (IMUs), in addition to a plurality of angle-sensitive pixel (ASP) elements and a computer. The right and left expansion optics are configured to receive respective display images from the right and left display projectors and to release expanded forms of the display images. The right IMU is fixedly coupled to the right display projector, and the left IMU is fixedly coupled to the left display projector. Each ASP element is responsive to an angle of light of one of the respective display images as received into the right or left expansion optic. The computer is configured to receive output from the right IMU, the left IMU and the plurality of ASP elements, and render display data for the right and left display projectors based in part on the output.

AUTOCALIBRATED NEAR-EYE DISPLAY

A near-eye display device comprises right and left display projectors, expansion optics, and inertial measurement units (IMUs), in addition to a plurality of angle-sensitive pixel (ASP) elements and a computer. The right and left expansion optics are configured to receive respective display images from the right and left display projectors and to release expanded forms of the display images. The right IMU is fixedly coupled to the right display projector, and the left IMU is fixedly coupled to the left display projector. Each ASP element is responsive to an angle of light of one of the respective display images as received into the right or left expansion optic. The computer is configured to receive output from the right IMU, the left IMU and the plurality of ASP elements, and render display data for the right and left display projectors based in part on the output.

EYEWEAR SURFACE TEMPERATURE EVALUATION

A method and apparatus for monitoring a surface temperature of eyewear proximate a processor to understand the surface temperature as a function of computer instructions, such as when the computer instructions are modified during software design. A sensor is coupled to the eyewear proximate the processor, such as at a temple of the eyewear including the processor, using one or more layers of tape. A server provides instructions to the processor for execution, such as instructions of an application, which instructions vary the utilization of the processor. A testing device, such as a digital multi-meter, is coupled to the sensor, as well as the server, and displays the surface temperature as a function of the processor utilization. The surface temperature of the eyewear is monitored to ensure the surface temperature does not exceed a temperature threshold.

Display apparatus and method
11506897 · 2022-11-22 · ·

An apparatus including a first display for a user's first eye; a first motion sensor and/or a first externally facing image capture device configured to capture at least a first image of a user's real world point of view; a second display for a user's second eye; a second motion sensor and/or a second externally facing image capture device configured to capture at least a second image of a user's real world scene; and a controller configured to: receive a first signal from: the first motion sensor and/or the first image capture device, receive a second signal from: the second motion sensor and/or the second image capture device, determine, based on the first and second signals, a change in orientation of the first display with respect to the second display, and control display of first content on the first display in dependence on the determined change in orientation.