H04N13/324

WEARABLE ELECTRONIC DEVICE AND METHOD OF OUTPUTTING THREE-DIMENSIONAL IMAGE
20220360764 · 2022-11-10 ·

A wearable electronic device includes a left-eye display configured to output light of a first color corresponding to a 3D left-eye image, a right-eye display configured to output light of a second color corresponding to a 3D right-eye image, a left-eye optical waveguide configured to adjust a path of the light of the first color and output the light of the first color, a right-eye optical waveguide configured to adjust a path of the light of the second color and output the light of the second color, a left-eye display control circuit configured to supply a driving power and a control signal to the left-eye display, a right-eye display control circuit configured to supply a driving power and a control signal to the right-eye display, a communication module configured to communicate with a mobile electronic device, and a second control circuit configured to supply a driving power and a control signal to the communication module.

Stereo viewing

The invention relates to creating and viewing stereo images, for example stereo video images, also called 3D video. At least three camera sources with overlapping fields of view are used to capture a scene so that an area of the scene is covered by at least three cameras. At the viewer, a camera pair is chosen from the multiple cameras to create a stereo camera pair that best matches the location of the eyes of the user if they were located at the place of the camera sources. That is, a camera pair is chosen so that the disparity created by the camera sources resembles the disparity that the user's eyes would have at that location. If the user tilts his head, or the view orientation is otherwise altered, a new pair can be formed, for example by switching the other camera. The viewer device then forms the images of the video frames for the left and right eyes by picking the best sources for each area of each image for realistic stereo disparity.

Stereo viewing

The invention relates to creating and viewing stereo images, for example stereo video images, also called 3D video. At least three camera sources with overlapping fields of view are used to capture a scene so that an area of the scene is covered by at least three cameras. At the viewer, a camera pair is chosen from the multiple cameras to create a stereo camera pair that best matches the location of the eyes of the user if they were located at the place of the camera sources. That is, a camera pair is chosen so that the disparity created by the camera sources resembles the disparity that the user's eyes would have at that location. If the user tilts his head, or the view orientation is otherwise altered, a new pair can be formed, for example by switching the other camera. The viewer device then forms the images of the video frames for the left and right eyes by picking the best sources for each area of each image for realistic stereo disparity.

Reprojection and wobulation at head-mounted display device

A head-mounted display device including one or more position sensors and a processor. The processor may receive a rendered image of a current frame. The processor may receive position data from the one or more position sensors and determine an updated device pose based on the position data. The processor may apply a first spatial correction to color information in pixels of the rendered image at least in part by reprojecting the rendered image based on the updated device pose. The head-mounted display device may further include a display configured to apply a second spatial correction to the color information in the pixels of the rendered image at least in part by applying wobulation to the reprojected rendered image to thereby generate a sequence of wobulated pixel subframes for the current frame. The display may display the current frame by displaying the sequence of wobulated pixel subframes.

Reprojection and wobulation at head-mounted display device

A head-mounted display device including one or more position sensors and a processor. The processor may receive a rendered image of a current frame. The processor may receive position data from the one or more position sensors and determine an updated device pose based on the position data. The processor may apply a first spatial correction to color information in pixels of the rendered image at least in part by reprojecting the rendered image based on the updated device pose. The head-mounted display device may further include a display configured to apply a second spatial correction to the color information in the pixels of the rendered image at least in part by applying wobulation to the reprojected rendered image to thereby generate a sequence of wobulated pixel subframes for the current frame. The display may display the current frame by displaying the sequence of wobulated pixel subframes.

STEREOSCOPIC IMAGE DISPLAY DEVICE
20230097546 · 2023-03-30 · ·

According to one embodiment, a stereoscopic image display device includes a three-dimensional pixel unit, a backlight, and an arithmetic/control circuit. The three-dimensional pixel unit includes a plurality of pixel cells that are formed of an optical material having electrically changeable optical characteristics, are arranged in a mutually separated manner and in a three-dimensional manner, and are electrically connected with transparent wiring patterns. The backlight is configured to emit illumination light to the three-dimensional pixel unit. The arithmetic/control circuit is configured to control the plurality of pixel cells individually via the wiring patterns on the basis of input three-dimensional image data to cause the three-dimensional pixel unit to function as a transmissive hologram.

STEREOSCOPIC IMAGE DISPLAY DEVICE
20230097546 · 2023-03-30 · ·

According to one embodiment, a stereoscopic image display device includes a three-dimensional pixel unit, a backlight, and an arithmetic/control circuit. The three-dimensional pixel unit includes a plurality of pixel cells that are formed of an optical material having electrically changeable optical characteristics, are arranged in a mutually separated manner and in a three-dimensional manner, and are electrically connected with transparent wiring patterns. The backlight is configured to emit illumination light to the three-dimensional pixel unit. The arithmetic/control circuit is configured to control the plurality of pixel cells individually via the wiring patterns on the basis of input three-dimensional image data to cause the three-dimensional pixel unit to function as a transmissive hologram.

System and method for presenting image content on multiple depth planes by providing multiple intra-pupil parallax views

An augmented reality display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity is selected using a light source that outputs light for different images from different locations, with spatial differences in the locations of the light output providing differences in the paths that the light takes to the eye, which in turn provide different amounts of parallax disparity. Advantageously, the wavefront divergence, and the accommodation cue provided to the eye of the user, may be varied by appropriate selection of parallax disparity, which may be set by selecting the amount of spatial separation between the locations of light output.

System and method for presenting image content on multiple depth planes by providing multiple intra-pupil parallax views

An augmented reality display system is configured to direct a plurality of parallactically-disparate intra-pupil images into a viewer's eye. The parallactically-disparate intra-pupil images provide different parallax views of a virtual object, and impinge on the pupil from different angles. In the aggregate, the wavefronts of light forming the images approximate a continuous divergent wavefront and provide selectable accommodation cues for the user, depending on the amount of parallax disparity between the intra-pupil images. The amount of parallax disparity is selected using a light source that outputs light for different images from different locations, with spatial differences in the locations of the light output providing differences in the paths that the light takes to the eye, which in turn provide different amounts of parallax disparity. Advantageously, the wavefront divergence, and the accommodation cue provided to the eye of the user, may be varied by appropriate selection of parallax disparity, which may be set by selecting the amount of spatial separation between the locations of light output.

System for 3D Image Projections and Viewing

Shaped glasses have curved surface lenses with spectrally complementary filters disposed thereon. The filters curved surface lenses are configured to compensate for wavelength shifts occurring due to viewing angles and other sources. Complementary images are projected for viewing through projection filters having passbands that pre-shift to compensate for subsequent wavelength shifts. At least one filter may have more than 3 primary passbands. For example, two filters include a first filter having passbands of low blue, high blue, low green, high green, and red, and a second filter having passbands of blue, green, and red. The additional passbands may be utilized to more closely match a color space and white point of a projector in which the filters are used. The shaped glasses and projection filters together may be utilized as a system for projecting and viewing 3D images.