Patent classifications
G02B27/42
IMAGE LIGHT GUIDE WITH ZONED DIFFRACTIVE OPTIC
An image light guide for conveying a virtual image, including a waveguide, an in-coupling diffractive optic operable to direct image-bearing light beams into the waveguide, and an out-coupling diffractive optic operable to direct the image-bearing light beams from the waveguide toward an eyebox. The out-coupling diffractive optic having two or more zones each including a set of diffractive features, wherein successive zones along one dimension of the out-coupling diffractive optic have different respective sets of diffractive features, wherein the diffractive features are operable to direct image-bearing light beams of a first pixel incident upon the diffractive features at a first angle whereby the directed image-bearing light beams of the first pixel further propagate within the waveguide, and wherein the diffractive features are operable to out-couple a portion of the image-bearing light beams of the first pixel incident upon the diffractive features at a second angle.
CURABLE RESIN COMPOSITION, CURED PRODUCT, DIFFRACTIVE OPTICAL ELEMENT, AND MULTILAYER DIFFRACTIVE OPTICAL ELEMENT
Provided are a curable resin composition including a near-ultraviolet light-absorbing organic compound, indium tin oxide particles, and a polymer having a constitutional unit represented by General Formula (P) and having an acidic group at one terminal, in which the near-ultraviolet light-absorbing organic compound is a compound that has a maximal value at 300 to 400 nm in an absorption spectrum in a wavelength region of 300 to 800 nm and does not substantially absorb light at a wavelength of 410 to 800 nm; a cured product formed of the curable resin composition; a diffractive optical element; and a multilayer diffractive optical element.
##STR00001##
Ar.sup.P represents an aryl group and L.sup.P and R.sup.P1 represent a specific group.
OPTICAL DISPLAY SYSTEM AND AUGMENTED REALITY ELECTRONIC DEVICE
An optical display system and an augmented reality electronic device are disclosed. The optical display system comprises: a waveguide; an input coupler, provided at the input end of the waveguide and couples an image light into it; and a two-dimensional grating, provided at the output end of waveguide. The waveguide delivers the image light to the two-dimensional grating, which performs pupil expansion on the image light and out-couples the expanded image light. The two-dimensional grating has rhombus lattices. Unit cells of the two-dimensional grating are un-symmetric along respective axes parallel with a propagation direction of the image light incident onto the two-dimensional grating, from a top view of the two-dimensional grating. The unit cells are oriented with the propagation direction of the image light and each of the unit cells has at least two vertexes at its end side.
Holographic mode filter for super-resolution imaging
A method includes receiving collimated light from an optical imaging system and dividing the received light into multiple bands of wavelength. Each band is refocused onto a corresponding diffraction grating having an amplitude function matched to a point spread function (PSF) of the optical imaging system. The light that is not filtered out by the diffraction grating is transmitted onto a corresponding pixel array. An image is reconstructed from data provided by the pixel arrays for each band. The intensity of light scattered by each diffraction grating may be detected, with the image being reconstructed as a function of an average value of detected intensity of scattered light used to scale the known zero-order mode profile, which is added to the image on the pixel array.
Devices and methods employing optical-based machine learning using diffractive deep neural networks
An all-optical Diffractive Deep Neural Network (D.sup.2NN) architecture learns to implement various functions or tasks after deep learning-based design of the passive diffractive or reflective substrate layers that work collectively to perform the desired function or task. This architecture was successfully confirmed experimentally by creating 3D-printed D.sup.2NNs that learned to implement handwritten classifications and lens function at the terahertz spectrum. This all-optical deep learning framework can perform, at the speed of light, various complex functions and tasks that computer-based neural networks can implement, and will find applications in all-optical image analysis, feature detection and object classification, also enabling new camera designs and optical components that can learn to perform unique tasks using D.sup.2NNs. In alternative embodiments, the all-optical D.sup.2NN is used as a front-end in conjunction with a trained, digital neural network back-end.
Diffractive optical elements with mitigation of rebounce-induced light loss and related systems and methods
Display devices include waveguides with in-coupling optical elements that mitigate re-bounce of in-coupled light to improve overall in-coupling efficiency and/or uniformity. A waveguide receives light from a light source and/or projection optics and includes an in-coupling optical element that in-couples the received light to propagate by total internal reflection in a propagation direction within the waveguide. Once in-coupled into the waveguide the light may undergo re-bounce, in which the light reflects off a waveguide surface and, after the reflection, strikes the in-coupling optical element. Upon striking the in-coupling optical element, the light may be partially absorbed and/or out-coupled by the optical element, thereby effectively reducing the amount of in-coupled light propagating through the waveguide. The in-coupling optical element can be truncated or have reduced diffraction efficiency along the propagation direction to reduce the occurrence of light loss due to re-bounce of in-coupled light, resulting in less in-coupled light being prematurely out-coupled and/or absorbed during subsequent interactions with the in-coupling optical element.
Method for designing diffraction suppression optical component, display screen and under-screen camera apparatus
A method for designing a phase-typed diffraction suppressing optical device (12) for a transparent display screen(11) is disclosed, which comprises: acquiring a light field complex amplitude distribution U(x2,y2,d)=A(x2,y2,d)exp(iφ20(x2,y2,d)) on a plane with a distance d from the transparent display screen (12) after a plane wave is transmitted through the screen; and designing the diffraction suppressing optical device (12), so that it has a transmittance function t2 (x2,y2)=exp(iφ21(x2,y2)) and satisfies φ20 (x2,y2,d)+φ21 (x2,y2)=C, where C is a constant. A diffraction suppressing optical device (12) and an under-screen camera apparatus (1) comprising the same are disclosed. The phase-typed diffraction suppressing optical device (12) suppresses the diffraction effect in the under-screen camera apparatus (1) by providing phase modulation, thereby improving the quality of under-screen imaging.
Structured light projection module and depth camera
A structured light projection module and a depth camera are provided. The structured light projection module includes: a light source array including a plurality of sub-light sources arranged in a two-dimensional pattern and configured to transmit array beams corresponding to the two-dimensional pattern; a lens configured to receive and converge the array beams; and a diffractive optical element configured to receive the array beams that are emitted after being converged by the lens and project beams in a structured light speckle pattern. The structured light speckle pattern is formed through staggered superposition of at least two secondary structured light speckle patterns. Each secondary structured light speckle pattern is formed through a tiling arrangement of multiple sub-speckle patterns generated by a portion of the sub-light sources, and comprises speckles formed by diffracting an individual sub-light source via the diffractive optical element.
Projection module and terminal
A projection module and a terminal are provided. The projection module includes a base, a housing, a first light source, a second light source and an optical element. The housing is disposed on the base, and defines an accommodating cavity together with the base. The first light source is disposed on the base and arranged in the accommodating cavity. The second light source is disposed on the base and arranged in the accommodating cavity. The optical element is disposed on the housing and includes a diffraction area and a diffusion area. The first light source aligns with the diffraction area, the second light source aligns with the diffusion area, the diffraction area is configured to diffract light passing through the diffraction area, and the diffusion area is configured to diffuse light passing through the diffusion area.
GEOMETRIC INTRINSIC CAMERA CALIBRATION USING DIFFRACTIVE OPTICAL ELEMENT
Provided are methods for geometric intrinsic camera calibration using a diffractive optical element. Some methods described include receiving, by at least one processor, at least one image captured by a camera based on a plurality of light beams received from a diffractive optical element aligned with an optical axis of the camera, the plurality of light beams having a plurality of propagation directions associated with a plurality of view angles. The at least one processor identifies a plurality of shapes in the image, determines a correspondence between the plurality of shapes in the image and the plurality of light beams, and identifies one or more intrinsic parameters of the camera that minimize a reprojection error function based on the plurality of shapes in the image and the plurality of propagation directions. Systems and computer program products are also provided.