G06T15/55

Augmented reality lighting effects
11488366 · 2022-11-01 · ·

The present invention embraces a system, device, and method for adding lighting effects to augmented reality (AR) content (i.e., virtual objects). Light sensors in an augmented reality (AR) system monitor an environment's lighting conditions to acquire lighting data that can be used to create (or update) virtual light sources. Depth sensors in the AR system sense the environment to acquire mapping data that can be used to create a 3D model of the environment while tracking the system's location within the environment. Algorithms running on a processor may then add the virtual light sources to the 3D model of the environment so that, when AR content is created, lighting effects corresponding to the virtual light sources can be added. The resulting AR content with virtual lighting effects appear more realistic to a user.

Augmented reality lighting effects
11488366 · 2022-11-01 · ·

The present invention embraces a system, device, and method for adding lighting effects to augmented reality (AR) content (i.e., virtual objects). Light sensors in an augmented reality (AR) system monitor an environment's lighting conditions to acquire lighting data that can be used to create (or update) virtual light sources. Depth sensors in the AR system sense the environment to acquire mapping data that can be used to create a 3D model of the environment while tracking the system's location within the environment. Algorithms running on a processor may then add the virtual light sources to the 3D model of the environment so that, when AR content is created, lighting effects corresponding to the virtual light sources can be added. The resulting AR content with virtual lighting effects appear more realistic to a user.

Efficiently determining an absorption coefficient of a virtual volume in 3D computer graphics
11481959 · 2022-10-25 · ·

Disclosed is a method to derive the absorption coefficient, transparency, and/or the scattering coefficient from the user-specified parameters including roughness, phase function, index of refraction (IOR), and color by performing the simulation once, and storing the results of the simulation in an easy to retrieve representation, such as a lookup table, or an analytic function. To create the analytic function, one or more analytic functions can be fitted to the results of the simulation for the multiple parameters including roughness, phase function, IOR, and color. The lookup table can be combined with the analytic representation. For example, the lookup table can be used to represent the color, roughness, and phase function, while the IOR can be represented by an analytic function. For example, when the IOR is above 2, the lookup table becomes three-dimensional and the IOR is calculated using the analytic function.

Light field imaging system by projecting near-infrared spot in remote sensing based on multifocal microlens array

The present disclosure provides a light field imaging system by projecting near-infrared spot in remote sensing based on a multifocal microlens array. The light field imaging system includes a near-infrared spot projection apparatus (100) and a light field imaging component (200), where the near-infrared spot projection apparatus (100) is configured to scatter near-infrared spots on a to-be-observed object to add texture information to a target image, and the light field imaging component (200) is configured to image a target scene light ray with additional texture information. The present disclosure can extend a target depth-of-field (DOF) detection range, and particularly, reconstruct a surface of a weak-texture object.

Light field imaging system by projecting near-infrared spot in remote sensing based on multifocal microlens array

The present disclosure provides a light field imaging system by projecting near-infrared spot in remote sensing based on a multifocal microlens array. The light field imaging system includes a near-infrared spot projection apparatus (100) and a light field imaging component (200), where the near-infrared spot projection apparatus (100) is configured to scatter near-infrared spots on a to-be-observed object to add texture information to a target image, and the light field imaging component (200) is configured to image a target scene light ray with additional texture information. The present disclosure can extend a target depth-of-field (DOF) detection range, and particularly, reconstruct a surface of a weak-texture object.

ILLUMINATION RENDERING METHOD AND APPARATUS, COMPUTER DEVICE, AND STORAGE MEDIUM
20230076326 · 2023-03-09 ·

This application relates to an illumination rendering method performed by a computer device. The method include: determining light source change information when a light source changes in a virtual scene; determining a current light source projection coefficient corresponding to the changed light source according to the light source change information; determining an indirect illumination value of a target pixel point in the virtual scene according to a radiance transfer parameter corresponding to the target pixel point in the virtual scene and the current light source projection coefficient; determining a direct illumination value corresponding to the target pixel point under the changed light source; and performing illumination rendering on the target pixel point according to the direct illumination value and the indirect illumination value.

ILLUMINATION RENDERING METHOD AND APPARATUS, COMPUTER DEVICE, AND STORAGE MEDIUM
20230076326 · 2023-03-09 ·

This application relates to an illumination rendering method performed by a computer device. The method include: determining light source change information when a light source changes in a virtual scene; determining a current light source projection coefficient corresponding to the changed light source according to the light source change information; determining an indirect illumination value of a target pixel point in the virtual scene according to a radiance transfer parameter corresponding to the target pixel point in the virtual scene and the current light source projection coefficient; determining a direct illumination value corresponding to the target pixel point under the changed light source; and performing illumination rendering on the target pixel point according to the direct illumination value and the indirect illumination value.

Indoor scene illumination

Techniques for illuminating an indoor scene. A directional distribution associated with the indoor scene is received. The indoor scene has a first scene element and a first quadrilateral. The first scene element has a first shading point disposed thereon. The directional distribution is reparametrized such that the first quadrilateral as viewed from the first shading point corresponds to an axis-aligned rectangular region in the reparametrized directional distribution. The scene element is illuminated using one or more samples drawn from the shading point by performing importance sampling based on the reparametrized directional distribution.

Indoor scene illumination

Techniques for illuminating an indoor scene. A directional distribution associated with the indoor scene is received. The indoor scene has a first scene element and a first quadrilateral. The first scene element has a first shading point disposed thereon. The directional distribution is reparametrized such that the first quadrilateral as viewed from the first shading point corresponds to an axis-aligned rectangular region in the reparametrized directional distribution. The scene element is illuminated using one or more samples drawn from the shading point by performing importance sampling based on the reparametrized directional distribution.

SYSTEM AND METHOD FOR REAL TIME DYNAMIC LIGHTING SIMULATION
20170345208 · 2017-11-30 ·

Sustainable building lighting and energy modelling and control, and the associated computer graphics, including real-time dynamic lighting simulation, are concerned with: an optimized method for radiance modelling, including its application to predictive daylight harvesting; and the real-time simulation of physically-based electric lighting and daylighting for architectural, horticultural, and theatrical lighting systems visualization. In order to display and analyze in real time a photometrically accurate representation of an environment, thousands of lighting channels may have their intensity settings continually varied such that a user may interactively view the three-dimensional environment without the need for ongoing global illumination calculations. This can be accomplished utilizing texture maps as a multiplicity of canonical radiosity solutions, each representing a lighting channel for dynamic lighting simulation, and storing the solutions in the texture memory of a graphics processing unit.