G06T15/04

DISPLACEMENT MAPS

Examples of methods for determining displacement maps are described herein. In some examples of the methods, a method includes determining a displacement map for a three-dimensional (3D) object model based on a compensated point cloud. In some examples, the method includes assembling the displacement map on the 3D object model for 3D manufacturing.

DISPLACEMENT MAPS

Examples of methods for determining displacement maps are described herein. In some examples of the methods, a method includes determining a displacement map for a three-dimensional (3D) object model based on a compensated point cloud. In some examples, the method includes assembling the displacement map on the 3D object model for 3D manufacturing.

TEXTURE FILTERING OF TEXTURE REPRESENTED BY MULTILEVEL MIPMAP
20230050686 · 2023-02-16 ·

Texture filtering is applied to a texture represented with a mipmap comprising a plurality of levels, wherein each level of the mipmap comprises an image representing the texture at a respective level of detail. A texture filtering unit has minimum and maximum limits on an amount by which it can alter the level of detail when it filters texels from an image of a single level of the mipmap. The range of level of detail between the minimum and maximum limits defines an intrinsic region of the texture filtering unit. If it is determined that a received input level of detail is in an intrinsic region of the texture filtering unit, texels are read from a single mipmap level of the mipmap, and the read texels from the single mipmap level are filtered to determine a filtered texture value representing part of the texture at the input level of detail. If it is determined that the received input level of detail is in an extrinsic region of the texture filtering unit: texels are read from two mipmap levels of the mipmap, and the read texels from the two mipmap levels are processed to determine a filtered texture value representing part of the texture at the input level of detail.

TEXTURE FILTERING OF TEXTURE REPRESENTED BY MULTILEVEL MIPMAP
20230050686 · 2023-02-16 ·

Texture filtering is applied to a texture represented with a mipmap comprising a plurality of levels, wherein each level of the mipmap comprises an image representing the texture at a respective level of detail. A texture filtering unit has minimum and maximum limits on an amount by which it can alter the level of detail when it filters texels from an image of a single level of the mipmap. The range of level of detail between the minimum and maximum limits defines an intrinsic region of the texture filtering unit. If it is determined that a received input level of detail is in an intrinsic region of the texture filtering unit, texels are read from a single mipmap level of the mipmap, and the read texels from the single mipmap level are filtered to determine a filtered texture value representing part of the texture at the input level of detail. If it is determined that the received input level of detail is in an extrinsic region of the texture filtering unit: texels are read from two mipmap levels of the mipmap, and the read texels from the two mipmap levels are processed to determine a filtered texture value representing part of the texture at the input level of detail.

VOLUMETRIC VIDEO FROM AN IMAGE SOURCE

A method for generating one or more 3D models of at least one living object from at least one 2D image comprising the at least one living object. The one or more 3D models can be modified and enhanced. The resulting one or more 3D models can be transformed into at least one 2D display image; the point of view of the output 2D image(s) can be different from that of the input 2D image(s).

VOLUMETRIC VIDEO FROM AN IMAGE SOURCE

A method for generating one or more 3D models of at least one living object from at least one 2D image comprising the at least one living object. The one or more 3D models can be modified and enhanced. The resulting one or more 3D models can be transformed into at least one 2D display image; the point of view of the output 2D image(s) can be different from that of the input 2D image(s).

ANISOTROPIC TEXTURE FILTERING USING WEIGHTS OF AN ANISOTROPIC FILTER THAT MINIMIZE A COST FUNCTION
20230050797 · 2023-02-16 ·

A method of performing anisotropic texture filtering includes generating one or more parameters describing an elliptical footprint in texture space; performing isotropic filtering at each sampling point of a set of sampling points in an ellipse to be sampled to produce a plurality of isotropic filter results, the ellipse to be sampled based on the elliptical footprint; selecting, based on one or more parameters of the set of sampling points and one or more parameters of the ellipse to be sampled, weights of an anisotropic filter that minimize a cost function that penalises high frequencies in the filter response of the anisotropic filter under a constraint that the variance of the anisotropic filter is related to an anisotropic ratio squared, the anisotropic ratio being the ratio of a major radius of the ellipse to be sampled and a minor axis of the ellipse to be sampled; and combining the plurality of isotropic filter results using the selected weights of the anisotropic filter to generate at least a portion of a filter result.

ANISOTROPIC TEXTURE FILTERING USING WEIGHTS OF AN ANISOTROPIC FILTER THAT MINIMIZE A COST FUNCTION
20230050797 · 2023-02-16 ·

A method of performing anisotropic texture filtering includes generating one or more parameters describing an elliptical footprint in texture space; performing isotropic filtering at each sampling point of a set of sampling points in an ellipse to be sampled to produce a plurality of isotropic filter results, the ellipse to be sampled based on the elliptical footprint; selecting, based on one or more parameters of the set of sampling points and one or more parameters of the ellipse to be sampled, weights of an anisotropic filter that minimize a cost function that penalises high frequencies in the filter response of the anisotropic filter under a constraint that the variance of the anisotropic filter is related to an anisotropic ratio squared, the anisotropic ratio being the ratio of a major radius of the ellipse to be sampled and a minor axis of the ellipse to be sampled; and combining the plurality of isotropic filter results using the selected weights of the anisotropic filter to generate at least a portion of a filter result.

3D Imaging and Texture Mapping for Apparel Imagery

A manufacturing flow of apparel such as jeans uses a laser to finish the products. The products are designed using a digital design tool, where photorealistic previews are generated in three dimensions and two dimensions. Imagery of the products are sent to retailers where customers can order the products, such as online orders. Imagery of the products are sent to factories where the products are finished. Based on the imagery, the factories make adjustments to the processes as needed so that the actual products will have an appearance as in the received imagery. As orders are received by the retailers, the factories can manufacture the desired products on demand, and the products can be delivered to customers.

SYSTEM AND METHOD FOR GENERATING VIRTUAL PSEUDO 3D OUTPUTS FROM IMAGES
20230052169 · 2023-02-16 ·

A method for generating virtual pseudo three dimensional 360 degree outputs from 2D images of an object 102 is provided. An image viewer plane of the object 102 in the 3D image to be rendered on a user device 108 is detected using an augmented reality technique. An image viewer plane is placed facing the user device 108 rendering ‘Image 0’ and movement coordinates of the user device 108 with respect to the image viewer plane is detected to calculate the virtual pseudo 3D image set to be displayed based on at least one angle of view by performing interpolation between two consecutive virtual pseudo 3D images. The image viewer plane is changed with respect to the movement of the user device 108 to change the virtual pseudo 3D image and the interpolated virtual pseudo 3D image on the plane and that image is displayed as an augmented reality object in real-time to the user device 108.