Patent classifications
G06T2210/16
TEXTURE FILTERING OF TEXTURE REPRESENTED BY MULTILEVEL MIPMAP
Texture filtering is applied to a texture represented with a mipmap comprising a plurality of levels, wherein each level of the mipmap comprises an image representing the texture at a respective level of detail. A texture filtering unit has minimum and maximum limits on an amount by which it can alter the level of detail when it filters texels from an image of a single level of the mipmap. The range of level of detail between the minimum and maximum limits defines an intrinsic region of the texture filtering unit. If it is determined that a received input level of detail is in an intrinsic region of the texture filtering unit, texels are read from a single mipmap level of the mipmap, and the read texels from the single mipmap level are filtered to determine a filtered texture value representing part of the texture at the input level of detail. If it is determined that the received input level of detail is in an extrinsic region of the texture filtering unit: texels are read from two mipmap levels of the mipmap, and the read texels from the two mipmap levels are processed to determine a filtered texture value representing part of the texture at the input level of detail.
SYSTEM AND METHOD FOR GENERATING VIRTUAL PSEUDO 3D OUTPUTS FROM IMAGES
A method for generating virtual pseudo three dimensional 360 degree outputs from 2D images of an object 102 is provided. An image viewer plane of the object 102 in the 3D image to be rendered on a user device 108 is detected using an augmented reality technique. An image viewer plane is placed facing the user device 108 rendering ‘Image 0’ and movement coordinates of the user device 108 with respect to the image viewer plane is detected to calculate the virtual pseudo 3D image set to be displayed based on at least one angle of view by performing interpolation between two consecutive virtual pseudo 3D images. The image viewer plane is changed with respect to the movement of the user device 108 to change the virtual pseudo 3D image and the interpolated virtual pseudo 3D image on the plane and that image is displayed as an augmented reality object in real-time to the user device 108.
SYSTEM AND METHOD FOR GENERATING 3D OBJECTS FROM 2D IMAGES OF GARMENTS
A system for generating three-dimensional (3D) objects from two-dimensional (2D) images of garments is presented. The system includes a data module configured to receive a 2D image of a selected garment and a target 3D model. The system further includes a computer vision model configured to generate a UV map of the 2D image of the selected garment. The system moreover includes a training module configured to train the computer vision model based on a plurality of 2D training images and a plurality of ground truth (GT) panels for a plurality of 3D training models. The system furthermore includes a 3D object generator configured to generate a 3D object corresponding to the selected garment based on the UV map generated by a trained computer vision model and the target 3D model. A related method is also presented.
ANISOTROPIC TEXTURE FILTERING USING ADAPTIVE FILTER KERNEL
A texture filtering unit applies anisotropic filtering using a filter kernel which can be adapted to apply different amounts of anisotropy up to a maximum amount of anisotropy. If it is determined that a received input amount of anisotropy is not above the maximum amount of anisotropy, the filter kernel applies the input amount of anisotropy, and texels of a texture are sampled using the filter kernel to determine a filtered texture value. If it is determined that the input amount of anisotropy is above the maximum amount of anisotropy, the filter kernel applies an amount of anisotropy that is not above the maximum amount of anisotropy, a plurality of sampling operations are performed to sample texels of the texture using the filter kernel to determine a respective plurality of intermediate filtered texture values, and the plurality of intermediate filtered texture values are combined to determine a filtered texture value which has been filtered in accordance with the input amount of anisotropy and the input direction of anisotropy.
SYSTEM AND METHOD FOR PROVIDING PERSONALIZED TRANSACTIONS BASED ON 3D REPRESENTATIONS OF USER PHYSICAL CHARACTERISTICS
The disclosed systems, components, methods, and processing steps are directed to determining user-item fit characteristics of an item for a user body part by accessing a three-dimensional (3D) reconstructed model of the user body part, accessing information about one or more 3D reference models of the item, the information for each 3D reference model including respective dimensional measurement, spatial, and geometrical attributes, performing a 3D matching process based on the 3D reconstructed model and the accessed information of the one or more 3D reference models to determine a best-fitting 3D reference model from the one or more 3D reference models, integrating the best-fitting 3D reference model with the 3D reconstructed model to provide a 3D best fit representation and displaying the 3D best fit representation along with visual indications of user-item fit characteristics.
Virtually modeling clothing based on 3D models of customers
Three-dimensional models (or avatars) may be defined based on imaging data captured from a customer. The avatars may be based on a virtual mannequin having one or more dimensions in common with the customer, a body template corresponding to the customer, or imaging data captured from the customer. The avatars are displayed on displays or in user interfaces and used for any purpose, such as to depict how clothing will appear or behave while being worn by a customer alone or with other clothing. Customers may drag-and-drop images of clothing onto the avatars. One or more of the avatars may be displayed on any display, such as a monitor or a virtual reality headset, which may depict the avatars in a static or dynamic mode. Images of avatars and clothing may be used to generate print catalogs depicting the appearance or behavior of the clothing while worn by the customer.
Laser finishing design tool with image preview
A tool allows a user to create new designs for apparel and preview these designs before manufacture. Software and lasers are used in finishing apparel to produce a desired wear pattern or other design. Based on a laser input file with a pattern, a laser will burn the pattern onto apparel. With the tool, the user will be able to create, make changes, and view images of a design, in real time, before burning by a laser. Input to the tool includes fabric template images, laser input files, and damage input. The tool allows adding of tinting and adjusting of intensity and bright point. The user can also move, rotate, scale, and warp the image input.
Guide-assisted capture of material data
A material data collection system allows capturing of material data. For example, the material data collection system may include digital image data for materials. The material data collection system may ensure that captured digital image data is properly aligned, so that material data may be easily recalled for later use, while maintaining the proper alignment for the captured digital image. The material data collection system may include using a capture guide, to provide cues on how to orient a mobile device used with the material data collection system.
Computer-Implemented Method For Positioning Patterns Around An Avatar
A computer-implemented method for designing a virtual garment or upholstery (G) in a three-dimensional scene comprising the steps of: a) providing a three-dimensional avatar (AV) in the three-dimensional scene; b) providing at least one pattern (P) of said virtual garment or upholstery in the three-dimensional scene; c) determining a distance field from a surface of the avatar; d) positioning the pattern relative to the avatar by keeping a fixed orientation with respect to said distance field; and e) assembling the positioned pattern or patterns around the avatar to form said virtual garment or upholstery, and draping it onto the avatar. A computer program product, non-volatile computer-readable data-storage medium and Computer Aided Design system for carrying out such a method. Application of the method to the manufacturing of a garment or upholstery.
Replacing imagery of garments in an existing apparel collection with laser-finished garments
A system allows a user to create new designs for apparel and preview these designs before manufacture. Software and lasers are used in finishing apparel to produce a desired wear pattern or other design. The system swaps garments in a digital asset to garments that are designed using the system. The wear pattern is created by a laser using a laser input file. Generating the preview image comprises combining first and second contributions to obtain a combined value for a pixel at the pixel location of the preview image.