THREE-DIMENSIONAL (3D) IMAGE MODELING SYSTEMS AND METHODS FOR AUTOMATICALLY GENERATING PHOTOREALISTIC, VIRTUAL 3D PACKAGING AND PRODUCT MODELS FROM 2D IMAGING ASSETS AND DIMENSIONAL DATA
20230038240 · 2023-02-09
Inventors
- Rachel Wiley (Cincinnati, OH, US)
- David A. Lombardi, JR. (Cincinnati, OH, US)
- Diana Jobson Cheshire (Wyoming, OH, US)
Cpc classification
G06T19/20
PHYSICS
G06T17/20
PHYSICS
International classification
G06T17/20
PHYSICS
Abstract
Three-dimensional (3D) modeling systems and methods are described for automatically generating photorealistic, virtual 3D package and product models from two-dimensional (2D) imaging assets and dimensional data. The 3D modeling systems and methods include storing, by a memory with one or more processors, 2D imaging assets and dimensional datasets, obtaining, with an imaging asset manipulation script, a shape classification defining a real-world product or product package to be virtually modeled in 3D space, generating, with the imaging asset manipulation script, a spline based on an alpha channel extracted from a 2D imaging asset depicting the real-world product or package, and generating, with the imaging asset manipulation script, a parametric model based on the spline, the dimensional dataset, and the shape classification. A virtual 3D model is generated based on the parametric model and rendered, via a graphical display or environment, as a photorealistic image representing the real-world product or product package.
Claims
1. A three-dimensional (3D) image modeling system configured to automatically generate photorealistic, virtual 3D package and product models from two-dimensional (2D) imaging assets and dimensional data, the 3D image modeling system comprising: one or more processors; an imaging asset manipulation script comprising computing instructions configured to execute on the one or more processors; and a memory configured to store 2D imaging assets and dimensional datasets accessible by the one or more processors and the computing instructions of the imaging asset manipulation script, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: obtain a shape classification defining a real-world product or product package to be virtually modeled in 3D space, obtain a dimensional dataset defining product or package measurements of the real-world product or product package to be virtually modeled in 3D space, generate a spline, based on a 2D image asset, wherein the 2D image asset is selected from the 2D imaging assets, and wherein the 2D image asset depicts the real-world product or product package, the spline comprising a plurality of points positioned along a perimeter of a shape silhouette of the real-world product or product package depicted in the 2D image asset, generate a parametric model based on the spline, the dimensional dataset, and the shape classification, generate a virtual 3D model of the real-world product or product package based on the parametric model and one or more attributes corresponding to the real-world product or product package, and render, via a graphical display or environment, the virtual 3D model as a photorealistic image representing the real-world product or product package in a virtual 3D space.
2. The 3D image modeling system of claim 1, wherein the shape classification comprises at least one of: a bottle classification, a symmetrical pump classification, a tube classification, a symmetrical bottle classification, a tottle classification, an asymmetrical bottle classification, a box classification, a pouch classification, a bag classification, a handled bottle classification, or a blister pack classification.
3. The 3D image modeling system of claim 1, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: adjust the spline to optimize the mapping of the spline to the alpha channel or reduce the data size the spline by reducing or adjusting one or more of the plurality of points positioned along the perimeter of the alpha channel.
4. The 3D image modeling system of claim 1, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: generate, based on the 2D image asset, a UV graphical texture defining at least one of: a first graphical representation or a second graphical representation of the real-world product or product package.
5. The 3D image modeling system of claim 4, wherein each of the first graphical representation and the second graphical representation are generated as a different graphical representation or a same graphical representation for rendering as a virtual first side UV texture and a virtual second side UV texture of the virtual 3D model of the real-world product or product package.
6. The 3D image modeling system of claim 1, wherein the spline is a first spline, comprising a plurality of points positioned along a perimeter of a first shape silhouette of a first portion of the real-world product or product package depicted in the 2D image asset, and wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: generate a second spline comprising a plurality of points positioned along a perimeter of a second shape silhouette of the second portion of the real-world product or product package depicted in the 2D image asset.
7. The 3D image modeling system of claim 6, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: add a first graphical texture or a first color to the first portion of the real-world product or product package to be virtually modeled in 3D space; add a second graphical texture or a second color to the second portion of the real-world product or product package to be virtually modeled in 3D space, the second graphical texture or the second color being different from the first graphical texture or the first color.
8. The 3D image modeling system of claim 6, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: add a first graphical texture or a first color to the first portion of the real-world product or product package to be virtually modeled in 3D space; add a second graphical texture or a second color to the second portion of the real-world product or product package to be virtually modeled in 3D space, the second graphical texture or the second color being consistent the first graphical texture or the first color.
9. The 3D image modeling system of claim 1, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: update the spline or the parametric model by applying one or more refinements.
10. The 3D image modeling system of claim 9, wherein one or more points of the spline or the parametric model are configured to be selected or dragged, and wherein the one or imaging refinements comprise receiving a selection or drag command to adjust or reduce the one or more points of the spline or the parametric model.
11. The 3D image modeling system of claim 9, wherein one or more image features are configured to be adjusted for or applied to the virtual 3D model, and wherein the one or refinements comprise receiving an adjustment or application command to update the virtual 3D model.
12. The 3D image modeling system of claim 1, wherein the virtual 3D model is a high fidelity polygonal model representation of the real-world product or product package.
13. The 3D image modeling system of claim 1, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: initiate creation of at least a portion of the real-world product or product package based on the virtual 3D model.
14. The 3D image modeling system of claim 13, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: create, with a 3D printer, at least a portion of the real-world product or product package based on the virtual 3D model.
15. The 3D image modeling system of claim 12, wherein the high fidelity polygonal model is rendered in the virtual 3D space as part of a mixed reality environment.
16. The 3D image modeling system of claim 1 further comprising a server comprising at least one processor of the one or more processors, wherein at least a portion of 2D imaging assets and dimensional datasets are retrieved via a computing network.
17. The 3D image modeling system of claim 1, wherein the one or more processors are further configured to launch a graphical user interface (GUI), the GUI configured to load into memory, or render on the graphical display, any one or more of the 2D imaging assets, the dimensional datasets, the spline, the parametric model, or the virtual 3D model.
18. The 3D image modeling system of claim 17, wherein the GUI is configured to receive user selections to manipulate any of the 2D imaging assets, the dimensional datasets, the spline, the parametric model, or the virtual 3D model.
19. The 3D image modeling system of claim 18, wherein the one or more processors are further configured to generate or render a new virtual 3D model based on the user selections, the new virtual 3D model representing an updated product or product package corresponding to the user selections.
20. The 3D image modeling system of claim 19, wherein the one or more processors are further configured to store the virtual 3D model in a memory such that the virtual 3D model is accessible to the imaging asset manipulation script or the visualization editor.
21. The 3D image modeling system of claim 1, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: extract an alpha channel from the 2D image asset, wherein the shape silhouette is a shape silhouette of the alpha channel extracted from the 2D image asset.
22. A three-dimensional (3D) image modeling method for automatically generating photorealistic, virtual 3D package and product models from two-dimensional (2D) imaging assets and dimensional data, the 3D image modeling method comprising: obtaining, by one or more processors, a shape classification defining a real-world product or product package to be virtually modeled in 3D space, obtaining, by the one or more processors, a dimensional dataset defining product or package measurements of the real-world product or product package to be virtually modeled in 3D space, generating, by the one or more processors, a spline, based on the 2D image asset, wherein the 2D image asset is selected from the 2D imaging assets, and wherein the 2D image asset depicts the real-world product or product package, the spline comprising a plurality of points positioned along a perimeter of a shape silhouette of the real-world product or product package depicted in the 2D image asset, generating, by the one or more processors, a parametric model based on the spline, the dimensional dataset, and the shape classification, generating, by the one or more processors, a virtual 3D model of the real-world product or product package based on the parametric model and one or more attributes corresponding to the real-world product or product package, and rendering, by the one or more processors, via a graphical display or environment, the virtual 3D model as a photorealistic image representing the real-world product or product package in a virtual 3D space.
23. A tangible, non-transitory computer-readable medium storing three-dimensional (3D) image modeling instructions for automatically generating photorealistic, virtual 3D package and product models from two-dimensional (2D) imaging assets and dimensional data, that, when executed by one or more processors, cause the one or more processors to: obtain a shape classification defining a real-world product or product package to be virtually modeled in 3D space, obtain a dimensional dataset defining product or package measurements of the real-world product or product package to be virtually modeled in 3D space, generate a spline, based on the 2D image asset, wherein the 2D image asset is selected from the 2D imaging assets, and wherein the 2D image asset depicts the real-world product or product package, the spline comprising a plurality of points positioned along a perimeter of a shape silhouette of the real-world product or product package depicted in the 2D image asset, generate a parametric model based on the spline, the dimensional dataset, and the shape classification, generate a virtual 3D model of the real-world product or product package based on the parametric model and one or more attributes corresponding to the real-world product or product package, and render, via a graphical display or environment, the virtual 3D model as a photorealistic image representing the real-world product or product package in a virtual 3D space.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] The Figures described below depict various aspects of the system and methods disclosed therein. It should be understood that each Figure depicts an aspect of a particular aspect of the disclosed system and methods, and that each of the Figures is intended to accord with a possible aspect thereof. Further, wherever possible, the following description refers to the reference numerals included in the following Figures, in which features depicted in multiple Figures are designated with consistent reference numerals.
[0015] There are shown in the drawings arrangements which are presently discussed, it being understood, however, that the present aspects are not limited to the precise arrangements and instrumentalities shown, wherein:
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032] The Figures depict preferred aspects for purposes of illustration only. Alternative aspects of the systems and methods illustrated herein may be employed without departing from the principles of the invention described herein.
DETAILED DESCRIPTION
[0033]
[0034] Server(s) 102 may include one or more processor(s) 104 as well as one or more computer memories 106. Memories 106 may include one or more forms of volatile and/or non-volatile, fixed and/or removable memory, such as read-only memory (ROM), electronic programmable read-only memory (EPROM), random access memory (RAM), erasable electronic programmable read-only memory (EEPROM), and/or other hard drives, flash memory, Micros cards, and others. Memorie(s) 106 may store an operating system (OS) (e.g., Microsoft Windows, Linux, Unix, etc.) capable of facilitating the functionalities, apps, methods, or other software as discussed herein. Memorie(s) 106 may also store machine readable instructions, including any of one or more application(s), one or more software component(s), and/or one or more application programming interfaces (APIs), which may be implemented to facilitate or perform the features, functions, or other disclosure described herein, such as any methods, processes, elements or limitations, as illustrated, depicted, or described for the various flowcharts, illustrations, diagrams, figures, and/or other disclosure herein. For example, at least some of the software, instructions, scripts, applications, software components, or APIs may include, otherwise be part of, an imaging asset manipulation script, machine learning component, and/or other such software, where each are configured to facilitate their various functionalities as described herein. It should be appreciated that one or more other applications or scripts, such as those described herein, may be envisioned and that are executed by processor(s) 104. In addition, while
[0035] Processor(s) 104 may be connected to memories 106 via a computer bus responsible for transmitting electronic data, data packets, or otherwise electronic signals to and from processor(s) 104 and memories 106 in order to implement or perform the machine readable instructions, methods, processes, scripts, elements or limitations, as illustrated, depicted, or described for the various flowcharts, illustrations, diagrams, figures, and/or other disclosure herein.
[0036] Processor(s) 104 may interface with memory 106 via the computer bus to execute the operating system (OS). Processor(s) 104 may also interface with computer memory 106 via the computer bus to create, read, update, delete, or otherwise access or interact with the data stored in memory, including in memories 106 and/or the database 105 (e.g., a relational database, such as Oracle, DB2, MySQL, or a NoSQL based database, such as MongoDB). The data stored in memories 106 and/or the database 105 may include all or part of any of the scripts, data or information described herein, including, for example the imaging asset manipulation script, and/or the 2D imaging assets as accessible by the imaging asset manipulation script.
[0037] As described herein a “memory” may refer to either memory 106 and/or database 105. Such memory may be configured to store 2D imaging assets accessible by processor(s) 104, scripts, application, or other software, e.g., including an imaging asset manipulation script described herein.
[0038] In some aspects, database 105 may be a product lifecycle management (PLM) database or system. Generally, a PLM database or system is implemented as an information management system that can integrate data, processes, and other business systems within an enterprise or platform, such as the platform depicted for 3D modeling system 100. A PLM database or system generally includes software for managing information (e.g., 2D imaging assets) throughout an entire lifecycle of a product/package in an efficient and cost-effectivities manner. The lifecycle may include lifecycle stages from ideation, design and manufacture, through service and disposal. In some aspects, database 105 may store digital PLM objects (e.g., digital 2D and 3D imaging assets as described herein). Such digital objects or assets can represent a real-world physical parts, assemblies(s), or documents, customer requirements or supplier parts, a change process, and/or other data types relating to a lifecycle management and development of a product and/or package. For example, digital objects or assets can include computer-aided design (CAD) file(s) that depict or describe (e.g., via measurements, sizes, etc.) parts, components, or complete (or partially complete) models or designs of products and/or packages. Generally, non-CAD files can also be included in database 105. Such non-CAD files can include text or data files describing or defining parts, components, and/or product or package specifications, vendor datasheets, or emails relating to a design. For example, a PLM database or system can index and access text contents of a file, which can include metadata or other information regarding a product or package for design purposes.
[0039] In addition, PLM objects or assets, and/or corresponding data records, such as those that may be stored in database 105, can contain properties regarding an object's or an asset's parameters or aspects of its design lifecycle. For example, PLM database or systems can generally store different classes of objects or assets (primarily parts (e.g., as CAD files), documents, and change forms) with distinct properties and behaviors. Such properties can include metrics or metadata such as part/document number, item category, revision, title, unit of measure, bill of materials, cost, mass, regulatory compliance details, file attachments, and other such information regarding product(s), and/or package(s) of a company. In addition, such PLM objects or assets may be linked, e.g., within database 105 (e.g., as a relational database), to other objects or assets within database 105 for the association of or otherwise generation or construction of a product structure. In this way, a PLM database can be flexibly used to identify objects and assets, create and define relationships among such objects and assets. Such flexibility provides a basis for the creation, customization, revision, and/or reuse of virtual models (e.g., virtual 3D models) as described herein, and also the 2D imaging assets on which they are based.
[0040] For example, in some aspects, processor(s) 104 may store virtual 3D model(s) in memory 106 and/or database 105 such that virtual 3D model(s) are accessible to an imaging asset manipulation script or a visualization editor. In this way, an imaging asset manipulation script or the visualization editor, in a new or next iteration of a product lifecycle or introduction of new product lifecycle, may generate one or more new or additional virtual 3D models corresponding to one or more new or additional real-world products or product packages, or one or more new or additional virtual 3D models corresponding to updated versions of existing real-world products or product packages.
[0041] In various aspects described herein, database 105, implemented as a PLM database or system, can support CAD files for components or parts of existing or future (i.e., to be designed) products and/or packages. Such a PLM database or system can be implemented, for example, via third party software such as ALTIUM DESIGNER, ORCAD component information system (CIS), or the like.
[0042] While a PLM based database and system are described in various aspects herein, it is to be understood that other database or memory management systems (e.g., standard relational databases, NoSQL databases, etc.) may likewise be used in accordance with the disclosure of the 3D modeling systems and methods herein. As a non-limiting example, a PLM based database and/or system may comprise a “data lake” or the like, where a data lake or similar such database can comprise a system or repository of data stored in its natural/raw format, for example, as object blobs, raw bytes, and/or data files.
[0043] Further, with respect to
[0044] Server(s) 102, via processor(s) 104, may further include, implement, or launch a visualization editor, or otherwise operator interface, to render models or photorealistic images, present information to a user, and/or receive inputs or selections from the user. As shown in
[0045] Server(s) 102 may also provide I/O components (e.g., ports, capacitive or resistive touch sensitive input panels, keys, buttons, lights, LEDs), which may be directly accessible via or attached to server(s) 102 or may be indirectly accessible via or attached to terminal 109. According to some aspects, a user may access the server 102 via terminal 109 to render models or photorealistic images (e.g., via a visualization editor), review information, make changes, input data, and/or perform other functions.
[0046] As described above herein, in some aspects, server(s) 102 may perform the functionalities as discussed herein as part of a “cloud” network or may otherwise communicate with other hardware or software components within the cloud to send, retrieve, or otherwise analyze data or information (e.g., virtual 3D model(s)) as described herein.
[0047] In various aspects herein, a computer program, script, code, or application, (e.g., an imaging asset manipulation script) may comprise computer-readable program code or computer instructions, in accordance with aspects herein, and may be stored on a computer usable storage medium, or tangible, non-transitory computer-readable medium (e.g., standard random access memory (RAM), an optical disc, a universal serial bus (USB) drive, or the like). Such comprise computer-readable program code or computer instructions may be installed on or otherwise adapted to be executed by processor(s) 104 (e.g., working in connection with the respective operating system in memories 106) to facilitate, implement, or perform the machine readable instructions, methods, processes, elements or limitations, as illustrated, depicted, or described for the various flowcharts, illustrations, diagrams, figures, and/or other disclosure herein. In this regard, the program code or scripts may be implemented in any desired program language, and may be implemented as machine code, assembly code, byte code, and/or interpretable source code or the like (e.g., via Golang, Python, C, C++, C#, Objective-C, Java, Scala, ActionScript, JavaScript, HTML, CSS, XML, etc.). For example, as described herein, server(s) 102, implementing processor(s) 104, may execute one or more imaging asset manipulation scripts to assemble or otherwise manipulate or generate parametric-based CAD models or other models described herein.
[0048] In the example aspect of
[0049] For example, as shown for
[0050] In some aspects, modeling server(s) 102 may download or retrieve 2D imaging assets over computer network 120. For example, 2D imaging assets may be downloaded, by modeling server(s) 102, from remote server(s) 140 which may store 2D imaging assets. Remote server(s) 140 may be those of a third-party or of the company designing or developing product(s) and/or product package(s) as described herein. In some aspects, a portion or subset of 2D imaging assets and/or dimensional data required to design product(s) and/or product package(s) may be retrieved from the remote server(s) 140.
[0051]
[0052] In the aspect of
[0053] As shown at
[0054]
[0055]
[0056] In some examples, the imaging asset manipulation script 108 may additionally isolate (310) segments of the spline 304 that are associated with particular elements within the shape of the real-world product or product package to be virtually modeled in 3D space. For instance,
[0057]
[0058] Referring back to
[0059] In some examples, separate UV texture maps corresponding to various sides or views of the real-world product or product package (e.g., a front graphical representation, a rear graphical representation, various side graphical representations, a top graphical representation, a bottom graphical representation, etc.) of the real-world product or product package may be respectively applied to the respective portions (e.g., a front portion, a rear portion, various side portions, a top portion, a bottom portion, etc.) of the parametric model 404. In other words, each of the graphical representations of each side or view may be generated as a different graphical representation for rendering as a virtual UV texture of the real-world product or product package. Additionally, in some examples, if only a UV texture map for only one side or representation (e.g., a front graphical representation) of the real-world product or product package is available, the UV texture map for the front graphical representation of the real-world product or product package may be duplicated and applied (408) to multiple portions of the parametric model 404. In other words, each of the front graphical representation and the rear graphical representation may be generated as the same graphical representation for rendering as a virtual front UV texture and a virtual back UV texture of the virtual 3D model 410 of the real-world product or product package, i.e., based on the availability of graphical representations of the front or rear of the real-world product or product package.
[0060] For instance,
[0061]
[0062] For instance,
[0063] Referring back to
[0064] For example,
[0065]
[0066] The flow diagram 500 may further include, in some cases, adjusting or refining the virtual 3D model 410 by “autopainting” (508), i.e., applying textures or refinements to, different portions of the textured live package geometry 502 associated with the virtual 3D model 410 differently, e.g., by the imaging asset manipulation script 108 and/or a visualization editor 504. As one example, a first graphical texture or a first color may be added or applied (508) to a first portion of the real-world product or product package to be virtually modeled in 3D space, while a second, different, graphical texture or a second color is added or applied (508) to the second portion of the real-world product or product package to be virtually modeled in 3D space. For instance, certain textures or other refinements may applied (508) to a portion of the textured live package geometry 502 corresponding to the “cap” portion 312 of the spline 304, e.g., as shown at
[0067] Additionally, the flow diagram 500 may further include automatically reloading, e.g., by the imaging asset manipulation script 108, any changes to be made to the editable textured live package geometry 502 of the virtual 3D model 410, in order to produce a textured virtual reality package model with updated image textures 514. The imaging asset manipulation script 108 may render the textured virtual reality package model with updated image textures 514 in virtual 3D space, e.g., as part of a mixed reality environment, such as an augmented reality (AR) environment.
[0068] In some examples, the imaging asset manipulation script 108 may further initiate the creation of at least a portion of the real-world product or product package based on the virtual 3D model. For instance, the imaging asset manipulation script 108 may cause or initiate the creation of at least a portion of the real-world product or product package based on the virtual 3D model using a 3D printer.
[0069]
[0070] At block 602, 3D modeling method 600 includes storing, by a memory (e.g., memorie(s) 106 and/or database 105) with one or more processors (e.g., processor(s) 104), one or more 2D imaging assets (e.g., 2D imaging asset(s) 204) and dimensional datasets (e.g., dimensional dataset(s) 206) accessible by the one or more processors and the computing instructions of an imaging asset manipulation script (e.g., imaging asset manipulation script 108).
[0071] At block 604, 3D modeling method 600 further includes obtaining, with the imaging asset manipulation script (e.g., imaging asset manipulation script 108) implemented on the one or more processors (e.g., processor(s) 104), a shape classification (e.g., shape classification 208) defining a real-world product or product package to be virtually modeled in 3D space.
[0072] At block 608, 3D modeling method 600 further includes obtaining, with the imaging asset manipulation script (e.g., imaging asset manipulation script 108) implemented on the one or more processors (e.g., processor(s) 104), a dimensional dataset (e.g., of the dimensional dataset(s) 206) defining product or package measurements of the real-world product or product package to be virtually modeled in 3D space.
[0073] At block 610, 3D modeling method 600 further includes generating, with the imaging asset manipulation script (e.g., imaging asset manipulation script 108) implemented on the one or more processors (e.g., processor(s) 104), a spline (e.g., spline 304), based on a 2D image asset, wherein the 2D image asset is selected from the 2D imaging assets (e.g., 2D imaging asset(s) 204), and wherein the 2D image asset depicts the real-world product or product package, the spline comprising a plurality of points (e.g., points 306) positioned along a perimeter (e.g., perimeter 308) of a shape silhouette of the real-world product or product package depicted in the 2D image asset.
[0074] At block 612, 3D modeling method 600 further includes generating, with the imaging asset manipulation script (e.g., imaging asset manipulation script 108) implemented on the one or more processors (e.g., processor(s) 104), a parametric model (e.g., parametric model 404) based on the spline (e.g., spline 304), the dimensional dataset (e.g., dimensional dataset 206), and the shape classification (e.g., shape classification 208).
[0075] At block 614, 3D modeling method 600 further includes generating, with the imaging asset manipulation script (e.g., imaging asset manipulation script 108) implemented on the one or more processors (e.g., processor(s) 104), a virtual 3D model (e.g., virtual 3D model 410) of the real-world product or product package based on the parametric model (e.g., parametric model 404) and one or more attributes corresponding to the real-world product or product package.
[0076] At block 616, 3D modeling method 600 further includes rendering, via a graphical display or environment (e.g., via terminal 109), the virtual 3D model as a photorealistic image representing the real-world product or product package in a virtual 3D space.
[0077] Aspects of the Disclosure
[0078] 1. A three-dimensional (3D) image modeling system configured to automatically generate photorealistic, virtual 3D package and product models from two-dimensional (2D) imaging assets and dimensional data, the 3D image modeling system comprising: one or more processors; an imaging asset manipulation script comprising computing instructions configured to execute on the one or more processors; and a memory configured to store 2D imaging assets and dimensional datasets accessible by the one or more processors and the computing instructions of the imaging asset manipulation script, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: obtain a shape classification defining a real-world product or product package to be virtually modeled in 3D space, obtain a dimensional dataset defining product or package measurements of the real-world product or product package to be virtually modeled in 3D space, generate a spline, based on a 2D image asset, wherein the 2D image asset is selected from the 2D imaging assets, and wherein the 2D image asset depicts the real-world product or product package, the spline comprising a plurality of points positioned along a perimeter of a shape silhouette of the real-world product or product package depicted in the 2D image asset, generate a parametric model based on the spline, the dimensional dataset, and the shape classification, generate a virtual 3D model of the real-world product or product package based on the parametric model and one or more attributes corresponding to the real-world product or product package, and render, via a graphical display or environment, the virtual 3D model as a photorealistic image representing the real-world product or product package in a virtual 3D space.
[0079] 2. The 3D image modeling system of aspect 1, wherein the shape classification comprises at least one of: a bottle classification, a symmetrical pump classification, a tube classification, a symmetrical bottle classification, a tottle classification, an asymmetrical bottle classification, a box classification, a pouch classification, a bag classification, a handled bottle classification, or a blister pack classification.
[0080] 3. The 3D image modeling system of any one of aspects 1-2, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: adjust the spline to optimize the mapping of the spline to the alpha channel or reduce the data size the spline by reducing or adjusting one or more of the plurality of points positioned along the perimeter of the alpha channel.
[0081] 4. The 3D image modeling system of any one of aspects 1-3, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: generate, based on the 2D image asset, a UV graphical texture defining at least one of: a first graphical representation or a second graphical representation of the real-world product or product package.
[0082] 5. The 3D image modeling system of aspect 4, wherein each of the first graphical representation and the second graphical representation are generated as a different graphical representation or a same graphical representation for rendering as a virtual first side UV texture and a virtual second side UV texture of the virtual 3D model of the real-world product or product package.
[0083] 6. The 3D image modeling system of any one of aspects 1-5, wherein the spline is a first spline, comprising a plurality of points positioned along a perimeter of a first shape silhouette of a first portion of the real-world product or product package depicted in the 2D image asset, and wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: generate a second spline comprising a plurality of points positioned along a perimeter of a second shape silhouette of the second portion of the real-world product or product package depicted in the 2D image asset.
[0084] 7. The 3D image modeling system of aspect 6, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: add a first graphical texture or a first color to the first portion of the real-world product or product package to be virtually modeled in 3D space; add a second graphical texture or a second color to the second portion of the real-world product or product package to be virtually modeled in 3D space, the second graphical texture or the second color being different from the first graphical texture or the first color.
[0085] 8. The 3D image modeling system of aspect 6, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: add a first graphical texture or a first color to the first portion of the real-world product or product package to be virtually modeled in 3D space; add a second graphical texture or a second color to the second portion of the real-world product or product package to be virtually modeled in 3D space, the second graphical texture or the second color being consistent the first graphical texture or the first color.
[0086] 9. The 3D image modeling system of any one of aspects 1-8, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: update the spline or the parametric model by applying one or more refinements.
[0087] 10. The 3D image modeling system of aspect 9, wherein one or more points of the spline or the parametric model are configured to be selected or dragged, and wherein the one or imaging refinements comprise receiving a selection or drag command to adjust or reduce the one or more points of the spline or the parametric model.
[0088] 11. The 3D image modeling system of any one of aspects 8-9, wherein one or more image features are configured to be adjusted for or applied to the virtual 3D model, and wherein the one or refinements comprise receiving an adjustment or application command to update the virtual 3D model.
[0089] 12. The 3D image modeling system of any one of aspects 1-11, wherein the virtual 3D model is a high fidelity polygonal model representation of the real-world product or product package.
[0090] 13. The 3D image modeling system of any one of aspects 1-12, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: initiate creation of at least a portion of the real-world product or product package based on the virtual 3D model.
[0091] 14. The 3D image modeling system of aspect 13, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: create, with a 3D printer, at least a portion of the real-world product or product package based on the virtual 3D model.
[0092] 15. The 3D image modeling system of aspect 12, wherein the high fidelity polygonal model is rendered in the virtual 3D space as part of a mixed reality environment.
[0093] 16. The 3D image modeling system of any one of aspects 1-15, further comprising a server comprising at least one processor of the one or more processors, wherein at least a portion of 2D imaging assets and dimensional datasets are retrieved via a computing network.
[0094] 17. The 3D image modeling system of any one of aspects 1-16, wherein the one or more processors are further configured to launch a graphical user interface (GUI), the GUI configured to load into memory, or render on the graphical display, any one or more of the 2D imaging assets, the dimensional datasets, the spline, the parametric model, or the virtual 3D model.
[0095] 18. The 3D image modeling system of aspect 17, wherein the GUI is configured to receive user selections to manipulate any of the 2D imaging assets, the dimensional datasets, the spline, the parametric model, or the virtual 3D model.
[0096] 19. The 3D image modeling system of aspect 18, wherein the one or more processors are further configured to generate or render a new virtual 3D model based on the user selections, the new virtual 3D model representing an updated product or product package corresponding to the user selections.
[0097] 20. The 3D image modeling system of aspect 19, wherein the one or more processors are further configured to store the virtual 3D model in a memory such that the virtual 3D model is accessible to the imaging asset manipulation script or the visualization editor.
[0098] 21. The 3D image modeling system of any of aspects 1-19, wherein the computing instructions of the imaging asset manipulation script, when executed by the one or more processors, cause the one or more processors to: extract an alpha channel from the 2D image asset, wherein the shape silhouette is a shape silhouette of the alpha channel extracted from the 2D image asset.
[0099] 22. A three-dimensional (3D) image modeling method for automatically generating photorealistic, virtual 3D package and product models from two-dimensional (2D) imaging assets and dimensional data, the 3D image modeling method comprising: obtaining, by one or more processors, a shape classification defining a real-world product or product package to be virtually modeled in 3D space, obtaining, by the one or more processors, a dimensional dataset defining product or package measurements of the real-world product or product package to be virtually modeled in 3D space, generating, by the one or more processors, a spline, based on the 2D image asset, wherein the 2D image asset is selected from the 2D imaging assets, and wherein the 2D image asset depicts the real-world product or product package, the spline comprising a plurality of points positioned along a perimeter of a shape silhouette of the real-world product or product package depicted in the 2D image asset, generating, by the one or more processors, a parametric model based on the spline, the dimensional dataset, and the shape classification, generating, by the one or more processors, a virtual 3D model of the real-world product or product package based on the parametric model and one or more attributes corresponding to the real-world product or product package, and rendering, by the one or more processors, via a graphical display or environment, the virtual 3D model as a photorealistic image representing the real-world product or product package in a virtual 3D space.
[0100] 23. A tangible, non-transitory computer-readable medium storing three-dimensional (3D) image modeling instructions for automatically generating photorealistic, virtual 3D package and product models from two-dimensional (2D) imaging assets and dimensional data, that, when executed by one or more processors, cause the one or more processors to: obtain a shape classification defining a real-world product or product package to be virtually modeled in 3D space, obtain a dimensional dataset defining product or package measurements of the real-world product or product package to be virtually modeled in 3D space, generate a spline, based on the 2D image asset, wherein the 2D image asset is selected from the 2D imaging assets, and wherein the 2D image asset depicts the real-world product or product package, the spline comprising a plurality of points positioned along a perimeter of a shape silhouette of the real-world product or product package depicted in the 2D image asset, generate a parametric model based on the spline, the dimensional dataset, and the shape classification, generate a virtual 3D model of the real-world product or product package based on the parametric model and one or more attributes corresponding to the real-world product or product package, and render, via a graphical display or environment, the virtual 3D model as a photorealistic image representing the real-world product or product package in a virtual 3D space.
[0101] Additional Considerations
[0102] Although the disclosure herein sets forth a detailed description of numerous different aspects, it should be understood that the legal scope of the description is defined by the words of the claims set forth at the end of this patent and equivalents. The detailed description is to be construed as exemplary only and does not describe every possible aspect since describing every possible aspect would be impractical. Numerous alternative aspects may be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims.
[0103] The following additional considerations apply to the foregoing discussion. Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
[0104] Additionally, certain aspects are described herein as including logic or a number of routines, subroutines, applications, or instructions. These may constitute either software (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware. In hardware, the routines, etc., are tangible units capable of performing certain operations and may be configured or arranged in a certain manner. In example aspects, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
[0105] In various aspects, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
[0106] Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering aspects in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
[0107] Hardware modules may provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiples of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In aspects in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and may operate on a resource (e.g., a collection of information).
[0108] The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example aspects, comprise processor-implemented modules.
[0109] Similarly, the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example aspects, the processor or processors may be located in a single location, while in other aspects processors may be distributed across a number of locations.
[0110] The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example aspects, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other aspects, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
[0111] This detailed description is to be construed as exemplary only and does not describe every possible aspect, as describing every possible aspect would be impractical, if not impossible. A person of ordinary skill in the art may implement numerous alternate aspects, using either current technology or technology developed after the filing date of this application.
[0112] Those of ordinary skill in the art will recognize that a wide variety of modifications, alterations, and combinations can be made with respect to the above described aspects without departing from the scope of the invention, and that such modifications, alterations, and combinations are to be viewed as being within the ambit of the inventive concept.
[0113] The patent claims at the end of this patent application are not intended to be construed under 35 U.S.C. § 112(f) unless traditional means-plus-function language is expressly recited, such as “means for” or “step for” language being explicitly recited in the claim(s). The systems and methods described herein are directed to an improvement to computer functionality, and improve the functioning of conventional computers.