METHOD AND DEVICE FOR PROVIDING AT LEAST ONE CUTTING PATTERN FOR AN ARTICLE OF CLOTHING TO BE PRODUCED INDIVIDUALLY FOR A CUSTOMER
20230248099 · 2023-08-10
Inventors
Cpc classification
G06T17/20
PHYSICS
International classification
A41H3/00
HUMAN NECESSITIES
Abstract
The present invention relates to a method, a computer program product, and a device for providing at least one cutting pattern of a garment to be made individually for a customer. Here, the method comprises the steps of: creating a virtual individual 3D body model (20) of the customer based on individually determined 3D body shell data (10) of the customer and general skeletal data (30) provided; creating a virtual individual 3D garment (40) based on a virtual 3D ideal design (30) and the virtual custom 3D body model (20) created; creating the at least one cutting pattern by flattening/developing the virtual individual 3D garment (40), wherein flattening/developing the virtual individual 3D garment (40) takes place algorithmically on the basis of at least one set of cutting rules (52, 54).
Claims
1. A method for providing at least one cutting pattern of a garment to be made individually for a customer, comprising the steps of: creating a virtual individual 3D body model of the customer based on individually determined 3D body shell data of the customer and provided general skeletal data; creating a virtual individual 3D garment based on a virtual 3D ideal design and the created virtual individual 3D body model; and creating the at least one cutting pattern by flattening/developing the virtual individual 3D garment, wherein flattening/developing the virtual individual 3D garment comprises, in this order, generating one or more possible cuts on the basis of a first set of cutting rules, generating one or more possible cuts on the basis of a second set of cutting rules, and generating one or more possible cuts on the basis of an automated flattening/development algorithm, the first set of cutting rules taking into account specified style elements of the garment and the second set of cutting rules taking into account specified effect elements of the garment.
2. The method according to claim 1, wherein creating a virtual individual 3D body model further takes place on the basis of provided general 3D body shell data.
3. (canceled)
4. (canceled)
5. The method according to claim 1, wherein generating one or more possible cuts takes place on the basis of an automated flattening/development algorithm using a merit/target function to be minimized.
6. The method according to claim 1, wherein flattening/developing the virtual custom 3D garment comprises minimizing a merit/target function, and wherein the merit/target function comprises a distortion energy term and/or a length regularization energy term and/or a strain energy term.
7. The method according to claim 6, wherein the merit/target function is defined by the equation E(p) = αE.sub.D(p) + βE.sub.L(p) + γE.sub.S(p), where E.sub.D(p) represents the distortion energy term, E.sub.L(p) the length regularization energy term, E.sub.D(p) the strain energy term, and α, β, γ > 0 represent weight factors associated with the respective energy terms.
8. The method according to claim 1, wherein creating a virtual individual 3D garment takes place on the basis of user specifications and in particular a selection from a predetermined clothing catalog, a predetermined style element catalog, a predetermined effect element catalog and/or a predetermined material catalog.
9. The method according to claim 1, wherein creating a virtual custom 3D garment comprises: enlarging the a volume of the virtual individual 3D body model to form an enlarged virtual individual 3D body model depending on the type of clothing and/or depending on a selected material and/or depending on a simulated movement of the virtual individual 3D body model.
10. The method according to claim 9, wherein creating a virtual custom 3D garment further comprises: projecting the virtual 3D ideal design onto a shell of the enlarged virtual individual 3D body model.
11. The method according to claim 1, wherein at least one final virtual individual 3D garment is created and displayed on the basis of the at least one created cutting pattern.
12. The method according to claim 1, further comprising: determining at least one linear measurement and/or at least one body circumference of the customer on the basis of the determined individual 3D body shell data.
13. The method of claim 12, further comprising: comparing the at least one linear measurement and/or body circumference of the customer on the basis of the determined individual 3D body shell data with at least one standard cutting pattern and/or with at least one predetermined measurement chart.
14. The method according to claim 1, further comprising: storing the created virtual individual 3D body model of the customer in an avatar comparison database, the avatar comparison database providing a large number of virtual individual 3D body models from different customers in order to compare or match these provided virtual individual 3D body models with one another.
15. The method of claim 14, further comprising: comparing the created virtual individual 3D body model of the customer to other virtual custom 3D body models provided by the avatar comparison database.
16. The method according to claim 14, further comprising: assigning the created virtual individual 3D body model of the customer to at least one avatar group from a large number of predefined avatar groups.
17. A computer program product comprising computer-readable instructions which, when loaded into a memory of a computer and executed by the computer, cause the computer to perform a method according to claim 1 .
18. A device for providing at least one cutting pattern of a garment to be made individually for a customer, comprising: a processor configured to: create a virtual 3D body model of the customer on the basis of individually determined 3D body shell data of the customer and given general skeletal data, create a virtual individual 3D garment on the basis of a virtual 3D ideal design and the virtual 3D body model, and create the at least one cutting pattern by flattening/developing the virtual individual 3D garment, wherein flattening/developing the virtual individual 3D garment comprises, in this order, generating one or more possible cuts on the basis of a first set of cutting rules, generating one or more possible cuts on the basis of a second set of cutting rules, and generating one or more possible cuts on the basis of an automated flattening/development algorithm, the first set of cutting rules taking into account specified style elements of the garment and the second set of cutting rules taking into account specified effect elements of the garment.
19. The device according to claim 18, further comprising: an avatar comparison database configured to store the created virtual individual 3D body model of the customer, the avatar comparison database comprising a large number of virtual individual 3D body models from different customers in order to compare the provided virtual individual 3D body models with one another.
20. The device according to claim 19, further comprising: an avatar matching device configured to compare the created virtual individual 3D body model of the customer with other virtual individual 3D body models provided by the avatar comparison database.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0092]
[0093]
[0094]
[0095]
[0096]
DETAILED DESCRIPTION OF THE DRAWINGS
[0097]
[0098] In addition, general 3D body shell data (or model data) 13 can be provided by a general ideal scan, in particular of a model, which is also linked to the general skeletal data 15 in order to create or provide a general perfect 3D body model in this way. The quality of the virtual individual 3D body model can be increased by linking the general perfect 3D body model provided to the virtual individual 3D body model created by the 3D scan. By such linking or projection defects or holes in the 3D mesh can be corrected. In other words, a high-quality virtual individual 3D body model 20 can be created on the basis of the determined individual 3D body shell data 10, the provided general skeletal data 15 and the provided general 3D body shell data.
[0099] It is pointed that some of the lines or arrows shown in the flowchart in
[0100] The customer’s scan can be available e.g. as a 3D surface mesh model in medium scan quality (with occasional holes in the scan). The virtual skeleton including nodes (joints) and connecting elements (bones) is placed into this scan. The placement of the skeleton can be done via a neural network. In parallel, there can be a perfect base mesh (i.e. a model topology, which in particular comprises the model data and the general skeletal data or a combination and/or linking or convolution thereof), which includes a virtual skeleton with exactly the same nodes and connecting elements. After all nodes and/or connecting elements have been identified with each other, the scan skeleton (i.e. the virtual individual 3D body model) is placed on the perfect base mesh (i.e. the virtual individual 3D body model is mapped with the perfect base mesh or matched) and the base mesh is adapted to the proportions of the 3D body scan by the scan skeleton or its nodes (through surface deformation in all mesh areas that are close to the affected skeleton nodes). This allows working with a closed mesh model in the further course, which does not have any defects or holes. The “mapping” or “matching” of the virtual individual 3D body model with the perfect base mesh comprises or is in particular a “mesh registration”. The 3D body model (avatar) can be edited with regard to size, physique, shape, posture and even dynamic (motion) properties by means of an editable set of input parameters.
[0101] A virtual ideal 3D design 30 is created or provided to create a virtual individual 3D garment 40. This virtual 3D ideal design can be based on the specifications of the designer and/or the customer from a clothing catalog 32, a style element catalog 34, an effect element catalog 36, and/or a material catalog 38. From the clothing catalog 32, the type of clothing (e.g. jacket, trousers, sweater, etc.) can be selected. Style elements such as puffed sleeves or a zipper can be selected from the style element catalog 34. Certain effect elements such as broad shoulders, narrow waist, conceal breast, etc. can be selected from the effect element catalog 36. Also, the material of the garment to be made (e.g. jeans, poplin, jersey, etc.) can be selected from the material catalogue.
[0102] Style elements comprise in particular so-called “add-ons” that are placed on the basic garment, e.g. widening of the skirt plate, insertion of puffed sleeves on the T-shirt, waterfall collar on the blouse, colored folds on the trousers, shape of the trousers (“carrot shape”, “slim fit”, “high waist”, “⅞ length”, “flares”). There are also zippers, buttons, Velcro fasteners or elastic bands to choose from as additional style elements.
[0103] Effect elements are personal preferences of the designer or customer, which emphasize or cover up parts of the body. In a query selection, the customer can e.g. indicate whether they have narrow shoulders and would like to visually broaden them, have large breasts and would like to conceal them optically, have a narrow waist and would like to emphasize it.
[0104] Before the created avatar 20 appears in the front end, a semantic mesh segmentation and proportional increase in volume of the avatar 20 take place in the back end. This is not visible in the frontend. A function that functionally models the edge of the volume in dependence is placed for each body segment parallel to the axis of rotation of the body part. In particular, distances to the surface, which vary depending on the body part, are provided by means of a signed distance function. The specifically selected distance depends in particular on the location, the type of textile and/or the style of the garment. The 3D body model (avatar) can be edited with regard to size, physique, shape, posture and even dynamic (motion) properties by means of an editable set of input parameters. This means that movement distances of the individual limbs produce different movement radii and therefore different distances (for freedom of movement or support function for body parts that are to be emphasized). For example, more distance can be calculated on the upper arm than on the forearm, or a narrower circumference on the waist in order to emphasize it. For example, in the case of an elbow, the circumferential distance is greater than the distance to the forearm, or the circumferential distance to the armpit area is greater than the distance to the upper arm. This ensures a certain freedom of movement. Different limbs require different freedoms of movement or have different angles of rotation and require different degrees of freedom. This results in different distances from the body shell to the garment.
[0105] In the frontend, a virtual ideal draft of the basic garment is projected onto the virtual (frontend) avatar (without any visible increase in volume). Within the context of the invention, “front end” is understood to mean in particular a mobile application (app) or a website (or a portal) that is available to the user (customer, end consumer, designer or tailor) as a graphical user interface. In the frontend, the customer can see information and act themselves. “Backend” means internal processes the customer does not see within the application or website and with which he cannot interact. In particular, “backend” means a part of an IT system that deals with data processing in the background (i.e. the “data layer”). The term “backend” can therefore comprise e.g. calculation and optimization software, which is preferably stored on a server and which executes the necessary calculations or algorithms (in the background).
[0106] Style elements and/or effect elements are selected in the frontend. In the backend, the implementation of the added “add-ons” means a regional deformation of the surface of the backend avatar. For example, for a puff sleeve t-shirt, a sphere is modeled onto the shoulder of the backend avatar, with faithfully replicates the volume of the puff sleeve. For example, a skirt has a sleeve connected to the outside of the legs wrapped around the legs. In the backend, the rest of the body is cut off, especially at the level of the skirt length (in this case the legs) and at the level of the skirt waistband, so that the skirt body can be developed. In the case of a fitted blouse with a flared hip, for example, a cone is modeled on the side of the hip. In addition, a minimum and maximum parameterization can be stored algorithmically, i.e. if the cross-section of a sleeve falls below the radius of a hand, suggestions are made to use a zipper or a slit so that the garment can be put on. In the case of a neckline, for example, the head circumference is calculated from the scan to ensure that the garment can be put on. If this radius is not reached, suggestions are automatically made as to whether a zipper or slit or something else should be used. A similar procedure is used for a skirt or dress, for example. Here, in particular the pelvis and shoulder circumference are measured. In the backend, the effect elements - e.g. based on a query of body parts that are to be emphasized or concealed - are translated in particular into the number and main orientation of the cutting lines. For example, in the case of large breasts that need to be covered, the pattern parts in the segment of the breast area can be arranged in small parts, i.e. more cutting lines than necessary can be set. In the case of narrow shoulders that should appear wider, cutting lines can be oriented perpendicular to the body axis (90 degrees). When a narrow waist is to be emphasized, the body shape in the abdominal segment can be oriented lengthwise to the body axis.
[0107] As can be seen from
[0108] After the virtual individual 3D garment 40 has been generated, the virtual individual 3D garment 40 is flattening/developed from 3D to 2D with the aid of an algorithm 50, i.e. the virtual individual three-dimensional garment 40 is projected onto a two-dimensional plane.
[0109]
[0110]
[0111]
[0112]
[0113] The set of cutting line rules leads to the creation of cutting patterns advantageously generated for the person. Depending on the body shape, the cuts are placed on other parts of the body. This setting of cutting lines arises from the topology of the respective body scan and is created by the algorithmic segmentation (dissection) and development of doubly curved (three-dimensional) surfaces with low, optimized angular distortions. The approach of segmenting curved surfaces (into multiple patches by means of cutting lines) is in particular a component of the present method in order to create planar surfaces from doubly curved surfaces and to map structures that can be produced from flat materials, such as paper or textiles. It is mathematically and therefore technologically impossible to map doubly curved surfaces exactly and without distortion in two dimensions, or only in an approximation or optimization process. In the method described here, the doubly curved surfaces cannot only be evaluated based on their characteristics (curvature and length of the curves). The projection of the segments onto a planar (flat) plane provides distortions or warping, which the applied cutting curves are supposed to imply. Specifically, this means that cutting lines are set automatically in those places where, from a purely technological point of view, the 3D surfaces cannot be flattened/developed into 2D because this would lead to warping or overlapping of fabric webs. These cutting lines created from the topology also result in the best possible fit for the garment to be made and guarantee the least amount of creases in the garment after the fabric has been sewn together, because they represent the most curved points (high-low points) of the body. The segmentation can be done according to various criteria as an initialization for the algorithm. For example, one criterion can be that contiguous round areas are to be clustered so that the number of residual areas (waste) is as small as possible. Another conceivable criterion is to create horizontal or vertical surfaces that are as continuous as possible, in order to emphasize body proportions or make them appear narrower. An initialization of small “patches” (pattern parts) is also possible, in order to make a large bust size appear smaller in the chest area, for example.
[0114] In summary, the present invention comprises a new method for the true-to-size and individual production of cutting patterns with the help of an algorithm based on 3D body scans. In particular, starting from a 3D body scan, a topology with many different fixed points of the body is generated, onto which a design is projected. The resulting projection is scaled (developed) from a 3D representation to a 2D representation to create useful pieces of fabric. In particular, darts and cutting patterns are automatically created by the algorithm. With the help of the present invention, individual or personalized unique cutting patterns can be generated. In particular, the invention distinguishes itself from conventional methods or systems in that it simplifies the production of cutting patterns and at the same time makes the garments even more true to size or individually adapted and also more cost-effective.
[0115] In contrast to conventional methods, in which an existing 2D cutting pattern (standard cutting pattern) is always used to produce garments and is adapted, where applicable, on the basis of the customer’s individual scan data, the present invention can be used to directly create an individual cutting pattern based on 3D body shell data or based on scan data. An existing 2D standard cutting pattern is therefore no longer required. While previous methods are based on a two-dimensional “basic pattern”, which is based e.g. on symmetrical pattern parts that can be mirrored, the method according to the invention can be used in particular to take into account the topology of an individual 3D scan of the customer’s body in order to individually divide the body surfaces of the customer into segments that can be flattened/developed two-dimensionally. In this way, an individual cutting pattern can be created, which can also produce non-symmetrical patterns. The present invention thus makes it possible to take into account individual body characteristics of the customer (such as a hunchback, deformities, a shorter arm or leg, disabilities, etc.) for the production of garments.
TABLE-US-00001 List of reference numerals 10 individual 3D body shell data (scan data) 13 general 3D body shell data (model data) 15 general skeletal data (virtual skeleton) 16 nodes (anchor point) 17 connecting element 20 virtual individual 3D body model (customer’s avatar) 22 model topology (3D virtual general 3D body model) 24 scanned model (virtual individual 3D body model or avatar) 25 adapted or deformed model topology 26 refined topology (high quality avatar of the customer) 30 virtual 3D ideal design of the garment to be made 32 clothing catalogue 34 catalog of style elements 36 catalog of effect elements 38 catalog of Materials 39 design basic rule 40 virtual individual 3D garment 50 flattening/development algorithm 52 first set of cutting rules 54 second set of cutting rules 56 algorithm for further development using a merit/target function S1 display of several cutting pattern solutions in a suitable form S1a viewing of a final virtual 3D garment using virtual reality and/or augmented reality S2 manual selection of a cutting pattern solution S3 output of the selected cutting pattern