METHODS FOR FABRICATION OF ARTICLES FROM THREE-DIMENSIONAL MODELS
20250163620 ยท 2025-05-22
Inventors
- William Samosir (Brooklyn, NY, US)
- Lawrence Panozzo (San Antonio, TX, US)
- Spencer Sherk (Brooklyn, NY, US)
- Garrett Li Gerson (Malibu, CA, US)
Cpc classification
International classification
Abstract
Methods for fabrication of articles, in particular knitted articles, using computer-controlled machines. A three-dimensional (3D) model defined in a 3D space may be transformed into a two-dimensional (2D) knitting map that specifies respective locations of stitches for a knitted article. Groups of the stitches forms courses and wales of the knitted article. The 2D knitting map contains apexes which terminate an end of respective pairs of the courses. A spatial distance between respective ones of the apexes within a portion of the 2D knitting map may be decreased or increased. An amount of the spatial distance decreased or increased between respective ones of the apexes of the portion of the 2D knitting map may be based on a user-provided input. The 2D knitting map is subsequently converted to knitting instructions for a computer-controlled knitting machine.
Claims
1. A computer-implemented method, comprising: transforming a three-dimensional (3D) model defined in a 3D space into a two-dimensional (2D) knitting map that specifies respective locations of stitches for a knitted article, groups of the stitches forming courses and wales of the knitted article, the 2D knitting map containing apexes which terminate an end of respective pairs of the courses, wherein all of the courses of the 2D knitting map extend along lines that are arranged parallel to one another; for a portion of the 2D knitting map, decreasing or increasing a spatial distance between respective ones of the apexes within the portion of the 2D knitting map, wherein an amount of the spatial distance decreased or increased between respective ones of the apexes of the portion of the 2D knitting map is based on a user-provided input; converting the 2D knitting map to knitting instructions for a computer-controlled flatbed knitting machine; and transmitting said knitting instructions to said computer-controlled flatbed knitting machine so as to produce the knitted article in accordance with the knitting instructions.
2. The computer-implemented method of claim 1, further comprising: prior to the converting of the 2D knitting map to the knitting instructions, presenting the 2D knitting map for review and edit by a user; and updating the 2D knitting map according to revisions made by the user.
3. The computer-implemented method of claim 1, further comprising: prior to the converting of the 2D knitting map to the knitting instructions, defining an updated 3D model based on the 2D knitting map; and updating the 2D knitting map using the updated 3D model.
4. The computer-implemented method of claim 1, wherein the 3D model is one of selected from a library, produced from imaging of a physical article, produced by a user algorithmically, or produced by the user manually.
5. The computer-implemented method of claim 1, further comprising: receiving a texture map representing a design, logo, pattern, or other features to be produced as part of the knitted article knitted by the computer-controlled flatbed knitting machine, wherein the texture map specifies one or more of texture or color on the surface of the 3D model; applying the texture map to the 3D model; and immediately after the 2D knitting map has been produced, transferring information represented in the texture map from the 3D model to the 2D knitting map.
6. The computer-implemented method of claim 5, wherein the texture map specifies one or more of texture or color on the surface of the 3D model.
7. The computer-implemented method of claim 1, wherein the 3D model represents a physical body as a collection of points in the 3D space that are connected by geometric primitives.
8. The computer-implemented method of claim 1, wherein the 3D model is formed by scanning a real-world object.
9. The computer-implemented method of claim 8, wherein the 3D model formed by scanning the real-world object is augmented through user manipulation of the 3D model in order to remove artifacts of the scanning and/or to provide customizations in one or more of shape or appearance.
10. The computer-implemented method of claim 1, wherein the 2D knitting map comprises a checkerboard-like array in which each pixel represents a loop of the knitted article.
11. A non-transitory machine-readable medium for a computer system comprising a processor, the non-transitory machine-readable medium comprising instructions that, when executed by the processor, cause the processor to: transform a three-dimensional (3D) model defined in a 3D space into a two-dimensional (2D) knitting map that specifies respective locations of stitches for a knitted article, groups of the stitches forming courses and wales of the knitted article, the 2D knitting map containing apexes which terminate an end of respective pairs of the courses, wherein all of the courses of the 2D knitting map extend along lines that are arranged parallel to one another; for a portion of the 2D knitting map, decrease or increase a spatial distance between respective ones of the apexes within the portion of the 2D knitting map, wherein an amount of the spatial distance decreased or increased between respective ones of the apexes of the portion of the 2D knitting map is based on a user-provided input; convert the 2D knitting map to knitting instructions for a computer-controlled flatbed knitting machine; and transmit said knitting instructions to said computer-controlled flatbed knitting machine so as to produce the knitted article in accordance with the knitting instructions.
12. The non-transitory machine-readable medium of claim 11, wherein the non-transitory machine-readable medium further comprises instructions that, when executed by the processor, cause the processor to: prior to the converting of the 2D knitting map to the knitting instructions, present the 2D knitting map for review and edit by a user; and update the 2D knitting map according to revisions made by the user.
13. The non-transitory machine-readable medium of claim 11, wherein the non-transitory machine-readable medium further comprises instructions that, when executed by the processor, cause the processor to: prior to the converting of the 2D knitting map to the knitting instructions, define an updated 3D model based on the 2D knitting map; and update the 2D knitting map using the updated 3D model.
14. The non-transitory machine-readable medium of claim 11, wherein the 3D model is one of selected from a library, produced from imaging of a physical article, produced by a user algorithmically, or produced by the user manually.
15. The non-transitory machine-readable medium of claim 11, wherein the non-transitory machine-readable medium further comprises instructions that, when executed by the processor, cause the processor to: receive a texture map representing a design, logo, pattern, or other features to be produced as part of the knitted article knitted by the computer-controlled flatbed knitting machine, wherein the texture map specifies one or more of texture or color on the surface of the 3D model; apply the texture map to the 3D model; and immediately after the 2D knitting map has been produced, transfer information represented in the texture map from the 3D model to the 2D knitting map.
16. The non-transitory machine-readable medium of claim 15, wherein the texture map specifies one or more of texture or color on the surface of the 3D model.
17. The non-transitory machine-readable medium of claim 11, wherein the 3D model represents a physical body as a collection of points in the 3D space that are connected by geometric primitives.
18. The non-transitory machine-readable medium of claim 11, wherein the 3D model is formed by scanning a real-world object.
19. The non-transitory machine-readable medium of claim 18, wherein the 3D model formed by scanning the real-world object is augmented through user manipulation of the 3D model in order to remove artifacts of the scanning and/or to provide customizations in one or more of shape or appearance.
20. The non-transitory machine-readable medium of claim 11, wherein the 2D knitting map comprises a checkerboard-like array in which each pixel represents a loop of the knitted article.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] The present invention is illustrated by way of example, and not limitation, in the figures of the accompanying drawings, in which:
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
DETAILED DESCRIPTION
[0028] The present invention addresses challenges, such as those described above, in the fabrication of articles using computer-controlled machines, and provides methods for such fabrication from instructions automatically generated from 3D models of said articles. In one embodiment, the present invention provides methods for transforming 3D meshes associated with 3D models of articles into instructions for a computer-controlled flatbed knitting machine.
[0029] As will become apparent in the following discussion of embodiments of the invention, various operations referred to herein are machine operations. Useful machines for performing the operations of the present invention include both the target fabrication machines which will produce the desired articles being constructed, and digital computer systems or other similar devices. The present invention involves, to some degree, the production of instructions for operating, that is controlling the operation of, the target fabrication machines to produce a desired result. Those instructions by which the target fabrication machine will produce the desired result are created, in part, using one or more programmed digital computer systems, which in some cases may intercommunicate with one another. For example, in one embodiment of the invention, a first computer system, referred to as a client is used to construct and/or customize a 3D model of the article to be fabricated, and that model is then passed to a second computer system, referred to as a server or host, where the 3D model is converted to a 2D bitmap or other representation suitable for translation into instructions for the target fabrication machine. In other cases, a single digital computer system may be used for both aspects of the operation, for example in a service-as-a-platform based approach in which a client computer system is used merely as a visualization and instruction instrument to observe, direct, and control processes executing on a server.
[0030]
[0031] As illustrated, computer system 400 generally includes a communication mechanism such as a bus 410 for passing information (e.g., data and/or instructions) between various components of the system, including one or more processors 402 for processing the data and instructions. Processor(s) 402 perform(s) operations on data as specified by the stored computer programs on computer system 400, such as the stored computer programs for running a web browser and/or for constructing and/or customizing a 3D model of the article to be fabricated and visualizing results of the server-based operations that will be described in greater detail below. The stored computer programs for computer system 400 and server 492 may be written in any convenient computer programming language and then compiled into a native instructions for the processors resident on the respective machines.
[0032] Computer system 400 also includes a memory 404, such as a random access memory (RAM) or any other dynamic storage device, coupled to bus 410. Memory 404 stores information, including processor-executable instructions, data, and temporary results, for performing the operations described herein. Computer system 400 also includes a read only memory (ROM) 406 or any other static storage device coupled to the bus 410 for storing static information, including processor-executable instructions, that is not changed by the computer system 400 during its operation. Also coupled to bus 410 is a non-volatile (persistent) storage device 408, such as a magnetic disk, optical disk, solid-state disc, or similar device for storing information, including processor-executable instructions, that persists even when the computer system 400 is turned off. Memory 404, ROM 406, and storage device 408 are examples of a non-transitory computer-readable medium.
[0033] Computer system 400 may also include human interface elements, such as a keyboard 412, display 414, and cursor control device (e.g., a mouse or trackpad) 416, each of which is coupled to bus 410. These elements allow a human user to interact with and control the operation of computer system 400. For example, these human interface elements may be used for controlling a position of a cursor on the display 414 and issuing commands associated with graphical elements presented thereon. In the illustrated example of computer system 400, special purpose hardware, such as an application specific integrated circuit (ASIC) 420, is coupled to bus 410 and may be configured to perform operations not performed by processor 402; for example, ASIC 420 may be a graphics accelerator unit for generating images for display 414.
[0034] To facilitate communication with external devices, computer system 400 also includes a communications interface 470 coupled to bus 410. Communication interface 470 provides bi-directional communication with remote computer systems such as server 492 and host 482 over a wired or wireless network link 478 that is communicably connected to a local network 480 and ultimately, through Internet service provider 484, to Internet 490. Server 492 is connected to Internet 490 and hosts a process that provides a service in response to information received over the Internet. For example, server 492 may host some or all of a process that provides a user the ability to construct and/or customize a 3D model of an article to be fabricated, which model is then converted to a 2D bitmap or other representation suitable for translation into instructions for the target fabrication machine, in accordance with embodiments of the present invention. It is contemplated that components of an overall system can be deployed in various configurations within one or more computer systems (e.g., computer system 400, host 482 and/or server 492).
[0035] Referring now to
[0036] The surface of the 3D model may also be defined by a texture map that includes details such as colors and textures. Generally, the texture map is a 2D construct, such as a bitmap image, although procedural textures may also be used. Often in the fabrication of custom articles, the user will create, as part of the modeling process, a custom texture map or UV map that forms part of the bespoke nature of the article to be fabricated. By UV map, we mean a 2D image of a pattern or other texture that is to be projected onto the 3D model's surface. The UV in UV map refers to the bi-dimensional nature of the process: the letters U and V denote orthogonal axes of the 2D texture because, by convention, X, Y and Z are used to denote orthogonal axes of the 3D model. UV maps allow for the application of designs, logos, patterns, and other features that will be produced as part of the finished article to be constructed.
[0037] When the 3D model with its associated texture map is ready, it is passed to a flattening process (step 104). The flattening process creates a 2D representation of the 3D model with its applied customizations (e.g., textures). The resulting 2D representation may be in the form of a bitmap and serves as a basis for producing machine instructions for the fabrication of the article represented by the 3D model. Before those instructions are generated, however, the user is afforded an opportunity to review and edit the 2D bitmap (step 106). While optional, this step is preferably included in the process 100 as the conversion from a 3D model to a 2D representation of the model may result in artifacts which can be edited/corrected at the individual pixel level. Any corrections thus made can then be reviewed, either in the form of the 2D bitmap and/or in the form of a revised 3D model. That is, the 2D bitmap can be represented as a 3D model in the same way as an original scan of an article can be so visualized. Or, the 2D bitmap may be applied as a texture map to the original 3D model for viewing. Thus, the above procedures 102-106 may be made iterative in nature (No branch of step 108) and the user may be afforded the opportunity to carefully refine his/her design until satisfied.
[0038] Once so satisfied (Yes branch of step 108), the 2D bitmap is used to produce fabrication instructions for the target fabrication equipment (step 110). For example, in the case of machines that operate in a raster-like fashion, the information provided in the 2D bitmap may be translated to cutting, stitching, knitting, welding, gluing, fastening, binding, folding, or other instructions for the fabrication machine(s). Each pixel in the 2D bitmap may represent, for example, a cut, a stitch, a weld, an application of glue or other fastener, a fold, or another operation to be performed by the fabrication machine(s). The translation into machine instructions will depend on the nature of the target fabrication machine(s). For example, a single pixel may be translated into one instruction or, more likely, more than one instruction. Where a cutting machine is involved, for example, the presence or absence of information in a pixel of the bitmap may determine the creation of a cutting instruction or an instruction not to cut a workpiece at a corresponding position. In other instances, information included in a pixel may define the nature of a stitch to be made, as well as the color of the yarn or thread to be used. Other target fabrication machines will have their own necessary instruction set and the 2D bitmap may be used to allocate instructions from that instruction set to spatial positions in or on a workpiece that represents the article to be manufactured.
[0039]
[0040] At step 204, the 3D model with its associated texture map may be passed to a flattening process. The flattening process may create a 2D knitting map for the article to be fabricated based on the 3D model with its applied customizations (e.g., textures). The 2D knitting map serves as a basis for producing knitting machine instructions for the fabrication of the knitted article represented by the 3D model. As discussed above, the user may be afforded an opportunity to review and edit the 2D knitting map (step 206), and any corrections can then be reviewed, either in the form of the 2D knitting map and/or in the form of a revised 3D model. Once the user is satisfied (Yes branch of step 208), the 2D knitting map is used to produce knitting instructions for the target knitting machine (step 210). For example, the 2D knitting map may be regarded as a checkerboard-like array in which each square (or pixel) represents a loop. The square may be coded (e.g., by color or other indicia) to provide information about the machining operations required, such as transferring loops, increasing or inactivating needles of the knitting machine, which bed of the knitting machine the needle is on, etc. Commercially available knitting machine software, which may execute on the computer system hosting the 2D knitting map, is available to transform such 2D knitting maps to g-code for a target knitting machine. For example, KnitPaint produced by Shima Seiki of Wakayama, Japan produces g-code for Shima Seiki knitting machines from pixel-level knitting maps. Additional parameters, such as tension settings, needle gauge, yarn type (elasticity), etc. may need to be provided in order for the knitting machine software to adapt the 2D knitting map to g-code in order to achieve a desired fit of the knitted article.
[0041] Referring now to
[0042] Computer-controlled knitting machines, including flatbed knitting machines, are capable of producing knitted articles with 3D geometries. To create such articles, a knitting pattern is needed that describes the article to be knitted as a series of 2D, line-by-line instructions. Popescu et al. have described an approach for creating knitting patterns from a 3D mesh. Popescu et al. Automated generation of knit patterns for non-developable surfaces. Humanizing Digital Reality: Design Modelling Symposium Paris 2017. Springer Singapore, 2018. Briefly, given a 3D mesh, a user-defined knitting direction, and desired loop parameters for a target knitting machine, courses and so-called short rows are generated. In knitting, a wale is a column of loops running lengthwise, corresponding to a warp of a woven fabric, and a course is a crosswise row of loops, corresponding to the filling. Short rows, also known as partial or turning rows, are used for shaping purposes (e.g., to fit curved areas such as shoulders, and to impart design elements, such as staggered stripes). In the method of Popescu et al., the courses are sampled with the defined loop width for the target machine to create a final topology, which is turned into a 2D knitting pattern in the form of squares representing loops course by course.
[0043] Narayanan et al. Automatic machine knitting of 3D meshes. ACM Transactions on Graphics (TOG) 37.3 (2018) described a computational approach to transform 3D meshes created by a conventional modeling program into instructions for a computer-controlled circular knitting machine. In the described method, an oriented, manifold 3D triangle mesh with two or more boundaries and a monotonic knitting time function specified by a scalar value at each vertex of the mesh are needed as inputs. Knitting machine instructions that approximate the mesh are developed in three main steps: First, a remeshing is performed to create a specially-structured knitting graph on the 3D surface. Second, tracing creates knitting instructions from the graph. Third, the knitting instructions are scheduled to needle locations on the knitting machine. The approach recognizes that knit objects have an intrinsic row-column structure, where rows arise from yarn-wise connections, and columns arise from loop-wise connections. The remeshing phase produces a directed graph to guide the row-column structure of the final knit object. Each node in the graph represents two knit loops in the fabricated pattern.
[0044] The present invention provides an automated method for transforming 3D meshes into instructions for a computer-controlled flatbed knitting machine. In one embodiment of the invention, pattern customization and flattening are merged into a single workflow, as represented in
[0045] Regardless of how it is created, the 3D model is preferably characterized by a 3D polygonal mesh that defines a surface of the article represented by the 3D model. As noted, that surface may be further defined by texture mapping (step 304). For example, details such as colors, patterns, shapes, etc. may be applied to customize the 3D model. This texture mapping may be performed using conventional 3D modeling tools, such as a personal computer or workstation running a 3D modeling program that allows for the described customization and visualization of the model. For example, in one embodiment of the invention, a UV map for the model may be provided or created and applied to the surface of the 3D model. Other processes can be used in lieu of UV mapping to add a pattern or texture to the 3D model. Customization of the 3D model through the application of surface textures or otherwise allows a user to create custom patterns, designs, logos, etc.
[0046] Returning to
[0047] The same 3D modeling tool(s) used to create and/or customize the 3D model may be used to define (e.g., draw) the streamline on the model and the resulting computer file that describes the customized model may be packaged and provided (step 308) to a server-based flattening tool, for example as a .glb file. In other instances, the model creation or importing and subsequent customization may be performed on a remote (e.g., cloud-based) system accessed by a client computer or workstation over a computer network or network of networks. This service-as-a-platform based approach allows the customization and subsequent flattening to be performed by the same cloud-based or network-based system, under the direction of a remote client at which a user is located.
[0048] In step 310, the flattening process, whether performed at a remote server or a local computer system (from the standpoint of the user), receives the input texture-mapped 3D model that includes the streamline. Using the streamline as an origin, the flattening approach begins by defining a set of isolines over the surface described by the 3D mesh (step 312). In this context, isolines are geodesics that are equally spaced from the streamline and from one another. In some embodiments of the invention, the isolines are determined according to the so-called heat method for computing geodesic distances described by Crane et al., Geodesics in Heat: A New Approach to Computing Distance Based on Heat Flow, ACM Transactions on Graphics, Vol. 32, No. 5 (September 2013). In other embodiments, other approaches for defining isolines may be used, for example, Sethian's Fast Marching Method, window propagation, etc.
[0049] Referring to
[0050] With the quantization points defined, one or more cut lines may be automatically defined on the surface (step 316). A cut line, such as cut line 1100 shown in
[0051] Next in the overall flattening process, courses are generated by connecting quantization points 1000 of the isolines 900 based on knitting rules (step 318).
[0052]
[0053] In
[0054] programmed to use yarn with a first property (e.g., color, elasticity, etc.) when knitting stitches within the black regions 1306a of the knitting regions 1302, and use yarn with a second property (e.g., color, elasticity, etc.) when knitting stitches within the checkered regions 1306b of the knitting regions 1302. The use of black and checkered regions is only an example and it should be understood that the same information could have been visually represented using two different colors, two different patterns, etc.
[0055] In
[0056] This flattening process involves constructing quadrangles/triangles between the isolines by setting the locations of apexes of the quadrangles/triangles at the locations defined by the equidistant points on the isolines. Isolines require fast geodesic distance computation, which typically can only be performed quickly vertex to vertex. Because it is only vertex to vertex, retriangulating the original meshto include (as new vertices) all intersections that the input path makes with existing edges and verticesallows for accurate calculations of distances from the path.
[0057] Liu et al., Knitting 4D Garments with Elasticity Controlled for Body Motion, ACM Transactions on Graphics, Vol. 40, No. 4 (August 2021) describe an approach for transforming a 3D model to a flattened bitmap by assigning each quad/triangle of a 3D mesh with corresponding wale and course indices. This process envisions each quad as a stitch and the stitches are mapped in order so as to be joined to one another, thereby defining the wales and courses. The present invention adopts a somewhat different approach. Unlike Liu et al., embodiments of the present invention employ a two bed jersey with elastic interlock. This allows for a framework that has minimal stretch and distortion. Further, whereas Liu et al. do suggest that apex diffusion improves 3D shaping, the present inventors have determined this to be true only in areas of relatively constant (i.e., Gaussian) curvature. Planar regions of an object should not have any diffused apexes. Each diffused apex creates some amount of goring, which results in 3D convexity in the final knitted article. To accommodate these demands, the present invention employs apex attraction, pulling apexes together into a relatively smooth goring edge, which gets pulled up by the knitting machine well, without holes in the result, once relief lines are added. The relief lines may be added automatically or manually. This is used in conjunction with apex diffusion.
[0058] In one embodiment then, both apex diffusion, to handle areas of constant curvatures, and apex attraction are employed in the same mesh, and even within the same wale. The apex diffusion and apex attraction function with weights, allowing a user to tune how much diffusion some areas will receive and how much attraction other areas will receive. This provides improved 3D shaping beyond the 3D processed tessellations initial output of wales and courses. Additionally, embodiments of the invention may add point relaxation that may provide better knits as well by tessellating more square-shaped stitch mesh elements in place of parallelogram-shaped elements.
[0059] Optionally, a new 3D mesh may be generated (step 324) from the 2D knitting map. Providing a new 3D mesh allows for iteration by the user. That is, the new 3D mesh can be passed back to the original 3D modeling program (e.g., which may be running on the user's local computer or running on the server is a service-as-a-platform model) (step 326), where the new 3D mesh can be visualized (step 328), reviewed; if desired, edited (step 330); and if desired, further edited (No branch of step 332). The visualization may be done using a shader application or another visualization routine. For example, the shader may receive the new 3D mesh and a texture defined by a 2D context (fillmap) and user-specified colors to indicate which areas of the 3D mesh should display which knit structure/yarn color. At the pixel/texel level, the shader may determine what the current structure should be (knit structures may be obtained from a library and/or may be user-specified), and with the current structure determined, then draw the diffuse and normal map. Yarn structures may be defined using quadratic Bezier curves and, as an optimization, Cardano's root-finding method may be used to derive basic curve structures. After defining the underlying curve structure, the shader may derive a signed distance field to give the curves a tunable/variable thickness, and derive a gradient from the line's midpoint. Using the gradient strength at any point on the line, the curvature/slope value of the yarn may then be derived, as may the direction from any point on the gradient to the curve's midpoint (e.g., by evaluating root values returned from a findRoots process). This allows for orienting the curvature value derived in the previous step. With the curvature and orientation of a current point on a current curve or yarn available, the curvature weights in x and y directions may be calculated and these values used to assign color. Lastly, depth may be derived from the gradient strength.
[0060] As part of the visualization and editing process, the user can perform pixel-level (stitch-level) editing of the 2D knitting map (step 330). This editing may result in a new customization of the article for production and the entire procedure may be iterated (No branch of step 334), until the user is satisfied with the result. Once so satisfied (Yes branch of step 334), the final version of the 2D knitting map is converted to machine instructions (step 336), and sent for execution on the target knit machine (step 338).
[0061]
[0062] If the reader closely inspects the 3D stitch map 1500, one will see that the pixels of the 3D stitch map 1500 are not arranged along straight lines, due to the 3D shape of the knitted article and the use of the so-called short-rows to effect the shaping of the knitted article. This means that there are bends or kinks in the courses (or wales) of the knitted article. Conceptually, to convert the 3D stitch map 1500 into the 2D knitting map 1400, one would straighten all of the courses into straight rows, and further orient the straightened courses into rows that are parallel to one another. In the present example, the courses are being straightened into straight rows, but in another example (not depicted), the wales could be straightened into straight columns. The third dimension of the 3D stitch map 1500, however, is not lost in the 2D knitting map 1400, as it is captured in the relative spacing between adjacent courses of the 3D stitch map 1500 which have been transformed into non-adjacent courses of the 2D knitting map 1400. For instance, portions of courses 15i and 15j in the 3D stitch map 1500 are adjacent to one another, whereas the corresponding courses 13i and 13j in the 2D knitting map 1400 are non-adjacent to one another (i.e., are separated from one another by goring region 12).
[0063] The rows of the 2D knitting map 1400 may also be referred to as the courses of the 2D knitting map 1400, as these rows map to the courses of the knitted article. Importantly, the courses of a 2D knitting map extend along straight lines which are arranged parallel to one another, as shown in
[0064] To describe
[0065] If the reader closely inspects 2D knitting map 1400, one will see that the courses are arranged in pairs of courses (also called a course pair). For clarity, each course pair has been depicted as two adjacent rows, one in a light grayscale color, and one in a darker grayscale color. For instance, the light rows may indicate stitches (or loops) that are knitted in the right direction, whereas darker rows may indicate stiches (or loops) that are knitted in the left direction. Each course pair may have two ends. For ease of discussion, these ends may be called the left end and the right end of a course pair. In the present example, many of the course pairs have a right end which is terminated by an apex (e.g., 14a, 14b, 14c, 14d, 14e and 14f). In the 2D knitting map 1400, an apex is indicated by two (or more) vertically aligned and contiguous white squares. For ease of illustration, it is noted that only some of the apexes in the 2D knitting map 1400 have been labeled. It is also noted that four contiguous courses are terminated at the left edge by apex 14h. Therefore, it should be clear that apexes can occur on both the left and right edges of a 2D knitting map. In most cases, however, when apex attraction is applied to a region of a 2D knitting map, the region contains the apexes from only one of the edges of the 2D knitting map (right or left edge).
[0066] For courses 13a-13f in the 2D knitting map 1400, corresponding courses 15a-15f have been labeled in the 3D stitch map 1500, with course 13a corresponding to course 15a, course 13b corresponding to course 15b, and so on. For apexes 14a-14h in the 2D knitting map 1400, corresponding apexes 16a-16h have been labeled in the 3D stitch map 1500, with apex 14a corresponding to apex 16a, apex 14b corresponding to apex 16b, and so on. In the context of courses, an apex may be understood as the transition from a course knitted in the right (or left) direction to the immediately adjacent course knitted in the left (or right) direction, respectively. The apexes appear as pointy structures in a knitted article, hence the name apex. Typically, apexes are located within the interior surface of a knitted article, as opposed to at the seam or edge of a knitted article. In the context of wales, an apex may be understood as the terminus of two or more wales, or two or more wales being combined into a single wale.
[0067] While the apexes of some adjacent course pairs are separated by a small distance in the 2D knitting map 1400 (e.g., between apexes 14a and 14b, or between apexes 14c and 14d), the apexes of other adjacent course pairs are separated by larger distances (e.g., between apexes 14e and 14f, or between apexes 14f and 14g). In apex attraction (the result of which is depicted below in
[0068] The region between non-adjacent course pairs in the 2D knitting map 1400 may be called a goring region 12, as it contributes the goring (or the shaping) of the knitted article. Conceptually, when a 2D knitting map is transformed into a 3D stitch map (or when a knitted article is knitted in accordance to instructions generated by a 2D knitting map), goring regions are collapsed into no region at all. For example, while courses 13i and 13j are separated by the goring region 12 in 2D knitting map 1400, no corresponding goring region is present in stitch map 1500. Rather, goring region 12 is completely collapsed in 3D stitch map 1500 such that a portion of the course 15i (corresponding to course 13i) is in direct contact with a portion of course 15j (corresponding to course 13j) in the 3D stitch map 1500. Stated differently, individual stitches of courses 13i and 13j that face one another across the goring region 12 in 2D knitting map 1400 are in direct contact with one another in the 3D stitch map 1500.
[0069]
[0070] As depicted in
[0071] It is noted, however, that a pronounced goring line is not always a desired feature. For instance in the shaping of a knit hat (e.g., a knit beanie), it may be more desirable for the apexes to be arranged in a diffuse manner without the appearance of any goring line. As mentioned above, techniques for performing apex diffusion on a 2D knitting map are described in Liu et al.
[0072] Thus, methods for fabrication of articles using computer-controlled machines from instructions for such fabrication automatically generated from 3D models of said articles, and in particular, methods for transforming 3D meshes associated with 3D models of articles into instructions for a computer-controlled flatbed knitting machine, have been described.