SYSTEMS AND METHODS FOR CUSTOM FOOTWEAR, APPAREL, AND ACCESSORIES
20220061463 · 2022-03-03
Inventors
- Craig D. Vanderoef (Costa Mesa, CA, US)
- Safir Bellali (Pasadena, CA, US)
- Longtao Wang (Alhambra, CA, US)
- Henry Song (Glendale, CA, US)
Cpc classification
A43D2200/60
HUMAN NECESSITIES
B33Y80/00
PERFORMING OPERATIONS; TRANSPORTING
A43B13/181
HUMAN NECESSITIES
A43D1/025
HUMAN NECESSITIES
International classification
Abstract
An example method may comprise determining, based at least on one or more images, one or more areas of wear indicative of worn portions of an article such as an article of footwear. The example method may comprise mapping the one or more areas of wear to a two dimensional wear model. The example method may comprise determining, based at least on the one or more images, a severity of wear of one or more of the one or more areas of wear. The example method may comprise determining, based on the two dimensional wear model and the severity of wear of each of the one or more areas of wear, a pattern for a custom article. The example method may comprise outputting a machine-readable code representing the pattern. The machine-readable code may be configured to be processed by a machine to cause manufacture of at least a portion of the custom article.
Claims
1. A method of making custom knit footwear, the method comprising: receiving one or more images of three-dimensional footwear; determining, based at least on the one or more images, one or more areas of wear indicative of worn portions of the footwear; mapping the one or more areas of wear to a two dimensional wear model; determining, based at least on the one or more images, a severity of wear of each of the one or more areas of wear; determining, based on the two dimensional wear model and the severity of wear of each of the one or more areas of wear, a knit pattern for a custom knit upper; and outputting a machine-readable code representing the knit pattern, wherein the machine-readable code is configured to be processed by a knitting machine to cause knitting of at least a portion of the custom knit upper.
2. The method of claim 1, wherein the one or more images are received via a mobile application. The method of claim 1, wherein the one or more images comprises a top-down view or a side view, or both.
4. The method of claim 1, wherein the one or more areas of wear are determined using computer vision.
5. The method of claim 1, wherein the one or more areas of wear are determined using a machine learning algorithm trained on a plurality of footwear images.
6. The method of claim 1, wherein the mapping comprises point-to-point positioning.
7. The method of claim 1, wherein the two dimensional model is based on a pattern of a footwear upper.
8. The method of claim 1, wherein the severity of wear is determined using a machine learning algorithm trained on a plurality of footwear images.
9. The method of claim 1, wherein the knit pattern comprises a reinforced region spatially disposed based on a location of the one or more areas of wear.
10. The method of claim 1, wherein the knit pattern comprises a reinforced region spatially disposed based on the severity of wear of the one or more areas of wear.
11. The method of claim 1, wherein the knit pattern comprises a reinforced region spatially disposed based on a location and severity of wear of the one or more areas of wear.
12. An article of footwear manufactured using the method of claim 1.
13. The article of claim 12, wherein the article comprises a skate shoe.
14. A method of making custom footwear, the method comprising: determining, based at least on one or more images of footwear, one or more areas of wear indicative of worn portions of the footwear; mapping the one or more areas of wear to a two dimensional wear model; determining, based at least on the one or more images, a severity of wear of one or more of the one or more areas of wear; determining, based on the two dimensional wear model and the severity of wear of each of the one or more areas of wear, a pattern for a custom upper; and outputting a machine-readable code representing the pattern, wherein the machine-readable code is configured to be processed by a machine to cause manufacture of at least a portion of the custom upper.
15. A method of making a custom article, the method comprising: receiving one or more images of a three-dimensional article; determining, based at least on the one or more images, one or more areas of wear indicative of worn portions of the article; mapping the one or more areas of wear to a two dimensional wear model; determining, based at least on the one or more images, a severity of wear of each of the one or more areas of wear; determining, based on the two dimensional wear model and the severity of wear of each of the one or more areas of wear, a pattern for a custom article; and outputting a machine-readable code representing the pattern, wherein the machine-readable code is configured to be processed by a machine to cause formation of at least a portion of the custom article.
16. The method of claim 15, wherein the pattern comprises an outsole, a midsole, or an upper, or a component of apparel.
17. The method of claim 15, wherein the one or more images are received via a mobile application. The method of claim 15, wherein the one or more images comprises a top-down view or a side view, or both.
19. The method of claim 15, wherein the one or more areas of wear are determined using computer vision.
20. The method of claim 15, wherein the one or more areas of wear are determined using a machine learning algorithm trained on a plurality of article images having various wear patterns.
21. The method of claim 15, wherein the mapping comprises point-to-point positioning.
22. The method of claim 15, wherein the severity of wear is determined using a machine learning algorithm trained on a plurality of article images.
23. The method of claim 15, wherein the pattern comprises a reinforced region spatially disposed based on a location of the one or more areas of wear.
24. The method of claim 15, wherein the pattern comprises a reinforced region spatially disposed based on the severity of wear of the one or more areas of wear.
25. The method of claim 15, wherein the pattern comprises a reinforced region spatially disposed based on a location and severity of wear of the one or more areas of wear.
26. An article manufactured using the method of claim 15.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The following drawings show generally, by way of example, but not by way of limitation, various examples discussed in the present disclosure. In the drawings:
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
DETAILED DESCRIPTION
[0024] Skating is an art form. As an illustrative example, the way each skater grinds, kick-flips, and ollies their way through the streets and ramps creates a style and genre unique to them in the same way each brush stroke and chisel strike belong to Monet or Michelangelo. The medium that captures the skaters' art is not a flat canvas or a chunk of marble, but the shoes in which they skate. Each trick, each push, each grind. Even each fall and failed attempt leave a permanent mark on the shoes they skate in. Their art is etched in leather and canvas left as a reminder of what was and a hint at what is next. The shoe as canvas for the artwork of skating has one a fatal flaw, the shoe is fleeting and in the creation of their art they destroy the canvas itself in the moment of creation and so their ability to create is hindered.
[0025] As a further example, footwear, apparel, or accessories may experience wear in a manner that is particular to a wearer and/or a specific activity. As such, the present disclosure may be used for various articles.
[0026] Extending the skate example, the performance customization model of the present disclosure offers an innovative solution to the continuance of art through skating by using the skater's (or artist's) previous work as the means to create a new canvas; specifically, for them and their unique style of art through skating. Through image capture of the old shoe (the manifestation of the previous works) the systems and methods may determine how their creative expression wears and tears their current shoes and through a powerful algorithm create a new shoe pattern that may be more durable for their specific style of skating. This pattern may be directly sent to a knitting machine that may then knit highly specialized yarns in specific areas utilizing performance structures to extend the ability to create in the skater's key zones resulting in a fully knit shoe that is built to the needs of the individual skater's art.
[0027] This process may extend the longevity for each skater in a unique way and allows greater confidence for the skater that their product will move forward with them as they push the envelope of their art. This new process may make a real time connection to their skating in their past and their future via a truly unique model of product creation.
[0028] Although reference is made to footwear, and in particular skate footwear, the processes, systems, and methods of the present disclosure may be applied to various footwear, apparel, accessories, and articles of manufacture without departing from the spirit of the invention.
[0029]
[0030] The one or more shoes 102 may comprise a pair of shoes. The one or more shoes 102 may comprise shoes for skateboarding. The one or more shoes 102 may have wear and tear from use. Although reference to footwear, and illustrations thereof, are made herein, other articles such as apparel or accessories may be used.
[0031] At least a portion of the network 104 may comprise a private network. At least a portion of the network 104 may comprise a public network. At least a portion of the network 104 may comprise the internet.
[0032] The remote computing device 106 may be associated with a clothing manufacturer. The remote computing device 106 may be associated with a shoe manufacturer. The remote computing device 106 may comprise one or more servers. The remote computing device 106 may comprise a cloud computing environment. The remote computing device 106 may comprise a network of computing devices. The remote computing device 106 may comprise a deep learning architecture. The remote computing device 106 may comprise a convolutional neural network. The remote computing device 106 may be configured to communicate with instances of the application.
[0033] The remote computing device 106 may use computer vision to help identify areas of wear and/or to help define a severity associated with each identified area of wear. The remote computing device 106 may use image digitization to help identify areas of wear and/or to help define a severity associated with each identified area of wear. The remote computing device 106 may use image extraction to help identify areas of wear and/or to help define a severity associated with each identified area of wear. The remote computing device 106 may use image recognition to help identify areas of wear and/or to help define a severity associated with each identified area of wear.
[0034] The manufacturing machine 108 may comprise a knitting machine or other machine used in making or assembling footwear, apparel, or accessories. Although reference is made to knitting techniques, other manufacturing techniques or assembly techniques may be used, such as digital printing, robotic assembly, adhesive or welding, including sonic welding, techniques, laminating, etc. The manufacturing machine 108 may be in communication with the remote computing device 106. The manufacturing machine 108 may be in direct communication with the remote computing device 106. The manufacturing machine 108 may be in communication with the remote computing device 106 via a network, such as the network 104. The manufacturing machine 108 may take an image file (e.g., jpeg, bitmap, etc.) as input. The manufacturing machine 108 may take instructions (e.g., Make One (M1) patterns, Make-One-Left (M1L) patterns, etc.) as input. The manufacturing machine 108 may output instructions (e.g., Make One (M1) patterns, Make-One-Left (M1L) patterns, etc.). The manufacturing machine 108 may output manufactured (e.g., knitted, etc.) apparel, such an upper for footwear, a midsole, an outsole, an apparel component, or an accessory.
[0035] A user may have an article (e.g., a pair of shoes), such as the one or more shoes 102. The article may exhibit wear and tear from use. The user may execute an application on a user device, such as the user device 100. The application may be associated with a manufacturer of the article. The user may capture one or more images of the article with the user device and use the application to transmit the one or more images through a network, such as the network 104, to a cloud computing environment associated with the manufacturer, such as the remote computing device 106.
[0036] The cloud computing environment may identify one or more locations based on the one or more images, wherein the identified one or more locations are indicative of wear and tear. The cloud computing environment may create a two-dimensional (2-D) pattern based on the identified one or more locations and/or the one or more images. The cloud computing environment may determine a severity degree associated with each of the one or more identified locations. The cloud computing environment may send instructions to create one or more articles (e.g., a pair of uppers for shoes) based on the 2-D pattern and/or the one or more determined severity degrees to a device (e.g., machine, knitting machine, computing device), such as the manufacturing machine 108. As an example, the manufacturing machine 108 may construct or fabricate a custom pair of uppers for shoes for the user based on her particular wear on the pair of shoes. Although reference is made to uppers for shoes, other footwear components may be made such as a midsole or outsole, or apparel components or accessories.
[0037]
[0038] The application may capture the top-down view images and/or the side view images. The application may process the top-down view images and/or the side view images. The application may use image extraction to define the boundaries of the shoes in the images and remove everything else. The application may use data transformation to align multiple images of the same shoe or shoe pair. The application may be in communication with an artificial intelligence (AI) engine via an application programming interface (API) to help identify the boundaries of the shoes. After receiving the images and the form information from the user, the application may cause the images and the form information to be transmitted across a network to a back-end computing system, such as the remote computing device 106 in
[0039]
[0040] The back-end of the application may collect images for training. The back-end of the application may use image digitization to model readable data. The back-end of the application may use data cleaning to remove noise from image data, such as top-down view and side view images of shoes. The back-end of the application may split the image data and prepare the image data for modeling. The back-end of the application may select a particular algorithm from a plurality of images to use for a particular image of a shoe. The back-end of the application may comprise a modeling pipeline for identifying worn areas on a shoe. The back-end of the application may use model training and/or tuning to teach a model to learn patterns from images of shoes. The back-end of the application may determine an evaluation of a model for identifying worn areas in a shoe. The back-end of the application may send model reports to a user interface of an application, such the application comprising the user interfaces shown in
[0041]
[0042] Similarly,
[0043]
[0044] The back-end of the application may comprise the coordinate system. The back-end of the application may comprise a 3-D digital model. The back-end of the application may comprise a 2-D digital model. The back-end of the application may comprise a 3-D to 2-D PTP mapping module.
[0045]
[0046]
[0047]
[0048] Mapping 440 shows a mapping for a pattern for a right upper for a shoe.
[0049] Mapping 440 comprises area 422c to compensate for the first location 422a, 422b. Area 422c may receive extra material. Area 422c may receive extra knitting material. Mapping 450 shows a mapping for a pattern for a left upper for a shoe. Mapping 450 comprises area 424c to compensate for the second location 424a, 424b. Area 424c may receive extra material. Area 424c may receive extra knitting material. Mapping 450 comprises area 426c to compensate for the third location 426a, 426b. Area 426c may receive extra material. Area 426c may receive extra knitting material.
[0050]
[0051] The back-end of the application may be trained to recognize patterns and regularities in data automatically using training data comprising images of worn articles (e.g., shoes), such as images 500, 502, 504, 506, 508, 510, 512, 514, 516. The back-end of the application may use AI and/or machine learning to determine severity degrees of worn areas. The back-end of the application may use computer vision to determine severity degrees of worn areas. The back-end of the application may use image digitization to determine severity degrees of worn areas. The back-end of the application may use data mining to determine severity degrees of worn areas. The back-end of the application may use user input to determine severity degrees of worn areas. The back-end of the application may use knowledge discovery in image databases to determine severity degrees of worn areas.
[0052] The back-end of the application may prepare severity data, such as images 500, 502, 504, 506, 508, 510, 512, 514, 516. The back-end of the application may select an algorithm for severity degree determination. The back-end of the application may comprise a severity assessment model. The back-end of the application may comprise severity assessment model training and/or tuning. The back-end of the application may comprise severity assessment model evaluation. The back-end of the application may comprise severity assessment reporting to a front-end instance of the application.
[0053] The back-end of the application may comprise a severity degree identification engine. The back-end of the application may comprise a deep regression neural network. The back-end of the application may be trained on worn-out severity data, such as images 500, 502, 504, 506, 508, 510, 512, 514, 516, to automatically assess a degree of severity from an image of an upper of a shoe. Severity may be represented on a scale (e.g., color or numerical) or by thresholds (e.g., pre-defined categories). The back-end application may receive an image of worn shoes and output a severity degree associated with each worn area.
[0054]
[0055] At step 604, one or more areas of wear indicative of worn portions of the footwear may be determined based at least on the one or more images. The remote computing device 106 in
[0056] At step 606, the one or more areas of wear may be mapped to a two dimensional wear model. The remote computing device 106 in
[0057] At step 608, a severity of wear of each of the one or more areas of wear may be determined based at least on the one or more images. The remote computing device 106 in
[0058] At step 610, a custom pattern such as a knit pattern may be determined for a custom knit upper based on the two dimensional wear model and the severity of wear of each of the one or more areas of wear. Although a knit pattern is referenced for illustration, other material patterns or components may be used such as upper, midsole, outsole, or apparel, or accessories. The remote computing device 106 in
[0059] At step 612, a machine-readable code representing the knit pattern may be outputted. The remote computing device 106 in
[0060] As an example, rather than generating a custom knit pattern, one or more available pre-set patterns may be selected based on the two dimensional wear model and the severity of wear of each of the one or more areas of wear. As an example, steps 610 and 612 may be embodied as a suggestion engine that recommends an available article from a pre-set catalogue (e.g., inline styles) that best matches the custom article based on the two dimensional wear model and the severity of wear. As used herein, “inline style” may refer to one or a plurality of pre-designed styles for a particular season (or seasons) of footwear, apparel or accessories. Other inputs may be used, such as expert feedback, historical style data (i.e., of the user/purchaser), preference data for a particular wearer or activity, etc. Suggestion engine recommendations may be based on a single user or may be aggregated based on preferences of cohorts or other like users.
[0061] An article, such as an article of footwear or component thereof, apparel or a component thereof, or an accessory or component thereof, may be manufactured using any combination of any portions of the steps described in
[0062]
[0063] At step 704, the one or more areas of wear may be mapped to a two dimensional wear model. The remote computing device 106 in
[0064] At step 706, a severity of wear of one or more of the one or more areas of wear may be determined based at least on the one or more images. The remote computing device 106 in
[0065] At step 708, a pattern may be determined for a custom component such as a footwear upper based on the two dimensional wear model and the severity of wear of each of the one or more areas of wear. Determining a custom component may comprise selecting an available component from a pre-set catalogue (e.g., inline styles) of available patterns or components. Additionally or alternatively, a custom component may be specifically designed for a particular user on an ad hoc basis. Determining a custom component may be based on historical data, wearer information, wearer style, expert input, or the like. The remote computing device 106 in
[0066] At step 710, a machine-readable code representing the pattern may be outputted. The remote computing device 106 in
[0067] An article, such as an article of footwear or component thereof, apparel or a component thereof, or an accessory or component thereof, may be manufactured using any combination of any portions of the steps described in
[0068] Additionally or alternatively, the present disclosure relates to receiving image data (e.g., directly from a wearer) such as images of worn articles (e.g., footwear or apparel). Images and/or other information may be received over a period of time, for example, to develop a history of wear and a personalized wear experience. As a non-limiting example, expert information may be received that relates to the article and/or an end-use. As an illustration, a subject-matter expert may review the history of wear or other details relating to a wearer and may provide expert information relating to style or wear. A technical skate expert may advise on the type of skate style a particular wearer may have, and thus the skate style may be used to determine an expect wear pattern. An expert trail runner may advise on the type of running style a particular wearer may have, and thus the runner style may be used to determine an expect wear pattern. A model may be created representing the wearer style and end-use needs. The model may comprise AI-based or machine learning based models. The model may be trained or tested on data such as image data. The model may be tuned based on expert information or other details relating to the wearer. From the model, a suggestion of an inline article may be provided to a wearer. Additionally or alternatively, a customized article may be manufactured (e.g., on demand) based on the model for the particular wearer.
EXAMPLES
[0069] Example 1: A method of making custom knit footwear, the method comprising: [0070] receiving one or more images of three-dimensional footwear; determining, based at least on the one or more images, one or more areas of wear indicative of worn portions of the footwear; mapping the one or more areas of wear to a two dimensional wear model; determining, based at least on the one or more images, a severity of wear of each of the one or more areas of wear; determining, based on the two dimensional wear model and the severity of wear of each of the one or more areas of wear, a knit pattern for a custom knit upper; and outputting a machine-readable code representing the knit pattern, wherein the machine-readable code is configured to be processed by a knitting machine to cause knitting of at least a portion of the custom knit upper.
[0071] Example 2: The method of example 1, wherein the one or more images are received via a mobile application.
[0072] Example 3: The method of any of examples 1-2, wherein the one or more images comprises a top-down view or a side view, or both.
[0073] Example 4: The method of any of examples 1-3, wherein the one or more areas of wear are determined using computer vision.
[0074] Example 5: The method of any of examples 1-4, wherein the one or more areas of wear are determined using a machine learning algorithm trained on a plurality of footwear images.
[0075] Example 6: The method of any of examples 1-5, wherein the mapping comprises point-to-point positioning.
[0076] Example 7: The method of any of examples 1-6, wherein the two dimensional model is based on a pattern of a footwear upper.
[0077] Example 8: The method of any of examples 1-7, wherein the severity of wear is determined using a machine learning algorithm trained on a plurality of footwear images.
[0078] Example 9: The method of any of examples 1-8, wherein the knit pattern comprises a reinforced region spatially disposed based on a location of the one or more areas of wear.
[0079] Example 10: The method of any of examples 1-9, wherein the knit pattern comprises a reinforced region spatially disposed based on the severity of wear of the one or more areas of wear.
[0080] Example 11: The method of any of examples 1-10, wherein the knit pattern comprises a reinforced region spatially disposed based on a location and severity of wear of the one or more areas of wear.
[0081] Example 12: An article of footwear manufactured using the method of any one of examples 1-11.
[0082] Example 13: The article of example 12, wherein the article comprises a skate shoe.
[0083] Example 14: A method of making custom footwear, the method comprising: [0084] determining, based at least on one or more images of footwear, one or more areas of wear indicative of worn portions of the footwear; mapping the one or more areas of wear to a two dimensional wear model; determining, based at least on the one or more images, a severity of wear of one or more of the one or more areas of wear; determining, based on the two dimensional wear model and the severity of wear of each of the one or more areas of wear, a pattern for a custom upper; and outputting a machine-readable code representing the pattern, wherein the machine-readable code is configured to be processed by a machine to cause manufacture of at least a portion of the custom upper.
[0085] Example 15: The method of any of examples 1-11 or 14, wherein the one or more images are received via a mobile application.
[0086] Example 16: The method of any of examples 1-11 or 14-15, wherein the one or more images comprises a top-down view or a side view, or both.
[0087] Example 17: The method of any of examples 1-11 or 14-16, wherein the one or more areas of wear are determined using computer vision.
[0088] Example 18: The method of any of examples 1-11 or 14-17, wherein the one or more areas of wear are determined using a machine learning algorithm trained on a plurality of footwear images.
[0089] Example 19: The method of any of examples 1-11 or 14-18, wherein the mapping comprises point-to-point positioning.
[0090] Example 20: The method of any of examples 1-11 or 14-19, wherein the two dimensional model is based on a pattern of a footwear upper.
[0091] Example 21: The method of any of examples 1-11 or 14-20, wherein the severity of wear is determined using a machine learning algorithm trained on a plurality of footwear images.
[0092] Example 22: The method of any of examples 1-11 or 14-21, wherein the pattern comprises a reinforced region spatially disposed based on a location of the one or more areas of wear.
[0093] Example 23: The method of any of examples 1-11 or 14-22, wherein the pattern comprises a reinforced region spatially disposed based on the severity of wear of the one or more areas of wear.
[0094] Example 24: The method of any of examples 1-11 or 14-23, wherein the pattern comprises a reinforced region spatially disposed based on a location and severity of wear of the one or more areas of wear.
[0095] Example 25: An article of footwear manufactured using the method of any one of examples 1-11 or 14-24.
[0096] Example 26: The article of example 25, wherein the article comprises a skate shoe.
[0097] Example 27: A method of making a custom article, the method comprising: receiving one or more images of a three-dimensional article; determining, based at least on the one or more images, one or more areas of wear indicative of worn portions of the article; mapping the one or more areas of wear to a two dimensional wear model; determining, based at least on the one or more images, a severity of wear of each of the one or more areas of wear; determining, based on the two dimensional wear model and the severity of wear of each of the one or more areas of wear, a pattern for a custom article; and outputting a machine-readable code representing the pattern, wherein the machine-readable code is configured to be processed by a machine to cause formation of at least a portion of the custom article.
[0098] Example 28: The method of claim 27, wherein the pattern comprises an outsole, a midsole, or an upper, or a component of apparel.
[0099] Example 29: The method of any one of claims 27-28, wherein the one or more images are received via a mobile application.
[0100] Example 30: The method of any one of claims 27-29, wherein the one or more images comprises a top-down view or a side view, or both.
[0101] Example 31: The method of any one of claims 27-30, wherein the one or more areas of wear are determined using computer vision.
[0102] Example 32: The method of any one of claims 27-31, wherein the one or more areas of wear are determined using a machine learning algorithm trained on a plurality of article images having various wear patterns.
[0103] Example 33: The method of any one of claims 27-32, wherein the mapping comprises point-to-point positioning.
[0104] Example 34: The method of any one of claims 27-33, wherein the severity of wear is determined using a machine learning algorithm trained on a plurality of article images.
[0105] Example 35: The method of any one of claims 27-34, wherein the pattern comprises a reinforced region spatially disposed based on a location of the one or more areas of wear.
[0106] Example 36: The method of any one of claims 27-35, wherein the pattern comprises a reinforced region spatially disposed based on the severity of wear of the one or more areas of wear.
[0107] Example 37: The method of any one of claims 27-36, wherein the pattern comprises a reinforced region spatially disposed based on a location and severity of wear of the one or more areas of wear.
[0108] Example 38: An article of footwear manufactured using the method of any one of claims 27-37.
[0109] Example 39: The article of claim 38, wherein the article comprises a skate shoe.