METHOD AND SYSTEM FOR INSPECTING UPPER OF SHOE
20200151951 ยท 2020-05-14
Inventors
Cpc classification
A43B9/00
HUMAN NECESSITIES
G06T17/10
PHYSICS
A43D2200/60
HUMAN NECESSITIES
A43D21/16
HUMAN NECESSITIES
G06T19/00
PHYSICS
International classification
Abstract
A system for inspecting an upper of a shoe includes an optical sub-system and a processor. The optical sub-system captures an image related to a work piece which includes the upper and a last so as to output image data representing the image. The processor receives the image data, establishes a work piece model that is a three-dimensional model of the work piece based on the image data, obtains a cross section of the work piece model, obtains an entry of section data related to the cross section of the work piece model, and compares the entry of section data and an entry of predetermined standard data so as to generate a result of inspection indicating whether the upper is normal.
Claims
1. A method for inspecting an upper of a shoe, the upper being pulled around a last by a shoe lasting machine where the upper and the last cooperatively serve as a work piece, the method to be implemented by a processor and comprising: a) establishing, based on at least one image related to the work piece, a work piece model that is a three-dimensional model of the work piece; b) obtaining a cross section of the work piece model along an imaginary cutting plane passing through an imaginary section line; c) obtaining an entry of section data related to the cross section of the work piece model; and d) comparing the entry of section data and an entry of predetermined standard data so as to generate a result of inspection indicating whether the upper is normal.
2. The method as claimed in claim 1, wherein step a) includes sub-steps of: a-1) obtaining two images of the work piece which are captured by two image capturing devices spaced apart from each other; a-2) comparing the two images of the work piece to obtain plural pairs of corresponding image points in the two images; a-3) for each of the pairs of corresponding image points, determining a disparity of the pair of corresponding image points, and calculating spatial coordinates of a common point (p) in a three-dimensional space based on the disparity and according to triangulation, where the pair of corresponding image points are projections of the common point in the two images; a-4) obtaining a point cloud associated with the work piece, where the point cloud is formed by the common points each correlating to a respective one of the pairs of corresponding image points in the three-dimensional space; and a-5) establishing the work piece model based on the spatial coordinates of the common points in the point cloud.
3. The method as claimed in claim 1, wherein step a) includes sub-steps of: a-1) controlling a light source device to emit a structured light beam on the upper to form a projected pattern on the work piece; a-2) obtaining an image of the projected pattern formed on the work piece; a-3) calculating spatial coordinates of each of elements of the projected pattern based on a degree of deformation of the projected pattern in the image and a positional relationship between the light source module (43) and the work piece; and a-4) establishing the work piece model based on the spatial coordinates of the elements of the projected pattern.
4. The method as claimed in claim 3, wherein: the structured light beam has a predefined pattern that is one of a pattern consisting of multiple stripes and a single light stripe; and the structured light beam is emitted from one of a laser projector and a digital light processing (DLP) projector.
5. The method as claimed in claim 1, wherein: the entry of section data related to the cross section of the work piece model includes coordinates of a geometric center of the cross section of the work piece model; the entry of predetermined standard data is a set of threshold values defining an allowable range; and when it is determined that the coordinates of the geometric center of the cross section of the work piece model is outside of the allowable range, the result of inspection thus generated indicates that the upper is abnormal.
6. The method as claimed in claim 1, wherein: the entry of section data related to the cross section of the work piece model includes a moment of inertia of the cross section of the work piece model; the entry of predetermined standard data is a set of threshold values defining an allowable range; and when it is determined that the moment of inertia of the cross section of the work piece model is outside of the allowable range, the result of inspection thus generated indicates that the upper is abnormal.
7. The method as claimed in claim 1, wherein: the entry of section data related to the cross section of the work piece model includes a slope of a line passing through predefined two points on a contour of the cross section of the work piece model; the entry of predetermined standard data is a set of threshold values defining an allowable range; and when it is determined that the slope of the line passing through the predefined two points on the contour of the cross section of the work piece model is outside of the allowable range, the result of inspection thus generated indicates that the upper is abnormal.
8. The method as claimed in claim 1, wherein: the entry of section data related to the cross section of the work piece model includes a contour of the cross section of the work piece model; the entry of predetermined standard data is a standard contour; and when it is determined that the contour of the cross section of the work piece model does not match the standard contour according to a contour matching method, the result of inspection thus generated indicates that the upper is abnormal.
9. The method as claimed in claim 1, subsequent to step d), further comprising: e) generating, when it is indicated by the result of inspection thus generated that the upper is abnormal, a warning signal to control a warning device to output one of sound, light and a text or graphics message, or to be transmitted to an electronic device to notify that the upper is abnormal.
10. A system for inspecting an upper of a shoe, the upper being pulled around a last by a shoe lasting machine where the upper and the last cooperatively serve as a work piece, said system comprising: an optical sub-system configured to capture at least one image related to the work piece so as to output image data representing the image thus captured; and a processor configured to receive the image data, to establish a work piece model that is a three-dimensional model of the work piece based on the image data, to obtain a cross section of the work piece model along an imaginary cutting plane passing through an imaginary section line, to obtain an entry of section data related to the cross section of the work piece model, and to compare the entry of section data and an entry of predetermined standard data so as to generate a result of inspection indicating whether the upper is normal.
11. The system as claimed in claim 10, wherein: said optical sub-system includes two image capturing devices which are located respectively at two fixed locations and are spaced apart from each other, and each of which is configured to capture an image of the work piece, and to output the image data representing the image of the work piece to said processor so as to enable said processor to obtain the image of the work piece; and said processor is configured to compare the two images of the work piece from the two image capturing devices to obtain plural pairs of corresponding image points in the two images, for each of the pairs of corresponding image points, determine a disparity of the pair of corresponding image points, and calculate spatial coordinates of a common point in a three-dimensional space based on the disparity and according to triangulation, where the pair of corresponding image points are projections of the common point in the two images, obtain a point cloud associated with the work piece, where the point cloud is formed by the common points each correlating to a respective one of the pairs of corresponding image points in the three-dimensional space, and establish the work piece model based on the spatial coordinates of each of the common points in the point cloud.
12. The system as claimed in claim 10, wherein: said optical sub-system includes a light source and an image capturing device that are located respectively at two fixed locations and are spaced apart from each other; said light source is configured to emit a structured light beam onto the upper to form a projected pattern on the work piece; said image capturing device is configured to capture an image of the projected pattern formed on the work piece, and to output the image data related to the image of the projected pattern to said processor so as to enable said processor to obtain the image of the projected pattern; and said processor is configured to calculate spatial coordinates of each of elements of the projected pattern based on a degree of deformation of the projected pattern in the image and a positional relationship between said light source and the work piece, and to establish the work piece model based on the spatial coordinates of the elements of the projected pattern.
13. The system as claimed in claim 12, wherein: said light source includes one of a laser projector and a digital light processing (DLP) projector that emits the structured light beam having a predefined pattern that is one of a pattern consisting of multiple stripes and a single light stripe.
14. The system as claimed in claim 10, wherein: the entry of section data related to the cross section of the work piece model includes coordinates of a geometric center of the cross section of the work piece model; the entry of predetermined standard data is a set of threshold values defining an allowable range; and said processor is configured to, when it is determined that the coordinates of the geometric center of the cross section of the work piece model is outside of the allowable range, generate the result of inspection indicating that the upper is abnormal.
15. The system as claimed in claim 10, wherein: the entry of section data related to the cross section of the work piece model includes a moment of inertia of the cross section of the work piece model; the entry of predetermined standard data is a set of threshold values defining an allowable range; and said processor is configured to, when it is determined that the moment of inertia of the area of the cross section of the work piece model is outside of the allowable range, generate the result of inspection indicating that the upper is abnormal.
16. The system as claimed in claim 10, wherein: the entry of section data related to the cross section of the work piece model includes a slope of a line passing through predefined two points on a contour of the cross section of the work piece model; the entry of predetermined standard data is a set of threshold values defining an allowable range; and said processor is configured to, when it is determined that the slope of the line passing through the predefined two points on the contour of the cross section of the work piece model is outside of the allowable range, generate the result of inspection indicating that the upper is abnormal.
17. The system as claimed in claim 10, wherein: the entry of section data related to the cross section of the work piece model includes a contour of the cross section of the work piece model; the entry of predetermined standard data is a standard contour; and said processor is configured to, when it is determined that the contour of the cross section of the work piece model does not match the standard contour according to a contour matching method, generate the result of inspection indicating that the upper is abnormal.
18. The system as claimed in claim 10, further comprising: a warning device, wherein said processor is further configured to, when it is indicating by the result of inspection thus generated that the upper is abnormal, generate a warning signal to control said warning device to output one of sound, light and a text or graphics message, or to be transmitted to an electronic device to notify that the upper is abnormal.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] Other features and advantages of the disclosure will become apparent in the following detailed description of the embodiments with reference to the accompanying drawings, of which:
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
DETAILED DESCRIPTION
[0028] Before the disclosure is described in greater detail, it should be noted that where considered appropriate, reference numerals or terminal portions of reference numerals have been repeated among the figures to indicate corresponding or analogous elements, which may optionally have similar characteristics.
[0029] Referring to
[0030] Referring to
[0031] Specifically speaking, referring to
[0032] The processor 5 is configured to compare the two images 91 of the work piece 3 to obtain plural pairs of corresponding image points (p) in the two images 91 (only one pair is shown). For each of the pairs of corresponding image points (p), the processor 5 is configured to determine a disparity of the pair of corresponding image points (p), and to calculate a set of spatial coordinates of a common point (p) in a three-dimensional space based on the disparity and according to triangulation (such as the computer vision triangulation technique), where the pair of corresponding image points (p) are projections of the common point (p) in the two images 91. In this embodiment, the set of spatial coordinates of the common point (p) is represented as P(x,y,z) in a Cartesian coordinate system, but implementation of representation of the set of spatial coordinates of the common point (p) is not limited to the disclosure herein and may vary in other embodiments. The processor 5 is configured to obtain a point cloud associated with the work piece 3, where the point cloud is formed by multiple common points (p) each correlating to a respective one of the pairs of corresponding image points (p) in the three-dimensional space, and to establish the work piece model 3 based on the set of spatial coordinates of the common points (p) in the point cloud.
[0033] It is worth to note that implementations of calculating the set of spatial coordinates of the common point (p) and establishing the work piece model 3 have been well known to one skilled in the relevant art, e.g., technologies of computer stereo vision, so detailed explanation of the same is omitted herein for the sake of brevity.
[0034] Referring to
[0035] Referring to
[0036] In step 601, the processor 5 controls each of the two image capturing devices 41 that are spaced apart from each other to capture an image 91 of the work piece 3, and to output image data (O) representing the image 91 of the work piece 3 to the processor 5.
[0037] In step 602, the processor 5 obtains the two images 91 of the work piece 3 from the image data (O) received from the two image capturing devices 41.
[0038] In step 603, the processor 5 compares the two images 91 of the work piece 3 to obtain plural pairs of corresponding image points (p) in the two images 91. It is noted that only one pair of corresponding image points (p) is illustrated in
[0039] In step 604, for each of the pairs of corresponding image points (p), the processor 5 determines a disparity of the pair of corresponding image points (p), and calculates a set of spatial coordinates of a common point (p) corresponding to the pair of corresponding image points (p) in three-dimensional space based on the disparity and according to computer vision triangulation techniques, where the pair of corresponding image points (p) are projections of the common point (p) in the two images 91.
[0040] In step 605, the processor 5 obtains a point cloud associated with the work piece 3, where the point cloud is formed by the common points (p) of the respective pairs of corresponding image points (p) in the three-dimensional space.
[0041] In step 606, the processor 5 establishes the work piece model 3 based on the set of spatial coordinates of the common points (p) in the point cloud. Then, a flow of procedure proceeds to the inspection procedure 7.
[0042] Referring to
[0043] In step 701, after the system performs optical scanning on the work piece 3 so as to establish the work piece model 3 that is the three-dimensional model of the work piece 3 in the model establishment procedure 6, the processor 5 obtains the cross section 30 of the work piece model 3 along the imaginary cutting plane passing through the imaginary section line (L) on an X-Z plane as shown in
[0044] In step 702, the processor 5 obtains an entry of section data (D) related to the cross section 30 of the work piece model 3.
[0045] In step 703, the processor 5 compares the entry of section data (D) and an entry of predetermined standard data (T) so as to determine whether the entry of section data (D) matches the entry of predetermined standard data (T). When it is determined that the entry of section data (D) matches the entry of predetermined standard data (T), the flow of procedure of this method proceeds to step 704. Otherwise, the flow of procedure proceeds to step 705.
[0046] In step 704, the processor 5 generates the result of inspection indicating that the upper 32 is normal. Based on the result of inspection, the shoe lasting machine 2 is permitted to proceed to procedures of glue spreading and sole attaching.
[0047] In step 705, the processor 5 generates the result of inspection indicating that the upper 32 is abnormal. Furthermore, the processor 5 generates the warning signal (W) to control the warning device 22 to output sound, light, and/or a text/graphics message. Alternatively, the warning signal (W) may be a digital signal to be transmitted to an electronic device for notification of the result of inspection.
[0048] Referring to
[0049] Referring to
[0050] Referring to
[0051] Referring to
[0052] It is worth to note that implementation of the comparison between the entry of section data (D) and the entry of predetermined standard data (T) is not limited to comparison in a single aspect, i.e., only one of the geometric center (C), the moment of inertia (I), the slope (M) and the contour 301 is taken into account. In some embodiments, the comparison may be implemented to be performed in multiple aspects, i.e., two or more of the geometric center (C), the moment of inertia (I), the slope (M) and the contour 301 may be taken into account for inspection of a single upper 32, or alternatively, multiple cross sections respectively on multiple X-Z planes along a Y axis may be extracted from the single work piece model 3 followed by their respective comparison processes, such that the result of inspection may be more accurate.
[0053] Referring to
[0054] Referring to
[0055] The light source 43 is configured to emit a structured light beam 431 of a predefined pattern toward the upper 32 to form a projected pattern (P) on the work piece 3. The projected pattern (P) appears to be a distortion of the predefined pattern from viewpoints other than that of the light source 43 due to the three-dimensionally varying surface of the work piece 3. The predefined pattern may consist of multiple stripes, or be a single light stripe, in which case the light source 43 is configured to scan the work piece 3 with the single-stripe structured light beam 431, but implementation of the predefined pattern of the structured light beam 431 is not limited to the disclosure herein and may vary in other embodiments. The light source 43 may include one of a laser projector and a digital light processing (DLP) projector, but implementation is not limited to the disclosure herein and may vary in other embodiments. In this embodiment, the light source 43 includes a DLP projector for emitting the structured light beam 431 with the predefined pattern consisting of multiple stripes and covering the whole work piece 3, and an angle of projection of the structured light beam 431 emitted by the DLP projector is adjustable without having to physically move the light source 43 around.
[0056] The image capturing device 44 is configured to obtain an image 92 of the projected pattern (P) formed on the work piece 3, and to output image data (O) representing the image 92 of the projected pattern (P) to the processor 5 so as to enable the processor (5) to obtain the image 92 of the projected pattern (P) from the image data (O).
[0057] The processor 5 is configured to calculate spatial coordinates of each element (e.g., each stripe) of the projected pattern (P) in the image 92 based on a degree of deformation of the projected pattern (P) in the image 92 (with respect to the non-distorted predefined pattern of the structured light beam 431) and a positional relationship among the light source 43, the image capturing device 44 and the work piece 3, and to establish the work piece model 3 as shown in
[0058] It should be noted that since implementation of establishing the work piece model 3 in the three-dimensional space based on technologies of structured light (e.g., structured light 3D scanning) has been well known to one skilled in the relevant art, detailed explanation of the same is omitted herein for the sake of brevity.
[0059] Next, a second embodiment of the method according to the disclosure is discussed. The second embodiment of the method is to be implemented by the processor 5 of the second embodiment of the system that is previously described. As shown in
[0060] Referring to
[0061] In step 611, the processor 5 controls the light source 43 to emit the structured light beam 431 onto the upper 32 to form a projected pattern (P) on the work piece 3.
[0062] In step S612, the processor 5 controls the image capturing device 44 to capture an image 92 of the projected pattern (P) formed on the work piece 3, and to output image data (O) representing the image 92 of the projected pattern (P) to the processor 5.
[0063] In step 613, the processor 5 obtains, from the image data (O), the image 92 of the projected pattern (P) formed on the work piece 3.
[0064] In step 614, the processor 5 calculates spatial coordinates of each element of the projected pattern (P) based on a degree of deformation of the projected pattern (P) in the image 92 and the positional relationship among the light source 43, the image capturing device 44 and the work piece 3.
[0065] In step 615, the processor 5 establishes the work piece model 3 based on the spatial coordinates of the elements of the projected pattern (P). Subsequently, the inspection procedure 7 as shown in
[0066] To sum up, the system and the method for inspecting an upper of a shoe according to the disclosure utilize an optical sub-system to capture an image related to a work piece that includes the upper and a last, and utilize a processor to establish a work piece model based on the image, to obtain a cross section of the work piece model, and to generate a result of inspection based on a comparison between an entry of section data related to the cross section and an entry of predetermined standard data. Therefore, adjustment can be made to a position of the upper relative to the last in time when it is indicated by the result of inspection that the upper is abnormal, reducing failure rate of the step of pulling the upper around the last and improving quality of shoe production. Accordingly, materials of the upper may be efficiently used. Moreover, inspection of the upper performed by the system and the method according to the disclosure is automatic and is more efficient than the conventional manual approach, and may be able to generate the result of inspection that is more consistent and precise than that of the conventional manual approach. Consequently, labor cost is reduced and quality control may be enhanced.
[0067] In the description above, for the purposes of explanation, numerous specific details have been set forth in order to output a thorough understanding of the embodiments. It will be apparent, however, to one skilled in the art, that one or more other embodiments may be practiced without some of these specific details. It should also be appreciated that reference throughout this specification to one embodiment, an embodiment, an embodiment with an indication of an ordinal number and so forth means that a particular feature, structure, or characteristic may be included in the practice of the disclosure. It should be further appreciated that in the description, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of various inventive aspects, and that one or more features or specific details from one embodiment may be practiced together with one or more features or specific details from another embodiment, where appropriate, in the practice of the disclosure.
[0068] While the disclosure has been described in connection with what are considered the exemplary embodiments, it is understood that this disclosure is not limited to the disclosed embodiments but is intended to cover various arrangements included within the spirit and scope of the broadest interpretation so as to encompass all such modifications and equivalent arrangements.