INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND STORAGE MEDIUM
20230162427 · 2023-05-25
Inventors
Cpc classification
B29C37/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
An information processing apparatus includes a shape data acquisition unit configured to acquire shape data indicating a three-dimensional shape of a mold for forming a molded product, a mold release direction acquisition unit configured to acquire a mold release direction in separating the molded product from the mold, a processing parameter acquisition unit configured to acquire a processing parameter for processing to be applied to a surface of the mold, a calculation unit configured to calculate, based on the shape data, the mold release direction and the processing parameter, a difference between a plurality of processing parameter maps each indicating a correspondence between a position on the surface of the mold and the processing parameter, and a notification unit configured to notify information about the difference.
Claims
1. An information processing apparatus comprising: a shape data acquisition unit configured to acquire shape data indicating a three-dimensional shape of a mold for forming a molded product; a mold release direction acquisition unit configured to acquire a mold release direction in separating the molded product from the mold; a processing parameter acquisition unit configured to acquire a processing parameter for processing to be applied to a surface of the mold; a calculation unit configured to calculate, based on the shape data, the mold release direction and the processing parameter, a difference between a plurality of processing parameter maps each indicating a correspondence between a position on the surface of the mold and the processing parameter; and a notification unit configured to notify information about the difference.
2. The information processing apparatus according to claim 1, wherein the plurality of processing parameter maps includes at least a processing parameter map related to surface texture of the molded product.
3. The information processing apparatus according to claim 1, wherein the plurality of processing parameter maps includes a first processing parameter map emphasizing fidelity of surface texture of the molded product, and a second processing parameter map emphasizing continuity of the surface texture of the molded product, and wherein the calculation unit calculates a difference between the first processing parameter map and the second processing parameter map.
4. The information processing apparatus according to claim 1, wherein the notification unit notifies the information, by displaying information indicating a region where the difference is not substantially zero in the shape data.
5. The information processing apparatus according to claim 1, wherein the notification unit notifies the information, by displaying information indicating color having intensity corresponding to amount of the difference in the shape data.
6. The information processing apparatus according to claim 1, wherein the calculation unit further acquires an adjustment parameter related to the processing parameter map based on the information about the difference, and regenerates the processing parameter map based on the adjustment parameter.
7. The information processing apparatus according to claim 6, wherein the adjustment parameter is a parameter for adjusting a balance between the plurality of processing parameter maps.
8. The information processing apparatus according to claim 1, further comprising a display unit configured to display a rendering image corresponding to each of the plurality of processing parameter maps.
9. The information processing apparatus according to claim 1, wherein the mold is configured to form minute irregularities on a surface of the molded product.
10. The information processing apparatus according to claim 1, wherein the mold release direction acquisition unit acquires a three-dimensional vector indicating the mold release direction, as the mold release direction.
11. An information processing method comprising: acquiring shape data indicating a three-dimensional shape of a mold for forming a molded product; acquiring a mold release direction in separating the molded product from the mold; acquiring a processing parameter for processing to be applied to a surface of the mold; calculating, based on the shape data, the mold release direction and the processing parameter, a difference between a plurality of processing parameter maps each indicating a correspondence between a position on the surface of the mold and the processing parameter; and notifying information about the difference.
12. A non-transitory computer-readable storage medium storing instructions that, when executed by a computer, cause the computer to perform an information processing method, the information processing method comprising: acquiring shape data indicating a three-dimensional shape of a mold for forming a molded product; acquiring a mold release direction in separating the molded product from the mold; acquiring a processing parameter for processing to be applied to a surface of the mold; calculating, based on the shape data, the mold release direction and the processing parameter, a difference between a plurality of processing parameter maps each indicating a correspondence between a position on the surface of the mold and the processing parameter; and notifying information about the difference.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
DESCRIPTION OF THE EMBODIMENTS
[0025] Modes (exemplary embodiments) for carrying out the present disclosure will be described with reference to the drawings. Not all of combinations of features that will be described in the exemplary embodiments of the present disclosure are necessarily essential to the solution of the present disclosure. In the description, the same configurations will be assigned the same reference numerals.
[0026] A first exemplary embodiment of the present disclosure will be described.
[0027]
[0028]
[0029] As illustrated in
[0030] Meanwhile, as illustrated in
[0031] As illustrated in
[0032] The uneven structure described above can be formed on the surface of a molded product, by processing (micro processing) the surface of a mold to form inversed irregularities, using a processing machine such as a cutting machine or a laser beam machine. For example, a depressed portion formed on the surface of the mold by processing (micro processing) is transferred to resin as the projected portion 321 or the projected portion 331 on the molded product, and the depth of the depressed portion formed on the surface of the mold by processing is the height of the projected portion 321 or the projected portion 331 on the molded product. In the present exemplary embodiment, as a parameter (hereinafter referred to as “processing control parameter”) for controlling processing (micro processing) by the processing machine, each of a processing diameter r, a processing depth d, and a processing density p is used.
[0033]
[0034] In
[0035] When the mold is actually fabricated, data (hereinafter referred to as “processing pattern”) indicating the correspondence between a position on the surface of the mold and the processing depth d is generated, based on shape data of the surface of the mold before the uneven structure is formed by processing (micro processing), and the processing density p. This processing pattern is input to a computer aided manufacturing (CAM) system. The processing pattern which is the input data is converted into a processing program such as numerical control (NC) data by the CAM system, and the processing program is sent to a computer numerical control (CNC) processing machine, so that processing is executed.
[0036] In general, in a case where an uneven structure is provided on the surface of a molded product, a mold release resistance tends to increase. Accordingly, giving texture to the surface of the molded product can cause a difficulty in the mold release. For example, in a case where the mold is moved in a direction indicated by an arrow E in
[0037]
[0038] Points P1 to P6 on the surface of the mold in
[0039]
[0040] In
[0041] In
[0042] Two-dimensional image data in which the processing depth d is recorded in association with the position on the surface of the mold will be referred to as “processing parameter map”. In the present exemplary embodiment, a processing parameter map emphasizing each item is generated for each evaluation item in the surface texture of the molded product, and a region where a trade-off occurs (i.e., a trade-off region), which is a portion to be adjusted, is notified based on the difference between processing parameter maps.
<Hardware Configuration>
[0043] A hardware configuration of the information processing system 10 including the information processing apparatus 100 according to the present exemplary embodiment will be described with reference to
[0044] As illustrated in
[0045] As illustrated in
[0046] The CPU 101 executes an operating system (OS) and various programs stored in devices such as the ROM 103 and the external storage device 200, using the RAM 102 as a work memory.
[0047] The OS and various programs may be stored in an internal storage device. The CPU 101 controls each hardware configuration via the system bus 107. A program code stored in the ROM 103, the external storage device 200, or the like is loaded into the RAM 102, and the loaded program code is executed by the CPU 101, so that processing in a flowchart to be described below is executed.
[0048] The external storage device 200 is connected to the SATA I/F 104 via the serial bus 510. The external storage device 200 is a hard disk drive (HDD) or a solid state drive (SSD).
[0049] The display 300 is connected to the VC 105 via the serial bus 520.
[0050] The input device 400 including a mouse and a keyboard is connected to the general-purpose I/F 106 via the serial bus 530.
[0051] The CPU 101 displays a graphical user interface (GUI) provided by a program on the display 300 via the VC 105, and receives input information representing a user instruction obtained via the input device 400.
[0052] The information processing apparatus 100 is, for example, implemented by a desktop personal computer (PC). Alternatively, the information processing apparatus 100 may be implemented by a notebook PC or tablet PC integrated with the display 300.
[0053] The external storage device 200 can be implemented by a medium (a storage medium) and an external storage drive for accessing this medium. A flexible disk (FD), a compact disc read only memory (CD-ROM), a digital versatile disc (DVD), a Universal Serial Bus (USB) memory, a magneto-optical disc (MO), or a flash memory can be used for the medium.
<Logical Configuration>
[0054] A logical configuration of the information processing apparatus 100 according to the present exemplary embodiment will be described with reference to
[0055] The CPU 101 illustrated in
[0056] As illustrated in
[0057] The shape data acquisition unit 110 is, for example, a shape data acquisition unit configured to acquire shape data indicating a three-dimensional shape of a mold for forming a molded product, from the ROM 103, the external storage device 200, or the like, based on a user instruction input via the input device 400. Specifically, the shape data in the present exemplary embodiment is polygon data in which the surface shape of a mold before an uneven structure is formed by processing (micro processing) is expressed by a group of a plurality of planes. In other words, the shape data represents the shapes of the planes of a mold that are in contact with resin for forming a molded product before texture is given. The shape data consists of a list of the three-dimensional xyz coordinates of vertexes forming the plurality of planes, and the two-dimensional uv coordinates (so-called texture coordinates) corresponding the three-dimensional xyz coordinates.
[0058]
[0059]
[0060] The shape data acquired by the shape data acquisition unit 110 is transmitted to the calculation unit 140 and the notification unit 150.
[0061] The mold release direction acquisition unit 120 is, for example, a mold release direction acquisition unit configured to acquire a mold release direction in separating a molded product from a mold based on a user instruction input via the input device 400. Specifically, in the present exemplary embodiment, the mold release direction acquisition unit 120 acquires a three-dimensional vector (a mold release direction vector) indicating a mold release direction, as the mold release direction. The mold release direction vector acquired by the mold release direction acquisition unit 120 is transmitted to the calculation unit 140.
[0062] The processing parameter acquisition unit 130 is, for example, a processing parameter acquisition unit that acquires a processing parameter for processing (micro processing) to be applied to the surface of a mold based on a user instruction input via the input device 400. Specifically, in the present exemplary embodiment, the processing parameter acquisition unit 130 acquires each of the above-described processing control parameters and a processing upper limit look-up table (LUT) to be described below, as the processing parameter. The processing control parameters include the processing depth d for reproducing desired surface texture (i.e., a target processing depth). The processing parameter acquisition unit 130 acquires a LUT (a processing upper limit LUT) indicating the correspondence between a draft and an upper limit value of the processing depth d enabling the mold release, from the ROM 103, the external storage device 200, or the like. Here, the draft is an angle representing an inclination of the surface of the mold with respect to the mold release direction, and is illustrated as an angle φ in
[0063] The calculation unit 140 is a calculation unit configured to generate a plurality of processing parameter maps each indicating the correspondence between the position on the surface of the mold and the processing parameter based on the received shape data, mold release direction vector and processing parameter, and calculates the difference between the plurality of processing parameter maps. Specifically, in the present exemplary embodiment, the plurality of processing parameter maps includes at least a processing parameter map related to the surface texture of the molded product. To be more specific, in the present exemplary embodiment, the plurality of processing parameter maps includes a first processing parameter map emphasizing the fidelity of the surface texture of the molded product and a second processing parameter map emphasizing the continuity of the surface texture of the molded product. In this case, the calculation unit 140 may be configured to calculate the difference between the first processing parameter map emphasizing the fidelity of the surface texture of the molded product and the second processing parameter map emphasizing the continuity of the surface texture of the molded product. The difference between the plurality of processing parameter maps calculated by the calculation unit 140 is transmitted to the notification unit 150.
[0064] The notification unit 150 is a notification unit configured to notifies information about the difference between the plurality of processing parameter maps transmitted from the calculation unit 140. Specifically, in the present exemplary embodiment, the notification unit 150 notifies the information, by displaying a trade-off region, which is a portion to be adjusted, on the display 300, based on the difference between the plurality of processing parameter maps.
<Processing to Be Executed>
[0065]
[0066] First, in step S101 in
[0067]
[0068] In a GUI 900 illustrated in
[0069] In step S102 in
[0070] In step S103 in
[0071] In step S104, the calculation unit 140 generates a processing parameter map emphasizing the continuity of the surface texture of the molded product, using various kinds of information acquired in step S101 to step S103. Specifically, the calculation unit 140 generates the processing parameter map emphasizing the continuity, using the shape data acquired in step S101, the processing upper limit LUT acquired in step S102, and the processing parameter map emphasizing the fidelity generated in step S103. The processing parameter map emphasizing the continuity generated in step S104 corresponds to “second processing parameter map”. In the present exemplary embodiment, the processing parameter map emphasizing the continuity in which the processing depth gradually changes is generated by securing region widths sequentially starting from a shallow processing width region on the surface of the mold based on the processing parameter map emphasizing the fidelity generated in step S103. The detailed processing procedure of this step S104 will be described below with reference to
[0072] In step S105, the calculation unit 140 calculates a difference map representing the difference between the processing parameter map emphasizing the fidelity generated in step S103 and the processing parameter map emphasizing the continuity generated in step S104. Specifically, the calculation unit 140 calculates a pixel value Δf(p(ij)) of the difference map, based on the following equation (1), for all pixels.
Δf(p(ij))=f1(p(ij))−f2(p(ij)) (1)
In the equation (1), p(ij) represents an ij-th pixel in the map. In the equation (1), f1(p(ij)) represents the pixel value of a pixel p(ij) in the processing parameter map emphasizing the fidelity (i.e., the processing depth emphasizing the fidelity). In the equation (1), f2(p(ij)) represents the pixel value of a pixel p(ij) in the processing parameter map emphasizing the continuity (i.e., the processing depth emphasizing the continuity).
[0073] In step S106, the notification unit 150 displays a trade-off region, which is a portion to be adjusted in shape data, on the display 300, based on the shape data acquired in step S101 and the difference map calculated in step S105. A region where Δf(p(ij)) of the above-described equation (1) is not zero (also including a region that can be regarded as a region where Δf(p(ij)) is not substantially zero) is a region where the processing depth emphasizing the fidelity and the processing depth emphasizing the continuity are different from each other, and can be regarded as a region where the fidelity and the continuity are not compatible with each other. For this reason, in the present exemplary embodiment, the difference map is texture-mapped on the surface of the mold expressed by the shape data, and this is rendered, so that an image indicating the trade-off region is generated. The trade-off region is notified by displaying this image indicating the trade-off region on a GUI. Known computer graphics techniques may be used for the texture mapping and the generation of the rendering image.
[0074] The process in step S106 will be described with reference to
[0075] In the GUI 900 illustrated in
[0076] Upon completion of the process in step S106, the processing in the flowchart illustrated in
□Detailed Processing Procedure in Step S103 in FIG. 8□
[0077] The detailed processing procedure of the processing for generating the processing parameter map emphasizing the fidelity in step S103 in
[0078]
[0079] First, in step S201 in
[0080] In step S202, the calculation unit 140 determines whether there is an element plane P.sub.T1P.sub.T2P.sub.T3 including a pixel p(ij) as illustrated in
[0081] As a result of the determination in step S202, in a case where there is an element plane including the pixel p(ij) (YES in step S202), the processing proceeds to step S203.
[0082] In step S203, the calculation unit 140 calculates a normal direction vector N to the element plane, by acquiring vertex coordinates (i.e., the coordinates of the points P.sub.T1, P.sub.T2, and P.sub.T3 in
[0083] In step S204, the calculation unit 140 calculates the draft tri described above with reference to
[0084] In step S205, the calculation unit 140 acquires a processing depth upper limit d.sub.limit corresponding to the draft p calculated in step S204 with reference to the processing upper limit LUT sent from the processing parameter acquisition unit 130.
[0085] In step S206, the calculation unit 140 determines whether the processing depth upper limit d.sub.limit acquired in step S205 is smaller than the target processing depth d.sub.target transmitted from the processing parameter acquisition unit 130.
[0086] As a result of the determination in step S206, in a case where the processing depth upper limit d.sub.limit is smaller than the target processing depth d.sub.target (i.e., the mold release is difficult in a case where processing is performed using a depth desirable for reproduction of desired texture) (YES in step S206), the processing proceeds to step S207.
[0087] In step S207, the calculation unit 140 records the processing depth upper limit d.sub.limit for the pixel p(ij), as the pixel value of the pixel p(ij).
[0088] As a result of the determination in step S206, in a case where the processing depth upper limit d.sub.limit is not smaller than the target processing depth d.sub.target (NO in step S206), the processing proceeds to step S208.
[0089] In step S208, the calculation unit 140 records the target processing depth d.sub.target for the pixel p(ij), as the pixel value of the pixel p(ij).
[0090] In a case where the process in step S208 is completed, in a case where the process in step S207 is completed, or in a case where it is determined that there is no element plane including the pixel p(ij) in step S202 (NO in step S202), the processing proceeds to step S209.
[0091] In step S209, the calculation unit 140 increases the index ij by 1.
[0092] In step S210, the calculation unit 140 determines whether the index ij is larger than or equal to a total pixel number NUM.sub.pix of the processing parameter map. As a result of this determination, in a case where the index ij is smaller than the total pixel number NUM.sub.pix of the processing parameter map (NO in step S210), the processing returns to step S202 to perform the processes in step S202 and in steps after step S202 again.
[0093] As a result of the determination in step S210, in a case where the index ij is larger than or equal to the total pixel number NUM.sub.pix of the processing parameter map (YES in step S210), the processing in the flowchart illustrated in
[0094] The processing in step S201 to step S210 in
□Detailed Processing Procedure in Step S104 in FIG. 8□
[0095] Next, the detailed processing procedure of the processing for generating the processing parameter map emphasizing the continuity in step S104 in
[0096]
[0097] In step S301 in
[0098] In step S302, the calculation unit 140 sets an index n indicating the processing depth d to NUM.sub.step. NUM.sub.step is the number of steps of the processing upper limit LUT. An n-th deepest processing depth in the processing upper limit LUT will be hereinafter referred to as “d.sub.n”.
[0099] In step S303, the calculation unit 140 sets the index ij indicating the target pixel to 0.
[0100] In step S304, the calculation unit 140 determines whether the processing depth is d.sub.n, for a point P on the surface of the mold corresponding to the pixel p(ij). Specifically, in step S304, the calculation unit 140 determines whether an element plane including the pixel p(ij) is present and f2(p(ij))=d.sub.n is satisfied.
[0101] As a result of the determination in step S304, in a case where the processing depth is d.sub.n for the point P on the surface of the mold corresponding to the pixel p(ij) (YES in step S304), the processing proceeds to step S305.
[0102] In step S305, the calculation unit 140 calculates the xyz coordinates of the point P on the surface of the mold corresponding to the pixel p(ij) based on the shape data. In the process, the xyz coordinates of the point P can be calculated by interpolation using the xyz coordinates of the vertexes of the element plane including the pixel p(ij).
[0103] In step S306 to step S316, the processing depth is checked for a point Q within a distance L determined beforehand from the point P on the surface of the mold, and in a case where the checked processing depth is deeper than the point P, the processing depth of the point Q is changed to a processing depth d.sub.n-1 that is deeper than the processing depth d.sub.n of the point P by 1 step. A region having the processing depth d.sub.n-1 and a width of L or more is thereby secured around the region having the processing depth d.sub.n. The value of L is determined by, for example, generating samples beforehand by gradually changing the processing depth d using various region widths, and determining a region width for avoiding perception of a gap in texture by performing a subjective evaluation experiment.
[0104] Specifically, in step S306, the calculation unit 140 adds the element plane including the pixel p(ij) to a processing wait list.
[0105] In step S307, the calculation unit 140 determines whether there is an unprocessed element plane in the processing wait list.
[0106] As a result of the determination in step S307, in a case where there is an unprocessed element plane in the processing wait list (YES in step S307), the processing proceeds to step S308.
[0107] In step S308, the calculation unit 140 extracts one unprocessed element plane T from the processing wait list.
[0108] In step S309, the calculation unit 140 sets an index ij′ indicating a processing target pixel to 0.
[0109] In step S310, the calculation unit 140 determines whether a pixel q(ij′) is included in the element plane T extracted in step S308 and whether f2(q(ij′))>d.sub.n is satisfied.
[0110] As a result of the determination in step S310, in a case where the pixel q(ij′) is included in the element plane T extracted in step S308 and f2(q(ij′))>d.sub.n is satisfied (YES in step S310), the processing proceeds to step S311.
[0111] In step S311, the calculation unit 140 calculates the xyz coordinates of the point Q on the surface of the mold corresponding to the pixel q(ij′) based on the shape data.
[0112] In step S312, first, the calculation unit 140 calculates the distance between the points P and Q in the xyz coordinate space, using the xyz coordinates of the point P calculated in step S305 and the xyz coordinates of the point Q calculated in step S311. Subsequently, the calculation unit 140 determines whether the calculated distance between the points P and Q in the xyz coordinate space is smaller than or equal to the distance L determined beforehand.
[0113] As a result of the determination in step S312, in a case where the calculated distance between the points P and Q in the xyz coordinate space is smaller than or equal to the distance L determined beforehand (YES in step S312), the processing proceeds to step S313.
[0114] In step S313, the calculation unit 140 records the processing depth d.sub.n-1 that is deeper than the processing depth d.sub.n by 1 step, as the pixel value of the pixel q(ij′) in the processing parameter map emphasizing the continuity.
[0115] In a case where the process in step S313 is completed, in a case where the result of the determination in step S310 is negative (NO in step S310), or in a case where the result of the determination in step S312 is negative (NO in step S312), the processing proceeds to step S314.
[0116] In step S314, the calculation unit 140 increases the index ij′ of the processing target pixel by 1.
[0117] In step S315, the calculation unit 140 determines whether the index ij′ is larger than or equal to the total pixel number NUM.sub.pix of the processing parameter map. As result of the determination, in a case where the index ij′ is smaller than the total pixel number NUM.sub.pix of the processing parameter map (NO in step S315), the processing returns to step S310 to perform the processes in step S310 and in steps after step S310 again.
[0118] As a result of the determination in step S315, in a case where the index ij′ is larger than or equal to the total pixel number NUM.sub.pix of the processing parameter map (YES in step S315), the processing proceeds to step S316.
[0119] In step S316, the calculation unit 140 determines an element plane adjacent to the element plane T extracted in step S308 with reference to the shape data, and adds the determined element plane to the processing wait list. Upon completion of the process in step S316, the processing returns to step S307 to perform the processes in step S307 and in steps after step S307 again.
[0120] In a case where the determination in step S304 is negative (NO in step S304) or in a case where the determination in step S307 is negative (NO in step S307), the processing proceeds to step S317.
[0121] In step S317, the calculation unit 140 increases the index ij of the target pixel by 1.
[0122] In step S318, the calculation unit 140 determines whether the index ij is larger than or equal to the total pixel number NUM.sub.pix of the processing parameter map. As a result of the determination, in a case where the index ij is smaller than the total pixel number NUM.sub.pix of the processing parameter map (NO in step S318), the processing returns to step S304 to perform the processes in step S304 and in steps after step S304 again.
[0123] As a result of the determination in step S318, in a case where the index ij is larger than or equal to the total pixel number NUM.sub.pix of the processing parameter map (YES in step S318), the processing proceeds to step S319.
[0124] In step S319, the calculation unit 140 decreases the value of the index n indicating the processing depth by 1.
[0125] In step S320, the calculation unit 140 determines whether the index n is 1 or less. As a result of this determination, in a case where the index n is larger than 1 (NO in step S320), the processing returns to step S303 to perform the processes in step S303 and in steps after step S303 again.
[0126] As a result of the determination in step S320, in a case where the index n is 1 or less (YES in step S320), the processing in the flowchart illustrated in
[0127] The processing in step S301 to step S320 in
[0128] In the information processing apparatus 100 according to the first exemplary embodiment described above, the shape data acquisition unit 110 acquires the shape data indicating the three-dimensional shape of the mold for forming the molded product.
[0129] The mold release direction acquisition unit 120 acquires the mold release direction in separating the molded product from the mold, and the processing parameter acquisition unit 130 acquires the processing parameter for processing (micro processing) to be applied to the surface of the mold. The calculation unit 140 generates the plurality of processing parameter maps each indicating the correspondence between the position on the surface of the mold and the processing parameter based on the shape data and the mold release direction described above, and calculates the difference between the plurality of processing parameter maps. The notification unit 150 notifies the information about the difference between the plurality of processing parameter maps calculated by the calculation unit 140.
[0130] According to such a configuration, it is possible to easily recognize the trade-off region to be adjusted, which is generated when desired texture is given to the surface of the molded product (e.g., the uneven structure is provided), when the molded product is formed using the mold.
[0131] In the present exemplary embodiment, the example in which the user inputs only one direction as the mold release direction is described. However, for a mold composed of a plurality of pieces varying in opening direction, the mold release direction may be input for each of the pieces.
[0132] In the present exemplary embodiment, the example in which the processing depth d is recorded in the processing parameter map is described. However, other processing control parameter such as the processing diameter, or the combination of the values of a plurality of processing control parameters may be recorded.
[0133] In the present exemplary embodiment, the example of the case where the fidelity and the continuity of the surface texture of the molded product are emphasized is described, but the item to be emphasized may be other evaluation item about surface texture. For example, the present exemplary embodiment is also applicable to a case where an item in which evaluation increases as the height difference of the uneven structure is greater, such as matteness and brightness, is used in place of the fidelity.
[0134] A resin material or a molding condition to be used may be input in step S101 in
[0135] A second exemplary embodiment of the present disclosure will be described. In the description of the second exemplary embodiment, a description of matters common to the above-described first exemplary embodiment will be omitted, and a matter different from the above-described first exemplary embodiment will be mainly described.
[0136] In the above-described first exemplary embodiment, the mode is described in which the processing parameter map emphasizing each item is generated for each evaluation item, and the trade-off region to be adjusted is notified based on the difference between the generated processing parameter maps. In contrast, in the second exemplary embodiment, a mode will be described in which, in a case where a trade-off is determined to occur based on the difference between processing parameter maps, a processing parameter map is regenerated by acquiring an adjustment parameter from a user.
[0137] A hardware configuration of an information processing system 10 including an information processing apparatus 100 according to the second exemplary embodiment is similar to the hardware configuration of the information processing system 10 including the information processing apparatus 100 according to the first exemplary embodiment illustrated in
[0138] In addition to having the function described in the above-described first exemplary embodiment, a calculation unit 140 in the second exemplary embodiment accepts an instruction from a user depending on information about an obtained difference, and regenerates a processing parameter map based on an adjustment parameter obtained via an input device 400. Further, the calculation unit 140 in the second exemplary embodiment generates a processing pattern based on shape data and a processing parameter map.
<Processing to Be Executed>
[0139]
[0140] In the flowchart illustrated in
[0141] In step S407, the calculation unit 140 determines whether there is a trade-off region where a pixel value being not zero (also including a case where a pixel value can be regarded as substantially zero) is included in a difference map calculated in step S405.
[0142] As a result of the determination in step S407, in a case where there is a trade-off region (YES in step S407), the processing proceeds to step S408.
[0143] In step S408, the calculation unit 140 acquires an adjustment parameter by accepting a user instruction via the GUI 900. In the second exemplary embodiment, a coefficient indicating a balance between emphasis on fidelity and emphasis on continuity is acquired via a slider 909 illustrated in
[0144] In step S409, the calculation unit 140 regenerates a processing parameter map based on the adjustment parameter acquired in step S408. Specifically, the calculation unit 140 executes the above-described processing in step S301 to step S320 using a value obtained by multiplying the value of a distance L by the coefficient α (i.e., a value αL replacing the distance L), and generates the processing parameter map in which the region width is adjusted based on the coefficient α. The generated processing parameter map after the adjustment is the same as the processing parameter map emphasizing the fidelity in the case of α=0, and is the same as the processing parameter map emphasizing the continuity in the case of α=1. An example in which the processing parameter map after the adjustment is displayed on the GUI 900 is illustrated in
[0145] In a case where the process in step S409 is completed or in a case where it is determined that there is no trade-off region in step S407 (NO in step S407), the processing proceeds to step S410.
[0146] In step S410, the calculation unit 140 generates the above-described processing pattern based on the shape data and the processing parameter map, and stores the generated processing pattern into an external storage device 200 or the like.
[0147] Upon completion of the process in step S410, the processing in the flowchart illustrated in
[0148] The processing pattern generated in step S401 may be output to a CAM system connected via a network, and processing by a processing machine may be executed base on the processing pattern. For the generation of the processing pattern, the processing parameter map after the adjustment regenerated in step S409 is used, in a case where it is determined that there is a trade-off region in step S407 (YES in step S407). In a case where it is determined that there is no trade-off region in step S407 (NO in step S407), for example, the processing parameter map emphasizing the fidelity generated in step S403 is used. In a case where there is no trade-off region, the processing parameter map emphasizing the fidelity and the processing parameter map emphasizing the continuity are identical to each other.
[0149] In the second exemplary embodiment, the calculation unit 140 generates an image 730 illustrated in
[0150] The details will be described with reference to
[0151] In the information processing apparatus 100 according to the second exemplary embodiment described above, the calculation unit 140 acquires the adjustment parameter related to the processing parameter map based on the above-described information about the difference, and regenerates the processing parameter map based on the adjustment parameter.
[0152] According to the second exemplary embodiment, in addition to having the effect in the above-described first exemplary embodiment, it is possible to adjust the processing control parameter easily in a case where the trade-off occurs between the evaluation items.
[0153] In the present exemplary embodiment, the example in which the entire processing parameter map is regenerated in step S409 in
[0154] In the case of the above-described mode in which only the partial region is regenerated, for example, upon generation of the entire processing parameter map after the adjustment, a region 1501 selected by the user via a GUI 1500 illustrated in
[0155] In the present exemplary embodiment, a simulation image indicating a molded product outer appearance in a case where an uneven structure is provided using the generated processing pattern may be displayed on the GUI, as displayed in a display region 908 in
[0156] In the present exemplary embodiment, the example in which the user acquires the coefficient indicating the balance between the emphasis on fidelity and the emphasis on continuity as the adjustment parameter is described, but an adjustment amount corresponding to the value of the distance L described above may be input.
[0157] According to the exemplary embodiments of the present disclosure, the portion to be adjusted can be recognized when the molded product is formed using the mold.
Other Embodiments
[0158] Embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
[0159] While the present disclosure has been described with reference to exemplary embodiments, it is to be understood that the disclosure is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
[0160] This application claims the benefit of Japanese Patent Application No. 2021-188574, filed Nov. 19, 2021, which is hereby incorporated by reference herein in its entirety.