Numerical control device
10915087 ยท 2021-02-09
Assignee
Inventors
Cpc classification
G05B19/402
PHYSICS
G05B19/182
PHYSICS
B23Q11/1076
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/50248
PHYSICS
B23Q17/2409
PERFORMING OPERATIONS; TRANSPORTING
B23Q17/249
PERFORMING OPERATIONS; TRANSPORTING
International classification
G05B19/402
PHYSICS
B23Q17/24
PERFORMING OPERATIONS; TRANSPORTING
B23Q11/10
PERFORMING OPERATIONS; TRANSPORTING
Abstract
To provide a numerical control device capable of directly determining whether or not a cutting fluid is applied to a cutting point. A numerical control device includes a determination unit configured to make, on a basis of image data acquired when a vision sensor photographs a cutting fluid jetted from an injection nozzle toward a cutting point, determination of whether or not the cutting fluid is applied to the cutting point, and an instruction unit configured to issue an instruction to a nozzle control device configured to control a position and an attitude of the injection nozzle on a basis of a result of the determination of the determination unit.
Claims
1. A numerical control device comprising: a memory that stores a program; and a processor configured to execute the program to: make, on a basis of image data acquired when a visual sensor photographs a cutting fluid jetted from an injection nozzle toward a cutting point, a determination of whether or not the cutting fluid is applied to the cutting point; and issue an instruction to a nozzle control device configured to control a position and an attitude of the injection nozzle on a basis of a result of the determination, wherein the processor makes the determination on a basis of a shape of a path of the cutting fluid extracted from the image data by detecting the edge portions of the path by pattern matching.
2. The numerical control device according to claim 1, wherein the processor makes the determination by machine learning.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
DETAILED DESCRIPTION OF THE INVENTION
(6) One embodiment of the present invention will be described below in detail with reference to the drawings.
(7) The next description with reference to
(8) When machining a workpiece (not shown), the machine tool 1 shown in
(9) More specifically, the machine tool 1 according to the present embodiment controls the position and the attitude of an injection nozzle 3 (the angle of the injection nozzle in the present embodiment) so that the cutting fluid is applied to the cutting point P, in the state prior to the cutting of the workpiece, that is, in the state where the cutting tool T is positioned away from the workpiece before the cutting tool T approaches the workpiece, and where a spindle 2 to be described below is not rotating. Then, whether or not the cutting fluid is applied to the cutting point P is determined, on the basis of the image data acquired by the vision sensor 5. The machine tool 1 according to the present embodiment is characterized in that the machine tool 1 determines whether or not the cutting fluid is actually applied to the cutting point P. The determination will be described below in detail. After determining that the cutting fluid is actually applied to the cutting point P, the machine tool 1 controls and brings the cutting tool T into contact with the workpiece while keeping the relative positions of the cutting tool T and the injection nozzle 3, and then executes a cutting work.
(10) The machine tool 1 includes the spindle 2, the injection nozzle 3, a nozzle control device 4, the vision sensor 5, the numerical control device 10, and the like.
(11) An arbitrary cutting tool T selected among the plurality of the cutting tools T is attached to the spindle 2 in an exchangeable manner. The spindle 2 is controlled by the numerical control device 10.
(12) The injection nozzle 3 is disposed in the vicinity of the spindle 2 so as to jet the cutting fluid F toward the cutting point P of the cutting tool T. The position and the attitude (an angle ) of the injection nozzle 3 are controlled by the nozzle control device 4. The injection nozzle 3 according to the present embodiment is disposed to be rotatable in the vertical direction (in the Z direction in
(13) The nozzle control device 4 controls the position and the attitude of the injection nozzle 3 on the basis of an instruction issued by the numerical control device 10, to apply the cutting fluid F jetted from the injection nozzle 3 to the cutting point P. The nozzle control device 4 according to the present embodiment controls the rotation in the vertical direction of the injection nozzle 3, to control the angle of the injection nozzle 3 with respect to the cutting tool T (refer to
(14) The vision sensor 5 acquires image data by photographing the cutting fluid F jetted from the injection nozzle 3 toward the cutting point P of the cutting tool T, and inputs the image data as a signal into the numerical control device 10. The vision sensor 5 may be fixed to the machine tool 1, or may be attached to a robot (not shown).
(15) The numerical control device (CNC) 10 includes a CPU 11, a memory 12, a display unit 13, an input unit 14, an interface 15, a bus 16 and the like.
(16) The CPU 11 is a processor for totally controlling the machine tool 1. The CPU 11 is connected to the memory (storage unit) 12, the display unit 13, the input unit 14 and the interface 15, via the bus 16.
(17) The memory 12 is configured with a ROM, a RAM, a nonvolatile memory, and the like. The memory 12 stores the position of the cutting point P as position data. The display unit 13 displays information necessary for an operator. The input unit 14 is a keyboard or the like by which various types of instructions and data are input. The interface 15 is connected to an external storage medium, a host computer or the like, to exchange various types of instructions and data.
(18) The CPU 11 includes a determination unit 17 and an instruction unit 18.
(19) The determination unit 17 determines whether or not the cutting fluid F is applied to the cutting point P, by using the image data acquired by the vision sensor 5, and inputs the determination result into the instruction unit 18. The determination unit 17 extracts the path of the cutting fluid F from the image data input as a signal by the vision sensor 5, and determines whether or not the cutting fluid F is applied to the cutting point P on the basis of the shape of the path of the cutting fluid F.
(20) Specifically, in the present embodiment, the determination unit 17 focuses on the edge portions of the path of the cutting fluid F in the image data acquired by the vision sensor 5, and detects such edge portions by pattern matching. In an example, the determination unit 17 detects the edge portions of the path by pattern matching, and in the case of detecting that the path of the cutting fluid F has a linear shape (refer to
(21) On the other hand, in the case where, in detecting the edge portions of the path, the determination unit 17 detects that the path of the cutting fluid F does not have a linear shape and is bent in the middle thereof, or that the path widely spreads and is diffused in the middle (refer to
(22) The instruction unit 18 issues an instruction to the nozzle control device 4 on the basis of the determination result input as a signal by the determination unit 17.
(23) The numerical control device 10 according to the present embodiment including the above-described configuration exhibits the following effects.
(24) In the numerical control device 10 according to the present embodiment including the vision sensor 5 introduced, the determination unit 17 determines whether or not the cutting fluid F is applied to the cutting point P, and feeds back the determination result to the nozzle control device 4. Accordingly, the determination whether or not the cutting fluid is applied to the cutting point is made more directly and reliably in comparison with the prior art.
(25) Further, in the numerical control device 10 according to the present embodiment, the state in which the cutting fluid is applied to the cutting point is detected directly. Accordingly, although the angle of the injection nozzle 3 for each cutting tool T is not stored in advance in the memory 12, the position and the attitude of the injection nozzle 3 are able to be adjusted. In addition, the correspondence relation between the cutting tool T and the injection nozzle 3 needs not to be manually set in advance. Therefore, even in the case where the data (length and diameter) on the cutting tool T stored in the memory 12 is changed, the correspondence relation between the cutting tool T and the injection nozzle 3 needs not to be revised.
(26) Although the embodiment of the present invention has been described so far, the present invention is not limited to the embodiment described above. The effects described in the present embodiment are listed merely as the most preferable effects produced by the present invention. The effects produced by the present invention are not limited to those described in the present embodiment.
(27) In the description of the above embodiment, the determination unit 17 makes the determination by detecting the edge portions of the path of the cutting fluid F extracted from the image data by pattern matching. The present invention is not limited thereto.
(28) In an example, the determination unit 17 may make the determination by detecting feature points on the path of the cutting fluid by pattern matching. Alternatively, the determination unit 17 may make the determination by detecting the shape of the entire path of the cutting fluid by pattern matching. In this case, the determination unit 17 extracts feature points correlated with the shape of the entire path of the cutting fluid F from the image data input as a signal by the vision sensor 5, and makes the determination on the basis of the feature points. That is, the determination unit 17 makes the determination on the basis of the similarity between the feature points extracted from the image data and the feature points stored in the memory 12.
(29) Each of
(30) More specifically, the edge portions of the cutting fluid F in a range L (a range L in the X direction in each of
(31) In an example, the determination unit 17 may make the determination by detecting the shape of the path of the cutting fluid by machine learning. The machine learning herein may be of supervised learning or unsupervised learning. In the case where the determination is made by machine learning, the determination unit 17 takes the image data acquired by the vision sensor 5 as an input value, and outputs a value regarding the determination result of whether or not the cutting fluid is applied to the cutting point, on the basis of the enormous image data stored in advance in the memory 12.
EXPLANATION OF REFERENCE NUMERALS
(32) 1 MACHINE TOOL 2 SPINDLE 3 INJECTION NOZZLE 4 NOZZLE CONTROL DEVICE 5 VISION SENSOR 10 NUMERICAL CONTROL DEVICE 11 CPU 12 MEMORY (STORAGE UNIT) 13 DISPLAY UNIT 14 INPUT UNIT 15 INTERFACE 16 BUS 17 DETERMINATION UNIT 18 INSTRUCTION UNIT T CUTTING TOOL P CUTTING POINT F CUTTING FLUID