Method and apparatus for laser projection, and machining method
09644942 · 2017-05-09
Assignee
Inventors
- Hiroyuki NAKANO (Tokyo, JP)
- Nobuhisa Seya (Tokyo, JP)
- Daisuke Igarashi (Tokyo, JP)
- Kazuhiro IGARASHI (Tokyo, JP)
- Youhei Maekawa (Tokyo, JP)
- Katsuto Numayama (Tokyo, JP)
Cpc classification
G05B2219/41168
PHYSICS
G05B19/401
PHYSICS
G05B2219/37205
PHYSICS
G05B2219/31048
PHYSICS
B23Q17/2428
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/37288
PHYSICS
Y02P90/02
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G01B11/2545
PHYSICS
B23Q17/2233
PERFORMING OPERATIONS; TRANSPORTING
G05B19/4097
PHYSICS
B23Q17/2414
PERFORMING OPERATIONS; TRANSPORTING
G01B11/2513
PHYSICS
G01B11/25
PHYSICS
International classification
G05B19/18
PHYSICS
G05B19/401
PHYSICS
B23Q17/22
PERFORMING OPERATIONS; TRANSPORTING
G01B11/25
PHYSICS
G01B11/00
PHYSICS
B23Q17/24
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A laser projection method including the steps of: irradiating, from a laser projection unit, a workpiece that is a measurement object, with a laser while controlling a plurality of mirror angles; imaging the workpiece with a stereo camera, extracting a contour of the workpiece, and calculating a three-dimensional coordinate; calculating a positional relationship between the laser projection unit and the workpiece by comparing the calculated three-dimensional coordinate of the workpiece contour with the minor angle; and performing coordinate transformation of CAD data information and drawing CAD data from the laser projection unit to the workpiece, based on the positional relationship between the laser projection unit and the workpiece. The machining method including the steps of: selecting a component of a tool; assembling the component; imaging the tool assembled; and determining whether or not a desired tool has been assembled.
Claims
1. A laser projection method comprising: irradiating, from a laser projection unit, a laser on a workpiece at at least three different positions on the workpiece, which is a measurement object, by controlling respective angles of a plurality of mirrors; storing the angles at which each of the plurality of mirrors are disposed when each of the at least three different positions are irradiated on the workpiece; measuring three-dimensional coordinates of the at least three positions with a stereo camera; calculating a positional relationship between the laser projection unit and the workpiece by comparing the at least three three-dimensional coordinates with the stored angles of the plurality of mirrors; performing coordinate conversion of Computer Aided Design (CAD) data based on the positional relationship between the laser projection unit and the workpiece to generate data for irradiating the laser; and controlling the angles of the plurality of mirrors and irradiating the laser on the workpiece to draw on the workpiece based on the generated data for irradiating the laser.
2. The laser projection method according to claim 1, further comprising the steps of: comparing a three-dimensional coordinate of said CAD data with the three-dimensional coordinate of an actual contour of said workpiece and indicating this comparison result.
3. The laser projection method according to claim 2, wherein said comparison result is indicated on a display.
4. A laser projection apparatus comprising: a laser projection unit to irradiate a laser on a workpiece at at least three different positions on the workpiece, which is a measurement object, by respectively controlling a plurality of mirror angles; a memory to store the angles at which of each of the plurality of mirrors are disposed when each of the at least three different positions are irradiated on the workpiece; a coordinate calculation unit to measure three-dimensional coordinates of the at least three positions with a stereo camera; a relative positional relationship calculation unit to calculate a positional relationship between the laser projection unit and the workpiece by comparing the at least three three-dimensional coordinates with the stored angles of the plurality of mirrors; and a CAD data conversion unit to perform coordinate conversion of Computer Aided Design (CAD) data based on the positional relationship between the laser projection unit and the workpiece to generate data for irradiating the laser, wherein the laser projection unit controls the angles of the plurality of mirrors and irradiates the laser on the workpiece to draw on the workpiece based on the generated data for irradiating the laser.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
(25)
(26)
(27)
(28)
(29)
(30)
(31)
(32)
(33)
(34)
(35)
(36)
(37)
(38)
(39)
(40)
(41)
(42)
(43)
(44)
(45)
DESCRIPTION OF THE EMBODIMENTS
(46) A method and apparatus for laser projection of the present invention relate to methods and apparatuses for drawing design information on a workpiece using a laser beam. Hereinafter, an embodiment of the method and apparatus for laser projection of the present invention is described using
(47)
(48) A laser beam 200 oscillated from the laser source 1 is focused by a focusing lens 3 at a desired distance. In order to focus the beam at a desired distance, the focusing lens 3 is mounted on a linearly-moving stage 2 that linearly moves in an optical axis direction. The position of the linearly-moving stage 2 is controlled by a linearly-moving stage control unit 11. Specifically, the position of the linearly-moving stage is calculated and controlled by a linearly-moving stage position indication/detection unit 14 so that the laser beam is focused at a laser drawing position determined by a CAD (Computer Aided Design) data conversion unit 21 to be described later and so that the linearly-moving stage moves to a calculated position. Note that the linearly-moving stage 2 is supplied with electric power from a motor drive power supply 25 via the linearly-moving stage control unit 11. Moreover, the linearly-moving stage control unit 11 is supplied electric power also from a circuitry power supply 24.
(49) A focused beam 201 emitted from the focusing lens 3 is projected on a workpiece via a first galvanomirror 4 and a second galvanomirror 5. The angles of the first galvanomirror 4 and second galvanomirror 5 are controlled by a first angle control unit 12 and a second angle control unit 13, respectively. Specifically, a first angle and a second angle are calculated by the mirror position indication/detection unit 15 so that the focused beam 201 travels toward a laser drawing position determined by the CAD data conversion unit 21 to be described late, and the first galvanomirror 4 and the second galvanomirror 5 are controlled so as to rotate to the calculated angles, respectively. The first galvanomirror 4 and the second galvanomirror 5 are supplied with electric power from the motor drive power supply 25 via the first angle control unit 12 and the second angle control unit 13. Moreover, the first angle control unit 12 and the second angle control unit 13 are supplied with electric power also from the circuitry power supply 24.
(50) Next, a coordinate detection unit is described. In this embodiment, the coordinate detection unit comprises a stereo camera. A stereo camera 8 comprises a left camera 6 and a right camera 7. Images captured by the left camera 6 and the right camera 7 are acquired into a computer 23 via an image capturing unit 17. The acquired image is processed by an image processing unit 18, where contour extraction and the like to be described later are performed. Subsequently, a three-dimensional coordinate of an extracted contour is calculated by a coordinate calculation unit 19.
(51) Here, the current positions (angles) of the first angle control unit 12 and second angle control unit 13 are continuously detected by the mirror position indication/detection unit 15. In a relative positional relationship calculation unit 20, the three-dimensional coordinate extracted by the coordinate calculation unit 19 is compared with the angles detected by the mirror position indication/detection unit 15 so as to calculate a relative positional relationship between the laser projection unit 9 and the stereo camera 8, a positional relationship between the stereo camera 8 and a workpiece 26, and furthermore a positional relationship between the laser projection unit 9 and the workpiece 26. In the CAD data conversion unit 21, based on the relative positional relationship between the laser projection unit 9 and the workpiece 26 calculated by the relative positional relationship calculation unit 20, information of CAD data 22 is subjected to coordinate conversion, thereby generating data that is drawn on the workpiece by the laser projection unit 9.
(52) Next, this embodiment is described using
(53)
(54)
(55)
(56)
(57)
(58) In
(59) The remaining amount may be defined as shown in
(60) Here, in order to extract an arc contour of a cylindrical bore, processing shown in
(61) In order for an operator to perform such processing, for example as shown in
(62) The laser projection unit can also draw a text, and therefore as shown in
(63) Next, using
(64) First, a laser beam is projected to an adequate position on a workpiece (
(65) In the laser projection unit 9, only two angles, i.e., a first angle n and a second angle m, are specified, and therefore it is not possible to know at which distance a workpiece has been irradiated with a projected laser beam. That is, only with information of a point P1 (1, 1) (r1 is uncertain), a point P2 (2, 2) (r2 is uncertain), and a point P3 (3, 3) (r3 is uncertain), it is not possible to determine where a work surface is. However, at the same time P1 (x1, y1, z1), P2 (x2, y2, z2), and P3 (x3, y3, z3), which are the three-dimensional coordinates of the points P1, P2, and P3, are grasped by the stereo camera 8, and therefore if these relationships are used, r1, r2, and r3 can be uniquely calculated. Thus, at the same time, a relative positional relationship between the stereo camera 8 and the laser projection unit 9 can be also calculated (
(66) Specifically, first, (n, n, rn) is converted to a rectangular coordinate system.
(67) P1: (1, 1, r1).fwdarw.(r1.Math.cos 1.Math.cos 1, r1.Math.cos 1.Math.sin 1, r1.Math.sin 1)
(68) P2: (2, 2, r2).fwdarw.(r2.Math.cos 2.Math.cos 2, r2.Math.cos 2.Math.sin 2, r2.Math.sin 2)
(69) P3: (3, 3, r3).fwdarw.(r3.Math.cos 3.Math.cos 3, r3.Math.cos 3.Math.sin 3, r1.Math.sin 3)
(70) Here, unknown values are r1, r2, and r3.
(71) On the other hand, the coordinates of the laser bright spots seen from the stereo camera are as follows.
(72) P1: (x1, y1, z1)
(73) P2: (x2, y2, z2)
(74) P3: (x3, y3, z3)
(75) Here, because the distances between the respective points are the same both in a coordinate system of the laser projector and in a coordinate system of the stereo camera, the following formulas are established.
(76) |P1P2|=|p1p2|
(77) |P2P3|=|p2p3|
(78) |P3P1|=|p3p1|
(79) As described above, because there are three formulas for three unknown values, the unknown values, r1, r2, and r3, can be uniquely calculated. Now assume that the coordinates of the laser bright spots in the stereo camera coordinate system are (x1, y1, z1), (x2, y2, z2), and (x3, y3, z3), and that the coordinates of the laser bright spots in the laser projector coordinate system are (X1, Y1, Z1), (X2, Y2, Z2), and (X3, Y3, Z3). Then, the circumcenter of (x1, y1, z1), (x2, y2, z2), and (x3, y3, z3) is calculated, and this is designated by (x0, y0, z0). Next, the circumcenter of (X1, Y1, Z1), (X2, Y2, Z2), and (X3, Y3, Z3) is calculated, and this is designated by (X0, Y0, Z0). Here, a vector heading toward the circumcenter (x0, y0, z0) from the point of origin of the stereo camera coordinate system is designated by A. A vector heading toward the circumcenter (X0, Y0, Z0) from the point of origin of the laser projector coordinate system is designated by B. Although seen from the different coordinate systems, (x0, y0, z0) and (X0, Y0, Z0) are the same points in a global coordinate system. Then, this point is set to the point of origin of the global coordinate system. Then, a vector heading toward the point of origin of the stereo camera coordinate system from the point of origin of the global coordinate system is A, and a vector heading toward the point of origin of the laser projector coordinate system from the point of origin of the global coordinate system is B. Accordingly, the positional relationship between the stereo camera coordinate system and the laser projector coordinate system can be easily calculated from the vector A and the vector B. Note that, if two or more laser spots reside on the same straight line when laser bright spots are seen from the stereo camera and the laser projector, the mutual positional relationship cannot be calculated, and therefore the laser bright spots should not reside on the same straight line when seen from whichever coordinate system.
(80) Next, a specific procedure for calculating the three-dimensional coordinate of a laser bright spot position on the workpiece with the stereo camera 8 is described using
(81) In
(82) Next, a specific procedure for calculating, with the stereo camera 8, a position serving as a reference on the workpiece, for example such as the three-dimensional coordinate of a feature point, such as a reference marker, and a corner, is described using again
(83) As shown in
(84) With the above processing, as shown in
(85) By performing coordinate conversion of the CAD data 22 by the CAD data conversion unit 21 in accordance with this positional relationship between the workpiece 26 and the laser projection unit 9, the data for laser projection is generated. Based on this data for laser projection, the stage position indication/detection unit 14 and the mirror position indication/detection unit 15 drive the linearly-moving stage 2, the first galvanomirror, and the second galvanomirror via the linearly-moving stage control unit 11, the first angle control unit 12, and the second angle control unit 13 to draw.
(86) The method and apparatus for laser projection of the present invention are laser projection techniques effectively utilized in order to obviate wrong cutting, confirm processing states, and check omission of machining in machine works.
(87) Hereinafter, a machining method according to an embodiment of the present invention is described using
(88) First, using
(89)
(90) In the working machine of this embodiment, registered tool image information is already stored in a database 31. A computer for NC program 32 includes an NC program generation unit 33 and an NC program simulator 34. The details of each of these components will be described later using
(91) In a tool assembly unit 311, a tool taken out from a tool storage unit 310 is assembled. A tool image information acquisition unit 312a acquires image information of a tool assembled by the tool assembly unit 311. Then, a tool image information determination unit 313a compares the image information acquired by the tool image information acquisition unit 312a with the registered tool image information taken out from the database 31 to determine the tool. The details of the operation of the tool image information acquisition unit 312a and the tool image information determination unit 313a are described later using
(92) A tool measurement unit 37 includes an NC simulator's tool shape data generation unit 8 and a tool-dimension measurement unit 39. The details of each of these components will be described later using
(93) A computer for tool measurement unit 36 prepares a label by a label printer 314, and also prepares a tag by a tag writer 315. A tool's label/tag attaching unit 316 attaches the prepared label and tag to a tool.
(94) A tool information read unit A 317 reads information from the label/tag attached to the tool. The tool image information acquisition unit A 312a acquires the image information of the tool. A tool image information determination unit B 313b determines the tool from the image information acquired by the tool image information acquisition unit A 312a. A comprehensive determination unit A 345a comprehensively determines from the information read by the tool information read unit A 317 and the information acquired by the tool image information determination unit B 313b. A comprehensive information generation unit A 346a generates comprehensive information obtained by putting together the information recorded on the label/tag and the image information, and sends the same to the database.
(95) An NC control working machine (MC) 344 with an ATC includes a machine's X-axis/Y-axis/Z-axis control unit 328, a tool information read unit C 329, a tool image information acquisition unit C 330, an NC control panel 331, a communication terminal unit 332, and an automatic tool change unit (ATC) 318. The automatic tool change unit (ATC) 318 includes a tool information read unit B 319, a tool storage unit 320, an ATC arm 321, and an ATC control unit 322. The NC control working machine (MC) 344 is controlled by a computer for ATC/MC 347.
(96) Next, using
(97)
(98) In machining a workpiece, first in Step S100, by an NC programmer, an NC program is generated in the NC program generation unit 33 of the computer for NC program 32. Next, in Step S110, simulation of the NC program to be executed by the working machine 344 is performed by the NC program simulator 34 of the computer for NC program 32, and in Step S120, an error, collision hazard prevention, and the like are checked. An NC program confirmed as not having a problem is stored into the database 31 via a network in Step S130.
(99) Next, using
(100)
(101) First, in Step S200, a desired tool is selected from the tool storage unit 310 by a working machine's operator. A selected tool is moved to the tool assembly unit 311, and is then assembled by the tool assembly unit 311 in Step S210. Next, in Step S220, an image of the assembled tool is captured by the tool image information acquisition unit 312a.
(102) Here, using
(103)
(104) First, in Step S220A in
(105) Here,
(106) Here, in capturing an image, as shown in
(107) Then, in Step 220D of
(108) Note that, this embodiment shows an example, in which the entire circumference image of a tool is captured in capturing an image for registration, and when an assembled tool is imaged, an image is captured from one direction and one image captured after assembly is collated with a plurality of registered images.
(109) Next, returning to Step 230 of
(110) When it has not been determined that the both coincide even if collation with all the registered images is complete, a wrong tool has been assembled and therefore the tool is reconfirmed and re-assembled.
(111) Here, the method for collating images, i.e., comparing the coincidences between images, in Step S240 in
(112) For SSD (Sum of Squared Difference) shown in Formula (1), a template is raster-scanned, and a square sum of differences between the brightness values of a pixel at the same position is used. The smaller the value of SSD, the more alike the positions become.
(113)
(114) For SAD (Sum of Absolute Difference) shown in Formula (2), a template is raster-scanned, and a sum of the absolute values of differences between the brightness values of a pixel at the same position is used. The smaller the value of SAD, the more alike the positions become.
(115)
(116) For normalized cross-correlation (NCC) shown in Formula (3), as similarity between a template image and a captured image, a normalized cross-correlation below is used. The closer to 1 the similarity, the more alike the positions become.
(117)
(118) This calculation formula is the same as a formula obtained by transforming a formula of an inner product to a formula of Cos =. If the formula above is transformed to Formula (4) below, an inner product of a vector of I of MN dimensions and a vector of T of MN dimensions is obtained.
(119)
(120) Here, because the value of RNCC is equivalent to Cos , the value of RNCC is a value in a range from 1 to +1. When RNCC=1, the both are completely the same images, and when RNCC=1, the both are negative-positive inverted images.
(121) In a cross-correlation coefficient of the above-described NCC, if the brightness of a template or a captured image fluctuates, the value of NCC will also fluctuate. In contrast, in Zero-mean Normalized Cross-Correlation (ZNCC) shown in Formula (5), by subtracting an average value of the brightness values of a template from each brightness value of the template and subtracting an average value of the brightness values of a captured image from each brightness value of the captured image, the similarity can be stably calculated even if there is a fluctuation in brightness.
(122)
(123) Here, in this Formula (5), an average value of the brightness values inside the area of a template is calculated and further the average value is subtracted from a brightness value, and therefore programming as is results in an inefficient program. Then, the formula of RZNCC is transformed. The average brightness value of a template and the average of the brightness values of an image in the same area as the template can be calculated Formula (6) and Formula (7) below.
(124)
(125) Thus, if these values are substituted into Formula (5) of RZNCC and arranged, Formula (8) below is obtained.
(126)
(127) If this formula is used, calculation efficiency improves because the calculation is done in programmatically one pass.
(128) Note that, when there are rotation and/or scale fluctuation between a template image and a captured image, matching may be performed after performing affine transformation (scaling/rotational transform, sharing) to either of the images.
(129) In this manner, in this embodiment, not by confirming the number of a tool assembly work instruction document or the like, but by using the image of an actually assembled tool, whether or not a tool is a desired tool is determined. Therefore, a mechanism can be constructed, for catching, at this point, a selection error and/or assembly error of a part of a tool caused by a human error and for reliably assembling a desired tool.
(130) Next, using
(131)
(132) A tool, which is assured to be a desired tool by the tool information determination unit A313a of
(133) The tool-dimension measurement unit 39 of the tool measurement unit 37 measures the dimensions of a tool, e.g., diameter (D) and length (L), in Step S300. For example, if a measurement value of the length L of an end mill, which is an example of a tool, is 10.05 mm and a design value L0 of this tool is 10.0 mm, then an error L (=LL0) is +0.05 mm Because this error results in a machining error when performing NC machining using this tool, information regarding the diameter and/or length is sent to the database 1 via the computer for tool measurement unit 36, with the value of this error L as a tool correction value. Moreover, in Step S310, the NC simulator's tool shape data generation unit 38 sends information of NC simulator's tool shape data to the database 31 via the computer for tool measurement unit 36. The computer for NC program 32 simulates the NC program using this data. Moreover, this information is transferred also to the NC control panel 31.
(134) Here,
(135) In Step S320, in the NC simulator's tool shape data generation unit 38, the shape of a tool used in the NC program simulator 34 is measured. Thus, the NC program simulator 34 can simulate based on the shape of a tool that is actually used for machining The shape of the measured tool is sent to the database 31 via the computer for tool measurement unit 36 in Step S330.
(136) Next, using
(137)
(138) First, in Step S400 in
(139) Then, in Step S410 in
(140) Then, in Step S420 in
(141) However, a management system using a person or a label/tag cannot recognize this attachment error. In this embodiment, after this, tool information determination using the above-described image is performed.
(142) That is, in Step S430 in
(143) However, with the determination by an image, only a fact that a tool is T315 can be recognized, but whether or not a correct label/tag is attached to the tool cannot be determined.
(144) Then, in Step S450 in
(145) Next, in Step S460 in
(146) Next, using
(147)
(148) A tool is transported to the NC control working machine (MC) 344 attached with the automatic tool change unit (ATC) 318.
(149) First, in Step S500, tool information is read by the tool information read unit B319. Then, in Step S510, the tool is stored into the tool storage unit 320. Furthermore, in Step S520, information regarding which rack of the tool storage unit 320 the relevant tool has been stored into is sent to the database 31 via the computer for ATC/MC 347. At this time, it is assured that a desired label/tag has been mounted on a desired tool, and therefore here, information regarding what number rack which tool has been stored into may need to be recognized.
(150) Next, using
(151)
(152) A workpiece 326 is prepared on a table 327.
(153) Then, in Step S600, the operation of the NC control panel 331 is performed by a working machine operator to specify an NC program and the NC program is transferred via a communication terminal 332. Next, in Step S610, this information is sent to and read into the ATC control unit 322. Then, in Step S620, based on the read information, the ATC arm 321 selects a desired tool (Tool D, 325) inside the tool storage unit 320, and this tool is mounted on a main shaft 324 in Step S630.
(154) Here, in Step S640 before starting machining, rear information that is supposed to be attached to the main shaft is read by the computer for ATC/MC 347. Moreover, in Step S650, tool information is read by a tool information read unit C329. Then, in Step S660, the information read in Step S640 and the information read in Step S650 are collated. Even if a wrong tool has been selected by ATC318, the tool information read unit C will notice that a wrong tool has been selected. Therefore, wrong cutting will not occur if the machining is stopped at the time when it has been detected that a wrong tool is mounted. However, at an actual machining site, a tool may be manually exchanged without via the ATC. In this case, as already described, only with the information of the tool information read unit C, whether or not a desired tool has been attached to the main shaft 325 cannot be reliably determined.
(155) Then, in this embodiment, in Step S670, the tool information determination unit B313b reads the image information attached to the tool. Here, the image information of a tool to be read is the one acquired in advance, and is image information acquired by expanding an image, which is captured by rotating the tool from a position at 0 degree to 360 degrees, from 0 degree to 360 degrees in the rotation direction of the axis of rotation. Moreover, in Step S680, the tool image information acquisition unit C330 images a tool to acquire tool image information. The acquired tool image information is sent to the tool information determination unit B313b via a network.
(156) Then, in Step S690, the tool information determination unit B313b can determine whether or not the tool is a desired tool, by collating the image information of Step S670 with the image information of Step S680. Here, the acquisition of the tool image information in Step S680 is performed by either of the following methods. In a first method, the main shaft 324 is not rotated, but at a position when a tool is initially attached to the main shaft 324, an image of the tool in the attached state is captured from one direction. In this case, in Step S690, the 360-degree expanded image information acquired in Step S670 is compared with the image information from one direction to determine whether or not the tool is a desired tool. In a second method, the main shaft 324 is slowly rotated from 0 degree to a predetermined angle (e.g., 90 degrees, 180 degrees, or the like), and an image of a tool in an attached state in a range from 0 degree to a predetermined angle is captured. In this case, in Step S690, the 360-degree expanded image information acquired in Step S670 is compared with the image information in a range from 0 degree to a predetermined angle to determine whether or not the tool is a desired tool. This method can improve the determination accuracy than the first method. In a third method, the main shaft 324 is rotated from 0 degree to 360 degrees, and an image of a tool in an attached state in a range from 0 degree to 360 degrees is captured. In this case, in Step S690, the 360-degree expanded image information acquired in Step S670 is compared with the image information in a range from 0 degree to 360 degrees to determine whether or not the tool is a desired tool. This method can improve the determination accuracy than the second method. In a fourth method, while the main shaft 324 is sequentially rotated from 0 degree, an image of a tool in an attached state is captured. In this case, in Step S690, the 360-degree expanded image information acquired in Step S670 is compared with the image information at each angle acquired while rotating the main shaft 324 from 0 degree, and the rotation is continued until the tool can be determined as a desired tool. This method can obtain the determination accuracy nearly equal to the third method, and in addition, can determine in a shorter time than the third method.
(157) Then, if the tool is determined as a desired one, then in Step S700, a predetermined machining operation is specified by the control unit 328, and X, Y, and Z-axes 323 and the main shaft 324 operate to perform the predetermined machining operation.
(158) According to this embodiment described above, because the collation is always performed with an image of a present tool, whether or not the present tool is a desired tool can be reliably determined Then, wrong cutting due to a tool-mounting error caused by a human error can be prevented.
(159) It should be further understood by those skilled in the art that although the foregoing description has been made on embodiments of the invention, the invention is not limited thereto and various changes and modifications may be made without departing from the spirit of the invention and the scope of the appended claims.