METHOD AND OPERATING SYSTEM FOR SETTING UP A MACHINING DEVICE, MACHINING DEVICE, AND COMPUTER PROGRAM FOR SETTING UP A MACHINING DEVICE
20220326679 · 2022-10-13
Inventors
Cpc classification
G05B19/401
PHYSICS
G05B19/402
PHYSICS
G06F3/017
PHYSICS
B23Q15/12
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/35444
PHYSICS
International classification
G05B19/402
PHYSICS
Abstract
The invention relates to a method for setting up a machining device (10), in particular a machining device (10) for machining workpieces (11) which are at least partially made of wood, wood materials, synthetic material, composite materials or the like, comprising the steps of optically detecting the machining device (10) at least in some areas by means of an image capture device (13); detecting, by means of the image capture device (13), an individual first gesture (21) of an operator of the machining device (10) that is performed in the optically detected area; and triggering at least one function of the machining device (10) depending on the detected individual first gesture (21), and to an operating system (12) for setting up a machining device (10), a machining device (10) and a computer program for setting up a machining device (10).
Claims
1. Method for setting up a machining device comprising the steps of optically detecting the machining device at least in some areas by means of an image capture device, detecting, by means of the image capture device, an individual first gesture of an operator of the machining device that is performed in the optically detected area, and triggering at least one function of the machining device depending on the detected gesture.
2. Method according to claim 1, wherein depending on the individual first gesture of the operator, a display device outputs optical, acoustic and/or haptic information for setting up the machining device.
3. Method according to claim 2, wherein the display device projects the information onto a machining area of the machining device and/or onto the workpiece provided in the machining area.
4. Method according to claim 2 or 3, wherein a defined information is selected by at least one further gesture of the operator wherein the information is each linked to at least one function relating to the machining process of the workpiece.
5. Method according to one of the preceding claims, wherein by performing and detecting the individual first gesture, an input mask and/or keyboard is projected onto the machining area of the machining device and/or onto the workpiece provided in the machining area wherein at least one function relating to the machining process of the workpiece is programmed via the input mask and/or keyboard and by the performance and detection of the at least one further gesture.
6. Operating system for setting up a machining device, in particular a wood working device, having an image capture device, by means of which at least a partial area of the machining device can be optically detected, and a display device, by means of which optical, acoustic and/or haptic information for setting up the machining device can be output in a machining area the machining device, wherein the setting up of the machining device provided according to the method of claim 1.
7. Machining device for machining workpieces, having an image capture device, which optically detects at least a partial area of the machining device, and a display device, by means of which optical, acoustic and/or haptic information for setting up the machining device can be output in a machining area of the machining device, wherein the setting up of the machining device is provided according to the method of claim 1.
8. Machining device according to claim 7, wherein the machining device is CNC controlled or is configured as a CNC controlled machining center or is a unit of a CNC controlled machining center.
9. Computer program for setting up a machining device, that is stored in a control device of the machining device, wherein a method according to claim 1 can be executed.
10. Method according to claim 1, wherein the machining device is a machining device for machining workpieces which are at least partially made of wood, wood materials, synthetic material or composite materials.
11. Method according to claim 4, wherein the further gesture of the operator is a hand movement, a pointing movement or touching of a surface of the workpiece or of the machining device in the area of the projected information to be selected.
12. Operating system according to claim 6, wherein the machining device is a wood working device.
Description
[0013] The invention as well as further advantageous embodiments and further developments thereof will be described and explained in more detail below on the basis of the examples shown in the drawings. The features that are apparent from the description and the drawings can be applied individually or jointly in any combination in accordance with the invention. The drawings show the following:
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020] Preferably, the machining device 10 is provided for machining workpieces 11 which are at least partially made of wood, wood materials, synthetic material, composite materials or the like. Such workpieces 11 are used, for example, in the field of furniture and components manufacturing. These can be a wide variety of workpieces 11, for example solid wood or chipboard, lightweight boards, sandwich boards, skirting boards, profiles for profile wrapping and the like. However, the present invention is not limited to such workpieces 11.
[0021] The machining device 10 comprises an operating system 12 for setting up or operating the machining device 10. Setting up or operating the machining device 10 is understood to mean in particular a control, regulation and/or programming of the machining device 10, in particular of the process for machining the workpiece 11 by an operator. The operating system 12 comprises an image capture device 13 and a display device 14. This image capture device 13 and display device 14 are connected to the machining device 10 via a control device 16.
[0022] The image capture device 13 comprises cameras 17 which optically capture the machining device 10 at least in some areas. The image capture device 13 particularly captures a machining area 18 of the machining device 10. This machining area 18 is preferably a support, console or a machining table on which the workpiece 11 is placed by the operator or automatically, and is prepared for the machining process and/or subsequently machined. The machining area 18 can also be understood as an area surrounding the machining device 10, in which the operator moves to set up the machining device 10.
[0023] As shown in
[0024] The display device 14 comprises a projector 19 such as a laser projector, LED projector, or similar projection device. Preferably, the display device 14 is arranged above the machining device 10. In particular, the display device 14 is arranged in such a way that it can output, in particular display or project, information 22 into the machining area 18.
[0025] In addition, the display device 14 can be provided to output acoustic and/or haptic information. Acoustic information can be, for example, signals or tones that convey particular information to the operator. Haptic information can be conveyed to the operator, for example, by a vibration of at least parts of the machining device 10, in particular a vibration of the support, console or machining table or of an input device.
[0026] The setting up of the machining device 10 using the operating system 12 will hereinafter be explained by means of
[0027]
[0028] To perform the function, it may be necessary to program the machining device 10 with further defined parameters. Using the example of the borehole, this can be the exact position of the borehole, the borehole diameter, the borehole depth, a sinking of the borehole or similar parameters. In order to program these parameters, a control of the display device 14 is carried out depending on the detected gesture 21. The display device 14 projects information 22 associated with the detected gesture 21 onto a surface of the workpiece 11. In particular, the information 22 is displayed two-dimensionally on the surface of the workpiece 11, comparable to a virtual window on a computer screen. This is schematically shown in
[0029] Entering the parameters is carried out by a further gesture 21, in particular by a pointing movement at the position of the projected information 22 to be selected or entered. This further gesture 21 is also detected by the image capture device 13, and a function associated with the selected or input information 22 is triggered. According to
Finally, the programmed parameters can be confirmed by means of a further gesture 21 detected by the image capture device 13, and thus the machining process of the workpiece 11 is started by the machining device 10.
[0030]
[0031] In order to program the machining device 10 with the parameters required for the saw cut, the display device 14 is triggered depending on the detected gesture 21, wherein the display device 14 projects the information 22 associated with the detected gesture 21 onto the surface of the workpiece 11. The information 22 can again be an input mask 23 projected onto the surface of the workpiece 11 and/or a projected keyboard 24 for entering parameters concerning the saw cut, as already described with respect to
[0032] In the following, some further individual gestures 21 are listed by way of example, by means of which corresponding functions for setting up the machining device 10 can be triggered, wherein the gestures 21 are performed on the surface of the workpiece 11, and the list is to be considered as non-exhaustive: [0033] Gesture: Draw a rectangle [0034] Function: Machining of a contour of the workpiece
[0035] Gesture: Tapping a drill hole in the workpiece [0036] Function: Gluing the drill hole
[0037] Gesture: Drawing a rectangle around a hinge hole [0038] Function: Setting a hinge
[0039] Gesture: Drawing a line along an edge of the workpiece [0040] Function: Providing edge material at the edge
[0041] Gesture: Drawing a question mark [0042] Function: Help menu is opened
[0043] Gesture: Drawing a line starting from an edge of the workpiece [0044] Function: Input menu is opened
[0045] Gesture: Double-tapping the workpiece with the finger [0046] Function: Selection of a specific information
[0047] Gesture: Drawing a “C” [0048] Function: Copy all programmed functions for another workpiece
[0049] Gesture: Drawing an arc-shaped line [0050] Function: Mirroring of programmed functions to an opposite workpiece side
[0051] Gesture: Drawing an “X” [0052] Function: Delete a programmed function
[0053] By means of the image capture device 13 and/or the display device 14, further information 22 can additionally also be output or functions performed. In particular, it is thereby provided that by means of the projected information 22 and/or the performance and capture of the gestures 21, the operator is guided in a largely automated manner through the setting up of the machining process for the workpiece 11.
[0054] By means of the image capture device 13, for example, clamping means for holding the workpiece 11 can be detected, wherein depending on the position of the clamping means required for holding the workpiece 11, missing and/or incorrectly positioned clamping means can be indicated to the operator by a corresponding projection of the display device 14. It can thereby be provided that the control device 16 automatically activates a successive function as soon as all clamping devices are correctly positioned by the operator.
[0055] It may also be provided that the display device 14 projects the outline of the workpiece 11 in the required position into the machining area 18 of the machining device 10. In this way, simple and precise positioning of the workpiece 11 can be performed by the operator. In addition, dimensions can also be projected into the machining area 18 of the machining device 10. When the workpiece 11 is correctly positioned, the operator may receive visual, audible, and/or haptic feedback from the display device 14.
[0056] The display device 14 can also output information 22 comprising further parameters relating to the machining process. These can be, for example, a remaining time of the machining process, safety instructions, context information on the machine status, error messages and the like.
[0057] In an alternative embodiment of the machining device 10, it may further be provided that the gestures 21 are not detected by the previously described operating system 12, but that the operator performs the gestures 21 on a sensitive surface, in particular a tablet or touchscreen. Such an embodiment can also be easily integrated into existing machining devices 10 or, if necessary, refitted.
[0058] List of reference numbers [0059] 10. machining device [0060] 11. workpiece [0061] 12. operating system [0062] 13. image capture device [0063] 14. display device [0064] 15. [0065] 16. control device [0066] 17. camera [0067] 18. machining area [0068] 19. projector [0069] 20. [0070] 21. gesture [0071] 22. information [0072] 23. input mask [0073] 24. keyboard