A METHOD FOR MANUFACTURING CONSTRUCTION COMPONENTS, A PORTABLE MANUFACTURING UNIT, A SOFTWARE APPLICATION EXECUTABLE ON A MACHINE TOOL SYSTEM FOR CONTROLLING A TOOL, THE MACHINE TOOL SYSTEM, AND A METHOD OF MACHINING THE WORKPIECE USING THE TOOL

20230251631 · 2023-08-10

    Inventors

    Cpc classification

    International classification

    Abstract

    A software application executable on a machine tool system for machining a workpiece is configured for executing the steps of: instructing an image recording device to capture an image of a first workpiece having one or more hand-drawn notations, associating the one or more notations with at least one machining step of the machine tool system, presenting on a display device the at least one machining step of the machine tool system on or in relation to the captured image, and/or the first workpiece as the first workpiece will appear after machining the first workpiece according to the at least one machining step of the machine tool system, and optionally controlling a tool of the machine tool system for machining the first workpiece according to the one or more notations.

    Claims

    1. A method for manufacturing construction components at a building site, the method comprising the steps of loading a material stack to be manufactured onto a bed of a transportable frame or next to the transportable frame, wherein the transportable frame carries an industrial manipulator and an external axis, transferring by the industrial manipulator a piece of the loaded material stack onto the external axis, moving the transferred piece by the external axis for allowing a desired position of the piece to be manufactured by the manipulator, manufacturing the transferred piece by the manipulator into a manufactured construction component, and transferring the manufactured construction component to a second stack of manufactured construction components.

    2. The method according to claim 1, wherein the material stack comprises uni-directionally elongated pieces, such as boards, pipes or rods

    3. The method according to claim 1 or 2, wherein the transportable frame comprises a cover, wherein the material stack or the piece of the material stack is loadable/transferrable through an opening in the cover.

    4. The method according to claim 3, wherein the cover comprises a longer side and a shorter side and the opening stretches along the longer side.

    5. The method according to any of the preceding claims, wherein the manipulator changes manipulation heads between the steps of transferring the piece onto the external axis and of manufacturing the transferred piece, the manipulator comprises simultaneously the manipulation heads for transferring the piece onto the external axis and for manufacturing the transferred piece, or the manipulator for transferring the piece onto the external axis and the manipulator for manufacturing the transferred piece are two different manipulator.

    6. The method according to any of the preceding claims, wherein the method comprises the step of receiving data about final length of at least one of the manufactured construction components and/or angle of machining.

    7. The method according to any of the preceding claims, wherein length and/or width and/or thickness of the piece to be manufactured and/or of the manufactured construction component is/are determined.

    8. The method according to any of the preceding claims, wherein the manipulator and/or the external axis is/are controlled by a mobile computing device.

    9. The method according to any of the preceding claims, wherein each component in a series of manufactured construction components is at least partly uniquely labelled.

    10. A portable manufacturing unit for manufacturing construction components, the manufacturing unit comprising a bed configured for receiving a material stack to be manufactured, an external axis, an industrial manipulator configured for transferring a piece of the material stack to the external axis, wherein the external axis is configured for moving the transferred piece on top of the external axis for allowing a desired position of the piece to be manufactured by the manipulator, wherein the manipulator is further configured for manufacturing the moved piece into a manufactured construction component, and wherein the external axis or the manipulator is configured for transferring the manufactured construction component to a second stack of manufactured components.

    11. The portable manufacturing unit according to claim 10, wherein the manufacturing unit has a longer side and a shorter side, and the manufacturing unit comprises a cover covering the manufacturing unit, wherein the cover has a first opening along the longer side configured for receiving the material stack and/or the piece of the material stack.

    12. The portable manufacturing unit according to claim 10 or 11, wherein the manufacturing unit has a longer side and a shorter side, and the manufacturing unit comprises a cover covering the manufacturing unit, wherein the cover has a second opening in the shorter side, wherein the second opening is positioned for allowing at least an end of the transferred piece on top of the external axis to be moved past the cover.

    13. The portable manufacturing unit according to any of the claims 10-12, wherein the manipulator changes manipulation heads between the steps of transferring the piece onto the external axis and of manufacturing the transferred piece, the manipulator comprises simultaneously the manipulation heads for transferring the piece onto the external axis and for manufacturing the transferred piece, or the manipulator for transferring the piece onto the external axis and the manipulator for manufacturing the transferred piece are two different manipulator.

    14. The portable manufacturing unit according to any of the preceding claims 10-13, wherein the portable manufacturing unit is controlled by a mobile computing device.

    15. The portable manufacturing unit according to any of the preceding claims 10-14, wherein the portable manufacturing unit comprises a sensor configured for determining length and/or width and/or thickness of the piece to be manufactured and/or of the manufactured construction component.

    16. The portable manufacturing unit according to any of the preceding claims 10-15, wherein the portable manufacturing unit comprises a labelling machine configured for labelling the manufactured construction components at least partly uniquely.

    17. A software application executable on a machine tool system for machining a workpiece, wherein the software application is configured for executing the steps of: instructing an image recording device to capture an image of a first workpiece having one or more hand-drawn notations, associating the one or more notations with at least one machining step of the machine tool system, presenting on a display device the at least one machining step of the machine tool system on or in relation to the captured image, and/or the first workpiece as the first workpiece will appear after machining the first workpiece according to the at least one machining step of the machine tool system, and optionally controlling a tool of the machine tool system for machining the first workpiece according to the one or more notations.

    18. The software application according to claim 17, wherein the software application is configured for executing the steps of presenting on the display device a distance from a first point on the first workpiece to a second point on the first workpiece, where the machining step will be machining the first workpiece according to the one or more notations, and/or an angle between a first direction of the first workpiece and a second direction of the first workpiece, in which the machining step will be machining the first workpiece according to the one or more notations.

    19. The software application according to claim 17 or 18, wherein the display device is an electronic screen, augmented reality glasses, virtual reality glasses, a hologram illuminated by light, and/or a brain-computer interface.

    20. The software application according to any of the preceding claims 17-19, wherein the software application is configured for executing the steps of: presenting on the display device the option of either including or excluding the one or more hand-drawn notations in the part of the first workpiece that is removed when machining the first workpiece.

    21. The software application according to any of the preceding claims 17-20, wherein the image recording device is a camera, preferably a digital camera like a CCD camera or a camera with an active-pixel-sensor like e.g. a CMOS sensor.

    22. The software application according to any of the preceding claims 17-21, wherein the software application is configured for executing the steps of: presenting on the display device different types of connections between the first workpiece and a second workpiece, receiving instruction about the type of connection to connect the first workpiece and the second workpiece, and optionally controlling the tool of the machine tool system for machining the first workpiece according to the one or more notations and according to the type of connection to connect the first workpiece and the second workpiece, and optionally controlling the tool of the machine tool system for machining the second workpiece according to the one or more notations and according to the type of connection to connect the first workpiece and the second workpiece.

    23. The software application according to any of the preceding claims 17-22, wherein the software application is configured for executing the step of determining for each of the one or more hand-drawn notations whether the one or more hand-drawn notations is/are a straight or curved line, a number and/or one or more letters, a symbol, or a combination thereof.

    24. The software application according to claim 23, wherein the symbol is a first symbol, e.g. an arrow, for indicating a length, or a second symbol, e.g. a circular arc, for indicating an angle.

    25. The software application according to claim 23 or 24, wherein the software application is configured for executing the step of associating a first hand-drawn notation determined to be a number with a second hand-drawn notation determined to be a straight or curved line, or a symbol.

    26. The software application according to claim 25, wherein the second hand-drawn notation is a straight or curved line, and wherein if the position of the machining step according to the second hand-drawn notation does not coincide with the machining step according to the number, the software application is configured for executing the step of interpreting the machining step indicated by the second hand-drawn notation to be in accordance with the number.

    27. The software application according to any of the preceding claims 17-26, wherein the software application is configured for interpreting a hand-drawn notation within the content of a recorded image in order to extract from the hand-drawn notation a number, and for manipulating a location of a line in the captured image based on the number.

    28. A machine tool system for machining a workpiece, wherein the machine tool system is configured to be controlled by the software application according to any of the preceding claims 17-27.

    29. The machine tool system according to claim 28, wherein the machine tool system comprises an image recording device positioned on a joint arm.

    30. The machine tool system according to claim 28 or 29, wherein the machine tool system is integrated in the portable manufacturing unit according to claim 10-16.

    31. A method of machining a workpiece using a tool of a machine tool system comprising the steps of handdrawing one or more hand-drawn notations on a first workpiece, capturing an image of the first workpiece having the one or more hand-drawn notations using an image recording device, associating the one or more hand-drawn notations with at least one machining step of the machine tool system, controlling a tool of the machine tool system for machining the first workpiece according to the one or more notations.

    32. The method according to claim 31, wherein the method further comprises any of the steps of claims 1-9.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0144] The invention will in the following be described in greater detail with reference to the accompanying drawings:

    [0145] FIG. 1 a schematic view of a flow-chart diagram for machining a workpiece;

    [0146] FIG. 2 a schematic view of a computer-controlled fabrication system;

    [0147] FIG. 3 a schematic view of the computer-controlled fabrication system in more detail;

    [0148] FIG. 4 a schematic view of a flow-chart diagram of the algorithmic process of line/curve detection;

    [0149] FIG. 5a a schematic view of diagrammatic description of the algorithmic process to detect a straight line;

    [0150] FIG. 5b a schematic view of diagrammatic description of the algorithmic process to detect a curved line;

    [0151] FIG. 6a a schematic view of a computer-controlled fabrication system positioned in a trailer;

    [0152] FIG. 6b a schematic view of another embodiment of a computer-controlled fabrication system framed within a trailer;

    [0153] FIG. 6c a schematic view of another embodiment of a computer-controlled fabrication system positioned in a trailer;

    [0154] FIG. 7 a schematic view of a rotatable worktable and a robotic manipulator with a tool;

    [0155] FIG. 8 a schematic view of a flowchart diagram showing the step of interpretation of hand-drawn notations;

    [0156] FIG. 9 a schematic view of different notations;

    [0157] FIG. 9a a schematic view of a board with notations;

    [0158] FIG. 9b a schematic view of a board with other notations; and

    [0159] FIG. 10 a schematic view of a flow-chart diagram of symbol processing.

    DETAILED DESCRIPTION OF THE INVENTION

    [0160] FIG. 1 shows a diagram that describes the workflow for machining a workpiece from start to finish.

    [0161] The workflow starts by the user drawing one or more hand-drawn notation on the workpiece in a step 101. In a step 102, the workpiece is placed in front of the computer-controlled fabrication system.

    [0162] If a thickness of the workpiece has been given as an input to the computer-controlled fabrication system already by the user, the computer-controlled fabrication system will adapt the machining process based on the given thickness as in a step 104.

    [0163] In a step 103, if a thickness of the workpiece has not been given by the user, the computer-controlled fabrication system will instruct a camera or an image recording device to capture an image of a side of the workpiece. The side of the workpiece may be visible from the position of the camera, or the workpiece is rotated to expose the side of the workpiece to the camera, or the camera is moved to better capture an image of the side of the workpiece. Alternatively, a combination of two or all three of the alternatives is also possible. The thickness of the workpiece can be determined by an earlier calibration with the camera in the same position, where the ratios between distance in the image and length on the side of the workpiece are determined.

    [0164] By determining the thickness of the workpiece, the computer-controlled fabrication system will know whether a tool attached to and controlled by the computer-controlled fabrication system is able to machine the side of the workpiece in one stroke or a lower part of the workpiece has to be machined in a second stroke or round by the tool.

    [0165] Alternatively, if the thickness of the workpiece is too large for the tool, the computer-controlled fabrication system might indicate for the user that the tool has to be exchanged for another larger tool that is able to machine the side of the workpiece in one stroke.

    [0166] In a step 105, the camera or the image recording device is used to capture an image of the one or more hand-drawn notation drawn on the workpiece. The digital information from the camera is algorithmically interpreted in a step 106, which i.a. can mean that pixels in the image, which depict dye or pigment from e.g. ink or graphite are set apart from pixels in the image, which do not depict dye or pigment. The pixels determined to depict dye or pigment together form notations.

    [0167] The notations can be compared to stored notations. An arrow up to a line across a first workpiece in the form of a board to be machined combined with a handwritten number on the first workpiece can mean that the first workpiece should be machined or cut the handwritten number of e.g. mm away from a short end of the first workpiece wherefrom the arrow originates. The software can be set to interpret the handwritten number in another length unit than mm like e.g. cm or inch. A circular arc between a side of the first workpiece and a hand-drawn line across the first workpiece combined with a handwritten number on the first workpiece can mean that the first workpiece should be machined or cut in an angle corresponding to the handwritten number in e.g. degrees radians depending on how the interpreting software is set.

    [0168] A hand-drawn notation in the form of a straight or curved line on the first workpiece can tell that the first workpiece should be machined along the straight or curved line.

    [0169] In a step 107, how the software has interpreted the one or more hand-drawn notations as they appear in the captured image is visualised on a display device such as a tablet.

    [0170] The user can approve or disapprove the interpretation in a step 108. The disapproval can be an indication that the user wants to add one or more custom details to the digital model or to modify the registered drawing.

    [0171] The user can add one or more custom details to the digital model in a step 109, where the added custom detail can be joinery information between a first workpiece and a second workpiece. The joinery information can e.g. be the angle under which the two workpieces are supposed after the machining processes of the two workpieces. The joinery information can e.g. be that both workpieces, supposed to meet each other at a certain angle, should be machined or cut with half that angle. The joinery information can e.g. be that the two workpieces should meet each other with an overlap so that an upper part of the first workpiece is removed while the lower part of the first workpiece is not removed, and so that an lower part of the first workpiece is removed while the upper part of the first workpiece is not removed, and wherein the upper parts are correspondingly machined to meet each other when the two workpieces are assembled, and wherein the lower parts are correspondingly machined to meet each other when the two workpieces are assembled. Such a joint gives an extra strong connection between the two workpieces. The joinery information can e.g. be that the two workpieces should meet each other with an overlap in three layers, where an end of the first workpiece has a U shape and an end of the second workpiece has a corresponding T-shape that fits in the U-shape end of the first workpiece.

    [0172] The added custom detail can be whether the machining shall leave or not leave the hand-drawn notation in the form of a line on the first workpiece after the machining.

    [0173] Instead of or in addition to the step 109, the user can modify the registered drawing in a step 110.

    [0174] The modification can e.g. be a modification of the interpretation of a handwritten number, e.g. by typing the right number as an input to the computer-controlled fabrication system replacing the wrong interpretation of the handwritten number.

    [0175] If a straight line has wrongly been interpreted as a curved line, the modification can be a modification of the curved line to a straight line.

    [0176] The modification can also be a supplementary hand-drawn notation on the first workpiece so that process starts again in the step 101. In many situations it is easier to add a hand-drawn notation on the first workpiece than to modify the visualized input on the display device.

    [0177] After the addition(s) in the step 109 and/or the modification in the step 110 the diagram leads back to the step 107 for another approval in the step 108.

    [0178] When the user approves of the visualized input or interpretation of the computer-controlled fabrication system, the first workpiece is machined or fabricated in a step 111 by a tool controlled by the computer-controlled fabrication system.

    [0179] When the joinery information is chosen in the step 109, and the first workpiece has been fabricated in the step 111, the second workpiece can be placed in front of the system as in the step 102 without any hand-drawn notation on the second workpiece; the software can store the information chosen for the first workpiece and use that information when machining the second workpiece. Alternatively, notations are hand-drawn on the second workpiece to instruct the computer-controlled fabrication system as was done for the first workpiece.

    [0180] In FIG. 2, the computer-controlled fabrication system is shown comprising detection device like a digital camera 201 like a CCD camera for capturing images of a workpiece 205, which contains the hand-drawn drawing, where the workpiece 205 is placed on a worktable 206. The camera 201 can be mounted externally or on a computer-controlled fabrication device 204.

    [0181] The computer-controlled fabrication device 204 can be a robotic arm, with a tool mounted. The robotic arm can have one, two, three or more linkages and be positioned on a base that can be rotated 360° and/or can be slidable along a direction parallel or perpendicular to the workpiece 205 to give that necessary mobility of the computer-controlled fabrication device 204 that the task requires.

    [0182] The computer-controlled fabrication device 204 can be a computer numerical control (CNC) of machining tools (drills, boring tools, lathes) and 3D printers by means of a computer.

    [0183] The worktable 206 can be moved horizontally in one or two dimensions for moving the workpiece 205 so that all parts of the workpiece 205 can be machined by the computer-controlled fabrication device 204.

    [0184] The computer-controlled fabrication system also comprises a server 202, which can be in the cloud, and display device like a tablet user interface (UI) 203, where the tablet UI can be any form of electronic interface having a display device for showing an image and having an electronic connection for wired or wireless two-way communication with the server 202.

    [0185] The camera 201 is in a wired or wireless communication with the server 202.

    [0186] The server 202 is also in a wired or wireless communication with the computer-controlled fabrication device 204 for controlling the computer-controlled fabrication device 204.

    [0187] When machining the workpiece 205, the user draw the hand-drawn notation on the workpiece 205 for instructing the computer-controlled fabrication system how to machine the workpiece 205. The workpiece 205 is placed on top of the worktable 206.

    [0188] The user then instructs the computer-controlled fabrication system through the tablet 203 to start the machining process of the workpiece 205 by capturing an image of the workpiece 205 by the camera 201. The image data of the image is sent from the camera 201 to the server 202. The server 202 processes the image data including the hand-drawn notation to find out how to machine the workpiece 205. The processing can preferably be a translation of the image data into CAD information that can be used to control the computer-controlled fabrication device 204.

    [0189] The processed or translated image data is sent to the tablet 203, where the workpiece 205 is shown in the image together with the machining step to be performed as understood according to the processing of the server 202. If the computer-controlled fabrication device 204 controls a tool e.g. in the form of a circular cutting blade, the tool will preferably be machining the workpiece 205 by making straight cuts. If the computer-controlled fabrication device 204 controls a tool e.g. in the form of a rotary cutter, the tool can be machining the workpiece 205 along straight lines and/or curved lines.

    [0190] If the machining step is machining along a straight line, the angle of the straight line in relation to another suitable direction as e.g. an edge of the workpiece 205 is preferably shown in the processed image.

    [0191] If the machining step is machining along a curved line, the angle of a tangent of the curved line at an edge of the workpiece 205 in relation to a suitable direction as e.g. the same or another edge of the workpiece 205 is preferably shown in the processed image.

    [0192] The processed image as shown on the tablet 203 will show whether the hand-drawn line along which the machining should be performed is left on the workpiece 205 after the machining process. If the processed image shows that the line is left on the workpiece 205 after the machining process even though the line should be removed, or vice versa, the user can give that input on the tablet 203 to the server 202 so that the processed image as shown on the tablet 203 is updated with this new information.

    [0193] The hand-drawn notation can e.g. also be a handwritten number, a circular arc and/or an arrow as discussed above in relation to the diagram in FIG. 1.

    [0194] The tablet 203 communicates with the computer-controlled fabrication device 204 via the server 202. Once an approval or a machining signal is received, the computer-controlled fabrication device 204 will start machining the workpiece 205.

    [0195] FIG. 3 shows the computer-controlled fabrication system of FIG. 2, where the computer-controlled fabrication device is a robot manipulator 304 with a tool 305 mounted on the robot manipulator for machining a workpiece 306 positioned on a worktable 307. The worktable has an external rotating axis and/or an ability to displace the workpiece linearly so that the workpiece can be positioned in a good position during the machining.

    [0196] A camera 301 captures an image of the workpiece 306. The camera 301 can be positioned at a fixed place, on a joint arm so that the position of camera can be changed (e.g. to be able to capture images of the side of the workpiece or of all the workpiece even if the workpiece is long), or on the joint arm of the robot manipulator 304.

    [0197] A local server 302 receives information from the camera 301 and translates the information into CAD information that is sent to a tablet UI 303. The tablet 303 is used to interpret, alter and add information to the captured information and present the information to the user on a user interface (UI) like a screen. Once satisfied, the user will send a fabrication signal to the server via the tablet 303, which in turn sends fabrication information to the robot manipulator 304.

    [0198] In FIG. 4, a diagram is shown describing the algorithmic process of line/curve detection. This is one example. Other examples are possible.

    [0199] In a step 401, the camera generates image data by capturing an image of the workpiece.

    [0200] In a step 402, a camera calibration is applied to the image data, ensuring that there is no spherical distortion in the image/video from which the drawing is detected.

    [0201] It is determined whether the hand-drawn notation is a straight line or a curved line. Manual selection is also possible.

    [0202] The curved line can be a curved line, or the curved line can be a number or a sign having a curved segment. If a line, number or sign has both a straight line and a curved line connected to each other, the line, number or sign may be dealt with as a curved line, or maybe considered as separate lines—one or more straight lines and one or more curved lines.

    [0203] Dependent on the outcome of whether a line is straight or curved a selection mechanism in step 402 sends the analysis further along some steps 403-409 for analysing straight lines or along some steps 410-415 for analysing curved lines. The steps 403-409 and the steps 410-415 resemble each other.

    [0204] The workpiece will not be a perfect, white piece, and the hand-drawn notation will not be totally black, but will have structures that can resemble e.g. the ink or graphite from a pen or pencil. A workpiece of concrete will be greyish with some small cracks having a darker colour. A wooden workpiece will have structures in the surface of the wooden workpiece due to e.g. growth rings. The cracks and growth rings must not be interpreted as hand-drawn notations or as parts of a handwritten sign. For that reason, the image of the workpiece with the handwritten sign has to be processed so that the handwritten sign is emphasised and lines in the image due to structures in the workpiece itself are eliminated.

    [0205] In a step 403 for line detection, the colour of the image/video is calibrated, by making colours more vibrant or by removing certain colour values and substituting them with white/black. The advantage of the calibrated colours is that the handwritten sign(s) is/are accentuated and background noise is removed so that it is easier for the system to correctly read and interpret the handwritten sign(s).

    [0206] In a step 404, the image is converted to black and grey and white to simplify the computational process.

    [0207] In a step 405, the image is blurred to remove noise.

    [0208] In a step 406, edge detection is performed on the blurred image. The edge is determined to be between adjacent pixels having a difference in darkness above a certain threshold.

    [0209] In a step 407, the edge detection is used to make an alpha map (containing only white/black colours). Inside the edges the pixels will be only black and outside the edges the pixels will be only white, or vice versa. The edges are removed so that the alpha map does not have the edges.

    [0210] In a step 408, a feature detection is performed based on the alpha map, where a line is applied somewhere in the middle of the black or white pixels, which earlier were inside the edges.

    [0211] In a step 409, the hand-drawn notation is detected and distinguished from the rest of the image.

    [0212] In a step 410 for curve detection, the colour is calibrated as in the step 403 and for the same reason.

    [0213] In a step 411, the image is blurred as in the step 405 and for the same reason.

    [0214] In a step 412, the image is converted to an alpha map (similarly to the output of step 407) using a threshold technique for the same reason.

    [0215] In a step 413, 2D contour is performed on the image.

    [0216] In a step 414, centre lines are extracted from the 2D contour.

    [0217] In a step 415, the curve is detected based on the extracted line.

    [0218] In a step 416, Curve/line is scaled from pixel space to metric space so that the robot manipulator and the tool of the robot manipulator can be controlled to perform the right machining at the right scale or dimension. It is important that the scaling is correct so that the cut or machining will be at the correct location.

    [0219] In a step 417, a metric CAD representation of the drawing is visualized on the tablet so that the user can approve, modify or disapprove the way the computer-controlled fabrication system has understood the hand-drawn notation.

    [0220] FIG. 5a shows a diagrammatic description of the algorithmic process to detect a line as described in the steps 403-409, where a captured, coloured image has been processed so that the colour of the captured image is calibrated, and we have a colour calibrated image a501.

    [0221] The colour calibrated image a501 is converted to a black and grey (100=white and 0=black) image a502.

    [0222] The black and grey image a502 is blurred by performing a gaussian blur, which averages neighbouring pixels, and we have a blurred image a503.

    [0223] Edge detection is performed on the blurred image a503 so that sudden changes in intensity (changes in intensity between neighbouring pixels) can be identified, and we have an image with edge detection a504.

    [0224] The image with edge detection a504 is converted by alpha mapping to an alpha map image a505.

    [0225] In the alpha map a505 feature detection is performed, which returns a generalized description of a line or a generalized line in the form of a point and angle as shown in a feature detection image a506.

    [0226] The generalized line in the feature detection image a506 is converted to a conventional line information image a507 with a start point and an end point, or with start coordinates for the start point and end coordinates for the end point. In this way, the hand-drawn notation in the form of a straight line can be translated into a line with a well-defined start point and a well-defined end point, where the conventional line information image a507 with the well-defined line can be sent to the tablet for approval by the user, where the well-defined line will represent a machining process like e.g. a straight cut or milling process along the well-defined line.

    [0227] FIG. 5b shows a diagrammatic description of the algorithmic process to detect a line as described in the steps 410-415. First, an imported image b501 is captured by the camera and sent to the server to be processed.

    [0228] The imported image b501 where a captured, coloured image has been processed so that the colour of the captured image is calibrated enhancing a colour spectrum, and we have a colour calibrated image captured b502. The colour calibrated captured image b502 may have been optimized through an application of blur algorithm used above when processing the black and grey image a502 to achieve the blurred image a503.

    [0229] The colour calibrated captured image b502 is converted by alpha mapping to an alpha map captured image b503.

    [0230] The alpha map captured image b503 is further processed for extracting a 2D contour by adjusting the colour value for pixels within a predefined threshold limit to 0 and for pixels outside the threshold to 100 and receiving a 2D contour image b504.

    [0231] Finally, a centreline for the contour is found in and added to the 2D contour image b504 to receive a centreline image b505. There are many mathematical models that can be utilised to achieve a good and correct position of the centreline. In this way, the hand-drawn notation in the form of a curved line can be translated into a line with a well-defined start point and a well-defined end point, where the centreline image b505 with the centreline can be sent to the tablet for approval by the user, where the centreline will represent a machining process like e.g. a curved milling process along the centreline.

    [0232] In FIG. 6a a computer-controlled fabrication system positioned in a trailer 602 pulled by a car/van 608 is shown.

    [0233] A user has made a hand-drawn notation on a workpiece 603. The workpiece is positioned in the trailer on or in a worktable 606 in such a way that a camera 604 has been able to capture an image of at least that part of the workpiece 603 that has the hand-drawn notation and will be machined. The image has been sent from the camera 604 to a robotic controller or server 607, where the captured image has been processed so that the image could be presented on a tablet UI 601 and instruct a robotic manipulator 605 how to machine the workpiece 603. The user has approved interpretation of the hand-drawn notation and how to machine the workpiece 603, and instructions have been sent from the controller or server 607 to the robotic manipulator 605 how to machine the workpiece 603. As shown, the robotic manipulator 605 is machining the workpiece 603 at the very moment.

    [0234] Next to the trailer 602, there is further material 609 to process.

    [0235] In FIG. 6b, another embodiment of the computer-controlled fabrication system is shown, where the computer-controlled fabrication system is framed within a trailer 601b, which can be attached and pulled by an ordinary car or a light truck having a tow bar. Further, the system may be devised to hold an opening 602b along the whole length of the trailer for loading construction materials means of a forklift 604b or related manual loading methods. The trailer according to any embodiment may have support legs 603b for supporting the trailer. When the trailer rests on the support legs instead of on the springs of the trailer, the trailer will be prevented from moving during loading process. The loading process will be easier.

    [0236] In FIG. 6c, another embodiment of the trailer 608c with the computer-controlled fabrication system is shown. This embodiment of the associated production system may further be configured such that it entails a 6-axis industrial manipulator 604c, an onboard power supplying device 601c, such as a generator or battery supply; and a control device 602c for the industrial manipulator. Further, the configuration may entail a bed 606c from which loaded material 607c can be picked. A tool changing device 603c enables the manipulator to interchangeably switch between 2 or more end-effectors. And finally, the embodiment may entail a longitudinal external axis 600c, on which timber beams can be placed and shifted forwards and backwards in position along the axis 600c. Openings 609c in short sides 610c of the trailer allow even a board or a beam (not shown) on top of the external axis 600c to be moved by the external axis 600c without being limited by the short sides. Collectors 605c will collect machined material like e.g. saw dust from the machining process so that the interior of the trailer stays clean. A vacuum pump (not shown) can be connected to the collectors for sucking the machined material so that the collectors are not overfilled.

    [0237] The totality of this embodiment as such allow for a process, in which a) a material stack is loaded unto the station; b) a sensor device mounted on the manipulator or in the structural frame detects the position and dimension of the loaded material stack; c) the manipulator—using a robotic gripper—picks one timber item from the stack and positions the timber item in the external axis; d) the manipulator changes to a machining end-effector such as milling spindle or saw; e) the external axis moves the timber item back and forth on the external axis to achieve a desired position, which enables the manipulator to machine the timber item within the confinement of the trailer frame; f) processed pieces are—after a second tool change—picked by the manipulator and placed in a second stack of processed elements, which are positioned within reach of a second opening inside the trailer; g) the processed elements are picked by a forklift or similar device for use in construction.

    [0238] A rotatable worktable 701 is shown in FIG. 7, on which a workpiece 702 is resting. A user 705 is handdrawing a sign 704 on the workpiece 702 using a pen 703. A robotic manipulator 706 with a tool 707 attached is positioned behind the user 705 ready to receive instructions to machine the workpiece 702.

    [0239] FIG. 8 shows in a flowchart diagram, how the step of the computer-implemented interpretation 104 in FIG. 1 of hand-drawn symbols on the work-piece, may be processed in detail.

    [0240] In a first step, a camera input from a camera is received in the form of a video or images 801. The images are segmented 802 and the segments are formatted against a group of pre-determined image classifier standards 803. The classification is executed using an instance of an Artificial Intelligent (AI) system 804-807, which is pre-trained to detect either text, symbols, paths (see FIG. 9) or a combination thereof within an image. This type of classification using an AI system is a standard procedure and well-known in the art, see e.g. Pattern Recognition and Machine Learning, Christopher Bishop, Springer, 2006; Machine Learning for Text, Charu C. Aggarwal, Springer, 2018; and Understanding Machine Learning: From Theory to Algorithms, Shai Ben-David and Shai Shalev-Shwartz, Cambridge University Press, 2014.

    [0241] Upon analysis, the AI system will return a value to denote if actionable content was detected in the camera input 808. In case no actionable content is detected, the system will prompt the user and await input through the display unit 809. In case actionable content is detected, an Action Identifier 811 determines the further process against the classification of the found content (path, text, symbol or combinations thereof).

    [0242] In case of detecting paths only 812, the information may be processed according preceding steps 401-417 in FIG. 4. For combinations of paths and texts 813, the path and text objects are segmented from one another. The path segment 814 is processed in process drawing 815 according to the steps 401-417 of preceding FIG. 4.

    [0243] The isolated text segment 816 is formatted against the requirements of the text interpreter 817. The text interpreter 818 represents a second AI system, which may rely on multi-classification or convolutional neural nets to classify individual, hand-written letters. The text interpreter may be trained 819 against formatted training data 820, e.g. the MNIST library or similar databases.

    [0244] In a further step 821, the processed texts and paths are analysed in combination, and the spatial relation between individual path segments and individual text objects is determined. By measuring the distances from any given point on a path segment to the area centroid of a text object, pairs are created between text and path segments. Upon determining pairs, the text object is screened for numerical content according to any number system.

    [0245] Upon determining a numeric value, the value will be assumed to represent dimensional information for the path segment, provided by the user. The interpreted dimension information is measured against the real-world measures of hand-drawn elements on the work-piece. In case of discrepancies,—such as e.g. the dimensional information stating 45 cm in real units, and the measured length of a segment being registered as 43.7 cm—the interpretation module adjusts the interpreted length of the path segment (43.7) to the identified dimensional value read from the text object (45 cm) and prompt the user through the display port for acceptance or correction of the interpretation.

    [0246] In case of identifying the presence of both symbols and text in the drawn content 822, text and symbols are separated and text elements are processed according to steps 816-818. The isolated symbol 823 is processed to generate a numeric instance of the symbol 824. This instance may be achieved according to the steps outlined in FIG. 10. In a further step 825, the numeric instance of the symbol may be augmented against text data found in the text interpretation process. This could entail situations such as augmenting the length of an arrow or diameter of a circle based on dimensional information given in text form. In case of identifying only symbolic image content 826, the symbol is processed according to step 824.

    [0247] As stated in preceding descriptions, one instance of the disclosed method (FIG. 1), may be to identify image content according to three classes: a) paths; b) text; and c) symbols. Paths are considered any instance of open or closed curves, drawn by the user on the workpiece. An open curve denotes a curve, in which the starting point and end point of the curve are not coinciding. This may take the form of straight lines 900 or curved lines 903, as shown in FIG. 9.

    [0248] Closed curves denote any curve, in which the start point and end point are coinciding. These may be distinguished in closed curves, which can be conformed to geometric primitives, such as circles, ellipsoids, triangles, rectangles and other polygons 901; or arbitrary closed curves that does not conform to known geometric primitives 902.

    [0249] In the context of the present disclosure, the utility of paths is considered the interpretation of user hand-drawn curves to designate a desired toolpath for a machining process, e.g. a straight line to be followed by a digitally controlled saw to perform a cut; or a curve to be followed by a digitally controlled point-tool to machine any arbitrary shape.

    [0250] Image content in the form of text, denote any numeral system used to indicate numeric information hand drawn on the workpiece by the user. Or it may denote any elements of an alphabetic system used to supply additional information, such as part numbering or units of measurement. In the context of the present disclosure, the utility of texts is for the user to—by hand-writing on the work-piece—provide supplementary information to a hand-drawn notation. This could be in the form of e.g. stating in numeric form a desired dimension of a line; a desired angle of a line; or a desired diameter of a circle, etc.

    [0251] Image content in the form of symbols denote any recognizable combination of curve segments that through digital processes can be abstracted to generic symbols; or system specific symbols stored in a library. In terms of generic symbols, this could be in the form of dimensional arrows or dimension lines 904; cross-markings 905; or angled lines denoting perpendicularity. In the context of the present disclosure, the utility of symbols is to enable the user through hand-drawing thereof to instruct a machining system with specific, pre-determined commands that are associated to other drawn content.

    [0252] FIG. 9a shows how a use scenario of the disclosed method, pertaining to the preceding descriptions, can be exemplified. A workpiece—consisting of a rectangular piece of hardwood—needs to be cut so that the length of the piece after the cut is 400 mm.

    [0253] To instruct the digital production system through hand-drawing, the user performs the following notations on the workpiece: first, a cutting line is drawn 904a. The cutting line is drawn per eye-measure of the user at the approximate location of the cut. Through the steps disclosed in preceding descriptions, the cutting line is recognized as a path-object in the form of an open curve, representing an approximation of a straight line. The line is conformed to an exact straight line, and the location of its start point is measured to 385 mm 906a from an edge 903a of the workpiece.

    [0254] In addition to the cutting line, the user has further drawn a cross-mark 905a; an arrow 902a; and the digits “40” 901a. According to preceding procedures, the cross-mark and arrow are identified as symbols, recognizable within an associated library and denoting a pre-determined instruction. The cross-mark denotes an instruction that the area delimited by the cutting line be considered the cut-off. Hereby, the system is implicitly instructed to position the saw-blade such that the width of the saw-blade is subtracted from the cut-off piece. Further, the arrow is recognized as a dimension instruction, designating the value “40”. By associating the dimension instruction to the cut-line position, the system infers that while the measured position of the line is 385 mm, the instructed distance is “40”. By proportionally relating “40” to the physical measures of the workpiece, the system will infer that the implied units of the dimension instructions are centimetres, and thus the instructed dimension target is 400 mm. following these steps, the system will prompt the user of the result of its interpretation, and query the users acceptance or adjustment of the interpreted instructions, and await a signal for execution.

    [0255] FIG. 9b shows how in another use scenario, the user has performed a hand-drawn notation on a workpiece 905b. The content is identified as: four curves—two linear and two curved; and the numbers 30, 50, 115 and 135. Through the drawing interpretation steps outlined in the preceding descriptions, the system infers the following information: two straight lines are identified as paths with associated text, denoting dimension instructions 904b and 907b; two curved segments are identified as angle demarcations with associated angle values 901b, 902b.

    [0256] The interpreted content is thus two straight cut lines, with an interior angle of 115 degrees, meeting the boundary of the workpiece at an angle of 35 degrees. The measured start point of the cutting lines are respectively placed 100 mm 906b and 425 mm 903b from the shared corner within the area delimited by the cutting lines. This interpretation is prompted to the user for acceptance or editing, upon which execution may be initiated.

    [0257] In FIG. 10 an embodiment of the present disclosure is shown, where the method of symbol processing denoted in FIG. 8, (steps 823-826) may entail the following steps: at the initiation of the symbol processing 1001, the symbol is segmented from the image content 1002, and the segmented image is converted to an alpha map 1003. Supplementary, an image classifier 1009 is used to determine the recognized type of symbol against an electronic library of available symbols, which the system is trained to detect, 1010. Parallel, 2D contours of the image are extracted 1004. In a further step, contours are clustered into subgroup according to their direction within the image space 1005. Based on this information an abstracted, numeric instance of the symbol is obtained 1006.

    [0258] By distance association of the area centroid of a text group against closest points of any lines of the symbol, numeric values from hand-written texts of the text interpreter 1011, 1012 are attributed to the symbol 1007. Hereby, an augmented instance of the symbol is achieved 1008, in which an implied machining instruction is achieved.