METHOD AND SYSTEM FOR GENERATING A 3D MODEL OF A PLANT LAYOUT CROSS-REFERENCE TO RELATED APPLICATION
20230142309 · 2023-05-11
Inventors
Cpc classification
Y02P90/02
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G06F3/04815
PHYSICS
G05B2219/32085
PHYSICS
G06F30/13
PHYSICS
International classification
G06F30/13
PHYSICS
G06F30/27
PHYSICS
Abstract
A system and method generating a 3D plant layout model departing from 2D schema of the layout provide access to a plant catalogue of identifiers of 3D plant objects. At least one 3D plant object identifier is associated with a 2D plant object identifier. Data on a given 2D schema of a layout are received as input data. A function trained by machine learning algorithm is applied to the input data for detecting a set of 2D plant objects. A set of identifier and location data on the detected 2D plant object set is provided as output data. A set of 3D plant objects is selected from the plant catalogue with identifiers associated with the set of 2D plant objects identifiers of the output data. A 3D model of the layout is generated by arranging the selected set of 3D plant objects according to location data of the output data.
Claims
1-20. (canceled)
21. A method for generating, by a data processing system, a 3D-model of a plant layout departing from a 2D-schema of the plant-layout, the plant-layout including an arrangement of a plurality of plant objects, the plant-layout being representable by a 2D-schema and by a 3D model, the plant-layout 2D schema including a 2D arrangement of a plurality of 2D plant objects and the plant-layout 3D model including a 3D arrangement of a plurality of 3D plant objects, the method comprising: a) providing access to a plant catalogue of a plurality of identifiers of a plurality of 3D plant objects, at least one of the 3D plant object identifiers being associated with an identifier of a corresponding 2D plant object; b) receiving data on a given 2D schema of a plant-layout as input data; c) applying a function trained by a machine learning algorithm to the input data for detecting a set of 2D plant objects, and providing a set of identifier and location data on the detected 2D plant object set as output data; d) selecting a set of 3D plant objects from the plant catalogue having identifiers associated with the set of 2D plant objects identifiers of the output data; and e) generating a 3D model of the plant-layout by arranging the selected set of 3D plant objects in accordance with the corresponding location data of the output data.
22. The method according to claim 21, which further comprises providing the plant layout 2D schema with a set of schema annotations providing schema information.
23. The method according to claim 21, which further comprises providing additional layout data.
24. The method according to claim 21, which further comprises interpreting at least one of the additional layout data or schema annotation information by a coded rule module to provide a selection of adjusting steps to the plant layout 3D model.
25. The method according to claim 24, which further comprises at least one of providing the coded rule module as a knowledge graph or providing additional layout data with manufacturing process semantic information.
26. The method according to claim 21, which further comprises providing the plant catalogue as a standard catalogue, a specific catalogue or a combination of a standard catalogue and a specific catalogue.
27. The method according to claim 21, which further comprises providing digital plant objects as CAD objects.
28. The method according to claim 21, which further comprises training a Machine Learning function with a You Only Look Once algorithm.
29. A data processing system, comprising: a processor; and an accessible memory; the data processing system configured to: a) provide access to a plant catalogue of a plurality of identifiers of a plurality of 3D plant objects, at least one of the 3D plant object identifiers being associated with an identifier of a corresponding 2D plant object; b) receive data on a given 2D schema of a plant-layout as input data; c) apply a function trained by a machine learning algorithm to the input data for detecting a set of 2D plant objects, and provide a set of identifier and location data on the detected 2D plant object set as output data; d) select a set of 3D plant objects from the plant catalogue having identifiers associated with the set of 2D plant object identifiers of the output data; and e) generate a 3D model of the plant-layout by arranging the selected set of 3D plant objects in accordance with the corresponding location data of the output data.
30. The data processing system according to claim 29, wherein the plant layout 2D schema include a set of schema annotations providing schema information.
31. The data processing system according to claim 29, wherein additional layout data are provided.
32. The data processing system according to claim 29, wherein at least one of additional layout data or schema annotation information are interpreted by a coded rule module to provide a selection of adjusting steps to the plant layout 3D model.
33. The data processing system according to claim 32, wherein at least one of the coded rule module is provided as a knowledge graph or additional layout data include manufacturing process semantic information.
34. The data processing system according to claim 29, wherein the plant catalogue is a standard catalogue, a specific catalogue or a combination of a standard catalogue and a specific catalogue.
35. The data processing system according to claim 29, wherein digital plant objects are provided as CAD objects.
36. A non-transitory computer-readable medium encoded with executable instructions that, when executed, cause one or more data processing systems to: a) provide access to a plant catalogue of a plurality of identifiers of a plurality of 3D plant objects, at least one of the 3D plant object identifiers being associated with an identifier of a corresponding 2D plant object; b) receive data on a given 2D schema of a plant-layout as input data; c) apply a function trained by a machine learning algorithm to the input data for detecting a set of 2D plant objects, and provide a set of identifier and location data on the detected 2D plant object set as output data; d) select a set of 3D plant objects from the plant catalogue having identifiers associated with the set of 2D plant objects identifiers of the output data; and e) generate a 3D model of the plant-layout by arranging the selected set of 3D plant objects in accordance with the corresponding location data of the output data.
37. The non-transitory computer-readable medium according to claim 36, wherein the plant layout 2D schema includes a set of schema annotations providing schema information.
38. The non-transitory computer-readable medium according to claim 36, wherein additional layout data are provided.
39. A method for providing a function trained by a machine learning algorithm for generating a 3D model of a plant-layout, the method comprising: a) receiving as input training data a plurality of 2D plant-layout schemas each including a 2D arrangement of a plurality of 2D plant objects; b) for each 2D plant-layout schema, receiving, as output training data, identifiers and location data associated with one or more of the plurality of 2D plant objects; c) training by a machine learning algorithm a function based on the input training data and on the output training data; and d) providing the trained function for generating a 3D model of a plant-layout.
40. A method for generating, by a data processing system, a 3D-model of a plant layout departing from a 2D-schema of the plant-layout, the plant-layout including an arrangement of a plurality of plant objects, the plant-layout being representable by a 2D-schema and by a 3D model, the plant-layout 2D schema including a 2D arrangement of a plurality of 2D plant objects and the plant-layout 3D model including a 3D arrangement of a plurality of 3D plant objects, the method comprising: a) providing access to a plant catalogue of a plurality of identifiers of a plurality of 3D plant objects, at least one of the 3D plant object identifiers being associated with an identifier of a corresponding 2D plant object; b) receiving as input training data a plurality of 2D plant-layout schemas each including a 2D arrangement of a plurality of 2D plant objects; c) for each 2D plant-layout schema, receiving as output training data, identifiers and location data associated with one or more of the plurality of 2D plant objects; d) training by a machine learning algorithm a function based on the input training data and on the output training data; e) providing the trained function for generating a 3D model of a plant-layout; and f) generating a 3D model of a plant layout by applying the trained function to a given 2D schema of a plant-layout as input data.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] For a more complete understanding of the present disclosure, and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, wherein like numbers designate like objects, and in which:
[0018]
[0019]
[0020]
[0021]
[0022]
DETAILED DESCRIPTION
[0023]
[0024] Previous techniques for generating a 3D model of a plant layout departing from a 2D schema of the plant layout have some drawbacks. The embodiments disclosed herein provide numerous technical benefits, including but not limited to the following examples.
[0025] Embodiments enable automatic generation of a 3D CAD model of a plant layout departing from its 2D schema without required human intervention by the plant layout engineer.
[0026] Embodiments render the process of generating a 3D model of plant layout more efficient.
[0027] Embodiments enable upgrading the capability of several existing manufacturing planning software applications.
[0028] Embodiments enable time savings.
[0029] Embodiments allow providing to layout planners a Software as a Service (“SaaS”) module whereby they can upload a 2D layout schema and get as result a populated 3D digital scene where plant equipment objects are automatically positioned.
[0030]
[0031] Other peripherals, such as local area network (LAN)/Wide Area Network/Wireless (e.g. WiFi) adapter 112, may also be connected to local system bus 106. Expansion bus interface 114 connects local system bus 106 to input/output (I/O) bus 116. I/O bus 116 is connected to keyboard/mouse adapter 118, disk controller 120, and I/O adapter 122. Disk controller 120 can be connected to a storage 126, which can be any suitable machine usable or machine readable storage medium, including but are not limited to nonvolatile, hard-coded type mediums such as read only memories (ROMs) or erasable, electrically programmable read only memories (EEPROMs), magnetic tape storage, and user-recordable type mediums such as floppy disks, hard disk drives and compact disk read only memories (CD-ROMs) or digital versatile disks (DVDs), and other known optical, electrical, or magnetic storage devices.
[0032] Also connected to I/O bus 116 in the example shown is audio adapter 124, to which speakers (not shown) may be connected for playing sounds. Keyboard/mouse adapter 118 provides a connection for a pointing device (not shown), such as a mouse, trackball, trackpointer, touchscreen, etc.
[0033] Those of ordinary skill in the art will appreciate that the hardware illustrated in
[0034] A data processing system in accordance with an embodiment of the present disclosure can include an operating system employing a graphical user interface. The operating system permits multiple display windows to be presented in the graphical user interface simultaneously, with each display window providing an interface to a different application or to a different instance of the same application. A cursor in the graphical user interface may be manipulated by a user through the pointing device. The position of the cursor may be changed and/or an event, such as clicking a mouse button, generated to actuate a desired response.
[0035] One of various commercial operating systems, such as a version of Microsoft Windows™, a product of Microsoft Corporation located in Redmond, Wash. may be employed if suitably modified. The operating system is modified or created in accordance with the present disclosure as described.
[0036] LAN/WAN/Wireless adapter 112 can be connected to a network 130 (not a part of data processing system 100), which can be any public or private data processing system network or combination of networks, as known to those of skill in the art, including the Internet. Data processing system 100 can communicate over network 130 with server system 140, which is also not part of data processing system 100, but can be implemented, for example, as a separate data processing system 100.
[0037] Embodiments include one or more of the following steps: [0038] preparing of input and output training data; [0039] training a Machine Learning (“ML”) function; [0040] applying a functioned trained by a ML algorithm; [0041] generating a 3D model of the plant layout; [0042] adjusting the 3D model by applying received additional layout data including Manufacturing Process Semantics (“MPS”) information;
Example Embodiments of Preparing of Input and Output Training Data
[0043] In embodiments, input training data and output training data are prepared for training a function by a ML algorithm.
[0044] As input training data, a plurality of 2D schemas of a plurality of plant layouts are generated with standard CAD software tools. The generated plant-layout schema drawings include a set of standardized plant object icons and schema annotations in form of text and shapes. In embodiments, data of the 2D schemas are preferably provided in a digital image format. In other embodiments, when data of the 2D schemas are provided in other non-images formats (e.g. DXF or other CAD file formats), such data are converted into a digital image format.
[0045] As output training data, for each generated 2D schema, a set of bounding boxes around each plant object icon is automatically or manually generated with CAD software tools. The bounding boxes are preferably rectangles around the plant objects with a label identifying the type of plant objects. The rectangle position identifies the object position.
[0046]
[0047] The 2D schema drawing of
[0048]
[0049] In
[0050] Preferably, a large amount of input and output training data is automatically generated for training the ML function.
Example Embodiments of Machine Learning Training
[0051] In embodiments, in case the format of the input training data is different than a digital image format, the input training data may conveniently be pre-processed to transform the input training data format into a digital image format. In embodiments, examples of pre-processing includes scanning a paper printout with the plant layout 2D schema or transforming a CAD file with the plant layout 2D schema into a digital image.
[0052] In embodiments, the output training data is pre-processed to generate output training data in a numerical format in which the output training data comprise a numerical object identifier and a set of coordinates defining the bounding box position.
[0053] Table 1 below shows an example embodiment of output training data in a numerical format.
[0054] The first column of Table 1 includes the identifiers of the plant object icons delimited by the corresponding bounding boxes. The remaining columns of Table 1 includes four coordinates for determining size and position of the bounding boxes according to YOLO requirements (x_center, y_center, width, height). Table 2 provides an example of association between the value of the object identifier and the corresponding label of the plant object.
TABLE-US-00001 TABLE 1 example of bounding boxes in a numerical data format Object identifier Coordinate_1 Coordinate_2 Coordinate_3 Coordinate_4 9 0.335677 0.195076 0.030729 0.301136 17 0.539062 0.380682 0.037500 0.060606 17 0.647396 0.595170 0.034375 0.072917 10 0.621615 0.594697 0.016146 0.054924 0 0.342708 0.194129 0.017708 0.299242 0 0.254948 0.296402 0.118229 0.030303 0 0.202344 0.457860 0.020313 0.358902 0 0.917133 0.802557 0.017708 0.330492 7 0.322656 0.286458 0.032813 0.108902 7 0.320333 0.859848 0.062500 0.058712 7 0.527033 0.883049 0.056250 0.055871 7 0.700781 0.950758 0.032813 0.093435 8 0.745573 0.783617 0.299479 0.098482 14 0.718750 0.685606 0.089533 0.125000 14 0.888802 0.669981 0.058854 0.086174 15 0.696094 0.611269 0.017188 0.023674 15 0.744792 0.611269 0.016667 0.025568 9 0.474479 0.748106 0.036458 0.147727 9 0.387240 0.902462 0.031771 0.096591 11 0.533073 0.948864 0.039062 0.066288 16 0.622917 0.665720 0.020833 0.085227 16 0.648698 0.668561 0.021354 0.071970
TABLE-US-00002 TABLE 2 example of association between identifiers and labels of plant objects. Object identifier Object label 0 Fence 1 Tool_changer 2 OverHead_conveyer 3 Robot 4 Turn_table 5 Assembly_rail 6 Robot_centroller 7 Electrical_cabinet 8 Fixture 9 Sealer 10 Tip_dresser
[0055] It is noted that in the embodiment examples of Table 1 the coordinates of the bounding boxes are defined by four coordinates only. In fact, in this example embodiment, the boxes are assumed to be rectangular with sides parallel to the plant layout cell and no orientation is considered. In other embodiments, the object coordinates may be more than four and orientation of the bounding box may also be considered.
[0056] The input training data with the generated images with 2D schemas of plant layouts and the output training data with the data on the bounding boxes, e.g. position parameters and identifier, of the corresponding tagged plant objects are elaborated to train a ML function.
[0057] The tagged plant objects are used for training the ML algorithm for object detection. As used herein, the terms “object detection” denote determining the location on the image where certain objects are present as well as classifying those objects.
[0058] In embodiments, the desired data format of the input and output training data is obtained by applying one or more pre-processing steps on the data so as to transform the original data format into the desired data format.
[0059] In embodiments, the ML, algorithm is a deep learning algorithm preferably a convolutional neural network algorithm. An example of object detection system include, but is not limited by, You Only Look Once (“YOLO”) algorithm.
[0060] In embodiments, the automatically generated and tagged images are used in order to train a dedicated neural network such as YOLO neural network. In embodiments, other types of ML object detection algorithms may be used.
[0061] In embodiments, the resulting data of the ML trained function are used to generate a module for detecting 2D plant objects from input data of a given 2D schema of a plant layout.
[0062] In embodiments, the training data may be stored at a local machine/server or in a remote location, e.g. in the cloud. In embodiments, training data may be supplied by proprietary data sources or by public data sources or by a combination thereof. In embodiments, the training of the ML function may be done at a local machine/server or at a remote, e.g. in the cloud.
[0063] In embodiments, the training step may be done as a Software as a Service (“SaaS”), either on a local machine/server or on remote machine/server, e.g. in the cloud.
Example Embodiments of Applying a Function Trained by a ML Algorithm
[0064] In embodiments, the detection module may be used as a SaaS cloud service. In embodiment, the detection module may be used as a stand-alone module in a local site or in a remote location.
[0065] In embodiments, the detection module may be used as a stand-alone module by a manufacturing planning system. In other embodiments, the detection module may be embedded within a manufacturing planning system.
[0066] Data on a given 2D schema of a plant layout are received as input data. In embodiments, the 2D schema data are provided in form of a digital image of a 2D plant layout drawing. In other embodiments, the 2D schema of the plant layout may be provided in other formats e.g. as a CAD file or as a hardcopy printout and the data are pre-processed so as to obtain the desired digital image format.
[0067] The 2D schema includes a plurality of 2D plant objects, preferably in form of icons, representing a plurality of plant objects. In embodiments, at least one of the 2D plant objects is accompanied by a schema annotation in form of text and/or symbols including schema information. Example of schema information include, but are not limited to, Product Manufacturing Information (“PMI”), information on equipment vendors and models, information on units, information on measurements like e.g. distance from wall, information on scales and other relevant schema information.
[0068]
[0069] A plant catalogue or an access to the plant catalogue is provided. A plant catalogue of plant objects comprises identifiers of 3D plant objects, wherein at least one of the identifiers is associated to an identifier of a corresponding 2D plant object. In embodiments, examples of ways of implementing the association between 2D and 3D identifiers include, but are not limited by, table/key value pairs, json, xml, txt files with pairs of identifier index and path to the 3D CAD model.
[0070] In embodiments, a plant catalogue may be a library of 3D CAD models of plant objects with their associated 2D identifier index. The plant catalogue may be a standard one with plant objects which are widely used in an industry and it may be a specialistic plant catalogue with plant objects which are vendor and/or project specific.
[0071] The 2D digital images of the 2D schemas of the plant layout are analyzed by applying the function trained with the ML algorithm. The plant object types, bounding rectangles, positions are recognized inside the 2D layout schema by means of neural network inference.
[0072]
Example Embodiments of Generating a 3D Model of the Plant Layout
[0073] In embodiments, the 3D models of the recognized plant objects are automatically selected from the associated plant catalogue, e.g. a ready-made 3D CAD library and/or a specific 3D CAD library supplied by the user. The 3D model of a recognized plant object is selected based on the 2D plant object type detected within the input 2D drawings.
[0074] The selected 3D models of the plant objects are populated in a 3D scene with position based on the position of the detected bounding boxes. Information on the orientation of the 3D models in case available from the bounding box coordinates may also be used. In embodiments, orientation information may be obtained by cropping the icon image inside the bounding box and by analyzing it to extract the orientation of the identified plant object.
[0075] In embodiments, schema information is extracted from the 2D schema drawings, for example via OCR from the schema annotation of the digital image or extracted from the CAD file when still available. In embodiments, the extracted schema information may be used to select the appropriate 3D model of a plant object, e.g. a specific model/type of a machine or a robot and/or it may be used to attach a payload and/or reposition or orientate the 3D model.
[0076] In other embodiments, the icon image inside the bounding box may be cropped and analyzed to determine the orientation of the plant object.
[0077]
Examples Embodiments of Adjusting the 3D Model by Applying MPS Information
[0078] In embodiments, additional layout data may be provided as for example data with information on scale of the drawings and/or data with “MPS” information. MPS information include manufacturing process information which may be used to improve the location and orientation accuracy of the plant objects and/or to add more details to the 3D model of the plant layout. Examples of MPS information include, but are not limited by, weld point parameter information, equipment payload information, electric constraints information. In embodiments, additional layout data may be provided with access to a repository like a database, with data files such as e.g. JSON, csv, excel, xml, txt files or via external inputs in the shape of list of paths. In embodiments, MPS information may automatically be extracted from a data center of a PLM system as for example TeamCenter.
[0079] In embodiments, based on MPS information provided as additional layout data, the 3D model of the plant layout may conveniently be adjusted for example by inserting additional 3D objects to the 3D scene and/or by adjusting the position and orientation of the already arranged 3D plant objects.
[0080] For example, if the MPS information include information on weld points, payload, and/or electricity requirement parameters of a robot, the correct robot tool type may be automatically selected, e.g. a weld tool gun instead of another tool gun such as e.g. a paint gun or a laser weld. Additionally, based on the weld point voltage parameter information included in the MPS information, the correct weld gun may be chosen. The robot 3D model may be reoriented to be directed towards the location where the robot needs to perform its task, e.g. the task of welding a car body recognized by weld point features derived from the CAD model of the car body.
[0081] In embodiments, the MPS information may conveniently be interpreted by means of a coded rule module whereby coded rules are defined for arranging plant objects in plant layouts, where the rule module output is a selection of suggested adjusting steps to the 3D model of the plant layout. The coded rule module may be provided with standard or specific industry rules and constraints.
[0082] In embodiments, the coded rule module may be a knowledge graph of relations among different plant object components. This knowledge graph might be generated manually or automatically so as to define relations among different components. Example of defined relations of the graph include, but it is not limited by: [0083] a robot having overlapping coordinates with a conveyor shall be placed above the conveyer in a feasible way; [0084] an electrical cabinet which is located near a wall should have its rear side placed to the wall; [0085] a robot which is placed near weld-points is to be equipped with a weld-gun; [0086] type of weld gun depending on voltage information.
[0087] In embodiments, the orientation of a 3D model of a plant object may be adjusted by using spatial information (e.g. orientate the rear side of a closet towards wall) and/or by using MPS information (e.g. turning a robot to the direction of the welding points).
[0088] Advantageously the coded rules module enables to combine information coming out from the PLM software backbone and information coming from the 2D plant layout schema in order to adjust the 3D model of the plant layout.
[0089]
[0090] At act 505, access to a plant catalogue of a plurality of identifiers of a plurality of 3D plant objects is provided, wherein at least one of the 3D plant object identifiers is associated to an identifier of a corresponding 2D plant object. In embodiments, the plant catalogue is a standard catalogue, a specific catalogue or a combination of the two. In embodiments, the digital plant objects are CAD objects.
[0091] At act 510, data on a given 2D schema of a plant-layout are received as input data. In embodiments, the plant layout 2D schema comprises a set of schema annotations providing schema information. In embodiments, additional layout data are provided. Example of additional layout data includes, but is not limited by, manufacturing process semantic information.
[0092] At act 515, a function trained by a machine learning algorithm is applied to the input data for detecting a set of 2D plant objects, wherein a set of identifier and location data on the detected 2D plant object set is provide as output data.
[0093] At act 520, a set of 3D plant objects is selected from the plant catalogue whose identifiers are associated to the set of 2D plant objects identifiers of the output data.
[0094] At act 525, a 3D model of the plant-layout is generated by arranging the selected set of 3D plant objects in accordance with the corresponding location data of the output data.
[0095] In embodiments, the additional layout data and/or the schema annotation information are interpreted by a coded rule module so as to provide a selection of adjusting steps to the plant layout 3D model. In embodiments, the coded rule module is a knowledge graph.
[0096] Of course, those of skill in the art will recognize that, unless specifically indicated or required by the sequence of operations, certain steps in the processes described above may be omitted, performed concurrently or sequentially, or performed in a different order.
[0097] Those skilled in the art will recognize that, for simplicity and clarity, the full structure and operation of all data processing systems suitable for use with the present disclosure is not being illustrated or described herein. Instead, only so much of a data processing system as is unique to the present disclosure or necessary for an understanding of the present disclosure is illustrated and described. The remainder of the construction and operation of data processing system 100 may conform to any of the various current implementations and practices known in the art.
[0098] It is important to note that while the disclosure includes a description in the context of a fully functional system, those skilled in the art will appreciate that at least portions of the present disclosure are capable of being distributed in the form of instructions contained within a machine-usable, computer-usable, or computer-readable medium in any of a variety of forms, and that the present disclosure applies equally regardless of the particular type of instruction or signal bearing medium or storage medium utilized to actually carry out the distribution. Examples of machine usable/readable or computer usable/readable mediums include: nonvolatile, hard-coded type mediums such as read only memories (ROMs) or erasable, electrically programmable read only memories (EEPROMs), and user-recordable type mediums such as floppy disks, hard disk drives and compact disk read only memories (CD-ROMs) or digital versatile disks (DVDs).
[0099] Although an exemplary embodiment of the present disclosure has been described in detail, those skilled in the art will understand that various changes, substitutions, variations, and improvements disclosed herein may be made without departing from the spirit and scope of the disclosure in its broadest form.
[0100] None of the description in the present application should be read as implying that any particular element, step, or function is an essential element which must be included in the claim scope: the scope of patented subject matter is defined only by the allowed claims.