SYSTEM AND METHOD FOR FACTORY AUTOMATION AND ENHANCING MACHINE CAPABILITIES
20230110483 · 2023-04-13
Inventors
Cpc classification
Y02P90/02
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
Y02P90/80
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G05B19/4155
PHYSICS
G05B19/4184
PHYSICS
G05B2219/35444
PHYSICS
International classification
Abstract
System and Method for Factory Automation and Enhancing Machine Capabilities The present invention relates to system and method for factory automation and for enhancing machine capabilities. The system is configured to extract data from machine through data capturing module, data extraction module and data conversion module and jointly analyze them with data from other equipment and sensors through data analysis engine and transmitting to factory systems in user configured protocol through protocol conversion module. It also accepts machine control commands from the factory systems and executes them through command processor and mouse & keyboard simulator modules. It involves combining GUI from multiple equipments/sensors and sending to single display device through GUI manager and video output modules and mapping required actions of mouse clicks and keyboard entries from new to old user interfaces. GUI manager module detects GUI elements from captured images and applies user configuration to automatically
Claims
1. A system for factory automation and enhancing machine capabilities, said machine is connected with a machine controller (2) having a CPU connected to a display and a keyboard/ mouse;, the system comprises: a video capturing module (101) electrically connected with the display and the CPU of the machine controller (2), a data extraction module (102) communicating with the video capturing module (101), a data conversion module (103) communicating with the data extraction module (102), a data analysis engine (104) communicating with the data conversion software module (103), a protocol conversion module (105) configured to convert analyzed data into user configured protocol, a command processor module (106) communicating with protocol conversion module (105), and a keyboard and mouse simulator hardware module (107) communicating with command processor module (106) and also electrically connected to the CPU of the machine controller (2); a GUI manager module (108) communicating to Data Analysis Engine (104) and also with a Video Output module (109), which is electrically connected to the display of the machine controller (2); the video capturing module (101) configured to captures a screen image of contents displayed on the machine controller’s (2) display periodically; the data extraction module (102) configured to receive the screen images captured and outputted from the video capturing module (101) and extract the data from the screen image; the data conversion software module (103) is configured to convert the extracted data from the image into user configured format; Characterized by, the data analysis engine (104) is configured to combine, correlate and analyze data of the machine controller (2) received from the data conversion software module (103) and from external sensors, equipment and devices (4) and is configured to sends analyzed data to a factory equipment controlling device (3) through the protocol conversion module (105); the command processor module (106) is configured to identify a series of mouse clicks and keyboard entries required to be performed to execute a control signal, command, action or instruction sent by the factory equipment controlling device (3); the keyboard and mouse simulator hardware module (107) is configured to simulate the mouse clicks and/or keyboard entries to execute the control signal sent by the command processor module (106).
2. The system for factory automation and enhancing machine capabilities as claimed in claim 1, wherein the data extraction module (102) is configured to perform an optical character recognition (OCR) to convert the information displayed on the screen image to digital information.
3. The system for factory automation and enhancing machine capabilities as claimed in claim 1, wherein the protocol conversion module (105) includes a SECS/GEM software module (105a), a Modbus software module (105b) , an OPC software module (105c), a JSON software module (105d) and a MQTT software module (105e) and a custom protocol software module (105f).
4. The system for factory automation and enhancing machine capabilities as claimed in claim 1, wherein the video capturing module (101) is configured to capture the screen image of PLC, HMI or other parts of the equipment through an external image capturing device such as a camera, or from the display of the machine controller (2) through a video recording device.
5. The system for factory automation and enhancing machine capabilities as claimed in claim 1, wherein the GUI manager module (108) is configured to detect GUI elements such as push button, drop-down selection, text box, etc. from the captured images through computer vision techniques in user configured language and apply user configuration to automatically design a new enhanced GUI that includes data from other external sensors/devices/equipment along with user specified enhancements and sends to the display of the machine controller (2) through a video output module (109).
6. The system for factory automation and enhancing machine capabilities as claimed in claim 5, wherein the command processor module (106) is configured to detect the keyboard and mouse events on the enhanced GUI and map or translate to corresponding keyboard entries and mouse clicks of the original GUI of the equipment controller (2) and send same activity to the equipment controller (2) through mouse and keyboard simulator module (107).
7. A method for factory automation and enhancing machine capabilities comprises a following steps: a) configuring the system to define the data to be received from a machine controller (2) of the factory equipment and to define the command to be sent from a factory equipment controlling system (3) to the machine controller (2) of the factory equipment; b) capturing the screenshots of the machine controller’s (2) display unit or other parts of the machine through a video capturing module (101); c) applying user configuration to extract data from the captured images of the machine controller or machine (2) through an OCR (Optical Character Recognition) technology or other computer vision techniques and converting the information displayed on the image into digital information through a data extraction module (102); d) converting the data extracted from the captured image into the user configured format through a data conversion software module (103) and categorizing the data into Alarm, Event, Image or Data “tags” and sends them to data analysis engine (104); e) combining, correlating and analyzing data from received from the data conversion module (103) and external sensors, devices or equipment through a data analysis engine (104) and sending the analyzed or raw data to the factory equipment controlling system (3) s in SECS/GEM, Modbus, OPC, JSON, MQTT or custom protocol format. f) sending command from the factory equipment controlling system (3) in either SECS/GEM, Modbus, OPC, JSON, MQTT or custom protocol format through a SECS/GEM software module, a Modbus software module, a JSON software module, a MQTT software module or a custom protocol software module (105a, 105b, 105c, 105d, 105e, 105f) to interpret the command and accordingly the command is sent to a command processor software module (106); g) identifying and converting the command into the series of mouse clicks and keyboard entries through the command processor software module (106) to execute the command sent by the factory host, MES or ERP system (3); h) simulating mouse and keyboard through a mouse and keyboard simulator hardware module (107) to execute the command for controlling the factory equipment (2). i) Combining graphical user interface (GUI) from the machine controller (2) with external sensors, devices or equipment to prepare uniform coherent graphical user interface (GUI) and displaying on the display device of the machine controller (2).
Description
BRIEF DESCRIPTION OF THE DRAWING
[0018] Objects and advantages of the invention will be apparent from the following detailed description taken in conjunction with the accompanying figures of the drawing wherein:
[0019]
[0020]
DETAILED DESCRIPTION OF THE INVENTION
[0021] For simplicity and clarity of illustration, elements in the figures are not necessarily drawn to scale. Also, descriptions and details of well-known steps and elements may be omitted for simplicity of the description. Furthermore, in the following detailed description of the present disclosure, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. However, it will be understood that the present disclosure may be practiced without these specific details. In other instances, well - known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the present disclosure.
[0022] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present disclosure. It will be further understood that the terms “comprises”, “comprising”, “includes”, and “including” when used in this specification, specify the presence of the stated features, integers, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, operations, elements, components, and/or portions thereof.
[0023] The present invention discloses the system and method for factory automation and enhancing machine capabilities by extracting data from machine and jointly analyzing them with data from other equipment and sensors and transmitting them to factory controlling systems that enable automation capability like extracting data from the equipment, sending commands such as START, ABORT, RECIPE SELECT etc. to the equipment, querying it’s processing state, adding or modifying recipe or setpoint values and equipment configuration parameters on a factory equipment that is otherwise not automation capable. It also involves combining user interface from multiple equipments and sensor and sending to a single display device and mapping required actions of mouse clicks and keyboard entries from new to old user interface. It also involves sending equipment data to cloud server for analysis and take action suggested by machine learning engine on the cloud there by converting an old/legacy factory into a smart factory.
[0024] It is to be understood that the term “factory equipment/machine” includes any kind of object, machine or equipment wherefrom the data is to be extracted, transmitted and/or controlled by the user through the system according to present invention.
[0025] The term “data” includes but is not limited to real-time process parameters, such as temperature, pressure, power, gas flows, etc., machine configuration, such as base pressure, type of hardware, etc., alarms or errors occurring on the equipment and critical events related to the processing of raw material, such as processing started, door opened, etc., or equipment state such as processing, idle, stopped, etc.
[0026] Automation here includes but is not limited to extracting data, including but not limited to real-time process parameters, recipe parameters, equipment configuration parameters, alarms and events from factory equipment and transmitting to a factory host, ERP, MES, database or other enterprise system. It includes sending that data to cloud servers for analyzing and correlating with data from other equipment and sensors. It also includes enabling the factory equipment to accept commands from a factory host, ERP, MES or other such systems. It also includes mixing user interfaces of multiple equipment and sensors into a single GUI of all relevant manufacturing information and displaying on a video output device such as a PC monitor.
[0027]
[0028] If the factory equipment is not PC based and doesn’t have these ports, then the system according to present invention will be connected through an external camera that is overlooking the HMI, PLC or other display of the factory equipment. Through processing images of equipment PC or display and applying OCR technology with the system’s algorithm, it processes various information from the screen and categorizes them as image, data, alarm or event. The computer of the factory equipment (2) includes a CPU and a display unit for displaying the data related to process parameters. For connecting the system according to present invention with the computer (machine controller) of the factory equipment, the display unit and the CPU of the computer are connected with the system rather than directly connecting with each other. Hence, the system according to present invention configured to intercept between the CPU and the display device and the CPU and the keyboard and mouse.
[0029] The system according to present invention has the capability to collect data from another equipment or sensor through the same way or some other protocol or method. The system then could analyze and correlate data from all the sources either locally or can transmit to a cloud based machine learning capable server and send both raw and analyzed data to the factory Host, MES or ERP system. The system according to present invention accepts commands based on analysis from the local or cloud server, factory host, MES or ERP systems and sends to the equipment.
[0030] It is to be understood that if the user just wants to collect data and doesn’t wants to control the machine (one side communication), user only needs to connect the system with the CPU through the VGA or other such display ports. In such case, no USB or other connection is required.
[0031] Further, the system according to present invention is configured with a factory equipment controlling system (3) like a factory host computer, a MES (Manufacturing Execution System), a ERP (Enterprise Resources Planning), local or cloud based machine learning capable data analysis server by defining the data to be captured and its data format. The factory host, MES or ERP systems are designed to understand and interpret data transmitted from the factory equipment/machine (2) and configured to transmit control signal, command, action or instruction to machine (2).
[0032] The controlling of factory equipment (2) by the factory equipment controlling system (3) requires user to configure the system as a way to map the name of the command to a series of mouse clicks and keyboard entries on the user interface of the equipment controller software. User configuration is a collection of files and databases stored with information supplied by the user at the time of one-time setup of the system. The information that the user needs to supply to configure the system include, but not limited to - where on the controller software’s GUI to collect which information and what does the user want to call it, what is its data type, etc. and which button or set of buttons to click in response to which command received from the factory host (3).
[0033] Referring again to
[0034] The video capturing module (101) is configured to capture screenshots or pictures as fast as a few milliseconds based on user configuration of what is displaying on the display of the machine controller (2) periodically. Said data extraction module (102) breaks the images captured from the video capturing module (101) into many pieces and applies user configuration to extract data from the screenshot or pictures through OCR (Optical Character Recognition) technology. The data conversion software module (103) is configured to convert the extracted data from the images into user configured format and also categorizes the data into Alarm, Event, Image or Data “tags” and sends them to the data analysis engine (104) which is configured to combine, correlate and analyze data from external sensors devices or equipment (4) using various machine learning (ML) or deep learning (DL) techniques and then send both analyzed and raw data to factory equipment controlling system (3). The GUI Manager (108) is configured to combine user interface of the machine controller (2) and other devices (4) and send to the display of the machine controller (2) through Video Output (109). Said data analysis engine (104) also sends raw and analyzed data to a protocol conversion module (105), which formats data into user configured protocol before sending to factory equipment controlling system (3).
[0035] For machine control, the system further comprises a command processor module (106) that uses the user configuration to identify a series of mouse clicks and keyboard entries required to be performed to execute the command sent by the factory host, MES or ERP system (3) and a keyboard and mouse simulator hardware module (107) that is connected to the CPU of the machine controller (2) through USB, PS/2, RS232 or other cables performs the required mouse clicks and/or keyboard entries to execute the command. Said Keyboard and mouse simulator hardware module (107) is connected to the CPU of the machine controller (2) via USB, PS/2, RS232 or other cables and it’s a combination of hardware and software that simulates the mouse and keyboard for the factory equipment PC. The GUI manager module (108) combines GUI of the equipment controller (2) and those of other devices, equipment or sensors (4) and displays on the monitor through VGA /DVI/ HDMI or other video output cable.
[0036] Said protocol conversion module (105) includes a SECS/GEM software module (105a), a Modbus software module (105b), an OPC software module (105c), a JSON software module (105d) and a MQTT software module (105e) or a custom software module (105f). It is to be understood that other protocols may also be added as per requirements through custom software module (105f).
[0037] Now, depending on which of the industrial data transmission protocol such as SECS/GEM, Modbus, OPC, JSON or MQTT or custom, the user has configured the system for, the SECS/GEM, OPC, Modbus, JSON, MQTT or custom software module (105a, 105b, 105c, 105d, 105e, 105f) prepares the message to transmit the data to the factory host, MES or ERP server locally or on cloud (3). The Factory host, MES, cloud server or ERP systems (3) are designed to understand and interpret data transmitted by the equipment using the SECS/GEM, Modbus, OPC, JSON, MQTT or custom protocols (105a, 105b, 105c, 105d, 105e, 105f).
[0038] Now, in the process of extracting and transmitting data and control of factory equipment using the system according to present embodiment, the process includes following steps:
[0039] In first step, user is required to configure the system according to present invention to define the data to be received from the machine controller (2) of the factory equipment and the commands to be sent from the factory equipment controlling system (3) to the machine controller of the factory equipment (2). This application is used for configuration of system only and hence it will be used initially to setup the system and then whenever configuration needs to be changed. This software shows the user the screen of the equipment PC and allows user to define which data to capture and what to call it. This software also allows the user to define mouse clicks and keyboard entries corresponding to various commands.
[0040] Now in second step, the video capturing module (101) captures the screenshots or pictures of the processing image of the equipment’s display unit periodically at user defined frequency which can be as fast as a few milliseconds.
[0041] In third step, the data extraction module (102) breaks the images captured from the video capturing module (101) into many pieces and applies user configuration to extract data from the image through the OCR (Optical Character Recognition) technology and converts the information displayed on the image into digital information.
[0042] In forth step, if the system is configured to receive data from external equipment, devices or sensors using either the same method or different protocols, it is done and data from all these sources are combined and analyzed as per user configuration in data analysis engine (104).
[0043] In fifth step, depending on which format (SECS/GEM, Modbus, OPC, JSON, MQTT or custom) the system is configured to send the data in, the protocol conversion module (105) formats messages in appropriate protocols through the SECS/GEM software module (105a), the Modbus software module (105b), the OPC software module (105c), the JSON software module (105d), the MQTT software module (105e) or custom software module (105f) and sends them to the factory equipment controlling system (3).
[0044] The aforementioned steps are performed for extracting and jointly analyzing the data from the machine controller (2) of the factory equipment and other external devices, equipment and sensors. Now, for controlling of the factory equipment, the command from the factory equipment controlling system (3) is sent to the machine controller (2) of the machine through the system according to present invention, following steps are performed.
[0045] In first step, the factory equipment controlling system (3) sends a command in either SECS/GEM, OPC, Modbus, JSON, MQTT or custom format to the protocol conversion module (105), which comprise of SECS/GEM, Modbus, OPC, JSON, MQTT or custom software module (105a, 105b, 105c, 105d, 105e, 105f) to interpret the command and accordingly the command is sent to the command processor software module (106).
[0046] In second step, the command processor software module (106) of the system uses user configuration to identify the command and convert the command into a series of mouse clicks and keyboard entries required to be performed to execute the command sent by the factory host, MES or ERP system or cloud server (3).
[0047] Then, the mouse and keyboard simulator hardware module (107) that is connected to the CPU of the machine controller (2) of the factory equipment through USB, PS/2, RS232 or other cable performs the required mouse clicks and/or keyboard entries to execute the command.
[0048] In this way, the system executes the user configuration defined as “Recipe” and keeps collecting data and sends them to external factory systems through SECS/GEM, Modbus, OPC, JSON, MQTT or custom protocol. The “Recipe” also stores configuration of commands such as start processing, abort, etc. from external factory system and mapping them to a set of mouse clicks and keyboard entries required to execute those commands on the factory equipment’s machine controller (2).
[0049] In one embodiment, the system and method according to present invention also enhances equipment capabilities without any software modification or other software installation. According to this embodiment, the present invention may enhance the capabilities of equipment by adding external devices, sensors or equipment’s data and providing other user defined software GUI modifications such as auto adjustment of certain set points depending on the analysis of data from external devices or sensors without installing any extra software or modifying existing equipment software.
[0050] Now as shown in
[0051] Thus, the present invention enables combing, correlating and analyzing data coming from the factory equipment with data coming from other sensors, devices or equipment of the same or different kind and taking necessary action either on its own or by accepting commands from factory systems to prevent misprocessing of production material or failure of the equipment or its part. This way the present invention converts old legacy equipment in to “Smart” Equipment that supports industry 4.0 smart manufacturing, and industrial internal of things (IIOT) initiatives. The present invention does this by not only collecting, correlating and analyzing but also combing the user interface into a single uniform and coherent graphical user interface (GUI).
[0052] Further, the present invention also enables localization or translation of a factory equipment’s GUI where GUI Manager (107) extracts GUI elements from the extracted images and redrawing them using user configured language such as Chinese, Japanese or any such language before sending the new translated or localized GUI to the display device through Video Output module (108). Whenever the user clicks mouse or enters localized text in the new localized GUI, the Data Analysis Engine (104) translates them to equivalent mouse clicks and text entries of the original GUI and sends them to the Command Processor module (105) and Mouse and Keyboard Simulator module (106) which executes the mouse clicks and keyboard entries on the original GUI.
[0053] The invention has been explained in relation to specific embodiment. It is inferred that the foregoing description is only illustrative of the present invention and it is not intended that the invention be limited or restrictive thereto. Many other specific embodiments of the present invention will be apparent to one skilled in the art from the foregoing disclosure. The scope of the invention should therefore be determined not with reference to the above description but should be determined with reference to claims along with full scope of equivalents to which such claims are entitled.