METHOD FOR THE MACHINE-BASED DETERMINATION OF THE FUNCTIONAL STATE OF SUPPORT ROLLERS OF A BELT CONVEYOR SYSTEM, COMPUTER PROGRAM AND MACHINE-READABLE DATA CARRIER
20230242350 · 2023-08-03
Inventors
- Phillip Esser (Köln, DE)
- Sophie Ruoshan Wei (München, DE)
- Martin Krex (Bottrop, DE)
- David Handl (München, DE)
Cpc classification
B65G43/02
PERFORMING OPERATIONS; TRANSPORTING
International classification
B65G43/02
PERFORMING OPERATIONS; TRANSPORTING
Abstract
The present invention relates to a method for the machine-based determination of the functional state of support rollers (13) of a belt conveyor system (1) during operation of the belt conveyor system, wherein at least one unmanned vehicle (2) with at least one imaging sensor system is provided, by means of which at least sections of the belt conveyor system can be sensed in the form of image data, wherein image data of at least one subregion of the belt conveyor system is captured as thermal image data. In the captured image data of the belt conveyor system, at least one identification image region position is determined automatically, in which at least one subregion of a support roller (13) is imaged. For each identification image region position determined from the image data, an analysis image region position is automatically defined in the thermal image data. In each defined analysis image region position, thermal image data is automatically analyzed and the functional state of support rollers (13) is determined. Furthermore, the present invention relates to a method for the identification of functionally impaired support rollers (13), a computer program configured to carry out these methods, a machine-readable data carrier containing such a computer program, and a device with a data processing device (3) for the evaluation of the captured image data.
Claims
1. A method for the machine-based determination of the functional state of the support rollers (13) of a belt conveyor system (1) during operation of the belt conveyor system (1), wherein at least one unmanned vehicle (2) with at least one imaging sensor system is provided, by means of which at least sections of the belt conveyor system (1) can be sensed in the form of image data, wherein image data of at least one subregion of the belt conveyor system (1) is captured as thermal image data, characterized in that in the captured image data of the belt conveyor system (1) at least one identification image region position, in which at least one subregion of a support roller (13) is imaged, is automatically determined, for each determined identification region position from the image data an analysis image region position is automatically defined in the thermal image data, and in each defined analysis image region position, thermal image data is automatically analyzed in order to determine the functional state of support rollers (13).
2. The method as claimed in claim 1, wherein image data of at least one subregion of the belt conveyor system (1) is captured as thermal image data by moving the at least one unmanned vehicle (2) with the at least one imaging sensor system, comprising at least one thermal image sensor device (21) for capturing the thermal image data, along at least one subregion of the belt conveyor system (1), image data from at least one subregion of the belt conveyor system (1) is captured with the imaging sensor system and the image data comprises at least thermal image data, in the captured image data of the belt conveyor system (1) the at least one identification image region position is automatically determined by automatically detecting image data regions in the captured image data, in which regions at least one subregion of a support roller (13) is imaged, and the position of the respectively detected image data region is automatically provided as the identification image region position of the respective support roller (13), for each identification image region position determined from the image data an analysis image region position is automatically defined in the thermal image data by automatically defining, for each identification image region position of the support roller (13) identified from the image data, an analysis image region position of the support roller (13) in the image data, which position corresponds spatially to the respective identification image region position from the image data, by automatically analyzing the thermal image data in each defined analysis image region position, by automatically determining temperature data of the relevant support roller (13) from the thermal image data in the defined analysis image region position of the support roller (13), and automatically assigning the temperature data determined from the analysis image region position to a functional state of the respective support rollers (13).
3. The method as claimed in claim 1 or 2, wherein the image data that can be detected by the imaging sensor system also comprises photographic image data in addition to the thermal image data, wherein image data of at least one subregion of the belt conveyor system (1) is also captured as photographic image data, the captured image data of the belt conveyor system (1), in which at least one identification image region position is determined automatically, is photographic image data in which photographic image data regions are automatically detected as image data regions, and the position of each detected photographic image data region is automatically provided as the identification image region position.
4. The method as claimed in any one of claims 1 to 3, wherein the at least one identification image region position is automatically determined in the captured image data of the belt conveyor system (1) by automatically detecting image data regions in the image data in which at least one detectable object is imaged, automatically detecting support rollers (13) or subregions of support rollers (13) among the detectable objects in the image data regions thus detected, and for each detected support roller (13) and for each detected subregion of a support roller (13), providing the position of the image data region in which the support roller (13) or the subregion of the support roller (13) is imaged as an identification image region position.
5. The method as claimed in any one of claims 1 to 4, wherein the recognition quality in the automatic determination of the identification image region position in the captured image data, in particular the recognition quality of the automatic detection of image data regions in which at least one detectable object is imaged, in particular at least one subregion of a support roller (13), and/or the recognition quality of the automatic detection of support rollers (13) or subregions of support rollers (13) in the detected image data regions is improved by means of a learning procedure carried out by means of an artificial neural network, in particular by means of a single-level or multi-level convolutional neural network.
6. The method as claimed in any one of claims 1 to 5, wherein in each defined analysis image region position, thermal image data is automatically analyzed by automatically selecting thermal image data in the defined analysis image region positions, which is thermal data of the support rollers (13), and the selected thermal data in the defined analysis image region positions is automatically analyzed.
7. The method as claimed in claim 6, wherein thermal image data is automatically selected that lies in those subregions within the defined analysis image region positions in which thermal image data is arranged in circular or near-circular contours and/or in which thermal image data corresponds to the temperatures of an equal temperature level.
8. The method as claimed in claim 6 or 7, wherein the recognition quality in the automatic selection of the thermal image data, which is thermal image data of the support rollers (13), is improved by means of a learning procedure carried out by means of an artificial neural network, in particular the recognition quality in the automatic selection of the thermal image data, which lies in such subregions within the defined analysis image region positions in which thermal image data is arranged in circular or near-circular contours and/or in which thermal image data corresponds to the temperatures of an equal temperature level.
9. The method as claimed in any one of claims 1 to 8, wherein the image data of the belt conveyor system (1) is assigned to a position in the belt conveyor system (1) and/or for defined analysis image region positions the thermal image data or evaluation information is assigned to an individual support roller (13) of the belt conveyor system (1), the assignment being performed in particular via a radio-based position determination, by comparing captured image data against reference image data, by recording the route traveled by the unmanned vehicle (2), and/or by detecting the orientation of the imaging sensor system.
10. The method as claimed in claim 2 or any one of claims 3 to 9 referring back to claim 2, wherein the temperature data determined from the analysis image region position is automatically assigned to a functional state of a support roller (13), by either automatically classifying the determined temperature data of the respective support roller (13) to one functional state of a plurality of previously defined functional states or by automatically assigning said data to a functional state as part of a cluster analysis, in particular as part of a multivariate cluster analysis.
11. The method as claimed in any one of claims 1 to 10, wherein the recognition quality in the automatic determination of the functional state of support rollers (13), in particular the classification of the thermal image data, is improved by means of a learning procedure carried out by means of an artificial neural network.
12. The method as claimed in any one of claims 1 to 11, wherein the functional state, the image data and/or, if applicable, the determined temperature data for the respective support roller (13), are recorded in a functional state data collection.
13. A method for identifying functionally impaired support rollers (13) of a belt conveyor system (1) during operation of the belt conveyor system (1), the method comprising the method for the machine-based determination of the functional state of the support rollers (13) of a belt conveyor system (1) as claimed in any one of claims 1 to 12, wherein functionally impaired support rollers (13) are detected, a time for replacement of the respective support roller (13) is determined based on a comparison with historical data from a functional state database, and for this support roller (13) the determined time is output to a communication interface.
14. A computer program which is configured to execute each step of a method as claimed in any one of claims 1 to 13.
15. A machine-readable data storage carrier on which a computer program according to claim 14 is stored.
16. A device for the machine-based determination of the functional state of support rollers (13) of a belt conveyor system (1) during operation of the belt conveyor system (1), the device comprising at least one unmanned vehicle (2) with at least one imaging sensor system that can be moved along at least one subregion of the belt conveyor system (1), by means of which sensor system at least some sections of the belt conveyor system (1) can be captured by sensors in the form of image data, and which comprises at least one thermal image sensor device (21) for capturing thermal image data, a data processing device (3) for evaluating the captured image data in order to determine the functional state of support rollers (13) of a belt conveyor system (1) during operation of the belt conveyor system (1), the data processing device (3) having an image region identification module (31) which is configured to automatically identify in captured image data at least one identification image region position in which at least one subregion of a support roller (13) is imaged, an image region definition module (34) which is configured to automatically define an analysis image region position in the thermal image data for an identification image region position identified from the image data, and a state identification module (35) which is configured to automatically analyze thermal image data at a defined analysis image region position to automatically determine the functional state of support rollers (13).
17. The device as claimed in claim 16, wherein the image region identification module (31) has an image region detection module (32) and an interface module (33), the image region detection module (32) of which is configured to automatically detect image data regions in the captured image data in which at least one subregion of a support roller (13) is imaged, and the interface module (33) of which is configured to automatically provide the position of the detected image data region as the identification image region position of the respective support roller (13), and wherein the image region definition module (34) is configured to automatically define, for each identification image region position of a support roller (13) determined from the image data, an analysis image region position of the support roller (13) in the thermal image data that corresponds spatially to the respective identification image region position from the image data, and wherein the state identification module (35) has an analysis module (36) and an assignment module (37), the analysis module (36) of which is configured to automatically determine temperature data of the respective support roller (13) in the defined analysis image region position of the support roller (13) from the thermal image data, and the assignment module (37) of which is configured to automatically assign the temperature data determined from the analysis image region position to a functional state of the respective support rollers (13).
18. The device as claimed in claim 17, wherein the data processing device (3) has at least one trainable artificial neural network, by means of which at least one of the following detections is carried out: the detection of support rollers (13) or subregions of support rollers (13) for the automatic determination of image data regions in the captured image data, in which at least one subregion of a support roller (13) is imaged, the detection of thermal image data of a support roller (13) for the automatic determination of temperature data of the support roller (13) in a defined analysis image region position of the support roller (13), the detection of functional states of a support roller (13) from the determined temperature data of the support roller (13) for the automatic assignment of the functional state of the support rollers (13).
Description
[0059] In order to clarify the invention, representative examples are described in general in the following with reference to the corresponding figures, wherein individual sub-steps and sequences of the same are also explained, which can be linked together in almost arbitrary ways depending on the particular desired objectives of such a method. From these examples, further advantages and possible applications emerge, which will be described in more detail in the following without limiting the general inventive idea underlying these exemplary embodiments, and individual features and steps are also explained which can be linked together in almost arbitrary ways depending on the particular desired objectives of such a method. In the drawings, schematically in each case:
[0060]
[0061]
[0062]
[0063]
[0064] An unmanned vehicle 2 is arranged on the belt conveyor system 1. This is an autonomously traveling vehicle or a remote controlled vehicle controlled by an operator, for example by means of a remote radio control. The vehicle 2 is an earthbound land vehicle, but it can also be designed differently, for example as an aircraft. The unmanned vehicle 2 moves along the entire belt conveyor system 1. In addition, multiple unmanned vehicles can also be assigned to the belt conveyor system 1. For example, it can be provided that each vehicle from a plurality of unmanned vehicles may only move in one subsection of the belt conveyor system 1, so that every vehicle 2 of this plurality is required to sense the entire belt conveyor system 1.
[0065] The unmanned vehicle 2 has an imaging sensor system that include a thermal imaging sensor device 21 and a photographic image sensor device 22. Thermal imaging sensor device 21 and photographic image sensor device 22 are located close together and have the same orientation and essentially the same imaging angle, so that they capture essentially the same object space. The thermal imaging sensor device 21 is a thermal imaging camera and the photographic image sensor device 22 is a photographic camera (monoscopic or stereoscopic), wherein both cameras record the captured image data electronically in digital form. The cameras can record single images or else corresponding moving images (film recordings) as image data. In addition, the unmanned vehicle 2 has a position determination device 23 to record the current position of the vehicle 2 in correlation with the image data captured in each case as position data; in this case, it is a GPS module, wherein the position information is inserted into the image data as meta-information. Alternatively, the photographic image sensor device 22 can be used for position determination, and even the thermal imaging sensor device 21 if necessary. In addition, the sensor system of the vehicle 2 has orientation detection in order to determine the respective orientation as orientation information when capturing the image data, in this case an inertial measuring unit.
[0066] The recorded image data, the position data and the orientation information are transmitted via a data transmission module 24 to a data processing device 3 where they are evaluated. In this case, the data processing device 3 is implemented separately from the vehicle 2, for example in the vicinity of the control station of the belt conveyor system 1. Instead, the data processing device 3 can also be arranged on the vehicle 2.
[0067] In the data processing device 3, the data transmitted by the data transmission module 24 of the unmanned vehicle 2, including the image data—the thermal image data from the thermal imaging camera and the photographic image data from the optical camera—is received by the data transmission module 38 of the data processing device 3. The photographic image data (and also, if applicable, the thermal image data) is first fed to an image region identification module 31, where it serves as input data from an image region detection module 32 of the image region identification module 31. In the image region detection module 32, image regions in which support rollers 13 or subregions of support rollers 13 are imaged are automatically detected. The positions of the detected image regions (identification image region positions) are provided for further processing via an interface module 33 of the image region identification module 31. From the interface module 33, the image region definition module 34 obtains the positions of the image regions detected in the photographic image data and automatically defines appropriate spatially corresponding image region positions in the thermal image data as analysis image region positions. The thermal image data with the analysis image region positions thus defined is forwarded to the functional state identification module 35. Within the state identification module 35, the thermal image data is automatically analyzed in an analysis module 36, wherein the analysis is limited here to the appropriately defined image region positions of the support rollers 13. The results of this analysis are entered as temperature data into the assignment module 37, which based on the temperature data of a support roller 13 obtained from the thermal image data in the respective analysis image region positions, automatically assigns a functional state to this support roller 13.
[0068] The data processing device 3 has a plurality of trainable artificial neural networks (not shown). One of these trainable artificial neural networks is used to detect support rollers 13 and subregions of support rollers 13 in order to identify image data regions in the captured image data, in which at least one subregion of a support roller 13 is imaged. Another of these trainable artificial neural networks is used to detect thermal image data of a support roller 13 in order to automatically determine the temperature data of the support roller 13 in a defined analysis image region position of the support roller 13. Another of these trainable artificial neural networks is used to detect functional states of a support roller 13 from the determined temperature data of the support roller 13, in order to automatically assign a functional state to the respective support rollers 13. Artificial neural networks are also used for the corresponding machine learning processes.
[0069] A schematic perspective partial view of a belt conveyor system 1 with a special device for the machine-based determination of the functional state of the support rollers 13 of the belt conveyor system 1 during operation of belt conveyor system 1 according to a particular embodiment of the present invention is shown in
[0070] The following describes a computer-based method for identifying functionally impaired support rollers 13, based on the method for the machine-based determination of the functional state of support rollers 13 of a belt conveyor system 1 during operation of the belt conveyor system 1. This method can be used in particular in the case of belt conveyor systems 1 described in connection with
[0071] The method provides for an unmanned vehicle 2 with at least one imaging sensor system, by means of which at least sections of the belt conveyor system 1 can be sensed in the form of image data, wherein image data from at least one subregion of the belt conveyor system 1 is captured as thermal image data (step 200). In the captured image data of the belt conveyor system 1, at least one identification image region position is determined automatically (step 300), in which at least one subregion of a support roller 13 is imaged. For each identification image region position determined from the image data, an analysis image region position is automatically defined in the thermal image data (step 410). In each defined analysis image region position, thermal image data is automatically analyzed (step 400) to automatically determine the functional state of support rollers 13.
[0072] The at least one unmanned vehicle 2 with the at least one imaging sensor system comprising at least one thermal imaging sensor device 21 is moved along at least one subregion of the belt conveyor system 1 to capture the thermal image data. Using the imaging sensor system of the unmanned vehicle 2, which moves along the main section (conveying direction) of the belt conveyor system 1, photographic image data is captured (step 100) and thermal image data is captured (step 200) in sections of the belt conveyor system 1. The unmanned vehicle 2 can be an autonomous unmanned vehicle or a remote controlled unmanned vehicle. In particular, drones or robots can also be considered as autonomous unmanned vehicles. Such a vehicle 2 can be designed in principle as a track-bound or track-less vehicle, in particular as a land vehicle or as an aircraft (for example as a quadricopter).
[0073] In step 100, the imaging sensor system is used to capture photographic image data from at least one subregion of the belt conveyor system 1 and in step 200, the imaging sensor system is used to capture thermal image data from this at least one subregion of the belt conveyor system 1. The image data captured by the imaging sensor system of this vehicle 2 represents support rollers 13 of a subregion of the belt conveyor system 1 either completely or partially, in particular including the bearings (anti-friction bearings) of the corresponding support rollers 13. The photographic image data and thermal image data in this case are static monoscopic recorded images, i.e. digital photographs or digital thermal images. However, other types of image data can be used instead, such as the corresponding moving images (moving images, videos), stereoscopic images and the like.
[0074] In addition to the spatially resolved image information, the photographic image data also contains the exact position (location position) at which the picture was taken (i.e. in the form of the position of the vehicle 2), as well as the orientation information indicating the orientation at which the photographic image sensor device 22 took the picture (alternatively or additionally, the corresponding information may also be included in the thermal image data). This additional information is stored in the corresponding image files in addition to the pixel data as metadata. Instead, this information can of course also be stored and transmitted separately from the image data, provided the information can be uniquely assigned to the corresponding image data, for example via a common time stamp. The position is determined by means of a position determination device 23 that is provided on the vehicle 2 itself or in a data processing device 3. In the present case, the position determination device 23 is a device for radio-based position determination, namely for satellite-based position determination by means of GPS. Instead, other devices and methods can also be used for determining position, for example a position determination based on optical environment detection, for example by comparing image data with previously captured image data or by comparing image data with a previously created three-dimensional terrain model, such as a three-dimensional point cloud, which also images the corresponding subregion of the belt conveyor system 1.
[0075] The image data is buffered during or after the recording and then transmitted—for example immediately afterwards, at fixed time intervals, or after the vehicle 2 has returned to a base station—to a data processing device 3. In the data processing device 3, the data is automatically processed and stored together with the results of such a data processing as target data, for example in the form of databases. The target data obtained in this way is made available to potential users via an interactive communication interface (interface), for example to the operator's employees, the service personnel, maintenance technicians and the like.
[0076] In step 300, image data regions are automatically detected in the captured image data of the belt conveyor system 1, in which regions at least one subregion of a support roller 13 is imaged. This step includes automatic detection of image data regions (step 320) and automatic provision of detected image data regions as identification image region positions (step 330), as well as improving the recognition quality of the automatic detection (step 310). At the end of step 300, the position of the detected image data region is automatically provided as the identification image region position of the respective support roller 13 (step 330).
[0077] In this case, the image data is photographic image data and the image data regions are photographic image data regions. In the photographic image data, photographic image data regions in which at least one identifiable object is imaged are thus automatically recognized (step 320). In the photographic image data regions detected in this way, support rollers 13 or their subregions are automatically detected among the identifiable objects. This enables the detection and identification of objects, in particular of support rollers 13 and the associated bearings (anti-friction bearings), on the basis of this photographic image data. For each detected support roller 13 and for each detected subregion of a support roller 13, the position of the photographic image data region in which the support roller 13 or the subregion of the support roller 13 is imaged is provided for further processing as an identification image region position (object position and object class) (step 330).
[0078] The automatic detection (step 320) of image data regions represents a central operation within step 300. In this case, the automatic detection 320 of image data regions is based on a machine learning algorithm. The recognition quality in the automatic determination of the identification image region position in the captured photographic image data, in particular the recognition quality of the automatic recognition of photographic image data regions in which at least one identifiable object is imaged (primarily at least one subregion of a support roller 13) and/or the recognition quality of the automatic recognition of support rollers 13 or subregions of support rollers 13 in the detected photographic image data regions, is improved by means of a learning procedure (step 310), which is carried out using an artificial neural network based on training data (training images), such as photographic image data in which such image regions have been manually marked with position and object designation, in which the objects to be learned are imaged—in particular, support rollers 13 or subregions of support rollers 13. By means of the training data, the recognition quality in the classification is improved, among other things. In addition, other objects characteristic of the belt conveyor system 1 may be marked in the training data, such as subregions of support frame elements, the conveyor belt 11 and the like. In this case, the training data also contains the verified object class assigned to the detected object as an identifier. The artificial neural network here is a multi-level convolutional neural network consisting of two “convolutional neural network” modules. The first module is designed to provide a preliminary classification of image data regions in the photographic image data where different objects might be located. The result of this preliminary classification is then transferred to the second module to perform an object-based classification in the potentially relevant image data regions identified in this way. The result of this classification of photographic image data returns the object class assigned to the detected object (the object name, for example, “support roller”, “bearing”, “steel cable” or the like), which is assigned a probability value from a range of 0 to 1. The object class with the highest probability value represents a result of this evaluation. The evaluation of the image also returns the associated position of the classified object within the image data. In this case, the position of such an object is defined by four numbers, namely by two pairs of x/y coordinates, each corresponding to point positions in the image. The two points in the photographic image data associated with the point positions span a rectangle as opposite edges of a rectangular selection frame in which the object is centrally positioned. Depending on the photographic image data examined, multiple objects can be identified simultaneously in one photographic image.
[0079] The image data of the belt conveyor system 1 is assigned to a position in the belt conveyor system 1. Additionally or instead, for defined analysis image region positions, the thermal image data or evaluation information is assigned to an individual support roller 13 of the belt conveyor system in each case. The assignment is performed in particular using a radio-based position determination, from a comparison of captured image data with reference image data, by detecting a route traveled by the unmanned vehicle 2, and/or by means of an orientation detection of the imaging sensor system. Such an assignment can be made at any point in the method before a comparison with historical data takes place. For this purpose, the belt conveyor system 1 is initially measured and the exact detection position (such as the GPS position) of each individual support roller 13 is determined. For a particularly reliable analysis, a three-dimensional digital model of the belt conveyor system 1 can also be created for this purpose (this can also be used as an intuitively comprehensible display of the identified information). Each image recording and thermal image recording will then include the position of the vehicle 2 (in the form of GPS coordinates) as metadata, as well as information on the orientation of the image sensor devices during data capture (such as via an internal electronic compass, an inertial measurement unit, or the pivot angle display of a robotic arm), thus enabling the precise assignment of the identified objects from the image recognition (step 320) to a specific support roller 13 of the belt conveyor system 1. In order to ensure an exact assignment, here a GPS module has been chosen which offers position detection based on a differential GPS procedure with a resolution of the GPS position in the centimeter range. Alternatively, the position can also be determined optically via an image comparison. The current (two-dimensional) photographic camera image (captured by means of a monoscopic camera) or a point cloud (3D point cloud, captured by means of a stereoscopic camera, a moving monoscopic camera, a scanning laser, lidar, radar or the like) is compared with two-dimensional photographic images with known positions or with a geo-referenced point cloud. In addition, it is also possible to determine the position in the near-range radio fields, for example via RFID transponders that are arranged in the support roller stations of the respective support rollers 13 and that are read out via a corresponding reader device in the vehicle 2. The position determination is carried out by radio-based methods at the time of the image data capture; image data can also be compared at a later time on the basis of captured image data. In this case, the position determination is carried out using GPS simultaneously with step 100 and/or 200, so that the location information is stored together with the image data and the orientation data and an assignment to specific support rollers 13 is then performed after the automatic recognition of image region data (step 320).
[0080] In step 410, for each identification image region position of a support roller 13 determined from the image data and provided, an analysis image region position of the support roller 13 is automatically defined in the thermal image data, which corresponds spatially to the respective identification image region position from the image data. In the subsequent automatic analysis of thermal image data (step 400), the corresponding temperature data of the support rollers 13 is determined only for the thermal image data in the analysis image region positions and subjected to an analysis (thermal analysis) in order then to assign the respective functional state (wear state) to the support rollers 13 and their bearings. Since the thermal imaging sensor device 21 and the photographic image sensor device 22 are a short distance apart, are rigidly aligned in the same direction with respect to the vehicle 2, and both sensor devices capture the image data approximately simultaneously, the information from the object detection described above in the photographic image data (step 300) can be transferred to the thermal image data (due to the known offset between the two image sensor devices, an offset can be calculated). If an object in the photographic image data has been identified and marked as a support roller 13, then in step 410 the rectangular selection frame, which in the photographic image data includes the support roller 13, is defined in the thermal image data as an analysis image region position and therefore also marked in the thermal image. The subsequent thermal analysis is then limited to thermal image data within this selected analysis image region.
[0081] In step 400, the selected thermal data is automatically analyzed in the defined analysis image region position of the support roller 13. To do this, temperature data of the respective support roller 13 is first automatically determined from the thermal image data in the defined analysis image region positions (step 420). The temperature data determined in the analysis image region is then automatically assigned to a functional state of the respective support rollers 13 (step 430). Step 400 therefore includes the automatic determination of temperature data (step 420) and the automatic assignment of the temperature data to a functional state (step 430), but not the automatic definition of the analysis image region position (step 410—however, a method sequence would also be possible in which the automatic definition of the analysis image region position is part of an automatic analysis of the thermal image data).
[0082] In step 420, thermal image data is automatically analyzed at each defined analysis image region position. To this end, in this step thermal image data that is thermal image data of the support rollers 13 is first automatically selected within the defined analysis image region positions. The thermal data thus selected in the defined analysis image region positions is then automatically analyzed.
[0083] To ensure that the thermal analysis is carried out on the support roller 13 and its bearings and not on other objects such as the conveyor belt 11 or on steel cables, which may also be located in the rectangular selection frame defined as the analysis image region, subregions are defined within the analysis image region position with two different selection methods. This makes it possible to keep the object region as small as possible for the subsequent actual data analysis. To do this, subregions of the thermal image data that contain thermal image data arranged in circular or near-circular contours are automatically selected within the defined analysis image region positions. In addition, subregions of the thermal image data in which the thermal image data corresponds to temperatures of an equal temperature level are automatically selected within the defined analysis image region positions. In the first selection procedure, the selection frame is therefore examined for circular contours. The region for inspecting the bearing of the support roller 13 is then defined as the average of all circular contours, which are consequently selected. In the second selection procedure, multiple regions with the same temperature level are selected based on the temperature distribution in the selection frame (the term temperature level includes a certain temperature range, the width of which must be selected according to the specific operating conditions of the belt conveyor system 1). In order to combine the two selection procedures, the positions of the subregions selected in the first selection procedure are compared with the positions of the subregions selected in the second selection procedure within each analysis image region position. The subregion, the thermal image data of which is subjected to the following actual thermal analysis is given by the joint intersection set (intersection surface) of the subregions selected in the two selection procedures. The combination of the two selection procedures provides a particularly high level of confidence that the automatically selected thermal image data is actually representative of the support rollers 13 and their bearings. In addition, the recognition quality in the automatic selection of the thermal image data, which is thermal image data of the support rollers 13, in particular in the automatic selection of the thermal image data which lies in such subregions within the defined analysis image region positions where thermal image data is arranged in circular or near-circular contours and/or where thermal image data corresponds to the temperatures of an equal temperature level, is improved by using a learning procedure carried out using an artificial neural network, namely a multi-level convolutional neural network which has two convolutional neural network modules.
[0084] In step 430, the temperature data determined from the analysis image region position is automatically assigned to a functional state of a support roller 13, by either automatically classifying the determined temperature data of the respective support roller 13 to one functional state of a plurality of previously defined functional states or by automatically assigning said data to a functional state as part of a cluster analysis, in particular as part of a multivariate cluster analysis. In addition, the recognition quality in the automatic determination of the functional state of support rollers 13, in particular the classification of the thermal image data, is improved by means of a learning procedure carried out by means of an artificial neural network. In the selected regions, characteristic features of the temperature, such as the maximum temperature, the minimum temperature, the temperature mean, the temperature median and characteristic features of the temperature distribution are extracted (e.g. their width, symmetry, or any hotspots). The automatic assignment to a functional state 430 is performed here by means of a multivariate clustering algorithm, by which the totality of the characteristic features of each support roller 13 and its bearings are assigned to one of a number of wear states. In the present case, this is an assignment to “Replacement required”, “Replacement required soon”, “Replacement not yet foreseeable”. This clustering algorithm is continuously trained, for which suitably verified information about the actual functional state of support rollers 13 is entered as training data, such as that obtained after a manual check as part of an inspection or maintenance for verification of the functional state.
[0085] The acquisition of the image data and its analysis, i.e. the method for the machine-based determination of the functional state of support rollers 13 of a belt conveyor system 1 during operation of the belt conveyor system 1 (steps 100-400), is carried out at regular time intervals. For each support roller 13, functional state, image data and determined temperature data are recorded in a functional state data collection, in particular in a functional state database (not shown). In this functional state data collection, each support roller 13 of the belt conveyor system 1 is captured with its exact GPS position and its respective specification; this includes, for example for the support roller 13, its size, material, manufacturer, type number, installation date and the like. For each support roller 13, in each acquisition cycle the storage paths of the associated image data (as raw data and as processed data after processing) and the analysis results are also stored with a time stamp. In addition to the position (support roller 13 and its bearings) and the temperature data, in particular the characteristic temperature parameters such as maximum temperature, minimum temperature, mean temperature, median temperature and the temperature distribution, the stored analysis results also include the determined functional states (as wear state classes). In addition, additional information such as temperature limits (for example, 80° C. for an imminent replacement and 90° C. for an immediate replacement), pending maintenance operations and the like can be stored for each support roller 13.
[0086] Via an interactive communication interface (local user interface or online user interface, for example via locally installed computer programs or via mobile or web-based application programs or applications), the data stored in the functional state data collection can be accessed and retrieved, for example in order to display the corresponding information visually. In this way, service personnel or maintenance technicians can display markings and listings of all support rollers 13 requiring maintenance along with their GPS positions, or add relevant photographs and comments interactively during inspection and maintenance of support rollers 13 for documentation purposes, for example, a manual assessment of the wear state or a list of the specific maintenance work and replacement activities carried out. This information is then stored in a database with a time stamp and can be used to improve the quality of the classification as part of a machine learning procedure. Furthermore, the functional state data collection can also be used to display the exact positions of the support rollers 13 on a map display based on available map material, satellite images, or a three-dimensional digital model of the belt conveyor system 1. For example, a user can interactively select individual support rollers 13 on the map in order to view additional information on these support rollers 13. In addition, current functional states/wear states as well as selected temperature characteristics of all support rollers 13 can be displayed in an overview diagram. The data on the support rollers 13 can also be retrieved in list form, wherein individual support rollers 13 can also be selected interactively in this display and their information viewed and comments on individual support rollers 13 can be stored, wherein the temperature profile of the respective support roller 13 can be displayed or its thermal image data and the photographic image data can be displayed, also as a chronology of the historical values including all changes.
[0087] In addition to the method described above for the machine-based determination of the functional state of support rollers 13, the method shown in
[0088] The method described above for the machine-based determination of the functional state of support rollers 13 of a belt conveyor system 1 during operation of the belt conveyor system 1 and in particular also the method described above for the identification of functionally impaired support rollers 13 of a belt conveyor system 1 during operation of the belt conveyor system 1 are each implemented in the form of a computer program which is configured to perform each step of these methods. This computer program is stored on a machine-readable data carrier.
LIST OF REFERENCE SIGNS
[0089] 1 belt conveyor system [0090] 11 conveyor belt [0091] 12 deflection roller [0092] 13 support roller [0093] 2 unmanned vehicle [0094] 21 thermal image sensor device [0095] 22 photographic image sensor device [0096] 23 position determination device [0097] 24 data transfer module [0098] 3 data processing device [0099] 31 image region identification module [0100] 32 image region detection module [0101] 33 interface module [0102] 34 image region definition module [0103] 35 state identification module [0104] 36 analysis module [0105] 37 assignment module [0106] 38 data transfer module [0107] 100 capturing photographic image data [0108] 200 capturing thermal image data [0109] 300 automatic determination of at least one identification image region position [0110] 310 improving the recognition quality of the automatic identification [0111] 320 automatic detection of image data regions [0112] 330 automatic provision of identified image data regions as identification image region position [0113] 400 automatic analysis of thermal image data [0114] 410 automatic determination of analysis image region position [0115] 420 automatic determination of temperature data [0116] 430 automatic assignment to a functional state [0117] 500 comparison with historical data from functional state database [0118] 600 determination of the time for replacement