CABLE PROCESSING DEVICE AND METHOD

20230081730 · 2023-03-16

    Inventors

    Cpc classification

    International classification

    Abstract

    The present invention comprises a cable processing device comprising first receiving device adapted to receive and fix a first cable end of a cable) in a predetermined position, an image recording device which is designed to capture at least one image of the first cable end, and an evaluation device which is designed to apply a trained algorithm to the at least one image, and to generate and output a control signal on the basis of at least one result output by the trained algorithm, wherein the trained algorithm is adapted to identify a predetermined feature in the at least one image, respectively, and to output a positive result if the predetermined feature is identifiable in the image. Further, the present invention discloses a corresponding method.

    Claims

    1. A cable processing device comprising: a first receiving device adapted to receive a first cable end of a cable and fix the first cable end in a predetermined position; an image recording device which is adapted to capture at least one image of the first cable end; and an evaluation device which is adapted to apply a trained algorithm to the at least one image, and to generate and output a control signal based on a result output by the trained algorithm; and wherein the trained algorithm is adapted to identify a predetermined feature in the at least one image, respectively, and to output a positive result when the predetermined feature is identifiable in the image.

    2. The cable processing device of claim 1, comprising: a second receiving device adapted to receive and fix a second cable end of the cable in a predetermined position, wherein the image recording device is further adapted to capture at least one image of the second cable end.

    3. The cable processing device of claim 1, wherein the image recording device comprises a first camera, which is configured to capture images in a top view of the first cable end.

    4. The cable processing device of claim 3, wherein the image recording device comprises a second camera adapted to capture images in a perspective view of the first cable end.

    5. The cable processing device of claim 1, wherein the trained algorithm comprises a convolutional neural network.

    6. The cable processing device of claim 1, wherein the predetermined feature comprises a marker at the first cable end and the trained algorithm is trained to identify the marker at the first cable end.

    7. The cable processing device of claim 6, wherein the evaluation device is adapted to output a positive control signal when the marker is at exactly the first cable end in the least one image, and to output an error signal if the marker is not identified at exactly the first cable end in the least one image.

    8. The cable processing device of claim 1, wherein the predetermined feature comprises a contact locking device, and the trained algorithm is trained to identify a presence of the contact locking device at the first cable end.

    9. The cable processing device of claim 8, wherein the evaluation device is adapted to output a positive control signal when the contact locking device is in the at least on image, and to output an error signal if the contact locking device is not identified in the at least on image.

    10. The cable processing device of claim 1, wherein the predetermined feature identifies a state of a locking device, and the trained algorithm is trained to identify the state of the locking device at the first cable end.

    11. The cable processing device of claim 10, wherein at least one of: wherein the evaluation device is adapted to output a positive control signal when the state of the locking device in the at least on image is identified as a locked state, and to output an error signal when the state is not identified as a locked state, or wherein the evaluation device is adapted to output an error signal when the state of the locking device is identified as a locked state in the at least on image and to output a positive control signal when the state is not identified as a locked state.

    12. A method for automatically processing cables, comprising: fixing at least one cable end of a cable in a predetermined position; capturing at least one image of the at least one cable end; applying a trained algorithm to the at least one image, wherein the trained algorithm is adapted to recognize a predetermined feature in the at least one image and to output a positive result when the predetermined feature is identifiable in the image; and generating and outputting a control signal based on at least one result output from the trained algorithm.

    13. The method of claim 12, wherein at least one of: wherein the at least one image is taken from a top view of the at least one cable end, or wherein the at least one image is taken in a perspective view of the at least one cable end.

    14. The method of claim 12, wherein the predetermined feature comprises a mark at the at least one cable end and the trained algorithm is trained to identify the mark at the at least one cable end, wherein, a positive control signal is output, when the mark is identified at exactly the at least one cable end in the at least one image, and an error signal is output, when the mark is not identified at exactly the at least one cable end in the at least one image.

    15. The method of claim 12, wherein said predetermined feature comprises a locking device and the trained algorithm is trained to identify a presence of the locking device at the at least one cable end, wherein a positive control signal is output when the locking device is identified in the at least one image, and an error signal is output when the locking device is not identified in the at least one image.

    16. The method of claim 12, wherein the predetermined feature identifies a state of a locking device and the trained algorithm is trained to identify the state of the locking device at the at least one cable end, and at least one of: wherein a positive control signal is output when the state of the locking device is identified in the at least one image as a locked state, and an error signal is output when the state is not identified as a locked state, or wherein an error signal is output when the state of the locking device is identified as a locked state in the at least one image, and a positive control signal is output when the state is not identified as a locked state.

    17. A non-volatile computer program product comprising instructions which, when executed by a processor, cause the processor to execute an operation, the operation comprising: fixing at least one cable end of a cable in a predetermined position; capturing at least one image of the at least one cable end; applying a trained algorithm to the at least one image, wherein the trained algorithm is adapted to recognize a predetermined feature in the at least one image and to output a positive result when the predetermined feature is identifiable in the image; and generating and outputting a control signal based on at least one result output from the trained algorithm.

    18. The non-volatile computer program product of claim 17, wherein at least one of: wherein the at least one image is taken from a top view of the at least one cable end, or wherein the at least one image is taken in a perspective view of the at least one cable end.

    19. The non-volatile computer program product of claim 17, wherein the predetermined feature comprises a mark at the at least one cable end and the trained algorithm is trained to identify the mark at the at least one cable end, wherein, a positive control signal is output, when the mark is identified at exactly the at least one cable end in the at least one image, and an error signal is output, when the mark is not identified at exactly the at least one cable end in the at least one image.

    20. The non-volatile computer program product of claim 17, wherein said predetermined feature comprises a locking device and the trained algorithm is trained to identify a presence of the locking device at the at least one cable end, wherein a positive control signal is output when the locking device is identified in the at least one image, and an error signal is output when the predetermined mark is not identified in the at least one image.

    Description

    BRIEF FIGURE DESCRIPTION

    [0121] Advantageous embodiments of the invention are explained below with reference to the accompanying figures. They show:

    [0122] FIG. 1 a schematic representation of an embodiment of a cable processing device according to the present invention;

    [0123] FIG. 2 a schematic representation of another embodiment of a cable processing device according to the present invention;

    [0124] FIG. 3 a schematic representation of still another embodiment of a cable processing device according to the present invention;

    [0125] FIG. 4 a schematic representation of a cable end for processing in one embodiment of a cable processing device according to the present invention;

    [0126] FIG. 5 a schematic representation of a further cable end for processing in one embodiment of a cable processing device according to the present invention;

    [0127] FIG. 6 a schematic representation of a further cable end for processing in one embodiment of a cable processing device according to the present invention;

    [0128] FIG. 7 a further representation of the cable end of FIG. 6;

    [0129] FIG. 8 a schematic representation of a further embodiment of a cable processing device according to the present invention;

    [0130] FIG. 9 a schematic representation of a further cable end for processing in one embodiment of a cable processing device according to the present invention; and

    [0131] FIG. 10 a flowchart of an embodiment of a method according to the present invention.

    [0132] The figures are schematic representations only and are for the purpose of explaining the invention. Identical or like-acting elements are indicated throughout by the same reference numerals.

    DETAILED DESCRIPTION

    [0133] FIG. 1 shows a cable processing device 100. The cable processing device 100 has a first receiving device 101, an image recording device 102 and an evaluation device 104.

    [0134] The first receiving device 101 is adapted to receive a first cable end 191 of a cable 190 and to fix it in a predetermined position. FIG. 1 shows an example of a cable 190 having a sheath 192 enclosing a braided shield 193 disposed on an insulator 194. The insulator 194 encloses the conductor 195 of the cable 190. In particular, the cable 190 may be a coaxial cable. It is understood that the cable processing device 100 may be used with any other type of cable as well. A mark 196 is provided on the sheath 192 of the cable 190, the mark 196 representing the feature to be identified by a trained algorithm 105. It is understood that the mark 196 is merely exemplary as a feature to be identified, and that other types of features may also be used. Possible features have been discussed above and are shown in FIGS. 5-7 and 9. Exemplary features for identification by the trained algorithm 105 may comprise, for example, connectors, markings on connectors, in particular codes, locking devices on cable ends or connectors, and in particular the state of such a locking device, as explained in connection with FIGS. 5-7. Other exemplary features may include, for example, contact tongues of connectors and, in particular, the positions of the contact tongues in the connectors, as explained in connection with FIG. 9.

    [0135] The image recording device 102 captures at least one image 103 of the first cable end 191 and sends the image data to the evaluation device 104. It is understood that typically a single image 103 can be captured of each cable end 191. However, should it be necessary, the image recording device 102 may capture multiple images 103 of the cable end 191.

    [0136] The evaluation device 104 applies the trained algorithm 105 to the one image 103 and generates a control signal 106 based on a result output from the trained algorithm 105. In particular, it is provided that the evaluation device 104 applies the trained algorithm to the directly captured image 103 or to a partial section of the directly captured image 103. In particular, the evaluation device 104 does not apply any further image processing to the image 103. For providing the partial section of the directly captured image 103, an intermediate program (in particular an image processing software, specifically a publicly available image processing program, for example ‘Eye Vision Technology’ (EVT)) can still be connected between the image recording device 102 and the evaluation device 104, the sole object of which is to crop the directly captured image 103 and to forward only a defined section as a partial section to the evaluation device 104; this task of cropping a partial section from the directly captured image 103 can also be performed within the evaluation device 104. It is understood, however, that the directly captured image 103, not just a partial section, may also be evaluated in the evaluation device 104.

    [0137] The trained algorithm 105 may, for example, comprise a neural network, in particular a convolutional neural network. This neural network may be formed and trained to identify a predetermined feature 196 in the at least one image 103 and output a positive result if the predetermined feature 196 is identifiable in the image 103. A possible embodiment for such a neural network is already described above. Likewise, a possible training procedure is already described above. In particular, it may be provided that the neural network is designed and trained to identify the predetermined feature 196 in the at least one immediately captured image 103. Thereby, it can be avoided that the directly captured image 103, as raw data, first requires image processing or image evaluation or image analysis in order to be evaluated in a processed form by the neural network in a subsequent step.

    [0138] The control signal 106 may be evaluated in a cable processing system in which the cable processing device 100 is used, and may influence the further processing of the cable 190. For example, if the control signal 106 indicates that the mark 196 on the jacket has been identified as expected, the next processing step may be initiated. If, on the other hand, the control signal 106 indicates that the marking 196 could not be identified, the cable 190 may be identified as defective, for example, and may be excluded from further processing.

    [0139] FIG. 2 shows a cable processing device 200. The cable processing device 200 starts from the cable processing device 100 described in more detail in FIG. 1 and continues this in that a second receiving device 201-2 is provided. Consequently, the cable processing device 200 has a first receiving device 201-1, a second receiving device 201-2, an image recording device 202, and an evaluation device 204. The evaluation device 204 further comprises a trained algorithm 205. The above explanations regarding the cable processing device 100 are mutatis mutandis also applicable to the cable processing device 200.

    [0140] As with the cable processing device 100 described in more detail with reference to FIG. 1, the first receiving device 201-1 receives a first cable end 291-1 of a cable. The second receiving device 201-2 receives a second cable end 291-2 of the cable. Thus, the cable may be clamped in the form of a loop in a cable clamp of a processing device and the two cable ends 291-1, 291-2 may be simultaneously captured and evaluated in the cable processing device 200.

    [0141] The image recording device 202 captures one or more images 203 of each of the two cable ends 291-1, 291-2 in the cable processing device 200.

    [0142] The evaluation device 204 may process the images 203 sequentially, thus applying the trained algorithm 205 to each of the images separately, and generating a corresponding control signal 206. With respect to the trained algorithm 205, the above explanations regarding the trained algorithm 105 apply. In particular, it may be provided that the evaluation device 204 applies the trained algorithm 205 to each of the immediately acquired images 203 without applying any further image processing to the images 203.

    [0143] Consequently, the trained algorithm 205 may be configured to detect a respective mark applied to the sheath of one of the cable ends 291-1, 291-2. Consequently, the respective marking represents the feature to be identified by a trained algorithm 205. It is understood that the marking is merely mentioned as an example of the feature to be identified, and that other types of features may also be used. Possible features have been discussed above and are shown in FIGS. 5-7 and 9. Exemplary features for identification by the trained algorithm 205 may comprise, for example, connectors, markings on connectors, in particular codes, locking devices on cable ends or connectors, and in particular the state of such a locking device, as explained in connection with FIGS. 5-7. Other exemplary features may include, for example, contact tongues of connectors and, in particular, the positions of the contact tongues in the connectors, as explained in connection with FIG. 9.

    [0144] For example, the evaluation device 204 may generate a positive control signal 206 when exactly one of the two cable ends 291-1, 291-2 has a mark or a predetermined feature. In other embodiments, the evaluation device 204 may generate a positive control signal 206 when both cable ends 291-1, 291-2 have a marking or predetermined feature.

    [0145] FIG. 3 shows a cable processing device 300. The cable processing device 300 is based on the cable processing device 200 and is an extension thereof in that the image recording device includes a first camera 302-1 and a second camera 302-2. Consequently, the cable processing device 300 has a first image receiving device 301-1, a second image receiving device 301-2, a first camera 302-1, a second camera 302-2, and an evaluation device 304. The evaluation device 304 further comprises a trained algorithm 305. It is understood that the second recording device 301-2 is only optional, and the cable processing device 300 may also have only one receiving device 301-1. The above discussion regarding the cable processing devices 100 and 200 is mutatis mutandis applicable to the cable processing device 300 as well.

    [0146] The first camera 302-1 captures images 303-1 of the cable ends 391-1, 391-2 from a top view. In contrast, the second camera 302-2 is offset or pivoted about the longitudinal axis of the cable ends 391-1, 391-2 by a predetermined angle relative to the first camera 302-1 and captures images 303-2 of the cable ends 391-1, 391-2 from a perspective view.

    [0147] For example, with two cable ends 391-1, 391-2 to be examined, the cable processing device 300 may capture four images 302-1, 302-2.

    [0148] The evaluation device 304 may process the images 303-1, 303-2 one after the other, thus applying the trained algorithm 305 to each of the images separately, and generating a corresponding control signal 306. Regarding the trained algorithm 305, the above explanations on the trained algorithm 105 and 205 apply. In particular, it may be provided that the evaluation device 304 processes the directly captured images 303-1, 303-2, thus applying the trained algorithm 305 to each of the directly captured images, i.e., to the raw data, without performing any further image processing.

    [0149] For example, the trained algorithm 305 may be configured to recognize a respective mark applied to the sheath of one of the cable ends 391-1, 391-2. Consequently, the respective marking represents the feature to be identified by a trained algorithm 305. It is understood that the marking is merely mentioned as an example of the feature to be identified, and that other types of features may also be used. Possible features have been discussed above and are shown in FIGS. 5-7 and 9. Exemplary features for identification by the trained algorithm 305 may comprise, for example, connectors, markings on connectors, in particular codes, locking devices on cable ends or connectors, and in particular the state of such a locking device, as explained in connection with FIGS. 5-7. Other exemplary features may include, for example, contact tongues of connectors and, in particular, the positions of the contact tongues in the connectors, as explained in connection with FIG. 9.

    [0150] The evaluation device 304 may, for example, generate a positive control signal 306 if exactly one of the two cable ends 391-1, 391-2 has a marking or a predetermined feature. In this regard, in one embodiment, the marking may be formed such that it is only visible in one of the two images 303-1, 303-2 of one of the cable ends 391-1, 391-2 at a time. In such an embodiment, the evaluation device 304 may generate the positive control signal 306, for example, only when the marking has been identified in only one of the images 303-1, 303-2. In other embodiments, the evaluation device 304 may generate a positive control signal 306 if both cable ends 391-1, 391-2 have a mark or a predetermined feature. Again, the marking may be such that it is only visible in one of two images 303-1, 303-2 of a cable end 391-1, 391-2 at a time.

    [0151] FIG. 4 shows a cable end 491 as it may be processed in one of the exemplary cable processing devices 100, 200, 300, 800 described above in FIG. 1, 2, or 3 or described in more detail below in FIG. 8, in a side view (top) and a top view (bottom). The cable end 491 corresponds to the cable end 191 shown in FIG. 1. Consequently, the cable end 491 has a sheath 492 which encloses a braided shield 493 arranged on an insulator 494. The insulator 494 encloses the conductor 495 of the cable.

    [0152] On the sheath 492 of the cable, by way of example only, is a rectangular mark 496 representing the feature to be identified by the trained algorithm. It is understood that other shapes of the mark 496 are also possible.

    [0153] It is further understood that the marking 496 can be used as the feature to be identified on other types of cables, i.e., not only on coaxial cables. For example, such marking 496 may also be used in connection with multi-core data cables.

    [0154] FIG. 5 shows another cable end 591 as can be processed in one of the exemplary cable processing devices 100, 200, 300, 800 described above in FIG. 1, 2, or 3 or in more detail below in FIG. 8.

    [0155] Unlike the cable end 491, the cable end 591 does not have a marking. In contrast, in the case of the cable end 591, the feature to be identified is provided by a connector 597. Consequently, in such an embodiment, the trained algorithm is trained to identify whether or not the connector 597 is present at the cable end.

    [0156] The connector 597 is a circular connector, which is why the cable end 591 is not shown in two different views. It is understood that any other type of connector may be provided instead of a circular connector.

    [0157] In another embodiment, a plug may be mounted on the cable end 591. However, in this embodiment, this plug need not be the feature to be identified. Rather, a marking may be provided on the plug that represents the feature to be identified. In such an embodiment, the mark on the connector is recognized by the trained algorithm.

    [0158] FIG. 6 shows yet another cable end 691 as may be processed in one of the cable processing devices 100, 200, 300, 800 described above in FIG. 1, 2, or 3 or described below in FIG. 8 by way of example. A connector 697 is provided at the cable end 691, but this connector is not the feature to be identified. Rather, the connector 697 includes a locking device 698, in this case a secondary locking device. The locking device 698 has a web from which pins 699 extend into the plug 697. These pins 699 positively engage the contacts of the connector 697, thereby securing them from slipping out of the connector 697.

    [0159] In FIG. 6, the locking device 698 is shown in the open state, and the pins 699 are not fully recessed in the plug 697.

    [0160] FIG. 7 shows the cable end 691 in a locked state, in which the pins 699 are fully recessed into the connector 697. The locking device 698 can be locked, for example, by pushing it into the connector 697.

    [0161] In one embodiment, if the feature to be identified is formed by the locking device 698, the trained algorithm may be trained to detect the presence or absence of the locking device 698.

    [0162] In another embodiment, the trained algorithm may be trained to detect the state of the locking device 698, i.e., whether it is open or locked.

    [0163] For better understanding, the reference signs of FIGS. 1 to 7 are also used in the following description of the method-related FIG. 8.

    [0164] FIG. 8 shows another embodiment of a cable processing device 800. The cable processing device 800 is based on the cable processing device 300. Consequently, the cable processing device 800 comprises a first receiving device 801-1 for receiving a cable end 891-1, a second receiving device 801-2 for receiving a second cable end 891-2, a first camera 802-1, a second camera 802-2, and an evaluation device 804. The evaluation device 804 further comprises a trained algorithm 805. The second camera 802-2 and the second receiving device 801-2 are merely optional, and the cable processing device 800 may further comprise only one camera 802-1 and/or one receiving device 301-1. Consequently, the cable processing device 800 may correspond to any of the cable processing devices 100, 200, 300 with respect to the elements disclosed herein. Therefore, the above explanations regarding the cable processing devices 100, 200, 300 apply analogously to the cable processing device 800.

    [0165] The cable processing device 800 further comprises an image processing device 810. The image processing device 810 is arranged in parallel with the evaluation device 804 and receives images 803-1, 803-2 from the first camera 802-1 and, if present, from the second camera 802-2 in parallel with the evaluation device 804.

    [0166] The evaluation device 804 evaluates the images 803-1, 803-2 as already described above in connection with FIGS. 1-3. Consequently, the evaluation device 804 applies the trained algorithm 805 to the images 803-1, 803-2 and generates a control signal 806 based on a result output from the trained algorithm 805. In particular, it is provided that the evaluation device 804 applies the trained algorithm to the directly captured images 803-1, 803-2 or to a partial section of each of the directly captured images 803-1, 803-2. In particular, it may be provided that the evaluation device 804 does not apply any further image processing to the image 803.

    [0167] The trained algorithm 805 may comprise, for example, a neural network, in particular a convolutional neural network. The trained algorithm 805, in particular the neural network, may be formed and trained to identify a predetermined feature in the images 803-1, 803-2 and output a positive result if the predetermined feature is appropriately identifiable. A mark may be provided on the sheath of at least one of the cable ends 891-1, 891-2, wherein such mark may represent the feature to be identified by a trained algorithm 805. It is understood that such a marking is merely exemplified as a feature to be identified, and that other types of features may also be used. Possible features have been discussed above and are shown in FIGS. 5-7 and 9. Exemplary features for identification by the trained algorithm 805 may comprise, for example, connectors, markings on connectors, in particular codes, locking devices on cable ends or connectors, and in particular the state of such a locking device, as explained in connection with FIGS. 5-7. Other exemplary features may include, for example, contact tongues of connectors and, in particular, the positions of the contact tongues in the connectors, as explained in connection with FIG. 9.

    [0168] The image processing device 810 evaluates the received images 803-1, 803-2 independently of the evaluation device 804 and autonomously and generates an image processing signal 811, in particular exactly one single image processing signal 811, which represents a kind of second control signal. The image processing signal 811 can be evaluated in a corresponding processing system for cables as a second signal in parallel with the control signal 806, which indicates whether the respective cable is free of defects or not. Such a processing system for cables can, for example, only identify a cable as fault-free if both the control signal 806 and the image processing signal 811 identify a fault-free cable.

    [0169] The combination of the two signals, i.e., the control signal 806 and the image processing signal 811, may also be performed in the image processing device 810. For this purpose, the evaluation device 804 may provide the control signal 806 to the image processing device 810 (shown in dashed lines). The image processing device 810 may, for example, perform a logical AND operation on the control signal 806 and the image processing signal 811, and output the AND-operated signal to the cable processing device.

    [0170] The image processing device 810 may evaluate the images 803-1, 803-2 according to predetermined criteria and perform, for example, pattern recognition, edge detection, or other functions of known image processing systems that, in particular, do not use artificial intelligence for image evaluation. The image processing device 810 may consequently evaluate the received images 803-1, 803-2 for the presence of other features, in particular also multiple features, than the evaluation device 804.

    [0171] It may be provided that the image processing device 810 processes the directly captured images 803-1, 803-2 or partial sections of the directly captured images 803-1, 803-2. For providing the partial section of the directly recorded images 803-1, 803-2, an intermediate program (the image processing software, in particular ‘Eye Vision Technology’ (EVT)) can still be connected between cameras 802-1, 802-2 and the image processing device 810, the sole object of which is to crop the directly recorded images 803-1, 803-2 and to forward only a defined section as a partial section or several defined partial sections in succession to the image processing device 810. This object of cropping a partial section from the directly captured images 803-1, 803-2 can also be performed within the image processing device 810. However, it is understood that the directly captured images 803-1, 803-2, not only a partial section, may also be evaluated in the image processing device 810. A single intermediate program may be provided which performs cropping of the immediately captured images 803-1, 803-2 for both the image processing device 810 and the evaluation device 804. If the intermediate program is integrated in the image processing device 810, the latter can provide the respective image crops to the evaluation device 804.

    [0172] In a further embodiment, the trained algorithm 805 in the evaluation device 804 may be trained to replace the image processing device 810 and additionally analyze the features in the images 803-1, 803-2 that the image processing device 810 analyzes as described above. Such a trained algorithm 805 outputs a positive control signal 806 only if all features in the images 803-1, 803-2 have been positively checked.

    [0173] FIG. 9 shows still another cable end 991 as may be processed in one of the exemplary cable processing devices 100, 200, 300, 800 described above in FIG. 1, 2, 3 or 8.

    [0174] The cable end 991 is shown in a frontal view of the longitudinal axis of the cable end 991 and has a plug 997 in which an insulator 994 with four openings 979-1, 979-2, 979-3, 979-4 is arranged. Such a connector 997 may be, for example, an HSD (“High Speed Data”) connector. In a properly assembled connector 997, there are two contact tongues 980-1, 980-2, 980-3, 980-4 in each of the four openings 979-1, 979-2, 979-3, 980-4, 980-5, 980-6, 980-7. It is understood that the number of two contact tongues 980-1, 980-2, 980-3, 980-4, 980-5, 980-6, 980-7 is selected merely by way of example and that more than two contact tongues 980-1, 980-2, 980-3, 980-4, 980-5, 980-6, 980-7 may be present in any of the four openings 979-1, 979-2, 979-3, 979-4. Also understood is that fewer or more than four openings 979-1, 979-2, 979-3, 979-4 may be provided.

    [0175] For testing the cable end 991, the trained algorithm may be trained to detect whether a corresponding number of contact tongues 980-1, 980-2, 980-3, 980-4, 980-5, 980-6, 980-7, e.g., two here, are present in each of the four openings 979-1, 979-2, 979-3, 979-4.

    [0176] The trained algorithm may be trained to recognize, in each case for the image of a single one of the four openings 979-1, 979-2, 979-3, 979-4, i.e., a corresponding section of an overall image of the cable end 991, whether the corresponding number of contact tongues 980-1, 980-2, 980-3, 980-4, 980-5, 980-6, 980-7 is present in the respective opening 979-1, 979-2, 979-3, 979-4. The corresponding sections of the overall image can be generated, for example, by an intermediate program as described earlier.

    [0177] The results of the trained algorithm for all four openings 979-1, 979-2, 979-3, 979-4 can then be abstracted to an overall result and output as a control signal. All partial results can be linked by means of a logical AND operation. A positive overall result is therefore only output if all partial results are positive.

    [0178] As can be seen in FIG. 9, only one contact tongue 980-7 is arranged in the right opening 979-4 of the four openings 979-1, 979-2, 979-3, 979-4. Consequently, the trained algorithm would detect two contact tongues 980-1, 980-2, 980-3, 980-4, 980-5, 980-6 for each of the three openings 979-1, 979-2, 979-3 and identify them as being free of defects. However, the trained algorithm would only detect one contact tongue 980-7 in the fourth opening 979-4 and thus identify it as non-faulty or faulty. The overall result output would thus be negative and the cable end 991 would be evaluated as defective.

    [0179] FIG. 10 shows a flow chart of a method for automatic processing of cables 190.

    [0180] In a first step S1, at least one cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991 of a cable 190 is fixed in a predetermined position. In step S2, at least one image 103, 203, 303-1, 303-2, 803-1, 803-2 of the at least one cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991 is captured. In step S3, a trained algorithm 105, 205, 305, 805 is applied to the at least one image 103, 203, 303-1, 303-2, 803-1, 803-2. In particular, it is provided that the application of the trained algorithm 105, 205, 305, 805 in step S3 is directly provided, without any further intermediate steps, with the image 103, 203. 303-1, 303-2 of the at least one cable end previously captured in step S2. Finally, in step S4, a control signal 106, 206, 306, 806 is generated and output based on at least one result output from the trained algorithm 105, 205, 305, 805.

    [0181] The trained algorithm 105, 205, 305, 805 mentioned above with reference to the embodiments of FIGS. 1, 2 and 3 may have, for example, a neural network, in particular a convolutional neural network. Such a trained algorithm 105, 205, 305, 805 may, for example, identify a predetermined feature 196, 496 in each of the at least one image 103, 203, 303-1, 303-2, 803-1, 803-2 and output a positive result if the predetermined feature 196, 496 is identifiable in the image 103, 203, 303-1, 303-2, 803-1, 803-2.

    [0182] In one embodiment, images 103, 203, 303-1, 303-2, 803-1, 803-2 may be captured either in a top view of the respective cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991 or in a perspective view of the respective cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991. In another embodiment, images 103, 203, 303-1, 303-2, 803-1, 803-2 may be captured both in a top view of the respective cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991 and in a perspective view of the respective cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991.

    [0183] In one embodiment, the predetermined feature 196, 496 may include a marker at the cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991. In such an embodiment, the trained algorithm 105, 205, 305, 805 may be trained to identify the marker at the cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991. A positive control signal 106, 206, 306, 806 may be output, in particular, when the mark at exactly one cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991 is identified in at least one of images 103, 203, 303-1, 303-2, 803-1, 803-2. An error signal may be output if the marker is not identified at exactly one cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991 in at least one of images 103, 203, 303-1, 303-2, 803-1, 803-2.

    [0184] In another embodiment, the predetermined feature 196, 496 may include a locking device 698, in particular a contact locking device 698. The trained algorithm 105, 205, 305, 805 may be trained to identify the presence of the locking device 698 at the cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991. In particular, a positive control signal 106, 206, 306, 806 may be output when the interlock device 698 is identified in at least one of the images 103, 203, 303-1, 303-2, 803-1, 803-2. An error signal may be output if the predetermined marker is not identified in at least one of images 103, 203, 303-1, 303-2, 803-1, 803-2.

    [0185] In yet another embodiment, the predetermined feature 196, 496 may identify the state of a locking device 698, particularly a contact locking device 698. The trained algorithm 105, 205, 305, 805 may be trained to identify the state of the interlock device 698 at the cable end 191, 291-1, 291-2, 391-1, 391-2, 491, 591, 691, 891-1, 891-2, 991. A positive control signal 106, 206, 306, 806 may be output when the state of the locking device 698 is identified as a locked state in at least one of images 103, 203, 303-1, 303-2, 803-1, 803-2. An error signal may be output if the state is not identified as a locked state. Alternatively, an error signal may be output if the state of the locking device 698 is identified as a locked state in at least one of images 103, 203, 303-1, 303-2, 803-1, 803-2. A positive control signal 106, 206, 306, 806 may be output when the state is not identified as a locked state.

    [0186] Since the devices and methods described in detail above are examples of embodiments, they may be modified in a customary manner by those skilled in the art to a wide extent without departing from the scope of the invention. In particular, the mechanical arrangements and the proportions of the individual elements with respect to each other are merely exemplary.

    TABLE-US-00001 LIST OF REFERENCES 100, 200, 300, 800 cable processing device 101, 201-1, 201-2, 301-1, 301-2 receiving device 801-1, 801-2 receiving device 102, 202, 302-1, 302-2, 802-1, 802-2 image recording device 103, 203, 303-1, 303-2, 803-1, 803-2 image 104, 204, 304, 804 evaluation device 105, 205, 305, 805 trained algorithm 106, 206, 306, 806 control signal 810 image processing device 811 image processing signal 190 cable 191, 291-1, 291-2, 391-1, 391-2 cable end 491, 591, 691, 891-1, 891-2, 991 cable end 192, 492 sheath 193, 493 shielding 194, 494, 994 insulator 195, 495 conductor 196, 496 characteristic 597, 697, 997 connector 698 locking device 699 pin 979-1, 979-2, 979-3, 979-4 opening 980-1, 980-2, 980-3, 980-4, 980-5 contact tongue 980-6, 980-7 contact tongue S1, S2, S3, S4 method steps