Method and device for detecting anomalies in technical systems

11686651 · 2023-06-27

Assignee

Inventors

Cpc classification

International classification

Abstract

A computer-implemented method for detecting an anomaly in a technical system. The method includes detecting an environment state vector and a system state vector, the environment state vector including at least one first value which characterizes a physical environment condition or a physical operating condition of the technical system, and the system state vector including at least one second value which characterizes a physical condition of the technical system; ascertaining, using an environment anomaly model, an environment value which characterizes a probability or a probability density value with which the environment state vector occurs; ascertaining, using a system anomaly model, a system value which characterizes a conditional probability or a conditional probability density value with which the system state vector occurs if the environment state vector occurs; signaling the presence of an anomaly or signaling the absence of an anomaly based on the environment value and/or the system value.

Claims

1. A computer-implemented method for a technical system, comprising the following steps: obtaining sensor output; generating from the sensor output: an environment state vector that includes at least one first value characterizing a physical environment condition of an environment in which the technical system operates; and a system state vector that includes at least one second value characterizing a physical condition of the technical system; using an environment anomaly model to ascertain as output an environment value based on the environment state vector, which is applied as input to the environment anomaly model, the environment value characterizing a probability or a probability density value of an occurrence of the input environment state vector; using a system anomaly model to ascertain as output a system value based on a combination of the environment state vector and the system state vector, which are both applied as input to the system anomaly model, the system value characterizing a conditional probability or a conditional probability density value of an occurrence of the input system state vector when the input environment state vector occurs; determining whether an anomaly is present based on the environment value and the system value; and controlling the technical system based on a result of the determination.

2. The method as recited in claim 1, wherein the determining of whether the anomaly is present includes comparing the environment value to a predefined first threshold value and comparing the system value to a predefined second threshold value.

3. The method as recited in claim 1, wherein the environment anomaly model is a normalizing flow model and/or the system anomaly model is a conditional normalizing flow model.

4. The method as recited in claim 2, further comprising retraining the environment anomaly model in response to results of the determining of whether the anomaly is present being a predefined inconsistency between results of the comparison to the predefined first threshold value and the comparison to the predefined second threshold value, the predefined inconsistency being that: (a) a result of the comparison to the predefined first threshold value is that the environment state vector is anomalous; and (b) a result of the comparison to the predefined second threshold value is that the system state vector is not anomalous.

5. The method as recited in claim 1, wherein the controlling includes, when the anomaly is determined to be present, at least temporarily stopping the technical system.

6. A device comprising: a processor; a memory storing an environment anomaly model and a system anomaly model; wherein the processor is configured to: obtain sensor output; generate from the sensor output: an environment state vector that includes at least one first value characterizing a physical environment condition of an environment in which the technical system operates; and a system state vector that includes at least one second characterizing a physical condition of the technical system; use the environment anomaly model to ascertain as output an environment value based on the environment state vector, which is applied as input to the environment anomaly model, the environment value characterizing a probability or a probability density value of an occurrence of the input environment state vector; use the system anomaly model to ascertain as output a system value based on a combination of the environment state vector and the system state vector, which are both applied as input to the system anomaly model, the system value characterizing a conditional probability or a conditional probability density value of an occurrence of the input system state vector when the input environment state vector occurs; determine whether an anomaly is present based on the environment value and the system value; and control the technical system based on a result of the determination.

7. A method for training an anomaly detection device, the method comprising the following steps: ascertaining a plurality of environment state vectors and a plurality of respective corresponding system state vectors, each of the environment state vectors including at least one first value that characterizes a physical environment condition of an environment in which the technical system operates, and each of the respective corresponding system state vectors including at least one second value that characterizes a physical condition of the technical system; training an environment anomaly model of the anomaly detection device and a system anomaly model of the anomaly detection device based on the ascertained plurality of environment vectors and the respective corresponding system state vectors; wherein the method includes at least one of the following two features (a) and (b): (a) the training includes iteratively modifying parameters of at least one of the environmental anomaly model and the system anomaly model, each of one or more of the iterations including: obtaining a respective value pair formed of a respective environment value characterizing a probability or a probability density value of an occurrence of one of the environment state vectors and a respective system value characterizing a conditional probability or a conditional probability density value of an occurrence of one the system state vectors when the one of the input environment state vector occurs; comparing the obtained respective value pair to an expected value pair; and modifying the parameters based on a result of the comparison; and (b) the training is performed in response to the anomaly detection device generating a predefined inconsistency, the predefined inconsistency being that one of the environment state vectors is determined by the anomaly detection device to be anomalous and that a corresponding one of the system state vectors is not anomalous.

8. A training device configured to train an anomaly detection device, the training device comprising: a processor; and a memory storing an environment anomaly model and a system anomaly model; wherein: the processor is configured to: ascertain a plurality of environment state vectors and a plurality of respective corresponding system state vectors, each of the environment state vectors including at least one first value that characterizes a physical environment condition of an environment in which the technical system operates, and each of the respective corresponding system state vectors including at least one second value that characterizes a physical condition of the technical system; train the environment anomaly model of the anomaly detection device and the system anomaly model of the anomaly detection device based on the ascertained plurality of environment vectors and the respective corresponding system state vectors; and the training device includes at least one of the following two features (a) and (b): (a) the training includes iteratively modifying parameters of at least one of the environmental anomaly model and the system anomaly model, each of one or more of the iterations including: obtaining a respective value pair formed of a respective environment value characterizing a probability or a probability density value of an occurrence of one of the environment state vectors and a respective system value characterizing a conditional probability or a conditional probability density value of an occurrence of one the system state vectors when the one of the input environment state vector occurs; comparing the obtained respective value pair to an expected value pair; and modifying the parameters based on a result of the comparison; and (b) the training is performed in response to the anomaly detection device generating a predefined inconsistency, the predefined inconsistency being that one of the environment state vectors is determined by the anomaly detection device to be anomalous and that a corresponding one of the system state vectors is not anomalous.

9. A non-transitory machine-readable memory medium on which is stored a computer program for a technical system, the computer program, when executed by a computer, causing the computer to perform the following steps: obtaining sensor output; generating from the sensor output: an environment state vector that includes at least one first value characterizing a physical environment condition of an environment in which of the technical system operates; and a system state vector that includes at least one second value characterizing a physical condition of the technical system; using an environment anomaly model to ascertain as output an environment value based on the environment state vector, which is applied as input to the environment anomaly model, the environment value characterizing a probability or a probability density value of an occurrence of the input environment state vector; using a system anomaly model to ascertain as output a system value based on a combination of the environment state vector and the system state vector, which are both applied as input to the system anomaly model, the system value characterizing a conditional probability or a conditional probability density value of an occurrence of the input system state vector when the input environment state vector occurs; determining whether an anomaly is present based on the environment value and the system value; and controlling the technical system based on a result of the determination.

10. A non-transitory machine-readable memory medium on which is stored a computer program for training an anomaly detection device, the computer program, when executed by a computer, causing the computer to perform a method, the method including the following steps: ascertaining a plurality of environment state vectors and a plurality of respective corresponding system state vectors, each of the environment state vectors including at least one first value that characterizes a physical environment condition of an environment in which the technical system operates, and each of the respective corresponding system state vectors including at least one second value that characterizes a physical condition of the technical system; training an environment anomaly model of the anomaly detection device and a system anomaly model of the anomaly detection device based on the ascertained plurality of environment vectors and the respective corresponding system state vectors; wherein the method includes at least one of the following two features (a) and (b): (a) the training includes iteratively modifying parameters of at least one of the environmental anomaly model and the system anomaly model, each of one or more of the iterations including: obtaining a respective value pair formed of a respective environment value characterizing a probability or a probability density value of an occurrence of one of the environment state vectors and a respective system value characterizing a conditional probability or a conditional probability density value of an occurrence of one the system state vectors when the one of the input environment state vector occurs; comparing the obtained respective value pair to an expected value pair; and modifying the parameters based on a result of the comparison; and (b) the training is performed in response to the anomaly detection device generating a predefined inconsistency, the predefined inconsistency being that one of the environment state vectors is determined by the anomaly detection device to be anomalous and that a corresponding one of the system state vectors is not anomalous.

11. The training device as recited in claim 8, wherein the training includes the iteratively modifying the parameters of the at least one of the environmental anomaly model and the system anomaly model, the each of one or more of the iterations including: the obtaining of the respective value pair formed of a respective environment value characterizing a probability or a probability density value of an occurrence of one of the environment state vectors and a respective system value characterizing a conditional probability or a conditional probability density value of an occurrence of one the system state vectors when the one of the input environment state vector occurs; the comparing of the obtained respective value pair to the expected value pair; and the modifying of the parameters based on the result of the comparison.

12. The training device as recited in claim 8, wherein the training is performed in response to the anomaly detection device generating the predefined inconsistency, the predefined inconsistency being that the one of the environment state vectors is determined by the anomaly detection device to be anomalous and that the corresponding one of the system state vectors is not anomalous.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) FIG. 1 schematically shows the structure of an anomaly detector in accordance with an example embodiment of the present invention.

(2) FIG. 2 schematically shows a structure of a control system for activating an actuator, which uses the anomaly detector.

(3) FIG. 3 schematically shows an exemplary embodiment for controlling an at least semi-autonomous robot with the aid of the control system.

(4) FIG. 4 schematically shows an exemplary embodiment for controlling a manufacturing system with the aid of the control system.

(5) FIG. 5 schematically shows an exemplary embodiment for controlling an access system with the aid of the control system.

(6) FIG. 6 schematically shows an exemplary embodiment for controlling a monitoring system with the aid of the control system.

(7) FIG. 7 schematically shows an exemplary embodiment for controlling a personal assistant with the aid of the control system.

(8) FIG. 8 schematically shows an exemplary embodiment for controlling a medical imaging system with the aid of the control system.

(9) FIG. 9 schematically shows an exemplary embodiment for training an anomaly detector.

DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

(10) FIG. 1 shows an anomaly detector 70. Anomaly detector 70 receives a plurality of first and second values x.sub.7), which are divided in a dividing unit 71 into an environment state vector x.sub.U of first values and a system state vector x.sub.S of second values. Environment state vector x.sub.U is fed to an environment anomaly model 72, which is designed to ascertain an environment value v.sub.U that characterizes a probability or a probability density value with which environment state vector x.sub.U occurs. Environment anomaly model 72 is preferably a normalizing flow model.

(11) Furthermore, environment state vector x.sub.U and system state vector x.sub.S are fed to a system anomaly model 73, which is designed to ascertain, for system state vector x.sub.S and as a function of environment state vector x.sub.U, a system value v.sub.S that characterizes a conditional probability or a conditional probability density value with which system state vector x.sub.S occurs. System anomaly model 73 is preferably a conditional normalizing flow model.

(12) Environment value v.sub.U is compared in a first comparison unit 74 with an environment threshold value T.sub.U. If environment value v.sub.U falls below environment threshold value T.sub.U, the presence of an environment anomaly is reported to an evaluation unit 76. In further specific embodiments, it is possible that the presence of an environment anomaly is reported to evaluation unit 76 even if environment threshold value T.sub.U and environment value v.sub.U match.

(13) System value v.sub.S is compared in a second comparison unit 75 with a system threshold value T.sub.S. If system value v.sub.S falls below environment threshold value T.sub.U, the presence of a system anomaly is reported to evaluation unit 76. In further specific embodiments, it is possible that the presence of a system anomaly is reported to evaluation unit 76 even if system threshold value T.sub.S and system value v.sub.S match.

(14) Evaluation unit 76 then ascertains, based on the presence or absence of an environment anomaly and/or a system anomaly, an anomaly detection output y.sub.70 which characterizes the presence or the absence.

(15) FIG. 2 shows an actuator 10 in its environment 20 in interaction with a control system 40. At preferably regular time intervals, environment 20 is detected in multiple sensors 30. Sensors 30 include sensors by which environment states and system states of control system 40 may be measured. In addition, sensors 30 also include imaging sensors, such as cameras for example. The signals of sensors S are transmitted to control system 40. Control system 40 thus receives a sequence of sensor signals S. From these, control system 40 ascertains activation signals A, which are transmitted to actuator 10.

(16) Control system 40 receives the sequence of sensor signals S from sensors 30 in a receiving unit 50, which converts the sequence of sensor signals S into a sequence of input images x.sub.60 and a sequence of environment states and system states x.sub.70. Input image x.sub.60 may be, for example, a detail or a further processing of a sensor signal from a camera, which is contained in sensor signal S. Input image x.sub.60 includes individual frames of a video recording. In other words, input image x.sub.60 and operating states x.sub.70 are ascertained as a function of sensor signal S. The sequence of input images x.sub.60 is fed to an image classifier 60. Control system 40 also includes anomaly detector 70, to which the sequence of environment states and system states x.sub.70 is fed.

(17) Image classifier 60 is preferably parameterized by first parameters Φ.sub.1, which are stored in a parameter memory P and are made available by the latter. Anomaly detector 70 is preferably parameterized by second parameters Φ.sub.2, which are likewise stored in the parameter memory and are made available by the latter.

(18) From input images x.sub.60, image classifier 60 ascertains output variables y.sub.60 which characterize a classification of input images x.sub.60. From operating states x.sub.70, anomaly detector 70 ascertains an anomaly detection output y.sub.70 which characterizes whether or not an anomaly is present.

(19) Output variables y.sub.60 and anomaly detection output y.sub.70 are fed to a forming unit 80, which from these ascertains activation signals A which are fed to actuator 10 in order to activate actuator 10 accordingly. An output variable y.sub.60 includes information about objects that may be seen on a corresponding input image x.sub.60.

(20) Actuator 10 receives activation signals A, is activated accordingly, and carries out a corresponding action. Actuator 10 may in this case include activation logic (not necessarily structurally integrated), which from activation signal A ascertains a second activation signal, with which actuator 10 is then activated.

(21) If an anomaly is present, activation signal A may be selected in such a way that the possible actions of actuator 10 are restricted. If no anomaly is present, it is possible that the possible actions are restricted not on the basis of an anomaly, but rather on the basis of environment 20 of control system 40 that has been ascertained by image classifier 60. It is also possible that, if an anomaly is present, at least some of sensor signals S are transmitted to a manufacturer or operator of control system 40.

(22) In further specific embodiments of the present invention, control system 40 includes sensor 30. In yet further specific embodiments, control system 40 alternatively or additionally also includes actuator 10.

(23) In further preferred specific embodiments, control system 40 includes one or multiple processors 45 and at least one machine-readable memory medium 46, on which instructions are stored which, when executed on processors 45, prompt control system 40 to carry out a method according to the present invention.

(24) In alternative specific embodiments of the present invention, a display unit 10a is provided as an alternative or in addition to actuator 10.

(25) FIG. 3 shows how control system 40 may be used to control an at least semi-autonomous robot, here an at least semi-autonomous motor vehicle 100.

(26) Sensors 30 may in this case include, for example, a video sensor, which is preferably situated in motor vehicle 100, as well as sensors for measuring the ambient temperature, sensors for measuring the light intensity, GPS sensors, sensors for measuring the fuel consumption, and/or sensors for measuring the engine speed.

(27) Image classifier 60 is designed to identify objects from input images x.sub.60.

(28) Actuator 10, which is preferably situated in motor vehicle 100, may be, for example, a brake, a drive or a steering system of motor vehicle 100. Activation signal A may then be ascertained in such a way that the one or multiple actuators 10 is/are activated in such a way that motor vehicle 100 prevents for example a collision with objects identified by image classifier 60, particularly if these are objects of particular classes, for example pedestrians.

(29) In the case that anomaly detector 70 detects an environment anomaly or a system anomaly, activation signal A may be selected in such a way that the vehicle may no longer change lanes and the speed is restricted to a predefined value. Alternatively, it is likewise possible that the control of the vehicle is handed over to a driver or operator (not necessarily located in the vehicle). It is also possible that the anomaly and sensor signals S that have resulted in the anomaly are stored in a fault memory of the control system, and/or a warning message is output on a display device 10a.

(30) Alternatively, the at least semi-autonomous robot may also be a different mobile robot (not shown), for example a robot that moves by flying, swimming, diving or walking. The mobile robot may also be, for example, an at least semi-autonomous lawnmower or an at least semi-autonomous cleaning robot. In these cases, too, activation signal A may be ascertained in such a way that the drive and/or steering of the mobile robot are activated in such a way that the at least semi-autonomous robot prevents for example a collision with objects identified by image classifier 60.

(31) FIG. 4 shows an exemplary embodiment in which control system 40 is used to activate a manufacturing machine 11 of a manufacturing system 200 by activating an actuator 10 which controls this manufacturing machine 11. Manufacturing machine 11 may be, for example, a machine for punching, sawing, drilling and/or cutting.

(32) Sensors 30 may include, for example, an optical sensor which detects for example properties of manufactured products 12a, 12b. The sensors may also include such sensors that may measure the ambient temperature, the air pressure, the light intensity, the radiation, the speed of travel of a conveyor belt and/or the power consumption of manufacturing machine 11.

(33) It is possible that manufactured products 12a, 12b are movable. It is possible that actuator 10 which controls manufacturing machine 11 is activated as a function of an assignment of detected manufactured products 12a, 12b, so that manufacturing machine 11 accordingly executes a subsequent processing step of the correct manufactured product 12a, 12b. It is also possible that, by identifying the correct properties of manufactured products 12a, 12b (i.e., without misclassification), manufacturing machine 11 accordingly adapts the same manufacturing step to process a subsequent manufactured product.

(34) If anomaly detector 70 detects an anomaly, manufacturing machine 11 may be stopped for example, and maintenance may be automatically requested. Alternatively, it is also possible that the presence of an anomaly is indicated to an appropriate technician for closer observation, but the operation of manufacturing machine 11 is maintained.

(35) FIG. 5 shows an exemplary embodiment in which control system 40 is used to control an access system 300. Access system 300 may include a physical access control, for example a door (401). Sensors 30 may include a video sensor, which is designed to detect a person. Sensors 30 may also include such sensors that may measure an ambient temperature, the light intensity, the time of day, the air pressure and/or the power consumption of actuator 10.

(36) The detected image may be interpreted with the aid of image classifier 60. If multiple persons are detected at the same time, the identity of the persons may be ascertained in a particularly reliable manner by associating the persons (i.e., the objects) with one another, for example by analyzing their movements. Actuator 10 may be a lock which releases the access control, or not, as a function of activation signal A, for example opens door 401 or not. For this purpose, activation signal A may be selected as a function of the interpretation by object identification system (image classifier) 60, for example as a function of the ascertained identity of the person. Instead of the physical access control, a logical access control may also be provided.

(37) For example, if an anomaly is detected, a technician may be contacted automatically in order to check the correct functioning of access system 300.

(38) FIG. 6 shows an exemplary embodiment in which control system 40 is used to control a monitoring system 400. This exemplary embodiment differs from the exemplary embodiment shown in FIG. 5 in that, instead of actuator 10, display unit 10a is provided, which is activated by control system 40. For example, an identity of the objects recorded by the video sensor may be reliably ascertained by image classifier 60 in order for example, as a function thereof, to deduce those that are suspicious, and activation signal A may then be selected in such a way that this object is displayed by display unit 10a in a manner highlighted in color.

(39) FIG. 7 shows an exemplary embodiment in which control system 40 is used to control a personal assistant 250. Sensors 30 preferably include an optical sensor which receives images of a gesture of a user 249.

(40) As a function of the signals of sensor 30, control system 40 ascertains an activation signal A for personal assistant 250, for example as a result of image classifier 60 carrying out gesture recognition. This ascertained activation signal A is then transmitted to personal assistant 250 and thus activates the latter accordingly. This ascertained activation signal A may in particular be selected in such a way that it corresponds to a presumed desired activation by user 249. This presumed desired activation may be ascertained as a function of the gesture recognized by image classifier 60. Control system 40 may then select activation signal A for transmission to personal assistant 250 as a function of the presumed desired actuation and/or may select activation signal A for transmission to the personal assistant according to the presumed desired activation 250.

(41) This corresponding activation may include for example that personal assistant 25 retrieves information from a database and reproduces it for user 249 in a receivable manner.

(42) If a detected anomaly is present, personal assistant (250) may communicate this to user 249 or may automatically inform a technician.

(43) Instead of personal assistant 250, a household appliance (not shown), in particular a washing machine, a stove, an oven, a microwave or a dishwasher, may also be provided so as to be activated accordingly.

(44) FIG. 8 shows an exemplary embodiment in which control system 40 is used to control a medical imaging system 500, for example an MRI, X-ray or ultrasound machine. Sensors 30 may include, for example, an imaging sensor. Sensors 30 may also include such sensors that may measure the ambient temperature, the humidity, the radiation within imaging system 500, the operating temperature of the imaging system and/or the power consumption of the imaging system.

(45) Display unit 10a is activated by control system 40. For example, image classifier 60 may ascertain whether an area recorded by the imaging sensor is suspicious, and activation signal A may then be selected in such a way that this area is highlighted in color by display unit 10a.

(46) In further exemplary embodiments (not shown), anomaly detector 70 may also monitor a control system which does not use an image classifier 60. Particularly, if the behavior of the control system is determined by a multitude of executable rules, the anomaly detector may monitor the parameters of the environment and of the system itself, as in the exemplary embodiments above.

(47) In the exemplary embodiments shown above, it is also possible that environment anomaly model 72 is retrained with at least environment state vector x.sub.U if evaluation unit 76 is notified of an anomaly concerning environment value v.sub.U but is not notified of an anomaly concerning system value v.sub.S. This scenario may be regarded as a situation for which anomaly detector 70 has not been tested, but nevertheless the system functions correctly.

(48) FIG. 9 shows an exemplary embodiment of a training system 140 which is designed to train an anomaly detector 70. For the training, a training data unit 150 accesses a computer-implemented database St.sub.2, database St.sub.2 including at least one training data set T, training data set T including pairs x.sub.i of environment state vectors and system state vectors.

(49) Training data unit 150 ascertains at least one pair x.sub.i of environment state vector and system state vector of training data set T and transmits pair x.sub.i to anomaly detector 70 to be trained. For the environment state vector and the system state vector, anomaly detector 70 determines an environment value with the aid of the environment anomaly model and a system value with the aid of the system anomaly model.

(50) The environment value and the system value are transmitted as an output pair ŷ.sub.i, to a change unit 180.

(51) Based on ascertained output pair ŷ.sub.i and a desired output pair y.sub.i of environment value and system, change unit 180 then determines new model parameters Φ′ for the environment anomaly model and the system anomaly model. It is possible, for example, that both anomaly models are given by neural networks. In this case, change unit 180 may ascertain new model parameters Φ′ with the aid of a gradient descent method, such as Stochastic Gradient Descent or Adam.

(52) Ascertained new model parameters Φ′ are stored in a model parameter memory St.sub.1.

(53) Desired output pair y.sub.i may in particular be made up of desired environment value and desired system value, the desired environment value being a desired probability for the environment state vector or a desired probability density value, and the desired system value being a desired probability for the system state vector or a desired probability density value.

(54) In further exemplary embodiments, the training described is repeated iteratively for a predefined number of iteration steps or is repeated iteratively until a difference between ascertained output pair ŷ.sub.i and desired output pair y.sub.i falls below a predefined threshold value. In at least one of the iterations, new model parameters Φ′ determined in a previous iteration are used as model parameters Φ of the anomaly detector.

(55) Training system 140 may also include at least one processor 145 and at least one machine-readable memory medium 146 containing commands which, when executed by processor 145, prompt training system 140 to carry out a training method according to one of the aspects of the present invention.

(56) The term “computer” encompasses arbitrary devices for processing predefinable computing rules. These computing rules may be in the form of software, or in the form of hardware, or else in a mixed form of software and hardware.