NON-INTRUSIVE REDUCTANT INJECTOR CLOGGING DETECTION
20230147578 · 2023-05-11
Assignee
Inventors
Cpc classification
F01N2900/1812
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F01N2550/02
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
Y02A50/20
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
F01N2900/1821
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F01N2610/14
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F01N2900/00
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
Y02T10/12
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
F01N2610/02
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F01N2900/1822
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F01N2900/0402
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
Y02T10/40
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
F01N3/208
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F01N2900/0601
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
International classification
Abstract
A computer-implemented method for determining whether a reductant (e.g. urea) injector is clogged is provided. The method includes receiving data indicative of an injector duty cycle and a pump duty cycle. Using a trained machine learning module, at least a first value is calculated, indicative of a probability of the injector being clogged. The method further includes providing, based on the first value, an indication of whether the reductant injector is clogged. A device for providing the indication using the method, a computer program, a reductant injector system, and e.g. a combustion engine including such a reductant injector system are also provided.
Claims
1. A computer-implemented method for determining whether a reductant injector is clogged, comprising: receiving data indicative of at least i) an injector duty cycle for the reductant injector and ii) a pump duty cycle for a pump providing reductant to the reductant injector; providing the data as input to a determination module, wherein the determination module uses one or more machine learning algorithms trained to, based on the data indicative of the injector duty cycle and the pump duty cycle, infer a set of one or more statistics including at least a first value) indicative of a probability of the reductant injector being clogged, and providing, based on the one or more statistics, an indication of whether the reductant injector is clogged.
2. The method according to claim 1, wherein: the determination module is at least partially implemented using an artificial neural network, ANN, including at least an input layer, a hidden layer connected to the input layer, and an output layer connected to the hidden layer, and the hidden layer or the output layer includes at least one neuron for providing the first value.
3. The method according to claim 2, wherein: the input layer includes at least 2N input neurons, a first set of N input neurons are configured to receive values indicative of the injector duty cycle sampled across a first time window of N different first time instances, and a second set of N input neurons are configured to receive values indicative of the pump duty cycle sampled across a second time window of N different second time instances.
4. The method according to claim 3, wherein: the first time window equals the second time window, and all of the first time instances and the second time instances are separated by a same time difference (dt).
5. The method according to claim 1, wherein: providing the indication that the reductant injector is clogged includes confirming a fulfilment of a full set of one or more conditions, and the one or more conditions include a first condition that, for at least one time instance, the first value, or a value of a first function applied to the first value, is at least not below a first threshold.
6. The method according to claim 5, wherein: the one or more statistics further include a second value indicative of a probability of the reductant injector not being clogged, and the one or more conditions further include a second condition that, for the at least one time instance, the second value, or a value of a second function applied to the second value, is at least not above a second threshold.
7. The method according to claim 5, wherein: the one or more conditions further include a third condition that, during at least a first time period preceding and optionally also including the at least one time instance, a requested mass flow, and/or a time average of a requested mass flow, for the reductant injector has at least not been below a mass flow threshold.
8. The method according to claim 5, wherein: the one or more conditions further include a fourth condition that, during at least a second time period preceding and optionally also including the at least one time instance, a count of time instances at which all of the other of the one or more conditions have been fulfilled is at least not below a first voting threshold, and the second time period at most extends back to a time instance at which the fourth condition was last fulfilled.
9. The method according to claim 8, wherein: the second time period at most extends back to a most recent of the time instance at which the fourth condition was last fulfilled and a time instance at which a count of time instances at which all of said other of the one or more conditions have not been fulfilled was last at least not below a second voting threshold.
10. The method according to claim 5, wherein: the one or more conditions further include a fifth condition that, a second count that is increased at each time instance at which all of the other of the one or more conditions have been fulfilled, and decreased at each time instance at which all of the other of the one or more conditions have not been fulfilled, is at least not below a third voting threshold.
11. The method according to claim 1, wherein: the one or more statistics further includes at least one additional value indicative of a probability of the reductant injector being only partially clogged.
12. A device for determining whether a reductant injector is clogged, comprising processing circuitry configured to cause the device to: receive data indicative of at least i) an injector duty cycle for the reductant injector and ii) a pump duty cycle for a pump providing reductant to the reductant injector; generate an indication of whether the reductant injector is clogged by performing a method according to claim 1, and output a signal including the indication.
13. A computer program for determining whether a reductant injector is clogged, the computer program comprising computer code which, when run on processing circuitry of a device, causes the device to: receive data indicative of at least i) an injector duty cycle for the reductant injector and ii) a pump duty cycle for a pump providing reductant to the reductant injector; generate an indication of whether the reductant injector is clogged by performing a method according to claim 1, and output a signal including the indication.
14. A reductant injection system for a combustion engine, comprising: a reductant injector configured to inject reductant into an exhaust system of the engine; a pump configured to provide reductant to the reductant injector; a control unit configured to control an injector duty cycle for the reductant injector and a pump duty cycle for the pump, and a device for determining whether the reductant injector is clogged according to claim 13.
15. A combustion engine, comprising a reductant injection system according to claim 14.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0031] Exemplifying embodiments will now be described below with reference to the accompanying drawings, in which:
[0032]
[0033]
[0034]
[0035]
[0036]
[0037] In the drawings, like reference numerals will be used for like elements unless stated otherwise. Unless explicitly stated to the contrary, the drawings show only such elements that are necessary to illustrate the example embodiments, while other elements, in the interest of clarity, may be omitted or merely suggested. As illustrated in the Figures, the (absolute or relative) sizes of elements and regions may be exaggerated or understated vis-à-vis their true values for illustrative purposes and, thus, are provided to illustrate the general structures of the embodiments.
DETAILED DESCRIPTION
[0038] Exemplifying embodiments of the envisaged method, device, computer program, reductant injection system, exhaust system, combustion engine and vehicle according to the present disclosure will now be described more fully hereinafter with reference to the accompanying drawings. The drawings show currently preferred embodiments, but the invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided for thoroughness and completeness, and fully convey the scope of the present disclosure to the skilled person.
[0039] As described earlier herein, it is envisaged that a “reductant” may be for example urea (or a mixture of urea and e.g. water). Although it is envisaged also that a reductant may be something else, such as e.g. ammonia, urea will in what follows (for exemplary reasons only) be used as an example of the reductant. For this reason, the terms “reductant” and “urea” will be used interchangeably. Similarly, the terms “reductant/urea injector” and just “injector” will be used interchangeably as well.
[0040]
[0041] The method 100 includes a determination module 110 configured to determine whether a reductant/urea injector is clogged. To do so, the determination module 110 includes a machine learning module 120 that implements one or more machine learning algorithms, and receives as its input at least data indicative of an injector duty cycle X.sub.1 for the injector, and data indicative of a pump duty cycle X.sub.2 for a pump which provides urea to the injector. There may of course, in this or other embodiments, optionally be other, additional input data in provided to the machine learning module 120 as well. As will be described later herein in more detail, the machine learning module 120 has been trained to infer a set P={P.sub.clog, . . . } of one or more statistics, based on at least the input data X.sub.1 and X.sub.2 (and in some embodiments also based on the optional input data in). The set P includes at least a first value (or statistic) P.sub.clog which is indicative of a probability that the injector is clogged. For example, after proper scaling, the value P.sub.clog may assume a decimal value between 0 and 1, where e.g. 0 represents an estimated 0% probability of the injector being clogged, where 0.5 represents an estimated 50% probability of the injector being clogged, and where 1 represents an estimated 100% probability of the injector being clogged. Other scaling of the values of the set P are of course also possible. The machine learning module 120 may of course, in this or in other embodiments, optionally output also additional output data 112.
[0042] The output from the machine learning module no, including at least the set P and optionally also the optional additional output data 112, is provided to a decision module 130 which, based on the set P (including the first value P.sub.clog) makes a decision whether the injector is clogged or not, and outputs a signal I which is indicative of the made decision. For example, if it is decided by the decision module 130 that the injector is clogged, the signal I may correspond to a logical “1”, while, if the decision is that the injector is not clogged, the signal I may instead correspond to a logical “0” or e.g. to an undetermined state. Of course, the exact form of the signal I is not important, as long as the signal I contains enough information in order to infer therefrom the decision made by the decision module 130. The signal I may be an analog signal, a digital signal, or similar. To make its decision, the decision module 130 may also, in this or in other embodiments, optionally receive and use additional input data 113.
[0043] The input data X.sub.1 and X.sub.2 are preferably provided as samples of the true injector duty cycle and pump duty cycle, respectively, taken at discreet time instances t.sub.1, where i is an integer corresponding to a particular sample, and such that X.sub.1(i)=X.sub.1(t=t.sub.i) and X.sub.2 (j)=X.sub.2 (t=t.sub.j). In some embodiments, all time instances are equidistant in time. In other embodiments, it is envisaged that the spacing in time between time instances may be different. The time instances are preferably the same for both the injector and pump duty cycles, but it may also be envisaged that e.g. the data for the injector duty cycle is provided at time instances which are different from those of the pump duty cycle. For example, the time between consequent time instances may be the same for both duty cycles, but the time instances of e.g. the injector duty cycle may lag those of the pump duty cycle by a fixed amount, or vice versa. In other embodiments, it may e.g. be such that one duty cycle is sampled at a higher or lower frequency than the other, etc. In any case, it is envisaged that the machine learning module 120 and the corresponding machine learning algorithm(s) are preferably trained using training data sampled in a same way as the actual, live data on which the method 100 is going to operate.
[0044] In what follows, it will be assumed that all signals are sampled at discrete time instances.
The Decision Module 130
[0045]
[0046]
[0047]
[0048] In
[0049] In
[0050] In common to all methods 100 illustrated in
[0051]
[0052] In
[0053] By using the above “voting system”, the reliability of the method 100 can be improved in that several equal outputs from the confirmation module 131 are needed during a certain time interval before the state of the method (i.e. the indication I) may change to indicate that the injector is clogged. This prevents e.g. a single erroneous output from the confirmation module 131 to greatly affect the outcome. Likewise, by only waiting for a certain second time period T.sub.vote before the voting process starts over, a potential “unclogging” of the detector, occurring at a later time, can also be picked up by the method 100 and the indication I changed accordingly. The indication I is here provided by the confirmation module 131 checking whether all of the conditions c.sub.1, c.sub.3 and c.sub.4 are true, i.e. such that I=(c.sub.1{circumflex over ( )}(c.sub.3{circumflex over ( )}c.sub.4)).
[0054]
[0055] However, in some embodiments, the voting timer module 133 may also keep a second count C.sub.good of the number of time instances, during the second time period T.sub.vote, at which the conditions c.sub.1, c.sub.2 and c.sub.3 have failed to be true at the same time (that is, how many times the output from the confirmation module 131 has been false). If it is decided that, during the second time period T.sub.vote, the second count C.sub.good is at least note below a second voting threshold CT.sub.good, the counters C.sub.clog and C.sub.good may be reset and a new second time period T.sub.vote started. By so doing, this makes it harder for the condition c.sub.4 to be fulfilled, as the counter C.sub.clog may be reset earlier than it would otherwise have been (i.e. before lapse of the full second time period T.sub.vote). This may further enhance the reliability of the output from the method 100, as a larger number of occurrences where all conditions c.sub.1, c.sub.2 and c.sub.3 are not simultaneously fulfilled may indicate a higher level of indecisiveness for the method 100. Phrased differently, with the voting system as illustrated in and described with reference to
[0056] In another envisaged embodiment of the method 100 as illustrated e.g. in
[0057] In common to all methods 100 illustrated in
[0058] Although not illustrated specifically in any Figure herein, it may also be envisaged that the machine learning module 120 is configured to use more classes than just a single class (“clogged”), or more than just two classes (“clogged” and “not clogged”/“good”). For example, it is envisaged that there may also be a class for the injector being only “partially clogged”, with a corresponding value P.sub.clog.sub.
The Machine Learning Module 120
[0059] As envisaged in various embodiments herein, the machine learning module 120 may for example operate not only on the most recent time instance samples of X.sub.1 and X.sub.2, but also take into account a history of each duty cycle. For example, if using two classes, where P.sub.1(i) indicates the estimated probability of the injector belonging to the first class at time instance i, and where P.sub.2 (i) indicates the estimated probability of the injector belonging to the second class at time instance i, the output of the machine learning module 120 can be formulated as follows:
[P.sub.1(i),P.sub.2(i)]=F[X.sub.i(i),X.sub.1(i−1), . . . ,X.sub.1(i−L.sub.1);X.sub.2(i),X.sub.2(i−1), . . . ,X.sub.2(i−L.sub.2)], (1)
where L.sub.1 and L.sub.2 are the number of historical time instances taken into account for each of the duty cycles X.sub.1 and X.sub.2. It may for example be assumed that L.sub.1=L.sub.2=L, but it may also, as described earlier herein, be assumed that L.sub.1≠L.sub.2. Thus, the machine learning module 120 may rely also on historical values of the variables X.sub.1 and X.sub.2, and not only on the most recent available samples of the duty cycles.
[0060] If using e.g. only a single class, it may be envisaged that the output from the machine learning module 120 is just P.sub.1(i). If using more than two classes, the output from the machine learning module 120 may instead be [P.sub.1(i), P.sub.2(i), . . . , P.sub.Q (i)], where Q is an integer indicating the total number of classes used. As used herein, the first class may correspond to the injector being clogged, e.g. such that P.sub.1(i)=P.sub.clog(i). Likewise, the second class may correspond to the injector not being clogged, e.g. such that P.sub.2 (i)=P.sub.good(i). Of course, the exact numbering of the various classes is irrelevant as long as consistency is kept between all performed operations and calculations.
[0061] In other embodiments of the machine learning module 120, regression may be used instead of classification. In such a case, there may be no specific classes, but instead a single output P(i) corresponding to an estimated level of clogging of the injector. For example, after suitable scaling, an estimate of P(i)=0.5 may for example correspond to the injector (as estimated by the machine learning module 120) being 50% clogged, while P(i)=0.1 may correspond to the injector being 10% clogged, etc. Such a method may be desirable due to its simplicity and analog nature.
[0062] As envisaged herein, the machine learning module 120 may implement any particular type of machine learning algorithm(s), as long as such algorithm(s) can be trained and are suitable to provide either the classification probabilities P.sub.1(i), P.sub.2(i), . . . or the analog value P(i) as discussed above. Examples of algorithms may for example be Artificial Neural Networks (ANNs) in general, including e.g. Multi-Layer Perceptrons (MLPs), but also other examples from the realm of machine learning techniques, such as for example Support Vector Machines (SVMs), Decision Trees, Random Forests, and Long Short-Term Memory Neural Networks (LSTMs), etc.
[0063] With reference to
[0064]
x.sub.j=[X.sub.1(j),X.sub.1(j−1), . . . ,X.sub.1(j−N+1),X.sub.2(j),X.sub.2(j−1), . . . ,X.sub.2(j−N+1)],
where, as earlier described herein, X.sub.1(j)=(t=t.sub.j) for example corresponds to a particular sample of the signal X.sub.1 taken at the time instance t.sub.j. By using the latest N samples of each duty cycle, it can be said that the size of the time window used is N. The input layer includes a total of N.sup.I=2N input neurons 211, divided into a first group 212 of N input neurons and a second group 213 also of N input neurons. The first group 212 of input neurons receives the first N samples [X.sub.1(j), . . . , X.sub.1(j−N+1)] of the input signal x.sub.j, while the second group 213 of input neurons receives the remaining N samples [X.sub.2 (i), . . . , X.sub.2(j−N+1)] of the input signal x.sub.j. The output Z.sub.i.sup.l of each i:th input neuron 211 is the same as its input, i.e. Z.sub.i.sup.I=x.sub.j(i), where x.sub.j(i) is the i:th element of the vector x.sub.j.
[0065] The hidden layer 220 includes a total of N.sup.H hidden neurons 221. The input layer 210 and the hidden layer 220 are fully connected, i.e. such that each input neuron 211 forms connections 214 to each one of the hidden neurons 221. Each such connection between a particular i:th input neuron 221 and a corresponding h:th hidden neuron 221 is is denoted w.sub.i,h.sup.1, and corresponds to a particular weight for that connection.
[0066] The output Z.sub.h.sup.H from each h:th hidden neuron 221 can be written as
Z.sub.h.sup.H=σ.sup.H(Σ.sub.i=1.sup.N.sup.
where σ.sup.H is a (non-linear) activation function used in the hidden neurons 221, and where b.sup.H is a bias term. For the hidden neurons 221, the activation function σ.sup.H may for example be a Rectified Linear Unit (ReLU) of the form
σ.sup.H(x)=max(0,x).
[0067] The output layer 230 includes a total of N.sup.O output neurons 231. The output neurons 231 are fully connected with the hidden neurons 221, such that each hidden neuron 221 forms connections 222 to each one of the output neurons 231. Each such connection between a particular h:th hidden neuron 221 and a corresponding o:th output neuron 231 is denoted w.sub.h,o.sup.2, and corresponds to a particular weight for that connection.
[0068] The output Z.sub.o.sup.O from each o:th output neuron 231 can be written as
Z.sub.o.sup.O=σ.sup.O(Σ.sub.h=1.sup.N.sup.
where σ.sup.O is a (non-linear) activation function used in the output neurons 231, and where b.sup.O is a bias term. For the output neurons 231, the activation function σ.sup.O may for example be a Sigmoid function of the form
If using classifiers as output, each output Z.sub.o.sup.O may e.g. correspond to a respective classifier Y.sub.o. As envisaged herein, such a classifier Y.sub.o may correspond to an estimated probability P.sub.o of the injector belong to the o:th class (e.g. P.sub.clog, P.sub.good, etc.).
[0069] The MLP 200 can be trained using commonly known backward propagation methods, using e.g. a gradient descent-based Adam optimizer, and a mean squared error (MSE) as the error function. It is also envisaged that there may also be more than a single hidden layer, for example two hidden layers, three hidden layers or more, each including N.sup.H.sup.
Generation of Training Data
[0070] On exemplary way of obtaining data suitable to train the machine learning module 120 as envisaged herein will now be described in more detail.
[0071] The topology of the MLP 200 illustrated in
[0072] A laboratory engine rig setup for urea dosage testing was used, wherein the (diesel) engine could be run using both a known clogged injector and a known non-clogged injector in the exhaust system. For each type of injector, the engine was run for a total of five World Harmonized Transient Cycles (WHTC) and a total of one In-Service Conformity Cycle (ISC). While running, data was recorded using a constant sampling rate of 100 milliseconds (ms), i.e. such that the time instances of the sampling was defined as t.sub.j=t.sub.0+j*0.1 s, where t.sub.0 was the starting time of the signal sampling. From the recorded data, two signals were extracted: the injector control duty cycle X.sub.1 and the pump control duty cycle X.sub.2. These two signals were extracted while requiring that the urea dosing system was running under normal operation conditions (i.e. a running state), and the mass flow of urea requested from the emission strategy was at least above a predefined threshold. The predefined threshold was, in this particular example, 0.5 grams per second (g/s). The sampled data was then concatenated to form a two-dimensional matrix {circumflex over (D)}, wherein each row {circumflex over (D)}.sub.i of said matrix was defined as
{circumflex over (D)}.sub.i=[X.sub.1(i),X.sub.1(i−1), . . . ,X.sub.1(i−N+1),X.sub.2(i),X.sub.2(i−1), . . . ,X.sub.2(i−N+1),Y.sub.1(i),Y.sub.2(i)],
where the additional columns Y.sub.1 and Y.sub.2 corresponds to the actual class of the injector at time sample/instance i. For example, if it was known that the injector was not clogged at time instance i, Y.sub.1(i) was set to “0” and Y.sub.2 (i) was set to “1”. Similarly, if it was known that the injector was clogged at another time instance j, Y.sub.1(j) was set to “1” and Y.sub.2 (j) was set to “0”. After having inserted all rows corresponding to the non-clogged injector followed by all rows corresponding to the clogged injector, the rows of the matrix {circumflex over (D)} was then reshuffled into a new matrix {tilde over (D)}. The shuffling was of course such that it preserved the integrity of each time window and associated class, i.e. the order of the columns was not affected.
[0073] Connecting back to the description of the input data to the MLP 200 illustrated in
[0074] After constructing the matrix T), the first X % of the rows were selected as a training set, the next Y % of the rows were selected as a validation set, and the remaining Z % of the rows were selected as a test set. Here, X+Y+Z=100%, and in the particular example described it was decided such that X=60%, Y=20% and Z=20%. In addition, all values relating to the duty cycles X.sub.1 and X.sub.2 were normalized by dividing by 100, resulting in values between 0.0 and 1.0.
Verification of the Envisaged Machine Learning Module 120
[0075] Using the training set as gathered according to the above, the MLP 200 was then trained by updating the weights w.sub.h,i.sup.1 and w.sub.o,h.sup.2 using backpropagation and a gradient descent-based Adam optimizer with a learning rate of 0.001. The total number of samples in the training set was 288219, and the batch used (i.e. the number of samples used for accumulating error until the first weight update in the MLP occurs) was 32. The error function was MSE and the whole training set was used for 30 times (epochs). After epoch 30, it was confirmed that the model loss had stabilized at a low enough number, and the validation set was then used to confirm that no overfitting of the model had occurred. At each step, the MSE was calculated using the recently estimated values of P.sub.1(i) and P.sub.2(i), as well as the “correct” answers found in the last two columns of each row {tilde over (D)}.sub.i.
[0076] The remaining test data set (i.e. the last 20% of the rows of the matrix {tilde over (D)}) was then used to check the performance of the model. The performance criteria used was a simple average precision (AP), defined as
where TP is the ratio of true positive classifications, and FP the ratio of false positive classifications. A performance of AP=0.927 was determined (following from TP=0.4721 and FP=0.0372). The determined ratio of true negative classifications was TN=0.4721 and the ratio of false negative classifications was FN=0.0186. With the AP as high as ˜93%, it was thus concluded that the performance of the envisaged method was satisfactory, and that it to a high degree of certainty had learned to estimate reasonable probabilities for the injector being clogged or not clogged.
Verification of the Envisaged Decision Module 130 and Method 100
[0077] Using a method 100 as illustrated in
[0078]
[0079] As can be seen from
[0080]
[0081] As can be seen in
[0082] As can be seen in
[0083] In summary of
Other Aspects
[0084]
[0085] As used herein, “processing circuitry” may for example include any integrated circuit capable of performing instructions stored e.g. as machine language instructions in some memory to which the processor has access, or where the memory is included as part of the processing circuitry itself. Examples of processing circuitry may for example include, but are not limited to, central processing units (CPUs), graphics processing units (GPUs), field-programmable gate-arrays (FPGAs), etc. The processing circuitry may be provided as part of e.g. a computer, an engine control unit, an after-treatment control module (ACM), or similar.
[0086] It is also envisaged to provide a computer program for determining whether the injector is clogged. The computer program includes computer code which, when run on processing circuitry such as the processing circuitry 410 of the device 400 illustrated in
[0087] As used herein, the envisaged computer program may e.g. be stored or distributed on a data carrier. A “data carrier” may e.g. be a transitory data carrier, such as a modulated electromagnetic wave or a modulated optical wave. A data carrier may also be non-transitory, including e.g. volatile and non-volatile memories such as permanent or non-permanent storage media of magnetic, optical or solid-state type. Such memories may be portable (as found e.g. in/on USB-sticks, CD-roms, DVDs, etc.), or fixedly mounted (such as e.g. HDDs, SSDs, etc.).
[0088]
[0089] Envisaged herein is also an aspect including the engine 600, and also including e.g. the exhaust system 610 and in particular including the reductant (e.g. urea) injection system 500. Although not illustrated in any particular Figure herein, as vehicle including the engine 600 (and the exhaust system 610 and urea injection system 500) is also envisaged as another aspect of the present disclosure. Another aspect of the present disclosure includes only the exhaust system 610 and the reductant/urea injection system 500.
[0090] As envisaged herein, a “combustion engine” (or Internal Combustion Engine, ICE) is not necessarily a diesel-driven engine. A combustion engine may for example, in some envisaged embodiments, instead be e.g. a spark-ignition (SI) combustion engine. Such a combustion engine may for example be driven by hydrogen. Likewise, as envisaged herein, the combustion engine can be used to propel numerous different types of vehicles, including e.g. trucks, busses, cars, working machines, and similar. The combustion engine may also form part of e.g. a marine vessel, or any other type of vessel wherein propulsion is provided by combustion engines. It is also envisaged that the combustion engine may also be used to drive something which is not a vehicle itself, such as for example a (stationary) generator or similar. In particular, it is envisaged that the proposed method (and device) for detecting whether a reductant injector is clogged can also be applied to other types of injectors, in particular any kind of injector configured to be driven using a duty cycle (e.g. based on pulse-width modulation, PWM, or similar), and configured to inject any kind of pressurized non-compressible fluid. Such other injectors may for example be diesel injectors forming part of engine diesel injection systems, or for example Aftertreatment Hydrocarbon Injectors (AHIs) used to e.g. inject diesel at a muffle to help increase temperatures to higher levels for regeneration.
[0091] Although features and elements may be described above in particular combinations, each feature or element may be used alone without the other features and elements or in various combinations with or without other features and elements. Additionally, variations to the disclosed embodiments may be understood and effected by the skilled person in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims.
[0092] In the claims, the words “comprising” and “including” does not exclude other elements, and the indefinite article “a” or “an” does not exclude a plurality. The mere fact that certain features are recited in mutually different dependent claims does not indicate that a combination of these features cannot be used to advantage.