CALCULATION OF EXHAUST EMISSIONS OF A MOTOR VEHICLE
20200240346 ยท 2020-07-30
Inventors
- Martin Schiegg (Korntal-Muenchingen, DE)
- Heiner Markert (Stuttgart, DE)
- Stefan Angermaier (Stuttgart, DE)
Cpc classification
F02D41/28
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02D2041/286
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02D41/26
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02D41/1439
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F01N11/007
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02D41/2429
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G01M15/05
PHYSICS
F02D41/1458
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02D41/1452
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02D2041/1433
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G06N7/01
PHYSICS
F02D2041/1437
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02D41/1401
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02D41/1462
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02D41/1453
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02D41/1465
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F01N2900/0402
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02B77/086
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02D41/1444
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F01N11/00
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
International classification
F02D41/14
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F02B77/08
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F01N11/00
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
Abstract
A method for ascertaining emissions of a motor vehicle driven with the aid of an internal combustion engine in a practical driving operation. A machine learning system is trained to generate time curves of the operating variables with the aid of measured time curves of operating variables of the motor vehicle and/or of the internal combustion engine, and to then ascertain the emissions as a function of these generated time curves.
Claims
1-17. (canceled)
18. A method for ascertaining emissions of a motor vehicle driven using an internal combustion engine in a practical driving operation, the method comprising the following steps: training a machine learning system to generate time curves using measured time curves of the operating variables of the internal combustion engine and/or of the motor vehicle; and ascertaining the emissions as a function of the generated time curves generated by the machine learning system.
19. The method as recited in claim 18, wherein the machine learning system includes a first part which initially transforms the measured time curves into first variables, each of which characterizes latent variables, a space of latent variables having a reduced dimensionality, and the machine learning system includes a second part which generates, as a function of the latent variables, second variables, each of which characterizes the generated time curves of the operating variables.
20. The method as recited in claim 19, wherein the first variables, which characterize the latent variables, are the latent variables themselves, and the second part includes a parameterized Gaussian process model parameterized by third parameters, and the third parameters and the latent variables are adapted during the training of the machine learning system in such a way that a marginal probability of a reconstruction of the measured time curves is maximized below these latent variables.
21. The method as recited in claim 20, wherein the first part ncludes a neural network parameterized by fourth parameters, and the adaptation of the latent variables occurs during the training by adapting the fourth parameters.
22. The method as recited in claim 19, wherein the first part and the second part of the machine learning system form an autoencoder.
23. The method as recited in claim 19, wherein the first part and the second part of the machine learning system form a variational autoencoder.
24. The method as recited in claim 19, wherein the first variables, which characterize the latent variable, are the latent variables themselves, and the first part ascertains the latent variables from the measured time curves using a sparse dictionary learning method, which variables represent coefficients of the measured time curves in a representation as a linear combination of the dictionary learned using this method.
25. The method as recited in claim 18, wherein latent variables are predefined and the machine learning system generates the time curves of the operating variables as a function of the predefined latent variables, and the emissions are then ascertained as a function of the generated time curves.
26. The method as recited in claim 25, wherein the latent variables are ascertained using a method of statistical test planning.
27. The method as recited in claim 25, wherein a probability density distribution of the latent variables resulting as a function of the measured time curves is ascertained and the predefined latent variables are drawn as a random sample from the estimated probability density distribution.
28. The method as recited in claim 18, wherein the machine learning system includes a first part to which either the measured time curves of the operating variables or time curves of the operating variables generated by a second part of the machine learning system are fed, the first part being trained to decide whether it is fed a measured time curve of the operating variables or a generated time curve of the operating variables, and the second part being trained to generate the time curves of the operating variables as a function of randomly selected input variables.
29. The method as recited in claim 28, wherein the second part is trained to generate the time curves of the operating variables as the function of the randomly selected input variables in such a way that the first part is able to only poorly decide whether it is fed the measured time curve or the generated time curve of the operating variables.
30. The method as recited in claim 28, wherein randomly selected input variables are predefined and the machine learning system generates the time curves of the operating variables as a function of the randomly selected input variables, and the emissions are then ascertained as a function of the generated time curves.
31. The method as recited in claim 30, wherein at least some of the randomly selected input variables are ascertained using a method of statistical test planning.
32. The method as recited in in claim 18, wherein the motor vehicle is controlled as a function of the ascertained emissions.
33. A non-transitory machine-readable memory medium on which is stored a computer program for ascertaining emissions of a motor vehicle driven using an internal combustion engine in a practical driving operation, the computer program, when executed by a computer, causing the computer to perform the following steps: training a machine learning system to generate time curves using measured time curves of the operating variables of the internal combustion engine and/or of the motor vehicle; and ascertaining the emissions as a function of the generated time curves generated by the machine learning system.
34. A computer configured to ascertain emissions of a motor vehicle driven using an internal combustion engine in a practical driving operation, the computer configured to: train a machine learning system to generate time curves using measured time curves of the operating variables of the internal combustion engine and/or of the motor vehicle; and ascertain the emissions as a function of the generated time curves generated by the machine learning system.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0083]
[0084]
[0085]
[0086]
[0087]
[0088]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0089]
[0090]
[0091]
[0092]
[0093]
[0094]
[0095] Coder (K) and decoder (D) in this case may form an autoencoder, for example, or a variational autoencoder, or implement a sparse dictionary learning.
[0096] It is also possible in this case that decoder (D) includes a Gaussian process. Then it is possible either that coder (K) includes a neural network, which ascertains latent variables (z) as a function of parameters (v), these parameters (v) also being varied during the training in addition to parameters (), which characterize the Gaussian process, in such a way that a marginal probability (p(x|z)) of the reconstruction of measured time curves (x) is maximized below these latent variables (z). Or, it is possible that coder (K) is omitted and latent variables (z) are directly predefined, so that in addition to parameters (y), learning unit (L) also adapts these latent variables (z) in such a way that a cost function, which includes a reconstruction error between measured time curve (x) and associated curve (x) generated from the selected latent variables (z), is minimized.
[0097]
[0098] Such generated time curves (x) and measured time curves (x) are fed alternatingly to first block (U), i.e., first block (U) is fed either a generated time curve (x) or a measured time curve (x). It is also possible that first block (U) is fed both these curves (x, x) if first block (U) has an internal selection mechanism (not shown), which in each case selects one of these two curves (x, x).
[0099] First block (U) is trained as shown in
[0100] First block (U) and second block (H) are then mutually trained, parameters (Y) of first block (U) being trained that the classification of first block (U) is preferably often correct and parameters () of second block (H) being trained that the classification of first block (U) is preferably often incorrect.
[0101]