ELECTRIFIED VEHICLE WITH INDICATION OF ADEQUATE DRIVING RANGE BASED ON AUTOENCODER
20220055488 · 2022-02-24
Assignee
Inventors
Cpc classification
B60L15/2045
PERFORMING OPERATIONS; TRANSPORTING
Y02T10/70
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
B60L2240/70
PERFORMING OPERATIONS; TRANSPORTING
B60L2260/52
PERFORMING OPERATIONS; TRANSPORTING
B60L2240/40
PERFORMING OPERATIONS; TRANSPORTING
Y02T90/16
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
Y02T10/72
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
Y02T10/84
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
B60L50/60
PERFORMING OPERATIONS; TRANSPORTING
B60L50/15
PERFORMING OPERATIONS; TRANSPORTING
B60L3/12
PERFORMING OPERATIONS; TRANSPORTING
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
B60K6/28
PERFORMING OPERATIONS; TRANSPORTING
B60W20/00
PERFORMING OPERATIONS; TRANSPORTING
Y02T10/7072
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
International classification
B60L15/20
PERFORMING OPERATIONS; TRANSPORTING
B60L50/60
PERFORMING OPERATIONS; TRANSPORTING
Abstract
An electrified vehicle and associated method for controlling an electrified vehicle having an electric machine powered by a traction battery include an autoencoder trained with training data indicative of a remaining driving range of the traction battery. The trained autoencoder processes vehicle operating data to generate a reference data record and determines a value indicative of a similarity between the vehicle operating data and the reference data record. The autoencoder generates an output data record if the value indicative of the similarity is below a predetermined threshold value. The output data record may be used to display an alert or message to a vehicle occupant and/or control the vehicle to reduce power consumption to increase vehicle driving range.
Claims
1. A method for controlling an electrified vehicle having an electric machine powered by a traction battery, the method comprising: training an autoencoder with training data indicative of a remaining vehicle driving range associated with the traction battery; processing electrified vehicle operating data by the autoencoder and generating a reference data record based on the operating data; generating an output data record by the autoencoder in response to a value indicative of similarity between the electrified vehicle operating data and the reference data record provided by the trained autoencoder being below a predetermined threshold value; and controlling the electrified vehicle in response to the output data record.
2. The method of claim 1 wherein controlling the electrified vehicle comprises reducing electric load on the traction battery.
3. The method of claim 1 wherein controlling the electrified vehicle comprises displaying a message that the traction battery capacity may not be sufficient to reach a designated destination.
4. The method of claim 1 wherein the autoencoder comprises a generative adversarial autoencoder implemented by a remote server, and wherein the output data record is wirelessly transmitted by the remote server to the electrified vehicle.
5. The method of claim 4 further comprising training the autoencoder by unsupervised learning.
6. The method of claim 1 wherein the electrified vehicle comprises one or more controllers and wherein the one or more controllers process stored data representing instructions to implement the autoencoder.
7. The method of claim 6 wherein the autoencoder comprises a generative adversarial autoencoder.
8. The method of claim 1 wherein training the vehicle comprises wirelessly transmitting internal data and external data to the autoencoder wherein the internal data includes data from at least one of vehicle sensor data, navigation data, drive train parameters, and user data, and the external data includes at least one of traffic data, weather data, calendar data, telephone connection data, and charging station data.
9. An electrified vehicle comprising: a traction battery; an electric machine powered by the traction battery and configured to provide propulsive force to the electrified vehicle; a human-machine interface (HMI); and a controller configured to communicate with the HMI, to control the traction battery, or to control the electric machine in response to an output data record from an autoencoder trained using vehicle data associated with a driving range of the electrified vehicle, the output data record generated in response to a value indicative of similarity between electrified vehicle operating data and reference data being below a predetermined threshold value.
10. The electrified vehicle of claim 9 wherein the controller reduces accessory load of the electrified vehicle in response to the output data record indicating the driving range is less than a current trip destination.
11. The electrified vehicle of claim 9 wherein the controller causes display of a message on the HMI that current traction battery charge may not be sufficient to reach a designated destination in response to the output data record indicating the driving range is insufficient.
12. The electrified vehicle of claim 9 wherein the controller is further configured to wirelessly receive the output data record from a remotely located autoencoder.
13. The electrified vehicle of claim 12 wherein the remotely located autoencoder is trained by unsupervised learning.
14. The electrified vehicle of claim 9 wherein the autoencoder is implemented by the controller.
15. The electrified vehicle of claim 9 wherein the autoencoder comprises a generative adversarial autoencoder including a first artificial neural network designed as a generator and a second artificial neural network designed as a discriminator.
16. A non-transitory computer readable storage medium including stored data representing instructions executable by a processor to implement a trained autoencoder that generates an output data record in response to a value indicative of similarity between electrified vehicle operating data and reference data being below a predetermined threshold value, the autoencoder trained using electrified vehicle operating data associated with an electrified vehicle driving range.
17. The non-transitory computer readable storage medium of claim 16 further comprising data representing instructions to reduce electrified vehicle accessory load in response to the output data record and a predetermined electrified vehicle destination.
18. The non-transitory computer readable storage medium of claim 16 further comprising data representing instructions to control the electrified vehicle in response to the value indicative of similarity being below the predetermined threshold value.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0013]
[0014]
[0015]
[0016]
[0017]
DETAILED DESCRIPTION
[0018] As required, detailed embodiments are disclosed herein; however, it is to be understood that the disclosed embodiments are merely representative and may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the claimed subject matter.
[0019] Referring to
[0020] The drive train 8 comprises all components that generate the power for the drive in the motor vehicle 4 and transfer it to the ground. The data processing unit 6 can be a server or a cloud computer or a computer network. For the tasks of the system 2 described below, i.e. the motor vehicle 4 and the data processing unit 6, this or these have hardware and/or software components. The various hardware and software components described may have one or more associated controllers, processors, or computers to control and monitor operation of the components and implement an autoencoder based on sensor data associated with one or more vehicle components. The controllers, computers, processors, etc. may communicate via a vehicle network that may be implemented as a serial bus (e.g., Controller Area Network (CAN)) or via discrete conductors in addition to wirelessly as previously described.
[0021] It should be understood that any one of the representative data processing units or controllers can collectively be referred to as a data processing unit, controller, computer, etc. that controls various actuators in response to signals from various sensors to control the vehicle in response to stored instructions processed or executed by one or more microprocessors, some of which may implement an autoencoder as described herein. Each data processing unit may include a microprocessor or central processing unit (CPU) in communication with various types of memory or non-transitory computer readable storage devices or media. Computer readable storage devices or media may include volatile and nonvolatile or persistent storage in read-only memory (ROM), random-access memory (RAM), and keep-alive memory (KAM), for example. KAM is a persistent or non-volatile memory that may be used to store various operating variables while the processor is powered down. Computer-readable storage devices or media may be implemented using any of a number of known memory devices such as PROMs (programmable read-only memory), EPROMs (electrically PROM), EEPROMs (electrically erasable PROM), flash memory, or any other electric, magnetic, optical, or combination memory devices capable of storing data, some of which represent executable instructions, used by the controller in generating display messages or signals and controlling the vehicle via various vehicle components or subsystems.
[0022] Control logic, functions, code, software, strategy etc. performed by one or more processors or controllers may be represented by block diagrams, flow charts, or similar diagrams in one or more figures. These figures provide representative control strategies, algorithms, and/or logic that may be implemented using one or more processing strategies such as event-driven, interrupt-driven, multi-tasking, multi-threading, and the like. As such, various steps or functions illustrated may be performed in the sequence illustrated, in parallel, or in some cases omitted. Although not always explicitly illustrated, one of ordinary skill in the art will recognize that one or more of the illustrated steps or functions may be repeatedly performed depending upon the particular function being performed and processing strategy being used. Similarly, the order of processing is not necessarily required to achieve the features and advantages described herein, but is provided for ease of illustration and description. The control logic may be implemented primarily in software executed by a microprocessor-based data processing unit located on-board the vehicle or remotely in the cloud. Of course, the control logic may be implemented in software, hardware, or a combination of software and hardware in one or more controllers depending upon the particular application. When implemented in software, the control logic may be provided in one or more non-transitory computer-readable storage devices or media having stored data representing code or instructions executed by a computer to implement an autoencoder to control the vehicle or its subsystems or cause a message or alert to be displayed. The computer-readable storage devices or media may include one or more of a number of known physical devices which utilize electric, magnetic, and/or optical storage to keep executable instructions and associated calibration information, operating variables, and the like.
[0023] Referring to
[0024] The traffic data can be for example indicative of a traffic jam corresponding to the current GPS location of the vehicle 4. The traffic data can be accessed directly from a satellite or online. The traffic data can also be read via V2V (Vehicle to Vehicle) or V2I (Vehicle to Infrastructure) communication.
[0025] The weather data can come from weather stations and are indicative of expected daytime temperatures. The weather data can be accessed directly from satellites or online or read in via V2V (Vehicle to Vehicle) or V2I (Vehicle to Infrastructure) communication.
[0026] The calendar data contain information from a digital calendar of the driver. This does not include private information, but the availability of the driver for a journey. For example, the motor vehicle 4 is not used in 8 hours due to business hours, or a journey with a certain distance is planned in the next hours/days. No travel details are given, but only information representative of a route or distance to the destination.
[0027] The phone connection data are, for example, representative of a number of calls or messages in a single day to estimate whether there is a possible journey.
[0028] Charging station data contain, for example, data from the providers of charging stations. They include the GPS position of nearby charging stations and the distance of charging stations as well as data indicative of their occupancy and use.
[0029] The sensor data come from vehicle-side sensors 18 of the motor vehicle 4, such as a GPS, cameras, radar or lidar systems, gyroscopes and similar sensors. The sensor data can describe a distance to the nearest motor vehicle, a route, an inclination or slope of the road or the number of lanes.
[0030] The navigation data contain information from digitized maps to determine whether the motor vehicle 4 is on a motorway or within a city or on a long route without an exit. For the determination, GPS information can be evaluated additionally, for example.
[0031] The drive train parameters are provided by the motor control unit, for example, and are indicative of the energy consumed during a journey and the remaining energy reserve, as well as of related data such as torque and speed.
[0032] The user data relate to the driver and/or driving profile. For this purpose, travel times and the length of journeys or even days of the week of journeys can be evaluated.
[0033] The Electronic Horizon Data are based at least on the navigation data or the digitized maps to predict a driving route and road conditions. The Electronic Horizon Data include road gradients, road type, road slope, and speed limits.
[0034] In other words, the external data ED can be considered as vehicle-independent data and the internal data ID can be understood as motor vehicle data or also as motor vehicle operating data.
[0035] Furthermore, of the components of the data processing unit 6,
[0036] Of the components of the motor vehicle 4, in addition to the drive train 8, a modem 16 for wireless data transmission, a battery management system 18 for the operation of the traction battery and an HMI (human machine interface) for informing the driver of the motor vehicle 4 as well as vehicle-side sensors 22 are shown in
[0037] In operation, the captured internal data ID are transmitted wirelessly to the data processing unit 6 via the modem 16 and read in by the network layer 14 and then temporarily stored in the memory 10. Furthermore, an output data record AD provided by the data processing unit 6 is read in with the modem 16 and then evaluated by the battery management system 18 to inform the driver of the vehicle 4 by outputting via the HMI 20 and/or to generate a control signal AS for controlling the drive train 8 in order to reduce the energy consumption. The HMI 20 may be designed to inform the driver by means of an acoustic and or optical and or haptic signal.
[0038] As generally illustrated in
[0039] Reference is now additionally made to
[0040] An autoencoder algorithm is designed to provide a representation of the original input, i.e. the training data TD or operating data BD using the encoder 24 and the decoder 26.
[0041] After the training phase I with the training data TD, a copy of the input, then the operating data BD, is simply provided as the output in the normal phase II. If the particular copy differs from the original data record, an anomaly can be concluded.
[0042] In operation, the decoder 26 provides the output data record AD as the output variable, which can be or may contain a logical variable, which is assigned the value logical zero for a sufficient traction battery capacity or remaining range and the value logical one for insufficient traction battery capacity or remaining range.
[0043] The logical variable is assigned the logical value one, for example during the training phase I, if the decoder 26 cannot distinguish reference data RD from the training data TD within predetermined limits or accuracy. Otherwise, the logical variable is assigned the value logical zero during the training phase I.
[0044] For this purpose—as is later explained by
[0045] The generative adversarial network 28 has a first and a second artificial neural network. The first artificial neural network is in the form of a generator and the second artificial neural network is in the form of a discriminator.
[0046] The first artificial neural network, the generator, creates candidates, while the second artificial neural network, the discriminator, evaluates the candidates. Typically, the generator maps from a vector of latent variables to the desired result space. The goal of the generator is to learn to generate results according to a certain distribution. The discriminator, on the other hand, is trained to distinguish the results of the generator from the data from the real, predetermined distribution. The target function of the generator is then to generate results that the discriminator cannot distinguish. As a result, the generated distribution will gradually conform to the real distribution.
[0047] The generative adversarial network 28 is used to continuously update the decoder 24 in operation by evaluating the reference data RD.
[0048] The generative adversarial network 28 and the autoencoder 12 are trained together in two subphases. In the first phase, the autoencoder 12 updates the encoder 24 and decoder 26 to minimize a reconstruction error of the input data. In the second phase, the generative adversarial network 28 is updated to distinguish real input data from generated input data, i.e. the data provided by the autoencoder 12. The generative adversarial network 28 then updates its generator to stimulate the generative adversarial network 28 again.
[0049] In the present representative embodiment, the autoencoder 12 is trained by means of unsupervised learning. In a departure from the present embodiment, however, training can also be carried out by means of supervised learning, semi-supervised learning or reinforcement learning.
[0050] After completing the training phase I, the decoder 26 of the autoencoder 12 forms a generative model, which maps input data to a data distribution. In normal operation II, when the generator 24 is loaded with the operating data BD, it can thus be detected whether there is a deviation or anomaly, which is then considered indicative of insufficient traction battery capacity or remaining range.
[0051] A procedure for the operation of the system 2 is now explained with additional reference to
[0052] In a further step S600 it is checked whether the training was successful or not. For this purpose, the precision AG of the autoencoder 12 is compared with a predetermined limit value GW. If the precision AG is less than or equal to the limit value GW, the training is continued, i.e. the procedure is continued with step S200. If the precision AG is above the limit value GW, the procedure is continued with a further step S700.
[0053] In the next step S700, the trained autoencoder 12 is loaded with the read-in current operating data BD and then provides a reference data record RD. Furthermore, the autoencoder 12 determines the value W indicative of a similarity between the read-in operating data BD and the reference data RD provided by the trained autoencoder 12 and generates the output data record AD if the value W indicative of a similarity is below the predetermined threshold value SW. In a further step S800, the output data record AD is transmitted wirelessly to the motor vehicle 4. In a further step S900, the output data record AD is evaluated in the motor vehicle 4 in order to generate the signal HS for informing a driver of the motor vehicle 4 and or the control signal AS for controlling the drive train 8. Furthermore, the procedure will then continue with the next step S200.
[0054] In a departure from the present embodiment, the order of the steps may also be different. In addition, multiple steps can also be performed at the same time or simultaneously. Furthermore, in a departure from the present embodiment, individual steps can be skipped or omitted.
[0055] Thus, a detection of the traction battery capacity or the remaining residual amount of energy is dispensed with, but insufficient traction battery capacity or remaining range is concluded indirectly. In this way, range anxiety can be counteracted in a particularly simple and at the same time reliable way.
[0056] While representative embodiments are described above, it is not intended that these embodiments describe all possible forms of the claimed subject matter. The words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the claimed subject matter. Additionally, the features of various implementing embodiments may be combined to form further embodiments that may not be explicitly illustrated or described.