PREDICTING SUN LIGHT IRRADIATION INTENSITY WITH NEURAL NETWORK OPERATIONS
20210165130 · 2021-06-03
Inventors
- Ti-chiun Chang (Princeton Junction, NJ)
- Patrick Reeb (Adelsdorf, DE)
- Joachim Bamberger (Stockdorf, DE)
Cpc classification
Y02E10/56
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
H02J3/003
ELECTRICITY
H02J3/004
ELECTRICITY
International classification
H02J3/00
ELECTRICITY
Abstract
A method of predicting the intensity of sun light irradiating the ground. At least two input images are provided of a time series of images captured from the sky; a plurality of image features are extracted from the at least two input images; a set of meta data associated with the at least two input images are determined; the image features and the meta data are supplied as input data to a neural network; and neural network operations predict the future intensity of the sun light as a function of the input data. Further, a data processing unit and a computer program for controlling or carrying out the described method are described, as well as an electric power system with such a data processing unit.
Claims
1-15. (canceled)
16. A method for predicting the intensity of sun light irradiating onto ground, the method comprising providing at least two input images of a time series of images captured from the sky; extracting a plurality of image features from the at least two input images; determining a set of meta data associated with the at least two input images; supplying the image features and the meta data as input data to a neural network; and predicting, by way of neural network operations, a future intensity of the sun light as a function of the input data.
17. The method according to claim 16, further comprising performing a first cloud segmentation of a first input image and a second cloud segmentation of a second input image; calculating cloud velocities for cloud portions identified by way of the first cloud segmentation and the second cloud segmentation; and prescribing a spatial irradiation prediction zone within each one of the at least two input images based on the calculated cloud velocities and a predetermined position of the sun within the input images; and thereby extracting image features solely from the prescribed spatial irradiation prediction zone.
18. The method according to claim 16, wherein the step of extracting the plurality of image features comprises subdividing the spatial irradiation prediction zone into a plurality of parallel pixel stripes which are oriented at least approximately perpendicular to a direction of a general cloud velocity; determining, for each pixel stripe of the plurality of pixel stripes, several characteristic pixel intensity values; and using the characteristic pixel intensity values obtained in the determining step as the image features supplied to the neural network.
19. The method according to claim 18, wherein each pixel stripe has a width of one pixel.
20. The method according to claim 18, wherein the several characteristic pixel intensity values include, for each of the pixel stripes, at least one of a mean intensity value of all individual pixel values of pixels assigned to the respective pixel stripe; a maximum intensity value being a highest intensity value of all pixels of the pixel stripe; or a minimum intensity value being a lowest intensity value of all pixels of the pixel stripe.
21. The method according to claim 18, wherein: each one of the input images is a color image captured within a color space having at least a first color, a second color, and a third color; and the several characteristic pixel intensity values include first characteristic pixel intensity values assigned to the first color, second characteristic pixel intensity values assigned to the second color, and third characteristic pixel intensity values assigned to the third color.
22. The method according to claim 16, wherein the meta data include at least one of the following information: sun light intensity measured at a time of capturing at least one of the at least two input images; several sun light intensities measured within a predefined time window, and average sun light intensity measured within a predefined time interval at the time of capturing at least one of the at least two input images.
23. The method according to claim 16, which comprises measuring the sun light intensity on the ground.
24. The method according to claim 23, which comprises measuring the sun light intensity by way of pyranometry and/or with a pyranometer apparatus.
25. The method according to claim 16, wherein the meta data include at least one of the following information: a time of the day when the at least one of the at least two input images is captured; a day of the year when at least one of the at least two input images is captured; and a geographic location of the ground.
26. The method according to claim 16, wherein the neural network comprises an input layer receiving the image features and the meta data; a Long Short-Term Memory layer processing the received image features and meta data and outputting a data set; and at least one further neural network layer receiving the processed image features and meta data as a neural data set and further processing the neural data set; wherein the predicted future intensity of the sun light depends on the further processed neural data set.
27. The method according to claim 26, wherein the neural network further comprises: a further input layer receiving at least one weighting factor and outputting a corresponding weighting data set; and a weighting layer receiving the further processed neural data set and the output weighting data set, wherein the predicted future intensity of the sun light further depends on the weighing data set.
28. The method according to claim 16, wherein the step of providing the at least two input images comprises capturing at least two images from the sky by employing a wide-angle lens; and transforming respectively one of the captured images to one of the at least two input images by applying an unwarping image processing operation.
29. A data processing unit for predicting an intensity of sun light irradiating onto ground, wherein the data processing unit is configured for carrying out the method according to claim 16.
30. A non-transitory computer program for predicting an intensity of sun light irradiating onto ground, the computer program, when being executed by a data processing unit, being configured for carrying out the method according to claim 16.
31. An electric power system, comprising: a power network; a photovoltaic power plant for supplying electric power to the power network; at least one further power plant for supplying electric power to the power network and/or at least one electric consumer for receiving electric power from the power network; a control device for controlling an electric power flow between the at least one further power plant and the power network and/or between the power network and the at least one electric consumer; and a prediction device for producing a prediction signal that is indicative of an predicted intensity of sun light being captured by the photovoltaic power plant in the future; wherein said prediction device includes a data processing unit for predicting an intensity of sun light irradiating onto ground, and the data processing unit is configured for carrying out the method according to claim 16; said prediction device is communicatively connected to said control device, and said control device is configured to control the electric power flow in the future based on the prediction signal.
Description
BRIEF DESCRIPTION OF THE DRAWING
[0057]
[0058]
[0059]
[0060]
DETAILED DESCRIPTION
[0061] The illustration in the drawing is schematic. It is noted that in different figures, similar or identical elements or features are provided with the same reference signs or with reference signs, which are different from the corresponding reference signs only within the first digit. In order to avoid unnecessary repetitions elements or features which have already been elucidated with respect to a previously described embodiment are not elucidated again at a later position of the description.
[0062]
[0063] In
[0064] (A) A cloud segmentation is performed within at least two (different) input images yielding two spatial cloud distributions.
[0065] (B) The spatial difference between the two spatial cloud distributions is determined.
[0066] (C) Based on the spatial difference and a time difference between capturing the two input images a general cloud velocity gv is determined.
[0067] (D) Based on the general cloud velocity gv at least the length of the spatial irradiation prediction zone Z along the direction of the general cloud velocity gv is determined. The width of the prediction zone Z (perpendicular to the direction of the general cloud velocity gv) can be selected based on a-priori knowledge for possible wind direction changes, which may be characteristic for the geographic position within which the intensity of sun light is to be predicted.
[0068] According to the exemplary embodiment described here, for carrying out the method for predicting the intensity of sun light irradiating onto ground, only image features are used which are located within the spatial irradiation prediction zone Z.
[0069] In the following a preferred embodiment of a method for predicting the intensity of sun light irradiating onto ground in accordance with various embodiments of the invention is described in detail. In this preferred method there is combined (a) a strong a-priori knowledge, (b) “loose” image features extracted from at least two input images, (c) “loose” meta data associated with the at least two input images, and (d) a data driven learning for feature representation and regression into a seamless framework. Most of the method steps are carried out by means of a recurrent neural network (RNN).
[0070] Descriptively speaking, given a sequence of sky images wherein the position of the sun S is known e.g. by means of a calibration step, there is first computed the optical flow of the (general) cloud velocity gv. Processing in 2 dimensions and looking from the position of (the center of) the sun, the cloud motion orientation allows to prescribe an area of interest within a restricted zone, referred in this document as the spatial irradiation prediction zone Z.
[0071]
[0072] For the following description it is assumed that the number n of pixels stripes PS is equal to 200. Further, it is assumed that each pixel comprises three sub-pixels each being assigned to one color of a color space. In the image I under consideration each sub-pixel (of each pixel) has a certain intensity value. The colors may be e.g. red (R), green (G), and blue (B).
[0073] In order to deal with a limited amount image features (of the prediction zone Z) in the embodiment described here the following procedure is carried out (for keeping the amount of data to be handled within acceptable limits): Within each pixel stripe PS and for each color R, G, B there is determined (a) the biggest intensity value I_max, (b) the smallest intensity value I_min, and (c) the mean intensity value I_mean. This means that for each pixel stripe PS there are determined 3×3=9 image features. This means that from the entire prediction zone Z there are determined 200×9=1800 image features. These 1800 image feature correspond to an N-tuple feature vector (here and so far N is equal to 1800) consisting of simple statistics from each pixel stripe PS.
[0074] This feature vector is then supplemented or concatenated with meta data. According to the exemplary embodiment described here the following meta data are used:
[0075] (1) The past solar irradiance, here denominated precisely the intensity of sun light, which has been measured before at different points in time within a certain time window: In the following it is assumed that this time window is 1 minute and the time interval between two subsequent images is 5 seconds. Together with the current irradiance measurement this amounts to 13 meta data values.
[0076] (2) The hour and the minute of the day: These are two further meta data values.
[0077] (3) The (general) cloud velocity: This is one further meta data value.
[0078] This means that according to the exemplary embodiment described here a 1816-tuple vector is processed respectively is used as an input for an RNN in order to predict the intensity of sun light which will irradiate at ground within the (near) future within a time horizon of e.g. 20 minutes. The characteristic features can be trained in the RNN in order to match the measured irradiance (i.e., the ground truth as the supervision). All training may be prepared from images and pyranometry values acquired in the past several years, so there will be enough data to train the RNN.
[0079]
[0080] A first layer 352 is an input layer with which the above described N-tuple vector (N=1816) is received. A next layer of the network 350 is a Long Short-Term Memory (LSTM) layer 354, wherein most of the data processing of the described method is carried out. As has already been mentioned above, a LSTM is a known key RNN structure. Due to its capability of memorizing a LSTM virtually increases the number of layers to infinite relative to a feedforward neural network.
[0081] After the LSTM layer 354 there are provided, just as an example, two further neural network layers 356 and 358. These layers are used for reducing respectively consolidating the number N (here N is a hyper parameter of the LSTM) of the LSTM processed N-tuple vector. In other words, the N dimensionality of the vector is reduced to N−dN, wherein dN corresponds to the amount of this (data) reduction.
[0082] As can be seen from
[0083] Within a weighing layer 372 of the depicted exemplary network 350 a weighing operation is carried out. As has already been mentioned above, in this weighing layer 372 the impact or the weight of some selected values of the processed vector (having a reduced dimensionality) can be reduced and/or the impact or the weight of some other selected values can be increased.
[0084] With an output layer 374 of the network 350 the result, namely the predicted intensity of sun light irradiating onto ground, is output. Thereby, predicted sun light intensity values for several points in time within a predetermined future time horizon can be provided.
[0085] The predicted sun light intensity values are indicative for the power generation of a photovoltaic power plant which is expected within the future. This information can be used for controlling the operation of a power system, wherein apart from the photovoltaic power plant at least one further different type electric power plant feeds electric power to a power network. Further details are given in the following.
[0086]
[0087] The photovoltaic power plant 420 is driven by the sun S irradiating on non-depicted solar panels of the photovoltaic power plant 420. In order to predict the electric power, which can be generated by the photovoltaic power plant 420 in the near future, there is provided a prediction device 430. The prediction device 430 comprises a camera 432 for capturing a series of images of the sky over the photovoltaic power plant 420. The captured images, two of which are the input images I1 and I2 as described above, are forwarded to a data processing and control device 434. A data processing section or data processing unit of the data processing and control device 434 is configured for carrying out the method as described above for classifying pixels within the captured images whether they represent cloud or sky. A control section of the data processing and control device 434 is communicatively connected with (at least some of) the power plants 442 and 644 and with (at least some of) the electric consumers 446 and 448. The corresponding wired or wireless data connections are indicated in
[0088] With (the data processing unit of) the data processing and control device 434 carrying out the described method a prediction of the expected solar irradiance within the near future can be made. This irradiance prediction directly corresponds to a prediction of the power, which can be supplied from the photovoltaic power plant 420 to the power network 410 in the near future. This allows to control, by means of (the control section of) the data processing and control device 434, the operation of the power plants 442, 444 and/or the electric consumers 446, 448 can be controlled in such a manner that the power flow to and the power flow from the power network 410 are balanced at least approximately. Hence, the stability of the power network 410 and, as a consequence, also the stability of the entire electric power system 400 can be increased.
[0089] It is pointed out that in the embodiment described here the data processing unit and the control section are realized by one and the same device, namely the data processing and control device 434. However, it should be clear that the data processing unit and the control section can also be realized by different devices which are communicatively connected in order to forward the prediction signal from the data processing unit to the control section.
[0090] It should be noted that the term “comprising” does not exclude other elements or steps and the use of articles “a” or “an” does not exclude a plurality. Also elements described in association with different embodiments may be combined. It should also be noted that reference signs in the claims should not be construed as limiting the scope of the claims.
LIST OF REFERENCE SIGNS
[0091] I input image [0092] S sun [0093] C clouds [0094] Z spatial irradiation prediction zone [0095] gv general cloud velocity [0096] PS pixel stripe [0097] n Number of pixel stripes [0098] 350 network architecture [0099] 352 input layer [0100] 354 LSTM layer [0101] 356 first dense layer [0102] 358 second dense layer [0103] 362 further input layer [0104] 372 weighing layer [0105] 374 output layer [0106] 400 electric power system [0107] 410 power network [0108] 420 photovoltaic power plant [0109] 430 prediction device [0110] 432 camera [0111] 434 data processing and control device [0112] 442 coal-fired power plant/gas-fired power plant [0113] 444 hydroelectric power plant [0114] 446 industrial complex/factory [0115] 448 household(s)/domestic home (s)