Dynamic shading system

11580923 · 2023-02-14

Assignee

Inventors

Cpc classification

International classification

Abstract

A dynamic shading system is disclosed. The system comprises a screen and a control system. The screen comprises a plurality of light valves. Each light valve has an adjustable translucency so that the screen can present an image on one side of the screen. The control system is configured to determine what image is to be presented on the one side of the screen in dependence of light intensity incident on another side of the screen. the control system is further configured to control each light valve of the screen to have a translucency so that the plurality of the light valves forms the determined image on the one side of the screen.

Claims

1. A dynamic shading system comprising a screen comprising a plurality of light valves, each light valve having an adjustable translucency so that the screen can present an image on one side of the screen; and a control system that is configured to determine what image is to be presented on the one side of the screen in dependence of light intensity incident on another side of the screen, and to control each light valve of the screen to have a translucency so that the plurality of the light valves forms the determined image on the one side of the screen, wherein the control system is configured to perform steps of obtaining a first light intensity value indicative of a first light intensity incident on the other side of the screen, determining, based on the first light intensity value, a first to-be-presented image, controlling each light valve of the screen to have a translucency so that the plurality of the light valves forms the first image on the one side of the screen, obtaining a second light intensity value indicative of a second light intensity incident on the other side of the screen, the second light intensity value being different from the first light intensity value, determining, based on the second light intensity value, a second to-be-presented image, the second image being different from the first image, and controlling each light valve of the screen to have a translucency so that the plurality of light valves forms the second image on the one side of the screen, wherein the first and second image each comprises one or more image elements formed by adjacent image pixels having the same or similar pixel values, each image element having a respective size and shape and relative position within the image in question, wherein the first and second image differ in that a particular image element in the second image, the particular image element having a particular size and particular shape and particular relative position, does not have in the first image a corresponding image element having the same particular size and the same particular shape and the same particular relative image position.

2. The dynamic shading system according to claim 1, wherein determining the first to-be-presented image comprises determining a first set of image pixel values comprising, for each light valve of the screen, a greyscale value, the first set of image pixel values representing the first image, and determining the second to-be-presented image comprises determining a second set of image pixel values comprising, for each light valve of the screen, an image pixel value, the second set of image pixel values representing the second image.

3. The dynamic shading system according to claim 1, wherein the control system is configured to perform a step of determining, for each light valve out of the plurality of light valves, a first translucency value, wherein controlling each light valve of the screen to have a translucency so that the plurality of the light valves forms the first image on the one side of the screen is performed based on the first translucency values determined for the respective light valves, and the control system is configured to perform a step of determining, for each light valve out of the plurality of light valves, a second translucency value, wherein controlling each light valve of the screen to have a translucency so that the plurality of the light valves forms the second image on the one side of the screen is performed based on the second translucency values determined for the respective light valves.

4. The dynamic shading system according to claim 3, wherein the first translucency value for each light valve is determined on the basis of a light intensity value indicative of light intensity incident on the other side of the screen when the plurality of the light valves will form the first image on the one side of the screen and the second translucency value for each light valve is determined on the basis of a light intensity value indicative of light intensity incident on the other side of the screen when the plurality of the light valves will form the second image on the one side of the screen.

5. The dynamic shading system according to claim 1, wherein the second light intensity as indicated by the second light intensity value is higher than the first light intensity as indicated by the first light intensity value, wherein on average the second image is darker than the first image.

6. The dynamic shading system according to claim 1, wherein the control system is configured to determine what movie is to be presented on the one side of the screen in dependence of light intensity on the other side of the screen, and wherein the control system is configured to perform steps of determining, based on the first light intensity value, a first to-be-presented movie comprising a first set of successively presented images, the first set of successively presented images comprising the first to-be-presented image, and controlling each light valve of the screen to sequentially have first translucencies so that the plurality of the light valves forms the first movie on the one side of the screen, and determining, based on the second light intensity value, a second to-be-presented movie different from the first to-be-presented movie, the second to-be-presented movie comprising a second set of successively presented images, the second set of successively presented images comprising the second to-be-presented image, and controlling each light valve of the screen to sequentially have second translucencies so that plurality of light valves forms the second movie on the one side of the screen.

7. The dynamic shading system according to claim 6, wherein the second light intensity as indicated by the second light intensity value is higher than the first light intensity as indicated by the first light intensity value, wherein on average, the second video is darker than the first video.

8. The dynamic shading system according to claim 7, wherein determining the first movie comprises generating the first movie based on the first light intensity value and/or wherein determining the second movie comprises generating the second movie based on the second light intensity value.

9. The dynamic shading system according to claim 1, wherein the first and/or second image depicts one or more plants and/or animals.

10. The dynamic shading system according to claim 1, the second light intensity as indicated by the second light intensity value is higher than the first light intensity as indicated by the first light intensity value, and wherein the first movie and the second movie depict growth of one or more plants, wherein the depicted growth in the second video is faster than the depicted growth in the first video.

11. The dynamic shading system according to claim 1, wherein each light valve comprises a liquid crystal display (LCD) pixel.

12. The dynamic shading system according to claim 1, further comprising a light intensity sensor for measuring light intensity incident on the other side of the screen.

13. A computer-implemented method for determining images that are to be presented on a side of a screen in dependence of light intensity incident on another side of the screen, the screen comprising a plurality of light valves, each light valve having an adjustable translucency so that the screen can present an image on one side of the screen, the method comprising: obtaining a first light intensity value indicative of a first light intensity incident on the other side of the screen, determining, based on the first light intensity value, a first to-be-presented image, controlling each light valve of the screen to have a translucency so that the plurality of the light valves forms the first image on the one side of the screen, obtaining a second light intensity value indicative of a second light intensity incident on the other side of the screen, the second light intensity value being different from the first light intensity value, determining, based on the second light intensity value, a second to-be-presented image, the second image being different from the first image, and controlling each light valve of the screen to have a translucency so that the plurality of light valves forms the second image on the one side of the screen, wherein the first and second image each comprises one or more image elements formed by adjacent image pixels having the same or similar pixel values, each image element having a respective size and shape and relative position within the image in question, wherein the first and second image differ in that a particular image element in the second image, the particular image element having a particular size and particular shape and particular relative position, does not have in the first image a corresponding image element having the same particular size and the same particular shape and the same particular relative image position.

14. A non-transitory computer-readable storage medium having stored thereon a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of claim 13.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) Aspects of the invention will be explained in greater detail by reference to exemplary embodiments shown in the drawings, in which:

(2) FIG. 1 depicts a dynamic shading system according to one embodiment; and

(3) FIGS. 2A-2D illustrate how two images may be understood to be different from each other;

(4) FIGS. 3A-3D illustrate different versions of the same image;

(5) FIG. 4 illustrates a method for determining to be presented images;

(6) FIG. 5 illustrates a particular method for determining a translucency value for each light valve

(7) FIGS. 6A-6E show movies that may be determined in accordance with methods described herein; and

(8) FIG. 7 is an artist impression of the screen of the dynamic shading system presenting an image of plants on the one side of the screen; and

(9) FIG. 8 illustrates a data processing system according to an embodiment.

DETAILED DESCRIPTION OF THE DRAWINGS

(10) In the figures identical reference numbers indicate identical or similar elements.

(11) FIG. 1 depicts a dynamic shading system 2 according to one embodiment. The system 2 comprises a screen 4 and a control system 8.

(12) The screen 4 comprises a plurality of light valves 6 that have an adjustable translucency, e.g. an adjustable transparency. The screen 4 provides protection from a light source 10, such as the sun, and can thus protect against UV and/or IR radiation, and/or against glare caused by direct incident light from the light source and/or against high perceived brightness caused by direct or indirect light from the light source. In one example, the screen is placed near a boundary between the interior I of a building and the exterior E. In FIG. 1, the screen 4 forms such boundary, e.g. is used as a façade. The screen 4 may cast a shadow that provides a comfortable area 12 wherein an observer 14 does not experience glare and/or wherein a temperature is kept suitably low as a result of the screen 4 blocking incident sunlight.

(13) The light intensity incident on the screen 4 may be a measure of a (time averaged) amount of radiant power incident on the screen 4. The light valves 6 may be unable to generate light autonomously, e.g. without the areas being backlit. In one embodiment, the sun is used as a variable backlight source. As a result, during operation, the screen may consume less than 10 W/m.sup.2, for example less than 4 W/m.sup.2.

(14) Since the translucency of a light valve 6 relates to an amount of light passing through the light valve 6, it also relates to a perceived brightness of the light valve 6. Hence, given a certain light intensity incident on the screen 4, a high translucency of the light valve 6 relates to a high perceived brightness of the light valve and a low translucency value of a light valve relates to a low perceived brightness of the light valve 6. The screen 4 may be said to use the incident light as a variable backlight for displaying images.

(15) The plurality of light valves 6 may be LCD pixels. The plurality of light valves 6 may be regularly arranged. In particular, the plurality of light valves may be regularly arranged pixels. The height and/or the width of each light valve may be in the range 0-20 m, preferably 0 mm-5 m, more preferably 5-50 mm or <1 mm. In one example, each light valve is a pixel sized 14.7 mm by 16.4 mm. The screen has a height and a width, wherein the height may be 1-10 m, preferably 1-3 m and the width may be in the range of 0.5-500 m, preferably 4-200 m, more preferably 8-100 m. In one example, the screen is sized 3.4 m by 6.85 meters. The screen may comprise 3428 light valves per m.sup.2. The screen may comprise approximately 2.000.000 pixels per 1.25 m.sup.2 (1920×1080 pixels on 1.44 m×0.82 m, for example).

(16) The light valves 6 may be electronically controllable. In one embodiment, the translucency of each area may be dependent on an electrical current or voltage being applied to the light valve 6. The control system 8 may be configured to apply a specific electrical current or a specific voltage to each light valve 6 for controlling the translucency of each light valve 6, for example by applying pulse width modulation (PWM). Each light valve may comprise a transistor, such as Thin Film Transistor, that is configured to control a voltage that is provided to, for example, the liquid crystal display pixel. Each Thin Film transistor may receive the same driving voltage.

(17) The translucency of a light valve and the to be applied voltage/current may or may not possess a linear relationship. The translucency of a light valve and the voltage/current applied to it may possess a negative or positive relationship. In an example, said relationship is negative and a zero applied voltage/current will result in the light valves having a high translucency, e.g. a high transparency. Hence, if a power failure occurs, beneficially the screen will not revert to an all-black state wherein it blocks substantially all incident light.

(18) The control system 8 may be configured to separately adjust the translucency of each light valve 6. The control system may be configured to control the translucency of each light valve 6 to become either of two values, for example a first maximum value corresponding to a maximum percentage, preferably 100% or close to 100%, of incident light intensity passing through the light valve 6 and corresponding to a maximum brightness of the area given the circumstances and to a second minimum value corresponding to a minimum percentage, preferably 0% or close to 0%, of incident light intensity passing through the light valve 6 and corresponding to a minimum brightness, preferably blackness, of the light valve 6. The translucency of each area 6 may be adjustable to a wide range of values and the control system 8 may be configured to control the translucency of each light valve 6 to be any value between said minimum and maximum value, i.e. the translucency may be controlled in a stepless manner. In an example, the control system 8 is configured to control the translucency of each light valve 6 to be one value out of a fixed number of values. Said fixed number of values is for example two, three, five, ten, sixteen, et cetera. Said fixed number of values may depend on the number of bits that represent a greyscale value as explained below. In a particular example, 8 bits are used for representing the possible greyscale values for an image pixel value. In such case, the image pixel value can have 256 different values.

(19) In FIG. 1, the control system 8 is shown to control the light valves 6a to have a high translucency, light valves 6b to have an intermediate translucency, and light valves 6c to have a low translucency. Hence, an observer 14 perceives area 6a as lighter than area 6b, and perceives area 6b as lighter than area 6c. The observer may perceive area 6a as white, even if area 6a has a translucency lower than 100%, and area 6c as black. The control system 8 may be configured to adjust the translucency of the light valves at least 25 times per second, preferably at least 60 times per second.

(20) Thus, in the depicted embodiment, since the control system 8 is configured to determine what image is to be presented on the one side of the screen in dependence of light intensity incident on the other side of the screen, that an arrow of a certain size is to be presented on the screen.

(21) Optionally, the dynamic shading system comprises a light intensity sensor for measuring light intensity incident on the other side of the screen. The sensor 16 may be positioned on the light receiving side of the screen 4 and/or on another side of the screen. In the latter example, the sensor 16 may be positioned to measure an amount of light intensity passing through one or more light valves. These one or more light valves may then be controlled to adopt one or more predetermined translucencies, e.g. cycle through a number of translucencies, and the control system may be configured to associate each predetermined translucency value with a light intensity passing through the one or more areas. Based on this, an indication of the light intensity incident on the screen may be determined. The light intensity value indicative of the light intensity incident on the screen may be an indication of the ambient light's intensity. In one example, the shading system comprises multiple light intensity sensors 16, for example a first light intensity sensor for at least one light valve and a second light intensity sensor for at least one other light valve. In one example, the control system comprises at least one light intensity sensor per light valve. The sensor 16 advantageously enables the shading system to quickly adapt to changing lighting conditions.

(22) Optionally, the control system 8 comprises a person sensor, e.g. e movement sensor, that is configured to detect a person 14 near the screen, e.g. on said one side of the screen, and in response output a signal, and the control system 100 may be configured to control the light valves based on this signal. This advantageously allows the system to for example temporarily improve its glare control function or climate control function when a person 14 passes by the screen 4.

(23) The control system may also be configured to connect to other devices, for example using Internet of Things technology known in the art. As such, the control system and the dynamic shading system can be a part of a smart façade of a building and can be used in Smart Building applications. The control system may for example be configured to connect to mobile devices of respective users, so that the users can control, to some extent, the images that are presented on the screen. In one example, users can, via their respective mobile device, e.g. smart phones, play interactive games on the screen against each other. To this end, the control system may be configured to receive control signals from a plurality of mobile devices.

(24) In one embodiment, the control system 8 comprises a user interface through which a user can change settings of the shading system and/or input light intensity values.

(25) The control system 8 may be embodied as a data processing system 100 further described below.

(26) In one embodiment, each light valve comprises an adjustable translucency in the sense that each light valve comprises an adjustable transparency, which transparency the control system 8 is configured to control. Transparency may thus be regarded as a species of translucency. If an area has a high transparency value, then not only does a relatively large percentage of incident light intensity passes through the area, also an observer 14 at one side of the area can clearly see objects at the other side of the area, because light passing through the area does not scatter. This embodiment allows to construct transparent shading systems. In other words, the screen of the system may be transparent screen.

(27) FIGS. 2A-2D illustrate how two images may be understood to be different from each other. FIG. 2A shows a first image 20 and FIG. 2C shows a different, second image 26. The first and second image comprise images pixels 22, 28, respectively, i.e. the rectangular areas in the images. Each image pixel may be understood to be a data structure indicating a relative position in an image as well as one or more intensities for that relative position, e.g. a greyscale value or RGB values. In FIGS. 2A and 2C pixels 22_1 and 28_1 may be understood to have the same relative position within the image, being the sixth pixel from the top and the first pixel from the left of the images. Likewise, the pixels 22_2 and 28_2 also have the same relative position.

(28) The first image 20 and the second image 26 are both greyscale images in the sense that each image pixel is associated with a single brightness value indicating the brightness for the image pixel in question. Herein, white image pixels may be understood to have a relatively high brightness, black image pixels a relatively low brightness and grey pixels a relatively intermediate image pixel.

(29) FIG. 2B shows the image elements A-P for the first image and FIG. 2D shows the image elements A-C, E-J and L-R for the second image. As is clear from these figures, image elements are formed by adjacent image pixels having the same or similar pixel values. To illustrate, image element A in first image 20 (and also image element A in second image 26) is formed by four adjacent image pixels (the four adjacent image elements being illustrated in FIGS. 2A and 26) that all have the same value, namely the relatively high brightness value. The same holds for image elements D, K, P in first image 20. Also, image element B in first image 20 (and also image element B in second image 26) is formed by four adjacent image pixels all having the intermediate brightness value. The same holds for image elements E, G, I, L, N.

(30) Each image element shown in FIGS. 2B and 2D has a respective size and shape and relative position within the image in question. To illustrate, image element C has a size of 2 pixels wide and 2 pixels height, a relative position of (0,8), and a rectangular shape. Note that the position of image element C is indicated by the coordinates of the top left of the image element, wherein “0” indicates that the top left sits at zero x-displacement (horizontal displacement in the figure) from the left side of the image and “8” indicates a displacement of eight times a pixel height in the y-direction (the vertical direction in the figure) from the bottom of the image.

(31) Image element C in the first image 20 has a corresponding image element in the second image 26, namely image element C in the second image 26. Image element C in the second image 26 namely also has a size of 2 pixels wide and 2 pixels height, a relative position of (0,8), and a rectangular shape.

(32) However, the second image comprises at least one image element, in the embodiment illustrated, namely image elements Q, R, S and T, each of which does not have a corresponding image element in the first image 20. To illustrate, image element S in the second image has a different shape than the image element D in the first image. Image element S namely has a hole in the middle there where image element Q is present, whereas image element D does not comprise such hole. The same applies, mutatis mutandis for image element T and K. Hence, the second image (FIGS. 2B, 2D) may be understood different from the first image (FIGS. 2A, 2C).

(33) In FIGS. 2B and 2D, the identified image elements are formed by adjacent pixels having equal image pixel values. However, they may also be formed by adjacent pixels having similar pixel values in the sense that all image pixels in an image element have an image pixel value within a certain range. It should be appreciated that multiple, preferably non-overlapping, ranges may be used for identifying the image elements. Further, it should be appreciated that the ranges used for identifying image elements in the first image may be different from the ranges used for identifying image elements in the second image.

(34) The control system may thus be configured to determine that the first image 20 is to-be-presented on the screen based on a first light intensity value and that the second image 26 is to be presented based on a second light intensity value. Note that the second image is on average darker than the first image because the average brightness values of the image pixels of the second image is lower than the average brightness value of the image pixels of the first image. Thus, the second image enables to the shading system to better perform a shading function, which would be appropriate if high light intensities are incident on the screen.

(35) FIG. 3A-3D illustrate an example in which two images may be understood as being the same, in particular as being different versions of the same image. FIG. 3A shows a first image and FIG. 3B a second image. The second image is a darker version and the first image a brighter version. FIG. 3B shows the image elements of FIG. 3A and FIG. 3D shows the image elements of the image in FIG. 3C. Clearly, each image element A, B, C has a corresponding image element of the same size, relative position and shape in the first image. Therefore, the first and second image depicted in FIG. 3A and FIG. 3C may be understood to be the same image.

(36) FIG. 4 illustrates an, optionally computer-implemented, method for determining images that are to be presented on a side of a screen in dependence of light intensity incident on another side of the screen, the screen comprising a plurality of light valves, each light valve having an adjustable translucency so that the screen can present an image on one side of the screen. The method comprises a step 30 of obtaining e.g. receiving, a light intensity value indicative of a light intensity incident on the other side of the screen. The light intensity value may be received from a light intensity sensor described herein and/or may be determined based on weather forecasts. The method further comprises a step 32 of determining, based on the light intensity value, what image is to be presented on the one side of the screen. This step may comprise determining a set of image pixel values comprising, for each light valve of the screen, an image pixel value, the set of image pixel values representing the image.

(37) Then, once it has been determined what image is going to be presented, the dynamic shading system may determine translucency values for the respective light valves of the system so that indeed the determined image is properly rendered on the screen. Such translucency values may then be used to control the translucency of the light valves. Determining such translucency values may be a relatively straightforward conversion from image pixel values to translucency values. However, as explained in FIG. 5, such conversion may also take into account incident light intensity. This may be performed by determining the translucency values based on the light intensity value used for determining what image is to be presented. Alternatively, converting image pixel values into translucency values may also be based on a light intensity that is incident on the screen at a later time, e.g. at a time when the determined image is actually going to be rendered. Hence, the method comprises an optional step 34 of obtaining a further light intensity value indicative of light intensity incident on the other side of the screen when the plurality of the light valves will form the determined image on the one side of the screen.

(38) The method also comprises a step 34 of controlling each light valve of the screen to have a translucency so that the plurality of the light valves forms the determined image on the one side of the screen. This step may of course be performed based on the translucency values that are optionally determined for the light valves.

(39) Preferably, the method is performed repeatedly so that the images that are presented on the screen remain appropriate for the current incident light intensity, as indicated by the arrow from step 34 to step 30. Such repeatedly determined images may be still images of a movie. Hence, the method can be used for generating a movie based on light intensity values indicative of respective light intensities incident on the screen at respective times.

(40) FIG. 5 shows a table that illustrates a particular method for determining a translucency value for each light valve.

(41) The top row shows light intensity values 1-10 (arbitrary units). Higher light intensity values are indicative of higher light intensities being incident on the screen 4. The most left column shows greyscale values 0-15. In this example, a greyscale value of 0/15 is associated with black, whereas a greyscale value of 15/15 is associated with white. Each combination of light intensity value and greyscale value corresponds with a translucency value. In this example, each translucency value is expressed as a percentage of incident light that passes through the light valve.

(42) Thus, if a to be presented image has a particular image pixel associated with a particular light valve, wherein the particular image pixel has a value of 10/15, and a light intensity value has a value of 5/10, then the translucency value for the particular light valve will be 60%. Note that the light intensity value used for this determination may be the same light intensity value used for determining what image is to be presented or a light intensity value indicative of the incident light intensity at a later time.

(43) It should be appreciated that determining a translucency value may comprise simply converting a greyscale value to a translucency value without taking into account any light intensity value.

(44) In an embodiment, the control system 8 is configured to perform the step of storing data associating combinations of light intensity value and greyscale value with translucency values. The control system may further be configured to perform the step of transforming the image pixel values into translucency values based on said transformation data. In one example, the control system may thus have stored the table as depicted in FIG. 5. Advantageously the control system then may only have to look up an associated translucency value given a light intensity value and given an image pixel value. In an example, during a preproduction process, for each light valve and for every time instant in a movie, a table such as depicted is determined and these tables are stored by the control system for use during the display of the movie.

(45) Said transformation data may comprise one or more predefined mathematical operations and in one embodiment, the control system is configured to perform the step of calculating for each image pixel value the associated translucency value using the one or more predefined (mathematical) operations.

(46) FIG. 6A shows a movie A that, in an embodiment, may be determined to be presented on the one side of the screen based on light intensity incident on the other side of the screen. FIG. 6A depicts four still images (which may also be referred to as frames or video frames) of movie A, namely a still image for playtime t1, a still image for playtime t2 and a still image for playtime t3 and a still image for playtime t4. In this example, the movie depicts branches with leaves that develop and grow as the movie proceeds. Indeed, at playtime t3 and t4, more branches and leaves are depicted than at playtime t2. Further, the still images for t3 and t4 may be understood to be different versions of the same image. In the image for t4, some of the leaves are darker with respect to the corresponding leaves in the image for t3. Hence, the average brightness value for image t4 is lower than the average brightness value for image t3. Thus, likely, when the image t4 is presented on the screen, more of the incident light intensity will be blocked by the screen.

(47) Determining a movie that is to be presented on the basis of incident light intensity may comprise generating the movie based on a light intensity value indicative of light intensity incident on the other side of the screen. Thus, the determined movie need not be prestored on some computer-readable storage medium but may be generated on the fly. The three still images for t1, t2, t3, t4 may all be determined based on the same light intensity value.

(48) Determining a movie may be understood to comprise determining a plurality of images. Each of these plurality of images may be determined in accordance with the methods described herein. Thus, in contrast to determining entire movie A based on a single light intensity value as per above, it may also be that still image for t1 for movie A is determined based on a first light intensity value, that still image t2 for movie A is determined based on a second light intensity value, that the still image for t3 for movie A is determined based on a third light intensity value and that still image for t3 for movie A is determined based on a fourth light intensity value. These first, second, third and fourth light intensity values may all be different, however, this is not per se the case.

(49) FIG. 6B illustrates a second movie that may be determined to be present on the one side of the screen based on another light intensity, as e.g. indicated by another light intensity value. In movie B of FIG. 6B, the still image for playtime t1 is identical to the still image for playtime t1 for movie A. However, the still image for t2 in movie B is different from the still image for playtime t2 for movie A. In fact, in this example, the still image for t2 in movie B is identical to the still image for t3 in movie A. Thus, in movie B, the branches with leaves are depicted to develop and grow faster than in movie A. Movie B may be a sped up version of movie A. These two movies may be understood to be different because the pair of two still images for the same playtime t2 are different. The still image for t2 in movie B namely has an image element A, in this example depicting a leave, that does not have a corresponding image element in the still image for t2 in movie A. Hence, these images are different and the movies may be understood to be different.

(50) Movie A (and thus each of image t1, image t2, image t3) may be determined to be presented based on a first light intensity value and movie B (thus image t1 and image t2) may be determined to be presented based on a second light intensity value. If the second light intensity as indicated by the second light intensity value is higher than the first light intensity as indicated by the first light intensity value, then, on average, the second video may be darker than the first video. The second video being darker than the first video may be understood as that an average brightness value of the second movie is lower than an average brightness value of the first movie. Determining an average brightness value for a movie may be performed by determining an average of all average brightness values for the respective still images the movie in question. An average brightness value for a still image may be determined in accordance with methods described above.

(51) It should be appreciated that movement and/or growth of image elements as depicted in a movie may also be determined based on incident light intensity on the screen. To illustrate, if the light intensity incident on the other side of the screen is relatively low, then a growth of image elements or growth of a number of image elements may be limited, whereas if the light intensity incident on the screen is relatively high, then a growth of image elements or growth of a number of image elements may be significant. Likewise, if the light intensity incident on the screen is relatively low, then image elements may move significantly in order to create more open, bright regions in still images of the movie herewith increasing the average brightness value for the movie.

(52) FIGS. 6C, 6D, 6E each shows a first image and a second image that may be determined in accordance with the methods described herein. Again, the pairs of first and second image may both be still images for playtime t1,t2, respectively, for a movie. FIG. 6C, 6D, 6E illustrate that a movie may be generated such that the average brightness values of the still images decreases over time, which would be appropriate if the light intensity incident on the screen would increase. Note that in FIGS. 6C, 6D, 6E the still images for t2 have a lower average brightness value than the still images for t1, for example due to thicker lines, more and/or darker image elements, et cetera.

(53) FIG. 7 is an artist impression of the screen of the dynamic shading system presenting an image of plants on the one side of the screen.

(54) FIG. 8 depicts a block diagram illustrating a data processing system according to an embodiment.

(55) As shown in FIG. 8, the data processing system 100 may include at least one processor 102 coupled to memory elements 104 through a system bus 106. As such, the data processing system may store program code within memory elements 104. Further, the processor 102 may execute the program code accessed from the memory elements 104 via a system bus 106. In one aspect, the data processing system may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that the data processing system 100 may be implemented in the form of any system including a processor and a memory that is capable of performing the functions described within this specification.

(56) The memory elements 104 may include one or more physical memory devices such as, for example, local memory 108 and one or more bulk storage devices 110. The local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code. A bulk storage device may be implemented as a hard drive or other persistent data storage device. The processing system 100 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the number of times program code must be retrieved from the bulk storage device 110 during execution.

(57) Input/output (I/O) devices depicted as an input device 112 and an output device 114 optionally can be coupled to the data processing system. Examples of input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, a touch-sensitive display, a light intensity sensor described herein or the like. Examples of output devices may include, but are not limited to, a monitor or a display, speakers, a screen and in particular light valves described herein, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening 1/O controllers.

(58) In an embodiment, the input and the output devices may be implemented as a combined input/output device (illustrated in FIG. 8 with a dashed line surrounding the input device 112 and the output device 114). An example of such a combined device is a touch sensitive display, also sometimes referred to as a “touch screen display” or simply “touch screen”. In such an embodiment, input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.

(59) A network adapter 116 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks. The network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 100, and a data transmitter for transmitting data from the data processing system 100 to said systems, devices and/or networks. Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 100.

(60) As pictured in FIG. 8, the memory elements 104 may store an application 118. In various embodiments, the application 118 may be stored in the local memory 108, the one or more bulk storage devices 110, or apart from the local memory and the bulk storage devices. It should be appreciated that the data processing system 100 may further execute an operating system (not shown in FIG. 8) that can facilitate execution of the application 118. The application 118, being implemented in the form of executable program code, can be executed by the data processing system 100, e.g., by the processor 102. Responsive to executing the application, the data processing system 100 may be configured to perform one or more operations or method steps described herein.

(61) In one aspect of the present invention, the data processing system 100 may represent a control system as described herein.

(62) Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein). In one embodiment, the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression “non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal. In another embodiment, the program(s) can be contained on a variety of transitory computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored. The computer program may be run on the processor 102 described herein.

(63) The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

(64) The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of embodiments of the present invention has been presented for purposes of illustration, but is not intended to be exhaustive or limited to the implementations in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the present invention. The embodiments were chosen and described in order to best explain the principles and some practical applications of the present invention, and to enable others of ordinary skill in the art to understand the present invention for various embodiments with various modifications as are suited to the particular use contemplated.