SHADING AND ILLUMINATION SYSTEM
20220128206 · 2022-04-28
Inventors
Cpc classification
F21S11/007
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
H05B47/11
ELECTRICITY
F21V23/0478
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F21V11/04
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
Y02B20/40
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
Y02B80/00
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
E06B9/24
FIXED CONSTRUCTIONS
Y02A30/24
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
International classification
F21S11/00
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F21V11/04
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
Abstract
A shading and illumination system includes a shading device for shading viewing openings, an illumination device for illuminating a room, an external sensor for detecting an external parameter acting on the room, an internal sensor for detecting a 3D image of the room, a position of a person present in the room in the 3D image, and a viewing direction of the person, and a control unit for actuating the shading device and the illumination device. The shading device and the illumination device are actuatable depending on the values measured by the external sensor and by the internal sensor. A light parameter acting on the person is determinable depending on the detected viewing direction, on the detected position, on the 3D image of the room, and on the external parameter. The shading device and/or the illumination device are/is actuatable depending on the light parameter acting on the person.
Claims
1. Shading and illumination system for a room in a building, comprising: at least one shading device for shading viewing openings, in particular windows, of the building, at least one illumination device for illuminating the room, at least one external sensor for detecting at least one external parameter (Pout) acting on the room from outside, at least one internal sensor for detecting a 3D image of the room, at least one position of a person present in the room in this 3D image and a viewing direction of this at least one person and a control unit for actuating the at least one shading device and the at least one illumination device, wherein the at least one shading device and the at least one illumination device are actuatable by the control unit depending on the values measured by the external sensor and by the internal sensor, wherein a light parameter acting on the person, in particular on an eye of the person, is determinable depending on the detected viewing direction and on the detected position and on the 3D image of the room and depending on the at least one external parameter, wherein the at least one shading device and/or the at least one illumination device are/is actuatable depending on this light parameter acting on the person (P).
2. The shading and illumination system according to claim 1, wherein the at least one external parameter is based on an irradiance and/or an illuminance and/or a radiation density and/or an external temperature.
3. The shading and illumination system according to claim 2, wherein the at least one external parameter represents values measured and/or calculated by the external sensor and/or conditions based on the irradiance and/or the illuminance and/or the radiation density and/or the external temperature.
4. The shading and illumination system according to claim 1, wherein the at least one external sensor is a pyranometer.
5. The shading and illumination system according to claim 1, wherein the at least one internal sensor for detecting the 3D image is a ToF camera.
6. The shading and illumination system according to claim 1, wherein several internal sensors for detecting several persons are arranged in the room.
7. The shading and illumination system according to claim 1, wherein the 3D image contains surfaces arranged in a coordinate system, wherein reflectances of these surfaces of the room are determinable via the at least one internal sensor, preferably by estimation or calculation.
8. The shading and illumination system according to claim 1, wherein the light parameter corresponds to an illumination parameter, wherein this illumination parameter is defined by the vertical illuminance.
9. The shading and illumination system according to claim 1, wherein the light parameter corresponds to a glare parameter, wherein this glare parameter is defined on the basis of a daylight glare probability or an approximation method of glare metrics.
10. The shading and illumination system according to claim 1, wherein the shading device is formed as a mechanical shading element, e.g. as a venetian blind, as a curtain, as an awning or as a roller shutter, and/or as intelligent glass.
11. The shading and illumination system according to claim 1, wherein the shading device has several shading panels, preferably arranged in the manner of a matrix, wherein at least two, preferably at least four, shading increments can be set for each shading panel.
12. The shading and illumination system according to claim 1, wherein the illumination device has a lighting means, for example a light-emitting diode, a support, which is preferably integrated, mounted, suspended or standalone, and a power supply means.
13. The shading and illumination system according to claim 1, wherein the control unit is formed as a switch cabinet with an arithmetic unit and modules connectable to the arithmetic unit for the illumination device and for the shading device.
14. A building automation system for monitoring, controlling and adjusting facilities in a building with the shading and illumination system according to claim 1.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0092] Further details and advantages of the present invention are explained in more detail below with reference to the embodiments represented in the drawings, in which:
[0093]
[0094]
[0095]
[0096]
[0097]
[0098]
[0099]
DETAILED DESCRIPTION OF THE INVENTION
[0100]
[0101] The majority of these viewing openings 5 are provided with a shading device 4. It is indicated schematically that a shading device 4 in the form of a lamellar curtain (e.g. a venetian blind) is lowered.
[0102] At least one room 2 (indicated by the dashed lines) is formed in the building 3. An (artificial) illumination device 6 is arranged in this room 2.
[0103] In addition, at least one internal sensor 8 is arranged in this room 2.
[0104] Outside the room 2 (e.g. on the external façade 11) there is an external sensor 7 for detecting at least one external parameter P.sub.out acting on the room 2 from outside.
[0105] Furthermore, a control unit 9 is provided for actuating the at least one shading device 4 and the at least one illumination device 6.
[0106] The shading device 4, the illumination device 6, the external sensor 7, the internal sensor 8 and the control unit 9 together form the shading and illumination system 1 for the room 2 of the building 3.
[0107] It is also indicated in
[0108] A room 2 in a building 3 is represented schematically in
[0109] Furniture 12 (e.g. a desk) is arranged in the room 2. A person P sits at this furniture 12 formed as a desk.
[0110] A monitor 13 is arranged on the desk. This monitor 13 can act as a user interface for the control unit 9.
[0111] An internal sensor 8 is arranged on the ceiling of the room 2. This internal sensor 8 is configured to detect a 3D image 3D of the room 2, at least one position x, y, z of the person P located in the room 2 in this 3D image 3D and a viewing direction x.sub.view, y.sub.view, z.sub.view of this at least one person P.
[0112] In addition, an illumination device 6 is located on the ceiling of the room 2, wherein two lamps of this illumination device 6 are represented in this case.
[0113] Furthermore, a further sensor 14 is arranged on the ceiling of the room 2. This sensor 14 serves with its detected values to control the luminous flux at target illuminances. This type of sensor is generally termed a look-down sensor. This sensor 14 can also be part of the internal sensor 8.
[0114] Optical properties (e.g. reflectances) of surfaces (furniture surface 15 and floor surface 16) can also be detected by the internal sensor 8.
[0115] Outside the room 2, the sun 17 is indicated schematically. The state of the sun 17 can be read from a world coordinate system and is set relative to the room 2. Sun rays S1, S2 and S3 penetrate the room 2 through the viewing opening 5 and fall on the person P. The sun ray S1 falls directly on the face of the person P and results in glare. The sun ray S2 is reflected via the furniture part surface 15 onto the person P. The sun ray S3 results in glare for the person P via a reflection on the floor surface 16.
[0116] An external sensor 7 is arranged on the outside of the room 2. The control unit 9 is indicated schematically in the room 2. The individual components of the shading and illumination system 1 have a signalling connection to one another.
[0117] Starting from the detected viewing direction x.sub.view, y.sub.view, z.sub.view and from the detected position x, y, z of the person P and from the 3D image (reference sign 3D) of the room 2 and depending on the at least one external parameter P.sub.out, the control unit 9 is configured to determine a light parameter L acting on the person P, in particular on the face of the person P. This light parameter L is then used by the control unit 9 again as a basis for the actuation of the at least one shading device 4 and/or of the at least one illumination device 6.
[0118] In the embodiment example represented in
[0119] Each shading panel 4.1 to 4.20 can assume several—in this case six—different shading settings A, B, C, D, E and F. In the shading setting A, there is no dimming or tinting of the corresponding panel. In the shading setting F, there is maximum dimming of the corresponding panel. The shading settings B, C, D and E form different shading stages (e.g. 20%, 40%, 60% and 80%) of maximum dimming.
[0120] If the shading panels 4.1 to 4.20 are part of an intelligent glass pane—as depicted—it is possible to set them in 1% increments or finer shading increments (with 1% increments there are 101 shading settings).
[0121] However, shading panels 4.1 to 4.n can also be produced—in contrast to the representation in
[0122] As illustrated in
[0123] Of course, other settings can also be made here according to individual wishes.
[0124] Such shadings and settings are not limited to intelligent glass either. On the contrary, this can also work in the same way in the case of shading devices 4 in the form of venetian blinds, roller shutters etc.
[0125] However, not only can the shading device 4 be actuated in this way depending on the detected light parameter L, but the illumination device 6 can also be actuated correspondingly. Thus, in the event of sun rays S1 falling frontally on the face, complete dimming via the shading device 4 can be necessary, wherein, however, the illumination device 6 then has to be turned on more strongly again.
[0126] A further embodiment example of a shading/illumination system 1 for a room 2 of a building 3 is represented in
[0127] Thus, the shading panel 4.8 is completely dimmed with the shading increment F. Direct solar radiation onto the head of the person P is thereby prevented. At the same time, however, the shading panel 4.3 arranged above it is not dimmed at all (shading increment A). Furthermore, in these settings, the shading panels 4.12 to 4.14 are set at shading increment C, so that greater reflection via the sun ray S2 is permitted. The shading panels 4.4, 4.5, 4.9 and 4.10, however, are set to the shading increment E.
[0128] In
[0129] A matrix can be laid over this field of view (see
[0130] This view matrix in
[0131] In
[0132] In the 3-phase method (Radiance), the transfer from a discrete area of the sky (sky matrix S) to a calculation point in the room is calculated via three matrices. The daylight matrix D describes in discrete grids how much of the discrete sky (sky matrix S) is to be seen from the façade point. The transmission matrix T in turn describes in a discretized manner how much is transferred from a half space area outside to another half space inside. The view matrix V describes in a discretized manner how much from the calculation point enters from the half space.
[0133] The discretization can be effected in 145 areas. To this end, such a grid is represented by way of example in
[0134] A calculation formula for the illuminance is indicated by way of example in
[0135] The illuminance at n calculation points can be calculated by means of matrix multiplication. It is thus a function of V, T, D and S, f(V,T,D,S).
[0136] More precise expansions of the 3-phase method are available in the science.
[0137] While the matrices V, D and S are constant in the time step, T can now be varied. The variation describes all possible settings of the shading and illumination system. For each setting a parameter is now calculated, specifically for glare L.sub.i=f(V,T,D,S), or for illuminance E.sub.i=f(V,T,D,S) or for the solar input Q of the energy input based on an external irradiance and the value g (angle-dependent overall energy transmittance characteristic). Depending on the target function, the optimal façade setting is selected and the shading device 4 and the illumination device 6 are finally activated.
[0138] The 3-phase method can be expanded to five or six phases by including further transfer properties. Other methods can, of course, also be used.
[0139] As a result, the sky state (sky matrix S) then changes in a next time step and the optimization is effected afresh. The view matrix V has possibly also changed, whereby a change in the shading or illumination can result again. For example, in the case of dimming of the desk that is too strong, the illumination device 6 in the area of the desk can automatically be switched to brighter.
LIST OF REFERENCE SIGNS
[0140] 1 Shading and illumination system [0141] 2 Room [0142] 3 Building [0143] 4 Shading device [0144] 4.1 to 4.n Shading panels [0145] 5 Viewing openings [0146] 6 Illumination device [0147] 7 External sensor [0148] 8 Internal sensor [0149] 9 Control unit [0150] 10 Building automation system [0151] 11 External façade [0152] 12 Furniture (desk) [0153] 13 Monitor [0154] 14 Sensor [0155] 15 Furniture surface [0156] 16 Floor surface [0157] 17 Sun [0158] P.sub.out External parameter [0159] 3D 3D image [0160] x, y, z Position of the person in the room [0161] P Person [0162] x.sub.view, y.sub.view, z.sub.view Viewing direction [0163] L Light parameter [0164] S1, S2, S3 Sun rays [0165] A, B, C, D, E, F Shading increments [0166] V View matrix [0167] T Transmission matrix [0168] D Daylight matrix [0169] S Sky matrix