Luminaire network with sensors

11758635 · 2023-09-12

Assignee

Inventors

Cpc classification

International classification

Abstract

The invention relates to a luminaire network, comprising a plurality of luminaires comprising a lighting apparatus, wherein a plurality of the luminaires comprise a communication unit configured to enable communication of data between said plurality of luminaires and/or with a central unit; a processing unit; a control unit configured to control the lighting apparatus as well as the communication and processing units and at least one first sensor configured to output first sensed data. The processing unit of the luminaire is configured to process the first sensed data to produce first processed data, and the luminaire network is further configured such that the first processed data of at least two luminaires is further processed to produce second processed data. The invention further relates to a method of processing sensor data in a luminaire network.

Claims

1. A luminaire network, comprising a plurality of luminaires comprising a lighting apparatus, wherein each of at least two luminaires of the plurality of the luminaires comprises: a communication unit configured to enable communication of data to and from communication units of other luminaires in the plurality of luminaires or to a central unit; a processing unit; a control unit configured to control the lighting apparatus as well as the communication and processing units; and at least one first sensor, configured to output first sensed data, wherein the processing unit is configured to process the first sensed data to produce first processed data, wherein the luminaire network is further configured such that the first processed data of said at least two luminaires is further processed to produce second processed data, and wherein an indication of quality of at least one of the first sensed data and the first processed data is taken into account to produce the second processed data.

2. The luminaire network according to claim 1, wherein the first processed data comprise a first value for a variable, wherein the second processed data comprise a second value for the variable, and wherein the processing of the first processed data to obtain second processed data is performed such that the accuracy of the second value is higher than of that of the first value.

3. The luminaire network according to claim 1, wherein the processing units of the luminaires are further configured to determine an indication of estimated quality of the first sensed data or the first processed data, and wherein this estimated quality is taken into account to produce the second processed data.

4. The luminaire network according to claim 3, wherein the indication of quality or estimated quality of the first sensed data or the first processed data is based at least in part on: the first sensed data; at least one of: a dynamic range of a captured image, an exposure time of a camera, a white balance, an ISO-value, a noise level, a signal-to-noise ratio, a processing load, available memory, obstruction data, vibration data, weather data, light data, sunlight data, or another indicator of image quality; or characteristics of the respective first sensor.

5. The luminaire network according to claim 3, wherein, in the processing of the first processed data to produce second processed data, the indications of quality or estimated quality of the first sensed or the first processed data are used to determine averaging weights.

6. The luminaire network according to claim 1, wherein the at least one first sensor comprises a camera.

7. The luminaire network according to claim 1, wherein the at least one first sensor comprises at least one of: a microphone or other sound sensor, a photosensitive sensor, an accelerometer, a vibration sensor, a wind sensor, a thermometer, a heat or thermal sensor, an RF sensor, an electromagnetic sensor, a smoke sensor, a dust sensor, an air quality sensor, another type of environmental sensor, a radar or lidar based sensor, a visibility sensor, a humidity sensor, an IR sensor, a motion sensor, an sonic or ultrasonic sensor, a microwave sensor, an IR sensor, a light sensor, and an astronomical clock.

8. The luminaire network according to claim 1, further comprising the central unit comprising a central processing unit and a central communication unit, wherein the communication units of the plurality of luminaires are further configured to enable communication of data between the central communication unit and the respective luminaires, and wherein the central processing unit is configured to perform at least part of the processing of the first processed data to produce second processed data.

9. The luminaire network according to claim 8, wherein the central processing unit is configured to have access to information about the location of the luminaires or to information about expected correlations between the first processed data of the plurality of luminaires.

10. The luminaire network according to claim 1, wherein at least a subset of the processing units of the luminaires are configured to perform at least part of the processing of the first processed data to produce second processed data through distributed computing.

11. The luminaire network according to claim 10, wherein the network is configured such that the assignment of processing to at least one processing unit of a luminaire through distributed computing reduces an amount of data transmission.

12. The luminaire network according to claim 1, further configured such that the first processed data includes data representative of a location of the associated luminaire.

13. The luminaire network according to claim 1, wherein the plurality of luminaires comprises at least one secondary sensor configured to output second sensed data.

14. The luminaire network according to claim 13, wherein the indication of the quality or the estimated quality of the first sensed data or the first processed data is based at least in part on the second sensed data.

15. The luminaire network according to claim 13, wherein the processing unit is configured to process the first sensed data to produce first processed data using the second sensed data, or wherein the at least one secondary sensor comprises an accelerometer, a vibration sensor, a wind sensor, a thermometer, a heat or thermal sensor, a humidity sensor, an environmental sensor, a microphone or other sound sensor, an air quality sensor, a smoke sensor, a dust sensor, an RF sensor, a photosensitive sensor, a visibility sensor, a camera, an IR sensor, a light sensor, an astronomical clock, a radar-based sensor, a lidar-based sensor, a motion sensor, a sonic sensor, an ultrasonic sensor, or a microwave sensor.

16. The luminaire network according to claim 1, wherein the processing of the first processed data and the indication of a quality or an estimated quality of the first sensed data or the first processed data to produce second processed data comprises averaging the first processed data.

17. The luminaire network according to claim 1, wherein first processed data that is found to be an outlier with respect to the first processed data of a plurality of luminaires of which the first processed data is expected to be correlated is disregarded in producing the second processed data, or wherein (i) the communication unit of the luminaires is configured to receive the second processed data and (ii) the control unit is configured to control the lighting apparatus or the processing unit based on the second processed data, or wherein (i) the at least one first sensor comprises an electromagnetic sensor and (ii) the second processed data comprises information about the presence, estimated position, and identifying information of an entity emitting electromagnetic radiation.

18. A method of processing sensor data in a luminaire network comprising a plurality of luminaires, comprising: sensing data at each of at least two luminaires of said plurality of luminaires to produce first sensed data for each of said at least two luminaires; processing each first sensed data to produce first processed data for each of said at least two luminaires; and combining the first processed data from said at least two luminaires to produce second processed data, wherein an indication of quality of at least one of the first sensed data and the first processed data is taken into account to produce the second processed data.

19. A luminaire network comprising a plurality of luminaires comprising a lighting apparatus, wherein each of at least two luminaires of the plurality of the luminaires comprises: a communication unit configured to enable communication of data to and from communication units of other luminaires of the plurality of luminaires or to a central unit; a processing unit; a control unit configured to control the lighting apparatus as well as the communication and processing units; at least one first sensor, configured to output first sensed data; and at least one secondary sensor configured to output second sensed data, wherein the processing unit is configured to process the first sensed data to produce first processed data using the second sensed data.

20. The luminaire network according to claim 19, wherein the at least one first sensor comprises a camera, and wherein the at least one secondary sensor comprises an accelerometer, a vibration sensor, a wind sensor, a thermometer, a heat or thermal sensor, a humidity sensor, an environmental sensor, a microphone or other sound sensor, an air quality sensor, a smoke sensor, a dust sensor, an RF sensor, a photosensitive sensor, a visibility sensor, a camera, an IR sensor, a light sensor, an astronomical clock, a radar-based sensor, a lidar-based sensor, a motion sensor, a sonic sensor, an ultrasonic sensor, or a microwave sensor.

Description

BRIEF DESCRIPTION OF THE FIGURES

(1) The invention will be further elucidated at the hand of the attached figures, wherein:

(2) FIG. 1 is a block diagram of an embodiment of the luminaire network which includes a central unit;

(3) FIG. 2 is a block diagram of an embodiment of the luminaire network in which distributing processing is used;

(4) FIG. 3 is a block diagram of a luminaire of a luminaire network wherein secondary sensors are used;

(5) FIG. 4 is a block diagram showing the various data streams in an embodiment of the luminaire network;

(6) FIG. 5 is a block diagram showing the various data streams in another embodiment of the luminaire network;

(7) FIG. 6 is a block diagram showing the various data streams in yet another embodiment of the luminaire network;

(8) FIG. 7 is an illustration of possible image data captured by a camera located at the side of a roadway with a plurality of lanes which may be used in the method according to the second aspect;

(9) FIG. 8 illustrates a method according to the second aspect, and a possibility for visual display;

(10) FIG. 9 illustrates a map of a roadway to which the method according to the second method is applicable, as well as an example of visually displaying the results of a method according to the second aspect.

DETAILED DESCRIPTION OF THE FIGURES

(11) In the figures, like reference numbers refer to like elements.

(12) FIG. 1 shows a block diagram of an embodiment of the luminaire network according to the invention which includes a central unit. While only 3 luminaires 20, 30, 40 are shown, it will be clear that luminaire networks may comprise many more luminaires.

(13) Each luminaire comprises a lighting apparatus 21, 31, 41, which is controlled by control unit 23, 33, 43. Each of the shown luminaire further comprises at least one first sensor 22, 32, 42. Note that not all luminaires need to comprise a sensor: some luminaires may be placed in locations where little useful sensor input is to be expected. Furthermore, not all luminaires need to comprise the same sensor or sensors. Non-limitative examples of sensors which may be used in the invention are: a camera, a microphone or other sound sensor, a photosensitive sensor, an accelerometer, a wind sensor, a thermometer, a heat/thermal sensor, an RF sensor, an electromagnetic sensor, a smoke sensor, a dust sensor, an air quality sensor, another type of environmental sensor, a radar or lidar based sensor, a visibility sensor, a humidity sensor, an IR sensor, a motion sensor, an (ultra)sonic sensor, a microwave sensor, etc.

(14) The plurality of luminaires 20, 30, 40 further comprise a processing unit 24, 34, 44, which may receive first sensed data S.sub.1, S.sub.2, S.sub.3 from the at least one first sensor 22, 32, 43 and be controlled by control unit 23, 33, 43. The control unit further controls the communication unit 25, 35, 45, which may receive first processed data P.sub.1, P.sub.2, P.sub.3 from the processing unit 24, 34, 44 to send it to the communication unit 15 of central unit 10. Note that the communication unit 25, 35, 45 may also allow the luminaire to communicate with other luminaires in the luminaire network. Furthermore, while in the figure direct lines are drawn from communication units 25, 35, 45 and central communication unit 15, this does not imply that there needs to be direct communication between each communication unit 25, 35, 45, as data may be relayed by other luminaires as well. The skilled person will be able to determine the most efficient and cost-effective way to enable communication with the central communication unit 15.

(15) The central communication unit 15 relays the received data P.sub.1, P.sub.2, P.sub.3 to central processing unit 14, which uses the first processed data P.sub.1, P.sub.2, P.sub.3 from the plurality of luminaires 20, 30, 40 to produce second processed data R. The luminaires are preferably chosen such that there is some overlap between what at least some of the luminaires aim to sense, and/or such that the respective first processed data (P.sub.1, P.sub.2, P.sub.3) is expected to be correlated. Therefore using sensed data S.sub.1, S.sub.2, S.sub.3 of such luminaires, in particular by combining the first processed data P.sub.1, P.sub.2, P.sub.3 may improve data quality. Furthermore, because it is first processed data P.sub.1, P.sub.2, P.sub.3 which is sent to the central unit, and not the raw sensed data S.sub.1, S.sub.2, S.sub.3, the requirements on bandwidth are lessened.

(16) Advantageously, the central unit may further comprise a storage unit. This storage unit may, amongst other things, store information which is useful in the processing of first processed data P.sub.1, P.sub.2, P.sub.3 to produce second processed data R. For instance, the storage unit may store at least one of an ID number of the first sensor; a configuration of and/or an algorithm running in the first sensor electronics itself and/or in the processing unit; a firmware version running on the first sensor; if the first sensor is a camera, what type of lens it has; GPS coordinates; historical data; information about expected correlations, and so on.

(17) FIG. 2 shows an alternate embodiment in which no central unit is present. Note that the presence and use of a central unit is not excluded in this embodiment, as a combination of central and distributed processing is also a possibility, and as the central unit may be useful to provide and keep track of certain data even if processing is done in a distributed manner. Like references refer to like elements: the shown luminaires 20, 30, 40 comprise a lighting apparatus 21, 31, 41; a control unit 23, 33, 43; a processing unit 24, 34, 44 and a communication unit 25, 35, 45. Though not shown, some or all of the luminaires may also comprise a storage unit, which may store information necessary for the performing of the first and/or second processing step. The communication unit 25, 35, 45 enables the luminaires 20, 30, 40 to communicate and to share the first processed data P.sub.1, P.sub.2, P.sub.3 produced by the processing units 24, 34, 44. While the figure depicts lines between all the communication units 25, 35, 45, there does not need to be direct communication between each and every communication unit of the luminaire network: data may also be relayed. The processing of the collected first processed data P.sub.1, P.sub.2, P.sub.3 to produce second processed data (R) may take place at any of the processing units 24, 34, 44 or be distributed among several of these processing units. The skilled person will be aware of how to coordinate such distributed processing. The choice of processing unit or units may be optimized for any of several factors, for instance to minimize bandwidth use, maximize speed, and/or improve accuracy.

(18) FIG. 3 shows a luminaire 20 belonging to an embodiment of a luminaire network according to the invention wherein a secondary sensor 26 is present. Note that a plurality of the luminaires or even all of the luminaires comprising a first sensor may but does not have to comprise such a secondary sensor 26. Note also that not every luminaire comprising a secondary sensors needs to comprise a first sensor as well. The second sensed data C.sub.1 sensed by this secondary sensor 26 may be used to estimate the quality of the sensed data S.sub.1 and/or of the first processed data P.sub.1. Note that the quality that is estimated is not just a quality of the image, but also an estimation of the quality of information derived from the sensed data by the processing unit 24, 34, 44. Furthermore, second sensed data C.sub.1, C.sub.2, C.sub.3 may in its turn be processed into additional first processed data and then (collected and) combined into additional second processed data, if desired.

(19) Note that while the luminaires are depicted similarly, there may be various differences. For instance, there may be differences in: the type of first sensor(s), the type of secondary sensor(s), the type of processing unit, the type of configuration of the processing unit, the type of control unit, the type of configuration of the control unit, the type of communication unit, the connections of the communication unit, the type of lighting apparatus, etc.

(20) Furthermore, in some embodiments there may be several results R produced from different (but potentially overlapping) sets of first processed data P.sub.1, P.sub.2, P.sub.3. For instance, a result R.sub.1 could be produced from first processed data P.sub.1, P.sub.2 and P.sub.3, while another result R.sub.2 could be produced from first processed data P.sub.1, P.sub.4, P.sub.5 and P.sub.6. Furthermore, in some embodiments it may be possible to change the configuration of processing units to thus change the resulting processed data P.sub.1, P.sub.2, P.sub.3, such that the resulting results R are improved.

(21) As an example, an embodiment may comprise camera based sensors located in luminaires along a highway segment. Each may individually be configured to compute the average speed per lane along this highway segment. If the result of the second processing (R) shows that the average speed is very low, this will likely mean that there is a traffic jam. The system may be configured such that, when conditions like these are encountered, the system is configured to (temporarily) change the processing mode (P.sub.1, P.sub.2, P.sub.3) of relevant luminaires such that the first processing and second processing yield results relevant for ‘traffic queuing detection’ instead of calculating the average speed.

(22) In another example with luminaires positioned along a highway segment, each of the processing units of a plurality of luminaires may be configured such that the first processed data P.sub.1, P.sub.2, P.sub.3, originating from a first processing step processing sensed data captured by the first sensors of respective first, second and third luminaires, relate to traffic data. For instance, the processing may include a per-lane analysis as described elsewhere in the application. P.sub.1, P.sub.2, and P.sub.3 may for instance comprise information about the number of passing vehicles, about the average speed, about the average speed per lane, about the speed of the fastest registered vehicle and/or about the presence of unusual features. During the second processing step, it may be noted that P.sub.1 and P.sub.2 indicate fluid traffic and/or at least one fast-moving vehicle, the first processed data P.sub.3, which originates from a luminaire positioned further along the highway, indicates a stoppage or traffic jam, or perhaps a traffic accident. This may indicate that there is a risk of a collision, if the fast-moving vehicles registered by the first sensors of the first and second luminaires are suddenly confronted with the slow-moving or stopped vehicles registered by the first sensor of the third luminaire. Therefore, the result of the second processing, R, may comprise information about this risk.

(23) Any of a number of actions may be taken as a result. For instance, any of a number of warning actions could be taken: at least one luminaire (which may include luminaires along the stretch of highway between the second and third luminaires, even if they are not equipped with sensors and/or processing units) may be instructed to alter its light to some type of warning light or warning light pattern; a display along or above the highway may be used to warn the drivers of vehicles, and in the case of vehicles equipped with V2X systems, a warning could even be displayed in the vehicle itself. In the latter case, it may even be possible to send control signals to specific vehicles such that they automatically slow down. A possible action is also to change the processing setting of the processing unit of the third luminaire such that it is aimed at collision detections specifically.

(24) The above is merely one example showing that combination of first processed data from a plurality of luminaires may be used not just to improve the quality of the results, but also to make it possible to get additional data, for instance by comparing first processed data from a plurality of luminaires. The skilled person will be able apply the principles of this example to other situations.

(25) FIG. 4 shows the data streams in more detail, for a simple embodiment. First sensed data S.sub.1, S.sub.2, S.sub.2 are processed to produce respective first processed data P.sub.1, P.sub.2, P.sub.3. Note that this first processing step is performed entirely locally and does not require the use of the communication unit 25, 35, 45 at all. Then, the locally produced first processed information P.sub.1, P.sub.2, P.sub.3 is collected and combined, either centrally and/or through distributed processing, to produce second processed data R.

(26) FIG. 5 shows a somewhat more complex embodiment, in which respective indications Q.sub.1, Q.sub.2, Q.sub.3 of estimated quality of this first sensed data S.sub.1, S.sub.2, S.sub.3 and/or of the first processed data P.sub.1, P.sub.2, P.sub.3 are also collected, and are combined with first processed data P.sub.1, P.sub.2, P.sub.3 to produce second processed data R.

(27) FIG. 6 shows yet another embodiment, which gives more detail about how the respective indications Q.sub.1, Q.sub.2, Q.sub.3 of estimated quality of this first sensed data S.sub.1, S.sub.2, S.sub.3 and/or of the first processed data P.sub.1, P.sub.2, P.sub.3 may be established. In particular, the respective indications Q.sub.1, Q.sub.2, Q.sub.3 of estimated quality of the first sensed data S.sub.1, S.sub.2, S.sub.3 and/or of the first processed data P.sub.1, P.sub.2, P.sub.3 may be based on the first sensed data S.sub.1, S.sub.2, S.sub.3 itself, and/or on characteristics of the first sensor which sensed the first sensed data, and/or on characteristics of the processing unit, and/or on any other dynamic of static parameter influencing the quality. It may also, additionally or alternately, be based on second sensed data C.sub.1, C.sub.2, C.sub.3 from a secondary sensor.

(28) Not shown in this figure is that the second sensed data C.sub.1, C.sub.2, C.sub.3 from the secondary sensors may itself, after a first local processing step, be collected and combined into additional second processed data R.sub.2, but it is noted that this is not in any way excluded. Furthermore, there is no requirement for every indication Q.sub.1, Q.sub.2, Q.sub.3 of estimated quality to be based on different second sensed data C.sub.1, C.sub.2, C.sub.3: in some cases, second sensed data C.sub.1 from a single secondary sensor may be used to estimate the quality of sensed data S.sub.1, S.sub.2, S.sub.3 and/or of the first processed data P.sub.1, P.sub.2, P.sub.3 from a plurality of first sensors.

(29) The skilled person will be able to envisage many alterations, combinations, permutations and elaborations on the systems described above. An inventive idea underlying the first aspect of invention is that a luminaire network is generally a dense network, and that sensors comprised in luminaires may therefore produce sensed data with a relatively high degree of overlap in what they sense and/or wherein some degree of correlation is to be expected. It is further based on the realization that luminaires may be equipped with the ability to communicate amongst themselves and/or with a central unit and with some processing capabilities, and that therefore a combination of local and distributed processing may be used to minimize requirements on sensors, processors and bandwidth simultaneously. Finally, the combination of data from different luminaires may be able to lead to additional insights. The second aspect of the invention, which may require only a single sensor, relates to improvements of specifically traffic measurement. The invention is not limited to a certain type or number of sensors, nor is it limited to the embodiments described above.

(30) FIG. 7 illustrates in an abstract, simplified way what a camera located along a roadway with a plurality of lanes (L.sub.1, L.sub.2, L.sub.3, L.sub.4) may observe. An average speed may be different for each lane: generally, the average speed is expected to be highest in the innermost lane L.sub.1, lower in the next lane L.sub.2, and lower still in the third lane L.sub.3. Lane L.sub.4 is an off-ramp, and the expected average speed of vehicles in this lane will depend on the length and angle of the off-ramp as well as the type of roadway it may lead to. Many types of traffic information may be determined from such image information, and in particular from a plurality of successively captured images. Examples are an average speed, an average number of vehicles passing through during a predetermined period of time, maximum and minimum speeds, accelerations, etc. In the method according to the second aspect of the invention, traffic information T.sub.1, T.sub.2, T.sub.3, T.sub.4 is determined for each lane L.sub.1, L.sub.2, L.sub.3, L.sub.4 separately. Furthermore, to ascertain the traffic situation, the information is compared to pre-determined, lane-dependent values. These values may comprise a single value per lane, but may also comprise a plurality of lane-dependent values. The comparison results then yield traffic information T.sub.r about the roadway. While FIG. 7 shows an abstraction of possible image data, the method is not limited to image data.

(31) FIG. 8 illustrates the method further at the hand of an example. Lines T.sub.1, T.sub.2, T.sub.3, T.sub.4 indicate determined traffic information, in this particular example an average speed. The considered roadway may be the one as seen in FIG. 7, with four lanes. The maximum speed in this example is 120 km/h. The average speed T.sub.1 in lane 1 is only slightly less than that maximum. The average speed T.sub.2 in lane 2 is around 90 km/h. The average speed T.sub.3 in lane three is around 80 km/h. The average speed T.sub.4 in lane 4, the off-ramp, is only about 50 km/h.

(32) A method considering only the overall average speed may conclude that traffic is slow in this particular stretch of roadway, since the vehicles on the off-ramp have a lowering effect. The method according to the second aspect, however, is more precise. In particular, in the example of FIG. 8, each lane has two thresholds. First thresholds D.sub.1-1, D.sub.1-2, D.sub.1-3, D.sub.1-4 are chosen such that if the average speed in a lane is above it, traffic in that lane can be said to be fluid or fast; and that if the average speed in the lane is below it, traffic in that lane can be said to be moderate or slow, or even stationary. For each lane, this threshold is different: if the average speed in lane 1 is 90 km/h, this is not fast, whereas if the average speed in lane 4 is 90 km/h, that would be extraordinarily fast. Second thresholds D.sub.2-1, D.sub.2-2, D.sub.2-3, D.sub.2-4 introduce a further demarcation: if the average speed in a given lane is between the first and second threshold, traffic in that lane can be said to be moderate, whereas if the average speed in a lane is below the second threshold, traffic is slow or even stationary. The second thresholds are also different. An average speed of 50 km/h is perfectly normal on the off-ramp, whereas it signals serious traffic issues in lane 4.

(33) In the present example, the average speed in lane 1, T.sub.1, is above the first threshold D.sub.1-1: traffic in that lane is fluid or fast. In lanes 2 and 3, T.sub.2 lies between D.sub.1-2 and D.sub.2-2 and T.sub.3 lies between D.sub.1-3 and D.sub.2-3: traffic in those lanes is going at a moderate speed. The average speed in lane 4, T.sub.4, lies above D.sub.1-4, indicating smooth traffic there.

(34) Note that FIG. 8 may also be seen as an example of how to visualize the results of the method according to the second aspect. Presented with such an image, a person can easily and quickly assess the traffic situation. Further note that while FIG. 8 has speed on the y-axis, many other types of traffic information may be used, with appropriate thresholds; furthermore, several types of traffic information may be combined for more nuanced results.

(35) FIG. 9 shows a map of a road with two roadways, each with four lanes. A sensor P may be placed and configured such that it can capture data, for instance image data, relating to the plurality of lanes L.sub.1, L.sub.2, L.sub.3, L.sub.4 of one roadway. For information purposes, a map may be displayed with an indication of the position of sensor P and lines 91 and 92 indicating the field of view. Furthermore, FIG. 9 shows another, simpler way of displaying traffic information to a user. The box T.sub.r comprises one arrow for each lane, using often-use symbols: here, straight arrows A.sub.1, A.sub.2 and A.sub.3 for the through-lanes L.sub.1, L.sub.2, L.sub.3 and bent arrow A.sub.4 for the off-ramp L.sub.4. These arrows may then be color-coded to display the result of the comparison between the determined traffic information for each lane and the associated pre-determined comparison values. For instance, referring back to the example described in conjunction with FIG. 8, A.sub.1 and A.sub.4 may be green to indicate fluid traffic conditions, while A.sub.2 and A.sub.3 may be yellow or orange, for example, to indicate that the average speed is moderate there when compared to expected values. Red could be used to indicate very slow or stationary traffic in a lane. As will be clear, a map of a larger area, displaying such compact visual indications at several points, will allow for a very quick yet precise assessment of the general traffic conditions, which improves on existing methods which merely indicate where the general average speed is moderate or low.

(36) As will be clear, FIG. 7-9 illustrate only an example of the method according to the second aspect, and many variations will be clear to the skilled person which make use of a lane-dependent evaluation of lane-specific traffic information. Furthermore, while the first and second aspects of the invention have separate advantages, advantageous embodiments may combine these aspects to further improve results.

(37) In embodiments of the invention according to the first aspect, the first sensor may be e.g. a camera. The images captured by the camera may be used e.g. to determine the type of the vehicles that drive on the road adjacent a luminaire, or to count the vehicles driving on the road. For example, three first sensors of three adjacent luminaires may determine the following first processed data:

(38) TABLE-US-00001 first processed data Luminaire % of cars % of trucks or busses First sensor of 75 25 luminaire 1 First sensor of 55 45 luminaire 2 First sensor of 50 50 luminaire 3

(39) The second processed data may consist in an average value of the percentage of sensed cars and the percentage of sensed trucks or busses, resulting in the example above in 60% of cars and 40% of trucks or busses.

(40) In addition or alternatively, three first sensors of three adjacent luminaires may determine the following first processed data:

(41) TABLE-US-00002 first processed data % for which the number % for which the number % for which the number of counted vehicles per of counted vehicles per of counted vehicles per Luminaire time frame is within 0-50 time frame is within 51-100 time frame is within 101-150 First sensor of 10 80 10 luminaire 1 First sensor of 12 78 10 luminaire 2 First sensor of 10 74 16 luminaire 3

(42) The second processed data may consist in an average value of the percentage of counted vehicles per time frame being within a predetermined range, resulting in the example above in: during 11% of the time the number of vehicles per time frame is between 0 and 50; during 77% of the time the number of vehicles per time frame is between 51 and 100; during 12% of the time the number of vehicles per time frame is between 101 and 150.

(43) The table below illustrates a couple of exemplary embodiments of the first aspect.

(44) TABLE-US-00003 First sensor First sensed data Goal First processed data Second processed data Camera Images Alert to police Unusual Direction/ghost Analysis of how many first or operator driver detection sensors are sending the wrong direction information, combined with GPS data to conclude if it really is a wrong direction Camera Images Classification Classification of vehicles Classification of vehicles based (cars or based on sensed data of a on the classification data from a trucks) camera plurality of first sensors, i.e. a plurality of camera's Camera Images Speed Bin Number of counts in each Combination of all speed bins counts speed bin from relevant first sensors to better estimate speed bins Vibration Vibrations Alert for Unusual vibrations Analysis of how many first sensor/ Earthquake sensors and where unusual accelerometer vibrations are detected sensor

(45) In the first example the first sensor is a camera, and it is desired to alert the police or an operator when an unusual direction of a vehicle/a ghost driver is detected. Typically, camera sensors on their own can only be relied upon to determine that something unusual has happened in terms of tracking a vehicle. However, by combining the measurements of several camera sensors, an alert of wrong direction can be sent to the police or to an operator in a reliable manner. The determining of the second processed data based on the first processed data from several luminaires can be done in a luminaire device or in a remote device (e.g. the central unit).

(46) In the second example the first sensor is a camera, and it is desired to count the number of cars and the number of trucks that drive in the road. The first sensor of a luminaire is a camera sensing images, and based on those images, the number of light vehicles and the number of heavy vehicles that pass within a predetermined time period is determined; this is the first processed data.

(47) Typically such first processed data will have an accuracy X which fulfills 0<X<Max, wherein Max may be e.g. the maximum accuracy possible for the first sensor/processing unit. By combining classification information of a plurality of first sensors of adjacent luminaires, it is possible to achieve an increased level of accuracy e.g. by averaging the values of heavy and light vehicles derived from the first sensed data of the plurality of first sensors, as explained above. In that way the accuracy X of the classification determined as the second processed data may approach the value Max at all times of the day.

(48) In the third example the first sensor is a camera, and it is desired to count the number of vehicles that drive in the road. The quality of the second processed data will be improved and the counts in the speed bins will be more accurate by combining the speed bin information of different first sensors.

(49) In the fourth example the first sensor is a vibration sensor or accelerometer sensor, and it is desired to detect earthquakes. Also for this example, the use of multiple first sensors in different luminaires will significantly improve the accuracy of the second processed data and/or allows identifying the epicenter of the earthquake.

(50) In yet other examples, the first processed data may comprise a variable indicating the detection of emergency vehicle or of a gunshot or of any other sound, such as a vehicle sound or an animal sound, as has been explained in more detail in the summary.

(51) To further improve the accuracy of the second processed data, an indication of the quality of the first processed data may be taken into account. The table below lists a series of static and dynamic quality parameters may be taken into account. One or more of those parameters may be taken into account to obtain an indication of the quality of the first sensed data and/or of the first processed data.

(52) TABLE-US-00004 DYNAMIC QUALITY STATIC QUALITY PARAMETERS PARAMETRS Signal to Noise ratio.sup.1: Algorithm running on camera Frame rate City Version Exposure time Highway Version Gain Two way street Temperature One way street Light level in scene Pedestrian Detection Vibration data Parking Hardware limits Firmware version Processing load Lens installed Available Memory Viewing angle Dynamic Range of Current Scene Distortion Above Specs.sup.2 Third party correlation: Below Specs Constructions Obstructions yes/no: Obstructions Bright light Traffic Info Sunlight Complete black or low light GPS location Fog Sunrise and Sunset times Insects Spiders etc. .sup.1Signal to noise Ratio or SNR varies with the exposure time, temperature and the gain applied to the sensor .sup.2This is a flag to say if the Dynamic range of the current scene is or is not within the limits of the Dynamic Range specification of the image sensor

(53) The dynamic parameters may be estimated e.g. using a secondary sensor as has been described in more detail above. The static parameters may be stored in a database e.g. on the central unit. Alternatively, the static parameters may be stored in a luminaire and may be transmitted to the location where the second processed data is produced, e.g. the central unit. Based on one or more quality parameters related to a first sensor and/or on one or more quality parameters related to the processing unit producing the first processed data, more or less weight may be given to the corresponding first processed data when performing the further processing to produce the second processed data. In that manner, the reliability of the second processed data can be further increased.

(54) Whilst the principles of the invention have been set out above in connection with specific embodiments, it is to be understood that this description is merely made by way of example and not as a limitation of the scope of protection which is determined by the appended claims.