Apparatus and method to calculate energy dissipated from an object
09804031 · 2017-10-31
Assignee
Inventors
- Katrin Strandemar (Rimbo, SE)
- Henrik Jönsson (Stockholm, SE)
- Torbjörn Hamrelius (Sollentuna, SE)
- Gunnar Palm (Jarfalla, SE)
Cpc classification
G01J5/064
PHYSICS
G01J5/025
PHYSICS
International classification
Abstract
An IR camera includes: a thermal radiation capturing arrangement for capturing thermal radiation of an imaged view in response to input control unit(s) receiving user inputs from a user of the IR camera; a processing unit arranged to process the thermal radiation data in order for the thermal radiation data to be displayed by an IR camera display as thermal images; and an IR camera display arranged to display thermal images to a user of the IR camera. The processing unit is further arranged to determine at least one temperature reference value representing the temperature of the surrounding environment of the imaged view; and calculate at least one output power value indicative of an amount of energy dissipated in a part of the imaged view by using the temperature value of the thermal radiation data corresponding to said part of the imaged view and the at least one determined temperature reference value.
Claims
1. An apparatus comprising: a thermal radiation capturing arrangement configured to capture thermal radiation data of an imaged view; one or more input control units configured to receive manual inputs from a user of the apparatus and further to output the manual inputs as user input information; and a processor configured to: receive the thermal radiation data from the thermal radiation capturing arrangement, receive the user input information from at least one of the one or more input control units, determine a radiated temperature reference value and a convection temperature reference value representing temperatures of an environment surrounding an object in the imaged view, and calculate an object power value indicative of an amount of energy dissipated from the object in the imaged view based on at least: a radiation power value based on at least the radiated temperature reference value and a portion of the thermal radiation data corresponding to the object, and a convection power value based on at least the convection temperature reference value and the portion of the thermal radiation data corresponding to the object.
2. The apparatus of claim 1, wherein: the processor is further configured to generate a thermal image of the imaged view using the thermal radiation data and the calculated object power value, the thermal image comprising one or more pixels representing the calculated object power value according to a color palette; and the apparatus comprises a display configured to present the thennal image for viewing by the user.
3. The apparatus of claim 1, wherein: the user input information comprises energy price information and time span information; and the processor is further configured to calculate a total energy cost estimate for the object in the imaged view based on the calculated object power value, the energy price information, and the time span information.
4. The apparatus of claim 1, wherein the object power value is calculated in response to at least part of the user information.
5. The apparatus of claim 4, wherein: the at least part of the user input information comprises an indication of an area in the imaged view representing the object; and the processor is configured to identify the portion of the thermal radiation data corresponding to the object in the imaged view to calculate the object power value, in response to the indication of the area representing the object.
6. The apparatus of claim 4, wherein: the at least part of the user input information comprises a temperature threshold value; and the processor is configured to identify the portion of the thermal radiation data corresponding to the object in the imaged view to calculate the object power value, in response to the temperature threshold value.
7. The apparatus of claim 6, wherein the identified portion of the thermal radiation data has a temperature value equal to or above the temperature threshold value.
8. The apparatus of claim 4, wherein the at least part of the user input information comprises a form indicator that is indicative of the shape or form of the object in the imaged view.
9. The apparatus of claim 4, wherein the at least part of the user input information comprises an emissivity value that is indicative of the emissivity of a surface of the object in the imaged view.
10. The apparatus of claim 1, further comprising a distance determining unit configured to determine distance information representing a distance from the apparatus to the object in the imaged view, wherein the processor is further configured to: determine an object field-of-view associated with the object represented by the portion of the thermal radiation data; estimate an actual physical surface area of the object in the imaged view facing towards the apparatus based upon the distance information and the determined object field-of-view; and calculate the object power value using the estimation of the actual physical surface area of the object.
11. The apparatus of claim 1, wherein the user input information comprises a user temperature reference value; and the processor is configured to determine the radiated temperature reference value and/or the convection temperature reference value based on the user temperature reference value.
12. A method of calculating an amount of energy dissipated from an object in an imaged view, the method comprising: capturing thermal radiation data of the imaged view using an infrared (IR) camera; receiving, at one or more input control units of the IR camera, manual inputs from a user as user input information; determining a radiated temperature reference value and a convection temperature reference value representing temperatures of an environment surrounding an object in the imaged view; and calculating an object power value indicative of the amount of energy dissipated from the object in the imaged view based on at least: a radiation power value based on at least the radiated temperature reference value and a portion of the thermal radiation data corresponding to the object, and a convection power value based on at least the convection temperature reference value and the portion of the thermal radiation data corresponding to the object.
13. The method of claim 12, further comprising: generating a thermal image of the imaged view using the thermal radiation data and the calculated object power value, the thermal image comprising one or more pixels representing the calculated output power value according to a color palette; and displaying the generated thermal image for viewing by the user.
14. The method of claim 12, wherein: the user input information comprises energy price information and time span information; and the method further comprises calculating a total energy cost estimate for the object in the imaged view based on the calculated object power value, the energy price information, and the time span information.
15. The method of claim 12, wherein the object power value is calculated in response to at least part of the user information.
16. The method of claim 15, wherein: the at least part of the user input information comprises an indication of an area in the imaged view representing the object; and the calculating of the object power value comprises identifying the portion of the thermal radiation data corresponding to the object in the imaged view, in response to the indication of the area representing the object.
17. The method of claim 15, wherein: the at least part of the user input information comprises a temperature threshold value; the calculating of the object power value comprises identifying the portion of the thermal radiation data corresponding to the object in the imaged view, in response to the temperature threshold value; and the identified portion of the thermal radiation data has a temperature value equal to or above the temperature threshold value.
18. The method of claim 15, wherein the at least part of the user input information comprises a form indicator that is indicative of the shape or form of the object in the imaged view.
19. The method of claim 15, wherein the at least part of the user input information comprises an emissivity value that is indicative of the emissivity of a surface of the object in the imaged view.
20. The method of claim 12, wherein: the user input information comprises a user temperature reference value; and the determining the radiated temperature reference value or the convection temperature reference value is based on the user temperature reference value.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
DETAILED DESCRIPTION
(8)
(9) From the detector element 5, captured thermal radiation data T.sub.x,y may be fed to a signal comprising signal values to a signal conditioning unit 6. The signal conditioning unit 6 may perform conventional signal conditioning, such as, for example, corrections for the inherent offset, gain drift, etc, and convert the signal values into thermal radiation data T.sub.x,y comprising temperature values. The signal conditioning unit 6 is arranged to output a thermal radiation signal comprising the thermal radiation data T.sub.x,y to a processing unit 7.
(10) The processing unit 7 is arranged to receive the thermal radiation signal comprising the thermal radiation data T.sub.x,y from the signal conditioning unit 6. The processing unit 7 is also arranged to control an IR camera display 11, for example, a viewfinder, a digital display and/or touch screen provided on the IR camera housing. The processing unit 7 is further arranged to process the thermal radiation data T.sub.x,y in order for the thermal radiation data T.sub.x,y to be displayed by the IR camera display 11 as thermal images. The processing unit 7 may output a thermal image signal to the IR camera display 11. The thermal image signal to the IR camera display 11 may also comprise additional information other than the thermal images to be displayed by the IR camera display 11. The thermal, radiation data recorded by the IR camera 1 can thus controlled to be displayed in the IR camera display 11 as thermal images, with or without the additional information, and be presented to a user of the IR camera 1. The operation of the processing unit 7 in the IR camera 1 is described in more detail in the exemplary embodiments presented below with reference to
(11) It should be noted that the signal conditioning unit 6 and the processing unit 7 may be provided as one physical unit, or alternatively as a plurality of logically interconnected units. The signal conditioning unit 6 and the processing unit 7 may also comprise processing means or logic for performing the functionality of the IR camera 1. This functionality may be implemented partly by means of a software or computer program. The signal conditioning unit 6 and the processing unit 7 may also comprise storage means or a memory unit for storing such a computer program and processing means or a processing unit, such as a microprocessor, for executing the computer program. The storage means may be a readable storage medium, but also a memory storage unit 10 separated from, but connected to the signal conditioning unit 6 and the processing unit 7. When, in the following, it is described that the IR camera 1, the signal conditioning unit 6 or the processing unit 7 performs a certain function or operation it is to be understood that the signal conditioning unit 6 and/or the processing unit 7 may use the processing means or logic to execute a certain part of the program which is stored in the storage means.
(12) The processing unit 7 may also be connected to and/or arranged to communicate with a distance determining unit 8 arranged to determine the distance between an object 3 in the imaged view 2 and the IR camera 1 and output the measured distance to the processing unit 7. The distance determining unit 8 may, for example, be the at least one lens arrangement 4 using focusing operations to determine the distance to an object 3, a laser distance measurement unit measuring the distance to the object 3 using laser, or any other type of distance measuring unit. The processing unit 7 may also be connected to and/or arranged to communicate with at least one input control unit(s) 12 arranged to receive manual inputs from a user of the IR camera 1 and output the manual inputs to the processing unit 7. The at least one input control unit(s) 12 may, for example, be buttons and/or joysticks, or be incorporated in the IR camera display 11 as a touch screen functionality. The processing unit 7 may also be connected to and/or arranged to communicate with at least one temperature measurement unit(s) 13 arranged to measure the surrounding air temperature and/or a surface temperature and output the measured air temperature values and/or surface temperature values to the processing unit 7. The at least one temperature measurement unit(s) 13 may be integrated into the IR camera 1 (as shown in
(13)
(14) By having the processing unit 7 in the IR camera 1 arranged to perform calculations of the power density values PD.sub.x,y, the IR camera 1 is able to calculate and/or display the power density distribution of an imaged view 2. Thus, the IR camera 1 may increase usability to a user of the IR camera 1 when the user of the IR camera 1 is attempting to analyze the imaged view 2 captured by the IR camera 1. This further may allow a user of the IR camera 1 to achieve an estimate of the energy properties of the imaged view 2 in a simple and easy manner without having to perform any manual calculations of the same.
(15) The processing unit 7 may be arranged to perform the calculation of the power dissipated in the imaged view 2 (referred to herein as an output power values or output power density values, PD.sub.x,y) based on calculated values of the energy exchanged through radiation from a surface(s) in the imaged view 2 to its surrounding environment (referred to herein as the radiated power values, PD.sub.x,y.sup.rad). This is illustrated by the logical block 21 in
PD.sub.x,y=PD.sub.x,y.sup.rad+PD.sub.x,y.sup.conv (Eq.1)
(16) Although the output power value PD.sub.x,y according to Eq. 1 is useful since it sums the power outputted in the imaged view 2 through both radiation and convection and thus may achieve a more accurate estimation of a value indicative of the outputted power in the imaged view, it should be noted that the output power value PD.sub.x,y may also be calculated by the processing unit 7 as either the radiated power value PD.sub.x,y.sup.rad or the convection power value, PD.sub.x,y.sup.conv.
(17) The radiated power values PD.sub.x,y.sup.rad in the imaged view 2 may be calculated in the logical block 21 by the processing unit 7 according to the following equation, Eq.2:
PD.sub.x,y.sup.rad=ε.Math.σ.Math.((T.sub.x,y).sup.4−T.sub.rad,env.sup.4) (Eq. 2)
wherein ε is the emissivity of the surface(s) in the imaged view 2, σ is the Stefan-Boltzmann constant, 5.67.Math.10.sup.−8 W/m.sup.2K.sup.4, T.sub.x,y is the temperature values at positions (x, y) in the thermal radiation data received from the detector element 5 and the signal conditioning unit 6, T.sub.rad,env is a temperature reference value representing the temperature of the surrounding environment of the imaged view 2.
(18) The convection power values PD.sub.x,y.sup.conv in the imaged view 2 may be calculated in the logical block 22 by the processing unit 7 according to the following equation, Eq.3:
PD.sub.x,y.sup.conv=h.Math.(T.sub.x,y−T.sub.conv,env) (Eq. 3)
wherein h is a heat transfer coefficient [W/m.sup.2K], T.sub.conv,env is a temperature reference value representing the temperature of the surrounding environment of the imaged view 2.
(19) Thus, combining Eq.1-3, the output power values PD.sub.x,y for each of the temperature values of the thermal radiation data T.sub.x,y may be calculated by the processing unit 7 according to the following equation, Eq. 4:
PD.sub.x,y=PD.sub.x,y.sup.rad+PD.sub.x,y.sup.conv=ε.Math.σ.Math.((T.sub.x,y).sup.4−T.sub.rad,env.sup.4)+h.Math.(T.sub.x,y−T.sub.conv,env) (Eq. 4)
(20) As the processing unit 7 in the IR camera 1 has calculated the output power values PD.sub.x,y in accordance with the above, the processing unit 7 may be arranged to control the IR camera display 11 to display the calculated output power values PD.sub.x,y to a user of the IR camera 1 in place of the thermal images, e.g. as shown in
(21) However, in order for the processing unit 7 in the IR camera 1 to calculate the output power values PD.sub.x,y the parameters T.sub.rad,env, T.sub.conv,env, h and ε comprised in Eq. 1-4 have to be determined by the processing unit 7.
(22) The temperature reference values T.sub.rad,env and T.sub.conv,env representing the temperature of the surrounding environment of the imaged view 2 may be determined by the processing unit 7 by receiving information from the input control unit(s) 12 comprising at least one temperature value to be used as both or one of the temperature reference values T.sub.rad,env and T.sub.conv,env. The user of the IR camera 1 may thus be arranged to manually enter a temperature value to the processing unit 7, which may be used by the processing unit 7 as both or one of the temperature reference values T.sub.rad,env and T.sub.conv,env. Alternatively, the processing unit 7 may be arranged to determine either or both of the temperature reference values T.sub.rad,env and T.sub.conv,env by receiving a temperature measurement value(s) from the temperature measuring unit(s) 13. This alternative will also allow the processing unit 7 of the IR camera 1 to automatically determine either or both of the temperature reference values T.sub.rad,env and T.sub.conv,env to be used in the calculation of the output power value PD.sub.x,y. Any one of the alternatives described above may be used by the processing unit 7 in the IR camera 1 in order to determine the temperature reference value T.sub.rad,env and/or the temperature reference value T.sub.conv.env. Alternatively, the IR camera 1 may use a surface temperature measurement unit arranged to detect a surface temperature for determining the temperature reference value T.sub.rad,env and a air temperature measurement unit arranged to detect an air temperature for determining the temperature reference value T.sub.conv,env.
(23) The heat transfer coefficient h indicates the amount of natural convection occurring at a surface(s) in the imaged view 2, and may be determined by the processing unit 7 by receiving information from the input control unit(s) 12 comprising a heat transfer coefficient value. The heat transfer coefficient value may be inputted manually by a user of the IR camera 1. Alternatively, the heat transfer coefficient value may be based on information indicating the wind speed of the air surrounding the imaged view 2 received from the wind speed measuring unit 15. For a particular wind speed, the processing unit 7 may determine a corresponding heat transfer coefficient value to be used by the processing unit when calculating the output power values PD.sub.x,y. This may be performed by the processing unit 7 or the memory storage unit 10 comprising a list associating wind speeds with corresponding heat transfer coefficient values. The heat transfer coefficient value may also be set to a suitable default value, such as, for example, 1≦h≦10 for indoor environments, and 1≦h≦100 for outdoor environments.
(24) The emissivity ε of surface(s) in the imaged view 2 may be determined by the processing unit 7 by receiving information from the input control unit(s) 12 comprising a value of the emissivity of the surface(s). The value of the emissivity ε of the surface(s) may be inputted manually by a user of the IR camera 1. The value of the emissivity ε may also be set to a suitable default value, for example, ε=1.
(25)
(26) By having the processing unit 7 in the IR camera 1 arranged to perform calculations of the power which is outputted from an object 3 in the imaged view 2, the IR camera 1 may increase usability to a user of the IR camera 1 when the user of the IR camera 1 is attempting to analyze the object 3 by viewing the thermal images of the IR camera 1. This may allow a user of the IR camera 1 to achieve an estimate of the energy properties of the imaged object 3 in a simple and easy manner without having to perform any manual calculations of the same.
(27) The processing unit 7 may be arranged to perform the calculation of the power outputted from an object 3 in the imaged view 2 (referred to herein as an object output power value, P.sub.TOT.sup.obj) based on a calculated value of the energy exchanged by the object 3 through radiation from its surface to its surrounding environment (referred to herein as the radiated power value, P.sub.rad.sup.obj). This is illustrated by the logical block 31 in
P.sub.TOT.sup.obj=P.sub.rad.sup.obj+P.sub.conv.sup.obj (Eq.5)
(28) Although the object output power value P.sub.TOT.sup.obj according to Eq. 1 is useful since it sums the power outputted from the object 3 through both radiation and convection and may thus achieve a more accurate estimation of a value of the outputted power from the object 3. It should be noted that the object output power value P.sub.TOT.sup.obj may also be calculated by the processing unit 7 as either the radiated power value P.sub.rad.sup.obj or the convection power value, P.sub.conv.sup.obj.
(29) The radiated power value P.sub.rad.sup.obj from the object 3 in the imaged view 2 may be calculated in the logical block 31 by the processing unit 7 according to the following equation, Eq. 6:
(30)
wherein ε is the emissivity of the surface of the object 3, A is the surface area of the object 3 facing towards and being perceived by 2D-camera view of the IR camera 1, σ is the Stefan-Boltzmann constant, 5.67.Math.10.sup.−8 W/m.sup.2K.sup.4, T.sub.x,y.sup.obj is a first subset of the thermal radiation data T.sub.x,y received from the detector element 5 and the signal conditioning unit 6 which has been determined by the processing unit 7 to represent the object 3 in the imaged view 2 (herein referred to as the thermal image object area 35) for which an object output power value P.sub.TOT.sup.obj is to be determined, T.sub.rad,env is a temperature reference value representing the temperature of the surrounding environment of the object 3 in the imaged view 2.
(31) The convection power value P.sub.conv.sup.obj from the object 3 in the imaged view 2 may be calculated in the logical block 32 by the processing unit 7 according to the following equation, Eq. 7:
(32)
wherein h is a heat transfer coefficient [W/m.sup.2 K], T.sub.conv,env is a temperature reference value representing the temperature of the surrounding environment of the object 3 in the imaged view 2.
(33) Thus, combining Eq.1-3, the object output power value P.sub.TOT.sup.obj may be calculated by the processing unit 7 according to the following equation, Eq. 8:
(34)
(35) As the processing unit 7 in the IR camera 1 has calculated the object output power value P.sub.TOT.sup.obj in accordance with the above, the processing unit 7 may be arranged to control the IR camera display 11 to display the object output power value P.sub.TOT.sup.obj to a user of the IR camera 1 together with the thermal images, e.g. as shown in
(36) As can be seen from Eq. 1-4 in the previous embodiment described in reference to
(37)
(38) However, in order for the processing unit 7 in the IR camera 1 to calculate the object output power value P.sub.TOT.sup.obj the parameters T.sub.x,y.sup.obj, or PD.sub.x,y.sup.obj, and A comprised in Eq. 5-9 have to be determined by the processing unit 7. The parameters T.sub.rad,env, T.sub.conv,env, h and ε may be determined by the processing unit 7 in the same manner as described in the previous embodiment described in relation to
(39) The first subset T.sub.x,y.sup.obj of the thermal radiation data T.sub.x,y corresponds to a thermal image object area 35 in the thermal images displayed on the display unit 11. The thermal image object area 35 represents the object 3 in the imaged view 2 for which an output power value P.sub.TOT.sup.obj is to be determined. The thermal image object area 35 may be determined by the processing unit 7 by receiving information from the input control unit(s) 12 indicating a particular area in the thermal images displayed on the display unit 11 as the thermal image object area 35. By receiving the information indicating a particular area as the thermal image object area 35, the processing unit 7 may identify the first subset T.sub.x,y.sup.obj of the thermal radiation data T.sub.x,y corresponding to the indicated thermal image object area 35. For example, the user of the IR camera 1 may, using the input control unit(s) 12, place a digital marker or indicator around the particular area in the thermal images to be indicated as the thermal image object area 35 in the thermal images. Alternatively, the thermal image object area 35 and the corresponding first subset T.sub.x,y.sup.obj of the thermal radiation data T.sub.x,y may be determined by the processing unit 7 by using a temperature threshold value. The temperature threshold value will directly indicate a first subset T.sub.x,y.sup.obj of the thermal radiation data T.sub.x,y as the thermal image object area 35. The temperature threshold value may be a user-settable threshold value or a default threshold value. The user of the IR camera 1 may set the temperature threshold value to be used in determining the thermal image object area 35 and the corresponding first subset T.sub.x,y.sup.obj of the thermal radiation data T.sub.x,y, by using the input control unit(s) 12. The same procedures may be used in order to determine the object output power density values PD.sub.x,y.sup.obj from which the object output power value P.sub.TOT.sup.obj may be calculated as well.
(40) The temperature reference values T.sub.rad,env and T.sub.conv,env representing the temperature of the surrounding environment of the object 3 in the imaged view 2 may be determined by the processing unit 7 in the same mariner as described in the previous embodiment described in relation to
(41) The surface area parameter A represents the actual surface area of the object 3 facing towards and being perceived by 2D-camera view of the IR camera 1 in the imaged view 2. The surface area parameter A may be determined by the processing unit 7 by receiving information from the input control unit(s) 12 comprising a value of the actual physical surface area of the object 3. The user of the IR camera 1 may thus be arranged to manually enter an area value to the processing unit 7 corresponding to the surface of the object 3, which may be used by the processing unit 7 as a value of the surface area parameter A. The value of the surface area parameter A may also be set to a suitable default value.
(42) Alternatively, the processing unit 7 may be arranged to receive information from the distance determining unit 8 comprised in the IR camera 1, wherein the information comprises a distance d between the actual physical object 3 in the imaged view 2 and the IR camera 1. Further, the processing unit 7 may be arranged to determine an object field-of-view o.sub.fov 14 of the determined thermal image object area 35 in the thermal radiation data T.sub.x,y. An object field-of-view o.sub.fov 14 of a determined thermal image object area 35 in the thermal radiation data T.sub.x,y may be determined by the processing unit 7 by, for example, from the first subset T.sub.x,y.sup.obj of the thermal radiation data T.sub.x,y determined as the thermal image object area 35, determine how much of the total field-of-view 2 of the IR camera 1, i.e. the entire thermal radiation data is occupied by the thermal image object area 35 and then, since the total field-of-view of the IR camera 1 is determined by the type of components incorporated in the IR camera 1 and hence may be known by the processing unit 7, estimate the object field-of-view o.sub.fov 14 based upon how much of the total field-of-view of the IR camera 1 is occupied by the thermal image object area 35. Once the distance d to the object 3 has been received and the object field-of-view o.sub.fov 14 has been determined, the processing unit 7 may calculate an estimate of the actual physical surface area of the object 3 in the imaged view 2 facing towards and being perceived by 2D-camera view of the IR camera 1, i.e. the surface area parameter A , based upon the received distance d to the object 3 and the determined object field-of-view o.sub.fov 14. For example, this may be performed by the processing unit 7 by using the area a.sub.x,y.sup.pix for each thermal image pixel in the object field-of-view o.sub.fov 14 of the determined thermal image object area 35 according to the following equation, Eq. 10-11:
(43)
(44) The field-of-view for each pixel o.sub.fov.sup.pix in the object field-of-view o.sub.fov 14 of the determined thermal image object area 35 may, for example, by dividing the total field-of-view of the IR camera 1 by the resolution of the detector element 5 in the IR camera 1.
(45) According to another alternative, the processing unit 7 may be arranged to receive information from the input control unit(s) 12 which comprises a form indicator ƒ.sub.ind. The form indicator ƒ.sub.ind may be indicative of the shape or form of the actual physical object 3 in the imaged view 2. The form indicator ƒ.sub.ind may be used to describe and represent the actual shape, form and/or orientation of the surface of the object 3 facing and being perceived by 2D-camera view of the IR camera 1. Furthermore, if the object 3 in the imaged view 2 can be assumed to emit energy homogenously in every direction and the object output power value P.sub.TOT.sup.obj is to be determined for the entire object 3, that is, not only for the surface of the object 3 facing and being perceived by 2D-camera view of the IR camera 1, the form indicator ƒ.sub.ind may be used to describe and represent the 3D-shape or form of the object 3 in the imaged view 2. The form indicator ƒ.sub.ind, or a corresponding value stored for the particular form indicator in the memory storage unit 10 or in the processing unit 7, may be used by the processing unit 7 when calculating the object output power value P.sub.TOT.sup.obj. For example, this may be performed by the processing unit 7 by modifying Eq.11 according to the following equation, Eq. 12:
(46)
(47) Thus, the form indicator ƒ.sub.ind may be used to adapt the surface area parameter according to the actual 3D shape and form of the object 3 in the imaged view 2.
(48) The form indicator ƒ.sub.ind may be manually inputted by the user of the IR camera 1 through the input control unit(s) 12. However, the user of the IR camera 1 may also be presented in the display unit 11 with a range of alternative object views from which the user of the IR camera 1 may select the most suitable alternative using the input control unit(s) 12, whereby the form indicator ƒ.sub.ind associated with the selected object view may be used by the processing unit 7 when calculating the object output power value P.sub.TOT.sup.obj. The alternative object views may, for example, be images describing different geometric fowls and/or the angle from which the object 3 having a particular geometric form is viewed by the IR camera 1.
(49) According to yet another alternative, if the distance d comprised in information received from the distance determining unit 8 is a distance map, the processing unit 7 may be arranged to estimate the shape or form of the actual physical object 3 in the imaged view 2 using this distance map. The distance map comprise separate distance values for different subsets of the thermal radiation data T.sub.x,y, for example, one distance value for each captured temperature value by the detector element 5. By receiving the distance map from the distance determining unit 8, the processing unit 7 may determine the distances to each temperature value in the first subset T.sub.x,y.sup.obj of the thermal radiation data T.sub.x,y corresponding to the thermal image object area 35. The processing unit 7 is thus able to achieve a 3D-representation of the object 3 in the imaged view 2 and may use the 3D-representation of the object 3 when calculating the object output power value P.sub.TOT.sup.obj. This may, for example, be performed by the processing unit 7 by determining a suitable form indicator ƒ.sub.ind to be used in dependence of the 3D-representation of the object 3, or by comprising an area estimation algorithm which may estimate the surface area of the object 3 using the 3D-representation of the object 3. This may be performed for the surface of the object 3 facing and being perceived by 2D-camera view of the IR camera 1. For example, the processing unit 7 may be arranged to incorporate the 3D-representation of the object 3 in Eq. 8 by having the area a.sub.x,y.sup.pix for each thermal image pixel in the object field-of-view o.sub.fov 14 of the determined thermal image object area 35 included in calculation of the object output power value P.sub.TOT.sup.obj according to the following equation, Eq. 13:
(50)
wherein a.sub.x,y.sup.pix is dependent upon its corresponding value d in the distance map.
(51) The processing unit 7 may be arranged to combine the use of a form indicator ƒ.sub.ind according to the previous alternative (for example, to determine an object output power value P.sub.TOT.sup.obj for the entire object 3, that is, not only for the surface of the object 3 facing and being perceived by 2D-camera view of the IR camera 1) with a form indicator fƒ.sub.ind, or 3D representation, according to the latter alternative (for example, to determine the surface of the object 3 facing and being perceived by 2D-camera view of the IR camera 1) when calculating the object output power value P.sub.TOT.sup.obj.
(52)
E.sub.TOT.sup.obj=∫.sub.t.sub.
(53) Alternatively, the processing unit 7 may calculate a total energy output value E.sub.TOT.sup.obj for the object 3 in the imaged view 2 based on the time span information t.sub.meas=Δt and the calculated object output power value P.sub.TOT.sup.obj according to the following equation, Eq. 15:
E.sub.TOT.sup.obj=P.sub.TOT.sup.obj.Math.Δt (Eq. 15)
wherein P.sub.TOT.sup.obj is assumed to be constant over time.
(54) The processing unit 7 may calculate a total energy cost value C based on the current energy price information c.sub.meas and the total energy output value E.sub.TOT.sup.obj according to the following equation, Eq. 16:
C=c.sub.meas.Math.E.sub.TOT.sup.obj=c.sub.meas.Math.∫.sub.t.sub.
(55) The total energy cost value C may also be calculated using Eq.15.
(56) The processing unit 7 may then control the IR camera display 11 to display the total energy cost value (as shown in
(57)
(58)
(59)
(60) In step S74, the processing unit 7 may receive a time span and current energy price information. In step S75, the processing unit 7 may calculate a total energy cost value. This may be performed by calculating a total energy output value for the object in the imaged view based on the time span information and the calculated object output power value, and by then calculating a total energy cost value based on the current energy price information and the calculated total energy output value. The method according to
(61) The description is not intended to be taken in a limiting sense, but is made merely for the purpose of describing the general principles of several optional embodiments. The scope of the invention should only be ascertained with reference to the issued claims.