SENSOR APPARATUS
20220003542 · 2022-01-06
Assignee
Inventors
- Ashley Napier (Wandsworth, GB)
- Shubham Wagh (Wandsworth, GB)
- Josh Kiff (Wandsworth, GB)
- Laura Moreira (Wandsworth, GB)
- Chris Hamblin (Wandsworth, GB)
- Mathew Holloway (Wandsworth, GB)
Cpc classification
G01S17/42
PHYSICS
G01C25/00
PHYSICS
International classification
G01C11/02
PHYSICS
G01C15/00
PHYSICS
G01S17/86
PHYSICS
Abstract
There is provided a portable sensor apparatus (10) for surveying within a room (30) of a building. The sensor apparatus (10) comprises: a sensor unit (12) for temporary insertion into a room (30), the sensor unit (12) being moveable in a scanning motion, and comprising a plurality of outwardly directed sensors (16, 20, 24) arranged to capture sensor data associated with an environment of the sensor apparatus (10) as the sensor unit is moved through the scanning motion. The plurality of sensors (16, 20, 24) comprises: a rangefinder sensor (16); a thermal imaging sensor (20); and a camera (24).
Claims
1. (canceled)
2. A portable sensor apparatus for surveying a building, the sensor apparatus comprising: a sensor unit for temporarily locating at the building, the sensor unit being moveable in a scanning motion and comprising a rangefinder sensor and a plurality of outwardly directed thermal imaging sensors arranged to capture sensor data associated with an environment of the sensor apparatus as the sensor unit is moved through the scanning motion, wherein each of the rangefinder sensor and the plurality of thermal imaging sensors has a field of view, wherein the rangefinder and the thermal imaging sensors are arranged such that the field of view of each of the plurality of thermal imaging sensors at least partially overlaps the field of view of the rangefinder sensor at a given position of the sensor unit, and wherein the field of view of each of the plurality of thermal imaging sensors at least partially overlaps the field of view of at least one of the other thermal imaging sensors of the plurality of thermal imaging sensors.
3. The portable sensor apparatus of claim 2, wherein a combined field of view of the plurality of thermal imaging sensors matches the field of view of the rangefinder sensor.
4. The portable sensor apparatus of claim 2, wherein the sensor unit is configured to rotate about an axis of rotation, and wherein the scanning motion comprises rotation of the sensor unit about the axis of rotation, the plurality of sensors being mounted for rotation with the sensor unit.
5. (canceled)
6. The portable sensor apparatus of claim 4, wherein a field of view of each of the rangefinder sensor, the one or more cameras and the one or more thermal imaging sensors is such that each of the rangefinder sensor, the camera and the thermal imaging sensor is configured to detect an object spaced by a predetermined minimum distance for each of the rangefinder sensor, the camera and the thermal imaging sensor from the sensor unit in either direction along the axis of rotation.
7. The portable sensor apparatus of claim 4, wherein a field of view of at least one of the plurality of sensors encompasses a first portion of the axis of rotation away from the sensor unit in a first direction and a second portion of the axis of rotation away from the sensor unit in a second direction opposite the first direction.
8. The portable sensor apparatus of claim 4, wherein the or each thermal imaging sensor and the camera are each arranged such that a principal axis of each of the or each thermal imaging sensor and the camera intersects with the axis of rotation of the sensor unit.
9. The portable sensor apparatus of claim 4, wherein the sensor unit comprises a first camera having a first camera principal axis and a second camera having a second camera principal axis, and wherein the first camera and the second camera are each arranged such that the first camera principal axis and the second camera principal axis substantially intersect with a first circle in a plane transverse to the axis of rotation and centred on the axis of rotation.
10. The portable sensor apparatus of claim 4, wherein the sensor unit comprises a plurality of thermal imaging sensors each having a thermal imaging sensor principal axis, and wherein each of the thermal imaging sensors are arranged such that the thermal imaging sensor principal axes intersect with a second circle in a plane transverse to the axis of rotation and centred on the axis of rotation.
11. The portable sensor apparatus of claim 4, further comprising a controller to control the plurality of sensors to capture the sensor data during rotation of the sensor unit.
12. The portable sensor apparatus of claim 11, wherein the controller is configured to control the sensor unit to rotate such that the plurality of sensors are arranged to capture the sensor data associated with 360 degrees of the environment of the sensor apparatus.
13. (canceled)
14. The portable sensor apparatus of claim 4, wherein the plurality of sensors are angularly spaced over less than 90 degrees about the axis of rotation on the sensor unit.
15. The portable sensor apparatus of claim 2, wherein the sensor unit comprises a camera, and wherein the rangefinder sensor and the camera are arranged such that a field of view of the rangefinder is configured to overlap, at least partially, with a field of view of the camera.
16. The portable sensor apparatus of claim 2, wherein the sensor unit comprises a plurality of thermal imaging sensors, and wherein each of the plurality of thermal imaging sensors has a field of view, and wherein the field of view of each of the plurality of thermal imaging sensors at least partially overlaps the field of view of at least one of the other thermal imaging sensors of the plurality of thermal imaging sensors.
17. The portable sensor apparatus of claim 2, further comprising a support structure, wherein the sensor unit is spaced from the ground surface by the support structure.
18. The portable sensor apparatus of claim 17, wherein the sensor unit is mounted to the support structure for rotation relative to the support structure.
19. The portable sensor apparatus of claim 18, wherein the sensor unit is mounted for motorised rotation relative to the support structure.
20-24. (canceled)
25. The portable sensor apparatus of claim 2, further comprising one or more further sensors comprising one or more of a temperature sensor, a humidity sensor, a carbon dioxide sensor, and/or an air particulate sensor.
26. The portable sensor apparatus of claim 25, wherein at least one of the further sensors is separate to the sensor unit.
27. The portable sensor apparatus of claim 25, wherein the further sensor device comprises a locating tag, configured to be detectable by at least one of the sensors of the sensor unit, whereby to locate the further sensor device in the environment sensed by the sensors of the sensor unit.
28. The portable sensor apparatus of claim 25, further comprising a controller and a wireless transceiver configured to be in wireless communication with a further device, and wherein the controller is configured to output sensor data from the plurality of sensors to the further device via the wireless transceiver.
29-39. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0079] Embodiments of the invention are further described hereinafter with reference to the accompanying drawings, in which:
[0080]
[0081]
[0082]
[0083]
[0084]
[0085]
[0086]
[0087]
[0088]
[0089]
[0090]
[0091]
[0092]
[0093]
[0094]
[0095]
DETAILED DESCRIPTION
[0096]
[0097] The sensor unit 12 has a housing 14 which comprises an arrangement of sensors 20, 16, 24. In one example, the arrangement of sensors includes a thermal imaging sensor 20, a laser rangefinder 16, and an optical camera 24, for example an RGB camera 24. The thermal imaging sensor 20 and optical sensor 24 in the form of a camera 24 are arranged to provide a wide field of view of the environment being scanned (see also
[0098] The front surface 14A of the housing has corresponding ports 18, 22 for each of the cameras 24 and thermal imaging sensors 20A, 20B, 20C, 20D. In the illustrated example, the thermal imaging sensors 20 are arranged in a first plane 66, the cameras 24 are arranged in a second plane 64 and the laser rangefinder 16 is secured within a recess 19 (see
[0099] The housing 14 is arranged such that one or more of the sensors are secured horizontally beyond an outermost edge the bottom surface 14C to provide an unobstructed view for the sensors capturing data below the sensor unit 12. The housing 14 is also arranged such that one or more of the sensors are secured horizontally beyond an outermost edge of the top surface 14B to provide an unobstructed view for the sensors capturing data above the sensor unit 12. Arranging the housing 14 in this manner enables data to be captured from directly above and/or beneath the sensor unit 12. In the illustrated example, thermal imaging sensors 20A and 20B and camera 24A capture data in front of and above the sensor unit 12, while thermal imaging sensors 20C and 20D and camera 24B capture data in front of and below the sensor unit 12. While two thermal imaging sensors 20A, 20B are used to capture an upper region of the room, it would be apparent that a single sensor with a sufficiently wide field of view may be used to capture the upper region. In some cases, one thermal imaging sensor with a sufficiently wide field of view may be sufficient to measure data from the region in front of the sensor unit 12. While two cameras 24 have been described, it would be apparent that one camera having a sufficiently wide field of view may be used to view the entire region in front of the sensor unit 12. The base portion 26 in this example includes a light bar 27 to illuminate the region below the sensor unit 12.
[0100] The laser rangefinder 16 preferably has a field of view of greater than 180 degrees. Therefore, as the sensor unit 12 rotates through 360 degrees, the laser rangefinder 16 can capture spatial data of the entire room. Once the sensor unit 12 has rotated through 360, a complete data set of the room including multiple types of data corresponding to each of the sensors will be captured. While one particular sensor arrangement has been described, it would be apparent that this arrangement is not essential and that the sensors and/or housing 14 may be arranged differently while still providing the requisite coverage. It would also be apparent that more or fewer thermal imaging sensors 20A, 20B, 20C, 20D and cameras 24A, 24B may be used to capture the required data.
[0101]
[0102] As shown in
[0103] The sensor unit 12 is preferably calibrated at least once before deployment. This allows for accurate sensor measurement, taking account of any manufacturing tolerances of the housing 14 or frame 58 affecting the precise positions of the sensors, and to account for any variations in sensing components or lenses. The calibration process helps to ensure the captured data is accurate and allows for corrections to be applied prior to capturing any field data. Calibration may also be used to correct for temperature of the environment. For example, it will be understood that variations in the temperature of the environment of the sensor apparatus can result in changes in the physical dimensions of one or more components of the sensor apparatus. One method of calibrating the cameras 24A, 24B is to use a calibration grid. Typically, a calibration grid includes contrasting shapes of different known sizes and relative position displayed on a surface. In one example, these can take the form of multiple black squares printed on a white board and placed at a predetermined or otherwise accurately determinable location relative to the sensor unit 12. In one example, calibration of the thermal imaging sensors 20A, 20B, 20C, 20D involves directing the thermal imaging sensors 20 towards multiple thermal surfaces and selectively bringing the thermal surfaces to one or more known temperatures. As the temperature of the thermal surfaces changes, the thermal imaging sensors 20 detect the thermal surfaces and the changes and this data is used to calibrate the thermal imaging sensors 20A, 20B, 20C, 20D. In one example, calibration of the laser rangefinder 16 may be performed by mounting the sensor unit 12 a known distance from a target. The laser rangefinder 16 can then capture data indicating a measured distance to the target and apply any correction factors. In one example, the laser rangefinder 16 may be calibrated prior to the thermal imaging sensors 20 and the optical cameras 24. In this case, the thermal imaging sensors 20 can detect the thermal surfaces and relate the locations of the thermal surfaces with the depth data captured by the laser rangefinder 16. Once the thermal imaging sensors 20 have been calibrated, the sensor unit 12 may rotate about the vertical axis 62 and face the calibration grid. The cameras 24 can then be calibrated using the calibration grid and relate the locations of the calibration grid with the depth data captured by the laser rangefinder 16. While separate calibration of the thermal imaging sensors 20 and optical cameras 24 has been described, it would be apparent this need not be the case, and two or more of the sensors may be calibrated using a single surface without needing to rotate the sensor unit 12.
[0104] A wireless transceiver 59 (see
[0105]
[0106] In the illustrated example, the sensor unit 12 is rotatable about a horizontal axis at pivot 81, which preferably includes a clamp for securing the rotational position of the sensor unit about the horizontal axis. The sensor unit 12 can also be rotated about a vertical axis by adjustment of the adjustable clamp at bracket 17. Therefore, the sensor unit 12 can be positioned with a desired field of view by moving the tripod 15 and then adjusting the position of the sensor unit 12 by using the clamps of the pivot 81 and the bracket 17.
[0107] The sensor unit 12 has a housing 14 which comprises an arrangement of sensors 20, 16, 24. In one example, the arrangement of sensors includes a thermal imaging sensor 20, a laser rangefinder 16, and an optical camera 24, for example an RGB camera 24. The thermal imaging sensor 20, the optical sensor 24 in the form of the camera 24, and the laser rangefinder 16 are arranged in the same plane, having parallel principal axes, to provide a field of view of the environment in front of the scanning unit 12.
[0108] In the example of
[0109] In this example, the fields of view of the laser rangefinder sensor 16, the camera 24, and the combined field of view of the thermal imaging sensors 20A, 20B are substantially the same, for example the same field of view. That is, each of the sensors 20A, 20B, 16, 24, is configured to scan the same area of the environment for any given position of the sensor unit 12.
[0110] Alternatively, in this example the laser rangefinder sensor 16, the camera 24, and the combined field of view of the thermal imaging sensors 20A, 20B each have overlapping fields of view. Specifically, the combined field of view of the thermal imaging sensors 20A, 20B overlaps the field of view of the laser rangefinder 16 and the field of view of the camera 24, the field of view of the camera 24 overlaps the field of view of the laser rangefinder 16 and the combined field of view of the thermal imaging sensors 20A, 20B, and the field of view of the laser rangefinder 16 overlaps the combined field of view of the thermal imaging sensors 20A, 20B and the field of view of the camera 24. Providing overlapping fields of view in this way makes it easier to combine the data from the different sensors.
[0111]
[0112] In contrast to the example sensor unit 12 of
[0113] In this example, the laser rangefinder 16 preferably comprises a solid state LiDAR rangefinder. A solid state LiDAR rangefinder 16 will be less susceptible to motion of the sensor unit 12, and so is better suited to applications where a less controlled motion of the sensor unit 12 is employed during the scanning motion. As described above, the steady state LiDAR rangefinder 16 has a field of view that matches or overlaps the field of view of the thermal imaging sensor 20 and the camera 24. In this way, the sensor data captured by the three sensors 20, 16, 24 can be more easily processed to match the corresponding locations of each scan.
[0114] While one particular sensor arrangement has been described, it would be apparent that this arrangement is not essential and that the sensors and/or housing 14 may be arranged differently while still providing the requisite coverage. It would also be apparent that more or fewer thermal imaging sensors 20 and cameras 24 may be used to capture the required data, provided that the combined fields of view of the multiple thermal imaging sensors and/or multiple cameras provides a matching or overlapping field of view for the other sensors, including the laser rangefinder 16.
[0115] The sensor unit 12 is preferably calibrated at least once before deployment. This allows for accurate sensor measurement, taking account of any manufacturing tolerances of the housing 14 affecting the precise positions of the sensors, and to account for any variations in sensing components or lenses. The calibration process can be performed in the same manner as described previously.
[0116] The sensor unit 12 also includes a wireless transceiver (not shown) and controller (not shown). The wireless transceiver allows the sensor unit 12 to be in wireless communication with a further device, for example a mobile device, such as a tablet computer, or a server. This enhances the portability of the sensor apparatus 10, as user can simply pick up and move the sensor apparatus 10 from room to room without worrying about trailing cables. Once data is captured by the sensor unit 12 it can be transmitted to the further device for storage or processing. In one example, the wireless transceiver is a wireless router. In one example, the further device may be part of the sensor apparatus 10 or may be a remote device separate to the sensor apparatus 10. The controller or further device may be configured to process the captured data from the solid state LiDAR rangefinder 16, thermal imaging sensor 20 and camera 24 and to provide the complete dataset of the scanned environment. In one example, data processing may happen offline, for example on the mobile device or the server.
[0117]
[0118]
[0119]
[0120]
[0121]
[0122] An exemplary deployment of the sensor apparatus 10 is shown in
[0123] The combined data set is preferably in a format that is compatible with existing building information management databases. For example, the combined data set may comprise database fields that are consistent with existing building information management databases, so that data in these fields can be directly copied or transferred into the existing building information management database. The combined data set may comprise further fields, not present in the building information management databases, that contain further information that is not compatible with the building information management databases. For example, the combined data set may comprise a plurality of text fields for containing information from the scan data, and a plurality of further fields containing further scan information, such as 3D model data. One of the fields of the combined data set may comprise a link to the further scan information hosted on a remote server, for example a cloud server. In this way, existing building information management databases can be used to store the scan data, and where compatibility is not possible, for example for 3D model data, the existing building information management databases can be supplemented by the further scan information stored on the remote server.
[0124] One or more of the sensors may also be used to detect an identifier in the room 30. The identifier may be used to tag an object or location in the room 30. The identifier may be affixed to an object semi-permanently or may be temporarily inserted into the room 30 by the user only for the duration of the sensing by the portable sensor apparatus 10. The identifier may be used by the user as a placeholder for subsequent additional data input into the combined dataset. The additional data may be data not captured by the sensors. In one example, the identifiers may be a visual identifier and the cameras 24 may be used to detect the visual identifiers. One example of a visual identifier may include a barcode, such as a QR® code.
[0125] In some cases, the identifier may indicate that an object should be kept in the combined dataset. In some cases, the identifier may indicate that an object should be ignored in the combined dataset. For example, one or more identifiers may be applied to any of electrical appliances 36, tables 40, cupboards 42 or work surfaces 44 in the illustrated room 30. Where these objects are not indicative of the environmental status of the room 30, these can be subsequently ignored in the combined dataset. In one example, an identifier 56 may be attached to a radiator 39 to indicate the radiator 39 should be retained in the combined dataset. This would allow for additional information about the radiator 39 to be included in the combined dataset. A different identifier 48 may be attached to a boiler 46. For example, additional information regarding the status of the boiler 46 can be accurately recorded as part of the combined dataset captured by the sensor unit 12. Further identifiers 52A, 52B may be attached to one or more windows 32. An identifier may be attached to electrical sockets (not shown) within the room 30. One or more identifiers may be attached to one or more pipes (not shown) within the room 30 or on a surface indicating the presence of a pipe behind one or more of the walls 31. An identifier 54 may be attached to a further sensor (not shown) within the room 30. This allows the combined dataset to include data not otherwise provided by the sensors within the sensor unit 12 itself.
[0126] The combined dataset may include data manually input by the user, data recorded by the further sensor, or data captured by other data sources.
[0127] For example, an operator may augment the data by adding annotations, removing data, and/or replacing information in the data set. For example, the operator may select at least some of the data to be deleted, for example the operator may select personal or confidential information captured in the scan data for deletion. Alternatively or additionally, the operator may replace at least some of the data with library information retrieved from a database. For example, data relating to an appliance, such as a refrigerator, may be isolated from other features of the building environment and replaced with library data relating to the refrigerator. The library data may include further information on the appliance, such as manufacturer details, maintenance history, and warranty information. Additionally or alternatively, the library data may comprise 3D model data for the appliance, which can replace the scan data to augment the scan data. In another example, the operator may remove all of the data associated with a feature, so that the feature is absent from the building model. In this example, walls behind the removed feature, such as a painting, may be extrapolated to fill the space left by removing the data. Data generated by extrapolating the wall may replace the removed data.
[0128] In one example the further sensor is a damp sensor (not shown). The damp sensor may be portable and be introduced into the room 30 by the user or be fixed within the room 30. In this example, the damp sensor can be used to indicate the presence of damp or mould at a specific location in the room 30, for example, on a particular wall 31D.
[0129] The additional data may be input during data capture or after data capture. The additional data may be input on-site or offline. Examples of additional data may include manufacturer details of the object tagged or in proximity to the tag, dates of installation or maintenance of the object tagged or in proximity to the tag, details of associated componentry or consumables, etc. Importantly, the additional data is localised in the scanned environment as its location is recorded by the sensor unit 12 when scanning the room 30. By integrating the additional data in the combined dataset of the room 30 and associating the dataset with a particular building, a more complete dataset indicative of the structural status of the building can be obtained. This can in turn be used to significantly reduce the inefficiencies with regard to building maintenance.
[0130]
[0131]
[0132] In a further example, illustrated in
[0133] In the example of
[0134] As illustrated in
[0135] As also illustrated in
[0136] As illustrated, the field of view 171 of each thermal imaging sensor 20A, 20B, 20C overlaps a field of view 171 of at least one other thermal imaging sensor 20A, 20B, 20C in overlapping regions 172. The thermal imaging sensors 20A, 20B, 20C thereby have a combined field of view, provided by aggregating the fields of view 171 of all of the thermal imaging sensors 20A, 20B, 20C, defined by the outer limits of each field of view 171.
[0137] Viewed from above or below, as shown in
[0138] In an alternative example more similar to the example of
[0139] As explained above, the field of view 171 of each of the thermal imaging sensors 20A, 20B, 20C also overlaps the field of view of the rangefinder sensor 16, at least during movement of the sensor unit 12 through the scanning motion. Therefore, the field of view 171 of each of the thermal imaging sensors 20A, 20B, 20C at least partially overlaps the field of view 170 of the rangefinder sensor 16, and the field of view 171 of each of the thermal imaging sensors 20A, 20B, 20C at least partially overlaps the field of view 171 of at least one other thermal imaging sensor 20A, 20B, 20C.
[0140] In this way, the thermal imaging sensors 20A, 20B, 20C capture thermal data from a common surface point in the environment of the sensor apparatus 10, and the common surface point is also detected by the rangefinder sensor 16. This arrangement is advantageous for calibration of the thermal imaging sensors 20A, 20B, 20C, as described further hereinafter.
[0141] Although in the example of
[0142] It will also be appreciated that the sensor unit 12 may further include at least one camera, for example an RGB camera, arranged to capture images of the environment of the portable sensor apparatus 10. The camera or cameras may be arranged to have a field of view or combined field of view that overlaps the fields of view of the rangefinder sensor 16 and the thermal imaging sensors 20A, 20B, 20C.
[0143] In this way, the sensor unit 12 is configured to capture scan data associated with the environment of the portable sensor apparatus 10, particular scan data corresponding to surfaces of the environment of the portable sensor apparatus 10—i.e. the range of the surfaces, the temperature of the surfaces, and optionally an appearance of the surfaces. This scan data can be used to construct a point cloud model that is representative of the environment of the portable sensor apparatus 10. Such a point cloud model would comprise a plurality of points, each representative of a surface point, and each point having range data, thermal data, and optionally image data.
[0144]
[0145] The thermal imaging sensors 20A, 20B, 20C are sensitive to surface temperatures that are detected in the environment of the portable sensor apparatus 10, and preferably detect temperature to an accuracy of at least 1/10 of a degree Celsius. When detecting surface temperatures using multiple thermal imaging sensors, as described with reference to the embodiments of the portable sensor apparatus, it is advantageous for the thermal imaging sensors to be calibrated to each other so that temperature variations across the surfaces are detected. It is advantageous to be able to detect temperature variations across a surface because temperature variations are indicative of the presence and effectiveness of thermal insulation, or the thermal performance of a scanned object. Moreover, thermal imaging sensors will have a drift or inconsistency that varies during operation of the portable sensor apparatus, meaning that it is advantageous to calibrate them in a constant process during use, based on the scan data, rather than when the portable sensor apparatus is manufactured.
[0146] As illustrated in
[0147] The data may be collected by performing a calibration scan using the portable sensor apparatus 10 described with reference to
[0148] The method further includes a step 182 of combining the scan data from the rangefinder sensor 16 and the plurality of thermal imaging sensors 20 into a combined data set. In this step 182, the combined data set comprises a point cloud of the environment of the portable sensor apparatus 10, where each point of the point cloud is representative of a surface point and having an associated value from the rangefinder scan data (i.e. a distance from the scanning unit 12 to the surface point) and at least one thermal data value from at least one of the thermal imaging sensors 20A, 20B, 20C.
[0149] In some examples, the method 180 is performed by a processor of the portable sensor apparatus 10, for example a controller of the sensor unit 12. In other examples the method 180 is performed on a further device, for example the tablet computer 161 illustrated in
[0150] From the combined data set generated in step 182, the method 180 performs a step 183 of identifying surface points with more than one thermal data value. That is, step 183 identifies surface points in the combined data set where more than one thermal imaging sensor has provided a thermal data value. Such surface points are referred to as a ‘common surface point’. Common surface points therefore have range data from the rangefinder sensor 16 and at least two thermal data values from at least two of the thermal imaging sensors 20.
[0151] Step 182 may comprise identifying one common surface point, or a plurality of common surface points in the combined data set. As described further below, preferably the method step 182 comprises identifying all common surface points in the combined data set.
[0152] The method 180 further includes a step 184 of calculating a difference between the at least two thermal data values associated with the or each common surface point. The difference is indicative of a difference in thermal data values between two thermal imaging sensors 20 that have detected the common surface point. The difference is calculated by subtracting one thermal data value from the other. A difference can be calculated for each common surface point of the combined data set.
[0153] Next, the method 180 includes a step 185 of determining a calibration factor. The calibration factor is based on the difference calculated in step 184. The calibration factor may be based on only one difference (i.e. only one common surface point), or can be based on a plurality of differences (i.e. a plurality of common surface points). The calibration factor can be applied to the thermal data values obtained by at least one of the thermal imaging sensors 20.
[0154] The calibration factor determined in step 185 is typically a positive or negative value that is applied to at least one of the thermal imaging sensors 20 such that both thermal imaging sensors obtain the same thermal data values for the common surface point. For example, if a first thermal imaging sensor 20A has detected a thermal reading of 27.5 degrees Celsius for a common surface point, and a second thermal imaging sensor 20B has detected a thermal reading of 27.9 degrees Celsius for the same common surface point, the difference is 0.4 degrees Celsius. In this case, a calibration factor of +0.4 degrees Celsius may be applied to the scan data of the first thermal imaging sensor 20A, or a calibration factor of −0.4 degrees Celsius may be applied to the scan data of the second thermal imaging sensor 20B, or a calibration factor of +0.2 degrees Celsius may be applied to the scan data of the first thermal imaging sensor 20A and −0.2 degrees Celsius may be applied to the scan data of the second thermal imaging sensor 20B.
[0155] In step 186 the calibration factor is then applied to the thermal data values detected by the appropriate thermal imaging sensor 20, such that the thermal data values across the entire combined data set are calibrated. In some examples, the calibration factor is only applied to thermal data values detected by some of the thermal imaging sensors, leaving thermal data values detected by at least one thermal imaging sensor unchanged. For example, one thermal imaging sensor may be taken as a reference, and the thermal data values detected by the other thermal imaging sensor or sensors are corrected to match the reference thermal imaging sensor. In the example of
[0156] In preferred examples, a single calibration factor may be determined based on all common surface points in the combined data set. In this example, described in detail below, a single calibration factor is calculated based on the differences of all common surface points, and the calibration factor is configured to minimise the aggregate difference across all of the common surface points. This calibration factor is then applied to all of the thermal data values across each of the multiple thermal imaging sensors. Any remaining difference between thermal data values of different thermal imaging sensors for common surface points can be rectified by taking one or the other, or by averaging the calibrated thermal data values for that common surface point.
[0157] A further example calibration method is described hereinafter.
[0158] In a first step analogous to step 181 of
[0159] In a step analogous to step 184 of
e.sub.ikjm=(l.sub.i(u.sub.ik,v.sub.ik)+δ.sub.i)−(l.sub.j(u.sub.jm,v.sub.jm)+δ.sub.j)
[0160] where (u.sub.ik, v.sub.ik) and (u.sub.jm, v.sub.jm) are the thermal data values from the two thermal imaging sensors, and where δ.sub.i and δ.sub.j are calibration factors applied to the thermal data values. In some cases, there will be a pre-existing calibration factor, otherwise the calibration factors are zero.
[0161] The above difference measurement is calculated for a plurality, or preferably all of, the common surface points in the combined data set.
[0162] From the calculated differences, an error vector is defined as follows:
e(δ)=Σe.sub.ikjm(δ.sub.i,δ.sub.j)
[0163] If the two thermal imaging sensors are perfectly calibrated to each other (calibration factors=zero and differences are all zero), or if the current calibration factors result in perfect calibration (differences are zero), then the error vector would contain only zeros. Otherwise, in a step analogous to step 185 of
e(δ)=Cδ−m
[0164] where C is a correspondence matrix in which each row consists of all zeros apart from the columns which are associated with the common surface points of each individual error measurement, which have a corresponding +1 and −1, e.g.:
C.sub.ijkm=[0,0, . . . ,1,0,0,0,−1,0, . . . ,0,0]
[0165] and where m is a vector of measurements l.sub.i (u.sub.ik, v.sub.ik)−l.sub.j (u.sub.jm, v.sub.jm), which are the differences between thermal data values.
[0166] Then, a random column of C and the corresponding row of δ is removed, allowing it to be inverted into a least squares problem, as below:
δ=(C.sup.TC).sup.−1C.sup.Tm
[0167] The value of the calibration factor, δ, can then be determined to minimise e.
[0168] The determined calibration factor, δ, is then added to the thermal data values for each point of the point cloud, which provides overall thermal calibration for the 3D point cloud. As mentioned above, any remaining inconsistency between thermal data values of different thermal imaging sensors for common surface points can be rectified by taking one or the other, or by averaging the calibrated thermal data values for that common surface point.
[0169] The method of calibration described above is a closed form solution that does use numerical optimisation techniques, meaning that the method is computationally efficient and can work effectively fora large number of images. This is important as the scan data for a normal size room (e.g. a living room of a house) might include upwards of 300 thermal images, and the method can efficiently calibrate these images to each other in a single closed form matrix operation, as per the method described above.
[0170] In summary, there is provided a portable sensor apparatus 10 for surveying within a room 30 of a building. The sensor apparatus 10 comprises: a rotatable sensor unit 12 for temporary insertion into a room 30, the rotatable sensor unit 12 for rotation about a substantially vertical axis of rotation 62, and comprising a plurality of outwardly directed sensors 16, 20, 24 mounted for rotation with the rotatable sensor unit 12 and to capture sensor data associated with an environment of the sensor apparatus 10. The plurality of sensors 16, 20, 24 comprises: a rangefinder sensor 16; one or more thermal imaging sensors 20A, 20B, 20C, 20D; and one or more cameras 24A, 24B.
[0171] There is also provided a portable sensor apparatus for surveying a building. The sensor apparatus comprises a sensor unit for temporarily locating at the building. The sensor unit is moveable in a scanning motion and comprises a plurality of outwardly directed sensors arranged to capture sensor data associated with an environment of the sensor apparatus as the sensor unit is moved though the scanning motion. The plurality of sensors comprises a rangefinder sensor, a thermal imaging sensor, and a camera.
[0172] There is also provided a portable sensor apparatus for surveying a building. The sensor apparatus comprises a sensor unit for temporarily locating at the building, the sensor unit being moveable in a scanning motion. The sensor unit comprises a rangefinder sensor and a plurality of outwardly directed thermal imaging sensors arranged to capture sensor data associated with an environment of the sensor apparatus as the sensor unit is moved through the scanning motion. Each of the rangefinder sensor and the plurality of thermal imaging sensors has a field of view, and the field of view of each of the plurality of thermal imaging sensors at least partially overlaps the field of view of the rangefinder sensor. In addition, the field of view of each of the plurality of thermal imaging sensors at least partially overlaps the field of view of at least one of the other thermal imaging sensors of the plurality of thermal imaging sensors.
[0173] There is also provided a method of managing a record of a state of a building. The method comprises: positioning the portable sensor apparatus described above in a building; activating the sensor apparatus to capture sensor data from each of the plurality of sensors of the sensor apparatus, the sensor data being indicative of the environment of the sensor apparatus; combining the sensor data from each of the plurality of sensors of the sensor apparatus into a combined data set; and storing the combined data set in a database associated with the building as a record of a state of the building.
[0174] There is also provided a method of calibrating a portable sensor apparatus for surveying a building. In these examples, the sensor apparatus comprises a rangefinder sensor adapted to capture range data for surfaces in the environment of the sensor apparatus, and first and second thermal imaging sensors adapted to capture thermal data of the surfaces in the environment of the sensor apparatus. The first and second thermal imaging sensors have at least partially overlapping fields of view. The method of calibrating the portable sensor apparatus comprises: combining the range sensor data and the thermal sensor data into a combined data set representative of the surfaces of the environment of the sensor apparatus; identifying a surface point in the combined data set where both of the first thermal imaging sensor and the second thermal imaging sensor have captured thermal data; determining a difference between the thermal data of the first thermal imaging sensor at the identified surface point and the thermal data of the second imaging sensor at the identified surface point; determining a calibration factor configured to calibrate the first thermal imaging sensor and/or the second thermal imaging sensor; and applying the calibration factor to the thermal data of at least one of the first thermal imaging sensor and the second thermal imaging sensor.
[0175] Throughout the description and claims of this specification, the words “comprise” and “contain” and variations of them mean “including but not limited to”, and they are not intended to (and do not) exclude other moieties, additives, components, integers or steps. Throughout the description and claims of this specification, the singular encompasses the plural unless the context otherwise requires. In particular, where the indefinite article is used, the specification is to be understood as contemplating plurality as well as singularity, unless the context requires otherwise.
[0176] Features, integers, characteristics or groups described in conjunction with a particular aspect, embodiment or example of the invention are to be understood to be applicable to any other aspect, embodiment or example described herein unless incompatible therewith. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive. The invention is not restricted to the details of any foregoing embodiments. The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.