AUTOMATED IMAGING OF PHOTOVOLTAIC DEVICES USING AN AERIAL VEHICLE AND AUTOMATED FLIGHT OF THE AERIAL VEHICLE FOR PERFORMING THE SAME
20240231391 ยท 2024-07-11
Assignee
Inventors
Cpc classification
B64U2101/30
PERFORMING OPERATIONS; TRANSPORTING
B64U50/23
PERFORMING OPERATIONS; TRANSPORTING
B64U2101/26
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
An aspect of the present disclosure relates to automated imaging of photovoltaic devices using an aerial vehicle (20). In one aspect, there is a method (440) for automated imaging of a PV array (310) using an aerial vehicle (20), the PV array (310) corresponding to target points (350) for the aerial vehicle (20). The method (440) comprises: positioning the aerial vehicle (20) at one of the target points (350) corresponding to the PV array (310); and controlling the aerial vehicle (20) for automated manoeuvre between the target points (350) to capture visual datasets of the PV array (310). The automated manoeuvre comprises: aligning a field-of-view (225) of a camera (222) of the aerial vehicle (20) to a PV array subsection of the PV array (310); determining a scanning direction (360) for moving the aerial vehicle (20) between the target points (350); and capturing, using the camera (222), the visual datasets of the PV array (310) starting from the PV array subsection as the aerial vehicle (20) moves along the scanning direction (360) between the target points (350).
Claims
1. A method for automated imaging of a PV array using an aerial vehicle, the PV array corresponding to target points for the aerial vehicle, the method comprising: positioning the aerial vehicle at one of the target points corresponding to the PV array; and controlling the aerial vehicle for automated manoeuvre between the target points to capture a visual dataset of the PV array, the automated manoeuvre performed by the aerial vehicle comprising: aligning a field-of-view (FOV) of a camera of the aerial vehicle to a PV array subsection of the PV array; locating, in the aligned FOV, a target viewpoint of the PV array and corresponding to said one of the target points; determining, with reference to the located target viewpoint, a scanning direction for moving the aerial vehicle between the target points; and capturing, using the camera, the visual dataset of the PV array starting from the PV array subsection as the aerial vehicle moves along the scanning direction between the target points.
2. (canceled)
3. The method according to claim 1, wherein aligning the FOV comprises detecting a polygonal outline of the PV array subsection.
4.-8. (canceled)
9. The method according to claim 3, wherein aligning the FOV comprises calculating a pitch angle misalignment based on the detected polygonal outline.
10. (canceled)
11. The method according to claim 3, wherein aligning the FOV comprises calculating a roll axis misalignment based on the angular difference between a current pitch angle of the camera and a desired pitch angle.
12. (canceled)
13. The method according to claim 3, wherein aligning the FOV comprises aligning a pair of guide lines of the FOV to the PV array subsection, the guide lines being offset from edges of the FOV.
14. The method according to claim 13, wherein aligning the FOV comprises calculating a rotational misalignment between the polygonal outline and the PV array subsection based on the offset guide lines and the polygonal outline.
15. (canceled)
16. The method according to claim 13, wherein aligning the FOV comprises calculating a vertical misalignment based on the offset guide lines and the polygonal outline.
17.-19. (canceled)
20. The method according to claim 1, wherein the scanning direction is determined based on a pitch axis misalignment between the aligned FOV and the PV array subsection.
21. The method according to claim 1, wherein the automated manoeuvre comprises calculating misalignment parameters as the aerial vehicle moves along the scanning direction and reducing movement speed of the aerial vehicle if the misalignment parameters breach predefined conditions.
22. (canceled)
23. The method according to claim 1, further comprising: receiving geolocation data of the aerial vehicle; and returning the aerial vehicle to a previous position based on the geolocation data, wherein at the previous position, the PV array subsection was last visible in the FOV.
24.-46. (canceled)
47. A method for automated flight of an aerial vehicle to perform imaging of a set of PV arrays, the method comprising: determining flight paths between target points for capturing visual datasets of the PV arrays, each PV array corresponding to one or more target points; controlling the aerial vehicle for automated flight along the flight paths to each of the target points; and controlling the aerial vehicle for automated manoeuvre at the respective one or more target points to capture the visual datasets of each PV array using a camera of the aerial vehicle, wherein the automated manoeuvre performed by the aerial vehicle comprises moving the aerial vehicle, from the respective one of the target points, along a scanning direction determined with reference to a target viewpoint of the PV array and corresponding to the respective one of the target points.
48.-49. (canceled)
50. The method according to claim 47, wherein each flight path is calculated using a 3D spline function, and wherein the length of each flight path is minimized by adjusting positions of knots of the 3D spline function, wherein the flight path does not intersect with any obstacle and is above a predefined minimum height.
51.-53. (canceled)
54. The method according to claim 47, wherein controlling the aerial vehicle for automated flight comprises controlling the camera to always face the PV array corresponding to the next target point as the aerial flies along the respective flight path to the next target point.
55. The method according to claim 54, wherein each flight path to a respective target point comprises an orbiting path circling around and towards the target point, and wherein the orbiting path comprises a target turning circle tangential to the target point.
56. (canceled)
57. The method according to claim 55, wherein the aerial vehicle is controlled to fly along an outer tangent of the target turning circle, and wherein controlling the aerial vehicle for automated flight comprises reducing flight speed of the aerial vehicle as the aerial vehicle approaches the target point, such that the radius of the target turning circle decreases and the orbiting path forms a shorter spiral path.
58. The method according to claim 55, wherein the aerial vehicle is controlled to fly along an inner tangent of the target turning circle and over the PV array, and wherein the camera is configured to change from forward-facing to backward-facing as the aerial vehicle flies over the PV array.
59. The method according to claim 47, further comprising determining a reference point on a PV array subsection of the PV array, the PV array subsection visible in a field-of-view (FOV) of the camera as the aerial vehicle approaches the PV array.
60. The method according to claim 59, further comprising determining, upon losing the PV array subsection from the FOV, a last reference point of the PV array subsection that was last visible in the FOV.
61. The method according to claim 60, further comprising transforming the last reference point into a corrected target point and recalculating the flight path from a current position of the aerial vehicle to the corrected target point.
62.-80. (canceled)
81. A method for automated flight of an aerial vehicle and automated imaging of a set of PV arrays using the aerial vehicle, the method comprising: determining flight paths between target points for capturing visual datasets of the PV arrays, each PV array corresponding to a pair of start and end target points; controlling the aerial vehicle for automated flight along the flight paths to respective start target points of the PV arrays; and controlling the aerial vehicle for automated manoeuvre between the respective start and end target points of each PV array to capture the visual dataset of the PV array using a camera of the aerial vehicle, the automated manoeuvre performed by the aerial vehicle comprising: aligning a field-of-view (FOV) of the camera to a PV array subsection of the PV array; locating, in the aligned FOV, a target viewpoint of the PV array and corresponding to the start target point corresponding to the PV array; determining, with reference to the located target viewpoint, a scanning direction for moving the aerial vehicle between the start and end target points corresponding to the PV array; and capturing, using the camera, the visual dataset of the PV array starting from the PV array subsection as the aerial vehicle moves along the scanning direction between the start and end target points.
82.-156. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
DETAILED DESCRIPTION
[0036] For purposes of brevity and clarity, descriptions of embodiments of the present disclosure are directed to aerial vehicles and methods for automated flight and automated imaging of PV arrays, in accordance with the drawings. While aspects of the present disclosure will be described in conjunction with the embodiments provided herein, it will be understood that they are not intended to limit the present disclosure to these embodiments. On the contrary, the present disclosure is intended to cover alternatives, modifications and equivalents to the embodiments described herein, which are included within the scope of the present disclosure as defined by the appended claims. Furthermore, in the following detailed description, specific details are set forth in order to provide a thorough understanding of the present disclosure. However, it will be recognized by an individual having ordinary skill in the art, i.e., a skilled person, that the present disclosure may be practiced without specific details, and/or with multiple details arising from combinations of aspects of particular embodiments. In a number of instances, well-known systems, methods, procedures, and components have not been described in detail so as to not unnecessarily obscure aspects of the embodiments of the present disclosure.
[0037] In embodiments of the present disclosure, depiction of a given element or consideration or use of a particular element number in a particular figure or a reference thereto in corresponding descriptive material can encompass the same, an equivalent, or an analogous element or element number identified in another figure or descriptive material associated therewith.
[0038] References to an embodiment/example, another embodiment/example, some embodiments/examples, some other embodiments/examples, and so on, indicate that the embodiment(s)/example(s) so described may include a particular feature, structure, characteristic, property, element, or limitation, but that not every embodiment/example necessarily includes that particular feature, structure, characteristic, property, element or limitation. Furthermore, repeated use of the phrase in an embodiment/example or in another embodiment/example does not necessarily refer to the same embodiment/example.
[0039] The terms comprising, including, having, and the like do not exclude the presence of other features/elements/steps than those listed in an embodiment. Recitation of certain features/elements/steps in mutually different embodiments does not indicate that a combination of these features/elements/steps cannot be used in an embodiment.
[0040] As used herein, the terms a and an are defined as one or more than one. The use of / in a figure or associated text is understood to mean and/or unless otherwise indicated. The term set is defined as a non-empty finite organization of elements that mathematically exhibits a cardinality of at least one (e.g. a set as defined herein can correspond to a unit, singlet, or single-element set, or a multiple-element set), in accordance with known mathematical definitions. The recitation of a particular numerical value or value range herein is understood to include or be a recitation of an approximate numerical value or value range.
[0041]
[0042] In the inspection setup 100, the imaging of the PV installation 10 includes electroluminescence (EL) measurements and the visual datasets include EL visual datasets. The PV installation 10 includes one or more PV arrays 11 and one or more PV strings 12 across the PV arrays 11. Each PV array 11 includes one or more PV devices or modules 14. One or more of the PV modules 14 are arranged in one or more of the PV strings 12. Each PV string 12 may extend across one or more PV arrays 11, and each PV array 11 may form part of one or more PV strings 12. In the embodiment as shown in
[0043] The setup 100 further includes a switcher box 32 that includes one or more channels 34. In the embodiment as shown in
[0044] The channels 34 can be selectively activated, such as by an onsite worker or by remote control, to selectively supply the PV strings 12 with an electrical current from the power source 36 which puts the PV strings 12 under forward bias conditions. When put in the forward bias condition, one or more PV modules 14 in the PV string 12 emits light, otherwise known as electroluminescence (EL), and thus produces an EL signal that is detectable by the optical sub-system 220.
[0045] Additionally, it should be noted that multiple PV strings 12 may be connected to one channel 34. For example, all three PV strings 12 of the PV installation 10 may be connected to a single channel 34. In this scenario, all three PV strings 12 are simultaneously put under forward bias conditions, and the EL visual datasets of the entire PV installation 10 are captured. Notably, the amount of electrical current supplied by the power supply 36 is lower in this scenario compared to when each channel 34 is connected to respective PV strings 12 although this does not affect the PV strings 12 being put under forward bias conditions.
[0046]
[0047] Further with reference to
[0048] The optical sub-system 220 further includes an optical distance measurement device such as a light detection and ranging (Lidar) device 224. The Lidar device 224 has an optical axis 224a that is aligned to the optical axis 222a of the camera 222. The Lidar device 224 is operable to measure distance of the optical sub-system 220 from the PV array 11.
[0049] The optical sub-system 220 further includes a focused light source such as a laser 226 or an LED. The laser 226 has an optical axis 226a that is also aligned to the optical axis 222a of the camera 222. The laser 226 is arranged to emit light in the visible spectrum, and has a beam divergence that is not larger than the camera's 222 field-of-view (FOV) which minimizes optical interference from the laser 226. Furthermore, the laser 226 allows for low power operation, emits light in a narrow waveband, and creates focused shapes which are easily identified. The focused shapes may be symmetrical or non-symmetrical. Non-symmetrical shapes advantageously allow for easier identification of where the camera 222 is pointing at, as well as to determine a rotation of the camera's FOV.
[0050] The optical sub-system 220 further includes a single-axis gimbal 228 which attaches the optical sub-system 220 to the main body 210 of the UAV 20. The controller 250 controls the gimbal 228 to raise/lower the optical axis 222a of the camera 222 with one degree of freedom (i.e. pitch angle). For example, the gimbal pitch may have an angular range of ?35? but is not limited to this. Alternatively, the optical sub-system 220 may be mounted to the main body 210 via a two-axis or a three-axis gimbal 228 to allow for further degrees of freedom (i.e. yaw and roll angles) for adjusting the optical axis 222a of the camera 222 and to provide enhanced stability of the FOV.
[0051] Referring to
[0052] The onboard processing sub-system 240 includes a controller 250 and a memory unit 252. The controller 250 is configured to execute certain functions (as described further below) according to a set of instructions stored in the memory unit 252. The controller 250 receives information from the optical sub-system 220 including the distance from the PV array 11 to the Lidar device 224, as well as the camera's 222 visual feed. Using the information received from the optical sub-system 220, the controller 250 is configured to operate the optical sub-system 220 and propulsion device 230 to execute the functions. Once the EL visual datasets are captured, the UAV 20 may transmit the EL visual datasets to the remote device 260 or return to the home base 330 to transfer the EL visual datasets for further processing to obtain the processed EL images. The remote device 260 includes a processor 262 and a memory unit 264 storing instructions for executing a software application or mobile app to remotely control the UAV 20. For example, the remote device 260 is a computer, laptop, mobile phone, or tablet device.
[0053] There are various ways to control flight motion of the UAV 20, such as by controlling the motor speed of the propellers 232, UAV height or altitude, UAV speed, and UAV position. For example for UAV speed control, the UAV 20 receives commands containing speed vectors to fly at the defined speeds and directions. For example for UAV positional control, the UAV 20 receives commands containing GPS coordinates and flies to those coordinates. Similarly, the gimbal 228 can be controlled to adjust the pitch, yaw, and roll angles of the camera 222 by speed control (such as controlling gimbal pitch speed to adjust the pitch angle) or by positional control (such as directly controlling the pitch angular position).
[0054] In some embodiments, the controller 250 is configured to directly control flight control of the UAV 20 and manoeuvring of the camera 222, as well as for image processing of the visual feed captured by the camera 222, such as for aligning the camera 222 as described further below. In some other embodiments, the remote device 260 performs the flight control and image processing and sends commands to the controller 250 for the UAV 20 to execute those commands. The remote device 260 executes the software thereon and communicates directly with the UAV 20 for controlling the UAV 20. In some other embodiments, the remote device 260 communicates with a handheld remote controller, such as a joystick device, which then communicates with the UAV 20 for flight control. The UAV 20, remote device 260, and remote controller may communicate with each other via suitable wired or wireless communication protocols. Non-limiting examples of wireless communication protocols include Bluetooth, Wi-Fi, telecommunications network (such as 4G, 5G, and LTE), and peer-to-peer or server-based network. The remote device 260 can also be used to communicate with the channels 34 to selectively activate them and supply electrical current to the PV strings 12 and put them under forward bias conditions.
[0055] In many embodiments, the UAV 20 is deployed to perform EL inspection of a PV installation 300 as shown in
[0056] The channels 34 can be selectively activated, such as by an onsite worker or by the remote device 260, to selectively supply the PV strings 312 with an electrical current which puts the PV strings 312 under forward bias conditions. For every PV array 310 and electrical current supplied to the respective PV strings 312, one EL visual dataset (I1 to I6) can be generated. The PV installation 310 further includes obstacles 320, such as buildings and trees, that must be avoided by the UAV 20 during flight. Each obstacle 320 may be defined with a boundary 322 and a safety margin 324 outside the boundary 322. The home base 330 marks the location where the UAV 20 usually starts and lands.
[0057] Further as shown in
[0058] The UAV 20 is preferably configured with a flight plan detailing the flight paths 340 and target points 350 for automated flight along the flight paths 340 and for automated capturing of EL visual datasets of the PV arrays 310. More preferably, the pilot of the UAV 20 is able to remotely control the channels 34 to supply electrical current to the PV strings 312 without being on-site. However, it will be appreciated that the UAV 20 may still perform automated flight and automated capturing of the EL visual datasets in cooperation with a worker (or the pilot) manually controlling the channels 34.
[0059]
DATA Function
[0060] The process 400 includes a method 410 representing the DATA function. The DATA function determines the information required for the UAV 20 to image and capture visual datasets of the PV arrays 310. In many embodiments, the DATA function determines the information for EL measurements and capturing EL visual datasets of the PV arrays 310. The information determined in the DATA function can be shared among various EL measurement contractors, and missing/inaccurate information can be obtained from prior EL visual datasets, processed EL images, and/or on-site physical measurements at the PV installation 300. The information may be categorized into UAV information, image information, and obstacle information.
[0061] The UAV information includes location details to locate the home base 330 of the UAV 20. The location details may include latitude and longitude geographic coordinates. The UAV information also includes a safe height relative to the home base 330 that is safe for the UAV 20 to fly. The UAV information also includes a minimum distance between the UAV 20 and any obstacles 320 near the flight paths 340 to prevent collisions.
[0062] The image information includes details of EL visual datasets (e.g. I1 to I6) to be captured together with the corresponding PV strings 312 (S1 to S3), connection points 316 (P1,P2), and electrical currents to be supplied to the PV strings 312. For example and as shown in
[0063] The obstacle information includes location details, such as latitude and longitude geographic coordinates, of the boundaries 322 of obstacles 320 at the PV installation 300. In a 3D perspective, the boundary 322 of each obstacle 320 may be described using a prism model with a polygonal base. Obstacles 320 are objects in proximity of the flight paths 340 that must be avoided by the UAV 20 during measurement of the PV arrays 310. The obstacle information further includes the safety margin 324 for each obstacle 320 and the safety margins 324 may be dependent on the size of the obstacles 320. For example, a larger obstacle 320 would typically have a wider safety margin 324 to mitigate risk of collision with the UAV 20. The obstacle information further includes the height of each obstacle 320 relative to the home base 330. The obstacle heights can be obtained from technical drawings and/or measured using measurement devices such as a Lidar device. For example, the Lidar device 224 of the UAV 20 can be used for such purpose.
[0064] There are various ways of obtaining the boundaries 322 of the obstacles 320. For example as shown in
PLAN Function
[0065] The process 400 includes a method 420 representing the PLAN function. The PLAN function generates an imaging measurement plan for the UAV 20 to capture visual datasets of the PV arrays 310. In many embodiments, the PLAN function generates an EL measurement plan for EL measurements and capturing EL visual datasets of the PV arrays 310. The EL measurement plan includes location details of the target points 350 and flight manoeuvres to be executed by the UAV 20 to fly between the home base 330 and the target points 350 to capture EL visual datasets of the corresponding PV arrays 310. The process 400 further includes methods 430 and 440 describing the flight manoeuvres. Specifically, the method 430 represents the TOUR function for automated flight of the UAV 20 for EL measurement of the PV arrays 310, and the method 440 represents the SCAN function for automated EL measurement of each PV array 310. The TOUR and SCAN functions are described in more detail further below.
[0066] In the EL measurement plan, each PV array 310 corresponds to one or more target points 350 for positioning the UAV 20 to capture the EL visual datasets of the PV array 310. In some embodiments as shown in
[0067] The EL measurement plan further includes the tilt angles, bearing angles, and heights of the PV arrays 310. The tilt angle (?.sub.tilt) and bearing angle (?.sub.bearing) can be translated into the pitch angle (?.sub.pitch) and yaw angle (?.sub.yaw), respectively, of the camera 222. If the camera 222 is forward-facing, i.e. facing the front 20a of the UAV 20, the yaw angle (?.sub.yaw) of the camera 222 can be set to be equal to the yaw angle of the UAV 20 around the yaw axis 210a. The pitch angle (?.sub.pitch) of the camera 222 is the angle between the camera's optical axis 222a and the true vertical. The pitch angle (?.sub.pitch) is positive if the camera 222 is facing the front 20a of the UAV 20 and negative if the camera 222 is facing the back of the UAV 20. The roll angle (?.sub.roll) of the camera 222 is assumed to be zero if the UAV 20 is aligned to the true horizontal plane.
[0068] There are two feasible scanning directions 360 for moving the UAV 20 along the array axis 10a to capture the EL visual datasets of the PV array 310.
[0069] For the sideward scanning direction 360a as shown in
[0070] For the forward scanning direction 360b as shown in
[0071] In the same fashion as obstacles 320, PV arrays 310 are objects that the UAV 20 must avoid colliding with. The boundaries of the PV arrays 310 can be mapped in a similar manner as the obstacles 320. The height (d.sub.z) of a PV array 310 relative to the home base 330 can be estimated during flight of the UAV 20 using the current flight height (d.sub.baro) of the UAV 20 relative to the home base, the distance (d.sub.L) between the UAV 20 and the PV array 310 along the surface normal, and the pitch angle (?.sub.pitch) of the camera 222, as defined below.
[0072] The camera 222 is normally facing forward in the same direction as the front 20a of the UAV 20. In certain geographical regions such as areas close to the Equator, PV arrays 310 can be tilted in different directions. The PV arrays 310 are differently tilted often not because to maximise exposure to solar radiation but to allow rainwater to flow away from the PV arrays 310. As shown in
[0073] The image sensor of the camera 222 used for EL measurements commonly has aspect ratio different to one. For example, an image sensor with 640?512 pixels has an aspect ratio of 1.25. As shown in
[0074] As shown in
[0075] To ensure that the PV array subsection is fully captured in the image sensor such that the short side of the PV array subsection is fully within the camera's FOV 225, an additional space perpendicular to the scanning direction 360 is provided, resulting in an adjusted imaged distance (d.sub.FOV). For example, a space factor (f.sub.space) of 1.2 allows for an extra space of 10% on both sides.
[0076] With the angle of the camera's FOV 225 (?.sub.FOV) being perpendicular to the scanning direction 360, the distance (d.sub.L) between the UAV 20 and the PV array 310 along the surface normal becomes
TOUR Function
[0077] The UAV 20 can be equipped with obstacle detection and collision avoidance devices, such as the Lidar device 224. These devices improve safety during operation as they can raise proximity alerts and stop the UAV 20 from moving towards a detected obstacle 320. However, such obstacle detection devices can be unreliable especially for translucent obstacles 320 such as glass structures and trees. EL measurements of PV arrays 310 are usually performed at heights of around 4-15 metres which are not above many buildings and trees, thus the risk of collision with these obstacles 320 is higher and it is important to mitigate this risk.
[0078] The process 400 includes the method 430 representing the TOUR function for automated flight of the UAV 20 for imaging of a set of PV arrays 310. In many embodiments, the TOUR function executes automated flight of the UAV 20 for EL measurements. The TOUR function executes automated flight manoeuvres between waypoints along the flight paths 340 while avoiding obstacles 320 along the flight paths 340 based on the EL measurement plan from the PLAN function. These waypoints include the target points 350 corresponding to the PV arrays 310 for measuring them, as well as the home base 330 for the UAV 20 to take off for the EL measurements and to land after capturing the EL visual datasets.
[0079] The TOUR function includes a step of determining the flight paths 340 between the target points 350 for capturing EL visual datasets of the PV arrays 310, each PV array 310 corresponding to one or more target points 350. Each target point 350 can be defined by its coordinates, height (d.sub.z) of the corresponding PV array 310 relative to the home base 330, and the camera's yaw angle (?.sub.yaw) and pitch angle (?.sub.pitch). When the UAV 20 is positioned at a target point 350, the camera 222 is aimed to face the corresponding target view 352, such as the side point A and B mentioned above.
[0080] Each flight path 340 (notably those shown in dashed lines in
[0081] Alternatively, the flight path 340 can be calculated using a 2D spline function for a predefined constant height of the flight path 340, wherein the predefined constant height is above the predefined minimum height 341. The flight path 340 has a number of knots that are iteratively increased until a valid flight path 340 that does not intersect with any obstacle 320 can be found.
[0082] The TOUR function includes a step of controlling the UAV 20 for automated flight along the flight paths 340 to each of the target points 350. Since all the flight paths 340 between the target points 350 have been determined, such as using the 3D spline function to find the shortest flight paths 340, the UAV 20 can be controlled for automated flight along these flight paths 340 such that the automated flight minimizes the overall flight duration.
[0083] In the PV installation 300 as shown in
[0084] As shown in
Additionally, the camera 222 can scan the PV arrays 310 in the forward or reverse direction (2.sup.n). With n being the number of EL visual datasets, i.e. n=6, then the total number (N) of possible flight paths 340 is then
[0085] With n=6, then N=23040. There are 23040 permutations to scan the EL visual datasets (I1 to I6) of all the PV arrays 310. Using a computational algorithm such as brute force or k-nearest neighbour, an optimal flight sequence of the target points 350 can be determined. The target points 350 in the optimal flight sequence minimizes the overall flight duration of the flight paths 340, enabling the EL measurements to be completed more quickly.
[0086] During automated flight along the flight paths 340, the camera 222 faces the front 20a of the UAV 20 such that the yaw angle (?.sub.yaw) of the camera 222 is the same as the yaw angle of the UAV 20. In one embodiment as shown in
[0087] If the UAV pilot is at the home base 330 and the UAV 20 is flying towards the target points 350 away from the home base 330, the pilot and camera 222 will face the same general direction. If the UAV 20 is returning to the home base 330, the pilot and camera 222 will face opposite directions. The yaw angle of the UAV 20 may be rotated by 1800 so that the back of the UAV 20 faces the pilot and the pilot and camera 222 continue to face the same general direction. This configuration allows for more intuitive manual control of the UAV 20 by the pilot if necessary.
[0088] As the UAV 20 is flying towards a target point 350, the maximum flight speed depends on the current distance between the UAV 20 and the target point 350, as well as the gradient or curvature of the flight path 340. If the UAV is close to the target point 350 or within a predefined distance from the target point 350, the flight speed will be gradually reduced to avoid overshooting and missing the target point 350. If the gradient of the flight path 340 is too steep, the flight speed will also be reduced to avoid deviating from the flight path 340. The flight speed gradually reduces as the UAV 20 approaches the target point 350. The flight height also gradually lowers until it reaches a predefined minimum height 341 or the distance d.sub.L reaches a predefined value.
[0089] In the embodiment as shown in
[0090] To address this problem of temporarily losing the target view 352, the flight path 340 to the target point 350 includes an orbiting path 342 that circles or orbits around and towards the target point 350. This allows the UAV 20, on automated flight along the orbiting path 342, to circle or orbit around the target point 350 as the UAV 20 approaches the target point 350. The orbiting path 342 may include a target turning circle tangential to the target point 350. The length of the orbiting path 342 or arc length of the target turning circle is determined by the angular difference between the yaw angles (?.sub.yaw) of the camera 222 before and at the target point 350. The radius of the target turning circle is determined by the rates of changing the camera's yaw angle (?.sub.yaw) and pitch angle (?.sub.pitch), as well as by the minimum turning radius of the UAV 20 at the flight speed when the UAV 20 enters the orbiting path 342. The algorithm to determine the orbiting path 342 is described below with reference to four Cases A to D as shown in
[0091] i. Draw a first line 343 between the target point 350 and corresponding target view 352. As shown in Case A, the first line 343 aligns with the camera's yaw angle (?.sub.yaw) when the UAV 20 is positioned at the target point 350.
[0092] ii. Draw a second line 344 perpendicular to the first line 343 and through the target point 350.
[0093] iii. Draw a third line 345 perpendicular to the first line 343 and through the target view 352.
[0094] iv. Draw two target turning circles 346 tangential to and on both left and right sides of the first line 343. The centres of the target turning circles 346 intersect with the second line 344.
[0095] v. The sectors of the target turning circles 346 between the target point 350 and target view 352 are referred as top, and the other sectors are referred to as bottom. There are thus four zones created by the two target turning circles 346 and separated by the first line 343 and second line 344top-left (TL), top-right (TR), bottom-left (BL), and bottom-right (BR) zones.
[0096] vi. The radius of the target turning circles 346 is determined such that the circumference is below the third line 345. If the circumference is above the third line 345, the target turning circles 346 must be shifted towards the bottom until they are tangential to the third line 345. This ensures that the camera 222 can continue to face forward to see the target view 352. In Case D, when the UAV 20 reaches the target point 350, the camera 222 changes from forward-facing to backward-facing in order to see the target view 352.
[0097] vii. For every UAV 20 position along the flight path 340, there are four tangents 347 intersecting with the UAV 20 and tangential with the target turning circles 346. The tangents 347 further away from the target point 350 are referred to as the outer tangents, and the tangents 347 closer to the target point 350 are referred to as the inner tangents.
[0098] viii. If the UAV 20 is in the TL zone as shown in Case B, the UAV 20 follows the outer tangent 347 of the left target turning circle 346. If the UAV 20 is in the TR zone, the UAV 20 follows the outer tangent 347 of the right target turning circle 346. If the UAV 20 is in the BR zone as shown in Case C, the UAV 20 follows the inner tangent 347 of the right target turning circle 346. If the UAV 20 is in the BL zone, the UAV 20 follows the inner tangent 347 of the left target turning circle 346.
[0099] ix. As shown on the right side of
[0100] The rates of change of the yaw angle (?.sub.yaw) and pitch angle (?.sub.pitch) can be kept constant as the UAV 20 flies to the target point 350 along the orbiting path 342. As shown in
[0101] Alternatively as shown in
[0102] Target points 350 can be in close proximity to each other without any obstacle 320 in between, such as target points 350 that correspond to adjacent PV arrays 310. A pair of target points 350 are adjacent to each other and the flight path 340 between them can be simplified to a straight-line shift. This allows the UAV 20 to fly along the straight flight path 340 with a shorter flight duration, while maintaining the same height, yaw angle (?.sub.yaw), and pitch angle (?.sub.pitch).
[0103] The TOUR function includes a step of controlling the UAV 20 for automated manoeuvre at the respective one or more target points 350 to capture the EL visual dataset of each PV array 310 using the camera 222. Accordingly, when the UAV 20 reaches a target point 350 (such as target point A) looking at a corresponding target view 352 (such as target view A), the UAV 20 is controlled for automated manoeuvre at the target point A to capture the EL visual dataset of the corresponding PV array 310. In this automated manoeuvre, the UAV 20 may move from target point A (looking at the target view A) to target point B (looking at the target view B) to capture the EL visual dataset of the PV array 310 that is bounded by the side points A and B. The scanning paths 360 of this automated manoeuvre are shown in dotted lines in
[0104] In some cases, the EL measurement plan from the PLAN function may contain errors such as incorrect coordinates for a target point 350. For example, a target point 350 should correspond to a particular target view 352 but if the coordinates for the target point 350 are incorrect, such as due to measurement errors, the UAV 20 may not face the correct target view 352 when it arrives at these coordinates via the TOUR function. Instead, as shown in
[0105] The UAV 20 can be controlled to move from the current target point 350 (with the incorrect coordinates and facing the incorrect target view 354) to the corrected target point 350. The UAV 20 at the corrected target point 350 would be facing the last reference point 380 and a PV array subsection would be visible in the FOV 225, allowing the UAV 20 to continue with the SCAN function. In one embodiment, the TOUR function determines, upon losing the PV array subsection from the FOV 225, the last reference point 380 and corrected target point 350. The TOUR function also recalculates the flight path 340 from the current position of the UAV 20 to the corrected target point 350 and automatically moves the UAV 20 from the current position to the corrected target point 350. In another embodiment, after determining the last reference point 380 and corrected target point 350, the corrected target point 350 is displayed on the remote device 260 controlling the UAV 20. The pilot then selects the corrected target point 350 to recalculate the flight path from the current position of the UAV 20 to the corrected target point 350 and control the UAV 20 to move to the corrected target point 350. Additionally, after losing the EL signal, a warning message may be sent to the remote device 260 to alert the pilot and provide an option to correct the target point 350 based on the last reference point 380.
[0106] Therefore, the TOUR function controls the UAV 20 to fly towards a target point 350 to capture the EL visual dataset of the corresponding PV array 310. If the EL signal of the PV array 310 is detected by the camera 222, the TOUR function proceeds to the method 440 of the process 400 or the SCAN function to capture the EL visual dataset of the PV array 310. After capturing the EL visual dataset of the PV array 310, the UAV 20 flies towards the next target point 350 via the TOUR function to capture the EL visual dataset of the next PV array 310. However, if the EL signal is not detected, this may mean that the PV array 310 is inactive or the target point 350 is erroneous. The TOUR function then proceeds to correct the target point 350, as described above. However, if despite this correction the PV array 310 is still not detected by the camera 222, the TOUR function may proceed to a method 450 of the process 400 representing a FIND function to find the nearest active PV array 310 with an EL signal. Details of the FIND function are described in PCT Application WO 2021137764 (incorporated by reference herein).
[0107] As mentioned above, the TOUR function executes automated flight manoeuvres along the flight paths 340 based on the EL measurement plan which includes location details of the target points 350 and the flight manoeuvres. In some situations, there is no EL measurement plan or the EL measurement plan does not contain sufficient details about the target points 350 and/or flight manoeuvres. The target points 350 would be determined based on the pilot's interaction with the remote device 260 controlling the UAV 20. A software executed in the remote device 260 displays an interactive map 500 as shown in
SCAN Function
[0108] The process 400 includes the method 440 representing the SCAN function for automated imaging of a PV array 310 using the UAV 20, wherein the PV array 310 corresponds to target points 350 for the UAV 20. For example, the target points 350 include the target points A and B as shown in
[0109] The SCAN function includes a step of positioning the UAV 20 at one of the target points 350 corresponding to the PV array 310. For example, the UAV 20 is controlled using the TOUR function for automated flight along the respective flight path 340 to the target point 350 (such as target point A). The SCAN function further includes a step of controlling the UAV 20 for automated manoeuvre between the target points 350 to capture the EL visual dataset of the PV array 310. For example, the UAV 20 performs the automated manoeuvre from the target point A to the target point B along the sideward scanning direction 360 to capture the EL visual dataset.
[0110] The SCAN function can be divided into six incremental levels (Level 0 to Level 5) of automation for the automated EL measurement, wherein SCAN function Level 0 requires full manual control by the UAV pilot and SCAN function Level 5 requires the least manual control or is fully automated.
[0111] In SCAN function Level 0, the UAV 20 is manually controlled by the pilot and flown between the target points 350 (such as between target points A and B). The pilot also manually aligns the camera's optical axis 222a to be perpendicular to the PV array 310. For optimal EL measurement, the pilot sets a distance between the UAV 20 and the PV array 310 such that the short side of a PV array subsection is fully within the FOV 225, preferably filling most of the FOV 225. The pilot may also manually adjust various parameters of the camera 222, some of which are described in PCT Application WO 2021137764 (incorporated by reference herein). Non-limiting examples of the camera parameters include camera focus, exposure, signal-to-noise ratio, and sharpness.
[0112] In SCAN function Level 1, the alignment of the PV array subsection with the FOV 225 is performed via the automated manoeuvre after the UAV 20 is positioned at the target point 350. The automated manoeuvre includes a step of aligning the FOV 225 to the PV array subsection (i.e. the portion of the PV array 310 that is currently captured by the image sensor of the camera 222). As shown in
[0113] In some embodiments, the step of aligning the FOV 225 includes detecting a polygonal outline of the PV array subsection. The EL signal or image of the PV array subsection can be outlined by a polygon such as a quadrilateral or more preferably a rectangle as PV modules 314 are rectangular and PV arrays 310 are usually arranged in a regular fashion. If the camera's optical axis 222a is almost perpendicular to the PV array 310, the polygon can be approximated as a rectangle 520 having a centre 522 (x.sub.R, y.sub.R), width (w.sub.R), height (h.sub.R), and rotation angle (a.sub.R), as shown in
[0114] As an example, detecting the polygonal outline (rectangle 520) can be described by an image processing algorithm as follows. The image processing algorithm includes converting an EL image frame of the PV array subsection from the EL visual dataset into monochrome or grayscale if it originally has multiple colour channels. The image processing algorithm includes binarizing the EL image frame by setting all image intensities above a predefined threshold to a non-zero value (e.g. 1) and other image intensities to zero. Otsu's method for automatic image thresholding can be used to obtain this threshold. The image processing algorithm further includes detecting contours around all non-zero objects in the binarized EL image frame and the rectangle 520 is defined around the detected contours. More specifically, the image processing algorithm excludes small objects from the detected contours to exclude objects from non-EL sources and finds the smallest rectangle 520 around all the remaining detected contours.
[0115] Referring to
[0116] In some embodiments, instead of detecting the polygonal outline, the step of aligning the FOV 225 includes calculating an image histogram of the EL image frame and calculating a centre of the EL image frame based on the image histogram. For example, the EL image frame centre (y.sub.R) can be calculated from the centre of the peak obtained from the value sum of average of the rows and columns in the image histogram. Although the EL image frame centre (y.sub.R) and size can be calculated, the rotation angle (?.sub.R) may be missing but this, as well as other missing information, can be obtained through other methods or sources. Alternatively, in some embodiments, a bounding box technique can be used to detect the PV array subsection or each PV module 314 in the PV array subsection.
[0117] In SCAN function Level 1, the automated manoeuvre automates alignment of the FOV 225 by adjustment of the camera's pitch angle (?.sub.pitch). The pilot manually controls the UAV 20 to perform other adjustments to improve alignment of the FOV 225 to the PV array subsection.
[0118] In SCAN function Level 2, the automated manoeuvre optimizes the alignment of the FOV 225 to the PV array subsection. More specifically, the step of aligning the FOV 225 may include aligning a pair of guide lines 530 of the FOV 225 to the PV array subsection. In embodiments wherein the SCAN function is performed along the sideward scanning direction 360a, the guide lines 530 are upper and lower lines offset by distance (d.sub.T) from the top and bottom edges of the FOV 225. A smaller offset distance (d.sub.T) allows the PV array subsection to fill a larger area of the FOV 225, but requires the camera 222 to have better positional stability and precision. Preferably, the offset distance (d.sub.T) is about 10% of the overall height (h.sub.FOV) of the FOV 225.
[0119] The rectangle 520 may be rotated relative to the PV array subsection and the step of aligning the FOV 225 may include calculating a rotational misalignment (m.sub.yaw) between the rectangle 520 and the PV array subsection based on the offset guide lines 530 and the rotated rectangle 520. The rotational misalignment (m.sub.yaw) is calculated from the smallest angular difference between the rotation angle (?.sub.R) of the rotated angle 520 and the offset guide lines 530. The step of aligning the FOV 225 further includes adjusting the yaw angle (?.sub.yaw) of the camera 222 by the rotational misalignment (m.sub.yaw), such as by rotating the UAV 20 about the yaw axis 210a.
[0120] The step of aligning the FOV 225 further includes calculating a roll axis misalignment (m.sub.y) based on the angular difference between the current pitch angle (?.sub.pitch) and desired pitch angle of the camera 222. Notably, the desired pitch angle is equivalent to the tilt angle (?.sub.tilt) of the PV array 310.
[0121] The step of aligning the FOV 225 further includes calculating a vertical misalignment (m.sub.z) based on the offset guide lines 530 and the rotated rectangle 520, as follows.
[0122] The vertical misalignment (m.sub.z) is negative if the PV array subsection fills up too small an area within the FOV 225 and the UAV 20 should be shifted vertically downwards to enlarge the PV array subsection in the FOV 225. However, if the distance between the UAV 20 and the PV array 310 is below a predefined minimum safety distance, the negative vertical misalignment (m.sub.z) should be reset to zero to prevent the UAV 20 from colliding with the PV array 310. The vertical misalignment (m.sub.z) is positive if the PV array subsection fills up too large an area within the FOV 225 and the UAV 20 should be shifted vertically upwards to shrink the PV array subsection in the FOV 225. However, if the vertical misalignment (m.sub.z) is positive and the roll axis misalignment (m.sub.y) is positive, this means the UAV 20 is vertically too close to the PV array 310 and the roll axis misalignment (m.sub.y) should be reset to zero to prevent the UAV 20 from colliding with the PV array 310.
[0123] In SCAN function Level 2, the automated manoeuvre automates alignment of the FOV 225 by one or more of the following misalignment parameterspitch angle misalignment (m.sub.pitch), rotational misalignment (m.sub.yaw), roll axis misalignment (m.sub.y), and vertical misalignment (m.sub.z). Adjustment of the UAV 20 and/or camera 222 by the misalignment parameters for alignment with the PV array subsection can be done by speed control and/or positional control as described above. Once the PV array subsection edges align with the guide lines 530 through the automated manoeuvre, all misalignment parameters are reset to zero, causing the UAV 20 to hover in place. The pilot may manually control the UAV 20 to perform other alignment adjustments if necessary, or to override the automated manoeuvre. For example, the pilot may need to counteract alignment actions by the automated manoeuvre due to inaccurate or false calculations of the misalignment parameters.
[0124] In SCAN function Level 3, after the FOV 225 has been aligned to the PV array subsection, the automated manoeuvre controls movement of the UAV 20 to move the UAV 20 along the PV array 310 and capture the EL visual dataset of the PV array 310. The automated manoeuvre includes a step of determining a scanning direction 360 for moving the UAV 20 between the target points 350. For example, if the positions of the target points A and B are known beforehand, the scanning direction 360 can be determined from the known positions, wherein the UAV 20 is controlled to move from start target point A to end target point B.
[0125] In some embodiments, the scanning direction 360 can be determined based on the pitch axis misalignment (m.sub.x) between the aligned FOV 225 and the PV array subsection. The pitch axis misalignment (m.sub.x) can be calculated as follows.
[0126] With the PV array subsection aligned inside the FOV 225, the scanning direction 360 can be determined if a side point 352 of the PV array subsection is inside the FOV 225. With reference to
[0127] After determining the scanning direction 360, i.e. either sidewards to the left or right, the automated manoeuvre includes a step of capturing the EL visual dataset of the PV array 310 starting from the PV array subsection as the UAV 20 moves along the scanning direction 360 between the target points 350. More specifically, at different points along the scanning direction 360, the UAV 20 captures an EL visual dataset, including comprising a series of EL image frames and/or a video, of different PV array subsections. The automated manoeuvre may gradually increase the UAV movement speed as the UAV 20 moves along the scanning direction 360 to minimize jerky movements. Additionally, the automated manoeuvre may continue calculating the misalignment parameters as the UAV 20 moves along the scanning direction 360 and reduce the movement speed if the misalignment parameters breach certain predefined conditions, such as if the absolute value of an individual misalignment parameter is too large which may suggest the FOV 225 is no longer properly aligned to a PV array subsection. The automated manoeuvre may stop the capturing once the UAV 20 reaches the other target point 350 (end target point B).
[0128] In SCAN function Level 4, the SCAN function may include a step of receiving geolocation data of the UAV 20. The geolocation data, such as from a GPS module in the UAV 20, improves positioning of the UAV 20 during the automated manoeuvre. As the UAV approaches the other target point 350 (end target point B) at the end of the scanning direction or path 360, the absolute value of the pitch axis misalignment (m.sub.x) increases and the movement speed decreases so that the UAV 20 should stop at the end target point B. However, due to the UAV's inertia, the deceleration might be insufficient to overcome the movement speed as the UAV 20 approaches the end target point B. This causes the pitch axis misalignment (m.sub.x) to continue increasing and the UAV 20 to overshoot the end target point B. Moreover, the UAV 20 may overshoot and hover at a position where the no PV array subsection is visible in the FOV 225. Without any PV array subsection visible in the FOV 225, the automated manoeuvre cannot be performed, and the UAV 20 may not be able to fly to the next target point 350 to continue capturing the EL visual dataset. To counteract this issue, the SCAN function may include a step of returning the UAV 20 to a last position based on the geolocation data, wherein at the previous position, the PV array subsection was last visible in the FOV 225 and/or the pitch axis misalignment (m.sub.x) was below a predefined absolute value (e.g. 0.7).
[0129] In some cases, the PV array 310 may have gaps and/or broken PV modules 314, resulting in some areas of the PV array 310 without any EL signal and the UAV 20 would not be able to detect these areas for EL measurement. If these areas are wider than the overall width (w.sub.FOV) of the FOV 225, the automated manoeuvre may stop the capturing of EL visual dataset before or after the UAV 20 reaches the end target point 350. For example, the automated manoeuvre may stop the capturing at the start of these areas without any EL signal from the perspective of the FOV 225 as the UAV 20 moves along the scanning direction 360.
[0130] In SCAN function Level 5, details of the target points 350 and the orientation of the UAV 20 and camera 222 to align to the corresponding PV array 310 at the respective target points 350 and target views 352 are known beforehand, such as from the PLAN function.
[0131] The SCAN function includes the step of positioning the UAV 20 at one of the target points 350 corresponding to the PV array 310. For example, the UAV 20 is controlled using the TOUR function for automated flight to the start target point 350. The SCAN function includes the step of controlling the UAV 20 for automated manoeuvre between the target points 350 to capture the EL visual dataset of the PV array 310. The automated manoeuvre includes the steps of aligning the camera's FOV 225 to a PV array subsection of the PV array 310 and determining the scanning direction 360 for moving the UAV 20 between the target points 350. As the orientation of the UAV 20 and camera 222 is known beforehand, the FOV 225 would be properly aligned upon positioning of the UAV 20 at the start target point 350 and the scanning direction 360 is also known (from the start to the end target points 350). The automated manoeuvre includes the step of capturing the EL visual dataset of the PV array 310 starting from the PV array subsection as the UAV 20 moves along the scanning direction 360 between the start and end target points 350. The SCAN function Level 5 thus fully automates EL measurement of the PV array 310, starting the capturing of EL visual dataset once the UAV 20 is positioned at the start target point 350 and ending once the UAV 20 reaches the end target point 350.
[0132] Although the SCAN function is described in embodiments herein that the UAV 20 captures the EL visual dataset while moving along the sideward scanning direction 360a (see
COMBI Function
[0133] In some embodiments, the TOUR and SCAN functions can be combined as a COMBI function or method 460 of the process 400. The COMBI function provides a method for automated flight of the UAV 20 and automated imaging of a set of PV arrays 310 using the UAV 20. The COMBI function can be initiated if a complete measurement plan from the PLAN function is available. In many embodiments, the COMBI function executes automated flight of the UAV 20 and automated EL measurement of the PV arrays 310 using the UAV 20, and the COMBI function can be initiated if a complete EL measurement plan is available.
[0134] The COMBI function includes a step of determining flight paths 340 between target points 350 for capturing EL visual datasets of the PV arrays 310, each PV array 310 corresponding to a pair of start and end target points 350 (e.g. target points A and B). The COMBI function includes a step of controlling the UAV 20 for automated flight along the flight paths 340 to the respective start target points 350 of the PV arrays 310. The COMBI function includes a step of controlling the UAV 20 for automated manoeuvre between the respective start and end target points 350 of each PV array 310 to capture the EL visual dataset of the PV array 310 using the camera 222. For each PV array 310, the automated manoeuvre includes steps of aligning the camera's FOV 225 to a PV array subsection of the PV array 310, determining a scanning direction 360 for moving the UAV 20 between the start and end target points 350, and capturing, using the camera 222, the EL visual dataset of the PV array 310 starting from the PV array subsection as the UAV 20 moves along the scanning direction 360 between the start and end target points 350.
[0135] It will be appreciated that various aspects of the TOUR and SCAN functions described above can apply similarly or analogously to the COMBI function, and vice versa, and are not further described for purpose of brevity.
[0136] As described above, the channels 34 can be selectively activated, such as by an onsite worker or by the remote device 260, to put the PV strings 312 under forward bias conditions and generate EL signals from the PV arrays 310. Preferably, the remote device 260 is able to control the UAV 20 and channels 34 so that execution of the EL measurement plan via the COMBI function can be fully automated with minimal or no human intervention. However, even if the COMBI function is fully automated, the pilot and other on-site workers should remain observant of the UAV 20 to ensure safe management and to prevent accidents. For example, the remote controller held by the pilot can include a dead man's switch that has to be pressed continuously or periodically. For example, the UAV 20 may be configured with another camera or eye tracker that tracks the pilot's eyes to ensure the pilot's attention on the UAV 20.
[0137] After completing the SCAN or COMBI function, the captured EL visual datasets of the PV arrays 310 may undergo further image processing to obtain the processed EL images of the PV arrays 310. Examples of such image processing are described in PCT application WO 2021137764 (incorporated by reference herein). For example, raw EL images from the EL visual datasets may be extracted and sent for an image enhancement process. Multiple processed EL images of the PV array subsections may be aligned and combined to form a combined processed EL image of the corresponding PV array 310. The processed EL images of the PV arrays 310 may be mapped onto a base map of the PV installation 300 for better visualization of the processed EL images against the PV installation 300.
[0138] Details of the processed EL images, such as image properties, electrical currents used for the EL measurements, and identifiers of the respective PV arrays 310, may be stored on a database for further analysis. This database allows for quantitative comparison of the processed EL images with historical EL images of the same PV arrays 310. For example, a pixel intensity histogram of an EL image depends on the electrical current as well as camera parameters such as exposure time, gain, ISO, aperture, and transmissivity of the lens 223. The pixel intensities have a range of values, such as 0 to 255 for an 8-bit camera 222, and these values can be mapped to absolute luminescence intensity values based on a technical standard. Since properties of the EL images can vary depending on the type of camera 222, camera properties, different EL images of the same PV array 310 can be mapped according to the absolute luminescence intensity values for quantitative comparison. This technical standard can also be used by different EL measurement contractors so that their EL images are comparable with others.
[0139] Embodiments of the present disclosure herein describe the process 400 for automated flight to the PV arrays 310 and automated imaging (such as EL measurements) of the PV arrays 310. The process 400 describes the various key functions including the DATA, PLAN, TOUR, SCAN, AND COMBI functions, as well as the various automation levels of the SCAN function. The automated flight and automated imaging reduce the overall time taken to complete the imaging of the PV arrays 310. The UAV 20 can execute the process 400 and operate independently with minimal or no human intervention. The process 400 is thus feasible for imaging of large PV installations 300 with many PV arrays 310. More PV arrays 310 can be measured at a faster rate and with less manpower, thereby improving overall efficiency. The quality of the EL visual datasets and processed EL images would also be better as the UAV 20 is more properly aligned to the PV arrays 310 by the automated manoeuvre.
[0140] In the foregoing detailed description, embodiments of the present disclosure in relation to aerial vehicles and methods for automated flight and automated imaging of PV arrays according to the present disclosure are described with reference to the provided figures. Although these embodiments are described in relation to EL or electroluminescence measurements of PV arrays, the aerial vehicles and methods described herein may be used with other imaging technologies such as visual, thermal (infrared), ultra-violet (UV) fluorescence, and photoluminescence (PL) imaging. For example for PL imaging, photoluminescence of the PV arrays can be created with a light source instead of a power supply. For example for UV fluorescence, fluorescence signals from the PV arrays can be created by illuminating the PV arrays with UV light. It will be appreciated that various aspects of the embodiments described herein in relation to EL imaging or measurements may apply similarly or analogously to other imaging technologies such as visual, infrared, UV, and PL imaging.
[0141] The description of the various embodiments herein is not intended to call out or be limited only to specific or particular representations of the present disclosure, but merely to illustrate non-limiting examples of the present disclosure. The present disclosure serves to address at least one of the mentioned problems and issues associated with the prior art. Although only some embodiments of the present disclosure are disclosed herein, it will be apparent to a person having ordinary skill in the art in view of this disclosure that a variety of changes and/or modifications can be made to the disclosed embodiments without departing from the scope of the present disclosure. Therefore, the scope of the disclosure as well as the scope of the following claims is not limited to embodiments described herein.