METHOD, SYSTEM, AND IMAGE PROCESSING DEVICE FOR CAPTURING AND/OR PROCESSING ELECTROLUMINESCENCE IMAGES, AND AN AERIAL VEHICLE
20230005115 · 2023-01-05
Assignee
Inventors
Cpc classification
G05D1/0094
PHYSICS
H04N23/67
ELECTRICITY
G01C11/02
PHYSICS
B64C39/024
PERFORMING OPERATIONS; TRANSPORTING
G06F18/2414
PHYSICS
B64U2201/10
PERFORMING OPERATIONS; TRANSPORTING
G06T3/4053
PHYSICS
G06T3/4038
PHYSICS
H02S50/00
ELECTRICITY
B64U2101/30
PERFORMING OPERATIONS; TRANSPORTING
International classification
G05D1/00
PHYSICS
Abstract
A method (400) of capturing and processing electroluminescence (EL) images (1910) of a PV array (40) is disclosed herein. In a described embodiment, the method 400 includes controlling the aerial vehicle (20) to fly along a flight path to capture EL images (1910) of corresponding PV array subsections (512b) of the PV array (40), deriving respective image quality parameters from at least some of the captured EL images, dynamically adjusting a flight speed of the aerial vehicle along the flight path, based on the respective image quality parameters for capturing the EL images (1910) of the PV array subsections (512b), extracting a plurality of frames (1500) of the PV array subsection (512b) from the EL images (1910); determining a reference frame having a highest image quality of the PV array subsection (512b) from among the extracted frames (2100); performing image alignment of the extracted frames (2100) to the reference frame to generate image aligned frames (2130), and processing the image aligned frames (2130) to produce an enhanced image (2140) of the PV array subsection (512b) having a higher resolution than the reference frame. A system, image processing device, and aerial vehicle for the method thereof are also disclosed.
Claims
1. A method of processing electroluminescence (EL) images of a PV array, comprising extracting a plurality of frames of a PV array subsection of the PV array from the EL images, the PV array subsection including one or more PV modules of the PV array; determining a reference frame having a highest image quality of the PV array subsection from among the extracted frames; performing image alignment of the extracted frames to the reference frame to generate image aligned frames by arranging the extracted frames in a stacked arrangement, wherein respective corner points of the PV modules are stacked; and aligning the respective corner points of each PV module in the extracted frames to the corresponding corner points of the PV module in the reference frame; and processing the image aligned frames to produce an enhanced image of the PV array subsection having a higher resolution than the reference frame.
2. A method according to claim 1, wherein extracting the frames from the images comprises determining the respective corner points of each PV module in the images; and constructing respective frames for each PV module based on the identified corner points of each PV module.
3. (canceled)
4. A method according to claim 1, wherein determining a reference frame having a highest image quality comprises evaluating the image quality of each frame based on at least one of sharpness, signal-to-noise ratio, and completeness of the frames.
5. A method according to claim 1, wherein processing the image aligned frames comprises grouping the image aligned frames according to the PV module in each frame; and performing image averaging on each group of image aligned frames to obtain respective enhanced frames for each PV module.
6. A method according to claim 5, wherein the image averaging is based on weighted image stack averaging, and/or a deep convolutional neural network structure.
7. A method according to claim 5, further comprising associating each enhanced frame with a horizontal index and a vertical index according to each PV module's position in the PV array subsection; and arranging the enhanced frames according to its horizontal and vertical index to produce the enhanced image of the PV array subsection.
8. A method according to claim 5, further comprising scaling respective image intensities of each enhanced frame.
9. A method according to claim 1, further comprising mapping the enhanced image of the PV array subsection onto a base-map of the PV array subsection, the base-map including geo-location of each PV module.
10. A method according to claim 9, wherein mapping the enhanced image onto the base-map comprises orientating the enhanced image to align the PV array subsection in the enhanced image to the PV array subsection in the base-map.
11. (canceled)
12. (canceled)
13. (canceled)
14. An image processing device for processing EL images of a PV array, comprising an image processor configured to extract a plurality of frames of a PV array subsection of the PV array from the EL images, the PV array subsection including one or more PV modules of the PV array; determine a reference frame having a highest image quality of the PV array subsection from among the extracted frames; perform image alignment of the extracted frames to the reference frame to generate image aligned frames by arranging the extracted frames in a stacked arrangement, wherein respective corner points of the PV modules are stacked; and aligning the respective corner points of each PV module in the extracted frames to the corresponding corner points of the PV module in the reference frame; and process the image aligned frames to produce an enhanced image of the PV array subsection having a higher resolution than the reference frame.
15. An image processing device according to claim 14, wherein the image processor is further configured to extract the frames from the images by determining respective corner points of each PV module in the images; and constructing respective frames for each PV module based on the identified corner points of each PV module.
16. (canceled)
17. An image processing device according to claim 14, wherein the image processor is further configured to determine a reference frame having a highest image quality by evaluating the image quality of each frame based on at least one of sharpness, signal-to-noise ratio, and completeness of the frames.
18. An image processing device according to claim 14, wherein the image processor is further configured to process the image aligned frames by grouping the image aligned frames according to the PV module in each frame; and perform image averaging on each group of image aligned frames to obtain respective enhanced frames for each PV module.
19. An image processing device according to claim 18, wherein the image processor is further configured to perform image averaging based on weighted image stack averaging, and/or a deep convolutional neural network structure.
20. An image processing device according to claim 18, wherein the image processor is further configured to associate each enhanced frame with a horizontal index and a vertical index according to each PV module's position in the PV array subsection; and arrange the enhanced frames according to its horizontal and vertical index to produce the enhanced image of the PV array subsection.
21. An image processing device according to claim 18, wherein the image processor is further configured to scale respective image intensities of each enhanced frame.
22. An image processing device according to claim 14, wherein the image processor is further configured to map the enhanced image of the PV array subsection onto a base-map of the PV array subsection, the base-map including geo-location of each PV module.
23. An image processing device according to claim 22, wherein the image processor is further configured to map the enhanced image onto the base-map by orientating the enhanced image to align the PV array subsection in the enhanced image to the PV array subsection in the base-map.
24-64. (canceled)
65. A method of obtaining an enhanced image of a PV array subsection of a PV array from EL images of the PV array subsection captured by an aerial vehicle having a camera, the method comprising controlling the aerial vehicle to fly along a flight path to capture EL images of corresponding PV array subsections of the PV array; deriving respective image quality parameters from at least some of the captured EL images; dynamically adjusting a flight speed of the aerial vehicle along the flight path, based on the respective image quality parameters for capturing the EL images of the PV array subsections; extracting a plurality of frames of the PV array subsection from the EL images; determining a reference frame having a highest image quality of the PV array subsection from among the extracted frames; performing image alignment of the extracted frames to the reference frame to generate image aligned frames by arranging the extracted frames in a stacked arrangement, wherein respective corner points of the PV modules are stacked; and aligning the respective corner points of each PV module in the extracted frames to the corresponding corner points of the PV module in the reference frame; and processing the image aligned frames to produce an enhanced image of the PV array subsection having a higher resolution than the reference frame.
66. (canceled)
67. (canceled)
68. (canceled)
69. (canceled)
70. (canceled)
71. (canceled)
72. (canceled)
73. (canceled)
74. (canceled)
Description
BRIEF DESCRIPTION OF DRAWINGS
[0072] Exemplary embodiments will be described with reference to the accompanying drawings in which:
[0073]
[0074]
[0075]
[0076]
[0077]
[0078]
[0079]
[0080]
[0081]
[0082]
[0083]
[0084]
[0085]
[0086]
[0087]
[0088]
[0089]
[0090]
[0091]
[0092]
[0093]
[0094]
[0095]
[0096]
[0097]
[0098]
[0099]
[0100]
[0101]
[0102]
[0103]
[0104]
DETAILED DESCRIPTION
[0105] The following description includes specific examples for illustrative purposes. The person skilled in the art would appreciate that variations and alterations to the specific examples are possible and within the scope of the present disclosure. The figures and the following description of the particular embodiments should not take away from the generality of the preceding summary.
[0106]
[0107] The setup 100 further includes a switcher box 32 that includes three channels 34. Each PV string 12 of the PV array 10 is connected to a respective channel 34 of the switcher box 32. The setup 100 further includes a power supply 36 connected to the switcher box 32. The power supply is configured to supply each PV string 12 with up to 1000 volts of electricity and a minimum current equal to 10% of the short circuit current of the PV modules 14. By selectively activating the channels 34, an on-site worker 30 selectively supplies the PV strings 12 with an electrical current from the power source 36 which puts the PV strings 12 under forward bias conditions. When put in the forward bias condition, one or more PV modules 14 in the PV string 12 emit light, otherwise known as electroluminescence (EL), and thus produce an EL signal that is detected by an optical sub-system of an aerial vehicle (e.g. an unmanned aerial vehicle (UAV) 20).
[0108] In
[0109] While the assistant 22 is provided in this embodiment, it should be clear that the worker 30 may deploy the UAV 20 without help from the assistant 22. Additionally, it should also be noted that multiple PV strings 12 may be connected to one channel 34. For example, all three PV strings 12 of the PV array 10 may be connected to a single channel 34. In this scenario, all three PV strings 12 are simultaneously put under forward bias conditions, and the EL images of the entire PV array 10 are captured. Notably, the amount of current supplied by the power supply 36 is lower in this scenario compared to when each channel 34 is connected to respective PV strings 12 although this does not affect the PV strings 12 being put under forward bias conditions.
[0110] Furthermore, a larger PV array may include multiple combiner boxes 16 which are then connected to the inverter box (not shown). Alternatively, the PV array 10 may not include the combiner box 16, and instead, the PV strings 12 are directly connected to the inverter box.
[0111] Preferably, each PV string 12 is supplied with 100% of the short circuit current of the PV modules 14. However, this is not necessary. For example, each PV string 12 may be supplied with a current equal to 60% of the short circuit current of the PV modules 14. A measurement of the same PV array sub section at multiple injection currents may be used to estimate electrical properties of the PV modules 14 and to identify current-dependent defects.
[0112]
[0113] The optical sub-system 220 is described first with reference to
[0114] The optical sub-system 220 further includes an optical distance measurement device (such as a Light Detection And Ranging device (LIDAR) 224). The LIDAR's optical axis (224a) is aligned to the optical axis 222a of the camera 222. The LIDAR 224 is operable to measure distance of the optical sub-system 220 from the PV array 10.
[0115] The optical sub-system 220 further includes a focused light source (such as a laser 226). The laser's optical axis (226a) is also aligned to the optical axis 222a of the camera 222. The laser 226 is arranged to emit light in the visible spectrum, and has a beam divergence that is not larger than the camera's field-of-view (FOV) which minimizes optical interference from the laser. Furthermore, the laser 226 allows for low power operation, emits light in a narrow waveband, and creates focused shapes which are easily identified by the worker 30. The focussed shapes are non-symmetrical which beneficially allows the worker 30 to identify where the camera 222 is pointing at, and also identify a rotation of the camera's FOV.
[0116] The optical sub-system 220 further includes a single-axis gimbal 228 (shown in
[0117] Alternatively, the optical sub-system 220 may be mounted to the main body 210 via a two-axis or a three-axis gimbal to allow for further degrees of freedom (yaw, roll) for adjusting the optical axis 222a of the camera 222 and provide enhanced stability of the FOV. Furthermore, the focused shapes created by the laser 226 may be symmetrical. An LED may also be used in place of the laser 226. The focus of the lens 223 may be adjustable, either mechanically or electrically driven.
[0118] Referring to
[0119] The onboard processing sub-system 240 includes a controller 250 and a memory unit 252. The controller is configured to execute five functions (FOCUS, POINT, FIND, ALIGN, SCAN, AUTO) according to a set of instructions stored in the memory unit 252. The controller 250 receives information from the optical sub-system 220 including the distance from the PV array 10 to the LIDAR 224, as well as the camera's visual feed. Using the information received from the optical sub-system 220, the controller 250 is configured to operate the optical sub-system 220 and the propulsion device 230 to execute the functions POINT, FIND, FOCUS, ALIGN, SCAN and AUTO algorithms. Once the EL images are captured, the UAV 20 returns to its base to transfer the EL images to the image processing device 260 for further processing.
[0120] The image processing device 260 is configured to execute the functions FREEZE and MAP. The image processing device 260 includes a frame extraction module 270, an image enhancement module 280, a mapping module 290, and an image processor 300. The image processing device 260 takes the EL images as input, and outputs an enhanced EL image of the PV array 10.
[0121] The operation of each component of the aerial vehicle is described in more detail in the following section.
[0122]
[0123] At step 410 of the method 400, the controller 250 executes the POINT function.
[0124] The worker 30 consults a string connection schematic which informs the worker 30 which PV string is put under forward bias according to the channel that is activated/open. In this embodiment, the string connection schematic contains an error and the worker 30 is informed that for a particular channel, a PV string 512a is put under forward bias. In actuality, another PV string 512b is put under forward bias, and one or more PV modules 514b of the PV string 512b emits an EL signal. The PV strings 512a, 512b are part of the PV array 40, and is also referred to as a PV array subsection 512a, 512b of the PV array 40.
[0125] After activating the particular channel, the worker 30 manually guides the UAV 20 to the PV string 512a along a flight path 520. The worker 30 notices that no EL signal is being emitted by the PV string 512a and deduces that there is an error in the string connection schematic. In order to determine the location of the PV string 512b that is under forward bias i.e. emitting an EL signal, the worker 30 directs the controller 250 to initiate the FIND function.
[0126] It should be noted that it is not necessary for the worker 30 to manually guide the UAV 20 to the PV string 512a. The worker 30 may initiate the FIND function immediately after deployment of the UAV 20 thus obviating the POINT function. Alternatively, the controller 250 may also be configured to initiate the FIND function automatically upon deployment of the UAV 20.
[0127] At step 420 of the method 400, the controller 250 executes the FIND function.
[0128] In addition, the controller 250 is further configured to adjust dynamically the propulsion device 220 to maneuver the UAV 20 (along the UAV's yaw axis 210a) to a predefined elevation 610 from the ground.
[0129]
[0130] To give an example, in the initial position, the UAV hovers at a height of 10 m above the ground. If the camera's optical axis angle is moved from 0° to 70°, a radial area of 55 meters is observed (using the law of sines:
The camera 222 has an angle-of-view of 60°. This results in a field of view of 10 m for a view pointing directly towards the ground
Thus, three rotations are sufficient to cover the PV array 40 which has a 25 m radius.
[0131] Notably, the camera's optical axis angle is increased at a decreasing pitch speed. From a perspective of the gimbal 228, the gimbal's pitch speed is decreased with increasing pitch angle. Since the scanning path 810 increases with 2π multiplied by radius, the camera's FOV travels along a five times larger distance (2π.Math.5 m≈31; 2π.Math.25 m≈157 m) in its last rotation. In consequence, the camera's rotation 710 or yaw speed is adjusted to be five times lower at the last rotation.
[0132] The camera's yaw speed depends on the amount of motion blur that is acceptable in a frame during an exposure time. For a maximum deflection during exposure of 10 pixels, an exposure time of 7 ms and a horizontal sensor resolution of 640px, a yaw speed of 22 m/s is possible
With an approximated travelled distance for the whole spiral of 282 m (sum of three circles at 5 m, 15 m, 25 m radius) the method FIND may take at maximum 13 seconds if no PV string under forward bias is detected. While the function FIND is in progress, the controller 250 checks the images captured by the camera 222 for features of the forward biased PV string 512b. The function FIND stops when an EL signal is detected from PV string 512b. Upon EL detection, the controller 250 is configured to adjust dynamically the propulsion device 220 to maneuver the UAV 20 to the EL signal. In this way, the EL signal is being used as an optical marker to guide the UAV 20.
[0133] At step 430 of the method 400, the controller 250 executes the FOCUS function. The controller 250 receives information regarding the distance of the camera 222 to the one or more PV modules in the PV string 512b from the LIDAR 224. The controller 250 is configured to adjust dynamically the camera's focus to match the distance between the camera lens 223 and a focal point to the distance between the camera lens 223 and imaged object according to the measured distance to maintain the camera lens' focus.
[0134] At step 440 of the method 400, the controller 250 executes the ALIGN function which is described next in relation to
[0135] Referring to
[0136] The controller 250 receives an EL image 1100 from the camera 222 (as depicted in
[0137] Further, the algorithm also determines an appropriate elevation of the UAV 20 relative to the PV string 512b which puts the PV string 512b at a predefined size ratio within the camera's FOV 910. The predefined size ratio is set to keep a space of about 5-10% between the top and bottom of the PV string 512b and image border 1410 to allow a tolerance to positional oscillations of the UAV 20. In other words, the PV string 512b occupies 80% to 90% of the camera's FOV at the predefined size ratio.
[0138] The algorithm then determines a perspective transformation to align the key points 1320 to the aligned points 1330 (as depicted in
[0139] Notably, the controller 250 is configured to execute the ALIGN function repeated while the SCAN function is in progress. This ensures that the camera's optical axis is perpendicular to the PV string's planar surface while the EL images are being captured during the SCAN function. Advantageously, this minimizes perspective distortion and increases the image resolution of EL images captured by the camera 222. Further, this allows the camera 222 to capture EL images with a more consistent focus across the EL image. In addition, the EL intensity from each PV module 514b is captured accurately which is important for analysis purposes.
[0140] Notably, if the controller 250 does not detect an end 1210 of the PV string 512b in the EL image 1200, the controller 250 is configured to adjust the propulsion device 220 to manuever the UAV 20 along the PV string's longitudinal axis 10a (refer to
[0141] At step 450, the controller 250 executes the SCAN function which is described next alongside
[0142] When considering large PV installations and a limited flight time of the UAV 20, scanning speed becomes a crucial parameter in determining system efficiency. Long camera exposure times typically result in better image quality, i.e. better signal-to-noise ratio (SNR). However, a long camera exposure time coupled with a fast scanning speed causes motion blur which reduces the image quality. On the other hand, a short camera exposure time results in EL images with too much noise, especially when the injected current is low, which also reduces image quality.
[0143] Since the PV module 1510 appears in
[0144] Furthermore, even with the ALIGN function being executed repeatedly during the SCAN function, it is difficult for the camera's FOV to remain completely stable throughout the SCAN function. This is especially so considering the positional oscillations of the UAV 20 due to external forces (such as wind) acting on the UAV 20. This is evident in
[0145] Limiting Noise:
[0146] The controller 250 is configured to perform optical flow analysis (e.g. Lucas-Kanade method) during the SCAN function. For each frame in
[0147]
[0148] The controller 250 calculates a line 1730 through the image centre 1720 and at deflection angle. The line 1730 intersects the image border 1740 at intersection points 1750, 1760. The distance between the intersection points 1750, 1760 represents an object distance travelled through the image plane. The controller 250 then calculates n.sub.frames by taking the ratio of the length of the deflection vector d.sub.f2f to the distance between the intersection points 1750, 1760.
[0149] The impact of noise on the image quality can be quantified with the SNR:
[0150] In this embodiment, the SNR is calculated in the following manner. The captured Otsu's method is used to obtain a threshold (t.sub.Otsu) between the dark background 1110 and the PV string 512b (or a portion of the PV string 512b) which appears brighter due to the EL signal emitted by the one or more PV modules 514b. The ‘Signal’ value is obtained by averaging the intensity of all pixels brighter than the threshold, t.sub.Otsu. The ‘Noise’ value is obtained before EL measurement from an average of the standard deviation of a pixel of multiple images taken with similar or comparable imaging parameters (e.g. exposure time, sensor temperature and gain) in series.
[0151] An SNR-dependent scanning speed factor (or simply SNR scanning factor), f.sub.SNR is applied on the current scanning speed to ensure that the SNR of the image average 1520, SNR.sub.frame matches a target SNR, SNR.sub.target using Equation (2)
[0152] For example, SNR.sub.target is set at 45 for lab measurements. The camera exposure time is adjusted during the SCAN function to keep the SNR.sub.frame at 5 (minimum requirement for outdoor measurements). The controller 250 then estimates that twenty-five EL images are available for image averaging (n.sub.frame=25). Based on Equation (1),
In other words, the current scanning speed should be reduced to 56% of its current value. In essence, lowering the scanning speed increases the number of frames (n.sub.frames) available for creating the image average 1520.
Limiting Motion Blur:
[0153] To avoid effects of motion blur, object deflection during exposure of a frame should also be below a pre-determined maximum value(d.sub.exp_max). A value of 0.75 pixel per exposure time is suggested. The frame to frame deflection (d.sub.f2f) can be scaled into exposure time deflection (d.sub.exp) using the time difference between two frames (t.sub.f2f) and exposure time (τ.sub.exp) according to Equation (3).
[0154] A motion blur dependent scanning factor (f.sub.blur) is equal to a ratio of the maximum object deflection d.sub.exp_max to the current object deflection d.sub.exp as shown in Equation (4).
[0155] A scanning speed factor (f.sub.scan) is obtained from a minimum of both factors (f.sub.SNR, f.sub.blur) as shown in Equation (5):
f.sub.scan=min(f.sub.SNR,f.sub.blur) (5)
[0156] To ensure high EL image quality, a maximum set scanning speed, v.sub.quality is obtained using Equation (6):
v.sub.quality=min(v.sub.max,f.sub.scan×v.sub.cur) (6)
[0157] The maximum set scanning speed, v.sub.quality defines the maximum scanning speed at which a high EL image quality can still be achieved.
[0158] According to Equation (6), a target scanning speed is calculated by multiplying the scanning speed factor, f.sub.scan with the current flight speed, v.sub.cur. If the target scanning speed is below a maximum flight speed, v.sub.max of the UAV 20, then the target scanning speed is selected as the maximum set scanning speed, v.sub.quality. In other words, even though the UAV 20 is able to move faster up to its maximum flight speed, v.sub.max, since this reduces the image quality of the EL images, the maximum set scanning speed, v.sub.quality is set below the maximum flight speed v.sub.max.
[0159] If the maximum set scanning speed exceeds the maximum flight speed, v.sub.max of the UAV 20, then the maximum flight speed, v.sub.max is selected as the maximum set scanning speed, v.sub.quality.
[0160] The target flight speed, v.sub.target is obtained from the maximum set scanning speed, v.sub.quality, and a user input factor, f.sub.user according to Equation (7).
v.sub.target=v.sub.quality.Math.f.sub.user (7)
[0161] The user input factor, f.sub.user is obtained from a deflection of a joystick controlled by the worker 30 remotely, and ranges from 0% to 100%. At 100%, the target flight speed, v.sub.target is simply the maximum set scanning speed, v.sub.quality.
[0162] A smoothing technique is applied to the target flight speed, v.sub.target to minimise jerky movement of the UAV 20. In this embodiment, exponential moving average is used to obtain a set speed, v.sub.set according to Equation (8). α is a smoothness factor within a range of 0 to 100%.
v.sub.set=(β×v.sub.cur)+((1−α).Math.v.sub.target) (8)
[0163] Two exemplary embodiments of the SCAN function are described next with reference to
[0164] Referring to
[0165] Notably, since the user input factor, f.sub.user is 100%, the maximum set scanning speed, v.sub.quality is also the target flight speed, v.sub.target according to Equation (7).
[0166] The controller 250 then dynamically decreases the current flight speed of the aerial vehicle until the target flight speed, v.sub.target is achieved. The smoothing technique according to Equation (8) is applied to minimise the jerky movement of the UAV 20, and this can be seen in the smooth transition of the current flight speed, v.sub.cur of the UAV 20 from 6 m/s (at time=1s) to 3 m/s (at time=2s).
[0167] At time=2s, the controller 250 obtains a scanning speed factor, f.sub.scan2 of 100% from Equation (5). At this point, the current flight speed, v.sub.cur of the UAV 20 matches the maximum set scanning speed, v.sub.quality.
[0168] Referring to
[0169] Since the target scanning speed is below the maximum flight speed, v.sub.max of the UAV 20, instead of selecting the maximum flight speed, v.sub.max as the maximum set scanning speed, v.sub.quality, the target scanning speed is selected as the maximum set scanning speed, v.sub.quality according to Equation (6).
[0170] Similarly, since the user input factor, f.sub.user is 100%, the maximum set scanning speed, v.sub.quality is also the target flight speed, v.sub.target according to Equation (7). The smoothing technique according to Equation (8) is also applied to minimise the jerky movement of the UAV 20.
[0171] The controller 250 then dynamically increases the current flight speed, v.sub.cur of the aerial vehicle until the target flight speed, v.sub.target is achieved. At time=2s, the controller 250 obtains a scanning speed factor, f.sub.scan2 of 100% from Equation (5). At this point, the current flight speed, v.sub.cur of the UAV 20 matches the maximum set scanning speed, v.sub.quality.
[0172] Notably, the controller 250 continuously performs the SCAN function until an opposing end of the PV string 512b is detected. Once the opposing end of the PV string 512b is detected, the controller 250 terminates the SCAN function and the EL images are stored in the memory unit 252.
[0173] At step 460, the controller 250 is configured to execute the AUTO function.
[0174] At t1, as illustrated in
[0175] At t3, as illustrated in
[0176]
[0177] At t5 in
[0178]
[0179] The camera 222 captures/digitizes the EL images 1910 at a bit depth larger than 8-bit (e.g. 14- or 16-bit). This allows resolving an image intensity range more precisely than within the 255 brightness steps of a monochromatic 8-bit sensor. To reduce the file size, an image encoder based on 8-bit images is used. An upper and lower intensity range of each EL image 1910 is stored in the meta data 1920. The upper and lower intensity range is obtained from the effective dynamic range of each EL image 1910 captured by camera 222. The range can be used to scale every 8-bit EL image to respective lower and upper intensity range of the original higher depth camera image.
[0180] Once the UAV 20 returns to its base, the stored EL images are then transferred to the image processing device 260 for further processing.
[0181] At step 470, the image processing device 260 executes the FREEZE function which is described next with reference to
[0182] The FREEZE function includes (i) a frame extraction step performed by the frame extraction module 270; and (ii) an image enhancement step performed by the image enhancement module 280.
[0183]
[0184] All corner points 2010 detected in the first EL frame image 2000a of
[0185] Referring to
[0186]
[0187] The image processor 300 further controls the image enhancement module 280 to group the frames 2100 according to the PV module 514b contained in each frame (or similarly, according to their module index 2110). Each frame 2100 includes four cluster points 2022 (respectively marked CA′ to D′). The frames 2100 in each group are then arranged in a stacked arrangement (referred to as a stack) so that the cluster points 2022 that are marked with the same alphabet (‘A’-‘D’) are stacked on top of each other. An exemplary stack 2120 having the module index [1,1] is shown in
[0188] The image enhancement module 280 is further configured to discard the area 2124 that are not part of the frames 2100.
[0189] Using the exemplary stack 2120 as an example, the image enhancement module 280 is further configured to determine a reference frame having a highest image quality from the frames 2100 within the exemplary stack 2120. The image quality is evaluated based on sharpness, SNR and completeness of the PV module 514b within the frames.
[0190] The image enhancement module 280 is further configured to perform image alignment of the frames 2100. This is done using an image alignment algorithm such as ‘Parametric Image Alignment using Enhanced Correlation Coefficient’. The image enhancement module 280 aligns the remaining frames 2100 in the exemplary stack 2120 to the reference frame. Image alignment is done by aligning the cluster points 2022 that are marked ‘A’ in the remaining frames to the corresponding cluster point 2022 that is marked ‘A’ in the reference frame to obtain image-aligned frames 2130.
[0191] The image enhancement module 280 is further configured to perform image averaging on the image-aligned frames 2130 to obtain an enhanced frame 2140 of the particular PV module 2514. Image averaging is performed using a super-resolution routine such as weighted image stack averaging 2032 and/or a dedicated deep convolutional network structure 2034. The enhanced frame 2140 has higher SNR (i.e. lower noise) and higher resolution (up to a resolution improvement factor of three) than the reference frame.
[0192] The same process is repeated for the remaining stacks to obtain respective enhanced frames for the remaining PV modules 514b in the PV string 512b. The image processor 300 further controls the image enhancement module 280 to determine the respective corner points 2010 of each enhanced frame and to remove any remaining perspective distortion in the enhanced frame.
[0193] The image processor 300 further controls the image enhancement module 280 to arrange the enhanced frames according to their module index 2110 to produce an enhanced EL image of the PV string. If distances between the PV modules 514b in the PV string 512b are similar, a single enhanced EL image is produced. If distances vary due to a large gap 2112 between two PV modules (which indicate that one of the PV modules belong to a separate PV string 512c), then a separate enhanced EL image is produced for the separate PV string 512c.
[0194] The image enhancement module 280 is further configured to scale the image intensities of each enhanced frame to reflect an intensity spectrum from the darkest to the brightest PV module 514b in the PV string 512b. Since the intensity scaling reduces a depth resolution of the PV module's intensity range, the enhanced frames for the respective PV modules 514b are stored together with the enhanced EL image of the PV string.
[0195] During image processing, image intensities are expressed in real or floating-point values. When visually displaying the resulting images, image intensities have to be assigned a brightness value between a darkest and a brightest displayable value. To reduce the influence of pixels with extreme image (or pixel) intensity values, a brightness range is defined by a lowest pixel intensity bin (dotted line 2201a) and a highest pixel intensity bin (dotted line 2201b) that contains a minimum number of pixels (referred before as the effective dynamic range).
[0196]
[0197]
[0198] Existing known methods may then be used to process the enhanced EL image 2200 to identify any defective PV modules of the PV string 512b based on the EL imaging. With a defective PV module identified, it might be helpful to know the defective PV module's geo-location. For this purpose, the mapping module 290 can be used.
[0199] The image processor 300 controls the mapping module 290 to execute the MAP function as illustrated in
[0200] The image processor 300 controls the mapping module 290 to map the enhanced image onto the base-map by orientating the enhanced image to align the PV array subsection in the enhanced image to the PV array subsection in the base-map. If the geo-location of the PV string 512b in the enhanced EL image 2200 is known, but not its orientation (i.e. camera's yaw), there are four possible orientations with respect to the black triangle 2220 (0°, 90°, 180°, 270° of rotation) to align the image with a PV string in a base-map layer. Commercially available PV modules generally have cell grids of 4×8, 6×10 or 6×12 cells, and are generally rectangular and not square. In such cases, only two of the four orientations are plausible: 0° and 180° of rotation with respect to the black triangle 2220. This is explained in further detail with reference to
[0201] The enhanced EL image 2310 in
[0202] Even though the enhanced EL image 2320 in
[0203] The enhanced EL image 2330 in
[0204]
[0205] When a particular EL image is used for image averaging to produce an enhanced frame, the image processor 300 further instructs the mapping module 290 to associate the image identifier 2420 of the particular EL image with the enhanced frame. For example, the enhanced frame 2430 is associated with unique number identifiers ‘2’,‘3’,‘4’,‘5’ and ‘6’. In other words, the corresponding frames extracted from the EL images ‘2’,‘3’,‘4’,‘5’, and ‘6’ are used for image averaging to produce the enhanced frame 2430.
[0206] The mapping module 290 is then able to determine the orientation (from the two available orientations: 0° and 180° of rotation with respect to the black triangle 2220) of the enhanced image 2200 based on the image identifiers 2420 associated with each enhanced frame. The enhanced frames associated with the image identifier ‘1’ represents the frames that are captured at the start of the SCAN function as opposed to the enhanced frames associated with the image identifier ‘11’ which represent the frames that are captured at the end of the SCAN function.
[0207] Further indicators for position and orientation of the enhanced EL image 2200 within the base-map 2410 are discussed. An approximate center position 2440 of the camera's FOV 910 along the PV string 512b can be calculated from the UAV's flight path 2450 (including flight start 2452 and flight end 2454), flight altitude and camera orientation.
[0208]
d.sub.xy,L=d.sub.L.Math.cos(α) (9)
[0209] Or based on the and the height 2560 of the PV string 512b from the ground d.sub.PV using Equation (4):
d.sub.xy,2=(d.sub.z−d.sub.PV).Math.tan(α) (10)
[0210] Equation (9) is preferred over Equation (10) d.sub.xy,2 because requires the height of the imaged object (d.sub.PV) to be known or estimated. Further, the UAV 20 estimates d.sub.z barometrically which may be erroneous for longer flight times and changing weather.
[0211] Referring to
[0212]
[0213] Referring to
[0214] Referring to
[0215] Once the enhanced EL image 2200,2610,2630 is mapped onto the base-map 2410,2620,2640, information about the geo-location (such as GPS coordinate) of a PV module defect that is identified in the enhanced EL image 2200,2610,2630 can be readily identified from the base-map 2410,2620,2640 for repair works and/or maintenance.
[0216] Advantageously, in light of the described embodiment, it is possible for the UAV 20 to take low resolution, monochromatic videos under dim light conditions and yet enhanced resolution and improved quality images may be produced to identify defective PV modules and to estimate PV module power loss. In particular, the onboard processing sub-system 240 is capable of autonomously navigating the UAV 20 and executing the exemplary method 400.
[0217] Further, since information such as frame-dependent timed geo-location (such as time, latitude, longitude, altitude etc) and camera orientation (e.g. yaw, pitch, roll) are processed, it is possible to reproduce the location of a PV string of a certain EL image reliably and accurately.
[0218] It should be noted that the various embodiments described herein should not be construed as limitative. For example, the UAV 20 may be further equipped with an ultrasound device for additional distance measurements. Furthermore, the camera 222 may capture still EL images of the PV string under forward bias, or record a video of the PV string instead. Further, instead of a monochromatic sensor, a colour sensor can be used. Although the described embodiment uses ‘PV string’ as an example, any other PV electrical connections may be used, and broadly, the embodiment may be used with any PV array.
[0219] Other types of aerial vehicles, such as drones may be used, and not only UAVs.
[0220] While the exemplary method 400 is described as including all 8 functions: FOCUS, POINT, FIND, ALIGN, SCAN, AUTO, FREEZE, MAP, it is understood that the system 200 may execute any number of the functions, and in any reasonable order. For example, in an alternative embodiment, the onboard processing sub-system 240 may not execute the AUTO function as the worker 30 may want greater control of which PV string to inspect. In this case, the worker manually controls the switcher box 32 and power supply 36 after the SCAN function is completed and initiates the FIND or SCAN function accordingly. The image processing device 260 may also execute the FREEZE function without the MAP function.
[0221] Furthermore, the FOCUS function may be executed at all times throughout the method 400, especially while the SCAN function is in progress, to ensure the captured EL images have a high quality of sharpness. Alternatively, the FOCUS function need not be executed at all if the distance between UAV 20 and PV array 10 can be kept within a narrow range. In such an embodiment, fixed focus lenses without controlled focus adjustments may be used, instead of the focussing lens 223.
[0222] Moreover, in an alternative embodiment, the UAV 20 may remotely transfer the captured EL images to the image processing device 260 without first returning to the base.
[0223] Furthermore, the pre-determined maximum value (d.sub.exp_max) may be set up to 1.5.
[0224] In another example, the laser may not need to be turned off if the lens filter is arranged to filter out any optical interference from the laser.
[0225] In a further example, during the SCAN function, the predefined size ratio may also be set to keep a space of about 15% to 20% (or even higher, e.g. 20% to 25%) between the top and bottom of the PV string 512b and image border 1410 to allow a greater tolerance to positional oscillations of the UAV 20, depending on how unstable the UAV 20 appears to be.