HELIOSTAT CALIBRATION
20230341151 · 2023-10-26
Assignee
Inventors
Cpc classification
F24S23/77
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F24S50/20
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F24S2050/25
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
International classification
F24S50/20
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
Abstract
Systems and methods for calibrating a heliostat (104) are disclosed. An imaging device (100) is positioned and oriented so that a calibration target (130) reflected by the heliostat (103) is visible at the imaging device and an image taken. Multiple features of the reflected calibration target in the image are identified and used to determine a centroid of reflection within the image which is then mapped to a corresponding centroid position on the calibration target. A vector
Claims
1. A method of calibrating a heliostat, comprising: positioning and orienting an imaging device so that a calibration target reflected by the heliostat is visible at the imaging device, the imaging device having a known position; by means of the imaging device, taking an image of the heliostat that includes the reflected calibration target visible on the heliostat; identifying multiple features of the reflected calibration target in the image; using the multiple features to determine a centroid of reflection within the image; mapping the centroid of reflection within the image to a corresponding centroid position on the calibration target; determining a vector
2. The method as claimed in claim 1, wherein the calibration target is divided into a number of segments, each segment representing one of the features that is identified.
3. The method as claimed in claim 2, wherein the segments have a visual coding applied thereto so that a sub-set of segments can be uniquely identified within all of the segments.
4. The method as claimed in claim 3, wherein the visual coding includes colours applied to at least a portion of each segment.
5. The method as claimed in claim 2, wherein only a sub-set of the segments are visible within the image.
6. The method as claimed in claim 2, wherein determining a centroid of reflection within the image includes determining a weighted average of areas of the segments shown within the image.
7. The method as claimed in claim 6, wherein an area of a segment may be determined based on a number of pixels in the image within the segment.
8. The method as claimed in claim 1, wherein using the multiple features to determine a centroid of reflection within the image and mapping the centroid of reflection within the image to a corresponding centroid position on the calibration target includes: determining coordinates of a centre point (A) in the image that corresponds to the centre position of the heliostat; determining coordinates of the identified features (h1, h2, h3, h4) of the reflected calibration target in the image; determining a projective transformation matrix (H) that transforms the features (h1, h2, h3, h4) to actual coordinates of corresponding features on the calibration target; and applying the projective transformation matrix (H) to the centre point (A) so as to determine coordinates of a point (B) on the target corresponding to the centre point (A), the point (B) being designated as the centroid position on the calibration target.
9. The method as claimed in claim 8, wherein the step of determining the coordinates of the centre point (A) includes analysing the image to identify corners (p1, p2, p3, p4) of the heliostat, and calculating the centre point (A) as the point at which lines connecting the corners intersect.
10. The method as claimed in claim 8, wherein the identified features (h1, h2, h3, h4) of the reflected calibration target are four corners of the reflected calibration target.
11. The method as claimed in claim 1, wherein the imaging device is mounted on an aerial vehicle.
12. The method as claimed in claim 1, wherein the imaging device is mounted on a pole or pedestal.
13. The method as claimed in claim 1, including an initial step of moving the heliostat into a calibration orientation.
14. The method as claimed in claim 1, wherein the steps of the method are repeated in respect of each image recorded by the imaging device, so as to rapidly obtain multiple
15. The method as claimed in claim 1, wherein the known position of the imaging device is a position relative to the heliostat to which the sun does not move during the day.
16. The method as claimed in claim 1, wherein the imaging device's position is determined by means of a real-time kinematic (RTK) global positioning system (GPS).
17. The method as claimed in claim 1, wherein the imaging device is capable of calibrating more than one heliostat from each known position by taking images of more than one heliostat.
18. The method as claimed in claim 1, wherein the imaging device is moved to successive known positions so as to successively obtain different calibration points.
19. The method as claimed in claim 1, in which more than one imaging device is employed simultaneously over a field of heliostats.
20. A system for calibrating a heliostat, comprising: an imaging device which is positioned and oriented so that a calibration target reflected by the heliostat is visible at the imaging device; a position detection system to determine the position of the imaging device; and a processor; wherein the imaging device takes an image of the heliostat that includes the reflected calibration target visible on the heliostat; and wherein the processor: identifies multiple features of the reflected calibration target in the image; uses the multiple features to determine a centroid of reflection within the image; maps the centroid of reflection within the image to a corresponding centroid position on the calibration target; determines a vector
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0051] In the drawings:
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
DETAILED DESCRIPTION WITH REFERENCE TO THE DRAWINGS
[0062] Embodiments of the invention provide for systems and methods of calibrating a heliostat. An imaging device may be positioned and oriented so that a calibration target reflected by the heliostat is visible at the imaging device. The imaging device may have a known position. The imaging device may then take an image of the heliostat that includes the reflected calibration target visible on the heliostat. Multiple features of the reflected calibration target in the image may then be identified. These features may be used to determine a centroid of reflection within the image. The centroid of reflection may be mapped to a corresponding centroid position on the calibration target. A vector
[0063]
[0064]
[0065] Each imaging device (100) moves above a field of heliostats (104.1, 104.2, 104.3) that operate by reflecting sunlight onto a central thermal receiver (108) where a working fluid may be heated which may be used to drive a turbine to produce electricity or may be used as a source of process heat, such as to heat manganese ore before it enters a smelter. The schematic shows only three heliostats (104.1, 104.2, 104.3), but it will be appreciated that in a typical concentrated solar power (CSP) plant there could be thousands of heliostats.
[0066] To properly calibrate a heliostat, it is necessary to measure its normal vector (which is the vector
[0067] The imaging device (100) may be positioned and oriented so that a calibration target (102) reflected by a heliostat (104) is visible at the imaging device (100). The imaging device (100) has a known three-dimensional position. Its position may be determined by a position detecting system, for example by means of a Global Positioning System (GPS) or by means of photogrammetry in which multiple recognized features enable triangulation to be performed. In the case of a GPS being used, typical onboard GPS devices provided in association with aerial vehicles such as drones may not be accurate enough, so in some embodiments the aerial vehicle may be provided with a real-time kinematic (RTK) GPS system, in which a fixed base station wirelessly sends out corrections to the onboard GPS system to provide centimetre-level positioning accuracy.
[0068]
[0069] The imaging device, such as an onboard camera on the drone, then takes an image (110) of the heliostat that includes at least a portion of the calibration target (102) visible on the heliostat mirror. Such an image (110) taken by an imaging device is shown in
[0070] Referring to
[0071] To determine the
[0072]
[0073] In the embodiment shown in
[0074] The visual coding system is designed such that a sub-set of segments can be uniquely identified within all of the segments, so that if only a sub-set of segments are visible within the image, the position of that sub-set within the entire calibration target is known. The visual coding system such as the illustrated system may be determined by an iterative method using only three different colours (represented here by the square, circle and cross symbols) and a constraint that every 3 by 3 group of segments should be unique, so that if any 3 by 3 group of segments are visible in the image their position within the calibration target is known. Many other visual coding systems can of course be obtained with different requirements and constraints.
[0075] Referring back to
[0076] Each segment (114) has an x and y position on the calibration target the correspond to a measurement of the centre of that segment from a starting coordinate, such as a (0, 0) coordinate at the bottom left of the calibration target. For example, if as illustrated in
[0077] Determining the x-coordinate of the centroid (124) can then be calculated by the following formula:
where i is the segment number, x.sub.i is the x-coordinate of segment i and I.sub.i is the number of pixels measured within segment i in the image (110), thereby also corresponding to the area of that segment in the image (110). The symbol I is used because the number of pixels represents intensity of light. I.sub.i is thus equivalent to an amount of light that would have fallen onto segment i if the imaging device had been the sun.
[0078] The y-coordinate of the centroid (124) can be calculated by the same formula applied to the y-coordinates of the segments:
[0079] Since the (x,y) coordinates of the centroid (124) are now determined, the
[0080] From the
[0081] The H-vector is then used to calibrate the heliostat by updating parameters of a tracking model of the heliostat according to existing methods.
[0082] This method is then repeated with the heliostat moved to a different calibration position as to obtain multiple
[0083] An aerial vehicle like a drone can rapidly be flown into different positions to acquire multiple
[0084] Furthermore, the drone can be flown into positions in which the sun does not move during the day, making it possible to define a heliostat tracking model more accurately and with a wider range of calibration points than would be available when using the sun's reflection for calibration.
[0085] Since multiple drones can be utilized, and individual drones can be configured to calibrate multiple heliostats, calibration of a field of heliostats can be carried out much more quickly than using BCS. This allows calibration to be performed more often and permits a less expensive and more robust heliostat structure to be employed. When new CSP plants are being erected, the method only needs a target with a known location to be set up initially, and this can be done before the entire tower is erected which can shorten calibration time.
[0086]
[0087] In the case where a single imaging device is used to calibrate multiple heliostats, a wide-angle lens can be used so as to capture multiple heliostats within the single image, or alternatively more than one camera on the drone can be provided with the cameras oriented differently to take multiple images.
[0088] With reference to
A.x=(p1.x+p2.x+p3.x+p4.x)/4 (4)
A.y=(p1.y+p2.y+p3.y+p4.y)/4 (5)
The next step is to determine, in the captured image, the pixel coordinates of at least one feature (h) of the reflected calibration target. While a single point such as a central marking in the reflected calibration target could work, a higher degree of accuracy is achieved by obtaining multiple features. Where the calibration target is rectangular, the four corner points (h1, h2, h3, h4) of the calibration target can be identified by image processing software. Each corner point (h1, h2, h3, h4) is an x-y pixel coordinate within the image.
[0089] Then, using homography, a projective transformation matrix (H) is determined that transforms the four corner points (h1, h2, h3, h4) to actual coordinates of the corresponding corners on the calibration target (i.e. positions on a plane of the calibration target). Matlab™ has a command called “fitgeotrans” which achieves this. In a projective transformation, straight lines remain straight, but parallel lines do not remain parallel. This transformation is suitable because it was found that the images obtained had little to no image distortion due to the camera lens, and the heliostat mirrors themselves are flat. The projective transformation matrix (H) may be represented by the following equation:
[0090] The projective transformation matrix (H) is then applied to the centre point (A) so as to determine coordinates of a point (B) on the target (130) as seen in
[0091] In one example, the target's centre can be designated coordinates (0, 0). For a target that is 2 m by 2 m in size, the coordinates of the corners are R1=(1,1), R2=(1,−1), R3=(−1,−1) and R4=(−1,1). When point h1 is transformed using the matrix H, the result is (1,1) or the coordinates of R1. When h2 is transformed, the result is (1,−1) or R2. When A is transformed, the result is point B. The three-dimensional coordinates of point B in space can then be determined since the position of the calibration target is known.
[0092]
[0093] Once point B is known, the vector
[0094] The embodiment illustrated with reference to
[0095] Experimental Results
[0096] The feasibility of the method according to the second embodiment was determined experimentally. A heliostat was initially calibrated with the existing BCS calibration method and then held stationary in place (i.e. frozen). From this the actual orientation of the heliostat can be determined using the azimuth and elevation angles of the sun at the time at which the BCS calibration was carried out, or
[0097] A drone with RTK-GPS was then flown over this frozen heliostat to capture the reflection of the target with the drone-mounted camera. The
TABLE-US-00001 Azimuth of Elevation of Measurement heliostat heliostat number (degrees) (degrees) 1 −3,697 28,685 2 −3,488 26,778 3 −3,324 25,477 4 −4,408 26.282 5 −4,377 26.501 6 −3.478 26.160 7 −2.976 25.872 Average of measurements −3,6626 25,8520 using drone system (n = 20) Control −3.755 25.701
[0098] The foregoing description has been presented for the purpose of illustration; it is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Persons skilled in the relevant art can appreciate that many modifications and variations are possible in light of the above disclosure.
[0099] The language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention set forth in any accompanying claims.
[0100] Any of the steps, operations, components or processes described herein may be performed or implemented with one or more hardware or software units, alone or in combination with other devices. Components or devices configured or arranged to perform described functions or operations may be so arranged or configured through computer-implemented instructions which implement or carry out the described functions, algorithms, or methods. The computer-implemented instructions may be provided by hardware or software units. In one embodiment, a software unit is implemented with a computer program product comprising a non-transient or non-transitory computer-readable medium containing computer program code, which can be executed by a processor for performing any or all of the steps, operations, or processes described.
[0101] Finally, throughout the specification and any accompanying claims, unless the context requires otherwise, the word ‘comprise’ or variations such as ‘comprises’ or ‘comprising’ will be understood to imply the inclusion of a stated integer or group of integers but not the exclusion of any other integer or group of integers.