IMAGE DATA CAPTURING ARRANGEMENT
20200333140 ยท 2020-10-22
Inventors
- Andrew ELSON (Wells, GB)
- Mike ROBERTS (Wells, GB)
- Steve HANCOCK (Southampton, GB)
- Simon ASHBY (Southampton, GB)
Cpc classification
H04N23/11
ELECTRICITY
G01C11/02
PHYSICS
H04N23/90
ELECTRICITY
International classification
Abstract
An image data capturing arrangement (1) has at least two image data capturing devices. At least one first image data capturing device (2) is adapted to capture data of one or more first images of an object along a first image capturing axis 4, and at least one second image data capturing device (6) is adapted to capture data of one or more second images of stars along a second image capturing axis 8. The second image capturing axis (8) has a known orientation relative to the first image capturing axis (4), and a reference clock for assigning a time stamp to each first and second image data.
Claims
1. An image data capturing arrangement for an aerial vehicle, comprising: at least one first image data capturing device adapted to capture data of one or more first images of an object along a first image capturing axis, at least one second image data capturing device adapted to capture data of one or more second images of stars along a second image capturing axis, the second image capturing axis having a known orientation relative to the first image capturing axis, and a reference clock for assigning a time stamp to each first and second image data.
2. An image data capturing arrangement according to claim 1, wherein the first image data capturing device or second image data capturing device is adapted to capture data of images based on one or more ranges of wavelengths of electromagnetic radiation within a spectrum, the range of wavelengths corresponding to that of visible light or ultra violet or X-ray or gamma ray or infra-red or microwave or radio waves or any other part of the spectrum.
3. An image data capturing arrangement according to claim 1, wherein the first image data capturing device or second image data capturing device is a camera, a radio detecting and ranging (RADAR) sensor or a Light Detection and Ranging (LiDAR) sensor.
4. An image data capturing arrangement according to claim 1, wherein the first image data capturing device is a different type of device than the second image data capturing device.
5. An image data capturing arrangement according to claim 4, wherein the first image data capturing device is adapted to capture data of one or more object images based on a first range of wavelengths of electromagnetic radiation within a spectrum, and the second image data capturing device is adapted to capture data of one or more star images based on a second range of wavelengths of electromagnetic radiation within the spectrum different than the first range of wavelengths.
6. An image data capturing arrangement according to claim 1, wherein the object is the Earth.
7. An image data capturing arrangement according to claim 1, wherein the second image data capturing device includes an infra-red filter.
8. An image capturing arrangement according to claim 1, wherein the image data capturing arrangement includes a data storage module.
9. An image capturing arrangement according to claim 1, wherein the image data capturing arrangement includes a data transmission module.
10. An image data capturing arrangement according to claim 1, wherein the image data capturing arrangement includes an image data processing module.
11. An image data capturing system comprising the image data capturing arrangement according to claim 1, and further comprising a position determining device arranged to determine a spatial position of the image data capturing arrangement relative to the object.
12. An image data capturing system according to claim 11, wherein the position determining device is configured to determine a spatial position of the image data capturing arrangement by accessing a repository of star image data and correlating star image data from the repository with a plurality of second image data of stars captured by the second image data capturing device.
13. An image data capturing system according to claim 11, further comprising a receiver for receiving satellite signals and the position determining device is arranged to determine the spatial position of the image data capturing arrangement relative to the object according to the satellite signals received.
14. An image data capturing system according to claim 11, wherein the position determining device is configured to determine one or more of the latitude, longitude, altitude and attitude of the image data capturing arrangement.
15. An image data capturing system according to claim 11 wherein at least a part of the position determining device is arranged remotely from the image data capturing arrangement.
16. The image data capturing system according to claim 11, further comprising an information repository providing the object's position and orientation over time with respect to the stars.
17. The image data capturing system according to claim 11, further comprising a processor configured to use the object's position and orientation correlated with the reference clock together with the spatial position of the image data capturing device in order to determine the location of the or each captured first image data on the object and assign object reference location data to the or each first image data.
18. The image data capturing system according to claim 17, wherein the processor is remote from the image data capturing arrangement.
19. An aerial vehicle comprising the image data capturing arrangement according to claim 1.
20. A method of capturing image data comprising the steps of: providing an image data capturing arrangement including at least one first image data capturing device adapted to capture data of one or more first images of an object along a first image capturing axis, at least one second image data capturing device adapted to capture data of one or more second images of stars along a second image capturing axis and a common reference clock, capturing data of one or more first images of the object with the first image data capturing device, capturing data of one or more second images with the second image data capturing device, the second image capturing axis having a known orientation relative to the first image capturing axis, assigning a time stamp to each first image data and each second image data with the common reference clock.
21. A method according to claim 20, further comprising correlating the one or more first image data with the one or more second image data according to the time stamp of each first image data and each second image data.
22. A method according to claim 20, further comprising determining a spatial position of the image data capturing arrangement.
23. A method according to claim 22, further comprising determining the spatial position of the image data capturing arrangement relative to the object according to satellite signals received.
24. A method according to claim 22, further comprising determining the spatial position of the image data capturing arrangement by accessing a repository of star image data and correlating star image data from the repository with a plurality of second image data of stars captured by the second image data capturing device.
25. A method according to claim 22, further comprising determining one or more of the latitude, longitude, altitude and attitude of the image data capturing arrangement.
26. A method according to claim 22, further comprising determining the location on the object of the or each captured first image data using the spatial position of the image data capturing device together with information on the object's position and orientation at the time stamp of the or each first image data.
27. A method according to claim 26, wherein the method step of determining the location on the object of each captured first image data is repeated for each first image data in a series of first image data in order to map at least part of the object.
28. A method according to claim 26, wherein one or more of the steps of correlating first image data with second image data, determining image data capturing device spatial position, and determining the location on the object of the or each captured first image data occurs remotely of the image data capturing arrangement.
29. A method according to claim 20, wherein the object is the Earth.
30. A method according to claim 20, further comprising mounting the image data capturing arrangement to an aerial vehicle, flying the aerial vehicle and capturing first and second image data during flight.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0031] Embodiments of the invention will now be described with reference to the accompanying drawings, in which:
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
DETAILED DESCRIPTION OF EMBODIMENT(S)
[0042] In an embodiment, an image capturing arrangement is elevated above the Earth to the stratosphere and arranged to capture images of the Earth's surface with the first image capturing device. The second image capturing device captures images of stars at generally the same time as the first image capturing device is capturing images.
[0043]
[0044] In
[0045] The second camera 6 is fitted with an infra-red lens filter, which allows only light at infrared wavelengths of the electromagnetic spectrum to pass through to the second camera lens 7. The filter absorbs visible light, in order to filter out daylight when capturing images of stars during the day. This filter 11 may not be required if the first images are being captured at night.
[0046] The term optical axis or principal axis is used to refer to the image capturing axis since in this embodiment the camera records visible light. However, it is to be understood that in alternative embodiments, the camera may be a sensor capturing images in a non-visible part of the electromagnetic spectrum.
[0047] The camera arrangement 1 also includes an ancillary unit 10 including a reference clock 12, such that when capturing images, the time of taking the images can be recorded and stored with each image. The time recorded includes date as well as time information. Images captured by the first camera 2 and the second camera 6 may be taken simultaneously or may be phased over time.
[0048] In
[0049] The image capturing arrangements in
[0050] The camera arrangement 1, 20, 30, 40 is elevated to a target altitude by an aerial vehicle. The vehicle in this embodiment is an unmanned aerial vehicle (UAV) as shown in
[0051] The exemplary UAV 50 shown in
[0052] The wings 52 are elongate in a spanwise direction with a total wingspan of around 20 to 60 metres, extending either side of the fuselage 54. Each wing 52 comprises a space frame having a plurality of interlocking ribs and spars.
[0053] Each of the wings 52 carry a motor driven propeller 59 which may be powered by rechargeable batteries, or the batteries may be recharged during flight via solar energy collecting cells (not shown) located on the external surface of the aircraft, e.g. on the wings 52. The UAV 50 can therefore fly for extended periods of time, for days or months at a time. The vehicle 50 typically includes an automated control system for flying the UAV 50 on a predetermined flight path. The UAV 50 is capable of straight level flight, and can turn and fly at inclined angles or roll, pitch and yaw.
[0054] In this embodiment, a camera arrangement 70 is located within the wing structure 72 of the UAV, as shown in the chordwise cross sectional view through the aerofoil in
[0055] Ribs 74 extend chordwise across the wing 72, and are spaced equidistantly apart in a spanwise direction. Each rib 74 interlocks with a series of spars (not shown) extending generally perpendicularly to the ribs 74. The spars and ribs 74 have slots 75 which enable interlocked joints to be formed. In this manner, hollow cells are formed between adjacent ribs 74 and spars. Upper and lower covers are then placed over the upper and lower surfaces of the space frame to form the wing. The camera arrangement 70 is located within a hollow cell at approximately the quarter chord position of the wing since this is the largest cell within the wing 52 and provides optimal weight balance in the chordwise direction. Any other hollow cell within the wing could alternatively be used, and the weight balanced in conjunction with, for example, payload distribution.
[0056] The first camera 76 in the camera arrangement 70 of
[0057]
[0058]
[0059] Multiple images are captured by the first camera 2, typically at a rate of around 5 frames per second. The position of the images on the Earth's surface E is calculated according to the steps above in order to build up a set of images mapping the Earth's surface E.
[0060] Whilst this embodiment relates to mapping the Earth's surface, it will be appreciated that images could be captured by the first camera 2 of, for example the Earth's atmosphere, e.g. cloud patterns, or the Moon, Mars or other celestial body.
Image Capture Operation:
[0061] Once the UAV 50 is at the target altitude and on its intended flight path, the camera arrangement 1 can be brought online ready to capture images. Control of the camera arrangement 1 and each first 2 and second 6 camera occurs in this embodiment via the UAV's control system. The flight path of the UAV is calculated such that the camera arrangement 1 will be optimally located to capture images of relevant parts of the Earth's surface E. As the first camera 2 captures images of the Earth E, so the second camera 6 captures images of stars 9 along the second optical axis 8 in the generally opposite and known direction to the optical axis 4 of the first camera 2. All images have a time stamp associated with the time the image is captured, provided by the reference clock 12 located in the ancillary unit 10.
[0062] The first camera 2 and second camera 6 may be arranged to capture images simultaneously, so they are directly correlated in time. Alternatively, first image and second image capture may occur at different times, in which case each first image will correlate to a point in time offset from the time of adjacent second images.
Determination of the Location of the Image Capturing Arrangement in Space:
[0063] Accurate positioning in terms of latitude, longitude and altitude of the camera at the time of image capture may be obtained via triangulation of multiple images of stars or by the use of a positioning system such as a GPS (Global Positioning System) device.
[0064] For example, the camera arrangement 1 could be equipped with a positioning receiver which records the latitude, longitude and altitude of the camera arrangement 1 relative to the Earth E. A commonly available system such as a GPS receiver could be used, calculating position according to information received from satellites arranged in the Earth's atmosphere. Other positioning systems are available and could alternatively be used, for example GLONASS (GLObal NAvigation Satellite System). If the object is not the Earth but another celestial body such as the Moon or Mars, then the positioning receiver would need to access satellites arranged in the atmosphere or space around that celestial body, which would need to be in place and providing communication signals for location purposes.
[0065] Alternatively, the position of the camera arrangement can be determined from the star images captured by the second camera 6. By comparing a star image captured by the second camera with star images taken from a known repository such as those mentioned below under orientation of the camera arrangement, and triangulating a plurality of star images provides the position in space of the camera arrangement at the time of the second image can be determined. Use of a GPS receiver together with analysis of star images to provide latitude, longitude and altitude information is also possible.
Determination of the Orientation of the Camera Arrangement:
[0066] Using the star images captured by the second camera 6 and comparing these with a repository of known star images it is possible to determine the orientation of the second camera 6. Since the orientation of the first camera 2 relative to the second camera 6 is known, this allows the orientation of the first camera 2 and the first optical axis 4 to be determined. Having retrieved the first and second images from the camera arrangement 1, star images are uploaded to a star tracking system, e.g. Astrometry.net or a similar astrometry plate solving system.
[0067] The system compares an index of known star locations with the second images. The Astrometry.net index is based on star catalogues: USNO-B, which is an all sky catalogue and TYCHO-2, which is a subset of 2.5 million brightest stars. Alternative star catalogues exist and could be used. Stars and galaxies in each second image are identified and compared with the index, and a position and rotation of the second image 101 on the sky 100 is returned. A schematic example is as shown in
[0068] Alternative star tracking systems are known, for example the Star Tracker 5000 from the University of Wisconsin-Madison, which determines its attitude with respect to an absolute coordinate system by analysing star patterns in a particular image frame. The ST5000 also uses a star catalogue for reference.
[0069] The angle of declination and therefore the location of the principal axis of the first or mapping camera can thereby be provided to sub arc second accuracy. Knowing where the image captured by the second camera 6 is located and how the image is angled enables the orientation of the second camera 6 to be established.
[0070] Since the orientation of the first camera 2 to the second camera 6 is known, the location and direction of the first optical or principal axis 4 is therefore determined.
[0071] The steps above may be carried out in a different order to that described above. For example, positional information may be recorded via a GPS receiver at the same time as images are being captured, this may be relevant if the aerial vehicle is travelling at a significant speed. Equally, the orientation of the camera arrangement may be processed on board as the image(s) are captured. Alternatively, the camera arrangement may simply capture images with a corresponding time stamp and transmit this data via a data link to the remote station for analysis. The location of the camera arrangement may be determined from the second images, in which case the location and orientation determination can occur as a single step.
Determination of the Position of the Object, e.g. the Earth:
[0072] Information on the Earth's location in its orbit and also its rotational position at the time stamp of each first image allows the correlation of the first image capturing axis 4 and the Earth's position and orientation to determine the location of the first image.
[0073] Using a series of images of the Earth E captured and processed according to this method enables the Earth's surface to be accurately mapped, to an accuracy of 1 metre squared on the surface of the Earth.
[0074] In alternative embodiments, the object may be a celestial object other than the Earth, for example images may be captured of the Moon, its surface, atmosphere, orbit etc or similarly for Mars or other stars or galaxies. For example, the camera arrangement 30 of
[0075] Although the invention has been described above with reference to one or more preferred embodiments, it will be appreciated that various changes or modifications may be made without departing from the scope of the invention as defined in the appended claims.