A METHOD OF OPERATING A TIME OF FLIGHT CAMERA

20230095342 · 2023-03-30

Assignee

Inventors

Cpc classification

International classification

Abstract

In one aspect the invention provides a method of operating a time of flight camera which includes the steps of capturing a sequence of time of flight camera data frames using a set of step frequency modulation signals to provide a time of flight camera data set, then completing a spectral analysis of the dataset which identifies frequency and phase value pairs indicative of the range of the camera to an object represented in the data frames. Next an estimated camera range value to an object represented in the data frames is determined using the frequency value, then a corrected camera range value is determined using the estimated camera range value and the phase value. A camera output is then provided which identifies the corrected range values of at least one object represented in the data frames of the dataset.

Claims

1. A time of flight camera which includes a signal generator configured to generate a source modulation signal and to modify the frequency of the source modulation signal by at least one multiple of an offset frequency, a camera light source configured to transmit light modulated by a modulation signal generated by the signal generator a camera sensor configured to capture time of flight camera data frames from received reflected light, a processor configured to compile a data set from captured time of flight data frames and to complete a spectral analysis of the received dataset which identifies frequency and phase value pairs indicative of the range of the camera to an object represented in the data frames, and determine an estimated camera range value to an object represented in the data frames using the frequency value, and determine a corrected camera range value using the estimated camera range value and the phase value, and providing a camera output which identifies the corrected range values of objects represented in the data frames of the dataset.

2. The time of flight camera of claim 1 wherein the processor is configured to apply a calibration to the frames of the captured data set or during the capture of the data set so that the results of the spectral analysis yields a zero phase value when interpolated to a zero frequency value.

3. The time of flight camera of claim 2 wherein the calibration applied specifies a rotation to be applied to a phase value associated with each modulation frequency used to capture a data frame,.

4. The time of flight camera of claim 2 wherein the calibration applied implements a windowing function.

5. The time of flight camera of claim 1 wherein the estimated camera range value is determined by the expression: E s t i m a t e d r a n g e = 2 c ω e s t B where c is the speed of light, ω.sub.est is a frequency exhibiting a peak in the spectral analysis and B is the bandwidth of the frequencies used by the camera as modulation signals.

6. The time of flight camera of claim 1 wherein the estimated camera range value is determined by multiplying an index value associated with the frequency by the range resolution of the camera.

7. The time of flight camera of claim 1 wherein the corrected camera range value is determined by adding or subtracting from the estimated range value a correction variable defined by the expression: C o r r e c t e d r a n g e = E s t i m a t e d r a n g e ± c 2 B + K Δ f where c is the speed of light, B is the bandwidth of the frequencies used by the camera as modulation signals, Δƒ is an offset frequency value and K is a scaling factor.

8. The time of flight camera of claim 7 wherein the correction variable is added when the dataset is ordered with the lowest modulation frequency captured frame first, and the correction variable is subtracted when the data set is ordered with the highest modulation frequency captured frame first.

9. The time of flight camera of claim 1 wherein the captured data frames of the dataset are ordered prior to spectral analysis being completed to present the camera data frame captured using light modulated with the highest frequency modulation signal as the first data frame of the camera data set with each subsequent frame being captured using the next highest frequency modulation signal.

10. The time of flight camera of claim 1 wherein the data frames of the dataset are captured using light modulated with the highest frequency modulation signal as the first data frame of the camera data set with each subsequent frame being captured using the next highest frequency modulation signal.

11. The time of flight camera of claim 1 wherein the captured data frames of the dataset are ordered prior to spectral analysis being completed to present the camera data frame captured using light modulated with the lowest frequency modulation signal as the first data frame of the camera data set with each subsequent frame being captured using the next lowest frequency modulation signal.

12. The time of flight camera of claim 1 wherein the data frames of the dataset are captured using light modulated with the lowest frequency modulation signal as the first data frame of the camera data set with each subsequent frame being captured using the next lowest frequency modulation signal.

13. The time of flight camera of claim 1 which includes the additional step of validating a corrected camera range value against known harmonic artefacts and removing invalidated corrected range values from the camera output.

14. A set of computer executable instructions for a processor of a time of flight camera, said instructions executing the steps of: capturing a sequence of time of flight camera data frames using a set of step frequency modulation signals to provide a time of flight camera data set, completing a spectral analysis of the dataset which identifies frequency and phase value pairs indicative of the range of the camera to an object represented in the data frames, and determining an estimated camera range value to an object represented in the data frames using the frequency value, and determining a corrected camera range value using the estimated camera range value and the phase value, and providing a camera output which identifies the corrected range values of at least one object represented in the data frames of the dataset.

15. The set of computer executable instructions of claim 14 which includes the additional instruction step of applying a calibration to the frames of the captured data set so that the results of the spectral analysis yields a zero phase value when interpolated to a zero frequency value.

16. The set of computer executable instructions of claim 15 wherein the calibration applied specifies a rotation to be applied to a phase value associated with each modulation frequency used to capture a data frame.

17. A method of operating a time of flight camera which includes the steps of: capturing a sequence of time of flight camera data frames using a set of step frequency modulation signals to provide a time of flight camera data set, completing a spectral analysis of the data set which identifies frequency and phase value pairs indicative of the range of the camera to an object represented in the data frames, and determining an estimated camera range value to an object represented in the data frames using the frequency value, and determining a corrected camera range value using the estimated camera range value and the phase value, and providing a camera output which identifies the corrected range values of at least one object represented in the data frames of the data set.

18. The method of claim 17 which includes the additional step of applying a calibration to the frames of the captured data set so that the results of the spectral analysis yields a zero phase value when interpolated to a zero frequency value.

19. The method of claim 18 wherein the calibration applied specifies a rotation to be applied to a phase value associated with each modulation frequency used to capture a data frame.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0108] Additional and further aspects of the present invention will be apparent to the reader from the following description of embodiments, given in by way of example only, with reference to the accompanying drawings in which:

[0109] FIG. 1 shows a block schematic diagram of the components of the time-of-flight camera provided in accordance with one embodiment of the invention,

[0110] FIG. 2 shows a flowchart of a program of computer executable instructions arranged to operate the time of flight camera of FIG. 1 as provided in accordance with one embodiment,

[0111] FIG. 3 shows a flowchart of a program of computer executable instructions arranged to operate the time of flight camera of FIG. 1 as provided in accordance with an alternative embodiment to that described with respect FIG. 2,

[0112] FIG. 4 shows a plot of single pixel raw amplitude values recorded during the capture of a sequence of camera data frames by a time of flight camera programed with the executable instructions illustrated with respect to FIG. 3,

[0113] FIGS. 5a, 5b show comparative plots of phase versus frequency of modulation signals used by the invention prior to and after the application of a calibration,

[0114] FIGS. 6a, 6b show comparative plots of amplitude versus modulation frequency for frame data prior to and after the application of a calibration which also implements a Hanning window function to reduce spectral leakage noise,

[0115] FIGS. 7a, 7b show comparative plots of the spectral analysis and object range results obtained with the prior art, and with use of the invention in one embodiment, and

[0116] FIGS. 8a, 8b show comparative plots of the spectral analysis and object range results obtained with the prior art, and with use of the invention in a further embodiment which utilises the Hanning window function illustrated with respect to FIG. 6b.

[0117] Further aspects of the invention will become apparent from the following description of the invention which is given by way of example only of particular embodiments.

BEST MODES FOR CARRYING OUT THE INVENTION

[0118] FIG. 1 shows a block schematic diagram of the components of the time-of-flight camera 1 provided in accordance with one embodiment of the invention. The camera 1 incorporates the same components as those utilised with a prior art step frequency continuous wave time of flight camera including a signal generating oscillator 2, light source 3, light sensor 4 and processor 5. The processor 5 is programmed with a set of executable instructions which control the operation of each of the remaining components, as described further with respect to FIG. 2.

[0119] FIG. 2 shows the first step A of this operational method where the signal oscillator generates a source modulation signal. Step B is then executed with the light source transmitting light modulated by the source modulation signal and the light sensor capturing a camera data frame.

[0120] At step C instructions are executed to modify the source modulation signal with the subtraction of a frequency offset value to provide a stepped modulation signal.

[0121] Step D is then executed with the light source transmitting light modulated by the modulation signal generated at step C with the light sensor capturing a further camera data frame.

[0122] At step E an assessment is made of the number of data frames captured so far when compared with the number of data frames required for a complete time of flight camera data set. If the data set is incomplete the process returns to step C where the frequency of the last modulation signal used is modified with the subtraction of the frequency offset value.

[0123] Once a complete data set has been captured step F is completed to perform a spectral transformation on the captured data frames. In the embodiment shown the spectral transformation is performed using a Fourier transform.

[0124] Lastly at step G the values present in the data set are analysed to identify particular frequency and phase values which are associated with objects reflecting light to the camera sensor. Frequency values correlating with the object’s range to the camera are identified, and the phase values associated with them are used to correct or refine the range value provided by the camera for an object.

[0125] In this embodiment an estimated range for an object is calculated by identifying intensity peaks in the results of the spectral analysis associated with particular frequency, phase paired values. For a particular intensity peak ω.sub.est an estimated range is calculated from the expression:

[00005]Estimatedrange=2cωestB

[0126] Once this estimated range has been calculated a corrected range is the calculated at step G using the expression:

[00006]Correctedrange=Estimatedrange±c2B+KΔf

This corrected range value can then be provided as camera output to complete step G and terminates the operational method of this embodiment. In this embodiment an image is presented as a camera output, where the colour of individual pixels of this image indicate both position and corrected range values for an object in the field of view of the camera.

[0127] FIG. 3 shows a flow chart of an alternative program of computer executable instructions which can also be arranged to operate the time of flight camera of FIG. 1.

[0128] Again the first step A of this operational method is executed to operate the signal oscillator to generate a source modulation signal. This modulation signal is generated with the use of a calibration which makes an adjustment to the phase of the signal so that the results of a spectral analysis yields a zero phase value when interpolated to a zero frequency value. In this embodiment the calibration applied specifies a rotation to be applied to a phase value associated with each modulation frequency used to capture a data frame.

[0129] In various additional embodiments this calibration can also be used to ensure that the phase of the modulation signal varies linearly with respect to the frequency of the modulation signal. Additional embodiments can also utilise this calibration to implement a windowing function in addition to adjustments to the phase of the modulation signal as referenced above.

[0130] Step B is then executed with the light source transmitting light modulated by the source modulation signal and the light sensor capturing a camera data frame. In this embodiment a captured camera data frame is supplied as an input to a ‘first in last out’ or FILO buffer memory structure implemented by the camera processor.

[0131] At step C instructions are executed to modify the source modulation signal with the addition of a frequency offset value to provide a stepped modulation signal. Again the same calibration use with respect to step A is used to adjust the phase of the resulting stepped modulation signal.

[0132] Step D is then executed with the light source transmitting light modulated by the modulation signal generated at step C with the light sensor capturing a further camera data frame. Again this captured data frame is supplied as the next input to the above referenced FILO buffer.

[0133] At step E an assessment is made of the number of data frames captured so far when compared with the number of data frames required for a complete time of flight camera data set. If the data set is incomplete the process returns to step C where the frequency of the last modulation signal used is modified with the addition of the frequency offset value.

[0134] At step F an ordering process is completed to compile the full set of captured data frames into a simple data set. In this embodiment the contents of the FILO buffer are read out, thereby reordering the stored data frames in the sequence provided in accordance with the invention.

[0135] Once the complete correctly ordered data set has been compiled step G is completed to perform a spectral transformation. In the embodiment shown the spectral transformation is performed using a Fourier transform.

[0136] Lastly at step H the values present in the data set are analysed to identify particular frequency and phase values which are associated with objects reflecting light to the camera sensor. Again frequency values correlating with the object’s range to the camera are identified, and the phase values associated with them are used to correct or refine the range value provided by the camera for an object. In such embodiments step H executes a similar process to that discussed with respect to step G of FIG. 2. In particular, an estimated range for an object is calculated by identifying intensity peaks in the results of the spectral analysis associated with particular frequency, phase paired values. For a particular intensity peak at frequency ω.sub.est an estimated range is calculated from the expression:

[00007]Estimatedrange=2cωestB

[0137] Once this estimated range has been calculated a corrected range is the calculated at step H using the expression:

[00008]Correctedrange=Estimatedrange±c2B+KΔf

This corrected range value can then be provided as camera output to complete step H and terminate the operational method of this embodiment. In this embodiment camera output is provided to a machine vision system, where the format and content delivered is determined by the requirements of the receiving system.

[0138] FIG. 4 shows a plot of single pixel raw amplitude values recorded during the capture of a sequence of camera data frames undertaken by a time of flight camera programed with the executable instructions illustrated with respect to FIG. 3. This plot illustrates how 29 camera data frames are captured using a step frequency value of 5 MHz. The modulation frequency used starts at 10 MHz with the final data frame captured at the modulation frequency of 150 MHz.

[0139] Raw amplitude values are captured over time as modulation frequencies are increased and FIG. 4 shows a clear oscillating signal with a defined frequency. Spectral analysis of this data will identify a power peak at the frequency of this oscillating signal, with this frequency correlating to the range of an object reflecting light onto the camera sensor.

[0140] Using the operational method described with respect to FIG. 3 a camera data set is compiled from the plot of raw frame actions to values shown, with the first element of the data set being the measurement captured with modulation frequency of 150 Mhz. The next frame integrated into the data set is the measurement captured at 145 MHz, with the final frame integrated into the data set being measurement captured at 10 MHz.

[0141] FIGS. 5a, 5b show comparative plots of phase versus frequency of modulation signals used by the invention prior to and after the application of a calibration.

[0142] As can be seen from FIG. 5a the actual phase values indicated by the dashed data points cycle above and below the solid line identifying a linear response to modulation frequency. The phase response with frequency is also offset so that a non-zero phase will be present at a OHz modulation frequency.

[0143] FIG. 5b illustrates the results of the calibration applied in accordance with various embodiments of the invention. In this embodiment the phase of the modulation signal has been adjusted to vary linearly with frequency. The offset illustrated with respect to FIG. 5a has also been removed so that a zero phase value will result at a 0 Hz modulation frequency.

[0144] FIGS. 6a, 6b show comparative plots of amplitude versus modulation frequency for frame data prior to and after the application of a calibration which also implements a Hanning window function to reduce spectral leakage noise. Sufficient data frames have been captured in the embodiment shown to allow this data to be formatted as a combination of real (X data points) and imaginary numbers (dashed data points).

[0145] As can be seen from FIG. 6a no restrictions are applied to the amplitude results obtained from these frames. FIG. 6b shows the application of a Hanning window function within a calibration equivalent to that discussed with respect to FIG. 5b. As can be seen from FIG. 6b amplitude values are scaled to sit underneath the solid curve shown at the uppermost region of this plot. To reduce spectral leakage noise amplitude values are attenuated by the windowing function as the minimum and maximum modulation frequency used are approached.

[0146] FIGS. 7a, 7b show comparative plots of the spectral analysis and object range results obtained with the prior art, and with use of the invention in one embodiment. For convenience range to target in meters has been derived from frequency values for both plots shown. Each plot also identifies the correct actual range of an object in the field of view of the camera at a range of 2.5 m.

[0147] FIG. 7a shows results obtained with the prior art where an estimated range value only is available and determined using frequency information in isolation. As can be seen from this figure an ambiguous range result is obtained from the 3.sup.rd and 4.sup.th data point peaks. This prior art implementation therefore identifies two possible objects present at both 2 m and 3 m respectively.

[0148] FIG. 7b shows results obtained by the invention in one embodiment where the estimated range values illustrated by FIG. 7a are used in combination with phase information to result in the corrected range value illustrated as the 3.sup.rd data point. In the embodiment shown this phase based correction applied to estimated range values combines the two adjacent ambiguous peaks of FIG. 7a into a single accurate 2.5 m corrected range value.

[0149] FIGS. 8a, 8b show comparative plots of the spectral analysis and object range results obtained with the prior art, and with use of the invention in a further embodiment which utilises the Hanning window function illustrated with respect to FIG. 6b. FIGS. 8a and 8b also illustrate the same circumstances as the plots of FIGS. 7a, 7b with an object in the field of view of the camera at 2.5 m.

[0150] Similarly to FIGS. 7a and 7b, in the embodiment shown the use of the invention results in the provision of a single unambiguous peak at 2.5 m with FIG. 8b, compared to the two ambiguous peaks of FIG. 8a at 2 m and 3 m. These figures also show the results of using the Hanning window function discussed with respect to FIG. 6b. As can be seen from a comparison with FIGS. 7a, 7b the prior noise peaks shown at larger ranges have been attenuated due to the windowing function reducing spectral leakage effects.

[0151] In the preceding description and the following claims the word “comprise” or equivalent variations thereof is used in an inclusive sense to specify the presence of the stated feature or features. This term does not preclude the presence or addition of further features in various embodiments.

[0152] It is to be understood that the present invention is not limited to the embodiments described herein and further and additional embodiments within the spirit and scope of the invention will be apparent to the skilled reader from the examples illustrated with reference to the drawings. In particular, the invention may reside in any combination of features described herein, or may reside in alternative embodiments or combinations of these features with known equivalents to given features. Modifications and variations of the example embodiments of the invention discussed above will be apparent to those skilled in the art and may be made without departure of the scope of the invention as defined in the appended claims.