Perspective tracking system
09671876 ยท 2017-06-06
Assignee
Inventors
Cpc classification
G06T7/80
PHYSICS
F41J5/02
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G01B11/14
PHYSICS
F41J9/00
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41G3/2605
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41A33/00
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41G3/26
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G06F3/0346
PHYSICS
G06F16/7328
PHYSICS
F41G3/2661
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41J9/08
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G06F16/7335
PHYSICS
G09B19/00
PHYSICS
International classification
F41A33/00
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G09B19/00
PHYSICS
F41J5/02
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41J9/00
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F41G3/26
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G06F3/0346
PHYSICS
F41J9/08
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G01B11/14
PHYSICS
G06T7/80
PHYSICS
Abstract
Resolution of perspective in three dimensions is necessary for intermeshing real players into simulated environments during virtual training exercises. With the advent of high resolution image sensors the ability to sense position and orientation using image capture devices is possible. The combination of small sized sensors and image recognition tracking algorithms allows the tracking element to be placed directly on the device whose perspective is desired. This provides a solution to determining perspective as it provides a direct measurement from the center axis of the observer. This invention employs a perspective tracking device to determine a point-of-gaze or a point-of-aim in a three-dimensional space to a high degree of accuracy. Point-of-gaze may be used to determine views for head mounted displays and aim-points for weapons. The invention may operate in an unconstrained space allowing simulation participants to operate in a larger, open environment. Areas of interest in the environment are bounded by area of interest markers which identify the region and its physical constraints.
Claims
1. A perspective tracking system comprising: a computer-based perspective tracking device; and an array of emitters configured to define at least one area of interest based on emitted signals from at least two emitters in the array of emitters; wherein the computer-based perspective tracking device is configured to determine a spatial position, orientation, and size of the area of interest based on the emitted signals from the array of emitters; and wherein the computer-based perspective tracking system is configured to measure a light intensity of each of the at least two emitters in the array of emitters over a detection threshold and generate a correction equation to the light intensity measurement that yields temperature-based offsets.
2. The system of claim 1 wherein at least one emitter in the array of emitters comprises a light emitter.
3. The system of claim 2 wherein said at least one emitter in the array of emitters is configured to modulate the light to output an identification of said light emitter.
4. The system of claim 3 wherein said identification comprises one or more of: an identification sequence; an error correction; and a validation sequence.
5. The system of claim 1 wherein said at least one area of interest comprises a n-sided polygon defined by said array of emitters.
6. The system of claim 1 wherein said spatial position, orientation, and size comprise: point of gaze; or point of aim.
7. A perspective tracking method comprising: capturing, using a computer-based system, an image frame wherein said image frame includes a location of an emitter and a location of a second emitter; determining, using the computer-based system, an area of interest based on the location of the first emitter and the location of the second emitter; determining, based on the locations of the first and second emitters, a spatial position, an orientation, and a size of the area of interest; wherein the capture of the image frame comprises measuring a light intensity of each of the first and second emitters over a detection threshold and applying a correction equation to the light intensity measurement that yields temperature-based offsets.
8. The method of claim 7, wherein the capture of the image frame further comprises calculating a centroid of the light intensity of each of the first and second emitters.
9. A computer program product comprising a non-transitory computer usable medium having control logic stored therein for causing a computer to track a perspective of an array of emitters, the control logic comprising: first computer readable program code for causing the computer to capture an image frame wherein said image frame includes a location of a first emitter and a location of a second emitter; second computer readable program code for causing the computer to determine an area of interest based on the location of the first emitter and the location of the second emitter; third computer readable program code for causing the computer to determine, based on the locations of the first and second emitters, a spatial position, an orientation, and a size of the area of interest; herein the first computer readable program code further comprises: computer readable program code for causing the computer to measure a light intensity of each of the first and second emitters over a detection threshold; and computer readable program code for causing the computer to generate a correction equation to the light intensity measurement that yields temperature-based offsets.
10. The computer program product of claim 9, wherein the first computer readable program code further comprises: computer readable program code for causing the computer to calculate a centroid of the light intensity of each of the first and second emitters.
11. The computer program product of claim 9, wherein said correction equation yields an optical error correction.
Description
BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES
(1)
(2)
(3)
(4)
(5)
(6)
DETAILED DESCRIPTION OF THE INVENTION
(7) Embodiments of the present invention are now described with reference to the figures. While specific configurations and arrangements are discussed, it should be understood that this is done for illustrative purposes only. A person skilled in the relevant art will recognize that other configurations and arrangements can be used without departing from the spirit and scope of the invention. It will be apparent to a person skilled in the relevant art that this invention can also be employed in a variety of other systems and applications other than those disclosed here.
(8) An embodiment of the invention is based on the detection of an array of emitters on a fixed plane in space. In
(9) In
(10) PTD devices may be enhanced using accelerometers, gyroscopes and magnetometers to detect orientation. PTDs enhanced in this manner would calibrate these devices during times of image-based track lock, then use them to extrapolate orientation and position when track lock is not possible. Various weighted methods to combine the diverse orientation information may be employed to improve accuracy (such as combining accelerometer and gyroscope information to yield motion and orientation in 6 DOF).
(11) AOIs are delineated using infrared point light emitters that identify points on the AOI (e.g., corners) as well as the spatial position, orientation and size of the AOI. AOIs may overlap in physical space either on parallel planes or orthogonal planes. AOIs exist to provide precise locations in the field of view, areas of projected blue-screen imagery, or high accuracy aim-point data areas. More than one emitter may be used to delimit an AOI point. This allows the determination of an AOI's orientation, even though its corners or sides may be occluded.
(12) In a three dimensional environment, AOIs are defined where high accuracy tracking of perspective is required. These AOIs may be two-dimensional polygonal regions where each vertex is marked by an emitter. Emitters may be infrared LEDs which are visible to tracking imaging systems. Each emitter source identifies a point in three dimensions in a bounded volume. A PTD detects emitters within its field of view and uses the relationships of the emitters to determine a current perspective in three dimensions.
(13) The PTD's processor identifies each emitter and determines its physical spatial relationship to other emitters based on the emitter's unique identification coding. In some embodiments of the invention, each TPA emitter uniquely identifies itself to the system by modulating its light output isochronously to the system image capture frame rate. Within a given TPA emitter array modulation may be sent synchronously (i.e., all emitters at once), although each emitter's message may be unique. The emitter identification coding can be modulated at or near the video frame rate, allowing the PTD processor to detect the emitter's coding pattern. The identification coding may include error correction and validation sequences. The emitters of a TPA may be synchronized to the PTD using wireless communication. A TPA's IR emissions may be shortened and sent only during the peak sensitivity of the PTD sensor. Modulation of an emitter would then consist of dropping pulses during this period on a low-duty cycle basis.
(14) The PTD computes a solution for perspective based on each emitter's location in the captured image field. Equations of motion are generated to allow the PTD to compute perspective during times of emitter occlusion. The PTD may use a secondary system to synchronize to an AOI's TPA. For example, a wireless connection may be used to signal the start of a point source modulation so that it may be synchronous with the video capture and allow for point source identity to be demodulated.
(15) Three or more emitters can be used to define a two-dimensional surface having a polygonal outline. An n-sided polygon is defined using n or more emitters. Over-specification of the TPA outline can be used to improve accuracy and to compensate for occlusion of corner emitters.
(16) A TPA may be implemented using an IR emitter and a micro-processor, or using an IR emitter and a programmed logic array. A TPA may also be implemented using an IR emitter and a memory cell used to replay the source IR through the emitter. A TPA may be permanently integrated into a device for use in an arena tracking environment.
(17) In
(18) Corrections may be applied to the computation of this centroid. The first of these corrections is a temperature-based offset of intensity amplitude on a per cell basis. The second compensation is the exact X:Y location of each cell, based on corrections for errors in the optics inherent in the PTD device. These corrections are applied locally, prior to the centroid computation being made for each emitter's centroid. The final emitter centroid is translated into an offset angle from the center of the PTD field of view.
(19) In
(20) On each frame, motion equations for each known emitter are advanced to the current time (404). The predicted positions are then compared to detected positions (403) using radial distance computations (410) and sorted using a radial bubble sort (401) to yield a best fit. When a radial distance match occurs within some predefined tolerance, the emitter history and equations are updated for the next frame. The result is a new point identification list (420). If an emitter centroid falls out of the maximum allowed radius then it is assumed to be a new emitter and a new set of equations are started. When an emitter is not detected on a given frame, the equations are coasted and the emitter's entry is marked as modulated. Emitter equations are coasted for a fixed number of frames then deleted as a valid entry.
(21) Once individual emitters have been identified, the AOI to which they belong may be determined. This may be seen in
(22) While some embodiments of the present invention have been described above, it should be understood that it has been presented by way of examples only and not meant to limit the invention. It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined in the appended claims. Thus, the breadth and scope of the present invention should not be limited by the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.