ULTRASONIC POSITION DETECTION SYSTEM

20190162833 ยท 2019-05-30

    Inventors

    Cpc classification

    International classification

    Abstract

    Tracking systems have been successfully applied to immersive simulation systems and virtual environment training in which portable devices (i.e., hand-held military equipment) within the immersive simulation system are tracked using time-of-fright recordings to triangulate each devices position. Until now, tracking systems have not used differential calculations to track these portable devices. The invention uses a single array of sensors mounted above the simulation area to communicate with small transmitters and emitters mounted on each portable device to generate position offsets for each portable device.

    Claims

    1. An immersive simulation environment, comprising system for tracking the location of a portable device, comprising: a portable device comprising an ultrasonic transmitter and a detector; a stationary sensor array comprising at least three ultrasonic sensors in communication with a sensor array processor; and at least one emitter; wherein the at least one emitter is configured to send a signal to the portable device; wherein the portable device is configured to emit an ultrasonic tone when the detector receives the signal; wherein the at least three ultrasonic sensors are configured to receive the ultrasonic tone; and wherein the sensor array processor is configured to calculate differentials and the order in which the at least three ultrasonic sensors receive the ultrasonic tone to calculate the x-position and the y-position of the portable device through a phase angle calculation; wherein the immersive simulation environment is in the shape of a dome.

    2. The immersive simulation environment according to claim 1, wherein the sensor array processor is further configured to start a counter when the emitter sends the signal and stop the counter when a number of samples per tone reaches a sensor in the array to determine the z-position of the portable device.

    3. The immersive simulation environment according to claim 2, wherein the portable device further comprises an accelerometer/gyroscope/compass device.

    4. The immersive simulation environment according to claim 1, wherein the stationary sensor comprises three ultrasonic sensors, comprising an x-axis sensor, a y-axis sensor, and a reference sensor.

    5. The immersive simulation environment according to claim 1, wherein the at least one emitter is located on the stationary sensor array.

    6. (canceled)

    7. (canceled)

    8. The immersive simulation environment according to claim 1, wherein the stationary sensor array is located at the top center of the dome.

    9. The immersive simulation environment according to claim 1, wherein the immersive simulation environment is a simulated military environment.

    10. The immersive simulation environment according to claim 9, wherein the portable device is a simulated military device.

    11. The immersive simulation system comp stem according to claim 1, comprising at least two rear-mounted image projectors, and at least two projector image generators, at least two SMD image generators, wherein the portable devices receive images from the SMD image generators.

    12. A method for tracking the position of a portable device comprising an ultrasonic transmitter and a detector in an immersive simulation system comprising a stationary sensor array comprising at least three ultrasonic sensors in communication with a sensor array processor, and at least one emitter; the process comprising: sending a signal from the at least one emitter to the portable device; emitting an ultrasonic tone from the portable device when the detector receives the signal; receiving at the at least three ultrasonic sensors the ultrasonic tone; and calculating in the sensor array processor phase differentials and the order in which the at least three ultrasonic sensors receive the ultrasonic tone to determine the x-position and the y-position of the portable device through a phase angle calculation; wherein the immersive simulation system is in a dome, wherein the stationary sensor array is mounted at the top center of the dome, and the portable device is located inside of the dome.

    13. The method according to claim 12, further comprising: starting a counter at the sensor array processor when the sensor array emits an infrared burst and stopping the counter when a number of samples per tone reaches a sensor in the array to determine the z-position of the portable device.

    14. (canceled)

    15. The method according to claim 12, wherein the at least one emitter is located on the stationary sensor array.

    16. The method according to claim 15, wherein the stationary sensor array comprises three sensors, an x-axis sensor, a y-axis sensor, and a reference sensor.

    17. The method according to claim 16, wherein the processor is programmed to record a multitude of timer values when the ultrasonic tone is sensed for each of the three ultrasonic sensors; and wherein the x-position and y-position calculation comprises: subtracting the x axis sensor timer value from the reference sensor timer value to create an x axis differential; subtracting the y axis sensor timer value from the reference sensor timer value to create a y axis differential; and applying a configurable scaling factor to the x-axis differential and the y-axis differential to determine x-axis and y-axis offsets for the portable device.

    18. The method according to claim 12, wherein the detector is configured to detect a radio signal or an infrared emission and the at least one emitter is configured to emit a radio signal or an infrared emission.

    19. The method according to claim 18, wherein the detector is an infrared detector and the at least one emitter is an infrared emitter.

    20. The system according to claim 1, wherein the detector is configured to detect a radio signal or an infrared emission and the at least one emitter is configured to emit a radio signal or an infrared emission.

    21. The system according to claim 1, wherein the detector is an infrared detector and the at least one emitter is an infrared emitter.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0042] Other features of the invention, as well as the invention itself, will become more readily apparent from the following detailed description when taken together with the accompanying drawings, in which:

    [0043] FIG. 1 is a depiction of an embodiment of the system wherein an ultrasonic sensor array is shown. The sensor array is made up of the Y-axis sensor (102), the X-axis sensor (100) and the Reference sensor (101). In addition, infrared transmitters (103) and a calibration laser (104) are shown. The distance between sensors is calculated so that phase differentials can be obtained within a single cycle of an ultrasonic tone.

    [0044] FIG. 2 is a depiction of a detector/emitter installed on a portable device. The infrared detector (200) captures an infrared message from the present system and the ultrasonic emitter (201) sends an ultrasonic tone back to the present invention.

    [0045] FIG. 3 is a depiction of an embodiment of the present invention wherein an example of an immersive simulation system (in this embodiment a dome) and the present invention's location within the immersive simulation system (300). The cone-shaped area of interest (301) is depicted in this illustration as well as the portable devices (302) in use by trainees.

    [0046] FIG. 4 illustrates the eye-point view problems in immersive simulation systems if tracking data is not available.

    [0047] FIG. 5 is a depiction of the communication between a portable device (for example a simulated military device [SMD]) (501) and invention sensor array (500). Initially, the system will send an infrared coded burst to all portable devices in the area of interest. Then only the portable device with the matching unique identifier will emit an ultrasonic tone to the present invention.

    [0048] FIG. 6 is a depiction of an ultrasonic tone reaching invention sensor array. The ultrasonic tone is arriving from in front of the sensor location (toward, for example, a dome) and to the right (if facing the dome). First, the tone will reach, for example, the Y axis sensor (602). Eventually, the tone will reach, for example, the Reference sensor (600). The difference in timer counters between when the tone reaches the Y axis sensor (602) to when the tone reaches the Reference sensor (601) is used to calculate a Y axis offset. In the depiction, the tone reaches the Reference sensor (600) before it reaches the X axis sensor (601), the timer counter difference calculation is just reversed. The Reference sensor (600) also serves as the Z axis sensor by measuring the time-of-flight from the moment the infrared code is transmitted to when the ultrasonic tone is received.

    [0049] FIG. 7 is a simplified depiction of an exemplary immersive simulation system. The simulation system contains a dome (700), multiple rear-mounted image projectors (703), multiple simulated military devices (SMDs) (701), the present invention (702), multiple Projector Image Generators (IGs) (704), multiple SMD Image Generators (IGs) (705), and controller systems (706). The SMDs receive images from the SMD IGs via HDMI protocol (707). The SMDs also communicate with the controller system and the present invention via high-speed Ethernet (708). The Projector IGs send images to the projectors via HDMI protocol (709). The controller systems communicate with the Projector IGs and the SMD IGs via high-speed Ethernet (710).

    [0050] FIG. 8 is a depiction of calculations to determine the intersection between an offset display vector and the dome for the X and Y axes.

    [0051] FIG. 9 is a depiction of calculations to determine the intersection between an offset display vector and the dome for the Z axis.

    [0052] FIG. 10 is a depiction of a phase differential for a single wave cycle of an ultrasonic tone as it is detected by the sensor array in the example depicted in FIG. 6. The sensor array for example detects the ultrasonic wave on the rising edge of the wave. Initially the ultrasonic wave is detected by the Y-axis sensor (1001) (or 602 in FIG. 6) and a sample of a sensor array processor counter is taken (1004). As the same wave cycle moves across the sensor array it is next detected by the reference sensor (1002) (or 600 in FIG. 6) and a sample of a sensor array processor counter is taken (1005). The Y-axis phase differential is the reference sensor sample (1005) less the Y-axis sensor sample (1004). The same wave cycle is then detected by the X-axis sensor (1003) (or 601 in FIG. 6) and a sample of a sensor array processor counter is taken (1006). The X-axis phase differential is the X-axis sensor sample (1006) less the reference sensor sample (1005). Phase differentials are taken for multiple wave cycles in an ultrasonic tone to provide averaging for the X and Y axis phase differentials.

    [0053] For both the X and the Y axis calculations, the phase differentials for example are phase angle calculations. In this case, the phase angle differential is calculated and used to determine the angle of arrival of the signal, thereby allowing for determination of X and Y positions. The phase angle is the change (horizontal shift) between the samplings of the X or Y axis sensors and the reference sensor or the phase angle differential.

    DETAILED DESCRIPTION OF THE INVENTION

    [0054] The instant system is described in further detail with respect to the enclosed Figures. The following detailed description of the Figures enclosed herewith further illustrate the invention but should not be construed as in any way limiting its scope.

    [0055] FIG. 1 illustrates an exemplary configuration of the present invention. Sensors are arranged in a pattern so as to allow a measured time difference between the arrival of an ultrasonic tone to the X axis sensor (100) and a Reference sensor (101), and a measured time difference between the arrival of an ultrasonic tone to the Y axis sensor (102) and the Reference sensor (101). In this embodiment, the Reference sensor (101) can double as the Z axis sensor. The dimension between the ultrasonic sensors can be set to be the smallest resolvable wavelength of an ultrasonic tone response from the portable devices. Also depicted are infrared emitters (103) which are configured to give maximum range on the infrared transmission within the immersive simulation system. According to this exemplary embodiment, all 3 infrared emitters (103) will emit the same code at the same time. FIG. 3 illustrates the location of the present invention (300) within an immersive simulation system.

    [0056] In FIG. 2, each portable device (for example simulated military device [SMD]) is configured with one or more infrared receivers (200) and ultrasonic transmitters (201).

    [0057] FIG. 3 illustrates invention sensor array in an immersive simulation system. The sensor array (300) detects the location of portable devices (for example Simulated Military Devices [SMD]) devices (302) within a specified area (in this example a cone-shaped field of interest in the dome) (301) to provide positional data used to modify the SMD's perspective (eye point offset). Positional data from a SMD display's view and/or the SMD's position within the dome (i.e., the simulated origination of tracer fire from a hand weapon) are coordinated with the Image Generator (IG) so that the view point through the visual SMD device is coordinated with the same view point that is seen with the naked eye in the dome. As SMDs are being operated, users can look left/right, up/down, and can tilt the device left/right. The images provided for the SMDs' displays are generated by independent Image Generators (IGs) and the images change to reflect the orientation (attitude) of each SMD, as controlled by infrared and ultrasonic transmit and receive signals.

    [0058] Described herein is an immersive simulation system which contains sensor array that is mounted, for example, at the top of a dome over a center of the area of interest (an area of interest is the cone-shaped field around the floor of the immersive simulation system) and one or more ultrasonic transmitters and infrared receivers (see FIG. 2) embedded in each SMD that requires positional data. Information from the sensor array is fed to individual SMD processing units to adjust the IG view within each SMD device. Some SMD devices, because of the way they are used, require more than one set of transmitters (201) and receivers (200). FIG. 4 illustrates the purpose of providing positional offsets to SMDs in an immersive simulation system. Without a tracking system, the point-of-view at a particular orientation would be the same at different positions within the immersive simulation system.

    [0059] Each portable device (for example SMD) configured in an immersive simulation system (for example dome) is assigned a unique identifier (Tracking ID). A processor configured with the present invention will schedule each Tracking ID to be sampled in a priority-based scheduling algorithm. The processor configured with the present invention will create a message to contain the Tracking ID of the SMD to be sampled and a Send Ultrasonic Tone command. The present invention will transmit this message to all SMDs in the immersive simulation system (dome) as an 8-bit infrared code. At the same time, the sensor measurement timers in the present invention processor are reset to 0 and set to run. Upon receiving the infrared command, each SMD will compare the Tracking ID from the infrared command with their unique identifier. If they match, only that SMD will emit an ultrasonic tone. FIG. 4 illustrates an embodiment of the described system (500) emitting an infrared message, and the SMD (501) responding with an ultrasonic tone.

    [0060] The processor configured with the described system is programmed to record a multitude of timer values when the ultrasonic tone is sensed for each of the 3 ultrasonic sensors. The processor measurement timers are sampling at a rate of, for example, 80 MHz (or every 12.5 nsec). The X axis sensor (100) timer value is subtracted from the Reference sensor (101) timer value to create the X axis phase differential. The Y axis sensor (102) timer value is subtracted from the Reference sensor (101) timer value to create the Y axis phase differential. The timer value associated with the Reference sensor will be used for the Z axis calculation.

    [0061] A configurable scaling factor is applied to X and Y axis differentials to scale the differential values to a centimeter (1 differential unit=1 centimeter). These X and Y axis offsets are then forwarded to the specific SMD processor for incorporation into the SMD's display eye-point view.

    [0062] The Z axis position can be calculated as a time-of-flight value from the SMD to the sensor array. Initially, the Z time-of-flight value is subtracted from the number of timer units from the sensor to the floor of the dome (the floor distance is determined during a calibration phase). This calculation is the number of timer units from the floor of the dome to the SMD. A configurable scaling factor is applied to the Z axis offset to scale the offset from timer units to centimeters. The Z axis offset is then forwarded to the SMD for incorporation into the SMD's display eye-point view.

    [0063] FIG. 7 depicts a simplified immersive simulation system. An immersive simulation system in this exemplary embodiment contains a dome (700), multiple rear-mounted projectors (703) controlled by multiple Image Generators (IGs) (704), the IGs send display information to the projectors via HDMI protocol (709). Also included in this immersive simulation system are SMDs (701), the SMD displays are controlled by SMD IGs (705), and a series of controller systems (706). The SMD IGs send display data to the SMD displays via HDMI protocol (707). The controller systems (706), the Projector IGs (704) and the SMD IGs (705) communicate via high-speed Ethernet (710). The controller systems also make requests of the SMDs and the present invention via high-speed Ethernet (708).

    [0064] The Projector IGs (704) generate the scenery of a simulated topical location for displaying on the dome (700). The SMD IGs (705) generate an immersive simulation image of the dome from the perspective of the SMD. The SMD IGs create an eye-point image of the dome image based on the position and orientation of the SMD. The present invention adjusts the SMD eye-point image, at any location within the area of interest in the immersive simulation system, to match the dome image.

    [0065] After the present invention has completed calculating the position offset (in the X, Y and Z axes) for a particular SMD, it will send the tracking data to the SMDs. The specific SMD processors will use the current display vector (using the yaw and pitch orientation of the SMD), and the new tracking offset to calculate new coordinates where the offset display vector intersects with the dome. FIG. 8 illustrates the calculations required for the new display vector intersection with the dome. A person skilled in the art of polynomial mathematics can understand the dome intersection calculations.

    [00001] .Math. Dome .Math. .Math. intersection .Math. x = - b b 2 - 4 .Math. ac 2 .Math. a Where .Math. : a = Vector .Math. .Math. Direction .Math. x 2 + Vector .Math. .Math. Direction .Math. y 2 + Vector .Math. .Math. Direction .Math. z 2 b = 2 .Math. ( Vector .Math. .Math. Direction .Math. x ) .Math. ( Vector .Math. .Math. Origin .Math. x ) + 2 .Math. ( Vector .Math. .Math. Direction .Math. y ) .Math. ( Vector .Math. .Math. Origin .Math. y ) + 2 .Math. ( Vector .Math. .Math. Direction .Math. z ) .Math. ( Vector .Math. .Math. Origin .Math. z ) c = Vector .Math. .Math. Origin .Math. x 2 + Vector .Math. .Math. Origin .Math. y 2 + Vector .Math. .Math. Origin .Math. z 2

    [0066] FIG. 9 illustrates the position offset calculations required for the Z axis. The number of sensor units from the present invention to the floor is calculated during calibration of the sensor array. The Z height of the SMD is subtracted from the floor distance resulting in the height of the SMD above the floor. This value is scaled by a configurable scaling factor to result in 1 sensor unit equaling 1 centimeter. The Z axis information is included in the quadratic equation above.

    [0067] The Controller Systems (706) request the orientation data (yaw, pitch and roll), the tracking data (X, Y and Z axis offsets), and the newly calculated adjusted orientation (offset yaw, pitch and roll) from each SMD and forwards this information onto the SMD IG responsible for a particular SMD's display. The responsible SMD IG uses the adjusted orientation data to create an image of the dome from the perspective of the SMD's eye-point and transmits this image to the SMD's display via HDMI (707).

    [0068] This process occurs multiple times a second for all SMDs registered in the immersive simulation system.

    [0069] Although more specifically described above are immersive simulation systems, the described position tracking systems potentially have application outside of the simulation genre. Multiple sensor arrays can be configured to increase the size of the area of interest. Further, although a simulated military environment is discussed above, it would be understood that the described position tracking system could be used in differing simulation environments. Although the SMDs in the immersive simulation system may be tethered, wireless portable devices can be developed, drawing minimal power, to allow for a more free range of movement.

    [0070] While the exemplary embodiments described above include specifically a dome, it would be understood that the system could potentially be adapted to any three dimensional immersive environment.

    [0071] Simulated Military Devices as used throughout would be any portable device used in a simulated military environment, including but not limited to portable devices such as binoculars for observing distant locations, or other simulated devices which normally would be used by a soldier in a typical military environment.

    [0072] Portable devices as used throughout would by any device which is moveable and for which it is desirable to track the position. While described in more detail above are simulated military devices, it is understood that the instantly described system could be used to track any portable device fitted with an ultrasonic transmitter.

    [0073] Immersive simulation system is any system which allows for three dimensional immersion in a simulated environment. While immersive military simulation environments are described in more detail above, it is understood that the system may be used in other immersive systems. Further, while immersive simulation systems described in more detail above are in the shape of a dome, it is understood that the instantly described system could be configured for use in other three dimensional geometries.

    [0074] Ultrasonic transmitter is a transmitter which is capable of emitting an ultrasonic tone. An ultrasonic tone is a tone which has a frequency above the human ear's adubility limit of 20,000 hertz, for example 40 kHz.

    [0075] All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.

    [0076] The use of the terms a and an and the and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., such as) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.

    [0077] Preferred embodiments of this invention are described herein, including the best mode known to the inventors for carrying out the invention. It should be understood that the illustrated embodiments are exemplary only, and should not be taken as limiting the scope of the invention.