Patent classifications
G01S5/163
CROSS REALITY SYSTEM WITH MAP PROCESSING USING MULTI-RESOLUTION FRAME DESCRIPTORS
A distributed, cross reality system efficiently and accurately compares location information that includes image frames. Each of the frames may be represented as a numeric descriptor that enables identification of frames with similar content. The resolution of the descriptors may vary for different computing devices in the distributed system based on degree of ambiguity in image comparisons and/or computing resources for the device. A descriptor computed for a cloud-based component operating on maps of large areas that can result in ambiguous identification of multiple image frames may use high resolution descriptors. High resolution descriptors reduce computationally intensive disambiguation processing. A portable device, which is more likely to operate on smaller maps and less likely to have the computational resources to compute a high resolution descriptor, may use a lower resolution descriptor.
System and method for light-based guidance of autonomous vehicles
A method for providing guidance to autonomous vehicles comprising emitting light signals from a plurality of light sources, wherein each light source emits a light signal with an angular dependent intensity profile, detecting the plurality of emitted light signals with an on-board light detector, processing the plurality of light signals detected by the light detector to distinguish each one of the detected light signals, comparing the distinguished detected light signals, using the distinguished detected light signals to encounter the orientation of the on-board light detector relative to the light sources, generating a control signal from the distinguished detected light signal and using the control signal to provide navigation guidance to the autonomous vehicle.
Method of estimating a direction of absolute orientation of an optronic system
A method for estimating the bearing of an optronic system in a geographical reference frame, the optronic system being situated at a first position and denoted first optronic system is provided. It comprises the following steps: defining a collaborative configuration, by way of the first optronic system and of at least one other optronic system, the optronic systems being respectively situated at separate positions and equipped with means for communication with one another, and with acquisition devices, acquiring, in a scene, one or more objects common to the optronic systems, the direction of orientation between each optronic system and each object being unknown, determining two positions from among those of the optronic systems, for at least one common object: measuring the relative angle by way of a relative angle measurement device fitted to the first optronic system, measuring the elevation of the object by way of an elevation measurement device fitted to the first optronic system, performing additional measurements by way of each other optronic system, with the two positions and the measurements constituting observations, communication, by the other optronic system(s) to the first optronic system, of the observations that it does not have, on the basis of the observations, estimation, by the first optronic system, of the bearing of the first optronic system.
Three-Dimensional Object Position Tracking System
A hand-held controller and a positional reference device for determining the position and orientation of the hand-held controller within a three-dimensional volume relative to the location of the positional reference device. An input/output subsystem in conjunction with processing and memory subsystems can receive a reference image data captured by a beacon sensing device combined with inertial measurement information from inertial measurement units within the hand-held controller. The position and orientation of the hand-held controller can be computed based on the linear distance between a pair of beacons on the positional reference device and the reference image data and the inertial measurement information.
Compact star-field sensor (SFS)
A compact SFS may can be deployed in small space vehicles. The SFS may have a small size, weight, and low power requirements. The hardware, software, catalogs, and calibration algorithm of the SFS provide highly accurate attitude information that can be used for pointing. For instance, accurate attitude determination may be provided that supports pointing of a deployable high gain helical antenna. A full “lost in space” attitude solution, accurate to about an arcminute, may be accomplished in under a minute. The SFS may be fully reprogrammable on orbit, allowing continued algorithm development and deployment after launch.
FILTERING AND SMOOTHING SOURCES IN CAMERA TRACKING
Video processing, including: generating first tracking information using a first tracking system coupled to a camera which moves during a video sequence forming a shot including multiple frames, wherein the first tracking information includes information about six degrees of freedom motion of the camera synchronized to the multiple frames in the shot; generating second tracking information using a second tracking system coupled to the camera which moves during the video sequence, wherein the second tracking information includes information about six degrees of freedom motion of the camera synchronized to the multiple frames in the shot; generating, by a tracking tool, a timeline with a first track for the first tracking information and a second track for the second tracking information, wherein the tracking tool is coupled to the first tracking system and the second tracking system, and receives the first tracking information and the second tracking information.
Method and Positioning System for Determining Location and Orientation of Machine
A method and a system determine location and orientation of a machine in a worksite. The machine is equipped with at least one marker point and the worksite is equipped with at least one reference point. Furthermore, a tracking apparatus is set to the worksite, data is acquired by the tracking apparatus by tracking reference point and marker point locations with respect to the tracking apparatus, the acquired data is transmitted from the tracking apparatus to at least one position determination unit, and the location and orientation of the machine in the worksite is determined by the at least one position determination unit based at least in part on the acquired data.
Method for Determining Location and Orientation of Machine
A method determines location and orientation of a machine in a worksite. The worksite is equipped with at least one reference point. The method comprises setting a tracking apparatus on the machine, tracking the machine with the tracking apparatus by determining location of at least one reference point in the worksite with respect to the tracking apparatus, transmitting data from the tracking apparatus to a position determination unit regarding the tracking, and determining by the position determination unit based at least in part on the data received from the tracking apparatus the location and orientation of the machine in the worksite.
Tracking using encoded beacons
A tracking system, comprising: multiple beacons, each associated with a different cyclic equivalence class of code-word length n, and each configured to broadcast a bit-stream comprising a repeating code-word, where the code-word belongs to the associated cyclic equivalence class; and a mobile tracking unit, comprising: a sensor, and a processor, wherein the sensor is configured to simultaneously detects at least some of the bit streams, and provide each sensed bit stream in real-time to the processor, wherein for each bit-stream received by the processor from the sensor, the processor is configured to identify the beacon that broadcasted the bit-stream using the first n received bits.
Three-dimensional object position tracking system
A hand-held controller and a positional reference device for determining the position and orientation of the hand-held controller within a three-dimensional volume relative to the location of the positional reference device. An input/output subsystem in conjunction with processing and memory subsystems can receive a reference image data captured by a beacon sensing device combined with inertial measurement information from inertial measurement units within the hand-held controller. The position and orientation of the hand-held controller can be computed based on the linear distance between a pair of beacons on the positional reference device and the reference image data and the inertial measurement information.