Patent classifications
G01S5/163
SYSTEMS AND METHODS FOR CALIBRATING AN INERTIAL MEASUREMENT UNIT AND A CAMERA
The present disclosure relates to a system and a method for calibrating an inertial measurement unit (IMU) and a camera of an autonomous vehicle. The system may perform the method to: obtain a track of the autonomous vehicle traveling straight; determine an IMU pose of the IMU relative to a first coordinate system; determine a camera pose of the camera relative to a second coordinate system; determine a relative coordinate pose between the first coordinate system and the second coordinate system; and determine a relative pose between the camera and the IMU based on the IMU pose, the camera pose, and the relative coordinate pose.
Positional tracking assisted beam forming in wireless virtual reality systems
Embodiments of the present disclosure support a head-mounted display (HMD) wirelessly coupled to a console. The HMD includes a positional tracking system, a beam controller and a transceiver. The positional tracking system tracks position of the HMD and generates positional information describing the tracked position of the HMD. The transceiver communicates with a console via a wireless channel, in accordance with communication instructions, the communication instructions causing the transceiver to communicate over one directional beam of a plurality of directional beams. The beam controller determines a change in the positional information. Based on the change to the positional information, the beam controller determines a directional beam of the plurality of directional beams. The beam controller further generates the communication instructions identifying the determined directional beam, and provides the communication instructions to the transceiver.
METHODS AND SYSTEMS FOR DIGITAL IMAGE-REFERENCED INDIRECT TARGET AIMING
There is provided methods and systems for digital image-referenced indirect target aiming. The systems and methods of the invention measures angle of rotation of subsequent stable images from an initial Reference Image and provide an aiming that is colinear to the target's absolute azimuth.
Visual inertial odometry health fitting
Systems, methods, and computer readable media to track and estimate the accuracy of a visual inertial odometry (VIO) system. Various embodiments are able to receive one or more VIO feature measurements associated with a set of image frames from a VIO system and generate a plurality of feature models to estimate health values for the VIO system. The various embodiments determine a plurality of feature health values with the feature models based on the VIO feature measurements and compare the feature health values with ground truth health scores associated with the set of image frames to determine one or more errors. The feature model parameters are updated based on the comparison with the feature health values with ground truth health scores.
Event driven sensor (EDS) tracking of light emitting diode (LED) array
An event driven sensor (EDS) is used for simultaneous localization and mapping (SLAM) and in particular is used in conjunction with a constellation of light emitting diodes (LED) to simultaneously localize all LEDs and track EDS pose in space. The EDS may be stationary or moveable and can track moveable LED constellations as rigid bodies. Each individual LED is distinguished at a high rate using minimal computational resources (no image processing). Thus, instead of a camera and image processing, rapidly pulsing LEDs detected by the EDS are used for feature points such that EDS events are related to only one LED at a time.
SYSTEM AND METHOD FOR GEOLOCATION OF AN OBJECT IN WATER
A system for geolocation of an object in water includes: first and second devices for immersion in, or to float on, water, the first device including a light source that emits a light beam; the second device includes a camera and a measuring device; and a processing unit, operatively connected to the camera, and configured to: determine a vertical distance between the first and second devices based on the depth of both devices, capture a 2D image of the first device via the camera, calculate the pixel position in the image of the light beam from the light source, calculate a position of the first device relative to the main reference frame based on the pixel position of the light beam, the orientation of the camera, a position of the second device relative to the main reference frame and the vertical distance.
Landmark placement for localization
Embodiments are provided that include receiving sensor data from a sensor positioned at a plurality of positions in an environment. The environment includes a plurality of landmarks. The embodiments also include determining, based on the sensor data, a subset of the plurality of landmarks detected at each of the plurality of positions. The embodiments further include determining, based on the subset of the plurality of landmarks detected at each of the plurality of positions, a detection frequency of each landmark. The embodiments additionally include determining, based on the determined detection frequency of each landmark, a localization viability metric associated with each landmark. The embodiments still further include providing for display, via a user interface, a map of the environment. The map includes an indication of the localization viability metric associated with each landmark.
Vehicle Localization using Map and Vision Data
This document describes vehicle localization using map and vision data. For example, this document describes a localization system that obtains a map centerline point and a vision centerline point of a lane of a roadway. The localization system also obtains the position of the vehicle. The localization system can then compare the map centerline point and the vision centerline point to generate a lateral and a longitudinal correction relative to the vehicle's position. The lateral and longitudinal corrections are used to generate a corrected position. In this way, the described localization system can provide accurate vehicle localization that addresses potential drift caused by lapsed or inaccurate positioning data and allows for the operation of assisted-driving and autonomous-driving systems at higher speeds and on roadways with tighter curves.
IMAGE-BASED TECHNIQUES FOR STABILIZING POSITIONING ESTIMATES
A device implementing a system for estimating device location includes at least one processor configured to receive a first estimated position of the device at a first time. The at least one processor is further configured to capture, using an image sensor of the device, images during a time period defined by the first time and a second time, and determine, based on the images, a second estimated position of the device, the second estimated position being relative to the first estimated position. The at least one processor is further configured to receive a third estimated position of the device at the second time, and estimate a location of the device based on the second estimated position and the third estimated position.
Determination of Relative Pose Using Artificial Magnetic Fields
The pose or position and orientation of a wearable sensor assembly in a reference coordinate system are to be determined in real-time. One or more artificial magnetic field sources are located with known positions and orientations in the reference coordinates. Each source generates a magnetic field which varies in time according to a predetermined AC pulse code, so that each source is uniquely identifiable and distinguishable from ambient DC fields. The magnitude and direction in wearable coordinates of the AC magnetic field due to each source varies in a modelable way with position and orientation of the wearable, which comprises a nine-channel Inertial Measurement Unit (IMU) and a processor. The IMU senses the inertial motion of the wearable and the time-varying magnetic field to which it's exposed. These data are processed to estimate the position and orientation of the wearable in reference coordinates, along with various constant parameters.