Patent classifications
G01C11/06
Distributed device mapping
The present invention relates to the efficient use of both local and remote computational resources and communication bandwidth to provide distributed environment mapping using a plurality of mobile sensor-equipped devices. According to a first aspect, there is provided a method of determining a global position of one or more landmarks on a global map, the method comprising the steps of determining one or more differences between sequential sensor data captured by one or more moving devices; determining one or more relative localisation landmark positions with respect to the one or more moving devices; determining relative device poses based one or more differences between sequential sensor data relative to the one or more relative localisation landmark positions; and determining a correlation between each device pose and the one or more relative localisation landmarks positions.
CAMERA SYSTEMS AND DEVICES FOR BALLISTIC PARAMETER MEASUREMENTS IN AN OUTDOOR ENVIRONMENT
A ballistic detection system includes a first camera; a second camera; a solar block device associated with at least one camera of the first and second cameras, wherein the solar block device is configured and arranged to block a solar disc in a field of view of the at least one camera; and a ballistics analysis computer configured to obtain image data captured by the first and second cameras, determine at least two points in three-dimensional space, which correspond to image artifacts of a projectile, using intrinsic and extrinsic parameters of the first and second cameras, define a trajectory of the projectile within a target volume using the at least two points in three-dimensional space, and find a point of intersection of the trajectory of the projectile with an object associated with the target volume.
CAMERA AND RADAR SYSTEMS AND DEVICES FOR BALLISTIC PARAMETER MEASUREMENTS FROM A SINGLE SIDE OF A TARGET VOLUME
A ballistic detection system includes a radar system; electromagnetic radiation detection equipment positioned on only a single side of a target volume; and a ballistics analysis computer configured to obtain image data captured by first and second cameras in accordance with timing specified by radar data, determine points in three-dimensional space, which correspond to image artifacts of a projectile, using intrinsic and extrinsic parameters of the first and second cameras, define a trajectory of the projectile within a target volume using the points in three-dimensional space, and find a point of intersection of the trajectory of the projectile with an object associated with the target volume.
TRACKING WITH REFERENCE TO A WORLD COORDINATE SYSTEM
Examples described herein provide a method that includes capturing data about an environment. The method further includes generating a database of two-dimensional (2D) features and associated three-dimensional (3D) coordinates based at least in part on the data about the environment. The method further includes determining a position (x, y, z) and an orientation (pitch, roll, yaw) of a device within the environment based at least in part on the database of 2D features and associated 3D coordinates. The method further includes causing the device to display, on a display of the device, an augmented reality element at a predetermined location based at least in part on the position and the orientation of the device.
TRACKING WITH REFERENCE TO A WORLD COORDINATE SYSTEM
Examples described herein provide a method that includes capturing data about an environment. The method further includes generating a database of two-dimensional (2D) features and associated three-dimensional (3D) coordinates based at least in part on the data about the environment. The method further includes determining a position (x, y, z) and an orientation (pitch, roll, yaw) of a device within the environment based at least in part on the database of 2D features and associated 3D coordinates. The method further includes causing the device to display, on a display of the device, an augmented reality element at a predetermined location based at least in part on the position and the orientation of the device.
MEASURING SYSTEM, MEASURING METHOD, AND MEASURING PROGRAM
In order to monitor operation situations of site resources at a construction site efficiently, a measuring system uses a surveying apparatus 100 including a camera and a position-determining function using laser light, and the measuring system includes a photographing means for continuously photographing construction machines 201 to 204 which are site resources for performing operations at a construction site by the camera, a recognizing means recognizing the site resources in photographed images obtained by the photographing, a tracking means for tracking the image of the site resources recognized in the multiple photographed images obtained by the continuous photographing, and a position-determining means collimating to the site resources which are objects for the tracking, and determining the positions of site resources by the position-determining function, in which the determining of positions is performed multiple times at intervals.
MEASURING SYSTEM, MEASURING METHOD, AND MEASURING PROGRAM
In order to monitor operation situations of site resources at a construction site efficiently, a measuring system uses a surveying apparatus 100 including a camera and a position-determining function using laser light, and the measuring system includes a photographing means for continuously photographing construction machines 201 to 204 which are site resources for performing operations at a construction site by the camera, a recognizing means recognizing the site resources in photographed images obtained by the photographing, a tracking means for tracking the image of the site resources recognized in the multiple photographed images obtained by the continuous photographing, and a position-determining means collimating to the site resources which are objects for the tracking, and determining the positions of site resources by the position-determining function, in which the determining of positions is performed multiple times at intervals.
Systems and methods for enhanced base map generation
A feature mapping computer system configured to (i) receive a first localized image including a first photo and a first location; (ii) receive a second localized image including a second photo and a second location; (iii) identify a roadway feature depicted in both the first and second photos; (iv) generate, using a photogrammetry module, a point cloud based upon the first and second photos and first and second locations; (v) generate a localized point cloud by assigning a location to the point cloud based upon at least one of the first and second locations; and (vi) generate an enhanced base map that includes a roadway feature by embedding an indication of the identified roadway feature onto the localized point cloud.
Systems and methods for enhanced base map generation
A feature mapping computer system configured to (i) receive a first localized image including a first photo and a first location; (ii) receive a second localized image including a second photo and a second location; (iii) identify a roadway feature depicted in both the first and second photos; (iv) generate, using a photogrammetry module, a point cloud based upon the first and second photos and first and second locations; (v) generate a localized point cloud by assigning a location to the point cloud based upon at least one of the first and second locations; and (vi) generate an enhanced base map that includes a roadway feature by embedding an indication of the identified roadway feature onto the localized point cloud.
CONTACTLESS REAL-TIME 3D MAPPING OF SURFACE EQUIPMENT
Systems and methods include a computer-implemented method for providing a photonic sensing system to perform an automated method to characterize displacement of equipment surfaces and monitor changes in real-time. A three-dimensional (3D) point cloud of one or more objects is generated by an analysis and presentation system using light information collected through structured light illumination by an array of structured-light sensors (SLSes) directed toward the one or more objects. Generating the point cloud includes defining points of the 3D point cloud that are relative to reference points on the one or more objects. Real-time contactless 3D surface measurements of the one or more objects are performed using the 3D point cloud. Changes in one or more parts of the one or more objects are determined by the an analysis and presentation system by analyzing the real-time contactless 3D surface measurements.