Patent classifications
G01S5/163
State estimation and localization for ROV-based structural inspection
A vision-based state estimation framework to estimate the state of an underwater remotely operated vehicle (ROV) used for inspection of an underwater structure, for example, a nuclear reactor pressure vessel. The framework employs an external overhead, pan-tilt-zoom (PTZ) camera as the primary sensing modality and incorporates prior knowledge of the geometry of the structure.
Estimating pose in 3D space
Methods and devices for estimating position of a device within a 3D environment are described. Embodiments of the methods include sequentially receiving multiple image segments forming an image representing a field of view (FOV) comprising a portion of the environment. The image includes multiple sparse points that are identifiable based in part on a corresponding subset of image segments of the multiple image segments. The method also includes sequentially identifying one or more sparse points of the multiple sparse points when each subset of image segments corresponding to the one or more sparse points is received and estimating a position of the device in the environment based on the identified the one or more sparse points.
Optical tracking system
An electronic device includes an electromagnetic radiation source having an axis, a set of optics disposed about the axis, a reflector disposed about the axis non-symmetrically, and a controller configured to operate the electromagnetic radiation source while controlling a beam steering orientation (e.g., rotation) of the reflector. The reflector is disposed to reflect electromagnetic radiation emitted by the electromagnetic radiation source. The set of optics is disposed to shape electromagnetic radiation emitted by the electromagnetic radiation source and direct electromagnetic radiation received from the reflector into a panoramic field of view about the axis.
DEVICE AND METHOD FOR CALIBRATING CAMERA FOR VEHICLE
In accordance with an aspect of the present disclosure, there is provided a method of calibrating a camera for a vehicle, comprising: obtaining attitude angle information of the vehicle by using a traveling direction of the vehicle obtained based on a satellite signal, and a vertical direction from ground obtained based on a high definition map; obtaining attitude angle information of the camera mounted on the vehicle by matching an image captured by the camera to the high definition map; and obtaining coordinate system transformation information between the vehicle and the camera by using the attitude angle information of the vehicle and the attitude angle information of the camera.
METHOD AND DEVICE FOR PREDICTING THE TRAJECTORY OF A TRAFFIC PARTICIPANT, AND SENSOR SYSTEM
A computer-implemented method for predicting a trajectory of a traffic participant. Sensor data acquired by at least one vehicle sensor at a plurality of acquisition times is received. Based on the received sensor data, values of at least one motion parameter of the traffic participant are determined for each acquisition time. The trajectory of the traffic participant is predicted using a stochastic regression algorithm which receives the determined values of the at least one motion parameter of the traffic participant as an input.
System and method for measuring position and orientation of a rigid body
A system and method for determining a position and orientation (e.g., pose) of a rigid body. The rigid body may be a position enabled projector, a surveying rod, a power tool, a drill robot, etc., in a given space. The position of the rigid body is specified by a set of three coordinates and the orientation is specified by a set of three angles. As such, based on these six values, the position and orientation of the rigid body can be determined.
LOCATING SYSTEM
An object locating system (100) in which there is an observation device (104) observing at least three datums (106, 112, 114), each of which datums (106, 112, 114) having a positioning system that reports it position to the observation device (104). The positioning systems of the datums (106, 12, 114) being calibrated so as to accurately report their relative positions. The observation device (104) has a camera whose field of view (116) contains an object (18) to be located as well as at least two of the datums (106, 112,114) and a range finder that measures the distance (110) between the observation device (104) and at least one object (18) within the field of view (116) of the camera. A computing device calculates an azimuth and (X1, X2) elevation angle (Y1, Y2) between two datums (122, 114), or between the optical axis of the camera and each datum (112, 114),in the image so as to triangulate the position and attitude of the camera (104) at the time the image was captured using received position data for each datum (106, 112, 114) at the time the image was captured; and also calculates an azimuth and elevation angle between an optical axis of the camera and the object (18) in the image. Knowing the position and attitude of the camera (104) and a distance (110) to the object (18) at the time the image was captured, it triangulates a position of the object (18) at the time the image was captured.
Image-Based Approach for Device Localization based on a Vehicle Location
Disclosed is an image-based approach for device localization. In particular, a mobile device may capture image(s) of a vehicle that is substantially proximate to the mobile device. Based on the image(s), the mobile device may (i) determine parameter(s) associated with the vehicle, and (ii) determine or obtain a location of the vehicle in accordance with the determined parameter(s). Additionally, the mobile device may use the image(s) as basis for determining a relative location indicating where the mobile device is located relative to the vehicle. Based on the location of the vehicle and on the relative location of the mobile device, the mobile device may then determine a location of the mobile device.
3D WIRELESS OPTICAL POSITIONING METHOD AND SYSTEM
The present invention provides a 3D wireless optical positioning method and system, including the steps of: arranging two LED lamps on the ceiling to transmit optical information and provide illumination; arranging a receiver including two photodetectors in a receiving plane; calculating the distance between the LED lamps and the photodetectors respectively through the TOA (Time of Arrival) method; and finally determining the actual position and orientation angle of the receiver based on the geometrical relationship between the LED lamps and the photodetectors in the XYZ coordinate system, the two photodetectors having a distance determined as l therebetween and being situated in the same receiving plane, the receiver being situated below the two LED lamps, the range where the receiver is to be positioned being on any side of the plane consisting of the two LED lamps and the origin.
Image acquisition method, handle device, head-mounted device and head-mounted system
The embodiments of the disclosure relate to an image acquisition method, a handle device, a head-mounted device and a head-mounted system. The handle device comprises a shell and a control module arranged in the shell. A switch control end of an infrared circuit is connected with the control module, and infrared light beads of the infrared circuit penetrate outwards through the shell; and a switch control end of a visible light circuit is connected with the control module, and visible light strips of the visible light circuit penetrate outwards through the shell. Through visible light and infrared light set on the handle, the position of the handle device can be judged, and the tracking accuracy is extremely high.