Patent classifications
G01C3/00
Distance measurement method and distance measurement system
The present disclosure relates to a ranging method and a ranging system, the ranging method includes using a first laser emitting portion of a laser emitting device to emit a vertical laser beam rotating in a vertical plane at a first rotation speed; calculating a time difference between the vertical laser beam passing through a first optical detection component and a second optical detection component by using the first optical detection component and the second optical detection component on a laser receiving device at least partially on the same vertical plane, wherein a distance between the two optical detection components is a first spacing; and calculating a first distance between the laser emitting device and the laser receiving device based on the first rotation speed, the first spacing, and the time difference.
Waveguide-integrated tunable liquid crystal metasurface devices
Optical receivers and transmitters can be used as stand-alone systems or combined together as a transceiver. Each of the receiver and transmitter may include an optically reflective steerable device, such as an optically reflective liquid crystal metasurface (LCM), to steer optical radiation to a target location. A transmit waveguide conveys optical radiation from a light source to the transmitter steerable device. A receive waveguide conveys received optical radiation reflected by the receiver optically steerable device to a sensor. In some embodiments, the transmit waveguide and the receive waveguide may be portions of the same planar waveguide. The receiver includes a holographic lens between the receiver LCM and the receive waveguide to pass through optical radiation received at a first range of incident angles and modify (e.g., collimate and/or spectrally filter) optical radiation reflected by the receiver LCM for conveyance by the receive waveguide to the sensor.
Working apparatus and working method
An apparatus and a method performing work free from variations among a plurality of work heads having individual differences. A working apparatus includes a number n (a natural number 2 or greater) of work heads, a stage retaining a work object, drive devices moving the stage and the work heads relative to each other in three directions, a work amount measurement device measuring a work amount of each work head, and a control device, wherein the first-direction drive device enables the number n of work heads to be independently moved in a first direction, and the second-direction drive device includes a second-direction main drive device moving the number n of work heads simultaneously in a second direction, and a second-direction auxiliary drive device moving n−1 work heads among the number n of work heads independently in the second direction. A working method is carried out using the working apparatus.
Point layout system using single laser transmitter
A laser controller having an electronic distance measuring instrument and a laser light transmitter creating a vertical laser plane is used with a remote controller and a movable target for point layout tasks. The electronic distance measurer and laser transmitter are mounted on the same vertical pivot axis. Once the system is set-up for a particular jobsite, the laser plane can be aimed at a specific point of interest on the jobsite floor, and a visible laser light line will then appear on the floor, from the laser controller, all the way to that point of interest. The distance measuring instrument is aimed along the same heading as the laser plane, and it gives the distance to the movable target, which is moved along the visible laser light line, until reaching the specified distance, and thereby find the point of interest.
AUGMENTED REALITY TECHNOLOGY AS A CONTROLLER FOR A TOTAL STATION
An augmented-reality system is combined with a surveying system to make measurement and/or layout at a construction site more efficient. A reflector can be mounted to a wearable device having an augmented-reality system. A total station can be used to track a reflector, and truth can be transferred to the wearable device while an obstruction is between the total station and the reflector. Further, a target can be used to orient a local map of a wearable device to an environment based on a distance between the target and the wearable device.
AUGMENTED REALITY TECHNOLOGY AS A CONTROLLER FOR A TOTAL STATION
An augmented-reality system is combined with a surveying system to make measurement and/or layout at a construction site more efficient. A reflector can be mounted to a wearable device having an augmented-reality system. A total station can be used to track a reflector, and truth can be transferred to the wearable device while an obstruction is between the total station and the reflector. Further, a target can be used to orient a local map of a wearable device to an environment based on a distance between the target and the wearable device.
Three-dimensional dataset and two-dimensional image localization
The disclosure relates to corresponding apparatus, computer program and method for receiving three-dimensional, 3D, map-data, in which a plurality of locations within the 3D-map-data are association with respective 3D-data-capture-locations of a 3D-camera, and in which 3D-camera-timing-information is associated with each of the plurality of locations; receiving one or more two-dimensional, 2D, images from a 2D-camera, in which 2D-camera-timing-information is associated with each 2D-image, and in which each 2D-image is captured when a movement of the 3D-camera is less than a threshold level; identifying 3D-camera-timing-information associated with locations within the 3D-map-data that correspond to 3D-data-capture-locations with a movement level of the 3D-camera less than the threshold level; associating, in a combined dataset, each 2D-image with a corresponding location within the 3D-map-data by a data processing unit correlating the 2D-camera-timing-information with the identified 3D-camera-timing-information.
CAMERA CONFIGURATION ON MOVABLE OBJECTS
A method for controlling a movable object includes receiving, at one or more processors, a plurality of images from a plurality of imaging devices carried by the movable object; determining, with aid of the one or more processors, an environment type of an environment surrounding the movable object; identifying a target based on the plurality of images and the environment type; determining information of the target based on the plurality of images; and controlling the movable object to avoid the target based on the information of the target. The plurality of imaging devices includes a first imaging device arranged at an upper surface of a body of the movable object and a second imaging device arranged at a lower surface of the body. Each of the first imaging device and the second imaging device has a field of view greater than 150 degrees.
CAMERA CONFIGURATION ON MOVABLE OBJECTS
A method for controlling a movable object includes receiving, at one or more processors, a plurality of images from a plurality of imaging devices carried by the movable object; determining, with aid of the one or more processors, an environment type of an environment surrounding the movable object; identifying a target based on the plurality of images and the environment type; determining information of the target based on the plurality of images; and controlling the movable object to avoid the target based on the information of the target. The plurality of imaging devices includes a first imaging device arranged at an upper surface of a body of the movable object and a second imaging device arranged at a lower surface of the body. Each of the first imaging device and the second imaging device has a field of view greater than 150 degrees.
Method for distance measurement using trajectory-based triangulation
A method for ascertaining a distance between a vehicle and a projection surface, onto which a characteristic light pattern is projected using a headlight of the vehicle, includes detecting, in an image of the characteristic light pattern captured by an image capturing unit, a characteristic structure produced by a first light-producing unit by evaluating a geometric location relationship in the captured image between the trajectory and characteristic structures of a characteristic light pattern that are located in an environment along the trajectory; calculating a point on the ray path that is correlated with a position of the detected characteristic structure on the trajectory in accordance with the transformation rule; and calculating the distance between the vehicle and the projection surface from the calculated point on the ray path.