Patent classifications
G06V2201/121
PHASE UNWRAPPING USING SEGMENTATION
Methods, apparatus and systems for processing interferograms in metrology applications are described. In one example aspect, a method includes obtaining an input phase image based on the interferograms, segmenting the input phase image by classifying the input phase image into multiple regions based on the phase value and a location of each pixel, assigning an integer value to each of the multiple regions, and constructing an output phase image based on the input phase image and the phase offset of each of the multiple regions.
THREE DIMENSIONAL IMAGING WITH INTENSITY INFORMATION
A method for operating a time-of-flight sensor system includes by an array of pixels of a time-of-flight sensor of the time-of-flight sensor system, generating signal data representative of reflected light from an environment; generating an intensity representation of an object in the environment based on the signal data representative of the reflected light from the environment; determining that the intensity representation indicates that an object in the environment includes a target object; and responsive to the determining, generating a three-dimensional representation of the environment based on the data representative of the reflected light.
Scanned beam display with multiple detector rangefinding
A scanning display system includes two detectors for rangefinding. Round trip times-of-flight are measured for reflections of laser pulses received at the detectors. A proportional correction factor is determined based at least in part on the geometry of the scanning display system. The proportional correction factor is applied to the measured times-of-flight to create estimates of more accurate times-of-flight.
Dual-pattern optical 3D dimensioning
An optical dimensioning system includes one or more light emitting assemblies configured to project one or more predetermined patterns on an object; an imaging assembly configured to sense light scattered and/or reflected off the object, and to capture an image of the object while the patterns are projected; and a processing assembly configured to analyze the image of the object to determine one or more dimension parameters of the object. The light emitting assembly may include a single piece optical component configured for producing a first pattern and second pattern. The patterns may be distinguishable based on directional filtering, feature detection, feature shift detection, or the like. A method for optical dimensioning includes illuminating an object with at least two detectable patterns; and calculating dimensions of the object by analyzing pattern separate of the elements comprising the projected patterns. One or more pattern generators may produce the patterns.
Three dimensional imaging with intensity information
A method for operating a time-of-flight sensor system includes by an array of pixels of a time-of-flight sensor of the time-of-flight sensor system, generating signal data representative of reflected light from an environment; generating an intensity representation of an object in the environment based on the signal data representative of the reflected light from the environment; determining that the intensity representation indicates that an object in the environment includes a target object; and responsive to the determining, generating a three-dimensional representation of the environment based on the data representative of the reflected light.
Medicine inspection assistance device, image processing device, image processing method, and program
A medicine inspection assistance device, an image processing device and an image processing method are provided that appropriately recognize identification information irrespective of whether the identification information is an engraved mark or a printed character. The image processing device that obtains a plurality of taken images of a medicine, performs a process of enhancing an engraved mark portion of the medicine based on at least one taken image among the taken images and generates a first enhanced image, performs a process of enhancing a printed character portion of the medicine based on at least one taken image among the taken images and generates a second enhanced image, collates an integrated image obtained by integrating the first enhanced image and the second enhanced image with each other, with a master image, and determines whether the medicine to be dispensed and the dispensed medicine are identical to each other or not.
TARGET DETECTION AND CONTROL METHOD, SYSTEM, APPARATUS AND STORAGE MEDIUM
The present disclosure provides a target detection, apparatus, system, device and readable storage medium and relates to the technical field of computer vision. The method may include acquiring a first image captured by an imaging device, wherein the first image is captured when a laser light of a first predetermined wavelength is emitted; acquiring a second image captured by the imaging device, wherein the second image is captured when a light of a second predetermined wavelength is emitted, and the laser light of the first predetermined wavelength and the light of the second predetermined wavelength have a same wavelength or different wavelengths; obtaining a distance between a target object and the imaging device based on the first image; and identifying the target object based on the second image.
POSITION-DETERMINING DEVICE
A position-determining device, and a milking device, that determines a relative position of an object and includes a 3D time-of-flight camera with a 2D arrangement of pixels configured to repeatedly record an image of a space. A control unit is connected and includes an image-processing device. The 3D time-of-flight camera has a controllable light source and is configured to record a 2D image by means of reflected emitted light and collect distance information. The image-processing device is configured to recognize an object in the 2D image using image-processing criteria and determine distance information and relative position by analysing the 2D image and the distance information. Due to the fact that distance information, which is often much more noisy than 2D brightness information, can be determined with far fewer image points, the position is determined more quickly and reliably.
Article damage evaluation
Implementations of the present specification provide article damage evaluation methods and apparatuses. In one aspect, the method includes: determining, by a terminal device, that a photographing device is in an active state; responsive to determining that the photographing device is in the active state, identifying, from a field of view of the photographing device, a particular surface region of the article that encompasses the damage to the article; obtaining an image of the particular surface region using the photographing device; determining surface structure information and surface material information of the surface region of the article using one or more infrared emitters and one or more infrared receivers; and generating, from the image, the surface structure information, and the surface material information, and based on a predetermined damage evaluation model, an output specifying a degree of the damage to the article.
DEPTH SENSING USING LINE PATTERN GENERATORS
A distance measurement system includes two or more line pattern generators (LPGs), a camera, and a processor. Each LPG emits a line pattern having a first set of dark portions separated by a respective first set of bright portions. A first line pattern has a first angular distance between adjacent bright portions, and a second line pattern has a second angular distance between adjacent bright portions. The camera captures at least one image of the first line pattern and the second line pattern. The camera is a first distance from the first LPG and a second distance from the second LPG. The processor identifies a target object illuminated by the first and second line patterns and determines a distance to the target object based on the appearance of the target object as illuminated by the first and second line patterns.