Patent classifications
G01B11/03
DEVICE AND METHOD FOR MEASURING TOOLS
A device for determining a dimension of a tool having a cutting edge includes a first light source configured to emit light parallel to a first axis, an image sensor which is associatable with a second axis extending orthogonally to the image sensor and an analyzing unit. The first axis and the second axis are inclined relative to each other. The device is configured such that the light emitted from the first light source is reflectable by the cutting edge of the tool in such a way that light spots arranged in a line on the image sensor are generatable by the reflected light. The analyzing unit is configured to determine positions of the light spots. The dimension of the tool is determinable based on the positions of the light spots.
DEVICE AND METHOD FOR MEASURING TOOLS
A device for determining a dimension of a tool having a cutting edge includes a first light source configured to emit light parallel to a first axis, an image sensor which is associatable with a second axis extending orthogonally to the image sensor and an analyzing unit. The first axis and the second axis are inclined relative to each other. The device is configured such that the light emitted from the first light source is reflectable by the cutting edge of the tool in such a way that light spots arranged in a line on the image sensor are generatable by the reflected light. The analyzing unit is configured to determine positions of the light spots. The dimension of the tool is determinable based on the positions of the light spots.
Aligning parts using multi-part scanning and feature based coordinate systems
Provided are methods and systems for aligning multiple parts using simultaneous scanning of features of different parts and using feature-based coordinate systems for determining relative positions of these. Specifically, a feature-based coordinate system may be constructed using one or more critical dimensions between features of different parts. The scanner may be specifically positioned to capture each of these critical dimensions precisely. The feature-based coordinate system is used to compare the critical dimensions to specified ranges. The position of at least one part may be adjusted based on results of this comparison using, for example, a robotic manipulator. The process may be repeated until all critical dimensions are within their specified ranges. In some embodiments, multiple sets of features from different parts are used such that each set uses its own feature-based coordinate system. The part adjustment may be performed based on the collective output from these multiple sets.
Aligning parts using multi-part scanning and feature based coordinate systems
Provided are methods and systems for aligning multiple parts using simultaneous scanning of features of different parts and using feature-based coordinate systems for determining relative positions of these. Specifically, a feature-based coordinate system may be constructed using one or more critical dimensions between features of different parts. The scanner may be specifically positioned to capture each of these critical dimensions precisely. The feature-based coordinate system is used to compare the critical dimensions to specified ranges. The position of at least one part may be adjusted based on results of this comparison using, for example, a robotic manipulator. The process may be repeated until all critical dimensions are within their specified ranges. In some embodiments, multiple sets of features from different parts are used such that each set uses its own feature-based coordinate system. The part adjustment may be performed based on the collective output from these multiple sets.
Systems and methods for diffraction line imaging
A novel class of imaging systems that combines diffractive optics with 1D line sensing is disclosed. When light passes through a diffraction grating or prism, it disperses as a function of wavelength. This property is exploited to recover 2D and 3D positions from line images. A detailed image formation model and a learning-based algorithm for 2D position estimation are disclosed. The disclosure includes several extensions of the imaging system to improve the accuracy of the 2D position estimates and to expand the effective field-of-view. The invention is useful for fast passive imaging of sparse light sources, such as streetlamps, headlights at night and LED-based motion capture, and structured light 3D scanning with line illumination and line sensing.
Systems and methods for diffraction line imaging
A novel class of imaging systems that combines diffractive optics with 1D line sensing is disclosed. When light passes through a diffraction grating or prism, it disperses as a function of wavelength. This property is exploited to recover 2D and 3D positions from line images. A detailed image formation model and a learning-based algorithm for 2D position estimation are disclosed. The disclosure includes several extensions of the imaging system to improve the accuracy of the 2D position estimates and to expand the effective field-of-view. The invention is useful for fast passive imaging of sparse light sources, such as streetlamps, headlights at night and LED-based motion capture, and structured light 3D scanning with line illumination and line sensing.
Method and device for analyzing the interaction between a surface of a sample and a liquid
A method for analyzing an interaction between a sample surface and a drop of liquid comprises applying the drop of liquid to the sample surface and illuminating the drop of liquid using at least two light sources. The at least two light sources are each arranged at a light source position surrounding the drop of liquid. Light reflected from the drop of liquid detecting and a sensor position on a sensor of a camera is determined for each detected light reflection. Light source positions are assigned to individual light source positions. A position of the drop of liquid is calculated relative to the sensor and an item of size information of the drop of liquid is determined. The position and the item of size information are calculated from the pairs of one sensor position and one associated light source position.
Method and device for analyzing the interaction between a surface of a sample and a liquid
A method for analyzing an interaction between a sample surface and a drop of liquid comprises applying the drop of liquid to the sample surface and illuminating the drop of liquid using at least two light sources. The at least two light sources are each arranged at a light source position surrounding the drop of liquid. Light reflected from the drop of liquid detecting and a sensor position on a sensor of a camera is determined for each detected light reflection. Light source positions are assigned to individual light source positions. A position of the drop of liquid is calculated relative to the sensor and an item of size information of the drop of liquid is determined. The position and the item of size information are calculated from the pairs of one sensor position and one associated light source position.
Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
A three-dimensional measurement apparatus selects points corresponding to geometric features of a three-dimensional shape model of a target object, projects a plurality of selected points corresponding to the geometric features onto a range image based on approximate values indicating the position and orientation of the target object and imaging parameters at the time of imaging of the range image, searches regions of predetermined ranges respectively from the plurality of projected points for geometric features on the range image which correspond to the geometric features of the three-dimensional shape model, and associates these geometric features with each other. The apparatus then calculates the position and orientation of the target object using differences of distances on a three-dimensional space between the geometric features of the three-dimensional shape model and those on the range image, which are associated with each other.
Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
A three-dimensional measurement apparatus selects points corresponding to geometric features of a three-dimensional shape model of a target object, projects a plurality of selected points corresponding to the geometric features onto a range image based on approximate values indicating the position and orientation of the target object and imaging parameters at the time of imaging of the range image, searches regions of predetermined ranges respectively from the plurality of projected points for geometric features on the range image which correspond to the geometric features of the three-dimensional shape model, and associates these geometric features with each other. The apparatus then calculates the position and orientation of the target object using differences of distances on a three-dimensional space between the geometric features of the three-dimensional shape model and those on the range image, which are associated with each other.